WO2022105830A1 - Procédé d'évaluation de sommeil, dispositif électronique et support de stockage - Google Patents

Procédé d'évaluation de sommeil, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2022105830A1
WO2022105830A1 PCT/CN2021/131469 CN2021131469W WO2022105830A1 WO 2022105830 A1 WO2022105830 A1 WO 2022105830A1 CN 2021131469 W CN2021131469 W CN 2021131469W WO 2022105830 A1 WO2022105830 A1 WO 2022105830A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
sleep
time
mobile terminal
wearable device
Prior art date
Application number
PCT/CN2021/131469
Other languages
English (en)
Chinese (zh)
Inventor
李坤阳
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022105830A1 publication Critical patent/WO2022105830A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items

Definitions

  • the embodiments of the present application relate to the field of communication technologies, and in particular, to a sleep assessment method, an electronic device, and a storage medium.
  • Sleep latency refers to the time a user experiences from going to bed to falling asleep, which is an effective indicator for evaluating sleep disorders.
  • going to bed mainly refers to the stage in which the user lies on the bed after turning off the lights and has the intention to sleep
  • falling asleep mainly refers to the stage in which the user actually enters a sleep state. If the sleep-onset stage exceeds 30 minutes, the user is clinically considered to have a sleep disorder.
  • the above-mentioned sleep latency is usually used as a reference index.
  • the sleep latency is currently measured, the user's sleep latency cannot be accurately calculated, which may lead to overestimation or underestimation of the user's actual sleep latency, so that the user's sleep quality cannot be accurately evaluated.
  • Embodiments of the present application provide a sleep assessment method, an electronic device, and a storage medium, so as to provide a method for measuring sleep latency.
  • an embodiment of the present application provides a sleep evaluation method, which is applied to a wearable device, and the wearable device establishes a communication connection with a mobile terminal, including:
  • the first information may include information such as the user's heart rate and environmental status.
  • the sleep onset time may be the start time of the calculation of the sleep latency.
  • Send a first instruction to the mobile terminal where the first instruction is used to instruct the mobile terminal to monitor the user's action event; specifically, the first instruction is used to instruct the mobile terminal to monitor the user's action event, so that the moment of the user's action event can be recorded , and then the duration of the user's action event can be calculated.
  • the second information is acquired, and the time of falling asleep of the user is determined based on the second information; specifically, the second information may include information such as the user's physical activity status and the change of the user's heart rate.
  • the sleep latency is determined based on the sleep onset time and sleep onset time. Specifically, a mathematical calculation can be performed based on the sleep start time and the sleep onset time, so that the sleep latency can be obtained. Exemplarily, the sleep latency can be determined according to the difference between the time of falling asleep and the time of starting.
  • the sleep start time is determined by monitoring the user's physical state and the surrounding environment state, and the sleep start time is further determined by monitoring the user's physical state. Since the sleep latency is determined based on the sleep start time and the sleep time, it can be more Accurately measure sleep latency.
  • the first information includes the user's heart rate and the environmental state, obtaining the first information, and determining the user's sleep start time based on the first information includes:
  • the environmental status may include a lights-off situation, exemplarily, the lights-off situation may include lights-off or no lights-off, so that whether the user has sleep intentions can be more accurately determined.
  • the preset condition of the heart rate may be whether the user's heart rate satisfies the resting heart rate interval.
  • the preset condition of the environmental state may be whether the surrounding environment is in a light-off state.
  • the moment is determined to be the sleep start moment. Specifically, when both preset conditions are satisfied at any moment, it can be determined that the moment is the start moment of sleep. Exemplarily, at a certain time t, the user's heart rate satisfies the resting heart rate interval, and the surrounding environment is in a light-off state, at this time, the time t is determined as the sleep start time.
  • the second information includes the user's physical activity status and the user's heart rate change
  • determining the user's sleep time based on the second information includes:
  • the user's body movement status and the change of the user's heart rate are continuously monitored; specifically, the user's body movement status is used to identify the user's body movement, and the user's heart rate change is used to characterize the user's heart rate slope characteristics.
  • the preset condition of the user's body movement condition may be a critical threshold. Exemplarily, if the user's body movement amount reaches the critical threshold, it may be determined that the user's body movement condition satisfies the preset condition.
  • the moment is a potential time to fall asleep; specifically, the potential time to fall asleep can be a reference point of the time to fall asleep, and the potential time to fall asleep can be further verified by changes in the user's heart rate Then, determine the real time to fall asleep.
  • the heart rate slope time series is obtained based on the user's heart rate change, and the heart rate slope time series includes one or more heart rate slope moments; specifically, the user's heart rate at each moment can be obtained from the user's heart rate change curve, and the user's heart rate can be calculated. Multiple heart rate slippage moments.
  • the potential sleep time is compared with one or more heart rate slide times in the heart rate slide time series, and the sleep time is determined according to the comparison result. Specifically, since there may be multiple heart rate slippage moments, each heart rate slippage time may be compared with the potential sleep time, so that the sleep fall time may be obtained according to the comparison result.
  • the reference point of the time to fall asleep is obtained according to the user's body movement status
  • the time of the user's heart rate slope is obtained according to the change of the user's heart rate
  • the reference point of the sleep time and the time of falling asleep are obtained.
  • the method before acquiring the second information, the method further includes:
  • the user action information includes an action event time sequence
  • the action event time sequence includes a plurality of action event moments, each of which corresponds to a user's action event one-to-one.
  • the action events include a pick-up action event and a drop action event, wherein the pick-up action corresponds to the drop action, and the duration of the action event can be obtained through the moment of a pick-up action event and the moment of a drop action event, which is understandable
  • a user can have multiple action events, and therefore, there can be multiple durations of action events.
  • the duration of the user's action event can be calculated, thereby eliminating the duration in the sleep latency, thereby improving the accuracy of the sleep latency measurement.
  • determining the sleep latency based on the sleep start time and the sleep onset time includes:
  • the sleep latency is determined based on the sleep start time, the sleep onset time, and the user activity duration, wherein the user activity duration is determined by a plurality of action event moments in the action event time sequence.
  • the action event duration is determined by the action event time, so that the ineffective duration in the sleep latency can be more accurately calculated, thereby improving the accuracy of the sleep latency measurement.
  • the user activity information further includes the interruption duration
  • determining the sleep latency based on the sleep start time and the sleep-onset time includes:
  • the sleep latency is determined based on the sleep start time, the sleep onset time, and the user activity duration, wherein the user activity duration is determined by multiple action event moments and interruption durations in the action event time sequence.
  • the ineffective duration of the sleep latency can be calculated more accurately, thereby improving the accuracy of the sleep latency measurement.
  • the method further includes:
  • the prompt may include a voice prompt, a vibration prompt, or a single prompt or a timing prompt, which is not particularly limited in this embodiment of the present application.
  • the sleep quality of the user is determined by judging the sleep latency, and the user is prompted accordingly, the sleep quality of the user can be improved, and the user experience can be improved.
  • the embodiment of the present application also provides a sleep evaluation method, which is applied to a mobile terminal, and the mobile terminal establishes a communication connection with the wearable device, including:
  • the wearable device receives a first instruction sent by the wearable device, where the first instruction is used to instruct the mobile terminal to monitor the user's action event; specifically, the first instruction can be used to instruct the mobile terminal to record the moment of the user's action event, the mobile terminal It may be a terminal device such as a mobile phone and a tablet, or other mobile devices, which are not particularly limited in this embodiment of the present application.
  • the user's action events are monitored, the time corresponding to the action events is recorded, and the action event time sequence is generated, wherein the action event time sequence includes multiple action event moments; the action event time sequence is sent to the wearable device.
  • the wearable device can calculate the action duration based on the above time, thereby improving the accuracy of the sleep latency measurement .
  • One of the possible implementations also includes:
  • the interruption event may include events such as the user getting up to drink water, going to the bathroom, etc., and may also include other interruption events, which are not particularly limited in this embodiment of the present application.
  • the interruption duration is sent to the wearable device, thereby improving the accuracy of the sleep latency measurement.
  • One of the possible implementations further includes: before the mobile terminal establishes a communication connection with the smart lighting device, and before monitoring the user's action event, the method further includes:
  • Detecting the user's operation of turning off the smart lighting device in response to the detected operation, recognizing the user's gesture; specifically, the user's gesture can be recognized through the camera of the mobile terminal, exemplarily, an image can be captured by the camera, and Perform image recognition on this image to recognize the user's gesture.
  • the complete moment of the user's action event can be obtained, thereby improving the accuracy of the measurement of the sleep latency.
  • an embodiment of the present application provides a sleep evaluation device, which is applied to a wearable device, and the wearable device establishes a communication connection with a mobile terminal, including:
  • a first obtaining module configured to obtain first information, and determine the sleep start time of the user based on the first information
  • a sending module configured to send a first instruction to the mobile terminal, where the first instruction is used to instruct the mobile terminal to monitor the action event of the user;
  • a second obtaining module configured to obtain second information, and determine the user's sleep time based on the second information
  • the computing module is used to determine the sleep latency based on the sleep start time and the sleep onset time.
  • the first information includes the user's heart rate and the environmental state
  • the above-mentioned first acquisition module includes:
  • a first monitoring unit for continuously monitoring the user's heart rate and environmental status
  • a first judging unit for judging whether the user's heart rate and environmental state meet preset conditions
  • the first acquiring unit is configured to determine the time as the sleep start time when the user's heart rate and the environmental state at any time meet the preset conditions.
  • the second information includes the user's physical activity status and the change of the user's heart rate
  • the above-mentioned second obtaining module includes:
  • the second monitoring unit is used to continuously monitor the user's body movement status and the change of the user's heart rate
  • a second judging unit configured to judge whether the user's body movement condition satisfies a preset condition
  • the second obtaining unit is configured to determine the time as a potential sleep time when the user's body movement condition meets the preset condition at any time; obtain the heart rate slippage time series based on the user's heart rate change, and the heart rate slippage time series includes one or more heart rate slippage moments ; Compare the potential sleep time with one or more heart rate slide times in the heart rate slide time series, and determine the sleep time according to the comparison result.
  • the above-mentioned device further includes:
  • the receiving module is used for receiving user action information sent by the mobile terminal.
  • the user action information includes an action event time sequence, and the action event time sequence includes a plurality of action event moments, and each action event moment corresponds to a user's action event one-to-one.
  • the above-mentioned computing module is also used to determine the sleep latency based on the sleep start time, the time of falling asleep and the user activity duration, wherein the user activity duration is determined by a plurality of action event moments in the action event time sequence.
  • the user activity information further includes the interruption duration
  • the above calculation module is further configured to determine the sleep latency based on the sleep start time, the sleep-onset time, and the user activity duration, wherein the user activity duration is determined by the action event time sequence The time and interruption duration of multiple action events are determined.
  • the above-mentioned device further includes:
  • the cue module is used for sleep cues based on sleep latency.
  • the embodiment of the present application also provides a sleep evaluation device, which is applied to a mobile terminal, and the mobile terminal establishes a communication connection with the wearable device, including:
  • a receiving module configured to receive a first instruction sent by the wearable device, where the first instruction is used to instruct the mobile terminal to monitor the action event of the user;
  • the first recording module is used to monitor the action events of the user, record the moments corresponding to the action events, and generate the action event time sequence, wherein the action event time sequence includes a plurality of action event moments;
  • the first sending module is used for sending the action event time sequence to the wearable device.
  • the above-mentioned device further includes:
  • the second recording module is used to monitor the activity event of the user, record the time corresponding to the activity event, and determine the interruption duration based on the time corresponding to the activity time;
  • the second sending module is used for sending the interruption duration to the wearable device.
  • the mobile terminal establishes a communication connection with the intelligent lighting device, and the above-mentioned apparatus further includes:
  • the detection module is used to detect the operation of the user to turn off the smart lighting device
  • an identification module for identifying the user's gesture in response to the detected operation
  • the starting module is used for starting the monitoring of the user's action event based on the identification result.
  • an embodiment of the present application provides a wearable device, and the wearable device establishes a communication connection with a mobile terminal, including:
  • Memory the memory is used to store computer program code, and the computer program code includes instructions.
  • the wearable device reads the instructions from the memory, the wearable device performs the following steps:
  • the sleep latency is determined based on the sleep onset time and sleep onset time.
  • the first information includes the user's heart rate and environmental status
  • the above-mentioned instruction is executed by the above-mentioned wearable device
  • the above-mentioned wearable device is executed to obtain the first information
  • the user's sleep start is determined based on the first information.
  • the steps of the moment include:
  • the determined time is the sleep start time.
  • the second information includes the user's body movement status and the change of the user's heart rate
  • the above-mentioned step of making the above-mentioned wearable device to determine the user's falling asleep time based on the second information includes: :
  • the heart rate landslide time series based on the user's heart rate change, and the heart rate landslide time series includes one or more heart rate landslide moments;
  • the potential sleep time is compared with one or more heart rate slide times in the heart rate slide time series, and the sleep time is determined according to the comparison result.
  • the above-mentioned wearable device when the above-mentioned instruction is executed by the above-mentioned wearable device, the above-mentioned wearable device further performs the following steps before executing the step of acquiring the second information:
  • the user action information sent by the mobile terminal is received.
  • the user action information includes an action event time sequence, and the action event time sequence includes a plurality of action event moments, and each action event moment corresponds to a user's action event one-to-one.
  • the steps include:
  • the sleep latency is determined based on the sleep start time, the sleep onset time, and the user activity duration, wherein the user activity duration is determined by a plurality of action event moments in the action event time sequence.
  • the user activity information further includes the interruption duration.
  • the above-mentioned instruction is executed by the above-mentioned wearable device, the above-mentioned step of determining the sleep latency based on the sleep start time and the sleep-onset time on the above-mentioned wearable device includes:
  • the sleep latency is determined based on the sleep start time, the sleep onset time, and the user activity duration, wherein the user activity duration is determined by multiple action event moments and interruption durations in the action event time sequence.
  • An embodiment of the present application further provides a mobile terminal, which establishes a communication connection with a wearable device, including: a memory, where the memory is used to store computer program codes, and the computer program codes include instructions. Read the above instruction, so that the above mobile terminal performs the following steps:
  • the above-mentioned mobile terminal when executed by the above-mentioned mobile terminal, the above-mentioned mobile terminal further performs the following steps:
  • Monitor the user's activity event record the time corresponding to the activity event, and determine the interruption duration based on the time corresponding to the activity time;
  • the mobile terminal establishes a communication connection with the intelligent lighting device, and when the above-mentioned instruction is executed by the above-mentioned mobile terminal, the above-mentioned mobile terminal also performs the following steps:
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when it runs on a computer, causes the computer to execute the method described in the first aspect.
  • an embodiment of the present application provides a computer program, which is used to execute the method described in the first aspect when the computer program is executed by a computer.
  • the program in the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, and may also be stored in part or in part in a memory not packaged with the processor.
  • FIG. 1 is an application scenario architecture diagram of a sleep assessment method provided by an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a mobile terminal provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of a wearable device provided by an embodiment of the present application.
  • FIG. 4 is a flowchart of an embodiment of a sleep assessment method provided by the present application.
  • FIG. 5 is a schematic diagram of an embodiment of a heart rate landslide moment calculation method provided by the present application.
  • FIG. 6 is a schematic diagram of another embodiment of a heart rate landslide moment calculation method provided by the present application.
  • FIG. 7 is a schematic diagram of the calculation of sleep latency provided by an embodiment of the present application.
  • FIG. 8 is a flowchart of another embodiment of a sleep assessment method provided by the present application.
  • FIG. 9 is a schematic structural diagram of an embodiment of a sleep assessment device provided by the present application.
  • FIG. 10 is a schematic structural diagram of another embodiment of a sleep assessment device provided by the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • the smart wearable device can recognize the user's awake or sleep state based on body motion, and has been widely used.
  • some products identify whether the user is lying down by classifying heart rate or heart rate variability, and use the difference between the relatively quiet moment after the user lay down and the moment he falls asleep as the sleep latency.
  • electronic products such as smart terminals
  • the correlation between lying in bed and sleep is weakening. That is to say, most users often use electronic products to read novels after lying in bed with lights out.
  • the duration of using electronic products is not very short, and the posture may remain unchanged for a long time.
  • the user puts down the electronic product it may only take a short period of time to fall asleep.
  • the measurement of sleep latency based on the above scheme usually overestimates the length of the sleep latency, which will bring errors to the measurement of sleep latency. .
  • they may be mistakenly identified as having fallen asleep when they are trying to fall asleep quietly on the bed, resulting in an underestimation of their actual sleep latency, which will give a negative impact on the sleep latency. Measurement introduces error.
  • an embodiment of the present application proposes a sleep evaluation method.
  • FIG. 1 is an example diagram of an application scenario provided by the embodiment of the present application.
  • the above-mentioned application scenario includes a mobile terminal 100, a smart wearable device 200 (for example, a smart watch), a smart wearable device 201 (eg, smart glasses), and a smart home device 300.
  • a smart wearable device 200 For example, a smart watch
  • a smart wearable device 201 eg, smart glasses
  • a smart home device 300 e.g., a smart home device.
  • a connection between the smart wearable device 200 and the mobile terminal 100 can be established wirelessly, and the above wireless method may include wireless communication methods such as WIFI, Bluetooth, and cellular mobile networks (eg, 4G, 5G, etc.), which are not specifically limited in this application.
  • the connection between the mobile terminal 100 and the smart home device 300 and the smart wearable device 201 may also be established wirelessly, and the above-mentioned methods may include wireless communication methods such as WIFI, which are not specifically limited in this application.
  • the smart wearable device 200 may be a wearable device with a wireless communication function and a display screen, such as a smart watch.
  • the smart wearable device 201 may be a wearable device with a wireless communication function, for example, smart glasses.
  • the mobile terminal 100 may also be referred to as terminal equipment, user equipment (UE), access terminal, subscriber unit, mobile station, remote terminal, mobile device, user terminal, terminal, wireless communication device, or user equipment.
  • the mobile terminal 100 may be a cellular telephone, a cordless telephone, a Personal Digital Assistant (PDA) device, a handheld or handheld communication device with wireless communication capabilities and/or other devices for communicating over a wireless system and Next-generation communication systems, such as mobile terminals in a 5G network or mobile terminals in a future evolved public land mobile network (Public Land Mobile Network, PLMN) network, etc., such as mobile phones and tablets.
  • the smart home device 300 may be a home device with a wireless communication function, for example, a smart lamp, a smart switch, a smart socket, and the like.
  • FIG. 2 shows a schematic structural diagram of the mobile terminal 100 .
  • the mobile terminal 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the mobile terminal 100 .
  • the mobile terminal 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the mobile terminal 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface, so as to realize the shooting function of the mobile terminal 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the mobile terminal 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the mobile terminal 100, and can also be used to transmit data between the mobile terminal 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the mobile terminal 100 .
  • the mobile terminal 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the mobile terminal 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the mobile terminal 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the mobile terminal 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the mobile terminal 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 may provide applications on the mobile terminal 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the mobile terminal 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile terminal 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the mobile terminal 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the mobile terminal 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the mobile terminal 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the mobile terminal 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the user's posture can be recognized by capturing an image of the user through the camera, and thus the user's sleeping intention can be determined.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the mobile terminal 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and the like.
  • Video codecs are used to compress or decompress digital video.
  • the mobile terminal 100 may support one or more video codecs.
  • the mobile terminal 100 can play or record videos in various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the mobile terminal 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile terminal 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the mobile terminal 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the mobile terminal 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the mobile terminal 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the mobile terminal 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the mobile terminal 100 may be provided with at least one microphone 170C.
  • the mobile terminal 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals.
  • the mobile terminal 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D can be the USB interface 130, or can be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the mobile terminal 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the mobile terminal 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the mobile terminal 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the mobile terminal 100 .
  • the angular velocity of the mobile terminal 100 about three axes ie, x, y and z axes
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the angle at which the mobile terminal 100 shakes, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shake of the mobile terminal 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the mobile terminal 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the mobile terminal 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the mobile terminal 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the mobile terminal 100 in various directions (generally three axes). The magnitude and direction of gravity can be detected when the mobile terminal 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc. In this embodiment of the present application, the acceleration sensor 180E can identify the action event of the user.
  • the action event may include: an event of the user picking up the mobile terminal and an event of the user picking up the mobile terminal.
  • the mobile terminal 100 may measure the distance through infrared or laser. In some embodiments, when shooting a scene, the mobile terminal 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the mobile terminal 100 emits infrared light to the outside through the light emitting diode.
  • the mobile terminal 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object near the mobile terminal 100 . When insufficient reflected light is detected, the mobile terminal 100 may determine that there is no object near the mobile terminal 100 .
  • the mobile terminal 100 can use the proximity light sensor 180G to detect that the user holds the mobile terminal 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the mobile terminal 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the mobile terminal 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the mobile terminal 100 can use the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a picture with the fingerprint, answer the incoming call with the fingerprint, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the mobile terminal 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the mobile terminal 100 performs performance reduction of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the mobile terminal 100 when the temperature is lower than another threshold, the mobile terminal 100 heats the battery 142 to avoid abnormal shutdown of the mobile terminal 100 caused by the low temperature.
  • the mobile terminal 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the mobile terminal 100 , which is different from the position where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M may also be disposed in the earphone, in conjunction with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the mobile terminal 100 may receive key inputs and generate key signal inputs related to user settings and function control of the mobile terminal 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to and separated from the mobile terminal 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the mobile terminal 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the mobile terminal 100 interacts with the network through the SIM card to realize functions such as calls and data communication.
  • the mobile terminal 100 employs an eSIM, ie an embedded SIM card.
  • the eSIM card can be embedded in the mobile terminal 100 and cannot be separated from the mobile terminal 100 .
  • the smart wearable device 200 may include a heart rate sensor 210 , a body motion sensor 220 , an ambient light sensor 230 , a processor 240 , a memory 250 , a power supply unit 260 and a wireless communication module 270 .
  • the heart rate sensor 210 , the body motion sensor 220 , the ambient light sensor 230 , the processor 240 , the memory 250 , the power supply unit 260 and the wireless communication module 270 can communicate with each other through an internal connection path to transmit control and/or data signals.
  • the heart rate sensor 210 is used to monitor the user's heart rate to determine the current posture of the user and whether the user is in a sleep state.
  • the body motion sensor 220 is used to monitor the body motion status of the user to determine whether the user has a sleep intention and assist in determining whether the user is in a sleep state, and the body motion status can be used to identify the user's body motion amount.
  • the ambient light sensor 230 is used to monitor the intensity of the current ambient light to determine whether the current environment is in a lights-off environment, thereby further determining whether the user has sleep intentions.
  • the memory 250 can be used to store a computer program, and the processor 240 can be used to call and run the computer program from the memory 250; in a specific implementation, the processor 240 can be a Micro-Controller Unit (MCU),
  • the memory 250 may be a buffer, and the memory 250 may also be used to store monitored data (eg, the user's heart rate) and data sent by the mobile terminal 100 (eg, the moment of the user's action event).
  • the power supply unit 260 is used to provide power to various devices or circuits in the smart wearable device 200 .
  • the wireless communication module 270 can provide applications on the smart wearable device 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation Satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation Satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • IR infrared technology
  • the wireless communication module 270 may be one or more devices integrating at least one communication processing module.
  • the above-mentioned memory 250 can be a read-only memory (read-only memory, ROM), other types of static storage devices that can store static information and instructions, random access memory (random access memory, RAM) or other types that can store information and instructions.
  • ROM read-only memory
  • RAM random access memory
  • dynamic storage devices which can also be electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM), or other optical storage, CD-ROM storage (including compact discs, laser discs, optical discs, digital versatile discs, blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or may also be capable of carrying or storing desired program code in the form of instructions or data structures and any other medium that can be accessed by a computer, etc.
  • EEPROM electrically erasable programmable read-only memory
  • CD-ROM compact disc read-only memory
  • CD-ROM storage including compact discs, laser discs, optical discs, digital versatile discs, blu-ray
  • the above-mentioned processor 240 may be combined with the memory 250 to form a processing device, more commonly, components independent of each other, and the processor 240 may be used to execute the program codes stored in the memory 250 to realize the above-mentioned functions.
  • the memory 250 may also be integrated in the processor 240 , or be independent of the processor 240 .
  • the smart wearable device 200 may further include one or more of an audio circuit 280 and a vibrator 290, and the audio circuit 280 may further include a speaker 281, wherein the The audio circuit 280 and the speaker 281 can be used for voice broadcast to prompt the user to change their work and rest habits to improve sleep quality; the vibrator 290 can be used to vibrate to prompt the user to change their work and rest habits to improve sleep quality.
  • the processor 240 in the smart wearable device 200 shown in FIG. 3 may be a system-on-chip SOC, and the processor 240 may include a central processing unit (Central Processing Unit; hereinafter referred to as: CPU), and may further include other types of For example: Graphics Processing Unit (Graphics Processing Unit; hereinafter referred to as: GPU), etc.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the smart wearable device 201 includes at least the above-mentioned ambient light sensor 230 .
  • FIG. 4 is a schematic flowchart of an embodiment of a sleep assessment method provided by an embodiment of the present application, including:
  • Step 101 the smart wearable device 200 monitors the user's heart rate.
  • the smart wearable device 200 can monitor the user's heart rate through the heart rate sensor 210, thereby determining whether the user is in a sleep state.
  • the smart wearable device 200 can classify the monitored heart rate by monitoring the user's heart rate.
  • the smart wearable device 200 may classify the heart rate into two categories, the resting heart rate and the falling asleep heart rate.
  • the above-mentioned resting heart rate and falling asleep heart rate indicate a heart rate interval.
  • the resting heart rate may correspond to a heart rate interval of 64-78
  • the falling asleep heart rate may correspond to a heart rate interval of 56-63.
  • the numerical values are only examples, and may also be other numerical values, which are not particularly limited in the embodiments of the present application. If the detected heart rate matches the above interval, the detected heart rate is classified into the corresponding heart rate category.
  • the resting heart rate is used to identify that the user is in a resting state but has not entered a sleep state; the sleep-onset heart rate is used to identify that the user has entered a sleep state.
  • Step 102 the smart wearable device 200 monitors the current environment state.
  • the current environmental state may include a light-off state and a light-off state.
  • the non-light-off state is used to identify that the current user's environment is the light-on-off state, from which it can be inferred that the user may still be active.
  • the light-off state is used to identify that the current environment the user is in is the light-off state, so that it can be determined that the user is ready to sleep.
  • the detection of the current environment state may be performed by the ambient light sensor 230 in the smart wearable device 200 .
  • the ambient light intensity threshold can be preset to 15 lux. If the ambient light intensity monitored by the ambient light sensor 230 is greater than or equal to the ambient light intensity threshold, it can be considered that the current environment state is not extinguished. If the ambient light sensor 230 If the monitored ambient light intensity is less than the ambient light intensity threshold, it can be considered that the current environmental state is the extinguished state. It can be understood that the above-mentioned ambient light intensity threshold may be any other value, which is not particularly limited in this embodiment of the present application.
  • the current environment state may also correspond to the light-on and light-off operations of the mobile terminal 100 .
  • the above-mentioned non-light-off state may correspond to a user's light-on operation.
  • the user may perform a light-on operation on the display interface of the mobile terminal 100 to turn on the smart home device 300 (eg, a smart lamp).
  • the mobile terminal 100 may determine that the current environmental state is a non-light-off state.
  • the above-described light-off state may correspond to a user's light-off operation.
  • the user may perform a light-off operation on the display interface of the mobile terminal 100 to turn off the smart home device 300 (eg, smart lamps).
  • the mobile terminal 100 may determine that the current environmental state is the light-off state. After the mobile terminal 100 acquires the current environment state, the current environment state may be sent to the smart wearable device 200 .
  • step 102 and step 101 are executed in no particular order, that is, this step 102 can be executed before step 101, can be executed after step 101, or can be executed simultaneously with step 101.
  • This application implements This example does not make any special restrictions.
  • Step 103 the smart wearable device 200 determines the sleep start time T start based on the above-mentioned user's heart rate and the current environment state, and starts sleep onset detection.
  • the smart wearable device 200 can make a comprehensive judgment on the above-mentioned user's heart rate and the current environment state.
  • the smart wearable device 200 may record the sleep start time T start .
  • Step 104 the smart wearable device 200 may also send instruction information to the mobile terminal 100 .
  • the indication information is used to instruct the mobile terminal 100 to start monitoring the user action event.
  • step 105 the mobile terminal 100 receives the indication information sent by the smart wearable device 200, and monitors the action events of the user.
  • the action event may include a pick-up event and a put-down event.
  • the event is a pick-up event; if the user puts down the mobile terminal 100, the event is a put-down event.
  • the pick-up event can be used to identify that the user is active, that is, the user has no sleep intention temporarily; the put-down event can be used to identify that the user is about to fall asleep, that is, the user has a sleep intention.
  • the above-mentioned action events may be identified by the acceleration sensor 180E in the mobile terminal 100 .
  • the above-mentioned action event may also correspond to a screen unlocking event and a screen locking event of the mobile terminal 100 .
  • the unlocking event may correspond to a pick-up event, and the unlocking event may be triggered by a user's unlocking operation.
  • the user may perform an unlocking operation on the screen of the mobile terminal 100 , for example, by entering a password or fingerprint to unlock.
  • the mobile terminal 100 triggers an unlocking event to unlock the screen, so that it can be determined that the user has picked up the mobile terminal 100 .
  • the screen lock event may correspond to a drop event, and the screen lock event may be triggered by a user's screen lock operation.
  • the user may also perform a screen lock operation on the screen of the mobile terminal 100, for example, press a screen lock key.
  • a screen lock operation on the screen of the mobile terminal 100, for example, press a screen lock key.
  • the mobile terminal 100 triggers a screen-locking event to lock the screen, whereby it can be determined that the user has put down the mobile terminal 100 .
  • the above-mentioned action event may also be identified by the smart wearable device 201 .
  • the smart wearable device 201 may be smart glasses. Through the smart glasses, the light of the mobile terminal 100 can be detected, and the above-mentioned action events can be corresponding to the light. If the smart glasses detect the presence of light from the mobile terminal 100, it can be determined that the current event is a pick-up event, that is, the mobile terminal 100 is in a bright screen state and the user is using the mobile terminal 100; if the smart glasses detect the mobile terminal 100 If the light disappears, it can be determined that the current event is a drop event, that is, the mobile terminal 100 is in a screen-off state, and the user is putting down the mobile terminal 100 . After acquiring the pick-up event or the put-down event, the smart wearable device 201 can send the pick-up event information and the put-down event information to the mobile terminal 100 .
  • Step 106 the mobile terminal 100 records the time corresponding to the action event, and generates a time series of the action event.
  • the user may have actions to pick up and put down the mobile terminal 100 multiple times, that is, there are multiple pick up events and put down events. Therefore, after the mobile terminal 100 monitors the pick-up event or the drop event, it can record the time corresponding to the pick-up event or the drop event, thereby obtaining two action event time series, for example, the pick-up event time series and the Drop the event time series.
  • the above pick-up event time sequence includes all the moments corresponding to the pick-up event.
  • the above-mentioned drop event time series includes all the moments corresponding to the drop events.
  • the pick-up event time series may include two moments of 23:37 and 0:10
  • the drop event time series may include two moments of 23:57 and 0:20.
  • Step 107 the mobile terminal 100 sends the above two action event time series to the smart wearable device 200 .
  • step 108 the smart wearable device 200 monitors the physical movement status of the user, and determines the potential time to fall asleep.
  • the smart wearable device 200 may monitor the user's body movement status through the body movement sensor 220 .
  • the body movement status may reflect the user's body momentum.
  • the smart wearable device 200 can judge the user's body momentum through the body motion sensor 220. If the smart wearable device 200 judges that the user's body momentum is less than or equal to the preset body momentum threshold, it can record the moment and use the moment as The potential sleep time T sleep1 , where the potential sleep time T sleep1 is used to identify the user's potential sleep time. It should be noted that the potential sleep time T sleep1 does not indicate that the user has actually fallen asleep. Therefore, further judgment can be made to determine the real sleep time.
  • Step 109 the smart wearable device 200 monitors the change of the user's heart rate, and determines the heart rate slope time T sleep2 .
  • the heart rate will drop sharply. Therefore, by monitoring the user's heart rate, the moment at which the user's heart rate slips can be found, and the moment when the user's heart rate slips can be used as a reference time for the user to fall asleep.
  • the smart wearable device 200 can use the heart rate sensor 210 to monitor the heart rate value at any two moments before and after, thereby calculating the difference between the heart rates corresponding to the two moments before and after, and can use the difference with the
  • the preset difference threshold is compared, and if the difference is greater than or equal to the preset difference threshold, it can be considered that the user's heart rate has dropped sharply, and thus the next time can be recorded as the heart rate landslide time T sleep2 .
  • the heart rate value at any two time points can be monitored, wherein the time T can be the sleep start time T start , or it can be the sleep start time Any time after T start is not specifically limited in this embodiment of the present application.
  • Tm is monitored from time T
  • the time period from T to Tm includes time T1, T2, T3, and T4
  • the heart rate difference between T1 and T2 and the heart rate difference between T2 and T3 can be calculated respectively.
  • the heart rate difference between T3 and T4 so that the heart rate slope time T sleep2 in T1, T2, T3 and T4 can be determined. For example, if the heart rate difference between T1 and T2 is greater than or equal to a preset difference threshold, T2 is the heart rate landslide time T sleep2 .
  • the smart wearable device 200 can also monitor the average heart rate of the two time periods before and after, thereby calculating the difference between the average heart rate values corresponding to the two time periods before and after, and can compare the difference with the preset value.
  • the difference threshold is compared. If the difference is greater than or equal to the preset difference threshold, it can be considered that the user's heart rate has dropped sharply, so the first moment of the next time period can be recorded as the heart rate slip time T sleep2 , and any time in the latter time period may also be recorded as the heart rate landslide time T sleep2 , which is not particularly limited in this embodiment of the present application.
  • the preset monitoring time span is Td, assuming that Tm is monitored from time T, the above-mentioned time period from T to Tm includes Td1, Td2, Td3 and sub-periods such as Td4; wherein, the sub-periods such as Td1, Td2, Td3, and Td4 respectively include multiple times. Therefore, the average heart rate of the sub-periods such as Td1, Td2, Td3 and Td4 can be calculated respectively.
  • Td1 and Td2 calculate the difference between the mean heart rate between Td1 and Td2, the difference between the mean heart rate between Td2 and Td3, and the difference between the mean heart rate between Td3 and Td4, so that Td1, Td2, Td3 and Td4 can be determined.
  • Td1, Td2, Td3 and Td4 can be determined.
  • the landslide time T sleep2 is not particularly limited in this embodiment of the present application.
  • the smart wearable device 200 may store the heart rate slippage time series, wherein the heart rate slippage time series may include one or more heart rate slippage time T sleep2. .
  • step 110 the smart wearable device 200 determines the time to fall asleep based on the potential time to sleep T sleep1 and the time T sleep2 of the heart rate slope.
  • the smart wearable device 200 can compare the potential sleep time T sleep1 with each heart rate slippage time T sleep2 in the heart rate slippage time series.
  • the smart wearable device 200 determines that the difference between the potential sleep time T sleep1 and the heart rate slippage time T sleep2 is greater than or equal to the preset difference threshold, it can be considered that the heart rate slippage time T sleep2 is invalid.
  • the heart rate slippage time T sleep2 is significantly earlier than the potential sleep time T sleep1 (for example, more than 3 minutes), it can be considered that the heart rate slippage time T sleep2 is an invalid time, and you can continue to search for the heart rate slippage time sequence in the above.
  • the other heart rate slope time T sleep2 is used to determine the sleep falling time T sleep .
  • the search may be performed in the order of time from first to last. For example, first compare the earliest heart rate decline time T sleep2 with the potential sleep time T sleep1 , and if the earliest heart rate decline time T sleep2 does not meet the conditions, continue to compare the next earliest heart rate decline time T sleep2 with the potential sleep time T sleep1 The comparison is performed until a heart rate slippage time T sleep2 that satisfies the condition is found to appear, and the heart rate slippage time T sleep2 that satisfies the condition can be taken as the sleep-onset time T sleep .
  • Step 111 the smart wearable device 200 determines the sleep latency based on the sleep start time T start , the sleep time T sleep and the action event time sequence.
  • the smart wearable device 200 can further determine whether there is a time corresponding to the action event.
  • the smart wearable device 200 receives the action event time sequence sent by the mobile terminal 100, it can determine that there is a time corresponding to the action event, and thus can be based on the sleep start time T start , the sleep time T sleep and the above action event time sequence
  • the T active may be determined by the time corresponding to the action event in the above-mentioned action event time sequence.
  • the user turns off the lights at 00:31 and prepares to start sleeping, and the 00:31 time corresponds to T start . 10 minutes later, that is, 00:41, the user picks up the phone again and starts reading the novel.
  • the 00:41 time corresponds to the pickup event 1 time T pickup1 .
  • the 00:56 time corresponds to the time T putdown1 of the drop event 1.
  • the 01:04 time corresponds to the time T pickup2 of the pick-up event 2.
  • Step 112 the smart wearable device 200 performs a sleep reminder based on the sleep latency.
  • the smart wearable device 200 may compare the sleep latency with a preset sleep latency threshold.
  • the calculated sleep latency is greater than the preset sleep latency threshold, it can be considered that the user has a sleep disorder, and relevant prompts can be displayed on the display screen of the smart wearable device 200 to remind the user that there is a sleep disorder, and the existing work and rest habits can be changed. , to improve sleep quality.
  • the smart wearable device 200 can also send an indication message to the mobile terminal 100, so that the mobile terminal 100 can wake up the corresponding application after receiving the indication message, for example, health APP, so that the application can push relevant health knowledge to the user, such as work and rest habits, diet and sleeping position, etc.
  • the indication message for example, health APP
  • the smart wearable device 200 can also adjust the user's living environment to create a good sleeping environment for the user. Sleep, you can adjust the light to the preset brightness at the sleep moment (for example, adjust the brightness of the smart lamp), or you can lower or turn off the volume of the current sound source, for example, lower the volume of the smart speaker or turn off the smart speaker , thereby creating a good sleeping atmosphere, helping the user to fall asleep as soon as possible, thereby improving the user's sleep quality.
  • the user's action event is monitored by the mobile terminal, so that the moment of the user's action is recorded, the duration of the user's action event is determined from the action moment, and the sleep latency is estimated based on the duration of the action event. Improve the accuracy of sleep latency calculations.
  • FIG. 8 is a schematic flowchart of another embodiment of the sleep assessment method provided by the embodiment of the present application, including:
  • Step 201 in response to the user's operation, the mobile terminal 100 turns off the smart home device 300.
  • the user may perform a light-off operation on the display interface of the mobile terminal 100 to turn off the smart home device 300 (eg, smart lamps).
  • the mobile terminal 100 may turn off the smart home device 300 .
  • the current environment can be turned off, which is beneficial for the user to go to sleep as soon as possible.
  • step 202 the mobile terminal 100 turns on the camera 193, and acquires an image, so as to recognize the gesture of the user.
  • the camera 193 can be turned on to obtain an image of the user, thereby identifying the user's posture, for example, whether the user is in a lying posture, and then It can be determined whether the user is ready to sleep.
  • the mobile terminal 100 may also continuously acquire images of the user. For example, the mobile terminal 100 may acquire the user's posture within a preset time period (eg, 5 minutes) through the camera 193 . If the mobile terminal 100 determines that the user has been in a still-lying position within the preset time period, the mobile terminal 100 may record the start time of the user lying down, that is, the sleep start time T start .
  • the sleep start time T start may be the time when the mobile terminal 100 recognizes the user's gesture.
  • Step 203 the mobile terminal 100 monitors the user's action event.
  • the action event may include a pick-up event and a put-down event.
  • the event is a pick-up event; if the user puts down the mobile terminal 100, the event is a put-down event.
  • the pick-up event can be used to identify that the user is active, that is, the user has no sleep intention temporarily; the put-down event can be used to identify that the user is about to fall asleep, that is, the user has a sleep intention.
  • the above-mentioned action events may be identified by the acceleration sensor 180E in the mobile terminal 100 .
  • the above-mentioned action event may also correspond to a screen unlocking event and a screen locking event of the mobile terminal 100 .
  • the unlocking event may correspond to a pick-up event, and the unlocking event may be triggered by a user's unlocking operation.
  • the user may perform an unlocking operation on the screen of the mobile terminal 100, for example, by entering a password or fingerprint to unlock.
  • the mobile terminal 100 triggers an unlocking event to unlock the screen, so that it can be determined that the user has picked up the mobile terminal 100 .
  • the screen lock event may correspond to a drop event, and the screen lock event may be triggered by a user's screen lock operation.
  • the user may also perform a screen lock operation on the screen of the mobile terminal 100, for example, press a screen lock key.
  • a screen lock operation on the screen of the mobile terminal 100, for example, press a screen lock key.
  • the mobile terminal 100 triggers a screen-locking event to lock the screen, whereby it can be determined that the user has put down the mobile terminal 100 .
  • the above-mentioned action event may also be identified by the smart wearable device 201 .
  • the smart wearable 201 may be smart glasses.
  • the light of the mobile terminal 100 can be detected by the smart glasses, and the action event can be corresponding to the light. If the smart glasses detect that the light of the mobile terminal 100 exists, it can be determined that the current event is a pick-up event, that is, the user is using the mobile terminal 100; if the smart glasses detect that the light of the mobile terminal 100 disappears, it can be determined that the current event is For a drop event, that is, the user is dropping the mobile terminal 100 . After acquiring the pick-up event or the put-down event, the smart wearable device 201 can send the pick-up event information and the put-down event information to the mobile terminal 100 .
  • Step 204 the mobile terminal 100 records the time corresponding to the action event, and generates a time series of the action event.
  • the user may have actions to pick up and put down the mobile terminal 100 multiple times, that is, there are multiple pick up events and put down events. Therefore, after the mobile terminal 100 monitors the pick-up event or the drop event, it can record the time corresponding to the pick-up event or the drop event, thereby obtaining two action event time series, for example, the pick-up event time series and the Drop the event time series.
  • the above pick-up event time sequence includes all the moments corresponding to the pick-up event.
  • the above-mentioned drop event time series includes all the moments corresponding to the drop events.
  • step 205 the mobile terminal 100 monitors the interruption event and determines the interruption duration.
  • the interruption event may include activities such as the user getting up, drinking water, and turning on the light.
  • the user can be identified through the image obtained by the camera of the mobile terminal 100 when he wakes up, and the light can be identified through the user's operation of turning on the light on the display interface of the mobile terminal 100, or through the ambient light sensor in the smart wearable device 200. identify.
  • the interruption time of the first interruption event is T pause1
  • the interruption recovery time of the first interruption event is T resume1
  • the interruption time of the second interruption event is T pause2
  • the interruption restoration time of the second interruption event The time is T resume2
  • the interruption duration T interrupt (T resume1 -T pause1 )+
  • the mobile terminal 100 can also compare the interruption duration with a preset duration threshold, and if the interruption duration is greater than or equal to the preset duration threshold, it can be considered that the user's sleep has been interrupted, and the sleep start time needs to be re-determined. T start .
  • the sleep start time T start may be the last interrupt recovery time T resume .
  • the pick-up event time series and the drop event time series recorded in step 204 can be cleared, and steps 203 to 205 are further performed to obtain the pick-up event time series and the drop event time series again.
  • Step 206 the mobile terminal 100 sends the above-mentioned pick-up event time sequence, drop event time sequence, and interruption duration T interrupt to the smart wearable device 200 .
  • Step 207 the smart wearable device 200 monitors the physical movement status of the user, and determines the time to fall asleep T sleep .
  • the smart wearable device 200 can monitor the user's body movement status through the body movement sensor 220 .
  • the body movement status may reflect the user's body momentum.
  • the smart wearable device 200 can judge the user's body momentum through the body motion sensor 220. If the smart wearable device 200 judges that the user's body momentum is less than or equal to the preset body momentum threshold, the time can be recorded, and the time As sleep time T sleep .
  • Step 208 the smart wearable device 200 determines the sleep latency based on the sleep start time T start , the sleep time T sleep , the action event time sequence and the interruption duration T interrupt .
  • the smart wearable device 200 can further determine whether there is a time corresponding to the action event and the interruption duration.
  • the smart wearable device 200 receives the action event time sequence and the interruption duration sent by the mobile terminal 100, it can determine that there is a time corresponding to the action event, and there is an interruption duration . .
  • Step 209 the smart wearable device 200 performs a sleep reminder based on the sleep latency.
  • the smart wearable device 200 may compare the sleep latency with a preset sleep latency threshold.
  • the calculated sleep latency is greater than the preset sleep latency threshold, it can be considered that the user has a sleep disorder, and relevant prompts can be displayed on the display screen of the smart wearable device 200 to remind the user that there is a sleep disorder, and the existing work and rest habits can be changed. , to improve sleep quality.
  • the smart wearable device 200 can also send an indication message to the mobile terminal 100, so that the mobile terminal 100 can wake up the corresponding application after receiving the indication message, for example, health APP, so that the application can push relevant health knowledge to the user, such as work and rest habits, diet and sleeping position, etc.
  • the indication message for example, health APP
  • the smart wearable device 200 can also adjust the user's living environment to create a good sleeping environment for the user. Sleep, you can adjust the light to the preset brightness at the sleep moment (for example, adjust the brightness of the smart lamp), or you can lower or turn off the volume of the current sound source, for example, lower the volume of the smart speaker or turn off the smart speaker , thereby creating a good sleeping atmosphere, helping the user to fall asleep as soon as possible, thereby improving the user's sleep quality.
  • the user's action event and activity time are monitored by the mobile terminal, thereby recording the user's action time and activity time, the action time determines the user's action event duration, and the activity time determines the user's
  • the sleep latency is estimated based on the duration of the above-mentioned action event and the interruption duration, which can improve the accuracy of the calculation of the sleep latency.
  • FIG. 9 is a schematic structural diagram of an embodiment of a sleep evaluation device of the present application. As shown in FIG. 9 , the above-mentioned sleep evaluation device 900 is applied to a wearable device, and the wearable device establishes a communication connection with a mobile terminal, which may include: a first acquisition module 910, a sending module 920, a second obtaining module 930, and a computing module 940;
  • a first obtaining module 910 configured to obtain first information, and determine the user's sleep start time based on the first information
  • a sending module 920 configured to send a first instruction to the mobile terminal, where the first instruction is used to instruct the mobile terminal to monitor the action event of the user;
  • the second obtaining module 930 is configured to obtain second information, and determine the user's sleep time based on the second information
  • the computing module 940 is configured to determine the sleep latency based on the sleep start time and the sleep onset time.
  • the first information includes the user's heart rate and the environmental state
  • the first obtaining module 910 includes: a first monitoring unit 911 , a first judging unit 912 , and a first obtaining unit 913 ;
  • the first monitoring unit 911 is used to continuously monitor the user's heart rate and environmental status
  • the first judgment unit 912 is used for judging whether the user's heart rate and the environmental state meet the preset conditions
  • the first obtaining unit 913 is configured to determine the time as the sleep start time when the user's heart rate and the environmental state at any time meet the preset conditions.
  • the second information includes the user's body movement status and the change of the user's heart rate
  • the second obtaining module 930 includes: a second monitoring unit 931 , a second judging unit 932 , and a second obtaining unit 933 ;
  • the second monitoring unit 931 is used to continuously monitor the user's body movement status and the change of the user's heart rate
  • a second judgment unit 932 configured to judge whether the user's body movement condition satisfies a preset condition
  • the second obtaining unit 933 is configured to determine that the moment is a potential sleep moment when the user's body movement condition satisfies a preset condition at any moment; and obtain a heart rate slope time series based on the change of the user's heart rate, where the heart rate slope time series includes one or more heart rate slopes time; compare the potential sleep time with one or more heart rate landslide times in the heart rate landslide time series, and determine the sleep time according to the comparison result.
  • the above-mentioned apparatus 900 further includes: a receiving module 950;
  • the receiving module 950 is configured to receive user action information sent by the mobile terminal, the user action information includes an action event time sequence, the action event time sequence includes a plurality of action event moments, and each action event moment corresponds to a user's action event one-to-one.
  • the above-mentioned calculation module 940 is further configured to determine the sleep latency based on the sleep start time, the sleep-onset time, and the user activity duration, wherein the user activity duration is determined by a plurality of action event moments in the action event time sequence .
  • the user activity information further includes the interruption duration
  • the above-mentioned calculation module 940 is further configured to determine the sleep latency based on the sleep start time, the sleep-onset time, and the user activity duration, wherein the user activity duration is determined by the action event time sequence The time and interruption duration of multiple action events in are determined.
  • the above-mentioned apparatus 900 further includes: a prompting module 960;
  • the prompting module 960 is configured to perform sleep prompting based on the sleep latency.
  • FIG. 10 is a schematic structural diagram of another embodiment of the sleep evaluation apparatus of the present application.
  • the above-mentioned sleep evaluation apparatus 1000 is applied to a mobile terminal, and the mobile terminal establishes a communication connection with a wearable device, and may include: a receiving module 1010, a first recording module 1020 and a first sending module 1030;
  • a receiving module 1010 configured to receive a first instruction sent by the wearable device, where the first instruction is used to instruct the mobile terminal to monitor the user's action event;
  • the first recording module 1020 is used to monitor the action events of the user, record the moments corresponding to the action events, and generate an action event time sequence, wherein the action event time sequence includes a plurality of action event moments;
  • the first sending module 1030 is configured to send the action event time sequence to the wearable device.
  • the above-mentioned apparatus 1000 further includes: a second recording module 1040 and a second sending module 1050;
  • the second recording module 1040 is used to monitor the activity event of the user, record the time corresponding to the activity event, and determine the interruption duration based on the time corresponding to the activity time;
  • the second sending module 1050 is configured to send the interruption duration to the wearable device.
  • the mobile terminal establishes a communication connection with the intelligent lighting device, and the above-mentioned apparatus 1000 further includes: a detection module 1060, an identification module 1070, and a startup module 1080;
  • a detection module 1060 configured to detect the operation of the user to turn off the smart lighting device
  • a recognition module 1070 configured to recognize the gesture of the user in response to the detected operation
  • the starting module 1080 is configured to start monitoring the action event of the user based on the identification result.
  • each module of the sleep evaluation apparatus shown in FIG. 9 and FIG. 10 is only a division of logical functions, and may be fully or partially integrated into a physical entity in actual implementation, or may be physically separated.
  • these modules can all be implemented in the form of software calling through processing elements; they can also all be implemented in hardware; some modules can also be implemented in the form of software calling through processing elements, and some modules can be implemented in hardware.
  • the detection module may be a separately established processing element, or may be integrated in a certain chip of the electronic device.
  • the implementation of other modules is similar.
  • all or part of these modules can be integrated together, and can also be implemented independently.
  • each step of the above-mentioned method or each of the above-mentioned modules can be completed by an integrated logic circuit of hardware in the processor element or an instruction in the form of software.
  • the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more specific integrated circuits (Application Specific Integrated Circuit; hereinafter referred to as: ASIC), or, one or more microprocessors Digital Singnal Processor (hereinafter referred to as: DSP), or, one or more Field Programmable Gate Array (Field Programmable Gate Array; hereinafter referred to as: FPGA), etc.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Singnal Processor
  • FPGA Field Programmable Gate Array
  • these modules can be integrated together and implemented in the form of a system-on-a-chip (System-On-a-Chip; hereinafter referred to as: SOC).
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation on the mobile terminal 100 and the wearable device 200 .
  • the mobile terminal 100 and the wearable device 200 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the above-mentioned mobile terminal 100 and the wearable device 200 etc. include corresponding hardware structures and/or software modules for executing each function.
  • the embodiments of the present application can be implemented in hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Experts may use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of the embodiments of the present application.
  • each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. It should be noted that, the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
  • Each functional unit in each of the embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • a computer-readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Abstract

La présente invention se rapporte au domaine technique des communications. La présente invention concerne un procédé d'évaluation de sommeil, un dispositif électronique et un support de stockage. Le procédé consiste : à acquérir des premières informations, et à déterminer un point temporel d'endormissement d'un utilisateur sur la base des premières informations ; à envoyer une première instruction à un terminal mobile, la première instruction étant utilisée pour donner instruction au terminal mobile de surveiller les événements de mouvement de l'utilisateur ; à acquérir des secondes informations, et à déterminer un point temporel d'état endormi de l'utilisateur sur la base des secondes informations ; et à déterminer une durée de latence de sommeil sur la base du point temporel d'endormissement et du point temporel d'état endormi. Le procédé d'évaluation de sommeil peut améliorer la précision de la mesure d'une durée de latence de sommeil.
PCT/CN2021/131469 2020-11-23 2021-11-18 Procédé d'évaluation de sommeil, dispositif électronique et support de stockage WO2022105830A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011322218.0A CN114521878A (zh) 2020-11-23 2020-11-23 睡眠评估方法、电子设备及存储介质
CN202011322218.0 2020-11-23

Publications (1)

Publication Number Publication Date
WO2022105830A1 true WO2022105830A1 (fr) 2022-05-27

Family

ID=81618747

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/131469 WO2022105830A1 (fr) 2020-11-23 2021-11-18 Procédé d'évaluation de sommeil, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN114521878A (fr)
WO (1) WO2022105830A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024087952A1 (fr) * 2022-10-26 2024-05-02 华为技术有限公司 Procédé, système et dispositif électronique de réglage de la température

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117481614A (zh) * 2023-12-26 2024-02-02 荣耀终端有限公司 睡眠状态检测的方法及相关设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060019224A1 (en) * 2004-07-23 2006-01-26 Pics, Inc. Insomnia assessment and treatment device and method
WO2008096307A1 (fr) * 2007-02-07 2008-08-14 Philips Intellectual Property & Standards Gmbh Gestion du sommeil
US20170094046A1 (en) * 2015-09-30 2017-03-30 Apple Inc. Adjusting alarms based on sleep onset latency
US20180344515A1 (en) * 2017-06-05 2018-12-06 Parker-Hannifin Corporation Thermoregulatory glove and method for producing a convergence in body temperature
CN109328034A (zh) * 2016-06-27 2019-02-12 皇家飞利浦有限公司 用于确定对象的睡眠阶段的确定系统和方法
CN110584666A (zh) * 2019-09-24 2019-12-20 喜临门家具股份有限公司 一种入睡潜伏期细化分类系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016187429A (ja) * 2015-03-30 2016-11-04 パイオニア株式会社 就寝案内装置
US10561362B2 (en) * 2016-09-16 2020-02-18 Bose Corporation Sleep assessment using a home sleep system
US10939866B2 (en) * 2017-12-12 2021-03-09 Koninklijke Philips N.V. System and method for determining sleep onset latency

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060019224A1 (en) * 2004-07-23 2006-01-26 Pics, Inc. Insomnia assessment and treatment device and method
WO2008096307A1 (fr) * 2007-02-07 2008-08-14 Philips Intellectual Property & Standards Gmbh Gestion du sommeil
US20170094046A1 (en) * 2015-09-30 2017-03-30 Apple Inc. Adjusting alarms based on sleep onset latency
CN109328034A (zh) * 2016-06-27 2019-02-12 皇家飞利浦有限公司 用于确定对象的睡眠阶段的确定系统和方法
US20180344515A1 (en) * 2017-06-05 2018-12-06 Parker-Hannifin Corporation Thermoregulatory glove and method for producing a convergence in body temperature
CN110584666A (zh) * 2019-09-24 2019-12-20 喜临门家具股份有限公司 一种入睡潜伏期细化分类系统

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024087952A1 (fr) * 2022-10-26 2024-05-02 华为技术有限公司 Procédé, système et dispositif électronique de réglage de la température

Also Published As

Publication number Publication date
CN114521878A (zh) 2022-05-24

Similar Documents

Publication Publication Date Title
JP7426470B2 (ja) 音声起動方法及び電子デバイス
CN110989852B (zh) 一种触摸屏、电子设备、显示控制方法
WO2021213151A1 (fr) Procédé de commande d'affichage et dispositif portable
WO2022105830A1 (fr) Procédé d'évaluation de sommeil, dispositif électronique et support de stockage
WO2020019355A1 (fr) Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire
CN113691271B (zh) 数据传输方法及可穿戴设备
WO2022089000A1 (fr) Procédé de vérification de système de fichiers, dispositif électronique et support de stockage lisible par ordinateur
WO2022156555A1 (fr) Procédé de réglage de luminosité d'écran, appareil et dispositif terminal
WO2022151887A1 (fr) Procédé de surveillance du sommeil et appareil associé
WO2023005706A1 (fr) Procédé de commande de dispositif, dispositif électronique et support de stockage
WO2022135144A1 (fr) Procédé d'affichage auto-adaptatif, dispositif électronique et support de stockage
CN115022807B (zh) 快递信息提醒方法和电子设备
WO2021204036A1 (fr) Procédé de surveillance du risque de sommeil, dispositif électronique et support de stockage
CN113918003A (zh) 检测皮肤接触屏幕时长的方法、装置及电子设备
CN114554012A (zh) 来电接听方法、电子设备及存储介质
CN113391735A (zh) 显示形态的调整方法、装置、电子设备及存储介质
CN113838478A (zh) 异常事件检测方法、装置和电子设备
CN114125144B (zh) 一种防误触的方法、终端及存储介质
WO2024093748A1 (fr) Procédé de collecte de signal, dispositif électronique et support de stockage
CN113364067B (zh) 充电精度校准方法及电子设备
WO2024055881A1 (fr) Procédé de synchronisation d'horloge, dispositif électronique, système, et support de stockage
CN113509145B (zh) 睡眠风险监测方法、电子设备及存储介质
CN114115513B (zh) 一种按键控制方法和一种按键装置
WO2023237087A1 (fr) Procédé de prévision de fenêtre de fertilité, appareil et dispositif électronique
CN113380374B (zh) 基于运动状态感知的辅助运动方法、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21893977

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21893977

Country of ref document: EP

Kind code of ref document: A1