CN110742580A - Sleep state identification method and device - Google Patents

Sleep state identification method and device Download PDF

Info

Publication number
CN110742580A
CN110742580A CN201910882108.0A CN201910882108A CN110742580A CN 110742580 A CN110742580 A CN 110742580A CN 201910882108 A CN201910882108 A CN 201910882108A CN 110742580 A CN110742580 A CN 110742580A
Authority
CN
China
Prior art keywords
sleep state
user
screen
electronic device
time information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910882108.0A
Other languages
Chinese (zh)
Inventor
朱小陆
赵京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910882108.0A priority Critical patent/CN110742580A/en
Publication of CN110742580A publication Critical patent/CN110742580A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Anesthesiology (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The application provides a sleep state identification method and device. The electronic equipment can determine the personalized sleep time period of the user according to the personalized sleep state identification model of the user, and judge whether the user is in the sleep state or not according to the collected data for identifying whether the user is in the sleep state or not in the personalized sleep time period of the user, so that the identification result of the sleep state is more in line with the sleep habit of the user.

Description

Sleep state identification method and device
Technical Field
The application relates to the technical field of intelligent terminals, in particular to a sleep state identification method and device.
Background
In the use of an intelligent terminal device (hereinafter referred to as an electronic device), the state of a user affects the control scheme of the electronic device. If the user is in the sleep state, the electronic device can execute a strong management and control measure to optimize the power consumption of the electronic device and ensure the standby time of the electronic device; when the user is in the non-sleep state, the electronic equipment can execute a weak management and control measure to ensure the efficient operation of the electronic equipment, and the user experience is improved.
The current electronic device will determine whether the user is in the sleep state only when entering a set time period, such as 23 o 'clock at night to 6 o' clock at early morning, and the electronic device will not determine whether the user is in the sleep state outside the set time period. In summary, at present, a personalized sleep state determination scheme cannot be set for a user, so that the electronic device is not flexible enough in the process of determining whether the user is in a sleep state, and cannot adapt to sleep habits of different users.
Disclosure of Invention
The application provides a sleep state identification method and device, which are used for flexibly judging whether a user is in a sleep state according to the sleep habit of the user.
In a first aspect, the present application provides a sleep state identification method. The method may be implemented by an electronic device or by a specific chip in an electronic device. The electronic device may include a mobile phone, a tablet computer, etc. According to the method, the electronic equipment can determine the personalized sleep time period of the user according to a personalized sleep state recognition model of the user, the personalized sleep state recognition model of the user is obtained by training an initial sleep state recognition model according to the screen-off time information of the electronic equipment and the sleep state information corresponding to the screen-off time information, the initial sleep state recognition model is obtained by training a neural network model according to the sleep state information corresponding to the screen-off time information of a plurality of pieces of electronic equipment, and the sleep state information is used for indicating whether the user is in a sleep state within the time corresponding to the screen-off time information; the electronic equipment can judge whether the user is in the sleep state according to the collected data for identifying whether the user is in the sleep state really in the personalized sleep time period of the user.
By adopting the method, the electronic equipment can determine the sleep time period of the user according to the personalized sleep state identification model of the user so as to flexibly judge whether the user is in the sleep state or not according to the personalized sleep time period of the user, and the identification result of the sleep state is more in line with the sleep habit of the user.
The electronic device may receive the initial sleep state recognition model from a server. Alternatively, the initial sleep state identification model may be pre-configured in the electronic device.
The electronic device may also send screen-off time information of the electronic device to the server. Therefore, the server can train the initial sleep state recognition model according to the screen-off time information sent by the electronic equipment.
The electronic equipment can also determine the sleep state information corresponding to each screen-off time information according to the screen-off duration corresponding to each screen-off time information of the electronic equipment.
The electronic equipment can also iteratively train an initial sleep state recognition model according to the screen-off time information of the user and the sleep state information corresponding to the screen-off time information; when the iteration termination condition is met, the electronic equipment can terminate the iteration training and obtain the personalized sleep state recognition model of the user.
The data for identifying whether the user is in a sleep state may include at least one of: the duration of the electronic equipment in the screen-off state; alternatively, ambient light information; alternatively, ambient sound information; or, location information.
In a second aspect, the present application provides a sleep state identification method. The method can be implemented by a server (or called cloud server) or a chip in the server. The server may be a network server, a cloud server, or other devices or apparatuses that can provide computing services. According to the method, the server can determine the sleep state information corresponding to the screen-off time information according to the screen-off time information of the first electronic devices, wherein the sleep state information is used for indicating whether the user is in the sleep state within the time corresponding to the screen-off time information. The server can also train a neural network model according to the sleep state information corresponding to the screen-off time information to obtain an initial sleep state recognition model, and the initial sleep state recognition model is used for determining the common sleep time period of the user.
The server may also receive screen-off time information for the plurality of electronic devices from the plurality of first electronic devices.
The server may also send the initial sleep state recognition model to one or more second electronic devices.
In a third aspect, the present application provides an electronic device. The electronic device may include one or more processors, memory, and one or more computer programs; wherein one or more computer programs are stored in the memory, which when executed by the processor, are operable to implement the method of any of the possible designs of the first aspect and embodiments of the present application. For example, functional modules corresponding to functions or steps or operations in the above methods may be provided in the electronic device to support the electronic device to execute the above methods. The electronic device may include a mobile phone, a tablet computer, etc.
In a fourth aspect, the present application provides a server. The server may include one or more processors, memory, and one or more computer programs; wherein one or more computer programs are stored in the memory, which when executed by the processor, implement the method of any of the possible designs of the second aspect and embodiments of the present application. For example, functional modules corresponding to functions or steps or operations in the above methods may be provided in the server to support the electronic device to execute the above methods. The server may include a web server, a cloud server, and the like.
In a fifth aspect, an embodiment of the present application provides a system. The system may comprise an electronic device as described in any of the possible designs of the third and fourth aspects above and a server as described in any of the possible designs of the fourth and fourth aspects above.
In a sixth aspect, a chip provided in this embodiment of the present application is coupled to a memory in an electronic device, so that the chip invokes a computer program stored in the memory when running, so as to implement the first aspect of this embodiment and any possibly designed method related to the first aspect of this embodiment.
In a seventh aspect, a chip provided in this application is coupled with a memory in an electronic device, so that the chip invokes a computer program stored in the memory when running, to implement the method according to any one of the possible designs related to the second aspect and the second aspect of this application.
In an eighth aspect, a computer-readable storage medium of the embodiments of the present application stores a computer program, which, when run on an electronic device, causes the electronic device to perform the method according to the first aspect of the embodiments of the present application and any one of the possible designs related to the first aspect.
In a ninth aspect, a computer storage medium of the embodiments of the present application stores a computer program, which, when run on an electronic device, causes the electronic device to execute the method according to the second aspect of the embodiments of the present application and any one of the possible designs related to the second aspect.
In a tenth aspect, a computer program product according to this embodiment of the present application, when running on an electronic device, causes the electronic device to perform a method that implements the first aspect of this embodiment and any possible design related to the first aspect.
In an eleventh aspect, a computer program product according to the embodiments of the present application, when run on an electronic device, causes the electronic device to perform a method for implementing any one of the above-mentioned second aspect and possible designs related to the second aspect of the embodiments of the present application.
In addition, the technical effect brought by any possible design manner in the second aspect to the eleventh aspect can be referred to the technical effect brought by the design manner of the response in the method section, and is not described herein again.
Drawings
Fig. 1 is a schematic architecture diagram of an electronic device provided in the present application;
FIG. 2 is a block diagram of another electronic device according to the present disclosure;
FIG. 3 is a schematic diagram of a server architecture provided in the present application;
fig. 4 is a schematic flowchart of a sleep state identification method provided in the present application;
FIG. 5 is a block diagram of another electronic device according to the present disclosure;
fig. 6 is a schematic diagram of an architecture of another server provided in the present application.
Detailed Description
In order to flexibly determine whether a user is in a sleep state, the embodiment of the application provides a sleep state identification method. In the method, the electronic device 101 can determine the personalized sleep time period of the user through the neural network model, and judge whether the user is in a sleep state in the sleep time period, so that a judgment result more conforming to the sleep habit of the user can be obtained compared with the prior art.
The sleep state identification method provided by the embodiment of the application can be applied to any electronic device, which can also be called User Equipment (UE), Mobile Station (MS), Mobile Terminal (MT), and the like. For example, a handheld device, an in-vehicle device, or an in-vehicle device having a wireless connection function. The electronic device may also include, but is not limited to, a mount
Figure BDA0002206205600000031
Android and Microsoft
Figure BDA0002206205600000032
Or other operating system. The portable electronic device may also be a device such as a laptop computer (laptop) with a touch sensitive surface (e.g., a touch panel), etc. Currently, some examples of electronic devices 101 are: a mobile phone (mobile phone), a tablet computer, a notebook computer, a palm top computer, a Mobile Internet Device (MID), a wearable device, a Virtual Reality (VR) device, an Augmented Reality (AR) device, a wireless terminal in industrial control (industrial control), a wireless terminal in self driving (self driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (smart security), a wireless terminal in city (smart city), a wireless terminal in home (smart home), and the like. It should be understood that in this application, a terminal may be referred to as a smart terminal device, a terminal apparatus, an electronic device, or the like.
Fig. 1 is a schematic diagram of a hardware structure of a possible electronic device 101. The electronic device 101 may be configured to execute the sleep state identification method provided in the embodiment of the present application. It should be understood that the hardware structure of the electronic device 101 as shown in fig. 1 is only one example. Also, the electronic device 101 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
As shown in fig. 1, the electronic device 101 includes a processor 110, an internal memory 121, an external memory interface 122, an antenna 1, a mobile communication module 131, an antenna 2, a wireless communication module 132, an audio module 140, a speaker 140A, a receiver 140B, a microphone 140C, an earphone interface 140D, a display screen 151, a Subscriber Identity Module (SIM) card interface 152, a camera 153, keys 154, a sensor module 160, a Universal Serial Bus (USB) interface 170, a charge management module 180, a power management module 181, and a battery 182. In other embodiments, the electronic device 101 may also include a motor, an indicator, and the like.
Processor 110 may include one or more processing units, among others. For example: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processor (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
In some embodiments, a memory may also be provided in processor 110 for storing instructions and data. By way of example, the memory in the processor 110 may be a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 101 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phone book, etc.) created during use of the electronic device 101, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The external memory interface 122 may be used to connect an external memory card (e.g., a Micro SD card) to extend the storage capability of the electronic device 101. The external memory card communicates with the processor 110 through the external memory interface 122 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 101 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 131 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied on the electronic device 101. The mobile communication module 131 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 131 can receive the electromagnetic wave signal from the antenna 1, and perform filtering, amplification, and other processing on the received electromagnetic wave signal, and transmit the electromagnetic wave signal to the modem processor for demodulation. The mobile communication module 131 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least part of the functional modules of the mobile communication module 131 may be provided in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 131 may be disposed in the same device as at least some of the modules of the processor 110. For example, the mobile communication module 131 may transmit voice to the electronic device 200, or may receive voice transmitted by the electronic device 200.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 140A, the receiver 140B, etc.) or displays an image or video through the display screen 151. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 131 or other functional modules, independent of the processor 110.
The wireless communication module 132 may provide a solution for wireless communication applied to the electronic device 101, including Wireless Local Area Networks (WLANs), such as Wi-Fi networks, Bluetooth (BT), Global Navigation Satellite Systems (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 132 may be one or more devices integrating at least one communication processing module. The wireless communication module 132 receives the electromagnetic wave signal via the antenna 2, performs frequency modulation and filtering processing on the electromagnetic wave signal, and transmits the processed signal to the processor 110. The wireless communication module 132 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it. For example, the wireless communication module 132 may transmit the voice of the user 1 in the language 1 captured by the electronic device 101 to the translation server, may transmit the voice of the user 2 in the language 2 captured by the electronic device 200 received by the mobile communication module 131 to the translation server, and may receive the translation result transmitted by the translation server.
The above GNSS may include a Global Positioning System (GPS), an Assisted Global Positioning System (AGPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a Galileo positioning system (Galileo positioning system), a quasi-zenith satellite system (QZSS), and/or an Satellite Based Augmentation System (SBAS). For example, the wireless communication module 132 may perform a GPS positioning function, an AGPS positioning function, a GLONASS positioning function, a BDS positioning function, a QZSS positioning function, and/or an SBAS positioning function to obtain the location information of the electronic device 101.
In some embodiments, antenna 1 of electronic device 101 is coupled to mobile communication module 131 and antenna 2 is coupled to wireless communication module 132 so that electronic device 101 can communicate with networks and other devices through wireless communication techniques. The wireless communication technologies may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, among others.
The electronic device 101 may implement audio functions through the audio module 140, the speaker 140A, the receiver 140B, the microphone 140C, the headphone interface 140D, the application processor, and the like. Such as music playing, recording, etc.
The audio module 140 may be used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 140 may also be used to encode and decode audio signals. In some embodiments, the audio module 140 may be disposed in the processor 110, or some functional modules of the audio module 140 may be disposed in the processor 110.
The speaker 140A, also called a "horn", is used to convert audio electrical signals into sound signals. The electronic device 101 can listen to music or a hands-free call through the speaker 140A.
The receiver 140B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 101 answers a call or voice information, the voice can be answered by placing the receiver 140B close to the ear of the person.
The microphone 140C, also known as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user may speak via the mouth of the user near the microphone 140C, which microphone 140C may be used to capture the voice of the user a and then convert the voice of the user a into an electrical signal. The electronic device 101 may be provided with at least one microphone 140C. In other embodiments, the electronic device 101 may be provided with two microphones 140C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 101 may further include three, four, or more microphones 140C to achieve sound signal collection, noise reduction, sound source identification, directional recording, and the like. In implementation, the microphone 140C may collect environmental sound information of the environment where the electronic device 101 is currently located, and if the collected sound intensity reaches the sound intensity threshold, it is not determined that the user is in the sleep state.
The headphone interface 140D is used to connect wired headphones. The headset interface 140D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, or the like. In implementation, the audio collecting device such as the earphone connected to the earphone interface 140D may collect the environmental sound information of the current environment of the electronic device 101, and if the collected sound intensity reaches the sound intensity threshold, it is not determined that the user is in the sleep state.
The electronic device 101 may implement display functions via the GPU, the display screen 151, and the application processor, among others. The GPU is a microprocessor for image processing, and is connected to the display screen 151 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 151 may be used to display images, videos, and the like. The display screen 151 may include a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 101 may include 1 or N display screens 151, N being a positive integer greater than 1. In this application, the processor 110 or the display screen 151 may collect the screen-off time information of the display screen 151, and the internal memory 121 and/or the external memory 122 may store the screen-off time information. The on-off screen duration data can be used for judging whether the user is in a sleep state.
The electronic device 101 may implement a shooting function through the ISP, the camera 153, the video codec, the GPU, the display screen 151, and the application processor, etc.
The ISP may be used to process data fed back by the camera 153. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 153 may be used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 101 may include 1 or N cameras 153, N being a positive integer greater than 1.
The keys 154 may include a power-on key, a volume key, and the like. The keys 154 may be mechanical keys. Or may be touch keys. The electronic device 101 may receive a key input, and generate a key signal input related to user settings and function control of the electronic device 101.
The sensor module 160 may include one or more sensors. For example, the touch sensor 160A, the fingerprint sensor 160B, the gyro sensor 160C, the pressure sensor 160D, the acceleration sensor 160E, and the like. In some embodiments, the sensor module 160 may also include environmental sensors, distance sensors, proximity light sensors, bone conduction sensors, and the like. In implementation, the environmental information of the environment in which the electronic device 101 is located may be collected by an environmental sensor. For example, the environmental sensor may include an ambient light sensor that may be used to collect ambient light information about the environment in which the electronic device 101 is located. If the collected illumination intensity reaches or exceeds the illumination threshold, the electronic device 101 is more likely to be located in an area with sufficient illumination, and the user is not determined to be in the sleep state. Additionally, the environmental sensor may include an ambient sound sensor that may be used to collect ambient sound information of the environment in which the electronic device 101 is located. If the collected sound intensity of the surrounding environment reaches or exceeds the illumination threshold, the user is not judged to be in the sleep state.
In the present application, the sensor module 160 can be used to collect data (or parameters) for identifying whether the user is in a sleep state. For example, the motion data of the electronic device 101 collected by the gyroscope sensor 160C and/or the acceleration sensor 160E may be used to determine whether the electronic device 101 has a displacement to evaluate whether the electronic device 101 is in a sleep state.
The touch sensor 160A may also be referred to as a "touch panel". The touch sensor 160A may be disposed on the display screen 151, and the touch sensor 160A and the display screen 151 form a touch screen, which is also called a "touch screen". The touch sensor 160A is used to detect a touch operation applied thereto or therearound. Touch sensor 160A may pass the detected touch operation to an application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display screen 151. In other embodiments, the touch sensor 160A may be disposed on the surface of the electronic device 101 at a different position than the display screen 151. In implementation, the data of the touch operation collected by the touch sensor 160A may be used to determine whether the user performs the touch operation on the touch screen, and if the user performs more touch operations on the touch screen (for example, if the touch operations are performed once or more times within a specific duration), the user is not determined to be in the sleep state.
The fingerprint sensor 160 may be used to capture a fingerprint. The electronic device 101 can utilize the collected fingerprint characteristics to implement fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering, and the like. In implementation, the fingerprint data collected by the fingerprint sensor 160 may be used to determine whether the user triggers operations of the electronic device 101 through a specific finger, such as operations of unlocking a fingerprint, accessing an application lock, photographing a fingerprint, answering an incoming call with a fingerprint, and if the number of times of triggering the operations is large (for example, there are one or more times within a specific time duration), it is not determined that the user is in a sleep state.
The gyro sensor 160C may be used to determine the placement posture of the electronic device 101. In some embodiments, the angular velocity of electronic device 101 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 160C. In implementation, the motion data collected by the gyro sensor 160C may be used to determine whether the electronic device 101 is in a posture change, such as a change from lying down to standing upright, and the like, and if the electronic device 101 is in a posture change, the user is not determined to be in a sleep state.
The pressure sensor 160D is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 160D may be disposed on the display screen 151. The pressure sensor 160D may be of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 101 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 101 detects the intensity of the touch operation from the pressure sensor 180A. The electronic apparatus 101 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message. In implementation, the pressure data collected by the pressure sensor 160D may be used to determine whether the user inputs a control instruction through a touch operation with a specific intensity, so as to control the electronic device 101 to perform a corresponding operation according to the control instruction, and if the electronic device 101 detects the control instruction for a large number of times (for example, if there are one or more times in a specific duration), it is not determined that the user is in a sleep state.
The acceleration sensor 160E can detect the magnitude of acceleration of the electronic device 101 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 101 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications. In an implementation, the motion data collected by the acceleration sensor 160E may be used to determine the moving state of the electronic device 101, and if the electronic device 101 frequently moves, for example, the horizontal and vertical screen postures of the electronic device 101 change for multiple times, or the pedometer counts for multiple times, the user is not determined to be in the sleep state.
In other embodiments, processor 110 may also include one or more interfaces. For example, the interface may be a SIM card interface 152. Also for example, the interface may be a USB interface 170. For example, the interface may also be an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, or the like. It is understood that the embodiments of the present application may interface different modules of the electronic device 101, so that the electronic device 101 can implement different functions. Such as taking a picture, processing, etc. In the embodiment of the present application, the connection method of the interface in the electronic device 101 is not limited.
The SIM card interface 152 may be used to connect a SIM card, among other things. The SIM card can be brought into and out of contact with the electronic device 101 by being inserted into the SIM card interface 152 or being pulled out from the SIM card interface 152. The electronic device 101 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 152 may support a Nano SIM card, a Micro SIM card, a SIM card, or the like. Multiple cards can be inserted into the same SIM card interface 152 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 152 may also be compatible with different types of SIM cards. The SIM card interface 152 may also be compatible with an external memory card. The electronic device 101 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 101 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 101 and cannot be separated from the electronic device 101.
The USB interface 170 is an interface conforming to the USB standard specification. For example, the USB interface 170 may include a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like. The USB interface 170 may be used to connect a charger to charge the electronic device 101, and may also be used to transmit data between the electronic device 101 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The USB interface 170 may also be used to connect other electronic devices, such as Augmented Reality (AR) devices, and the like.
The charge management module 180 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 180 may receive charging input from a wired charger via the USB interface 170. In some wireless charging embodiments, the charging management module 180 may receive a wireless charging input through a wireless charging coil of the electronic device 101. The charging management module 140 may also supply power to the electronic device through the power management module 181 while charging the battery 182.
The power management module 181 is used to connect the battery 182, the charging management module 180 and the processor 110. The power management module 181 receives an input of the battery 182 and/or the charging management module 180, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 151, the camera 153, the mobile communication module 131, the wireless communication module 132, and the like. The power management module 181 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), and the like. In some other embodiments, the power management module 181 may also be disposed in the processor 110. In other embodiments, the power management module 181 and the charging management module 180 may be disposed in the same device.
Fig. 2 is a schematic diagram of a logic architecture of an electronic device 101 according to an embodiment of the present disclosure. As shown in fig. 2, the electronic device 101 provided in the embodiment of the present application may be disposed with a user behavior collection module, a local database module, a sleep state identification module, and other components. The electronic device 101 may further include an interface, which may be the mobile communication module 131 and the wireless communication module 132 in fig. 1, and is mainly used for the electronic device 101 to perform communication.
Specifically, the user behavior acquisition module can acquire screen-off time information (or called data) of the display screen and acquire data used for identifying whether the user is in a sleep state. In this application, the screen turn-off time information of the display screen 151 may be the time when the electronic device 101 starts or ends the screen turn-off state, the duration of the screen turn-off state of the electronic device 101, and the like. Specifically, the screen-off state (or the screen-off state) may include a state in which the display screen (e.g., the display screen 151 shown in fig. 1) of the electronic device 101 turns off the screen display and a state in which the screen-off display is turned on. In this example, the user behavior collection module may be implemented by the processor 110 and/or the display screen 151 shown in fig. 1. Specifically, the screen-off time information collected by the electronic device 101 may include one or more of the sets of data shown in table 1.
Index (index) Start time (start time) End time (end time)
1 23:00 7:00
2 23:05 6:30
3 9:00 11:00
…… …… ……
TABLE 1
Illustratively, when the collecting module collects the screen-off time information of the display screen 151, the screen-on duration less than t occurring in the screen-off duration may be ignored, where the value of t is, for example, 1 minute (min), so as to filter the situation that the electronic device 101 lights up the screen due to receiving a message, or filter the situation that the user wakes up from sleep and lights up the screen due to checking time, and the like. For example, taking t ═ 2min as an example, the display screen 151 enters the screen-off state from 23:00, the screen is turned on at 0:00 to 0:01 at night, and then the screen is in the screen-off state from 0:01 to 7:00 in the morning, the electronic device 101 may determine that the time when the display screen 151 starts the screen-off state is 23:00, the time when the screen-off state ends is 7:00 in the next morning, and the duration when the electronic device 101 is in the screen-off state is 8 hours.
The data for identifying whether the user is in the sleep state may include a screen-off duration of the electronic device 101 currently in the screen-off state, ambient light information of an environment where the electronic device 101 is currently located, ambient sound information, data of touch operation, fingerprint data, pressure data, motion data, and the like, and the data may be used to determine whether the user is in the sleep state. When collecting data of whether the user is in a sleep state, the user behavior collecting module may be implemented by at least one of the microphone 140C, the earphone interface 140D, the camera 153, or the sensor module 160 shown in fig. 1.
The local database module can be used for storing data acquired by the user behavior acquisition module, or storing data acquired after processing and processing the data acquired by the user behavior acquisition module. In one possible example, the electronic device 101 may store the above data for identifying whether the user is in a sleep state in the local database module. In another example, the electronic device 101 may label data of the electronic device 101 in the screen-off state, and store the labeled data in the local database module.
Specifically, during the labeling, the electronic device 101 may determine the sleep state information according to the screen turn-off duration indicated by the screen turn-off time information, and store the corresponding relationship between the screen turn-off time information and the sleep state information, where the sleep state information may be used to indicate whether the user is in a sleep state (is or isn't sleep) within the time corresponding to the screen turn-off time information.
The electronic device 101 may label, as the sleep state data, data in the screen turn-off time information of the display screen 151 whose screen turn-off time is not less than (or exceeds) the screen turn-off time threshold T, and/or label, as the non-sleep state data, data in the screen turn-off time information of the display screen 151 whose screen turn-off time is less than (or not greater than) the screen turn-off time threshold T. Wherein, the screen-off duration threshold is, for example, 7 hours (h), or other values. In other words, the screen-off time information that the screen-off duration is not less than (or exceeds) the screen-off duration threshold T indicates that the user is in the sleep state in the time period corresponding to the duration, and the screen-off time information that the screen-off duration is less than (or not more than) the screen-off duration threshold T indicates that the user is in the non-sleep state in the time period corresponding to the duration.
Taking T ═ 7h as an example, the electronic device 101 may mark, to the screen-off time information shown in table 1, whether the user is in the sleep state in each screen-off time period, to obtain the marked screen-off time information shown in table 2. The marked screen-off time information can reflect the sleeping habits of the user, such as whether the sleeping time the user is accustomed to is daytime or nighttime, the sleeping duration the user is accustomed to, and the like.
Indexing Starting time End time Sleep state information
1 23:00 7:00 Sleep state
2 23:05 6:30 Sleep state
3 9:00 11:00 Non-sleep state
…… …… …… ……
TABLE 2
Illustratively, the sleep state recognition module may train the sleep state recognition model according to the marked screen-off time information shown in table 2, so as to obtain the personalized sleep state recognition model of the user.
The sleep state identification module can be used for determining the personalized sleep time period of the user according to the personalized sleep state identification model of the user, and further can judge whether the user is really in the sleep state or not according to the data for identifying whether the user is in the sleep state or not in the personalized sleep time period of the user so as to flexibly judge whether the user is in the sleep state or not according to the sleep habit of the user.
An interface as shown in fig. 2 may be used for the electronic device 101 to communicate with the server. For example, the interface may be configured to periodically send, for example, every other week, every month, or every N days, where N is a positive integer), the screen-off time information of the user collected by the electronic device 101 to the server, where the screen-off time information of the user may be used by the server to train the neural network model to obtain an initial sleep state recognition model, and the initial sleep state recognition model may be used to determine a general sleep period of the user. In addition, the interface may be further configured to enable the electronic device 101 to obtain the initial sleep state identification model from the server, so that the electronic device 101 determines the common sleep time period of the user according to the initial sleep state identification model. It should be understood that the electronic device 101 itself may also train the initial sleep state recognition model according to the screen-off time information of the user collected by the electronic device 101, so as to obtain the personalized sleep state recognition model of the user. Specifically, the interface may include the mobile communication module 131 and/or the wireless communication module 132 shown in fig. 1.
It should be understood that the server described herein may be in a centralized or distributed arrangement, and the present application is not limited in particular. As shown in fig. 3, the server provided in the embodiment of the present application may be deployed with a big data platform, a data analysis module, a model training module, and a communication interface. The communication interface can be a wired or wireless communication module for the server to communicate. The above server may be implemented by one or more devices, apparatuses.
The above big data platform may be used to collect and store a large amount of screen-off time information of a large number of users, which may be received by a server from a plurality of electronic devices through a communication interface.
The data analysis module can be used for marking the screen-off time information of a large number of users collected and stored by the big data platform and determining the screen-off time information in a sleep state. The mode of the data analysis module for marking the screen-off time information can refer to and introduce the description of the local database module shown in fig. 2 in the application for marking the screen-off time information.
The model training module can be used for iteratively training a neural network model (such as a convolutional neural network model or other neural network models) according to the screen-off time information labeled by the data analysis module to obtain an initial sleep state recognition model.
The method provided by the embodiment of the present application will be described in detail below by taking the communication system shown in fig. 1 as an example.
As shown in fig. 4, the sleep state identification method provided in the embodiment of the present application may include the following steps:
s101: the electronic device 101 determines an individualized sleep time period of the user according to an individualized sleep state identification model of the user, where the individualized sleep state identification model of the user is obtained by training an initial sleep state identification model according to the acquired information of the screen-off time of the user, which is labeled with a sleep state.
The initial sleep state recognition model can be obtained by training screen-off time information of a large number of different users on the basis of a neural network model, and reflects the common sleep time period of the users. The initial sleep state identification model may be received by the electronic device 101 from a server, or may be pre-configured in the internal memory 121 of the electronic device 101, which is not particularly limited in this application.
Specifically, the screen turn-off time information labeled with the sleep state is shown in table 2, for example.
S102: the electronic device 101 determines whether the user is in the sleep state according to the collected data for identifying whether the user is in the sleep state during the personalized sleep time period of the user.
By adopting the method, the electronic equipment 101 can determine the sleep time period of the user according to the personalized sleep state identification model of the user, so as to flexibly judge whether the user is in the sleep state or not according to the sleep time period of the user, and the identification result of the sleep state is more in line with the sleep habit of the user.
The following describes a manner of determining the initial sleep state recognition model in the embodiment of the present application.
Taking the example that the electronic device 101 receives the initial sleep state recognition model from the server, the server may iteratively train a neural network model (such as a convolutional neural network (convolutional neural network) model or other neural network models) according to a plurality of pieces of screen-off time information of a plurality of users to obtain the initial sleep state recognition model. Taking the convolutional neural network model as an example, the convolutional neural network model may adopt four layers of networks, including three hidden layers and an output layer, the number of neuron nodes of the three hidden layers is 512, 128 and 32, respectively, and the neuron activation function of each hidden layer adopts a linear rectification function (RELU) function. The iteration termination condition of the model training is that the recognition accuracy reaches (or exceeds) a recognition accuracy threshold, and/or the misrecognition rate is below or not higher than the misrecognition rate threshold. For example, the misrecognition percentage threshold is 99.5% (or another value), and the misrecognition percentage threshold is 0.5%. The identification accuracy is the proportion that the sleep state (hereinafter referred to as a prediction result) of the user output by the neural network model is consistent with the sleep state marked by the screen-off time information when the start time and the end time (or the duration of the screen-off state between the start time and the end time) of the screen-off state in the screen-off time information are input into the neural network model as input data in the iteration process; otherwise, the false recognition rate is the proportion of the prediction result output by the neural network model and the sleep state marked by the screen-off time information, which is inconsistent with the prediction result output by the neural network model.
It should be understood that the above convolutional neural network model may be replaced by other neural network models, such as long-short term memory model, stacked codec, and deep belief network, etc.
Specifically, a plurality of electronic devices including the electronic device 101 may periodically (upload screen-off time information of each of the electronic devices to the server, respectively, where the electronic devices may anonymously upload screen-off time information to protect privacy of a user, the server may periodically perform data analysis on the screen-off time information uploaded by one or more electronic devices, mark whether the user is in a sleep state within a screen-off time period corresponding to each screen-off time information, and periodically and iteratively train the neural network model according to the marked data.
Further, the server can iteratively train the neural network model according to part or all of the marked screen-off time information as input data. In each iteration, the predicted result (i.e. the predicted sleep state) of a part (such as one or more groups) of input data can be predicted through the neural network model according to the starting time and the ending time of the screen-off state in the part of the input data, and the accuracy of prediction can be determined according to the labeling result of the part of the input data. Because each set of input data is labeled, the difference between the prediction result and the labeling result of the current neural network model can be calculated. And finally, updating the values of the parameters in the neural network model by means of a back propagation algorithm and the like based on the difference, so that the prediction result of the neural network model for the group of input data is closer to the labeling result.
When the recognition accuracy of the iterative training reaches (or exceeds) a recognition accuracy threshold, and/or the misrecognition rate is not higher than (or lower than) a misrecognition rate threshold, the iteration is considered to be terminated. At which point the server may obtain an initial sleep state recognition model. The server may send the initial sleep state recognition model obtained after the iteration terminates to an electronic device, such as electronic device 101.
It should be understood that the manner in which the initial sleep state identification model is generated by the server is described above by way of example only, and the present application is not limited to the initial sleep state identification model being configured in the electronic device 101 before shipment from the electronic device 101. In this example, the initial sleep state recognition model may be determined by the vendor of the electronic device 101 in a similar manner as when the server determines the initial sleep state recognition model, and the initial sleep state recognition model may be configured in the electronic device 101.
Before S101, the electronic device 101 may train an initial sleep state recognition model according to the marked screen-off time information, and these data can represent the usage habit of the user on the electronic device 101, so that a personalized sleep state recognition model of the user may be obtained.
Specifically, the electronic device 101 may select a part of data from the screen-off time information of the user of the electronic device 101 as input data according to the screen-off time information shown in table 2, and iteratively train the initial sleep state recognition model to obtain the personalized sleep state recognition model of the user. In each iteration, a prediction result (i.e., a predicted sleep state) of a portion (e.g., one or more groups) of input data of the electronic device 101 may be predicted by the initial sleep state recognition model according to a start time and an end time of a screen-out state in the portion of the input data, and accuracy of the prediction may be determined according to a labeling result of the portion of the input data. Because each set of input data is labeled, the difference between the prediction result and the labeling result of the current initial sleep state recognition model can be calculated. And finally, updating the values of the parameters in the initial sleep state identification model by a back propagation algorithm and the like based on the difference, so that the prediction result of the initial sleep state identification model for the group of input data is closer to the labeling result. And when the iteration termination condition is met, if the identification accuracy of the iteration training reaches (or exceeds) the identification accuracy threshold value and the false identification rate is not higher than (or lower than) the false identification rate threshold value, the iteration is considered to be terminated. At this point, a personalized sleep state recognition model for the user may be obtained.
In the implementation of S101, the electronic device 101 may output the sleep time period of the user through the personalized sleep state recognition model of the user. Sleep period evening 23:00 to the next morning 9:00 may correspond to a night sleep mode for determining whether a user who is accustomed to sleeping at night is in a sleep state. Sleep period 6 a.m.: 00 to 15 pm: 00 may correspond to a daytime sleep mode for determining whether a user accustomed to daytime sleep is in a sleep state.
In the implementation of S102, the electronic device 101 may collect data for identifying whether the user is in a sleep state during the sleep time period according to the sleep time period of the user obtained in S101.
Specifically, the electronic device 101 may collect information such as position information, ambient light information, ambient sound information, whether to turn on a do-not-disturb mode, whether to set an alarm clock, whether to be in a screen-off state, a duration of being in a screen-off state, whether to be in a still state, and a duration of being in a still state. The location information may be used to determine whether the electronic device 101 is currently within a specified range (e.g., the user's home). For example, the electronic device 101 may compare the current location information of the electronic device 101 with the location information of the designated range, and if the current location information of the electronic device 101 is included in the location information of the designated range, the electronic device 101 is considered to be currently in the designated range.
Specifically, the above location information may be acquired by the electronic device 101 through GNSS.
In a specific example, the electronic device 101 may be configured to use the position information of the time zone in which the user is in the sleep state as the specified range. For example, the electronic device 101 may determine a time period in which the user is in the sleep state according to the one or more pieces of marked on-off screen time information, query location information in which the user is located in the time period, and determine the location information as a specified range in which the user, such as the user's home, may be in the sleep state.
In another example, the specified range may be set by a user. For example, the user may input an instruction to mark a range in which the current position is located as a specified range, and the electronic device 101 collects position information of the position.
For example, when the position information collected by the electronic device 101 indicates that the electronic device 101 is in the range of the home set by the user, the electronic device 101 may recognize that the user is in the sleep state. When the illumination intensity represented by the ambient light information collected by the electronic device 101 is below (or not above) the illumination threshold, the electronic device 101 may identify that the user is in a sleep state. When the sound intensity represented by the ambient sound information collected by the electronic device 101 is lower (or not higher) than the sound intensity threshold, the electronic device 101 may recognize that the user is in a sleep state. When the electronic device 101 turns on the do-not-disturb mode, the electronic device 101 may recognize that the user is in a sleep state. When the electronic device 101 is provided with an alarm clock, the electronic device 101 may recognize that the user is in a sleep state. When the electronic device 101 is in the screen-off state and the duration of the screen-off state reaches a set duration (e.g., 15 minutes, or other duration), the electronic device 101 may identify that the user is in the sleep state. When the electronic device 101 is in a stationary state (e.g., no displacement of the electronic device 101 or change in the placement posture is detected) and the stationary state is for a set duration (e.g., 15 minutes, or other duration), the electronic device 101 may identify that the user is in a sleep state.
After the electronic device 101 recognizes that the current user is in the sleep state, the electronic device 101 may execute a stronger management and control measure to reduce the power consumption of the device. For example, after recognizing that the current user is in the sleep state, the electronic device 101 may perform at least one of the following control measures: the method comprises the steps of closing a mobile network connection function, closing a WLAN network connection function, closing a Bluetooth communication function, closing a GNSS function or closing a system clock of the electronic equipment and the like.
The sleep state identification method provided by the embodiment of the present application is described below by way of example.
The electronic device 101 receives the initial sleep state recognition model from the server and determines that the general sleep period of the user is 23:00 nighttime to 9:00 next morning according to the initial sleep state recognition model.
The electronic device 101 may collect data identifying whether the user is in a sleep state during a period of sleep common to the user and determine whether the user is in the sleep state based on the data collected identifying whether the user is in the sleep state if any one or more of the following conditions exist: the electronic device 101 is a newly enabled electronic device, or the electronic device 101 does not detect valid user screen-off time information (for example, because the user clears the memory, the amount of the user screen-off time information is less than the data amount threshold, and at this time, the initial sleep state recognition model cannot be effectively trained according to the screen-off time information), or the user does not start the intelligent recognition of the sleep state. In these cases, the electronic device 101 cannot obtain the personalized sleep state recognition model of the user through training on the basis of the initial sleep state recognition model.
For example, if the common sleep time period of the user is 23:00 at night to 9:00 in the next morning, the electronic device 101 may start to collect data for identifying whether the user is in the sleep state after 23:00 at night, and determine whether the user is in the sleep state according to the collected data, and if so, the electronic device 101 may perform a strong control measure to reduce the power consumption of the device.
The electronic device 101 trains the initial sleep state recognition model according to the screen-off time information of the user, so as to obtain an individualized sleep state recognition model of the user, wherein the individualized sleep state recognition model of the user can be used for determining an individualized sleep time period of the user. During periodic training, when the next training time is reached, the electronic device 101 may train the personalized sleep state recognition model of the user obtained by the previous training according to the screen-off time information acquired before the previous training and the screen-off time information acquired after the previous training, so as to obtain the latest personalized sleep state recognition model of the user. Thereafter, when the sleep state of the user is recognized, the user-customized sleep time period can be determined according to the latest user-customized sleep state determination model. In the personalized sleep time period of the user, the electronic device 101 can accurately identify whether the user is in a sleep state, so that the balance between power consumption adjustment and user experience is achieved.
The user's personalized sleep period may be the same or different than the user's general sleep period. For example, the user personalized sleep period is 9:00 pm to 6 am: 00, or 6 am: 00 to 15 pm: 00.
in addition, after the screen-off time information of the user is collected, the electronic device 101 may upload the screen-off time information to the server, so that the server periodically trains the initial sleep state recognition model.
Based on the same concept, an embodiment of the present application further provides an electronic device, which is used for implementing the steps or operations executed by the electronic device in the sleep state identification method provided by the embodiment of the present application.
In a specific example, fig. 5 illustrates an electronic device 500 provided herein. The electronic device 500 may include at least one processor 510 and at least one memory 520. The processor 510 is coupled with the memory 520, and the coupling in this embodiment is an indirect coupling or a communication connection between devices, units or modules, and may be in an electrical, mechanical or other form, which is used for information interaction between the devices, units or modules.
It should be understood that the electronic device 500 may be used to implement the steps or operations performed by the electronic device in the sleep state identification method provided by the embodiment of the present application.
In particular, memory 520 may be used to store program instructions.
The processor 510 is configured to call the program instructions stored in the memory 520, so that the electronic device 500 performs the steps or operations performed by the electronic device in the sleep state identification method provided by the embodiment of the present application.
The electronic device 500 may further include a user behavior collection module. Processor 510 may control the user behavior collection module to collect data identifying whether the user is in a sleep state. For example, the user behavior collection module may be formed by at least one of a microphone, an earphone interface, a camera or an environment sensor, a gyroscope sensor, an acceleration sensor, a touch sensor, or a pressure sensor.
The electronic device 500 may also include at least one transceiver 530. The transceiver 530 may be coupled to the processor 510 and the memory 520, respectively. The transceiver 530 may be used for the electronic device 500 to communicate. For example, the transceiver 530 may be used for the electronic device 500 to communicate with an ephemeris server and/or a network location server.
For example, the electronic device 500 may be implemented by the electronic device 101 shown in fig. 1. In particular, the processor 110 of the electronic device 101 may be used to implement the processor 510. Internal memory 121 of electronic device 101 may be used to implement memory 520. The user behavior collection module may be formed of at least one of the microphone 140C, the earphone interface 140D, or the sensor module 160 shown in fig. 1.
In another example, functional modules corresponding to functions or steps or operations in the above methods may be further provided in the electronic device to support the electronic device to execute the above methods. For example, the electronic device may be provided with a communication module and a processing module, wherein the communication module may be used for the server to perform communication, and the processing module may be used for the server to perform processing operations, such as training the initial sleep state recognition model to obtain a personalized sleep state recognition model of the user, recognizing the sleep state of the user according to the personalized sleep state recognition model, and generating information/message to be sent, or processing the received signal to obtain the information/message.
Illustratively, the electronic device may be implemented by the electronic device 101 as shown in fig. 2. Alternatively, the electronic device may have a logical structure as shown in fig. 2.
Based on the same concept, embodiments of the present application further provide a server, which is used to implement the steps or operations executed by the server in the sleep state identification method provided by the embodiments of the present application.
In a specific example, fig. 6 illustrates a server 600 provided herein. The server 600 may include at least one processor 610 and at least one memory 620. Wherein the processor 610 is coupled to the memory 620.
In particular, memory 620 may be used to store program instructions.
The processor 610 is configured to call the program instructions stored in the memory 620, so that the server 600 performs the steps or operations performed by the server in the sleep state identification method provided by the embodiment of the present application.
In another example, a functional module corresponding to a function or a step or an operation in each of the above methods may be further provided in the server to support the server to execute the above methods. For example, the server may be provided with a communication module and a processing module, wherein the communication module may be used for the server to perform communication, and the processing module may be used for the server to perform processing operations, such as obtaining an initial sleep state identification model, and generating information/message to be transmitted, or processing received signals to obtain information/message.
Illustratively, the server may be implemented by a server as shown in FIG. 3. Alternatively, the server may have a logical structure as shown in fig. 3.
Based on the same concept as the method embodiments, the present application further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program causes the computer to perform the operations performed by the electronic device or the server in any one of the possible implementations of the method embodiments and the method embodiments.
Based on the same concept as the method embodiments, the present application further provides a computer program product, which when being invoked to execute, can enable a computer to implement the method embodiments and the operations performed by the electronic device or the server in any possible implementation manner of the method embodiments.
Based on the same concept as the method embodiments described above, the present application also provides a chip or a chip system, which may include a processor. The chip may further include or be coupled with a memory (or a storage module) and/or a transceiver (or a communication module), where the transceiver (or the communication module) may be used to support the chip for wired and/or wireless communication, and the memory (or the storage module) may be used to store a program that is called by the processor to implement the operations performed by the electronic device or the server in any of the possible implementations of the above-described method embodiments and method embodiments. The chip system may include the above chip, and may also include the above chip and other discrete devices, such as a memory (or storage module) and/or a transceiver (or communication module).
Based on the same concept as the method embodiment, the application also provides a communication system which can comprise the electronic equipment and/or the server. The communication system may be used to implement the method embodiments described above, the operations performed by the electronic device or the server in any of its possible implementations.
It is clear to those skilled in the art that the embodiments of the present application can be implemented in hardware, or firmware, or a combination thereof. When implemented in software, the functions described above may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. Taking this as an example but not limiting: the computer-readable medium may include RAM, ROM, Electrically Erasable Programmable Read Only Memory (EEPROM), compact disc read-Only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Furthermore, the method is simple. Any connection is properly termed a computer-readable medium. For example, if software is transmitted from a website, a server, or other remote source using a coaxial cable, a fiber optic cable, a twisted pair, a Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, the coaxial cable, the fiber optic cable, the twisted pair, the DSL, or the wireless technologies such as infrared, radio, and microwave are included in the fixation of the medium. Disk and disc, as used in accordance with embodiments of the present application, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
In short, the above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modifications, equivalents, improvements and the like made in accordance with the disclosure of the present application are intended to be included within the scope of the present application.

Claims (11)

1. A sleep state recognition method, comprising:
the electronic equipment determines an individualized sleep time period of a user according to an individualized sleep state identification model of the user, the individualized sleep state identification model of the user is obtained by training an initial sleep state identification model according to screen-off time information of the electronic equipment and sleep state information corresponding to the screen-off time information, the initial sleep state identification model is obtained by training a neural network model according to the sleep state information corresponding to the screen-off time information of a plurality of pieces of electronic equipment, and the sleep state information is used for indicating whether the user is in a sleep state within time corresponding to the screen-off time information;
and the electronic equipment judges whether the user is in the sleep state according to the collected data for identifying whether the user is in the sleep state or not in the personalized sleep time period of the user.
2. The method of claim 1, wherein the method further comprises:
the electronic device receives the initial sleep state recognition model from a server.
3. The method of claim 1 or 2, wherein the method further comprises:
and the electronic equipment sends the screen-off time information of the electronic equipment to a server.
4. The method of any one of claims 1-3, wherein the method further comprises:
and the electronic equipment determines the sleep state information corresponding to each screen extinguishing time information according to the screen extinguishing duration corresponding to each screen extinguishing time information of the electronic equipment.
5. The method of any one of claims 1-4, wherein the method further comprises:
the electronic equipment iteratively trains an initial sleep state recognition model according to the screen-off time information of the electronic equipment and the sleep state information corresponding to the screen-off time information;
and when the iteration termination condition is met, the electronic equipment terminates the iteration training and obtains the personalized sleep state recognition model of the user.
6. The method of any one of claims 1-5, wherein the data for identifying whether the user is in a sleep state comprises at least one of:
the duration of the electronic equipment in the screen-off state; alternatively, the first and second electrodes may be,
ambient light information; alternatively, the first and second electrodes may be,
ambient sound information; alternatively, the first and second electrodes may be,
location information.
7. A sleep state recognition method, comprising:
determining sleep state information corresponding to the screen turn-off time information according to the screen turn-off time information of the first electronic equipment, wherein the sleep state information is used for indicating whether a user is in a sleep state within the time corresponding to the screen turn-off time information;
and training a neural network model according to the sleep state information corresponding to the screen-off time information to obtain an initial sleep state recognition model, wherein the initial sleep state recognition model is used for determining the common sleep time period of the user.
8. The method of any of claim 7, further comprising:
screen-off time information of the plurality of electronic devices is received from the plurality of first electronic devices.
9. The method of any of claims 7 or 8, further comprising:
sending the initial sleep state recognition model to one or more second electronic devices.
10. An electronic device, comprising: one or more processors and memory, and one or more computer programs;
wherein the one or more computer programs are stored in the memory and, when executed by the processor, cause the electronic device to implement the method of any of claims 1 to 6.
11. A server, comprising: one or more processors and memory, and one or more computer programs;
wherein the one or more computer programs are stored in the memory, which when executed by the processor, cause the server to implement the method of any of claims 7 to 9.
CN201910882108.0A 2019-09-18 2019-09-18 Sleep state identification method and device Pending CN110742580A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910882108.0A CN110742580A (en) 2019-09-18 2019-09-18 Sleep state identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910882108.0A CN110742580A (en) 2019-09-18 2019-09-18 Sleep state identification method and device

Publications (1)

Publication Number Publication Date
CN110742580A true CN110742580A (en) 2020-02-04

Family

ID=69276720

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910882108.0A Pending CN110742580A (en) 2019-09-18 2019-09-18 Sleep state identification method and device

Country Status (1)

Country Link
CN (1) CN110742580A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112435441A (en) * 2020-11-19 2021-03-02 维沃移动通信有限公司 Sleep detection method and wearable electronic device
CN113138656A (en) * 2021-05-14 2021-07-20 上海传英信息技术有限公司 Control method, mobile terminal and storage medium
CN113420740A (en) * 2021-08-24 2021-09-21 深圳小小小科技有限公司 Control method of smart home, electronic device and computer readable medium
WO2022068332A1 (en) * 2020-09-29 2022-04-07 Oppo广东移动通信有限公司 Sleep monitoring method and apparatus, electronic device, and computer-readable medium
CN116684524A (en) * 2022-09-30 2023-09-01 荣耀终端有限公司 Site marking method, electronic equipment and storage medium
CN117061658A (en) * 2023-07-11 2023-11-14 荣耀终端有限公司 Sleep time identification method, electronic device and storage medium
CN117077812A (en) * 2023-09-13 2023-11-17 荣耀终端有限公司 Network training method, sleep state evaluation method and related equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101765074A (en) * 2008-12-24 2010-06-30 宏达国际电子股份有限公司 Method, device and recording medium for controlling operation pattern of electronic device
WO2012149707A1 (en) * 2011-05-03 2012-11-08 中兴通讯股份有限公司 Method and device for making mobile terminal enter standby
CN103024205A (en) * 2012-12-14 2013-04-03 华为终端有限公司 Method, device and terminal for controlling power
CN106095059A (en) * 2016-06-08 2016-11-09 维沃移动通信有限公司 A kind of method reducing mobile terminal power consumption and mobile terminal
CN106502371A (en) * 2016-11-08 2017-03-15 珠海市魅族科技有限公司 A kind of electricity-saving control method and device
CN107277907A (en) * 2017-07-31 2017-10-20 努比亚技术有限公司 Method for controlling mobile terminal, mobile terminal and computer-readable recording medium
CN107861605A (en) * 2017-11-06 2018-03-30 北京小米移动软件有限公司 Data processing method and device
CN107943266A (en) * 2017-11-20 2018-04-20 北京小米移动软件有限公司 power consumption control method, device and equipment
CN109793497A (en) * 2017-11-17 2019-05-24 广东乐心医疗电子股份有限公司 Sleep state identification method and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101765074A (en) * 2008-12-24 2010-06-30 宏达国际电子股份有限公司 Method, device and recording medium for controlling operation pattern of electronic device
WO2012149707A1 (en) * 2011-05-03 2012-11-08 中兴通讯股份有限公司 Method and device for making mobile terminal enter standby
CN103024205A (en) * 2012-12-14 2013-04-03 华为终端有限公司 Method, device and terminal for controlling power
CN106095059A (en) * 2016-06-08 2016-11-09 维沃移动通信有限公司 A kind of method reducing mobile terminal power consumption and mobile terminal
CN106502371A (en) * 2016-11-08 2017-03-15 珠海市魅族科技有限公司 A kind of electricity-saving control method and device
CN107277907A (en) * 2017-07-31 2017-10-20 努比亚技术有限公司 Method for controlling mobile terminal, mobile terminal and computer-readable recording medium
CN107861605A (en) * 2017-11-06 2018-03-30 北京小米移动软件有限公司 Data processing method and device
CN109793497A (en) * 2017-11-17 2019-05-24 广东乐心医疗电子股份有限公司 Sleep state identification method and device
CN107943266A (en) * 2017-11-20 2018-04-20 北京小米移动软件有限公司 power consumption control method, device and equipment

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022068332A1 (en) * 2020-09-29 2022-04-07 Oppo广东移动通信有限公司 Sleep monitoring method and apparatus, electronic device, and computer-readable medium
CN114343587A (en) * 2020-09-29 2022-04-15 Oppo广东移动通信有限公司 Sleep monitoring method and device, electronic equipment and computer readable medium
CN112435441A (en) * 2020-11-19 2021-03-02 维沃移动通信有限公司 Sleep detection method and wearable electronic device
CN113138656A (en) * 2021-05-14 2021-07-20 上海传英信息技术有限公司 Control method, mobile terminal and storage medium
CN113420740A (en) * 2021-08-24 2021-09-21 深圳小小小科技有限公司 Control method of smart home, electronic device and computer readable medium
CN113420740B (en) * 2021-08-24 2021-12-03 深圳小小小科技有限公司 Control method of smart home, electronic device and computer readable medium
CN116684524A (en) * 2022-09-30 2023-09-01 荣耀终端有限公司 Site marking method, electronic equipment and storage medium
CN116684524B (en) * 2022-09-30 2024-04-05 荣耀终端有限公司 Site marking method, electronic equipment and storage medium
CN117061658A (en) * 2023-07-11 2023-11-14 荣耀终端有限公司 Sleep time identification method, electronic device and storage medium
CN117077812A (en) * 2023-09-13 2023-11-17 荣耀终端有限公司 Network training method, sleep state evaluation method and related equipment
CN117077812B (en) * 2023-09-13 2024-03-08 荣耀终端有限公司 Network training method, sleep state evaluation method and related equipment

Similar Documents

Publication Publication Date Title
CN110347269B (en) Empty mouse mode realization method and related equipment
CN112289313A (en) Voice control method, electronic equipment and system
CN110742580A (en) Sleep state identification method and device
CN110506416A (en) A kind of method and terminal of terminal switching camera
CN113395382B (en) Method for data interaction between devices and related devices
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
CN112150778A (en) Environmental sound processing method and related device
CN112334860A (en) Touch method of wearable device, wearable device and system
CN111865646A (en) Terminal upgrading method and related device
CN114880251B (en) Memory cell access method, memory cell access device and terminal equipment
CN113973398A (en) Wireless network connection method, electronic equipment and chip system
CN114490174A (en) File system detection method, electronic device and computer readable storage medium
CN113472861B (en) File transmission method and electronic equipment
CN114221402A (en) Charging method and device of terminal equipment and terminal equipment
CN113467735A (en) Image adjusting method, electronic device and storage medium
CN114077519A (en) System service recovery method and device and electronic equipment
CN109285563B (en) Voice data processing method and device in online translation process
CN114095602A (en) Index display method, electronic device and computer-readable storage medium
CN113901485B (en) Application program loading method, electronic device and storage medium
CN113467747B (en) Volume adjusting method, electronic device and storage medium
CN115022807A (en) Express delivery information reminding method and electronic equipment
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium
CN114116610A (en) Method, device, electronic equipment and medium for acquiring storage information
CN113509145A (en) Sleep risk monitoring method, electronic device and storage medium
CN113918003A (en) Method and device for detecting time length of skin contacting screen and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200204

RJ01 Rejection of invention patent application after publication