CN114424927A - Sleep monitoring method and device, electronic equipment and computer readable storage medium - Google Patents

Sleep monitoring method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN114424927A
CN114424927A CN202011185682.XA CN202011185682A CN114424927A CN 114424927 A CN114424927 A CN 114424927A CN 202011185682 A CN202011185682 A CN 202011185682A CN 114424927 A CN114424927 A CN 114424927A
Authority
CN
China
Prior art keywords
data
user
monitoring
vehicle
sleep
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011185682.XA
Other languages
Chinese (zh)
Inventor
许德省
李靖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011185682.XA priority Critical patent/CN114424927A/en
Priority to PCT/CN2021/115753 priority patent/WO2022088938A1/en
Publication of CN114424927A publication Critical patent/CN114424927A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Cardiology (AREA)
  • Pulmonology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Telephone Function (AREA)

Abstract

The application is applicable to the field of data processing, and provides a sleep monitoring method and device, electronic equipment and a computer readable storage medium. The sleep monitoring method comprises the following steps: the electronic device determining whether the user is located on a vehicle; if the user is located on the vehicle, the electronic equipment acquires first monitoring data of the user located on the vehicle, wherein the first monitoring data comprises information related to the vehicle; the electronic equipment obtains target data according to the first monitoring data, wherein the target data is obtained after information related to vehicles in the first monitoring data is filtered; the electronic equipment monitors the sleep of the user according to the target data. By the method, the sleeping condition of the user on the vehicle can be accurately recorded.

Description

Sleep monitoring method and device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of data processing, and in particular, to a sleep monitoring method and apparatus, an electronic device, and a computer-readable storage medium.
Background
Currently, sleep monitoring is a common function of electronic devices (such as mobile phones, bracelets, smartwatches, etc.). The user can know the sleep data such as the time of going out of or falling asleep and the sleeping time of the user through the function, and further master the sleeping state of the user.
In the sleep monitoring method, the falling-asleep point and the falling-asleep point of a user are determined. Currently, an electronic device generally determines a sleep point and a sleep point of a user according to acceleration data monitored by the electronic device, so as to monitor sleep of the user. However, at present, when a user is in a vehicle, there may be a situation that the electronic device cannot accurately determine the user's point of falling asleep and point of falling asleep, so that the electronic device cannot accurately record the sleep situation of the user in the vehicle.
Disclosure of Invention
The application provides a sleep monitoring method, a sleep monitoring device, electronic equipment and a computer readable storage medium, which can accurately record the sleep condition of a user on a vehicle.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, a sleep monitoring method is provided, which is applied to an electronic device, and includes:
the electronic device determining whether the user is located on a vehicle; if the user is located on the vehicle, the electronic equipment acquires first monitoring data of the user located on the vehicle, wherein the first monitoring data comprises information related to the vehicle; the electronic equipment obtains target data according to the first monitoring data, wherein the target data is obtained after information related to vehicles in the first monitoring data is filtered; and monitoring the sleep of the user according to the target data.
In the sleep monitoring method, since the user is located on the vehicle, the first monitoring data includes information related to the vehicle. However, since the introduction of the vehicle-related information may cause the electronic device to be unable to accurately calculate the sleep-in point and/or the sleep-out point of the user on the vehicle when monitoring the sleep of the user according to the first monitoring data, there are often situations where the user is in the vehicle and the electronic device does not record the sleep of the user or lacks a sleep record of the user. In the embodiment of the application, the electronic device filters the vehicle-related information in the first monitoring data, so that the data affecting the user action data in the first monitoring data is filtered. Therefore, the target data obtained after filtering can accurately reflect the action state of the user, so that the electronic equipment can accurately calculate the sleep-in point and/or the sleep-out point of the user on the vehicle, the problem that the sleep of the user on the vehicle is not recorded or is recorded less is solved, and the user experience is improved.
With reference to the first aspect, in some embodiments, the method provided in embodiments herein further includes: and if the user is not positioned on the vehicle, the electronic equipment monitors the sleep of the user according to the acquired second monitoring data, wherein the second monitoring data comprises the state information of the user.
In the embodiment of the application, the electronic equipment can monitor the sleep of the user in different states. That is to say, above-mentioned electronic equipment not only can monitor the sleep when the user is on the vehicle, can also be used for monitoring the sleep when the user is not on the vehicle, has promoted user's experience. In addition, the first monitoring mode is employed when the user is located on the vehicle, i.e., the effects of vehicle-related information need to be filtered out. When the user is not located on the vehicle, the second monitoring mode is adopted, and at this time, because the monitoring data collected by the electronic equipment does not usually include information related to the vehicle, data does not need to be filtered out, and the collected monitoring data is used for sleep monitoring. Through the two monitoring modes, the sleep monitoring is more targeted, and the sleep monitoring result is more accurate.
With reference to the first aspect, in some embodiments, before the electronic device obtains the first monitoring data that the user is located on the vehicle, the method provided by the embodiment of the present application further includes: the electronic device determines to monitor the sleep of the user on the vehicle by adopting a first monitoring mode, wherein the first monitoring mode refers to a process of filtering information related to the vehicle from monitoring data acquired by the electronic device.
With reference to the first aspect, in some embodiments, before the electronic device monitors sleep of the user according to the obtained second monitoring data, the method provided in this embodiment of the application further includes: the electronic device determines to monitor the sleep of the user in a second monitoring mode, wherein the second monitoring mode is that information related to a vehicle in monitoring data acquired by the electronic device does not need to be filtered.
In the embodiment of the application, the electronic device can judge whether the user is located on the vehicle, and determine to adopt the first monitoring mode or the second monitoring mode to monitor the sleep of the user according to the judgment result. By the method, the electronic equipment can automatically switch the monitoring mode, the intelligent degree is high, and the user experience is improved.
With reference to the first aspect, in some embodiments, the obtaining, by the electronic device, the target data according to the first monitoring data includes: the electronic equipment acquires driving data of a vehicle; the electronic equipment determines a data component matched with the driving data in the first monitoring data as vehicle-related information; the electronic equipment filters out information related to the vehicle from the first monitoring data to obtain target data.
Compared with the method for denoising the first monitoring data, the method for denoising the first monitoring data can filter information related to vehicles in a targeted manner, and further can obtain more accurate state information of the user.
With reference to the first aspect, in some embodiments, the driving data includes first acceleration data when the vehicle is in a stationary state and second acceleration data when the vehicle is in a driving state; the electronic device determines a data component in the first monitoring data, which is matched with the driving data, as the vehicle-related information, and comprises the following steps: the electronic equipment constructs a noise data matrix according to the first acceleration data and the second acceleration data; the electronic equipment inputs the noise data matrix and the first monitoring data into a preset decomposition model to obtain a data component matched with the driving data in the first monitoring data; the electronic device determines a data component of the first monitoring data that matches the travel data as vehicle-related information.
The noise data matrix comprises the first acceleration data and the second acceleration data, so that the data interference caused in the driving process of the vehicle and the data interference existing when the vehicle is static can be considered in the subsequent data filtering process, and the data filtering is more thorough. In addition, the trained preset decomposition model is used for decomposition, so that the monitoring efficiency and the monitoring accuracy can be effectively improved.
With reference to the first aspect, in some embodiments, the electronic device monitors sleep of the user according to the target data, including: when the data amount in the target data reaches a preset amount, the electronic equipment monitors the sleep of the user according to the target data; and when the data quantity in the target data does not reach the preset quantity, the electronic equipment continues to acquire the first monitoring data.
The sleep monitoring is carried out according to the preset amount of target data, the data processing amount can be reduced, and the accidental monitoring result obtained according to the data at a single sampling moment can be avoided.
With reference to the first aspect, in some embodiments, the electronic device monitors sleep of the user according to the target data, including: the electronic equipment obtains multiple groups of data according to the target data; the electronic equipment determines a first state label corresponding to each group of data according to a statistical characteristic value corresponding to each group of data in the plurality of groups of data, wherein the first state label comprises a sleep state and a waking state; and the electronic equipment determines a sleep monitoring result according to the first state label corresponding to each group of data.
With reference to the first aspect, in some embodiments, the determining, by the electronic device, a sleep monitoring result according to the first state tag corresponding to each group of data includes: the electronic equipment acquires pulse wave data of a user; the electronic equipment determines a second state label corresponding to each group of data according to the pulse wave data and the statistical characteristic value corresponding to each group of data, wherein the second state label comprises a sleep state and a waking state; the electronic equipment determines a final state label corresponding to each group of data according to the first state label and the second state label; and the electronic equipment determines a sleep monitoring result according to the final state label corresponding to each group of data.
By utilizing the method, the monitoring result of the sensor data and the monitoring result of the PPG data are comprehensively considered, and the accuracy of the sleep monitoring result can be further improved.
With reference to the first aspect, in some embodiments, the electronic device determining whether the user is located on a vehicle includes: the electronic equipment acquires travel information and/or motion information of a user and judges whether the user is located on the vehicle or not according to the travel information and/or the motion information; or the electronic equipment displays prompt information on the display screen, wherein the prompt information is used for prompting the user to select whether the user is located on the vehicle; the electronic equipment monitors a first operation instruction input by a user; the electronic device determines whether the user is located on the vehicle according to the first operation instruction.
Whether the user is in a vehicle or not is judged through the travel information/motion information, and the automatic switching of the electronic equipment to the sleep monitoring mode can be realized, so that the sleep monitoring function of the electronic equipment is more intelligent, and the user experience is improved. In addition, a mode of user self-selection is provided, whether the vehicle is located or not is determined by the user, and the mode can judge the state of the user more accurately.
In a second aspect, an embodiment of the present application provides a sleep monitoring apparatus, including:
a judging unit for judging whether the user is on the vehicle by the electronic device;
the first monitoring unit is used for acquiring first monitoring data of the user on the vehicle if the user is on the vehicle, wherein the first monitoring data comprises information related to the vehicle; the electronic equipment obtains target data according to the first monitoring data, wherein the target data is obtained after information related to vehicles in the first monitoring data is filtered; the electronic equipment monitors the sleep of the user according to the target data.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device includes a processor, and the processor is configured to execute a computer program stored in a memory to implement the method as provided in any one of the possible implementation manners of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, which includes computer instructions, when the computer instructions are executed on a computer or a processor, the computer or the processor is caused to execute the method as provided in any one of the possible implementation manners of the first aspect.
In a fifth aspect, the embodiments of the present application provide a computer program product, which when run on a computer or a processor, causes the computer or the processor to execute the method as provided in any one of the possible embodiments of the first aspect.
It is to be understood that the sleep monitoring apparatus of the second aspect provided above corresponds to the method of the first aspect, and the electronic device of the third aspect, the computer storage medium of the fourth aspect, or the computer program product of the fifth aspect is used to execute the method of the first aspect. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
FIG. 1 is a schematic diagram of a sleep stage heart rate trend provided by an embodiment of the present application;
FIG. 2 is a diagram illustrating a variation trend of an acceleration data amplitude provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application;
fig. 4 is a block diagram of a software structure of the electronic device 100 according to an embodiment of the present disclosure;
fig. 5 is a schematic view of an application interface for sleep monitoring provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a home screen interface provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of an application interface for monitoring mode selection according to an embodiment of the present application;
fig. 8 is a schematic flowchart of a sleep monitoring method according to an embodiment of the present application;
fig. 9 is a schematic flowchart of a sleep monitoring method in a first monitoring mode according to an embodiment of the present application;
FIG. 10 is a schematic diagram of decomposition results provided by embodiments of the present application;
fig. 11 is a block diagram of a sleep monitoring device according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first message and the second message are only used for distinguishing different messages, and the sequence order of the messages is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
The sleep monitoring technique generally monitors the time to and/or from a user's sleep based on acceleration data, gyroscope data, and/or PPG (photoplethysmography) data, among others.
The accelerometer and the gyroscope have high sensitivity and are easily influenced by external factors, so that the specificity of a monitoring result of the user action is high, and the size of the user action cannot be accurately monitored. For the heart rate monitoring based on the PPG data, the specificity of the heart rate monitoring result of the user is high due to the individual difference. Illustratively, refer to fig. 1, which is a schematic diagram of a variation trend of a sleep stage heart rate provided by an embodiment of the present application. As shown in fig. 1 (a), the heart rate of the user a changes, and the time corresponding to the vertical solid line in the figure is the time of falling asleep, and it can be seen from the figure that the heart rate of the user a falls as a whole after falling asleep. As shown in fig. 1 (B), the heart rate of the user B changes, and the time corresponding to the vertical solid line in the figure is the time of falling asleep, and it can be seen from the figure that the heart rate of the user B rises as a whole after falling asleep. As can be seen from the example of fig. 1, the heart rate variation trend may be different for different sleep stages of the user. In summary, if sleep monitoring is performed solely according to the acceleration data, the gyroscope data or the PPG data, the accuracy of the obtained sleep monitoring result is low. Therefore, in the prior art, it is common to monitor the magnitude of the user's motion from the acceleration data and the gyroscope data, monitor the heart rate of the user from the PPG data, and then determine the user's time to go out/to sleep in combination with the magnitude of the user's motion and the heart rate of the user.
However, existing sleep monitoring techniques do not take into account the situation where the user is located on a vehicle. When a user is located on a transportation means such as a subway, a high-speed rail, a bus or an airplane, the monitored acceleration data or gyroscope data cannot accurately reflect the action of the user due to the shaking of the transportation means or the acceleration of the transportation means, and the time of the user going out of or falling asleep cannot be accurately judged. For example, refer to fig. 2, which is a schematic diagram of a variation trend of the acceleration data amplitude provided in the embodiment of the present application. The trend of the magnitude of the monitored acceleration data when the user is located on the vehicle is shown in fig. 2. As can be seen from fig. 2, the time to sleep of the user determined from the acceleration data is not the actual time to sleep of the user. Therefore, when a user is located on a vehicle, the existing sleep monitoring technology cannot accurately monitor the time of going out of or going to sleep of the user, so that the sleep of the user on the vehicle is not recorded or is recorded less, and further the user experience is influenced.
Based on the above problems, an embodiment of the present application provides a sleep monitoring method. According to the method, when a user is located on a vehicle, the process of filtering information related to the vehicle needs to be executed on monitoring data acquired by electronic equipment, and then the sleep of the user is monitored according to the filtered data. Because the influence of shaking or acceleration of the vehicle on the action of the user is filtered, the sleep monitoring method provided by the embodiment of the application can accurately monitor the time point of the user going out of or falling asleep on the vehicle, and finally the purpose of solving the problem that the user does not record or records less sleep on the vehicle is achieved.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
First, an electronic device related to the embodiment of the present application is introduced, where the electronic device may be a mobile phone, a wearable device, a tablet computer, a netbook, a notebook computer, an ultra-mobile personal computer (UMPC), a Personal Digital Assistant (PDA), and other devices having a function of monitoring a user's sleep, and the embodiment of the present application does not set any limitation on a specific type of the electronic device. For example, the wearable device may be a smart bracelet, a smart watch, or the like.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. Illustratively, the processor is configured to execute the sleep monitoring method provided in the embodiments of the present application, for example, the processor executes the following steps S801 to S803 or steps S901 to S903.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, information such as the first monitoring data acquired in the sleep monitoring method provided in the embodiment of the present application, the filtered data obtained after processing, and the like is stored in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sleep monitoring function) required by at least one function, and the like. The storage data area may store data created during the use of the electronic device 100 (e.g., monitoring data acquired by the sleep monitoring function), and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, cue signal playing, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic device 100 may listen to music, listen to a hands-free call, or listen to a reminder signal through the speaker 170A.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyroscope sensor 180B can be used for shooting anti-shake, and can also be used for navigation and body sensing game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
Acceleration sensor 180E may monitor the magnitude of acceleration of electronic device 100 in various directions (typically three axes, i.e., the x, y, and z axes). The magnitude and direction of gravity may be monitored when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the application processor may analyze the heart rate information based on the blood pressure pulsation signal acquired by the bone conduction sensor 180M, so as to implement the heart rate monitoring function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a layered architecture as an example to exemplarily illustrate a software structure of the electronic device 100.
Fig. 4 is a block diagram of a software structure of the electronic device 100 according to an embodiment of the present disclosure.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, a system library and Android runtime (Android runtime), and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 4, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, sports health, bluetooth, music, video, short message, etc. For example: the athletic health application may include a sleep monitoring function. When the exercise health application program with the sleep monitoring function is installed on the electronic equipment, the electronic equipment can acquire monitoring data through the sleep monitoring function and analyze and process the monitoring data through the sleep monitoring function so as to monitor the sleep of the user. The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 4, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include monitoring data obtained by the sensor module 180 (e.g., acceleration data obtained by the acceleration sensor 180E), video, images, audio, phone calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform about the completion of the download, message alerts (e.g., travel messages, etc.), etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system layer may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following embodiments of the sleep monitoring method may be implemented on an electronic device having the above hardware/software architecture.
The following describes a sleep detection function according to an embodiment of the present application.
The sleep monitoring function on the electronic device may be set to a default on state, for example, the sleep monitoring function is in an on state as long as the electronic device is turned on. The sleep monitoring function can also be set to be on/off by user self-definition, for example, if the user self-definition time period is 22:00-07:00, the sleep monitoring function is turned on at 22:00 and turned off at 07:00 of the next day.
When the sleep monitoring function in the electronic equipment is started, the function can be operated on the foreground, an interactive interface of a sleep monitoring application program is displayed on the electronic equipment, a software function control is displayed on the interactive interface, and related information of sleep monitoring is displayed in real time. Referring to fig. 5, fig. 5 is a schematic view of an application interface for sleep monitoring provided in an embodiment of the present application. As shown in fig. 5, an application interface 50 for sleep monitoring of the electronic device 100. The interface 50 may include a monitoring indicia 501, a data display area 502, and a functionality control 503. Wherein:
a monitor flag 501 for indicating that the sleep monitor function is in progress. The monitoring identifier may be a static picture or a dynamic image, and is not limited in this respect.
And a data display area 502 for displaying the monitoring data. The display form of the monitoring data may be a digital form, as shown in fig. 5 (a); or may be in the form of a statistical graph, as shown in fig. 5 (b). The embodiment of the present application is not particularly limited to the display form of the monitoring data.
And the function control 503 is used for implementing a corresponding software function when the user clicks/touches the function control. As shown in fig. 5, the functionality control 503 may be an "end" control that when clicked by the user, turns off the sleep monitoring functionality.
The foreground operation mode is convenient for the user to operate, but can occupy more system resources. And when the sleep monitoring function is in the foreground operation mode, the user cannot operate other application programs.
When the sleep monitoring function in the electronic device is turned on, the function may also be run in the background. The electronic equipment can acquire the monitoring data of the sensor in real time through the function, and can inform a user through the display device after generating a message in the background. The background operating mode may conserve system resources. Referring to fig. 6, fig. 6 is a schematic diagram of a home screen interface provided in an embodiment of the present application. As shown in fig. 6, is a home screen interface 60 of the electronic device 100. The home screen interface 60 may include a status bar 601 and a notification bar 602. Wherein:
the status bar 601 may include information such as operator name, time, signal strength, and current remaining power.
A notification bar 602 for displaying notification information of an application, such as the notification information of the sleep monitor application shown in fig. 6. The position, display mode and display duration of the notification bar on the home screen interface can be set by the user in a customized manner, and are not particularly limited herein. The user can enter the sleep monitoring application program through the notification bar, for example, enter the sleep monitoring application program by clicking/pressing the area occupied by the notification bar for a long time.
It should be noted that the interfaces described in the embodiments of fig. 5 and fig. 6 are only examples, and are not used to limit the application interface of sleep monitoring and the home screen interface of the electronic device.
The following describes a sleep monitoring method provided in an embodiment of the present application.
In practical application, the travel states of the user can be divided into two types: a travel state on the vehicle and a travel state not on the vehicle. When the user is located on the vehicle, the monitoring data acquired by the electronic device includes both the state information of the user and the vehicle-related information. Under the condition, the sleep of the user is monitored, and the state information of the user is interfered by the information related to the transportation means, so that the accuracy of the sleep monitoring result is influenced. When the user is not located on the vehicle, the monitoring data acquired by the electronic device includes the state information of the user and does not include information related to the vehicle. In this case, the sleep of the user can be monitored using the monitoring data. Accordingly, in the sleep monitoring method provided by the embodiment of the application, two sleep monitoring modes can be provided. The electronic device determines to monitor sleep of the user on the vehicle in a first monitoring mode while the user is located on the vehicle. The first monitoring mode refers to a process of filtering information related to the vehicle from monitoring data acquired by the electronic device. When the user is not located on the vehicle, the electronic device determines to monitor sleep of the user in the second monitoring mode. The second monitoring mode is that information related to the vehicle in the monitoring data acquired by the electronic equipment does not need to be filtered.
In one application scenario, the user may select the monitoring mode at his or her discretion. The electronic equipment determines to adopt the first monitoring mode or the second monitoring mode to carry out sleep monitoring according to an operation instruction input by a user. For example, refer to fig. 7, which is a schematic view of an application interface for monitoring mode selection provided in an embodiment of the present application. As shown in fig. 7, an application interface 70 for sleep detection of the electronic device 100. The interface 70 may include a selection information area 701 and a selection control 702. Wherein:
an information area 701 is selected for displaying the name of the monitoring mode. As shown in fig. 7 (a), a "riding vehicle" mode and an "unoccupied vehicle" mode are displayed in the selection information area 701. The "in-vehicle" mode corresponds to the first monitoring mode, and the "out-of-vehicle" mode corresponds to the second monitoring mode. It should be noted that the names of the monitoring modes shown in fig. 7 are only examples, and are not intended to be limited specifically, and in practical applications, the names of the monitoring modes may be defined as other contents as long as they are convenient for the user to distinguish.
And a selection control 702, configured to implement a corresponding software function when the user clicks/touches, where the selection control corresponds to the selection information. As shown in fig. 7 (a), the selection information "with vehicle" and "without vehicle" each corresponds to one selection control. For example: when the user clicks/touches a corresponding selection control behind "riding a vehicle", the electronic device 100 displays an interface as shown in (b) of fig. 7, and at this time, the electronic device determines that the user is located on the vehicle and determines to detect the sleep of the user by using the first monitoring mode in the sleep monitoring function. When the user clicks/touches "unoccupied vehicle", the electronic device 100 displays an interface as shown in (c) of fig. 7, at which time the electronic device determines that the user is not located on the vehicle and determines that the sleep of the user is detected in the second monitoring mode of the sleep monitoring function.
In another application scenario, the electronic device determines whether the user is located on a vehicle, and determines to monitor the sleep of the user in the first monitoring mode or the second monitoring mode according to the determination result.
Fig. 8 is a schematic flowchart of a sleep monitoring method according to an embodiment of the present application. As shown in fig. 8, the sleep monitoring method may include the steps of:
s801, the electronic device judges whether the user is located on the vehicle.
S802, if the user is located on the vehicle, the electronic device determines to monitor the sleep of the user on the vehicle by adopting a first monitoring mode.
And S803, if the user is not located on the vehicle, the electronic device determines to monitor the sleep of the user on the vehicle by adopting the second monitoring mode.
For example, the second monitoring mode may be set as a default monitoring mode, and the electronic device switches the monitoring mode to the first monitoring mode when the electronic device determines that the user is located on the vehicle. Optionally, when the electronic device determines to adopt the first monitoring mode, a prompt message may be sent to the user to prompt the user to subsequently adopt the first monitoring mode for sleep monitoring. The prompt message may be one or more of a voice message, a negative one-screen message, or a vibration signal, etc. Under the application scene, the intelligent degree of sleep monitoring is higher, and the user experience is better.
In one embodiment of the present application, when the electronic device determines that the user is located on the vehicle, the electronic device may prompt the user whether to employ the first monitoring mode. And then determining whether to adopt the first monitoring mode according to an operation instruction input by a user. In this way, the wrong selection of the monitoring mode caused by the misjudgment of the electronic equipment can be effectively avoided.
For example, when the electronic device determines that the user is located on the vehicle, the electronic device prompts the user whether to adopt the first monitoring mode, and if the user agrees to adopt the first monitoring mode, the electronic device monitors the sleep of the user based on the instruction of the user, that is, the first monitoring mode is subsequently adopted.
Optionally, if the user does not make feedback within a specified time (e.g., 1 minute) after the electronic device prompts the user whether to adopt the first monitoring mode, the electronic device determines to adopt the first monitoring mode to monitor the sleep of the user.
For example, when the electronic device determines that the user is located on the vehicle, the electronic device prompts the user whether to adopt the first monitoring mode, and if the user does not agree to adopt the first monitoring mode, the electronic device takes the instruction of the user as the standard, that is, the electronic device subsequently adopts the second monitoring mode to monitor the sleep of the user.
The following describes a method for determining whether the user is located on the vehicle by the electronic device in S801. The following two methods may be included:
according to the first method, the electronic equipment acquires the travel information of the user, and the electronic equipment judges whether the user is located on a vehicle or not according to the travel information.
The travel information may include departure time, running time, arrival time, and the like of the transportation means on which the user is seated, such as takeoff/landing time of an airplane, arrival/departure time of a train, and the like. The travel information can be obtained from other applications of the electronic device, such as travel prompting short messages, ticket booking mails, ticket booking applications, negative one-screen information (e.g. event reminding information, "flight number CN 2222", "takeoff time 2020/09/1010: 00" set by the user as shown in fig. 6), and the like.
The electronic device can determine the time range of the user on the vehicle according to the travel information. For example, assuming that the obtained travel information is that the takeoff time of the airplane is 10:00 and the landing time is 14:00, it can be determined that the user at 10:00-14:00 is on the airplane. For another example: assuming that the obtained travel information is 'Beijing-Shanghai, G2222', the electronic device can query the schedule of the G2222 trains on the Internet according to the travel information, and determine that the time period from the Beijing to the Shanghai is 10:00-17:00 from the queried schedule, so that the 10:00-17:00 user can be determined to be on a high-speed rail.
Accordingly, the electronic device may preset the operating time of the monitoring mode according to the determined time range. For example, assuming that the electronic device has determined that the user is in the vehicle for a time ranging from 10:00 to 14:00, the electronic device may switch the monitoring mode from the second monitoring mode to the first monitoring mode at 10:00 and from the first monitoring mode to the second monitoring mode at 14: 00.
In one embodiment of the present application, the electronic device may switch from the first monitoring mode to the second monitoring mode when the user is away from the vehicle. I.e. exit the first monitoring mode and enable the second monitoring mode. In one embodiment of the present application, the electronic device may automatically leave the first monitoring mode and automatically enable the second monitoring mode.
In another embodiment of the present application, the electronic device may automatically leave the first monitoring mode and enable the second monitoring mode at the direction of the user.
For example, before the electronic device exits the first monitoring mode, the user may be prompted that the second monitoring mode will be entered after n hours. If the user agrees to enter the second monitoring mode, the electronic device will enter the second monitoring mode after n times. If the user does not agree to enter the second monitoring mode, the electronic device continues to monitor the sleep of the user in the first monitoring mode after n times. Or, the electronic device automatically enters the second monitoring mode after the electronic device prompts the user for a preset time. Or after the electronic device prompts the user, if the user does not make feedback within a specified time, the electronic device enters the second monitoring mode after n times.
Since the user will typically board the vehicle before the vehicle departs, e.g., 10:00 airplane takeoff time, the user boards the airplane at 09: 45. Therefore, the time range of the user actually on the vehicle is usually larger than the time range determined according to the travel information. In order to ensure the accuracy of the monitoring, the determined time range of the user on the vehicle (i.e. the working time of the first monitoring mode) can be prolonged appropriately. For example: assuming that the takeoff time of the aircraft is determined to be 10:00 according to the travel information, the electronic device may set the monitoring mode to the first monitoring mode at 09: 30.
And when the travel information of the user cannot be acquired, judging by adopting a second method.
And secondly, the electronic equipment acquires the motion information of the user, and the electronic equipment judges whether the user is in the vehicle or not according to the motion information.
The exercise information may include, among other things, exercise data and health data.
For example, the processor in the electronic device may acquire motion data (e.g., motion speed, motion time, motion route, etc.) of the user through an acceleration sensor (e.g., 180E shown in fig. 3), a distance sensor (e.g., 180F shown in fig. 3), etc. installed on the electronic device, and the processor in the electronic device may acquire health data (e.g., heart rate value, etc.) of the user through a bone conduction sensor (e.g., 180M shown in fig. 3), etc. installed on the electronic device.
The electronic device can determine a speed threshold of the movement speed of the user according to the acquired movement data of the user. The electronic device may determine a normal heart rate range of the user according to the obtained health data of the user (the normal heart rate range in the embodiment of the present application refers to a range of heart rate values of the user in a non-exercise state).
Such as: the electronic device can monitor the exercise data and the health data of the user through the exercise health application program, then determine a speed threshold of the exercise speed of the user and a normal heart rate range of the user according to the exercise data and the health data of the user, and then judge whether the user is on the vehicle or not according to the determined speed threshold and the normal heart rate range.
In one application scenario, a user may set a time range allowing the electronic device to acquire the exercise data and the health data of the user through the sleep monitoring function. For example: the settings page of the sleep monitoring function in the electronic device gives the options "always allowed", "allowed only during use". After the user selects the option of 'always allow', the electronic device can also acquire the health data of the user in real time in the background through the sleep monitoring function even if the sleep monitoring function is in the off state in the foreground. When the user selects the "allow only during use" option, the electronic device may acquire the health data of the user through the sleep monitoring function only when the sleep monitoring function is in an on state in the foreground.
Typically, when the user is in an exercise state (e.g., running), the exercise speed of the user monitored by the electronic device is large and the heart rate value of the user exceeds a normal heart rate range. And when the user is located on the vehicle, the movement speed of the user monitored by the electronic device is high, but the heart rate value of the user is within a normal heart rate range.
In an embodiment of the present application, one way for the electronic device to determine whether the user is located on the vehicle according to the motion information may be: when the user's movement speed is greater than the speed threshold and the user's heart rate value exceeds the normal heart rate range, the electronic device determines that the user is on a vehicle.
When the vehicle is in a stationary state or a low-speed driving state, because the movement speed of the user obtained by the electronic device is small and the heart rate value of the user is within the normal heart rate range, the electronic device may not determine whether the user is on the vehicle by using the second method, and thus the electronic device may not accurately determine the time range of the user on the vehicle.
In order to improve the monitoring accuracy, a caching mechanism may be provided, that is, it is ensured that data can be cached for a period of time. For example: assuming that the user is on the vehicle at 10:00 and the buffering time is 0.5h according to the motion information, 09:30 can be recorded as the time node when the user is on the vehicle. Corresponding to a 0.5h extension of the first monitoring mode operation time.
Of course, in order to ensure the accuracy of the time when the identified user is on the vehicle, the first method and the second method can be combined, and the identification results of the two methods are comprehensively considered.
The first monitoring mode in S802 is described below. Fig. 9 is a schematic flowchart of a sleep monitoring method in the first monitoring mode according to an embodiment of the present application. As shown in fig. 9, by way of example and not limitation, the electronic device determines to detect sleep of the user in the first monitoring mode when the user is located on a vehicle. In the first monitoring mode, the method specifically includes the following steps:
s901, the electronic equipment acquires first monitoring data of a user on a vehicle.
The processor in the electronic device can acquire various sensor data such as acceleration data, speed data, heart rate value and the like in real time through the sensor module installed on the electronic device. The electronic device can acquire sensor data in real time whether the user is on the vehicle or not. In the first monitoring mode, however, the data used for sleep monitoring is actually sensor data acquired while the user is on the vehicle. Therefore, the first monitored data in the embodiment of the present application is data monitored when the user is on the vehicle.
For example: the working time of the sleep monitoring function in the electronic equipment is 06:00-22:00, and the electronic equipment always collects sensor data in the time period of 06:00-22: 00. Assuming that the electronic device determines that the user is located on the vehicle within the time period of 09:00-10:00, the electronic device records the monitoring data acquired by the electronic device within the time period of 09:00-10:00 as first monitoring data, and records the monitoring data acquired by the electronic device within the time periods of 06:00-09:00 and 10:00-22:00 as second monitoring data.
The first monitoring data includes both the state information of the user and the vehicle-related information, and may include measurement error data due to the hardware configuration of the sensor itself. The vehicle-related information may include acceleration data generated by shaking of the vehicle, acceleration data of the vehicle itself during traveling of the vehicle, and the like.
Since the user is located on the vehicle, the first monitoring data is influenced not only by the user's motion but also by the vehicle-related information. Therefore, it is necessary to filter out the vehicle-related information from the first monitored data using the method in S902 described below.
And S902, the electronic equipment obtains target data according to the first monitoring data.
The target data is data obtained after vehicle-related information in the first monitoring data is filtered out.
The sleep of the user is monitored, and the state information of the user is needed. Therefore, the vehicle-related information can be regarded as a kind of noise data, and the first monitored data can be regarded as data in which the state information of the user is mixed with the noise data. Then, the process of filtering the state information of the vehicle in the first monitoring data can be regarded as a process of denoising the first monitoring data.
In one embodiment, a manner of obtaining, by the electronic device, the target data according to the first monitoring data in S902 may include the following steps:
I. the electronic device obtains driving data of a vehicle.
The driving data of the vehicle can be used for representing the driving state of the vehicle, and is data which has the largest interference to the state information of the user. The travel data of the vehicle may include speed data and/or acceleration data of the traffic, etc.
The data attributes of the travel data of the vehicle are consistent with the data attributes of the first monitored data. For example, it is assumed that the acceleration data is included in the first monitored data, and the acceleration data is also included in the travel data of the corresponding vehicle; it is assumed that the first monitoring data includes speed data, and the travel data of the corresponding vehicle also includes speed data. In other words, if the speed data in the first monitored data needs to be denoised, the electronic device needs to acquire the speed data of the vehicle. If the acceleration data in the first monitoring data needs to be denoised, the electronic device needs to acquire the acceleration data of the vehicle.
The speed is used to represent the amount of change in displacement per unit time, and the acceleration is used to represent the amount of change in speed per unit time. In contrast, the acceleration data can reflect the action state of the user more sensitively. Therefore, in the embodiment of the present application, the travel data of the vehicle may be acceleration data of the vehicle.
Typically, the acceleration sensor acquires three-axis accelerations, i.e., an x-axis acceleration, a y-axis acceleration, and a z-axis acceleration. Therefore, the three-axis acceleration may be included in the travel data of the vehicle. Accordingly, the first monitoring data may also include three-axis acceleration.
The travel data of the vehicle can represent a travel state of the vehicle, and the travel state of the vehicle may include both a stationary state and a travel state. When the vehicle is running, the acceleration of the vehicle itself is large, and therefore, the state information of the user is greatly influenced. Therefore, the travel data of the vehicle generally refers to acceleration data when the vehicle travels. However, in practical applications, when the vehicle is stationary, the vehicle itself usually has a certain acceleration due to slight shaking of the vehicle itself, sensor measurement errors, and the like, and further the state information of the user is affected.
In order to obtain accurate state information of the user and improve monitoring accuracy, in the embodiment of the application, the driving data of the vehicle may include both first acceleration data for indicating that the vehicle is in a stationary state and second acceleration data for indicating that the vehicle is in a driving state. Therefore, in the subsequent denoising process, the data interference caused in the driving process of the vehicle is filtered, and the data interference existing when the vehicle is static is also filtered, so that the target data obtained after filtering can more accurately reflect the action state of the user.
The following describes the acquisition methods of the first acceleration data and the second acceleration data, respectively.
The acquisition mode of the first acceleration data may include, but is not limited to, the following two modes:
the method comprises the steps that firstly, the electronic equipment confirms the departure time of a vehicle according to travel information of a user; the method comprises the steps that in a preset time before the departure time of a vehicle, the electronic equipment obtains acceleration data in real time and records the acceleration data as candidate data; the electronic equipment carries out noise reduction filtering processing on the candidate data; the electronic equipment judges whether the user is relatively static according to the candidate data after the noise reduction filtering processing; and if the user is relatively static, the electronic equipment records the corresponding candidate data as the first acceleration data when the user is relatively static.
Typically, the user is not in an absolutely stationary state. The user's body may make a slight movement due to the user's breathing, small movements of the limbs, and shaking of the vehicle. But these slight actions do not result in the electronic device acquiring relatively large acceleration data. Therefore, after the electronic equipment performs noise reduction filtering processing on the acquired acceleration data, the acceleration data generated by slight actions of the user can be effectively filtered.
If the vehicle is in a driving state, the value of the acceleration data acquired by the electronic device worn by the user is relatively large. If the vehicle is stationary, the electronic device worn by the user acquires acceleration data having a relatively small value.
In an embodiment of the present application, the determining, by the electronic device, whether the user is relatively still according to the candidate data after the noise reduction filtering processing may include: and if the variation of the acceleration of any two axes in the candidate data is smaller than a first preset value, the electronic equipment judges that the user is relatively still. The amount of change in the uniaxial acceleration is small, which means that the range of motion of the user in a certain direction is small. The variation of the acceleration of any two shafts is small, which indicates that the user is in a relatively static state and the vehicle is in a static state at the moment. The first preset value can be preset according to the requirement of monitoring precision.
And if the travel information of the user cannot be acquired, acquiring the first acceleration data according to the mode two.
The electronic equipment acquires acceleration data in real time, records the acceleration data as candidate data and caches the candidate data; the electronic equipment carries out noise reduction filtering processing on the candidate data; the electronic equipment judges whether the vehicle is in a driving state or not according to the candidate data after the noise reduction filtering processing; if the vehicle is judged to be in the driving state, the electronic equipment determines the departure time of the vehicle and obtains the cache data before the departure time; the electronic equipment judges whether the user is relatively static or not according to the cache data; and if the user is relatively static, the electronic equipment records the corresponding candidate data as the first acceleration data when the user is relatively static.
In an embodiment of the present application, the determining, by the electronic device, whether the vehicle is in a driving state according to the candidate data after the noise reduction filtering process may include: and if the variation of the acceleration of a certain axis in the candidate data is larger than a second preset value, the electronic equipment judges that the vehicle is in a driving state. The second preset value may be preset with reference to a traveling speed of the vehicle and a moving speed of the user.
Optionally, the electronic device may further determine whether the vehicle is in a driving state according to the acquired data of the gyro sensor. Since the gyro sensor can monitor the shift of the center of gravity of the human body, the center of gravity of the user on the vehicle shifts when the vehicle is started. The electronic equipment can judge whether the vehicle is in a driving state or not through the gravity center offset monitored by the gyroscope sensor. For example: if the electronic equipment monitors continuous and regular angle deviation through the gyroscope sensor and the angle deviation is larger than the preset deviation, the electronic equipment determines that the vehicle is in a driving state. The angle offset of the gyro sensor generated when the vehicle is started is usually greater than the angle offset of the gyro sensor generated when the user walks, and therefore the preset offset can be preset by referring to the angle offset when the user walks.
The electronic device determines whether the user is relatively still according to the cached data, and may refer to the above-mentioned manner in which the electronic device determines whether the user is relatively still according to the candidate data after the noise reduction filtering processing, which is not described herein again.
In practice, the user may be in motion during the period from the time the user is riding in the vehicle to the departure time of the vehicle, i.e. there is no situation where the user is relatively stationary. In this case, the first acceleration data may be set to a first preset acceleration. The first predetermined acceleration may be 0, a small constant value, a random value, or the like.
The acquisition mode of the second acceleration data may include, but is not limited to, the following two modes:
the method comprises the steps that after the fact that a vehicle is in a running state is determined, the electronic equipment obtains acceleration data in real time and records the acceleration data as data to be monitored; and the electronic equipment determines the uniaxial acceleration with the maximum change amount in the data to be monitored as the second acceleration.
This is more suitable for the acceleration and deceleration phases of the vehicle, since the acceleration of the vehicle varies more during acceleration and deceleration. However, when the vehicle moves at a constant speed, the acceleration of the vehicle is often less varied, and the second acceleration obtained in the first mode may not be accurate.
The method comprises the steps that after the vehicle is determined to be in a running state, the electronic equipment acquires Global Positioning System (GPS) data; the electronic device calculates a travel distance from the GPS data and calculates second acceleration data from the travel distance.
The method for determining whether the vehicle is in the driving state may refer to the description of the method for acquiring the first acceleration data, and will not be described herein again.
In practical applications, the vehicle is in a stationary state, i.e., the second acceleration data is not generated, during the period from the time when the user is riding the vehicle to the departure time of the vehicle. In this case, the second acceleration data may be set to a second preset acceleration. The second predetermined acceleration may be 0, or a small constant value, or a random value, etc.
In addition, in practical applications, in order to improve the accuracy of the acquired second acceleration data, two modes, i.e., the first mode and the second mode, and other possible modes may be considered together.
II. The electronic device determines a data component of the first monitoring data that matches the travel data as vehicle-related information.
And III, filtering the information related to the vehicle from the first monitoring data by the electronic equipment to obtain target data.
The sensor module, as in the embodiment described in fig. 3, typically acquires sensor data at a sampling frequency. Thus, one or more sensor data may be included in the first monitoring data, each sensor data in the first monitoring data corresponding to a sampling instant. Correspondingly, the data in the driving data correspond to the data in the first monitoring data one by one. For example, as described in S902, the first monitoring data and the driving data may each include acceleration data. It is assumed that the first monitored data includes acceleration data corresponding to the sampling times t1 and t2, and accordingly, the driving data also includes acceleration data corresponding to the sampling times t1 and t 2.
The data component matching the travel data includes a data component equivalent to the travel data correspondence, and a data component proportional to the travel data correspondence.
The correspondence equality means that the data values corresponding to the two at the same sampling time are equal. For example, it is assumed that the acceleration data values corresponding to the sampling times t1 and t2 in the travel data are p and q, and the acceleration data values corresponding to the sampling times t1 and t2 in the data component matching the travel data are also p and q.
Correspondingly proportional means that the values of the data corresponding to both at the same sampling instant are proportional. For example, it is assumed that the acceleration data values corresponding to the sampling times t1 and t2 in the travel data are p and q, and the acceleration data values corresponding to the sampling times t1 and t2 in the data component matching the travel data are 5p and 5 q.
Optionally, one implementation manner of obtaining the target data from the first monitoring data by the electronic device is as follows:
the electronic device determines a data component matching the travel data in the first monitoring data as the vehicle-related information, and then subtracts the data component matching the travel data from the first monitoring data. And subtracting the data component matched with the driving data from the first monitoring data to obtain difference data, namely the target data.
The electronic device may process the three-axis acceleration when subtracting the data component matched with the driving data from the first monitoring data. Specifically, the electronic device subtracts the x-axis acceleration of the data component matched with the running data from the x-axis acceleration of each sampling time in the first monitoring data, subtracts the y-axis acceleration of the data component matched with the running data from the y-axis acceleration of each sampling time in the first monitoring data, and subtracts the z-axis acceleration of the data component matched with the running data from the z-axis acceleration of each sampling time in the first monitoring data.
When the data component matched with the driving data is subtracted from the first monitoring data by the electronic equipment, the three-axis acceleration can be fused firstly, and then the fused acceleration is processed. Specifically, the electronic device fuses the acceleration of the x axis, the acceleration of the y axis and the acceleration of the z axis corresponding to each sampling moment in the first monitoring data into one acceleration to obtain a first fused acceleration; fusing the acceleration of the x axis, the acceleration of the y axis and the acceleration of the z axis corresponding to each sampling moment in the data component matched with the driving data into an acceleration to obtain a second fused acceleration; and then subtracting the second fused acceleration from the first fused acceleration corresponding to each sampling moment.
The method for obtaining the target data from the first monitoring data by the electronic equipment has low algorithm complexity and is easy to implement. However, the user's status information and vehicle-related information are typically not simply superimposed but rather are intermingled. If the data component matching the travel data is directly subtracted from the first monitored data, the resulting target data may not accurately reflect the user's action state.
In order to improve the monitoring accuracy, optionally, another implementation manner of obtaining the target data from the first monitoring data by the electronic device is as follows: the method can comprise the following steps:
the electronic equipment constructs a noise data matrix according to the first acceleration data and the second acceleration data; the electronic equipment inputs the noise data matrix and the first monitoring data into a preset decomposition model to obtain a data component matched with the driving data in the first monitoring data; the electronic device determines a data component of the first monitoring data that matches the travel data as vehicle-related information, and the electronic device determines a data component of the first monitoring data that does not match the travel data as target data.
As described in I, the traveling data includes the first acceleration data and the second acceleration data. The first acceleration data and the second acceleration data may each include a plurality of data.
Alternatively, one way for the electronic device to construct the noise data matrix according to the first acceleration data and the second acceleration data may be: and taking the data in the first acceleration data and the second acceleration data as elements in a noise data matrix.
For example, it is assumed that (x1, y1, z1), (x2, y2, z2) two sets of three-axis accelerations are included in the first acceleration data, and (x3, y3, z3), (x4, y4, z4), (x5, y5, z5) three sets of three-axis accelerations are included in the second acceleration data. An example of a noisy data matrix is given as follows:
Figure BDA0002750756180000203
the above-mentioned noise data matrix is only an example, and is not used for the form of the noise data matrix to be limited specifically.
Optionally, another way for the electronic device to construct the noise data matrix according to the first acceleration data and the second acceleration data may be: the electronic equipment respectively carries out data statistics on the first acceleration data and the second acceleration data, and then constructs a noise data matrix according to the data after statistics.
For example, it is assumed that (x1, y1, z1), (x2, y2, z2) two sets of three-axis accelerations are included in the first acceleration data, and (x3, y3, z3), (x4, y4, z4), (x5, y5, z5) three sets of three-axis accelerations are included in the second acceleration data. The electronic equipment can firstly respectively fuse each group of three-axis accelerations in the first acceleration data into an acceleration value to obtain a1 and a 2; and respectively fusing each group of triaxial acceleration in the second acceleration data into an acceleration value to obtain a3, a4 and a5, and then constructing a1, a2, a3, a4 and a5 into a noise data matrix. Since the amount of data in the first acceleration is not equal to the amount of data in the second acceleration, it can be filled with a preset value. An example of a noisy data matrix is given as follows:
Figure BDA0002750756180000202
the predetermined value in the noise data matrix is 0.
The above-mentioned mode of fusing every group triaxial acceleration into an acceleration value can be: and determining the statistical characteristic value of each group of triaxial accelerations as the acceleration value after each group of triaxial accelerations is fused. The statistical characteristic value may be a mean value, a variance, a median, a root mean square, and the like.
The above-mentioned noise data matrix is only an example, and is not used to specifically limit the form of the preset numerical value and the noise data matrix.
Because the preset decomposition model needs to process the noise data matrix and the first monitoring data, in order to ensure the consistency of data dimensionality, when the form of the noise data matrix is set, the dimensionality of the noise data matrix can be set to be matched with the dimensionality of the first monitoring data according to the requirement of the preset decomposition model on data processing. For example: suppose that the desired cross-noise data matrix in the predetermined decomposition model is multiplied by the first monitored data, which is (x0, y0, z 0). The first monitoring data can be formed into a3 × 1 vector, and accordingly, the dimensionality of the noise data matrix needs to satisfy n × 3(n is a positive integer); if the first monitoring data is formed into a1 × 3 vector, the dimension of the noise data matrix needs to satisfy 3 × n accordingly.
The preset decomposition model can be an algorithm model capable of achieving a data denoising function, namely data in the noise data matrix are used as noise data, and denoising processing is conducted on the first monitoring data according to the noise data. The preset decomposition model may also be an algorithm model capable of implementing a signal decomposition function, that is, the first monitoring data is decomposed into a noise part (i.e., driving data) and a non-noise part (i.e., target data), such as a neural network, a variational modal decomposition model, a classical modal decomposition model, and the like.
Taking a preset decomposition model as an algorithm model capable of realizing a signal decomposition function as an example, see fig. 10, which is a schematic diagram of a decomposition result provided in the embodiment of the present application. As shown in fig. 10, (a) in fig. 10 is a data curve to which the first monitoring data is fitted, (b) in fig. 10 is a data curve to which the travel data is fitted, (c) in fig. 10 is a data curve to which a data component matching the travel data, which is separated from the first monitoring data, is fitted, and (d) in fig. 10 is a data curve to which the target data is fitted. As can be seen from fig. 10, the preset decomposition model decomposes the first monitored data into a data component matching the travel data and a data component not matching the travel data (i.e., target data), which is essentially the process of decomposing a noise part and a non-noise part from the first monitored data. The data component which is matched with the driving data is separated from the first monitoring data, and the information related to the vehicles in the first monitoring data is filtered. The decomposed data component (i.e., target data) that does not match the driving data corresponds to the user status information obtained by filtering the vehicle-related information.
In order to improve monitoring efficiency and monitoring accuracy, the decomposition model may be trained in advance. For example: sensor data when the user is not in the vehicle can be acquired as sample data of the state information of the user; acquiring sensor data of a vehicle as sample data of driving data; fusing the sample data of the driving data serving as noise data into the sample data of the state information of the user, and using the fused data as the sample data of the first monitoring data; and then training the decomposition model according to the sample data of the state information of the user, the sample data of the driving data and the sample data of the monitoring data.
According to the target data obtained by the method, the action state of the user can be accurately reflected. The sleep of the user is monitored according to the target data, the influence of the acceleration of the vehicle on the action of the user can be avoided, and the accuracy of sleep monitoring is effectively improved.
And S903, the electronic equipment monitors the sleep of the user according to the target data.
Since the acceleration data can reflect the action state of the user more sensitively, when the user performs some actions with smaller amplitude (such as breathing, slight shaking, etc.), the acceleration data can still be more clearly characterized, which will interfere with the sleep monitoring. In order to improve the monitoring accuracy, noise reduction and filtering processing can be carried out on target data, and the influence of small-amplitude action on sleep monitoring is filtered. Furthermore, since the data value of the acceleration data is complicated and the data calculation is complex, the target data after the noise reduction filtering process may be normalized in order to simplify the data processing. For example: target data smaller than a certain threshold is denoted as 0, and target data greater than or equal to the threshold is denoted as 1.
In practical application, the electronic device can perform sleep monitoring according to the target data at each sampling time. However, such data processing is too frequent, the amount of calculation is large, and the monitoring result obtained from the data at a single sampling time is occasional. Therefore, the electronic device can perform sleep monitoring according to the target data at a plurality of sampling moments.
Specifically, S903 may include the following steps:
when the data amount in the target data reaches a preset amount, the electronic equipment monitors the sleep of the user according to the target data; and when the data quantity in the target data does not reach the preset quantity, the electronic equipment continues to acquire the first monitoring data.
Optionally, a caching mechanism may be provided. Because the sampling frequency is fixed, only the buffer duration needs to be set, namely, the data volume is limited.
Accordingly, S903 may include: when the cache duration reaches a preset duration, the electronic equipment monitors the sleep of the user according to the target data; and when the cache duration does not reach the preset duration, the electronic equipment continues to acquire the first monitoring data.
Further, an implementation manner of monitoring the sleep of the user according to the target data may be: and inputting the target data into a preset monitoring model to obtain an output monitoring result. The preset monitoring model may be a neural network model, a clustering model, or the like, and the monitoring result may be used to indicate that the user is in a waking state or a sleeping state.
As described above, since the target data includes data at a plurality of sampling times, the user may go to sleep or go to sleep at a certain sampling time. By using the method, only the sleep state of the user corresponding to the target data can be monitored, and the specific in-out sleep time of the user cannot be monitored.
In order to solve the above problem, optionally, another implementation manner in which the electronic device monitors the sleep of the user according to the target data may be that the electronic device obtains multiple sets of data according to the target data; the electronic equipment determines a first state label corresponding to each group of data according to the statistical characteristic value corresponding to each group of data in the multiple groups of data; and the electronic equipment determines a sleep monitoring result according to the first state label corresponding to each group of data.
The electronic device obtains multiple sets of data according to the target data, and may divide the target data into multiple sets of data.
The statistical characteristic value may be a median, a root mean square, an average, a moving frequency, a spectrum energy, and the like. In addition, in order to make the statistical characteristics of the data more obvious, after the first monitoring data is acquired in S901 and the driving data of the vehicle is acquired in I, the acquired first monitoring data and the driving data may be subjected to band-pass filtering to remove the extremely low frequency and the extremely high frequency.
Wherein the first state tag includes a sleep state and an awake state. The sleep monitoring results comprise the time of falling asleep, the sleeping time and the like. The first state label corresponding to each group of data may be determined using a neural network, a clustering algorithm, or a decision tree algorithm, etc. Therefore, the time of the user going out of sleep can be determined more accurately.
Illustratively, assume that the target data is divided into 10 groups, and the target data corresponds to a time period of 10:00 to 10: 50. The first group comprises data of a time period of 10:00-10:05, the second group comprises data of a time period of 10:06-10:10, and so on, and the tenth group comprises data of a time period of 10:46-10:50 (namely, target data are divided according to time, and each group of data corresponds to 5 min). And the electronic equipment respectively calculates the statistical characteristic values corresponding to each group of data to obtain 10 statistical characteristic values. Determining the first state label corresponding to each statistical characteristic value, for example, determining that the first state label corresponding to the first characteristic value is in a waking state, the first state label corresponding to the second characteristic value is in a waking state, the first state labels corresponding to the third characteristic value to the eighth characteristic value are in sleep states, and the first state labels corresponding to the ninth characteristic value and the tenth characteristic value are in waking states. Then the electronic device can determine, according to the first status tag, that the time taken for the user to fall asleep is the starting time (i.e., 10:15) corresponding to the third set of data, and the time taken for the user to fall asleep is the ending time (i.e., 10:35) corresponding to the 7 th set of data. Further, the electronic device may determine that the sleep time of the user is 20 min.
It should be noted that the above is only an example in which the electronic device determines the sleep monitoring result according to the target data, and the division rule of the target data and the like are not specifically limited. Of course, the more groups the target data is divided into, the more accurate the monitored out/in-sleep time is, but the larger the data processing amount is.
In the sleep monitoring method, the acceleration data is used for monitoring. In order to improve the monitoring accuracy, PPG data, such as pulse wave time domain characteristic data, pulse wave frequency domain characteristic data, etc., may be comprehensively considered.
Specifically, the monitoring method may include:
the electronic equipment acquires pulse wave data of a user; the electronic equipment determines a second state label corresponding to each group of data according to the pulse wave data and the statistical characteristic value corresponding to each group of data; the electronic equipment determines a final state label corresponding to each group of data according to the first state label and the second state label; and the electronic equipment determines a sleep monitoring result according to the final state label corresponding to each group of data.
Wherein the second state tag comprises the sleep state and the awake state.
The second state label corresponding to the pulse wave data and the statistical characteristic value can be determined by classifying the pulse wave data and the statistical characteristic value by utilizing a neural network, a decision tree algorithm, a machine learning algorithm and the like. To avoid the disadvantages of a single algorithm, different algorithms may be employed in obtaining the first state label and the second state label. For example: the first state label is obtained by a decision tree algorithm, and the second state label is obtained by a machine learning algorithm (such as a LightGBM model).
After the first state label and the second state label are obtained, the first state label and the second state label can be fused into a final state label according to different weights.
For example, assume that the awake state is quantized to 1 and the sleep state is quantized to 0, the first state tag is weighted to 0.8 and the second state tag is weighted to 0.2. When the first state flag is 1 and the second state flag is 0, the quantization value of the final state flag is calculated to be 1 × 0.8+0 × 0.2 — 0.8. Assuming that the quantization threshold of the final state flag is 0.5 (i.e., awake state when greater than or equal to 0.5 and sleep state when less than 0.5), accordingly, the finally determined final state flag is awake state.
It should be noted that the above is only an example of determining the final state label, and the weight value, the quantization threshold, the determination method, and the like are not specifically limited.
The second monitoring mode in S803 will be described below. When the user is not located on the vehicle, the electronic device determines to detect sleep of the user in the second monitoring mode. In the second monitoring mode, the method may specifically include: and the electronic equipment monitors the sleep of the user according to the acquired second monitoring data.
The second monitoring data is obtained by the electronic device when the user is not located on the vehicle.
When the user is not located on the vehicle, since the state information of the user is included in the second monitored data, the vehicle-related information is not included. Therefore, the second monitoring data does not need to be denoised, and sleep monitoring can be carried out according to the second monitoring data.
Through the two sleep monitoring modes provided by the embodiment, the electronic equipment can monitor the sleep of the user in different states. Specifically, the electronic device may monitor not only sleep when the user is located on the vehicle, but also sleep when the user is not located on the vehicle. Therefore, the problem that the user does not record or records less when sleeping on the vehicle is solved, and the user experience is improved.
Based on the sleep monitoring method provided by the above embodiment, an application scenario of the method is described below. The application scenarios may include offline monitoring scenarios, online monitoring scenarios, and fusion monitoring scenarios.
In an offline monitoring scenario, the electronic device obtains sensor data and caches the obtained sensor data. After determining the corresponding time node when the user is in the vehicle, determining first monitoring data (i.e., data monitored by the electronic device when the user is in the vehicle) in the sensor data according to the time node. And then filtering the information related to the vehicles in the first monitoring data to obtain target data, and monitoring the sleep of the user according to the target data. When information related to the vehicle is filtered, the electronic device can acquire driving data of the vehicle from the cached sensor data.
In this application scenario, the electronic device needs to cache a large amount of sensor data, and has a high requirement on a storage space of the electronic device. In order to relieve the storage pressure of the electronic device, in an application scenario, the electronic device may upload the acquired sensor data to a third-party storage space such as a cloud server, and when the sleep monitoring method needs to be executed, acquire related data from the third-party storage space in which the sensor data is stored. In another application scenario, the electronic device that collects the sensor data and the electronic device that performs the sleep monitoring method may be different electronic devices. For example, the electronic device for collecting sensor data may be a sports bracelet, and the electronic device for performing the sleep monitoring method may be a mobile phone. After the sensor data are collected by the motion bracelet, the sensor data are transmitted to the mobile phone, and the mobile phone executes the sleep monitoring method provided by the embodiment of the application.
The offline monitoring needs to analyze and process data after the electronic device acquires all monitoring data, so that the delay of the monitoring result is high, and a user cannot timely master the sleep condition. However, the data acquired by offline monitoring is more comprehensive, so the accuracy of the monitoring result of offline monitoring is higher. Therefore, the off-line monitoring is more suitable for the conditions of lower requirements on the timeliness of monitoring and higher requirements on the accuracy of the monitoring result.
In an online monitoring scenario, an electronic device acquires sensor data in real time. When the user is determined to be in the vehicle, the electronic equipment records the subsequently acquired sensor data as first monitoring data and enters a first monitoring mode. And then the electronic equipment filters the information related to the transportation means in the first monitoring data to obtain target data, and monitors the sleep of the user according to the target data.
On-line monitoring requires real-time processing of monitoring data, and the delay of the monitoring result is low. However, when information related to the transportation means is filtered, the acquired real-time monitoring data is not comprehensive, so that the acquired driving data of the transportation means may be inaccurate, and further, the finally obtained sleep monitoring result may have errors. Therefore, compared with offline monitoring, online monitoring is more suitable for the situation that the requirement on the timeliness of monitoring is higher and the monitoring result is allowed to have errors.
In order to avoid the situation that errors may occur in online monitoring, in a fusion monitoring scene, when it is determined that a user is located on a vehicle, the electronic device may cache part of monitoring data first, so as to ensure that more accurate driving data of the vehicle can be acquired. In other words, the electronic device starts sleep monitoring after acquiring part of the monitoring data. Thus, although the monitoring result is delayed for a short time at the initial stage of entering the first monitoring mode, a higher accuracy of the monitoring result can be ensured. Compared with off-line monitoring, the fusion monitoring has lower delay; compared with on-line monitoring, the fusion monitoring has higher accuracy.
Fig. 11 is a block diagram of a sleep monitoring apparatus according to an embodiment of the present application, which corresponds to the sleep detection method according to the foregoing embodiment. For convenience of explanation, only portions related to the embodiments of the present application are shown. Referring to fig. 11, the apparatus includes:
the electronic device determines whether the user is located on the vehicle, as a determination unit 111.
The first determining unit 112 determines to monitor the user's sleep in the vehicle in the first monitoring mode if the user is located in the vehicle.
The second determining unit 113 determines to monitor the user's sleep in the vehicle using the second monitoring mode if the user is not located in the vehicle.
Optionally, the apparatus further comprises:
the first monitoring unit 114 is configured to, if the user is located on a vehicle, acquire first monitoring data of the user located on the vehicle, where the first monitoring data includes information related to the vehicle; the electronic equipment obtains target data according to the first monitoring data, wherein the target data is obtained after information related to vehicles in the first monitoring data is filtered; the electronic equipment monitors the sleep of the user according to the target data.
The second monitoring unit 115 is configured to, if the user is not located on the vehicle, monitor sleep of the user by the electronic device according to the obtained second monitoring data, where the second monitoring data includes state information of the user.
Optionally, the first monitoring unit 114 is further configured to:
the electronic equipment acquires driving data of a vehicle; the electronic equipment determines a data component matched with the driving data in the first monitoring data as vehicle-related information; the electronic equipment filters out information related to the vehicle from the first monitoring data to obtain target data.
Optionally, the driving data includes first acceleration data when the vehicle is in a stationary state and second acceleration data when the vehicle is in a driving state.
Further, the first monitoring unit 114 is further configured to:
the electronic equipment constructs a noise data matrix according to the first acceleration data and the second acceleration data; the electronic equipment inputs the noise data matrix and the first monitoring data into a preset decomposition model to obtain a data component matched with the driving data in the first monitoring data; the electronic device determines a data component of the first monitoring data that matches the travel data as vehicle-related information.
Optionally, the first monitoring unit 114 is further configured to:
when the data amount in the target data reaches a preset amount, the electronic equipment monitors the sleep of the user according to the target data; and when the data quantity in the target data does not reach the preset quantity, the electronic equipment continues to acquire the first monitoring data.
Optionally, the first monitoring unit 114 is further configured to:
the electronic equipment obtains multiple groups of data according to the target data; the electronic equipment determines a first state label corresponding to each group of data according to the statistical characteristic value corresponding to each group of data in the multiple groups of data; and the electronic equipment determines a sleep monitoring result according to the first state label corresponding to each group of data.
Wherein the first state tag includes a sleep state and an awake state.
Optionally, the first monitoring unit 114 is further configured to:
the electronic equipment acquires pulse wave data of a user; the electronic equipment determines a second state label corresponding to each group of data according to the pulse wave data and the statistical characteristic value corresponding to each group of data; the electronic equipment determines a final state label corresponding to each group of data according to the first state label and the second state label; and the electronic equipment determines a sleep monitoring result according to the final state label corresponding to each group of data.
Wherein the second state tag includes a sleep state and an awake state.
Optionally, the determining unit 111 is further configured to:
the electronic equipment acquires travel information and/or motion information of a user and judges whether the user is located on a vehicle or not according to the travel information and/or the motion information;
or the electronic equipment displays prompt information on the display screen, wherein the prompt information is used for prompting the user to select whether the user is located on the vehicle; the electronic equipment monitors a first operation instruction input by a user; the electronic device determines whether the user is located on the vehicle according to the first operation instruction.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Embodiments of the present application further provide a computer-readable storage medium, which includes computer instructions, and when the computer instructions are executed on a computer or a processor, the computer or the processor is caused to execute the steps in the above embodiments of the sleep monitoring method.
Embodiments of the present application provide a computer program product, which when executed on a computer or a processor, enables the computer or the processor to implement the steps in the foregoing sleep monitoring method embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optics, digital subscriber line) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The embodiment of the present application further provides a chip system, where the chip system includes a processor, the processor is coupled with a memory, and the processor executes a computer program stored in the memory to implement the steps in the foregoing sleep monitoring method embodiments. The chip system may be a single chip or a chip module composed of a plurality of chips.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A sleep monitoring method is applied to an electronic device, and comprises the following steps:
the electronic device determining whether a user is located on a vehicle;
if the user is located on a vehicle, the electronic equipment acquires first monitoring data of the user located on the vehicle, wherein the first monitoring data comprises information related to the vehicle;
the electronic equipment obtains target data according to the first monitoring data, wherein the target data is obtained after information related to the vehicles in the first monitoring data is filtered;
and the electronic equipment monitors the sleep of the user according to the target data.
2. The sleep monitoring method as set forth in claim 1, wherein prior to the electronic device acquiring first monitoring data that a user is located on the vehicle, the method further comprises:
the electronic device determines to monitor the sleep of the user on the vehicle by adopting a first monitoring mode, wherein the first monitoring mode refers to a process of filtering information related to the vehicle from monitoring data acquired by the electronic device.
3. The sleep monitoring method as claimed in claim 1 or 2, wherein the obtaining of the target data by the electronic device according to the first monitoring data comprises:
the electronic equipment acquires driving data of the vehicle;
the electronic device determining a data component of the first monitoring data that matches the travel data as the vehicle-related information;
and the electronic equipment filters the information related to the vehicles from the first monitoring data to obtain the target data.
4. The sleep monitoring method as set forth in claim 3, wherein the traveling data includes first acceleration data when the vehicle is in a stationary state and second acceleration data when the vehicle is in a traveling state;
the electronic device determines a data component of the first monitoring data that matches the travel data as the vehicle-related information, including:
the electronic equipment constructs a noise data matrix according to the first acceleration data and the second acceleration data;
the electronic equipment inputs the noise data matrix and the first monitoring data into a preset decomposition model to obtain a data component matched with the driving data in the first monitoring data;
the electronic device determines a data component of the first monitoring data that matches the travel data as the vehicle-related information.
5. The sleep monitoring method as claimed in any one of claims 1 to 4, wherein the electronic device monitors the sleep of the user according to the target data, comprising:
when the data amount in the target data reaches a preset amount, the electronic equipment monitors the sleep of the user according to the target data;
and when the data volume in the target data does not reach the preset volume, the electronic equipment continues to acquire the first monitoring data.
6. The sleep monitoring method as claimed in claim 5, wherein the monitoring of the user's sleep by the electronic device according to the target data comprises:
the electronic equipment obtains multiple groups of data according to the target data;
the electronic equipment determines a first state label corresponding to each group of data according to a statistical characteristic value corresponding to each group of data in the multiple groups of data, wherein the first state label comprises a sleep state and a waking state;
and the electronic equipment determines a sleep monitoring result according to the first state label corresponding to each group of data.
7. The sleep monitoring method as claimed in claim 6, wherein the determining, by the electronic device, the sleep monitoring result according to the first status label corresponding to each group of data includes:
the electronic equipment acquires pulse wave data of a user;
the electronic equipment determines a second state label corresponding to each group of data according to the pulse wave data and the statistical characteristic value corresponding to each group of data, wherein the second state label comprises the sleep state and the waking state;
the electronic equipment determines a final state label corresponding to each group of data according to the first state label and the second state label;
and the electronic equipment determines a sleep monitoring result according to the final state label corresponding to each group of data.
8. The sleep monitoring method as claimed in any one of claims 1 to 7, wherein the determining by the electronic device whether the user is located on a vehicle comprises:
the electronic equipment acquires travel information and/or motion information of a user, and judges whether the user is located on the vehicle or not according to the travel information and/or the motion information;
alternatively, the first and second electrodes may be,
the electronic equipment displays prompt information on a display screen, wherein the prompt information is used for prompting a user to select whether the user is located on the vehicle;
the electronic equipment monitors a first operation instruction input by a user;
the electronic device determines whether a user is located on the vehicle according to the first operation instruction.
9. The sleep monitoring method as claimed in any one of claims 1 to 8, characterized in that the method further comprises:
and if the user is not positioned on the vehicle, the electronic equipment monitors the sleep of the user according to the acquired second monitoring data, wherein the second monitoring data comprises the state information of the user.
10. The sleep monitoring method according to claim 9, wherein before the electronic device monitors the sleep of the user according to the acquired second monitoring data, the method further comprises:
the electronic equipment determines to monitor the sleep of the user by adopting a second monitoring mode, wherein the second monitoring mode is that information related to the transportation means in the monitoring data acquired by the electronic equipment does not need to be filtered.
11. An electronic device, characterized in that the electronic device comprises a processor for executing a computer program stored in a memory for implementing the method according to any of claims 1 to 10.
12. A computer readable storage medium comprising computer instructions which, when run on a computer or processor, cause the computer or processor to perform the method of any of claims 1 to 10.
CN202011185682.XA 2020-10-29 2020-10-29 Sleep monitoring method and device, electronic equipment and computer readable storage medium Pending CN114424927A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011185682.XA CN114424927A (en) 2020-10-29 2020-10-29 Sleep monitoring method and device, electronic equipment and computer readable storage medium
PCT/CN2021/115753 WO2022088938A1 (en) 2020-10-29 2021-08-31 Sleep monitoring method and apparatus, and electronic device and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011185682.XA CN114424927A (en) 2020-10-29 2020-10-29 Sleep monitoring method and device, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114424927A true CN114424927A (en) 2022-05-03

Family

ID=81309428

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011185682.XA Pending CN114424927A (en) 2020-10-29 2020-10-29 Sleep monitoring method and device, electronic equipment and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN114424927A (en)
WO (1) WO2022088938A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116616721A (en) * 2023-07-24 2023-08-22 北京中科心研科技有限公司 Work and rest information determining method and device based on PPG (program G) signal and wearable equipment

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115040092B (en) * 2022-06-13 2024-06-14 天津大学 Channel state information-based heart rate monitoring method and respiratory event detection method
CN117678970A (en) * 2022-09-09 2024-03-12 荣耀终端有限公司 Sleep state detection method, electronic equipment and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101159793A (en) * 2007-11-06 2008-04-09 西安电子科技大学 Mobile phone wireless medical treatment health detection tutelage system
CN101243973A (en) * 2008-01-31 2008-08-20 杨杰 Method and apparatus for monitoring and awakening fatigue doze
CN106846585A (en) * 2017-02-13 2017-06-13 上海量明科技发展有限公司 The safety-type shared vehicles and implementation method, safety device and system
WO2018049852A1 (en) * 2016-09-13 2018-03-22 深圳市迈迪加科技发展有限公司 Sleep evaluation method, apparatus and system
CN110151137A (en) * 2019-05-28 2019-08-23 深圳如一探索科技有限公司 Sleep state monitoring method, device, equipment and medium based on data fusion
CN111655135A (en) * 2017-12-22 2020-09-11 瑞思迈传感器技术有限公司 Apparatus, system, and method for physiological sensing in a vehicle

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9463805B2 (en) * 2014-12-17 2016-10-11 Honda Motor Co., Ltd. System and method for dynamic vehicle control affecting sleep states of vehicle occupants
CN104905795B (en) * 2015-06-15 2017-10-10 深圳市奋达科技股份有限公司 A kind of BLE networkings sleep monitor method and device
CN205379309U (en) * 2015-12-09 2016-07-13 深圳市新元素医疗技术开发有限公司 Sleep guardianship device
CN106897562B (en) * 2017-02-28 2022-04-29 镇江市高等专科学校 Working method of portable health terminal for travel

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101159793A (en) * 2007-11-06 2008-04-09 西安电子科技大学 Mobile phone wireless medical treatment health detection tutelage system
CN101243973A (en) * 2008-01-31 2008-08-20 杨杰 Method and apparatus for monitoring and awakening fatigue doze
WO2018049852A1 (en) * 2016-09-13 2018-03-22 深圳市迈迪加科技发展有限公司 Sleep evaluation method, apparatus and system
CN106846585A (en) * 2017-02-13 2017-06-13 上海量明科技发展有限公司 The safety-type shared vehicles and implementation method, safety device and system
CN111655135A (en) * 2017-12-22 2020-09-11 瑞思迈传感器技术有限公司 Apparatus, system, and method for physiological sensing in a vehicle
CN110151137A (en) * 2019-05-28 2019-08-23 深圳如一探索科技有限公司 Sleep state monitoring method, device, equipment and medium based on data fusion

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116616721A (en) * 2023-07-24 2023-08-22 北京中科心研科技有限公司 Work and rest information determining method and device based on PPG (program G) signal and wearable equipment
CN116616721B (en) * 2023-07-24 2023-10-13 北京中科心研科技有限公司 Work and rest information determining method and device based on PPG (program G) signal and wearable equipment

Also Published As

Publication number Publication date
WO2022088938A1 (en) 2022-05-05

Similar Documents

Publication Publication Date Title
CN110134316B (en) Model training method, emotion recognition method, and related device and equipment
CN110138959B (en) Method for displaying prompt of human-computer interaction instruction and electronic equipment
WO2022088938A1 (en) Sleep monitoring method and apparatus, and electronic device and computer-readable storage medium
CN111819533B (en) Method for triggering electronic equipment to execute function and electronic equipment
CN114449599A (en) Network link switching method based on electronic equipment position and electronic equipment
CN113747527B (en) Network link switching method based on electronic equipment state and electronic equipment
CN112671080B (en) Charging method and device
WO2021052139A1 (en) Gesture input method and electronic device
WO2021238460A1 (en) Risk pre-warning method, risk behavior information acquisition method, and electronic device
CN112204532A (en) Method for evaluating AI task support capability by terminal and terminal
CN115589051B (en) Charging method and terminal equipment
CN114546511A (en) Plug-in management method, system and device
CN110058729B (en) Method and electronic device for adjusting sensitivity of touch detection
WO2022022335A1 (en) Method and apparatus for displaying weather information, and electronic device
CN114995715A (en) Control method of floating ball and related device
CN113284585A (en) Data display method, terminal device and storage medium
CN115964231A (en) Load model-based assessment method and device
CN114911400A (en) Method for sharing pictures and electronic equipment
CN114548141A (en) Method and device for generating waveform file, electronic equipment and readable storage medium
CN113380240B (en) Voice interaction method and electronic equipment
CN115022807B (en) Express information reminding method and electronic equipment
CN116048831A (en) Target signal processing method and electronic equipment
CN116450259A (en) Service abnormality reminding method, electronic equipment and storage medium
CN114828098A (en) Data transmission method and electronic equipment
CN115373957A (en) Application killing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination