WO2022151887A1 - Procédé de surveillance du sommeil et appareil associé - Google Patents

Procédé de surveillance du sommeil et appareil associé Download PDF

Info

Publication number
WO2022151887A1
WO2022151887A1 PCT/CN2021/137461 CN2021137461W WO2022151887A1 WO 2022151887 A1 WO2022151887 A1 WO 2022151887A1 CN 2021137461 W CN2021137461 W CN 2021137461W WO 2022151887 A1 WO2022151887 A1 WO 2022151887A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
user
sleep state
data
mobile phone
Prior art date
Application number
PCT/CN2021/137461
Other languages
English (en)
Chinese (zh)
Inventor
韩佳
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022151887A1 publication Critical patent/WO2022151887A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4815Sleep quality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the present application relates to the technical field of artificial intelligence, and in particular, to a sleep monitoring method and related devices.
  • the quality of sleep is closely related to people's physical health. More and more electronic devices (such as wristbands, watches, etc.) have a sleep monitoring function to monitor the quality of a person's sleep.
  • Sleep monitoring needs to determine whether the user has entered a sleep state and whether the user has woken up.
  • Current electronic devices such as wristbands, usually determine whether a user is in a sleep state by monitoring their own exercise state and the change in the user's heart rate.
  • the above method is likely to cause misjudgment of the sleep state. For example, in the scenario where the user uses the mobile phone in a fixed posture before going to bed, the user does not enter the sleep state. The bracelet often judges that the user has entered a sleep state, resulting in inaccurate sleep quality monitoring.
  • the present application provides a sleep monitoring method and a related device, which can determine whether a user enters a sleep state through the cooperation of multiple electronic devices, so as to improve the accuracy of monitoring the time when the user enters the sleep state.
  • the present application provides a sleep monitoring method.
  • the first electronic device may receive the first request of the second electronic device.
  • the first electronic device has a binding relationship with the second electronic device.
  • the first request may be sent when the second electronic device is in a wearing state and the first data is monitored.
  • the first data is consistent with the data that the user enters the sleep state.
  • the first electronic device may determine whether the first user wearing the second electronic device is using the first electronic device, and send the first determination result or the second determination result to the second electronic device.
  • the first determination result is that the first user is using the first electronic device
  • the second determination result is that the first user is not using the first electronic device.
  • the above-mentioned first electronic device can be a mobile phone, a tablet computer, a laptop computer, a handheld computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA) and smart glasses, etc.
  • Electronic equipment The above-mentioned second electronic device may be an electronic device for monitoring the sleep quality of the user. For example, bracelets, watches, etc. When the second electronic device is in the wearing state, the sleep quality of the user can be monitored.
  • the above-mentioned first data is the data monitored when the second electronic device is worn by the first user.
  • the above-mentioned first data may include physiological characteristic data of the first user and motion data of the second electronic device.
  • the above-mentioned physiological characteristic data may be, for example, heart rate data.
  • the motion data of the second electronic device may be, for example, acceleration data and angular velocity data.
  • the above-mentioned first data is consistent with the data that the user enters the sleep state, which may indicate that the second electronic device prejudges that the first user enters the sleep state.
  • the above-mentioned first data may be collected through big data, and may reflect physiological characteristic data of a general user wearing the second electronic device and enter a sleep state and motion data of the second electronic device.
  • the above-mentioned second data may include physiological characteristic data of the first user wearing the second electronic device and actually entering a sleep state and motion data of the second electronic device. The second electronic device can more accurately predict whether the first user enters the sleep state by using the physiological characteristic data and the motion data of the second electronic device that the first user wears the second electronic device actually enters the sleep state.
  • the first electronic device may determine under the first condition that the first electronic device is being used by the user and the user is the first user, and the first electronic device obtains the first determination result.
  • the first condition may include one or more of the following: the first electronic device determines that it is in a non-stationary state, the first electronic device detects a user operation within a first time period, the first electronic device detects that a human eye is watching the first The screen of the electronic device and the first electronic device detect that it is running a screen-casting application.
  • the first electronic device can determine under the first condition that the first electronic device is being used by the user and the user is not the first user, or the first electronic device determines that the first electronic device is not being used by the user, then the first electronic device A second judgment result is obtained.
  • the first electronic device can determine that a user is using the first electronic device. If the first electronic device monitors the user operation within the first time, the first electronic device may determine that a user is using the first electronic device. If the first electronic device detects that a human eye is looking at the screen of the first electronic device, the first electronic device may determine that a user is using the first electronic device. If the first electronic device runs a screen-casting application, the first electronic device can determine that a user is using the first electronic device.
  • the first electronic device may first determine whether it is in a stationary state. If the first electronic device is in a non-stationary state and it is determined that the user is the first user, the first electronic device can obtain the first determination result. If it is in a stationary state, the first electronic device may further determine whether a user operation is monitored within the first time period. If the user operation is monitored within the first time period, and it is determined that the user is the first user, the first electronic device can obtain the first determination result. If no user operation is monitored within the first time period, the first electronic device may further monitor whether human eyes are watching the screen of the first electronic device.
  • the first electronic device can obtain the first determination result. If it is detected that no human eyes are watching the screen of the first electronic device, the first electronic device can obtain the second judgment result. If it is detected that human eyes are watching the screen of the first electronic device, and it is determined that the user is not the first user, the first electronic device can obtain the second determination result.
  • the first electronic device may further monitor whether it runs an application program for screen projection. If a screen-casting application is running, and it is determined that the user is the first user, the first electronic device can obtain the first judgment result. If it is detected that no human eye is watching the screen of the first electronic device, and no application program for screen projection is running, the first electronic device can obtain the second judgment result.
  • the first electronic device may first determine whether it is in a stationary state. If the first electronic device is in a non-stationary state and it is determined that the user is the first user, the first electronic device can obtain the first determination result. If it is in a stationary state, the first electronic device can further monitor whether human eyes are watching the screen of the first electronic device. If it is detected that human eyes are watching the screen of the first electronic device, and it is determined that the user is the first user, the first electronic device can obtain the first determination result. If it is detected that no human eyes are watching the screen of the first electronic device, the first electronic device can obtain the second judgment result. If it is detected that human eyes are watching the screen of the first electronic device, and it is determined that the user is not the first user, the first electronic device can obtain the second determination result.
  • the first electronic device may first determine whether a user operation is monitored within the first time period. If the user operation is monitored within the first time period, and it is determined that the user is the first user, the first electronic device can obtain the first determination result. If no user operation is monitored within the first time period, the first electronic device may further monitor whether human eyes are watching the screen of the first electronic device. If it is detected that human eyes are watching the screen of the first electronic device, and it is determined that the user is the first user, the first electronic device can obtain the first determination result. If it is detected that no human eyes are watching the screen of the first electronic device, the first electronic device can obtain the second judgment result. If it is detected that human eyes are watching the screen of the first electronic device, and it is determined that the user is not the first user, the first electronic device can obtain the second determination result.
  • the first electronic device may monitor whether human eyes are watching the screen of the first electronic device. If it is detected that human eyes are watching the screen of the first electronic device, and it is determined that the user is the first user, the first electronic device can obtain the first determination result. If it is detected that no human eyes are watching the screen of the first electronic device, the first electronic device can obtain the second judgment result. If it is detected that human eyes are watching the screen of the first electronic device, and it is determined that the user is not the first user, the first electronic device can obtain the second determination result.
  • the first electronic device may use sensors such as an acceleration sensor, a gyroscope sensor and the like to determine whether it is in a stationary state.
  • sensors such as an acceleration sensor, a gyroscope sensor and the like to determine whether it is in a stationary state.
  • the above-mentioned user operation may be a touch operation acting on the screen of the first electronic device, a user operation acting on a button of the first electronic device, an input operation of a voice command, an input operation of an air gesture, and the like.
  • the first electronic device may determine whether a human eye is looking at the screen of the first electronic device through a human eye gaze recognition model.
  • the human eye gaze recognition model may be a neural network model.
  • the training data for training the human eye gaze recognition model may include image data of the human eye looking at the screen and image data of the human eye not looking at the screen.
  • the trained human eye gaze recognition model can identify the characteristics of the image in which the human eye gazes at the screen, so as to determine whether the human eye gazes at the screen of the mobile phone 100 .
  • the method for the first electronic device to determine that the user is the first user may include: the first electronic device may determine, through the first image captured by the camera, that the first image contains the face of the first user The image is used to determine the user of the first electronic device as the first user.
  • the above-mentioned camera for collecting the first image may be a camera of the first electronic device, or a camera of a screen projection device.
  • the first electronic device can also determine whether the user of the first electronic device is the first user by collecting other biometric information (such as voiceprint information, fingerprint information, etc.).
  • the first electronic device may establish a binding relationship with the second electronic device by pairing with Bluetooth.
  • the first electronic device may establish a binding relationship with the second electronic device in response to the first user operation.
  • the first user operation may be used to indicate that the owner of the first electronic device is the first user.
  • the first electronic device may establish a binding relationship with the second electronic device by logging into the same account.
  • the first electronic device can determine that the owner of the first electronic device and the user wearing the second electronic device are the same person. In this way, the user of the first electronic device and the user wearing the second electronic device are not the same person, and misjudgment on whether the user enters the sleep state can be reduced.
  • the second electronic device may request the first electronic device to further confirm whether the first user enters the sleep state.
  • the first electronic device may confirm whether the first user enters the sleep state by judging whether the first user is using the first electronic device. This can reduce the misjudgment of whether the user enters the sleep state because the user maintains a fixed posture for a long time but is not in the sleep state when the second electronic device is used alone to monitor whether the user enters the sleep state, and improves the monitoring of the time when the user enters the sleep state 's accuracy.
  • the first electronic device monitors a user operation to unlock the first electronic device within the second time period, and the first electronic device may send the first message to the second electronic device.
  • the first message may be used to indicate that the first user is using the first electronic device.
  • the first electronic device detects a user operation to turn off the alarm clock within the second time period, and the user who turns off the alarm clock is the first user, and the first electronic device may send the above-mentioned first message to the second electronic device.
  • the above unlocking method may be a method of unlocking by using biometric information.
  • the biometric information may be, for example, face information, voiceprint information, fingerprint information, and the like.
  • the first electronic device may determine that the unlocked user is the first user.
  • the above-mentioned second time period may be a time period of a first duration from when the first electronic device sends the second judgment result to the second electronic device.
  • the second time period may be a preset time period.
  • the second electronic device can use the first electronic device to determine whether the first user exits the sleep state. This can reduce the misjudgment of whether the first user has exited the sleep state when the first user has woken up but has not woken up, and improve the accuracy of sleep quality monitoring.
  • the present application also provides a sleep monitoring method.
  • the second electronic device monitors the first data while in the wearing state.
  • the first data is consistent with the data that the user enters the sleep state.
  • the second electronic device sends the first request to the first electronic device.
  • the first electronic device has a binding relationship with the second electronic device.
  • the second electronic device determines that the first user wearing the second electronic device has not entered a sleep state.
  • the first determination result is a determination result of determining that the first user is using the first electronic device after the first electronic device receives the first request.
  • the above-mentioned first data is the data monitored when the second electronic device is worn by the first user.
  • the above-mentioned first data may include physiological characteristic data of the first user and motion data of the second electronic device.
  • the above-mentioned physiological characteristic data may be, for example, heart rate data.
  • the motion data of the second electronic device may be, for example, acceleration data and angular velocity data.
  • the above-mentioned first data is consistent with the data that the user enters the sleep state, which may indicate that the second electronic device prejudges that the first user enters the sleep state.
  • the above-mentioned first data may be collected through big data, and may reflect physiological characteristic data of a general user wearing the second electronic device and enter a sleep state and motion data of the second electronic device.
  • the above-mentioned second data may include physiological characteristic data of the first user wearing the second electronic device and actually entering a sleep state and motion data of the second electronic device. The second electronic device can more accurately predict whether the first user enters the sleep state by using the physiological characteristic data and the motion data of the second electronic device that the first user wears the second electronic device actually enters the sleep state.
  • the second electronic device may monitor the second data while in the wearing state, and determine whether the second data and the data of the user entering the sleep state are not match. That is, the second electronic device can predict again whether the first user enters the sleep state.
  • the second electronic device may perform a pre-judgment every preset time period (such as 5 minutes, etc.), and when the pre-judgment result is that the first user enters a sleep state requesting the first electronic device to determine whether the first user enters a sleep state.
  • the above-mentioned second data is data of the first user, and the second data may include physiological characteristic data of the first user and motion data of the second electronic device.
  • the second electronic device may record the monitored physiological characteristic data of the user as data not in a sleep state.
  • the second electronic device determines that the first user enters a sleep state.
  • the second determination result is the determination result that the first electronic device determines that the first user does not use the first electronic device after receiving the first request.
  • the second electronic device may record the monitored physiological characteristic data of the user as data in the sleep state.
  • the second electronic device may request the first electronic device to further confirm whether the first user enters the sleep state.
  • the first electronic device may confirm whether the first user enters the sleep state by judging whether the first user is using the first electronic device. This can reduce the misjudgment of whether the user enters the sleep state because the user maintains a fixed posture for a long time but is not in the sleep state when the second electronic device is used alone to monitor whether the user enters the sleep state, and improves the monitoring of the time when the user enters the sleep state 's accuracy.
  • the second electronic device receives the first message from the first electronic device when detecting that the state of the first user is the sleep state, and can determine the first message detected by the second electronic device
  • the user's state is a non-sleep state.
  • the first message may be used to indicate that the first user is using the first electronic device.
  • the second electronic device can use the first electronic device to determine whether the first user exits the sleep state. This can reduce the misjudgment of whether the first user has exited the sleep state when the first user has woken up but has not woken up, and improve the accuracy of sleep quality monitoring.
  • the present application also provides a sleep monitoring method.
  • the second electronic device monitors the first data while in the wearing state.
  • the first data is consistent with the data that the user enters the sleep state.
  • the second electronic device may send the first request to the first electronic device.
  • the first electronic device has a binding relationship with the second electronic device.
  • the first electronic device receives the first request from the second electronic device.
  • the first electronic device may determine whether the first user wearing the second electronic device is using the first electronic device, and send the first determination result or the second determination result to the second electronic device.
  • the first determination result is that the first user is using the first electronic device.
  • the second judgment result is that the first user does not use the first electronic device. In the case of receiving the first judgment result, the second electronic device may determine that the first user does not enter the sleep state.
  • the second electronic device may request the first electronic device to further confirm whether the first user enters the sleep state.
  • the first electronic device may confirm whether the first user enters the sleep state by judging whether the first user is using the first electronic device. This can reduce the misjudgment of whether the user enters the sleep state because the user maintains a fixed posture for a long time but is not in the sleep state when the second electronic device is used alone to monitor whether the user enters the sleep state, and improves the monitoring of the time when the user enters the sleep state 's accuracy.
  • the first electronic device determines under the first condition that the first electronic device is being used by the user and the user is the first user, and the first electronic device obtains the first determination result.
  • the first condition includes one or more of the following: the first electronic device determines that it is in a non-stationary state, the first electronic device detects a user operation within the first time period, and the first electronic device detects that a human eye is watching the first electronic device. The screen of the device and the first electronic device detect that it is running a screen-casting application.
  • the first electronic device determines that the first electronic device is being used by the user and the user is not the first user, or the first electronic device determines that the first electronic device is not being used by the user, and the first electronic device obtains the first electronic device. 2. Judgment results.
  • the method for the first electronic device to determine that the user is the first user may be: the first electronic device captures the first image through a camera, and determines that the first image contains the first user face image.
  • the above-mentioned camera for collecting the first image may be a camera of the first electronic device, or a camera of a screen projection device.
  • the first electronic device can also determine whether the user of the first electronic device is the first user by collecting other biometric information (such as voiceprint information, fingerprint information, etc.).
  • the second electronic device may determine that the first user enters a sleep state.
  • the first electronic device may establish a binding relationship with the second electronic device by pairing with Bluetooth.
  • the first electronic device may establish a binding relationship with the second electronic device in response to the first user operation.
  • the first user operation may be used to indicate that the owner of the first electronic device is the first user.
  • the first electronic device may establish a binding relationship with the second electronic device by logging into the same account.
  • the first electronic device can determine that the owner of the first electronic device and the user wearing the second electronic device are the same person. In this way, the user of the first electronic device and the user wearing the second electronic device are not the same person, and misjudgment on whether the user enters the sleep state can be reduced.
  • the first electronic device monitors a user operation of unlocking the first electronic device within the second time period, and the first electronic device may send the first message to the second electronic device.
  • the first message may be used to indicate that the first user is using the first electronic device.
  • the first electronic device detects a user operation to turn off the alarm clock within the second time period, and the user who turns off the alarm clock is the first user, and the first electronic device may send the first message to the second electronic device.
  • the above unlocking method may be a method of unlocking by using biometric information.
  • the biometric information can be, for example, face information, voiceprint information, fingerprint information, and the like.
  • the first electronic device may determine that the unlocked user is the first user.
  • the above-mentioned second time period may be a time period of a first duration from when the first electronic device sends the second judgment result to the second electronic device.
  • the second time period may be a preset time period.
  • the second electronic device can use the first electronic device to determine whether the first user exits the sleep state. This can reduce the misjudgment of whether the first user has exited the sleep state when the first user has woken up but has not woken up, and improve the accuracy of sleep quality monitoring.
  • the present application also provides a sleep monitoring method.
  • the second electronic device receives the first message of the first electronic device when detecting that the state of the first user wearing the second electronic device is a sleep state, and can determine the first message detected by the second electronic device The user's state is a non-sleep state.
  • the first electronic device has a binding relationship with the second electronic device.
  • the first message may be sent by the first electronic device after monitoring the user operation of unlocking the first electronic device within the first monitoring time period.
  • determining that the detected state of the first user is a non-sleep state may specifically be marking the detected state of the first user from a sleep state to a non-sleep state. That is to say, when receiving the above-mentioned first message, the second electronic device may determine that the first user wakes up. Furthermore, the second electronic device may determine the moment when the first message is received or the moment when the first electronic device detects the unlocking user operation as the moment when the first user wakes up.
  • the second electronic device can use the first electronic device to determine whether the first user exits the sleep state. This can reduce the misjudgment of whether the first user has exited the sleep state when the first user has woken up but has not woken up, and improve the accuracy of sleep quality monitoring.
  • the above unlocking method may be a method for unlocking by using biometric information.
  • the biometric information may be, for example, face information, voiceprint information, fingerprint information, and the like.
  • the first electronic device may determine that the unlocked user is the first user.
  • the second electronic device monitors the first data while in the wearing state.
  • the first data is consistent with the data that the user enters the sleep state.
  • the second electronic device may send the first request to the first electronic device.
  • the second electronic device receives the judgment result that the first electronic device indicates that the first user does not use the first electronic device, and can determine that the state of the first user detected by the second electronic device is the sleep state.
  • the above-mentioned first monitoring time period may be a time period for which the second electronic device estimates the first user to exit the sleep state.
  • the above-mentioned first monitoring time period may be a fixed time period, such as a time period from 5:00 am to the above-mentioned 10:00 am.
  • the above-mentioned first monitoring period may be the first period of time from the first electronic device sending a judgment result indicating that the first user does not use the first electronic device to the second electronic device part.
  • the above-mentioned first monitoring time period and the second time period in the foregoing embodiment may be the same time period.
  • the first electronic device may establish a binding relationship with the second electronic device by pairing with Bluetooth.
  • the first electronic device may establish a binding relationship with the second electronic device in response to the first user operation.
  • the first user operation may be used to indicate that the owner of the first electronic device is the first user.
  • the first electronic device may establish a binding relationship with the second electronic device by logging into the same account.
  • the first electronic device can determine that the owner of the first electronic device and the user wearing the second electronic device are the same person. In this way, the user of the first electronic device and the user wearing the second electronic device are not the same person, and misjudgment on whether the user enters the sleep state can be reduced.
  • the second electronic device may further record the monitored physiological characteristic data of the user as not in a sleep state data.
  • the present application also provides a sleep monitoring method.
  • the first electronic device may monitor the user operation of unlocking the first electronic device within the first monitoring time period, and send the first message to the second electronic device.
  • the first message is used to indicate that the first user wearing the second electronic device is using the first electronic device.
  • the first electronic device has a binding relationship with the second electronic device.
  • the second electronic device receives the first message when detecting that the state of the first user is a sleep state, and may determine that the detected state of the first user is a non-sleep state.
  • determining that the detected state of the first user is a non-sleep state may specifically be marking the detected state of the first user from a sleep state to a non-sleep state. That is to say, when receiving the above-mentioned first message, the second electronic device may determine that the first user wakes up. Furthermore, the second electronic device may determine the moment when the first message is received or the moment when the first electronic device detects the unlocking user operation as the moment when the first user wakes up.
  • the second electronic device can use the first electronic device to determine whether the first user exits the sleep state. This can reduce the misjudgment of whether the first user has exited the sleep state when the first user has woken up but has not woken up, and improve the accuracy of sleep quality monitoring.
  • the second electronic device monitors the first data when it is in the wearing state, and the first data is consistent with the data that the user enters the sleep state.
  • the second electronic device sends the first request to the first electronic device.
  • the first electronic device receives the first request, determines whether the first user is using the first electronic device, and obtains a determination result indicating that the first user is not using the first electronic device.
  • the first electronic device may send a judgment result indicating that the first user does not use the first electronic device to the second electronic device.
  • the second electronic device may determine that the first user enters a sleep state.
  • the above-mentioned first monitoring time period may be a time period during which the second electronic device estimates the first user to exit the sleep state.
  • the above-mentioned first monitoring time period may be the first time period from the time when the first electronic device sends a judgment result indicating that the first user does not use the first electronic device to the second electronic device part.
  • the first electronic device before sending the above-mentioned first message to the second electronic device, may further determine under the first condition that the first electronic device is being used by the user and the user is the first electronic device a user.
  • the first condition includes one or more of the following: the first electronic device determines that it is in a non-stationary state, the first electronic device detects a user operation within the first time period, and the first electronic device detects that a human eye is watching the first electronic device.
  • the screen of the device and the first electronic device detect that it is running a screen-casting application.
  • the second electronic device may record the monitored physiological characteristic data of the user as data not in the sleep state.
  • the first electronic device may establish a binding relationship with the second electronic device by pairing with Bluetooth.
  • the first electronic device may establish a binding relationship with the second electronic device in response to the first user operation.
  • the first user operation may be used to indicate that the owner of the first electronic device is the first user.
  • the first electronic device may establish a binding relationship with the second electronic device by logging into the same account.
  • the first electronic device can determine that the owner of the first electronic device and the user wearing the second electronic device are the same person. In this way, the user of the first electronic device and the user wearing the second electronic device are not the same person, and misjudgment on whether the user enters the sleep state can be reduced.
  • the present application provides an electronic device.
  • the electronic device is a first electronic device.
  • the first electronic device may include a camera, a communication module, a memory and a processor.
  • a camera can be used to capture images.
  • the communication module can be used to establish a communication connection with the second electronic device.
  • Memory can be used to store computer programs.
  • the processor may be configured to invoke a computer program, so that the first electronic device executes any of the possible implementation methods of the above-mentioned first aspect.
  • the present application further provides an electronic device.
  • the electronic device is the second electronic device.
  • the second electronic device may include a communication module, a memory and a processor.
  • the communication module can be used to establish a communication connection with the first electronic device.
  • Memory can be used to store computer programs.
  • the processor may be configured to invoke a computer program, so that the second electronic device executes any possible implementation method of the second aspect above or any possible implementation method of the fourth aspect above.
  • the present application provides a sleep monitoring system, which may include the electronic device provided in the sixth aspect and the electronic device provided in the seventh aspect.
  • an embodiment of the present application provides a chip, which is applied to the electronic device provided in the sixth aspect or the electronic device provided in the seventh aspect, the chip includes one or more processors, and the processors are used to invoke computer instructions In order to make the electronic device provided in the sixth aspect above perform any possible implementation method as in the first aspect, or cause the electronic device provided in the seventh aspect above to perform any possible implementation method as in the second aspect or perform as in the fourth aspect Any possible implementation of the aspect.
  • an embodiment of the present application provides a computer program product containing instructions, when the computer program product is executed on an electronic device, the electronic device provided in the sixth aspect can perform any possible implementation as in the first aspect. method, or cause the electronic device provided in the seventh aspect to perform any possible implementation method as in the second aspect or perform any possible implementation method as in the fourth aspect.
  • an embodiment of the present application provides a computer storage medium, including computer instructions, when the computer instructions are executed on an electronic device, the electronic device provided in the sixth aspect can execute any of the possible operations in the first aspect.
  • the implementation method is implemented, or the electronic device provided in the seventh aspect is caused to perform any possible implementation method in the second aspect or any possible implementation method in the fourth aspect.
  • the electronic device provided in the sixth aspect the electronic device provided in the seventh aspect, the sleep monitoring system provided in the eighth aspect, the chip provided in the ninth aspect, the computer program product provided in the tenth aspect, and the eleventh aspect
  • the provided computer-readable storage media are all used to execute the methods provided by the embodiments of the present application. Therefore, for the beneficial effects that can be achieved, reference may be made to the beneficial effects in the corresponding method, which will not be repeated here.
  • FIG. 1 is a schematic structural diagram of a first electronic device 100 provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a sleep monitoring scenario provided by an embodiment of the present application.
  • FIG. 3 is a flowchart of a sleep monitoring method provided by an embodiment of the present application.
  • FIG. 5 is a flowchart of another sleep monitoring method provided by an embodiment of the present application.
  • FIG. 6 is a flowchart of another sleep monitoring method provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be understood as implying or implying relative importance or implying the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of the features. In the description of the embodiments of the present application, unless otherwise specified, the “multiple” The meaning is two or more.
  • the present application provides a sleep monitoring method, which can monitor whether a user enters a sleep state through the cooperation of a first electronic device and a second electronic device.
  • Sleep state can refer to the form that a person exhibits when sleeping.
  • the sleep state may include a sleep onset stage, a light sleep stage, and a deep sleep stage.
  • a user in a sleeping state maintains a fixed posture for a long time or has a small change in the posture of the limbs.
  • the heart rate of the user in the sleeping state fluctuates around the resting heart rate.
  • the first electronic device may be a mobile phone, a tablet computer, a laptop computer, a handheld computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (PDA), and smart glasses. equipment.
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the second electronic device is an electronic device for monitoring the user's sleep quality.
  • the sleep quality monitoring may include determining the total duration of the user in a sleep state, and data such as the duration of sleep onset, light sleep, and deep sleep in the sleep state.
  • the second electronic device may be, for example, an electronic device such as a wristband, a watch, or the like. That is, the user can monitor his own sleep quality by wearing the second electronic device.
  • the embodiments of the present application do not limit the specific types of the first electronic device and the second electronic device.
  • the second electronic device is worn on the first user.
  • a sleep model is stored in the second electronic device.
  • the second electronic device may use a sleep model to predict whether the first user enters a sleep state according to data collected by sensors such as an acceleration sensor and a heart rate sensor.
  • the second electronic device may send a request for confirming whether the first user enters the sleep state to the first electronic device.
  • the first electronic device can detect whether the first user is using the first electronic device. If it is determined that the first user is using the first electronic device, the first electronic device may notify the second electronic device that the first user is using the first electronic device.
  • the second electronic device may determine that the first user has not entered a sleep state according to the notification. Otherwise, the first electronic device may notify the second electronic device that the first electronic device is not used by the first user. The second electronic device may determine that the second user enters the sleep state according to the notification.
  • the second electronic device can request the first electronic device to further confirm whether the first user enters the sleep state. This can reduce the misjudgment of whether the user enters the sleep state because the user maintains a fixed posture for a long time but is not in the sleep state when the second electronic device is used alone to monitor whether the user enters the sleep state, and improves the monitoring of the time when the user enters the sleep state 's accuracy.
  • FIG. 1 exemplarily shows a schematic structural diagram of a first electronic device 100 provided by an embodiment of the present application.
  • the first electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, Battery 142, Antenna 1, Antenna 2, Mobile Communication Module 150, Wireless Communication Module 160, Audio Module 170, Speaker 170A, Receiver 170B, Microphone 170C, Headphone Jack 170D, Sensor Module 180, Key 190, Motor 191, Indicator 192, A camera 193, a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the first electronic device 100 .
  • the first electronic device 100 may include more or less components than shown, or some components are combined, or some components are separated, or different components are arranged.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the first electronic device 100 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the first electronic device 100, and can also be used to transmit data between the first electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the wireless communication function of the first electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the first electronic device 100 may be used to cover a single or multiple communication frequency bands.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G etc. applied on the first electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication module 160 can provide applications on the first electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global Navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the
  • the antenna 1 of the first electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the first electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the first electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • the first electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the first electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the first electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the first electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the first electronic device 100 may support one or more video codecs.
  • the first electronic device 100 can play or record videos in various encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the first electronic device 100 can be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the first electronic device 100.
  • an external memory card such as a Micro SD card
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the first electronic device 100 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the first electronic device 100 and the like.
  • the first electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals.
  • Speaker 170A also referred to as a “speaker” is used to convert audio electrical signals into sound signals.
  • the receiver 170B also referred to as “earpiece”, is used to convert audio electrical signals into sound signals.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the earphone jack 170D is used to connect wired earphones.
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the first electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the first electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the first electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the first electronic device 100 .
  • the angular velocity of the first electronic device 100 about three axes ie, the x, y and z axes
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the first electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the first electronic device 100 through reverse motion, Achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the first electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the first electronic device 100 is stationary. It can also be used for recognizing the posture of the first electronic device 100, and can be used in applications such as switching between horizontal and vertical screens, and pedometers.
  • Distance sensor 180F for measuring distance.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the first electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object near the first electronic device 100 . When insufficient reflected light is detected, the first electronic device 100 may determine that there is no object near the first electronic device 100 .
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the first electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the first electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the first electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the first electronic device 100 , which is different from the position where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like.
  • the first electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the first electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact with and separation from the first electronic device 100 .
  • the first electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the first electronic device 100 employs an eSIM, ie an embedded SIM card.
  • the eSIM card can be embedded in the first electronic device 100 and cannot be separated from the first electronic device 100 .
  • the sleep monitoring method provided by the present application is specifically introduced by taking the first electronic device 100 as a mobile phone and the second electronic device 200 as a wristband.
  • FIG. 2 exemplarily shows a sleep monitoring scenario involved in the present application.
  • the first user wears the bracelet 200 .
  • the first user uses the mobile phone 100 while lying on the bed and maintaining a fixed posture.
  • the bracelet 200 can use a sleep model to predict that the first user has entered a sleep state according to data collected by sensors such as an acceleration sensor and a heart rate sensor.
  • the bracelet 200 may request the mobile phone 100 to further confirm whether the first user enters the sleep state.
  • the mobile phone 100 can confirm whether the first user enters the sleep state by judging whether the first user is using the mobile phone 100 . If it is determined that the first user is using the mobile phone 100 , the mobile phone 100 may send a message to the bracelet 200 to indicate that the first user is using the mobile phone 100 .
  • the bracelet 200 may determine that the first user has not entered a sleep state. If it is determined that the first user does not use the mobile phone 100 , the mobile phone 100 may send a message to the bracelet, indicating that the first user does not use the mobile phone 100 . When receiving the message, the bracelet 200 can determine that the first user has entered a sleep state.
  • the mobile phone 100 can determine whether the first user uses the mobile phone by monitoring whether the mobile phone 100 is in a stationary state, whether it detects user operations acting on the mobile phone 100 within a preset time, and whether the first user's eyes are looking at the screen of the mobile phone 100 100.
  • the mobile phone 100 can determine whether it is in a stationary state by using the data collected by the acceleration sensor and the gyroscope sensor. If it is determined that it is in a non-stationary state (that is, the posture of the mobile phone 100 changes), the mobile phone 100 can determine that a user is using the mobile phone 100 . Further, the mobile phone 100 can determine whether the user using the mobile phone 100 is the first user.
  • the mobile phone 100 can monitor whether there is a user operation acting on the mobile phone 100 within a preset time. This can reduce the misjudgment of whether the user is in a sleep state when the mobile phone 100 is in a stationary state but is still being used by the user. If the user operation is monitored within the preset time, the mobile phone 100 may determine that a user is using the mobile phone 100 . Further, the mobile phone 100 can determine whether the user using the mobile phone 100 is the first user. This embodiment of the present application does not limit the length of the foregoing preset time.
  • the mobile phone 100 can monitor whether human eyes are watching the screen of the mobile phone 100 . This can reduce the misjudgment of whether the user is in a sleep state when the mobile phone is in a stationary state and there is no user operation for a preset period of time, but is still used by the user (for example, a scene where the mobile phone is in a stationary state to play a video). If a human eye looks at the screen of the mobile phone 100, the mobile phone 100 can determine whether the user looking at the screen is the first user. If no human eyes look at the screen of the mobile phone 100 or the user looking at the screen of the mobile phone 100 is not the first user, the mobile phone 100 can determine that the first user does not use the mobile phone 100 .
  • the mobile phone 100 and the bracelet 200 have a binding relationship.
  • the mobile phone 100 and the bracelet 200 may establish a binding relationship through Bluetooth pairing.
  • the mobile phone 100 may mark the bracelet 200 as a receiving bracelet worn by the owner of the mobile phone 100 .
  • the mobile phone 100 may add an owner tag to the bracelet 200 when storing the Bluetooth address of the bracelet 200 .
  • the mobile phone 100 can detect whether the first user is using the mobile phone 100 .
  • the mobile phone 100 may not process the request.
  • the mobile phone 100 in response to the first user operation, may establish a binding relationship with the bracelet 200 .
  • the first user operation may be used to indicate that the owner of the mobile phone 100 and the user wearing the bracelet 200 are the same user.
  • the mobile phone 100 can manage the bracelets that have established a communication connection with itself.
  • the setting options for managing the bracelet in the mobile phone 100 may include a sleep assist function.
  • the mobile phone 100 can enable the sleep assist function in the setting options for managing the bracelet 200 .
  • the bracelet 200 may request the mobile phone 100 to further confirm whether the owner of the mobile phone 100 is in the sleep state after prejudging that the user wearing the bracelet 200 has entered the sleep state.
  • the mobile phone 100 and the bracelet 200 may establish a binding relationship by associating with the same account (for example, a Huawei account). That is, the accounts logged in the mobile phone 100 and the bracelet 200 are the same account. Since the mobile phone 100 and the bracelet 200 have a binding relationship, the mobile phone 100 can assist the bracelet 200 to monitor whether the user wearing the bracelet 200 (ie the first user) has entered by judging whether the owner (ie, the first user) is using the mobile phone. sleep state. This can reduce the misjudgment that the user of the mobile phone 100 and the user wearing the bracelet 200 are not the same person as to whether the user enters the sleep state.
  • the same account for example, a Huawei account
  • the embodiments of the present application do not limit the manner in which the mobile phone 100 and the bracelet 200 establish a binding relationship.
  • a neural network can be composed of neural units, and a neural unit can refer to an operation unit that takes x s and an intercept 1 as inputs, and the output of the operation unit can refer to the following formula (1):
  • W s is the weight of x s
  • b is the bias of the neural unit.
  • f is an activation function of the neural unit, which is used to introduce nonlinear characteristics into the neural network to convert the input signal in the neural unit into an output signal. The output signal of this activation function can be used as the input of the next convolutional layer.
  • the activation function can be a sigmoid function.
  • a neural network is a network formed by connecting many of the above single neural units together, that is, the output of one neural unit can be the input of another neural unit.
  • the input of each neural unit can be connected with the local receptive field of the previous layer to extract the features of the local receptive field, and the local receptive field can be an area composed of several neural units.
  • the convolutional neural network can use the error back propagation (BP) algorithm to correct the size of the parameters in the initial super-resolution model during the training process, so that the reconstruction error loss of the super-resolution model becomes smaller and smaller. Specifically, forwarding the input signal until the output will generate an error loss, and updating the parameters in the initial super-resolution model by back-propagating the error loss information, so that the error loss converges.
  • the back-propagation algorithm is a back-propagation motion dominated by the error loss, aiming to obtain the parameters of the optimal super-resolution model, such as the weight matrix.
  • FIG. 3 exemplarily shows a flowchart of a sleep monitoring method provided by an embodiment of the present application. As shown in FIG. 3, the method may include steps S101-S108. in:
  • the bracelet 200 uses a sleep model to predict that the first user enters a sleep state.
  • the sleep model may be a trained neural network model.
  • the bracelet 200 can collect acceleration data in real time through an acceleration sensor, and collect heart rate data of the first user through a heart rate sensor in real time. Based on the above acceleration data and heart rate data, the bracelet 200 can use the sleep model to predict whether the first user enters a sleep state.
  • the wristband 200 when the user is in a sleep state, the wristband 200 usually has a small change in posture within a certain period of time, or even the posture remains unchanged. The posture of the wristband 200 is different when the wristband 200 is worn on the user and remains in a stationary state and when the wristband 200 is placed on a desktop and remains in a stationary state.
  • the heart rate usually drops gradually after the user goes to sleep. Therefore, in combination with the acceleration data and the heart rate data detected by the bracelet 200, the bracelet 200 can predict whether the user enters a sleep state.
  • the training data for training the sleep model may include acceleration data and heart rate data of the bracelet 200 when the user is actually in a sleeping state, and acceleration data and heart rate data of the bracelet 200 when the user is actually in a non-sleep state. These data can be collected through big data. That is, these data may be acceleration data and heart rate data in a sleep state and acceleration data and heart rate data in a non-sleep state when a general user wears the bracelet 200 . Alternatively, these data may be acceleration data and heart rate data in a sleep state and acceleration data and heart rate data in a non-sleep state when the first user is wearing the bracelet 200 .
  • the sleep model trained by using the data of the first user can better predict whether the first user enters a sleep state.
  • the trained sleep model can identify the characteristics of the acceleration data and heart rate data of the bracelet 200 when the first user is actually in a sleeping state, so as to predict whether the first user is in a sleeping state. That is to say, when the acceleration data monitored by the wristband 200 is consistent with the acceleration data when the user is actually sleeping, and the monitored heart rate data is consistent with the heart rate data when the user is actually sleeping, the wristband 200 It can be predicted that the first user enters a sleep state.
  • the bracelet 200 can also use a sleep model to predict whether the first user is sleeping based on data such as angular velocity data collected by a gyro sensor, ambient light data collected by an ambient light sensor, and ambient sound data collected by a microphone. state. That is, the training data of the sleep model may also include angular velocity data, ambient light data, ambient sound data, and the like.
  • the embodiments of the present application do not limit the training data used for training the sleep model and the method for training the sleep model.
  • the method for the wristband 200 to predict whether the first user enters the sleep state reference may also be made to the method for judging whether the first user enters the sleep state by electronic devices such as wristbands in the prior art.
  • the bracelet 200 sends a request to the mobile phone 100 to confirm whether the first user is in a sleep state.
  • the mobile phone 100 determines whether it is in a stationary state.
  • the fact that the mobile phone 100 is in a non-stationary state can indicate that a user is using the mobile phone 100 .
  • the fact that the mobile phone 100 is in a stationary state cannot directly indicate that no user is using the mobile phone 100 .
  • the mobile phone 100 is in a stationary state, but is still used by the user.
  • the mobile phone 100 can determine whether it is in a stationary state through acceleration data collected by the acceleration sensor.
  • the mobile phone 100 may perform the following step S104.
  • the mobile phone 100 may perform the following step S106.
  • the method for determining whether the mobile phone 100 is in a stationary state in this embodiment of the present application is not limited.
  • the mobile phone 100 can also determine whether it is in a stationary state through the angular velocity data collected by the gyro sensor.
  • the mobile phone 100 monitors whether the first user is in a sleep state by judging whether it is in a stationary state, which can reduce the misjudgment of whether the first user is in a sleep state when the mobile phone 100 is in a stationary state but is still used by the first user.
  • the mobile phone 100 determines whether a user operation is monitored within a preset time.
  • the fact that the mobile phone 100 monitors the user operation within the preset time can indicate that a user is using the mobile phone 100 .
  • the fact that the mobile phone 100 does not monitor the user operation within the preset time cannot directly indicate that no user is using the mobile phone 100 .
  • the length of the above-mentioned preset time is 10 minutes.
  • the mobile phone 100 may not receive user operation during the video playback process, but is still blocked by the user. use.
  • the above-mentioned user operation may be, for example, a touch operation acting on the screen of the mobile phone 100, a user operation acting on a button of the mobile phone 100, an input operation of a voice command, an input operation of an air gesture, and the like.
  • This embodiment of the present application does not limit the specific types of the above-mentioned user operations.
  • the mobile phone 100 may perform the following step S105.
  • the mobile phone 100 may perform the following step S106.
  • This embodiment of the present application does not limit the length of the foregoing preset time.
  • the mobile phone 100 monitors whether the first user enters a sleep state by judging whether a user operation is monitored within a preset time, which can reduce the number of times that the mobile phone 100 is in a stationary state and there is no user operation within the preset time.
  • a scene still being used by the first user eg, a scene where the mobile phone is in a still state to play a video
  • misjudged whether the first user has entered a sleep state eg, a scene where the mobile phone is in a still state to play a video
  • the mobile phone 100 determines whether human eyes are watching the screen.
  • the mobile phone 100 can capture images through the front camera. Based on the above image, the mobile phone 100 can use the human eye gaze recognition model to determine whether the human eye gazes at the screen.
  • the human eye gaze recognition model may be a neural network model.
  • the training data for training the human eye gaze recognition model may include image data of the human eye looking at the screen and image data of the human eye not looking at the screen.
  • the trained human eye gaze recognition model can identify the characteristics of the image in which the human eye gazes at the screen, so as to determine whether the human eye gazes at the screen of the mobile phone 100 .
  • the aforementioned front-facing camera for capturing images may be a low-power camera.
  • a low-power camera For example, infrared cameras.
  • the above-mentioned low-power camera can be in a working state in real time. This embodiment of the present application does not limit the type of the above-mentioned front camera.
  • the mobile phone 100 when it is determined that there is no user operation acting on the mobile phone 100 within a preset time period, the mobile phone 100 can turn on the above-mentioned front camera to capture images.
  • the mobile phone 100 may turn on the aforementioned front-facing camera. The embodiment of the present application does not limit the time when the mobile phone 100 turns on the front camera.
  • the mobile phone 100 may perform the following step S106.
  • the mobile phone 100 may perform the following step S108.
  • the mobile phone 100 monitors whether the first user is in a sleep state by judging whether human eyes are watching the screen, which can reduce the situation where the mobile phone 100 is in a stationary state, and there is no user operation within a preset time, but is not used by the first user (for example, the mobile phone is in a stationary state.
  • the video is played in the state, but the first user falls asleep during the video playback) misjudgment of whether the first user has entered the sleep state.
  • the mobile phone 100 determines whether the user of the mobile phone 100 is the first user.
  • the mobile phone 100 determines that the mobile phone 100 is in a non-stationary state, the mobile phone 100 determines that the mobile phone 100 is in a stationary state but monitors user operations within a preset time, and the mobile phone 100 determines that the mobile phone 100 is in a stationary state and is in a stationary state. If no user operation is detected within the preset time, but it is recognized that someone is watching the screen of the mobile phone 100 , the mobile phone 100 can determine that a user is using the mobile phone 100 . Further, the mobile phone 100 can determine whether the user is the first user.
  • the mobile phone 100 may determine whether the user is the first user through face recognition.
  • the bracelet 200 there is a binding relationship between the bracelet 200 and the mobile phone 100 .
  • the first user wearing the bracelet 200 is the owner of the mobile phone 100 .
  • the mobile phone 100 can compare the face image collected by the front camera with the face image of the owner stored in the mobile phone 100 to determine whether the user of the mobile phone 100 is the first user.
  • the above-mentioned face image of the owner stored in the mobile phone 100 may be a face image used for face recognition to unlock the mobile phone 100 .
  • the embodiment of the present application does not limit the method for the mobile phone 100 to compare whether the face image collected by the front camera and the face image of the owner stored in the mobile phone 100 are the face image of the same user.
  • the mobile phone 100 may also determine whether the user of the mobile phone 100 is the first user by means of biometric identification methods such as voiceprint recognition and fingerprint recognition.
  • biometric identification methods such as voiceprint recognition and fingerprint recognition. This embodiment of the present application does not limit the specific method for determining whether the user of the mobile phone 100 is the first user.
  • the mobile phone 100 may execute the following step S107.
  • the mobile phone 100 may execute the following step S108.
  • the mobile phone 100 detects whether the first user enters the sleep state by identifying whether the user is the first user, which can reduce the number of users of the mobile phone 100 and the first user wearing the bracelet 200 It is not the same person's misjudgment of whether the first user has entered a sleep state.
  • the mobile phone 100 sends the first judgment result to the bracelet 200 , indicating that the first user is using the mobile phone 100 .
  • the mobile phone 100 can determine that the first user is using the mobile phone 100 . Then, the mobile phone 100 may send the first judgment result to the bracelet 200 to indicate that the first user is using the mobile phone 100 .
  • the bracelet 200 can use the sleep model again to predict whether the first user enters the sleep state, and request the mobile phone 100 to confirm whether the first user enters the sleep state .
  • the bracelet 200 can use a sleep model to predict whether the first user is in a sleep state every preset time period (eg, 5 minutes, etc.).
  • the bracelet 200 may record the monitored data (such as heart rate data) as the data that the first user is in a non-sleep state.
  • the mobile phone 100 sends the second judgment result to the bracelet 200 , indicating that the first user does not use the mobile phone 100 .
  • the mobile phone 100 may determine that the first user does not use the mobile phone 100 . Then, the mobile phone 100 can send the second judgment result to the bracelet 200, indicating that the first user does not use the mobile phone 100.
  • the bracelet 200 may determine the moment when the second judgment result is received as the moment when the first user enters the sleep state. Alternatively, the bracelet 200 may predetermine the moment when the first user enters the sleep state as the moment when the first user enters the sleep state. In this embodiment of the present application, the time when the wristband 200 determines that the first user enters the sleep state is not particularly limited. The bracelet 200 may record the monitored data (such as heart rate data) after the first user enters the sleep state as the data when the first user is in the sleep state.
  • the monitored data such as heart rate data
  • the bracelet 200 can use the mobile phone 100 to confirm whether the first user enters the sleep state. This can reduce the misjudgment of whether the first user has entered the sleep state because the first user maintains a fixed posture for a long time but is not in the sleep state when the bracelet 200 is used alone to monitor whether the first user has entered the sleep state, and the first user can be improved. The accuracy of monitoring when the user goes to sleep. Thus, the bracelet 200 can improve the accuracy of sleep quality monitoring.
  • the mobile phone 100 after determining that the mobile phone 100 is in a non-stationary state, it can directly determine whether the user of the mobile phone 100 is the first user. In this way, the mobile phone 100 does not need to perform steps S104 and S105 , thereby saving the power consumption of the mobile phone 100 .
  • the mobile phone 100 determines that the user's operation has not been monitored within the preset time, it can directly determine whether the user of the mobile phone 100 is the first user. In this way, the mobile phone 100 does not need to perform step S105 , thereby saving the power consumption of the mobile phone 100 .
  • the execution order of the above steps S103 and S104 may be reversed. That is, after receiving the request from the bracelet 200 for confirming whether the first user enters the sleep state, the mobile phone 100 may first determine whether there is a user operation within the preset time. If it is determined that there is a user operation within the preset time, the mobile phone 100 may perform step S106. If it is determined that there is no user operation within the preset time, the mobile phone 100 can further determine whether it is in a stationary state. If it is determined that it is in a stationary state, the mobile phone 100 may execute step S105. If it is determined that it is in a non-stationary state, the mobile phone 100 may execute step S106.
  • the mobile phone 100 can simultaneously execute the above steps S103, S104, and S105 to determine whether the mobile phone 100 is in the sleep state. used by users. If it is determined that the mobile phone 100 is used by the user, the mobile phone 100 may further perform step S106 to determine whether the user is the first user. Otherwise, the mobile phone 100 may execute step S108 to instruct the first user of the bracelet 200 to enter the sleep state.
  • FIG. 4 exemplarily shows a flowchart of another sleep monitoring method provided by an embodiment of the present application.
  • the method may include steps S201-S207. in:
  • the bracelet 200 uses the sleep model to predict that the first user enters the sleep state.
  • the bracelet 200 sends a request to the mobile phone 100 to confirm whether the first user enters the sleep state.
  • steps S201 and S202 may refer to steps S101 and S102 in the method shown in FIG. 3 , respectively.
  • the mobile phone 100 determines whether it is in a stationary state.
  • step S203 For the method for determining whether the mobile phone 100 is in a stationary state, reference may be made to step S203 in the method shown in FIG. 3 .
  • the mobile phone 100 may execute step S204. That is, the mobile phone 100 can determine whether human eyes are watching the screen.
  • the mobile phone 100 may execute step S205.
  • the mobile phone 100 determines whether human eyes are watching the screen.
  • the mobile phone 100 determines whether the user of the mobile phone 100 is the first user.
  • the mobile phone 100 sends the first judgment result to the bracelet 200 , indicating that the first user is using the mobile phone 100 .
  • the mobile phone 100 sends the second judgment result to the bracelet 200 , indicating that the first user does not use the mobile phone 100 .
  • steps S204 to S207 reference may be made to steps S105 to S108 in the method shown in FIG. 3 , which will not be repeated here.
  • FIG. 5 exemplarily shows a flowchart of another sleep monitoring method provided by an embodiment of the present application.
  • the method may include steps S301-S307. in:
  • the bracelet 200 uses the sleep model to predict that the first user enters the sleep state.
  • the bracelet 200 sends a request to the mobile phone 100 to confirm whether the first user enters the sleep state.
  • steps S301 and S302 may refer to steps S101 and S102 in the method shown in FIG. 3 , respectively.
  • the mobile phone 100 determines whether a user operation is monitored within a preset time.
  • step S204 For the method for the mobile phone 100 to determine whether the user operation is monitored within the preset time, reference may be made to step S204 in the method shown in FIG. 3 .
  • the mobile phone 100 may perform step S304. That is, the mobile phone 100 can determine whether human eyes are watching the screen.
  • the mobile phone 100 may perform step S305.
  • the mobile phone 100 determines whether human eyes are watching the screen.
  • the mobile phone 100 determines whether the user of the mobile phone 100 is the first user.
  • the mobile phone 100 sends the first judgment result to the bracelet 200 , indicating that the first user is using the mobile phone 100 .
  • the mobile phone 100 sends the second judgment result to the bracelet 200 , indicating that the first user does not use the mobile phone 100 .
  • steps S304 to S307 reference may be made to steps S105 to S108 in the method shown in FIG. 3 , which will not be repeated here.
  • FIG. 6 exemplarily shows a flowchart of another sleep monitoring method provided by an embodiment of the present application.
  • the method may include steps S401-S406. in:
  • the bracelet 200 uses the sleep model to predict that the first user enters the sleep state.
  • the bracelet 200 sends a request to the mobile phone 100 to confirm whether the first user is in a sleep state.
  • steps S401 and S402 may refer to steps S101 and S102 in the method shown in FIG. 3 , respectively.
  • the mobile phone 100 determines whether human eyes are watching the screen.
  • the mobile phone 100 determines whether the user of the mobile phone 100 is the first user.
  • the mobile phone 100 sends the first judgment result to the bracelet 200 , indicating that the first user is using the mobile phone 100 .
  • the mobile phone 100 sends the second judgment result to the bracelet 200 , indicating that the first user does not use the mobile phone 100 .
  • steps S403 to S406 reference may be made to steps S105 to S108 in the method shown in FIG. 3 , and details are not repeated here.
  • the mobile phone 100 when receiving a request from the bracelet 200 for determining whether the first user enters a sleep state, can monitor whether it is running a screen-casting application.
  • the fact that the mobile phone 100 runs a screen-casting application can indicate that a user is using the mobile phone 100 .
  • the mobile phone 100 can detect whether the user of the mobile phone 100 is the first user.
  • the mobile phone 100 may send a message for capturing images to a screen projection device, such as a TV.
  • the screen projection device may capture and view an image of the area of the screen projection device, and send the image to the mobile phone 100 .
  • the mobile phone 100 can determine whether the image from the screen projection device includes the face image of the first user.
  • the fact that the image collected by the screen-casting device includes the face image of the first user may indicate that the first user casts the screen through the mobile phone 100 and watches the content played on the screen-casting device. That is, the first user is not in a sleep state.
  • the mobile phone 100 may send the first judgment result in the foregoing embodiment to the bracelet 200 , indicating that the first user is using the mobile phone 100 .
  • the fact that the image collected by the screen projection device does not include the face image of the first user may indicate that the users who use the mobile phone 100 to project the screen and watch the content played on the screen projection device do not include the first user.
  • the mobile phone 100 may send the second judgment result in the foregoing embodiment to the bracelet 200, indicating that the first user does not use the mobile phone 100.
  • the mobile phone 100 may detect that it is in a stationary state, there is no user operation within a preset time, and no human eyes are watching. However, the first user did not go to sleep, but was watching the screen-casting device.
  • the above method can monitor whether the mobile phone 100 is running a screen-casting application, and monitor whether the first user enters a sleep state by using images collected by the screen-casting device when the screen-casting application is running. This can reduce the misjudgment by the first user that the mobile phone 100 is in a stationary state, has no user operation within a preset time, and has no human eyes, but is still used by the first user to determine whether the first user is in a sleep state.
  • the mobile phone 100 may request other electronic devices having an image capturing device (eg, a camera), such as a TV, to perform image capturing.
  • an image captured by an electronic device having an image capturing device is obtained, the mobile phone 100 can determine whether the image contains the first user and determine the state of the first user. In this way, the mobile phone 100 can monitor whether the first user enters the sleep state, and send the monitoring result to the bracelet 200 .
  • the mobile phone 100 may send a message for capturing images to the TV.
  • the TV can send the images captured by the camera to the mobile phone 100 .
  • the fact that the image collected by the television through the camera includes the face image of the first user may indicate that the first user is watching the television. That is, the first user enters the sleep state. If the mobile phone 100 determines that the image from the TV contains the face image of the first user, the mobile phone 100 can send the first judgment result in the foregoing embodiment to the bracelet 200 to indicate that the first user is using the mobile phone 100 .
  • the embodiments of the present application do not limit the manner in which the mobile phone 100 establishes a communication connection with an electronic device having an image acquisition device, such as a television.
  • the above-mentioned communication connection manner may be, for example, a Bluetooth connection, a Wi-Fi network connection, or the like.
  • the bracelet 200 can establish a communication connection with other electronic devices having image capturing devices.
  • the bracelet 200 may send a message for capturing an image to the electronic device having the image capturing device.
  • the electronic device with the image acquisition device can send the acquired image to the bracelet 200 .
  • the bracelet 200 can determine the state of the first user according to these images to determine whether the first user has entered a sleep state.
  • the bracelet 200 can send these images to an electronic device with strong processing capability, such as the mobile phone 100 , among the electronic devices that have established a communication connection with itself.
  • the mobile phone 100 can determine the state of the first user according to the received image, so as to determine whether the first user has entered a sleep state.
  • the manner in which the bracelet 200 establishes a communication connection with an electronic device having an image acquisition device in this embodiment of the present application is not limited.
  • the bracelet 200 can use the mobile phone 100 to determine whether the first user exits the sleep state (ie, the first user wakes up).
  • the mobile phone 100 may send a message to the bracelet 200 for instructing the first user to exit the sleep state when monitoring the user operation of unlocking the mobile phone 100 for the first time within the second time period.
  • the bracelet 200 may determine the time when the user operation of unlocking the mobile phone 100 is detected as the time when the first user exits the sleep state. Combined with the moment when the first user enters the sleep state determined by the sleep monitoring method in the foregoing embodiment, the bracelet 200 can determine the total duration of the first user being in the sleep state, and evaluate the duration of the first user from entering the sleep state to exiting the sleep state. Sleep quality over time.
  • the above unlocking method may be a method for unlocking by using biometric information.
  • the biometric information may be, for example, face information, voiceprint information, fingerprint information, and the like.
  • the mobile phone 100 may determine that the unlocked user is the first user.
  • the above-mentioned second time period may be a period of time (eg, 12 hours) during which the mobile phone 100 sends the second judgment result to the bracelet 200 to indicate that the first user does not use the mobile phone 100 .
  • the above-mentioned second time period may be a preset time period, for example, a time period from 5:00 am to 10:00 am.
  • the above-mentioned second time period may also be a time period estimated by the bracelet 200 when the first user exits the sleep state. The bracelet 200 can estimate the time period for the first user to exit the sleep state according to the data of the first user detected multiple times.
  • the mobile phone 100 may determine whether the user who turns off the alarm clock of the mobile phone 100 is the first user when monitoring the user's operation to turn off the alarm clock of the mobile phone 100 within the above-mentioned second time period. If it is determined that the user who turns off the alarm clock of the mobile phone 100 is the first user, the mobile phone 100 may send a message to the bracelet 200 for instructing the first user to exit the sleep state. When receiving the above-mentioned message for instructing the first user to exit the sleep state, the bracelet 200 may determine the time when the user operation to turn off the alarm clock of the mobile phone 100 is monitored as the time when the first user exits the sleep state.
  • This embodiment of the present application does not limit the method for the mobile phone 100 to determine whether the user who turns off the alarm clock of the mobile phone 100 is the first user.
  • the mobile phone 100 can determine whether the user who turns off the alarm clock is the first user by identifying the biometric information such as face recognition, voiceprint recognition, and fingerprint recognition.
  • the mobile phone 100 may also use the methods shown in FIG. 3 to FIG. 6 to determine whether the first user uses the mobile phone 100 during the second time period. If it is determined that the first user is using the mobile phone 100 , the mobile phone 100 may send a message to the bracelet 200 for instructing the first user to exit the sleep state.
  • This embodiment of the present application does not limit the implementation method for the mobile phone 100 to determine whether the first user exits the sleep state.
  • the bracelet 200 may use the sleep model in the foregoing embodiment to determine whether the first user exits the sleep state based on the acceleration data and the heart rate data.
  • the first user has exited the sleep state, but the change in the posture of the bracelet 200 is small or even the posture remains unchanged. Then, the result determined by the bracelet 200 using the sleep model is often that the first user is still in a sleep state. This reduces the accuracy of sleep quality monitoring.
  • the above method of judging whether the first user exits the sleep state by the mobile phone 100 of the bracelet 200 can reduce the misjudgment of whether the first user exits the sleep state when the first user has woken up but does not get up, and improve the accuracy of sleep quality monitoring.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • Physiology (AREA)
  • Cardiology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pulmonology (AREA)
  • Dentistry (AREA)
  • Anesthesiology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

L'invention concerne un procédé de surveillance du sommeil et un appareil associé, se rapportant au domaine de l'intelligence artificielle. Dans le procédé, un second dispositif électronique (200) porté sur un premier utilisateur et utilisé pour surveiller la qualité du sommeil peut déterminer, au moyen d'un premier dispositif électronique (100), si le premier utilisateur est entré dans un état de sommeil. Après que le premier utilisateur a été préalablement déterminé comme ayant pénétré dans l'état de sommeil, le second dispositif électronique (200) envoie, au premier dispositif électronique (100), une requête pour déterminer si le premier utilisateur est entré dans l'état de sommeil. Lorsqu'il est déterminé que le premier utilisateur utilise le premier dispositif électronique (100), le premier dispositif électronique (100) envoie, au second dispositif électronique (200), un message pour indiquer que le premier utilisateur n'est pas entré dans l'état de sommeil. Le procédé peut réduire les erreurs de jugement, provoquées par l'utilisation du second dispositif électronique (200) seul pour surveiller, relatives à si oui on non l'utilisateur est entré dans l'état de sommeil, et améliorer la précision de surveillance du temps lorsque l'utilisateur entre dans l'état de sommeil.
PCT/CN2021/137461 2021-01-15 2021-12-13 Procédé de surveillance du sommeil et appareil associé WO2022151887A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110057952.7 2021-01-15
CN202110057952.7A CN114762588A (zh) 2021-01-15 2021-01-15 睡眠监测方法及相关装置

Publications (1)

Publication Number Publication Date
WO2022151887A1 true WO2022151887A1 (fr) 2022-07-21

Family

ID=82364577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/137461 WO2022151887A1 (fr) 2021-01-15 2021-12-13 Procédé de surveillance du sommeil et appareil associé

Country Status (2)

Country Link
CN (1) CN114762588A (fr)
WO (1) WO2022151887A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024051790A1 (fr) * 2022-09-09 2024-03-14 荣耀终端有限公司 Procédé de détection d'état de sommeil, dispositifs électroniques et système

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117481614A (zh) * 2023-12-26 2024-02-02 荣耀终端有限公司 睡眠状态检测的方法及相关设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150181368A1 (en) * 2013-12-20 2015-06-25 Kabushiki Kaisha Toshiba Electronic apparatus, method and storage medium
CN105042769A (zh) * 2015-06-30 2015-11-11 广东美的制冷设备有限公司 睡眠状态监控方法及装置、空调器系统
CN105380596A (zh) * 2014-08-26 2016-03-09 三星电子株式会社 电子设备和电子设备中的睡眠监视方法
CN105433904A (zh) * 2015-11-24 2016-03-30 小米科技有限责任公司 睡眠状态检测方法、装置及系统
CN106308752A (zh) * 2016-08-23 2017-01-11 广东小天才科技有限公司 一种基于可穿戴设备的睡眠监测方法和系统
CN111839465A (zh) * 2020-07-30 2020-10-30 歌尔科技有限公司 睡眠检测方法、装置、智能穿戴设备及可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150181368A1 (en) * 2013-12-20 2015-06-25 Kabushiki Kaisha Toshiba Electronic apparatus, method and storage medium
CN105380596A (zh) * 2014-08-26 2016-03-09 三星电子株式会社 电子设备和电子设备中的睡眠监视方法
CN105042769A (zh) * 2015-06-30 2015-11-11 广东美的制冷设备有限公司 睡眠状态监控方法及装置、空调器系统
CN105433904A (zh) * 2015-11-24 2016-03-30 小米科技有限责任公司 睡眠状态检测方法、装置及系统
CN106308752A (zh) * 2016-08-23 2017-01-11 广东小天才科技有限公司 一种基于可穿戴设备的睡眠监测方法和系统
CN111839465A (zh) * 2020-07-30 2020-10-30 歌尔科技有限公司 睡眠检测方法、装置、智能穿戴设备及可读存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024051790A1 (fr) * 2022-09-09 2024-03-14 荣耀终端有限公司 Procédé de détection d'état de sommeil, dispositifs électroniques et système

Also Published As

Publication number Publication date
CN114762588A (zh) 2022-07-19

Similar Documents

Publication Publication Date Title
CN112289313A (zh) 一种语音控制方法、电子设备及系统
WO2020207328A1 (fr) Procédé de reconnaissance d'image et dispositif électronique
WO2021213165A1 (fr) Procédé de traitement de données multi-sources, dispositif électronique et support de stockage lisible par ordinateur
WO2021213151A1 (fr) Procédé de commande d'affichage et dispositif portable
WO2020019176A1 (fr) Procédé de mise à jour de voix de réveil d'un assistant vocal par un terminal, et terminal
WO2021169515A1 (fr) Procédé d'échange de données entre dispositifs, et dispositif associé
WO2022151887A1 (fr) Procédé de surveillance du sommeil et appareil associé
WO2021104104A1 (fr) Procédé de traitement d'affichage écoénergétique, et appareil
WO2020019355A1 (fr) Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire
WO2022007720A1 (fr) Procédé de détection de port pour un dispositif pouvant être porté, appareil et dispositif électronique
WO2021068926A1 (fr) Procédé de mise à jour de modèle, nœud de travail et système de mise à jour de modèle
CN111835907A (zh) 一种跨电子设备转接服务的方法、设备以及系统
CN113676339B (zh) 组播方法、装置、终端设备及计算机可读存储介质
WO2022105830A1 (fr) Procédé d'évaluation de sommeil, dispositif électronique et support de stockage
WO2022100407A1 (fr) Masque oculaire intelligent, dispositif de terminal, et procédé et système de gestion de santé
CN113467735A (zh) 图像调整方法、电子设备及存储介质
WO2022237598A1 (fr) Procédé de test d'état de sommeil et dispositif électronique
WO2022135144A1 (fr) Procédé d'affichage auto-adaptatif, dispositif électronique et support de stockage
WO2021204036A1 (fr) Procédé de surveillance du risque de sommeil, dispositif électronique et support de stockage
CN113467747B (zh) 音量调节方法、电子设备及存储介质
WO2021244186A1 (fr) Procédé de gestion et de contrôle de santé d'utilisateur, et dispositif électronique
CN114116610A (zh) 获取存储信息的方法、装置、电子设备和介质
CN111026285B (zh) 一种调节压力阈值的方法及电子设备
WO2022252786A1 (fr) Procédé d'affichage d'écran divisé en fenêtres et dispositif électronique
WO2021239079A1 (fr) Procédé de mesure de données et appareil associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21919077

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21919077

Country of ref document: EP

Kind code of ref document: A1