WO2024075982A1 - Dispositif électronique et son procédé de fonctionnement - Google Patents

Dispositif électronique et son procédé de fonctionnement Download PDF

Info

Publication number
WO2024075982A1
WO2024075982A1 PCT/KR2023/012668 KR2023012668W WO2024075982A1 WO 2024075982 A1 WO2024075982 A1 WO 2024075982A1 KR 2023012668 W KR2023012668 W KR 2023012668W WO 2024075982 A1 WO2024075982 A1 WO 2024075982A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
data
user
sensor
processor
Prior art date
Application number
PCT/KR2023/012668
Other languages
English (en)
Korean (ko)
Inventor
정승용
한승욱
황민경
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220144336A external-priority patent/KR20240047876A/ko
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2024075982A1 publication Critical patent/WO2024075982A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters

Definitions

  • Embodiments of the present invention relate to electronic devices and methods of operating the same.
  • sleep time or sleep score can be confirmed, but it may be difficult to specifically determine the user's sleep habits or any problems with the user's posture while sleeping.
  • sleep quality can vary depending on sleeping position.
  • the electronic device includes at least one communication module; at least one sensor; display module; and a processor.
  • the processor may be set to receive first data obtained through a motion sensor of the wearable device from the wearable device through the at least one communication module.
  • the processor may be set to acquire second data related to the movement of the user of the wearable device through the at least one sensor.
  • the processor may be set to check the user's movement and/or posture based on the first data and the second data.
  • the processor may be set to receive third data obtained through a biometric sensor of the wearable device from the wearable device through the at least one communication module.
  • the processor may be set to check the user's sleep state based on the third data.
  • the processor may be set to display a screen through the display module using rendering data generated using the first data, the second data, and/or the third data.
  • a method of operating an electronic device may include receiving first data obtained through a motion sensor of the wearable device from the wearable device through at least one communication module of the electronic device. .
  • the method may include acquiring second data related to the movement of the user of the wearable device through at least one sensor of the electronic device.
  • the method may include checking the user's movement and/or posture based on the first data and the second data.
  • the method may include receiving third data obtained through a biometric sensor of the wearable device from the wearable device through the at least one communication module.
  • the method may include checking the user's sleep state based on the third data.
  • the method may include displaying a screen through the display module using rendering data generated using the first data, the second data, and/or the third data.
  • the at least one operation may be performed by a motion sensor of a wearable device.
  • the method may include receiving first data obtained through the wearable device through at least one communication module of the electronic device.
  • the at least one operation may include acquiring second data related to the movement of the user of the wearable device through at least one sensor of the electronic device.
  • the at least one operation may include an operation of checking the user's movement and/or posture based on the first data and the second data.
  • the at least one operation may include receiving third data obtained through a biometric sensor of the wearable device from the wearable device through the at least one communication module.
  • the at least one operation may include checking the user's sleeping state based on the third data.
  • the at least one operation may include displaying a screen through the display module using rendering data generated using the first data, the second data, and/or the third data. there is.
  • the wearable device includes at least one sensor; communication module; and a processor.
  • the processor may be set to confirm the start of the user's sleep based on data acquired through the at least one sensor.
  • the processor may be configured to transmit, through the communication module, a signal requesting activation of at least one sensor of the electronic device to the electronic device, based on confirming the start of the sleep.
  • the processor transmits first data acquired through a motion sensor among the at least one sensor of the wearable device, and second data acquired through a biometric sensor among the at least one sensor, through the communication module, to the electronic device. It can be set to transmit to the device.
  • a method of operating a wearable device may include confirming the start of a user's sleep based on data acquired through at least one sensor of the wearable device.
  • the method may include transmitting, based on confirming the onset of the sleep, a signal requesting activation of at least one sensor of the electronic device to an electronic device through a communication module of the wearable device.
  • the method includes first data acquired through a motion sensor among the at least one sensor of the wearable device, and second data acquired through a biometric sensor among the at least one sensor, through the communication module, the electronic device. It may include the operation of transmitting to a device.
  • the at least one operation may be performed by at least one operation of the wearable device. It may include an operation of confirming the start of the user's sleep based on data acquired through one sensor.
  • the at least one operation includes transmitting a signal requesting activation of at least one sensor of the electronic device to an electronic device through a communication module of the wearable device, based on confirming the start of the sleep. It can be included.
  • the at least one operation includes first data acquired through a motion sensor among the at least one sensor of the wearable device, and second data acquired through a biometric sensor among the at least one sensor through the communication module. , may include an operation of transmitting to the electronic device.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to an embodiment.
  • Figure 2 is a block diagram of devices included in a system according to an embodiment.
  • Figure 3 is a block diagram of an electronic device according to an embodiment.
  • Figure 4 is a block diagram of a wearable device according to an embodiment.
  • Figure 5 is a block diagram of a server according to an embodiment.
  • Figure 6 is a block diagram of an external device according to an embodiment.
  • Figure 7 is a flowchart of a method of operating an electronic device, according to an embodiment.
  • FIG. 8 is a flowchart of a method of operating an electronic device, according to an embodiment.
  • Figure 9 is a flowchart of a method of operating a wearable device, according to an embodiment.
  • Figure 10 is a flowchart of a method of operating an electronic device, according to an embodiment.
  • FIG. 11 is a diagram for explaining the operation of an electronic device according to an embodiment.
  • Figure 12 is a flowchart of a method of operating a wearable device, according to an embodiment.
  • FIG. 13 is a diagram for explaining the operation of an electronic device according to an embodiment.
  • FIG. 14 is a diagram for explaining the operation of an electronic device according to an embodiment.
  • FIG. 15 is a diagram for explaining the operation of devices included in the system according to an embodiment.
  • Figure 16 is a diagram for explaining a rendered image according to an embodiment.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • a first network 198 e.g., a short-range wireless communication network
  • a second network 199 e.g., a second network 199.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
  • the processor 120 for example, executes software (e.g., program 140) to operate at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • software e.g., program 140
  • the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
  • the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • the processor 120 includes a main processor 121 (e.g., a central processing unit or an application processor) or an auxiliary processor 123 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 121 e.g., a central processing unit or an application processor
  • auxiliary processor 123 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the electronic device 101 includes a main processor 121 and a secondary processor 123
  • the secondary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
  • the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
  • the auxiliary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or while the main processor 121 is in an active (e.g., application execution) state. ), together with the main processor 121, at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) At least some of the functions or states related to can be controlled.
  • co-processor 123 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 101 itself, where artificial intelligence is performed, or may be performed through a separate server (e.g., server 108).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
  • Memory 130 may include volatile memory 132 or non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142, middleware 144, or application 146.
  • the input module 150 may receive commands or data to be used in a component of the electronic device 101 (e.g., the processor 120) from outside the electronic device 101 (e.g., a user).
  • the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101.
  • the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 can visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
  • the electronic device 102 e.g., speaker or headphone
  • the sensor module 176 detects the operating state (e.g., power or temperature) of the electronic device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a photo sensor, It may include a light detection and ranging (LIDAR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
  • LIDAR light detection and ranging
  • the interface 177 may support one or more designated protocols that can be used to connect the electronic device 101 directly or wirelessly with an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 can capture still images and moving images.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 can manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • Communication module 190 is configured to provide a direct (e.g., wired) communication channel or wireless communication channel between electronic device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108). It can support establishment and communication through established communication channels. Communication module 190 operates independently of processor 120 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • processor 120 e.g., an application processor
  • the communication module 190 is a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
  • a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
  • the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low latency). -latency communications)) can be supported.
  • the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. It can support technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199).
  • the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 197 may transmit signals or power to or receive signals or power from the outside (e.g., an external electronic device).
  • the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected to the plurality of antennas by, for example, the communication module 190. can be selected. Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 197.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band), And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band), And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 or 104 may be of the same or different type as the electronic device 101.
  • all or part of the operations performed in the electronic device 101 may be executed in one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 may perform the function or service instead of executing the function or service on its own.
  • one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of Things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199.
  • the electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • Figure 2 is a block diagram of devices included in a system according to an embodiment.
  • the system may include an electronic device 101, a wearable device 202, and a server 203.
  • the system may include an external device 204.
  • the wearable device 202 may be a device that can be worn on the body of the user of the electronic device 101.
  • the wearable device 202 may be a smart watch, but there is no limitation on the type of the wearable device 202.
  • the wearable device 202 may communicate with the electronic device 101 (eg, wireless communication or wired communication). There are no restrictions on the communication method between the wearable device 202 and the electronic device 101.
  • the wearable device 202 can transmit data to the electronic device 101.
  • the wearable device 202 may communicate with the server 203 (eg, wireless communication or wired communication). There are no restrictions on the communication method between the wearable device 202 and the server 203.
  • the wearable device 202 can transmit data to the server 203.
  • the wearable device 202 may transmit data to the server 203 through the electronic device 101.
  • the wearable device 202 may receive data from the server 203.
  • the electronic device 101 may communicate with the wearable device 202 (eg, wireless communication or wired communication).
  • the electronic device 101 can receive a signal from the wearable device 202.
  • the electronic device 101 may receive data related to the user of the wearable device 202 from the wearable device 202 .
  • the electronic device 101 may communicate with the server 203 (eg, wireless communication or wired communication). There are no restrictions on the communication method between the electronic device 101 and the server 203.
  • the electronic device 101 can transmit data to the server 203.
  • the electronic device 101 may receive data from the server 203.
  • the electronic device 101 may communicate with the external device 204 (eg, wireless communication or wired communication).
  • the electronic device 101 can transmit data to the external device 204.
  • the external device 204 may be an augmented reality (AR) device, a virtual reality (VR) device, or an extended reality (XR) device, but there is no limitation on the type of the external device 204.
  • the external device 204 can receive data from the electronic device 101 .
  • the external device 204 may receive data from the server 203.
  • the external device 204 can display a screen using the received data. There are no restrictions on the way the external device 204 displays the screen.
  • Figure 3 is a block diagram of an electronic device according to an embodiment.
  • the electronic device 101 includes a speaker 301 (e.g., the sound output module 155 of FIG. 1), a microphone 302 (e.g., the input module 150 of FIG. 1), and a processor ( 303) (e.g., processor 120 in FIG. 1), display 304 (e.g., display module 160 in FIG. 1), camera 305 (e.g., camera module 180 in FIG. 1), communication module 306 (e.g., communication module 190 in FIG. 1), memory 307 (e.g., memory 130 in FIG. 1), photo sensor 308 (e.g., included in sensor module 176 in FIG.
  • a speaker 301 e.g., the sound output module 155 of FIG. 1
  • a microphone 302 e.g., the input module 150 of FIG. 1
  • a processor e.g., processor 120 in FIG. 1
  • display 304 e.g., display module 160 in FIG. 1
  • camera 305 e.g., camera module 180 in FIG. 1
  • the electronic device 101 may include at least one of a photo sensor), or a LiDAR sensor 309 (e.g., a LiDAR sensor included in the sensor module 176 of FIG. 1). There is no limit to the type and number of communication modules 306 of the electronic device 101.
  • the electronic device 101 may include a sensor other than the photo sensor 308 or the LiDAR sensor 309 (eg, at least one sensor included in the sensor module 176 of FIG. 1).
  • the electronic device 101 can use the LiDAR sensor 309 to check the distance to an object (eg, the user's body or an object) or the coordinates of the object.
  • the electronic device 101 may include a sensor (e.g., illuminance sensor, temperature sensor, photo sensor) to check information (e.g., illuminance, temperature) about the user's surrounding environment.
  • Figure 4 is a block diagram of a wearable device according to an embodiment.
  • the wearable device 202 includes a speaker 401, a microphone 402, a processor 403, a display 404, a biometric sensor 405, a communication module 406, a memory 407, It may include at least one of a motion sensor 408 or a temperature sensor 409.
  • the motion sensor 408 may be a sensor for detecting the user's movement.
  • motion sensor 408 may be a gyro sensor or an acceleration sensor.
  • the motion sensor 408 may include a gyro sensor or an acceleration sensor.
  • the biometric sensor 405 may be a sensor for detecting a user's biosignal.
  • the wearable device 202 may check the user's heart rate or blood oxygen saturation (eg, SpO 2 concentration) through the biometric sensor 405.
  • the wearable device 202 may display a screen showing changes according to the characteristics of the user's biological signals through the display 404.
  • the wearable device 202 may output a sound representing a change according to the characteristics of the user's biological signals through the speaker 401.
  • the wearable device 202 may check an event in which the user leaves the sleeping space based on data acquired through the microphone 402 (e.g., sound of water intake, toilet sound, notification sound).
  • Figure 5 is a block diagram of a server according to an embodiment.
  • the server 203 may include at least one of a processor 501, a memory 502, or a communication module 503.
  • Server 203 receives data (e.g., data related to the user or related to the user's surroundings) from the wearable device 202 or electronic device 101, via the communication module 503. data) can be received.
  • the server 203 may receive data in real time from the wearable device 202 or the electronic device 101.
  • the server 203 may render data received from the wearable device 202 or the electronic device 101.
  • Server 203 may render data related to the user's posture (or movement) and/or the user's surrounding environment.
  • the server 203 provides the user's body mass index (BMI) data and/or data acquired through a sensor (e.g., the camera 305 of the electronic device 101, or the lidar sensor 309) (e.g., the user It can be rendered based on data related to the user's environment or data related to the user's surroundings.
  • the server 203 may correct the rendered data using existing clinical data related to sleep.
  • the server 203 may generate rendered data by using data received from the wearable device 202 or the electronic device 101 and existing clinical data related to sleep.
  • the server 203 may store data received from the wearable device 202 or the electronic device 101 and rendered data. Data related to sleep is shared in real time, and users can directly observe the rendered user before or after the set wake-up time.
  • the user can access the prepared virtual space and observe the user's sleep in 3D through an AR device, VR device, XR device, or device with a display (e.g., electronic device 101 or external device 204). there is.
  • Figure 6 is a block diagram of an external device according to an embodiment.
  • the external device 204 may include at least one of a processor 601, a memory 602, a communication module 603, or a display 604.
  • the external device 204 (eg, processor 601) may receive data (eg, rendered data) from the server 203 directly through the communication module 603.
  • the external device 204 may receive data (e.g., rendered data) provided from the server 203 to the electronic device 101 from the electronic device 101 through the communication module 603.
  • the external device 204 displays a screen (e.g., a screen indicating sleeping posture (or movement) and/or the surrounding environment) on the display 604 based on data provided from the server 203 or the electronic device 101. can be displayed.
  • FIG. 7 is a flowchart of a method of operating an electronic device according to an embodiment.
  • At least some of the operations in FIG. 7 may be omitted.
  • the operation order of the operations in FIG. 7 may be changed. Operations other than the operations of FIG. 7 may be performed before, during, or after the operations of FIG. 7 are performed.
  • the electronic device 101 may check the user's movement and/or posture.
  • the user may be a user of the electronic device 101.
  • the user may be a user of wearable device 202.
  • the user may be a user of the electronic device 101 and the wearable device 202.
  • the electronic device 101 may check the movement and/or posture of a person other than the user of the electronic device 101. Movement may be a dynamic movement of the body during or before sleep.
  • a posture can be a movement and/or posture.
  • checking posture may mean checking movement and/or posture.
  • checking one's posture may mean checking one's posture at one point in time, or it may mean continuously checking one's posture (or movement).
  • checking the posture may mean confirming the posture by analyzing an image.
  • checking the posture may mean checking the posture by analyzing the coordinates where the body is placed.
  • the electronic device 101 may check the user's movement and/or posture based on the image acquired through the camera 305.
  • the electronic device 101 may check the user's movement and/or posture based on data acquired through at least one sensor (eg, LiDAR sensor 309).
  • the electronic device 101 may check the user's movement and/or posture based on data received from the wearable device 202 through the communication module 306.
  • the electronic device 101 may receive an image acquired through a camera 305, data acquired through at least one sensor (e.g., a lidar sensor 309), and/or a wearable device through a communication module 306. Based on the data received from 202), the user's movement and/or posture can be confirmed.
  • the electronic device 101 may receive an image acquired through a camera 305, data acquired through at least one sensor (e.g., a lidar sensor 309), and/or a wearable device through a communication module 306. By synchronizing the data received from 202) in real time, the user's movements and/or posture can be confirmed.
  • the electronic device 101 may check the user's posture at one point in time, or may check the user's movements and/or posture for a specified time or continuously.
  • the user's posture e.g., sleeping posture
  • the electronic device 101 may check the user's biosignal.
  • the biosignal may include the user's heart rate or blood oxygen saturation (eg, SpO 2 concentration).
  • the electronic device 101 may check the user's biological signals based on data received from the wearable device 202 through the communication module 306.
  • the wearable device 202 can detect the user's biometric signal through the biometric sensor 405.
  • the wearable device 202 may transmit data about the user's biological signals acquired through the biometric sensor 405 to the electronic device 101 through the communication module 406.
  • the electronic device 101 may check the user's biometric signal based on data about the user's biosignal received from the wearable device 202 through the communication module 306.
  • the electronic device 101 may check the user's biological signals at one point in time, or may check the user's biological signals for a specified time or continuously.
  • the electronic device 101 may transmit data about the user to the server 203.
  • the electronic device 101 may transmit data about the user's movement and/or posture (eg, an image of the user or coordinate information of the user's body) to the server 203.
  • the electronic device 101 may transmit the user's image (or video) to the server 203.
  • the electronic device 101 may transmit coordinate information of the user's body to the server 203.
  • the electronic device 101 may transmit the user's image (or video) and coordinate information of the user's body to the server 203.
  • the electronic device 101 may transmit information about the user's biological signals to the server 203.
  • the electronic device 101 may transmit information about the user's surrounding environment (e.g., snoring sound, noise, brightness, air purity, degree of light flicker, location of surrounding objects) to the server 203.
  • the electronic device 101 may perform operation 705 while performing the operations of FIG. 7 .
  • the electronic device 101 may perform operation 705 in real time, depending on settings.
  • the electronic device 101 may perform operation 705 periodically or aperiodically, depending on settings. Depending on settings, the electronic device 101 may not perform operation 705.
  • the electronic device 101 may confirm the start of the user's sleep.
  • the start of sleep may be the point at which the user enters sleep.
  • the electronic device 101 may confirm the start of the user's sleep based on data about the user's movement received from the wearable device 202 through the communication module 306.
  • the electronic device 101 may confirm the beginning of the user's sleep based on the user's absence of movement for a specified period of time, but there is no limit to the method of confirming the beginning of the user's sleep.
  • the electronic device 101 may confirm the start of the user's sleep based on data on the user's biosignals received from the wearable device 202 through the communication module 306.
  • the electronic device 101 may confirm the beginning of a user's sleep based on the user's biosignals maintaining a specified level for a specified period of time, but there is no limit to the method of confirming the beginning of the user's sleep. .
  • the electronic device 101 receives a signal (e.g., a signal indicating the start of the user's sleep) received from the wearable device 202 through the communication module 306, or at least one sensor of the electronic device 101 (e.g., Based on the signal requesting activation of the sensor 309), the start of the user's sleep can be confirmed.
  • a signal e.g., a signal indicating the start of the user's sleep
  • the electronic device 101 may check the user's sleep state.
  • the sleep state may include a deep sleep state, a light sleep state, a waking state during sleep, a REM sleep state, a state entering sleep, or a state in which sleep has ended.
  • the sleep state may include the level or change in the user's biosignals (eg, heart rate or blood oxygen saturation).
  • the sleep state may include the user's movements and/or posture.
  • the sleeping state may include information about the surrounding environment of the sleeping user (e.g., noise, brightness, air purity, degree of light flickering, location of surrounding objects).
  • the electronic device 101 includes a microphone 302, a camera 305, at least one sensor (e.g., a photo sensor 308, a lidar sensor 309, or at least one sensor included in the sensor module 176). ), it is possible to obtain information about the user's surrounding environment (e.g. snoring sound, noise, brightness, air purity, degree of light flickering, location of surrounding objects).
  • the electronic device 101 may check the user's sleep state based on the information obtained in operation 701 or 703.
  • the electronic device 101 may check the user's sleeping state based on the image acquired through the camera 305.
  • the electronic device 101 may check the user's sleep state based on data acquired through at least one sensor (eg, the photo sensor 308 or the LiDAR sensor 309).
  • the electronic device 101 may check the user's sleep state based on data received from the wearable device 202 through the communication module 306.
  • the electronic device 101 may include an image acquired through a camera 305, data acquired through at least one sensor (e.g., a photo sensor 308 or a lidar sensor 309), and/or a communication module 306. ), the user's sleep state can be confirmed based on data received from the wearable device 202.
  • the electronic device 101 may transmit information about the sleep state to the server 203.
  • the electronic device 101 monitors the deep sleep state, light sleep state, waking state during sleep, REM sleep state, entering sleep state, sleep ending state, and the user's biosignals (e.g., heart rate or blood oxygen saturation).
  • the user's biosignals e.g., heart rate or blood oxygen saturation.
  • Level or change, information about the user's movements and/or posture, or information about the user's surroundings e.g. snoring sounds, noise, brightness, air quality, flicker of light, location of surrounding objects
  • the electronic device 101 may transmit information about when the user's sleep state changes to the server 203.
  • the electronic device 101 may transmit information about when the user's heart rate or blood oxygen saturation changes to the server 203. Sleep status or changes in sleep status will be described later.
  • the electronic device 101 may perform operation 713 while performing the operations of FIG. 7 .
  • the electronic device 101 can perform operation 713 in real time, depending on settings.
  • the electronic device 101 may perform operation 713 periodically or aperiodically, depending on settings. Depending on settings, the electronic device 101 may not perform operation 713.
  • the electronic device 101 may receive rendered data from the server 203 through the communication module 306.
  • the server 203 may render data received from the electronic device 101 (or the wearable device 202).
  • the server 203 may render the user's sleeping posture.
  • the server 203 may render the user's sleeping posture based on the user's image.
  • the server 203 may render the user's sleeping posture based on coordinate information of the user's body.
  • the server 203 may render the user's sleeping posture based on the user's image and coordinate information of the user's body.
  • the user's sleeping posture may be a static posture or may be a dynamic movement.
  • the server 203 may render the user's surrounding environment (e.g., location of objects around the user, snoring sounds, noise, brightness, air purity, and flicker of light) based on the received data. .
  • the server 203 may transmit the rendered data to the electronic device 101 . Rendered data will be described later.
  • the electronic device 101 may perform at least a portion of the operation of the server 203 within the electronic device 101.
  • the electronic device 101 may render the user's sleeping posture.
  • the electronic device 101 may render the user's sleeping posture based on the user's image.
  • the electronic device 101 may render the user's sleeping posture based on coordinate information of the user's body.
  • the electronic device 101 may render the user's sleeping posture based on the user's image and coordinate information of the user's body.
  • the electronic device 101 may render the user's surrounding environment (e.g., location of objects around the user, snoring sounds, noise, brightness, air purity, and flicker level of light).
  • the electronic device 101 may render the user's sleeping position and/or the user's surrounding environment without performing operations 705 and/or 713.
  • the electronic device 101 may include some of the data of the 705 operation (e.g., data about the user or data about the surrounding environment) and/or information of the 713 operation (e.g., information about the sleeping state or information about the surrounding environment). It may be transmitted to the server 203, and the remaining part may not be transmitted to the server 203.
  • the electronic device 101 may render the user's sleeping position and/or the user's surrounding environment based on data and/or information not transmitted to the server 203.
  • the electronic device 101 may render the user's sleeping position and/or the user's surrounding environment for data transmitted to the server 203, depending on settings.
  • the electronic device 101 uses rendered data received from the server 203 to display a screen (e.g., sleeping position) through the display 304. (or movement), and/or a screen representing the surrounding environment) may be displayed.
  • the electronic device 101 uses data rendered within the electronic device 101 to display a screen (e.g., sleeping position (or movement), and/or surrounding environment) through the display 304. screen) can be displayed.
  • the electronic device 101 may display the user's sleeping posture (or movement) and/or surrounding environment through the display 304 using the rendered data.
  • the electronic device 101 may transmit data (eg, rendered data) to the external device 204.
  • the electronic device 101 may transmit data (eg, rendered data) to the external device 204 so that the external device 204 displays the user's sleeping position (or movement) or surrounding environment.
  • data eg, rendered data
  • the screen display of the user's sleeping position (or movement) or surrounding environment will be described later.
  • FIG. 8 is a flowchart of a method of operating an electronic device according to an embodiment.
  • At least some of the operations in FIG. 8 may be omitted.
  • the operation order of the operations in FIG. 8 may be changed. Operations other than the operations of FIG. 8 may be performed before, during, or after the operations of FIG. 8 are performed. At least some of the operations in FIG. 8 may correspond to at least some of the operations in FIG. 7 .
  • the operations of FIG. 8 can be performed organically with the operations of FIG. 7.
  • the electronic device 101 may check a designated event.
  • the specified event may be the start of the user's sleep.
  • the electronic device 101 may confirm the start of the user's sleep in operation 709.
  • the designated event may confirm a request for activation of at least one sensor (eg, LiDAR sensor 309) of the electronic device 101.
  • the electronic device 101 may receive a signal requesting activation of at least one sensor (eg, the lidar sensor 309) of the electronic device 101 in operation 709.
  • the designated event may be that the electronic device 101 is placed on a holder.
  • the electronic device 101 can confirm that the electronic device 101 is mounted on the holder. There are no restrictions on the method of confirming that the electronic device 101 is mounted on the holder.
  • the designated event may be the start of charging of the electronic device 101. There are no limitations to the method of confirming the start of charging of the electronic device 101. There are no restrictions on the type of user input.
  • the designated event may be the arrival of the scheduled sleep time reserved for the electronic device 101.
  • the designated event may be a user input requesting activation of at least one sensor (eg, lidar sensor 309) of the electronic device 101. After confirming a user input requesting activation of at least one sensor (e.g., lidar sensor 309) of the electronic device 101, the electronic device 101 is based on the fact that the user's biosignal enters a designated range. Thus, at least one sensor (e.g., lidar sensor 309) can be activated.
  • the electronic device 101 may activate at least one sensor (e.g., lidar sensor 309) upon confirming a designated event. .
  • the electronic device 101 may deactivate at least one sensor (e.g., LiDAR sensor 309).
  • the electronic device 101 may activate at least one sensor (e.g., LiDAR sensor 309) as it confirms a specified event. You can.
  • the electronic device 101 may check the user's movement and/or posture.
  • the electronic device 101 may check the user's movement and/or posture based on data acquired through at least one activated sensor (eg, LiDAR sensor 309).
  • the electronic device 101 may include an image acquired through a camera 305, data acquired through at least one activated sensor (e.g., lidar sensor 309), and/or a wearable device through a communication module 306. Based on data received from the device 202, the user's movement and/or posture can be confirmed.
  • Operation 805 may be operation 701 of FIG. 7.
  • the electronic device 101 may perform the operations of FIG. 7 organically with operation 805.
  • Figure 9 is a flowchart of a method of operating a wearable device according to an embodiment.
  • At least some of the operations in FIG. 9 may be omitted.
  • the operation order of the operations in FIG. 9 may be changed. Operations other than the operations of FIG. 9 may be performed before, during, or after the operations of FIG. 9 are performed.
  • the operations of FIG. 9 can be performed organically with the operations of FIGS. 7 and 8.
  • the wearable device 202 may confirm the start of the user's sleep based on the user's data.
  • the wearable device 202 through at least one sensor (e.g., biometric sensor 405, motion sensor 408, or temperature sensor 409), collects user data (e.g., data of the user's biometric signals, user data of the user's movement, or data of the temperature of the user's body) can be obtained.
  • the wearable device 202 may confirm the start of the user's sleep based on the user's data. For example, the wearable device 202 may confirm the beginning of the user's sleep based on the user's absence of movement for a specified period of time.
  • the wearable device 202 may confirm the start of the user's sleep based on the user's biosignals maintaining a specified level for a specified period of time. There are no limitations to the way the wearable device 202 confirms the start of the user's sleep.
  • the wearable device 202 sends a signal requesting activation of at least one sensor (e.g., lidar sensor 309) of the electronic device 101. Can be transmitted to the electronic device 101 through the communication module 406.
  • the wearable device 202 confirms the start of the user's sleep, the wearable device 202 communicates a signal requesting activation of at least one sensor (e.g., the lidar sensor 309) of the electronic device 101. It can be transmitted to the electronic device 101 through the module 406.
  • the wearable device 202 collects the user's data (e.g., data of the user's biosignals, data of the user's movements, or temperature of the user's body). data) can be transmitted to the electronic device 101 through the communication module 406. Before confirming the start of the user's sleep, the wearable device 202 collects the user's data (e.g., data of the user's biosignals, data of the user's movement, or data of the user's body temperature) to the electronic device 101. ) can be transmitted.
  • the user's data e.g., data of the user's biosignals, data of the user's movements, or temperature of the user's body.
  • the wearable device 202 After confirming the start of the user's sleep, the wearable device 202 sends the user's data (e.g., data of the user's biological signals, data of the user's movement, or data of the user's body temperature) to the electronic device 101. It can be sent to .
  • the wearable device 202 may continuously transmit user data (eg, data on the user's biosignals, data on the user's movements, or data on the temperature of the user's body) to the electronic device 101 . Transmission of the user's data after confirming the start of the user's sleep will be described later.
  • FIG. 10 is a flowchart of a method of operating an electronic device according to an embodiment.
  • FIG. 10 can be explained with reference to FIG. 11.
  • FIG. 11 is a diagram for explaining the operation of an electronic device according to an embodiment.
  • At least some of the operations in FIG. 10 may be omitted.
  • the operation order of the operations in FIG. 10 may be changed. Operations other than the operations of FIG. 10 may be performed before, during, or after the operations of FIG. 10 are performed. At least some of the operations of FIG. 10 may correspond to at least some of the operations of FIGS. 7 and 8.
  • the operations of FIG. 10 can be performed organically with the operations of FIGS. 7, 8, and 9.
  • the electronic device 101 may check the user's movement and/or posture.
  • Operation 1001 may be operation 701 of FIG. 7 or operation 805 of FIG. 8.
  • the electronic device 101 may match the posture pattern.
  • Operation 1003 may be included in operation 1001.
  • the electronic device 101 may check the user's movement and/or posture through an operation of matching the posture pattern.
  • the electronic device 101 may match the information confirmed in operation 1001 with the posture pattern stored in the memory 307.
  • the posture pattern may be a pattern stored in the memory 307 in relation to the sleeping posture.
  • the postural pattern is at least one sleep position (e.g., Left to Up, Up to Left, Right to Up, Up to Right, Down to Left, Left to Down, Down to Right, Right to Down, Up to Up, Left to Left, Right to Right, or Down to Down), and there is no limit to the type of posture pattern. (a) of FIG.
  • 11 may be a sleeping position in which the sleeping position is curled up while lying on the right side. 11(b) may be a sleeping position lying down.
  • the electronic device 101 may determine the user's sleeping posture by comparing the image (or video) obtained at 1001 with the posture pattern stored in the memory 307.
  • the electronic device 101 may determine the user's sleeping posture by comparing the body coordinate data obtained at 1001 and the posture pattern stored in the memory 307.
  • the electronic device 101 may determine the user's sleeping posture through matching the posture pattern.
  • the electronic device 101 may determine the user's sleeping posture by matching the posture pattern using pre-stored user information (e.g., height, weight, or body type).
  • the electronic device 101 may determine the user's sleeping posture through matching the posture pattern.
  • the electronic device 101 may detect the user's tossing and turning during sleep based on data confirmed in operation 1001.
  • the electronic device 101 may measure the intensity of tossing and turning through matching of posture patterns according to the user's tossing and turning over time.
  • the electronic device 101 may transmit data on the user's movement and/or posture to the server 203.
  • Operation 1005 may be operation 705 of FIG. 7.
  • the user's movement and/or posture data may include information about the sleeping posture confirmed through matching the pattern of the posture of the 1003 motion.
  • Figure 12 is a flowchart of a method of operating a wearable device according to an embodiment.
  • At least some of the operations in FIG. 12 may be omitted.
  • the operation order of the operations in FIG. 12 may be changed. Operations other than the operations of FIG. 12 may be performed before, during, or after the operations of FIG. 12 are performed. At least some of the operations in FIG. 12 may correspond to at least some of the operations in FIG. 9 .
  • the operations of FIG. 12 can be performed organically with the operations of FIG. 9.
  • the wearable device 202 may confirm the start of the user's sleep.
  • Operation 1201 may be operation 901 of FIG. 9.
  • the wearable device 202 upon confirming the start of sleep of the user, detects at least one sensor (e.g., a biometric sensor) of the wearable device 202.
  • the mode of the 405), motion sensor 408, or temperature sensor 409) can be changed to sleep mode.
  • the mode of at least one sensor e.g, the biometric sensor 405, the motion sensor 408, or the temperature sensor 409) may include a normal mode and a sleep mode. In normal mode and sleep mode, the sensing settings (e.g., cycle, sensitivity, etc.) of at least one sensor (e.g., biometric sensor 405, motion sensor 408, or temperature sensor 409) may be changed.
  • the wearable device 202 changes at least one sensor (e.g., biometric sensor 405, motion sensor 408, or temperature sensor) to a sleep mode.
  • Data can be obtained through (409)).
  • the wearable device 202 may transmit the data obtained in operation 1205 to the electronic device 101 through the communication module 406.
  • the wearable device 202 collects data acquired through at least one sensor (e.g., biometric sensor 405, motion sensor 408, or temperature sensor 409) in sleep mode. can be transmitted to the electronic device 101.
  • the wearable device 202 acquires information through at least one sensor in the normal mode (e.g., the biometric sensor 405, the motion sensor 408, or the temperature sensor 409). Data can be transmitted to the electronic device 101.
  • FIG. 13 is a diagram for explaining the operation of an electronic device according to an embodiment.
  • Figure 13 shows the degree of the user's tossing and turning (e.g., toss and turn), blood oxygen saturation (e.g., SpO 2 ), heart rate (e.g., HR), and size of snoring (e.g., snoring) scaled over time. It's a graph.
  • the unit on the left side of the graph in FIG. 13 is arbitrarily designated, and there is no limitation on the unit of the graph.
  • the electronic device 101 monitors the user's degree of tossing and turning, blood oxygen saturation, heart rate, and snoring volume over time based on the operations of FIGS. 7, 8, 9, 10, or 12. You can check it.
  • FIG. 13 is a diagram illustrating only some of the data checked by the electronic device 101, and the electronic device 101 can also check other data related to the user's sleeping state.
  • operations 711 e.g., checking the sleep state
  • 713 e.g., transmitting information about the sleep state
  • the electronic device 101 may confirm that the degree of the user's tossing and turning (eg, tossing and turning) exceeds the reference value at the first time point (eg, t1) in FIG. 13 .
  • the electronic device 101 may check a change in the user's sleep state at a first time point (eg, t1) depending on the degree of the user's tossing and turning (eg, tossing and turning).
  • the electronic device 101 may confirm that the blood oxygen saturation (eg, SpO 2 ) decreases below the reference value (eg, 90%) at the second time point (eg, t2) in FIG. 13 .
  • the electronic device 101 may check a change in the user's sleep state at a second time point (eg, t2) according to blood oxygen saturation (eg, SpO 2 ).
  • the reference value (e.g., 90%) of blood oxygen saturation (e.g., SpO 2 ) may be determined differently depending on the user's information (e.g., age, weight, height, race, nationality, gender, or occupation).
  • the electronic device 101 may confirm that the heart rate (eg, HR) increases above the reference value (eg, 130) at the second time point (eg, t2) in FIG. 13 .
  • the electronic device 101 may check a change in the user's sleep state at a second time point (eg, t2) according to the heart rate (eg, HR).
  • the reference value (e.g., 130) of heart rate (e.g., HR) may be determined differently depending on the user's information (e.g., age, weight, height, race, nationality, gender, or occupation).
  • the electronic device 101 detects that blood oxygen saturation (e.g., SpO 2 ) decreases below the reference value (e.g., 90%) and heart rate (e.g., HR) decreases to the reference value (e.g., 90%) at the second time point (e.g., t2) in FIG. 13 . : 130), a change in the user's sleep state can be confirmed at a second time point (e.g., t2).
  • blood oxygen saturation e.g., SpO 2
  • HR heart rate
  • the electronic device 101 may confirm that the size of snoring (eg, snoring) exceeds the reference value at the third time point (eg, t3) in FIG. 13 .
  • the electronic device 101 may confirm a change in the user's sleep state at a third time point (eg, t3) according to the size of snoring (eg, snoring).
  • the electronic device 101 may confirm that the user's sleep state has stabilized at the fourth time point (eg, t4) in FIG. 13 .
  • the electronic device 101 is configured to measure the user's degree of tossing and turning (e.g., toss and turn), blood oxygen saturation (e.g., SpO 2 ), heart rate (e.g., HR), and/or the amount of snoring (e.g., snoring). Accordingly, it can be confirmed that the user's sleep state has stabilized at the fourth time point (eg, t4).
  • FIG. 14 is a diagram for explaining the operation of an electronic device according to an embodiment.
  • Figure 14 is a graph showing the user's blood oxygen saturation (eg, SpO 2 ) and heart rate (eg, HR) over time.
  • blood oxygen saturation eg, SpO 2
  • HR heart rate
  • the electronic device 101 may check the user's sleep state according to the user's blood oxygen saturation (eg, SpO 2 ) and heart rate (eg, HR).
  • the electronic device 101 may check when the user's blood oxygen saturation begins to decrease (eg, t1, t3) or begins to increase (eg, t2, t4).
  • the electronic device 101 may check when the user's heart rate begins to increase (eg, t3) or decrease (eg, t2, t4).
  • the electronic device 101 determines that the period from the first time point (e.g., t1) to the second time point (e.g., t2) in FIG. 14 is a period in which the user's sleep condition worsens (e.g., the sleep score decreases). You can judge. For example, the electronic device 101 changes the user's sleeping position at a first time point (eg, t1), and then changes the user's sleep position from the first time point (eg, t1) to the second time point (eg, t2). It can be judged that the condition is worsening (e.g., sleep score is lowering).
  • the condition is worsening
  • the electronic device 101 may determine that the period from the second time point (eg, t2) to the third time point (eg, t3) is a period in which the user's sleep condition improves (eg, the sleep score increases). For example, the electronic device 101 changes the user's sleeping position at a second time point (eg, t2), and then changes the user's sleep position from the second time point (eg, t2) to the third time point (eg, t3). It can be judged that the condition is improving (e.g., sleep score is increasing).
  • the electronic device 101 may determine that the period from the third time point (e.g., t3) to the fourth time point (e.g., t4) is a period in which the user's sleep condition worsens (e.g., the sleep score decreases). there is.
  • the electronic device 101 may determine that the period after the fourth time point (e.g., t4) is a period in which the user's sleep condition improves (e.g., the sleep score increases).
  • the electronic device 101 determines that the reason the user's blood oxygen saturation is low and the heart rate is high at the second time point (e.g., t2) in FIG. 14 is because the user's posture has changed at the first time point (e.g., t1). You can.
  • the electronic device 101 determines that the reason the user's blood oxygen saturation is high and the heart rate is low at the third time point (e.g., t3) in FIG. 14 is because the user's posture has changed at the second time point (e.g., t2). You can.
  • the electronic device 101 may transmit the information (e.g., information about sleep state) determined in the description of FIG. 14 to the server 203 (e.g., operation 713 in FIG. 7).
  • information e.g., information about sleep state
  • FIG. 15 is a diagram for explaining the operation of devices included in the system according to an embodiment.
  • FIG. 15 can be explained with reference to FIGS. 7, 8, 9, 10, 11, 12, 13, 14, 15, and 16.
  • Figure 16 is a diagram for explaining a rendered image according to an embodiment.
  • At least some of the operations in FIG. 15 may be omitted.
  • the operation order of the operations in FIG. 15 may be changed. Operations other than the operations of FIG. 15 may be performed before, during, or after the operations of FIG. 15 are performed. At least some of the operations in FIG. 15 may correspond to at least some of the operations in FIGS. 7, 8, 9, 10, 11, 12, 13, and 14. The operations of FIG. 15 may be performed organically with the operations of FIGS. 7, 8, 9, 10, 11, 12, 13, and 14.
  • the wearable device 202 detects at least one sensor (e.g., biometric sensor 405, motion sensor 408, or Data (e.g., data of the user's biological signals, data of the user's movement, or data of the temperature of the user's body) may be obtained through the temperature sensor 409.
  • the wearable device 202 may use at least one sensor in normal mode (e.g., biometric sensor 405, motion sensor 408, or temperature sensor 409).
  • data e.g., data of the user's biosignals, data of the user's movements, or data of the temperature of the user's body
  • data e.g., data of the user's biosignals, data of the user's movements, or data of the temperature of the user's body
  • the wearable device 202 may check the user's sleep (eg, start of sleep).
  • the wearable device 202 stores data (e.g., data of the user's biometric signals, user's Based on movement data or data on the temperature of the user's body, the user's sleep (e.g., start of sleep) can be confirmed.
  • Operation 1503 may be operation 901 of FIG. 9 or operation 1201 of FIG. 12.
  • the electronic device 101 may check the user's surrounding environment. Before the user's sleep (e.g., start of sleep) is confirmed, the electronic device 101 uses a microphone 302, a camera 305, and at least one sensor (e.g., a photo sensor 308, a lidar sensor 309). ), or at least one sensor included in the sensor module 176), about the user's surrounding environment (e.g., snoring sound, noise, brightness, air purity, degree of light flickering, location of surrounding objects). Information can be obtained.
  • the electronic device 101 includes a microphone 302, a camera 305, and at least one sensor (e.g., a photo sensor 308, a lidar sensor 309, or a sensor module 176).
  • a sensor e.g., a photo sensor 308, a lidar sensor 309, or a sensor module 176.
  • the wearable device 202 e.g., processor 403
  • the wearable device 202 upon confirming the user's sleep (e.g., onset of sleep), communicates, through the communication module 406, the electronic device ( 101), the signal can be transmitted.
  • the wearable device 202 may transmit a signal indicating the user's sleep (eg, the start of sleep) to the electronic device 101.
  • the wearable device 202 detects at least one sensor (e.g., photo sensor 308, lidar sensor 309, or sensor) of the electronic device 101.
  • a signal requesting activation of at least one sensor included in the module 176 may be transmitted to the electronic device 101.
  • the electronic device 101 receives a signal (e.g., a signal indicating the user's sleep (e.g., start of sleep), or at least one signal of the electronic device 101 from the wearable device 202, through the communication module 306.
  • a signal requesting activation of a sensor e.g, the photo sensor 308, the lidar sensor 309, or at least one sensor included in the sensor module 176) may be received.
  • the electronic device 101 receives the signal of operation 1507 from the wearable device 202, and detects at least one sensor of the electronic device 101.
  • the photo sensor 308, the lidar sensor 309, or at least one sensor included in the sensor module 176) can be activated.
  • the electronic device 101 activates at least one sensor (e.g., photo sensor 308, lidar sensor 309, or sensor module (e.g., The user's surrounding environment can be checked through at least one sensor included in 176).
  • the electronic device 101 receives the signal of operation 1507 from the wearable device 202, the electronic device 101 uses a microphone 302, a camera 305, and at least one sensor (e.g., a photo sensor 308, a lidar sensor 309). ), or at least one sensor included in the sensor module 176), about the user's surrounding environment (e.g., snoring sound, noise, brightness, air purity, degree of light flickering, location of surrounding objects). Information can be obtained.
  • the electronic device 101 upon receiving the signal of operation 1507 from the wearable device 202, activates at least one sensor (e.g., the photo sensor 308, the lidar sensor 309, or User data (eg, data on the user's movement and/or the location (or coordinates) of the user's body) may be obtained through at least one sensor included in the sensor module 176.
  • at least one sensor e.g., the photo sensor 308, the lidar sensor 309, or User data (eg, data on the user's movement and/or the location (or coordinates) of the user's body) may be obtained through at least one sensor included in the sensor module 176.
  • the wearable device 202 determines the user's sleep (e.g., onset of sleep) by detecting at least one sensor of the wearable device 202 (e.g., processor 403).
  • the mode of the biometric sensor 405, motion sensor 408, or temperature sensor 409) can be changed to sleep mode.
  • Operation 1513 may be operation 1203 of FIG. 12.
  • the wearable device 202 may collect data (e.g., the user's data) through at least one sensor (e.g., the biometric sensor 405, the motion sensor 408, or the temperature sensor 409) that has been changed to sleep mode. data) can be obtained.
  • data e.g., the user's data
  • the biometric sensor 405, the motion sensor 408, or the temperature sensor 409 e.g., the biometric sensor 405, the motion sensor 408, or the temperature sensor 409
  • the wearable device 202 changes at least one sensor (e.g., microphone 402, biometric sensor 405, motion sensor 408) to sleep mode. ), or data acquired through the temperature sensor 409) can be transmitted to the electronic device 101 through the communication module 406.
  • the electronic device 101 e.g., processor 303 detects from the wearable device 202 at least one sensor (e.g., microphone 402, biometric sensor 405, Data acquired through a motion sensor 408 or a temperature sensor 409 may be received.
  • Operation 1515 may be operation 1207 of FIG. 12.
  • the electronic device 101 may check the user's movement and/or posture.
  • Operation 1517 may be operation 701 of FIG. 7, operation 805 of FIG. 8, or operations 1001 and 1003 of FIG. 10.
  • the electronic device 101 may transmit data about the user's movement and/or posture to the server 203.
  • the server 203 may receive data about the user's movement and/or posture (eg, an image of the user or coordinate information of the user's body) from the electronic device 101 .
  • Operation 1519 may be operation 705 of FIG. 7 or operation 1005 of FIG. 10.
  • the electronic device 101 may check the user's sleep state.
  • Operation 1521 may be operation 711 of FIG. 7.
  • the electronic device 101 may transmit information about the user's sleep state to the server 203. Operation 1523 may be operation 713 of FIG. 7. As described above in the description of FIG. 7 , the electronic device 101 may not transmit data or information to the server 203. The electronic device 101 may transmit part of the data or information to the server 203 and not transmit the remaining part to the server 203 . The electronic device 101 may perform at least part of the operation of the server 203 within the electronic device 101. For example, the electronic device 101 may directly render data based on data and/or information. An embodiment in which rendering is performed directly in the electronic device 101 can be understood by referring to the description described above with respect to FIG. 7 .
  • operations 1519, 1523, 1525, and 1527 may be omitted or simplified.
  • all of the real-time data e.g., data about the user, data about the surrounding environment, information about sleeping conditions, and/or information about the surrounding environment
  • all of the real-time data e.g., data about the user, data about the surrounding environment, information about sleeping conditions, and/or information about the surrounding environment
  • At least some of the real-time data (e.g., data about the user, data about the surrounding environment, information about sleeping conditions, and/or information about the surrounding environment) is rendered on the electronic device 101, and some of the remaining data is may be rendered on the server 203.
  • the server 203 may render data received from the electronic device 101 (or the wearable device 202).
  • the server 203 may generate rendering data using data received from the electronic device 101 (or the wearable device 202).
  • the server 203 may render the user's sleeping posture.
  • the server 203 may render the user's sleeping posture based on the user's image.
  • the server 203 may render the user's sleeping posture based on coordinate information of the user's body.
  • the server 203 may render the user's sleeping posture based on the user's image and coordinate information of the user's body.
  • Figure 16(a) is a rendering of the sleeping posture of the character corresponding to the user.
  • Figure 16(b) renders the sleeping posture of a virtual person corresponding to the user by modeling the user.
  • the user's sleeping posture may be a static posture or may be a dynamic movement.
  • Server 203 may correct the rendered data based on sleep-related clinical data.
  • the server 203 may store the rendered data in the memory 502.
  • the server 203 may transmit rendered data to the electronic device 101 through the communication module 503.
  • the electronic device 101 may receive rendered data from the server 203 through the communication module 306.
  • the rendered data may be a static image or a dynamic image.
  • Operation 1527 may be operation 715 of FIG. 7.
  • the rendered data may include data about sounds (e.g., snoring sounds, ambient noise).
  • the rendered data that the electronic device 101 receives from the server 203 may be rendered data corresponding to the entire period.
  • the total period may be the total period corresponding to data provided from the electronic device 101 to the server 203.
  • the server 203 can render all data corresponding to the entire period.
  • the server 203 may transmit all rendered data corresponding to the entire period to the electronic device 101 .
  • the rendered data that the electronic device 101 receives from the server 203 may be rendered data corresponding to a specific period.
  • the specific period may be a specific period that is determined based on information about the sleep state provided from the electronic device 101 to the server 203 (eg, time of change in the sleep state). For example, referring to FIGS. 13 and 14, the server 203, based on information about the sleep state (e.g., time of change in sleep state) provided to the server 203 from the electronic device 101, A designated period including the change in sleep state may be determined as a specific period.
  • the server 203 may render only data corresponding to a specific period and transmit the rendered data corresponding to the specific period to the electronic device 101 .
  • the server 203 may render all data corresponding to the entire period and transmit the rendered data corresponding to a specific period to the electronic device 101 .
  • the electronic device 101 uses rendered data received from the server 203 to display a screen (e.g., sleeping position) through the display 304. (or movement), and/or a screen representing the surrounding environment) may be displayed. Operation 1529 may be operation 717 of FIG. 7.
  • the electronic device 101 e.g., the processor 303 may detect the change in the user's biosignal and the change timing of the user's sleep state (e.g., t1, t2, t3) in the diagram of FIG. 14 (e.g., , or a screen representing t4) can be displayed through the display 304.
  • the electronic device 101 displays the change in the user's biosignal displayed through the display 304 and the change point in the user's sleep state on a screen (e.g., the diagram of FIG. 14 ) showing the change point in the user's sleep state.
  • a screen e.g., the diagram of FIG. 14
  • rendered data received from server 203 is used to display the screen (e.g., sleep position (or movement), and/or a screen representing the surrounding environment) may be displayed.
  • the electronic device 101 displays a period before and after the first time point (eg, t1) according to a user input of selecting an icon corresponding to the second time point (eg, t2) on the screen corresponding to FIG. 14.
  • a screen (e.g., a screen representing a sleeping position (or movement) and/or the surrounding environment) may be displayed through the display 304 using the rendered data corresponding to .
  • the electronic device 101 displays a first image from a first time point (e.g., t1) according to a user input of selecting an icon corresponding to a second time point (e.g., t2) on the screen corresponding to FIG. 14.
  • a screen e.g., sleep position (or movement), and/or a screen representing the surrounding environment
  • the electronic device 101 displays a period before and after the second time point (eg, t2) according to a user input of selecting an icon corresponding to the third time point (eg, t3) on the screen corresponding to FIG. 14.
  • a screen e.g., a screen representing a sleeping position (or movement) and/or the surrounding environment
  • the electronic device 101 displays a first image from a second time point (e.g., t2) according to a user input of selecting an icon corresponding to a third time point (e.g., t3) on the screen corresponding to FIG. 14.
  • a screen e.g., sleep position (or movement), and/or a screen representing the surrounding environment
  • the electronic device 101 displays a screen (e.g., sleeping position) through the display 304 using rendered data corresponding to a specific time point, according to a user input of selecting an icon corresponding to a specific time point. (or movement), and/or a screen representing the surrounding environment) may be displayed.
  • the second time point e.g., t2
  • blood oxygen saturation e.g., SpO 2
  • heart rate e.g., HR
  • the electronic device 101 can capture the user's movements.
  • the electronic device 101 Feedback can be given using lighting, vibration, sound, etc.
  • the electronic device 101 or the server 203 may analyze captured user movements while providing feedback to the user.
  • the electronic device 101 or the server 203 analyzes the captured user's movements, thereby detecting the user when a change in body signal (e.g., blood oxygen saturation (e.g., SpO 2 ) or heart rate (e.g., HR)) occurs.
  • a change in body signal e.g., blood oxygen saturation (e.g., SpO 2 ) or heart rate (e.g., HR)
  • the position of the upper airway and/or body e.g., the position of joints and internal organs
  • the electronic device 101 or the server 203 may define the user's snoring sound in decibel units.
  • the electronic device 101 is based on data received from the server 203 or an internal calculation result of the electronic device 101, based on the volume and pattern of the user's snoring sound and data about the user's sleeping position.
  • sleep apnea and sleep stages can be displayed.
  • the electronic device 101 or the server 203 monitors the user's heart rate, SpO2 level, and snoring according to the user's body fat mass and posture in order to provide a good posture through virtualization to secure quality sleep time according to the user's physical characteristics.
  • the user can be guided by personalizing and shaping the most suitable posture.
  • the electronic device 101 or the server 203 may provide feedback about poor sleeping posture to the user through sleep indicators.
  • the electronic device 101 may transmit data (eg, rendered data) to the external device 204.
  • the electronic device 101 displays data (e.g., a graph about the user's sleeping state (e.g., FIG. 14), a screen about the user's sleeping posture (or movement), or the user's surrounding environment, so that the external device 204 displays a screen about the user's sleeping state (e.g., FIG. : rendered data) can be transmitted to the external device 204.
  • the electronic device 101 may transmit data (eg, rendered data) to the wearable device 202.
  • the electronic device 101 stores data (e.g., data) so that the wearable device 202 displays a screen about a graph about the user's sleep state (e.g., FIG. 14), the user's sleeping posture (or movement), or the user's surrounding environment. : rendered data) can be transmitted to the wearable device 202.
  • data e.g., data
  • the external device 204 displays the screen (e.g., the user's sleep) through the display 604, based on data received from the electronic device 101.
  • a graph about the state e.g., Figure 14
  • a screen showing the user's sleeping position (or movement), or the user's surrounding environment may be displayed.
  • the external device 204 may be an augmented reality (AR) device, a virtual reality (VR) device, or an extended reality (XR) device, and there is no limitation on the type of screen displayed on the external device 204.
  • AR augmented reality
  • VR virtual reality
  • XR extended reality
  • the user can virtually check the user's sleeping position in bed through the electronic device 101 or the external device 204.
  • the electronic device 101 or the external device 204 may intuitively provide the user with a screen showing the correlation between attention factors related to the user's sleep (e.g., bed life) and the actual user's sleep.
  • Data related to sleep is shared in real time, and users can directly observe the rendered user before or after the set wake-up time.
  • the data shared in real time is transmitted to the electronic device 101 or the server through the user's BMI (body mass index) data or data acquired through a sensor based on the user's head, chest, left hand, right hand, left leg, and right leg. It can be rendered at (203).
  • BMI body mass index
  • the server 203 may perform rendering using the coordinate data.
  • the electronic device 101 may perform rendering within the electronic device 101 using coordinate data related to the location of the user's body.
  • the user can access the prepared virtual space and observe the user's sleep in 3D through an AR device, VR device, XR device, or device with a display (e.g., electronic device 101 or external device 204). there is.
  • the user can determine what the user looks like according to their sleep state (e.g. good sleep state, average sleep state, or bad sleep state) in relation to certain events (e.g. events related to HR, SPO2, snoring, toss and turn).
  • the screen displayed on the electronic device 101 or the external device 204 can be observed using rendered data.
  • the electronic device 101 or the server 203 may render a sleep simulation based on the user's sleep pattern when the sleep score is good, or may match the same sleep data to the user in the virtual world.
  • the electronic device 101 or the server 203 may render the user's sleeping posture based on an event that occurred during the user's sleeping time.
  • the electronic device 101 or the external device 204 projects the user's sleep data onto the virtual self and provides simulation results to the user so that the user can observe the user at the moment of falling asleep.
  • the electronic device 101, the wearable device 202, or the external device 204 may display a sleep graph so that the user can check the user's sleep status over time. Sleep data can be categorized by country, race, age, or gender.
  • the electronic device 101 includes at least one communication module 306; at least one sensor (176; 308; 309); display module (160; 304); and a processor (120; 303).
  • the processor 120 (303) is configured to receive first data obtained through the motion sensor 408 of the wearable device 202 from the wearable device 202 through the at least one communication module 306. can be set.
  • the processor may be set to acquire second data related to the movement of the user of the wearable device 202 through the at least one sensor (176; 308; 309).
  • the processor may be set to check the user's movement and/or posture based on the first data and the second data.
  • the processor may be set to receive third data acquired through the biometric sensor 405 of the wearable device 202 from the wearable device 202 through the at least one communication module 306. .
  • the processor may be set to check the user's sleep state based on the third data.
  • the processor may be set to display a screen through the display module 160 (304) using rendering data generated using the first data, the second data, and/or the third data. there is.
  • At least part of the rendering data may be generated in the electronic device 101.
  • At least part of the rendering data may be generated in the server 203.
  • the processor 120 (303) may be configured to transmit fourth data about the movement and/or the posture of the user to the server 203 through the at least one communication module 306.
  • the processor 120 (303) may be set to transmit information about the sleep state to the server 203 through the at least one communication module 306.
  • the processor 120 (303) may be configured to receive at least part of the rendering data from the server 203 through the at least one communication module 306.
  • the at least one sensor may include a LiDAR sensor (309).
  • the electronic device 101 may include a memory 130; 307.
  • the processor (120; 303) may be set to match the first data and the second data with a posture pattern stored in the memory (130; 307).
  • the processor 120 (303) may be set to check the user's posture based on the matching result.
  • the processor 120 (303) may be set to check the sleeping state of the user based on the confirmed posture and the third data.
  • the processor (120; 303) may be set to activate the at least one sensor (176; 308; 309) upon confirming a designated event.
  • the processor 120; 303 may be set to acquire the second data through the activated at least one sensor 176; 308; 309.
  • the designated event confirms the start of sleep of the user of the wearable device 202 based on a signal received from the wearable device 202 through the at least one communication module 306. It may be.
  • the designated event is based on a signal received from the wearable device 202 through the at least one communication module 306, and the at least one sensor 176 of the electronic device 101; This may be confirming a request for activation of 308; 309).
  • the designated event may be confirmation that the electronic device 101 is mounted on a holder.
  • the designated event may confirm the start of charging of the electronic device 101.
  • the processor 120 (303) may be set to check the first time point at which the change in sleep is detected based on the third data.
  • the processor 120 (303) may be set to transmit information about the first time point to the server 203 as information about the sleep state.
  • the processor 120 (303) may be set to identify the time at which changes in the user's oxygen saturation and heart rate are detected as the first time based on the third data.
  • the server 203 uses the fourth data from a second time point before the first time interval from the first time point to a third time point after the second time interval from the first time point. Thus, it can be set to generate the rendering data.
  • the electronic device 101 may include a camera 180; 305.
  • the processor (120; 303) may be set to transmit an image of the surrounding environment acquired through the camera (180; 305) to the server (203).
  • the server 203 may be set to generate the rendering data using the surrounding environment image and the fourth data.
  • a method of operating the electronic device 101 includes first data acquired through the motion sensor 408 of the wearable device 202, and transmitting the first data to the at least one communication module 306 of the electronic device 101. It may include an operation of receiving information from the wearable device 202.
  • the method may include acquiring second data related to the movement of the user of the wearable device 202 through at least one sensor 176; 308; 309 of the electronic device 101.
  • the method may include checking the user's movement and/or posture based on the first data and the second data.
  • the method may include receiving third data obtained through the biometric sensor 405 of the wearable device 202 from the wearable device 202 through the at least one communication module 306. You can.
  • the method may include checking the user's sleep state based on the third data.
  • the method uses rendering data generated using the first data, the second data, and/or the third data to display a screen through the display module 160; 304 of the electronic device 101. It may include an action that displays .
  • At least part of the rendering data may be generated in the electronic device 101.
  • At least part of the rendering data may be generated in the server 203.
  • the method may include transmitting fourth data about the movement and/or posture of the user to the server 203 through the at least one communication module 306.
  • the method may include transmitting information about the sleep state to the server 203 through the at least one communication module 306.
  • the method may include receiving the at least part of the rendering data from the server 203 through the at least one communication module 306.
  • the at least one sensor may include a LiDAR sensor (309).
  • the method may include matching the first data and the second data with a posture pattern stored in the memory 130 (307) of the electronic device 101.
  • the method may include an operation of confirming the posture of the user based on the result of the matching.
  • the method may include confirming the sleeping state of the user based on the confirmed posture and the third data.
  • the method may include activating the at least one sensor (176; 308; 309) upon identifying a designated event.
  • the method may include acquiring the second data through the activated at least one sensor (176; 308; 309).
  • the designated event confirms the start of sleep of the user of the wearable device 202 based on a signal received from the wearable device 202 through the at least one communication module 306. It may be.
  • the designated event is based on a signal received from the wearable device 202 through the at least one communication module 306, and the at least one sensor 176 of the electronic device 101; This may be confirming a request for activation of 308; 309).
  • the designated event may be confirmation that the electronic device 101 is mounted on a holder.
  • the designated event may confirm the start of charging of the electronic device 101.
  • the method may include confirming a first time point at which the change in sleep is detected based on the third data.
  • the method may include transmitting information about the first time point to the server 203 as information about the sleep state.
  • the method may include an operation of confirming the time at which changes in the user's oxygen saturation and heart rate are detected as the first time, based on the third data.
  • the server 203 uses the fourth data from a second time point before the first time interval from the first time point to a third time point after the second time interval from the first time point. Thus, it can be set to generate the rendering data.
  • the method may include transmitting an image of the surrounding environment obtained through the camera 180 (305) of the electronic device 101 to the server 203.
  • the server 203 may be set to generate the rendering data using the surrounding environment image and the fourth data.
  • the at least one operation may be performed by the wearable device 202. It may include receiving first data obtained through the motion sensor 408 from the wearable device 202 through at least one communication module 306 of the electronic device 101.
  • the at least one operation may include acquiring second data related to the movement of the user of the wearable device 202 through at least one sensor 176; 308; 309 of the electronic device 101. You can.
  • the at least one operation may include an operation of checking the user's movement and/or posture based on the first data and the second data.
  • the at least one operation includes receiving third data obtained through the biometric sensor 405 of the wearable device 202 from the wearable device 202 through the at least one communication module 306. may include.
  • the at least one operation may include checking the user's sleeping state based on the third data.
  • the at least one operation includes performing the display module 160 (304) of the electronic device 101 using rendering data generated using the first data, the second data, and/or the third data. Through this, the operation of displaying the screen may be included.
  • At least part of the rendering data may be generated in the electronic device 101.
  • At least part of the rendering data may be generated in the server 203.
  • the at least one operation may include transmitting fourth data about the movement and/or the posture of the user to the server 203 through the at least one communication module 306.
  • the at least one operation may include transmitting information about the sleep state to the server 203 through the at least one communication module 306.
  • the at least one operation may include receiving the at least part of the rendering data from the server 203 through the at least one communication module 306.
  • the at least one sensor may include a LiDAR sensor (309).
  • the at least one operation may include matching the first data and the second data with a posture pattern stored in the memory 130 (307) of the electronic device 101.
  • the at least one operation may include an operation of confirming the posture of the user based on the result of the matching.
  • the at least one operation may include confirming the sleeping state of the user based on the confirmed posture and the third data.
  • the at least one operation may include activating the at least one sensor (176; 308; 309) upon confirming a designated event.
  • the at least one operation may include acquiring the second data through the activated at least one sensor (176; 308; 309).
  • the designated event confirms the start of sleep of the user of the wearable device 202 based on a signal received from the wearable device 202 through the at least one communication module 306. It may be.
  • the designated event is based on a signal received from the wearable device 202 through the at least one communication module 306, and the at least one sensor 176 of the electronic device 101; This may be confirming a request for activation of 308; 309).
  • the designated event may be confirmation that the electronic device 101 is mounted on a holder.
  • the designated event may confirm the start of charging of the electronic device 101.
  • the at least one operation may include confirming a first time point at which the change in sleep is detected based on the third data.
  • the at least one operation may include transmitting information about the first time point as information about the sleep state to the server 203.
  • the at least one operation may include an operation of confirming the time at which changes in the user's oxygen saturation and heart rate are detected as the first time based on the third data.
  • the server 203 uses the fourth data from a second time point before the first time interval from the first time point to a third time point after the second time interval from the first time point. Thus, it can be set to generate the rendering data.
  • the at least one operation may include transmitting an image of the surrounding environment obtained through the camera 180 (305) of the electronic device 101 to the server 203.
  • the server 203 may be set to generate the rendering data using the surrounding environment image and the fourth data.
  • the wearable device 202 includes at least one sensor (405; 408; 409); communication module 406; and a processor 403.
  • the processor 403 may be set to confirm the start of the user's sleep based on data acquired through the at least one sensor 405; 408; 409. Based on confirming the start of the sleep, the processor 403, via the communication module 406, sends the electronic device 101 to the electronic device 101 by detecting at least one sensor 176; 308; 309) may be set to transmit a signal requesting activation.
  • the processor 403 processes first data acquired through a motion sensor 408 among the at least one sensor 405; 408; 409 of the wearable device 202, and the at least one sensor 405; 408. ; 409), the second data acquired through the biometric sensor 405 may be set to be transmitted to the electronic device 101 through the communication module 406.
  • the processor 403 may be set to change the mode of the motion sensor 408 and the biometric sensor 405 to a sleep mode based on confirming the start of the sleep.
  • the processor 403 may be set to acquire the first data through the motion sensor 408 that has been changed to the sleep mode.
  • the processor 403 may be set to acquire the second data through the biometric sensor 405 changed to the sleep mode.
  • the electronic device 101 may include the first data, the second data, and data acquired through the activated at least one sensor 176; 308; 309 of the electronic device 101. Based on this, it can be set to check the first time point at which the change in sleep is detected. The electronic device 101 may be set to transmit information about the first time point to the server 203.
  • the processor 403 transmits third data acquired through the temperature sensor 409 among the at least one sensor 405; 408; 409 to the electronic device through the communication module 406. It may be set to transmit to device 101.
  • the wearable device 202 may include a microphone 402.
  • the processor 403 may be set to transmit the fourth data obtained through the microphone 402 to the electronic device 101 through the communication module 406.
  • the operating method of the wearable device 202 includes confirming the start of the user's sleep based on data acquired through at least one sensor 405; 408; 409 of the wearable device 202.
  • the method based on confirming the onset of the sleep, sends a message to the electronic device 101, via the communication module 406 of the wearable device 202, to at least one sensor 176 of the electronic device 101. ; 308; 309) may include transmitting a signal requesting activation.
  • the method includes first data acquired through a motion sensor 408 among the at least one sensor 405; 408; 409 of the wearable device 202, and the at least one sensor 405; 408; 409. It may include transmitting second data acquired through the biometric sensor 405 to the electronic device 101 through the communication module 406.
  • the method may include changing the mode of the motion sensor 408 and the biometric sensor 405 to a sleep mode based on confirming the start of the sleep.
  • the method may include acquiring the first data through the motion sensor 408 that has been changed to the sleep mode.
  • the method may include acquiring the second data through the biometric sensor 405 changed to the sleep mode.
  • the electronic device 101 may include the first data, the second data, and data acquired through the activated at least one sensor 176; 308; 309 of the electronic device 101. Based on this, it can be set to check the first time point at which the change in sleep is detected. The electronic device 101 may be set to transmit information about the first time point to the server 203.
  • the method may transmit third data obtained through the temperature sensor 409 among the at least one sensor 405; 408; 409 to the electronic device 101 through the communication module 406. ) may include a transmission operation.
  • the method includes transmitting fourth data obtained through the microphone 402 of the wearable device 202 to the electronic device 101 through the communication module 406. can do.
  • the at least one operation may be performed by the wearable device 202.
  • the at least one operation is based on confirming the start of the sleep, to the electronic device 101, through the communication module 406 of the wearable device 202, by performing at least one operation of the electronic device 101. It may include transmitting a signal requesting activation of the sensor (176; 308; 309).
  • the at least one operation includes first data acquired through a motion sensor 408 among the at least one sensor 405; 408; 409 of the wearable device 202, and the at least one sensor 405; 408. ; 409) may include transmitting second data obtained through the biometric sensor 405 to the electronic device 101 through the communication module 406.
  • the at least one operation may include changing the mode of the motion sensor 408 and the biometric sensor 405 to a sleep mode based on confirming the start of the sleep. .
  • the at least one operation may include acquiring the first data through the motion sensor 408 that has been changed to the sleep mode.
  • the at least one operation may include acquiring the second data through the biometric sensor 405 that has been changed to the sleep mode.
  • the electronic device 101 may include the first data, the second data, and data acquired through the activated at least one sensor 176; 308; 309 of the electronic device 101. Based on this, it can be set to check the first time point at which the change in sleep is detected. The electronic device 101 may be set to transmit information about the first time point to the server 203.
  • the at least one operation may include sending third data acquired through the temperature sensor 409 among the at least one sensor 405, 408, 409, through the communication module 406, to the electronic device. It may include an operation of transmitting to the device 101.
  • the at least one operation includes transmitting fourth data acquired through the microphone 402 of the wearable device 202 to the electronic device 101 through the communication module 406. Can include actions.
  • the electronic device 101 may provide intuitive data to the user by observing changes in sleeping posture when the user temporarily wakes up or enters deep sleep or REM sleep.
  • Electronic devices may be of various types. Electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances. Electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, second, or first or second may be used simply to distinguish one component from another, and to refer to that component in other respects (e.g., importance or order) is not limited.
  • One (e.g., first) component is said to be “coupled” or “connected” to another (e.g., second) component, with or without the terms “functionally” or “communicatively.”
  • any of the components can be connected to the other components directly (e.g. wired), wirelessly, or through a third component.
  • module used in the embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as logic, logic block, component, or circuit, for example. can be used
  • a module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Embodiments of this document are software (software) including one or more instructions stored in a storage medium (e.g., internal memory or external memory) that can be read by a machine (e.g., wireless power transmission device 100) For example, it can be implemented as a program).
  • a processor e.g., processor 201 of a device (e.g., wireless power transmission device 100) may call at least one command among one or more commands stored from a storage medium and execute it. This allows the device to be operated to perform at least one function according to the at least one instruction called.
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and this term refers to cases where data is semi-permanently stored in the storage medium. There is no distinction between temporary storage cases.
  • the method according to the embodiments disclosed in this document may be provided and included in a computer program product.
  • Computer program products are commodities and can be traded between sellers and buyers.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)) or through an application store (e.g. Play StoreTM) or on two user devices (e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server.
  • each component (e.g., module or program) of the above-described components may include a single or multiple entities, and some of the multiple entities may be separately arranged in other components.
  • one or more of the components or operations described above may be omitted, or one or more other components or operations may be added.
  • multiple components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar manner as those performed by the corresponding component of the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, omitted, or , or one or more other operations may be added.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Physiology (AREA)
  • Cardiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Optics & Photonics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif électronique qui peut comprendre : au moins un module de communication ; au moins un capteur ; un module d'affichage ; et un processeur. Le processeur peut être configuré pour recevoir des premières données obtenues au moyen d'un capteur de mouvement d'un dispositif pouvant être porté à partir du dispositif pouvant être porté. Le processeur peut être configuré pour obtenir des deuxièmes données relatives au mouvement d'un utilisateur du dispositif pouvant être porté au moyen du ou des capteurs. Le processeur peut être configuré pour identifier le mouvement et/ou la posture de l'utilisateur. Le processeur peut être configuré pour recevoir des troisièmes données obtenues au moyen d'un capteur biométrique du dispositif pouvant être porté à partir du dispositif pouvant être porté. Le processeur peut être configuré pour identifier l'état de sommeil de l'utilisateur. Le processeur peut être configuré pour afficher un écran à l'aide des données de rendu générées au moyen des premières données, des deuxièmes données et/ou des troisièmes données.
PCT/KR2023/012668 2022-10-05 2023-08-25 Dispositif électronique et son procédé de fonctionnement WO2024075982A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2022-0127408 2022-10-05
KR20220127408 2022-10-05
KR10-2022-0144336 2022-11-02
KR1020220144336A KR20240047876A (ko) 2022-10-05 2022-11-02 전자 장치 및 그 동작 방법

Publications (1)

Publication Number Publication Date
WO2024075982A1 true WO2024075982A1 (fr) 2024-04-11

Family

ID=90608657

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/012668 WO2024075982A1 (fr) 2022-10-05 2023-08-25 Dispositif électronique et son procédé de fonctionnement

Country Status (1)

Country Link
WO (1) WO2024075982A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050085738A1 (en) * 2003-09-18 2005-04-21 Stahmann Jeffrey E. Sleep logbook
KR20140003867A (ko) * 2012-06-29 2014-01-10 전자부품연구원 수면 무호흡과 수면 단계를 모니터링 하기 위한 수면 모니터링 시스템 및 방법
KR20170129689A (ko) * 2015-01-06 2017-11-27 데이비드 버톤 모바일 웨어러블 모니터링 시스템
US20190175858A1 (en) * 2017-12-10 2019-06-13 SomnaCardia Inc. Devices and methods for non-invasive cardio-adaptive positive pressure ventilation therapy
WO2021048820A1 (fr) * 2019-09-13 2021-03-18 Resmed Sensor Technologies Limited Systèmes et procédés de soin continu

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050085738A1 (en) * 2003-09-18 2005-04-21 Stahmann Jeffrey E. Sleep logbook
KR20140003867A (ko) * 2012-06-29 2014-01-10 전자부품연구원 수면 무호흡과 수면 단계를 모니터링 하기 위한 수면 모니터링 시스템 및 방법
KR20170129689A (ko) * 2015-01-06 2017-11-27 데이비드 버톤 모바일 웨어러블 모니터링 시스템
US20190175858A1 (en) * 2017-12-10 2019-06-13 SomnaCardia Inc. Devices and methods for non-invasive cardio-adaptive positive pressure ventilation therapy
WO2021048820A1 (fr) * 2019-09-13 2021-03-18 Resmed Sensor Technologies Limited Systèmes et procédés de soin continu

Similar Documents

Publication Publication Date Title
WO2020071712A1 (fr) Procédé de commande d'une pluralité de dispositifs de reconnaissance vocale et dispositif électronique prenant en charge ledit procédé
WO2020130691A1 (fr) Dispositif électronique et procédé pour fournir des informations sur celui-ci
WO2018208093A1 (fr) Procédé de fourniture de rétroaction haptique et dispositif électronique destiné à sa mise en œuvre
WO2022154546A1 (fr) Dispositif habitronique pour effectuer une commande de volume automatique
WO2022131549A1 (fr) Dispositif électronique et procédé de fonctionnement d'un dispositif électronique
WO2022025494A1 (fr) Dispositif électronique de commande de luminance de dispositif d'affichage et procédé de fonctionnement associé
WO2024075982A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022154440A1 (fr) Dispositif électronique de traitement de données audio, et procédé d'exploitation associé
WO2022124784A1 (fr) Dispositif électronique pour fournir des informations sur un menu de repas et son procédé de fonctionnement
WO2021167385A1 (fr) Dispositif électronique et procédé permettant de reconnaître un contexte
WO2022025444A1 (fr) Procédé et appareil d'affichage d'écran
WO2021162400A1 (fr) Dispositif électronique et procédé de distinction entre différentes opérations d'entrée
WO2020111727A1 (fr) Dispositif électronique et procédé de commande de dispositif électronique
WO2023149782A1 (fr) Dispositif électronique et procédé de fourniture d'une fonction haptique
WO2021230462A1 (fr) Procédé de transmission de données et dispositif électronique le prenant en charge
WO2024043465A1 (fr) Dispositif électronique pour mesurer des informations biométriques et son procédé de fonctionnement
WO2022196984A1 (fr) Dispositif électronique comprenant une caméra et son procédé de fonctionnement
WO2024085550A1 (fr) Dispositif électronique de commande d'affichage de dispositif virtuel, procédé associé et support de stockage non transitoire lisible par ordinateur associé
WO2022097992A1 (fr) Dispositif électronique comprenant un écran variable et son procédé de fonctionnement
WO2023068549A1 (fr) Dispositif électronique utilisant un dispositif externe, et son procédé de fonctionnement
WO2022177166A1 (fr) Procédé de commande de fréquence de rafraîchissement, et dispositif électronique prenant en charge celui-ci
WO2023128208A1 (fr) Dispositif électronique pouvant être monté sur la tête d'un utilisateur, et procédé pour fournir une fonction à l'aide d'informations biométriques dans le même dispositif électronique
WO2023043118A1 (fr) Dispositif électronique et procédé de reconnaissance tactile de dispositif électronique
WO2022186477A1 (fr) Procédé de lecture de contenu et dispositif électronique prenant en charge celui-ci
WO2023167506A1 (fr) Procédé de réduction du niveau de fatigue oculaire et dispositif électronique associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23875052

Country of ref document: EP

Kind code of ref document: A1