WO2024096419A1 - Procédé pour fournir une interface utilisateur graphique représentant des informations ou une évaluation du sommeil d'un utilisateur - Google Patents

Procédé pour fournir une interface utilisateur graphique représentant des informations ou une évaluation du sommeil d'un utilisateur Download PDF

Info

Publication number
WO2024096419A1
WO2024096419A1 PCT/KR2023/016512 KR2023016512W WO2024096419A1 WO 2024096419 A1 WO2024096419 A1 WO 2024096419A1 KR 2023016512 W KR2023016512 W KR 2023016512W WO 2024096419 A1 WO2024096419 A1 WO 2024096419A1
Authority
WO
WIPO (PCT)
Prior art keywords
sleep
information
user
user interface
time
Prior art date
Application number
PCT/KR2023/016512
Other languages
English (en)
Korean (ko)
Inventor
이동헌
홍준기
박혜아
김형국
강소라
배재현
김종목
김대우
김성연
조욱남
최지혜
김승훈
Original Assignee
주식회사 에이슬립
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 에이슬립 filed Critical 주식회사 에이슬립
Priority to KR1020237044063A priority Critical patent/KR20240065214A/ko
Priority to KR1020237041712A priority patent/KR20240065211A/ko
Priority to KR1020237044152A priority patent/KR20240065215A/ko
Publication of WO2024096419A1 publication Critical patent/WO2024096419A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M21/02Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis for inducing sleep or relaxation, e.g. by direct nerve stimulation, hypnosis, analgesia
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present invention relates to a method of providing a graphical user interface that displays information or evaluation of a user's sleep.
  • the number of patients with sleep disorders in Korea increased by about 8% on average per year from 2014 to 2018, and the number of patients treated for sleep disorders in Korea in 2018 reached approximately 570,000.
  • Republic of Korea Patent Publication No. 10-2023-0012133 discloses an electronic device that provides a user interface according to sleep state and a method of operating the same. It is disclosed that an abnormal REM sleep event can be detected and a user interface based on specified action information can be provided in response.
  • the conventional technology detects abnormal REM sleep stages by examining all sleep stages, there is a risk that the accuracy of measuring the sleep stage may be lowered, and there is a risk that sleep-related events may not be detected in real time. Additionally, there is a limitation in that general consumers who are not experts on sleep cannot intuitively evaluate the quality of their sleep.
  • Republic of Korea Patent Publication No. 10-2023-0012133 discloses an electronic device that provides a user interface according to sleep state and a method of operating the same. It is disclosed that an abnormal REM sleep event can be detected and a user interface based on specified action information can be provided in response.
  • the conventional technology detects abnormal REM sleep stages by examining all sleep stages, there is a risk that the accuracy of the measurement of the sleep stage may be lowered, and since the basic unit of measurement is made up of minutes, sleep-related information is measured in real time. There is a risk that the event may not be detected.
  • Figures 31A to 31E are diagrams showing hypnogram graphs of sleep stage information expressed in a conventional sleep measurement interface.
  • a method of providing a graphical user interface indicating the date and/or time point at which information about the user's sleep was acquired is disclosed.
  • a method of providing a graphical user interface includes acquiring user's sleep information from one or more sleep information sensor devices - the user's sleep information includes user's sleep sound information - acquiring sleep information May include steps.
  • Obtaining the user's sleep state information based on the acquired user's sleep information - the user's sleep state information includes at least one of waking time information, sleep time information, or waking up time information - a step of acquiring sleep state information It can be included.
  • a method of providing a graphical user interface includes the sleep information based on at least one of the waking time information, sleep time information, or waking up time information included in the acquired sleep state information of the user. It may include an acquisition point display information generation step of generating information indicating the point in time at which the acquisition was acquired.
  • a method of providing a graphical user interface according to an embodiment of the present invention may include displaying a graphical user interface including information indicating an acquisition point.
  • the step of generating acquisition time display information may be characterized by generating information indicating the time point at which the sleep information was acquired based on wake-up time information included in sleep state information.
  • the step of generating the acquisition time display information includes information indicating the time of acquisition of the sleep state information displayed on the graphic user interface to display the date of the waking up included in the wake-up time information. It is created as a shape, and the shape may be any one of a point, polygon, circle, oval, fan shape, or a combination of straight lines and curves.
  • the step of generating acquisition time display information includes, when N wake-up times (N is 0 or a natural number) on the same date, information displaying the date displayed on the graphical user interface. can be created to include N shapes.
  • the step of generating acquisition time display information may be characterized by generating information indicating the time point at which sleep information was acquired based on elevation time information included in sleep state information.
  • the step of generating acquisition time display information displays information indicating the time of acquisition of the sleep state information displayed on the graphic user interface by displaying the date of the elevation time included in the elevation time information.
  • the shape may be any one of a point, polygon, circle, oval, fan shape, or a combination of straight lines and curves.
  • the step of generating acquisition time display information is to display the date displayed on the graphical user interface when N wake-up times (N is 0 or a natural number) are included on the same date.
  • Information can be generated to include N shapes.
  • the step of generating acquisition time display information may be characterized in that it generates information indicating the time of acquisition of sleep information based on the elevation time information and wake-up time information included in the sleep state information. You can.
  • the step of generating the acquisition time display information includes information indicating the time of acquisition of the sleep state information displayed on the graphic user interface, and the date of the elevation time included in the elevation time information. And, if the dates of the weather time included in the weather time information are different, a shape having a continuous shape indicating the date of the elevation time to the date of the weather time is generated, and the shape is a point, polygon, circle, It may be characterized as being one of the following shapes: an oval, a fan, or a combination of straight lines and curves.
  • the step of generating acquisition time display information may be performed on the graphical user interface when the date of the elevation time included in the elevation time point information and the date of the weather time included in the weather time information are the same.
  • the information indicating the displayed date is created as a shape with a continuous shape connecting two different points, and the shape is any of the shapes consisting of a point, polygon, circle, oval, sector, or a combination of straight lines and curves. It can be characterized as being one.
  • the step of generating acquisition time display information includes generating information indicating the time point at which the sleep information was acquired based on wake-up time information included in the sleep state information, and obtaining the sleep information.
  • the information indicating the time point includes time zone information, and the time zone information may be generated to display the time zone to which the waking up time belongs based on the definition of a preset time zone.
  • the definition of the preset time zone which is the basis for determining the time zone to which the wake-up time belongs in the acquisition time display information generation step according to an embodiment of the present invention, is 12:00 a.m. (midnight) or more and less than 5:00 a.m. Early morning hours are between 5 AM and 9 AM, morning hours are between 9 AM and 5 PM, daytime hours are between 5 PM and 9 PM, evening hours are between 5 PM and 9 PM, and between 9 PM and 12 AM. If it is less than midnight (midnight), it may be defined as night time.
  • the step of generating the acquisition time display information includes information indicating the time of acquisition of the sleep state information displayed on the graphic user interface, the date of the wake-up time included in the wake-up time information, and It is created as a shape that displays time zone information, and the shape may be any one of a point, polygon, circle, oval, fan, or a combination of straight lines and curves.
  • the figure created according to an embodiment of the present invention may be characterized in that at least one of its type or color is displayed differently depending on the time zone information included in the wake-up time information.
  • the graphical user interface further includes text displaying the time zone information, and the text displaying the time zone information includes a keyword of dawn if the time zone to which the wake-up time belongs is the dawn time zone, If it is a morning time zone, it may be characterized as including the keyword "morning”, if it is a day time zone, it may include the keyword “day”, if it is an evening time zone, it may include the keyword “evening”, and if it is a night time zone, it may include the keyword "night”.
  • the step of acquiring sleep state information converts the sleep sound information included in the acquired sleep information into information including changes in the frequency components of the sleep sound information along the time axis, and the converted It may include performing analysis on information.
  • the converted information may be a spectrogram.
  • the step of acquiring sleep state information may be characterized by acquiring the sleep state information based on dividing the spectrogram into epochs of 30 seconds.
  • the sleep information receiving method includes receiving the user's sleep information including the user's sleep sound information from one or more sleep information sensor devices. step; A sleep state information acquisition step of acquiring sleep state information of the user based on the received sleep information of the user; creating one or more graphical user interfaces for receiving desired wake-up time information; Receiving desired wake-up time information through one or more graphical user interfaces for receiving the desired wake-up time information; An alarm time information generating step of generating alarm time information based on the obtained sleep state information and the received desired wake-up time information; A graphical user interface creation step of generating one or more graphical user interfaces for performing an alarm function based on the generated alarm time information; and a display step of displaying the generated graphical user interface through a display device.
  • the step of acquiring sleep state information if the sleep state information indicates that the user is in REM sleep within a time range including the desired wake-up time information, after REM sleep appears, After the time of, the alarm time information generating step; generating alarm time information based on the sleep state information provides a method of providing a graphical user interface representing information about the user's sleep.
  • the step of creating a graphical user interface includes, when a trigger area for activating the notification function is activated, a step of creating a graphical user interface for starting sleep measurement along with activating the alarm function; , if the trigger area for activating the notification function is not activated, creating a graphical user interface for initiating sleep measurement along with deactivating the alarm function; providing a graphical user interface that displays information about the user's sleep, including: Provides a way to do this.
  • the step of creating a graphical user interface includes: generating a graphical user interface that provides information corresponding to sleep with an undetermined wake-up time when a trigger area for activating the notification function is not activated; A method of providing a graphical user interface representing information about a user's sleep including a method is provided.
  • the step of generating a graphical user interface includes: generating a graphical user interface including a slide area for selecting the desired wake-up time information through a slide method; and receiving the desired wake-up time information. It provides a method of providing a graphical user interface displaying information about the user's sleep, further comprising: receiving the desired wake-up time information through the slide area.
  • the step of creating a graphical user interface including a slide area for selecting the desired wake-up time information; when the slide area is slid in a first direction, the time of the desired wake-up time information is delayed,
  • a method of providing a graphical user interface representing information about a user's sleep including a step of generating a graphical user interface in which the time of the desired wake-up time information becomes faster when the slide area is slid in a second direction.
  • the step of creating a graphical user interface represents a range of the scheduled wake-up time based on the received desired wake-up time information, and the range of the scheduled wake-up time is a predetermined time before the desired wake-up time.
  • a method of providing a graphical user interface representing information about a user's sleep including a step of creating a graphical user interface indicating the desired wake-up time.
  • the step of generating the graphical user interface is: upon receiving the desired wake-up time information through the slide area, predicted wake-up time information or predicted sleep time or prediction inferred from the received sleep state information of the user.
  • a method of providing a graphical user interface indicating information about a user's sleep including a step of generating a graphical user interface indicating an estimated sleep time based on a combination of sleep time and sleep efficiency, is provided.
  • the step of creating a graphical user interface is: when receiving the desired wake-up time information through the slide area, a graphic user indicates an expected sleep time based on the difference between the received desired wake-up time and the current time.
  • a method of providing a graphical user interface representing information about a user's sleep, including an interface creation step, is provided.
  • the step of creating a graphical user interface includes: a graphical user including a screen indicating that sleep measurement is in progress when the alarm function is activated and the start of sleep measurement is inputted through a graphical user interface for starting sleep measurement;
  • a method of providing a graphical user interface representing information about a user's sleep is provided, including the step of creating an interface.
  • generating a graphical user interface including a screen indicating that sleep measurement is in progress; generating a graphical user interface indicating that the alarm function is being activated; and a range of the scheduled wake-up time.
  • generating a graphical user interface including a screen indicating that sleep measurement is in progress; generating a graphical user interface capable of ending sleep measurement; displaying information about the user's sleep further comprising: Provides a method of providing a graphical user interface.
  • the step of generating the graphical user interface when the trigger area for activating the notification function is not activated, when the start of sleep measurement is input through the graphical user interface for starting the sleep measurement, sleep
  • a method of providing a graphical user interface representing information about a user's sleep including the step of generating a graphical user interface including a screen indicating that measurement is in progress.
  • generating a graphical user interface including a screen indicating that sleep is being measured includes generating a graphical user interface indicating that a trigger area for activating the alarm function is not being activated.
  • a graphic representing information about the user's sleep further comprising: generating a graphic user interface representing a range of the scheduled wake-up time; and generating a graphic user interface representing waves of the user's sleep sound information.
  • generating a graphical user interface including a screen indicating that sleep measurement is in progress; generating a graphical user interface capable of ending sleep measurement; displaying information about the user's sleep further comprising: Provides a method of providing a graphical user interface.
  • the step of generating a graphical user interface generating a graphical user interface that displays a pop-up window so that the user can check the expected sleep time when the expected sleep time exceeds a predetermined time;
  • a method of providing a graphical user interface representing information about sleep is provided.
  • the step of generating a graphical user interface includes: generating a graphical user interface that provides information that sleep measurement results can be confirmed when sleeping for a predetermined time or more when the estimated sleep time is less than a predetermined time; A method of providing a graphical user interface representing information about a user's sleep is provided.
  • the time range including the user's desired wake-up time information includes the time from a predetermined time before the user's desired wake-up time to the user's desired wake-up time, and the sleep within the time range.
  • the status information does not indicate that the user is in REM sleep
  • a method of providing an alarm based on the user's sleep status information is provided, which generates the alarm timing information at the user's desired wake-up time.
  • the time range including the user's desired wake-up time information includes the time from a predetermined time before the user's desired wake-up time to the user's desired wake-up time, and the sleep within the time range.
  • the state information indicates that the user is in REM sleep
  • a method of providing an alarm based on the user's sleep state information that generates the alarm time information at a time point indicating that the user is in REM sleep is provided.
  • the sleep information receiving step is an alarm based on the user's sleep state information, further comprising a sleep information conversion step of converting the user's sleep sound information in the time domain into information in the frequency domain.
  • the sleep information receiving step further includes a sleep information inference step of inferring information about sleep by using the user's sleep sound information as an input to a sleep information inference deep learning model.
  • a sleep information inference step of inferring information about sleep by using the user's sleep sound information as an input to a sleep information inference deep learning model.
  • a method of providing alarm timing information based on a user's sleep state information includes: a sleep state information acquisition step of acquiring the user's sleep state information; A desired wake-up time information receiving step of receiving the user's desired wake-up time information; And an alarm timing information generating step of generating alarm timing information based on the obtained sleep state information and the received desired waking up time information. If the sleep state information indicates that the user is in REM sleep during the time from before to the desired wake-up time, alarm time information based on the user's sleep state information is set as the alarm time information at the time when the user is in REM sleep. Provides a method to provide.
  • the sleep state information indicates that the user is in REM sleep during the time from a predetermined time before the desired wake-up time to the desired wake-up time, the user is in REM sleep.
  • a method of providing alarm time information based on the user's sleep state information is provided, which sets a time point after a predetermined time from the time point as the alarm time information.
  • the user's sleep stage A method of providing alarm timing information based on the user's sleep state information is provided, which sets the alarm timing information as a timing indicating that the sleep stage is other than REM sleep.
  • the step of generating alarm time information after the sleep state information indicates that the user is in REM sleep during the time from a predetermined time before the desired wake-up time to the desired wake-up time, the user is in REM sleep.
  • the alarm time information is generated at a time when the reliability of the sleep state information is lowered.
  • a method of providing a graphical user interface that displays information about a user's sleep is disclosed.
  • a method of providing a graphical user interface includes acquiring user's sleep information from one or more sleep information sensor devices - the user's sleep information includes the user's sleep sound information - sleep information acquisition step. may include.
  • a method of providing a graphical user interface includes acquiring user's sleep state information in real time based on the acquired user's sleep information - the user's sleep state information includes a plurality of sleep stage information. - May include a step of acquiring sleep state information.
  • the method of providing a graphical user interface includes acquiring user's sleep state information based on the acquired user's sleep information - the user's sleep state information includes a plurality of sleep stage information. - May include a step of acquiring sleep state information.
  • a method of providing a graphical user interface may include a sleep state information graph generating step of generating a graph representing the user's sleep state information over time based on the obtained sleep state information. .
  • a method of providing a graphical user interface according to an embodiment of the present invention may include displaying a graphical user interface including a generated graph.
  • a method of providing a graphical user interface generates a graph representing the user's sleep state information over time based on the obtained sleep state information - the graph includes a plurality of rectangles. - It may include the step of generating a sleep state information graph.
  • the method of providing a graphical user interface may include generating a sleep state information graph including a plurality of rectangles each corresponding to a plurality of sleep stages.
  • the graphic user interface provided according to an embodiment of the present invention includes a plurality of areas each allocated to a plurality of sleep stages, and each of the plurality of rectangles included in the sleep state information graph is one of the plurality of areas. It can only be displayed in the area allocated to the corresponding sleep stage.
  • a method of providing a graphical user interface includes a graph representing the user's sleep state information over time based on the obtained sleep state information - a graph representing the user's sleep state information is a sleep state Expressing information discretely - may include a step of generating a sleep state information graph.
  • a method of providing a graphical user interface may include generating a sleep state information graph including a plurality of shapes that each correspond to a plurality of sleep stages and are separated from each other. there is.
  • the graphical user interface provided according to an embodiment of the present invention includes a plurality of areas each allocated to a plurality of sleep stages, the plurality of areas have a height assigned to each, and the distance between the plurality of areas is With an assigned pitch, each of a plurality of shapes included in the sleep state information graph may be discrete based on the pitch.
  • At least one of the boundaries between a plurality of areas included in the graphical user interface provided according to an embodiment of the present invention may be expressed as a distinguishable line.
  • a method of providing a graphical user interface includes a plurality of shapes, each corresponding to a plurality of sleep stages, wherein at least one shape of the plurality of shapes is different from the remaining shapes. It may include the step of generating sleep state information graphs that are separated from each other.
  • the figures that are separated from each other may correspond to the waking stage.
  • a plurality of shapes each corresponding to a plurality of sleep stages, may be expressed as shapes of the same shape, and may specifically be expressed as rectangles.
  • the graphic user interface includes a plurality of areas allocated to each of the sleep stages, and a plurality of shapes each corresponding to a plurality of sleep stages correspond to corresponding sleep stages among the plurality of areas. It is displayed only in the area allocated to the stage, and when the shape corresponding to the sleep stage is displayed separately from the remaining shapes, the shape may not be displayed in the area allocated to the other sleep stage.
  • the user's sleep state information can be acquired in 30-second increments.
  • a graph representing sleep stage information in units of 30 seconds included in the obtained sleep state information may be generated.
  • the acoustic information on the time domain included in the acquired sleep information is converted into information on the frequency domain, and the user's above information is converted based on the converted information on the frequency domain. Sleep state information can be obtained.
  • information on the frequency domain may be a spectrogram or a Mel spectrogram to which a Mel scale is applied.
  • the sleep state information may be obtained based on dividing the spectrogram into epochs of 30 seconds.
  • each graph generated in the sleep state information graph generation step according to an embodiment of the present invention may be expressed in different colors assigned to correspond to each of the plurality of sleep stage information included in the user's sleep state information.
  • the user's waking point information or waking up time information included in the sleep state information acquired in the sleep state information acquisition step is displayed together in the sleep state information graph. can be created.
  • the graph generated in the sleep state information graph generation step numerically expresses the user's waking-up time information or waking-up time information displayed together in the sleep state information graph, and is a sleep state information graph. It may be characterized in that it is connected with a line and generated to be displayed in parallel.
  • the present invention provides a method for starting sleep measurement through a user device, including a sensing step of sensing a sleep measurement start trigger; and a starting step of starting sleep measurement at a time when the trigger is sensed based on the sensed sleep measurement start trigger.
  • the sleep measurement start trigger is the user's desired sleep time and sleep mode on the user device.
  • the sleep mode refers to a mode in which the output of information indicating the occurrence of events related to applications running on the user device is limited. - If at least one of - is set, it provides a method for starting sleep measurement through a user device, characterized in that sensing is performed after the set user desired sleep time or during a period set in the sleep mode.
  • the sleep measurement start trigger is exercise or movement applied to the user device, and when the user device senses the exercise or movement in the sensing step, at the time when the start step senses the exercise or movement.
  • the sleep measurement start trigger is a user's desired sleep voice input to the user device, and when the user device senses the user's desired sleep voice in the sensing step, the start step; senses the user's desired sleep voice.
  • the sleep measurement start trigger is a user's desired sleep voice input to the user device, and when the user device senses the user's desired sleep voice in the sensing step, the start step; senses the user's desired sleep voice.
  • the present invention relates to a user device where, when the user device senses the start of charging within a predetermined time range set in the sensing step, the recognition step recognizes the charging start sensing time as the sensing time of the sleep measurement start trigger. Provides a method for initiating sleep measurement through .
  • the present invention provides a method for initiating sleep measurement through a user device, wherein the charging initiated by the user device is wired charging.
  • the present invention provides a method for initiating sleep measurement through a user device, wherein the charging initiated by the user device is wireless charging.
  • the present invention relates to the sensing step; when the user device senses the lock mode release after a predetermined time when the lock mode of the display screen is released, the recognition step; starts sleep measurement at the time of sensing the lock mode release.
  • the present invention relates to sleep measurement using a user device, wherein when the display unit of the user device senses a swipe input in the sensing step, the recognition step recognizes the point in time at which the swipe is sensed as the sensing point in time for the sleep measurement start trigger. Provides a method for initiating.
  • the present invention provides a method for initiating sleep measurement through a user device, wherein the swipe input is input with the user's finger.
  • the present invention provides a method for initiating sleep measurement through a user device, wherein the swipe input is input with the user's palm.
  • the present invention provides a method for starting sleep measurement through a user device, in which the recognition step recognizes the time when the two or more trigger sensing operations are sensed together as the sensing time of the sleep measurement start trigger.
  • the present invention provides a method for starting sleep measurement through a user device, when another device connected to the user device through a network senses sleep measurement start trigger information and transmits the sleep measurement start trigger information sensed by the other device. , a receiving step of receiving the transmitted sleep measurement start trigger information; and a start step of starting sleep measurement at a time when the trigger information is sensed based on the received sleep measurement start trigger information.
  • the sleep measurement start trigger is provided to at least one of the user device and the other device, the user's desired sleep time and sleep mode, and the sleep mode is an event related to applications running on the user device or the other device.
  • the sleep mode is an event related to applications running on the user device or the other device. Refers to a mode in which the output of information indicating occurrence is limited -
  • sleep measurement is initiated through a user device, characterized in that sensing is performed after the user's desired sleep time set or during a period set as the sleep mode. Provides a method for doing so.
  • the sleep measurement start trigger is a physical contact between the other device and the user device
  • the initiating step is a user initiating sleep measurement at the time of sensing the physical contact between the other device and the user device.
  • the sleep measurement start trigger is physical contact between the other device and the user's body
  • the initiating step is to initiate sleep measurement at the time of sensing the physical contact between the other device and the user's body.
  • the sleep measurement start trigger is a user sleep measurement start trigger voice input to the other device
  • the starting step is to start sleep measurement at the time of sensing the user sleep measurement start trigger voice input to the other device.
  • the sleep measurement start trigger is a signal that the other device has started charging after a predetermined time, and the start step is to start sleep measurement at the time the other device starts charging, through a user device.
  • the sleep measurement start trigger is a signal that the other device has started charging after a predetermined time
  • the start step is to start sleep measurement at the time the other device starts charging, through a user device.
  • the present invention provides a method for initiating sleep measurement through a user device, comprising: a sensing step of sensing information related to the user; A start decision information generating step of determining a time to start sleep measurement based on the sensed user-related information; and a start step of starting sleep measurement at the determined start time.
  • the present invention relates to the creation of the start decision information step by allowing the user to use the user device for a predetermined period of time during a period in which the output of information indicating the occurrence of an event related to applications operating on the user device in the sensing step is set to a restricted mode.
  • the information that the user is using the user device includes the user's movement or movement detected by the user device, a signal that the display of the user device is turned on, the user's voice input to the user device, or the user's voice input to the user device.
  • the present invention provides an electronic device that initiates sleep measurement of a user, comprising: a sensing unit that senses a sleep measurement start trigger; a memory unit in which applications can be recorded; and a processor unit capable of executing the application, wherein the processor unit initiates sleep measurement of the electronic device at a time when the trigger is sensed by the sensing unit. to provide.
  • the sleep information receiving method includes receiving the user's sleep information including the user's sleep sound information from one or more sleep information sensor devices. step; A sleep state information acquisition step of acquiring sleep state information of the user based on the received sleep information of the user; Receiving average sleep data of another person; generating a graphical user interface including a graph comparing the received sleep average data of others with the obtained sleep state information of the user; and displaying the graphical user interface.
  • a method of providing a graphical user interface that displays information about a user's sleep is provided.
  • the graph provided for comparing the received sleep average data of others with the obtained sleep state information of the user is characterized in that the length is proportional to the data value, A method of providing a graphical user interface representing information is provided.
  • the step of creating a graphical user interface including a graph comparing the received sleep average data of others and the acquired sleep state information of the user includes the received average sleep data of others and the obtained sleep state information of the user.
  • a method of providing a graphical user interface representing information about a user's sleep is provided, characterized in that the evaluation obtained by comparing the acquired sleep state information of the user is expressed as an “emoticon”.
  • a method of providing a graphical user interface representing information about a user's sleep wherein the average sleep data of others is an average value calculated based on medically obtained information. do.
  • a method of providing a graphical user interface displaying information about the user's sleep is provided, wherein the average sleep data of others is an average value calculated based on medically recommended information.
  • a method of providing a graphical user interface representing information about a user's sleep, wherein the average sleep data of others is statistically obtained is provided.
  • the other person's average sleep data is a graphical user interface that displays information about the user's sleep, characterized in that the sleep average data of the other person is obtained by acquiring the other person's sleep state information and analyzing the other person's sleep state information.
  • a method of providing a graphical user interface that displays information about the user's sleep is provided, wherein the average sleep data of others is classified based on the other person's age group.
  • a method of providing a graphical user interface displaying information about a user's sleep is provided, wherein the other person's average sleep data is classified based on the other person's gender.
  • a method of providing a graphical user interface that displays information about the user's sleep is provided, wherein the average sleep data of others is classified based on the other person's occupation.
  • a method of providing a graphical user interface displaying information about a user's sleep is provided, wherein the other person's average sleep data is classified based on the other person's daily activity record.
  • a method of providing a graphical user interface representing information about a user's sleep is provided, wherein the other person's average sleep data is classified based on the other person's financial data information.
  • the average sleep data of others is information about the user's sleep, which is classified based on a combination of two or more of the other person's age group, gender, occupation, day's activity record, and financial data.
  • the received average sleep data of others is the average number of wakes during sleep
  • the obtained sleep state information of the user is information about the user's sleep, which is the number of wakes during the user's sleep.
  • the received sleep average data of others is average light sleep proportion information
  • the obtained user's sleep state information is the user's light sleep proportion information.
  • Graphic user representing information about the user's sleep. Provides a method of providing an interface.
  • the received sleep average data of others includes two or more of the following: average waking delay time information, average deep sleep proportion information, average REM sleep proportion information, average number of awakenings during sleep, and light sleep proportion information. It is a combination, and the acquired sleep state information of the user is a combination of two or more of the average waking delay time information, average deep sleep proportion information, average REM sleep proportion information, average number of awakenings during sleep, and light sleep proportion information, A method of providing a graphical user interface representing information about a user's sleep is provided.
  • a method of providing a graphical user interface representing an evaluation of a user's sleep is a method of providing a graphical user interface representing an evaluation of a user's sleep using text, and includes one or more sleep information sensors.
  • Acquiring sleep information from a device - the sleep information includes sleep sound information of the user - acquiring sleep information; Sleep comprising at least two words indicating an evaluation of the user's sleep based on the acquired sleep information;
  • a method of generating one or more graphical user interfaces representing an evaluation of a user's sleep is provided.
  • the phrase provides a method for creating one or more graphical user interfaces that represent the user's evaluation of his or her sleep expressed in a non-numerical manner.
  • the sleep information provides a method of generating one or more graphical user interfaces representing an evaluation of the user's sleep including sleep environment information and user's life information.
  • a method of generating one or more graphical user interfaces representing an evaluation of a user's sleep including a sleep log information storage step of storing sleep log information related to an account assigned to the user in a memory.
  • the sleep information acquisition step further includes a sleep information conversion step of converting the user's sleep sound information in the time domain into information in the frequency domain. Creating at least one graphical user interface representing an evaluation of the user's sleep. A method can be provided.
  • the sleep information acquisition step further includes a sleep information inference step of inferring information about sleep by using sleep sound information as an input to a sleep information inference deep learning model. Evaluation of the user's sleep. It may provide a method of creating one or more graphical user interfaces to represent a user interface.
  • the sleep phrase generating step includes generating a first sleep phrase generating text of high interest during evaluation of the user's sleep; and generating a second sleep phrase, generating text constituting the evaluation of the user's sleep.
  • the sleep phrase generating step may include a third sleep phrase generating step, wherein the sleep phrase generating step generates advice text based on the user's evaluation of sleep. to provide.
  • the sleep phrase generating step further includes a lookup table sleep phrase generating step of generating the sleep phrase based on a lookup table.
  • a lookup table sleep phrase generating step of generating the sleep phrase based on a lookup table One or more graphical user interfaces representing an evaluation of the user's sleep.
  • a method for creating can be provided.
  • the lookup table sleep phrase generating step includes a lookup table user sleep characteristic classification step of classifying the user's sleep characteristics based on the sleep information; and a look-up table sleep phrase extraction step of extracting a sleep phrase based on the look-up table corresponding to the user's sleep characteristics.
  • the step of generating the sleep phrase further includes the step of generating the sleep phrase based on a large-scale language model, one or more graphics representing the user's evaluation of the user's sleep.
  • a method for creating a user interface can be provided.
  • the large-scale language model sleep phrase generating step includes a large-scale language model user sleep characteristic classification step of classifying the user's sleep characteristics based on the sleep information; and extracting sleep phrases from a large-scale language model using the user's sleep characteristics as input to the large-scale language model. You can.
  • the user graphic user interface display step includes an evaluation of the user's sleep including a user sleep graph generating step of generating a graph about the sleep stage within the user's sleep period based on the sleep information.
  • a method for creating one or more graphical user interfaces representing a can be provided.
  • the third sleep phrase is arranged to be spaced apart from the generated first sleep phrase and the generated second sleep phrase, and is placed at the bottom of the generated user sleep graph.
  • a method may be provided for creating one or more graphical user interfaces that represent an assessment of a user's sleep as a feature.
  • a non-transitory computer-readable storage medium containing a program that provides a graphical user interface representing an evaluation of the user's sleep is provided, and one or more graphic users representing an evaluation of the user's sleep are provided.
  • a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors to create an interface, wherein the one or more programs include instructions to perform a method according to an embodiment of the present invention.
  • a temporary computer-readable storage medium is provided.
  • an apparatus for providing a graphical user interface representing an evaluation of a user's sleep includes an apparatus for generating one or more graphical user interfaces representing an evaluation of a user's sleep, comprising: a display unit; One or more processors; and a memory storing one or more programs configured to be executed by the one or more processors, the one or more programs comprising instructions to perform a method according to an embodiment of the present invention.
  • a graphical user interface representing the evaluation of the user's sleep is provided using a non-numerical method and a numerical method.
  • a method of acquiring sleep information from one or more sleep information sensor devices - the sleep information includes sleep sound information of a user - acquiring sleep information;
  • a method of generating one or more graphical user interfaces representing an evaluation of a user's sleep is provided.
  • the sleep information provides a method of generating one or more graphical user interfaces that represent an evaluation of the user's sleep including sleep environment information and user's life information.
  • a method of generating one or more graphical user interfaces representing an evaluation of a user's sleep including a sleep log information storage step of storing sleep log information related to an account assigned to the user in a memory.
  • the sleep information acquisition step further includes a sleep information conversion step of converting the user's sleep sound information in the time domain into information in the frequency domain. Creating at least one graphical user interface representing an evaluation of the user's sleep. Provides a method.
  • the sleep information acquisition step further includes a sleep information inference step of inferring information about sleep by using sleep sound information as an input to a sleep information inference deep learning model. Evaluation of the user's sleep. Provides a method for creating one or more graphical user interfaces to represent.
  • a first sleep phrase generation step of generating a text of high interest during evaluation of the user's sleep and generating a second sleep phrase, generating text constituting the evaluation of the user's sleep.
  • the step of generating a sleep assessment includes a third step of generating sleep phrases, wherein the step of generating advice text is based on the assessment of the user's sleep. to provide.
  • the sleep evaluation generating step includes providing the user with an evaluation of sleep in a numerical manner based on the evaluation of the user's sleep, and one or more graphics representing an evaluation of the user's sleep, including generating numerical sleep information.
  • the sleep evaluation generating step further includes a lookup table sleep phrase evaluation step of generating the sleep evaluation based on a lookup table.
  • a lookup table sleep phrase evaluation step of generating the sleep evaluation based on a lookup table.
  • One or more graphical user interfaces representing an evaluation of the user's sleep.
  • a method for creating a can be provided.
  • the lookup table sleep evaluation generating step includes a lookup table user sleep characteristic classification step of classifying the user's sleep characteristics based on the sleep information; and a look-up table sleep evaluation extraction step of extracting a sleep evaluation based on the look-up table corresponding to the user's sleep characteristics.
  • the step of generating a sleep rating further includes generating a large-scale language model sleep phrase, generating the sleep rating based on a large-scale language model.
  • the large-scale language model sleep evaluation generating step includes a large-scale language model user sleep characteristic classification step of classifying the user's sleep characteristics based on the sleep information; and a large-scale language model sleep evaluation extraction step of extracting sleep phrases by using the user sleep characteristics as input to the large-scale language model. You can.
  • the user graphic user interface display step relates to the user's sleep, including a user sleep graph generating step of generating a graph about the sleep stage within the user's sleep period based on the sleep information.
  • a method may be provided for creating one or more graphical user interfaces representing the evaluation.
  • a non-transitory computer-readable storage medium containing a program that provides a graphical user interface representing an evaluation of the user's sleep is provided, and one or more graphic users representing an evaluation of the user's sleep are provided.
  • a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors to create an interface, wherein the one or more programs include instructions to perform a method according to an embodiment of the present invention.
  • a temporary computer-readable storage medium is provided.
  • an apparatus for providing a graphical user interface representing an evaluation of a user's sleep includes an apparatus for generating one or more graphical user interfaces representing an evaluation of a user's sleep, comprising: a display unit; One or more processors; and a memory storing one or more programs configured to be executed by the one or more processors, the one or more programs comprising instructions to perform a method according to an embodiment of the present invention.
  • a method for collecting feedback on a user's sleep and a method for providing a graphical user interface include a sleep information acquisition step of acquiring sleep information from one or more sleep information sensor devices; A sleep state information acquisition step of acquiring user's sleep state information based on the acquired sleep information - the user's sleep state information includes at least one of the user's sleep stage information and the user's sleep event information; A sleep service providing step of providing a sleep service based on the acquired sleep state information or the acquired sleep information; and displaying a user feedback graphical user interface to collect user feedback on the provided sleep service.
  • a method of collecting feedback on a user's sleep and a method of providing a graphical user interface can be provided.
  • the sleep information may include one or more of sleep environment information, sleep sound information, and sleep life information.
  • the sleep information acquisition step may include a preprocessing step of converting the sleep sound information into information including changes in frequency components along the time axis.
  • the sleep service providing step may provide a sleep content service based on the acquired sleep state information or the acquired sleep information.
  • the sleep service providing step may provide a sleep analysis information providing service based on the acquired sleep state information or the acquired sleep information.
  • the sleep service providing step may provide a sleep environment control service based on the acquired sleep state information or the acquired sleep information.
  • displaying the user feedback graphical user interface may include collecting a user's response to the provided sleep service through the displayed user feedback graphical user interface; and storing the collected user responses.
  • a method for collecting feedback on a user's sleep and a method for providing a graphical user interface include a sleep information acquisition step of acquiring sleep information from one or more sleep information sensor devices; A sleep state information acquisition step of acquiring sleep state information of the user based on previously acquired sleep information, wherein the sleep state information of the user includes at least one of the user's sleep stage and the user's sleep event; A sleep service providing step of providing a sleep service based on the acquired sleep state information or the acquired sleep information; and detecting user motion feedback to collect user feedback on the provided sleep service.
  • the sleep information may include one or more of sleep environment information, sleep sound information, and sleep life information.
  • the sleep information acquisition step may include a preprocessing step of converting the sleep sound information into information including changes in frequency components along the time axis.
  • the sleep service providing step may provide a sleep content service based on the acquired sleep state information or the acquired sleep information.
  • the sleep service providing step may provide a sleep analysis information providing service based on the acquired sleep state information or the acquired sleep information.
  • the sleep service providing step may provide a sleep environment control service based on the acquired sleep state information or the acquired sleep information.
  • the step of detecting user motion feedback includes detecting a user's motion for the provided sleep service; and converting the detected user's motion into information about feedback.
  • the step of detecting user motion feedback may include one or more of voice motion feedback, physical motion feedback, and sleep environment control feedback.
  • a method of collecting feedback on a user's sleep and a method of providing a graphical user interface include a sleep information acquisition step of acquiring sleep information from one or more sleep information sensor devices; A sleep state information acquisition step of acquiring user's sleep state information based on the acquired sleep information - the user's sleep state information includes at least one of the user's sleep stage and the user's sleep event; A sleep service providing step of providing a sleep service based on the acquired sleep state information or the acquired sleep information; and detecting changes in the obtained sleep state information to collect user feedback on the provided sleep service.
  • the sleep information may include one or more of sleep environment information, sleep sound information, and sleep life information.
  • the sleep information acquisition step may include a preprocessing step of converting the sleep sound information into information including changes in frequency components along the time axis.
  • the sleep service providing step may provide a sleep content service based on the acquired sleep state information or the acquired sleep information.
  • the sleep service providing step may provide a sleep analysis information providing service based on the acquired sleep state information or the acquired sleep information.
  • the sleep service providing step may provide a sleep environment control service based on the acquired sleep state information or the acquired sleep information.
  • the method may further include obtaining sleep state information feedback regarding a sleep service based on the sensed change in sleep state information.
  • detecting a change in the acquired sleep state information includes obtaining corresponding information between the obtained sleep state information and the provided sleep service; and collecting changes in sleep state information as feedback based on the obtained corresponding information.
  • detecting a change in the acquired sleep state information includes determining whether the acquired change in sleep state information falls within a positive sleep reference range; and determining positive sleep feedback if the sleep quality falls within the positive sleep standard range, and determining negative sleep feedback if the sleep quality does not fall within the positive sleep standard range.
  • the sleep state information may include information about apnea.
  • the sleep state information may include information about the elevation delay time.
  • the sleep state information may include information about the delay time of REM sleep.
  • the sleep state information may include deep sleep ratio information.
  • the sleep state information may include information on the total amount of sleep.
  • a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors to collect feedback on the user's sleep and provide a graphical user interface
  • the above program may provide a non-transitory computer-readable storage medium containing instructions to perform one or more of the methods described above.
  • a device that collects feedback on the user's sleep and provides a graphical user interface, comprising: a display unit; One or more processors; and a memory storing one or more programs configured to be executed by the one or more processors, the one or more programs comprising instructions for performing any one of the methods described above, and providing feedback regarding the user's sleep.
  • a device that collects and provides a graphical user interface.
  • the present invention by generating and providing an evaluation of the user's sleep through a graphical user interface, it is possible to contribute to improving the quality of the user's sleep by providing the user with an evaluation of the user's sleep.
  • the present invention by generating and providing information about the user's sleep through a graphical user interface, it is possible to contribute to improving the user's sleep quality by providing information about the user's sleep to the user.
  • the present invention by generating and providing information about the user's sleep through a graphical user interface, it is possible to contribute to improving the user's sleep quality by providing information about the user's sleep to the user. Additionally, it can be helpful in analyzing information about sleep by intuitively displaying information about when information about the user's sleep was obtained.
  • the effect of the present invention is to accurately capture the timing of REM sleep and provide an alarm to the user. Additionally, it provides the expected wake-up time and estimated sleep time on one screen. Additionally, when the scheduled wake-up time is changed, an estimated sleep time that changes in real time is provided. Additionally, minimal time information is provided on the screen after the alarm is set to prevent users from feeling anxious about waking up time. Additionally, a hypnogram is displayed on the screen after the alarm is set, allowing the user to know whether the user woke up exactly at the time of REM sleep. Additionally, it provides users with information about how much more sleep they can sleep, rather than when the alarm sounds.
  • the present invention provides a graphical user interface that analyzes the user's sleep sound information, provides sleep state information, and provides information so that the user's sleep results can be easily recognized at a glance by comparing the results of the user's sleep with the sleep of others.
  • the present invention can conveniently trigger sleep measurement to provide sleep state information on a user device.
  • the present invention by generating and providing an evaluation of the user's sleep through a graphical user interface, it is possible to contribute to improving the quality of the user's sleep by providing the user with an evaluation of the user's sleep.
  • FIG. 1A is a conceptual diagram illustrating a system in which various aspects of an apparatus for generating one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention may be implemented.
  • FIG. 1B is a conceptual diagram illustrating a system in which various aspects of a device providing one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention may be implemented.
  • FIG. 2A is a conceptual diagram illustrating a system in which the creation and/or provision of one or more graphical user interfaces representing information about a user's sleep are implemented in a user terminal according to an embodiment of the present invention.
  • Figure 2b is a conceptual diagram showing a system in which various aspects of various electronic devices according to the present invention can be implemented.
  • Figure 3 is a block diagram showing the configuration of a device that generates/provides one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention.
  • FIGS. 4A to 4G are diagrams illustrating a graphical user interface showing information on the date and/or time when information on the user's sleep was acquired based on the waking time according to embodiments of the present invention.
  • FIGS. 5A to 5E are diagrams illustrating a graphical user interface showing information on the date and/or time when information on the user's sleep was acquired based on the elevation viewpoint according to embodiments of the present invention.
  • FIGS. 6A to 6E are diagrams illustrating a graphical user interface showing information on the date and/or time when information on the user's sleep was acquired based on the elevation time and the waking time according to embodiments of the present invention.
  • FIGS. 7A to 7G are diagrams illustrating a graphical user interface showing information on the date and/or time when information on the user's sleep was acquired based on the definition of the time zone to which the waking time belongs according to embodiments of the present invention. am.
  • Figure 8 is a diagram for explaining the process of acquiring sleep sound information in the sleep analysis method according to the present invention.
  • Figure 9 is a diagram for explaining a method of obtaining a spectrogram corresponding to sleep sound information in the sleep analysis method according to the present invention.
  • Figure 10 is a flowchart of a method for creating and providing one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention.
  • Figure 11 is a schematic diagram showing one or more network functions for performing the sleep analysis method according to the present invention.
  • Figure 12 is a diagram for explaining sleep stage analysis using a spectrogram in the sleep analysis method according to the present invention.
  • Figure 13 is a diagram for explaining sleep disorder determination using a spectrogram in the sleep analysis method according to the present invention.
  • Figure 14 is a diagram showing an experimental process for verifying the performance of the sleep analysis method according to the present invention.
  • Figure 15 is a diagram for explaining the overall structure of a sleep analysis model according to an embodiment of the present invention.
  • Figure 16 is a diagram for explaining a feature extraction model and a feature classification model according to an embodiment of the present invention.
  • FIG. 17A and 17B are graphs verifying the performance of the sleep analysis method according to the present invention, showing polysomnography (PSG) results (PSG results) and analysis results (AI results) using the AI algorithm according to the present invention. This is a comparison drawing.
  • Figure 18 is a graph verifying the performance of the sleep analysis method according to the present invention, showing polysomnography (PSG) results in relation to sleep apnea and hypoventilation and the results according to the present invention.
  • PSG polysomnography
  • This is a diagram comparing the analysis results (AI results) using AI algorithms.
  • Figure 19a is a diagram showing a graphical user interface showing information about the sleep of a user with the alarm function activated according to the present invention.
  • Figure 19b is a diagram illustrating a graphical user interface showing information about the sleep of a user in which the alarm function according to the present invention is not activated.
  • Figure 20a is a diagram showing a graphical user interface indicating that sleep is being measured while the alarm function according to the present invention is activated.
  • Figure 20b is a diagram showing a graphical user interface indicating that sleep is being measured while the alarm function according to the present invention is deactivated.
  • Figure 21 is a diagram illustrating a graphical user interface that provides information to confirm the user's expected sleep time according to the present invention when the user's expected sleep time is more than a predetermined time.
  • Figure 22 is a diagram illustrating a user interface that provides information corresponding to sleep measurement when the user's expected sleep time is less than a predetermined time according to the present invention.
  • Figure 23 is a diagram for explaining a method of providing an alarm based on the user's sleep state information according to the present invention.
  • Figure 24a is a diagram for explaining the case where the AI alarm of the present invention is not set using a hypnogram.
  • Figure 24b is a diagram for explaining the case of setting the AI alarm of the present invention through a hypnogram.
  • Figure 25c is a diagram showing a respiratory stability graph according to an embodiment of the present invention.
  • Figure 25e is a diagram showing a graphical user interface including an explanatory display for respiratory instability according to an embodiment of the present invention.
  • 26A and 26B are diagrams illustrating a graphical user interface including statistical information of sleep state information according to embodiments of the present invention.
  • Figures 27a and 27b are diagrams illustrating a graphical user interface including sleep state information acquired over a week according to embodiments of the present invention.
  • FIGS. 28A and 28B are diagrams illustrating a graphical user interface including sleep state information acquired over a predetermined period of time according to embodiments of the present invention.
  • Figure 29 is a flowchart of a method for creating and providing one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention.
  • Figures 31A to 31E are diagrams showing hypnogram graphs of sleep stage information expressed in a conventional sleep measurement interface.
  • Figure 32 is a diagram showing that when a finger swipe input is sensed on the display unit of a user device, which is an embodiment of the present invention, the time of sensing the finger swipe is recognized as the sensing time of the sleep measurement start trigger.
  • Figure 33 is a diagram showing that when a palm swipe input is sensed on the display unit of a user device, which is an embodiment of the present invention, the time of sensing the palm swipe is recognized as the sensing time of the sleep measurement start trigger.
  • Figure 34 is a diagram showing that when the user device, which is an embodiment of the present invention, senses the start of wired charging, it recognizes the charging start sensing time as the sensing time of the sleep measurement start trigger.
  • Figure 35 is a diagram showing that when a user device, which is an embodiment of the present invention, senses the start of wireless charging, it recognizes the charging start sensing time as the sensing time of the sleep measurement start trigger.
  • Figure 36 is a diagram showing that when exercise or movement is sensed by a user device, which is an embodiment of the present invention, the time at which the exercise or movement is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • Figure 37 is a diagram showing that when a user's voice is sensed by a user device, which is an embodiment of the present invention, the time when the user's voice is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • Figure 38 is a diagram illustrating that when another device connected to the user device through a network senses sleep measurement start trigger information, which is an embodiment of the present invention, the sensing time is recognized as the sensing time of the sleep measurement start trigger.
  • Figure 39 is a diagram showing that when sleep sound information is sensed by a user device according to an embodiment of the present invention, the time at which the sleep sound information is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • FIG. 40 is a diagram illustrating that when sleep sound information is sensed by a user device in a sleep mode according to an embodiment of the present invention, the time at which the sleep sound information is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • FIG. 41 is a diagram illustrating that when unlocking of the locking device of the user device is sensed, which is an embodiment of the present invention, the time when unlocking of the locking device is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • Figure 42 shows that during the sleep mode, which is an embodiment of the present invention, if finger tapping is not sensed by the user device, the time when a predetermined time has elapsed from the time when finger tapping was not sensed is recognized as the sensing time of the sleep measurement start trigger. It is a drawing.
  • Figure 43 shows that during the sleep mode, which is an embodiment of the present invention, if the user's exercise or movement is not sensed by the user device, the sleep measurement start trigger is sensed at a predetermined time after the user's exercise or movement is not sensed. This is a diagram showing what is perceived from a viewpoint.
  • Figure 44 is a diagram showing that when the display of the user device is turned off during sleep mode, which is an embodiment of the present invention, the point in time when a predetermined time has elapsed from the time the display of the user device is turned off is recognized as the sensing time of the sleep measurement start trigger.
  • Figure 45 shows the sensing time of the sleep measurement start trigger when a predetermined time has elapsed from the time when the user's voice is not recognized by the user device during the sleep mode, which is an embodiment of the present invention, when the user's voice is not recognized by the user device. This is a drawing showing what is recognized as.
  • Figure 46a is a diagram illustrating a graphical user interface showing a comparison of the average elevation delay time of the general public and the average elevation delay time of a user, according to an embodiment of the present invention.
  • FIG. 46B is a diagram illustrating a graphical user interface showing a comparison of the average elevation delay time of a user with the average elevation delay time of a specific age group and a specific gender, according to an embodiment of the present invention.
  • Figure 46c is a diagram illustrating a graphical user interface showing a comparison between the average elevation delay time of a specific occupation and the average elevation delay time of a user, according to an embodiment of the present invention.
  • Figure 46d is a diagram illustrating a graphical user interface showing a comparison of the average elevation delay time of a user with the average elevation delay time of a specific age group, specific gender, and specific occupation, according to an embodiment of the present invention.
  • Figure 47a is a diagram illustrating a graphical user interface showing a comparison between the average deep sleep of a general person and the average deep sleep of a user, according to an embodiment of the present invention.
  • FIG. 47B is a diagram illustrating a graphical user interface showing a comparison of a user's average deep sleep with that of a specific age group and a specific gender, according to an embodiment of the present invention.
  • Figure 47c is a diagram illustrating a graphical user interface showing a comparison between deep sleep of a specific occupation and deep sleep of a user, according to an embodiment of the present invention.
  • FIG. 47D is a diagram illustrating a graphical user interface showing a comparison of deep sleep of a user with deep sleep of a specific age group, specific gender, and specific occupation, according to an embodiment of the present invention.
  • FIG. 48A is a diagram illustrating a graphical user interface showing a comparison between the average REM sleep of an ordinary person and the REM sleep of a user, according to an embodiment of the present invention.
  • FIG. 48B is a diagram illustrating a graphical user interface showing a comparison of the user's REM sleep with the average REM sleep of a specific age group and a specific gender, according to an embodiment of the present invention.
  • Figure 48c is a diagram illustrating a graphical user interface showing a comparison between the REM sleep of a specific occupation and the REM sleep of a user, according to an embodiment of the present invention.
  • FIG. 48D is a diagram illustrating a graphical user interface showing a comparison of the user's REM sleep with that of a specific age group, specific gender, and specific occupation, according to an embodiment of the present invention.
  • Figure 49A is a flowchart of a method for creating one or more graphical user interfaces representing a non-numerical assessment of a user's sleep according to an embodiment of the present invention.
  • Figure 49b is a flowchart of a method for creating one or more graphical user interfaces that represent the evaluation of the user's sleep in a non-numerical and numerical manner according to an embodiment of the present invention.
  • 50A to 50C are diagrams for explaining one or more graphical user interfaces representing non-numerical evaluations of a user's sleep according to an embodiment of the present invention.
  • FIG. 51A is a diagram illustrating one or more graphical user interfaces representing a numerical evaluation of a user's sleep, according to an embodiment of the present invention.
  • FIG. 51B is a diagram illustrating one or more graphical user interfaces representing a numerical evaluation of a user's sleep, according to an embodiment of the present invention.
  • Figure 51c is a diagram for explaining factors affecting sleep quality among 36 psychiatrists surveyed to generate a numerical evaluation calculation formula for the user's sleep according to an embodiment of the present invention. .
  • FIG. 51D is a diagram illustrating sleep keywords and a sleep Likert scale for calculating a numerical evaluation of a user's sleep, according to an embodiment of the present invention.
  • Figure 52 is a diagram illustrating the structure of a sleep analysis model using deep learning to analyze a user's sleep, according to an embodiment of the present invention.
  • FIG. 53 is a diagram illustrating a method of generating a sleep phrase based on a lookup table in a method of providing one or more graphical user interfaces representing an evaluation of a user's sleep, according to an embodiment of the present invention.
  • FIG. 54 is a diagram illustrating a method of generating a sleep phrase based on a large-scale language model in a method of providing one or more graphical user interfaces representing an evaluation of a user's sleep, according to an embodiment of the present invention. .
  • Figure 55 illustrates a user sleep graph that generates a graph of sleep stages within a user's sleep period in a method of providing one or more graphical user interfaces representing an evaluation of a user's sleep, according to an embodiment of the present invention. This is a drawing for this purpose.
  • Figure 56 is a diagram for explaining a hypnogram displaying a sleep stage within a user's sleep period according to an embodiment of the present invention.
  • Figure 57 is a diagram for explaining a hypnodensity graph displaying the sleep stage within the user's sleep period according to an embodiment of the present invention.
  • FIG. 58 is a diagram illustrating providing a sleep content service to a user in order to collect feedback, according to an embodiment of the present invention.
  • FIG. 59 is a diagram illustrating providing sleep analysis information to a user to collect feedback, according to an embodiment of the present invention.
  • FIG. 60 is a diagram illustrating providing a sleep environment adjustment service to a user in order to collect feedback, according to an embodiment of the invention.
  • Figure 61 is a diagram for explaining a user graphic display for collecting user feedback on sleep content according to an embodiment of the present invention.
  • Figure 62 is a diagram for explaining a user graphic display for collecting user feedback on sleep analysis information, according to an embodiment of the present invention.
  • FIG. 63 is a diagram illustrating a user graphic display for collecting user feedback on a sleep service according to an embodiment of the present invention.
  • Figure 64 is a diagram for explaining user action feedback for collecting user feedback on a sleep environment control service according to an embodiment of the present invention.
  • FIG. 65 is a diagram illustrating user physical motion feedback for collecting user feedback on sleep services or information according to an embodiment of the present invention.
  • FIG. 66 is a diagram illustrating user voice motion feedback for collecting user feedback on sleep services or information according to an embodiment of the present invention.
  • FIG. 67 is a diagram illustrating sleep state information feedback for collecting user feedback on sleep services or information according to an embodiment of the present invention.
  • Figure 68 is a flowchart illustrating a method for collecting user feedback on sleep services or information using user interface feedback, according to an embodiment of the present invention.
  • Figure 69 is a flowchart illustrating a method for collecting user feedback on sleep services or information using user motion feedback, according to an embodiment of the present invention.
  • Figure 70 is a flowchart illustrating a method for collecting user feedback on sleep services or information using user sleep state feedback, according to an embodiment of the present invention.
  • Figure 71 is a diagram to explain the structure of the Transformer model, which is the basis of a large language model.
  • Figure 72 is a diagram for explaining the inverter model of the DIFFUSION model in content creation artificial intelligence according to an embodiment of the present invention.
  • Figure 73 is a diagram for explaining the generator and discriminator of a GAN (Generative Adversarial Network) in content generation artificial intelligence according to an embodiment of the present invention.
  • GAN Geneative Adversarial Network
  • each step described in this specification is described as being performed by a computer, but the subject of each step is not limited thereto, and depending on the embodiment, at least part of each step may be performed in a different device.
  • FIG. 1A is a conceptual diagram illustrating a system in which various aspects of an apparatus 100 that generates one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention can be implemented.
  • a system according to embodiments of the present invention may include a device 100 for generating a graphical user interface, a user terminal 10, an external server 20, and a network.
  • the system in which the device 100 for generating one or more graphical user interfaces showing information about the user's sleep shown in FIG. 1A is implemented is according to one embodiment, and its components are the embodiment shown in FIG. 1A. It is not limited to this and may be added, changed, or deleted as needed.
  • FIG. 1B is a conceptual diagram illustrating a system in which various aspects of the device 200 that provides one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention can be implemented.
  • a system according to embodiments of the present invention may include a device 200 that provides a graphical user interface, a user terminal 10, an external server 20, and a network.
  • the system in which the device 200 providing one or more graphical user interfaces showing information about the user's sleep shown in FIG. 1B is implemented is according to one embodiment, and its components are the embodiment shown in FIG. 1B. It is not limited to this and may be added, changed, or deleted as needed.
  • FIG. 2A is a conceptual diagram showing a system in which the creation and/or provision of one or more graphical user interfaces representing information about the user's sleep are implemented in the user terminal 10, according to another embodiment of the present invention. It shows. As shown in FIG. 2A, without a separate generating device 100 and/or a separate providing device 200, creation of one or more graphical user interfaces representing information about the user's sleep on the user terminal 10 and /Or provision may be made.
  • FIG. 2B shows a conceptual diagram illustrating a system in which various aspects of various electronic devices related to another embodiment of the present invention can be implemented.
  • the electronic devices shown in FIG. 2B may perform at least one of the operations performed by various devices according to embodiments of the present invention.
  • operations performed by various electronic devices include acquiring environmental sensing information, performing learning about sleep analysis, performing inference about sleep analysis, and sleep state. It may include an operation to obtain information.
  • receive information related to the user's sleep transmit or receive environmental sensing information, determine environmental sensing information, extract acoustic information from environmental sensing information, process or process data, or process services.
  • provide services build a learning data set based on environmental sensing information or information related to the user's sleep, store acquired data or a plurality of data that are input to a neural network, or transmit or receive various information.
  • mutually transmitting and receiving data for the system according to embodiments of the present invention through a network generating one or more graphical user interfaces that display information about the user's sleep, or one or more graphic user interfaces that display information about the user's sleep. It may also include operations that provide an interface.
  • the electronic devices shown in FIG. 2B may individually perform the operations performed by various electronic devices according to embodiments of the present invention, but may also perform one or more operations simultaneously or in time series.
  • the electronic devices 1a to 1d shown in FIG. 2B may be electronic devices within the range of the area 11a that can obtain object state information or environmental sensing information.
  • area 11a where object state information or environmental sensing information can be obtained will be referred to as “area 11a.”
  • the electronic devices 10a and 10d may be a combination of two or more electronic devices. Additionally, the electronic devices 10a and 10b may be electronic devices connected to a network within the area 11a. Additionally, the electronic devices 10c and 10d may be electronic devices that are not connected to the network within the area 11a.
  • the electronic devices 20a to 20b may be electronic devices outside the range of the area 11a. Additionally, there may be a network that interacts with electronic devices within the confines of area 11a, and there may be a network that interacts with electronic devices outside the confines of area 11a.
  • a network that interacts with electronic devices within the scope of area 11a may serve to transmit and receive information for controlling smart home appliances.
  • the network interacting with electronic devices within the scope of area 11a may be, for example, a local area network or a local network.
  • the network interacting with electronic devices within the scope of area 11a may be, for example, a remote network or a global network.
  • FIG. 2B there may be one or more electronic devices connected through a network outside the range of area 11a, and in this case, the electronic devices may distribute data to each other or perform one or more operations separately.
  • electronic devices connected through a network outside the scope of area 11a may include server devices.
  • the electronic devices may perform various operations independently of each other.
  • the present invention includes an apparatus 100 for generating one or more graphical user interfaces representing information about the user's sleep and a device providing one or more graphical user interfaces representing information about the user's sleep. (200) can transmit and receive data for the system according to embodiments of the present invention with the user terminal 10 through a network.
  • one or more graphic user interface generating devices 100 representing information about the user's sleep and one or more graphic user interface providing devices 200 representing information about the user's sleep are separately provided. Even if not provided, the user terminal 10 provides one or more graphical user interface generating devices 100 representing information about the user's sleep and one or more graphical user interfaces representing information about the user's sleep through a network. By performing the role of the device 200, data for systems according to embodiments of the present invention can be transmitted and received.
  • various electronic devices according to the present invention can transmit and receive data for the system according to embodiments of the present invention through a network.
  • Networks include Public Switched Telephone Network (PSTN), x Digital Subscriber Line (xDSL), Rate Adaptive DSL (RADSL), Multi Rate DSL (MDSL), and Very High Speed DSL (VDSL). ), UADSL (Universal Asymmetric DSL), HDSL (High Bit Rate DSL), and local area network (LAN) can be used.
  • PSTN Public Switched Telephone Network
  • xDSL Digital Subscriber Line
  • RADSL Rate Adaptive DSL
  • MDSL Multi Rate DSL
  • VDSL Very High Speed DSL
  • UADSL Universal Asymmetric DSL
  • HDSL High Bit Rate DSL
  • LAN local area network
  • CDMA Code Division Multi Access
  • TDMA Time Division Multi Access
  • FDMA Frequency Division Multi Access
  • OFDMA Orthogonal Frequency Division Multi Access
  • SC-FDMA Single Carrier-FDMA
  • the network according to embodiments of the present invention can be configured regardless of the communication mode, such as wired or wireless, and is composed of various communication networks such as a personal area network (PAN) and a wide area network (WAN). It can be. Additionally, the network may be the well-known World Wide Web (WWW), and may also use wireless transmission technology used for short-distance communication, such as Infrared Data Association (IrDA) or Bluetooth.
  • IrDA Infrared Data Association
  • Bluetooth wireless transmission technology used for short-distance communication
  • FIG. 3 is a block diagram showing the configuration of an apparatus 100 for generating/providing one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention.
  • an apparatus 100 for generating/providing one or more graphical user interfaces representing information about a user's sleep 200 includes a display 120, configured to be executed by one or more processors. It may include a memory 140 that stores one or more programs and one or more processors 160.
  • the memory 140 storing one or more programs includes high-speed random access memory such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices, and one or more magnetic disks. Includes non-volatile memory, such as storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Additionally, the memory may store instructions for performing a method of providing one or more graphical user interfaces representing information about the user's sleep.
  • the processor 160 may be composed of one or more processors. Additionally, the processor may execute memory that stores one or more programs.
  • Environmental sensing information or sleep information may be obtained from one or more sensor devices. Additionally, the sensor device according to an embodiment of the present invention may be implemented in the form of a user terminal 10.
  • Environmental sensing information may refer to sensing information obtained from the space where the user is located.
  • Environmental sensing information may be sensing information obtained in a space where the user is located through a non-contact method.
  • environmental sensing information may be acoustic information obtained in the bedroom where the user sleeps.
  • the environmental sensing information acquired through the user terminal 10 may be the basis for obtaining the user's sleep state information in the present invention.
  • sleep state information related to whether the user is before, during, or after sleep may be obtained through environmental sensing information obtained in relation to the user's activities.
  • environmental sensing information includes noise information commonly occurring in daily life (sound information related to cleaning, sound information related to food cooking, sound information related to watching TV, cat sounds, dog sounds, bird sounds, car sounds, wind noise, It may be at least one of (e.g., rain sounds, etc.) or other biometric information (e.g., electrocardiogram, brain wave, pulse information, information on muscle movement, etc.).
  • This user terminal 10 may refer to any type of entity(s) in the system that has a mechanism for communication with the device 100 for generating/providing a graphical user interface.
  • these user terminals 10 include personal computers (PCs), notebooks (note books), mobile terminals, smart phones, tablet PCs, and artificial intelligence (AI) speakers. and artificial intelligence TVs and wearable devices, and may include all types of terminals that can access wired/wireless networks.
  • the user terminal 10 may include an arbitrary server implemented by at least one of an agent, an application programming interface (API), and a plug-in. Additionally, the user terminal 10 may include an application source and/or client application.
  • API application programming interface
  • an external server that stores information about a plurality of learning data for learning a neural network may be further provided.
  • the plurality of learning data may include, for example, health checkup information or sleep checkup information.
  • the external server may be at least one of a hospital server and an information server, and may be a server that stores information about a plurality of polysomnography records, electronic health records, and electronic medical records.
  • a polysomnographic record may include information on the sleep examination subject's breathing and movements during sleep, and information on sleep diagnosis results (eg, sleep stages, etc.) corresponding to the information.
  • Information stored in an external server (not shown) can be used as learning data, verification data, and test data to train the neural network in the present invention.
  • an external server may record an artificial intelligence model for analyzing sleep state information.
  • sleep state information can be generated based on the environmental sensing information through an artificial intelligence model mounted on the external server.
  • the acquired sleep sound information is externally transmitted. It is transmitted to the server, and the external server may generate sleep state information based on the received sleep sound information.
  • the device 100 for generating/providing a graphical user interface receives health checkup information or sleep checkup information from an external server, and sets a learning data set based on the corresponding information. can be built.
  • the apparatus 100/providing a graphical user interface performs learning on one or more network functions through a learning data set, thereby determining the sleep state based on environmental sensing information.
  • a sleep analysis model can be created to obtain information.
  • the external server is a digital device, such as a laptop computer, a notebook computer, a desktop computer, a web pad, or a mobile phone, and may be a digital device equipped with a processor and computing power with memory.
  • the external server may be a web server that processes the service.
  • the types of servers described above are merely examples and the present invention is not limited thereto.
  • the device 100/device 200 for generating/providing a graphical user interface acquires sleep state information of the user, and relates to the user's sleep based on the sleep state information of the user.
  • One or more graphical user interfaces may be created and/or provided to represent information.
  • the apparatus 100/device 200 for generating/providing a graphical user interface determines the sleep state related to whether the user is before, during, or after sleep based on environmental sensing information. Information may be obtained, and one or more graphical user interfaces representing information about the user's sleep may be generated and/or provided based on the acquired user's sleep state information.
  • one or more sleep sensor devices may include a microphone module, a camera, and an illumination sensor provided in the user terminal 10.
  • information related to the user's activities in the work space may be obtained through a microphone module provided in the user terminal 10.
  • the microphone module when sensing information through a microphone module provided in the user terminal 10, the microphone module must be provided in the user terminal 10 of a relatively small size, so it may be configured as a Micro-electro Mechanical System (MEMC).
  • MEMC Micro-electro Mechanical System
  • Microphone modules according to embodiments of the present invention can be manufactured very small, but may have a lower signal-to-noise ratio (SNR) than a condenser microphone or dynamic microphone.
  • SNR signal-to-noise ratio
  • a low signal-to-noise ratio may mean that the ratio of noise, which is a sound that is not to be identified, to the sound that is to be identified is high, making it difficult to identify the sound (i.e., unclear).
  • the environmental sensing information that is the subject of analysis in the present invention may include acoustic information related to the user's breathing and movement acquired during sleep.
  • This acoustic information is information about very small sounds (i.e., sounds that are difficult to distinguish) such as the user's breathing and movement, and is acquired along with other sounds during the sleep environment, so the microphone module as described above with a low signal-to-noise ratio If acquired through , detection and analysis can be very difficult.
  • the electronic device can convert and/or adjust environmental sensing information obtained indistinctly, including a lot of noise, into data that can be analyzed, and utilize the converted and/or adjusted data to use an artificial neural network.
  • the learned neural network e.g., acoustic analysis model
  • the user based on data (e.g., transformed and/or adjusted) acquired (e.g., transformed and/or adjusted) corresponding to the sleep acoustic information. Sleep state information can be obtained.
  • the sleep state information may include sleep stage information related to changes in the user's sleep stage during sleep, as well as information related to whether the user is sleeping.
  • the sleep state information may include sleep stage information indicating that the user was in REM sleep at a first time point, and that the user was in light sleep at a second time point different from the first time point. In this case, information that the user fell into a relatively deep sleep at the first time and a lighter sleep at the second time may be obtained through the corresponding sleep state information.
  • the device 100/device 200 for generating/providing a graphical user interface is a user terminal that is widely used to collect sound (e.g., artificial intelligence speaker, bedroom IoT device, When sleep sound information with a low signal-to-noise ratio is acquired through a mobile phone, wearable device, etc.), it can be processed into data appropriate for analysis, and the processed data can be processed to provide sleep state information related to changes in sleep stages. there is.
  • the device 100 for generating a graphical user interface in FIG. 1A is represented as a separate entity from the user terminal 10, according to an embodiment of the present invention, the device 100 for generating a graphical user interface as shown in FIG. 2A
  • the device 100 may be included in the user terminal 10 and perform functions of measuring sleep status and recommending products related to sleep in one integrated device.
  • the device 200 providing the graphical user interface is represented as a separate entity from the user terminal 10, but according to an embodiment of the present invention, as shown in FIG. 2A, the graphical user interface
  • the device 200 that provides may be included in the user terminal 10 and perform functions of measuring sleep status and verifying products related to sleep in one integrated device.
  • the device 100 for generating/providing a graphical user interface 200 may be a terminal or a server, and may include any type of device.
  • the device 100/providing a graphical user interface is a digital device, such as a laptop computer, a notebook computer, a desktop computer, a web pad, or a mobile phone, equipped with a processor and computing power with memory. It can be a digital device.
  • the apparatus 100 for generating/providing a graphical user interface 200 may be a web server that processes services.
  • the types of servers described above are merely examples and the present invention is not limited thereto.
  • the device 100 for generating/providing a graphical user interface 200 may be a server providing a cloud computing service. More specifically, the device 100 for generating/providing a graphical user interface 200 is a type of Internet-based computing, which is a server that provides a cloud computing service that processes information not on the user's computer but on another computer connected to the Internet. It can be.
  • the cloud computing service may be a service that stores data on the Internet and can be used anytime, anywhere through Internet access without the user having to install necessary data or programs on his or her computer.
  • the cloud computing service may be a service that allows simple manipulation of data stored on the Internet. You can easily share and forward with a click.
  • cloud computing services not only allow you to simply store data on a server on the Internet, but also allow you to perform desired tasks using the functions of applications provided on the web without having to install a separate program, and allow multiple people to view documents at the same time. It may be a service that allows you to work while sharing.
  • cloud computing services may be implemented in at least one of the following forms: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), Software as a Service (SaaS), virtual machine-based cloud server, and container-based cloud server.
  • IaaS Infrastructure as a Service
  • PaaS Platform as a Service
  • SaaS Software as a Service
  • virtual machine-based cloud server virtual machine-based cloud server
  • container-based cloud server container-based cloud server.
  • the apparatus 100 for generating/providing the graphic user interface of the present invention 200 may be implemented in at least one form of the cloud computing service described above.
  • the specific description of the cloud computing service described above is merely an example, and may include any platform for constructing the cloud computing environment of the present invention.
  • environmental sensing information may include sleeping environment information and user's life information.
  • the sleep environment information may be acoustic information about the user's sleep.
  • One or more sleep information sensor devices may collect raw data about sounds generated from sleep to analyze sleep. Raw data about sounds occurring on the water may be in the time domain.
  • sleep sound information may be related to breathing and movement patterns related to the user's sleep. For example, in the awake state, all nervous systems are activated, so breathing patterns may be irregular and body movements may be frequent. Additionally, breathing sounds may be very low because the neck muscles are not relaxed. On the other hand, when the user sleeps, the autonomic nervous system stabilizes, breathing changes regularly, body movements may decrease, and breathing sounds may become louder. Additionally, when apnea occurs during sleep, loud breathing sounds may occur immediately after apnea as a compensation mechanism. In other words, by collecting raw data about sleep, analysis of sleep can be performed.
  • the acquired environmental sensing information and acoustic information are information in the time domain and may undergo a preprocessing process of noise reduction.
  • noise e.g. white noise
  • the noise reduction process can be accomplished using algorithms such as spectral gating and spectral subtraction to remove background noise.
  • a noise removal process can be performed using a deep learning-based noise reduction algorithm.
  • the deep learning-based noise reduction algorithm can use a noise reduction algorithm specialized for the user's breathing or breathing sounds, that is, a noise reduction algorithm learned through the user's breathing or breathing sounds.
  • the above preprocessing may be performed during the learning process of sleep state information, or may be performed during the inference process.
  • Spectral gating or spectral noise gating is a preprocessing method for acoustic information. Noise reduction can be performed on all of the acquired acoustic information, but splitting can be performed at regular time intervals (eg, 5 minutes, etc.), and then noise reduction can be performed on each of the split acoustic information. In order to perform noise reduction on acoustic information split at regular time intervals, a method of calculating a spectrum for each frame may first be included.
  • a method may be included in which the frame having the frequency spectrum with the lowest energy among each spectrum frame is assumed to be static noise, and the frequency of the frequency spectrum frame assumed to be static noise is attenuated from the spectrum frame.
  • a deep learning-based noise reduction method performed on raw acoustic information in the time domain rather than the frequency domain may be used.
  • a method may be used in which information such as sleep sound information, which is necessary information to be used as input to a sleep analysis model, is maintained, and other sounds are attenuated.
  • Noise reduction can be performed not only on sound information obtained through PSG test results, but also on sound information acquired through a microphone built into a user terminal such as a smartphone.
  • raw sound information in order to analyze sleep sound information, can be converted into information including changes along the time axis in the frequency components of sleep sound information in the time domain.
  • information on the frequency domain may mean converting raw sleep sound information into information including changes in frequency components along the time axis.
  • changes along the time axis of frequency components included in raw acoustic information from which noise has been removed can be visualized and converted into information represented as an image.
  • raw acoustic information can be converted into a spectrogram among information including changes in frequency components along the time axis.
  • a method of converting raw acoustic information into a spectrogram based only on the amplitude excluding the phase can be used. Through this method, not only privacy is protected, but the processing speed can be improved by lowering the data capacity.
  • One embodiment of the present invention can generate a sleep analysis model using a spectrogram (SP) converted based on sleep sound information (SS).
  • SP spectrogram
  • SS sleep sound information
  • One embodiment of the present invention removes noise from sleep sound information using the above-described method, converts it into a spectrogram, and learns the spectrogram to create a sleep analysis model, thereby reducing the amount of computation and computation time. It is possible to protect individual privacy.
  • the sleep acoustic information (e.g., the user's breathing sound, etc.) required for sleep stage analysis may be relatively smaller than other noise, but when converted to a spectrogram, it is smaller than other noise in the surrounding area. Identification of sleep acoustic information can be relatively excellent.
  • a method may be included to convert acquired acoustic information into a spectrogram in real time.
  • the leakage of personal information can be prevented as compression of the frequency resolution of the spectrogram can be performed on the user's smartphone rather than on a server or cloud.
  • the spectrogram according to an embodiment of the present invention may be a Mel spectrogram to which the Mel scale is applied.
  • Figure 8 is a diagram for explaining the process of acquiring sleep sound information in the sleep analysis method according to the present invention.
  • Figure 9 is a diagram for explaining a method of obtaining a spectrogram corresponding to sleep sound information in the sleep analysis method according to the present invention.
  • the processor 130 may generate a spectrogram (SP) in response to the sleep sound information (SS).
  • Raw data raw acoustic information in the time domain
  • raw data according to the present invention can also be collected through polysomnography (PSG) in a hospital environment.
  • PSG polysomnography
  • user information in a home environment may be collected through a microphone built into a user terminal such as a wearable device or smartphone.
  • raw data is acquired through the user terminal 10 from the start point input by the user to the end point, or corresponds to the device manipulation (eg, alarm setting time) from the time the user manipulates the device (eg, alarm setting). It can be acquired up to the point in time, or the time point can be automatically selected and acquired based on the user's sleep pattern, and the time point of the user's sleep intention can be determined by sound (user's speech, breathing sound, sound of peripheral devices (TV, washing machine), etc.) or change in illumination. It may also be obtained by automatically determining the viewpoint based on, etc. Meanwhile, the sleep intention timing according to an embodiment of the present invention may be calculated from the user's sleep intention information.
  • the processor 130 may generate a sleep spectrogram (SP) by performing fast Fourier transform on the sleep sound information (SS).
  • SP sleep spectrogram
  • a spectrogram (SP) is intended to visualize and understand sound or waves, and may be a combination of waveform and spectrum characteristics.
  • a spectrogram (SP) may represent the difference in amplitude according to changes in the time axis and frequency axis as a difference in printing density or display color.
  • Preprocessed acoustic-related raw data can be cut into 30-second increments and converted into a spectrogram. Accordingly, a 30-second spectrogram has dimensions of 20 frequency bins x 1201 time steps.
  • a rectangular spectrogram can be converted into a shape close to a square by using various methods such as reshaping, resizing, and split-cat to change it into a shape close to a square. Alternatively, by using this method, it is possible to relatively preserve the amount of information.
  • the present invention can use a method of simulating breathing sounds measured in various home environments by adding various noises occurring in the home environment to clean breathing sounds. Because sounds have additive properties, they can be added to one another. However, adding original sound signals such as mp3 or pcm and converting them to a spectrogram results in very large consumption of computing resources. Therefore, the present invention proposes a method of converting breathing sounds and noise into spectrograms and adding them, respectively. Through this, it is possible to secure the robustness of the artificial intelligence model to information from various home environments by simulating breathing sounds measured in various home environments and using them to learn artificial intelligence models.
  • the purpose of converting data according to an embodiment of the present invention into a spectrogram is to use the input of a sleep analysis model to infer through the learned model which sleep state or sleep stage the pattern in the spectrogram corresponds to.
  • some preprocessing processes may be required before using it as input to the sleep analysis model. These preprocessing processes may be performed only during the learning process, or may be performed not only during the learning process but also during the inference process. Or, it may only occur during the reasoning process.
  • the preprocessing process of the spectrogram involves adding Gaussian noise to the data to inflate the amount of data, or pitch shifting to gradually raise or lower the overall pitch of the sound.
  • Preprocessing method a spectrogram or mel spectrogram is converted into a vector during the learning process, the converted vector is randomly cut (tiled) at the input stage of one node (neuron), and then again after the output of the node (neuron).
  • Data augmentation preprocessing techniques such as Untile (Untile) TUT (Tile UnTile) augmentation method may be included.
  • the data augmentation preprocessing method includes noise occurring in various environments other than Gaussian noise (e.g., external sounds, sounds of nature, sounds of a fan running, sounds of doors opening or closing, animals
  • noise addition augmentation method that adds sounds (sounds made by people, people talking, movement sounds, etc.) may also be included.
  • noise addition augmentation converts noise information into a spectrogram and then artificially adds it to the sleep sound information and the spectrogram. It may include how to do it. In this case, there is a significant difference between the spectrogram obtained by adding noise information to the sleep sound information in the original sound information domain and the spectrogram obtained by adding sleep sound information and noise information to the spectrogram converted domain. There may not be.
  • the noise-added augmentation makes it difficult to convert back to the original signal from the spectrogram, so that in order to protect the user's privacy, the amplitude and phase are changed from the spectrogram of each sleep sound information and noise information. By maintaining only the amplitude and adding a random phase, it is possible to make it difficult to convert back to the original signal from the spectrogram.
  • noise addition augmentation may include not only a method of adding sound information on a domain converted into a spectrogram, but also a method of adding noise on a domain converted into a Mel spectrogram to which a Mel scale is applied. there is.
  • time required for hardware to process data can be shortened by the method added by Mel Scale according to an embodiment of the present invention.
  • a preprocessing method may be performed to convert information or a spectrogram in the frequency domain into a nearly square form.
  • sleep state information may include information related to whether the user is sleeping.
  • the sleep state information may include at least one of first sleep state information indicating that the user is before sleep, second sleep state information indicating that the user is sleeping, and third sleep state information indicating that the user is after sleep.
  • the processor 130 may determine that the user is in a pre-sleep state (i.e., before going to bed), and the second sleep state information is inferred. In this case, it may be determined that the user is in a sleeping state, and if third sleep state information is obtained, it may be determined that the user is in a post-sleep state (i.e., waking up).
  • the sleep state information may include information about at least one of sleep apnea, snoring, tossing and turning, coughing, sneezing, or bruxism, in addition to information related to the user's sleep stage.
  • acoustic information acquired over a long time interval may be required.
  • a relatively short time interval (e.g., 1) before and after the corresponding sleep state occurs. minutes) may require acoustic information acquired.
  • This sleep state information may be obtained based on environmental sensing information.
  • Environmental sensing information may include sensing information obtained in a non-contact manner in the space where the user is located.
  • the processor 130 may obtain environmental sensing information. Specifically, environmental sensing information can be obtained through the user terminal 10 carried by the user. For example, environmental sensing information related to the space in which the user operates may be obtained through the user terminal 10 carried by the user, and the processor 130 may receive the corresponding environmental sensing information from the user terminal 10.
  • the processor 130 may obtain sleep state information based on acoustic information, actigraphy, and biometric information obtained from the user terminal 10. Specifically, the processor 130 may identify a singularity in acoustic information.
  • the uniqueness of the acoustic information may be related to breathing and movement patterns related to sleep. For example, in the awake state, all nervous systems are activated, so breathing patterns may be irregular and body movements may be frequent. Additionally, breathing sounds may be very low because the neck muscles are not relaxed.
  • the autonomic nervous system stabilizes, breathing changes regularly, body movements may decrease, and breathing sounds may become louder.
  • the processor 130 may identify the point in time at which a pattern of acoustic information related to regular breathing, small body movement, or small breathing sounds is detected as a singular point in the acoustic information. Additionally, the processor 130 may obtain sleep sound information based on sound information obtained based on the identified singularity. The processor 130 may identify a singularity related to the user's sleep time from the sound information acquired in time series and obtain sleep sound information based on the singularity.
  • FIG. 8 is a diagram for explaining the process of acquiring sleep sound information in the sleep analysis method according to the present invention.
  • the processor 130 determines a singular point (P) related to the point in time when acoustic information of a pattern related to regular breathing, small body movement, or small breathing sounds is detected from the acoustic information (E). can be identified.
  • the processor 130 may acquire sleep sound information (SS) based on the identified singular point (P) and sound information acquired after the singular point (P).
  • SS sleep sound information
  • the waveforms and singularities related to sound in FIG. 4 are merely examples for understanding the present invention, and the present invention is not limited thereto.
  • the processor 130 identifies the singularity (P) related to the user's sleep from the acoustic information, and thus extracts only the sleep acoustic information (SS) from the vast amount of environmental sensing information (i.e., acoustic information) based on the singularity (P). It can be obtained by extracting it.
  • P singularity
  • SS sleep acoustic information
  • the processor 130 may obtain sleep state information related to whether the user is before sleep or in sleep based on the singular point (P) identified from the sound information (E). Specifically, if the singular point (P) is not identified, the processor 130 may determine that the user is before sleeping, and if the singular point (P) is identified, the processor 130 may determine that the user is sleeping after the singular point (P). there is. In addition, after the outlier P is identified, the processor 130 identifies a time point (e.g., waking up time) at which the identified pattern is no longer observed, and when the corresponding time point is identified, it is determined that the user has woken up after sleeping, that is, You can judge.
  • a time point e.g., waking up time
  • the processor 130 determines whether the user is before, during, or after sleep based on whether a singular point (P) is identified in the acoustic information (E) and whether a preset pattern is continuously detected after the singular point is identified. Sleep state information related to awareness can be obtained.
  • the processor 130 may obtain sleep state information based on actigraphy or biometric information rather than acoustic information (E). It may be advantageous to obtain the user's movement information through a sensor unit in contact with the body. In the present invention, since the user's sleep state information is identified in advance using actigraphy or biometric information during the first sleep analysis, the reliability of the sleep state analysis can be further improved.
  • the processor 130 may extract sleep stage information. Sleep stage information may be extracted based on the user's environmental sensing information. Sleep stages can be divided into NREM (non-REM) sleep and REM (rapid eye movement) sleep, and NREM sleep can be further divided into multiple stages (e.g., stages 2 of light and deep, and stages 4 of N1 to N4). there is.
  • the sleep stage setting may be defined as a general sleep stage, but may also be arbitrarily set to various sleep stages depending on the designer. Through sleep stage analysis, it is possible to predict not only sleep-related sleep quality, but also sleep diseases (e.g. sleep apnea) and their underlying causes (e.g. snoring).
  • the processor 130 may generate product recommendation information and verification information related to sleep based on sleep stage information.
  • a word indicating the light sleep stage when a word indicating the light sleep stage is displayed in Korean, it may be displayed as 'light sleep' or 'normal sleep'. Users who do not have expert knowledge about sleep stages may misunderstand that they did not sleep properly during the sleep period due to the word 'light sleep'. In contrast, if the word 'normal sleep' is displayed, this misunderstanding may occur. There is a possibility of reducing.
  • Figure 15 is a diagram for explaining the overall structure of a sleep analysis model according to an embodiment of the present invention.
  • sleep state information can be obtained through a sleep analysis model that analyzes the user's sleep stage based on sound information (sleep sound information).
  • the sleep sound information (SS) may be a very small sound because it is a sound related to breathing and body movement acquired during the user's sleeping time. Accordingly, the present invention uses the sleep sound information (SS) as described above. Analysis of sound can be performed by converting it into a spectrogram (SP). In this case, the spectrogram (SP) contains information that shows how the frequency spectrum of the sound changes over time, so breathing or movement patterns related to relatively small sounds can be easily identified, improving the efficiency of analysis. there is.
  • the sleep sound information is at least one of the awake state, REM sleep state, light sleep state, and deep sleep state based solely on changes in the energy level of the sleep sound information, but by converting the sleep sound information into a spectrogram, each sleep sound information is converted into a spectrogram. Since changes in the frequency spectrum can be easily detected, analysis corresponding to small sounds (eg, breathing and body movements) may be possible.
  • the processor 130 may obtain sleep state information by processing the spectrogram (SP) as an input to a sleep analysis model.
  • the sleep analysis model is a model for obtaining sleep state information related to changes in the user's sleep stage, and can output sleep state information by inputting sleep sound information acquired during the user's sleep.
  • the sleep analysis model may include a neural network model constructed through one or more network functions.
  • Figure 11 is a schematic diagram showing one or more network functions for performing the sleep analysis method according to the present invention.
  • a sleep analysis model is comprised of one or more network functions, and one or more network functions may be comprised of a set of interconnected computational units, which may generally be referred to as 'nodes'. These 'nodes' may also be referred to as 'neurons'.
  • One or more network functions are composed of at least one or more nodes. Nodes (or neurons) that make up one or more network functions may be interconnected by one or more 'links'.
  • one or more nodes connected through a link may form a relative input node and output node relationship.
  • the concepts of input node and output node are relative, and any node in an output node relationship with one node may be in an input node relationship with another node, and vice versa.
  • input node to output node relationships can be created around links.
  • One or more output nodes can be connected to one input node through a link, and vice versa.
  • the value of the output node may be determined based on data input to the input node.
  • the nodes connecting the input node and the output node may have a weight. Weights may be variable and may be varied by the user or algorithm in order for the neural network to perform the desired function. For example, when one or more input nodes are connected to one output node by respective links, the output node is set to the values input to the input nodes connected to the output node and the links corresponding to each input node. The output node value can be determined based on the weight.
  • one or more nodes are interconnected through one or more links to form an input node and output node relationship within the neural network.
  • the characteristics of the neural network may be determined according to the number of nodes and links within the neural network, the correlation between the nodes and links, and the value of the weight assigned to each link. For example, if there are two neural networks with the same number of nodes and links and different weight values between the links, the two neural networks may be recognized as different from each other.
  • Some of the nodes constituting the neural network may form one layer based on the distances from the first input node.
  • a set of nodes with a distance n from the initial input node may constitute n layers.
  • the distance from the initial input node can be defined by the minimum number of links that must be passed to reach the node from the initial input node.
  • this definition of a layer is arbitrary for explanation purposes, and the order of a layer within a neural network may be defined in a different way than described above.
  • a layer of nodes may be defined by distance from the final output node.
  • the initial input node may refer to one or more nodes in the neural network through which data is directly input without going through links in relationships with other nodes. Alternatively, in the relationship between nodes based on a link within a neural network, it may refer to nodes that do not have other input nodes connected by a link. Similarly, the final output node may refer to one or more nodes that do not have an output node in their relationship with other nodes among the nodes in the neural network. Additionally, hidden nodes may refer to nodes constituting a neural network other than the first input node and the last output node.
  • the neural network according to an embodiment of the present invention may have more nodes in the input layer than the nodes in the hidden layer close to the output layer, and may be a neural network in which the number of nodes decreases as it progresses from the input layer to the hidden layer.
  • a neural network may contain one or more hidden layers.
  • the hidden node of the hidden layer can take the output of the previous layer and the output of surrounding hidden nodes as input.
  • the number of hidden nodes for each hidden layer may be the same or different.
  • the number of nodes in the input layer may be determined based on the number of data fields of the input data and may be the same as or different from the number of hidden nodes.
  • Input data input to the input layer can be operated by the hidden node of the hidden layer and output by the fully connected layer (FCL), which is the output layer.
  • FCL fully connected layer
  • a deep neural network may refer to a neural network that includes multiple hidden layers in addition to the input layer and output layer.
  • Deep neural networks allow you to identify latent structures in data. In other words, it is possible to identify the potential structure of a photo, text, video, voice, or music (e.g., what object is in the photo, what the content and emotion of the text are, what the content and emotion of the voice are, etc.) .
  • Deep neural networks include convolutional neural networks (CNN), recurrent neural networks (RNN), auto encoders, generative adversarial networks (GAN), and restricted Boltzmann machines (RBMs).
  • the network function may include an auto encoder.
  • An autoencoder may be a type of artificial neural network to output output data similar to input data.
  • the autoencoder may include at least one hidden layer, and an odd number of hidden layers may be placed between input and output layers.
  • the number of nodes in each layer may be reduced from the number of nodes in the input layer to an intermediate layer called the bottleneck layer (encoding), and then expanded symmetrically and reduced from the bottleneck layer to the output layer (symmetrical to the input layer).
  • the nodes of the dimensionality reduction layer and dimensionality restoration layer may or may not be symmetric.
  • Autoencoders can perform nonlinear dimensionality reduction.
  • the number of input layers and output layers may correspond to the number of sensors remaining after preprocessing of the input data.
  • the number of nodes in the hidden layer included in the encoder may have a structure that decreases as the distance from the input layer increases. If the number of nodes in the bottleneck layer (the layer with the fewest nodes located between the encoder and decoder) is too small, not enough information may be conveyed, so if it is higher than a certain number (e.g., more than half of the input layers, etc.) ) may be maintained.
  • a neural network may be trained in at least one of supervised learning, unsupervised learning, and semi-supervised learning. Learning of a neural network is intended to minimize errors in output.
  • learning data is repeatedly input into the neural network, the output of the neural network and the error of the target for the learning data are calculated, and the error of the neural network is transferred from the output layer of the neural network to the input layer in the direction of reducing the error. This is the process of updating the weight of each node in the neural network through backpropagation.
  • learning data in which the correct answer is labeled for each learning data is used i.e., labeled learning data
  • the correct answer may not be labeled in each learning data. That is, for example, in the case of supervised learning on data classification, the learning data may be data in which each training data is labeled with a category. Labeled training data is input to the neural network, and the error can be calculated by comparing the output (category) of the neural network with the label of the training data. As another example, in the case of unsupervised learning on data classification, the error can be calculated by comparing the input training data with the neural network output.
  • the calculated error is backpropagated in the reverse direction (i.e., from the output layer to the input layer) in the neural network, and the connection weight of each node in each layer of the neural network can be updated according to backpropagation.
  • the amount of change in the connection weight of each updated node may be determined according to the learning rate.
  • the neural network's calculation of input data and backpropagation of errors can constitute a learning cycle (epoch).
  • the learning rate may be applied differently depending on the number of repetitions of the learning cycle of the neural network. For example, in the early stages of neural network training, a high learning rate can be used to increase efficiency by allowing the neural network to quickly achieve a certain level of performance, and in the later stages of training, a low learning rate can be used to increase accuracy.
  • the training data can generally be a subset of real data (i.e., the data to be processed using the learned neural network), and thus the error for the training data is reduced, but the error for the real data is reduced. There may be an incremental learning cycle.
  • Overfitting is a phenomenon in which errors in actual data increase due to excessive learning on training data. For example, a phenomenon in which a neural network that learned a cat by showing a yellow cat fails to recognize that it is a cat when it sees a non-yellow cat may be a type of overfitting. Overfitting can cause errors in AI algorithms to increase. To prevent such overfitting, various optimization methods can be used. To prevent overfitting, methods such as increasing the learning data, regularization or regularization, and dropout, which omits some of the network nodes during the learning process, can be applied.
  • the data structure may include a neural network.
  • the data structure including the neural network may be stored in a computer-readable medium.
  • Data structures including neural networks may also include data input to the neural network, weights of the neural network, hyperparameters of the neural network, data obtained from the neural network, activation functions associated with each node or layer of the neural network, and loss functions for learning the neural network.
  • a data structure containing a neural network may include any of the components disclosed above. In other words, the data structure including the neural network is all or It may be configured to include any combination of these.
  • a data structure containing a neural network may include any other information that determines the characteristics of the neural network. Additionally, the data structure may include all types of data used or generated in the computational process of a neural network and is not limited to the above.
  • Computer-readable media may include computer-readable recording media and/or computer-readable transmission media.
  • a neural network can generally consist of a set of interconnected computational units, which can be referred to as nodes. These nodes may also be referred to as neurons.
  • a neural network consists of at least one node.
  • Figure 16 is a diagram for explaining a feature extraction model and a feature classification model according to an embodiment of the present invention.
  • the sleep analysis model used in the present invention is a feature extraction model that extracts one or more features for each predetermined epoch and a feature classification model that generates sleep state information by classifying each of the features extracted through the feature extraction model into one or more sleep stages.
  • the feature extraction model can extract features related to breathing sounds, breathing patterns, and movement patterns by analyzing the time-series frequency pattern of the spectrogram (SP).
  • the feature extraction model may be constructed from part of a neural network model pre-trained on a training data set.
  • the sleep analysis model used in the present invention may include a feature extraction model and a feature classification model.
  • the feature extraction model may be a deep learning learning model based on a natural language processing model that can learn the time-series correlation of given data.
  • the feature classification model may be a learning model based on a natural language processing model that can learn the time-series correlation of given data.
  • deep learning learning models based on natural language processing models that can learn time-series correlations may include Tarnsformer, ViT, MobileViT, and MobileViT2, but are not limited thereto.
  • the learning data set according to an embodiment of the present invention may be composed of data in the frequency domain and a plurality of sleep state information corresponding to each data.
  • the learning data set according to an embodiment of the present invention may be composed of a plurality of spectrograms and a plurality of sleep state information corresponding to each spectrogram.
  • the learning data set according to an embodiment of the present invention may be composed of a plurality of Mel spectrograms and a plurality of sleep state information corresponding to each Mel spectrogram.
  • the configuration and performance of the sleep analysis model according to an embodiment of the present invention will be described in detail based on the data set of the spectrogram.
  • the learning data used in the sleep analysis model of the present invention is included in the spectrogram. It is not limited, and information in the frequency domain, a spectrogram, or a mel spectrogram can be used as learning data.
  • the feature extraction model is a one-to-one proxy task in which one spectrogram is input and learned to predict sleep state information corresponding to one spectrogram. It can be pre-trained by .
  • learning may be performed by adopting the structure of FC (Fully Connected Layer) or FCN (Fully Connected Neural Network).
  • FC Full Connected Layer
  • FCN Full Connected Neural Network
  • learning may be performed by adopting the structure of the intermediate layer.
  • the feature classification model inputs a plurality of consecutive spectrograms, predicts the sleep state information of each spectrogram, and analyzes the sequence of the plurality of consecutive spectrograms to provide an overall It can be learned to predict or classify sleep state information.
  • pre-learning is performed through a one-to-one proxy task for the feature extraction model, and then through many-to-many tasks for the pre-trained feature extraction model and feature classification model. Fine tuning can be performed.
  • the sleep stage may be inferred by inputting a sequence of 40 consecutive spectrograms into a plurality of feature extraction models learned through a one-to-one proxy task and outputting 20 pieces of sleep state information.
  • the above-described specific numerical descriptions regarding the number of spectrograms, the number of feature extraction models, and the number of sleep state information are merely examples, and the present invention is not limited thereto.
  • an inference model is created to extract the user's sleep state and sleep stage through deep learning of environmental sensing information.
  • environmental sensing information including sound information is converted into a spectrogram, and an inference model is created based on the spectrogram.
  • the inference model may be built in the apparatus 100 for generating/providing a graphical user interface 200, as described above.
  • environmental sensing information including user sound information acquired through the user terminal 10 is input to the corresponding inference model, and sleep state information and/or sleep stage information is output as a result value.
  • learning and inference may be performed by the same entity, but learning and inference may also be performed by separate entities. That is, both learning and inference may be performed by the device 100 that generates the graphical user interface of FIG. 1A or the device 200 that provides the graphical user interface of FIG. 1B, and learning is performed using the graphical user interface of FIG. 1A. Inference may be performed in the generating device 100 or the device 200 providing the graphical user interface of FIG. 1B, but inference may be performed in the user terminal 10.
  • both learning and inference are performed using the user terminal 10. It can be performed by .
  • learning or inference may be performed by at least one of the electronic devices shown in FIG. 2B.
  • the feature extraction model may be composed of an independent deep learning model learned through a training data set.
  • the feature extraction model can be learned through supervised learning or unsupervised learning methods.
  • a feature extraction model can be trained to output output data similar to input data through a learning data set.
  • only the core feature data (or features) of the input spectrogram can be learned through the hidden layer.
  • the output data of the hidden layer may be an approximation of the input data (i.e., spectrogram) rather than a perfect copy value.
  • Each of the plurality of spectrograms included in the learning data set may be tagged with sleep state information.
  • Each of the plurality of spectrograms may be input to a feature extraction model, and the output corresponding to each spectrogram may be stored by matching the tagged sleep state information.
  • first learning data sets i.e., multiple spectrograms
  • first sleep state information e.g., light sleep
  • features related to the output for the input are first sleep state information. It can be saved by matching.
  • one or more features relevant to the output may be represented in a vector space.
  • the feature data output corresponding to each of the first learning data sets are output through a spectrogram related to the first sleep stage, they may be located at a relatively close distance in the vector space. That is, learning can be performed so that a plurality of spectrograms output similar features corresponding to each sleep stage.
  • the feature extraction model through the above-described learning process receives a spectrogram (eg, a spectrogram converted in response to sleep sound information) as input, features corresponding to the spectrogram can be extracted.
  • a spectrogram eg, a spectrogram converted in response to sleep sound information
  • the processor 130 may extract features by processing the spectrogram (SP) generated in response to the sleep sound information (SS) as an input to a feature extraction model.
  • the processor 130 may divide the spectrogram (SP) into predetermined epochs. For example, the processor 130 may obtain a plurality of spectrograms by dividing the spectrogram (SP) corresponding to the sleep sound information (SS) into 30-second increments. For example, if sleep sound information is acquired during the user's 7-hour (i.e., 420-minute) sleep, the processor 130 may obtain 840 spectrograms by dividing the spectrogram in 30-second increments.
  • the detailed numerical description of the above-described sleep time, division time unit of the spectrogram, and number of divisions is only an example, and the present invention is not limited thereto.
  • the processor 130 may process each of the plurality of segmented spectrograms as input to a feature extraction model to extract a plurality of features corresponding to each of the plurality of spectrograms. For example, if the number of spectrograms is 840, the number of features extracted by the feature extraction model correspondingly may also be 840.
  • the above-described specific numerical description regarding the spectrogram and number of features is only an example, and the present invention is not limited thereto.
  • the feature extraction model according to an embodiment of the present invention may be trained using a one-to-one proxy task. Additionally, in the process of learning to extract sleep state information for one spectrogram, it may be learned to extract sleep state information by combining a feature extraction model and another NN (Neural Network).
  • NN Neuron
  • the learning time of the feature extraction model can be shortened or the learning efficiency can be increased.
  • one spectrogram divided by 30 seconds may be used as an input to a feature extraction model, and the output vector may be learned to output sleep state information by using it as an input to another NN. .
  • the processor 130 may obtain sleep state information by processing a plurality of features output through the feature extraction model as input to a feature classification model.
  • the feature classification model may be a neural network model modeled to predict sleep stages in response to features.
  • the feature classification model includes a fully connected layer and may be a model that classifies features into at least one of the sleep stages. For example, when the feature classification model inputs the first feature corresponding to the first spectrogram, the first feature may be classified as shallow water.
  • the feature classification model can perform multi-epoch classification to predict sleep stages of multiple epochs by using spectrograms related to multiple epochs as input.
  • Multi-epoch classification does not provide one sleep stage analysis information in response to the spectrogram of a single epoch (i.e., one spectrogram corresponding to 30 seconds), but spectrograms corresponding to multiple epochs (i.e. It may be used to estimate several sleep stages (e.g., changes in sleep stages according to time changes) at once by using a combination of spectrograms (each corresponding to 30 seconds) as input.
  • the feature classification model may input 40 spectrograms (e.g., 40 spectrograms corresponding to 30 seconds each) and perform prediction for the 20 spectrograms located in the center. That is, all spectrograms from 1 to 40 are examined, but the sleep stage can be predicted through classification corresponding to the spectrograms corresponding to 10 to 20.
  • the detailed numerical description of the number of spectrograms described above is only an example, and the present invention is not limited thereto.
  • spectrograms corresponding to multiple epochs are used as input so that all information related to the past and future can be considered. By doing so, the accuracy of output can be improved.
  • Figure 12 is a diagram for explaining sleep stage analysis using a spectrogram in the sleep analysis method according to the present invention.
  • the second analysis based on sleep acoustic information uses the sleep analysis model described above, as shown in FIG. 12 .
  • the corresponding sleep stage (Wake, REM, Light, Deep) can be immediately inferred.
  • secondary analysis based on sleep sound information can extract the time when sleep disorders (sleep apnea, hyperventilation) or snoring occurred through the singularity of the Mel spectrum corresponding to the sleep stage.
  • Figure 13 is a diagram for explaining sleep disorder determination using a spectrogram in the sleep analysis method according to the present invention.
  • the breathing pattern is analyzed in one Mel spectrogram, and when characteristics corresponding to a sleep apnea or hyperpnea event are detected, the point in time is determined as the time when the sleep disorder occurred. You can. At this time, a process of classifying snoring as snoring rather than sleep apnea or hyperpnea through frequency analysis may be further included.
  • Figure 14 is a diagram showing an experimental process for verifying the performance of the sleep analysis method according to the present invention.
  • the user's sleep image and sleep sound are acquired in real time, and the acquired sleep sound information can be immediately converted to information in the frequency domain.
  • the user's sleep sound information may be immediately converted into a spectrogram.
  • a preprocessing process of sleep sound information may be performed.
  • the converted frequency domain information or spectrogram can be input into a sleep analysis model and the sleep stage can be analyzed immediately.
  • the operation may be performed as follows.
  • a spectrogram containing time series information can be used as an input to a CNN-based deep learning model, and a vector with reduced dimension can be output.
  • a vector with reduced dimension can be output.
  • the output vector of the transformer-based deep learning model is input to a 1D CNN (1D Convolutional Neural Network) so that the average pooling technique can be applied, and through averaging work on time series information, The process of converting time series information into an N-dimensional vector with implied time can also be performed.
  • the N-dimensional vector containing time series information corresponds to data that still contains time series information, although there is only a difference in resolution from the input data.
  • prediction of various sleep stages can be performed by performing multi-epoch classification on a combination of N-dimensional vectors containing output time series information.
  • continuous prediction of sleep state information can be performed by using the output vectors of transformer-based deep learning models as input to a plurality of fully connected layers (FC).
  • the operation can be performed as follows.
  • a spectrogram containing time series information can be used as an input to a Mobile ViT-based deep learning model to output a vector with reduced dimensionality.
  • features can be extracted from each spectrogram as the output of a Mobile ViT-based deep learning model.
  • a vector containing time series information can be output by using a vector with a reduced dimension as an input to the intermediate layer.
  • the intermediate layer model may include at least one of the following steps: a linearization step to imply vector information, a layer normalization step to input the average and variance, or a dropout step to disable some nodes. there is.
  • overfitting can be prevented by performing a process of outputting a vector containing time series information by using a vector with a reduced dimension as an input to the intermediate layer.
  • sleep state information can be output by using the output vector of the intermediate layer as an input to a ViT-based deep learning model.
  • sleep state information corresponding to information on the frequency domain containing time series information, a spectrogram, or a mel spectrogram can be output.
  • sleep state information corresponding to a series of frequency domain information, spectrogram, or mel spectrogram containing time series information can be output.
  • various deep learning models in addition to the above-mentioned AI model may be employed to perform learning or inference, and specific descriptions related to the types of deep learning models described above are provided. is merely an example, and the present invention is not limited thereto.
  • FIG. 17A and 17B are graphs verifying the performance of the sleep analysis method according to the present invention, showing polysomnography (PSG) results (PSG results) and analysis results (AI results) using the AI algorithm according to the present invention. This is a comparison drawing.
  • Existing sleep analysis models predict sleep stages by using ECG (Electrocardiogram) or HRV (Heart Rate Variability) as input, but the present invention uses sleep sound information as the change in frequency components of sleep sound information in the time domain along the time axis. Sleep stage analysis and inference can be performed by converting the included information into a spectrogram or mel spectrogram as input. Therefore, unlike existing sleep analysis models, because sleep sound information is converted into frequency domain information, spectrogram, or mel spectrogram as input, the sleep stage can be sensed in real time through analysis of the specificity of the sleep pattern. It can be obtained.
  • a graphical user interface that displays information about the user's sleep can be provided.
  • FIG. 17A and 17B are graphs verifying the performance of the sleep analysis method according to the present invention, showing polysomnography (PSG) results (PSG results) and analysis results (AI results) using the AI algorithm according to the present invention. This is a comparison drawing.
  • the Hypnodensity graph shown at the bottom of FIG. 17A is a graph showing the probability of belonging to which of the four sleep stage classes.
  • the hypnogram which is the graph shown in the center of FIG. 17A, can be obtained by determining the sleep stage with the highest probability from the hypnodensity graph.
  • the probability of belonging to one of the four classes can be indicated in 30 second increments.
  • the four classes refer to the awake state, light sleep state, deep sleep state, and REM sleep state, respectively.
  • the sleep analysis results obtained according to the present invention showed very consistent performance when compared with the labeling data obtained through polysomnography.
  • Figure 18 is a graph verifying the performance of the sleep analysis method according to the present invention, showing polysomnography (PSG) results in relation to sleep apnea and hypoventilation and the results according to the present invention.
  • PSG polysomnography
  • This is a diagram comparing the analysis results (AI results) using AI algorithms.
  • the probability graph shown at the bottom of FIG. 18 shows the probability of which of the two diseases (sleep apnea, hypoventilation) it belongs to in 30-second increments when predicting a sleep disease by receiving user sleep sound information.
  • the graph shown in the middle can be obtained by determining the disease with the highest probability from the probability graph shown below.
  • the sleep state information obtained according to the present invention showed performance very consistent with polysomnography. In addition, it demonstrated the ability to include more precise analysis information related to apnea and respiratory depression.
  • a sleep disorder sleep hyperventilation, sleep hypopnea
  • stimulation tacrine, auditory, olfactory, etc.
  • the sleep disorder can be temporarily alleviated.
  • a method for generating information for displaying the date on which information about a user's sleep was acquired can be provided.
  • the method of providing information on the date on which information about sleep was acquired according to embodiments of the present invention sleep can be analyzed more easily by allowing the user to intuitively understand the date on which information about sleep was acquired. There is an advantage in allowing this.
  • a graphical user interface that displays information about the user's sleep can be provided. Additionally, a graphical user interface may be provided that displays information about the date and/or time when information about the user's sleep was acquired.
  • a graphic user interface representing information about the user's sleep may be displayed on the user terminal 10 or an electronic device implemented with various types of displays.
  • FIGS. 4A to 4F are diagrams illustrating a graphical user interface showing information on the date and/or time when information on the user's sleep was acquired based on the time of waking up according to embodiments of the present invention.
  • a method of generating date information on which information about sleep was acquired based on wake-up time information included in the acquired sleep state information may be provided.
  • a method of displaying the wake-up time as the date on which the sleep state information was acquired, based on the wake-up time information included in the acquired sleep state information. can be provided.
  • the date on which the corresponding sleep state information was acquired is set to the time of waking up.
  • the date can be expressed as April 6th.
  • a graphical user interface including a figure displaying the date may be provided.
  • multiple sleeps may occur on the same day.
  • N is a non-negative integer
  • information about sleep is provided based on information about the corresponding waking times.
  • N pieces of information indicating the acquisition date may be generated on the same date.
  • the information indicating the date is displayed on a graphical user interface, it may be displayed to include N shapes.
  • the sleep obtained in a specific session among the three sleep sessions measured on that day A graphical user interface representing status information may be displayed at the bottom.
  • the sleep state information displayed at the bottom is among text information indicating the time of waking up, time of waking up, sleep time, and information about sleep included in the information about the user's sleep obtained on that day. At least one or more may be displayed together.
  • the shape displaying the date according to embodiments of the present invention may be displayed as at least one of a point, a polygon, a circle, an oval, a sector, or a shape consisting of a combination of straight lines and curves.
  • the shape indicating the date according to an embodiment of the present invention may be expressed as a star-shaped shape, but this is only an example, and as described above, various shapes may be used. It can be expressed as one or more of the shapes.
  • the graphic user interface includes elevation time and wake-up information included in the information about the user's sleep acquired on the relevant date, as shown by reference numbers 102 to 104 in FIGS. 4A to 4C.
  • At least one of text information indicating viewpoint, sleep time, and sleep information may be displayed together.
  • whether sleep information obtained on a certain date is indicated can be indicated using a shape, as shown by reference numeral 101a, to distinguish it from other dates.
  • it may be displayed to distinguish it from other dates by changing its color or brightness.
  • the rate of deep sleep was relatively high, it was accompanied by the phrase “long sleep.” I slept deeply last night! You will be able to start your day on a good note” may be displayed (references 102 and 103). Additionally, according to embodiments of the present invention, the actual sleeping time and the time taken to fall asleep may be displayed in parallel (reference number 104).
  • the phrases that appear in the above-described sleep evaluation text information are merely examples for explaining embodiments of the present invention, and the present invention is not limited thereto.
  • the graphical user interface according to the embodiments of the present invention discussed above is not limited to being displayed in Korean and can be displayed in various languages.
  • the date when displayed in English, it may be expressed as shown in FIG. 4F.
  • FIGS. 5A to 5D are diagrams illustrating a graphical user interface showing information on the date and/or time when information on the user's sleep was acquired based on the elevation viewpoint according to embodiments of the present invention.
  • a method of generating date information on when information about sleep was acquired based on elevation time information included in the acquired sleep state information may be provided.
  • a method is provided to display the corresponding elevation time as the date on which the sleep state information was acquired, based on the elevation time information included in the acquired sleep state information. It can be.
  • the date on which the sleeping state information was acquired is the time of waking up.
  • the date can be expressed as April 5th.
  • a graphical user interface including a figure displaying the corresponding date may be provided.
  • multiple sleeps may occur on the same day. For example, if waking up occurs N times on the same day due to multiple sleeping sessions (N is a non-negative integer), the date on which information about sleep was obtained is displayed based on the information about the corresponding waking times. N pieces of information may be generated on the same date. Additionally, in this case, when the information indicating the date is displayed on a graphical user interface, it may be displayed to include N shapes.
  • the sleep-related Three shapes representing information that information has been acquired can be created.
  • three shapes may be displayed, as indicated by reference number 201d.
  • the sleep obtained in a specific session among the three sleep sessions measured on that day A graphical user interface representing status information may be displayed at the bottom.
  • the sleep state information displayed at the bottom is among text information indicating the time of waking up, time of waking up, sleep time, and information about sleep included in the information about the user's sleep obtained on that day, as shown in FIG. 5e. At least one or more may be displayed together.
  • the shape displaying the date according to embodiments of the present invention may be displayed as at least one of a point, a polygon, a circle, an oval, a sector, or a combination of straight lines and curves.
  • the shape that displays the date according to an embodiment of the present invention may be expressed as a star-shaped shape, but this is just an example and, as described above, is one of various shapes. It can be expressed as above.
  • the graphic user interface includes elevation time and wake-up information included in the information about the user's sleep acquired on the relevant date, as shown by reference numbers 202 to 204 in FIGS. 5A to 5C.
  • At least one of text information indicating viewpoint, sleep time, and sleep information may be displayed together.
  • whether sleep information obtained on a certain date is indicated can be indicated using a shape, as shown by reference numeral 201a, to distinguish it from other dates.
  • it may be displayed to distinguish it from other dates by changing its color or brightness.
  • the rate of deep sleep was relatively high, it was accompanied by the phrase “long sleep.” I slept deeply last night! You will be able to start your day on a good note” may be displayed (references 202 and 203). Additionally, according to embodiments of the present invention, the actual sleeping time and the time taken to fall asleep may be displayed in parallel (reference number 204).
  • the phrases that appear in the above-described sleep evaluation text information are merely examples for explaining embodiments of the present invention, and the present invention is not limited thereto.
  • FIGS. 6A to 6C are diagrams illustrating a graphical user interface showing information on the date and/or time when information on the user's sleep was acquired based on the elevation time and the waking time according to embodiments of the present invention.
  • a method of generating information on the date on which sleep-related information was acquired based on the elevation time information and wake-up time information included in the acquired sleep state information may be provided.
  • sleep state information is obtained from the corresponding awakening point to the wake-up point based on the waking point information and the waking point information included in the acquired sleep state information.
  • a method of displaying the acquisition date may be provided.
  • a graphical user interface including a figure displaying the corresponding sleep period may be provided.
  • the figure displaying the date according to the embodiments of the present invention may be displayed as a figure having a continuous shape displaying the date from the elevation point to the date of the wake-up point.
  • a shape according to embodiments of the present invention may be displayed as at least one of a point, a polygon, a circle, an oval, a sector, or a combination of straight lines and curves.
  • multiple sleeps may occur on the same day. For example, if several sleeps occurred on the same day, and accordingly, waking up and waking up occurred N times each on the same day (N is a non-negative integer), based on the information about the corresponding waking moments and waking moments, N pieces of information indicating the date on which information about sleep was acquired may be generated on the same date. Additionally, in this case, when the information indicating the date is displayed on a graphical user interface, it may be displayed to include N shapes.
  • the N shapes can be created as shapes with a continuous shape connecting two different points, and the N shapes according to embodiments of the present invention are each a point, polygon, circle, oval, fan, or straight line. It can be displayed as at least one of the shapes made up of a combination of and curves.
  • waking up and waking up according to three sleep sessions are performed three times on July 1, respectively, on July 1
  • Three shapes may be created indicating that information about sleep has been obtained.
  • three shapes may be displayed, as indicated by reference number 301d.
  • the sleep obtained in a specific session among the three sleep sessions measured on that day A graphical user interface representing status information may be displayed at the bottom.
  • the sleep state information displayed at the bottom is among text information indicating the time of waking up, time of waking up, sleep time, and information about sleep included in the information about the user's sleep obtained on that day. At least one or more may be displayed together.
  • a graphical user interface may be created as shown in reference numerals 305 and 301d of FIG. 6E.
  • Reference number 305 is a code indicating an area where the date is displayed.
  • Reference number 301e is a figure that displays information on the date on which information on sleep was acquired displayed in the area assigned by reference number 305.
  • reference number 305 such as reference number 301e
  • a graphical user interface may be created that displays portions corresponding to the elevation time point and the weather time point.
  • Date information can be displayed with a shape representing the 5th to the 6th in the area marked with the number 305.
  • the starting point of the shape can be displayed in the area allocated to the 5th close to the border between the 5th and the 6th.
  • the end point of the figure may be displayed in the 7th area among the areas allocated to the 6th, divided into 24 equal parts on the horizontal axis.
  • a graphical user interface indicating sleep state information acquired on the relevant date may be displayed at the bottom, where the sleep state information displayed at the bottom is the corresponding sleep state information, as shown in FIG. 6E. At least one or more of the waking time, waking up time, sleep time, and text information indicating information about sleep included in the information about the user's sleep obtained on the date may be displayed together.
  • the information displayed at the bottom is differentiated by changing the color of the figure indicated by reference number 301e or displaying different brightness to determine which of those dates. It can indicate whether the display is based on the sleep measured.
  • the graphic user interface includes elevation time and wake-up information included in the information about the user's sleep acquired on the relevant date, as shown by reference numbers 302 to 304 in FIGS. 6A to 6C.
  • Text information indicating time point, sleep time, and information about sleep may be displayed together.
  • whether sleep information obtained on a certain date is indicated can be indicated using a shape, as shown in reference numeral 301a, to distinguish it from other dates.
  • it may be displayed to distinguish it from other dates by changing its color or brightness.
  • the rate of REM sleep was relatively high, it was called “stress-relieving sleep.”
  • the REM sleep rate was high! You may see the phrase, “Sleep not only relieves stress but also helps you be creative” (References 302 and 303).
  • the actual sleeping time and the time taken to fall asleep may be displayed in parallel (reference number 304).
  • the phrases that appear in the above-described sleep evaluation text information are merely examples for explaining embodiments of the present invention, and the present invention is not limited thereto.
  • FIGS. 7A to 7E are diagrams illustrating a graphical user interface showing information on the date and/or time when information on the user's sleep was acquired based on the definition of the time zone to which the waking time belongs according to embodiments of the present invention. am.
  • a method of generating date information on which information about sleep was acquired based on wake-up time information included in the acquired sleep state information may be provided.
  • a method of displaying the wake-up time as the date on which the sleep state information was acquired, based on the wake-up time information included in the acquired sleep state information. can be provided.
  • information on the date when information about sleep was acquired can be generated based on the definition of the time zone to which the waking up time belongs.
  • the definition of the time zone may be preset, for example, “dawn” for between 12 AM (midnight) and less than 5 AM, “morning” for between 5 AM and less than 9 AM, and between 9 AM and less than 5 PM. can be set as “day”, between 5 PM and less than 9 PM can be set as “evening”, and between 9 PM and less than 12 AM (midnight) can be set as “night”.
  • sleep state information that includes information that you went to bed on April 8 and woke up on the same day, April 8, as shown by reference number 401d in FIG. 7C, the date of waking up is April 8.
  • the day can be expressed as the date on which the corresponding sleep state information was acquired.
  • a graphical user interface including a figure displaying the date may be provided.
  • the color of the figure displaying the date information on which sleep-related information was acquired may be expressed in the same color (e.g., white), but as shown in FIG. 7D, the date information is displayed.
  • the colors of the shapes may be displayed differently depending on the time period in which information about the user's sleep was obtained.
  • the figure indicating the date on which the sleep information was acquired may be displayed in dark blue.
  • the time zone to which the sleep time included in the information about the user's sleep belongs is mainly in the “day” time zone, the figure indicating the date on which the information about the sleep was acquired may be displayed in yellow.
  • the shape displaying the date according to embodiments of the present invention may be displayed as at least one of a point, a polygon, a circle, an oval, a sector, or a shape consisting of a combination of straight lines and curves.
  • the type of shape that displays the date information on sleep may be represented by the same type (e.g., dot), but as shown in FIG. 7C, the type of the shape may be displayed differently depending on the time zone in which information about the user's sleep was obtained.
  • the date on which the information about the sleep was acquired is displayed, as shown by reference number 401c in FIG. 7C.
  • the shape can also be displayed as a star shape.
  • a square shape is drawn indicating the date on which the information about sleep was acquired, as shown by reference number 401d. I can also be displayed as a shape (e.g. a square).
  • multiple sleeps may occur on the same day. For example, if several sleeps occurred on the same day, and accordingly, waking up occurred N times on the same day (N is a non-negative integer), information about sleep is obtained based on information about the corresponding waking times. N pieces of information indicating one date may be generated on the same date. Additionally, in this case, when the information indicating the date is displayed on a graphical user interface, it may be displayed to include N shapes.
  • the sleep obtained in a specific session among the three sleep sessions measured on that day A graphical user interface representing status information may be displayed at the bottom.
  • the sleep state information displayed at the bottom is among text information indicating the time of waking up, time of waking up, sleep time, and information about sleep included in the information about the user's sleep obtained on that day, as shown in FIG. 7f. At least one or more may be displayed together.
  • the type of shape that displays the date information on sleep may be represented by the same type (e.g., dot), but is shown by reference number 401g in FIG. 7g. As described above, the type of shape may be displayed differently depending on the time period in which information about the user's sleep was obtained.
  • the graphic user interface includes elevation time and wake-up information included in the information about the user's sleep acquired on the relevant date, as shown by reference numbers 402 to 404 in FIGS. 7A to 7E.
  • Text information indicating time point, sleep time, and information about sleep may be displayed together.
  • whether sleep information obtained on a certain date is indicated can be indicated using a shape, as shown in reference numeral 401a, to distinguish it from other dates.
  • it may be displayed to distinguish it from other dates by changing its color or brightness.
  • a graphical user interface may be provided that further includes text displaying information on the time zone in which information about sleep was obtained.
  • the text “Morning sleep report” may also be displayed.
  • the time zone included in the information about the user's sleep mainly falls in the “day” time zone, “nap report”, if it falls in the “evening” time zone, “evening sleep report”, “ If you belong to the “night” time zone, the text “Night sleep report” can be displayed, and if you belong to the “dawn” time zone, the text “Early sleep report” can also be displayed.
  • the text displaying the time zone information at which information about the user's sleep was obtained is not limited to the example above. If the time zone to which the user wakes up belongs to the early morning time zone, the keyword “dawn” is used, and if it is the morning time zone, the keyword is “ It may be displayed as including the keyword “morning”, the keyword “day” in the case of daytime, the keyword “evening” in the case of the evening, and the keyword “night” in the case of the night.
  • Figure 10 is a flowchart of a method for creating and providing one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention.
  • a method of generating one or more graphical user interfaces representing information about sleep includes acquiring sleep information from the user (S100), the acquired sleep information A step of converting the frequency components included in the information into information including changes along the time axis (S120), a step of acquiring sleep state information (S140), a step of generating a graphical user interface (S160), and a graphical user interface. It may include a providing step (S180).
  • the sleep information acquired in the step of acquiring sleep information may include environmental sensing information or sleep sound information.
  • a method of generating one or more graphical user interfaces representing information about sleep includes a sleep log storage step (not shown) of storing sleep log information related to an account assigned to a user in memory. It may include more.
  • the step of converting the frequency components included in the acquired sleep information into information including changes along the time axis (S120) is performed by converting the acquired sleep information into raw acoustic information on the time domain or information on the frequency domain. It may include a step of performing preprocessing.
  • the step of converting the acquired sleep information into information on the frequency domain (S120) may include converting acoustic information into spectrogram information.
  • the step of applying Mel scale to the spectrogram and converting it into Mel spectrogram may be further included.
  • the step (S140) of acquiring sleep state information involves extracting sleep state information corresponding to each piece of information in the frequency domain, a spectrogram, or a mel spectrogram divided into 30 second units. It may include steps.
  • the step of generating a graphical user interface includes the date information on the user's sleep was acquired based on the user's waking time and/or waking time included in the sleep information. It may include the step of generating.
  • the step of generating information on the date on which information on sleep was acquired may further include generating time zone information on which information on sleep was acquired based on information on the time zone to which the time of waking up belongs.
  • time zone information may be determined according to a preset definition.
  • Figure 19a is a diagram showing a graphical user interface showing information about the sleep of a user with the alarm function activated according to the present invention.
  • Figure 19b is a diagram illustrating a graphical user interface showing information about the sleep of a user in which the alarm function according to the present invention is not activated.
  • a user may trigger an alarm function, and an alarm function trigger activation state user interface 19100 and an alarm function trigger deactivation state user interface 19200 may be provided to the user.
  • the user may trigger an alarm function, and if the user triggers the alarm function in a first direction, an alarm function trigger activation state user interface 19100 may be provided to the user, and the user may trigger the alarm function in a second direction.
  • an alarm function trigger deactivation state user interface 19200 may be provided to the user.
  • an alarm function trigger activation state user interface 19100 is provided to the user.
  • an alarm function trigger deactivation state user interface 19200 may be provided to the user, but is not limited to this.
  • the graphical user interface creation step includes a graphical user interface creation step for activating the alarm function and starting sleep measurement when the trigger area for activating the notification function is activated, and activating the notification function. If the trigger area is not activated, the step may include creating a graphical user interface to initiate sleep measurement along with deactivating the alarm function.
  • the alarm window user interface 19101 in the alarm function trigger activated state may be provided, and the alarm window user interface 19101 in the alarm function trigger activated state may be provided with the alarm function.
  • a user interface (19110) for inputting the desired wake-up time in the trigger activation state may be provided.
  • the notification phrase user interface 19110 which inputs the desired wake-up time when the alarm function trigger is activated, may be in the form of providing the phrase “When will I wake up?”
  • the desired wake-up time input notification user interface 19110 in the alarm function trigger activation state may be in the form of providing the phrase “What time do you want to wake up?” and “What time do you want to wake up?” It may be in the form of providing a phrase, or it may be in the form of "When should I wake you up?”, but this is only one of many examples and is not limited to this.
  • an alarm window user interface 19101 in the alarm function trigger activation state may be provided, and the alarm window user interface 19101 in the alarm function trigger activation state may be provided.
  • (19101) is a wake-up range user interface (19120) including desired wake-up time information when the alarm function trigger is activated, or a slide area user interface (19130) provided to allow input of desired wake-up time information when the alarm function trigger is activated. ) can be provided.
  • the slide area user interface 19130 provided to allow input of desired wake-up time information in an alarm function trigger activation state allows input of desired wake-up time information in an alarm function trigger activation state. It may include a slide area bar user interface (19131) provided to allow the user to enter information on the desired wake-up time when the alarm function trigger is activated by sliding the slide area user interface (19130) provided to indicate the user's desired wake-up time. You can choose the time.
  • the step of creating a graphical user interface including a slide area for selecting desired wake-up time information when the slide area is slid in a first direction, the time of the desired wake-up time information is delayed, and the slide area It may include a step of creating a graphical user interface in which the time of the desired wake-up time information becomes faster when the information is slid in the second direction.
  • the desired wake-up time may be set to an earlier time.
  • the desired wake-up time may be set to a later or earlier time, but this is limited to this. It doesn't work.
  • a slide area bar is provided to allow the user to input desired wake-up time information with the alarm function trigger activated.
  • the interface 19131 may be provided to indicate 8:36 a.m.
  • the scheduled wake-up range user interface 19120 may indicate a predetermined range from 8:36 a.m., including information on the desired wake-up time when the alarm function trigger is activated. , for example, if set to 30 minutes, it can represent the range from 8:06 AM to 8:36 AM.
  • the wake-up range user interface 19120 including information on the desired wake-up time in an alarm function trigger activation state may indicate a section where the user's predicted wake-up time belongs.
  • an alarm window user interface 19101 in an alarm function trigger activation state may be provided, and the alarm window user interface in an alarm function trigger activation state may be provided.
  • the interface 19101 may include a user interface 19140 that provides information on the estimated sleep time in an alarm function trigger activation state.
  • the step of creating a graphical user interface may include a step of creating a graphical user interface that, upon receiving desired wake-up time information through the slide area, indicates an expected sleep time based on the difference between the received desired wake-up time and the current time. For example, as shown in Figure 19a, if the desired wake-up time is 8:36 AM and the current time is 12:39 AM, the desired wake-up time is 8:36 AM to the current time is 12:39 AM. With the difference of 8 hours and 57 minutes, a user interface 19140 that provides information on the estimated sleep time with the alarm function trigger activated can be provided.
  • the step of creating a graphical user interface includes the desired wake-up time information through a slide area user interface 19130 provided to allow the user to input desired wake-up time information in an alarm function trigger activation state.
  • the step of generating a graphical user interface representing the expected sleep time based on either predicted wake-up time information or predicted sleep time or a combination of predicted sleep time and sleep efficiency inferred from the received sleep state information of the user. can do.
  • sleep efficiency may be (user's actual sleep time/time from start of sleep measurement until user wakes up) * 100, and even more specifically, ⁇ time of NREM stage during user's sleep and the time calculated by excluding the time of the WAKE phase from the sum of the times of the REM phase)/time from the start of sleep measurement until the user wakes up ⁇ *100.
  • the estimated sleep time provided in the slide area user interface 19130 which is provided so that the user can input information on the desired wake-up time in the alarm function trigger activation state, is calculated from the current time when the alarm function trigger is activated. Based on the user's sleep state information within the scheduled wake-up range including information on the desired wake-up time in the state, it may include the time until the time when REM sleep is predicted to occur or a time after a predetermined time has elapsed from that time. .
  • the estimated sleep time provided in the slide area user interface 19130 which is provided to allow the user to input information on the desired wake-up time when the alarm function trigger is activated, is calculated from the current time of the alarm function. Sleep efficiency at the time when REM sleep is predicted to occur or a predetermined time after the time has elapsed based on the user's sleep state information within the scheduled wake-up range including the desired wake-up time information in the trigger activation state. It may be a time calculated by multiplying .
  • an alarm window user interface 19101 in the alarm function trigger activation state may be provided, and at the same time, sleep measurement in the alarm function trigger activation state may be initiated.
  • a user interface 19150 may be provided.
  • the user interface 19150 that initiates sleep measurement in an alarm function trigger activation state may be in the form of providing the phrase “Go to sleep.” Additionally, it may be in the form of providing a phrase such as "Start measuring sleep” or it may be in a form of providing a phrase such as "Shall we go to sleep?", but this is only an example and is not limited thereto.
  • the user can trigger an alarm function, and an alarm function trigger activation state user interface 19100 or an alarm function trigger deactivation state user interface 19200 may be provided to the user.
  • the user may trigger an alarm function, and if the user triggers the alarm function in a first direction, an alarm function trigger activation state user interface 19100 may be provided to the user, and the user may trigger the alarm function in a second direction.
  • an alarm function trigger deactivation state user interface 19200 may be provided to the user.
  • the alarm function trigger deactivation state user interface 19200 When the alarm function trigger deactivation state user interface 19200 is provided, the alarm function is not triggered, and a user interface 19201 providing information corresponding to sleep measurement in a state where the scheduled wake-up time is not set may be provided.
  • the information providing user interface 19201 suitable for sleep measurement in a state where the scheduled wake-up time is not set may be a phrase providing information that sleep measurement results can be provided only after sleeping for a certain amount of time, , For example, it could be a phrase like “If you sleep for more than 30 minutes, you can get a report.”
  • Another example could be a phrase like “Why not try using an AI alarm for your next sleep?” This is just an example and is not limited to this.
  • a user interface 19250 that initiates sleep measurement in the alarm function trigger deactivation state may be provided.
  • the user interface 19250 for initiating sleep measurement with the alarm function trigger disabled may be provided in the form of phrases such as “go to bed without an alarm,” or in the form of phrases such as “turn off the alarm and sleep.” It may also be provided as .
  • the user interface 19250 for starting sleep measurement with the alarm function trigger disabled may be provided in the form of phrases such as “turn off the alarm and start sleep measurement,” but this is only an example and is not limited thereto.
  • Figure 20a is a diagram showing a graphical user interface indicating that sleep is being measured while the alarm function according to the present invention is activated.
  • Figure 20b is a diagram showing a graphical user interface indicating that sleep is being measured while the alarm function according to the present invention is deactivated.
  • an alarm window user interface 19101 in an alarm function trigger activation state may be provided, and at the same time, sleep in an alarm function trigger activation state may be provided.
  • a user interface 19150 may be provided to initiate a measurement.
  • the user interface 20310 indicating that the alarm function trigger is activated during sleep measurement in the alarm function trigger activation state of FIG. 20A ) can be provided.
  • the user interface 20310 indicating that the alarm function trigger is activated is in the form of a phrase indicating that the alarm function is activated: “AI will definitely wake you up!” can be provided.
  • the user interface 20310 indicating that the alarm function trigger is activated may be provided in the form of the phrase “AI alarm function is activated!” It may be provided in the form of a phrase such as “AI will help you wake up refreshed!”, and may also be provided in the form of a phrase such as “I will wake you up, so sleep well!”, but this is only an example and is not limited thereto. .
  • an alarm window user interface 19101 in an alarm function trigger activation state may be provided, and at the same time, sleep in an alarm function trigger activation state may be provided.
  • a user interface 19150 for initiating measurement may be provided, and at the same time, a user interface 20320 may be provided indicating a scheduled wake-up range including desired wake-up time information during sleep measurement in an alarm function trigger activation state.
  • the user interface 20320 which represents the scheduled wake-up range including desired wake-up time information during sleep measurement in the alarm function trigger activated state, is the user interface 20320 that represents the scheduled wake-up range including desired wake-up time information in the alarm function trigger activated state.
  • Information about scheduled alarms can be provided in the same range as (19120).
  • the wake-up range user interface 19120 which includes the desired wake-up time information in the alarm function trigger activation state, provides a range of “between 8:06 a.m. and 8:36 a.m.”, the alarm function trigger is activated.
  • the user interface 20320 indicating the expected wake-up range including desired wake-up time information may provide the phrase “alarm between 8:06 a.m. and 8:36 a.m.”
  • the wake-up range user interface 19120 which includes the desired wake-up time information in the alarm function trigger activation state, provides a range of “between 8:06 a.m. and 8:36 a.m.”, the alarm function is triggered.
  • the user interface 20320 indicating the expected wake-up range including desired wake-up time information may provide the phrase, “The alarm will sound between 8:06 a.m. and 8:36 a.m.!”
  • an alarm window user interface 19101 in an alarm function trigger activation state may be provided, and at the same time, sleep in an alarm function trigger activation state may be provided.
  • a user interface 19150 for initiating measurement may be provided, and at the same time, a user interface 20320 may be provided indicating a scheduled wake-up range including desired wake-up time information during sleep measurement with the alarm function trigger activated, and at the same time, an alarm function may be provided.
  • a user interface 20340a may be provided indicating the first wave of the user's sleep sound information during sleep measurement in the function trigger activation state.
  • the step of creating a graphical user interface including a screen indicating that sleep is being measured includes the step of creating a graphical user interface indicating that a trigger area for activating an alarm function is not being activated; , It may further include a step of creating a graphical user interface representing the range of the scheduled wake-up time, and may further include a step of creating a graphical user interface representing waves of the user's sleep sound information.
  • the user interface 20340a which represents the first wave of the user's sleep sound information during sleep measurement with the alarm function trigger activated, may change in amplitude depending on the size of the user's sleep sound in real time.
  • the user interface 20340a representing the second wave of the user's sleep sound information moves in real time in a random direction, and the amplitude varies depending on the size of the user's sleep sound. This can change.
  • an alarm window user interface 19101 in an alarm function trigger activation state may be provided, and at the same time, sleep in an alarm function trigger activation state may be provided.
  • a user interface 19150 for initiating measurement may be provided, and at the same time, a user interface 20320 may be provided indicating a scheduled wake-up range including desired wake-up time information during sleep measurement with the alarm function trigger activated, and at the same time, an alarm function may be provided.
  • a user interface (20340a) may be provided representing the first wave of the user's sleep sound information during sleep measurement in a function trigger activation state, and a user interface (20340a) for stopping sleep measurement during sleep measurement in an alarm function trigger activation state ( 20330) may be provided.
  • FIG. 20A (a) of FIG. 20A is a diagram showing that a user interface 20340a is provided indicating the first wave of the user's sleep sound information during sleep measurement with the alarm function trigger activated. (b) in 20a is a diagram showing the user interface 20340b showing the second wave of the user's sleep sound information during sleep measurement with the alarm function trigger activated.
  • (20340b) may be provided differently depending on the user's user terminal model, and the user interface (20340b) representing the second wave of the user's sleep sound information during sleep measurement with the alarm function trigger activated is the size of the user's sleep sound.
  • the characteristic is that the amplitudes of the plurality of waves all change and become smaller in size equally, and may be provided differently depending on the type of user terminal, specifically the type of smartphone.
  • the step of generating a graphical user interface including a screen indicating that sleep measurement is in progress may further include the step of creating a graphical user interface capable of ending sleep measurement.
  • the user interface 20330 for stopping sleep measurement during sleep measurement with the alarm function trigger activated may be provided in the form of indicating the phrase “wake up,” and sleep measurement with the alarm function trigger activated.
  • the user interface 20330 for stopping mid-sleep measurement may be provided in the form of the phrase “wake up.”
  • it may be provided in a form that represents the phrase "Go to wake up” and may also be provided in a form that represents the phrase "End sleep measurement,” but this is only an example and is not limited thereto.
  • a user interface 20250 may be provided that initiates sleep measurement in the alarm function trigger deactivated state.
  • a phrase or emoticon or a phrase and an emoticon indicating that sleep measurement is in progress is displayed while the alarm function trigger is not activated.
  • a user interface 20610 representing may be provided.
  • the user interface 20610 representing a phrase or an emoticon or a phrase and an emoticon indicating that sleep is being measured is in the form of a sleep-inducing phrase such as "Close your eyes, take a good rest ⁇ _ ⁇ " It can be provided as .
  • the user interface 20610 representing a phrase or emoticon indicating that sleep is being measured, or a phrase and an emoticon may be provided in the form of the phrase “I will sleep comfortably,” and a phrase or emoticon indicating that sleep is being measured, or a phrase and an emoticon.
  • the user interface 20610 representing an emoticon may be provided in the form of a phrase such as “Tonight is a comfortable night,” but this is only an example and is not limited thereto.
  • a user interface 19250 may be provided to initiate sleep measurement in the alarm function trigger deactivated state.
  • a phrase or emoticon or a phrase and emoticon indicating that sleep measurement is in progress is displayed while the alarm function trigger is not activated.
  • a user interface 20610 may be provided, and at the same time, a user interface 20640a may be provided showing the first wave of the user's sleep sound information during sleep measurement while the alarm function trigger is not activated.
  • the user interface 20640a which represents the first wave of the user's sleep sound information during sleep measurement when the alarm function trigger is not activated, has an amplitude depending on the size of the user's sleep sound in real time. It can change. As a specific example, during sleep measurement in a state in which the alarm function trigger is not activated, the user interface 20640a representing the first wave of the user's sleep sound information moves in real time in a random direction and adjusts to the size of the user's sleep sound. The amplitude may change depending on the
  • a user interface 19250 may be provided to initiate sleep measurement in the alarm function trigger deactivated state.
  • a phrase or emoticon or a phrase and emoticon indicating that sleep measurement is in progress is displayed while the alarm function trigger is not activated.
  • a user interface 20610 representing may be provided, and at the same time, a user interface 20640a may be provided representing the first wave of the user's sleep acoustic information during sleep measurement while the alarm function trigger is not activated, An information providing user interface 20620 suitable for measuring sleep in a state where the scheduled wake-up time is not set may be provided.
  • the step of creating a graphical user interface may include a step of creating a graphical user interface that provides information corresponding to sleep with an undetermined wake-up time when the trigger area for activating the notification function is not activated.
  • the user interface 20620 provides information suitable for sleep measurement in a state where the scheduled wake-up time is not set, and provides information for receiving sleep measurement results such as “You can receive a report if you sleep for more than 30 minutes.” It may be provided in the form of phrases.
  • the user interface 20620 which provides information suitable for sleep measurement in a state where the scheduled wake-up time is not set, provides information necessary when the scheduled wake-up time is not set, such as “The best sleep time is 7 hours!” It may be in the form. Or, it may be provided in the form of phrases such as “Tomorrow is the weekend! I can sleep more than on weekdays!”, but this is only an example and is not limited to this.
  • FIG. 20B (a) of FIG. 20B is a diagram showing that a user interface 20640a is provided indicating the first wave of the user's sleep sound information during sleep measurement in a state in which the alarm function trigger is not activated.
  • FIG. 20A (b) is a diagram illustrating the user interface 20640b showing the second wave of the user's sleep sound information during sleep measurement in a state in which the alarm function trigger is not activated.
  • the user interface (20640b) representing the second wave may be provided differently depending on the user's user terminal model, and may be a user interface (20640b) representing the second wave of the user's sleep sound information during sleep measurement when the alarm function trigger is not activated ( 20640b) is characterized in that the amplitudes of a plurality of waves all change in size and become smaller depending on the size of the user's sleeping sound, and may be provided differently depending on the model of the user terminal, specifically the model of the smartphone.
  • a user interface 19250 may be provided to initiate sleep measurement in the alarm function trigger deactivated state.
  • a phrase or emoticon or a phrase and emoticon indicating that sleep measurement is in progress is displayed while the alarm function trigger is not activated.
  • a user interface 20610 representing may be provided, and at the same time, a user interface 20640a may be provided representing the first wave of the user's sleep acoustic information during sleep measurement while the alarm function trigger is not activated,
  • An information providing user interface (20620) suitable for sleep measurement in a state where the scheduled wake-up time is not set may be provided, and a user interface (20630) for stopping sleep measurement during sleep measurement in a state in which an alarm function trigger is not activated. ) can be provided.
  • the user interface 20630 for stopping sleep measurement during sleep measurement in a state in which the alarm function trigger is not activated may be provided in the form of indicating the phrase “wake up” and the alarm function trigger is not activated.
  • the user interface 20630 for stopping sleep measurement during sleep measurement in a sleep state may be provided in a form that represents the phrase “wake up.”
  • it may be provided in a form that represents the phrase "Go to wake up” and may also be provided in a form that represents the phrase "End sleep measurement,” but this is only an example and is not limited thereto.
  • Figure 21 is a diagram illustrating a graphical user interface that provides information to confirm the user's expected sleep time according to the present invention when the user's expected sleep time is more than a predetermined time.
  • the step of creating a graphical user interface may include a step of creating a graphical user interface that displays a pop-up window to confirm the expected sleep time when the expected sleep time exceeds a predetermined time.
  • a pop-up window user interface 21700 may be provided for the user interface. However, the case is not limited to exceeding 12 hours, and even if the estimated sleep time exceeds the predetermined time, a pop-up user interface 21700 may be provided to confirm the estimated sleep time.
  • the pop-up window user interface 21700 for checking the expected sleep time may include a user interface 21710 that describes a pop-up window for checking the expected sleep time when the expected sleep time is more than a predetermined time.
  • the user interface 21710 explaining a pop-up window for checking the expected sleep time may be in the form of providing the phrase “Confirm sleep time.”
  • the user interface 21710 which describes a pop-up window for checking the expected sleep time, provides phrases such as “The estimated sleep time is a little long!” so that the user can once again set the wake-up wish. It may be in the form of providing information that leads to checking the time.
  • the pop-up window user interface 21700 for checking the expected sleep time may include a user interface 21710 that describes a pop-up window for checking the expected sleep time when the expected sleep time is more than a predetermined time.
  • a user interface 21720 may be included that displays a question about the desired wake-up time in a pop-up window for confirming the expected sleep time.
  • the user interface 21720 which displays a phrase asking about the desired wake-up time in a pop-up window for checking the expected sleep time, contains the phrase "The estimated sleep time is 00:00. Is the time set correct?" It can be provided in the form In addition, the user interface 21720, which displays a question about the desired wake-up time in a pop-up window for checking the expected sleep time, contains the phrase "The estimated sleep time is a little long! Is the desired alarm time 00:00 AM??" It can be provided in the form In addition, it may be provided in the form of a phrase such as "The estimated sleep time is over 00 hours. I think today is a day when I want to sleep a lot!, but this is only an example and is not limited to this.
  • the pop-up window user interface 21700 for checking the expected sleep time may include a user interface 21710 that describes a pop-up window for checking the expected sleep time when the expected sleep time is more than a predetermined time. If the expected sleep time is more than a predetermined time, a user interface (21730) for resetting the desired wake-up time in a pop-up window to check the expected sleep time, or if the expected sleep time is more than a predetermined time, a user interface (21730) for checking the expected sleep time A user interface 21740 for maintaining the desired wake-up time preset in the pop-up window may be further included.
  • the user interface 21730 for resetting the desired wake-up time in a pop-up window for checking the expected sleep time may be provided in the form of the phrase “reset” and the phrase “go to reset”. It can also be provided as . Additionally, it may be provided in the form of a phrase such as “Reset alarm settings,” but this is only an example.
  • the user interface 21740 for maintaining the desired wake-up time set in the pop-up window for confirming the expected sleep time may be provided in the form of the phrase “Yes, that's right” and "The time is right! It may be provided in the form of a phrase such as “Yes, I want to go to bed,” but it is not limited to this.
  • an alarm window user interface 19101 in the alarm function trigger activation state may be provided, and the alarm window user interface 19101 in the alarm function trigger activation state may be provided.
  • (19101) may include a user interface (22140) that provides information on the expected sleep time in an alarm function trigger activation state, and when the expected sleep time is less than a predetermined time, when the expected sleep time is less than a predetermined time, sleep An informational user interface 22202 may be provided consistent with the measurement.
  • the information providing user interface 22202 consistent with the sleep measurement may be in the form of the phrase "Sleep 27 more minutes and you can get a report.” It can be provided as .
  • Figure 23 is a diagram for explaining a method of providing an alarm based on the user's sleep state information according to the present invention.
  • the method of providing an alarm based on the user's sleep state information includes a sleep information receiving step of receiving the user's sleep information including the user's sleep sound information from one or more sleep information sensor devices ( S2310), a sleep state information acquisition step of acquiring the user's sleep state information including the user's average sleep time information based on the received user's sleep information (S2320), desired wake-up time information for receiving desired wake-up time information A receiving step (S2330), an alarm time information generating step (S2340) of generating alarm time information based on the acquired sleep state information and the received desired wake-up time information, and an alarm sound based on the generated alarm time information. It may include an alarm sound providing step (S2350).
  • the alarm time information generation step (S2340) can generate alarm timing information based on sleep state information.
  • the time range including the desired wake-up time information includes the time from a predetermined time before the desired wake-up time to the desired wake-up time; , If sleep state information indicating that the user is in REM sleep is not obtained within the above range of time, the alarm time information may be generated at the desired wake-up time.
  • the above range is the time from 8:36 a.m., the desired wake-up time, to a predetermined time, 30 minutes before, that is, from 8:06 a.m. to 8:36 a.m. It refers to the range from minute to minute, and if the user's REM sleep is not measured during the period from 8:06 a.m. to 8:36 a.m., alarm time information can be generated at 8:36 a.m.
  • the time range including the desired wake-up time information includes the time from a predetermined time before the desired wake-up time to the desired wake-up time. And, if the user acquires sleep state information indicating that the user is in REM sleep within the above range of time, the alarm time information may be generated at the time of acquisition of sleep state information indicating that the user is in REM sleep.
  • the desired wake-up time is 8:36 a.m.
  • the above range is the time from 8:36 a.m., the desired wake-up time, to a predetermined time, 30 minutes before, that is, from 8:06 a.m. to 8:36 a.m.
  • Alarm time information can be generated after a predetermined time has elapsed.
  • Figure 24a is a diagram for explaining the case where the AI alarm of the present invention is not set using a hypnogram.
  • Figure 24b is a diagram for explaining the case of setting the AI alarm of the present invention through a hypnogram.
  • the user's REM sleep continues 30 minutes before the time of the user's desired wake-up time information, and a time point appears indicating that the user's sleep stage is a sleep stage other than REM sleep, at that point You can also set it as alarm timing information. For example, if the time in the user's desired wake-up time information is 8:30 a.m., and REM sleep was detected at 8:10 a.m., between 8:00 a.m. and 8:30 a.m., 30 minutes before then, 8:00 a.m. If the user's sleep status information at 20 minutes indicates a normal sleep, the time point of 8:20 a.m. can be set as the alarm time information.
  • the alarm time information can be set to the point in time when the reliability of the sleep state information is lowered.
  • Figure 25 a Graphical user interface including hypnogram
  • Figure 25a is a diagram showing a graphical user interface including a hypnogram according to an embodiment of the present invention.
  • the phrase “sleep stage” may be displayed together, as indicated by reference number 25101.
  • the hypnogram is expressed as time (reference number 25114) on the x-axis and sleep stage information on the y-axis, and may include shapes corresponding to each sleep stage information.
  • sleep stage information has a total of four stages and may include a plurality of areas allocated to each sleep stage information. For example, an area allocated to the waking stage (reference number 25102), an area allocated to the REM sleep stage (reference number 25103), an area allocated to the light sleep (or 'normal sleep') stage (reference number 25104), and an area allocated to the deep sleep stage. It can be expressed divided into allocated areas (reference number 25105).
  • the colors of shapes corresponding to each sleep stage information may be expressed differently.
  • the color of the shape corresponding to the deep sleep stage may be a relatively darker color than other shapes (reference number 25113).
  • the color of the shape corresponding to the REM sleep stage may be relatively brighter than other shapes (reference number 25111).
  • the colors of the figures and background corresponding to each sleep stage information may be colors expressed at different positions in the color space.
  • the background is expressed in a black color
  • the shapes corresponding to the deep sleep stage are relatively dark blue
  • the shapes corresponding to the light sleep or normal sleep stages are relatively dark blue.
  • the shape corresponding to the REM sleep stage can be expressed in a light blue color (reference number 25112)
  • the shape corresponding to the REM sleep stage can be expressed in a purple color
  • the figure corresponding to the waking stage can be expressed in a yellow color.
  • shapes corresponding to sleep stage information may be shapes expressed discretely.
  • shapes are expressed discretely according to an embodiment of the present invention, unlike the prior art, there may be cases where shapes corresponding to different sleep stages share only one intersection with each other.
  • the fact that shapes are expressed discretely means that they each include a plurality of shapes corresponding to a plurality of sleep stages, and at least one of the plurality of shapes is separated from the remaining shapes. It may include the meaning of being (isolated).
  • a graphical user interface that displays information about the user's sleep can be provided.
  • Figures 30A to 30C are diagrams illustrating a graphical user interface displayed on various display units according to embodiments of the present invention.
  • a graphic user interface representing information about the user's sleep may be displayed on an electronic device or user terminal implemented with various types of displays.
  • each figure is displayed to be connected to each other by lines such as dotted or solid lines between the figures. There was a case.
  • the transition from the first sleep stage to the second sleep stage is performed. This has the effect of reducing the misunderstanding that one must go through different sleep stages in the process.
  • the light sleep (normal sleep) stage is switched to the waking stage and then switched back to the light sleep (normal sleep) stage.
  • transition or change of sleep stages is expressed as a 'continuous' hypnogram graph, it may lead to the misunderstanding that the REM sleep stage must be passed when the sleep stage is switched, but according to the present invention, a 'discrete' hypnogram graph is used. If expressed as , such misunderstandings can be reduced.
  • each sleep stage information when the shape corresponding to each sleep stage information is expressed discretely in the hypnogram, the frequency of each sleep stage is clearly expressed, so the frequency of the sleep stage belonging to a specific section can be directly confirmed, and the relative frequency of each sleep stage can be checked. It can be easy to understand the proportion. This can help identify features or abnormalities in your sleep pattern.
  • the shape corresponding to the sleep stage information may be expressed as at least one shape selected from the group consisting of a trapezoid, an equilateral trapezoid, a kite, a parallelogram, a rhombus, a rectangle, a square, and other general quadrangles. Preferably, it can be expressed as a rectangle.
  • a plurality of sleep stage information may be expressed as a plurality of rectangles. At this time, at least one of the plurality of rectangles may be isolated from the remaining rectangles, such as the rectangle indicated by reference number 25110 in FIG. 25A. In the embodiment shown in FIG. 25A, a rectangle 25110 corresponding to the waking stage of sleep stage information is separated from the remaining rectangles.
  • the rectangle representing the awakening stage can be displayed separately from the rectangles representing other sleep stages.
  • the sleep stage can be displayed separately from the rectangles representing the other sleep stages. It has the advantage of being able to analyze sleep by clearly distinguishing the occurrence of the waking phase. According to the interface of the prior art, when information on the waking stage during sleep is frequently acquired, each sleep stage and the waking stage during sleep are connected with a line (solid line or dotted line), so there is a risk that the hypnogram may be expressed relatively unclearly. , there were concerns that the occurrence of the awakening stage could not be clearly distinguished.
  • areas allocated to the waking stage, REM sleep stage, light sleep stage, and deep sleep stage may be displayed in order from top to bottom (reference numbers 25102 to 25105).
  • the interface according to embodiments of the present invention when at least one of the shapes corresponding to a plurality of sleep stages is isolated from the remaining shapes, the area allocated to the different sleep stages contains the shape. By preventing this from existing, there is an effect of being able to clearly understand what sleep stage occurred at that point in time.
  • shapes corresponding to a plurality of sleep stages are expressed as shapes of the same shape (e.g., rectangles), and at least one of them is separated from the remaining shapes. It can be (isolated).
  • this embodiment there is an effect of making it possible to clearly understand which sleep stage occurred at that point in time by ensuring that there are no shapes in areas allocated to different sleep stages expressed by shapes of the same shape.
  • the shape corresponding to at least one sleep stage may have the same shape as the shape corresponding to the other sleep stage.
  • a figure corresponding to at least one sleep stage may be displayed separately from other figures corresponding to the same sleep stage. Additionally, shapes corresponding to at least one sleep stage may be displayed separately from shapes corresponding to other sleep stages.
  • the figure corresponding to the sleep stage is displayed only in the area reference number 25102 and is assigned to the remaining sleep stages. Since the area (reference numbers 25103 to 25105) is not displayed, the user can clearly understand which sleep stage (specifically, the awakening stage) occurred at that time just by looking at the hypnogram graph according to the present invention.
  • a word indicating a light sleep stage when a word indicating a light sleep stage is displayed in Korean, it may be displayed as 'light sleep' or 'normal sleep'. Users who do not have expert knowledge about sleep stages may misunderstand that they did not sleep properly during the sleep period due to the word 'light sleep'. In contrast, if the word 'normal sleep' is displayed, this misunderstanding may occur. There is a possibility of reducing (reference number 25104).
  • shapes corresponding to each sleep stage can be displayed only in a plurality of areas allocated to each sleep stage information (reference numbers 25102 to 25105).
  • the sleep stage information during the sleep period can be clearly distinguished and expressed, allowing the user to distinguish the sleep stage information during the sleep period and accurately understand the graph of the sleep stage information.
  • the line representing the boundary may be straight or curved.
  • lines representing the boundaries of a plurality of areas allocated to each piece of sleep stage information may be displayed in different colors or brightness. Specifically, the color of the border dividing the waking stage area from the remaining sleep stage area may be displayed relatively brightly.
  • a plurality of areas allocated to each piece of sleep stage information may have a height allocated to each.
  • the heights assigned to a plurality of areas assigned to each piece of sleep stage information may be the same or different. Specifically, the height allocated to the area allocated to the awakening stage information may be different from the height allocated to other areas (see reference number 25102).
  • a pitch may be assigned between a plurality of areas assigned to each piece of sleep stage information.
  • the figure corresponding to each piece of sleep stage information may be expressed discretely based on the pitch allocated between a plurality of areas allocated to each piece of sleep stage information. Through this discrete expression, the user can distinguish the sleep stage information during the sleep period and accurately understand the graph of the sleep stage information.
  • At least one of the elevation time and the waking up time may be displayed together in the hypnogram.
  • at least one of the elevation time and the wake-up time may be displayed as time information on the x-axis (reference numbers 25106 and 25107).
  • the time interval from the elevation time to the wake-up time can be easily determined just by looking at the hypnogram.
  • at least one of the time of rising and the time of waking up when at least one of the time of rising and the time of waking up is displayed as time information on the x-axis, it may be displayed connected to the shape and line corresponding to each sleep stage information.
  • a line connecting at least one piece of information among the time of elevation and the time of waking up with a figure corresponding to the sleep stage information may be displayed as a solid line, a dotted line, a straight line, or a curved line.
  • comment information based on sleep stage information or sleep state information may be displayed together with a hypnogram graph indicating the sleep stage ( Reference number 25109).
  • Figure 25d is a diagram showing a graphical user interface including a hypnogram according to another embodiment of the present invention.
  • the hypnogram graph can be generated in real time during the sleep period.
  • sleep information including the user's sleep sound is acquired in real time, and the acquired sleep sound information can be immediately converted into a spectrogram.
  • Sleep stages can be analyzed immediately by using the converted spectrogram as input to the sleep analysis model. Accordingly, a hypnogram graph representing sleep stage information can be generated in real time simultaneously with the analysis of sleep stages.
  • Figure 25b is a graph showing the time ratio of each sleep stage measured according to an embodiment of the present invention.
  • a graphical user interface that includes a graph displaying the time ratio of each sleep stage according to an embodiment of the present invention
  • the phrase “My sleep at a glance” may be displayed together, as indicated by reference number 25201.
  • the time ratio of each sleep stage according to the present invention can be calculated as the ratio of the time corresponding to each sleep stage to the total sleep period.
  • the total sleep time can be calculated as the time from the time of falling asleep to the time of waking up.
  • the time corresponding to each sleep stage can be calculated based on the sleep stage inferred by the artificial intelligence model according to an embodiment of the present invention.
  • the total sleep time from the time of waking up to the time of waking up is 7 hours and 22 minutes, and the period inferred as the general sleep (light sleep) stage is 4 hours and 34 minutes, the REM sleep stage.
  • the inferred period is 1 hour and 37 minutes
  • the inferred period in the deep sleep stage is 58 minutes
  • the inferred period in the waking stage is 13 minutes
  • the time corresponding to each period is divided by the total sleep time, and the percentile value is calculated.
  • the percentage of sleep stages may be displayed.
  • the specific time values described above are merely examples, and the present invention is not limited thereto.
  • time information corresponding to each sleep stage and the ratio of each sleep stage may be displayed side by side in parallel on the left and right (reference number 202). Additionally, as shown in reference numbers 25204 to 25207 of FIG. 25B, the time ratio corresponding to each sleep stage can be visually displayed using a predetermined shape.
  • the color of the figure corresponding to each sleep stage may be displayed the same as the color of the figure corresponding to the sleep stage information shown in the hypnogram of FIG. 25A.
  • the figures corresponding to each sleep stage are displayed in the order of the highest time ratio corresponding to each sleep stage.
  • the order of displayed sleep stages may be changed.
  • the deep sleep stage may be displayed first, and the waking stage may be displayed last.
  • the waking stage may be configured to be displayed first, and the deep sleep stage may be displayed last.
  • the REM sleep stage may be configured to be displayed first, or the normal sleep stage may be configured to be displayed first. This order may be selected directly by the user, or may be automatically configured by an algorithm. The above-described sequence is merely an example, and the present invention is not limited thereto.
  • Figure 25c is a diagram showing a respiratory stability graph according to an embodiment of the present invention.
  • a graphical user interface including a respiratory stability graph according to an embodiment of the present invention, the phrase “respiratory stability” may be displayed together, as indicated by reference number 25301.
  • sleep disorders eg, sleep apnea
  • their underlying causes eg, snoring
  • the stability of breathing can be determined based on criteria such as breathing cycle, breathing frequency change, and breathing pattern.
  • your breathing pattern is irregular or changes suddenly during the sleep period, your breathing may be judged to be unstable.
  • the breathing frequency fluctuates significantly or shows irregular changes during the sleep period, breathing may be judged to be unstable.
  • the breathing stability during the relevant period can be expressed as 'respiratory instability' (reference number 25303). Additionally, the respiratory stability in the section that does not correspond to 'respiratory instability' can be expressed as 'respiratory stability' (reference number 25302).
  • respiratory stability can be graphed over time.
  • the phrase “unstable” may also be displayed (reference number 25306).
  • AHI apnea-hypopnea index
  • the AHI level can be determined through sleep analysis using an artificial intelligence model, and the rate of respiratory stability can be calculated and displayed on the interface based on the determined AHI level (reference numbers 25304 and 25305). .
  • the color of the figure corresponding to the point in time when it is determined that breathing is unstable may be displayed brighter. This display has the advantage of being able to better identify the occurrence of problematic events from the viewpoint of respiratory stability.
  • a hypnogram graph representing sleep stage information and a graph representing breathing stability may be displayed together in parallel up and down on the same time axis. When displayed in this way, the user can see at a glance what the breathing stability was at which sleep stage, which can be effective in analyzing sleep.
  • a breathing instability event occurs during the REM sleep stage, it can be interpreted as more serious. If the hypnogram graph representing sleep stage information and the graph representing breathing stability are displayed together in parallel up and down on the same time axis as shown above, Since the occurrence of such an event can be easily identified at a glance, it can be easy to interpret and/or judge sleep information.
  • Figure 25e is a diagram showing a graphical user interface including an explanatory display for respiratory instability according to an embodiment of the present invention.
  • the phrase “What is respiratory instability?” may be displayed together, as indicated by reference number 25401.
  • a graphical user interface may display a description of respiratory instability (reference number 25402). Additionally, when the user clicks on the part assigned to the predetermined area indicated by reference number 25403, the user may be connected to an external website explaining respiratory instability or a screen displaying information accompanying a more detailed explanation.
  • the graphical user interface located in the background may be displayed relatively darkly (see No. 25404).
  • 26A and 26B are diagrams illustrating a graphical user interface including statistical information of sleep state information according to embodiments of the present invention.
  • the phrase “sleep statistics” may be displayed together, as indicated by reference number 26501.
  • daily sleep time can be expressed as a bar graph. Specifically, the date on which sleep state information was acquired may be displayed on the x-axis (reference number 26503), and the sleep time value may be displayed on the y-axis (reference number 26504).
  • the sleep time calculated based on sleep state information acquired on the day corresponding to the date on the x-axis can be displayed in the form of a bar graph (reference number 26502).
  • day of the week information on the day the corresponding sleep state information was acquired may be displayed in the upper part of the bar graph (reference number 26505).
  • information indicating the average of actual remaining time may be displayed at the bottom of the bar graph (reference numbers 26506 to 26508). Specifically, at least one of the average of sleep times obtained over a predetermined period of time, the average of sleep times measured during weekdays, or the average of sleep times measured during weekends may be displayed together.
  • the average of sleep times obtained over a predetermined period of time, the average of sleep times measured during weekdays, or the average of sleep times measured during weekends may all be displayed together.
  • the sleep time indicated by reference number 26509 is used.
  • a graphical user interface may be provided that displays average values separated by lines.
  • the line for distinction may be a dotted line or a solid line, and may be expressed as a curved line or a straight line.
  • the graphical user interface is as shown at reference number 26508 in FIG. 26A. can be displayed together. Additionally, in this case, bar graphs corresponding to the 11th (Saturday) and 12th (Sunday) of FIG. 26A may not be formed.
  • the specific description of the above-mentioned day and date is merely an example and is not limited thereto. For example, if sleep analysis is not performed on the 9th (Thursday), a bar graph corresponding to the 9th (Thursday) will not be formed. It may not be possible.
  • average information on sleep time acquired over a predetermined period may be displayed along with a bar graph.
  • the average sleep time acquired during a predetermined period is 6 hours and 26 minutes
  • the average sleep time is displayed in a shape such as a line (e.g., dotted line) at the position corresponding to 6 hours and 26 minutes on the y-axis of the bar graph. It can be (reference number 26510).
  • the average of sleep time acquired during a predetermined period is 6 hours and 4 minutes
  • the average sleep time is represented by a line, etc., at a position corresponding to 6 hours and 4 minutes on the y-axis of the bar graph. can be displayed.
  • a graphical user interface containing statistical information of sleep state information includes the phrase 'sleep statistics' as indicated by reference number 26511 and information on the period during which the corresponding sleep state information was acquired (e.g., “ Period information “2022.6.6 ⁇ 6.12” may be displayed together.
  • a graphical user interface containing statistical information of sleep state information includes a brief description of the corresponding graph (e.g., “I slept this much this week”) as indicated by reference number 26511. An explanation (or, “The more consistent the height of the water bar, the better”) may be displayed together.
  • the specific figures for the above-described average sleep time and brief explanatory text on the graph are merely examples, and the present invention is not limited thereto.
  • Figures 27a and 27b are diagrams illustrating a graphical user interface including sleep state information acquired over a week according to embodiments of the present invention.
  • sleep state information obtained for a week can be expressed as a bar graph where the x-axis is the day of the week (reference number 27601) and the y-axis is time information. .
  • time information on the y-axis may be expressed in 2-hour units (reference number 27602).
  • lines (solid or dotted lines) corresponding to each time on the y-axis can also be displayed (reference number 27603).
  • the entire bar graph may be expressed in a single color (reference number 27604).
  • the bar graph may be displayed with sections corresponding to the sleep stages of REM sleep, deep sleep, light sleep, and awakening stages (reference number 27605). to 27608). Specifically, if the colors of the shapes corresponding to each sleep stage information in the hypnogram graph are expressed differently, the color of the shapes corresponding to each sleep stage information in the bar graph for that date is displayed as the ratio of the corresponding sleep stage. It can be.
  • the part corresponding to the awakening stage is placed at the top of the bar graph (reference number 27608), and the part corresponding to deep sleep is placed at the top of the bar graph (reference number 27608).
  • the section may be placed at the bottom of the bar graph (reference number 27605).
  • the portion corresponding to a sleep stage with a high time ratio may be placed lower in the bar graph, and the portion corresponding to a sleep stage with a lower time ratio may be placed higher in the bar graph.
  • this arrangement order is merely an example and the present invention is not limited thereto.
  • the part corresponding to the deep sleep stage, the part corresponding to the light sleep stage, the part corresponding to the REM sleep stage, and the part corresponding to the awakening stage may be arranged in that order.
  • FIGS. 28A and 28B are diagrams illustrating a graphical user interface including sleep state information acquired over a predetermined period of time according to embodiments of the present invention.
  • the sleep state information obtained for a week is expressed as a bar graph where the x-axis is the date (reference numbers 28701a and 28701b) and the y-axis is time information. You can.
  • the elevation time included in the obtained sleep state information A bar graph can be displayed based on information and weather timing information. For example, if it is determined that the person went to bed after the evening of the 12th and woke up after the early morning of the 13th, it may be expressed in the shape of the leftmost bar graph shown in FIG. 28A. At this time, as shown in FIGS. 28A and 28B, both ends of the bar graph of sleep state information acquired during a predetermined period may correspond to the elevation time and the wake-up time, respectively.
  • time information on the y-axis may be expressed in time units, but may also be expressed in the words noon, dawn, midnight, evening, and noon, as shown in FIGS. 28A and 28B (reference number 28702). Additionally, lines (solid or dotted lines) corresponding to each time on the y-axis can also be displayed (reference number 28703).
  • the line corresponding to midnight time is relatively thicker than the lines corresponding to other times. They can be distinguished by being marked with a bright line.
  • the entire bar graph may be expressed in a single color (reference number 28704).
  • the bar graph may display separate parts corresponding to the sleep stages of REM sleep, deep sleep, light sleep, and waking stages. Specifically, if the colors of the shapes corresponding to each sleep stage information in the hypnogram graph are expressed differently, the color of the shapes corresponding to each sleep stage information in the bar graph for that date is displayed as the ratio of the corresponding sleep stage. It can be.
  • each sleep stage The corresponding part may be arranged in the order of time when the corresponding sleep stage was detected.
  • Figure 29 is a flowchart of a method for creating and providing one or more graphical user interfaces representing information about a user's sleep according to an embodiment of the present invention.
  • a method of generating one or more graphical user interfaces representing information about sleep includes acquiring sleep information (S29120), sleep information acquired in the time domain It may include converting to information on the frequency domain (S29140), generating a graphical user interface (S29160), and providing a graphical user interface (S29180).
  • the sleep information acquired in the step of acquiring sleep information may include environmental sensing information or sleep sound information.
  • the method of generating one or more graphical user interfaces representing information about sleep may further include a sleep log storage step of storing sleep log information related to the account assigned to the user in a memory. there is.
  • the step of converting environmental sensing information obtained in the time domain into information in the frequency domain includes performing preprocessing on raw acoustic information in the time domain or information in the frequency domain. may include.
  • the step of converting environmental sensing information obtained in the time domain into information in the frequency domain may include converting acoustic information into spectrogram information in the frequency domain.
  • the step of applying Mel scale to the spectrogram and converting it into Mel spectrogram may be further included.
  • the step of generating a graphical user interface may include a hypnogram graph generating step of generating a graph of the sleep stage within the user's sleep period based on sleep information.
  • sleep information that is the basis for generating a hypnogram graph may be obtained by converting sleep information obtained in the time domain into information in the frequency domain.
  • the information in the frequency domain may be a spectrogram or a Mel spectrogram to which a Mel scale is applied.
  • the step of generating a graphical user interface (S29160) may include generating a graph about sleep stability within the user's sleep period based on sleep information.
  • the user in a method for starting sleep measurement through the user device 10, the user can set the user's desired sleep time in advance. For example, if the user's desired sleep time is set to 10 PM, the sleep measurement start trigger, which will be described below, may be set to be activated.
  • the user device 10 operates in a mode in which the output of information indicating the occurrence of events related to applications operating on the user device 10 is restricted (e.g., sleep mode, sleep mode, falling asleep mode, etc.). If set, a sensing step of sensing a sleep measurement start trigger to start sleep measurement only after setting the sleep mode; and a start step of starting sleep measurement at the time the trigger is sensed.
  • a sensing step of sensing a sleep measurement start trigger to start sleep measurement only after setting the sleep mode; and a start step of starting sleep measurement at the time the trigger is sensed.
  • the user device 10 when the user's desired sleep time is set, the user device 10 may be set to operate in a sleep mode, etc. after the set user's desired sleep time, and in this case, the user's desired sleep time Setting may mean setting the time at which the user device 10 starts operating in sleep mode, etc.
  • sleep measurement is not only started when each sleep measurement start trigger is sensed independently, but also when trigger sensing operations according to two or more embodiments are sensed together. Only then can it be recognized as the trigger sensing point for sleep measurement.
  • Figure 32 is a diagram showing that when a finger swipe input is sensed on the display unit of a user device, which is an embodiment of the present invention, the time of sensing the finger swipe is recognized as the sensing time of the sleep measurement start trigger.
  • a method for starting sleep measurement through the user device 10 includes a sensing step of sensing a sleep measurement start trigger; and a starting step of starting sleep measurement at the time when the trigger is sensed.
  • the user device 10 may sense a swipe in the first swipe input direction 32110 as a sleep measurement start trigger. Accordingly, the user device 10 may start sleep measurement at the time when the sleep measurement start trigger is sensed. For example, when starting sleep measurement, the user device 10 displays the user interface 32100'' indicating that sleep measurement is in progress as shown in FIG. 34(b) on the user device display unit 32100' of the user device 10. ) can be displayed.
  • the user device 10 may sense the swipe in the second swipe input direction 32120 as a sleep measurement start trigger.
  • the user can set a specific direction for sensing a swipe as a sleep measurement start trigger among the first swipe input direction (32110) and the second swipe input direction (32120), and a preset sleep measurement start trigger. If the specific direction for sensing is the second swipe input direction 32120, sleep measurement may be started accordingly.
  • Figure 33 is a diagram showing that when a palm swipe input is sensed on the display unit of a user device, which is an embodiment of the present invention, the time of sensing the palm swipe is recognized as the sensing time of the sleep measurement start trigger.
  • a method for starting sleep measurement through the user device 10 includes a sensing step of sensing a sleep measurement start trigger; and a starting step of starting sleep measurement at the time when the trigger is sensed.
  • the user device display unit 32100' of the user device 10 is displayed in the first swipe input direction 33110.
  • the user device 10 may sense the swipe in the first swipe input direction 33110 as a sleep measurement start trigger. Accordingly, the user device 10 may start sleep measurement at the time of sensing the sleep measurement start trigger. For example, when starting sleep measurement, the user device 10 displays the user interface 32100', which indicates that sleep measurement is in progress, shown in Figure 34(b), on the user device display unit 32100' of the user device 10. ) can be displayed.
  • the display unit of the user device 10 is held with the palm in the second swipe input direction 33120.
  • the user device 10 may sense the swipe in the second swipe input direction 33120 as a sleep measurement start trigger.
  • the user can set a specific direction for sensing a swipe as a sleep measurement start trigger among the first swipe input direction (33110) and the second swipe input direction (33120), and a preset sleep measurement start trigger. If the specific direction for sensing is the second swipe input direction 33120, sleep measurement may be started accordingly.
  • the user touches the display unit of the user device 10 with the palm in the first swipe input direction 33110.
  • the user device 10 may sense a swipe in the second swipe input direction 33120 as a sleep measurement interruption trigger to stop sleep measurement, Sleep measurement can be stopped when the sleep measurement interruption trigger is sensed.
  • the display unit of the user device 10 is held with the palm in the second swipe input direction 33120.
  • the user device 10 may sense a swipe in the first swipe input direction 33110 as a sleep measurement interruption trigger to stop sleep measurement, Sleep measurement can be stopped when the sleep measurement interruption trigger is sensed.
  • Figure 34 is a diagram showing that when the user device, which is an embodiment of the present invention, senses the start of wired charging, it recognizes the charging start sensing time as the sensing time of the sleep measurement start trigger.
  • a method for starting sleep measurement through the user device 10 includes a sensing step of sensing a sleep measurement start trigger; and a starting step of starting sleep measurement at the time when the trigger is sensed.
  • sleep measurement is not started on the user device display unit 32100', indicating that sleep measurement is in progress.
  • the displayed user interface 32100" may not appear on the user device display unit 32100' of the user device 10.
  • the user device 10 when the user device 10 is connected to the user device charger 34300, the user device 10 is connected to the user device charger 34300.
  • the time point can be sensed as a sleep measurement start trigger, and the time when the user device 10 is connected to the user device charger 34300 can be recognized as the sleep measurement start time point to start sleep measurement. Accordingly, sleep measurement may be initiated on the user device display unit 32100', and a user interface 32100" indicating that sleep measurement is in progress may appear on the user device display unit 32100' of the user device 10.
  • the user device 10 when the user device 10 is connected to the user device charger 34300 after the user's desired sleep time, the user device 10 is connected to the user device charger (34300).
  • 34300) can be sensed as a sleep measurement start trigger, and the user device 10 can recognize the time connected to the user device charger 34300 as the sleep measurement start time to start sleep measurement.
  • the time range set by the user may be a range in which the user usually sets the time to sleep. For example, if the time when the user mainly falls asleep is after 10 p.m., after 10 p.m.
  • the user device 10 may sense the time when the user device 10 is connected to the user device charger 34300 as a sleep measurement start trigger, The time when the user device 10 is connected to the user device charger 34300 may be recognized as the start time of sleep measurement and the sleep measurement may be started.
  • the user device 10 when the user device 10 is connected to the user device charger 34300 after the user's desired sleep time, the user device 10 is connected to the user device charger (34300).
  • 34300 can be sensed as a sleep measurement start trigger, and after the user device 10 recognizes the time connected to the user device charger 34300 as the sleep measurement start time and starts sleep measurement, the user If the connection between the device 10 and the user device charger 34300 is disconnected, sleep measurement may be interrupted, and in this case, the user interface 32100" indicating that sleep measurement is being displayed is displayed on the user device 10. It may not appear in section 32100'.
  • Figure 35 is a diagram showing that when a user device, which is an embodiment of the present invention, senses the start of wireless charging, it recognizes the charging start sensing time as the sensing time of the sleep measurement start trigger.
  • a method for starting sleep measurement through the user device 10 includes a sensing step of sensing a sleep measurement start trigger; and a starting step of starting sleep measurement at the time when the trigger is sensed.
  • sleep measurement is not initiated on the user device display unit 32100', so sleep measurement is in progress.
  • the user interface 32100" indicating may not appear on the user device display unit 32100' of the user device 10.
  • the user device 10 when the user device 10 is in contact with the user device wireless charging pad 35400, the user device 10 is connected to the user device wireless charging pad 35400.
  • the point of contact with (35400) can be sensed as a sleep measurement start trigger, and the user device 10 can recognize the point of contact with the user device wireless charging pad 35400 as the sleep measurement start point to start sleep measurement.
  • sleep measurement may be initiated on the user device display unit 32100', and a user interface 32100" indicating that sleep measurement is in progress may appear on the user device display unit 32100' of the user device 10.
  • the user device 10 when the user device 10 comes into contact with the user device wireless charging pad 35400 after the user's desired sleep time, the user device 10
  • the time of contact with the device wireless charging pad 35400 can be sensed as a sleep measurement start trigger, and the time when the user device 10 touches the user device wireless charging pad 35400 is recognized as the sleep measurement start time to measure sleep. can be initiated.
  • the time range set by the user may be a range in which the user usually sets the time to sleep. For example, if the time when the user mainly falls asleep is after 10 p.m., after 10 p.m.
  • the user device 10 When the user device 10 is in contact with the user device wireless charging pad 35400, the user device 10 triggers the start of sleep measurement at the time when the user device 10 is in contact with the user device wireless charging pad 35400.
  • the sleep measurement can be started by recognizing the time when the user device 10 touches the user device wireless charging pad 35400 as the sleep measurement start time.
  • the user device 10 when the user device 10 is in contact with the user device wireless charging pad 35400 within the time range set by the user, the user device 10 The time when the user device touches the wireless charging pad 35400 can be sensed as a sleep measurement start trigger, and the time when the user device 10 touches the user device wireless charging pad 35400 is recognized as the sleep measurement start time. After starting sleep measurement, if the user releases contact between the user device 10 and the user device wireless charging pad 35400, sleep measurement may be stopped, and in this case, a user interface indicating that sleep measurement is in progress ( 32100") may not appear on the user device display unit 32100' of the user device 10.
  • Figure 36 is a diagram showing that when exercise or movement is sensed by a user device, which is an embodiment of the present invention, the time at which the exercise or movement is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • a method for starting sleep measurement through the user device 10 includes a sensing step of sensing a sleep measurement start trigger; and a starting step of starting sleep measurement at the time when the trigger is sensed.
  • the user device 10 can sense the user's exercise or movement.
  • the accelerometer sensor of the user device 10 may detect the user's exercise or movement.
  • a gyro sensor may detect the user's exercise or movement.
  • the gyro sensor may detect the user's exercise or movement.
  • the sensor can measure angular velocity, unlike an accelerometer that measures acceleration.
  • the user device 10 may recognize the time when the accelerometer sensor of the user device 10 detects the user's exercise or movement as the time when the sleep measurement start trigger is sensed, and start sleep measurement. For example, when starting sleep measurement, the user device 10 displays the user interface 32100'' indicating that sleep measurement is in progress as shown in FIG. 36(b) on the user device display unit 32100' of the user device 10. ) can be displayed.
  • the user device 10 recognizes the time when the accelerometer sensor of the user device 10 detects the user's exercise or movement as the time when the sleep measurement start trigger is sensed, and sleep measurement may be initiated, and at this time, when the user's hand 32200 shakes the user device 10 that is measuring sleep once more, the user device 10 shakes the user device 10 that is measuring sleep once more to detect sleep. It can be sensed as a sleep measurement interruption trigger to stop measurement, and sleep measurement can be stopped at the point when the sleep measurement interruption trigger is sensed.
  • the user device 10 when the user device 10 is shaken while the user's hand 32200 is holding the user device 10 within a time range set by the user, the user device ( 10) can sense the user's exercise or movement. Accordingly, the user device 10 may recognize the time when the accelerometer sensor of the user device 10 detects exercise or movement as the time when the sleep measurement start trigger is sensed, and start sleep measurement. For example, if the time range is set to after 10 PM, when the user usually goes to sleep, after 10 PM, while the user's hand 32200 is holding the user device 10, the user When the device 10 is shaken, the user device 10 may sense the user's exercise or movement. Accordingly, the user device 10 may recognize the time when the accelerometer sensor of the user device 10 detects exercise or movement as the time when the sleep measurement start trigger is sensed, and start sleep measurement.
  • the user device 10 when the user device 10 is shaken while the user's hand 32200 is holding the user device 10 within a time range set by the user, the user device ( 10) can sense the user's exercise or movement. Accordingly, the user device 10 may recognize the time when the accelerometer sensor of the user device 10 detects the user's exercise or movement as the time when the sleep measurement start trigger is sensed, and start sleep measurement. In this case, when the user's hand 32200 shakes the user device 10 that is measuring sleep once more, the user device 10 shakes the user device 10 that is measuring sleep once more to stop sleep measurement. It can be sensed with a measurement stop trigger, and sleep measurement can be stopped at the point when the sleep measurement stop trigger is sensed.
  • the user device 10 may sense the user's exercise or movement. Accordingly, the user device 10 may recognize the time when the accelerometer sensor of the user device 10 detects the user's exercise or movement as the time when the sleep measurement start trigger is sensed, and start sleep measurement. In this case, when the user's hand 32200 shakes the user device 10 that is measuring sleep once more, the user device 10 shakes the user device 10 that is measuring sleep once more to stop sleep measurement. It can be sensed with a measurement stop trigger, and sleep measurement can be stopped at the point when the sleep measurement stop trigger is sensed.
  • the above-described movement or specific pattern of movement eg, shaking
  • the present invention is not limited thereto.
  • Figure 37 is a diagram showing that when a user's voice is sensed by a user device, which is an embodiment of the present invention, the time when the user's voice is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • a method for starting sleep measurement through the user device 10 includes a sensing step of sensing a sleep measurement start trigger; and a starting step of starting sleep measurement at the time when the trigger is sensed.
  • sleep measurement may be started at the time when the voice is sensed.
  • the user sleep measurement start trigger voice 37210 may be when the user device 10 senses a level higher than a predetermined decibel, and the user sleep measurement start trigger voice 37210 may be within the time range set by the user. This may be the case where the user device 10 senses a level higher than a predetermined decibel, and the user device 10 may recognize the meaning of the user sleep measurement start trigger voice 37210.
  • the external server 20 starts the user sleep measurement received from the user device 10.
  • the speech data analysis module may include at least one of an automatic speech recognition (ASR) module or a natural language understanding (NLU) module.
  • ASR automatic speech recognition
  • NLU natural language understanding
  • the user device 10 recognizes the meaning of the user sleep measurement start trigger voice 37210 and , Sleep measurement can be started at the point when the user's sleep measurement start trigger voice (37210) is sensed.
  • the user sleep measurement initiation trigger voice 37210 may be a voice indicating the user's intention to sleep, such as “I'm sleepy,” “I'm going to sleep,” “I'm going to sleep now,” “I'm going to sleep,” etc. there is.
  • Figure 38 is a diagram illustrating that when another device connected to the user device through a network senses sleep measurement start trigger information, which is an embodiment of the present invention, the sensing time is recognized as the sensing time of the sleep measurement start trigger.
  • the method for starting sleep measurement through the user device 10 is that another device (38220) connected to the user device through a network (38230) senses sleep measurement start trigger information, and the other device senses sleep measurement start trigger information.
  • the trigger for starting sleep measurement through the user device 10 may be physical contact between another device 38220 and the user's body.
  • the user device 10 may be connected to the other device 38220.
  • the user's body can be sensed through the network (38230), and sleep measurement can be started at the sensed time.
  • the other device 38220 may be a “smart watch” type device configured to be worn by the user on the body, and when the user wears it on the body and physical contact occurs, the user device 10 can sense the point in time when there is physical contact between another device 38220 and the user's body and start measuring sleep at the sensed point in time.
  • Other devices (38220) of the present invention include smart watches, smart rings, Oura rings, Bluetooth earphones, smart bands, smart glasses, wearable computers, smart home appliances, electronic devices for adjusting the user's sleep environment, etc. , but is not limited to this.
  • sleep measurement when a sleep measurement start trigger occurs within a time range set by the user, sleep measurement may be started. For example, if the time set by the user is 10 p.m., and there is physical contact between the other device 38220 and the user's body after 10 p.m., the user device 10 contacts the other device 38220. The point at which there is physical contact between the user's bodies can be sensed through the network 38230, and sleep measurement can be started at the sensed point in time.
  • the other device 38220 may be a “smart watch” type device configured to be worn by the user on the body, and may be worn on the user’s body after 10 p.m., a time set by the user, to physically watch the device.
  • the user device 10 may sense the point in time when there is physical contact between the other device 38220 and the user's body and start measuring sleep at the sensed point in time.
  • a method for starting sleep measurement through the user device 10 includes another device 38220 connected to the user device through a network 38230 sensing sleep measurement start trigger information, When the other device transmits the sleep measurement start trigger information sensed, a receiving step of receiving the transmitted sleep measurement start trigger information; and a starting step of starting sleep measurement at a time when the trigger information is sensed based on the received sleep measurement start trigger information.
  • the sleep measurement start trigger is a physical contact between the other device 38220 and the user device 10
  • the start step is when physical contact between the other device and the user device is sensed. It may include initiating sleep measurement.
  • a sleep measurement start trigger when a user wears a “smart watch” included in another device 38220 and then touches the other device 38220 currently being worn on the user device 10, the user device 10 The point at which the user device 10 and another device 38220 come into physical contact can be sensed as a sleep measurement start trigger.
  • the user wears the “smart watch” included in the other device 38220 and then displays the currently worn watch on the user device 10.
  • the user device 10 may sense the point at which the user device 10 and the other device 38220 physically contact as a sleep measurement start trigger.
  • the sleep measurement start trigger may be a user sleep measurement start trigger voice 37210 input to another device 38220, and the start step may be a user sleep measurement start trigger voice 37210 input to another device 38220.
  • the user device 10 determines the meaning of the user sleep measurement start trigger voice 37210 by another device 38220.
  • sleep measurement can be started at the point when the user's sleep measurement start trigger voice (37210) is sensed.
  • the other device 38220 recognizes the meaning of the user sleep measurement start trigger voice 37210 and , when the time at which the user sleep measurement start trigger voice 37210 is sensed is transmitted to the user device 10 through the network 38230, the user device 10 responds to the user sleep measurement start trigger voice 37210 by another device 38220.
  • sleep measurement can be started at the time when the user's sleep measurement start trigger voice (37210) is sensed.
  • the user sleep measurement initiation trigger voice 37210 may be a voice indicating the user's intention to sleep, such as “I'm sleepy,” “I'm going to sleep,” “I'm going to sleep now,” “I'm going to sleep,” etc. there is.
  • the sleep measurement start trigger may be a signal that another device (38220) has started charging after the user's desired sleep time
  • the initiating step may include initiating sleep measurement at the time when the other device 38220 starts charging.
  • the user device 10 receives the time when the other device 38220 is connected to the charger through the network 38230 and sleeps. It can be sensed using a measurement start trigger, and the user device 10 can recognize the time when another device 38220 is connected to the charger as the sleep measurement start time and start sleep measurement.
  • the user's desired sleep time may be a range in which the user usually sets the time to sleep.
  • the user device 10 can sense the time when the other device 38220 is connected to the charger as a sleep measurement start trigger, and the time when the other device 38220 is connected to the charger can be detected as the sleep measurement start trigger. Sleep measurement can be started by recognizing the measurement start time.
  • the accelerometer sensor of the other device 38220 may detect the movement or movement of the other device 38220, and for a specific example, the gyro sensor may detect the movement or movement of the other device 38220. can detect, and in this case, the gyro sensor can measure angular velocity, unlike an accelerometer that measures acceleration. Accordingly, the user device 10 may recognize the time when the accelerometer sensor of the other device 38220 detects the exercise or movement of the other device as the time when the sleep measurement start trigger is sensed, and start sleep measurement.
  • the user device 10 detects the other device 38220
  • the exercise or motion detection point can be received through the network 38230 and sensed as a sleep measurement start trigger, and the user device 10 recognizes the exercise or motion detection point of the other device 38220 as the sleep measurement start point.
  • sleep measurement can be started.
  • the user's desired sleep time may be a range in which the user usually sets the time to sleep. For example, if the user's main sleep time is after 10 p.m., the user may sleep after 10 p.m.
  • the user device 10 can sense the time when the exercise or movement of another device 38220 is sensed as a sleep measurement start trigger, and the movement of the other device 38220 can be detected as a sleep measurement start trigger.
  • sleep measurement can be started by recognizing the time when movement is sensed as the sleep measurement start time.
  • the user device 10 when another device 38220 detects exercise or movement of a specific pattern, the user device 10 recognizes the time of detection of exercise or movement of the pattern as the start time of sleep measurement. You can.
  • the user device 10 when shaking is detected in another device 38220, the user device 10 recognizes the time when shaking is detected in the other device 38220 as the time when the sleep measurement start trigger is sensed and starts sleep measurement. can do. In this case, when shaking is detected once more in the other device (38220), the user device 10 senses the shaking in the other device (38220) as a sleep measurement interruption trigger to stop sleep measurement. This can be done, and sleep measurement can be stopped at the point when the sleep measurement interruption trigger is sensed.
  • the above-described movement or specific pattern of movement eg, shaking
  • the present invention is not limited thereto.
  • Figure 39 is a diagram showing that when sleep sound information is sensed by a user device according to an embodiment of the present invention, the time at which the sleep sound information is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • a method for starting sleep measurement through the user device 10 includes a sensing step of sensing a sleep measurement start trigger; and a starting step of starting sleep measurement at the time when the trigger is sensed.
  • the user device 10 may sense the user's sleep sound information.
  • the user device 10 may receive the user's sleep sound information (SS), and the user's sleep sound information (SS) may include environmental sensing information.
  • environmental sensing information may include information acquired in a non-invasive manner during the user's activities or sleep in the work space.
  • environmental sensing information may include sounds generated as the user tosses and turns during sleep, sounds related to muscle movements, or sounds related to the user's breathing during sleep.
  • the environmental sensing information may include movement and distance information related to the user's movement during sleep, and breathing information generated based on this.
  • the environmental sensing information may include sleep sound information (SS), and the sleep sound information (SS) may mean sound information related to movement patterns and breathing patterns that occur during the user's sleep.
  • the environmental sensing information may include sleep movement information, and the sleep movement information may mean information related to movement patterns and breathing patterns that occur during the user's sleep.
  • This sleep sound information is information about very small sounds (i.e., sounds that are difficult to distinguish) such as the user's breathing and movement, and is acquired along with other sounds during the sleep environment, so it has a low signal-to-noise ratio. When acquired through a microphone module such as a bar, detection and analysis can be very difficult.
  • the user device 10 when the user has not yet fallen asleep, the user device 10 does not sense sleep sound information from the user 39240 who is not asleep. Therefore, the user device 10 cannot sense the user's sleep sound information as a sleep measurement start trigger.
  • the user device 10 when the user is not yet asleep, the user device 10 cannot sense sleep sensing data from the user 39240 who is not asleep, so the user device 10
  • the user's sleep sensing data cannot be sensed as a sleep measurement start trigger.
  • the sleep sensing data may include the user's breathing information acquired during a predetermined time period related to the user's sleep environment.
  • the sleep sensitivity data may further include the user's movement information and heart rate information acquired during a predetermined time period in relation to the user's sleep environment. That is, the user device 10 may be configured to include a sensor module for obtaining at least one of breathing information, movement information, and heart rate information from the user in relation to the user's sleeping environment.
  • the user device 10 includes one or more transmitting modules that transmit radio waves of a specific frequency and a receiving module that receives reflected waves generated in response to radio waves of a specific frequency (e.g., microwaves). It may be provided including.
  • the user device 10 can obtain sleep sensing data from the user in a non-contact manner by detecting a phase difference or frequency change according to the moving distance of the reflected wave corresponding to the radio wave transmitted from the transmission module. For example, when radio waves transmitted through a transmission module hit an object and are reflected, the phase difference or frequency may vary. For example, when an object gets closer to the transmission module, the frequency of the reflected wave may become shorter, and as it moves further away, the frequency of the reflected wave may become longer.
  • the user device 10 may acquire sleep sensing information related to the user's breathing by detecting movement of the user's body (eg, abdomen or chest, etc.) based on the phase difference or frequency change state of the reflected wave. For example, when the user sleeps, the user device 10 may be located in an area of the space where the user sleeps.
  • the user device 10 can transmit radio waves with a specific frequency through a transmission module, and receive reflected waves reflected from the user's body in response to the radio waves through a reception module. Additionally, the user device 10 may obtain at least one of the user's breathing information, movement information, and heart rate information based on the phase difference or frequency change of the reflected wave received through the reception module.
  • the user device 10 may be equipped with a transmission module and a reception module to obtain sleep sensing data through RF (Radio Frequency) sensing, but the present disclosure is not limited thereto.
  • the user device 10 of the present disclosure is configured to further include a sensor module for transmitting and detecting WiFi radio waves and a sensor module for detecting airflow related to the user's breathing to perform sleep sensing. Data can be obtained.
  • the user device 10 when the user device 10 senses the sleep sound information of the user 39240' who is asleep, the user device 10 starts measuring the sleep sound information of the user. It can be sensed with a trigger. Accordingly, the user device 00 may start sleep measurement at the time of sensing the sleep measurement start trigger. For example, when starting sleep measurement, the user device 10 displays the user interface 32100', which indicates that sleep measurement is in progress, shown in FIG. 39(b), on the user device display unit 32100' of the user device 10. ) can be displayed.
  • the user device 10 when the user device 10 senses the sleep sensing data of the user 39240' who is asleep, the user device 10 detects the sleep sensing data of the user Sensing is possible with a measurement start trigger. Accordingly, the user device 10 may start sleep measurement at the time of sensing the sleep measurement start trigger. For example, when starting sleep measurement, the user device 10 displays the user interface 32100', which indicates that sleep measurement is in progress, shown in Figure 39(b), on the user device display unit 32100' of the user device 10. ) can be displayed.
  • the user device 10 when the user's desired sleep time is set to 10 p.m., after 10 p.m., the user device 10 senses sleep sound information of the sleeping user 39240'. In this case, the user device 10 may sense sleep sound information of the sleeping user 39240' as a sleep measurement start trigger. Accordingly, the user device 10 may start sleep measurement at the time when the sleep measurement start trigger is sensed. For example, when starting sleep measurement, the user device 10 displays the user interface 32100', which indicates that sleep measurement is in progress, shown in FIG. 39(b), on the user device display unit 32100' of the user device 10. ) can be displayed.
  • FIG. 40 is a diagram illustrating that when sleep sound information is sensed by a user device in a sleep mode according to an embodiment of the present invention, the time at which the sleep sound information is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • a method for initiating sleep measurement through a user device includes a sensing step of sensing information related to the user; A start decision information generating step of determining a time to start sleep measurement based on the sensed user-related information; and a start step of starting sleep measurement at the determined start time. It may include a method for starting sleep measurement through a user device.
  • the user device 10 may sense the user's sleep sound information.
  • the user device 10 may receive the user's sleep sound information (SS), and the user's sleep sound information (SS) may include environmental sensing information.
  • environmental sensing information may include information acquired in a non-invasive manner during the user's activities or sleep in the work space.
  • environmental sensing information may include sounds generated as the user tosses and turns during sleep, sounds related to muscle movements, or sounds related to the user's breathing during sleep.
  • the environmental sensing information may include movement and distance information related to the user's movement during sleep, and breathing information generated based on this.
  • the environmental sensing information may include sleep sound information (SS), and the sleep sound information (SS) may mean sound information related to movement patterns and breathing patterns that occur during the user's sleep.
  • the environmental sensing information may include sleep movement information, and the sleep movement information may mean information related to movement patterns and breathing patterns that occur during the user's sleep.
  • This sleep sound information is information about very small sounds (i.e., sounds that are difficult to distinguish) such as the user's breathing and movement, and is acquired along with other sounds during the sleep environment, so it has a low signal-to-noise ratio. When acquired through a microphone module such as a bar, detection and analysis can be very difficult.
  • the user device 10 when the user has not yet fallen asleep, the user device 10 does not sense sleep sound information from the user 39240 who is not asleep. Therefore, the user device 10 cannot sense the user's sleep sound information as a sleep measurement start trigger.
  • the user device 10 when the user is not yet asleep, the user device 10 cannot sense sleep sensing data from the user 39240 who is not asleep, so the user device 10
  • the user's sleep sensing data cannot be sensed as a sleep measurement start trigger.
  • the sleep sensing data may include the user's breathing information acquired during a predetermined time period related to the user's sleep environment.
  • the sleep sensitivity data may further include the user's movement information and heart rate information acquired during a predetermined time period in relation to the user's sleep environment. That is, the user device 10 may be configured to include a sensor module for obtaining at least one of breathing information, movement information, and heart rate information from the user in relation to the user's sleeping environment.
  • the user device 10 includes one or more transmitting modules that transmit radio waves of a specific frequency and a receiving module that receives reflected waves generated in response to radio waves of a specific frequency (e.g., microwaves). It may be provided including.
  • the user device 10 can obtain sleep sensing data from the user in a non-contact manner by detecting a phase difference or frequency change according to the moving distance of the reflected wave corresponding to the radio wave transmitted from the transmission module. For example, when radio waves transmitted through a transmission module hit an object and are reflected, the phase difference or frequency may vary. For example, when an object gets closer to the transmission module, the frequency of the reflected wave may become shorter, and as it moves further away, the frequency of the reflected wave may become longer.
  • the user device 10 may acquire sleep sensing information related to the user's breathing by detecting movement of the user's body (eg, abdomen or chest, etc.) based on the phase difference or frequency change state of the reflected wave. For example, when the user sleeps, the user device 10 may be located in an area of the space where the user sleeps.
  • the user device 10 may acquire sleep sensing information related to the user's breathing by detecting movement of the user's body (eg, abdomen or chest, etc.) based on the phase difference or frequency change state of the reflected wave. For example, when the user sleeps, the user device 10 may be located in an area of the space where the user sleeps.
  • the user device 10 can transmit radio waves with a specific frequency through a transmission module, and receive reflected waves reflected from the user's body in response to the radio waves through a reception module. Additionally, the user device 10 may obtain at least one of the user's breathing information, movement information, and heart rate information based on the phase difference or frequency change of the reflected wave received through the reception module.
  • the user device 10 may be equipped with a transmission module and a reception module to obtain sleep sensing data through RF (Radio Frequency) sensing, but the present disclosure is not limited thereto.
  • the user device 10 of the present disclosure is configured to further include a sensor module for transmitting and detecting WiFi radio waves and a sensor module for detecting airflow related to the user's breathing to perform sleep sensing. Data can be obtained.
  • the user device 10 when the user device 10 senses the sleep sound information of the user 39240' who is asleep, the user device 10 starts measuring the sleep sound information of the user. It can be sensed with a trigger. Accordingly, the user device 10 may start sleep measurement at the time of sensing the sleep measurement start trigger. For example, when starting sleep measurement, the user device 10 displays the user interface 32100', which indicates that sleep measurement is in progress, shown in FIG. 39(b), on the user device display unit 32100' of the user device 10. ) can be displayed.
  • the user device 10 when the user device 10 senses the sleep sensing data of the user 39240' who is asleep, the user device 10 detects the sleep sensing data of the user Sensing is possible with a measurement start trigger. Accordingly, the user device 10 may start sleep measurement at the time of sensing the sleep measurement start trigger. For example, when starting sleep measurement, the user device 10 displays the user interface 32100'' indicating that sleep measurement is in progress as shown in FIG. 40(b) on the user device display unit 32100' of the user device 10. ) can be displayed.
  • the user device 10 when the user's desired sleep time is set to 10 p.m., after 10 p.m., the user device 10 senses sleep sound information of the sleeping user 39240'. In this case, the user device 10 may sense sleep sound information of the sleeping user 39240' as a sleep measurement start trigger. Accordingly, the user device 10 may start sleep measurement at the time of sensing the sleep measurement start trigger. For example, when starting sleep measurement, the user device 10 displays the user interface 32100'' indicating that sleep measurement is in progress as shown in FIG. 40(b) on the user device display unit 32100' of the user device 10. ) can be displayed.
  • FIG. 41 is a diagram illustrating that when sleep sound information is sensed by a user device in a sleep mode according to an embodiment of the present invention, the time at which the sleep sound information is sensed is recognized as the sensing time of the sleep measurement start trigger.
  • a method for initiating sleep measurement through a user device includes a sensing step of sensing information related to the user; A start decision information generating step of determining a time to start sleep measurement based on the sensed user-related information; and a start step of starting sleep measurement at the determined start time. It may include a method for starting sleep measurement through a user device.
  • the user device 10 may sense the user's sleep sound information.
  • the user device 10 may receive the user's sleep sound information (SS), and the user's sleep sound information (SS) may include environmental sensing information.
  • environmental sensing information may include information acquired in a non-invasive manner during the user's activities or sleep in the work space.
  • environmental sensing information may include sounds generated as the user tosses and turns during sleep, sounds related to muscle movements, or sounds related to the user's breathing during sleep.
  • the environmental sensing information may include movement and distance information related to the user's movement during sleep, and breathing information generated based on this.
  • the environmental sensing information may include sleep sound information (SS), and the sleep sound information (SS) may mean sound information related to movement patterns and breathing patterns that occur during the user's sleep.
  • the environmental sensing information may include sleep movement information, and the sleep movement information may mean information related to movement patterns and breathing patterns that occur during the user's sleep.
  • This sleep sound information is information about very small sounds (i.e., sounds that are difficult to distinguish) such as the user's breathing and movement, and is acquired along with other sounds during the sleep environment, so it has a low signal-to-noise ratio. When acquired through a microphone module such as a bar, detection and analysis can be very difficult.
  • the user device 10 when the user has not yet fallen asleep, the user device 10 does not sense sleep sound information from the user 39240 who is not asleep. Therefore, the user device 10 cannot sense the user's sleep sound information as a sleep measurement start trigger.
  • the user device 10 when the user is not yet asleep, the user device 10 cannot sense sleep sensing data from the user 39240 who is not asleep, so the user device 10
  • the user's sleep sensing data cannot be sensed as a sleep measurement start trigger.
  • the sleep sensing data may include the user's breathing information acquired during a predetermined time period related to the user's sleep environment.
  • the sleep sensitivity data may further include the user's movement information and heart rate information acquired during a predetermined time period in relation to the user's sleep environment. That is, the user device 10 may be configured to include a sensor module for obtaining at least one of information among breathing information, movement information, and heart rate information from the user in relation to the user's sleep environment.
  • the user device 10 includes one or more transmitting modules that transmit radio waves of a specific frequency and a receiving module that receives reflected waves generated in response to radio waves of a specific frequency (e.g., microwaves). It may be provided including.
  • the user device 10 can obtain sleep sensing data from the user in a non-contact manner by detecting a phase difference or frequency change according to the moving distance of the reflected wave corresponding to the radio wave transmitted from the transmission module. For example, when radio waves transmitted through a transmission module hit an object and are reflected, the phase difference or frequency may vary. For example, when an object gets closer to the transmission module, the frequency of the reflected wave may become shorter, and as it moves further away, the frequency of the reflected wave may become longer.
  • the user device 10 may acquire sleep sensing information related to the user's breathing by detecting movement of the user's body (eg, abdomen or chest, etc.) based on the phase difference or frequency change state of the reflected wave. For example, when the user sleeps, the user device 10 may be located in an area of the space where the user sleeps.
  • the user device 10 may acquire sleep sensing information related to the user's breathing by detecting movement of the user's body (eg, abdomen or chest, etc.) based on the phase difference or frequency change state of the reflected wave. For example, when the user sleeps, the user device 10 may be located in an area of the space where the user sleeps.
  • the user device 10 can transmit radio waves with a specific frequency through a transmission module, and receive reflected waves reflected from the user's body in response to the radio waves through a reception module. Additionally, the user device 10 may obtain at least one of the user's breathing information, movement information, and heart rate information based on the phase difference or frequency change of the reflected wave received through the reception module.
  • the user device 10 may be equipped with a transmission module and a reception module to obtain sleep sensing data through RF (Radio Frequency) sensing, but the present disclosure is not limited thereto.
  • the user device 10 of the present disclosure is configured to further include a sensor module for transmitting and detecting WiFi radio waves and a sensor module for detecting airflow related to the user's breathing to perform sleep sensing. Data can be obtained.
  • the user device 10 when the user device 10 senses the sleep sound information of the user 39240' who is asleep, the user device 10 starts measuring the sleep sound information of the user. It can be sensed with a trigger. Accordingly, the user device 10 may start sleep measurement at the time when the sleep measurement start trigger is sensed. For example, when starting sleep measurement, the user device 10 displays the user interface 32100'' indicating that sleep measurement is in progress as shown in FIG. 40(b) on the user device display unit 32100' of the user device 10. ) can be displayed.
  • the user device 10 when the user device 10 senses the sleep sensing data of the user 39240' who is asleep, the user device 10 detects the sleep sensing data of the user Sensing is possible with a measurement start trigger. Accordingly, the user device 10 may start sleep measurement at the time when the sleep measurement start trigger is sensed. For example, when starting sleep measurement, the user device 10 displays the user interface 32100'' indicating that sleep measurement is in progress as shown in FIG. 41(b) on the user device display unit 32100' of the user device 10. ) can be displayed.
  • FIG. 41 is a diagram illustrating that when unlocking of the locking device of the user device is sensed, which is an embodiment of the present invention, the time when unlocking of the locking device is sensed is recognized as the sensing time of the sleep measurement start trigger. It may be an embodiment that is applicable in the same way as the above-described embodiments.
  • Figure 42 shows that during the sleep mode, which is an embodiment of the present invention, if finger tapping is not sensed by the user device, the time when a predetermined time has elapsed from the time when finger tapping was not sensed is recognized as the sensing time of the sleep measurement start trigger. It is a drawing.
  • a method for initiating sleep measurement through a user device includes a sensing step of sensing information related to the user; A start decision information generating step of determining a time to start sleep measurement based on the sensed user-related information; and a start step of starting sleep measurement at the determined start point.
  • the start decision information generation step is performed when the user uses the user device for a predetermined period of time during a period in which the output of information indicating the occurrence of an event related to applications operating on the user device is set to a limited mode in the sensing step. If the information indicating use is not sensed, the time at which the predetermined time ends may be determined as the start time of the sleep measurement.
  • the information that the user is using the user device includes the user's movement or movement detected by the user device, a signal that the display of the user device is turned on, the user's voice input to the user device, Alternatively, it may be at least one of the user's tapping detected on the display of the user device.
  • the sleep mode operation user interface S outputs information indicating the occurrence of events related to applications running on the user device 10.
  • the restricted mode may be a user interface that appears on the user device display unit 32100'. For example, if the user wants to go to bed soon, a sleep mode, sleep mode, falling asleep mode, etc. can be set on the user device 10, and such sleep modes, etc. can be set by applications running on the user device 10.
  • the output of information indicating the occurrence of events related to can be restricted.
  • the illuminance of the user device display unit 32100' can be controlled below a predetermined output, and the sound of the user device 10 can be controlled below a predetermined output, and an application that functions as a messenger can be used. Alarms can be blocked, but are not limited to this. In this case, if the user is not asleep during the time when the sleep mode operation user interface (S) appears, the finger tapping (42241) of the user who is not asleep may be detected on the user device display unit (32100'), The user interface 32100" indicating that sleep is being measured may not appear on the user device display unit 32100' of the user device 10.
  • a sleep mode, sleep mode, falling asleep mode, etc. may limit the output of information indicating the occurrence of events related to applications operating on the user device 10.
  • the illuminance of the user device display unit 32100' can be controlled below a predetermined output, and the sound of the user device 10 can be controlled below a predetermined output, and an application that functions as a messenger can be used. Alarms can be blocked, but are not limited to this.
  • the finger (42241') of the sleeping user is not detected on the user device display unit (32100'), so sleep is being measured.
  • a user interface 32100" indicating may appear on the user device display unit 32100' of the user device 10. More specifically, at a predetermined time, for example, the last detected user's finger tapping ( 42241) If the finger tapping 42241 of the user who is not asleep is not detected again within 5 minutes from the time of detection, the user device 10 selects the finger of the user who is not asleep at the last detected time, which is when the predetermined time ends. The time point 5 minutes after the tapping (42241) is detected can be determined as the start time of the sleep measurement.
  • the user who has not fallen asleep within a predetermined time from the time of detecting the last detected finger tapping (42241) of the user who has not fallen asleep Even if the finger tapping 42241 is not detected again, when the user is using the user device 10, such as when a video is being played on the user device display unit 32100', it indicates that sleep is being measured.
  • the user interface 32100" may not appear on the user device display unit 32100' of the user device 10.
  • the finger (42241') of the sleeping user is displayed on the user device display unit (32100'). Since it is not detected, a user interface 32100" indicating that sleep is being measured may appear on the user device display unit 32100' of the user device 10, and at this time, sleep is displayed on the user device display unit 32100'. If the finger tapping (42241) of the user who is not asleep is sensed again, the user device 10 measures sleep at the time when the finger tapping (42241) of the user who is not asleep is sensed again by the user device 10 that is measuring sleep. It is possible to sense a sleep measurement interruption trigger to stop the sleep measurement, and the sleep measurement can be stopped at the point when the sleep measurement interruption trigger is sensed.
  • FIG. 42 an embodiment of a sleep measurement start trigger that is sensed after being set to 'sleep mode, etc.' has been described.
  • this embodiment is not limited to the sleep measurement start trigger being sensed during sleep mode.
  • the sleep measurement start trigger according to the above embodiment is sensed after the time set as the user's desired sleep time, and in the case where the user's desired sleep time and sleep mode are not set, the above-described It may also be assumed that a sleep measurement start trigger is sensed.
  • Figure 43 shows that during the sleep mode, which is an embodiment of the present invention, if the user's exercise or movement is not sensed by the user device, the sleep measurement start trigger is sensed at a predetermined time after the user's exercise or movement is not sensed. This is a diagram showing what is perceived from a viewpoint.
  • a method for initiating sleep measurement through a user device includes a sensing step of sensing information related to the user; A start decision information generating step of determining a time to start sleep measurement based on the sensed user-related information; and a start step of starting sleep measurement at the determined start point.
  • the start decision information generation step is performed when the user uses the user device for a predetermined period of time during a period in which the output of information indicating the occurrence of an event related to applications operating on the user device is set to a limited mode in the sensing step. If the information indicating use is not sensed, the time at which the predetermined time ends may be determined as the start time of the sleep measurement.
  • the information that the user is using the user device includes the user's movement or movement detected by the user device, a signal that the display of the user device is turned on, the user's voice input to the user device, Alternatively, it may be at least one of the user's tapping detected on the display of the user device.
  • a finger tapping 43241 by a user who is not asleep, or a user who is not asleep uses the user device 10.
  • the accelerometer sensor of the user device 10 may detect the user's exercise or movement, and as a specific example, a gyro sensor may detect the user's exercise or movement, in this case, Unlike an accelerometer, which measures acceleration, a gyro sensor can measure angular velocity.
  • the user device 10 detects the last user's motion.
  • the time at which a predetermined time has elapsed from the time at which exercise or movement is detected is recognized as the time at which the sleep measurement start trigger is sensed, and sleep measurement can be started.
  • the user device 10 may display a user interface 32100' indicating that sleep measurement is being performed on the user device display unit 32100' of the user device 10.
  • a finger tapping 43241 by a user who is not asleep, or a user who is not asleep uses the user device 10.
  • the user device 10 if the user's exercise or movement is not detected while the accelerometer sensor of the user device 10 detects the user's exercise or movement for a predetermined period of time, the last user The point in time at which a predetermined period of time has elapsed from the point in time at which exercise or movement of the body is detected is recognized as the point in time at which the sleep measurement start trigger is sensed, and sleep measurement can be started. At this time, the user device 10 detects the user's sleep during sleep measurement.
  • the user device 10 may sense a sleep measurement interruption trigger to stop sleep measurement at the point when the user's exercise or movement is sensed again by the user device 10 that is measuring sleep. , sleep measurement can be stopped at the point when the sleep measurement interruption trigger is sensed.
  • FIG. 43 an example of a sleep measurement start trigger that is sensed after being set to 'sleep mode, etc.' has been described.
  • this embodiment is not limited to the sleep measurement start trigger being sensed during sleep mode.
  • the sleep measurement start trigger according to the above embodiment is sensed after the time set as the user's desired sleep time, and in the case where the user's desired sleep time and sleep mode are not set, the above-described It may also be assumed that a sleep measurement start trigger is sensed.
  • Figure 44 is a diagram showing that when the display of the user device is turned off during sleep mode, which is an embodiment of the present invention, the point in time when a predetermined time has elapsed from the time the display of the user device is turned off is recognized as the sensing time of the sleep measurement start trigger.
  • the user device display unit 32100' is turned on, and the user device display on screen is displayed.
  • the user interface 32100" indicating that sleep measurement is in progress may not appear on the user device display unit 32100' of the user device 10.
  • the user device display unit 32100' since the user device display unit 32100' is turned off, the user device display off screen 32100'b may appear. In this case, since information indicating that the user is using the user device 10 cannot be sensed for a predetermined period of time, the end of the predetermined period of time may be determined as the start point of the sleep measurement.
  • the user device 10 is not used by the user 39240', who is asleep, so the user device display off screen 32100'b appears for the last time. , If the user device display on screen 32100'a does not appear until 5 minutes have passed, the user device display off screen 32100'b appears last, and the time after 5 minutes is determined as the start time of the sleep measurement. You can.
  • FIG. 44 an embodiment of a sleep measurement start trigger that is sensed after being set to 'sleep mode, etc.' has been described.
  • this embodiment is not limited to the sleep measurement start trigger being sensed during sleep mode.
  • the sleep measurement start trigger according to the above embodiment is sensed after the time set as the user's desired sleep time, and in the case where the user's desired sleep time and sleep mode are not set, the above-described It may also be assumed that a sleep measurement start trigger is sensed.
  • Figure 45 shows the sensing time of the sleep measurement start trigger when a predetermined time has elapsed from the time when the user's voice is not recognized by the user device during the sleep mode, which is an embodiment of the present invention, when the user's voice is not recognized by the user device. This is a drawing showing what is recognized as.
  • Figure 46a is a diagram illustrating a graphical user interface showing a comparison of the average elevation delay time of the general public and the average elevation delay time of a user, according to an embodiment of the present invention.
  • the graphical user interface representing the comparison of the average elevation delay time of the general public and the average elevation delay time of the user provides a sleep comparison title user interface 46100 and a first information delivery user interface 46200A regarding the elevation delay time. can do.
  • the sleep comparison title user interface 46100 can provide the user with information that sleep analysis information about the user's sleep is provided in an easy-to-understand form.
  • the sleep comparison title user interface 46100 may be provided in the form of phrases such as “My sleep is easy to understand.” As another example, it may be provided in the form of a phrase such as “Compare my sleep with others,” and may be provided in the form of a phrase such as “Compare my sleep,” but is not limited thereto.
  • the first information delivery user interface 46200A about the elevation delay time provides the user with information about the elevation delay time, that is, information about how much the elevation delay time is similar to the average elevation delay time of others. can do.
  • a specific example may be provided in the form of a phrase such as, “Close your eyes and fall asleep within 30 minutes.”
  • it may be provided as a way to give positive feedback about sleep, such as "I fell asleep earlier than the average person! It's great!”, but it is not limited to this.
  • the graphical user interface representing the comparison of the average sleep latency time of the general public and the average sleep latency time of the user includes a sleep latency sleep comparison check button user interface 46310, a deep sleep proportion comparison check button user interface 46320, and REM sleep proportion comparison confirmation button user interface 46330 may be provided in a manner that provides information that one has been selected, but the hypnagogic delay time sleep comparison confirmation button user interface 46310, the deep sleep proportion comparison confirmation button user interface (46320), and the REM sleep ratio comparison confirmation button user interface (46330), but are not limited to other types of comparison confirmation button user interfaces, such as the number of awakenings in the middle comparison confirmation button user interface, or the light sleep comparison confirmation button user interface. may be provided.
  • the sleep delay time sleep comparison confirmation button user interface 46310 may be in a form that provides phrases such as “to fall asleep,” may be in a form that provides phrases such as “time to fall asleep,” and may be in a form that provides phrases such as “time to fall asleep.” It may be in the form of providing a phrase such as “What is the time?”, but is not limited to this.
  • a graphical user interface representing a comparison of the average elevation delay time of the general public and the average elevation delay time of a user may include a user elevation delay time information user interface 46400A and an average elevation delay time information user interface 46410A of the general public.
  • the user's sleep delay time information user interface (46400A) is a bar graph that compares the user's sleep delay time with the average sleep delay time information user interface (46410A) of the general public, based on the sleep data of the date of the report selected by the user. It can be provided with .
  • the bar graph has a length proportional to the data value
  • the length of the bar graph of the user elevation delay time information user interface 46400A and the length of the bar graph of the average person elevation delay time information user interface 46410A By providing a comparison, it is possible to easily provide information about which of the user's elevation delay time and the average person's average elevation delay time is longer.
  • the average elevation delay time information for the general public user interface 46410A may be in a form that provides medically obtained average elevation delay time information for the general public.
  • the general public average elevation delay time information user interface 46410A may be in a form that provides medically recommended average elevation delay time information for the general public.
  • the general public average elevation delay time information user interface 46410A may be in a form that provides statistically obtained general public average elevation delay time information. Additionally, it may be in the form of providing average sleep delay time information for the general public obtained by analyzing other people's sleep state information.
  • the average sleep delay time information for the general public obtained by analyzing other people's sleep status information may be the overall average sleep delay time information statistically obtained through the service, and the information for a specific group among the data statistically obtained through the service may be It may be overall average elevation delay time information, medically recommended elevation delay time information, or average elevation delay time information obtained through the user's own past data, but is not limited to this.
  • the average waking delay time information for the general public obtained by analyzing other people's sleep state information may be information obtained based on the other person's activity record for that day. Specific examples include whether you drank coffee that day, if you drank coffee, how much you consumed, whether you drank alcohol that day, if you drank, how much alcohol you consumed, whether you smoked that day, and if you smoked, how much you smoked. It may be information obtained based on whether or not the information has been used, but is not limited to this.
  • the average waking delay time information for the general public obtained by analyzing other people's sleep state information may be information obtained based on other people's financial data information.
  • information acquired based on another person's financial data information may be information obtained based on another person's assets or information obtained based on another person's salary, but is not limited thereto.
  • the overall average sleep delay time information for a specific group includes the average sleep delay time information for men, the average sleep delay time information for people in their 30s, and the average sleep delay time information for those in their 30s who sleep on weekends. , it may be the average elevation delay time information of people who drank coffee, etc., and it may be a combination of two or more of the above average elevation delay time information, but this is only an example and is not limited thereto.
  • a graphical user interface representing a comparison of the average elevation delay time of the general public and the average elevation delay time of a user may include a user elevation delay time emoticon information user interface 46400A'.
  • the user sleep delay time emoticon information user interface 46400A' may include expressing the evaluation obtained by comparing the received sleep average data of others with the acquired sleep state information of the user as an “emoticon.”
  • the user's elevation delay time information user interface (46400A) when the bar length of the user's elevation delay time information user interface (46400A) is expressed as longer than the bar length of the average person's average elevation delay time information user interface (46410A), the user's elevation The delay time emoticon information user interface 46400A' displays a crying emoticon, so that the user can easily provide information that the sleep quality is worse than the average of the general public in relation to the waking delay time.
  • a graphical user interface representing a comparison between the average elevation delay time of the general public and the average elevation delay time of a user may include a second information delivery user interface 46500A regarding the elevation delay time.
  • the user interface 46500A which conveys second information about sleep latency, may be as follows: “Alcohol consumption may help you fall asleep quickly, but it may prevent you from falling asleep deeply and actually reduce the quality of your sleep.” , may be provided in the form of conveying medical information regarding the delay time of elevation. Also, “Tryptophan in milk helps you sleep well. It may be provided in the form of delivering medical information about sleep, such as “If it takes a long time to fall asleep, should I drink warm milk?”, but is not limited to this.
  • FIG. 46B is a diagram illustrating a graphical user interface showing a comparison of the average elevation delay time of a user with the average elevation delay time of a specific age group and a specific gender, according to an embodiment of the present invention.
  • a graphical user interface representing a comparison of the user's average sleep latency time with the average sleep latency time of a specific age group and a specific gender is a sleep comparison title user interface 46100 and a user interface conveying first information regarding the sleep latency latency. (46200A) can be provided.
  • the sleep comparison title user interface 46100 can provide the user with information that sleep analysis information about the user's sleep is provided in an easy-to-understand form.
  • the sleep comparison title user interface 46100 may be provided in the form of phrases such as “My sleep is easy to understand.” As another example, it may be provided in the form of a phrase such as “Compare my sleep with others,” and may be provided in the form of a phrase such as “Compare my sleep,” but is not limited thereto.
  • the first information delivery user interface 46200A about the elevation delay time provides the user with information about the elevation delay time, that is, information about how much the elevation delay time is similar to the average elevation delay time of others. can do.
  • a specific example may be provided in the form of a phrase such as, “Close your eyes and fall asleep within 30 minutes.”
  • it may be provided as a way to give positive feedback about sleep, such as "I fell asleep earlier than the average for women in their 50s! It's great!”, but it is not limited to this.
  • a graphical user interface representing a comparison of the user's average sleep latency with the average sleep latency of a specific age group and a specific gender includes a sleep latency sleep comparison check button user interface 46310, and a deep sleep proportion comparison check button for the user.
  • the interface 46320, and the REM sleep proportion comparison check button user interface 46330 may be provided in a manner that provides information that one has been selected, but the hypnagogic delay time sleep comparison check button user interface 46310, the deep sleep proportion confirmation button user interface 46310 It is not limited to the comparison confirmation button user interface 46320, and the REM sleep percentage comparison confirmation button user interface 46330, and is not limited to other types of comparison, such as the number of waking up in the middle comparison confirmation button user interface, or the light sleep comparison confirmation button user interface.
  • An OK button user interface may be provided.
  • the sleep delay time sleep comparison confirmation button user interface 46310 may be in a form that provides phrases such as “to fall asleep,” may be in a form that provides phrases such as “time to fall asleep,” and may be in a form that provides phrases such as “time to fall asleep.” It may be in the form of providing a phrase such as “What is the time?”, but is not limited to this.
  • a graphical user interface representing a comparison of the average elevation delay time of a specific age group and a specific gender with the average elevation delay time of a user includes a user elevation delay time information user interface 46400A and an average elevation delay time of a specific age group and a specific gender. May include a time information user interface 46411A. Specifically, the user's waking delay time information user interface 46400A displays the user's sleeping delay time based on the sleep data of the date of the report selected by the user, and the average sleeping delay time information user interface 46411A for a specific age group and a specific gender. It can be provided with bar graphs for comparison.
  • the bar graph has a length proportional to the data value, the length of the bar graph of the user elevation delay time information user interface 46400A and the average elevation delay time information user interface 46411A for a specific age group and a specific gender
  • the length of the bar graph of the user elevation delay time information user interface 46400A and the average elevation delay time information user interface 46411A for a specific age group and a specific gender
  • the average elevation delay time information user interface 46411A for a specific age group and a specific gender may be in a form that provides medically obtained average elevation delay time information for a specific age group and a specific gender.
  • the average elevation delay time information user interface 46411A for a specific age group and a specific gender may be in a form that provides medically recommended average elevation delay time information for a specific age group and a specific gender.
  • the average elevation delay time information user interface 46411A for a specific age group and a specific gender may be in a form that provides statistically obtained average elevation delay time information for a specific age group and a specific gender. Additionally, it may be in the form of providing average sleep delay time information for a specific age group and specific gender obtained by analyzing other people's sleep status information.
  • the average sleep delay time information for a specific age group and a specific gender obtained by analyzing other people's sleep status information may be the overall average sleep delay time information statistically obtained through the service, and data statistically obtained through the service. It may be the overall average elevation delay time information for a specific group, may be medically recommended elevation delay time information, or may be average elevation delay time information obtained through the user's own past data, but is not limited to this.
  • the average sleep delay time information for a specific age group and a specific gender obtained by analyzing other people's sleep state information may be information obtained based on the other person's activity record for that day. Specific examples include whether you drank coffee that day, if you drank coffee, how much you consumed, whether you drank alcohol that day, if you drank, how much alcohol you consumed, whether you smoked that day, and if you smoked, how much you smoked. It may be information obtained based on whether or not the information has been used, but is not limited to this.
  • the average waking delay time information for the general public obtained by analyzing other people's sleep state information may be information obtained based on other people's financial data information.
  • information acquired based on another person's financial data information may be information obtained based on another person's assets or information obtained based on another person's salary, but is not limited thereto.
  • the overall average sleep delay time information for a specific group includes the average sleep delay time information for men, the average sleep delay time information for people in their 30s, and the average sleep delay time information for those in their 30s who sleep on weekends. , it may be the average elevation delay time information of people who drank coffee, etc., and it may be a combination of two or more of the above average elevation delay time information, but this is only an example and is not limited thereto.
  • a graphical user interface representing a comparison of the user's average elevation delay time with the average elevation delay time of a specific age group and a specific gender may include a user elevation delay time emoticon information user interface 46400A'.
  • the user sleep delay time emoticon information user interface 46400A' may include expressing the evaluation obtained by comparing the received sleep average data of others with the acquired sleep state information of the user as an “emoticon.”
  • the bar length of the user elevation delay time information user interface 46400A is expressed shorter than the bar length of the average elevation delay time information user interface 46411A for a specific age group and a specific gender.
  • the user's waking delay time emoticon information user interface (46400A') displays a smiling emoticon so that the user can easily recognize information that sleep quality is better than the average for a specific age group and a specific gender in relation to waking delay time. can be provided.
  • a graphical user interface representing a comparison of the user's average elevation delay time with the average elevation delay time of a specific age group and a specific gender may include a second information delivery user interface 46500A regarding the elevation delay time.
  • the user interface 46500A which conveys second information about sleep latency, may be as follows: “Alcohol consumption may help you fall asleep quickly, but it may prevent you from falling asleep deeply and actually reduce the quality of your sleep.” , may be provided in the form of conveying medical information regarding the delay time of elevation. Also, “Tryptophan in milk helps you sleep well. It may be provided in the form of delivering medical information about sleep, such as “If it takes a long time to fall asleep, should I drink warm milk?”, but is not limited to this.
  • Figure 46c is a diagram illustrating a graphical user interface showing a comparison between the average elevation delay time of a specific occupation and the average elevation delay time of a user, according to an embodiment of the present invention.
  • a graphical user interface representing a comparison between the average elevation delay time of a specific occupation and the average elevation delay time of a user includes a sleep comparison title user interface 46100 and a first information delivery user interface regarding the elevation delay time 46200A. can be provided.
  • the sleep comparison title user interface 46100 can provide the user with information that sleep analysis information about the user's sleep is provided in an easy-to-understand format.
  • the sleep comparison title user interface 46100 may be provided in the form of phrases such as “My sleep is easy to understand.” As another example, it may be provided in the form of a phrase such as “Compare my sleep with others,” and may be provided in the form of a phrase such as “Compare my sleep,” but is not limited thereto.
  • the first information delivery user interface 46200A about the elevation delay time provides the user with information about the elevation delay time, that is, information about how much the elevation delay time is similar to the average elevation delay time of others. can do.
  • a specific example may be provided in the form of a phrase such as, “Close your eyes and fall asleep within 30 minutes.”
  • it may be provided as a way to provide positive feedback about sleep, such as, "I fell asleep earlier than the nurse average! It's great!”, but is not limited to this.
  • a graphical user interface representing a comparison of the average sleep latency of a specific occupation and the user's average sleep latency includes a sleep latency sleep comparison check button user interface 46310 and a deep sleep proportion comparison check button user interface 46320.
  • the REM sleep proportion comparison confirmation button user interface 46330 may be provided in a way to provide information that one has been selected, but the hypnagogic delay time sleep comparison confirmation button user interface 46310, the deep sleep proportion comparison confirmation button user It is not limited to the interface 46320, and the REM sleep ratio comparison confirmation button user interface 46330, and other types of comparison confirmation button user interfaces, such as the number of waking up in the middle comparison confirmation button user interface, or the light sleep comparison confirmation button user interface. may be provided.
  • the sleep delay time sleep comparison confirmation button user interface 46310 may be in a form that provides phrases such as “to fall asleep,” may be in a form that provides phrases such as “time to fall asleep,” and may be in a form that provides phrases such as “time to fall asleep.” It may be in the form of providing a phrase such as “What is the time?”, but is not limited to this.
  • a graphical user interface representing a comparison between the average elevation delay time of a specific occupation and the average elevation delay time of a user includes a user elevation delay time information user interface 46400A and an average elevation delay time information user interface 46412A of a specific occupation. may include.
  • the user's elevation delay time information user interface (46400A) can compare the user's elevation delay time with the average elevation delay time information user interface (46412A) of a specific occupation based on the sleep data of the date of the report selected by the user. It can be provided with a bar graph.
  • the bar graph has a length proportional to the data value
  • the length of the bar graph of the user elevation delay time information user interface 46400A and the bar graph of the average elevation delay time information user interface 46412A of a specific occupation By providing a comparison of the length of , it is possible to easily provide information on which is longer between the user's elevation delay time and the average elevation delay time of a specific occupation.
  • the average elevation delay time information user interface 46412A for a specific occupation may be in a form that provides medically obtained average elevation delay time information for a specific occupation.
  • the average elevation delay time information user interface 46412A for a specific occupation may be in a form that provides medically recommended information on the average elevation delay time for a specific occupation.
  • the average elevation delay time information user interface 46412A for a specific occupation may be in a form that provides statistically obtained average elevation delay time information for a specific occupation. Additionally, it may be in the form of providing average waking delay time information for a specific occupation obtained by analyzing other people's sleep state information.
  • the average waking delay time information for a specific occupation obtained by analyzing other people's sleep status information may be the average waking delay time information for a specific occupation statistically obtained through the service, and may be among the data statistically obtained through the service. It may be overall average elevation delay time information for a specific occupation, may be medically recommended elevation delay time information, or may be average elevation delay time information obtained through the user's own past data, but is not limited to this.
  • the average waking delay time information for a specific occupation obtained by analyzing other people's sleep state information may be information obtained based on the other person's activity record for that day. Specific examples include whether you drank coffee that day, if you drank coffee, how much you consumed, whether you drank alcohol that day, if you drank, how much alcohol you consumed, whether you smoked that day, and if you smoked, how much you smoked. It may be information obtained based on whether or not the information has been used, but is not limited to this.
  • the average waking delay time information for the general public obtained by analyzing other people's sleep state information may be information obtained based on other people's financial data information.
  • information acquired based on another person's financial data information may be information obtained based on another person's assets or information obtained based on another person's salary, but is not limited thereto.
  • the overall average sleep delay time information for a specific group includes the average sleep delay time information for men, the average sleep delay time information for people in their 30s, and the average sleep delay time information for those in their 30s who sleep on weekends. , it may be the average elevation delay time information of people who drank coffee, etc., and it may be a combination of two or more of the above average elevation delay time information, but this is only an example and is not limited thereto.
  • a graphical user interface representing a comparison between the average elevation delay time of a specific occupation and the average elevation delay time of a user may include a user elevation delay time emoticon information user interface 46400A'.
  • the user sleep delay time emoticon information user interface 46400A' may include expressing the evaluation obtained by comparing the received sleep average data of others with the acquired sleep state information of the user as an “emoticon.”
  • a graphical user interface representing a comparison between the average elevation delay time of a specific occupation and the average elevation delay time of a user may include a second information delivery user interface 46500A regarding the elevation delay time.
  • the second information delivery user interface (46500A) regarding sleep latency may be as follows: “Alcohol consumption may help you fall asleep quickly, but it actually reduces the quality of your sleep by preventing you from falling asleep deeply.” , may be provided in the form of conveying medical information regarding the delay time of elevation. Also, “Tryptophan in milk helps you sleep well. It may be provided in the form of delivering medical information about sleep, such as “If it takes a long time to fall asleep, should I drink warm milk?”, but is not limited to this.
  • Figure 46d is a diagram illustrating a graphical user interface showing a comparison of the average elevation delay time of a user with the average elevation delay time of a specific age group, specific gender, and specific occupation, according to an embodiment of the present invention.
  • a graphical user interface representing a comparison of the user's average sleep latency time with the average sleep latency time of a specific age group, specific gender, and specific occupation is provided with a sleep comparison title user interface 46100 and first information about the sleep latency time.
  • a delivery user interface 46200A may be provided.
  • the sleep comparison title user interface 46100 can provide the user with information that sleep analysis information about the user's sleep is provided in an easy-to-understand format.
  • the sleep comparison title user interface 46100 may be provided in the form of phrases such as “My sleep is easy to understand.” As another example, it may be provided in the form of a phrase such as “Compare my sleep with others,” and may be provided in the form of a phrase such as “Compare my sleep,” but is not limited thereto.
  • the first information delivery user interface 46200A about the elevation delay time provides the user with information about the elevation delay time, that is, information about how much the elevation delay time is similar to the average elevation delay time of others. can do.
  • a specific example could be given in the form of a phrase such as, “Close your eyes and fall asleep within 30 minutes.”
  • it may be provided as a way to give positive feedback about sleep, such as "I fell asleep earlier than the average for a female nurse in her 50s! It's great!”, but it is not limited to this.
  • a graphical user interface representing a comparison of the user's average sleep latency with the average sleep latency of a specific age group, specific gender, and specific occupation includes a sleep latency sleep comparison check button user interface 46310, deep sleep proportion comparison.
  • the confirmation button user interface 46320, and the REM sleep proportion comparison confirmation button user interface 46330 may be provided in a way that provides information that one of them has been selected, but the elevation delay time sleep comparison confirmation button user interface 46310, It is not limited to the deep sleep proportion comparison confirmation button user interface 46320, and the REM sleep proportion comparison confirmation button user interface 46330, and may include other devices such as the number of awakenings in the middle comparison confirmation button user interface, or the light sleep comparison confirmation button user interface.
  • a user interface of type Compare Confirm button may be provided.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Psychiatry (AREA)
  • Anesthesiology (AREA)
  • Psychology (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Evolutionary Computation (AREA)
  • Signal Processing (AREA)
  • Pain & Pain Management (AREA)
  • Acoustics & Sound (AREA)
  • Hematology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention concerne un procédé pour générer une interface utilisateur graphique représentant une évaluation du sommeil d'un utilisateur, qui est un procédé pour générer une ou plusieurs interfaces utilisateur graphiques représentant une évaluation du sommeil d'un utilisateur, le procédé comprenant les étapes suivantes : une étape d'acquisition d'informations de sommeil consistant à acquérir des informations de sommeil à partir d'un ou plusieurs dispositifs de capteur d'informations de sommeil, les informations de sommeil comprenant des informations sonores de sommeil d'un utilisateur ; une étape de génération d'expression de sommeil consistant à générer une expression de sommeil comprenant au moins deux mots représentant une évaluation du sommeil de l'utilisateur sur la base des informations de sommeil acquises ; et une étape consistant à afficher une interface utilisateur graphique comprenant l'expression générée.
PCT/KR2023/016512 2022-11-01 2023-10-23 Procédé pour fournir une interface utilisateur graphique représentant des informations ou une évaluation du sommeil d'un utilisateur WO2024096419A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
KR1020237044063A KR20240065214A (ko) 2022-11-01 2023-10-23 사용자의 수면에 관한 정보 또는 평가를 나타내는 그래픽 사용자 인터페이스 제공 방법
KR1020237041712A KR20240065211A (ko) 2022-11-01 2023-10-23 사용자의 수면에 관한 정보 또는 평가를 나타내는 그래픽사용자 인터페이스 제공 방법
KR1020237044152A KR20240065215A (ko) 2022-11-01 2023-10-23 사용자의 수면에 관한 정보 또는 평가를 나타내는 그래픽 사용자 인터페이스 제공 방법

Applications Claiming Priority (24)

Application Number Priority Date Filing Date Title
KR10-2022-0143598 2022-11-01
KR20220143598 2022-11-01
KR20230038894 2023-03-24
KR10-2023-0038894 2023-03-24
KR10-2023-0063097 2023-05-16
KR20230063097 2023-05-16
KR20230068443 2023-05-26
KR10-2023-0068443 2023-05-26
KR10-2023-0071935 2023-06-02
KR20230071935 2023-06-02
KR10-2023-0073194 2023-06-07
KR20230073194 2023-06-07
KR10-2023-0081202 2023-06-23
KR20230081202 2023-06-23
KR20230090108 2023-07-11
KR10-2023-0090108 2023-07-11
KR20230098507 2023-07-27
KR10-2023-0098507 2023-07-27
KR10-2023-0103710 2023-08-08
KR20230103710 2023-08-08
KR10-2023-0107174 2023-08-16
KR20230107174 2023-08-16
KR10-2023-0141493 2023-10-20
KR20230141493 2023-10-20

Publications (1)

Publication Number Publication Date
WO2024096419A1 true WO2024096419A1 (fr) 2024-05-10

Family

ID=90930908

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/016512 WO2024096419A1 (fr) 2022-11-01 2023-10-23 Procédé pour fournir une interface utilisateur graphique représentant des informations ou une évaluation du sommeil d'un utilisateur

Country Status (2)

Country Link
KR (1) KR20240065211A (fr)
WO (1) WO2024096419A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016042914A (ja) * 2014-08-20 2016-04-04 株式会社北電子 睡眠情報表示制御プログラム、睡眠情報表示装置及び睡眠情報表示方法
KR20180014417A (ko) * 2016-07-29 2018-02-08 연세대학교 원주산학협력단 비강압력신호를 이용한 수면호흡장애 환자의 수면/각성 분류 장치 및 방법
JP2019063200A (ja) * 2017-09-29 2019-04-25 特定非営利活動法人睡眠健康研究所 呼吸評価システム、解析システム、及びプログラム
KR102406157B1 (ko) * 2015-09-03 2022-06-10 삼성전자주식회사 사용자 단말기 및 수면 관리 방법
KR102440214B1 (ko) * 2020-09-10 2022-09-05 경상국립대학교산학협력단 생체 정보를 이용한 수면 질 향상 방법 및 이를 위한 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016042914A (ja) * 2014-08-20 2016-04-04 株式会社北電子 睡眠情報表示制御プログラム、睡眠情報表示装置及び睡眠情報表示方法
KR102406157B1 (ko) * 2015-09-03 2022-06-10 삼성전자주식회사 사용자 단말기 및 수면 관리 방법
KR20180014417A (ko) * 2016-07-29 2018-02-08 연세대학교 원주산학협력단 비강압력신호를 이용한 수면호흡장애 환자의 수면/각성 분류 장치 및 방법
JP2019063200A (ja) * 2017-09-29 2019-04-25 特定非営利活動法人睡眠健康研究所 呼吸評価システム、解析システム、及びプログラム
KR102440214B1 (ko) * 2020-09-10 2022-09-05 경상국립대학교산학협력단 생체 정보를 이용한 수면 질 향상 방법 및 이를 위한 시스템

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
HAI HONG TRAN: "Prediction of Sleep Stages Via Deep Learning Using Smartphone Audio Recordings in Home Environments: Model Development and Validation", JOURNAL OF MEDICAL INTERNET RESEARCH, JMIR PUBLICATIONS, CA, vol. 25, 1 June 2023 (2023-06-01), CA , pages e46216, XP093159265, ISSN: 1438-8871, DOI: 10.2196/46216 *
HONG JOONKI, HAI TRAN, JINHWAN JEONG, HYERYUNG JANG, IN-YOUNG YOON, JUNG KYUNG HONG, JEONG-WHUN KIM: "0348 SLEEP STAGING USING END-TO-END DEEP LEARNING MODEL BASED ON NOCTURNAL SOUND FOR SMARTPHONES", SLEEP, vol. 45, no. Suppl 1, 25 May 2022 (2022-05-25), pages A156, XP093131680, DOI: 10.1101/2021.10.13.21264974 *
HONG JUNG KYUNG, LEE TAEYOUNG, DELOS REYES ROBEN DEOCAMPO, HONG JOONKI, TRAN HAI HONG, LEE DONGHEON, JUNG JINHWAN, YOON IN-YOUNG: "Confidence-Based Framework Using Deep Learning for Automated Sleep Stage Scoring", NATURE AND SCIENCE OF SLEEP, DOVE MEDICAL PRESS, vol. Volume 13, 1 January 2021 (2021-01-01), pages 2239 - 2250, XP093131678, ISSN: 1179-1608, DOI: 10.2147/NSS.S333566 *
KIM JONGMOK, KIM DAEWOO, CHO EUNSUNG, TRAN HAI HONG, HONG JOONKI, LEE DONGHEON, HONG JUNGKYUNG, YOON IN-YOUNG, KIM JEONG-WHUN, JAN: "SOUND-BASED SLEEP STAGING BY EXPLOITING REAL-WORLD UNLABELED DATA", ICLR 2023, 2 March 2023 (2023-03-02), pages 1 - 7, XP093131686 *
LE VU LINH, KIM DAEWOO, CHO EUNSUNG, JANG HYERYUNG, REYES ROBEN DELOS, KIM HYUNGGUG, LEE DONGHEON, YOON IN-YOUNG, HONG JOONKI, KIM: "Real-Time Detection of Sleep Apnea Based on Breathing Sounds and Prediction Reinforcement Using Home Noises: Algorithm Development and Validation", JOURNAL OF MEDICAL INTERNET RESEARCH, JMIR PUBLICATIONS, CA, vol. 25, 22 February 2023 (2023-02-22), CA , pages e44818, XP093131684, ISSN: 1438-8871, DOI: 10.2196/44818 *

Also Published As

Publication number Publication date
KR20240065211A (ko) 2024-05-14

Similar Documents

Publication Publication Date Title
WO2017146524A1 (fr) Appareil et procédé d'évaluation d'une insuffisance cardiaque
WO2022031038A1 (fr) Dispositif informatique pour prédire un état de sommeil sur la base de données mesurées dans un environnement de sommeil d'un utilisateur
WO2023128713A1 (fr) Procédé, appareil informatique, et programme informatique permettant d'analyser l'état de sommeil d'un utilisateur par l'intermédiaire d'informations sonores
WO2018182202A1 (fr) Dispositif électronique et procédé d'execution de fonctionnement de dispositif électronique
WO2016080804A1 (fr) Appareil de mesure de signaux bioélectriques
EP3403235A1 (fr) Évaluation assistée par capteur de la santé et de la réadaptation
WO2020138624A1 (fr) Appareil de suppression de bruit et son procédé
WO2016200233A1 (fr) Procédé et appareil de commande de dispositif de réglage de température
WO2015137788A1 (fr) Appareil électronique de fourniture d'informations d'état de santé, son procédé de commande, et support d'informations lisible par ordinateur
WO2016175622A1 (fr) Appareil de sortie sonore, appareil électronique, et procédé de commande associé
WO2019172667A1 (fr) Dispositif de gestion d'environnement de sommeil utilisant un apprentissage par renforcement
WO2017171356A1 (fr) Procédé de positionnement d'une vidéo, appareil terminal et serveur infonuagique
WO2019027240A1 (fr) Dispositif électronique et procédé pour fournir un résultat de recherche de celui-ci
WO2017078288A1 (fr) Dispositif électronique et procédé de génération de profil d'utilisateur
WO2015126095A1 (fr) Dispositif électronique
EP3220815A1 (fr) Appareil de mesure de signaux bioélectriques
WO2016200204A1 (fr) Dispositif électronique et son procédé de commande
WO2018084576A1 (fr) Dispositif électronique et procédé de commande associé
WO2019240513A1 (fr) Procédé et appareil pour fournir des informations biométriques par un dispositif électronique
WO2020230933A1 (fr) Dispositif d'intelligence artificielle pour reconnaître la voix d'un utilisateur et procédé associé
WO2020222622A1 (fr) Dispositif d'induction de méditation respiration combiné à des écouteurs pour détecter des signaux d'ondes cérébrales, système d'induction de méditation respiration pour afficher et stocker des signaux d'ondes cérébrales à l'aide de celui-ci, et système pour la gestion de signaux d'ondes cérébrales par l'intermédiaire d'un gestionnaire intermédiaire
EP3523709A1 (fr) Dispositif électronique et procédé de commande associé
WO2024096419A1 (fr) Procédé pour fournir une interface utilisateur graphique représentant des informations ou une évaluation du sommeil d'un utilisateur
WO2020122293A1 (fr) Appareil de planification de lavage
WO2020141641A1 (fr) Dispositif d'induction du sommeil

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23886115

Country of ref document: EP

Kind code of ref document: A1