WO2019164145A1 - Dispositif électronique et procédé de correction de posture s'y rapportant - Google Patents

Dispositif électronique et procédé de correction de posture s'y rapportant Download PDF

Info

Publication number
WO2019164145A1
WO2019164145A1 PCT/KR2019/001166 KR2019001166W WO2019164145A1 WO 2019164145 A1 WO2019164145 A1 WO 2019164145A1 KR 2019001166 W KR2019001166 W KR 2019001166W WO 2019164145 A1 WO2019164145 A1 WO 2019164145A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable device
user
posture
data
electronic device
Prior art date
Application number
PCT/KR2019/001166
Other languages
English (en)
Korean (ko)
Inventor
키스겐나디
바실리예프드미트로
쿠덴척세르히
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2019164145A1 publication Critical patent/WO2019164145A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/10Photometry, e.g. photographic exposure meter by comparison with reference light or electric value provisionally void
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0218Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults
    • G05B23/0224Process history based detection method, e.g. whereby history implies the availability of large amounts of data
    • G05B23/0227Qualitative history assessment, whereby the type of data acted upon, e.g. waveforms, images or patterns, is not relevant, e.g. rule based assessment; if-then decisions
    • G05B23/0235Qualitative history assessment, whereby the type of data acted upon, e.g. waveforms, images or patterns, is not relevant, e.g. rule based assessment; if-then decisions based on a comparison with predetermined threshold or range, e.g. "classical methods", carried out during normal operation; threshold adaptation or choice; when or how to compare with the threshold
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Definitions

  • the present disclosure relates to an electronic device and a method for correcting a posture thereof, and more particularly, to measure a distance to an object viewed by a user for protecting eyesight of a user, or to measure a peripheral illumination to correct a posture of a user.
  • the present invention relates to an electronic device and a control method thereof capable of outputting or outputting a message for correcting a posture of a user by measuring a tilt of a user's head.
  • the present disclosure is to solve the above-described problem, and relates to an electronic device and a wearable device for helping a user correct posture.
  • An electronic device for correcting a user's posture may include a communication unit, an output unit, a memory including at least one command, and a connection with the communication unit, the output unit, and the memory. And a processor configured to control an electronic device, wherein the processor performs a communication connection with a wearable device worn by a user through the communication unit by executing the at least one instruction, and from the wearable device through the communication booth.
  • the controller may be configured to receive sensing data sensed by the wearable device and to output a message corresponding to the preset condition to guide the user's posture when the received sensing data satisfies a preset condition.
  • the sensing data may include data about a distance between the external device and an object detected by the distance measuring sensor of the wearable device, data about an angle detected by the gyro sensor of the wearable device, and illuminance of the wearable device. It may include at least one of the data on the illumination sensed by the sensor.
  • the processor guides the posture of the user when the detected distance is less than or equal to a first threshold.
  • the output unit may be controlled to output information for performing the above operation.
  • the processor may match the detected distance with time information and store it in the memory, and output the information for guiding the posture of the user when the detected distance is less than or equal to the first threshold value for a preset time. In order to control the output unit.
  • the processor when the sensing data is data about an angle detected by the gyro sensor, the processor outputs information for guiding the posture of the user when the detected angle is less than or equal to a second threshold.
  • the output can be controlled.
  • the processor matches the detected angle with the time information and stores it in the memory, and outputs information for guiding the posture of the user when the detected angle is less than or equal to the second threshold for a preset time. In order to control the output unit.
  • the processor when the sensing data is data on illuminance detected by the illuminance sensor, the processor outputs information for guiding the posture of the user when the sensed illuminance is greater than or equal to a third threshold.
  • the output can be controlled.
  • the processor the first sensing data obtained from the first sensor for detecting the illumination around the wearable device through the communication unit and a second for sensing the illumination of the area viewed by the user wearing the wearable device
  • the second sensing data obtained from the sensor is received and the difference between the first illuminance value obtained from the first sensing data and the second illuminance value obtained from the second sensing data is greater than or equal to a fourth threshold value
  • the output unit may be controlled to output information for guiding a posture.
  • the control method of the electronic device for correcting the user posture for solving the above-described problem, the step of performing a communication connection with the wearable device worn by the user, from the wearable device Receiving sensing data sensed by the wearable device and outputting a message corresponding to the preset condition to guide the posture of the user when the received sensing data satisfies a preset condition.
  • the sensing data may include data about a distance between the external device and an object detected by the distance measuring sensor of the wearable device, data about an angle detected by the gyro sensor of the wearable device, and illuminance of the wearable device. It may include at least one of the data on the illumination sensed by the sensor.
  • the outputting of the sensing data may include data about a distance between the wearable device and an object detected by the distance measuring sensor, and when the detected distance is less than or equal to a first threshold value, the posture of the user.
  • Information for guiding can be output.
  • the outputting may include matching the detected distance with time information and outputting information for guiding the posture of the user when the detected distance is less than or equal to the first threshold value for a preset time. It may include.
  • the outputting may include outputting information for guiding the posture of the user when the sensing data is data about an angle detected by the gyro sensor and the detected angle is less than or equal to a second threshold. have.
  • the outputting may include matching the detected angle with time information and outputting information for guiding a user's posture when the detected angle is less than or equal to the second threshold for a preset time. It may include.
  • the outputting may include outputting information for guiding a user's posture when the sensing data is data on illuminance detected by the illuminance sensor and the detected illuminance is greater than or equal to a third threshold value.
  • the electronic device may output a message for correcting the posture of the user by measuring at least one of an object viewed by the user, an ambient illuminance, and a tilt of the neck of the user.
  • 1 is an exemplary view for explaining the wrong posture and correct posture of the conventional people.
  • FIGS. 2A and 2B are exemplary views illustrating a posture correcting system according to an embodiment of the present disclosure.
  • 3A is a block diagram illustrating a configuration of an electronic device 100 according to an embodiment of the present disclosure.
  • 3B is a block diagram illustrating in detail the configuration of the electronic device 100 according to an embodiment of the present disclosure.
  • FIG. 4 is a block diagram illustrating a configuration of a wearable device according to an embodiment of the present disclosure.
  • 5A to 5C are exemplary views for describing a method of determining a distance between a user and an object and outputting a message for guiding a posture of the user according to the determined result according to an embodiment of the present disclosure.
  • 6A to 6E are exemplary views for describing a method of determining a tilt of a user's neck and outputting a message for guiding a posture of a user according to the determined result, according to an exemplary embodiment.
  • 7A to 7D are exemplary views for describing a method of determining illuminance around a user and outputting a message for guiding a posture of the user according to the determined result, according to an embodiment of the present disclosure.
  • FIG. 8 is a flowchart illustrating a control method of an electronic device according to an embodiment of the present disclosure.
  • Embodiments of the present disclosure may be variously modified and have various embodiments, and specific embodiments will be illustrated in the drawings and described in detail in the written description. However, this is not intended to limit the scope to the specific embodiments, it should be understood to include all transformations, equivalents, and substitutes included in the scope of the disclosed spirit and technology. In describing the embodiments, when it is determined that the detailed description of the related known technology may obscure the gist, the detailed description thereof will be omitted.
  • first and second may be used to describe various components, but the components should not be limited by the terms. The terms are only used to distinguish one component from another.
  • the 'module' or 'unit' performs at least one function or operation, and may be implemented in hardware or software or in a combination of hardware and software.
  • the plurality of modules or the plurality of units may be integrated into at least one module except for the modules or units that need to be implemented with specific hardware, and are implemented as at least one processor (not shown). Can be.
  • a part when a part is “connected” with another part, it is not only “directly connected” but also “electrically connected” with another element in between. Also includes.
  • a part when a part is said to “include” a certain component, which means that it may further include other components, except to exclude other components unless otherwise stated.
  • an "application” refers to a series of computer program sets designed to perform a specific task.
  • the application may vary. For example, game applications, video playback applications, map applications, memo applications, calendar applications, phone book applications, broadcast applications, workout support applications, payment applications, photo folder applications, medical device control applications, user interface of many medical devices. There may be a providing application, but is not limited thereto.
  • FIG. 2A is an exemplary diagram for describing a posture correction system according to an embodiment of the present disclosure.
  • the posture correcting system 100 may be configured of the electronic device 100 and the wearable device 200.
  • the electronic device 100 is configured to determine whether a user's posture is wrong based on the sensing data received from the wearable device 200, and output a message for correcting the posture.
  • the electronic device 100 may be implemented as a smart phone, but this is only an example, and may include a smartphone, a tablet personal computer, a mobile phone, a video phone, an e-book.
  • E-book readers desktop PCs, laptop PCs, netbook computers, workstations, servers, personal digital assistants, portable multimedia players (PMPs) It may include at least one of the MP3 player.
  • PMPs portable multimedia players
  • the electronic device 100 may be a home appliance.
  • Home appliances are, for example, televisions, digital video disk players, audio, refrigerators, air conditioners, cleaners, ovens, microwave ovens, washing machines, air cleaners, set-top boxes, home automation Home automation control panel, security control panel, TV box (e.g. Samsung HomeSync TM, Apple TV TM, or Google TV TM), game console (e.g. Xbox TM, PlayStation TM), electronics It may include at least one of a dictionary, an electronic key, a camcorder, or an electronic picture frame.
  • the wearable device 200 is a component for collecting sensing data.
  • the wearable device 200 may be an accessory type (eg, watch, ring, bracelet, anklet, necklace, glasses, contact lens, or head-mounted-device (HMD)), earphone, headphone, It may include at least one of a fabric or clothing integral type (e.g., electronic garment), a body attachment type (e.g., skin pad or tattoo), or a living implantable type (e.g., implantable circuit).
  • an accessory type eg, watch, ring, bracelet, anklet, necklace, glasses, contact lens, or head-mounted-device (HMD)
  • HMD head-mounted-device
  • earphone headphone
  • It may include at least one of a fabric or clothing integral type (e.g., electronic garment), a body attachment type (e.g., skin pad or tattoo), or a living implantable type (e.g., implantable circuit).
  • the wearable device 200 may be an independent device.
  • the wearable device may be an independent device including a configuration capable of collecting sensing data, and may be attached to various external devices.
  • the wearable device 200 may be configured to be attached to various accessories worn on a user's head, such as glasses, earphones, a headband, and headphones.
  • the wearable device 200 is described as being a device attached to a device worn on a user's head or a device worn on a user's head, but is not limited thereto, and may be worn or attached to various positions. to be.
  • FIG. 3A is a block diagram illustrating a configuration of an electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 includes a communication unit 110, an output unit 120, a memory 130, and a processor 140.
  • the communication unit 110 is a component for performing communication with an external device.
  • the electronic device 100 may be connected to the wearable device 200 through the communication unit 110.
  • the electronic device 100 may transmit and receive various data to and from another external device through the communication unit 110.
  • the output unit 120 is a component for outputting various data of the electronic device 100.
  • the electronic device 100 may output a message for correcting the posture of the user through the output unit 120.
  • the message output through the output unit 120 may be a UI screen related to posture correction or an audio signal related to posture correction.
  • the memory 130 may store various programs and data necessary for the operation of the electronic device 100.
  • the memory 130 may be implemented as a nonvolatile memory, a volatile memory, a flash-memory, a hard disk drive (HDD), or a solid state drive (SSD).
  • the memory 130 may include at least one instruction, and the processor 140 may execute at least one instruction.
  • the processor 140 controls the overall operation of the electronic device 100.
  • the processor 140 may control the communication unit 110 to perform a communication connection with the wearable device 200 and to receive sensing data from the wearable device 200.
  • the processor 140 may control the output unit 120 to output a message corresponding to the preset condition to guide the posture of the user.
  • the sensing data may be data detected by a distance sensor, a gyro sensor, an illumination sensor, or the like included in the wearable device 200.
  • the sensing data may include at least one of distance data from the wearable device 200 to an object, gradient data of the wearable device 200, and peripheral illumination data.
  • the preset condition may vary depending on the type of sensing data. Specifically, when the sensing data is data on the distance between the wearable device 200 and the object detected by the distance measuring sensor of the wearable device 200, the preset condition is that the distance between the wearable device 200 and the object is determined. It may be a condition for whether the first threshold value or less. Alternatively, the preset condition may be a condition related to a time during which the distance between the wearable device 200 and the object is maintained below a first threshold value. The preset condition may be a condition for whether the average value of the distance between the wearable device 200 and the object is equal to or less than the first threshold value.
  • the preset condition is a condition of whether the tilt of the wearable device 200 is greater than or equal to a second threshold value or less.
  • the preset condition may be a condition related to a time during which the inclination of the wearable device 200 maintains a state below the second threshold.
  • the preset condition may be a condition regarding whether the average slope value of the wearable device 200 is less than or equal to the second threshold value.
  • the preset condition is whether the illuminance around the wearable apparatus 200 is greater than or equal to a third threshold value or less.
  • the preset condition may be a condition related to a time during which the illuminance around the wearable device 200 maintains a state where the illuminance around the wearable device 200 is less than or equal to the third threshold value.
  • the preset condition may be a condition of whether the difference between the illuminance value around the wearable device 200 and the illuminance value of the object region is greater than or equal to a fourth threshold value or less.
  • FIG. 3B is a block diagram illustrating in detail the configuration of the electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 may further include an input unit 150, an audio processor 160, etc. in addition to the communication unit 110, the output unit 120, the memory 130, and the processor 140.
  • the present invention is not limited to the above-described configuration, and some configurations may be added or omitted as necessary.
  • the communication unit 110 may communicate with an external device.
  • the communication unit 110 may include various communication chips such as a Wi-Fi chip 111, a Bluetooth chip 112, a wireless communication chip 113, and an NFC chip 114.
  • the Wi-Fi chip 111, the Bluetooth chip 112, and the NFC chip 114 communicate with each other by a LAN method, a WiFi method, a Bluetooth method, and an NFC method.
  • various connection information such as SSID and session key may be first transmitted and received, and then various communication information may be transmitted and received using the same.
  • the wireless communication chip 113 refers to a chip that performs communication according to various communication standards such as IEEE, Zigbee, 3rd Generation (3G), 3rd Generation Partnership Project (3GPP), Long Term Evolution (LTE), and the like.
  • the output unit 120 is a component for outputting various data of the electronic device 100.
  • the output unit 120 may include a display 121 and an audio output unit 122.
  • the display 121 may display various screens as described above.
  • the display 121 may be implemented as various types of display panels.
  • the display panel may be a liquid crystal display (LCD), organic light emitting diodes (OLED), active-matrix organic light-emitting diode (AM-OLED), liquid crystal on silicon (LcoS), or digital light processing (DLP). It can be implemented with various display technologies.
  • the display 121 may be coupled to at least one of a front region, a side region, and a rear region of the electronic device 100 in the form of a flexible display.
  • the flexible display may be characterized by being able to bend, bend or roll without damage through a paper-thin and flexible substrate.
  • Such a flexible display may be manufactured using a plastic substrate as well as a commonly used glass substrate. When using a plastic substrate, it can be formed using a low temperature manufacturing processor without using a conventional manufacturing processor to prevent damage to the substrate.
  • the glass substrate surrounding the flexible liquid crystal may be replaced with a plastic film, thereby giving flexibility to fold and unfold.
  • Such a flexible display is not only thin and light, but also strong in impact, and can be bent or bent and manufactured in various forms.
  • the audio output unit 122 is configured to output not only various audio data on which various processing tasks such as decoding, amplification, and noise filtering are performed by the audio processor 160, but also various notification sounds or voice messages.
  • the audio processor 160 is a component that processes audio data.
  • the audio processor 160 may perform various processing such as decoding, amplification, noise filtering, and the like on the audio data.
  • the audio data processed by the audio processor 160 may be output to the audio output unit 160.
  • the audio output unit 122 may be implemented as a speaker, but this is only an example and may be implemented as an output terminal capable of outputting audio data.
  • the input unit 150 may include a touch panel 151, a pen sensor 152, a key 153, and a microphone 154 to receive various inputs.
  • the touch panel 151 may be configured by combining the display 121 and a touch sensor (not shown), and the touch sensor may use at least one of capacitive, pressure sensitive, infrared, and ultrasonic methods.
  • the touch panel may have a function of detecting not only a display function but also a touch input position, a touched area, as well as a touch input pressure, and a function of detecting a proximity touch as well as a real touch. Can be.
  • the pen sensor 152 may be implemented as part of the touch panel 151 or may include a separate sheet for recognition.
  • the key 153 may include a physical button, an optical key or a keypad.
  • the microphone 154 may include at least one of an internal microphone or an external microphone.
  • the input unit 150 may receive an external command from the various components described above and transmit it to the processor 140.
  • the processor 140 may generate a control signal corresponding to the received input to control the electronic device 100.
  • the processor 140 may control overall operations of the electronic device 100 using various programs stored in the memory 130.
  • the processor 140 may include a RAM 141, a ROM 142, a graphics processor 143, a main CPU 144, a first to n interface 145-1 to 145-n, and a bus 146.
  • the RAM 141, the ROM 142, the graphic processor 143, the main CPU 144, and the first to n-th interfaces 145-1 to 145-n may be connected to each other through the bus 146. .
  • the RAM 141 stores an O / S and an application program.
  • the O / S may be stored in the RAM 141, and various application data selected by the user may be stored in the RAM 141.
  • the ROM 142 stores a command set for system booting.
  • the main CPU 144 copies the O / S stored in the memory 110 to the RAM 141 according to the command stored in the ROM 142, and executes the O / S.
  • the main CPU 144 copies various application programs stored in the memory 130 to the RAM 141 and executes the application programs copied to the RAM 141 to perform various operations.
  • the graphic processor 143 generates a screen including various objects such as an item, an image, and a text by using a calculator (not shown) and a renderer (not shown).
  • the operation unit may be configured to calculate attribute values such as coordinate values, shapes, sizes, colors, etc. of each object to be displayed according to the layout of the screen using the control command received from the input unit 150.
  • the renderer may be configured to generate screens of various layouts including objects based on the attribute values calculated by the calculator. The screen generated by the renderer may be displayed in the display area of the display 121.
  • the main CPU 144 accesses the memory 130 and performs booting using an OS stored in the memory 130.
  • the main CPU 144 performs various operations using various programs, contents, data, and the like stored in the memory 130.
  • the first to n interfaces 145-1 to 145-n are connected to the various components described above.
  • One of the first to n-th interfaces 145-1 to 145-n may be a network interface connected to an external device through a network.
  • the wearable device 200 includes a communication unit 210, a sensor 220, and a processor 230.
  • the communication unit 210 is configured to perform communication with an external device, and as described above, may include various communication chips such as a Wi-Fi chip, a Bluetooth chip, a wireless communication chip, an NFC chip, and the like.
  • the communicator 210 may transmit various sensing data sensed by the sensor 220 to the electronic device 100.
  • the sensor 220 is a component for sensing various sensing data. As shown in FIG. 4, the sensor 220 may include a distance sensor, an illumination sensor, a noise sensor, a gyro sensor, and the like.
  • the distance measuring sensor is a sensor for measuring the distance between the wearable device 200 and the object.
  • the distance sensor may be configured as an optical sensor, but is not limited thereto. If necessary, the distance sensor may be configured as an infrared sensor or an ultrasonic sensor.
  • the illuminance sensor is a configuration for measuring the ambient illuminance.
  • the illuminance sensor may be composed of two or more.
  • the first illuminance sensor detects the illuminance around the wearable device 200
  • the second illuminance sensor measures the illuminance near the area where the object is located. I can detect it.
  • the noise measuring sensor is a component for measuring ambient noise
  • the gyro sensor is a sensor that recognizes more detailed and precise motion by inserting rotations into the existing acceleration sensor and recognizes 6-axis directions, and is a wearable device ( 200 is a configuration for detecting the position.
  • the sensor 220 may include various types of sensors, such as an acceleration sensor, a proximity sensor, an impact sensor, a gravity sensor, and a geomagnetic sensor, in addition to the above configurations.
  • sensors such as an acceleration sensor, a proximity sensor, an impact sensor, a gravity sensor, and a geomagnetic sensor, in addition to the above configurations.
  • the processor 230 is a component for controlling the wearable device 200 and may control the communication unit 210 to transmit various information sensed by the sensor 220 to the electronic device 100.
  • the wearable device 200 may further include additional components as necessary in addition to the above-described components.
  • the wearable device 200 when the wearable device 200 is implemented as an earphone, the wearable device 200 may further include an audio output unit. In this case, the wearable device 200 may receive and output data about an audio message related to posture correction from the electronic device 100.
  • the wearable device 200 when the wearable device 200 is implemented as a head mounted display HMD, the wearable device 200 may further include a display. In this case, the wearable device 200 may receive and output data on a UI message related to posture correction from the electronic device 100.
  • various configurations may be added according to which device the wearable device 200 is implemented.
  • 5A to 5C are exemplary views for describing a method of determining a distance between a user and an object and outputting a message for guiding a posture of the user according to the determined result according to an embodiment of the present disclosure.
  • Staring at an object too close or staring at an object of the same distance for a long time may cause diseases such as astigmatism or myopia. Therefore, the electronic device 100 needs to guide the posture of the user by measuring the distance to the object and the object gaze time gaze at the user.
  • the distance measuring sensor 221 of the wearable device 200 may measure the distance w between the wearable device 200 and the object 500 and transmit the measured distance w to the electronic device 100.
  • the wearable device 200 may measure the distance data at a predetermined time interval (for example, at 10 second intervals) and transmit the distance data to the electronic device 100.
  • the transmission method of the distance w may vary.
  • the wearable device 100 may transmit the distance data to the electronic device 100 whenever the data for the distance w is obtained, or may transmit the data at a predetermined time interval.
  • the electronic device 100 may determine the posture of the user by analyzing the distance data received from the wearable device 200. For example, when the distance between the wearable device 200 and the object is equal to or less than the first threshold value (eg, 0.4 m), the electronic device 100 may determine that the posture correction of the user is necessary.
  • the first threshold value eg, 0.4 m
  • the electronic device 100 may output a message for guiding the posture of the user. For example, as shown in FIG. 5B, the electronic device 100 may output a message UI such as "The distance to the front object is too close. Please maintain the proper distance for eye health.” have.
  • the electronic device 100 may vibrate the electronic device 100 or output an alarm sound when the message UI is displayed.
  • the data about the message UI may be transmitted to the wearable device 200 and displayed by the wearable device 200. That is, when the wearable device 200 is a head mounted display device, the message UI may be output from the wearable device 200.
  • the message UI may be output in the form of audio data.
  • the audio data "The distance from the front object is too close. Please keep the proper distance for eye health.” It may be output from the device 100 or the wearable device 200.
  • the electronic device 100 may store the received distance data by matching the time information.
  • the electronic device 100 may obtain an average gaze distance at which a user gazes at an object based on distance data matched with time information.
  • the electronic device 100 may output a message guiding a posture of the user.
  • the electronic device 100 may measure a change in distance data over time. If the distance data does not change for a preset time, the electronic device 100 may output a message guiding a posture of the user. For example, the electronic device 100 "looks at the same distance for too long. Look at the distance for eye health.” You can print a message like this:
  • the electronic device 100 may use distance data to determine whether 1) an object that the user gazes near, 2) an average distance of an object that the user gazes near, and 3) the same object (the same). It is possible to determine how long the user is staring at an object located on the street and output a message guiding the user's posture.
  • 6A to 6E are exemplary views for describing a method of determining a tilt of a user's neck and outputting a message for guiding a posture of a user according to the determined result, according to an exemplary embodiment.
  • the electronic device 100 needs to guide the posture of the user by measuring the posture of the user's head and the inclination of the neck.
  • the electronic device 100 may measure the inclination of the neck of the user using a gyro sensor, a gravity sensor, a geomagnetic sensor, or the like. As illustrated in FIG. 6B, the electronic device 100 may receive angle data of an inclination of the wearable device 200 from the wearable device 200 based on the gravity direction. When the received angle data is greater than or equal to the second threshold value (eg, 30 degrees), the electronic device 100 may output a message for guiding the posture of the user. For example, as illustrated in FIG.
  • the second threshold value eg, 30 degrees
  • the electronic device 100 may output a message UI such as "Please maintain a correct posture for neck health.”
  • the electronic device 100 may vibrate the electronic device 100 or output an alarm sound when the message UI is displayed.
  • the data about the message UI may be transmitted to the wearable device 200 and displayed by the wearable device 200. That is, when the wearable device 200 is a head mounted display device, the message UI may be output from the wearable device 200.
  • the message UI may be output in the form of audio data.
  • the audio data “Please maintain a correct posture for neck health” is the electronic device 100 or the wearable device 200. Can be output from
  • the wearable device 200 in order to obtain accurate angle data, it is necessary to wear the wearable device 200 correctly. That is, the user may wear the wearable device 200 such that the inclination measured by the sensor at the correct posture becomes 0, so that the posture of the user of the electronic device 100 may be accurately determined. To this end, the electronic device 100 may output a message for guiding the wearing of the wearable device.
  • the electronic device 100 may display a message UI that guides the user to wear the wearable device 200 correctly.
  • the electronic device 100 may output a message UI such as "Please wear a wearable device. If the user presses OK, the process proceeds to adjusting the position of the wearable device.”
  • the electronic device 100 may display a UI for adjusting the position of the wearable device 200 as illustrated in FIG. 6E. If the confirmation command is input after the position of the wearable device 200 is adjusted so that the angle measured by the sensor is zero, the electronic device 100 determines that the wearing of the wearable device 200 is completed, and uses the sensing data. The posture of the user can be determined.
  • the electronic device 100 may match the angle data with the time information.
  • the electronic device 100 may obtain an average time when the user's head is inclined based on the angle data matched with the time information.
  • the electronic device 100 may output a message guiding a posture of the user.
  • the electronic device 100 may measure a change in angle data over time. If the angle data does not change for a preset time, the electronic device 100 may output a message guiding a posture of the user. For example, the electronic device 100 "holds the same posture for a long time. Please stretch for neck health.” You can print a message like this:
  • 7A to 7D are exemplary views for describing a method of determining illuminance around a user and outputting a message for guiding a posture of the user according to the determined result, according to an embodiment of the present disclosure.
  • the electronic device 100 may determine the peripheral illumination and the illumination of the object region and guide the peripheral illumination value.
  • the electronic device 100 may receive peripheral illumination data from the wearable device 200. If the received ambient illuminance is above (or below) the third threshold (for example 500 lux), the electronic device 100 may show, "Ambient light is too bright. As shown in FIG. 7B. You can print a message UI such as "Please,” or “The surroundings are too dark. Please increase the illuminance for eye health.” As described above, the message may be output as audio or may be output from the wearable device 200.
  • the third threshold for example 500 lux
  • the object may be a light source that emits light.
  • the electronic device 100 may determine the health state of the user by determining the ambient illumination and the intensity of light emitted by the object. For example, if the difference between the peripheral illumination value and the illumination value of the object area is greater than or equal to the fourth threshold value, the electronic device 100 may display the "brightness of the monitor too bright as shown in FIG. 7D. Shorten the message UI. " However, if necessary, the electronic device 100 may display "The brightness of the monitor is too dark. Please brighten the monitor", "The ambient light is too bright. Lower the illuminance for eye health” or "The ambient light is too dark.” Increase the illuminance for eye health. "
  • the various thresholds described above may be determined by the electronic device 100 or may be determined by an external server. In the following description, the external server determines the threshold.
  • the external server can collect medically approved raw data.
  • the raw data obtained from all the sensors for a given time may be represented by the variable vector Vi.
  • Collected data can be classified by various experts. Experts can then add binary marks to the raw data. That is, the raw data can be appended with a binary mark of either 0 (e.g. bad posture) or 1 (e.g. correct posture).
  • the external server can classify the raw data with the binary mark added in various ways.
  • an external server can train a classifier trained to automatically select key segments by a programmable counter array (PCA).
  • the classifier may be implemented as a Naive Bayes Classifier, but is not limited thereto.
  • raw data may be classified through supervised learning.
  • the raw data may be self-learned using the training data without any guidance, and thus, the raw data may be classified through unsupervised learning that discovers a classification criterion.
  • the raw data classification model may be based on a neural network model.
  • Neural network models can be designed to simulate the human brain structure on a computer.
  • the neural network model may include a plurality of weighted network nodes that simulate the neurons of a human neural network.
  • the plurality of network nodes may form a connection relationship so that neurons simulate synaptic activity through which signals are sent and received through synapses.
  • the raw data classification model may include a neural network model or a deep learning model developed from the neural network model. In the deep learning model, a plurality of network nodes may be located at different depths (or layers) and exchange data according to a convolutional connection relationship.
  • a model such as a deep neural network (DNN), a recurrent neural network (RNN), and a bidirectional recurrent deep neural network (BRDNN) may be used as the classification model, but is not limited thereto.
  • DNN deep neural network
  • RNN recurrent neural network
  • BBDNN bidirectional recurrent deep neural network
  • the various models described above may be trained using learning algorithms including error back-propagation or gradient descent.
  • the electronic device 100 may receive the information on the above-described various threshold values from the external server, and determine the posture of the user by comparing the sensing data received from the wearable device 200 with the threshold values received from the external server. . In this case, the electronic device 100 may receive the information on the threshold value from the external server at a predetermined cycle. Alternatively, when the threshold value determined by the external server is changed, the electronic device 100 may receive information about the changed threshold value.
  • the electronic device 100 receives the information on the threshold value from the external server.
  • the electronic device 100 may acquire the threshold value through the above-described method.
  • FIG. 8 is a flowchart illustrating a control method of the electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 may perform a communication connection with the wearable device 200 (S810).
  • the electronic device 100 may receive sensing data sensed by the wearable device 200 from the wearable device 200 that is connected in communication (S820).
  • the sensing data received by the electronic device 100 includes data about a distance between objects of the wearable device 200 detected by the distance measuring sensor of the wearable device 200, and a gyro sensor of the wearable device 200. It may be at least one of data about an angle detected by the data and data about the illumination sensed by the illumination sensor of the wearable device 200.
  • the electronic device 100 may output a message corresponding to the preset condition in order to guide the posture of the user (S830).
  • the preset condition may be a condition for sensing data and a threshold value corresponding to the sensing data.
  • the sensing data is distance data between the wearable device 200 and the object
  • the preset condition may be a condition of whether the distance between the wearable device 200 and the object is equal to or less than a first threshold value.
  • the preset condition may be a condition of whether the distance between the wearable device 200 and the object is maintained at a predetermined time (for example, 5 minutes) or more in a state where the distance between the wearable device 200 and the object is less than or equal to the first threshold value.
  • the sensing data is angle data detected by the gyro sensor
  • the preset condition may be a condition of whether the angle of the wearable device 200 is equal to or less than a second threshold value.
  • the preset condition may be a condition of whether the angle of the wearable device 200 is maintained for a preset time (for example, 5 minutes) or more in a state where the angle of the wearable device 200 is less than or equal to the second threshold value.
  • the preset condition when the sensing data is illuminance data sensed by the illuminance sensor, the preset condition may be a condition as to whether the ambient illuminance is less than or equal to the third threshold.
  • the preset condition may be a condition as to whether or not the peripheral illuminance has been maintained for a preset time (for example, 5 minutes) or less in a state of being less than or equal to the third threshold value.
  • the preset condition may be a difference between the peripheral illuminance obtained from the first illuminance sensor of the wearable device 200 and the illuminance of the area viewed by the user obtained from the second illuminance sensor of the wearable device 200 by at least a fourth threshold value. May be a condition for cognition.
  • the methods described above may be embodied in the form of program instructions that may be executed by various computer means and may be recorded in a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
  • Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Automation & Control Theory (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif électronique permettant de corriger une posture d'un utilisateur. Un dispositif électronique selon un mode de réalisation de la présente invention comprend : une unité de communication ; une unité de sortie ; une mémoire comprenant au moins une instruction ; et un processeur raccordé à l'unité de communication, à l'unité de sortie et à la mémoire de sorte à commander le dispositif électronique. Le processeur établit une connexion de communication avec un dispositif portable porté par l'utilisateur au moyen de l'unité de communication en exécutant au moins une instruction, reçoit des données de détection détectées par le dispositif portable en provenance du dispositif portable au moyen de l'unité de communication et, lorsque les données de détection reçues remplissent une condition prédéfinie, commande l'unité de sortie de telle sorte qu'un message correspondant à la condition prédéfinie soit transmis afin de guider la posture de l'utilisateur.
PCT/KR2019/001166 2018-02-20 2019-01-28 Dispositif électronique et procédé de correction de posture s'y rapportant WO2019164145A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180019821A KR102620967B1 (ko) 2018-02-20 2018-02-20 전자 장치 및 그의 자세 교정 방법
KR10-2018-0019821 2018-02-20

Publications (1)

Publication Number Publication Date
WO2019164145A1 true WO2019164145A1 (fr) 2019-08-29

Family

ID=67687791

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/001166 WO2019164145A1 (fr) 2018-02-20 2019-01-28 Dispositif électronique et procédé de correction de posture s'y rapportant

Country Status (2)

Country Link
KR (1) KR102620967B1 (fr)
WO (1) WO2019164145A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112333500A (zh) * 2019-11-26 2021-02-05 河南水滴智能技术有限公司 儿童看电视时的视力保护技术及方法
CN113297876A (zh) * 2020-02-21 2021-08-24 佛山市云米电器科技有限公司 基于智能冰箱的运动姿势矫正方法、智能冰箱及存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102130761B1 (ko) * 2019-09-02 2020-07-08 트레스테크(주) 스마트 헬스케어 스피커 어셈블리

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11133937A (ja) * 1997-11-04 1999-05-21 Sony Corp 視力低下防止装置および視力低下防止方法
KR20060005239A (ko) * 2004-07-12 2006-01-17 엘지전자 주식회사 시력 보호 기능을 갖는 영상표시기기 및 그 제어방법
WO2010147281A1 (fr) * 2009-06-16 2010-12-23 (주)엘지전자 Procédé de notification du champ de vision et récepteur de télévision pour mise en œuvre de ce procédé
KR20140076666A (ko) * 2012-12-12 2014-06-23 엘지전자 주식회사 영상 출력 장치 및 그 구동 방법
KR20170092232A (ko) * 2016-02-03 2017-08-11 경북대학교 산학협력단 시력 보호 장치 및 방법, 이를 수행하기 위한 기록매체

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013127548A (ja) 2011-12-19 2013-06-27 Nikon Corp 表示装置、および表示制御プログラム
KR101729008B1 (ko) * 2014-04-04 2017-04-21 엘지전자 주식회사 조명 시스템 및 그 동작방법
KR20160062521A (ko) * 2014-11-25 2016-06-02 인제대학교 산학협력단 목디스크 예방시스템 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11133937A (ja) * 1997-11-04 1999-05-21 Sony Corp 視力低下防止装置および視力低下防止方法
KR20060005239A (ko) * 2004-07-12 2006-01-17 엘지전자 주식회사 시력 보호 기능을 갖는 영상표시기기 및 그 제어방법
WO2010147281A1 (fr) * 2009-06-16 2010-12-23 (주)엘지전자 Procédé de notification du champ de vision et récepteur de télévision pour mise en œuvre de ce procédé
KR20140076666A (ko) * 2012-12-12 2014-06-23 엘지전자 주식회사 영상 출력 장치 및 그 구동 방법
KR20170092232A (ko) * 2016-02-03 2017-08-11 경북대학교 산학협력단 시력 보호 장치 및 방법, 이를 수행하기 위한 기록매체

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112333500A (zh) * 2019-11-26 2021-02-05 河南水滴智能技术有限公司 儿童看电视时的视力保护技术及方法
CN113297876A (zh) * 2020-02-21 2021-08-24 佛山市云米电器科技有限公司 基于智能冰箱的运动姿势矫正方法、智能冰箱及存储介质

Also Published As

Publication number Publication date
KR20190099847A (ko) 2019-08-28
KR102620967B1 (ko) 2024-01-05

Similar Documents

Publication Publication Date Title
WO2016108537A1 (fr) Dispositif électronique comprenant un corps rotatif et un procédé de commande de celui-ci
KR102072788B1 (ko) 휴대 장치 및 휴대 장치의 콘텐트 화면 변경방법
WO2019164145A1 (fr) Dispositif électronique et procédé de correction de posture s'y rapportant
EP3616050A1 (fr) Appareil et procédé pour contexte de commande vocale
WO2019098797A1 (fr) Appareil et procédé de fourniture de rétroaction haptique par l'intermédiaire d'un dispositif portable
WO2018217060A1 (fr) Procédé et dispositif pouvant être porté permettant d'effectuer des actions à l'aide d'un réseau de capteurs corporels
WO2016032124A1 (fr) Dispositif rotatif et dispositif électronique l'intégrant
WO2020130689A1 (fr) Dispositif électronique pour recommander un contenu de jeu et son procédé de fonctionnement
WO2013133618A1 (fr) Procédé pour commander au moins une fonction d'un dispositif par action de l'œil et dispositif pour exécuter le procédé
KR20160035394A (ko) 센서 데이터 처리 방법 및 그 장치
WO2015126208A1 (fr) Procédé et système permettant une commande à distance d'un dispositif électronique
WO2016144095A1 (fr) Procédé et appareil pour commander un dispositif électronique dans un système de communication
WO2018124633A1 (fr) Dispositif électronique et procédé de délivrance d'un message par celui-ci
US20210394369A1 (en) Healthcare robot and control method therefor
WO2019151689A1 (fr) Dispositif électronique et procédé de commande associé
WO2016039532A1 (fr) Procédé de commande de l'écran d'un dispositif électronique, et dispositif électronique correspondant
WO2022158700A1 (fr) Dispositif électronique et procédé de commande associé
WO2022131488A1 (fr) Dispositif électronique et procédé de commande associé
WO2021145551A1 (fr) Dispositif électronique et son procédé de commande
US11393170B2 (en) Presentation of content based on attention center of user
WO2023137087A1 (fr) Optimisation sur un capteur d'entrée sur la base de données de capteur
WO2021233051A1 (fr) Procédé et dispositif d'alerte d'interférence
WO2021167289A1 (fr) Dispositif et son procédé de commande
WO2020226264A1 (fr) Dispositif électronique permettant d'acquérir des informations d'emplacement sur la base d'une image, et son procédé de fonctionnement
WO2019216630A1 (fr) Procédé d'édition d'image et dispositif électronique prenant en charge ledit procédé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19756612

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19756612

Country of ref document: EP

Kind code of ref document: A1