WO2024071708A1 - Dispositif portable et son procédé de commande - Google Patents

Dispositif portable et son procédé de commande Download PDF

Info

Publication number
WO2024071708A1
WO2024071708A1 PCT/KR2023/013062 KR2023013062W WO2024071708A1 WO 2024071708 A1 WO2024071708 A1 WO 2024071708A1 KR 2023013062 W KR2023013062 W KR 2023013062W WO 2024071708 A1 WO2024071708 A1 WO 2024071708A1
Authority
WO
WIPO (PCT)
Prior art keywords
upper body
body movement
user
wearable device
information
Prior art date
Application number
PCT/KR2023/013062
Other languages
English (en)
Korean (ko)
Inventor
박홍식
이승준
한훈
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2024071708A1 publication Critical patent/WO2024071708A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0237Stretching or bending or torsioning apparatus for exercising for the lower limbs
    • A61H1/0255Both knee and hip of a patient, e.g. in supine or sitting position, the feet being moved together in a plane substantially parallel to the body-symmetrical plane
    • A61H1/0262Walking movement; Appliances for aiding disabled persons to walk
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0006Exoskeletons, i.e. resembling a human figure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/12Driving means
    • A61H2201/1207Driving means with electric or magnetic drive
    • A61H2201/1215Rotary drive
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1628Pelvis
    • A61H2201/163Pelvis holding means therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/164Feet or leg, e.g. pedal
    • A61H2201/1642Holding means therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5069Angle sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2203/00Additional characteristics concerning the patient
    • A61H2203/04Position of the patient
    • A61H2203/0406Standing on the feet

Definitions

  • Various embodiments relate to technology for controlling wearable devices.
  • a wearable device includes a processor that controls the wearable device, at least one sensor that measures the angle of the thigh support frame, a motor driver circuit controlled by the processor, and a motor electrically connected to the motor driver circuit. , and a thigh support frame that transmits the torque generated by the motor to at least a portion of the user's lower extremities, wherein the processor acquires upper body movement information about the upper body movement of the user wearing the wearable device, Calculate an optimal stride length for the user based on upper body movement information, and determine a first value of a control parameter corresponding to the optimal stride length, wherein the control parameter adjusts at least one of the size, direction, or timing of the torque to be output. This parameter is used to control the motor driver of the wearable device based on the first value of the control parameter.
  • a wearable device control method performed by a wearable device includes acquiring upper body movement information about the upper body movement of the user wearing the wearable device, and controlling the user based on the upper body movement information.
  • a wearable device control method performed by a wearable device includes acquiring upper body movement information about the upper body movement of a user wearing the wearable device, and receiving a first user input based on the upper body movement information. An operation of detecting, an operation of determining a target value of the control parameter based on second upper body movement information obtained after the first user input is detected, detecting a second user input based on third upper body movement information and, when the second user input is detected, controlling the motor driver of the wearable device based on the target value of the control parameter.
  • FIG. 1 is a configuration diagram of a system for providing an exercise program to a user, according to an embodiment.
  • Figure 2 is a block diagram of an electronic device in a network environment, according to one embodiment.
  • FIGS. 3A, 3B, 3C, and 3D are diagrams for explaining a wearable device according to an embodiment.
  • FIG. 4 is a diagram illustrating a wearable device that communicates with an electronic device, according to an embodiment.
  • 5 and 6 are diagrams for explaining a torque output method of a wearable device, according to an embodiment.
  • Figure 7 shows a motion sensing circuit worn on the user's upper body, according to one embodiment.
  • Figure 8 is a diagram for explaining target angle information according to one embodiment.
  • Figure 9 is a diagram for explaining target angle information regarding a stretch sensor, according to an embodiment.
  • Figure 10 is a flowchart of a method for controlling a wearable device, according to an embodiment.
  • Figure 11 is a flowchart of a method for controlling a motor driver of a wearable device, according to an embodiment.
  • Figure 12 is a flowchart of a method for calculating an optimal stride length, according to one embodiment.
  • Figure 13 is a flowchart of a method for calculating an optimal stride length, according to one embodiment.
  • Figure 14 is a flowchart of a method for determining whether a user's upper body movement corresponds to running, according to one embodiment.
  • Figure 15 is a diagram for explaining a method of determining whether a user's upper body movement corresponds to running, according to an embodiment.
  • Figure 16 is a flowchart of a method for controlling a wearable device, according to an embodiment.
  • Figure 17 shows a graph for detecting user input, according to one embodiment.
  • Figure 18 is a flowchart of a method for determining the value of a control parameter, according to one embodiment.
  • Figure 19 is a diagram for explaining a method for determining the value of a control parameter, according to an embodiment.
  • Figure 20 is a flowchart of a method for determining the value of a control parameter, according to one embodiment.
  • Figure 21 is a diagram for explaining a method for determining the value of a control parameter, according to an embodiment.
  • Figure 22 is a flowchart of a method for controlling a wearable device based on detection of a user input, according to an embodiment.
  • Figure 23 is a flowchart of a method for controlling a wearable device based on detection of a motion stop input, according to an embodiment.
  • FIG. 1 is a configuration diagram of a system for providing an exercise program to a user, according to an embodiment.
  • the system 100 for providing an exercise program to a user includes an electronic device 110, a wearable device 120, an additional device 130, a server 140, and smart clothing 150. can do.
  • the electronic device 110 may be a user terminal that can be connected to the wearable device 120 using short-range wireless communication.
  • the electronic device 110 may transmit a control signal to control the wearable device 120 to the wearable device 120 .
  • the wearable device 120 may transmit information indicating the status of the wearable device 120 to the electronic device 110.
  • the electronic device 110 is described in detail below with reference to FIG. 2 , and the transmission of the control signal is described in detail below with reference to FIG. 4 .
  • the wearable device 120 may provide an assistive force to assist gait or exercise or a resistance force to hinder gait to a user wearing the wearable device 120. Resistance may be provided for the user's exercise.
  • the assisting force or resistance force output by the wearable device 120 can be controlled by controlling the values of various control parameters used in the wearable device 120. The structure and driving method of the wearable device 120 will be described in detail below with reference to FIGS. 3A, 3B, 3C, 3D, 4, 5, and 6.
  • the electronic device 110 may be connected to an additional device 130 (eg, wireless earphones 131, smart watch 132, or smart glasses 133) using short-range wireless communication.
  • the electronic device 110 may output information indicating the state of the electronic device 110 or the state of the wearable device 120 to the user through the additional device 130.
  • feedback information about the walking state of the user wearing the wearable device 120 may be output through the haptic device, speaker device, and display device of the additional device 130.
  • the electronic device 110 may be connected to the server 140 using short-range wireless communication or cellular communication.
  • the server 140 may include a database storing information about a plurality of exercise programs that can be provided to the user through the wearable device 120.
  • the server 140 may manage user accounts for users of the electronic device 110 or the wearable device 120.
  • the server 140 may store and manage the exercise program performed by the user and the results of the exercise program in association with the user account.
  • the electronic device 110 may be connected to smart clothing 150 using short-range wireless communication.
  • the electronic device 110 may receive upper body movement information about the user's upper body movement from the smart clothing 150.
  • smart clothing 150 may receive information about the user's walking state from the electronic device 110.
  • the wearable device 120 may be connected to smart clothing 150 using short-range wireless communication or wired communication. Smart clothing 150 is described in detail below with reference to FIG. 7 .
  • smart clothing 150 may generate movement information of a user wearing smart clothing 150.
  • Smart clothing 150 may include smart tops 151 and smart bottoms 152.
  • the optimal stride length for the user may be calculated by the electronic device 110 or the wearable device 120 based on the upper body movement information generated by the smart top 151.
  • the optimal stride length for the user is provided to the electronic device 110 or the wearable device 120. can be calculated by
  • the system 100 may provide the user with a wearable device 120 that can be controlled based on movement information generated by the smart clothing 150 regarding the user's movements. For example, when the user's movement corresponds to running, the assistance force output by the wearable device 120 may be controlled to increase.
  • a method of controlling a wearable device based on movement information generated by smart clothing 150 is described in detail with reference to FIGS. 10 to 23 . For example, feedback information about the walking state of a user wearing smart clothing 150 may be output through the haptic device of smart clothing 150.
  • Figure 2 is a block diagram of an electronic device in a network environment, according to one embodiment.
  • FIG. 2 is a block diagram of an electronic device 201 (eg, electronic device 110 of FIG. 1 ) in a network environment 200, according to an embodiment.
  • the electronic device 201 communicates with the electronic device 202 through a first network 298 (e.g., a short-range wireless communication network) or a second network 299. It is possible to communicate with at least one of the electronic device 204 or the server 208 through (e.g., a long-distance wireless communication network).
  • the electronic device 201 may communicate with the electronic device 204 through the server 208.
  • the electronic device 201 includes a processor 220, a memory 230, an input module 250, an audio output module 255, a display module 260, an audio module 270, and a sensor module ( 276), interface 277, connection terminal 278, haptic module 279, camera module 280, power management module 288, battery 289, communication module 290, subscriber identification module 296 , or may include an antenna module 297.
  • at least one of these components eg, the connection terminal 278) may be omitted, or one or more other components may be added to the electronic device 201.
  • some of these components e.g., sensor module 276, camera module 280, or antenna module 297) are integrated into one component (e.g., display module 260). It can be.
  • Processor 220 executes software (e.g., program 240) to operate at least one other component (e.g., hardware or software component) of electronic device 201 connected to processor 220. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 220 stores commands or data received from another component (e.g., sensor module 276 or communication module 290) in volatile memory 232. The commands or data stored in the volatile memory 232 can be processed, and the resulting data can be stored in the non-volatile memory 234.
  • software e.g., program 240
  • the processor 220 stores commands or data received from another component (e.g., sensor module 276 or communication module 290) in volatile memory 232.
  • the commands or data stored in the volatile memory 232 can be processed, and the resulting data can be stored in the non-volatile memory 234.
  • the processor 220 may include a main processor 221 (e.g., a central processing unit or an application processor) or an auxiliary processor 223 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 221 e.g., a central processing unit or an application processor
  • auxiliary processor 223 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the electronic device 201 includes a main processor 221 and a auxiliary processor 223, the auxiliary processor 223 may be set to use lower power than the main processor 221 or be specialized for a designated function. You can.
  • the auxiliary processor 223 may be implemented separately from the main processor 221 or as part of it.
  • the auxiliary processor 223 may, for example, act on behalf of the main processor 221 while the main processor 221 is in an inactive (e.g., sleep) state, or while the main processor 221 is in an active (e.g., application execution) state. ), together with the main processor 221, at least one of the components of the electronic device 201 (e.g., the display module 260, the sensor module 276, or the communication module 290) At least some of the functions or states related to can be controlled.
  • coprocessor 223 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 280 or communication module 290. there is.
  • the auxiliary processor 223 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 201 itself on which the artificial intelligence model is performed, or may be performed through a separate server (e.g., server 208).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 230 may store various data used by at least one component (eg, the processor 220 or the sensor module 276) of the electronic device 201. Data may include, for example, input data or output data for software (e.g., program 240) and instructions related thereto.
  • Memory 230 may include volatile memory 232 or non-volatile memory 234.
  • the program 240 may be stored as software in the memory 230 and may include, for example, an operating system 242, middleware 244, or application 246.
  • the input module 250 may receive commands or data to be used in a component of the electronic device 201 (e.g., the processor 220) from outside the electronic device 201 (e.g., a user).
  • the input module 250 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 255 may output sound signals to the outside of the electronic device 201.
  • the sound output module 255 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 260 can visually provide information to the outside of the electronic device 201 (eg, a user).
  • the display module 260 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 260 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 270 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 270 acquires sound through the input module 250, the sound output module 255, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 201). Sound may be output through an electronic device 202 (e.g., speaker or headphone).
  • an electronic device 202 e.g., speaker or headphone
  • the sensor module 276 detects the operating state (e.g., power or temperature) of the electronic device 201 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 276 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 277 may support one or more designated protocols that can be used to connect the electronic device 201 directly or wirelessly with an external electronic device (eg, the electronic device 202).
  • the interface 277 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 278 may include a connector through which the electronic device 201 can be physically connected to an external electronic device (eg, the electronic device 202).
  • the connection terminal 278 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 279 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 279 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 280 can capture still images and moving images.
  • the camera module 280 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 288 can manage power supplied to the electronic device 201.
  • the power management module 288 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • Battery 289 may supply power to at least one component of electronic device 201.
  • the battery 289 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • Communication module 290 provides a direct (e.g., wired) communication channel or wireless communication channel between electronic device 201 and an external electronic device (e.g., electronic device 202, electronic device 204, or server 208). It can support establishment and communication through established communication channels. Communication module 290 operates independently of processor 220 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication. According to one embodiment, the communication module 290 is a wireless communication module 292 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 294 (e.g.
  • GNSS global navigation satellite system
  • LAN local area network
  • power line communication module may be included.
  • the corresponding communication module is a first network 298 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 299 (e.g., legacy It may communicate with an external electronic device 204 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • the wireless communication module 292 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 296 to communicate within a communication network, such as the first network 298 or the second network 299.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 292 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low latency). -latency communications)) can be supported.
  • the wireless communication module 292 may support high frequency bands (e.g., mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 292 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. It can support technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module 292 may support various requirements specified in the electronic device 201, an external electronic device (e.g., electronic device 204), or a network system (e.g., second network 299).
  • the wireless communication module 292 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 297 may transmit or receive signals or power to or from the outside (e.g., an external electronic device).
  • the antenna module 297 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 297 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 298 or the second network 299 is, for example, connected to the plurality of antennas by the communication module 290. can be selected. Signals or power may be transmitted or received between the communication module 290 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 297.
  • RFIC radio frequency integrated circuit
  • the antenna module 297 may form a mmWave antenna module.
  • a mmWave antenna module may include a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band), and It may include a plurality of antennas (e.g., array antennas) disposed on or adjacent to a second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band.
  • a mmWave antenna module may include a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band), and It may include a plurality of antennas (e.g., array antennas) disposed on or adjacent to a second side (e.g.
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 201 and the external electronic device 204 through the server 208 connected to the second network 299.
  • Each of the external electronic devices 202 or 204 may be of the same or different type as the electronic device 201.
  • all or part of the operations performed in the electronic device 201 may be executed in one or more of the external electronic devices 202, 204, or 208.
  • the electronic device 201 may perform the function or service instead of executing the function or service on its own.
  • one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 201.
  • the electronic device 201 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the electronic device 201 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 204 may include an Internet of Things (IoT) device.
  • Server 208 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 204 or server 208 may be included in the second network 299.
  • the electronic device 201 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • Electronic devices may be of various types.
  • Electronic devices may include, for example, portable communication devices (e.g., smart phones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances.
  • portable communication devices e.g., smart phones
  • computer devices e.g., portable multimedia devices
  • portable medical devices e.g., cameras, wearable devices, or home appliances.
  • electronic devices according to embodiments of this document are not limited to the above-mentioned devices.
  • first, second, or first or second may be used simply to distinguish one element from another, and may be used to distinguish such elements in other respects, such as importance or order) is not limited.
  • One (e.g. first) component is said to be “coupled” or “connected” to another (e.g. second) component, with or without the terms “functionally” or “communicatively”.
  • any of the components can be connected to the other components directly (e.g. wired), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as logic, logic block, component, or circuit, for example. can be used
  • a module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of this document are one or more instructions stored in a storage medium (e.g., built-in memory 236 or external memory 238) that can be read by a machine (e.g., electronic device 201). It may be implemented as software (e.g., program 240) including these.
  • a processor e.g., processor 220 of a device (e.g., electronic device 201) may call at least one command among one or more commands stored from a storage medium and execute it. This allows the device to be operated to perform at least one function according to the at least one instruction called.
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and this term refers to cases where data is semi-permanently stored in the storage medium. There is no distinction between temporary storage cases.
  • Computer program products are commodities and can be traded between sellers and buyers.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)) or through an application store (e.g. Play StoreTM) or on two user devices (e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • a machine-readable storage medium e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play StoreTM
  • two user devices e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • at least a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server.
  • each component (e.g., module or program) of the above-described components may include a single or plural entity, and some of the plurality of entities may be separately placed in other components.
  • one or more of the components or operations described above may be omitted, or one or more other components or operations may be added.
  • multiple components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar manner as those performed by the corresponding component of the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, omitted, or , or one or more other operations may be added.
  • FIGS. 3A, 3B, 3C, and 3D are diagrams for explaining a wearable device according to an embodiment.
  • a wearable device 300 may be mounted on the user to assist the user in walking.
  • the wearable device 300 may be a device that assists the user's walking.
  • the wearable device 300 may be an exercise device that not only assists the user's movement (eg, walking or exercise), but also provides exercise functions by providing resistance to the user.
  • the resistance provided to the user may be a force actively applied to the user, such as a force output by a device such as a motor.
  • the resistance force is not a force actively applied to the user, but may be a force that hinders the user's movement, such as friction force. In other words, resistance can be expressed as exercise load.
  • Figures 3a, 3b, 3c and 3d show a hip type wearable device 300, but the type of wearable device is not limited to the hip type, and the wearable device may support the entire lower extremity or a portion of the lower extremity. It could be a type. Additionally, the wearable device may be one of a form that supports part of the lower extremity, a form that supports up to the knee, a form that supports up to the ankle, and a form that supports the entire body.
  • Embodiments described with reference to FIGS. 3A, 3B, 3C, and 3D may be applied to a hip type, but are not limited thereto and may be applied to various types of wearable devices.
  • the wearable device 300 includes a driving unit 310, a sensor unit 320, an Inertial Measurement Unit (IMU) 330, a control unit 340, a battery 350, and a communication module 352. do.
  • the IMU 330 and the control unit 340 may be placed within the main frame of the wearable device 300.
  • the IMU 330 and the control unit 340 may be included in a housing formed on (or attached to) the outside of the main frame of the wearable device 300.
  • the driving unit 310 may include a motor 314 and a motor driver circuit 312 for driving the motor 314.
  • the sensor unit 320 may include at least one sensor 321.
  • the control unit 340 may include a processor 342, memory 344, and input interface 346.
  • FIG. 3C one sensor 321, one motor driver circuit 312, and one motor 314 are shown, but this is only an example.
  • 300-1 may include a plurality of sensors 321 and 321-1, a plurality of motor driver circuits 312 and 312-1, and a plurality of motors 314 and 314-1. Additionally, depending on implementation, the wearable device 300 may include a plurality of processors. The number of motor driver circuits, motors, or processors may vary depending on the part of the body on which the wearable device 300 is worn.
  • the driving unit 310 may drive the user's hip joint.
  • the driving unit 310 may be located on the user's right hip and/or left hip.
  • the driving unit 310 may be additionally located at the user's knees and ankles.
  • the driving unit 310 includes a motor 314 capable of generating rotational torque and a motor driver circuit 312 for driving the motor 314.
  • the sensor unit 320 can measure the angle of the user's hip joint when walking.
  • Information about the angle of the hip joint sensed by the sensor unit 320 may include the angle of the right hip joint, the angle of the left hip joint, the difference between the angles of both hip joints, and the direction of hip joint movement.
  • the sensor 321 may be located within the driving unit 310.
  • the sensor unit 320 may additionally measure the user's knee angle and ankle angle.
  • the sensor 321 may be an encoder.
  • sensor 321 may be a Hall sensor. Information on the joint angle measured by the sensor unit 320 may be transmitted to the control unit 340.
  • the sensor unit 320 may include a potentiometer.
  • the potentiometer can sense the R-axis joint angle, L-axis joint angle, R-axis joint angular velocity, and L-axis joint angular velocity according to the user's walking motion.
  • the R/L axis may be a reference axis for the user's right/left leg.
  • the R/L axis may be set to be perpendicular to the ground, have a negative value on the front side of the person's torso, and have a positive value on the back side of the person's torso.
  • the IMU 330 can measure acceleration information and posture information while walking.
  • the IMU 330 may sense X-, Y-, and Z-axis acceleration and X-, Y-, and Z-axis angular velocities according to the user's walking motion. Acceleration information and posture information measured by the IMU 330 may be transmitted to the control unit 340.
  • the wearable device 300 includes a sensor (for example, an ElectroMyoGram sensor) capable of sensing changes in the user's momentum or biosignals according to walking motions. EMG sensor)) may be included.
  • a sensor for example, an ElectroMyoGram sensor
  • EMG sensor capable of sensing changes in the user's momentum or biosignals according to walking motions.
  • the control unit 340 may generally control the operation of the wearable device 300.
  • the control unit 340 may receive information sensed by each of the sensor unit 320 and the IMU 330.
  • Information sensed by the IMU 330 includes acceleration information and posture information
  • information sensed by the sensor unit 320 includes the angle of the right hip joint, the angle of the left hip joint, the difference between the angles of both hip joints, and May include direction of hip joint movement.
  • the control unit 340 may calculate the difference between the angles of both hip joints based on the angle of the right hip joint and the angle of the left hip joint.
  • the control unit 340 may generate a signal to control the driver 310 based on the sensed information.
  • the generated signal may be an assistive force to assist the user's movement.
  • the generated signal may be a resistance force to impede the user's movement. Resistance may be provided for the user's exercise.
  • a negative size of the exercise load (or torque) may mean resistance force
  • a positive size may mean assistance force.
  • the processor 342 of the control unit 340 may control the driving unit 310 to provide resistance to the user.
  • the driving unit 310 may provide resistance to the user by actively applying force to the user through the motor 314.
  • the driving unit 310 may provide resistance to the user by using the back-drivability of the motor 314 without actively applying force to the user.
  • the reverse driveability of a motor can mean the responsiveness of the motor's rotation axis to external forces. The higher the motor's reverse driveability, the easier it is to react to the external force acting on the motor's rotation axis (i.e. , the rotation axis of the motor rotates easily). Even if the same external force is applied to the rotation axis of the motor, the degree to which the rotation axis of the motor rotates may vary depending on the degree of reverse driveability.
  • the processor 342 of the control unit 340 may control the driving unit 310 so that the driving unit 310 outputs torque (or auxiliary torque) to assist the user's movement.
  • the driving unit 310 may be configured to be disposed on the left hip portion and the right hip portion, respectively, and the control unit 340 controls the driving unit 310 to generate torque.
  • a control signal can be output.
  • the driving unit 310 may generate torque based on the control signal output by the control unit 340.
  • the torque value for generating torque may be set externally or may be set by the control unit 340.
  • the controller 340 may use the magnitude of the current for the signal transmitted to the driver 310 to indicate the magnitude of the torque value. That is, the larger the amount of current received by the driver 310, the larger the torque value can be.
  • the processor 342 of the control unit 340 transmits a control signal to the motor driver circuit 312 of the drive unit 310, and the motor driver circuit 312 generates a current corresponding to the control signal to generate a motor ( 314) can be controlled.
  • Battery 350 supplies power to components of wearable device 300.
  • the wearable device 300 includes a circuit (e.g., Power Management Integrated (PMIC)) that converts the power of the battery 350 to match the operating voltage of the components of the wearable device 300 and provides it to the components of the wearable device 300. Circuit)) may be further included. Additionally, depending on the operation mode of the wearable device 300, the battery 350 may or may not supply power to the motor 314.
  • PMIC Power Management Integrated
  • the communication module 352 may support establishment of a direct (e.g., wired) communication channel or wireless communication channel between the wearable device 300 and an external electronic device, and performance of communication through the established communication channel.
  • Communication module 352 may include one or more communication processors that support direct (e.g., wired) or wireless communication.
  • the communication module 352 is a wireless communication module (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module (e.g., a local area network (LAN) ) may include a communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • the corresponding communication module is a first network (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network (e.g., a legacy cellular network, 5G network, It can communicate with external electronic devices through next-generation communication networks, the Internet, or computer networks.
  • a first network e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network e.g., a legacy cellular network, 5G network
  • next-generation communication networks e.g., the Internet
  • these various types of communication modules are integrated into one component (e.g., a single chip) or are comprised of multiple separate components. may be implemented with multiple chips (e.g., multiple chips).
  • the electronic device 201 described above with reference to FIG. 2 may be included in the wearable device 300.
  • the electronic device 201 described above with reference to FIG. 2 may be a separate device that is physically separated from the wearable device 300, and the electronic device 201 and the wearable device 300 may be connected to a short-range wireless device. Can be connected through communication.
  • FIG. 4 is a diagram illustrating a wearable device that communicates with an electronic device, according to an embodiment.
  • the wearable device 300 described above with reference to FIGS. 3A, 3B, 3C, and 3D is the electronic device 201 described above with reference to FIG. 2.
  • the electronic device 201 may be the user's electronic device of the wearable device 300.
  • the wearable device 300 and the electronic device 201 may be connected using a short-range wireless communication method.
  • the electronic device 201 may display a user interface (UI) for controlling the operation of the wearable device 300 on the display 201-1.
  • UI user interface
  • the UI may include at least one soft key that allows the user to control the wearable device 300.
  • the user can input a command to control the operation of the wearable device 300 through the UI on the display 201-1 of the electronic device 201, and the electronic device 201 generates a control command corresponding to the command. And the generated control command can be transmitted to the wearable device 300.
  • the wearable device 300 can operate according to the received control command and transmit the control result to the electronic device 201.
  • the electronic device 201 may display a control completion message on the display 201-1 of the electronic device 201.
  • 5 and 6 are diagrams for explaining a torque output method of a wearable device, according to an embodiment.
  • the driving units 310-1 and 310-2 of the wearable device 300 of FIG. 3 are located near the user's hip joint.
  • the control unit 340 of the wearable device 300 may be located near the waist.
  • the positions of the driving units 310-1 and 310-2 and the control unit 340 are not limited to the examples shown in FIGS. 5 and 6.
  • the wearable device 300 measures (or senses) the user's left hip joint angle q_l and right hip joint angle q_r .
  • the wearable device 300 can measure the user's left hip joint angle q_l through a left encoder (or Hall sensor), and can measure the user's right hip joint angle q_r through a right encoder.
  • the left leg is ahead of the baseline 620, so the left hip joint angle q_l can be a negative number
  • the right leg is behind the baseline 620, so the right hip joint angle q_r is It may be a positive number.
  • the right hip joint angle q_r may be negative when the right leg is ahead of the baseline 620 and the left hip joint angle q_l may be positive when the left leg is behind the baseline 620.
  • the wearable device 300 measures the first raw angle (e.g., q_r_raw ) of the first joint (e.g., right hip joint) and the second joint (e.g., left hip joint) measured by the sensor unit 320. ) can be obtained by filtering the second raw angle (eg, q_l_raw ) of the first angle (eg, q_r ) and the second angle ( q_l ). For example, the wearable device 300 may filter the first raw angle and the second raw angle based on the first previous angle and the second previous angle measured for the previous time.
  • the first raw angle e.g., q_r_raw
  • the second joint e.g., left hip joint
  • the wearable device 300 may filter the first raw angle and the second raw angle based on the first previous angle and the second previous angle measured for the previous time.
  • the wearable device 300 has a left hip joint angle q_l , a right hip joint angle q_r , an offset angle c, and a sensitivity. , gain , and torque value based on delay ⁇ t Determine (t) and the determined torque value
  • the motor driver circuit 312 of the wearable device 300 can be controlled to output (t) .
  • torque value The force provided to the user by (t) may be named force feedback.
  • the wearable device 300 determines the torque value based on [Equation 1] below: (t) can be determined.
  • y may be a state factor
  • q_r may be the right hip joint angle
  • q_l may be the left hip joint angle.
  • the state factor y may be related to the distance between the two legs. For example, when y is 0, it indicates a state in which the distance between legs is 0 (i.e., crossing state), and when the absolute value of y is maximum, it indicates a state in which the angle between legs is maximum (i.e., landing state). status) can be indicated.
  • the state factor may be expressed as y(t) .
  • gain is a parameter that represents the magnitude and direction of the output torque. gain The larger the size, the stronger the torque can be output. gain If is a negative number, torque that acts as a resistance to the user can be output and the gain If is a positive number, torque that acts as an assistive force to the user may be output.
  • Delay ⁇ t is a parameter related to the output timing of torque. gain The value of and the value of delay ⁇ t may be set in advance and may be adjustable by the user, the wearable device 300, or the electronic device 201 described above with reference to FIG. 2 .
  • a model that outputs torque that acts as an assistive force to the user using [Equation 1] can be defined as a torque output model (or torque output algorithm).
  • the wearable device 300 or the electronic device 201 can determine the size and delay of the torque to be output by inputting the values of the input parameters received through sensors into the torque output model.
  • the wearable device 300 or the electronic device 201 applies the first gain value and the first delay value as parameter values determined for the state factor y(t) to the first state factor y(t).
  • the first torque value can be determined through [Equation 2] below.
  • the calculated first torque value may include a value for the first joint and a value for the second joint.
  • a value for the first joint may be a value for the left hip joint, which is the second joint, may be a value for the right hip joint, which is the first joint. and
  • the magnitude may be the same and the direction of torque may be opposite.
  • the wearable device 300 may control the motor driver 312 of the wearable device 300 to output torque corresponding to the first torque value.
  • the wearable device 300 may provide asymmetric torque to both legs of the user to assist the asymmetric walking. For example, stronger assistance can be provided to the leg with a short stride or slow swing speed.
  • the leg with a short stride or slow swing speed is referred to as the affected leg or target leg.
  • the swing time of the affected leg may be shorter or the stride length may be shorter than that of the sound leg.
  • a method of adjusting the timing of torque acting on the affected leg to assist the user's walking may be considered.
  • an offset angle may be added to the actual joint angle for the affected leg to increase the output time of torque to assist the swing motion of the affected leg.
  • c may be a value of a parameter indicating the offset angle between joint angles.
  • the value of the input parameter input to the torque output model mounted (or applied) to the wearable device 300 can be adjusted.
  • the values of q_r and q_l can be adjusted through [Equation 3] below.
  • c r may mean an offset angle for the right hip joint
  • c l may mean an offset angle for the left hip joint.
  • the wearable device 300 may filter state factors to reduce discomfort felt by the user due to irregular torque output. For example, the wearable device 300 or the electronic device 201 determines the initial state factor y raw (t) at the current time t based on the first angle of the first joint and the second angle of the second joint, The first state factor y(t) may be determined based on the previous state factor y prv and the initial state factor y raw (t) determined for the previous time t-1.
  • the current time t may mean the processing time for the t-th data (or sample), and the previous time t-1 may mean the processing time for the t-1-th data.
  • the difference between the current time t and the previous time t-1 may be the operation cycle of the processor that generates or processes the corresponding data.
  • Sensitivity ⁇ may be a value of a parameter indicating sensitivity.
  • the sensitivity value may be continuously adjusted during the test walk, but the sensitivity value may be preset to a constant value to reduce computational complexity.
  • the torque output method based on the state factor described with reference to [Equation 1] to [Equation 3] can be used when the user wearing the wearable device 300 is in a walking state.
  • a motion control model corresponding to the exercise performed by the user may be used to control the wearable device 300.
  • Figure 7 shows a motion sensing circuit worn on the user's upper body, according to one embodiment.
  • smart clothing 700 may include a motion sensing circuit 710.
  • the motion sensing circuit 710 includes at least one IMU (721; 722; 723), a core 730, at least one stretch sensor 750, a connector 760, and the motion sensing circuit 710. It may include a cable 740 connecting the elements.
  • At least one IMU (721, 722, 723) and at least one stretch sensor 750 may be symmetrically connected by a cable 740.
  • the IMU 721 is located on the back or chest of the clothing 700
  • the IMU 722 is located above (or above) the elbow of the arm
  • the IMU 723 is located below the elbow of the arm (
  • the stretching sensor 750 may be located at the elbow.
  • Upper body movement information about the upper body movement of a user wearing smart clothing 700 may be generated by the motion sensing circuit 710.
  • the upper body movement information may include acceleration information generated by at least one IMU (721; 722; 723) included in the motion sensing circuit 710.
  • the upper body movement information may include resistance value information generated by at least one stretch sensor 750.
  • At least one IMU (721; 722; 723) may measure (or sense) acceleration information according to the user's movement.
  • the IMU (721; 722; 723) can sense acceleration of the X-axis, Y-axis, and Z-axis and angular velocity of the X-axis, Y-axis, and Z-axis according to the user's movement.
  • the user's shoulder rotation angle may be determined based on acceleration information generated by the IMU 722 located on the upper arm of the smart clothing 700.
  • the user's elbow angle may be determined based on acceleration information generated by the IMU 722 located above and the IMU 723 located below the arm portion of the smart clothing 700. Acceleration information measured by the IMU (721; 722; 723) may be transmitted to the core (730).
  • At least one stretch sensor 750 is located on the user's elbow and can measure (or sense) resistance value information according to the user's movement. Resistance value information measured by the stretch sensor 750 may be transmitted to the core 730.
  • the resistance value may vary depending on the extent to which the stretch sensor 750 is stretched due to bending of the area to which the stretch sensor 750 is attached. The larger the resistance value measured by the stretch sensor 750 may indicate that the degree of bending of the area where the stretch sensor 750 is located is greater. For example, the user's elbow angle may be determined based on the resistance value measured by the stretch sensor 750.
  • the measured resistance value may vary depending on how the user taps the area where the stretch sensor 750 is attached. The more times the resistance value measured by the stretch sensor 750 momentarily increases and then decreases again, the greater the number of times the area to which the stretch sensor 750 is attached increases. For example, a user input (eg, a tap or double tap) may be detected based on the resistance value measured by the stretch sensor 750.
  • a user input eg, a tap or double tap
  • the upper body movement information may be information in the form of an analog signal (eg, a current value).
  • the core 730 may convert upper body movement information from analog signal form to digital signal form.
  • core 730 may include an analog-to-digital converter (ADC).
  • ADC analog-to-digital converter
  • the core 730 may acquire and process upper body movement information measured by the IMU (721; 722; 723) and/or the stretch sensor 750.
  • the core 730 may perform operations described with reference to FIGS. 10 to 23 below.
  • the core 730 may include a battery to supply power to the core 730 and the motion sensing circuit 710.
  • the core 730 may be further connected to an input device.
  • the input device may be a button.
  • the core 730 may directly obtain user input through an input device.
  • smart clothing 700 may be connected to a wearable device (e.g., wearable device 120 in FIG. 1 or wearable device 300 in FIG. 3) through the core 730 using short-range wireless communication.
  • Upper body movement information may be converted from analog signal form to digital signal form by the core 730.
  • smart clothing 700 may transmit upper body movement information to a wearable device through the core 730.
  • smart clothing 700 may be connected to a wearable device by wire using a connector 760.
  • the wearable device can obtain upper body movement information from the smart clothing 700 through the connector 760.
  • Upper body movement information can be converted from analog signal form to digital signal form by a wearable device.
  • smart clothing 700 may not include a core.
  • the wearable device can receive upper body movement information by controlling smart clothing 700 through the connector 760.
  • Figure 8 is a diagram for explaining target angle information according to one embodiment.
  • the upper body movement information may include at least one of acceleration information, resistance value information, or a combination thereof.
  • Target angle information may be determined based on upper body movement information.
  • the target angle information may include at least one of an elbow angle, a shoulder rotation angle, or a combination thereof.
  • target angle information determined based on the user's upper body movement information may include at least one of an angle 810 or an angle 820.
  • Angle 810 may represent the user's elbow angle
  • angle 820 may represent the user's shoulder rotation angle. For example, when the user moves the upper body in a state of bending the arm and swinging forward while walking with the elbow extended with the fingertips pointing toward the ground, the angle 810 may increase and the angle 820 may decrease. .
  • Figure 9 is a diagram for explaining target angle information regarding a stretch sensor, according to an embodiment.
  • the solid line 900 in the graph of FIG. 9 represents a motion sensing circuit (e.g., the motion sensing circuit 710 of FIG. 7) of smart clothing (e.g., smart clothing 150 of FIG. 1 or smart clothing 700 of FIG. 7). It may indicate target angle information determined based on resistance value information measured by a stretch sensor included in (e.g., stretch sensor 750 of FIG. 7).
  • solid line 900 may represent the elbow angle.
  • the measured resistance value may vary depending on the extent to which the stretch sensor is stretched due to bending of the area where the stretch sensor is attached. A larger resistance value measured by the stretch sensor may indicate a greater degree of bending (e.g., elbow angle) of the area where the stretch sensor is located.
  • the constant initial value of the solid line 900 may indicate a state in which the user straightens his elbow, the point 910 indicates a state in which the user bends his elbow, and the point 920 may indicate a state in which the user straightens his elbow.
  • Figure 10 is a flowchart of a method for controlling a wearable device, according to an embodiment.
  • operations 1010 to 1040 below may be performed by the wearable device.
  • the processor of the wearable device may obtain upper body movement information about the upper body movement of the user wearing the wearable device.
  • upper body movement information is generated by at least one IMU (e.g., IMU (721; 722; 723) of FIG. 7) included in a motion sensing circuit (e.g., motion sensing circuit 710 of FIG. 7).
  • IMU e.g., IMU (721; 722; 723) of FIG. 7
  • a motion sensing circuit e.g., motion sensing circuit 710 of FIG. 7
  • the upper body movement information may include resistance value information generated by at least one stretch sensor (eg, stretch sensor 750 of FIG. 7).
  • the processor is a smart clothing (e.g., smart clothing 150 of FIG. 1 or smart clothing 700 of FIG. 7) that is wiredly connected to a wearable device through a connector (e.g., connector 760 of FIG. 7).
  • upper body movement information can be obtained from.
  • upper body movement information may be information in the form of an analog signal (e.g., current value).
  • the upper body movement information may be information in the form of a digital signal converted from an analog signal form by the core of the smart clothing (eg, core 720 in FIG. 7).
  • the processor may obtain upper body movement information from smart clothing connected to a wearable device using short-range wireless communication.
  • upper body movement information may be information in the form of an analog signal.
  • the upper body movement information may be information in the form of a digital signal converted from an analog signal form by the core of the smart clothing.
  • the processor may calculate an optimal stride length for the user based on the upper body movement information.
  • the method of calculating the optimal stride length for a user is explained in detail with reference to FIGS. 12 and 13 below.
  • the processor may determine a first value of the control parameter corresponding to the optimal stride length.
  • the control parameter may be a parameter for adjusting at least one of the size, direction, or timing of the torque to be output.
  • the control parameter may include at least one of gain, delay, or a combination thereof.
  • Gain may represent the magnitude and direction of output torque, and delay may represent torque output timing.
  • the value of gain or delay may be set in advance and may be adjusted by the user, smart clothing, or wearable device. For example, when the control parameter is gain, if the gain value is negative, torque acting as a resistance force to the user may be output, and if the gain value is positive, torque acting as an assisting force may be output to the user. The larger the gain value, the stronger the torque can be output.
  • the processor may determine the first value of the control parameter based on preset optimal parameter information corresponding to the optimal stride length.
  • the wearable device may have data on preset optimal parameter information corresponding to different optimal stride lengths stored in a database in a memory (e.g., memory 344 in FIG. 3).
  • the processor may determine the first value of the control parameter using data stored in the memory.
  • the processor may control the motor driver of the wearable device based on the first value of the control parameter. For example, if the torque value determined based on the first value of the control parameter is a positive number, the processor (e.g., the driver 310 in FIG. 3) drives the user through a motor (e.g., the motor 314 in FIG. 3).
  • the driving unit can be controlled to output torque (or assistance torque) to help the user walk.
  • the driving unit can provide assistance by applying force to the user through a motor.
  • the processor controls the motor driver of the wearable device based on a setting value directly input by the user or a preset default value. can be controlled.
  • Figure 11 is a flowchart of a method for controlling a motor driver of a wearable device, according to an embodiment.
  • operation 1040 described above with reference to FIG. 10 may include operations 1110 and 1120 below.
  • Operations 1110 and 1120 may be performed by a processor (e.g., processor 342 of FIG. 3) of a wearable device (e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3).
  • a processor e.g., processor 342 of FIG. 3
  • a wearable device e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3.
  • the processor may determine the torque value based on the first value of the control parameter.
  • the method of determining the torque value based on the value of the control parameter has been described in detail with reference to FIGS. 5 and 6, and redundant information will be omitted.
  • the processor may control the motor driver of the wearable device to output a torque value.
  • the processor may provide assistance to the user by controlling a motor (eg, motor 314 in FIG. 3) based on the torque value.
  • the processor drives the motor to output auxiliary torque using the battery of the wearable device (e.g., battery 350 in FIG. 3), and auxiliary torque may be provided to the user by the auxiliary torque output by the motor.
  • the torque value refers to the control signal applied to the motor
  • the auxiliary torque refers to the rotational torque output by the motor based on the torque value
  • the auxiliary force refers to the force felt by the user due to the auxiliary torque.
  • Figure 12 is a flowchart of a method for calculating an optimal stride length, according to one embodiment.
  • operation 1020 described above with reference to FIG. 10 may include operations 1210 to 1230 below.
  • Operations 1210 to 1230 may be performed by a processor (e.g., processor 342 of FIG. 3) of a wearable device (e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3).
  • a processor e.g., processor 342 of FIG. 3
  • a wearable device e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3.
  • the processor may determine target angle information based on upper body movement information.
  • upper body movement information may include acceleration information.
  • upper body movement information may include resistance value information.
  • Target angle information may be determined based on upper body movement information.
  • the target angle information may include at least one of an elbow angle, a shoulder rotation angle, or a combination thereof.
  • the processor may determine the elbow angle based on resistance value information. For example, the processor may determine the user's elbow angle based on a resistance value measured by a stretch sensor (eg, stretch sensor 750 in FIG. 7).
  • a stretch sensor eg, stretch sensor 750 in FIG. 7
  • the processor may determine the elbow angle based on acceleration information. For example, the processor may configure an IMU (e.g., IMU 722 in FIG. 7) located at the top of the arm portion of smart clothing (e.g., smart clothing 700 in FIG. 7) and an IMU (e.g., IMU 722 in FIG. 7) located at the bottom.
  • the user's elbow angle may be determined based on acceleration information generated by the IMU 723.
  • the processor may determine the elbow angle based on resistance value information and acceleration information.
  • the processor may determine shoulder rotation angle information based on acceleration information. For example, the processor may determine the user's shoulder rotation angle information based on acceleration information generated by an IMU located on the upper arm of the smart clothing.
  • the processor may calculate the optimal stride rate based on the target angle information.
  • the optimal stride ratio may represent a parameter that allows a user wearing a wearable device to walk optimally between the maximum stride length and the minimum stride length.
  • the maximum stride length and minimum stride length are described in detail with reference to operation 1230 below.
  • the optimal stride ratio can be calculated using [Equation 4] below.
  • the shoulder rotation angle and the elbow angle may be angles at the time when the upper body movement information is acquired, which is indicated by the target angle information determined based on the upper body movement information in operation 1210.
  • the shoulder rotation angle and the elbow angle can be expressed as the current shoulder rotation angle and the current elbow angle.
  • the maximum shoulder rotation angle and minimum shoulder rotation angle may be optimal shoulder rotation angles corresponding to when the user walks with the maximum stride length and minimum stride length, respectively.
  • the maximum elbow angle and minimum elbow angle may be optimal elbow angles corresponding to when the user walks at the maximum and minimum strides, respectively.
  • the shoulder weighting ratio may represent a weighting ratio of the shoulder rotation angle to the elbow angle between 0 and 1. For example, the shoulder weight ratio may be set to 0.5 by default.
  • the processor may calculate the optimal stride length based on the optimal stride rate.
  • the optimal stride length can be calculated using [Equation 5] below.
  • the processor may calculate the optimal stride length for the user's preset maximum stride length and minimum stride length based on the optimal stride ratio.
  • the user's preset maximum and minimum stride lengths may be strides corresponding to the maximum torque and minimum torque that the wearable device can provide, respectively.
  • the user's preset maximum and minimum stride lengths may be determined based on the user's leg length that the user pre-enters in order to use the wearable device.
  • the user's preset maximum and minimum stride lengths may be arbitrarily input by the user.
  • Figure 13 is a flowchart of a method for calculating an optimal stride length, according to one embodiment.
  • operation 1020 described above with reference to FIG. 10 may include operations 1310 to 1330 below.
  • Operations 1310 to 1330 may be performed by a processor (e.g., processor 342 of FIG. 3) of a wearable device (e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3).
  • a processor e.g., processor 342 of FIG. 3
  • a wearable device e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3.
  • the processor may determine target angle information based on upper body movement information.
  • descriptions that overlap with those described in operation 1210 with reference to FIG. 12 are omitted.
  • the processor may determine whether the user's upper body movement corresponds to running based on the target angle information. Since the movements of the user's upper and lower body during walking are related to each other, whether the user's upper body movement corresponds to running may be determined based on target angle information determined based on the upper body movement information. For example, since the shoulder rotation angle indicated by the target angle information and the rate of change of the shoulder rotation angle and the user's stride length and the change rate of the stride length are proportional, the processor determines whether the user's upper body movement corresponds to running based on the stride length and the change rate of the stride length. can be decided.
  • the processor may calculate the optimal stride length based on whether the upper body movement corresponds to running and the target angle information. After operation 1330, as described with reference to FIG. 10, the processor may determine a first value of the control parameter corresponding to the optimal stride length and control the motor driver of the wearable device based on the first value of the control parameter. .
  • the processor may calculate the optimal stride length based on the running mode. if the user's upper body movement corresponds to running, the optimal stride length can be calculated based on the running mode.
  • the processor may calculate the optimal stride length based on the walking mode. . For example, if the user's upper body movement does not correspond to running, the optimal stride length can be calculated based on the walking mode.
  • the processor may calculate the optimal stride length based on target angle information. Regarding the method of calculating the optimal stride length, content that overlaps with the content described with reference to operations 1220 and 1230 of FIG. 12 will be omitted.
  • the processor may transmit information about the running mode to an electronic device (eg, the electronic device 110 of FIG. 1 or the electronic device 201 of FIG. 2).
  • the processor when calculating the optimal stride length based on the walking mode, may re-perform operation 1010 of FIG. 10 described above.
  • the processor may transmit information regarding the walking mode to the electronic device.
  • Figure 14 is a flowchart of a method for determining whether a user's upper body movement corresponds to running, according to one embodiment.
  • operation 1320 described above with reference to FIG. 13 may include operations 1410 and 1420 below.
  • Operations 1410 and 1420 may be performed by a processor (e.g., processor 342 of FIG. 3) of a wearable device (e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3).
  • a processor e.g., processor 342 of FIG. 3
  • a wearable device e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3.
  • the processor may determine whether the angles of both elbows indicated by the target angle information are greater than or equal to a preset threshold angle.
  • the preset threshold angle may be 110°.
  • the processor may determine whether the rate of change of the angles of both elbows is greater than or equal to a preset threshold angle change rate.
  • the preset threshold angle change rate may be 30°/sec.
  • the processor determines that the user's upper body movement corresponds to running when the angles of both elbows indicated by the target angle information are more than a preset threshold angle and the change rate of the angles of both elbows is more than the preset threshold angle change rate. You can decide.
  • Figure 15 is a diagram for explaining a method of determining whether a user's upper body movement corresponds to running, according to an embodiment.
  • the graph in FIG. 15 shows changes in sensing data according to the user's walking.
  • the horizontal axis of the graph represents time, and the vertical axis represents resistance values measured by a stretch sensor (e.g., stretch sensor 750 in FIG. 7).
  • line 1510 may represent the resistance value of the left elbow
  • line 1520 may represent the resistance value of the right elbow.
  • the section 1530 may represent a time interval (eg, 1 second) at which the angle change rate is determined. For example, if the angles of both elbows of the user change by more than 30° within 1 second while walking and are determined to be more than 110°, the user's upper body movement may be determined to correspond to running. If the user's upper body movement corresponds to running, the optimal stride length may be calculated based on the running mode.
  • Figure 16 is a flowchart of a method for controlling a wearable device, according to an embodiment.
  • operations 1610 to 1640 below may be performed after operation 1040 described above with reference to FIG. 10.
  • Operations 1610 to 1640 may be performed by a processor (e.g., processor 342 of FIG. 3) of a wearable device (e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3).
  • a processor e.g., processor 342 of FIG. 3
  • a wearable device e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3.
  • the processor may detect the first user input based on upper body movement information.
  • the upper body movement information may include at least one of acceleration information, resistance value information, or a combination thereof.
  • the first user input may represent a user double-tapping smart clothing (eg, smart clothing 150 in FIG. 1 or smart clothing 700 in FIG. 7).
  • the first user input may represent an action in which the user taps a stretch sensor (eg, stretch sensor 750 in FIG. 7) twice.
  • the first user input may be expressed as a double tap.
  • the processor can detect a double tap based on resistance value information.
  • the processor may determine a second value of the control parameter based on second upper body movement information obtained after the first user input is detected.
  • the second upper body movement information may represent information acquired between the time when the first user input is detected and the time when the second user input is detected.
  • the second upper body movement information may include at least one of acceleration information, resistance value information, or a combination thereof.
  • the control parameter may include at least one of gain, delay, or a combination thereof.
  • the second value of the control parameter e.g., the value of gain or the value of delay
  • the second upper body movement information may be about the user's arbitrary upper body movement to adjust the value of the gain.
  • the processor may detect the second user input based on the third upper body movement information.
  • the third upper body movement information may include at least one of acceleration information, resistance value information, or a combination thereof.
  • the second user input may represent the user double-tapping the smart clothing.
  • the processor may control the motor driver of the wearable device based on the second value of the control parameter when the second user input is detected. For example, when a double tap is detected as a second user input, the processor may control the motor driver of the wearable device based on the second value of the control parameter.
  • the method of controlling the motor driver of the wearable device based on the second value of the control parameter corresponds to the content described in operation 1040 with reference to FIG. 10, and overlapping content is omitted.
  • Figure 17 shows a graph for detecting user input, according to one embodiment.
  • the graph of FIG. 17 shows the stretching included in the movement sensing circuit (e.g., the motion sensing circuit 710 of FIG. 7) of smart clothing (e.g., smart clothing 150 of FIG. 1 or smart clothing 700 of FIG. 7). It represents a resistance value measured by a sensor (e.g., stretch sensor 750 in FIG. 7).
  • the horizontal axis of the graph represents time, and the vertical axis represents resistance value.
  • the measured resistance value may vary depending on tapping the area where the stretch sensor is attached. The more times the resistance value measured by the stretch sensor instantly increases and then decreases again, the greater the number of times the area where the stretch sensor is attached is tapped.
  • Point 1710 may represent a user tapping the stretch sensor once, and point 1720 may represent a user tapping the stretch sensor twice.
  • Point 1710 may be represented as a tab.
  • Point 1720 may be represented by a double tap.
  • the fact that the resistance value before and after the point 1720 is greater than the resistance value before and after the point 1710 may indicate that the angle of the elbow where the stretch sensor is located at the time the user double taps the stretch sensor is larger.
  • point 1710 may represent the action of the user tapping the stretch sensor with his right hand while extending his left elbow.
  • point 1720 may represent the action of the user double-tapping the stretch sensor with his right hand while his left elbow is bent.
  • Figure 18 is a flowchart of a method for determining the value of a control parameter, according to one embodiment.
  • operation 1620 described above with reference to FIG. 16 may include operations 1810 and 1820 below.
  • Operations 1810 and 1820 may be performed by a processor (e.g., processor 342 of FIG. 3) of a wearable device (e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3).
  • a processor e.g., processor 342 of FIG. 3
  • a wearable device e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3.
  • the processor may determine input angle information based on the second upper body movement information.
  • the second upper body movement information may include at least one of acceleration information, resistance value information, or a combination thereof.
  • the second upper body movement information may be about the user's arbitrary upper body movement (eg, bending the user's elbow) to adjust the value of the gain.
  • the input angle information may be the user's arbitrary elbow angle for adjusting the value of the gain.
  • the processor may determine the user's elbow angle based on the second upper body movement information.
  • the processor may determine a second value of the control parameter to correspond to the input angle information.
  • the wearable device may have data regarding second values of control parameters corresponding to different input angle information stored in a database in a memory (e.g., memory 344 in FIG. 3).
  • the processor may determine the second value of the control parameter from data stored in the memory.
  • the data stored in the memory may be related to the second value of the control parameter each corresponding to several angle sections of different input angle information (eg, the user's elbow angle). For example, if the input angle information indicates that the user's elbow angle is 120°, the second value (eg, gain) of the control parameter may be determined to be 5.
  • Figure 19 is a diagram for explaining a method for determining the value of a control parameter, according to an embodiment.
  • the graph of FIG. 19 shows changes in resistance values sensed by a stretch sensor (e.g., stretch sensor 750 of FIG. 7) according to the user's upper body movement.
  • the horizontal axis of the graph represents time, and the vertical axis represents the resistance value measured by the stretch sensor.
  • the larger the resistance value measured by the stretch sensor the greater the degree of bending of the area where the stretch sensor is located, and the greater the number of times the resistance value momentarily increases and then decreases again, the greater the number of times the area to which the stretch sensor is attached is tapped. You can.
  • points 1910 and 1920 the number of times the resistance value measured by the stretch sensor instantly increases and then decreases again is 2 times each, so the number of times the area to which the stretch sensor is attached is tapped is 2 times each. It can mean something.
  • points 1910 and 1920 may represent the action of a user double-tapping the stretch sensor.
  • point 1910 may represent the first user input described with reference to FIG. 16 and point 1920 may represent the second user input.
  • Second upper body movement information may be obtained in the section between point 1910 and point 1920.
  • Input angle information may be determined based on the second upper body movement information.
  • the user's elbow angle may be determined based on the second upper body movement information.
  • input angle information e.g., elbow angle
  • a set of control parameters determined to correspond to the input angle information 2 Values e.g. gain
  • the motor driver of the wearable device may be controlled based on the second value of the control parameter at the point in time (e.g., point 1920) at which the second user input is detected. there is.
  • the second value of the control parameter is determined to be 5
  • the second value of the control parameter is determined to be 5.
  • the value is determined to be 1, and if the input angle information is 120° again, the second value of the control parameter is determined to be 1, and if the double tap action is detected again at point 1920, the second value of the control parameter is 1.
  • Figure 20 is a flowchart of a method for determining the value of a control parameter, according to one embodiment.
  • operation 1620 described above with reference to FIG. 16 may include operations 2010 to 2040 below.
  • Operations 2010 to 2040 may be performed by a processor (e.g., processor 342 of FIG. 3) of a wearable device (e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3).
  • a processor e.g., processor 342 of FIG. 3
  • a wearable device e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3.
  • the processor may detect the first adjustment input based on the second upper body movement information.
  • the second upper body movement information may include at least one of acceleration information, resistance value information, or a combination thereof.
  • the first adjustment input may represent an action in which the user taps smart clothing (eg, smart clothing 150 in FIG. 1 or smart clothing 700 in FIG. 7) once.
  • the first adjustment input may represent a user tapping a stretch sensor (eg, stretch sensor 750 in FIG. 7) once.
  • the first adjustment input can be represented as a tap.
  • the processor may detect a tap based on resistance value information. The method for detecting the first adjustment input is described in detail below with reference to FIG. 21.
  • the processor may determine input angle information indicated by the second upper body movement information at the time the first adjustment input is detected.
  • the second upper body movement information may include at least one of acceleration information or resistance value information.
  • the second upper body movement information may be about the user's arbitrary upper body movement (eg, bending the user's elbow) to adjust the value of the gain.
  • the input angle information may be the user's arbitrary elbow angle for adjusting the value of the gain.
  • the processor may determine the user's elbow angle based on the second upper body movement information.
  • the processor may determine whether to increase or decrease the current value of the control parameter based on the input angle information.
  • the current value of the control parameter means the value of the control parameter at the time the first adjustment input is detected.
  • the current value of the control parameter may be the first value of the control parameter determined corresponding to the optimal stride length for the user in operation 1030 of FIG. 10.
  • the processor may determine to increase the current value of the control parameter when the angle indicated by the input angle information is greater than or equal to a preset angle.
  • the processor may determine to decrease the current value of the control parameter when the angle indicated by the input angle information is less than a preset angle. For example, if the angle represented by the input angle (e.g., elbow angle) is greater than or equal to a preset angle (e.g., 90°), the processor may determine to increase the current value of the control parameter by 1.
  • the processor may determine a second value of the control parameter based on whether the current value of the control parameter increases or decreases. For example, when it is determined four times to increase the current value of the control parameter and once to decrease it, the second value of the control parameter may be determined to be the current value of the control parameter plus 3.
  • Figure 21 is a diagram for explaining a method for determining the value of a control parameter, according to an embodiment.
  • the graph of FIG. 21 shows changes in resistance values sensed by a stretch sensor (e.g., stretch sensor 750 of FIG. 7) according to the user's upper body movement.
  • the horizontal axis of the graph represents time, and the vertical axis represents the resistance value measured by the stretch sensor.
  • the larger the resistance value measured by the stretch sensor the greater the degree of bending of the area where the stretch sensor is located, and the greater the number of times the resistance value momentarily increases and then decreases again, the greater the number of times the area to which the stretch sensor is attached is tapped.
  • the resistance value measured by the stretch sensor increases instantaneously and then decreases twice each, so the number of times the area to which the stretch sensor is attached is tapped is 2 each.
  • points 2120, 2130, and 2140 the resistance value measured by the stretch sensor increases instantaneously and then decreases once each, so the area where the stretch sensor is attached is shown. This may mean that the number of taps is once each.
  • points 1910 and 1920 may represent the action of a user double-tapping the stretch sensor.
  • point 2120, point 2130, and point 2140 may represent the action of a user tapping the stretch sensor.
  • point 2110 may represent the first user input described with reference to FIG. 16 .
  • point 2120, point 2130, and point 2140 may represent the first adjustment input described with reference to FIG. 20.
  • point 2150 may represent the second user input described with reference to FIG. 16 .
  • second upper body movement information may be obtained in the section between point 2110 and point 2150.
  • Input angle information may be determined based on the second upper body movement information.
  • the user's elbow angle may be determined based on the second upper body movement information.
  • the input angle information (e.g., elbow angle) indicated by the second upper body movement information at the time points at which the first adjustment input is detected in the section between the point 2110 and the point 2150 may continuously change. And, based on the input angle information, whether the current value of the control parameter increases or decreases may vary at each time point when the first adjustment input is detected in response to a change in the input angle information.
  • a second user input e.g., double tap
  • control is performed based on whether the current value of the control parameter is increased or decreased, which is accumulated and determined up to the point at which the second user input is detected (e.g., point 2150).
  • a second value of the parameter may be determined, and the motor driver of the wearable device may be controlled based on the second value of the control parameter. For example, if the input angle information indicated by the second upper body movement information at the time corresponding to the point 2120 at which the first adjustment input was detected after the point 1910 at which the double tap motion was detected is less than 90°, the current control parameter A decision may be made to reduce the value. If the input angle information indicated by the second upper body movement information at the time corresponding to the point 2130 at which the first adjustment input was detected is 90° or more, it may be determined to increase the current value of the control parameter.
  • the input angle indicated by the second upper body movement information at the time corresponding to the point 2140 at which the first adjustment input was detected is 90° or more, it may be determined to increase the current value of the control parameter.
  • the second value of the control parameter through the above operations may be determined as the current value of the control parameter plus 1.
  • the motor driver of the wearable device may be controlled based on the second value of the control parameter plus 1.
  • Figure 22 is a flowchart of a method for controlling a wearable device, according to an embodiment.
  • operations 2210 to 2240 below may be performed by the wearable device.
  • a processor may obtain upper body movement information about the upper body movement of a user wearing a wearable device.
  • the upper body movement information includes acceleration information generated by at least one IMU (e.g., IMUs 721; 722; 723 of FIG. 7) included in a motion sensing circuit (e.g., motion sensing circuit 710 of FIG. 7), It may include at least one of resistance value information generated by at least one stretch sensor (eg, stretch sensor 750 of FIG. 7) or a combination thereof.
  • the processor is a smart clothing (e.g., smart clothing 150 of FIG. 1 or smart clothing 700 of FIG. 7) that is wiredly connected to a wearable device through a connector (e.g., connector 760 of FIG. 7).
  • upper body movement information can be obtained from.
  • upper body movement information may be information in the form of an analog signal.
  • the upper body movement information may be information in the form of a digital signal converted from an analog signal form by the core of the smart clothing (eg, core 720 in FIG. 7).
  • the processor may obtain upper body movement information from smart clothing connected to a wearable device using short-range wireless communication.
  • upper body movement information may be information in the form of an analog signal.
  • the upper body movement information may be information in the form of a digital signal converted from an analog signal form by the core of the smart clothing.
  • the processor may detect the first user input based on upper body movement information.
  • the upper body movement information may include at least one of acceleration information, resistance value information, or a combination thereof.
  • the first user input may represent the user double-tapping smart clothing.
  • the first user input may represent the user tapping the stretch sensor twice.
  • the first user input may be expressed as a double tap.
  • the processor can detect a double tap based on resistance value information. The method for detecting user input has been described in detail with reference to FIG. 17 .
  • the processor may determine a target value of the control parameter based on second upper body movement information obtained after the first user input is detected.
  • the second upper body movement information may be obtained between the time when the first user input is detected and the time when the second user input is detected.
  • the second upper body movement information may include at least one of acceleration information or resistance value information.
  • the control parameter may include at least one of gain, delay, or a combination thereof.
  • the target value of the control parameter (e.g., the value of gain or the value of delay) may be preset and may be adjustable by the user, smart clothing, or wearable device.
  • the second upper body movement information may be about the user's arbitrary upper body movement to adjust the value of the gain.
  • the processor may detect the second user input based on the third upper body movement information.
  • the third upper body movement information may include at least one of acceleration information, resistance value information, or a combination thereof.
  • the second user input may represent the user double-tapping the smart clothing.
  • the processor may control the motor driver of the wearable device based on the target value of the control parameter. For example, if the torque value determined based on the target value of the control parameter is a positive number, the processor (e.g., the drive unit 310 in FIG. 3) operates to the user through a motor (e.g., the motor 314 in FIG. 3).
  • the driving unit can be controlled to output torque (or assistance torque) to help the user walk.
  • the driving unit can provide assistance by applying force to the user through a motor.
  • Figure 23 is a flowchart of a method for controlling a wearable device, according to an embodiment.
  • operations 2310 and 2320 below may be performed after operation 1040 described above with reference to FIG. 10.
  • Operations 2310 and 2320 may be performed by a processor (e.g., processor 342 of FIG. 3) of a wearable device (e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3).
  • a processor e.g., processor 342 of FIG. 3
  • a wearable device e.g., wearable device 120 of FIG. 1 or wearable device 300 of FIG. 3.
  • the processor may detect a motion stop input based on upper body movement information.
  • the motion stop input may represent the user double-tapping the smart clothing.
  • the motion stop input and the first and second user inputs described with reference to FIGS. 16 to 22 may represent an action in which the user double taps the stretch sensor attached to the arm on the other side of the smart clothing.
  • the first user input and the second user input may represent an action in which the user double taps a stretch sensor attached to the left arm of smart clothing.
  • the motion stop input may be preset to indicate that the user double taps the stretch sensor attached to the right arm of the smart clothing.
  • the processor may stop the operation of controlling the motor driver of the wearable device when an operation stop input is detected.
  • the torque value output from the wearable device may be 0.
  • a wearable device may operate in freestyle mode.
  • the freestyle mode may be a mode in which the reverse drivability of a motor (e.g., the motor 314 in FIG. 3C) is maximized.
  • the method of controlling the motor driver of the wearable device may be performed again.
  • the processor may control the motor driver of the wearable device based on the first value of the control parameter at the time of detecting the operation stop input in operation 2310.
  • the processor may perform the operations of FIG. 10 again.
  • the wearable device (120; 201; 300) is controlled by a processor (220; 342) that controls the wearable device, at least one sensor (276; 321) that measures the angle of the thigh support frame, and a processor. It includes a motor driver circuit 312, a motor 314 electrically connected to the motor driver circuit, and a thigh support frame that transmits torque generated by the motor to at least a portion of the user's lower extremities, wherein the processor provides a wearable device.
  • Determining operation 1030 - the control parameter is a parameter for adjusting at least one of the magnitude, direction, or timing of the torque to be output - and controlling the motor driver of the wearable device based on the first value of the control parameter ( 1040) can be performed.
  • the operation of controlling the motor driver of the wearable device based on the first value of the control parameter includes determining a torque value based on the first value of the control parameter (1110), and the torque value is It may include an operation 1120 of controlling the motor driver of the wearable device to output.
  • the upper body movement information is generated by a motion sensing circuit worn on the user's upper body
  • the motion information is acceleration information generated by at least one IMU included in the motion sensing circuit or transmitted to at least one stretch sensor. It may include at least one of resistance value information generated by
  • the operation of calculating the optimal stride length for the user based on the upper body movement information includes an operation 1210 of determining target angle information based on the upper body movement information, and an optimal stride ratio based on the target angle information. It may include an operation 1220 of calculating , and an operation 1230 of calculating the optimal stride length based on the optimal stride ratio.
  • the operation of calculating the optimal stride length based on the optimal stride ratio may include calculating the optimal stride length for the user's maximum stride length and minimum stride length preset based on the optimal stride ratio.
  • the operation of calculating the optimal stride length for the user based on the upper body movement information includes an operation of determining target angle information based on the upper body movement information (1310), and the user's upper body movement based on the target angle information. This may include an operation 1320 of determining whether the upper body movement corresponds to running, and an operation 1330 of calculating an optimal stride length based on target angle information and whether the upper body movement corresponds to running.
  • the operation of determining whether the user's upper body movement corresponds to running based on the target angle information includes the operation of determining whether the angles of both elbows indicated by the target angle information are greater than or equal to a preset threshold angle ( 1410), and an operation 1420 of determining whether the rate of change of the angles of both elbows is greater than or equal to a preset threshold angle change rate.
  • the operation of calculating the optimal stride length based on whether the upper body movement corresponds to running and the target angle information includes the angles of both elbows indicated by the target angle information being more than a preset threshold angle, and the angles of both elbows If the rate of change is greater than or equal to a preset threshold angle change rate, an operation may be included to calculate the optimal stride length based on the running mode.
  • the operation of determining the first value of the control parameter corresponding to the optimal stride may include determining the first value of the control parameter based on preset optimal parameter information corresponding to the optimal stride. there is.
  • the processor performs an operation 1610 of detecting a first user input based on upper body movement information, and determines a control parameter based on second upper body movement information obtained after the first user input is detected. 2
  • An operation 1640 of controlling the motor driver of the device may be further performed.
  • the operation of determining the second value of the control parameter based on the second upper body movement information includes an operation of determining input angle information based on the second upper body movement information (1810), and the input angle information
  • An operation 1820 of determining a second value of a correspondingly preset control parameter may be included.
  • the operation of determining the second value of the control parameter based on the second upper body movement information includes detecting the first adjustment input based on the second upper body movement information (2010), the first adjustment input An operation 2020 of determining input angle information indicated by the second upper body movement information at the detected time point, an operation 2030 of determining whether to increase or decrease the current value of the control parameter based on the input angle information, and the current value. It may include an operation 2040 of determining a second value of the control parameter based on whether the value increases or decreases.
  • the operation of determining whether to increase or decrease the current value of a control parameter based on input angle information includes determining to increase the current value when the angle indicated by the input angle information is greater than or equal to a preset angle. , if the angle indicated by the input angle information is less than a preset angle, it may include an operation of determining to decrease the current value.
  • the processor detects a motion stop input based on upper body movement information (2310), and stops the operation of controlling the motor driver of the wearable device when the motion stop input is detected (2320). You can do more.
  • a wearable device control method performed by a wearable device includes an operation (1010) of acquiring upper body movement information about the upper body movement of a user wearing a wearable device, upper body movement An operation 1020 of calculating the optimal stride length for the user based on the information, an operation 1030 of determining the first value of the control parameter corresponding to the optimal stride length - the control parameter is at least one of the size, direction, or timing of the torque to be output. It is a parameter for adjusting one -, and may include an operation 1040 of controlling the motor driver of the wearable device based on the first value of the control parameter.
  • a wearable device control method performed by a wearable device includes an operation 2210 of acquiring upper body movement information about the upper body movement of a user wearing a wearable device, upper body movement An operation 2220 for detecting a first user input based on information, an operation 2230 for determining a target value of a control parameter based on second upper body movement information obtained after the first user input is detected, a third operation Includes an operation 2240 of detecting a second user input based on upper body movement information, and an operation 2250 of controlling the motor driver of the wearable device based on the target value of the control parameter when the second user input is detected. can do.
  • the operation of determining the target value of the control parameter based on the second upper body movement information includes an operation of determining input angle information based on the second upper body movement information (1810), and corresponding to the input angle information.
  • An operation 1820 of determining a target value of a control parameter preset to be used may be included.
  • the operation of determining the second value of the control parameter based on the second upper body movement information includes detecting the first adjustment input based on the second upper body movement information (2010), the first adjustment input An operation 2020 of determining input angle information indicated by the second upper body movement information at the detected time point, an operation 2030 of determining whether to increase or decrease the current value of the control parameter based on the input angle information, and the current value. It may include an operation 2040 of determining the target value of the control parameter based on whether the value increases or decreases.
  • the operation of determining whether to increase or decrease the current value of a control parameter based on input angle information includes determining to increase the current value when the angle indicated by the input angle information is greater than or equal to a preset angle. , if the target angle size is less than a preset angle, it may include an operation of determining to decrease the current value.
  • a method for controlling a wearable device includes detecting a motion stop input based on upper body movement information (2310), and stopping the operation of controlling a motor driver of the wearable device when the motion stop input is detected. (2320) may be further included.
  • the embodiments described above may be implemented with hardware components, software components, and/or a combination of hardware components and software components.
  • the devices, methods, and components described in the embodiments may include, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, and a field programmable gate array (FPGA). ), a programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions.
  • the processing device may execute an operating system (OS) and software applications running on the operating system. Additionally, a processing device may access, store, manipulate, process, and generate data in response to the execution of software.
  • OS operating system
  • a processing device may access, store, manipulate, process, and generate data in response to the execution of software.
  • a single processing device may be described as being used; however, those skilled in the art will understand that a processing device includes multiple processing elements and/or multiple types of processing elements. It can be seen that it may include.
  • a processing device may include multiple processors or one processor and one controller. Additionally, other processing configurations, such as parallel processors, are possible.
  • Software may include a computer program, code, instructions, or a combination of one or more of these, which may configure a processing unit to operate as desired, or may be processed independently or collectively. You can command the device.
  • Software and/or data may be used on any type of machine, component, physical device, virtual equipment, computer storage medium or device to be interpreted by or to provide instructions or data to a processing device. , or may be permanently or temporarily embodied in a transmitted signal wave.
  • Software may be distributed over networked computer systems and stored or executed in a distributed manner.
  • Software and data may be stored on a computer-readable recording medium.
  • the method according to the embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer-readable medium.
  • a computer-readable medium may include program instructions, data files, data structures, etc., singly or in combination, and the program instructions recorded on the medium may be specially designed and constructed for the embodiment or may be known and available to those skilled in the art of computer software. It may be possible.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floptical disks.
  • Examples of program instructions include machine language code, such as that produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter, etc.
  • the hardware devices described above may be configured to operate as one or multiple software modules to perform the operations of the embodiments, and vice versa.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Pain & Pain Management (AREA)
  • Mechanical Engineering (AREA)
  • Epidemiology (AREA)
  • Robotics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif portable selon un mode de réalisation peut : acquérir des informations de mouvement de partie corps supérieure associées au mouvement de la partie corps supérieure d'un utilisateur portant le dispositif portable ; calculer une longueur de foulée optimale pour l'utilisateur sur la base des informations de mouvement de la partie corps supérieure ; déterminer une première valeur d'un paramètre de commande, correspondant à la longueur de foulée optimale ; et commander un pilote de moteur du dispositif pouvant être porté sur soi sur la base de la première valeur du paramètre de commande.
PCT/KR2023/013062 2022-09-26 2023-09-01 Dispositif portable et son procédé de commande WO2024071708A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020220121867A KR20240042953A (ko) 2022-09-26 2022-09-26 웨어러블 장치 및 그 장치를 제어하는 방법
KR10-2022-0121867 2022-09-26

Publications (1)

Publication Number Publication Date
WO2024071708A1 true WO2024071708A1 (fr) 2024-04-04

Family

ID=90478395

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/013062 WO2024071708A1 (fr) 2022-09-26 2023-09-01 Dispositif portable et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20240042953A (fr)
WO (1) WO2024071708A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101224056B1 (ko) * 2011-06-27 2013-01-21 엘지전자 주식회사 보행 보조 장치 및 그 작동 제어 방법
JP2013111408A (ja) * 2011-11-30 2013-06-10 Honda Motor Co Ltd 歩行補助装置
US20180228683A1 (en) * 2015-11-05 2018-08-16 Boe Technology Group Co., Ltd. Wearable device and control method thereof
KR20210069557A (ko) * 2019-12-03 2021-06-11 삼성전자주식회사 웨어러블 장치의 사용자에게 저항력을 제공하는 방법 및 장치
US20210298984A1 (en) * 2016-07-29 2021-09-30 The United States Of America, As Represented By The Secretary, Dept Of Health And Human Services Powered gait assistance systems
KR20210153505A (ko) * 2020-06-10 2021-12-17 삼성전자주식회사 웨어러블 장치 및 이의 동작 방법

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101224056B1 (ko) * 2011-06-27 2013-01-21 엘지전자 주식회사 보행 보조 장치 및 그 작동 제어 방법
JP2013111408A (ja) * 2011-11-30 2013-06-10 Honda Motor Co Ltd 歩行補助装置
US20180228683A1 (en) * 2015-11-05 2018-08-16 Boe Technology Group Co., Ltd. Wearable device and control method thereof
US20210298984A1 (en) * 2016-07-29 2021-09-30 The United States Of America, As Represented By The Secretary, Dept Of Health And Human Services Powered gait assistance systems
KR20210069557A (ko) * 2019-12-03 2021-06-11 삼성전자주식회사 웨어러블 장치의 사용자에게 저항력을 제공하는 방법 및 장치
KR20210153505A (ko) * 2020-06-10 2021-12-17 삼성전자주식회사 웨어러블 장치 및 이의 동작 방법

Also Published As

Publication number Publication date
KR20240042953A (ko) 2024-04-02

Similar Documents

Publication Publication Date Title
WO2022191435A1 (fr) Dispositif électronique et système d'aide au mouvement d'un utilisateur
WO2022097862A1 (fr) Procédé de commande d'affichage et dispositif électronique prenant en charge ledit procédé
WO2022108110A1 (fr) Dispositif électronique à affichage flexible et procédé de fonctionnement dudit dispositif
WO2020111483A1 (fr) Procédé et dispositif de sélection de faisceau de réception sur la base d'un réseau neuronal artificiel
WO2022025444A1 (fr) Procédé et appareil d'affichage d'écran
WO2020213935A1 (fr) Dispositif électronique et procédé de commande d'écran souple
WO2022154483A1 (fr) Dispositif électronique et procédé de fonctionnement d'un dispositif électronique
WO2023153694A1 (fr) Procédé pour déterminer la valeur d'un paramètre de commande, et dispositif électronique pour mettre en œuvre ce procédé
WO2022059968A1 (fr) Dispositif électronique permettant de fournir un contenu de réalité augmentée et son procédé de fonctionnement
WO2024071708A1 (fr) Dispositif portable et son procédé de commande
WO2023219245A1 (fr) Procédé et système pour fournir un programme d'exercice à un utilisateur
WO2022225235A1 (fr) Procédé et dispositif de mesure de condition musculaire d'utilisateur à l'aide d'un dispositif vestimentaire
WO2024034889A1 (fr) Procédé de détermination d'état de démarche, et procédé de réalisation de dispositif
WO2024076131A1 (fr) Procédé de sortie d'image de référence sur la base d'informations de détection d'un dispositif à porter sur soi, et dispositif électronique qui effectue celui-ci
WO2023128651A1 (fr) Procédé de commande de dispositif portable sur la base d'un mode d'exercice, et procédé de fonctionnement de dispositif électronique
WO2022124561A1 (fr) Procédé de commande de dispositif électronique utilisant une pluralité de capteurs, et dispositif électronique associé
WO2022131784A1 (fr) Dispositif électronique et procédé de fourniture de contenu de réalité augmentée
WO2022154349A1 (fr) Procédé de commande de dispositif électronique sur la base de la capacité de batterie restante, et dispositif électronique associé
WO2022030746A1 (fr) Dispositif électronique pour détecter l'état de port d'un dispositif électronique à l'aide d'un capteur inertiel et son procédé de commande
WO2022154166A1 (fr) Procédé permettant de fournir une fonction de création de contenu et dispositif électronique prenant en charge celui-ci
WO2023063803A1 (fr) Procédé et système pour fournir un programme d'exercice à un utilisateur
WO2023106763A1 (fr) Procédé de génération d'un modèle de commande de mouvement pour appareil portable et appareil électronique mettant en œuvre ledit procédé
WO2023080417A1 (fr) Procédé et dispositif de commande de dispositif portatif sur la base de la fréquence cardiaque de l'utilisateur
WO2023043035A1 (fr) Dispositif électronique et procédé de commande d'au moins un dispositif pouvant être porté, et support de stockage lisible par ordinateur non transitoire
WO2024150966A1 (fr) Dispositif électronique pour délivrer en sortie des informations de notification, et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23872828

Country of ref document: EP

Kind code of ref document: A1