US20170296100A1 - Apparatus and method for recognizing gait motion - Google Patents

Apparatus and method for recognizing gait motion Download PDF

Info

Publication number
US20170296100A1
US20170296100A1 US15/641,655 US201715641655A US2017296100A1 US 20170296100 A1 US20170296100 A1 US 20170296100A1 US 201715641655 A US201715641655 A US 201715641655A US 2017296100 A1 US2017296100 A1 US 2017296100A1
Authority
US
United States
Prior art keywords
hip joint
motion
time
horizon
landing point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/641,655
Inventor
Jun-Won JANG
Kyung-Rock KIM
Youngbo SHIM
Jusuk LEE
Bokman LIM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US15/641,655 priority Critical patent/US20170296100A1/en
Publication of US20170296100A1 publication Critical patent/US20170296100A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0237Stretching or bending or torsioning apparatus for exercising for the lower limbs
    • A61H1/0244Hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C1/00Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
    • G07C1/22Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people in connection with sports or games
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5061Force sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5069Angle sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5079Velocity sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5084Acceleration sensors

Definitions

  • Example embodiments relate to apparatuses and/or methods for recognizing a gait motion, and more particularly, to apparatuses and/or methods for recognizing a gait motion based on biometric data of a user sensed by, for example, a walking assistance apparatus.
  • Human walking is performed using different operating mechanisms of hip joints for level walking, walking in an upward inclined direction, for example, walking up stairs, and walking in a downward inclined direction, for example, walking down stairs.
  • the walking assistance apparatus may assist walking by, for example, collectively generating an oscillator-based pattern for each gait motion.
  • a walking assistance apparatus may not provide a walking assistance.
  • the walking assistance apparatus may operate differently, using operating mechanisms, with respect to recognized gait motions, respectively, thereby providing an optimized walking assistance.
  • At least one example embodiment relates to an apparatus for recognizing a gait motion.
  • an apparatus for recognizing a gait motion includes a gait motion inference unit configured to infer a gait motion based on right and left hip joint angle information of a user, the right and left hip joint angle information sensed at a point in time at which a foot of the user lands, and a landing leg detector configured to detect a landing leg between both legs of the user based on the inferred gait motion.
  • the right and left hip joint angle information may include at least one of angles of a right hip joint and a left hip joint, a difference between the angles of the right hip joint and the left hip joint, and motion directions of the right hip joint and the left hip joint.
  • the gait motion may include a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion.
  • the apparatus may further include a landing point in time detector configured to detect a landing point in time of a foot of the user based on sensed acceleration information.
  • the landing point in time detector may be configured to detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • the base horizon may be set to follow a freeze horizon preset from a previous landing point in time to prevent or mitigate an error in detection of the landing point in time.
  • the landing point in time detector may be configured to detect the landing point in time of the foot of the user by shifting the prediction horizon when the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value.
  • the mean acceleration for the base horizon may be updated for each step or preset to a first acceleration.
  • the gait motion inference unit may be configured to infer the gait motion using a fuzzy logic.
  • the gait motion inference unit may be configured to infer the gait motion by performing defuzzification based on a desired (or alternatively, preset) fuzzy rule and a value obtained by fuzzification of the right and left hip joint angle information using a membership function, and the membership function may be set based on the right and left hip joint angle information.
  • the landing leg detector may be configured to detect, as the landing leg, a leg having a greater hip joint angle between angles of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a level walking motion or a walking motion in an upward inclined direction.
  • the landing leg detector may be configured to detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a walking motion in a downward inclined direction.
  • At least one example embodiment relates to a walking assistance apparatus.
  • a walking assistance apparatus includes a driving portion configured to drive a right hip joint and a left hip joint of a user, a sensor portion configured to sense right and left hip joint angle information, an inertial measurement unit (IMU) sensor configured to sense acceleration information in response to walking of the user, and a controller configured to control the driving portion by inferring a gait motion of the user based on the right and left hip joint angle information, the right and left hip joint angle information sensed at a landing point in time of a foot of the user, the landing point in time detected based on the acceleration information, and detect a landing leg based on the inferred gait motion.
  • IMU inertial measurement unit
  • the controller may include a landing point in time detector configured to detect the landing point in time of the foot of the user based on the sensed acceleration information, a gait motion inference unit configured to infer the gait motion based on the right and left hip joint angle information of the user sensed at the detected landing point in time, and a landing leg detector configured to detect the landing leg between both legs of the user based on the inferred gait motion.
  • the landing point in time detector may be further configured to detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • the gait motion inference unit may be configured to infer the gait motion using a fuzzy logic.
  • the landing leg detector may be configured to detect, as the landing leg, a leg having a greater hip joint angle between angles of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a level walking motion or a walking motion in an upward inclined direction, and to detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a walking motion in a downward inclined direction.
  • At least one example embodiment relates to a method of recognizing a gait motion.
  • a method of recognizing a gait motion includes inferring a gait motion based on right and left hip joint angle information of a user sensed at a landing point in time of a foot of the user, and detecting a landing leg between both legs of the user based on the inferred gait motion.
  • the method may further include detecting the landing point in time of the foot of the user based on sensed acceleration information.
  • the detecting a landing point in time may include detecting a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • the inferring may include inferring the gait motion using a fuzzy logic, and inferring the gait motion by performing defuzzification based on a desired (or alternatively, preset) fuzzy rule and a value obtained by fuzzification of the right and left hip joint angle information using a membership function.
  • the membership function may be set based on the right and left hip joint angle information.
  • the detecting a landing leg may include detecting, as the landing leg, a leg having a greater hip joint angle between angles of a right hip joint and a left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a level walking motion or a walking motion in an upward inclined direction, and detecting, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a walking motion in a downward inclined direction.
  • At least one example embodiment relates to an operating method of a walking assistance apparatus.
  • an operating method of a walking assistance apparatus includes determining a gait motion based on sensed right and left hip joint angle information of a user, and outputting a driving control signal to drive a right hip joint and a left hip joint of the user based on the determined gait motion.
  • the method may further include detecting a landing leg between a right leg and a left leg of the user based on the gait motion.
  • FIG. 1 illustrates a user wearing a walking assistance apparatus according to an example embodiment
  • FIG. 2 shows a block diagram of an apparatus for recognizing a gait motion according to an example embodiment
  • FIG. 3 illustrates a sensed acceleration and horizons to be used to detect a landing point in time according to an example embodiment
  • FIG. 4 illustrates a process of inferring a gait motion using a fuzzy logic according to an example embodiment
  • FIG. 5 illustrates trajectories of angles of both hip joints of a user for a walking motion in an upward inclined direction according to an example embodiment
  • FIG. 6 illustrates trajectories of angles of both hip joints of a user for a walking motion in a downward inclined direction according to an example embodiment
  • FIG. 7 illustrates trajectories of angles of both hip joints of a user for a level walking motion according to an example embodiment
  • FIG. 8 shows a flow chart illustrating a method of recognizing a gait motion according to an example embodiment
  • FIG. 9 shows a flow chart illustrating a method of detecting a landing point in time according to an example embodiment.
  • FIG. 1 illustrates a user wearing a walking assistance apparatus according to example embodiments.
  • the walking assistance apparatus includes a driving portion 110 , a sensor portion 120 , an inertial measurement unit (IMU) sensor 130 , and a controller 140 .
  • FIG. 1 illustrates a hip-type walking assistance apparatus, the type of the walking assistance apparatus is not limited thereto.
  • the walking assistance apparatus may be applicable to, for example, a walking assistance apparatus that supports an entire pelvic limb, a walking assistance apparatus that supports a portion of a pelvic limb, etc.
  • the walking assistance apparatus that supports a portion of a pelvic limb may be applicable to, for example, a walking assistance apparatus that supports up to a knee, a walking assistance apparatus that supports up to an ankle, etc.
  • the driving portion 110 may be disposed on, for example, each of a right hip portion and a left hip portion of a user to drive both hip joints of the user.
  • the sensor portion 120 may measure both hip joint angle information of the user while the user is walking.
  • the both hip joint angle information may also be referred to as right and left hip joint angle information.
  • the sensor portion 120 may be disposed in the driving portion 110 .
  • the both hip joint angle information sensed by the sensor portion 120 may include at least one of angles of both hip joints, a difference between the angles of both hip joints, and motion directions of both hip joints.
  • the IMU sensor 130 may measure acceleration information and posture information while the user is walking. A landing point in time of a foot of the user may be detected based on the acceleration information measured by the IMU sensor 130 . However, when a sensor capable of detecting a landing point in time of a foot is included in the walking assistance apparatus, the IMU sensor 130 may not be provided to recognize a gait motion.
  • the controller 140 may infer a gait motion of the user based on right and left hip joint angle information sensed at the detected landing point in time of the foot of the user, and detect a landing leg based on the inferred gait motion.
  • the gait motion of the user recognized by the controller 140 may include, for example, a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion.
  • human walking is performed using different operating mechanisms of hip joints for level walking, walking in an upward inclined direction(e.g., walking up stairs), and walking in a downward inclined direction (e.g., walking down stairs).
  • an upward inclined direction e.g., walking up stairs
  • a downward inclined direction e.g., walking down stairs
  • the controller 140 may recognize the gait motion of the user as described above, and output a control signal to control the driving portion 110 based on at least one of the inferred gait motion and the detected landing leg.
  • the driving portion 110 may drive the hip joints of the user suitably for the recognized gait motion based on the control signal output from the controller 140 .
  • FIG. 2 shows a block diagram of an apparatus 200 for recognizing a gait motion according to an example embodiment.
  • the apparatus 200 for recognizing a gait motion includes a landing point in time detector 210 , a gait motion inference unit 220 , and a landing leg detector 230 .
  • the landing point in time detector 210 may detect a landing point in time of a foot of a user based on acceleration information sensed by the IMU sensor 130 of FIG. 1 or a separate acceleration sensor (not shown).
  • a walking assistance apparatus that supports an entire pelvic limb of a user may include a foot force sensor configured to detect a landing point in time of a foot of the user
  • the landing point in time detector 210 may detect a landing point in time of a foot of a user based on acceleration information sensed by the foot force sensor.
  • the foot force sensor may be provided on a bottom of a shoe to easily detect a landing point in time.
  • the landing point in time detector 210 may not be included in the walking assistance apparatus.
  • a walking assistance apparatus that supports a portion of a pelvic limb may not include a foot force sensor configured to detect a landing point in time of a foot of a user. In such cases, the landing point in time of the foot of the user is to be detected separately.
  • the landing point in time detector 210 may detect the landing point in time of the foot of the user based on the acceleration information sensed by the IMU sensor 130 or the acceleration sensor.
  • the acceleration information may be, for example, a vertical acceleration, or a sum of squares of accelerations in an x-axial direction, a y-axial direction, and a z-axial direction corresponding to a vertical direction.
  • the landing point in time detector 210 may detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • the base horizon may refer to a horizon in which a landing point in time does not occur in the previous step duration horizon.
  • the base horizon may be set to follow a freeze horizon preset from a landing point in time after a previous landing point in time occurs.
  • the base horizon may be set to be uniform for each step, or may be updated for each step based on a previous base horizon.
  • the mean acceleration for the base horizon may also be updated for each step, or may be predetermined to be a uniform value as desired.
  • the base horizon and the mean acceleration for the base horizon may be preset based on a general gait motion of a human. However, to detect a landing point in time more precisely based on characteristics of each user, the base horizon and the mean acceleration for the base horizon may be updated for each step.
  • the prediction horizon may start after a freeze horizon and a base horizon occur subsequent to the previous landing point in time. This reflects that a desired (or alternatively, predetermined) time is required between a current landing point in time and a subsequent landing point in time during human walking. Thus, the prediction horizon may be minimized. Further, detection of the landing point in time may be attempted in a horizon with a relatively high landing point in time detection probability. Thus, a detection performance may increase.
  • the landing point in time detector 210 may detect the landing point in time of the foot of the user by shifting the prediction horizon when the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value.
  • the prediction horizon may be set to be a horizon with a relatively high landing point in time detection probability.
  • a horizon in which a landing point in time is detected for each step of the user may be non-uniform depending on a walking condition for the user.
  • the landing point in time may not be detected in the prediction horizon set to follow the base horizon.
  • the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value, it may be determined that landing does not occur in the prediction horizon.
  • the landing point in time detector 210 may shift the prediction horizon, and compare the mean acceleration for the base horizon to a mean acceleration for the shifted prediction horizon.
  • a method of the landing point in time detector 210 detecting the landing point in time of the foot of the user will be described later with reference to FIG. 3 .
  • the gait motion inference unit 220 may infer a gait motion based on right and left hip joint angle information of the user sensed at the landing point in time of the foot of the user.
  • the gait motion inference unit 220 may infer the gait motion of the user based on right and left hip joint angle information at a single step point in time of the user.
  • the gait motion inference unit 220 may infer the gait motion using a fuzzy logic.
  • the gait motion inference unit 220 may infer the gait motion of the user based on, for example, angles of both hip joints of the user at the landing point in time of the foot of the user, a difference between the angles of both hip joints of the user, and motion directions of both hip joints of the user.
  • the gait motion of the user may be inferred by comparing the both hip joint angle information of the user to a threshold value, or through a separately preset rule.
  • a threshold value or through a separately preset rule.
  • the gait motion inference unit 220 may receive the right and left hip joint angle information of the user, and infer the gait motion of the user through fuzzification and defuzzification of the received right and left hip joint angle information.
  • the gait motion inference unit 220 may infer the gait motion of the user by performing defuzzification based on a desired (or alternatively, preset) fuzzy rule and a value obtained by fuzzification of the received right and left hip joint angle information using a membership function.
  • the membership function may be set based on the right and left hip joint angle information.
  • the fuzzy rule may be “IF-THEN” rules which are set based on both hip joint angle information for, for example, a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion.
  • the landing leg detector 230 may detect a landing leg between both legs of the user based on the gait motion inferred by the gait motion inference unit 220 .
  • the landing leg may be detected.
  • the landing leg detector 230 may detect the landing leg using different methods depending on the inferred gait motion.
  • the landing leg detector 230 may detect, as the landing leg, a leg having a greater hip joint angle between angles of the right hip joint and the left hip joint.
  • the landing leg detector 230 may detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint.
  • the landing leg detector 230 may set a method of detecting the landing leg differently for the gait motion inferred by the gait motion inference unit 220 .
  • a method of detecting the landing leg using the landing leg detector 230 will be described later with reference to FIGS. 5 through 7 .
  • the gait motion recognized by the apparatus 200 for recognizing a gait motion may be applied as information to be used by the walking assistance apparatus to provide a user with a walking assistance optimized for each gait motion.
  • FIG. 3 illustrates a sensed acceleration and horizons to be used to detect a landing point in time according to an example embodiment.
  • FIG. 3 is a graph illustrating a relationship between a time and an acceleration sensed by the IMU sensor 130 of FIG. 1 or an acceleration sensor.
  • t_psh denotes a previous stride horizon
  • t_sh denotes a current stride horizon
  • t_bh denotes a base horizon
  • t_fh denotes a freeze horizon
  • t_ph denotes a prediction horizon.
  • the base horizon may be a horizon in which a landing point in time does not occur in a previous step duration horizon.
  • the base horizon may be set to be uniform for each step, or may be updated for each step based on a previous base horizon.
  • the base horizon may be set to follow the freeze horizon preset from a previous landing point in time to prevent or mitigate an error in detection of the landing point in time after the landing point in time is detected. Taking into account that a time period is required between the landing point in time and a subsequent landing point in time, the prediction horizon may be set to follow the base horizon.
  • a method of detecting a landing point in time of a subsequent step based on a current step using the landing point in time detector 210 will be described.
  • a horizon for the current step may be estimated using a horizon for a previous step.
  • a desired freeze horizon may be set from the landing point in time.
  • the freeze horizon may be a horizon set or preset to prevent or mitigate an error in detection of the landing point in time.
  • a mean acceleration for the base horizon may be compared to a mean acceleration for a prediction horizon.
  • the freeze horizon set or preset to prevent or mitigate an error in detection of the landing point in time may be set to accurately set the mean acceleration for the base horizon estimated to be a horizon in which a landing point in time does not occur.
  • the current base horizon may be set based on a step duration horizon for the previous step.
  • the horizon for the current step may be estimated based on the previous step duration horizon, and the base horizon may be set based on the estimated horizon for the current step.
  • a difference between a mean acceleration for a previous base horizon and a mean acceleration for an initially set prediction horizon may be less than a threshold value.
  • the prediction horizon may be shifted to detect the landing point in time.
  • an actual duration horizon for the previous step estimated based on a step previous to the previous step may increase to an extent corresponding to a shifted portion of the prediction horizon.
  • the horizon for the current step may be set based on the actual duration horizon for the previous step.
  • the current base horizon may be updated to a horizon obtained by adding the shifted portion of the prediction horizon to the previous base horizon.
  • the prediction horizon may be shifted and set to follow the freeze horizon and the base horizon after the landing point in time of the current step occurs.
  • the prediction horizon may be set or preset to minimize the prediction horizon and to allow the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon to be greater than or equal to the threshold value when the landing point in time occurs.
  • the landing point in time detector 210 may detect the prediction horizon as the landing point in time. For example, a point in time at which acceleration is maximized in the prediction horizon may be detected as the landing point in time.
  • the landing point in time detector 210 may determine that landing of the foot of the user does not occur in the prediction horizon.
  • the landing point in time detector 210 may shift the prediction horizon, and compare a difference between the mean acceleration for the base horizon and a mean acceleration for the shifted prediction horizon to the threshold value.
  • the landing point in time detector 210 may detect the landing point in time by shifting the prediction horizon until the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is greater than or equal to the threshold value.
  • the landing point in time detector 210 may store a final step duration horizon corresponding to an actual duration horizon for the current step. By storing the final step duration horizon for the current step, the landing point in time detector 210 may estimate a horizon for the subsequent step.
  • a duration horizon for the subsequent step may be estimated through the stored final step duration horizon for the current step. Further, a subsequent base horizon may also be updated based on the base horizon for the current step and the prediction horizon in which the landing point in time is detected.
  • the base horizon and the mean acceleration for the base horizon may be updated for each step. While the user is walking, a step duration horizon and an acceleration may be non-uniform. Thus, a current base horizon and a mean acceleration for the current base horizon may be updated for each step through a previous step duration horizon.
  • the base horizon and the mean acceleration for the base horizon may be set to be uniform values, thereby reducing a computational complexity of the landing point in time detector 210 .
  • a landing point in time may be detected through the landing point in time detector 210 .
  • the landing point in time detected through the landing point in time detector 210 may be provided to the gait motion inference unit 220 .
  • the gait motion inference unit 220 may infer a gait motion based on both hip joint angle information of the user at the provided landing point in time.
  • FIG. 4 illustrates a process of inferring a gait motion using a fuzzy logic according to an example embodiment.
  • an input 410 includes a landing point in time and both hip joint angle information to be input into the gait motion inference unit 220 .
  • the input 410 includes, as an input parameter, at least one of a landing point in time, an angle of a left hip joint, an angle of a right hip joint, a difference between the angles of both hip joints, a motion direction of the left hip joint, and a motion direction of the right hip joint.
  • a member function may be set or preset for each input 410 to be provided to the gait motion inference unit 220 .
  • the member function may be set or preset based on a characteristic of each input parameter included in the input 410 .
  • a member function set for the angle of the left hip joint, among the input parameters may be classified into ranges of NEMID, NELOW, ZERO, POLOW, POMID, POHIGH, and POVHIGH based on the angle of the left hip joint, and expressed as a membership function.
  • the membership function may indicate a degree of a value of an input parameter belonging to a classified range based on the value of the input parameter.
  • a member function corresponding to each of the input parameters may be classified into ranges and expressed as a membership function.
  • the foregoing is provided as an example for ease of description, and may be set differently based on a characteristic of each input parameter and a characteristic of a user.
  • the gait motion inference unit 220 may perform fuzzification 420 on a value of each input parameter through a member function corresponding to each input parameter.
  • the gait motion inference unit 220 may obtain a fuzzified value of each input parameter by performing the fuzzification 420 on each input parameter through the member function.
  • the fuzzification 420 may correspond to a process of calculating a degree of the value of each input parameter belonging to each range classified in a member function corresponding to each input parameter. For example, when the angle of the left hip joint is 20°, an input angle of the left hip joint belonging to POLOW by 0.5 and POMID by 0.5 may be expressed by the fuzzified value.
  • the gait motion inference unit 220 may perform defuzzification 430 based on a set or preset fuzzy rule and the value obtained by the fuzzification 420 of each input parameter using the member function.
  • the fuzzy rule may be “IF-THEN” rules which are set or preset based on both hip joint angle information for a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion.
  • fuzzy rule may be defined as “IF-THEN” rules as follows.
  • Rules 1 through 6 may be included in a single fuzzy rule, and may be a fuzzy rule to be used to infer a gait motion based on each input parameter.
  • Rule 1 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POVHIGH, an angle of a right hip joint belongs to POLOW, and a difference between the angles of both hip joints belongs to HIGH, a gait motion is inferred as a walking motion in an upward inclined direction.
  • Rule 2 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POVHIGH, an angle of a right hip joint belongs to ZERO, and a difference between the angles of both hip joints belongs to HIGH, a gait motion is inferred as a walking motion in an upward inclined direction.
  • Rule 3 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POMID, an angle of a right hip joint belongs to POMID, and a difference between the angles of both hip joints belongs to HIGH, a gait motion is inferred as a walking motion in a downward inclined direction.
  • Rule 4 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POMID, an angle of a right hip joint belongs to POMID, and a difference between the angles of both hip joints belongs to LOW, a gait motion is inferred as a walking motion in a downward inclined direction.
  • Rule 5 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POHIGH, an angle of a right hip joint belongs to NELOW, and a difference between the angles of both hip joints belongs to HIGH, a gait motion is inferred as a level walking motion.
  • Rule 6 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POHIGH, an angle of a right hip joint belongs to NEMID, and a difference between the angles of both hip joints belongs to VHIGH, a gait motion is inferred as a level walking motion.
  • the “IF-THEN” rules are provided as an example for ease of description. It is obvious to those skilled in the art that the rules may be set differently depending on a characteristic of a gait motion.
  • the gait motion inference unit 220 may infer a gait motion of a user by performing the defuzzification 430 based on the fuzzy rule, a range to which each input parameter belongs, and the value obtained by the fuzzification 420 of each input parameter using the member function.
  • the gait motion inference unit 220 may output results 440 of finally inferring the gait motion through the defuzzification 430 .
  • the results 440 of inferring the gait motion may be classified into, for example, a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion, and may be output.
  • a fuzzy logic is one example of artificial intelligence technologies for performing deductive inference based on a fuzzy rule.
  • the gait motion inference unit 220 may infer the gait motion of the user using the fuzzy logic, thereby inferring the gait motion of the user with a relatively intuitive and robust expression in comparison to a method using a simple threshold value and/or combination of rules.
  • FIG. 5 illustrates trajectories of angles of both hip joints of a user for a walking motion in an upward inclined direction according to an example embodiment.
  • a graph illustrating a trajectory 520 of an angle of a right hip joint of a user and a trajectory 530 of an angle of a left hip joint of the user for a walking motion in an upward inclined direction is provided.
  • an x axis denotes a time
  • a y axis denotes a hip joint angle.
  • the landing leg detector 230 may detect, as a landing leg, a leg having a greater hip joint angle between the angles of the right hip joint and the left hip joint.
  • the landing leg detector 230 may detect, as the landing leg, a leg having a greater hip joint angle between the angles of the right hip joint and the left hip joint at each landing point in time 511 , 512 , 513 , 514 , or 515 .
  • the landing leg at each landing point in time 511 , 512 , 513 , 514 , or 515 may be detected as follows.
  • a right leg may be detected as the landing leg because the angle of the right hip joint is greater than the angle of the left hip joint.
  • a left leg may be detected as the landing leg because the angle of the left hip joint is greater than the angle of the right hip joint.
  • FIG. 6 illustrates trajectories of angles of both hip joints of a user for a walking motion in a downward inclined direction according to an example embodiment.
  • a graph illustrating a trajectory 620 of an angle of a right hip joint of a user and a trajectory 630 of an angle of a left hip joint of the user for a walking motion in a downward inclined direction are provided.
  • an x axis denotes a time
  • a y axis denotes a hip joint angle.
  • the landing leg detector 230 may detect, as a landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint.
  • the landing leg detector 230 may detect, as the landing leg, a leg having a motion direction with a negative velocity between the motion directions of the right hip joint and the left hip joint at each landing point in time 611 , 612 , 613 , or 614 .
  • the landing leg at each landing point in time 611 , 612 , 613 , or 614 may be detected as follows.
  • a right leg may be detected as the landing leg because the motion direction of the right hip joint has a negative velocity.
  • a left leg may be detected as the landing leg because the motion direction of the left hip joint has a negative velocity.
  • FIG. 7 illustrates trajectories of angles of both hip joints of a user for a level walking motion according to an example embodiment.
  • a graph illustrating a trajectory 720 of an angle of a right hip joint of a user and a trajectory 730 of an angle of a left hip joint of the user for a level walking motion is provided.
  • an x axis denotes a time
  • a y axis denotes a hip joint angle.
  • the landing leg detector 230 may detect, as a landing leg, a leg having a greater hip joint angle between the angles of the right hip joint and the left hip joint.
  • the landing leg detector 230 may detect, as the landing leg, a leg having a greater hip joint angle between the angles of the right hip joint and a left hip joint at each landing point in time 711 , 712 , 713 , 714 , or 715 .
  • the landing leg at each landing point in time 711 , 712 , 713 , 714 , or 715 may be detected as follows.
  • a left leg may be detected as the landing leg because the angle of the left hip joint is greater than the angle of the right hip joint.
  • a right leg may be detected as the landing leg because the angle of the right hip joint is greater than the angle of the left hip joint.
  • the landing leg detector 230 may detect a landing leg based on different criteria for a gait motion inferred by the gait motion inference unit 220 .
  • FIG. 8 shows a flow chart illustrating a method of recognizing a gait motion according to an example embodiment.
  • the landing point in time detector 210 of FIG. 2 may detect a landing point in time of a foot of a user based on acceleration information sensed by the IMU sensor 130 of FIG. 1 or a separate acceleration sensor.
  • the landing point in time detector 210 may detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • the landing point in time detector 210 may detect the landing point in time of the foot of the user by shifting the prediction horizon when the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value.
  • the gait motion inference unit 220 of FIG. 2 may infer a gait motion based on right and left hip joint angle information of the user sensed at the landing point in time of the foot of the user.
  • the gait motion inference unit 220 may infer the gait motion of the user based on right and left hip joint angle information at a single step point in time of the user.
  • the gait motion inference unit 220 may infer the gait motion using a fuzzy logic.
  • the gait motion inference unit 220 may infer the gait motion of the user based on, for example, angles of both hip joints of the user at the landing point in time of the foot of the user, a difference between the angles of both hip joints of the user, and motion directions of both hip joints of the user.
  • the gait motion inference unit 220 may infer the gait motion of the user by performing defuzzification based on a preset fuzzy rule and a value obtained by fuzzification of the right and left hip joint angle information using a member function.
  • the member function may be set based on the right and left hip joint angle information.
  • the landing leg detector 230 of FIG. 2 may detect a landing leg between both legs of the user based on the gait motion inferred by the gait motion inference unit 220 .
  • the landing leg detector 230 may detect, as the landing leg, a leg having a greater hip joint angle between angles of a right hip joint and a left hip joint.
  • the landing leg detector 230 may detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint.
  • FIG. 9 shows a flow chart illustrating a method of detecting a landing point in time according to an example embodiment.
  • the landing point in time detector 210 of FIG. 2 may estimate a horizon for a current step based on a horizon for a previous step.
  • the horizon for the current step may be estimated based on the horizon for the previous step considering that a horizon for each step may not differ greatly.
  • the landing point in time detector 210 may set a base horizon based on the estimated horizon for the current step.
  • the base horizon may be a horizon in which a landing point in time does not occur in a previous step duration horizon.
  • the base horizon may be set to be uniform for each step, or may be updated for each step based on a previous base horizon.
  • the landing point in time detector 210 may shift a prediction horizon to be set to follow a freeze horizon and the base horizon after a landing point in time of the current step occurs.
  • the prediction horizon may be set or preset to minimize the prediction horizon and to allow a difference between a mean acceleration for the base horizon and a mean acceleration for the prediction horizon to be greater than or equal to a threshold value when the landing point in time occurs.
  • the landing point in time detector 210 may compare the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon to the threshold value.
  • the threshold value is a desired (or alternatively, preset) value, and a reference value to be used to determine whether the landing point in time occurs based on the difference between the mean accelerations.
  • the landing point in time detector 210 may determine that landing of a foot of the user does not occur in the prediction horizon. In this example, the landing point in time detector 210 may shift the prediction horizon, and compare a difference between the mean acceleration for the base horizon and a mean acceleration for the shifted prediction horizon to the threshold value.
  • the landing point in time detector 210 may detect the prediction horizon as the landing point in time. For example, a point in time at which an acceleration is maximized in the prediction horizon may be detected as the landing point in time.
  • the landing point in time detector 210 may store a final step duration horizon corresponding to an actual duration horizon for the current step. By storing the final step duration horizon for the current step, the landing point in time detector 210 may estimate a horizon for the subsequent step.
  • the portion, units and/or modules described herein may be implemented using hardware components and software components.
  • the hardware components may include microphones, amplifiers, band-pass filters, audio to digital convertors, and processing devices.
  • a processing device e.g., controller
  • the processing device(s) may include a processor, a controller and an arithmetic logic unit, a digital signal processor, a microcomputer, a field programmable array, a programmable logic unit, a microprocessor or any other device capable of responding to and executing instructions in a defined manner.
  • the processing device may run an operating system (OS) and one or more software applications that run on the OS.
  • the processing device also may access, store, manipulate, process, and create data in response to execution of the software.
  • OS operating system
  • the description of a processing device is used as singular; however, one skilled in the art will appreciated that a processing device may include multiple processing elements and multiple types of processing elements.
  • a processing device may include multiple processors or a processor and a controller.
  • different processing configurations are possible, such a parallel processors.
  • the software may include a computer program, a piece of code, an instruction, or some combination thereof, to independently or collectively instruct and/or configure the processing device to operate as desired, thereby transforming the processing device into a special purpose processor.
  • Software and data may be embodied permanently or temporarily in any type of machine, component, physical or virtual equipment, computer storage medium or device, or in a propagated signal wave capable of providing instructions or data to or being interpreted by the processing device.
  • the software also may be distributed over network coupled computer systems so that the software is stored and executed in a distributed fashion.
  • the software and data may be stored by one or more non-transitory computer readable recording mediums.
  • the methods according to the above-described example embodiments may be recorded in non-transitory computer-readable media including program instructions to implement various operations of the above-described example embodiments.
  • the media may also include, alone or in combination with the program instructions, data files, data structures, and the like.
  • the program instructions recorded on the media may be those specially designed and constructed for the purposes of example embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • non-transitory computer-readable media examples include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM discs, DVDs, and/or Blue-ray discs; magneto-optical media such as optical discs; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory (e.g., USB flash drives, memory cards, memory sticks, etc.), and the like.
  • program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
  • the above-described devices may be configured to act as one or more software modules in order to perform the operations of the above-described example embodiments, or vice versa.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Engineering & Computer Science (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Geometry (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Rehabilitation Tools (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)

Abstract

An apparatus and method for recognizing a gait motion by detecting a landing point in time of a foot of a user based on sensed acceleration information, inferring a gait motion based on right and left hip joint angle information of the user sensed at the detected landing point in time of the foot of the user, and detecting a landing leg between both legs of the user based on the inferred gait motion may be provided.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This is a Continuation application of U.S. application Ser. No. 14/556,841, filed Dec. 1, 2014, which claims priority under 35 U.S.C. §119 to Korean Patent Application No. 10-2014-0096320, filed on Jul. 29, 2014, in the Korean Intellectual Property Office, the entire contents of each of which are herein incorporated by reference.
  • BACKGROUND 1. Field
  • Example embodiments relate to apparatuses and/or methods for recognizing a gait motion, and more particularly, to apparatuses and/or methods for recognizing a gait motion based on biometric data of a user sensed by, for example, a walking assistance apparatus.
  • 2. Description of the Related Art
  • Human walking is performed using different operating mechanisms of hip joints for level walking, walking in an upward inclined direction, for example, walking up stairs, and walking in a downward inclined direction, for example, walking down stairs.
  • When a walking assistance apparatus is unable to recognize a gait motion of a user during assisting walking of the user, the walking assistance apparatus may assist walking by, for example, collectively generating an oscillator-based pattern for each gait motion. However, such a walking assistance apparatus may not provide a walking assistance.
  • Thus, when human walking is to be assisted by, for example, a walking assistance apparatus, it is desired to recognize a gait motion of a user. The walking assistance apparatus may operate differently, using operating mechanisms, with respect to recognized gait motions, respectively, thereby providing an optimized walking assistance.
  • SUMMARY
  • At least one example embodiment relates to an apparatus for recognizing a gait motion.
  • According to an example embodiment, an apparatus for recognizing a gait motion includes a gait motion inference unit configured to infer a gait motion based on right and left hip joint angle information of a user, the right and left hip joint angle information sensed at a point in time at which a foot of the user lands, and a landing leg detector configured to detect a landing leg between both legs of the user based on the inferred gait motion.
  • According to some example embodiments, the right and left hip joint angle information may include at least one of angles of a right hip joint and a left hip joint, a difference between the angles of the right hip joint and the left hip joint, and motion directions of the right hip joint and the left hip joint.
  • According to some example embodiments, the gait motion may include a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion.
  • According to some example embodiments, the apparatus may further include a landing point in time detector configured to detect a landing point in time of a foot of the user based on sensed acceleration information.
  • According to some example embodiments provide that the landing point in time detector may be configured to detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • According to some example embodiments, the base horizon may be set to follow a freeze horizon preset from a previous landing point in time to prevent or mitigate an error in detection of the landing point in time.
  • According to some example embodiments, the landing point in time detector may be configured to detect the landing point in time of the foot of the user by shifting the prediction horizon when the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value.
  • According to some example embodiments, the mean acceleration for the base horizon may be updated for each step or preset to a first acceleration.
  • According to some example embodiments provide that the gait motion inference unit may be configured to infer the gait motion using a fuzzy logic. The gait motion inference unit may be configured to infer the gait motion by performing defuzzification based on a desired (or alternatively, preset) fuzzy rule and a value obtained by fuzzification of the right and left hip joint angle information using a membership function, and the membership function may be set based on the right and left hip joint angle information.
  • According to some example embodiments, the landing leg detector may be configured to detect, as the landing leg, a leg having a greater hip joint angle between angles of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a level walking motion or a walking motion in an upward inclined direction.
  • According to some example embodiments provide that the landing leg detector may be configured to detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a walking motion in a downward inclined direction.
  • At least one example embodiment relates to a walking assistance apparatus.
  • According to an example embodiment, a walking assistance apparatus includes a driving portion configured to drive a right hip joint and a left hip joint of a user, a sensor portion configured to sense right and left hip joint angle information, an inertial measurement unit (IMU) sensor configured to sense acceleration information in response to walking of the user, and a controller configured to control the driving portion by inferring a gait motion of the user based on the right and left hip joint angle information, the right and left hip joint angle information sensed at a landing point in time of a foot of the user, the landing point in time detected based on the acceleration information, and detect a landing leg based on the inferred gait motion.
  • According to some example embodiments, the controller may include a landing point in time detector configured to detect the landing point in time of the foot of the user based on the sensed acceleration information, a gait motion inference unit configured to infer the gait motion based on the right and left hip joint angle information of the user sensed at the detected landing point in time, and a landing leg detector configured to detect the landing leg between both legs of the user based on the inferred gait motion.
  • According to some example embodiments, the landing point in time detector may be further configured to detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • According to some example embodiments, the gait motion inference unit may be configured to infer the gait motion using a fuzzy logic.
  • According to some example embodiments, the landing leg detector may be configured to detect, as the landing leg, a leg having a greater hip joint angle between angles of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a level walking motion or a walking motion in an upward inclined direction, and to detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a walking motion in a downward inclined direction.
  • At least one example embodiment relates to a method of recognizing a gait motion.
  • According to an example embodiment, a method of recognizing a gait motion includes inferring a gait motion based on right and left hip joint angle information of a user sensed at a landing point in time of a foot of the user, and detecting a landing leg between both legs of the user based on the inferred gait motion.
  • According to some example embodiments, the method may further include detecting the landing point in time of the foot of the user based on sensed acceleration information.
  • According to some example embodiments, the detecting a landing point in time may include detecting a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • According to some example embodiments, the inferring may include inferring the gait motion using a fuzzy logic, and inferring the gait motion by performing defuzzification based on a desired (or alternatively, preset) fuzzy rule and a value obtained by fuzzification of the right and left hip joint angle information using a membership function. The membership function may be set based on the right and left hip joint angle information.
  • According to some example embodiments, the detecting a landing leg may include detecting, as the landing leg, a leg having a greater hip joint angle between angles of a right hip joint and a left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a level walking motion or a walking motion in an upward inclined direction, and detecting, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a walking motion in a downward inclined direction.
  • At least one example embodiment relates to an operating method of a walking assistance apparatus.
  • According to an example embodiment, an operating method of a walking assistance apparatus includes determining a gait motion based on sensed right and left hip joint angle information of a user, and outputting a driving control signal to drive a right hip joint and a left hip joint of the user based on the determined gait motion.
  • According to some example embodiments, the method may further include detecting a landing leg between a right leg and a left leg of the user based on the gait motion.
  • Additional aspects of example embodiments will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and/or other features will become apparent and more readily appreciated from the following description of example embodiments, taken in conjunction with the accompanying drawings of which:
  • FIG. 1 illustrates a user wearing a walking assistance apparatus according to an example embodiment; and
  • FIG. 2 shows a block diagram of an apparatus for recognizing a gait motion according to an example embodiment;
  • FIG. 3 illustrates a sensed acceleration and horizons to be used to detect a landing point in time according to an example embodiment;
  • FIG. 4 illustrates a process of inferring a gait motion using a fuzzy logic according to an example embodiment;
  • FIG. 5 illustrates trajectories of angles of both hip joints of a user for a walking motion in an upward inclined direction according to an example embodiment;
  • FIG. 6 illustrates trajectories of angles of both hip joints of a user for a walking motion in a downward inclined direction according to an example embodiment;
  • FIG. 7 illustrates trajectories of angles of both hip joints of a user for a level walking motion according to an example embodiment;
  • FIG. 8 shows a flow chart illustrating a method of recognizing a gait motion according to an example embodiment; and
  • FIG. 9 shows a flow chart illustrating a method of detecting a landing point in time according to an example embodiment.
  • DETAILED DESCRIPTION
  • Reference will now be made in detail with reference to various example embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. Some example embodiments are described below to explain the present disclosure by referring to the figures.
  • In the drawings, the thicknesses of layers and regions are exaggerated for clarity. Like reference numerals in the drawings denote like elements.
  • Some detailed illustrative example embodiments are disclosed herein. However, specific structural and functional details disclosed herein are merely representative for purposes of describing the example embodiments. Example embodiments may be embodied in many alternate forms and should not be construed as limited to only those set forth herein.
  • It should be understood that there is no intent to limit this disclosure to the particular example embodiments disclosed herein. On the contrary, the example embodiments described herein are to cover all modifications, equivalents, and alternatives falling within the scope of example embodiments.
  • It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of this disclosure. As used herein, the term “and/or,” includes any and all combinations of one or more of the associated listed items.
  • It will be understood that when an element is referred to as being “connected,” or “coupled,” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected,” or “directly coupled,” to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., “between,” versus “directly between,” “adjacent,” versus “directly adjacent,” etc.).
  • The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the,” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
  • It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed substantially concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
  • Various example embodiments will now be described more fully with reference to the accompanying drawings in which some example embodiments are shown.
  • FIG. 1 illustrates a user wearing a walking assistance apparatus according to example embodiments.
  • Referring to FIG. 1, the walking assistance apparatus includes a driving portion 110, a sensor portion 120, an inertial measurement unit (IMU) sensor 130, and a controller 140. Although FIG. 1 illustrates a hip-type walking assistance apparatus, the type of the walking assistance apparatus is not limited thereto. The walking assistance apparatus may be applicable to, for example, a walking assistance apparatus that supports an entire pelvic limb, a walking assistance apparatus that supports a portion of a pelvic limb, etc. The walking assistance apparatus that supports a portion of a pelvic limb may be applicable to, for example, a walking assistance apparatus that supports up to a knee, a walking assistance apparatus that supports up to an ankle, etc.
  • The driving portion 110 may be disposed on, for example, each of a right hip portion and a left hip portion of a user to drive both hip joints of the user. The sensor portion 120 may measure both hip joint angle information of the user while the user is walking. Herein, the both hip joint angle information may also be referred to as right and left hip joint angle information. The sensor portion 120 may be disposed in the driving portion 110. The both hip joint angle information sensed by the sensor portion 120 may include at least one of angles of both hip joints, a difference between the angles of both hip joints, and motion directions of both hip joints.
  • The IMU sensor 130 may measure acceleration information and posture information while the user is walking. A landing point in time of a foot of the user may be detected based on the acceleration information measured by the IMU sensor 130. However, when a sensor capable of detecting a landing point in time of a foot is included in the walking assistance apparatus, the IMU sensor 130 may not be provided to recognize a gait motion.
  • The controller 140 may infer a gait motion of the user based on right and left hip joint angle information sensed at the detected landing point in time of the foot of the user, and detect a landing leg based on the inferred gait motion.
  • The gait motion of the user recognized by the controller 140 may include, for example, a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion.
  • As described above, human walking is performed using different operating mechanisms of hip joints for level walking, walking in an upward inclined direction(e.g., walking up stairs), and walking in a downward inclined direction (e.g., walking down stairs).
  • The controller 140 may recognize the gait motion of the user as described above, and output a control signal to control the driving portion 110 based on at least one of the inferred gait motion and the detected landing leg. The driving portion 110 may drive the hip joints of the user suitably for the recognized gait motion based on the control signal output from the controller 140.
  • Hereinafter, an apparatus for recognizing a gait motion included in the controller 140, and a method thereof will be described.
  • FIG. 2 shows a block diagram of an apparatus 200 for recognizing a gait motion according to an example embodiment.
  • Referring to FIG. 2, the apparatus 200 for recognizing a gait motion includes a landing point in time detector 210, a gait motion inference unit 220, and a landing leg detector 230.
  • The landing point in time detector 210 may detect a landing point in time of a foot of a user based on acceleration information sensed by the IMU sensor 130 of FIG. 1 or a separate acceleration sensor (not shown). For example, in the event that a walking assistance apparatus that supports an entire pelvic limb of a user may include a foot force sensor configured to detect a landing point in time of a foot of the user, the landing point in time detector 210 may detect a landing point in time of a foot of a user based on acceleration information sensed by the foot force sensor.
  • In the walking assistance apparatus that supports an entire pelvic limb, the foot force sensor may be provided on a bottom of a shoe to easily detect a landing point in time. In this example, the landing point in time detector 210 may not be included in the walking assistance apparatus. However, a walking assistance apparatus that supports a portion of a pelvic limb may not include a foot force sensor configured to detect a landing point in time of a foot of a user. In such cases, the landing point in time of the foot of the user is to be detected separately.
  • The landing point in time detector 210 may detect the landing point in time of the foot of the user based on the acceleration information sensed by the IMU sensor 130 or the acceleration sensor. The acceleration information may be, for example, a vertical acceleration, or a sum of squares of accelerations in an x-axial direction, a y-axial direction, and a z-axial direction corresponding to a vertical direction.
  • The landing point in time detector 210 may detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • The base horizon may refer to a horizon in which a landing point in time does not occur in the previous step duration horizon. The base horizon may be set to follow a freeze horizon preset from a landing point in time after a previous landing point in time occurs. The base horizon may be set to be uniform for each step, or may be updated for each step based on a previous base horizon.
  • The mean acceleration for the base horizon may also be updated for each step, or may be predetermined to be a uniform value as desired. As described above, the base horizon and the mean acceleration for the base horizon may be preset based on a general gait motion of a human. However, to detect a landing point in time more precisely based on characteristics of each user, the base horizon and the mean acceleration for the base horizon may be updated for each step.
  • The prediction horizon may start after a freeze horizon and a base horizon occur subsequent to the previous landing point in time. This reflects that a desired (or alternatively, predetermined) time is required between a current landing point in time and a subsequent landing point in time during human walking. Thus, the prediction horizon may be minimized. Further, detection of the landing point in time may be attempted in a horizon with a relatively high landing point in time detection probability. Thus, a detection performance may increase.
  • The landing point in time detector 210 may detect the landing point in time of the foot of the user by shifting the prediction horizon when the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value. As described above, the prediction horizon may be set to be a horizon with a relatively high landing point in time detection probability. However, a horizon in which a landing point in time is detected for each step of the user may be non-uniform depending on a walking condition for the user.
  • Thus, the landing point in time may not be detected in the prediction horizon set to follow the base horizon. When the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value, it may be determined that landing does not occur in the prediction horizon.
  • In this example, the landing point in time detector 210 may shift the prediction horizon, and compare the mean acceleration for the base horizon to a mean acceleration for the shifted prediction horizon. A method of the landing point in time detector 210 detecting the landing point in time of the foot of the user will be described later with reference to FIG. 3.
  • The gait motion inference unit 220 may infer a gait motion based on right and left hip joint angle information of the user sensed at the landing point in time of the foot of the user. The gait motion inference unit 220 may infer the gait motion of the user based on right and left hip joint angle information at a single step point in time of the user.
  • The gait motion inference unit 220 may infer the gait motion using a fuzzy logic. The gait motion inference unit 220 may infer the gait motion of the user based on, for example, angles of both hip joints of the user at the landing point in time of the foot of the user, a difference between the angles of both hip joints of the user, and motion directions of both hip joints of the user.
  • The gait motion of the user may be inferred by comparing the both hip joint angle information of the user to a threshold value, or through a separately preset rule. However, because each user has different walking characteristics and a walking condition for each user can be non-uniform, it may be difficult to infer a gait motion of a user accurately by simply setting a threshold value or using a set rule.
  • However, when a fuzzy logic is used, inference with a relatively intuitive and robust expression may be possible in comparison to the conventional threshold value based or set rule based method described above. According to some example embodiments, the gait motion inference unit 220 may receive the right and left hip joint angle information of the user, and infer the gait motion of the user through fuzzification and defuzzification of the received right and left hip joint angle information.
  • For example, the gait motion inference unit 220 may infer the gait motion of the user by performing defuzzification based on a desired (or alternatively, preset) fuzzy rule and a value obtained by fuzzification of the received right and left hip joint angle information using a membership function. The membership function may be set based on the right and left hip joint angle information.
  • The fuzzy rule may be “IF-THEN” rules which are set based on both hip joint angle information for, for example, a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion.
  • A process of inferring the gait motion of the user based on the right and left hip joint angle information will be described later with reference to FIG. 4.
  • The landing leg detector 230 may detect a landing leg between both legs of the user based on the gait motion inferred by the gait motion inference unit 220. For the walking assistance apparatus to assist walking of the user, the landing leg may be detected.
  • The landing leg detector 230 may detect the landing leg using different methods depending on the inferred gait motion. When the inferred gait motion corresponds to the level walking motion or the walking motion in the upward inclined direction, the landing leg detector 230 may detect, as the landing leg, a leg having a greater hip joint angle between angles of the right hip joint and the left hip joint.
  • When the inferred gait motion corresponds to the walking motion in the downward inclined direction, the landing leg detector 230 may detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint.
  • In view of the both hip joint angle information, the landing leg detector 230 may set a method of detecting the landing leg differently for the gait motion inferred by the gait motion inference unit 220. A method of detecting the landing leg using the landing leg detector 230 will be described later with reference to FIGS. 5 through 7.
  • As described above, the gait motion recognized by the apparatus 200 for recognizing a gait motion may be applied as information to be used by the walking assistance apparatus to provide a user with a walking assistance optimized for each gait motion.
  • FIG. 3 illustrates a sensed acceleration and horizons to be used to detect a landing point in time according to an example embodiment.
  • FIG. 3 is a graph illustrating a relationship between a time and an acceleration sensed by the IMU sensor 130 of FIG. 1 or an acceleration sensor. In the graph, t_psh denotes a previous stride horizon, and t_sh denotes a current stride horizon. t_bh denotes a base horizon, t_fh denotes a freeze horizon, and t_ph denotes a prediction horizon.
  • The base horizon may be a horizon in which a landing point in time does not occur in a previous step duration horizon. The base horizon may be set to be uniform for each step, or may be updated for each step based on a previous base horizon.
  • The base horizon may be set to follow the freeze horizon preset from a previous landing point in time to prevent or mitigate an error in detection of the landing point in time after the landing point in time is detected. Taking into account that a time period is required between the landing point in time and a subsequent landing point in time, the prediction horizon may be set to follow the base horizon.
  • A method of detecting a landing point in time of a subsequent step based on a current step using the landing point in time detector 210 will be described. A horizon for the current step may be estimated using a horizon for a previous step. When a landing point in time of the current step is detected, a desired freeze horizon may be set from the landing point in time. As described above, the freeze horizon may be a horizon set or preset to prevent or mitigate an error in detection of the landing point in time.
  • To detect the landing point in time of the subsequent step, a mean acceleration for the base horizon may be compared to a mean acceleration for a prediction horizon. The freeze horizon set or preset to prevent or mitigate an error in detection of the landing point in time may be set to accurately set the mean acceleration for the base horizon estimated to be a horizon in which a landing point in time does not occur.
  • The current base horizon may be set based on a step duration horizon for the previous step. The horizon for the current step may be estimated based on the previous step duration horizon, and the base horizon may be set based on the estimated horizon for the current step.
  • For example, in detection of the landing point in time of the current step from the previous step, a difference between a mean acceleration for a previous base horizon and a mean acceleration for an initially set prediction horizon may be less than a threshold value. In this example, the prediction horizon may be shifted to detect the landing point in time.
  • When a landing point in time is detected in the shifted prediction horizon, an actual duration horizon for the previous step estimated based on a step previous to the previous step may increase to an extent corresponding to a shifted portion of the prediction horizon. The horizon for the current step may be set based on the actual duration horizon for the previous step. Thus, the current base horizon may be updated to a horizon obtained by adding the shifted portion of the prediction horizon to the previous base horizon.
  • The prediction horizon may be shifted and set to follow the freeze horizon and the base horizon after the landing point in time of the current step occurs. The prediction horizon may be set or preset to minimize the prediction horizon and to allow the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon to be greater than or equal to the threshold value when the landing point in time occurs.
  • When the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is greater than or equal to the threshold value, the landing point in time detector 210 may detect the prediction horizon as the landing point in time. For example, a point in time at which acceleration is maximized in the prediction horizon may be detected as the landing point in time.
  • When the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value, the landing point in time detector 210 may determine that landing of the foot of the user does not occur in the prediction horizon.
  • In this example, the landing point in time detector 210 may shift the prediction horizon, and compare a difference between the mean acceleration for the base horizon and a mean acceleration for the shifted prediction horizon to the threshold value. The landing point in time detector 210 may detect the landing point in time by shifting the prediction horizon until the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is greater than or equal to the threshold value.
  • When the landing point in time of the subsequent step is detected, the landing point in time detector 210 may store a final step duration horizon corresponding to an actual duration horizon for the current step. By storing the final step duration horizon for the current step, the landing point in time detector 210 may estimate a horizon for the subsequent step.
  • When a landing point in time of a step subsequent to the subsequent step is to be detected, a duration horizon for the subsequent step may be estimated through the stored final step duration horizon for the current step. Further, a subsequent base horizon may also be updated based on the base horizon for the current step and the prediction horizon in which the landing point in time is detected.
  • As described above, the base horizon and the mean acceleration for the base horizon may be updated for each step. While the user is walking, a step duration horizon and an acceleration may be non-uniform. Thus, a current base horizon and a mean acceleration for the current base horizon may be updated for each step through a previous step duration horizon.
  • However, when a step duration horizon and an acceleration of a user do not have large deviations for each step, the base horizon and the mean acceleration for the base horizon may be set to be uniform values, thereby reducing a computational complexity of the landing point in time detector 210.
  • As described above, in a walking assistance apparatus not including a foot force sensor, a landing point in time may be detected through the landing point in time detector 210. The landing point in time detected through the landing point in time detector 210 may be provided to the gait motion inference unit 220. The gait motion inference unit 220 may infer a gait motion based on both hip joint angle information of the user at the provided landing point in time.
  • FIG. 4 illustrates a process of inferring a gait motion using a fuzzy logic according to an example embodiment.
  • Referring to FIG. 4, an input 410 includes a landing point in time and both hip joint angle information to be input into the gait motion inference unit 220. The input 410 includes, as an input parameter, at least one of a landing point in time, an angle of a left hip joint, an angle of a right hip joint, a difference between the angles of both hip joints, a motion direction of the left hip joint, and a motion direction of the right hip joint.
  • A member function may be set or preset for each input 410 to be provided to the gait motion inference unit 220. The member function may be set or preset based on a characteristic of each input parameter included in the input 410.
  • For example, a member function set for the angle of the left hip joint, among the input parameters, may be classified into ranges of NEMID, NELOW, ZERO, POLOW, POMID, POHIGH, and POVHIGH based on the angle of the left hip joint, and expressed as a membership function. The membership function may indicate a degree of a value of an input parameter belonging to a classified range based on the value of the input parameter.
  • Similar to the angle of the left hip joint, a member function corresponding to each of the input parameters may be classified into ranges and expressed as a membership function. However, the foregoing is provided as an example for ease of description, and may be set differently based on a characteristic of each input parameter and a characteristic of a user.
  • The gait motion inference unit 220 may perform fuzzification 420 on a value of each input parameter through a member function corresponding to each input parameter. The gait motion inference unit 220 may obtain a fuzzified value of each input parameter by performing the fuzzification 420 on each input parameter through the member function.
  • The fuzzification 420 may correspond to a process of calculating a degree of the value of each input parameter belonging to each range classified in a member function corresponding to each input parameter. For example, when the angle of the left hip joint is 20°, an input angle of the left hip joint belonging to POLOW by 0.5 and POMID by 0.5 may be expressed by the fuzzified value.
  • The gait motion inference unit 220 may perform defuzzification 430 based on a set or preset fuzzy rule and the value obtained by the fuzzification 420 of each input parameter using the member function. For example, the fuzzy rule may be “IF-THEN” rules which are set or preset based on both hip joint angle information for a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion.
  • For example, the fuzzy rule may be defined as “IF-THEN” rules as follows.
      • 1. rule: if FootStrike is ON and LeftHipAng is POVHIGH and RightHipAng is POLOW and AbsHipAngDiff is HIGH then WalkMode is STAIRUP
      • 2. rule: if FootStrike is ON and LeftHipAng is POVHIGH and RightHipAng is ZERO and AbsHipAngDiff is HIGH then WalkMode is STAIRUP
      • 3. rule: if FootStrike is ON and LeftHipAng is POMID and RightHipAng is POMID and AbsHipAngDiff is VLOW then WalkMode is STAIRDOWN
      • 4. rule: if FootStrike is ON and LeftHipAng is POMID and RightHipAng is POMID and AbsHipAngDiff is LOW then WalkMode is STAIRDOWN
      • 5. rule: if FootStrike is ON and LeftHipAng is POHIGH and RightHipAng is NELOW and AbsHipAngDiff is HIGH then WalkMode is LEVEL
      • 6. rule: if FootStrike is ON and LeftHipAng is POHIGH and RightHipAng is NEMID and AbsHipAngDiff is VHIGH then WalkMode is LEVEL
  • Rules 1 through 6 may be included in a single fuzzy rule, and may be a fuzzy rule to be used to infer a gait motion based on each input parameter.
  • Rule 1 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POVHIGH, an angle of a right hip joint belongs to POLOW, and a difference between the angles of both hip joints belongs to HIGH, a gait motion is inferred as a walking motion in an upward inclined direction.
  • Rule 2 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POVHIGH, an angle of a right hip joint belongs to ZERO, and a difference between the angles of both hip joints belongs to HIGH, a gait motion is inferred as a walking motion in an upward inclined direction.
  • Rule 3 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POMID, an angle of a right hip joint belongs to POMID, and a difference between the angles of both hip joints belongs to HIGH, a gait motion is inferred as a walking motion in a downward inclined direction.
  • Rule 4 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POMID, an angle of a right hip joint belongs to POMID, and a difference between the angles of both hip joints belongs to LOW, a gait motion is inferred as a walking motion in a downward inclined direction.
  • Rule 5 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POHIGH, an angle of a right hip joint belongs to NELOW, and a difference between the angles of both hip joints belongs to HIGH, a gait motion is inferred as a level walking motion.
  • Rule 6 is a rule in which, when a landing point in time is detected, an angle of a left hip joint belongs to POHIGH, an angle of a right hip joint belongs to NEMID, and a difference between the angles of both hip joints belongs to VHIGH, a gait motion is inferred as a level walking motion.
  • However, the “IF-THEN” rules are provided as an example for ease of description. It is obvious to those skilled in the art that the rules may be set differently depending on a characteristic of a gait motion.
  • As described above, the gait motion inference unit 220 may infer a gait motion of a user by performing the defuzzification 430 based on the fuzzy rule, a range to which each input parameter belongs, and the value obtained by the fuzzification 420 of each input parameter using the member function.
  • The gait motion inference unit 220 may output results 440 of finally inferring the gait motion through the defuzzification 430. The results 440 of inferring the gait motion may be classified into, for example, a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion, and may be output.
  • A fuzzy logic is one example of artificial intelligence technologies for performing deductive inference based on a fuzzy rule. The gait motion inference unit 220 may infer the gait motion of the user using the fuzzy logic, thereby inferring the gait motion of the user with a relatively intuitive and robust expression in comparison to a method using a simple threshold value and/or combination of rules.
  • FIG. 5 illustrates trajectories of angles of both hip joints of a user for a walking motion in an upward inclined direction according to an example embodiment.
  • Referring to FIG. 5, a graph illustrating a trajectory 520 of an angle of a right hip joint of a user and a trajectory 530 of an angle of a left hip joint of the user for a walking motion in an upward inclined direction is provided. In the graph, an x axis denotes a time, and a y axis denotes a hip joint angle.
  • When a gait motion inferred by the gait motion inference unit 220 corresponds to a walking motion in an upward inclined direction, the landing leg detector 230 may detect, as a landing leg, a leg having a greater hip joint angle between the angles of the right hip joint and the left hip joint.
  • Considering the trajectories 520, 530 of angles of both hip joints for a walking motion in an upward inclined direction shown in FIG. 5, the landing leg detector 230 may detect, as the landing leg, a leg having a greater hip joint angle between the angles of the right hip joint and the left hip joint at each landing point in time 511, 512, 513, 514, or 515.
  • Based on the foregoing description, the landing leg at each landing point in time 511, 512, 513, 514, or 515 may be detected as follows. At the landing points in time 511, 513, and 515, a right leg may be detected as the landing leg because the angle of the right hip joint is greater than the angle of the left hip joint. At the landing points in time 512 and 514, a left leg may be detected as the landing leg because the angle of the left hip joint is greater than the angle of the right hip joint.
  • FIG. 6 illustrates trajectories of angles of both hip joints of a user for a walking motion in a downward inclined direction according to an example embodiment.
  • Referring to FIG. 6, a graph illustrating a trajectory 620 of an angle of a right hip joint of a user and a trajectory 630 of an angle of a left hip joint of the user for a walking motion in a downward inclined direction are provided. In the graph, an x axis denotes a time, and a y axis denotes a hip joint angle.
  • When a gait motion inferred by the gait motion inference unit 220 corresponds to a walking motion in a downward inclined direction, the landing leg detector 230 may detect, as a landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint.
  • Considering the trajectories of angles 620, 630 of both hip joints for a walking motion in a downward inclined direction shown in FIG. 6, the landing leg detector 230 may detect, as the landing leg, a leg having a motion direction with a negative velocity between the motion directions of the right hip joint and the left hip joint at each landing point in time 611, 612, 613, or 614.
  • Based on the foregoing description, the landing leg at each landing point in time 611, 612, 613, or 614 may be detected as follows. At the landing points in time 611 and 613, a right leg may be detected as the landing leg because the motion direction of the right hip joint has a negative velocity. At the landing points in time 612 and 614, a left leg may be detected as the landing leg because the motion direction of the left hip joint has a negative velocity.
  • FIG. 7 illustrates trajectories of angles of both hip joints of a user for a level walking motion according to an example embodiment.
  • Referring to FIG. 7, a graph illustrating a trajectory 720 of an angle of a right hip joint of a user and a trajectory 730 of an angle of a left hip joint of the user for a level walking motion is provided. In the graph, an x axis denotes a time, and a y axis denotes a hip joint angle.
  • When a gait motion inferred by the gait motion inference unit 220 corresponds to a level walking motion, the landing leg detector 230 may detect, as a landing leg, a leg having a greater hip joint angle between the angles of the right hip joint and the left hip joint.
  • Considering the trajectories 720, 730 of angles of both hip joints for a level walking motion shown in FIG. 7, the landing leg detector 230 may detect, as the landing leg, a leg having a greater hip joint angle between the angles of the right hip joint and a left hip joint at each landing point in time 711, 712, 713, 714, or 715.
  • Based on the foregoing description, the landing leg at each landing point in time 711, 712, 713, 714, or 715 may be detected as follows. At the landing points in time 711, 713, and 715, a left leg may be detected as the landing leg because the angle of the left hip joint is greater than the angle of the right hip joint. At the landing points in time 712 and 714, a right leg may be detected as the landing leg because the angle of the right hip joint is greater than the angle of the left hip joint.
  • As described with reference to FIGS. 5 through 7, the landing leg detector 230 may detect a landing leg based on different criteria for a gait motion inferred by the gait motion inference unit 220.
  • FIG. 8 shows a flow chart illustrating a method of recognizing a gait motion according to an example embodiment.
  • Referring to FIG. 8, in operation 810, the landing point in time detector 210 of FIG. 2 may detect a landing point in time of a foot of a user based on acceleration information sensed by the IMU sensor 130 of FIG. 1 or a separate acceleration sensor. The landing point in time detector 210 may detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
  • The landing point in time detector 210 may detect the landing point in time of the foot of the user by shifting the prediction horizon when the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value.
  • In operation 820, the gait motion inference unit 220 of FIG. 2 may infer a gait motion based on right and left hip joint angle information of the user sensed at the landing point in time of the foot of the user. The gait motion inference unit 220 may infer the gait motion of the user based on right and left hip joint angle information at a single step point in time of the user.
  • The gait motion inference unit 220 may infer the gait motion using a fuzzy logic. The gait motion inference unit 220 may infer the gait motion of the user based on, for example, angles of both hip joints of the user at the landing point in time of the foot of the user, a difference between the angles of both hip joints of the user, and motion directions of both hip joints of the user.
  • The gait motion inference unit 220 may infer the gait motion of the user by performing defuzzification based on a preset fuzzy rule and a value obtained by fuzzification of the right and left hip joint angle information using a member function. The member function may be set based on the right and left hip joint angle information.
  • In operation 830, the landing leg detector 230 of FIG. 2 may detect a landing leg between both legs of the user based on the gait motion inferred by the gait motion inference unit 220.
  • When the gait motion inferred by the gait motion inference unit 220 corresponds to a level walking motion or a walking motion in an upward inclined direction, the landing leg detector 230 may detect, as the landing leg, a leg having a greater hip joint angle between angles of a right hip joint and a left hip joint.
  • When the gait motion inferred by the gait motion inference unit 220 corresponds to a walking motion in a downward inclined direction, the landing leg detector 230 may detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint.
  • FIG. 9 shows a flow chart illustrating a method of detecting a landing point in time according to an example embodiment.
  • Referring to FIG. 9, in operation 910, the landing point in time detector 210 of FIG. 2 may estimate a horizon for a current step based on a horizon for a previous step. The horizon for the current step may be estimated based on the horizon for the previous step considering that a horizon for each step may not differ greatly.
  • In operation 920, the landing point in time detector 210 may set a base horizon based on the estimated horizon for the current step. The base horizon may be a horizon in which a landing point in time does not occur in a previous step duration horizon. The base horizon may be set to be uniform for each step, or may be updated for each step based on a previous base horizon.
  • In operation 930, the landing point in time detector 210 may shift a prediction horizon to be set to follow a freeze horizon and the base horizon after a landing point in time of the current step occurs. The prediction horizon may be set or preset to minimize the prediction horizon and to allow a difference between a mean acceleration for the base horizon and a mean acceleration for the prediction horizon to be greater than or equal to a threshold value when the landing point in time occurs.
  • In operation 940, the landing point in time detector 210 may compare the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon to the threshold value. The threshold value is a desired (or alternatively, preset) value, and a reference value to be used to determine whether the landing point in time occurs based on the difference between the mean accelerations.
  • When the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value, the landing point in time detector 210 may determine that landing of a foot of the user does not occur in the prediction horizon. In this example, the landing point in time detector 210 may shift the prediction horizon, and compare a difference between the mean acceleration for the base horizon and a mean acceleration for the shifted prediction horizon to the threshold value.
  • In operation 950, when the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is greater than or equal to the threshold value, the landing point in time detector 210 may detect the prediction horizon as the landing point in time. For example, a point in time at which an acceleration is maximized in the prediction horizon may be detected as the landing point in time.
  • In operation 960, when a landing point in time of a subsequent step is detected, the landing point in time detector 210 may store a final step duration horizon corresponding to an actual duration horizon for the current step. By storing the final step duration horizon for the current step, the landing point in time detector 210 may estimate a horizon for the subsequent step.
  • The portion, units and/or modules described herein may be implemented using hardware components and software components. For example, the hardware components may include microphones, amplifiers, band-pass filters, audio to digital convertors, and processing devices. A processing device (e.g., controller) may be implemented using one or more hardware device configured to carry out and/or execute program code by performing arithmetical, logical, and input/output operations. The processing device(s) may include a processor, a controller and an arithmetic logic unit, a digital signal processor, a microcomputer, a field programmable array, a programmable logic unit, a microprocessor or any other device capable of responding to and executing instructions in a defined manner. The processing device may run an operating system (OS) and one or more software applications that run on the OS. The processing device also may access, store, manipulate, process, and create data in response to execution of the software. For purpose of simplicity, the description of a processing device is used as singular; however, one skilled in the art will appreciated that a processing device may include multiple processing elements and multiple types of processing elements. For example, a processing device may include multiple processors or a processor and a controller. In addition, different processing configurations are possible, such a parallel processors.
  • The software may include a computer program, a piece of code, an instruction, or some combination thereof, to independently or collectively instruct and/or configure the processing device to operate as desired, thereby transforming the processing device into a special purpose processor. Software and data may be embodied permanently or temporarily in any type of machine, component, physical or virtual equipment, computer storage medium or device, or in a propagated signal wave capable of providing instructions or data to or being interpreted by the processing device. The software also may be distributed over network coupled computer systems so that the software is stored and executed in a distributed fashion. The software and data may be stored by one or more non-transitory computer readable recording mediums.
  • The methods according to the above-described example embodiments may be recorded in non-transitory computer-readable media including program instructions to implement various operations of the above-described example embodiments. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. The program instructions recorded on the media may be those specially designed and constructed for the purposes of example embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts. Examples of non-transitory computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM discs, DVDs, and/or Blue-ray discs; magneto-optical media such as optical discs; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory (e.g., USB flash drives, memory cards, memory sticks, etc.), and the like. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The above-described devices may be configured to act as one or more software modules in order to perform the operations of the above-described example embodiments, or vice versa.
  • A number of example embodiments have been described above. Nevertheless, it should be understood that various modifications may be made. For example, suitable results may be achieved if the described techniques are performed in a different order and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Accordingly, other implementations are within the scope of the following claims.

Claims (20)

What is claimed is:
1. An apparatus for recognizing a gait motion, the apparatus comprising:
a gait motion inference unit configured to infer a gait motion based on right and left hip joint angle information of a user, the right and left hip joint angle information sensed at a point in time at which a foot of the user lands; and
a landing leg detector configured to detect a landing leg between both legs of the user based on the inferred gait motion.
2. The apparatus of claim 1, wherein the right and left hip joint angle information comprises at least one of angles of a right hip joint and a left hip joint, a difference between the angles of the right hip joint and the left hip joint, and motion directions of the right hip joint and the left hip joint.
3. The apparatus of claim 1, wherein the gait motion comprises a level walking motion, a walking motion in an upward inclined direction, a walking motion in a downward inclined direction, and a standing motion.
4. The apparatus of claim 1, further comprising:
a landing point in time detector configured to detect a landing point in time of a foot of the user based on sensed acceleration information.
5. The apparatus of claim 4, wherein the landing point in time detector is configured to detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
6. The apparatus of claim 5, wherein the base horizon is set to follow a freeze horizon set from a previous landing point in time.
7. The apparatus of claim 5, wherein the landing point in time detector is configured to detect the landing point in time of the foot of the user by shifting the prediction horizon when the difference between the mean acceleration for the base horizon and the mean acceleration for the prediction horizon is less than the threshold value.
8. The apparatus of claim 1, wherein the gait motion inference unit is configured to infer the gait motion using a fuzzy logic.
9. The apparatus of claim 8, wherein the gait motion inference unit is configured to infer the gait motion by performing defuzzification based on a fuzzy rule and a value obtained by fuzzification of the right and left hip joint angle information using a member function, and
the member function is set based on the right and left hip joint angle information.
10. The apparatus of claim 1, wherein the landing leg detector is configured to detect, as the landing leg, a leg having a greater hip joint angle between angles of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a level walking motion or a walking motion in an upward inclined direction.
11. The apparatus of claim 1, wherein the landing leg detector is configured to detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a walking motion in a downward inclined direction.
12. A walking assistance apparatus comprising:
a driving portion configured to drive a right hip joint and a left hip joint of a user;
a sensor portion configured to sense right and left hip joint angle information;
an inertial measurement unit (IMU) sensor configured to sense acceleration information in response to walking of the user; and
a controller configured to
control the driving portion by inferring a gait motion of the user based on the right and left hip joint angle information, the right and left hip joint angle information sensed at a landing point in time of a foot of the user, the landing point in time detected based on the acceleration information, and
detect a landing leg based on the inferred gait motion.
13. The apparatus of claim 12, wherein the controller comprises:
a landing point in time detector configured to detect the landing point in time of the foot of the user based on the sensed acceleration information;
a gait motion inference unit configured to infer the gait motion based on the right and left hip joint angle information of the user sensed at the detected landing point in time; and
a landing leg detector configured to detect the landing leg between both legs of the user based on the inferred gait motion.
14. The apparatus of claim 13, wherein the landing point in time detector is further configured to detect a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
15. The apparatus of claim 13, wherein the gait motion inference unit is configured to infer the gait motion using a fuzzy logic.
16. The apparatus of claim 13, wherein the landing leg detector is configured:
to detect, as the landing leg, a leg having a greater hip joint angle between angles of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a level walking motion or a walking motion in an upward inclined direction, and
to detect, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a walking motion in a downward inclined direction.
17. A method of recognizing a gait motion, the method comprising:
detecting a landing point in time of a foot of a user based on sensed acceleration information;
inferring a gait motion based on right and left hip joint angle information of the user sensed at the detected landing point in time of the foot of the user; and
detecting a landing leg between both legs of the user based on the inferred gait motion.
18. The method of claim 17, wherein the detecting a landing point in time comprises detecting a prediction horizon as the landing point in time when a difference between a mean acceleration for a base horizon set based on a previous step duration horizon and a mean acceleration for the prediction horizon is greater than or equal to a threshold value.
19. The method of claim 17, wherein the inferring comprises inferring the gait motion by performing defuzzification based on a fuzzy rule and a value obtained by fuzzification of the right and left hip joint angle information using a member function, and
the member function is set based on the right and left hip joint angle information.
20. The method of claim 17, wherein the detecting a landing leg comprises:
detecting, as the landing leg, a leg having a greater hip joint angle between angles of a right hip joint and a left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a level walking motion or a walking motion in an upward inclined direction; and
detecting, as the landing leg, a leg having a motion direction with a negative velocity between motion directions of the right hip joint and the left hip joint included in the right and left hip joint angle information when the inferred gait motion corresponds to a walking motion in a downward inclined direction.
US15/641,655 2014-07-29 2017-07-05 Apparatus and method for recognizing gait motion Abandoned US20170296100A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/641,655 US20170296100A1 (en) 2014-07-29 2017-07-05 Apparatus and method for recognizing gait motion

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2014-0096320 2014-07-29
KR1020140096320A KR102378018B1 (en) 2014-07-29 2014-07-29 Gait motion recognition apparatus and method thereof
US14/556,841 US9737240B2 (en) 2014-07-29 2014-12-01 Apparatus and method for recognizing gait motion
US15/641,655 US20170296100A1 (en) 2014-07-29 2017-07-05 Apparatus and method for recognizing gait motion

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/556,841 Continuation US9737240B2 (en) 2014-07-29 2014-12-01 Apparatus and method for recognizing gait motion

Publications (1)

Publication Number Publication Date
US20170296100A1 true US20170296100A1 (en) 2017-10-19

Family

ID=55178765

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/556,841 Active 2035-01-29 US9737240B2 (en) 2014-07-29 2014-12-01 Apparatus and method for recognizing gait motion
US15/641,655 Abandoned US20170296100A1 (en) 2014-07-29 2017-07-05 Apparatus and method for recognizing gait motion

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/556,841 Active 2035-01-29 US9737240B2 (en) 2014-07-29 2014-12-01 Apparatus and method for recognizing gait motion

Country Status (2)

Country Link
US (2) US9737240B2 (en)
KR (3) KR102378018B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919943A (en) * 2019-04-16 2019-06-21 广东省妇幼保健院 Infant hip joint angle automatic testing method, system and calculating equipment
GB2616682A (en) * 2022-03-18 2023-09-20 Biomex Ltd Apparatus and method for determining phase of gait of a subject

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6660110B2 (en) * 2015-07-23 2020-03-04 原田電子工業株式会社 Gait analysis method and gait analysis system
JP6418652B2 (en) * 2016-04-22 2018-11-07 トヨタ自動車株式会社 Upper limb rehabilitation support apparatus and method for operating the same
EP3778139B1 (en) * 2016-08-17 2023-10-18 Power Assist International Corporation Wearable assist robot apparatus
US11246507B2 (en) * 2016-08-18 2022-02-15 Sigmasense, Llc. Wireless in-shoe physical activity monitoring apparatus
KR102556924B1 (en) 2016-09-05 2023-07-18 삼성전자주식회사 Method for walking assist, and device operating the same
KR102701390B1 (en) 2017-02-21 2024-09-02 삼성전자주식회사 Method and apparatus for walking assistance
CN109325479B (en) * 2018-11-28 2020-10-16 清华大学 Step detection method and device
CN110558991B (en) * 2019-07-30 2022-05-20 福建省万物智联科技有限公司 Gait analysis method
KR102278728B1 (en) * 2019-08-09 2021-07-16 재단법인대구경북과학기술원 System for automatically determining scale of spasticity based on inertia sensor
CN110522458A (en) * 2019-10-15 2019-12-03 北京理工大学 A kind of gait real-time identification method suitable for knee joint ectoskeleton
CN112869732B (en) * 2019-11-29 2024-05-28 宝成工业股份有限公司 Method and device for analyzing gait
WO2024034889A1 (en) * 2022-08-12 2024-02-15 삼성전자주식회사 Method for determining gait state, and device performing method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140121575A1 (en) * 2012-11-01 2014-05-01 Honda Motor Co., Ltd. Walking motion assist device
US20160107309A1 (en) * 2013-05-31 2016-04-21 President And Fellows Of Harvard College Soft Exosuit for Assistance with Human Motion

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6360597B1 (en) 1997-01-08 2002-03-26 The Trustees Of Boston University In-shoe remote telemetry gait analysis system
US8864846B2 (en) 2005-03-31 2014-10-21 Massachusetts Institute Of Technology Model-based neuromechanical controller for a robotic leg
JP5120795B2 (en) * 2005-11-15 2013-01-16 学校法人日本大学 Human posture motion discrimination device and energy consumption calculation device
KR100651639B1 (en) * 2005-12-30 2006-12-01 서강대학교산학협력단 Foot pressure sensor of robot for assistant exoskeletal power
KR101252634B1 (en) * 2006-04-07 2013-04-09 삼성전자주식회사 system for analyzing walking motion
JP2012516717A (en) * 2009-01-30 2012-07-26 マサチューセッツ インスティテュート オブ テクノロジー Actuator-powered knee prosthesis with antagonistic muscle action
KR101053491B1 (en) * 2009-04-07 2011-08-08 (주)휴레브 Walking Cycle Detection System and Method Using Motion Sensor
US9188963B2 (en) 2009-07-06 2015-11-17 Autonomous Id Canada Inc. Gait-based authentication system
JP5802131B2 (en) 2009-10-21 2015-10-28 本田技研工業株式会社 Method for controlling exercise assist device, walking assist device and rehabilitation method
KR20130096631A (en) 2010-04-05 2013-08-30 아이워크, 아이엔씨. Controlling torque in a prosthesis or orthosis
US8548740B2 (en) 2010-10-07 2013-10-01 Honeywell International Inc. System and method for wavelet-based gait classification
KR101384988B1 (en) * 2011-04-08 2014-04-21 연세대학교 원주산학협력단 System and method of robotic gait training
JP5885292B2 (en) * 2011-11-08 2016-03-15 公立大学法人首都大学東京 Action recognition program and processing device for action recognition
KR101317354B1 (en) * 2011-11-21 2013-10-11 서강대학교산학협력단 Control method of walking assistance torque and walking assistance apparatus
KR101323019B1 (en) * 2011-11-25 2013-10-29 신대섭 Rehabilitation Therapy Device Using Walking Assist Robot
KR101361362B1 (en) 2012-02-14 2014-02-12 한국산업기술대학교산학협력단 Walking Assistance Robot for Actively Determining Moving Speed Based on User Gait Cycle
US9682005B2 (en) 2012-02-24 2017-06-20 Massachusetts Institute Of Technology Elastic element exoskeleton and method of using same
US9451881B2 (en) 2012-12-06 2016-09-27 Autonomous_Id Canada Inc. Gait-based biometric system for detecting weight gain or loss
US9204797B2 (en) 2012-12-06 2015-12-08 Autonomous Id Canada Inc. Gait-based biometric system for detecting pathomechanical abnormalities relating to disease pathology
KR101358943B1 (en) 2013-02-12 2014-02-07 한국과학기술연구원 Pelvis support device for gait rehabilitation robot
KR102119536B1 (en) * 2014-01-15 2020-06-05 삼성전자주식회사 Wearable robot and control method for the same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140121575A1 (en) * 2012-11-01 2014-05-01 Honda Motor Co., Ltd. Walking motion assist device
US20160107309A1 (en) * 2013-05-31 2016-04-21 President And Fellows Of Harvard College Soft Exosuit for Assistance with Human Motion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Martínez-Solís, Fermín, et al. "Design of a low cost measurement system based on accelerometers for gait analysis." Acta Scientiarum. Technology 36.1 (2014): 111-121. *
Ng, Sau Kuen, and Howard Jay Chizeck. "Fuzzy model identification for classification of gait events in paraplegics." Fuzzy Systems, IEEE Transactions on 5.4 (1997): 536-544. *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919943A (en) * 2019-04-16 2019-06-21 广东省妇幼保健院 Infant hip joint angle automatic testing method, system and calculating equipment
GB2616682A (en) * 2022-03-18 2023-09-20 Biomex Ltd Apparatus and method for determining phase of gait of a subject

Also Published As

Publication number Publication date
KR20230124532A (en) 2023-08-25
KR20220039694A (en) 2022-03-29
KR20160014284A (en) 2016-02-11
US9737240B2 (en) 2017-08-22
US20160029928A1 (en) 2016-02-04
KR102569009B1 (en) 2023-08-21
KR102378018B1 (en) 2022-03-24

Similar Documents

Publication Publication Date Title
US9737240B2 (en) Apparatus and method for recognizing gait motion
US10449106B2 (en) Method and apparatus for walking assistance
US11234890B2 (en) Method and apparatus for recognizing gait task
US10575761B2 (en) Method and apparatus for recognizing gait motion
US11219572B2 (en) Method and apparatus for controlling a walking assistance apparatus
EP3166033A1 (en) Walking assistance apparatus and method of controlling same
US11919156B2 (en) Method and apparatus for recognizing user motion
US10835405B2 (en) Method and apparatus for walking assist
JP6438579B2 (en) Apparatus and method for determining a desired target
KR20160012537A (en) Neural network training method and apparatus, data processing apparatus
US20180146890A1 (en) Apparatus and method for recognizing gait state
US10548803B2 (en) Method and device for outputting torque of walking assistance device
US20220142850A1 (en) Method and apparatus for recognizing gait task
KR102046707B1 (en) Techniques of performing convolutional neural network-based gesture recognition using inertial measurement unit
KR102018416B1 (en) Device and system for monitoring micturition or ejaculation based on user's posture or change in posture, method of monitoring micturition or ejaculation, and computer-readable recording medium for implementing same method
KR101870542B1 (en) Method and apparatus of recognizing a motion
JP7369783B2 (en) Information processing method, program and information processing device
TWI505707B (en) Abnormal object detecting method and electric device using the same
Ravindran et al. Deictic option schemas
Bhattacharyya et al. A Fall Detection System using Hybrid Inertial and Physiological Signal Classifiers for Dynamic Environments
KR20240061787A (en) Method and apparatus for object tracking
Liang et al. Studying the capture of stochastic events using radar and a mobile robot
Meriçli et al. Dealing with Uncertainty in Structured Environments: A Robot Soccer Case Study
KR20210046219A (en) Method and apparatus for detecting human interaction with an object
Merigli et al. Dealing with Uncertainty in Structured Environments: A Robot

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION