CN111588597A - Intelligent interactive walking training system and implementation method thereof - Google Patents

Intelligent interactive walking training system and implementation method thereof Download PDF

Info

Publication number
CN111588597A
CN111588597A CN202010321869.1A CN202010321869A CN111588597A CN 111588597 A CN111588597 A CN 111588597A CN 202010321869 A CN202010321869 A CN 202010321869A CN 111588597 A CN111588597 A CN 111588597A
Authority
CN
China
Prior art keywords
gait
lower limb
flexion
angle
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010321869.1A
Other languages
Chinese (zh)
Inventor
董梁
鲜沛宜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ikcare Medical Electrical Equipment Co ltd
Original Assignee
Ikcare Medical Electrical Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ikcare Medical Electrical Equipment Co ltd filed Critical Ikcare Medical Electrical Equipment Co ltd
Priority to CN202010321869.1A priority Critical patent/CN111588597A/en
Publication of CN111588597A publication Critical patent/CN111588597A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0028Training appliances or apparatus for special sports for running, jogging or speed-walking
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/12Driving means
    • A61H2201/1207Driving means with electric or magnetic drive
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/06363D visualisation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/18Inclination, slope or curvature
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/30Speed
    • A63B2220/34Angular speed
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • Public Health (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Biophysics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Pain & Pain Management (AREA)
  • Rehabilitation Therapy (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Rehabilitation Tools (AREA)

Abstract

The invention discloses an intelligent interactive walking training system which comprises a lower limb movement capturing sensor combination, a gait processor connected with the lower limb movement capturing sensor combination, a feedback controller connected with the gait processor and a prompting module connected with the feedback controller. The invention also discloses a realization method of the intelligent interactive walking training system, which comprises the steps of establishing a gait model in the gait recognition module and setting a training target value in the training setting module; and the lower limb motion capture sensor combination acquires the magnetic field intensity, the angular velocity and the acceleration value of the left lower limb and the right lower limb in the motion process in real time. The gait recognition method adopted by the invention decomposes each step into 4 gaits, and the duration of each gait is very short, so the abnormal gait recognition speed is very fast, and once the abnormal gait is found, the correction can be prompted immediately when the abnormal gait is repeated next time, therefore, the gait recognition method has very fast responsiveness and is similar to the actual guidance scene of a rehabilitation therapist.

Description

Intelligent interactive walking training system and implementation method thereof
Technical Field
The invention relates to the field of walking rehabilitation training equipment, in particular to a walking rehabilitation training system capable of intelligently identifying abnormal gaits and correcting the abnormal gaits in an interactive mode and a using method thereof.
Background
The walking ability is an important evaluation index of human transfer function, and each walking action of the lower limbs of the human is completed by the cooperation of multiple organs and limb parts, including brain control, nerve conduction, lower limb muscle contraction, lower limb joint flexion and extension and the like. Walking disorder can be caused by any one of the steps, so that a large number of diseases and sequelae can be related to the walking disorder, including nerve center injury, spinal cord injury, muscle injury, orthopedic diseases, postoperative long-term braking and the like. Most of patients with dysbasia have suffered from injuries to the nerve center mainly caused by stroke, and these patients are characterized in that the functional region of the brain controlling the movement of the lower limbs is damaged, and the spinal cord, muscles and joints are relatively normal. There are many kinds of walking exercises performed on such patients, such as early standing exercises by means of a standing exercise bed, non-bearing walking exercises using a walking exercise machine having a load-bearing and body-supporting function; equipment-assisted walking training after having touchdown capability; obstacle crossing training, up-down slope training and the like after certain walking ability. The main goals of these exercises are to create new brain reflex zones, induce control function reflexes, enhance the control ability of the separation action, and enhance the balance ability. Such exercise training is performed almost all in hospitals, either by means of large equipment or by the guidance of professional rehabilitation therapists, and is difficult to develop for discharged patients.
The walking training outside the hospital is mostly instructed by the doctor for long-term medical orders, such as walking training for one hour every day, walking training for going up and down stairs every day, and the like. However, direct monitoring and supervision cannot be performed, so that whether the walking action of the user meets training requirements cannot be evaluated, and real-time guidance cannot be performed even when abnormal gait occurs, so that the walking training after discharge cannot guarantee the quality, the walking function of many patients cannot be improved late, and the patients need to walk with the help of auxiliary equipment for a long time.
In view of the above, a home gait training device with interaction and monitoring functions has great value, but unfortunately, no mature such device is available on the market for patients. The technology that can supply for reference has an interactive upper limbs rehabilitation equipment based on miniature sensor that patent number ZL2011205619611 disclosed, and this equipment adopts a plurality of wearable sensors to track the upper limbs motion, realizes interactive training, also can assess upper limbs function. However, the device is specially designed for upper limb movement, and neither the monitoring algorithm nor the evaluation algorithm is suitable for walking training. Patent application No. 2019102852009 discloses an interactive difficulty-grading neck training system, which uses a single sensor to detect head and neck movements, automatically recommends the next training movement according to the execution of the user's current movement, and monitors and guides the user's execution angle, drafting holding time, etc. in real time during the training process. Although the technology has intelligence and real-time interaction function, as with the technology of the previous patent, neither algorithm nor interaction mode is suitable for walking training. The intelligent household interactive walking training system has to comprehensively consider the following factors: firstly, a large weight support device cannot be deployed in a household environment, and facility transformation similar to a hospital rehabilitation department cannot be implemented, and for safety, walking training mainly aims at patients with primary walking and support capacity; secondly, the household environment is not limited to indoor, but comprises places suitable for walking training such as homes, districts and parks, and the target equipment has portability and environmental adaptability; thirdly, the main external manifestation of the walking process is the periodic flexion and extension changes of the hip joint, the knee joint and the ankle joint, and effective gait monitoring needs to acquire the real-time angles of the three joints and also needs to consider the linkage relation of the three joints; fourth, the interaction of walking training cannot come from the vision, because the user must observe the road conditions under the feet instead of the screen during training, so the interaction should be mainly voice and body surface feeling. Therefore, it is important to provide a new type of walking training system.
Disclosure of Invention
The invention aims to overcome the defects and provides an intelligent interactive walking training system and a using method thereof, which supervise the walking training of a user in real time, collect the flexion and extension angles of three joints of a hip, a knee and an ankle in the training process, use a Hidden Markov Model (HMM) to identify and evaluate the gait, give a prompt and a stimulus in a voice and body surface vibration mode when the next same gait starts after an abnormal gait is detected, and assist the user to correct the abnormal gait.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows: an intelligent interactive walking training system comprising:
lower limb movement capture sensor combination: the lower limb movement capturing sensor combination is used for measuring magnetic field intensity, angular velocity and acceleration values of the worn limb relative to three axes of the sensor under each gait in real time.
Training sets up the module: the training setting module is used for setting target values of all gaits during training; the target value of each gait comprises the maximum and minimum value ranges of the flexion and extension angles of the hip joint, the knee joint and the ankle joint under each gait, and the duration range of each gait.
A gait recognition module: the gait recognition module calculates the flexion and extension angles of all joints of the worn limb according to data measured by the lower limb movement capture sensor combination, recognizes the current gait according to the flexion and extension angles, and updates the maximum value and the minimum value of all joint flexion and extension angles of the current gait and the duration of the current gait; the gait recognition module is also used for comparing the flexion and extension angles of all joints with the set flexion and extension angle target value, meanwhile, comparing the duration time of the finished gait with the set gait duration time, judging whether the finished gait reaches the standard or not, and outputting a judgment result; and when the gait changes, the gait recognition module outputs a new gait mark.
A gait storage module: the gait storage module is used for storing the judgment result of the historical gait.
A feedback generation module: and the feedback generation module determines whether to give prompt or feedback when the next gait starts according to the judgment result of the same gait in the previous period.
A prompt module: the prompt module is used for sending prompt information.
The lower limb motion capture sensor combination comprises a left lower limb motion capture sensor node and a right lower limb motion capture sensor node.
The intelligent interactive walking training system further comprises a calibration module, wherein the calibration module is used for measuring the wearing error of the lower limb movement capturing sensor combination and the angle of the advancing direction relative to each sensor, and inputting the wearing error and the advancing direction into the gait recognition module.
The gait recognition module is also used for correcting the flexion and extension angles of the right hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb according to the wearing error and the advancing direction provided by the calibration module.
The intelligent interactive walking training system also comprises a sensor interface, wherein the sensor interface comprises a left lower limb sensor interface and a right lower limb sensor interface; the left lower limb sensor interface is used for providing power for the left lower limb motion capture sensor node and acquiring measurement data of the left lower limb motion capture sensor node; the right lower limb sensor interface is used for providing power for the right lower limb motion capture sensor node and acquiring measurement data of the right lower limb motion capture sensor node.
The prompting module comprises a vibration motor combination, an earphone or a sound box.
An implementation method of an intelligent interactive walking training system comprises the following steps:
step 1: establishing a gait model in a gait recognition module, and setting a training target value in a training setting module;
step 2: the lower limb movement capturing sensor combination acquires the magnetic field intensity, the angular velocity and the acceleration value of the left lower limb and the right lower limb in the movement process in real time;
and step 3: the gait recognition module respectively calculates the flexion and extension angles of hip joints, knee joints and ankle joints of the left lower limb and the right lower limb according to the acquired magnetic field intensity, angular velocity and acceleration values of the left lower limb and the right lower limb;
and 4, step 4: the gait recognition module is used for recognizing the current gait of the left lower limb and the right lower limb by combining the gait model with the flexion and extension angles of the hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb;
and 5: the gait recognition module updates the maximum flexion-extension angle value and the minimum flexion-extension angle value which are reached by the hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb in the current gait, and simultaneously updates the duration time of the current gait of the left lower limb and the right lower limb;
step 6: the gait recognition module judges whether gait conversion occurs or not; if yes, the gait recognition module compares the maximum flexion-extension angle value and the minimum flexion-extension angle value of the hip joint, the knee joint and the ankle joint which finish the gait in front of the left lower limb and the right lower limb, and the duration time of the finished gait with the training target value set by the training setting module, judges whether the finished gait is abnormal or not, sends the judgment result to the gait storage module for storage, and simultaneously sends a new gait mark to the feedback generation module; if not, returning to the step 2;
and 7: after receiving the new gait mark sent by the gait recognition module, the feedback generation module inquires whether the judgment result of the same gait on the limb on the same side stored by the gait storage module is abnormal gait; if yes, outputting prompt information; otherwise, returning to the step 2.
Further, the gait model in step 1 is as follows: each step of the lower limbs on one side is a walking cycle, the walking cycle is divided into four gaits of heel off the ground, forward swing, heel touching the ground and relative backward swing, and the left lower limb and the right lower limb finish the conversion of the gaits in sequence during walking; wherein when one lower limb is in heel off gait, the other lower limb is in heel on gait; when the lower limbs on one side are in forward swing gait, the lower limbs on the opposite side are in backward swing gait; when the lower limbs on one side are in heel-off gait, the lower limbs on the other side enter heel-off gait; when the lower limbs on one side are relatively backward swing gaits, the opposite side enters forward swing gaits.
The training target values set in step 1 include:
1) left hip minimum angle LH in heel off ground gaitS1Right hip joint minimum angle RHS1Left ankle minimum angle LAS1Right ankle minimum angle RAS1
2) Minimum angle LH of left hip joint in forward swing gaitS2Right hip joint minimum angle RHS2Left knee joint maximum angle LKS2Maximum angle RK of right knee jointS2Left ankle maximum angle LAS2Right ankle maximum angle RAS2
3) Left hip maximum angle LH in heel strikeS3Maximum angle RH of right hip jointS3Left ankle maximum angle LAS3Right ankle maximum angle RAS3
4) Maximum angle LH of left hip joint in relative supination gaitS4Maximum angle RH of right hip jointS4The minimum angle LK of the left knee jointS4Minimum angle RK of right knee jointS4
5) Duration range of each gait.
The specific method for identifying the current gait of the left lower limb and the current gait of the right lower limb in the step 4 is as follows: obtaining a flexion-extension angle vector l of the left lower limb at any time ttAnd the flexion-extension angle vector r of the right lower limbtIn 1 with1、l2、l3…lt,r1、r2、r3…rtAs the output of two random processes from the moment 1 to the moment t, a Viterbi algorithm is adopted to perform backward thrust on the gait model, and the gait of which the left lower limb and the right lower limb have the maximum posterior probability at the moments 1, 2 and 3 … t is obtained, namely each gait identified by the gait model; wherein, the flexion-extension angle vector l of the left lower limbt=(lht,lkt,lat),lhtLk is the flexion and extension angle of the left hip joint at the moment ttThe left knee joint at the time of tAngle of flexion and extension of the joints, latThe flexion and extension angle of the left ankle joint at the moment t; flexion-extension angle vector r of right lower limbt=(rht,rkt,rat),rhtIs the flexion-extension angle, rk, of the right hip joint at time ttIs the flexion and extension angle of the right knee joint at time t, ratThe flexion-extension angle of the right ankle joint at the time t.
Step 1.1 is also included before step 2; step 1.1: the calibration module measures the wearing error of the lower limb movement capturing sensor combination and the angle of the travel direction relative to each sensor.
Compared with the prior art, the invention has the following advantages and beneficial effects:
(1) the invention adopts a Hidden Markov Model (HMM) to identify the gait, the mathematical model has strong fuzzy logic, has good identification precision for different users and different walking modes, and the model can be automatically optimized during use to better match the walking habits of the users.
(2) The gait recognition method adopted by the invention decomposes each step into 4 gaits, and the duration of each gait is very short, so the abnormal gait recognition speed is very fast, and once the abnormal gait is found, the correction can be prompted immediately when the abnormal gait is repeated next time, therefore, the gait recognition method has very fast responsiveness and is similar to the actual guidance scene of a rehabilitation therapist.
(3) The invention adopts an interactive mode to independently set the 4 gaits decomposed from each step, including setting the joint target angle and the duration of each gait, rather than setting a threshold in a general way.
(4) The invention adopts sound and body surface vibration stimulation as prompt and feedback, so that the user can concentrate on road conditions without watching a screen, thereby ensuring the training safety.
(5) The motion sensor, the earphone, the vibration motor, the processor and the like used by the invention can be integrated into a miniature portable device, so that a user can easily wear the device on the body and train in various occasions such as indoor, residential areas, parks and the like at any time without any transformation on the environment, thereby having good applicability.
Drawings
FIG. 1 is a block diagram of the structure of the intelligent interactive walking training system of the present invention.
Fig. 2 is a schematic view of the lower limb movement capturing sensor combination and the vibration motor combination of the present invention.
FIG. 3 is a flowchart illustrating a method for implementing the intelligent interactive walking training system of the present invention.
The figures of the above drawings are numbered:
1-sensor, 2-vibrating motor.
Detailed Description
The present invention will be described in further detail with reference to examples and drawings, but the present invention is not limited thereto.
Example 1
As shown in fig. 1, the intelligent interactive walking training system of the present embodiment includes a lower limb motion capture sensor assembly, a gait processor connected to the lower limb motion capture sensor assembly, a feedback controller connected to the gait processor, and a prompt module connected to the feedback controller.
Specifically, the lower limb movement capturing sensor combination is used for measuring magnetic field strength, angular velocity and acceleration values of the worn limb relative to three axes of the sensor under each gait in real time. The lower limb motion capture sensor combination comprises a left lower limb motion capture sensor node and a right lower limb motion capture sensor node, wherein the left lower limb motion capture sensor node and the right lower limb motion capture sensor node are both composed of a plurality of cascaded sensors 1. In use, as shown in fig. 2, the sensors are fixed to the right thigh outer side, the right calf outer side, the right instep, the left thigh outer side, the left calf outer side and the left instep, respectively, and the magnetic field strength, the angular velocity and the acceleration value of each joint of the left lower limb and the right lower limb relative to three axes of the sensors are measured in real time by the plurality of sensors.
The sensors adopted by the lower limb motion capture sensor combination are in the prior art, a motion capture system with the model of MMocap of Wuxi micro technology company Limited or a motion tracking sensor with the model of YD122 of Sichuan Xukang medical appliances Limited can be selected, and the arrangement and use of the sensors in the lower limb motion capture sensor combination can be completed by a person skilled in the art according to the description without creative labor, so that the description is omitted.
The gait processor comprises a gait recognition module, a training setting module, a left lower limb sensor interface and a right lower limb sensor interface. The training setting module is connected with the gait recognition module, and the gait recognition module is connected with the feedback controller; the left lower limb motion capture sensor node is connected with the gait recognition module through a left lower limb sensor interface, and the right lower limb motion capture sensor node is connected with the gait recognition module through a right lower limb sensor interface.
The left lower limb sensor interface is used for providing power for the left lower limb motion capture sensor node and acquiring measurement data of the left lower limb motion capture sensor node; the right lower limb sensor interface is used for providing power for the right lower limb motion capture sensor node and acquiring measurement data of the right lower limb motion capture sensor node. The left lower limb sensor interface and the right lower limb sensor interface are power and data interfaces on the mobile interactive equipment, are connected with the lower limb motion capture sensor in a wired mode in a combined mode, provide power for the lower limb motion capture sensor, and simultaneously acquire measurement data of the lower limb motion capture sensor and send the measurement data to the gait recognition module.
The training setting module is used for setting a target value during specific gait training; the target values of the gaits include the maximum and minimum ranges of flexion and extension angles of the hip joint, knee joint and ankle joint of the left and right lower limbs under the specific gait, and the duration range of the specific gait.
The gait recognition module calculates the flexion and extension angles of each joint of the left lower limb and the right lower limb according to data measured by the lower limb motion capture sensor; a gait model is built in the gait recognition device, and the current gait of the left lower limb and the current gait of the right lower limb are recognized according to the gait model and the flexion-extension angle; before gait conversion, the gait recognition module continuously updates the maximum value and the minimum value of the flexion-extension angle of each joint of the left lower limb and the right lower limb of the current gait and the duration of the current gait, wherein the maximum value and the minimum value of the flexion-extension angle of each joint comprise the maximum value and the minimum value of the flexion-extension angle of the hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb; when the gait is converted, the gait recognition module also compares the flexion and extension angles of all joints with the flexion and extension angle target value set in the training setting module, compares the duration time of the finished gait with the gait duration time set in the training setting module, judges whether the finished gait reaches the standard or not, and outputs the judgment result to the feedback controller; and when the gait changes, the gait recognition module also outputs a new gait mark to the feedback controller.
In order to improve the training effect, as another preferable scheme, the gait processor also comprises a calibration module which is connected with the gait recognition module.
The calibration module is used for measuring the wearing error of the lower limb movement capturing sensor combination and the included angle between the advancing direction and each sensor, and inputting the wearing error and the advancing direction into the gait recognition module. During specific calibration, the calibration module prompts a user to complete several calibration actions, the wearing errors of the sensor nodes and the included angles between the advancing direction and the sensors are measured, the included angles mainly comprise the included angles between the normal axis of the worn sensor and the direction of the earth plumb and the included angles between the horizontal axis of the sensor and the direction right before the body, and the included angle values representing the wearing errors are input to the gait recognition module by the calibration module. At the moment, the gait recognition module is further used for correcting the flexion and extension angles of the right hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb according to the wearing error and the advancing direction provided by the calibration module so as to improve the monitoring accuracy of the system.
When the gait processor is specifically set, each module in the gait processor is arranged in mobile interaction equipment, such as a mobile phone, a tablet or other mobile terminals, and the interaction setting function of the training setting module is realized through a touch screen or a keyboard of the mobile interaction equipment.
The feedback controller comprises a gait storage module and a feedback generation module connected with the gait storage module, the feedback generation module and the gait storage module are both connected with the gait recognition module, and the feedback generation module is also connected with the prompt module. The various modules of the feedback controller are also provided in the mobile interaction device.
The gait storage module is used for receiving and storing the judgment result of the historical gait sent by the gait recognition module, and the judgment result is a storage space divided by software.
The feedback generation module is used for receiving the new gait mark sent by the gait recognition module and determining whether to give a prompt or feedback when the next gait starts according to the judgment result of the same gait in the previous period. The specific work is as follows: the gait storage module stores previous judgment results of a plurality of gaits sent by the gait recognition module, wherein the judgment results comprise normal gaits, a certain joint flexion and extension angle not reaching a target angle, a certain joint flexion and extension angle exceeding the target angle, and overtime or over-fast gaits. If the feedback generation module receives the new gait mark sent by the gait recognition module, the judgment result of the same gait on the limb on the same side in the gait storage module is checked, and if the gait is abnormal, prompt or feedback judgment is made.
The prompting module is used for sending prompting information and comprises a vibration motor combination, an earphone or a sound box; the earphone or the sound box is connected with the feedback generation module in a wired or wireless mode. The prompt function of the calibration module is realized through an earphone or a sound box, and the user is informed of completing several standard actions in a voice mode, and the wearing error and the advancing direction are measured. The voice segment of the feedback generation module is stored in the mobile interaction device, and the content is mainly prompt voice, for example: the training aid is characterized by comprising a front swing thigh, a straight knee joint, a bent ankle joint, a little faster speed and the like, and can prompt a trainer through an earphone or a sound box in the training process. The vibration motor combination is connected with the feedback generation module in a wired mode. The above interfaces using wired connections are all provided on mobile interaction devices.
The vibration motor combination comprises a plurality of vibration motors 2 which are independently controlled, and when the vibration motor combination is used, the vibration motors are respectively fixed on a right thigh back side muscle group, a right thigh front near knee joint muscle group, a right calf back side muscle group, a left thigh front near knee joint muscle group and a left calf back side muscle group, and when gait is abnormal, the feedback generation module drives the corresponding vibration motors in the vibration motor combination to generate vibration for a specified time length so as to prompt a trainer to correct gait. The vibration motor can be a FLAT vibration motor with model number KPD-FLAT-0827/0834, KPD4C-H056 cylindrical vibration motor or KPD-RS3322B patch vibration motor of Shenzhen Kunberda electromechanical Limited, or a FLAT motor of a mobile phone with model number 1027/1034/1020/0820 of Shenzhen Shizhen Quzhen technology Limited, or a hollow cup vibration motor with model number DJ0612 of Shenzhen electromechanical Limited, which is a conventional technology and is not described herein again.
Example 2
As shown in fig. 3, this embodiment is a method for implementing the intelligent interactive walking training system of embodiment 1, and the method includes the following steps:
step 1: and a gait model is established in the gait recognition module, and a rehabilitation therapist sets a training target value through the training setting module according to the walking ability of the user.
Specifically, the gait model established by the gait recognition module is a Hidden Markov Model (HMM), and the gait model is as follows: each step of the lower limbs on one side is a walking cycle, one walking cycle is divided into 4 gaits of heel off the ground, forward swing, heel touching the ground and relative backward swing, and the left lower limb and the right lower limb finish the conversion of the gaits in sequence during walking. For example, while the left lower limb is in a heel lift gait, the right lower limb is in a heel strike gait; when the left lower limb is in forward swing gait, the right lower limb is in backward swing gait; when the left lower limb is in heel-off gait, the right lower limb enters heel-off gait; when the left lower limb is in a relatively backward swinging gait, the right lower limb enters a forward swinging gait.
The specific modeling method is as follows: let the left hip joint angle observed at time t be lhtThe right hip joint has an angle rhtThe angle of the left knee joint is lktThe right knee joint angle is rktThe angle of the left ankle joint is latThe right ankle joint angle is rat. The observed value of the left lower limb at the time t is expressed as l in a vector formt=(lht,lkt,lat) (ii) a The observation vector at time t of the right lower limb is denoted as rt=(rht,rkt,rat). With ltL (arbitrary L) as the observed output at time t of some random processt∈ L) as the whole observation space, and establishing a gait model H of the left lower limb according to the 4 gaitsLThe Model is a first-order left-right Hidden Markov Model (L-R HMM) and comprises 4 states, wherein the state S1 corresponds to a heel-off gait, the state S2 corresponds to a forward swing gait, the state S3 corresponds to a heel-on-ground gait, the state S4 corresponds to a relative backward swing gait, any one state can only repeat itself or enter the next state of the adjacent region according to a certain probability, the adjacent state of the state S4 is S1, and each state outputs an observation vector L in an observation space L according to different probability densitiest. Same method, with R (arbitrary R)t∈ R) as an observation space, and establishing a right lower limb gait HMM model H with 4 statesR
After modeling, initial setting of gait models is required, specifically, HLAnd HRThe method comprises the steps of firstly, initializing according to the probability obtained by standard gait calculation, setting the transition probability between adjacent states according to the duration time of each state under normal gait, and setting the state output probability according to the flexion-extension angle probability density of hip joints, knee joints and ankle joints observed under normal gait.
Specifically, a training target value can be set to the training setting module through a touch screen or a keyboard of the mobile interactive device, wherein the set training target value includes:
1) minimum left hip angle LH of S1 in heel off ground gaitS1Right hip joint minimum angle RHS1Left ankle minimum angle LAS1Right ankle minimum angle RAS1
2) Minimum angle LH of left hip joint in forward swing gait S2S2Right hip joint minimum angle RHS2Left knee joint maximum angle LKS2Maximum angle RK of right knee jointS2Left ankle maximum angle LAS2Right ankle maximum angle RAS2
3) Maximum left hip angle LH of heel strike S3S3Maximum angle RH of right hip jointS3Left ankle maximum angle LAS3Right ankle maximum angle RAS3
4) Maximum angle LH of the left hip joint relative to S4 in a supination gaitS4Maximum angle RH of right hip jointS4The minimum angle LK of the left knee jointS4Minimum angle RK of right knee jointS4
5) A duration range of each gait; wherein the heel lift gait S1 is (T)S1min、TS1max) The forward swing gait S2 is (T)S2min、TS2max) Heel strike gait S3 is (T)S3min、TS3max) Relative supination gait S4 is (T)S4min、TS4max)。
After the work is done, the following earlier stage work is needed before training is started:
a. the motion sensors of the lower limb motion capture sensor combination are respectively fixed on the outer sides of the left/right thighs, the outer sides of the left/right shanks and the left/right dorsum of feet, and are respectively connected to a left lower limb sensor interface and a right lower limb sensor interface of a gait processor after being cascaded by data lines.
b. The vibration motors combined by the vibration motors are respectively fixed on the muscle group at the back of the left/right thigh, the muscle group at the front of the left/right thigh near the knee joint and the muscle group at the back of the left/right crus, and the vibration motors are connected with the feedback controller in a wired mode.
c. And connecting the feedback controller with an earphone or a sound box.
d. Correcting wearing errors of the lower limb movement capturing sensor combination; the specific operation is as follows: the user respectively makes the following calibration actions according to the voice prompt of the earphone or the sound box: keeping the two legs upright, the left leg step forward and the right leg step forward, and the calibration module of the gait processor acquires the included angles between the axial direction of the sensor and the normal direction and the forward direction of the ground to finish the correction of the wearing position and the advancing direction.
After the earlier stage work is done, the earphone or the sound box prompts to start training, and the user starts walking training and then enters the following steps.
Step 2: the lower limb motion capture sensor combination acquires the magnetic field intensity, the angular velocity and the acceleration value of the left lower limb and the right lower limb in the motion process in real time and transmits the values to the gait recognition module through the left lower limb sensor interface and the right lower limb sensor interface.
And step 3: and the gait recognition module respectively calculates the flexion and extension angles of hip joints, knee joints and ankle joints of the left lower limb and the right lower limb according to the acquired magnetic field intensity, angular velocity and acceleration values of the left lower limb and the right lower limb.
During specific calculation, the included angle between the vertical axis of the sensor and the earth normal is obtained after complementary filtering of the magnetic field intensity, the angular velocity and the acceleration output by the sensor worn on the thigh position, and the hip joint flexion and extension angle is obtained after correction according to the wearing error and the advancing direction. The magnetic field intensity, the angular velocity and the acceleration output by the sensor worn on the thigh position and the sensor worn on the shank position are subjected to complementary filtering to obtain the vertical axis included angle of the two sensors, and the knee joint flexion and extension angle is obtained after correction according to the wearing error and the advancing direction. And magnetic field intensity, angular velocity and acceleration output by the sensor worn at the position of the shank and the sensor worn at the position of the instep are subjected to complementary filtering to obtain an included angle between the vertical axis of the shank sensor and the horizontal axis of the instep sensor, and the included angle is corrected according to wearing errors and the advancing direction to obtain the ankle joint flexion and extension angle. The above calculation method is the prior art, and will not be described herein.
And 4, step 4: the gait recognition module is used for recognizing whether the current gait of the left lower limb and the right lower limb is heel off-ground, forward swing, heel touching and relative backward swing gait by combining the built gait model with the flexion and extension angles of the hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb.
Specifically, the method for identifying the current gait of the left lower limb and the right lower limb comprises the following steps: obtaining a flexion-extension angle vector l of the left lower limb at any time ttAnd the flexion-extension angle vector r of the right lower limbtIn 1 with1、l2、l3…lt,r1、r2、r3…rtAnd (3) as output of two random processes from the moment 1 to the moment t, adopting a Viterbi algorithm to perform backward thrust on the gait model, and acquiring the gait of which the left lower limb and the right lower limb have the maximum posterior probability at the moments 1, 2 and 3 … t, wherein the gait is each gait identified by the gait model.
Wherein, the flexion-extension angle vector l of the left lower limbt=(lht,lkt,lat),lhtLk is the flexion and extension angle of the left hip joint at the moment ttIs the flexion and extension angle of the left knee joint at time t, latThe flexion and extension angle of the left ankle joint at the moment t; flexion-extension angle vector r of right lower limbt=(rht,rkt,rat),rhtIs the flexion-extension angle, rk, of the right hip joint at time ttIs the flexion and extension angle of the right knee joint at time t, ratThe flexion-extension angle of the right ankle joint at the time t.
In addition, while monitoring, gait model HLAnd HRAlso from the observation vector ltAnd rtAnd the EM algorithm is adopted for correction, and the corrected model can be better matched with the gait characteristics of the current user.
And 5: and the gait recognition module continuously updates the maximum flexion and extension angle value and the minimum flexion and extension angle value which are reached by the hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb in the current gait according to the flexion and extension angle value calculated in the step 3, and continuously updates the duration time of the current gait of the left lower limb and the right lower limb.
Step 6: the gait recognition module judges whether gait conversion occurs, namely whether the left lower limb and the right lower limb change from one gait to the next gait. If yes, the current gait is finished, the new gait is started, the gait recognition module compares the maximum flexion-extension angle value and the minimum flexion-extension angle value of the hip joint, the knee joint and the ankle joint with finished gaits in front of the left lower limb and the right lower limb, and the duration time of the finished gaits with the training target value set by the training setting module, judges whether the finished gaits are abnormal or not, sends the judgment result to the gait storage module for storage, and meanwhile, the gait recognition module also sends the new gait mark to the feedback generation module. If not, the current gait is not finished, the step 2 is returned, namely the gait recognition module continuously updates the maximum value and the minimum value of the flexion-extension angles of the hip joint, the knee joint and the ankle joint of the current gait of the left lower limb and the right lower limb, and continuously accumulates the duration of the current gait.
The judging method for judging whether the finished gait is abnormal in the step is as follows: if the maximum flexion-extension angle value and the minimum flexion-extension angle value of the hip joint, the knee joint and the ankle joint which have finished the gait in front of the left lower limb and the right lower limb and the duration time of finished gait are all within the target value range, judging the gait to be normal; if the angle exceeds the target value range, judging that a certain joint does not reach the target angle, a certain joint exceeds the target angle, overtime or over-fast abnormal gait.
And 7: after receiving the new gait mark sent by the gait recognition module, the feedback generation module inquires a judgment result of the same gait on the limb on the same side stored by the gait storage module, if the gait is abnormal gait, the feedback generation module drives the earphone or the sound box to give a voice prompt, and simultaneously drives the vibration motor at the corresponding position to give muscle vibration stimulation, when a user enters the same gait in the next period, the user receives the voice prompt from the earphone or the sound box and also feels the vibration stimulation from the corresponding muscle group position, and accordingly, the user makes an action meeting the requirements and corrects the wrong gait. And if the stored judgment result of the same gait on the limb on the same side is the normal gait, returning to the step 2 and continuing the monitoring training.
According to the embodiments, the invention can be well realized.

Claims (10)

1. An intelligent interactive walking training system, comprising:
lower limb movement capture sensor combination: the lower limb movement capturing sensor combination is used for measuring magnetic field intensity, angular velocity and acceleration values of the worn limb relative to three axes of the sensor under each gait in real time;
training sets up the module: the training setting module is used for setting target values of all gaits during training; the target value of each gait comprises the maximum and minimum value ranges of the flexion and extension angles of the hip joint, the knee joint and the ankle joint under each gait and the duration time range of each gait;
a gait recognition module: the gait recognition module calculates the flexion and extension angles of all joints of the worn limb according to data measured by the lower limb movement capture sensor combination, recognizes the current gait according to the flexion and extension angles, and updates the maximum value and the minimum value of all joint flexion and extension angles of the current gait and the duration of the current gait; the gait recognition module is also used for comparing the flexion and extension angles of all joints with the set flexion and extension angle target value, meanwhile, comparing the duration time of the finished gait with the set gait duration time, judging whether the finished gait reaches the standard or not, and outputting a judgment result; when gait conversion occurs, the gait recognition module outputs a new gait mark;
a gait storage module: the gait storage module is used for storing the judgment result of the historical gait;
a feedback generation module: the feedback generation module determines whether to give a prompt or feedback when the next gait starts according to the judgment result of the same gait in the previous period;
a prompt module: the prompt module is used for sending prompt information.
2. The intelligent interactive walking training system of claim 1, wherein the lower limb motion capture sensor combination comprises a left lower limb motion capture sensor node and a right lower limb motion capture sensor node.
3. The intelligent interactive walking training system of claim 1, further comprising a calibration module for measuring wearing errors of the lower limb motion capture sensor combination and angles of the direction of travel with respect to each sensor and inputting the wearing errors and the direction of travel to the gait recognition module;
the gait recognition module is also used for correcting the flexion and extension angles of the right hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb according to the wearing error and the advancing direction provided by the calibration module.
4. The intelligent interactive walking training system of claim 2, further comprising sensor interfaces including a left lower limb sensor interface and a right lower limb sensor interface; the left lower limb sensor interface is used for providing power for the left lower limb motion capture sensor node and acquiring measurement data of the left lower limb motion capture sensor node; the right lower limb sensor interface is used for providing power for the right lower limb motion capture sensor node and acquiring measurement data of the right lower limb motion capture sensor node.
5. The intelligent interactive walking training system of claim 1, wherein the prompting module comprises a vibrating motor combination, an earphone or a sound box.
6. An implementation method of the intelligent interactive walking training system of any one of claims 1 to 5, comprising the following steps:
step 1: establishing a gait model in a gait recognition module, and setting a training target value in a training setting module;
step 2: the lower limb movement capturing sensor combination acquires the magnetic field intensity, the angular velocity and the acceleration value of the left lower limb and the right lower limb in the movement process in real time;
and step 3: the gait recognition module respectively calculates the flexion and extension angles of hip joints, knee joints and ankle joints of the left lower limb and the right lower limb according to the acquired magnetic field intensity, angular velocity and acceleration values of the left lower limb and the right lower limb;
and 4, step 4: the gait recognition module is used for recognizing the current gait of the left lower limb and the right lower limb by combining the gait model with the flexion and extension angles of the hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb;
and 5: the gait recognition module updates the maximum flexion-extension angle value and the minimum flexion-extension angle value which are reached by the hip joint, the knee joint and the ankle joint of the left lower limb and the right lower limb in the current gait, and simultaneously updates the duration time of the current gait of the left lower limb and the right lower limb;
step 6: the gait recognition module judges whether gait conversion occurs or not; if yes, the gait recognition module compares the maximum flexion-extension angle value and the minimum flexion-extension angle value of the hip joint, the knee joint and the ankle joint which finish the gait in front of the left lower limb and the right lower limb, and the duration time of the finished gait with the training target value set by the training setting module, judges whether the finished gait is abnormal or not, sends the judgment result to the gait storage module for storage, and simultaneously sends a new gait mark to the feedback generation module; if not, returning to the step 2;
and 7: after receiving the new gait mark sent by the gait recognition module, the feedback generation module inquires whether the judgment result of the same gait on the limb on the same side stored by the gait storage module is abnormal gait; if yes, outputting prompt information; otherwise, returning to the step 2.
7. The method for implementing the intelligent interactive walking training system of claim 6, wherein the gait model in step 1 is as follows: each step of the lower limbs on one side is a walking cycle, the walking cycle is divided into four gaits of heel off the ground, forward swing, heel touching the ground and relative backward swing, and the left lower limb and the right lower limb finish the conversion of the gaits in sequence during walking; wherein when one lower limb is in heel off gait, the other lower limb is in heel on gait; when the lower limbs on one side are in forward swing gait, the lower limbs on the opposite side are in backward swing gait; when the lower limbs on one side are in heel-off gait, the lower limbs on the other side enter heel-off gait; when the lower limbs on one side are relatively backward swing gaits, the opposite side enters forward swing gaits.
8. The method of claim 7, wherein the training target values set in step 1 include:
1) left hip minimum angle in heel off ground gaitDegree LHS1Right hip joint minimum angle RHS1Left ankle minimum angle LAS1Right ankle minimum angle RAS1
2) Minimum angle LH of left hip joint in forward swing gaitS2Right hip joint minimum angle RHS2Left knee joint maximum angle LKS2Maximum angle RK of right knee jointS2Left ankle maximum angle LAS2Right ankle maximum angle RAS2
3) Left hip maximum angle LH in heel strikeS3Maximum angle RH of right hip jointS3Left ankle maximum angle LAS3Right ankle maximum angle RAS3
4) Maximum angle LH of left hip joint in relative supination gaitS4Maximum angle RH of right hip jointS4The minimum angle LK of the left knee jointS4Minimum angle RK of right knee jointS4
5) Duration range of each gait.
9. The method of claim 8, wherein the specific method for identifying the current gait of the left and right lower limbs in the step 4 is as follows: obtaining a flexion-extension angle vector l of the left lower limb at any time ttAnd the flexion-extension angle vector r of the right lower limbtIn 1 with1、l2、l3…lt,r1、r2、r3…rtAs the output of two random processes from the moment 1 to the moment t, a Viterbi algorithm is adopted to perform backward thrust on the gait model, and the gait of which the left lower limb and the right lower limb have the maximum posterior probability at the moments 1, 2 and 3 … t is obtained, namely each gait identified by the gait model; wherein, the flexion-extension angle vector l of the left lower limbt=(lht,lkt,lat),lhtLk is the flexion and extension angle of the left hip joint at the moment ttIs the flexion and extension angle of the left knee joint at time t, latThe flexion and extension angle of the left ankle joint at the moment t; flexion-extension angle vector r of right lower limbt=(rht,rkt,rat),rhtIs the flexion-extension angle, rk, of the right hip joint at time ttIs the flexion and extension angle of the right knee joint at time t, ratThe flexion-extension angle of the right ankle joint at the time t.
10. The method of claim 6, wherein step 2 is preceded by step 1.1; step 1.1: the calibration module measures the wearing error of the lower limb movement capturing sensor combination and the angle of the travel direction relative to each sensor.
CN202010321869.1A 2020-04-22 2020-04-22 Intelligent interactive walking training system and implementation method thereof Pending CN111588597A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010321869.1A CN111588597A (en) 2020-04-22 2020-04-22 Intelligent interactive walking training system and implementation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010321869.1A CN111588597A (en) 2020-04-22 2020-04-22 Intelligent interactive walking training system and implementation method thereof

Publications (1)

Publication Number Publication Date
CN111588597A true CN111588597A (en) 2020-08-28

Family

ID=72181790

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010321869.1A Pending CN111588597A (en) 2020-04-22 2020-04-22 Intelligent interactive walking training system and implementation method thereof

Country Status (1)

Country Link
CN (1) CN111588597A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112704491A (en) * 2020-12-28 2021-04-27 华南理工大学 Lower limb gait prediction method based on attitude sensor and dynamic capture template data
CN113057850A (en) * 2021-03-11 2021-07-02 东南大学 Recovery robot control method based on probability motion primitive and hidden semi-Markov
CN114712052A (en) * 2022-05-06 2022-07-08 燕山大学 Gait correction device and method
CN117078976A (en) * 2023-10-16 2023-11-17 华南师范大学 Action scoring method, action scoring device, computer equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130146659A1 (en) * 2011-07-18 2013-06-13 Dylan T X Zhou Wearable personal digital device for facilitating mobile device payments and personal use
CN103267524A (en) * 2013-04-24 2013-08-28 华中科技大学 Wearable personnel gait-detection indoor-positioning system and method therefor
CN103383292A (en) * 2013-07-17 2013-11-06 北京航空航天大学 Wearable real-time monitoring and feedback alarm device and method for scissors gaits
CN105727443A (en) * 2016-02-05 2016-07-06 福州大学 Foot drop therapeutic method based on MEMS sensor
CN108720841A (en) * 2018-05-22 2018-11-02 上海交通大学 Wearable lower extremity movement correction system based on cloud detection
CN108720842A (en) * 2018-05-23 2018-11-02 上海交通大学 Wearable lower limb rehabilitation system based on electromyography signal feedback
CN110916679A (en) * 2019-12-31 2020-03-27 复旦大学 Human body lower limb pose gait detection device and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130146659A1 (en) * 2011-07-18 2013-06-13 Dylan T X Zhou Wearable personal digital device for facilitating mobile device payments and personal use
CN103267524A (en) * 2013-04-24 2013-08-28 华中科技大学 Wearable personnel gait-detection indoor-positioning system and method therefor
CN103383292A (en) * 2013-07-17 2013-11-06 北京航空航天大学 Wearable real-time monitoring and feedback alarm device and method for scissors gaits
CN105727443A (en) * 2016-02-05 2016-07-06 福州大学 Foot drop therapeutic method based on MEMS sensor
CN108720841A (en) * 2018-05-22 2018-11-02 上海交通大学 Wearable lower extremity movement correction system based on cloud detection
CN108720842A (en) * 2018-05-23 2018-11-02 上海交通大学 Wearable lower limb rehabilitation system based on electromyography signal feedback
CN110916679A (en) * 2019-12-31 2020-03-27 复旦大学 Human body lower limb pose gait detection device and method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112704491A (en) * 2020-12-28 2021-04-27 华南理工大学 Lower limb gait prediction method based on attitude sensor and dynamic capture template data
CN113057850A (en) * 2021-03-11 2021-07-02 东南大学 Recovery robot control method based on probability motion primitive and hidden semi-Markov
CN114712052A (en) * 2022-05-06 2022-07-08 燕山大学 Gait correction device and method
CN117078976A (en) * 2023-10-16 2023-11-17 华南师范大学 Action scoring method, action scoring device, computer equipment and storage medium
CN117078976B (en) * 2023-10-16 2024-01-30 华南师范大学 Action scoring method, action scoring device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111588597A (en) Intelligent interactive walking training system and implementation method thereof
CN108236560B (en) Electric walking aid for promoting gait movement and application method thereof
CN108056898B (en) Virtual scene interactive rehabilitation training robot based on lower limb connecting rod model and force sense information and control method thereof
US10290235B2 (en) Rehabilitation using a prosthetic device
JP6301862B2 (en) Lower leg exercise device and control method thereof
JP2014509919A (en) Active robotic walking training system and method
KR102010361B1 (en) User gait feedback device and driving method thereof
CN106112985B (en) Exoskeleton hybrid control system and method for lower limb walking aid machine
CN104434466A (en) Rehabilitation robot system for old people with cerebral apoplexy
CN106691770A (en) Session program for generating and executing training
US20170352288A1 (en) Method and system for physical training and rehabilitation
KR102578261B1 (en) Method for walking assist, and devices operating the same
CN112137844A (en) Rehabilitation support system, estimation device, learning method, and storage medium
CN112405504B (en) Exoskeleton robot
Brantley et al. Prediction of lower-limb joint kinematics from surface EMG during overground locomotion
CN109771216A (en) A kind of patients with cerebral apoplexy rehabilitation fes signal accurate positioning method
WO2021174340A1 (en) System and method for gait monitoring and improvement
CN112137846A (en) Learning system, walking training system, method, program, and learning completion model
CN110338952B (en) Device for training normal gait and application thereof
CN112137837B (en) Learning system, walking training system, method, program, and learning completion model
Martinez et al. Preliminary assessment of a lower-limb exoskeleton controller for guiding leg movement in overground walking
Baud et al. Bio-inspired standing balance controller for a full-mobilization exoskeleton
CN112137839A (en) Learning system, walking training system, method, program, and learned model
CN112137838A (en) Learning device, walking training system, method, program, and learning completion model
KR20120012010A (en) Rehabilitation apparatus which follows the action of a normal human body part

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200828