CN108992072B - Human body lower limb movement intention identification method based on gait event - Google Patents

Human body lower limb movement intention identification method based on gait event Download PDF

Info

Publication number
CN108992072B
CN108992072B CN201810915082.0A CN201810915082A CN108992072B CN 108992072 B CN108992072 B CN 108992072B CN 201810915082 A CN201810915082 A CN 201810915082A CN 108992072 B CN108992072 B CN 108992072B
Authority
CN
China
Prior art keywords
gait
event
stride
time
step frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810915082.0A
Other languages
Chinese (zh)
Other versions
CN108992072A (en
Inventor
陈正
卢形
黄方昊
周时钊
朱世强
梅珑
金来
郭凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Sanlian Robot Technology Co ltd
Zhejiang University ZJU
Original Assignee
Anhui Sanlian Robot Technology Co ltd
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Sanlian Robot Technology Co ltd, Zhejiang University ZJU filed Critical Anhui Sanlian Robot Technology Co ltd
Priority to CN201810915082.0A priority Critical patent/CN108992072B/en
Publication of CN108992072A publication Critical patent/CN108992072A/en
Application granted granted Critical
Publication of CN108992072B publication Critical patent/CN108992072B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6828Leg
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis

Abstract

The invention discloses a human body intention identification method based on gait event detection. The method comprises the steps of firstly defining the gait event definition of the lower limbs of the human body, then detecting the gait event in real time, then obtaining the real-time step frequency and the stride based on the gait event, finally adjusting the standard joint corner track according to the real-time step frequency and the stride, and generating the reference joint corner track for synchronizing the step frequency and the stride. According to the invention, joint corner data can be acquired in real time through an inertia detection unit (IMU) installed on the lower limb, and a joint corner reference track for synchronizing the step frequency and the stride is obtained according to the acquired data and is used as a tracking track for motion control, so that the synchronous human gait of the lower limb exoskeleton is finally realized.

Description

Human body lower limb movement intention identification method based on gait event
Technical Field
The invention relates to a human body lower limb movement intention identification method based on gait events.
Background
With the continuous development of electromechanical technology, the research on robot systems is becoming a hot topic in the present stage, and a wearable lower limb exoskeleton robot has been gradually developed and has been widely applied in the military and medical fields.
The exoskeleton robot recognizes the movement intention of the lower limbs of a human body, and is one of key technologies for realizing the intellectualization of the exoskeleton robot. At present, there are two main methods for recognizing lower limb movement intention, one of which is to recognize the movement intention of a human body by using an electrical signal of the human body, such as Electromyography (EMG) or Electroencephalogram (EOG); one is to detect human-computer interaction to recognize human movement intention.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a human body lower limb movement intention identification method based on gait events.
The method comprises the steps of firstly detecting real-time gait parameters (step frequency and stride) of the lower limbs of a human body, then adjusting a standard corner track according to the real-time step frequency and stride, further obtaining a joint corner control target track synchronizing the step frequency and the stride of the human body, and finally performing motion control by taking the target track as a tracking target track of a control system, so that the lower limb exoskeleton can perform motion synchronization with the gait of the human body, namely the recognition of the motion intention of the lower limbs of the human body is realized.
The technical scheme of the invention comprises the following specific contents:
(1) human lower limb gait event definition.
(2) Gait event real-time detection algorithm.
(3) And obtaining real-time step frequency and step length based on the gait event.
(4) And adjusting the standard joint corner track according to the real-time step frequency and the stride, and generating a reference joint corner track for synchronizing the step frequency and the stride.
The invention has the beneficial effects that: a brand-new method for recognizing the lower limb movement intention is provided, joint corner data can be acquired in real time through an inertia detection unit (IMU) installed on the lower limb, a joint corner reference track of synchronous step frequency and stride is obtained according to the acquired data and is used as a tracking track of movement control, and finally the synchronous human gait of the lower limb exoskeleton is achieved.
Drawings
FIG. 1 is an overall process block diagram;
fig. 2 is a gait event definition diagram.
Detailed Description
The whole scheme of the invention mainly comprises 3 steps, wherein the first step is real-time gait event detection, the second step is real-time step frequency and stride detection, and the last step is standard joint corner track curve adjustment and reference corner track generation. As shown in particular in figure 1.
(1) Gait event detection
The present invention divides a full gait cycle into 4 gait phases including the primary support phase (ISt), the mid support phase (MSt), the terminal support phase (TSt) and the swing phase (Sw). The onset of each gait phase is defined as a gait event, including a contact onset event (IC), a thigh upright event (TE), a toe pre-launch event (pTO) and a toe-launch event (TO), as shown in particular in fig. 2.
According to the definition of the gait event and the experimental result, the gait event and the corner of the hip joint of the lower limb have a certain mapping relation, and the specific rule is shown in table 1.
TABLE 1 is a table of gait events and hip joint rotation angle laws
Gait event IC TE pTO TO
Hip joint corner Maximum value Is about zero Minimum value Is about zero
The data acquisition card acquires IMU (inertial measurement Unit) real-time hip joint rotation angle data of the lower limbs of the human body, and programs are compiled according to the rules, so that each gait event in each gait cycle can be detected in real time.
(2) Stride frequency and stride algorithm
2.1 step frequency Algorithm
Based on the detection of gait events, the time interval between adjacent identical gait events can be derived and the gait cycle T can then be deducedsThe calculation formula is as follows:
Figure BDA0001762770260000031
in the formula tnAnd tn-1Respectively, the time points of the gait event occurring at the nth time and the (n-1) th time are shown. In order to improve the accuracy of gait cycle calculation, 4 gait events defined in a gait cycle are all used for calculating the gait cycle and taking an average value, and the specific formula is as follows:
Figure BDA0001762770260000032
in the formula
Figure BDA0001762770260000033
And
Figure BDA0001762770260000034
respectively representing the time points, omega, of the occurrence of the nth and the (n-1) th gait eventssRepresenting the gait frequency, k is equal to 4. After obtaining the gait cycle, the phase angle of the gait cycle is defined as follows:
Figure BDA0001762770260000035
wherein phisRepresenting the phase angle function of gait, over a range of 0 to 2 pi, and thenDefining the gait process percentage s according to the phase angle as:
Figure BDA0001762770260000041
2.2 stride Algorithm
A stride is defined in the present design as the distance between the centers of mass of the feet at the onset of a contact event or toe off event. According to the above definition and the lower limb simplification model, the calculation formula of the stride can be obtained as follows:
L=l3sinθ3+l4sinθ4-(l1sinθ1+l2sinθ2)
in the formula [ theta ]1And theta2Representing knee joint angle, theta3And theta4Which indicates the angle of rotation of the hip joint,
Figure BDA0001762770260000042
in a complete gait cycle, there are two gait events that can be used to calculate the stride, and to improve the accuracy of the calculation result, the stride calculation formula can be expressed as follows:
Figure BDA0001762770260000043
in the formula
Figure BDA0001762770260000044
And
Figure BDA0001762770260000045
indicates hip joint rotation angle values when IC (i ═ 1) and pTO (i ═ 2) gait events occur,
Figure BDA0001762770260000046
and
Figure BDA0001762770260000047
represents the knee joint angle value when IC (i ═ 1) and pTO (i ═ 2) gait events occur.
(3) Joint corner track adjustment algorithm
The exoskeleton reference joint tracking trajectory is generated based on standard joint trajectories, particularly including hip joints and knee joints. And the standard joint rotation angle track is obtained by referring to a clinical gait analysis database (CGA) or experimental results. In order to realize the intention of synchronizing the lower limb movement of the exoskeleton robot, the period and the amplitude of the standard joint corner track need to be adjusted according to the step frequency and the step length detected in real time.
3.1 step frequency adjustment algorithm
Assuming that the curve functions of the corner of the standard hip joint and the knee joint are respectively fh(s) and fk(s), wherein the variable represents the gait process percentage, and s is a time-varying parameter, and is specifically defined as follows:
Figure BDA0001762770260000051
in the formulasAnd (t) represents a gait phase angle function, which can be obtained in a step frequency detection algorithm. The joint rotation angle function of the synchronous step frequency can be further deduced as follows:
fh(t)=fh(s(t))
fk(t)=fk(s(t))
3.2 stride adjustment algorithm
Assuming a standard gait stride of LsThe result of stride detection is LrFurther, a stride adjustment parameter K may be defined as:
Figure BDA0001762770260000052
according TO the gait event detection result, the gait progress percentage value when the TO event occurs can be obtained, and the specific calculation formula is as follows:
Figure BDA0001762770260000053
in the formula tTOIndicating the point in time at which the TO event occurred, tTEIndicating the point in time at which the TE event occurred, TsRepresenting the gait cycle. The standard trajectory function is then adjusted as follows:
Figure BDA0001762770260000054
wherein f(s) represents a standard joint rotation angle function, theta1Is the stride stretch interval, Θ1∈(0,sTO),Θ2Is the stride contraction interval, theta2∈(sTO100%). Finally, when the method is applied to a step frequency adjustment algorithm, the formula can be derived as follows:
fn(t)=fn(s(t))
in the formula fnAnd (t) represents a reference corner track function, namely a corner track curve for synchronizing the human body step frequency and the stride.
In summary, the invention designs a human intention identification method based on gait event detection. The method uses 4 Inertial Measurement Units (IMU) to measure the joint corners (hip joint and knee joint corners) of the lower limbs of the human body in real time, and the sensor is relatively low in price and convenient to install and is suitable for the aim of industrial development of the robot. In the aspect of algorithm complexity, the algorithm designed by the invention is relatively simple, and the calculated amount is reduced.

Claims (3)

1. A human body lower limb movement intention identification method based on gait events is characterized by comprising the following steps:
(1) defining a human lower limb gait event;
the start of each gait phase is a gait event, which comprises a contact start event, a thigh erecting event, a toe pre-off event and a toe off event, wherein the events have corresponding mapping relation with the hip joint rotation angle;
(2) detecting gait events in real time;
collecting real-time hip joint rotation angle data of lower limbs of a human body, and detecting each gait event in each gait cycle according to the mapping relation, wherein each gait cycle is divided into four gait phases including a support initial phase, a support middle phase, a support final phase and a swing phase;
(3) obtaining real-time step frequency and step length based on the gait event;
the process of obtaining the real-time step frequency comprises the following steps:
obtaining the time interval of adjacent same gait events according to the detection result of the gait events, thereby deducing the gait cycle Ts
Figure FDA0002550988240000011
In the formula tnAnd tn-1Respectively represents the time points of the gait event, omega, at the nth time and the n-1 st timesRepresenting a gait frequency;
after obtaining the gait cycle, the phase angle of the gait cycle is defined as follows:
Figure FDA0002550988240000012
wherein phiSThe gait phase angle function has a value range of 0 to 2 pi, and then the gait process percentage s is defined according to the phase angle as follows:
Figure FDA0002550988240000021
the process of obtaining the real-time stride comprises the following steps:
defining the stride as the distance between the centers of mass of the two feet when the contact start event or toe pre-off event occurs, the stride is calculated as follows:
L=l3sinθ3+l4sinθ4-(l1sinθ1+l2sinθ2)
in the formula [ theta ]1And theta2Representing knee joint angle, theta3And theta4Which indicates the angle of rotation of the hip joint,
Figure FDA0002550988240000022
(4) adjusting a standard joint corner track according to the real-time step frequency and the stride, and generating a reference joint corner track for synchronizing the step frequency and the stride;
step frequency adjustment:
assuming that the curve functions of the corner of the standard hip joint and the knee joint are respectively fh(s) and fk(s), then the joint rotation angle function of the synchronous step frequency is as follows:
fh(t)=fh(s(t))
fk(t)=fk(s(t))
step adjustment:
assuming a standard gait stride of LsThe result of stride detection is LrThen, define the step adjustment parameter K as:
Figure FDA0002550988240000023
according to the gait event detection result, obtaining the gait process percentage value when the toe-off event occurs:
Figure FDA0002550988240000031
in the formula tTOIndicating the point in time at which a toe-off event occurred, tTERepresents a point in time at which a thigh upright event occurs; the standard trajectory function is then adjusted as follows:
Figure FDA0002550988240000032
wherein f(s) represents a standard joint rotation angle function, theta1Is the stride stretch interval, Θ1∈(0,sTO),Θ2Is the stride contraction interval, theta2∈(sTO100%), and finally applied to the step frequency adjustment to obtain:
fn(t)=fn(s(t))
in the formula fnAnd (t) represents a reference corner track function, namely a corner track curve for synchronizing the human body step frequency and the stride.
2. The method for recognizing the motion intention of the lower limbs of the human body based on the gait event as claimed in claim 1, wherein: in order to improve the accuracy of gait cycle calculation, four gait events defined in a gait cycle are all used for calculating the gait cycle and taking an average value, and the specific formula is as follows:
Figure FDA0002550988240000033
in the formula
Figure FDA0002550988240000034
And
Figure FDA0002550988240000035
respectively, the time points of the nth and the (n-1) th gait events.
3. The method for recognizing the motion intention of the lower limbs of the human body based on the gait event as claimed in claim 1, wherein: in a complete gait cycle, two gait events exist for calculating the stride, and in order to improve the accuracy of the calculation result, the stride calculation formula is expressed as follows:
Figure FDA0002550988240000041
in the formula
Figure FDA0002550988240000042
And
Figure FDA0002550988240000043
representing the hip angle value when a contact start event and a toe off advance event occur,
Figure FDA0002550988240000044
and
Figure FDA0002550988240000045
representing the knee angle value when the contact start event and toe off anticipation event occur.
CN201810915082.0A 2018-08-13 2018-08-13 Human body lower limb movement intention identification method based on gait event Active CN108992072B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810915082.0A CN108992072B (en) 2018-08-13 2018-08-13 Human body lower limb movement intention identification method based on gait event

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810915082.0A CN108992072B (en) 2018-08-13 2018-08-13 Human body lower limb movement intention identification method based on gait event

Publications (2)

Publication Number Publication Date
CN108992072A CN108992072A (en) 2018-12-14
CN108992072B true CN108992072B (en) 2020-09-11

Family

ID=64595801

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810915082.0A Active CN108992072B (en) 2018-08-13 2018-08-13 Human body lower limb movement intention identification method based on gait event

Country Status (1)

Country Link
CN (1) CN108992072B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111469117B (en) * 2020-04-14 2022-06-03 武汉理工大学 Human motion mode detection method of rigid-flexible coupling active exoskeleton
CN113425290A (en) * 2021-06-15 2021-09-24 燕山大学 Joint coupling time sequence calculation method for human body rhythm movement

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104302251A (en) * 2012-03-22 2015-01-21 埃克苏仿生公司 Human machine interface for lower extremity orthotics
CN105992554A (en) * 2013-12-09 2016-10-05 哈佛大学校长及研究员协会 Assistive flexible suits, flexible suit systems, and methods for making and control thereof to assist human mobility
CN107361992A (en) * 2016-05-13 2017-11-21 深圳市肯綮科技有限公司 A kind of human body lower limbs move power assisting device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104302251A (en) * 2012-03-22 2015-01-21 埃克苏仿生公司 Human machine interface for lower extremity orthotics
CN105992554A (en) * 2013-12-09 2016-10-05 哈佛大学校长及研究员协会 Assistive flexible suits, flexible suit systems, and methods for making and control thereof to assist human mobility
CN107361992A (en) * 2016-05-13 2017-11-21 深圳市肯綮科技有限公司 A kind of human body lower limbs move power assisting device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Gait-Event-Based Synchronization Method for Gait Rehabilitation Robots via a Bioinspired Adaptive Oscillator;Gong Chen等;《IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING》;20170630;第64卷(第6期);第1345-1356页 *

Also Published As

Publication number Publication date
CN108992072A (en) 2018-12-14

Similar Documents

Publication Publication Date Title
Schmitz et al. The measurement of in vivo joint angles during a squat using a single camera markerless motion capture system as compared to a marker based system
Zheng et al. Gait phase estimation based on noncontact capacitive sensing and adaptive oscillators
Joshi et al. Classification of gait phases from lower limb EMG: Application to exoskeleton orthosis
Ivanenko et al. Temporal components of the motor patterns expressed by the human spinal cord reflect foot kinematics
EP2796123B1 (en) Movement assistance device
Cerveri et al. Robust recovery of human motion from video using Kalman filters and virtual humans
Sadeghi et al. Reduction of gait data variability using curve registration
CN108992072B (en) Human body lower limb movement intention identification method based on gait event
CN110801226A (en) Human knee joint moment testing system method based on surface electromyographic signals and application
do Carmo et al. Alteration in the center of mass trajectory of patients after stroke
Yi et al. Continuous prediction of lower-limb kinematics from multi-modal biomedical signals
CN111096830B (en) Exoskeleton gait prediction method based on LightGBM
Mendoza-Crespo et al. An adaptable human-like gait pattern generator derived from a lower limb exoskeleton
Glackin et al. Gait trajectory prediction using Gaussian process ensembles
Chen et al. Adaptive control strategy for gait rehabilitation robot to assist-when-needed
Mallikarjuna et al. Feedback-based gait identification using deep neural network classification
Al-Quraishi et al. Impact of feature extraction techniques on classification accuracy for EMG based ankle joint movements
Xiong et al. Continuous human gait tracking using sEMG signals
CN116999034B (en) Evaluation system and evaluation method
Lu et al. Gait-Event-Based human intention recognition approach for lower limb
Cao et al. Research on human sports rehabilitation design based on object-oriented technology
Skaro et al. Knee angles after crosstalk correction with principal component analysis in gait and cycling
Yang et al. Biomechanics analysis of human walking with load carriage
Meng et al. Effect of walking variations on complementary filter based inertial data fusion for ankle angle measurement
Caby et al. Multi-modal movement reconstruction for stroke rehabilitation and performance assessment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant