CN113143256B - Gait feature extraction method, lower limb evaluation and control method, device and medium - Google Patents

Gait feature extraction method, lower limb evaluation and control method, device and medium Download PDF

Info

Publication number
CN113143256B
CN113143256B CN202110117565.8A CN202110117565A CN113143256B CN 113143256 B CN113143256 B CN 113143256B CN 202110117565 A CN202110117565 A CN 202110117565A CN 113143256 B CN113143256 B CN 113143256B
Authority
CN
China
Prior art keywords
lower limb
calculating
monitored object
inertial sensor
pelvis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110117565.8A
Other languages
Chinese (zh)
Other versions
CN113143256A (en
Inventor
夏林清
李福生
范渊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Electric Group Corp
Original Assignee
Shanghai Electric Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Electric Group Corp filed Critical Shanghai Electric Group Corp
Priority to CN202110117565.8A priority Critical patent/CN113143256B/en
Publication of CN113143256A publication Critical patent/CN113143256A/en
Application granted granted Critical
Publication of CN113143256B publication Critical patent/CN113143256B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique

Abstract

The invention discloses a gait feature extraction method, a lower limb evaluation and control method, equipment and a medium, wherein the gait feature extraction method comprises the following steps: collecting lower limb movement parameters of a monitored object through the inertial sensor; acquiring the joint angle of the monitored object according to the lower limb movement parameters; the pelvic height is calculated from the joint angle. According to the invention, the lower limb movement parameters can be obtained based on the inertial sensor worn on the user, the joint angles can be calculated based on the lower limb movement parameters, and important gait characteristic data including the step frequency, the stride, the pelvis height and the like can be further calculated based on the joint angles, so that on one hand, embedded development is facilitated, the development cost is saved, the development efficiency is improved, and on the other hand, the calculation accuracy is improved, and further processing of the gait characteristic data is facilitated.

Description

Gait feature extraction method, lower limb evaluation and control method, device and medium
Technical Field
The invention relates to the technical field of computer technology and application, in particular to a gait feature extraction method, a lower limb evaluation and control method, equipment and a medium.
Background
Worldwide, population aging is becoming a non-negligible significant problem, along with the aggravation of population aging degree, the incidence rate of diseases of the aging population is increasing, and taking the most serious cerebral apoplexy as an example, according to data, the recognition number of the death caused by cerebral apoplexy in China reaches 140.3/10 ten thousand by 2019, even if the patients are timely cured, about 75% of patients still leave a plurality of sequelae with different degrees after the cerebral apoplexy is caused, and the sequelae can greatly reduce the life self-care ability of the patients and seriously affect the life quality of the patients and families. Among the sequelae, the hemiplegia occurrence probability of the patient is highest, and clinic shows that scientific exercise rehabilitation training is matched with operation treatment and drug treatment, so that the probability of limb function recovery of the hemiplegia patient with cerebral apoplexy can be remarkably improved, the damaged nervous system of the patient in the cerebral apoplexy attack process can be repaired by timely repeated rehabilitation exercise training, and the exercise systems such as musculoskeletal and the like are strengthened, thereby being beneficial to recovery of the motor functions of the affected limbs of the patient.
Along with the development of chip technology, the cooperative robot has also achieved the development of great footage in miniaturization and intellectualization, and the rehabilitation robot is gradually replacing the traditional rehabilitation training which is led by a rehabilitation therapist due to the characteristics of flexibility, perfected rehabilitation mode, high interactivity, high interestingness and the like. Along with the introduction of robotics in the rehabilitation training process, the evaluation of the state of a monitored subject, the evaluation of the progress of rehabilitation therapy is gradually changed from the direct subjective judgment of a rehabilitation therapist to a more automatic and intelligent mode based on data analysis. In the lower limb rehabilitation training, in order to evaluate the rehabilitation state of the lower limb of the monitored subject, gait data of the lower limb needs to be acquired.
How to acquire human lower limb gait data is a popular research topic in the fields of rehabilitation science and robots, and from the basic technical principle, the human lower limb gait data can be mainly divided into two types, namely an image gait recognition technology based on an optical sensor and a motion parameter gait recognition technology based on an inertial sensor. The university of Harbin engineering in 2017 and 2019 respectively discloses two patents for human lower limb gait cycle division based on video images, namely a multi-angle gait cycle detection method and a convolutional neural network-based gait cycle detection method, wherein the two patents are used for dividing the human lower limb gait cycle by frame through dividing videos, then carrying out image preprocessing operation of pedestrian contour extraction and centroid normalization, and then calculating the characteristic value of the lower limb gait image obtained through normalization or inputting the characteristic value into the convolutional neural network trained in advance, so that the human lower limb gait can be divided in cycle. The Shenzhen advanced technology institute of China academy of sciences discloses a gait feature extraction and gait recognition method based on an inertial sensor, wherein the inertial sensor is placed in the middle of the left and right lower legs of the lower limb of a human body to capture gait information, so that corresponding acceleration and angular velocity gait features are extracted.
The existing human lower limb gait feature extraction algorithm mainly has the following problems: on the one hand, the optical sensor-based lower limb gait feature extraction technology requires high price of the sensor, the data scale is large in the use process, the calculation power requirement of a required control system is high, and in the lower limb rehabilitation training process, the existence of the rehabilitation robot can shield the lower limb of a human body to a certain extent, so that the calculation accuracy is influenced; on the other hand, the human lower limb gait feature extraction algorithm based on the inertial element processes parameters such as acceleration, angular velocity and the like output by the sensor, or simply divides and extracts data acquired by the sensor, so that only simple gait features can be obtained, and the accuracy of the extracted data is limited.
Disclosure of Invention
The invention aims to overcome the defects of low accuracy, high cost and limited extracted parameters of gait feature extraction in the prior art, and provides a gait feature extraction method, a lower limb evaluation and control method, equipment and a medium, which have low cost and high accuracy and can obtain various important parameters.
The invention solves the technical problems by the following technical scheme:
The invention provides a gait feature extraction method based on an inertial sensor, which comprises the following steps:
collecting lower limb movement parameters of a monitored object through the inertial sensor;
acquiring the joint angle of the monitored object according to the lower limb movement parameters;
the pelvic height is calculated from the joint angle.
Preferably, the lower limb movement parameters comprise joint angular velocity and/or joint acceleration, and the joint angle comprises a hip joint angle;
the step of calculating the pelvic height from the joint angle includes:
and inputting the lower limb movement parameters, the joint angles, the gait characteristics and the object characteristic parameters into a pelvis height prediction model to obtain a predicted pelvis height, wherein the pelvis height prediction model is a model obtained by taking a plurality of pre-acquired standard lower limb movement parameters, corresponding standard joint angles, standard object characteristic parameters and standard gait characteristics as inputs and corresponding pelvis height as an output training neural network model.
Preferably, the subject characteristic parameter comprises a ratio of a height of a pelvis of the corresponding subject to a height of the subject and/or a height of the subject, and the gait characteristic further comprises at least one of a stride frequency, a stride length, a pace speed and a gait cycle.
Preferably, the motion capture system of Vicon (a man-machine motion capture system) is used for obtaining a plurality of standard lower limb motion parameters and corresponding standard joint angles in advance.
Preferably, the gait feature extraction method further comprises: and calculating the step frequency according to the joint angle.
Preferably, the lower limb movement parameter includes a joint angular velocity, the joint angle includes a hip joint angle, and the step of calculating the step frequency according to the joint angle includes:
calculating a hip joint angular velocity average value according to the acquired plurality of hip joint angular velocities in the preset time range;
calculating an angular velocity triggering threshold according to the average value of the hip joint angular velocity and a preset coefficient;
when the acquired angular velocities of the two hip joints trigger the angular velocity triggering threshold value and the change rate of the angular velocities of the two hip joints is positive or negative at the same time, taking the corresponding time point as a time intercept point and taking the time interval of the adjacent time intercept points as a gait cycle;
and calculating the step frequency according to the gait cycle.
Preferably, the step of calculating the angular velocity trigger threshold according to the average value of the hip joint angular velocity and a preset coefficient includes:
And calculating an angular velocity triggering threshold according to the average value of the angular velocity of the hip joint, a preset coefficient and a preset delay time.
Preferably, the gait feature extraction method further comprises: and calculating the stride according to the joint angle.
Preferably, the joint angle includes a hip joint angle and a knee joint angle, and the step of calculating the stride according to the joint angle includes:
calculating thigh projection of the thigh of the monitored object on the ground according to the hip joint angle;
calculating the lower leg projection of the lower leg of the monitored object on the ground according to the knee joint angle;
and calculating the stride according to the thigh projection and the shank projection.
Preferably, when the moving lower limb is located on the front side of the body of the monitored subject, the step of calculating the stride from the thigh projections and the calf projections includes: calculating a stride from the difference of the thigh projection and the calf projection; when the moving lower limb is positioned at the front side of the body of the monitored object, the step of calculating the stride according to the thigh projection and the calf projection comprises the following steps: a stride is calculated from the sum of the thigh projections and the calf projections.
The invention also provides an evaluation method of the lower limb state, which comprises the following steps:
Acquiring monitoring data of a monitored object, wherein the monitoring data comprises at least one of lower limb movement parameters, joint angles, pelvis heights, step frequencies and stride, and the monitoring data is acquired according to the gait feature extraction method based on the inertial sensor;
judging whether the obtained monitoring data are in a preset range of the standard data or not, if so, confirming that the monitoring object is in a rehabilitation state, and if not, confirming that the monitoring object is in an unrebalance state.
The invention also provides a control method of the lower limb rehabilitation robot, which comprises the following steps:
acquiring gait features of a user, wherein the gait features are acquired according to the gait feature extraction method based on the inertial sensor;
and controlling the degrees of freedom of the lower limb rehabilitation robot in different directions according to the gait characteristics.
Preferably, the gait feature comprises pace and pelvic height;
the step of controlling the degrees of freedom of the lower limb rehabilitation robot in different directions according to the gait characteristics comprises the following steps:
controlling the degree of freedom of the lower limb rehabilitation robot in the horizontal direction according to the pace speed;
and controlling the freedom degree of the lower limb rehabilitation robot in the vertical direction according to the pelvis height.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method as described above when executing the computer program.
The invention also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the steps of the method as described above.
The invention has the positive progress effects that: according to the invention, the lower limb movement parameters can be obtained based on the inertial sensor worn on the user, the joint angles can be calculated based on the lower limb movement parameters, and important gait characteristic data including the step frequency, the stride, the pelvis height and the like can be further calculated based on the joint angles, so that on one hand, embedded development is facilitated, the development cost is saved, the development efficiency is improved, and on the other hand, the calculation accuracy is improved, and further processing of the gait characteristic data is facilitated.
The invention can track and evaluate the rehabilitation state of the monitored object by utilizing the gait characteristics calculated according to the joint angles, and can realize accurate evaluation of the rehabilitation state of the monitored object on the premise of not increasing the complexity of the rehabilitation robot system.
The invention can utilize the calculated gait characteristics as an input signal for controlling the movement of the rehabilitation robot, namely, the movement freedom degree of the rehabilitation robot in the vertical direction is controlled through the pelvis height, the movement freedom degree of the rehabilitation robot in the horizontal direction is controlled through the stride, the pace and the like, the use of an encoder and a torque sensor in a rehabilitation robot system can be reduced, the complexity of the robot control system can be reduced, and the robustness of the control system is improved.
Drawings
Fig. 1 is a flowchart of a gait feature extraction method in embodiment 1 of the invention.
Fig. 2 is a schematic diagram of the positional relationship of the coordinate system defined in example 1.
Fig. 3 is a schematic view of the joint angle of the lower limb coordinate system in example 1.
Fig. 4 is a schematic structural diagram of a BP neural network in a specific scenario in embodiment 1.
Fig. 5 is a flowchart of a method for calculating a step frequency according to the joint angle in embodiment 1.
Fig. 6 is a schematic view of the hip joint angle of the real-time lower limb taken through a sliding rectangular window in one specific scenario in example 1.
Fig. 7 is a schematic view showing the change of angular velocity in the specific scene in example 1.
Fig. 8 is a flowchart of a method for calculating a stride according to an angle of a joint of embodiment 1.
Fig. 9 is a schematic diagram of a lower limb model of the subject in example 1 while walking.
Fig. 10 is a schematic diagram showing the comparison of the hip joint angle obtained in example 1 with a standard hip joint angle.
Fig. 11 is a schematic diagram showing the comparison of the knee angle obtained in example 1 with a standard knee angle.
Fig. 12 is a schematic diagram showing the comparison of the pelvis height obtained in example 1 with a standard pelvis height.
Fig. 13 is a graph showing the comparison of the stride obtained in example 1 with a standard stride.
Fig. 14 is a flowchart of a method for evaluating the lower limb state in embodiment 2 of the present invention.
Fig. 15 is a flowchart of a control method of the lower limb rehabilitation robot in embodiment 3 of the present invention.
Fig. 16 is a schematic diagram of the positional relationship between the lower limb rehabilitation robot and the monitoring object in a specific scenario in embodiment 3.
Fig. 17 is a flow chart showing overall data processing in embodiment 3.
Fig. 18 is a schematic block diagram of an electronic device in embodiment 4 of the present invention.
Detailed Description
The invention is further illustrated by means of the following examples, which are not intended to limit the scope of the invention.
Example 1
The present embodiment provides a gait feature extraction method based on an inertial sensor, wherein the inertial sensor is worn on a lower leg, a thigh and a pelvis of a user, as shown in fig. 1, and the gait feature extraction method in the present embodiment includes:
And step 101, acquiring lower limb movement parameters of a monitored object through an inertial sensor.
Step 102, acquiring the joint angle of the monitored object according to the lower limb movement parameters.
Step 103, calculating the pelvis height according to the joint angle.
The embodiment is based on the joint data comprising the joint angle which can be obtained by the inertial sensor worn on the user, and the gait feature data comprising the pelvis height can be further calculated based on the joint data, so that on one hand, the embedded development is facilitated, the development cost is saved, the development efficiency is improved, on the other hand, the calculation accuracy is improved, and the subsequent further processing of the gait feature data is facilitated.
Among them, an inertial sensor (IMU) is a device that measures three-axis attitude angles (or angular rates) of an object as well as accelerations. In general, an IMU is composed of a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer, and can be used to measure angular speeds and accelerations of an object in various directions in a three-dimensional space, and calculate a quaternion of the object based on the angular speeds and accelerations, so as to determine the pose of the object in the three-dimensional space.
In order to better understand the present embodiment, a method for acquiring a joint angle of a lower limb on a side of a monitored subject will be briefly described by way of a specific example:
As shown in fig. 2, a global coordinate system may be defined according to the right hand rule: wherein the X-axis is directed laterally, the Y-axis is directed anteriorly, the Z-axis is directed laterally, and the first IMU coordinate system, the hip coordinate systemThe global coordinate system is used as a reference, assuming a second IMU coordinate system, i.e. the initial thigh reference coordinate systemAnd a third coordinate system, namely the calf reference coordinate systemAnd global coordinate systemThus, the rotation matrix can be determined based on the three coordinate systemsWhereinRepresenting a rotation matrix, superscriptThe initial position is indicated, and the subscripts "G", "H" and "F" indicate the global reference coordinate system (pelvis), thigh reference coordinate system and calf reference coordinate system, respectively. Rotation matrixRepresenting the pose of the reference coordinate system "N" with respect to the reference coordinate system "M". Column vectorA unit vector representing the coordinate axis direction of the coordinate system "N" by the reference coordinate system "M",
the orientations of the thigh coordinate system with respect to the 1 st IMU coordinate system and the shank coordinate system with respect to the 2 nd IMU coordinate system are expressed as follows:
wherein the subscript U represents the coordinate system of IMU2 worn on the thigh, the subscript L represents the coordinate system of IMU3 worn on the calf, and IMU1 represents the coordinate system worn on the pelvis.
When the subject moves the lower limb to a new posture, the transformation relationship of the thigh and the calf in the global coordinate system with respect to the pelvic coordinate system can be described by the following rotation matrix:
wherein, superscriptRepresenting a new pose of the monitored subject.
Obtaining quaternions from an IMUWhereinIs a vector of the values of the vectors,is a scalar.
The pose of the thigh in the global coordinate system can be expressed by a quaternion as follows:
the joint angle diagram of the lower limb coordinate system is shown in fig. 3. Consider 3 Euler angles between the lower leg coordinate system and the upper leg coordinate system, i.eThe angles to the x, y, z axes are indicated, respectively, i.e. the swivel angle, pitch angle and yaw angle of the target reference frame are indicated, respectively.
First the thigh coordinate system coincides with the pelvis coordinate system, then the thigh coordinate system is wound aroundRotation of the shaftRadian and then wind aroundRotation of the shaftRadian, finally windRotation of the shaftRadian. The thigh coordinate system rotation matrix can therefore be written in the form:
wherein, the liquid crystal display device comprises a liquid crystal display device,in the shorthand of (c) is,in the shorthand of (c) is,in the shorthand of (c) is,in the shorthand of (c) is,in the shorthand of (c) is,is a shorthand for (2).
The quaternions output from IMU1 and IMU2 can calculate the relative movement angle of the thigh of the monitored subject relative to the pelvis. The rotation matrix simultaneous calculation can be obtained, and the relative motion angles in 3 directions are as follows:
Wherein, the liquid crystal display device comprises a liquid crystal display device,representing the swivel angle, yaw angle and pitch angle of the hip joint, respectively. Similarly, the rotation angle, deflection angle and pitch angle of the knee joint of the monitored object can be calculated through quaternion output of the IMU1 and the IMU 3.
Similarly, the limb movement parameters of the lower limb at the other side of the monitored object can be acquired by IMU (inertial measurement unit) respectively positioned at the pelvis, the corresponding thigh and the corresponding calf, so as to obtain the hip joint angle and the knee joint angle of the corresponding lower limb. In this way, the angle of the knee joint and the angle of the hip joint of the lower limbs on both sides of the subject can be determined, and the posture of the subject can be obtained based on the determined joint angles.
In this embodiment, after obtaining the joint angle of the monitored object in step 102, the lower limb feature may be predicted in step 103, where the pelvis height may be used as a specific lower limb feature, specifically, the pelvis height may be obtained by a pre-trained pelvis height prediction model, where the pelvis height prediction model may be obtained by:
the neural network is widely applied in pattern recognition, and has self-learning and self-adapting capabilities and the capability of searching an optimal solution at a high speed. The BP neural network is a multi-layer feedforward network, and is characterized by that its structure includes input layer, hidden layer and output layer, so that said embodiment adopts BP neural network as classifier to make model training, and uses the extracted characteristic values of joint angle and angular velocity of two sides of human lower limb, human pelvis height, height-occupying height proportion of human pelvis and gait cycle length as model input, and uses pelvis height as output, according to empirical mode, the model training method is implemented Training a model, wherein n is an implicit layer, m is an input layer number, k is an output layer number, and a is an adjustment constant between 1 and 10. Specifically, as shown in fig. 4, in a specific scenario, the number of layers of the Input layer is 3, the number of layers of the Hidden layer Hidden is 4, the number of layers of the Output layer Output is 1, and the number of Hidden layer nodes is selected to be 10.
Before training the neural network, the characteristic values of 3000 groups of standard actions of healthy people are collected as a sample space, wherein 70% of the characteristic values are used as training samples, 15% of the characteristic values are used as verification samples, and 15% of the characteristic values are used as test samples. Through testing and adjusting, the network target error is set to be 0.001, the training iteration number is 20, and the learning rate is calculated at 0.3 step frequency.
Since the pelvic height prediction model is a data prediction model built on the gait cycle. Firstly, a standard gait data acquisition system, such as a Vicon motion capture system, is used for acquiring lower limb motion parameters, joint angles and gait characteristics of a normal person during walking, wherein the lower limb motion parameters comprise angular velocity and acceleration of a hip joint, the gait characteristics comprise characteristics of pelvis height, gait cycle, angular velocity, proportion of the height occupied by the pelvis and the like, the characteristics of the posture of the person during walking are reflected, parameters except the pelvis height are used as input data of a BP neural network model, and the pelvis height is used as output data to train the BP neural network model.
In order to ensure the robustness of the model and accelerate the model training gradient descent speed, the training data of the neural network needs to be normalized, and the hip joint angle, the angular velocity and the acceleration are mapped at [0,1 ]]Interval of (2)Interval, intervalIs defined as:
wherein, the liquid crystal display device comprises a liquid crystal display device,represents data after mapping, q represents data before mapping, q max Represents maximum data before mapping, q min Representing the minimum data before mapping. According to the training data in the mode, gait cycle intervals are divided at equal intervals according to a peak searching algorithm and a data sampling rate, and label assignment is carried out according to positions of the data in the gait cycle.
In this embodiment, in order to address the output change of the pelvis height caused by the height and the walking speed, parameters reflecting the characteristics of the human body can be further collected as object characteristic data, such as the height and the walking speed of a person, 10 subjects are summoned under a specific scene, the height distribution range is 1.56 m-1.86 m, the data collection environment is a Vicon motion capture system, the test speed is 0.3m/s, 0.5m/s and 0.7m/s, and the model library total quantity is 30 to further train the neural network model.
After model training is completed, a pelvis height prediction model is obtained, and the current lower limb movement parameters, joint angles, gait characteristics and object characteristic parameters are input into the pelvis height prediction model to obtain the predicted pelvis height.
In this embodiment, the pelvis height of the monitored object can be predicted in real time through the pelvis height prediction model according to the joint angle obtained by the inertial sensor, so that on one hand, the accuracy of pelvis height prediction is improved, and on the other hand, the motion gesture of the monitored object can be obtained more accurately, and on the other hand, the use of encoders and torque sensors in the rehabilitation robot system can be reduced, and the complexity of the robot control system can be reduced.
In this embodiment, the degree of freedom of the lower limb rehabilitation robot in the vertical direction may also be controlled according to the predicted pelvis height, so as to achieve the purpose of automatically and intelligently training the lower limb of the monitored subject through the lower limb rehabilitation robot.
The step frequency, stride, and other gait characteristics obtained based on the step frequency or stride calculation, such as the pace, in this embodiment may be obtained by a calculation method in the prior art, or may be obtained based on the joint angle obtained by the calculation in step 102 in this embodiment, and when obtained by the latter method, the specific steps are as follows:
the method for calculating the step frequency according to the joint angle is shown in fig. 5, and includes:
step 201, calculating a hip joint angular velocity average value according to a plurality of obtained hip joint angular velocities within a preset time range.
Specifically, after the joint angle of the lower limb movement of the monitored object is obtained in step 102, the angular velocity of the bilateral joint movement can be obtained by calculating through a differential method, specifically, the angular velocity can be obtained by subtracting the difference of the angular velocity of the hip joint calculated in the previous time from the angular velocity of the hip joint calculated in the previous time and dividing the difference by the time difference calculated in the two times. And dividing the sum of the hip joint angular velocities in the preset time range by the calculated times of the angular velocities in the time range to obtain the hip joint angular velocity average value.
And 202, calculating an angular velocity trigger threshold according to the average value of the hip joint angular velocity and a preset coefficient.
To further prevent noise from interfering with the process of calculating the gait cycle described below, step 202 may further include calculating the angular velocity trigger threshold based on the hip angular velocity average, the predetermined coefficient and the predetermined delay time.
And 203, when the acquired angular velocities of the two hip joints trigger an angular velocity triggering threshold value and the change rates of the angular velocities of the two hip joints are positive or negative at the same time, taking the corresponding time point as a time intercept point and taking the time interval of the adjacent time intercept points as a gait cycle.
Step 204, calculating the step frequency according to the gait cycle.
For better understanding of the principle of step frequency calculation in this embodiment, the step frequency calculation is further described below by way of an example in a specific scenario:
as shown in fig. 6, in this scenario, a sliding rectangular window with a size of 250 (i.e. the window includes 250 data points) is used to intercept the current real-time Hip joint Angle of the lower limb, calculate the average value aver_hip_angle of the Hip joint angular velocity in the current window, and set a trigger threshold thre_val of the angular velocity, i.e. thre_val=1.5×aver_hip_angle, where 1.5 is a preset coefficient in this scenario, and the preset coefficient may be set according to actual requirements.
Then judging whether the current hip joint angle simultaneously meets the following conditions:
here, 3 is used as a delay time, and in the actual operation process, different delay times can be preset according to different situations.
In another case, the values of the angular velocity twice before and after can be set to be negative at the same time.
Fig. 7 shows a graph of the change of the angular velocity in this specific scenario, where a1 and a2 respectively represent the angular velocity trigger threshold values with positive values of the change rates of the front and rear times, and the time T between a1 and a2 is the gait cycle of one movement, and stride_fre=1/T.
It should be understood that the above scenario is only illustrated by taking the lower limb on one side of the monitored object as an example, and the principle of the method for calculating the step frequency of the lower limb on the other side of the monitored object is the same as that of the above method, and will not be described herein.
In this embodiment, the gait cycle can be calculated according to the hip joint angle, and the frequency of the corresponding side of the monitored object can be calculated according to the gait cycle, so that an accurate step frequency can be obtained, and the rehabilitation condition of the monitored object can be further evaluated on the basis of the step frequency, or the lower limb of the corresponding side of the monitored object can be further trained.
Further, as shown in fig. 8, a method for calculating a stride specifically according to a joint angle includes:
step 301, calculating thigh projection of the thigh of the monitored object on the ground according to the hip joint angle;
step 302, calculating the lower leg projection of the lower leg of the monitored object on the ground according to the knee joint angle;
step 303, calculating the stride according to the thigh projection and the shank projection.
Wherein, in step 303, when the moving lower limb is located on the front side of the monitored object body, calculating the stride of the corresponding lower limb according to the difference between the thigh projection and the calf projection; when the moving lower limb is positioned on the front side of the body of the monitored object, the stride of the corresponding lower limb is calculated according to the sum of the thigh projection and the calf projection.
The stride calculation principle in the present embodiment is described below by way of a specific example:
fig. 9 shows the lower limb model and triangle theorem of a subject when walking, and the projection stride_l=abs (thigh_sin (hip_l)) + (shank_sin (knee_l-hip_l)) between the rear heel and the pelvis on the ground can be calculated from the lower limb thigh length thigh, the lower leg length shank, the rear hip joint angle hip_l, and the rear knee joint angle knee_l for the lower limb in the movement of the subject body.
For a lower limb positioned in the movement of the front side of the body of the monitored subject, the projection stride_r=abs (height sin (hip_r)) - (shank sin (knee_r) -hip_r))) between the front heel and the pelvis on the ground can be calculated from the lower limb thigh length, the calf length shank, the front hip joint angle hip_r, and the front knee joint angle knee_r,
the sum of the projection of the rear foot and the projection of the front foot is the stride of the monitored subject when walking, i.e., stride=stride_r+stride_l.
In this embodiment, the stride of the front side foot and the stride of the rear side foot can be calculated according to the knee joint and hip joint angles, and the whole stride of the monitor can be calculated according to the stride of the front side foot and the stride of the rear side foot, so that an accurate stride can be obtained, and the rehabilitation condition of the monitor can be further evaluated on the basis of the stride, or the corresponding side lower limb of the monitor can be further trained.
The gait feature extraction method based on the inertial sensor in this embodiment can obtain a near standard gait feature, that is, an accurate hip joint angle and a knee joint angle can be obtained, and the aforementioned joint angles can obtain accurate pelvis heights and steps, fig. 10, 11, 12 and 13 respectively show a comparison graph of data acquired based on the standard Vicon motion capture system and data extracted based on the gait feature extraction method in this embodiment in a specific scene, and the abscissas of the four graphs each represent a time change, wherein the ordinate of fig. 10 represents a hip joint angle, the ordinate of fig. 11 represents a knee joint angle, the ordinate of fig. 12 represents a pelvis height, the ordinate of fig. 13 represents a step, and the curves R1, R2, R3 and R4 respectively represent a hip joint angle conversion curve, a knee joint angle conversion curve, a pelvis height conversion curve and a step conversion curve extracted by the gait feature extraction method in this embodiment, and the curves T1, T2, T3 and T4 respectively represent a corresponding knee joint angle conversion curve, a corresponding to a standard Vicon the motion capture system, a hip joint height, a corresponding pelvis height conversion curve and a step width, and a hip joint height, and a step width, respectively, and a hip joint height and a step width, respectively, and the corresponding hip height and a step height, respectively, can be calculated within the standard error and a base angle and a base-to the hip joint angle.
The wearable inertial sensor adopted by the embodiment has the advantages of small volume, light weight, low cost, good privacy confidentiality, high portability and the like, and comprises the triaxial accelerometer, the triaxial gyroscope and the triaxial magnetometer, so that the joint angle, the angular velocity, the movement intention, the hip joint position and the speed of a patient in the rehabilitation training process can be acquired or obtained, on one hand, the rehabilitation state of the patient can be tracked and evaluated, on the other hand, the wearable inertial sensor can also be used as an input signal of a robot control system, the use of an encoder and a torque sensor in the rehabilitation robot system can be reduced, the complexity of the robot control system can be reduced, and the robustness of the control system is improved.
In this embodiment, the lower limb rehabilitation state of the monitored object may be estimated based on the characteristics such as the joint angle, the step frequency, the stride, etc., and the lower limb rehabilitation training may be further performed on the monitored object based on the foregoing data.
Example 2
The present embodiment provides a method for evaluating a lower limb state, which is based on embodiment 1, as shown in fig. 14, and includes:
step 401, obtaining monitoring data of a monitored object.
The monitoring data includes the lower limb movement parameters, the joint angles and the gait characteristics, wherein the lower limb movement parameters and the joint angles can be defined by referring to the movement parameters and the joint angles in embodiment 1, and the description thereof will be omitted. Gait features including pelvic height, stride frequency and stride are obtained based on the inertial sensor-based gait feature extraction method of example 1.
Step 402, determining whether the obtained monitoring data and the obtained standard data are within a preset range, if yes, executing step 403, and if not, executing step 404.
Step 403, confirming that the monitored object is in a rehabilitation state.
Step 404, confirming that the monitored object is in an unreliability state.
In this embodiment, the joint angle may be calculated by using the lower limb motion parameters collected by the inertial sensor, the rehabilitation state of the monitored object may be tracked and evaluated by using the gait feature calculated by the joint angle, and the accurate evaluation of the rehabilitation state of the monitored object may be realized without increasing the complexity of the rehabilitation robot system.
Example 3
The present embodiment provides a control method of a lower limb rehabilitation robot, where the present embodiment is based on embodiment 1 or embodiment 2, as shown in fig. 15, and the control method includes:
step 501, acquiring gait characteristics of a user.
The gait characteristics include the pelvis height, the stride frequency, and the stride, and the gait characteristics are obtained based on the inertial sensor-based gait characteristic extraction method in embodiment 1.
Step 502, controlling the degrees of freedom of the lower limb rehabilitation robot in different directions according to gait characteristics.
Step 502 may specifically include: the degree of freedom of the lower limb rehabilitation robot in the horizontal direction is controlled according to the pace speed, and the degree of freedom of the lower limb rehabilitation robot in the vertical direction is controlled according to the pelvis height.
For better understanding of the present embodiment, the following illustrates the control method in the present embodiment through a specific scenario:
the physiotherapist fixes inertial measurement sensors at the positions of each movable unit (pelvis, left and right thighs, left and right shanks) of the lower limbs of the patient through a binding mechanism, the number of the sensors can be increased or decreased according to different application scenes, if a user only needs to train the left thigh, only two inertial sensors can be used, one is bound on the pelvis, the other is bound on the left thigh, and when the user needs to train the left and right shanks, the inertial sensors are respectively bound on the pelvis, the left thigh, the right thigh, the left shank and the right shank, wherein 2-5 inertial sensors can be selected according to different requirements of the user.
The physical therapist switches the lower limb rehabilitation robot to a certain rehabilitation training mode, such as a 'running table following' mode, determines the weight reduction ratio of the rehabilitation training, follows the parameters necessary for training such as starting force and the like, and then starts the training.
The physical therapist enables the main control module of the lower limb rehabilitation robot to call the program function module written according to the gait feature extraction method in the embodiment 1 by starting the training function, so that the system opens the lower limb movement gait feature extraction function, and extracts information such as hip, knee joint angles, pace, step frequency, stride and the like of lower limbs on two sides of a rehabilitation patient on the pelvis height of the lower limb of the patient in the rehabilitation process. Meanwhile, the pelvis height and the pace are used as input signals of motors for controlling the running platform and the stand column to move, the main control program controls the running platform and the stand column to move by calling related control programs, so that the lower limb rehabilitation robot finishes following training of a patient, the running platform is used for controlling the degree of freedom of the lower limb rehabilitation robot in the horizontal direction, the stand column is used for controlling the degree of freedom of the lower limb rehabilitation robot in the vertical direction, and specific structures of the running platform, the stand column and other components in the lower limb rehabilitation robot and the mutual connection relation of all structures can be obtained according to the prior art. The patient completes rehabilitation training according to prompts, games and the like in the training system.
If the trained patient needs to be replaced, repeating the steps until the set rehabilitation training is finished. The training time is up or the end is clicked, and the rehabilitation training process is ended;
the physical therapist derives the information of the height of the pelvis of the patient recorded in the training process of the patient, the angles of the hip joint and the knee joint of the lower limbs at two sides of the patient to be rehabilitated, the pace speed, the step frequency, the stride and the like, and the doctor can further evaluate the rehabilitation state of the patient and plan the next rehabilitation plan of the patient through the information.
Fig. 16 shows a schematic diagram of a positional relationship between a lower limb rehabilitation robot and a monitored object in a specific scenario, where, in order to increase a user's look and feel and improve user experience, the lower limb rehabilitation robot may further include a VR (virtual reality) device, and a user may play a game through the VR to obtain a prompt.
Fig. 17 shows an overall data processing schematic diagram from the acquisition of data by the inertial sensor to the calculation of gait characteristics and the training of the lower limb rehabilitation robot, wherein four elements can be obtained by filtering and data fusion of the triaxial acceleration, the triaxial angular velocity and the three-week magnetic field intensity of the monitored object under the natural gait acquired by the inertial sensor, the lower limb movement parameters and the joint angles can be obtained by the four elements, and the step frequency, the stride and the pelvis height of the monitored object can be calculated specifically by the lower limb movement parameters and the joint angles, so that the freedom degrees of the monitored object in the vertical direction and the horizontal direction can be controlled by the pelvis height, the step frequency and the stride respectively.
According to the embodiment, the calculated gait characteristics can be used as input signals for controlling the movement of the rehabilitation robot, namely, the movement freedom degree of the rehabilitation robot in the vertical direction is controlled through the pelvis height, the movement freedom degree of the rehabilitation robot in the horizontal direction is controlled through the stride, the pace and the like, the use of encoders and torque sensors in a rehabilitation robot system can be reduced, the complexity of the robot control system can be reduced, and the robustness of the control system is improved.
It should be understood that the lower limb rehabilitation robot in the embodiment of the present invention includes a control structure in a vertical direction, such as a column, and a control structure in a horizontal direction, and the lower limb rehabilitation robot in the embodiment may be a lower limb rehabilitation robot used for training a lower limb in the prior art, and the embodiment does not specifically limit the type and structure of the lower limb rehabilitation robot.
Example 4
The present embodiment provides an electronic device, which may be expressed in the form of a computing device (for example, may be a server device), including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor may implement the gait feature extraction method based on the inertial sensor in embodiment 1, the assessment method of the lower limb state in embodiment 2, or the control method of the lower limb rehabilitation robot in embodiment 3 when executing the computer program.
Fig. 18 shows a schematic diagram of the hardware structure of the present embodiment, and as shown in fig. 18, the electronic device 9 specifically includes:
at least one processor 91, at least one memory 92, and a bus 93 for connecting the different system components (including the processor 91 and the memory 92), wherein:
the bus 93 includes a data bus, an address bus, and a control bus.
The memory 92 includes volatile memory such as Random Access Memory (RAM) 921 and/or cache memory 922, and may further include Read Only Memory (ROM) 923.
Memory 92 also includes a program/utility 925 having a set (at least one) of program modules 924, such program modules 924 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The processor 91 executes various functional applications and data processing by running a computer program stored in the memory 92, such as the gait feature extraction method based on the inertial sensor in embodiment 1 of the present invention, the evaluation method of the lower limb state in embodiment 2, or the control method of the lower limb rehabilitation robot in embodiment 3.
The electronic device 9 may further communicate with one or more external devices 94 (e.g., keyboard, pointing device, etc.). Such communication may occur through an input/output (I/O) interface 95. Also, the electronic device 9 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through a network adapter 96. The network adapter 96 communicates with other modules of the electronic device 9 via the bus 93. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in connection with the electronic device 9, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID (disk array) systems, tape drives, data backup storage systems, and the like.
It should be noted that although several units/modules or sub-units/modules of an electronic device are mentioned in the above detailed description, such a division is merely exemplary and not mandatory. Indeed, the features and functionality of two or more units/modules described above may be embodied in one unit/module in accordance with embodiments of the present application. Conversely, the features and functions of one unit/module described above may be further divided into ones that are embodied by a plurality of units/modules.
Example 5
The present embodiment provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the inertial sensor-based gait feature extraction method of embodiment 1, the lower limb state evaluation method of embodiment 2, or the control method of the lower limb rehabilitation robot of embodiment 3.
More specifically, among others, readable storage media may be employed including, but not limited to: portable disk, hard disk, random access memory, read only memory, erasable programmable read only memory, optical storage device, magnetic storage device, or any suitable combination of the foregoing.
In a possible embodiment, the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps of implementing the inertial sensor based gait feature extraction method of example 1, the lower limb state assessment method of example 2 or the control method of the lower limb rehabilitation robot of example 3 when the program product is run on the terminal device.
Wherein the program code for carrying out the invention may be written in any combination of one or more programming languages, which program code may execute entirely on the user device, partly on the user device, as a stand-alone software package, partly on the user device and partly on the remote device or entirely on the remote device.
While specific embodiments of the invention have been described above, it will be appreciated by those skilled in the art that this is by way of example only, and the scope of the invention is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the principles and spirit of the invention, but such changes and modifications fall within the scope of the invention.

Claims (8)

1. The gait feature extraction method based on the inertial sensor is characterized by comprising the following steps of:
collecting lower limb movement parameters of a monitored object through the inertial sensor; the inertial sensor comprises an inertial sensor positioned at the pelvis of the monitored object, an inertial sensor positioned at the thigh of the monitored object and an inertial sensor positioned at the calf of the monitored object;
acquiring the joint angle of the monitored object according to the lower limb movement parameters; the lower limb movement parameters comprise joint angular velocity and/or joint acceleration, and the joint angles comprise hip joint angles and knee joint angles; acquiring lower limb movement parameters of the monitored object through an inertial sensor positioned at the pelvis of the monitored object and an inertial sensor positioned at the thigh of the monitored object; calculating a relative movement angle of the thigh of the monitored object relative to the pelvis according to the lower limb movement parameters, and taking the relative movement angle of the thigh of the monitored object relative to the pelvis as a hip joint angle;
Acquiring lower limb movement parameters of the monitored object through an inertial sensor positioned at the pelvis of the monitored object and an inertial sensor positioned at the calf of the monitored object; calculating a relative movement angle of the lower leg of the monitored object relative to the pelvis according to the lower limb movement parameters, and taking the relative movement angle of the lower leg of the monitored object relative to the pelvis as a knee joint angle;
calculating gait cycle, step frequency and stride according to the joint angle;
calculating according to the step frequency or the step length to obtain the step speed;
inputting at least one of the stride frequency, the stride, the pace speed and the gait cycle and the lower limb movement parameters, the joint angles and object characteristic parameters into a pelvis height prediction model to obtain a predicted pelvis height, wherein the object characteristic parameters comprise the ratio of the pelvis height of a corresponding object to the height and/or the height of the object, the pelvis height prediction model is a model obtained by taking at least one of a pre-acquired standard stride frequency, standard stride, standard pace and standard gait cycle and a plurality of standard lower limb movement parameters and corresponding standard joint angles and standard object characteristic parameters as inputs, and the corresponding pelvis height is taken as an output training neural network model.
2. The inertial sensor-based gait feature extraction method of claim 1, wherein the plurality of standard lower limb movement parameters and corresponding standard joint angles are pre-acquired by the Vicon motion capture system.
3. The inertial sensor-based gait feature extraction method of claim 1, wherein the step of calculating a step frequency from the joint angle comprises:
calculating a hip joint angular velocity average value according to the acquired plurality of hip joint angular velocities in the preset time range;
calculating an angular velocity triggering threshold according to the average value of the hip joint angular velocity and a preset coefficient;
when the acquired angular velocities of the two hip joints trigger the angular velocity triggering threshold value and the change rate of the angular velocities of the two hip joints is positive or negative at the same time, taking the corresponding time point as a time intercept point and taking the time interval of the adjacent time intercept points as a gait cycle;
and calculating the step frequency according to the gait cycle.
4. The inertial sensor-based gait feature extraction method as claimed in claim 3, wherein said step of calculating an angular velocity trigger threshold from said hip joint angular velocity average and a preset coefficient comprises:
And calculating an angular velocity triggering threshold according to the average value of the angular velocity of the hip joint, a preset coefficient and a preset delay time.
5. The inertial sensor-based gait feature extraction method of claim 1, wherein the step of calculating a stride from the joint angle comprises:
calculating thigh projection of the thigh of the monitored object on the ground according to the hip joint angle;
calculating the lower leg projection of the lower leg of the monitored object on the ground according to the knee joint angle;
and calculating the stride according to the thigh projection and the shank projection.
6. The inertial sensor-based gait feature extraction method of claim 5, wherein said step of calculating a stride from said thigh projections and said calf projections when the moving lower limb is positioned on the anterior side of the monitored subject's body comprises: calculating a stride from the difference of the thigh projection and the calf projection; when the moving lower limb is positioned at the rear side of the body of the monitored object, the step of calculating the stride according to the thigh projection and the calf projection comprises the following steps: a stride is calculated from the sum of the thigh projections and the calf projections.
7. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of claims 1 to 6 when executing the computer program.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
CN202110117565.8A 2021-01-28 2021-01-28 Gait feature extraction method, lower limb evaluation and control method, device and medium Active CN113143256B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110117565.8A CN113143256B (en) 2021-01-28 2021-01-28 Gait feature extraction method, lower limb evaluation and control method, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110117565.8A CN113143256B (en) 2021-01-28 2021-01-28 Gait feature extraction method, lower limb evaluation and control method, device and medium

Publications (2)

Publication Number Publication Date
CN113143256A CN113143256A (en) 2021-07-23
CN113143256B true CN113143256B (en) 2023-09-26

Family

ID=76878950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110117565.8A Active CN113143256B (en) 2021-01-28 2021-01-28 Gait feature extraction method, lower limb evaluation and control method, device and medium

Country Status (1)

Country Link
CN (1) CN113143256B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114041782A (en) * 2021-07-26 2022-02-15 南宁师范大学 Multi-channel human body lower limb movement information acquisition system and method
CN113768760B (en) * 2021-09-08 2022-12-20 中国科学院深圳先进技术研究院 Control method and system of walking aid and driving device
CN113876316B (en) * 2021-09-16 2023-10-10 河南翔宇医疗设备股份有限公司 System, method, device, equipment and medium for detecting abnormal lower limb flexion and extension activities

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4051558A (en) * 1976-06-30 1977-10-04 The United States Of America As Represented By The United States National Aeronautics And Space Administration Mechanical energy storage device for hip disarticulation
EP1260201A1 (en) * 2001-05-24 2002-11-27 Argo Medical Technologies Ltd. Gait-locomotor apparatus
JP2004197253A (en) * 2002-12-17 2004-07-15 Norikazu Suzuki Support suit
KR20070072314A (en) * 2005-12-31 2007-07-04 고려대학교 산학협력단 Method for controlling pose of robot with using neural network, recording medium thereof, apparatus for controlling pose of robot with using neuron-network and robot therewith
CN102944200A (en) * 2012-11-01 2013-02-27 沈阳工业大学 Method for obtaining hip joint angle and displacement from hip joint center to platform
WO2015008937A1 (en) * 2013-07-15 2015-01-22 서울대학교산학협력단 Method and apparatus for generating motion data
JP2015062654A (en) * 2013-08-28 2015-04-09 日本電信電話株式会社 Gait estimation device, program thereof, stumble risk calculation device and program thereof
CN104613963A (en) * 2015-01-23 2015-05-13 南京师范大学 Pedestrian navigation system and navigation positioning method based on kinesiology model
CN104921851A (en) * 2015-05-25 2015-09-23 河北工业大学 Predictive control method for knee joints of active above-knee prostheses
WO2015164421A1 (en) * 2014-04-21 2015-10-29 The Trustees Of Columbia University In The City Of New York Human movement research, therapeutic, and diagnostic devices, methods, and systems
WO2016057521A1 (en) * 2014-10-06 2016-04-14 Inmotion, Llc Systems, devices and methods relating to motion data
KR20160089791A (en) * 2015-01-20 2016-07-28 한국생산기술연구원 System and method for recognizing gait phase
WO2017014294A1 (en) * 2015-07-23 2017-01-26 国立大学法人北海道大学 Gait analysis method and gait analysis system
CN106815857A (en) * 2015-11-27 2017-06-09 财团法人工业技术研究院 Gesture estimation method for mobile auxiliary robot
CN106821391A (en) * 2017-03-23 2017-06-13 北京精密机电控制设备研究所 Body gait acquisition analysis system and method based on inertial sensor information fusion
WO2017156617A1 (en) * 2016-03-15 2017-09-21 Laboratoire Victhom Inc. Biomechanical analysis and validation system and method
CN108697377A (en) * 2016-01-25 2018-10-23 贝泰米亚公司 gait analyzer system and method
CN108836346A (en) * 2018-04-16 2018-11-20 大连理工大学 A kind of Human Body Gait Analysis method and system based on inertial sensor
CN109521771A (en) * 2018-11-22 2019-03-26 西北工业大学 A kind of hexapod robot motion control arithmetic
CN110021398A (en) * 2017-08-23 2019-07-16 陆晓 A kind of gait analysis, training method and system
CN110215648A (en) * 2019-06-28 2019-09-10 华中科技大学 Ectoskeleton based on body gait motor coordination characteristic coordinates gait control method
CN110420029A (en) * 2019-08-03 2019-11-08 苏州自如医疗器械有限公司 A kind of walking step state wireless detecting system based on Multi-sensor Fusion
CN110969114A (en) * 2019-11-28 2020-04-07 四川省骨科医院 Human body action function detection system, detection method and detector
CN111142378A (en) * 2020-01-07 2020-05-12 四川省桑瑞光辉标识系统股份有限公司 Neural network optimization method of biped robot neural network controller
CN111631727A (en) * 2020-06-11 2020-09-08 国家康复辅具研究中心 Evaluation method and evaluation device for artificial limb adaptation effect
CN111904793A (en) * 2020-08-13 2020-11-10 上海电气集团股份有限公司 Lower limb rehabilitation robot based on parallel mechanism and control system

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE60327073D1 (en) * 2002-02-07 2009-05-20 Ecole Polytech BODY MOVEMENT MONITOR
US20100094174A1 (en) * 2007-03-13 2010-04-15 Yong Jae Choi Method for three-dimensional biomechanical data and parameter analysis and system using the same method
EP1970005B1 (en) * 2007-03-15 2012-10-03 Xsens Holding B.V. A system and a method for motion tracking using a calibration unit
US8444564B2 (en) * 2009-02-02 2013-05-21 Jointvue, Llc Noninvasive diagnostic system
KR102387378B1 (en) * 2014-10-07 2022-04-15 삼성전자주식회사 Method and apparatus for recognizing gait motion
KR102503955B1 (en) * 2016-11-02 2023-02-28 삼성전자주식회사 Method and apparatus for controlling balance
KR20180096241A (en) * 2017-02-21 2018-08-29 삼성전자주식회사 Method and apparatus for walking assistance
JP6882916B2 (en) * 2017-03-29 2021-06-02 本田技研工業株式会社 Walking support system, walking support method, and walking support program
US20180289579A1 (en) * 2017-04-11 2018-10-11 The Trustees Of Columbia University In The City Of New York Powered Walking Assistant and Associated Systems and Methods
US11672480B2 (en) * 2018-07-09 2023-06-13 V Reuben F. Burch Wearable flexible sensor motion capture system
JP7183963B2 (en) * 2019-06-07 2022-12-06 トヨタ自動車株式会社 Gait training system, display method, and display program

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4051558A (en) * 1976-06-30 1977-10-04 The United States Of America As Represented By The United States National Aeronautics And Space Administration Mechanical energy storage device for hip disarticulation
EP1260201A1 (en) * 2001-05-24 2002-11-27 Argo Medical Technologies Ltd. Gait-locomotor apparatus
JP2004197253A (en) * 2002-12-17 2004-07-15 Norikazu Suzuki Support suit
KR20070072314A (en) * 2005-12-31 2007-07-04 고려대학교 산학협력단 Method for controlling pose of robot with using neural network, recording medium thereof, apparatus for controlling pose of robot with using neuron-network and robot therewith
CN102944200A (en) * 2012-11-01 2013-02-27 沈阳工业大学 Method for obtaining hip joint angle and displacement from hip joint center to platform
WO2015008937A1 (en) * 2013-07-15 2015-01-22 서울대학교산학협력단 Method and apparatus for generating motion data
JP2015062654A (en) * 2013-08-28 2015-04-09 日本電信電話株式会社 Gait estimation device, program thereof, stumble risk calculation device and program thereof
WO2015164421A1 (en) * 2014-04-21 2015-10-29 The Trustees Of Columbia University In The City Of New York Human movement research, therapeutic, and diagnostic devices, methods, and systems
WO2016057521A1 (en) * 2014-10-06 2016-04-14 Inmotion, Llc Systems, devices and methods relating to motion data
KR20160089791A (en) * 2015-01-20 2016-07-28 한국생산기술연구원 System and method for recognizing gait phase
CN104613963A (en) * 2015-01-23 2015-05-13 南京师范大学 Pedestrian navigation system and navigation positioning method based on kinesiology model
CN104921851A (en) * 2015-05-25 2015-09-23 河北工业大学 Predictive control method for knee joints of active above-knee prostheses
WO2017014294A1 (en) * 2015-07-23 2017-01-26 国立大学法人北海道大学 Gait analysis method and gait analysis system
CN106815857A (en) * 2015-11-27 2017-06-09 财团法人工业技术研究院 Gesture estimation method for mobile auxiliary robot
CN108697377A (en) * 2016-01-25 2018-10-23 贝泰米亚公司 gait analyzer system and method
WO2017156617A1 (en) * 2016-03-15 2017-09-21 Laboratoire Victhom Inc. Biomechanical analysis and validation system and method
CN106821391A (en) * 2017-03-23 2017-06-13 北京精密机电控制设备研究所 Body gait acquisition analysis system and method based on inertial sensor information fusion
CN110021398A (en) * 2017-08-23 2019-07-16 陆晓 A kind of gait analysis, training method and system
CN108836346A (en) * 2018-04-16 2018-11-20 大连理工大学 A kind of Human Body Gait Analysis method and system based on inertial sensor
CN109521771A (en) * 2018-11-22 2019-03-26 西北工业大学 A kind of hexapod robot motion control arithmetic
CN110215648A (en) * 2019-06-28 2019-09-10 华中科技大学 Ectoskeleton based on body gait motor coordination characteristic coordinates gait control method
CN110420029A (en) * 2019-08-03 2019-11-08 苏州自如医疗器械有限公司 A kind of walking step state wireless detecting system based on Multi-sensor Fusion
CN110969114A (en) * 2019-11-28 2020-04-07 四川省骨科医院 Human body action function detection system, detection method and detector
CN111142378A (en) * 2020-01-07 2020-05-12 四川省桑瑞光辉标识系统股份有限公司 Neural network optimization method of biped robot neural network controller
CN111631727A (en) * 2020-06-11 2020-09-08 国家康复辅具研究中心 Evaluation method and evaluation device for artificial limb adaptation effect
CN111904793A (en) * 2020-08-13 2020-11-10 上海电气集团股份有限公司 Lower limb rehabilitation robot based on parallel mechanism and control system

Also Published As

Publication number Publication date
CN113143256A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN113143256B (en) Gait feature extraction method, lower limb evaluation and control method, device and medium
CN106650687B (en) Posture correction method based on depth information and skeleton information
Yin et al. FootSee: an interactive animation system.
Wang et al. Using wearable sensors to capture posture of the human lumbar spine in competitive swimming
CN111631923A (en) Neural network control system of exoskeleton robot based on intention recognition
Huang et al. Posture estimation and human support using wearable sensors and walking-aid robot
CN104424650B (en) A kind of arm information compensation method in optical profile type human body motion capture
KR20180096241A (en) Method and apparatus for walking assistance
CN110193830B (en) Ankle joint gait prediction method based on RBF neural network
Kim et al. StrokeTrack: wireless inertial motion tracking of human arms for stroke telerehabilitation
Nomm et al. Monitoring of the human motor functions rehabilitation by neural networks based system with kinect sensor
Yuan et al. Adaptive recognition of motion posture in sports video based on evolution equation
Zhen et al. Hybrid deep-learning framework based on Gaussian fusion of multiple spatiotemporal networks for walking gait phase recognition
He et al. A new Kinect-based posture recognition method in physical sports training based on urban data
Ren et al. Multivariate analysis of joint motion data by Kinect: application to Parkinson’s disease
Wang et al. Arbitrary spatial trajectory reconstruction based on a single inertial sensor
CN117109567A (en) Riding gesture monitoring method and system for dynamic bicycle movement and wearable riding gesture monitoring equipment
Jiang et al. Deep learning algorithm based wearable device for basketball stance recognition in basketball
WO2023035457A1 (en) Walking aid control method and system, and driving device
Lianzhen et al. Athlete Rehabilitation Evaluation System Based on Internet of Health Things and Human Gait Analysis Algorithm
WO2014129917A2 (en) System and method for evaluating the motion of a subject
CN115018962A (en) Human motion attitude data set generation method based on virtual character model
Lueken et al. Using synthesized imu data to train a long-short term memory-based neural network for unobtrusive gait analysis with a sparse sensor setup
KR20220066535A (en) Method, Server and System for Recognizing Motion in Video
Nergui et al. Human gait behavior interpretation by a mobile home healthcare robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20210723

Assignee: SHANGHAI ELECTRIC INTELLIGENT REHABILITATION MEDICAL TECHNOLOGY Co.,Ltd.

Assignor: Shanghai Electric Group Co.,Ltd.

Contract record no.: X2023310000146

Denomination of invention: Gait feature extraction method, lower limb evaluation, control method, equipment and medium

License type: Exclusive License

Record date: 20230919