CN114831627A - Lower limb prosthesis movement identification method based on three decision trees - Google Patents

Lower limb prosthesis movement identification method based on three decision trees Download PDF

Info

Publication number
CN114831627A
CN114831627A CN202210267991.4A CN202210267991A CN114831627A CN 114831627 A CN114831627 A CN 114831627A CN 202210267991 A CN202210267991 A CN 202210267991A CN 114831627 A CN114831627 A CN 114831627A
Authority
CN
China
Prior art keywords
decision tree
data
disabled
healthy
tree model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210267991.4A
Other languages
Chinese (zh)
Inventor
任雷
张尧
修豪华
李振男
阎凌云
韩阳
王旭
钱志辉
任露泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN202210267991.4A priority Critical patent/CN114831627A/en
Publication of CN114831627A publication Critical patent/CN114831627A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/259Fusion by voting

Abstract

A lower limb artificial limb movement recognition method based on three decision trees belongs to the technical field of pattern recognition, and the invention firstly collects nine movement state experimental data of the left leg of 8 left-side above-knee amputees and 16 healthy subjects; respectively training an independent decision tree model for the disabled, a generalized decision tree model for the disabled and a generalized decision tree model for the healthy; integrating the prediction results of the three decision trees by using a soft voting classifier to perform real-time prediction; the invention utilizes the advantage of short prediction time of the decision tree, imitates a small forest structure, integrates the disabled data and the healthy experimental data to train a plurality of decision tree models, can greatly improve the algorithm accuracy under the condition of reducing the prediction time and the memory of a processor, and realizes the function of predicting the motion intention of the amputee in real time.

Description

Lower limb prosthesis movement identification method based on three decision trees
Technical Field
The invention belongs to the technical field of online pattern recognition, and particularly relates to a lower limb prosthesis movement recognition method based on three decision trees.
Background
Over the years, the focus of lower limb prosthesis research has been to gradually shift from the initial passive type to the active type capable of saving effort for disabled persons. The development of powered prostheses requires more advanced and intuitive control strategies, and it is particularly important to accurately judge the motor intent of lower limb amputees. Because the collection of the electromyographic signals is inconvenient for a user in daily life movement, a mechanical sensor embedded in an artificial limb is used for collecting data, and better data characteristics are screened out from the original data so as to improve the training effect of the model. In past research, many machine learning algorithms were used to perform human intent recognition, but some algorithms have too complex model structures or require storage of large amounts of training data, and are not suitable for real-time motion state prediction for prosthetic systems. The random forest algorithm has strong generalization capability and good performance when processing multi-feature data, but cannot control the operation in the model, possibly has a plurality of similar decision trees to cover the real result, and the execution speed is much slower than that of a single decision tree. The decision tree is high in speed and accuracy, can be regarded as a set of 'if-then' rules, is easy to process in real time on an STM32 processor, the division condition of each node can be regarded as a heuristic rule, and a complex machine learning classification model is simplified into a multilayer judgment structure.
The residual limb length and the amputation reason of each amputation subject are different, the data collected by the sensor are different, the prediction accuracy of the independent decision tree model of the disabled person trained through the data of the subject is the highest, but the data of most disabled persons are combined to participate in the training model from the perspective of the generalization of the model, the joint motion track of the healthy persons is always an important basis for adjusting parameters of the artificial limb control algorithm, so that the motion data of the healthy persons with a certain proportion is added into the training data of the disabled persons by researchers, but the simple mixed experimental data can obscure the boundary value of the data of the disabled persons.
Due to various defects of the existing algorithm, the effect is not ideal in practical application, and the invention provides the lower limb prosthesis motion identification method for predicting the motion state of the testee by using three independent decision tree models.
Disclosure of Invention
The invention aims to provide a lower limb prosthesis motion recognition method based on three decision trees, which realizes real-time recognition and prediction of amputation patient motion intention by integrating prediction results of an independent decision tree model for disabled persons, a generalized decision tree model for disabled persons and a generalized decision tree model for healthy persons through a soft voting classifier.
The invention relates to a lower limb prosthesis movement identification method based on three decision trees, which comprises the following steps:
1.1 acquiring action data acquired by each sensor in the artificial limb knee joint, comprising the following steps:
1.1.1 using a knee joint angle sensor, a weighing sensor and an IMU sensor which are placed on a prosthetic knee joint, collecting data of 7 male disabled testees and 1 female disabled testees during slow horizontal walking, normal horizontal walking, fast horizontal walking, ascending, descending, sitting, standing, ascending and descending motions;
1.1.2 using knee joint angle sensors and IMU sensors placed on the knee joints of healthy male subjects, collecting data of 14 healthy male subjects and 2 healthy female subjects during slow horizontal walking, normal horizontal walking, fast horizontal walking, ascending, descending, sitting, standing, ascending and descending motions;
1.1.3 the data collected by the sensors placed on the artificial limb knee joint comprises knee joint angle, ground reaction force, IMU sensor X-axis acceleration, IMU sensor Y-axis acceleration, IMU sensor Z-axis acceleration, IMU sensor X-axis angular velocity, IMU sensor Y-axis angular velocity and IMU sensor Z-axis angular velocity data;
1.1.4 the data collected by the sensors placed on the knee joint of the healthy subject comprises the knee joint angle, the X-axis acceleration of the IMU sensor, the Y-axis acceleration of the IMU sensor, the Z-axis acceleration of the IMU sensor, the X-axis angular velocity of the IMU sensor, the Y-axis angular velocity of the IMU sensor and the Z-axis angular velocity of the IMU sensor;
1.1.5 pretreatment: denoising and abnormal data removing are carried out on the data collected in the step 1.1.1 and the step 1.1.2, and classification labels are added to the normal data;
1.1.6 processing sensor data using a fixed 200ms time window, extracting two time domain eigenvalues of each dimension of data within the time window, including: mean and standard deviation, the formula is described as follows:
mean value:
Figure BDA0003551330260000021
standard deviation:
Figure BDA0003551330260000022
wherein: x i N is the number of sampling samples in a 200ms time window;
1.2 because the decision tree has fast calculation speed and high accuracy, the method is suitable for STM32 to carry out real-time human intention identification and judgment, and three decision trees are constructed for integrated classification, and the method comprises the following steps:
1.2.1 Using the experimental data of all 7 male disabled subjects and 1 female disabled subject, the generalized decision tree model for disabled was trained as follows:
1.2.1.1 using a tree.decision treecosisifier () function in a skerarn frame in Python as a classifier, setting relevant parameters as optimal partition attributes selected by using a kini coefficient, setting the maximum depth of each decision tree to be 7, setting the minimum sample number of nodes to be 5, and when the sample number is less than 5, the nodes are not divided again;
1.2.1.2 training a decision tree model by taking experimental data and labels of 7 male disabled testees and 1 female disabled testee which are subjected to feature extraction as a data set;
1.2.1.3 visualization of the decision tree is done using the pydotplus module;
1.2.1.4 storing the judgment condition of each node in the model after the training is finished;
1.2.2 training an independent decision tree model of the disabled by using the acquired experimental data of the disabled subject to perform real-time intention identification by using the prosthesis to be worn, wherein the training method comprises the following steps:
1.2.2.1 synchronization step 1.2.1.1;
1.2.2.2 training a decision tree model by taking the experimental data and the labels of the disabled person subjects to be worn with the artificial limbs subjected to real-time intention identification as a data set;
1.2.2.3 synchronization step 1.2.1.3;
1.2.2.4 synchronization step 1.2.1.4;
1.2.3 Using experimental data from all 14 healthy male subjects and 2 healthy female subjects, a generalized decision tree model was trained on healthy persons as follows:
1.2.3.1 synchronization step 1.2.1.1;
1.2.3.2 training a decision tree model by taking experimental data and labels of all 14 male healthy subjects and 2 healthy female subjects after feature extraction processing as a data set;
1.2.3.3 synchronization step 1.2.1.3;
1.2.3.4 synchronization step 1.2.1.4;
1.3 using three decision tree classification models to distinguish nine motion states of slow horizontal walking, normal horizontal walking, fast horizontal walking, ascending, descending, sitting, standing, ascending and descending, comprising the following steps:
1.3.1 the data set of each decision tree classification model is composed of nine motion state data according to the same proportion;
1.3.2 training data sets of the disabled person generalized decision tree model and the healthy person generalized decision tree model, wherein the data quantity is more than the data quantity used by the disabled person independent decision tree model;
1.3.3 integrating the prediction results of the independent decision tree model for the disabled, the generalized decision tree model for the disabled and the generalized decision tree model for the healthy by using a soft voting classifier, and taking the prediction results as the prediction results of the three decision tree classification models finally.
Step 1.1.1 the 7 male and 1 female disabled subjects were left above knee amputees.
The sampling frequencies of the knee joint angle sensor, the weighing sensor and the IMU sensor on the artificial knee joint in the step 1.1.1 and the sampling frequencies of the knee joint angle sensor and the IMU sensor placed on the knee joint of the healthy subject in the step 1.1.2 are all 100 HZ.
In the three decision tree classification models described in step 1.3, the calculation of the classification accuracy is performed by cross-folding cross validation, dividing the data set into ten parts, taking nine parts as training data and one part as test data in turn, and taking the average value of the accuracy of ten test results as the estimation of the algorithm accuracy.
The invention has the beneficial effects that:
the lower limb prosthesis motion identification method based on three decision trees can greatly improve the accuracy rate of amputee motion intention identification: a method for recognizing the movement of lower artificial limb with three decision trees is disclosed. Because the machine learning algorithm has the problems of overlong prediction time and complex model structure in real-time prediction, the decision tree algorithm is used for pattern recognition; the method is characterized in that the model generalization and the reference healthy person motion state are ensured, so a soft voting classifier is used for integrating the prediction results of a disabled person generalized decision tree model, a disabled person independent decision tree model and a healthy person generalized decision tree model to identify the motion mode, and three decision tree algorithms are used.
Drawings
FIG. 1 is a flow chart of a method for lower limb prosthesis motion recognition based on three decision trees;
FIG. 2 is a diagram of a decision process of a decision tree algorithm;
fig. 3 is a schematic diagram of a soft voting classifier.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
The implementation flow of the lower limb prosthesis movement recognition algorithm based on three decision trees is shown in figure 1, and the method comprises the following steps:
1.1 acquiring action data acquired by each sensor in the artificial limb knee joint, comprising the following steps:
1.1.1 using a knee joint angle sensor, a weighing sensor and an IMU sensor which are placed on a prosthetic knee joint, collecting data of 7 male disabled testees and 1 female disabled testees during slow horizontal walking, normal horizontal walking, fast horizontal walking, ascending, descending, sitting, standing, ascending and descending motions; before data acquisition, each subject needs to wear a prosthesis to carry out ten-hour adaptive training, and each subject acquires 5-minute motion data in each motion state during experiment;
1.1.2 the 7 male and 1 female disabled subjects described in step 1.1.1 were both left above knee amputees;
1.1.3 use knee angle sensors and IMU sensors placed on the knee joints of healthy subjects to collect data when 14 healthy male subjects and 2 healthy female subjects perform movements of slow horizontal walking, normal horizontal walking, fast horizontal walking, ascending, descending, sitting, standing, ascending and descending. The knee joint angle sensor and the IMU sensor are fixed at the left knee position of a healthy subject by a binding belt, the fixed position of each subject is ensured to be the same, and 5-minute motion data is collected in each motion state of each subject during an experiment;
1.1.4 data collected by sensors on the artificial limbs of the disabled person testees comprise knee joint angles, ground reaction force, X-axis acceleration of an IMU sensor, Y-axis acceleration of the IMU sensor, Z-axis acceleration of the IMU sensor, X-axis angular velocity of the IMU sensor, Y-axis angular velocity of the IMU sensor and Z-axis angular velocity data of the IMU sensor;
1.1.5 the healthy subject wears sensors to acquire data including knee joint angle, IMU sensor X-axis acceleration, IMU sensor Y-axis acceleration, IMU sensor Z-axis acceleration, IMU sensor X-axis angular velocity, IMU sensor Y-axis angular velocity and IMU sensor Z-axis angular velocity data;
1.1.6 pretreatment: denoising and abnormal data removing are carried out on the data collected in the step 1.1.1 and the step 1.1.3, and classification labels are added to the normal data;
1.1.7 processing sensor data using a fixed 200ms time window, extracting two time domain characteristic values of each dimension of data in the time window, including: mean and standard deviation, the formula is described as follows:
mean value:
Figure BDA0003551330260000041
standard deviation:
Figure BDA0003551330260000042
wherein: x i N is the number of sampling samples in a 200ms time window;
1.2 training three decision trees using different experimental data respectively, comprising the following steps:
1.2.1 Using experimental data from all 7 male and 1 female disabled subjects, the generalized decision tree model for disabled persons was trained, wherein the data volume for each locomotor state of each subject was the same, as follows:
1.2.1.1 using a tree.decision treeconsisifier () function in a sklern frame in Python as a classifier, as shown in fig. 2, setting relevant parameters to adopt a kini coefficient to select an optimal partition attribute, setting the maximum depth of each decision tree to be 7, setting the minimum sample number of nodes to be 5, and when the sample number is less than 5, the nodes will not be subdivided;
1.2.1.2 training a decision tree model by taking experimental data and labels of 7 male disabled testees and 1 female disabled testee which are subjected to feature extraction as a data set;
1.2.1.3 visualization of the decision tree is done using the pydotplus module;
1.2.1.4 storing the judgment condition of each node in the model after the training is finished;
1.2.2 training an independent decision tree model of the disabled by using the acquired experimental data of the disabled subject to perform real-time intention identification by using the prosthesis to be worn, wherein the training method comprises the following steps:
1.2.2.1 synchronization step 1.2.1.1;
1.2.2.2 training a decision tree model by taking the experimental data and the labels of the disabled person subject to be worn with the prosthesis for real-time intention identification as a data set;
1.2.2.3 synchronization step 1.2.1.3;
1.2.2.4 synchronization step 1.2.1.4;
1.2.3 Using experimental data from all 14 healthy male subjects and 2 healthy female subjects, a generalized decision tree model was trained on healthy persons as follows:
1.2.3.1 synchronization step 1.2.1.1;
1.2.3.2 training a decision tree model by taking experimental data and labels of all 14 male healthy subjects and 2 healthy female subjects after feature extraction processing as a data set;
1.2.3.3 synchronization step 1.2.1.3;
1.2.3.4 synchronization step 1.2.1.4;
1.3 using three decision tree classification models respectively to distinguish nine motion states of slow horizontal walking, normal horizontal walking, fast horizontal walking, ascending, descending, sitting, standing, ascending and descending, comprising the following steps:
1.3.1 the data set of each decision tree classifier is composed of nine motion state data according to the same proportion;
1.3.2 the data volume of the training data set of the generic decision tree model for the disabled and the generic decision tree model for the healthy is more than the data volume used by the independent decision tree model for the disabled;
1.3.3, respectively calculating the classification accuracy of the generic decision tree model for the disabled, the independent decision tree model for the disabled and the generic decision tree model for the healthy;
1.3.4 compiling codes of three prediction functions according to the judgment conditions of each node in the three decision tree classification models stored in the steps 1.2.1.4, 1.2.2.4 and 1.2.3.4, and transplanting the codes to an STM32 singlechip;
1.3.5 as shown in FIG. 3, when real-time prediction is performed, the prediction results of the independent decision tree model for the disabled, the generalized decision tree model for the disabled and the generalized decision tree model for the healthy are integrated by using the soft voting classifier as the prediction results of the overall classification model.

Claims (4)

1. A lower limb prosthesis movement identification method based on three decision trees is characterized in that: comprises the following steps:
1.1 acquiring action data acquired by each sensor in the artificial limb knee joint, comprising the following steps:
1.1.1 using a knee joint angle sensor, a weighing sensor and an IMU sensor which are placed on a prosthetic knee joint, collecting data of 7 male disabled testees and 1 female disabled testees during slow horizontal walking, normal horizontal walking, fast horizontal walking, ascending, descending, sitting, standing, ascending and descending motions;
1.1.2 using knee joint angle sensors and IMU sensors placed on the knee joints of healthy male subjects, collecting data of 14 healthy male subjects and 2 healthy female subjects during slow horizontal walking, normal horizontal walking, fast horizontal walking, ascending, descending, sitting, standing, ascending and descending motions;
1.1.3 the data collected by the sensors placed on the artificial limb knee joint comprises knee joint angle, ground reaction force, IMU sensor X-axis acceleration, IMU sensor Y-axis acceleration, IMU sensor Z-axis acceleration, IMU sensor X-axis angular velocity, IMU sensor Y-axis angular velocity and IMU sensor Z-axis angular velocity data;
1.1.4 the data collected by the sensors placed on the knee joint of the healthy subject comprises the knee joint angle, the X-axis acceleration of the IMU sensor, the Y-axis acceleration of the IMU sensor, the Z-axis acceleration of the IMU sensor, the X-axis angular velocity of the IMU sensor, the Y-axis angular velocity of the IMU sensor and the Z-axis angular velocity of the IMU sensor;
1.1.5 pretreatment: denoising and abnormal data removing are carried out on the data collected in the step 1.1.1 and the step 1.1.2, and classification labels are added to the normal data;
1.1.6 processing sensor data using a fixed 200ms time window, extracting two time domain eigenvalues of each dimension of data within the time window, including: mean and standard deviation, the formula is described as follows:
mean value:
Figure FDA0003551330250000011
standard deviation:
Figure FDA0003551330250000012
wherein: x i N is the number of sampling samples in a 200ms time window;
1.2 because the decision tree has fast calculation speed and high accuracy, the method is suitable for STM32 to carry out real-time human intention identification and judgment, and three decision trees are constructed for integrated classification, and the method comprises the following steps:
1.2.1 Using the experimental data of all 7 male disabled subjects and 1 female disabled subject, the generalized decision tree model for disabled was trained as follows:
1.2.1.1 using a tree.decision treecosisifier () function in a skerarn frame in Python as a classifier, setting relevant parameters as optimal partition attributes selected by using a kini coefficient, setting the maximum depth of each decision tree to be 7, setting the minimum sample number of nodes to be 5, and when the sample number is less than 5, the nodes are not divided again;
1.2.1.2 training a decision tree model by taking experimental data and labels of 7 male disabled testees and 1 female disabled testee which are subjected to feature extraction as a data set;
1.2.1.3 visualization of the decision tree is done using the pydotplus module;
1.2.1.4 storing the judgment condition of each node in the model after the training is finished;
1.2.2 training an independent decision tree model of the disabled by using the acquired experimental data of the disabled subject to perform real-time intention identification by using the prosthesis to be worn, wherein the training method comprises the following steps:
1.2.2.1 synchronization step 1.2.1.1;
1.2.2.2 training a decision tree model by taking the experimental data and the labels of the disabled person subjects to be worn with the artificial limbs subjected to real-time intention identification as a data set;
1.2.2.3 synchronization step 1.2.1.3;
1.2.2.4 synchronization step 1.2.1.4;
1.2.3 Using experimental data from all 14 healthy male subjects and 2 healthy female subjects, a generalized decision tree model was trained on healthy persons as follows:
1.2.3.1 synchronization step 1.2.1.1;
1.2.3.2 training a decision tree model by taking experimental data and labels of all 14 male healthy subjects and 2 healthy female subjects after feature extraction processing as a data set;
1.2.3.3 synchronization step 1.2.1.3;
1.2.3.4 synchronization step 1.2.1.4;
1.3 using three decision tree classification models to distinguish nine motion states of slow horizontal walking, normal horizontal walking, fast horizontal walking, ascending, descending, sitting, standing, ascending and descending, comprising the following steps:
1.3.1 the data set of each decision tree classification model is composed of nine motion state data according to the same proportion;
1.3.2 training data sets of the disabled person generalized decision tree model and the healthy person generalized decision tree model, wherein the data quantity is more than the data quantity used by the disabled person independent decision tree model;
1.3.3 integrating the prediction results of the independent decision tree model for the disabled, the generalized decision tree model for the disabled and the generalized decision tree model for the healthy by using a soft voting classifier, and taking the prediction results as the prediction results of the three decision tree classification models finally.
2. A method for lower limb prosthesis movement recognition based on three decision trees according to claim 1, characterized in that: step 1.1.1 the 7 male and 1 female disabled subjects were left above knee amputees.
3. A method for lower limb prosthesis movement recognition based on three decision trees according to claim 1, characterized in that: the sampling frequencies of the knee joint angle sensor, the weighing sensor and the IMU sensor on the artificial knee joint in the step 1.1.1 and the sampling frequencies of the knee joint angle sensor and the IMU sensor placed on the knee joint of the healthy subject in the step 1.1.2 are all 100 HZ.
4. A method for lower limb prosthesis movement recognition based on three decision trees according to claim 1, characterized in that: in the three decision tree classification models described in step 1.3, the calculation of the classification accuracy is performed by cross-folding cross validation, dividing the data set into ten parts, taking nine parts as training data and one part as test data in turn, and taking the average value of the accuracy of ten test results as the estimation of the algorithm accuracy.
CN202210267991.4A 2022-03-17 2022-03-17 Lower limb prosthesis movement identification method based on three decision trees Pending CN114831627A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210267991.4A CN114831627A (en) 2022-03-17 2022-03-17 Lower limb prosthesis movement identification method based on three decision trees

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210267991.4A CN114831627A (en) 2022-03-17 2022-03-17 Lower limb prosthesis movement identification method based on three decision trees

Publications (1)

Publication Number Publication Date
CN114831627A true CN114831627A (en) 2022-08-02

Family

ID=82562403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210267991.4A Pending CN114831627A (en) 2022-03-17 2022-03-17 Lower limb prosthesis movement identification method based on three decision trees

Country Status (1)

Country Link
CN (1) CN114831627A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006314670A (en) * 2005-05-16 2006-11-24 Kenichi Katsura Walking supporting device, and rehabilitation system
CN101533467A (en) * 2009-04-28 2009-09-16 南京航空航天大学 Method for identifying a plurality of human postures based on decision tree
US20130338540A1 (en) * 2012-06-14 2013-12-19 Rehabilitation Institute Of Chicago Systems and methods for hierarchical pattern recognition for simultaneous control of multiple-degree of freedom movements for prosthetics
CN107669278A (en) * 2017-09-22 2018-02-09 广州杰赛科技股份有限公司 Moving state identification method and system, animal behavior identifying system
CN107918492A (en) * 2017-12-22 2018-04-17 安庆师范大学 A kind of human motion in face of Intelligent lower limb artificial limb is intended to precognition recognition methods
CN108509897A (en) * 2018-03-29 2018-09-07 同济大学 A kind of human posture recognition method and system
CN108681685A (en) * 2018-03-23 2018-10-19 天津科技大学 A kind of body work intension recognizing method based on human body surface myoelectric signal
CN110969108A (en) * 2019-11-25 2020-04-07 杭州电子科技大学 Limb action recognition method based on autonomic motor imagery electroencephalogram
CN113314209A (en) * 2021-06-11 2021-08-27 吉林大学 Human body intention identification method based on weighted KNN
US20210401324A1 (en) * 2020-06-28 2021-12-30 The Chinese University Of Hong Kong Method for recognizing a motion pattern of a limb
CN114041783A (en) * 2021-11-11 2022-02-15 吉林大学 Lower limb movement intention identification method based on empirical rule combined with machine learning

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006314670A (en) * 2005-05-16 2006-11-24 Kenichi Katsura Walking supporting device, and rehabilitation system
CN101533467A (en) * 2009-04-28 2009-09-16 南京航空航天大学 Method for identifying a plurality of human postures based on decision tree
US20130338540A1 (en) * 2012-06-14 2013-12-19 Rehabilitation Institute Of Chicago Systems and methods for hierarchical pattern recognition for simultaneous control of multiple-degree of freedom movements for prosthetics
CN107669278A (en) * 2017-09-22 2018-02-09 广州杰赛科技股份有限公司 Moving state identification method and system, animal behavior identifying system
CN107918492A (en) * 2017-12-22 2018-04-17 安庆师范大学 A kind of human motion in face of Intelligent lower limb artificial limb is intended to precognition recognition methods
CN108681685A (en) * 2018-03-23 2018-10-19 天津科技大学 A kind of body work intension recognizing method based on human body surface myoelectric signal
CN108509897A (en) * 2018-03-29 2018-09-07 同济大学 A kind of human posture recognition method and system
CN110969108A (en) * 2019-11-25 2020-04-07 杭州电子科技大学 Limb action recognition method based on autonomic motor imagery electroencephalogram
US20210401324A1 (en) * 2020-06-28 2021-12-30 The Chinese University Of Hong Kong Method for recognizing a motion pattern of a limb
CN113314209A (en) * 2021-06-11 2021-08-27 吉林大学 Human body intention identification method based on weighted KNN
CN114041783A (en) * 2021-11-11 2022-02-15 吉林大学 Lower limb movement intention identification method based on empirical rule combined with machine learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
马超: ""半监督随机森林分类算法及其并行化研究"", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》, 15 June 2018 (2018-06-15), pages 140 - 99 *

Similar Documents

Publication Publication Date Title
Liu et al. Intent pattern recognition of lower-limb motion based on mechanical sensors
Hargrove et al. The effect of electrode displacements on pattern recognition based myoelectric control
Joshi et al. High energy spectrogram with integrated prior knowledge for EMG-based locomotion classification
US20120310370A1 (en) Systems and methods for providing a neural-machine interface for artificial legs
Liu et al. Towards zero retraining for myoelectric control based on common model component analysis
CN106308809A (en) Method for recognizing gait of thigh amputation subject
US9907489B2 (en) Systems and methods for hierarchical pattern recognition for simultaneous control of multiple-degree of freedom movements for prosthetics
CN110363152B (en) Method for identifying road condition of lower limb prosthesis based on surface electromyographic signals
Zhuojun et al. sEMG pattern recognition of muscle force of upper arm for intelligent bionic limb control
CN104983489A (en) Road condition identifying method for lower limb prosthesis walking
CN108968918A (en) The wearable auxiliary screening equipment of early stage Parkinson
Chuang et al. A wearable activity sensor system and its physical activity classification scheme
CN113850104A (en) Motion pattern recognition method for limbs
CN205031391U (en) Road conditions recognition device of power type artificial limb
CN113314209B (en) Human body intention identification method based on weighted KNN
CN113749644A (en) Intelligent garment capable of monitoring lumbar movement of human body and automatically correcting posture
CN114831627A (en) Lower limb prosthesis movement identification method based on three decision trees
Sun et al. A fault-tolerant algorithm to enhance generalization of EMG-based pattern recognition for lower limb movement
CN114831784A (en) Lower limb prosthesis terrain recognition system and method based on multi-source signals
Lei et al. Leg amputees motion pattern recognition based on principal component analysis and BP network
Sheng et al. Motion intent recognition in intelligent lower limb prosthesis using one-dimensional dual-tree complex wavelet transforms
Dominguez et al. The role of machine learning in improved functionality of lower limb prostheses
CN113625882B (en) Myoelectric gesture recognition method based on sparse multichannel correlation characteristics
CN115985463B (en) Real-time muscle fatigue prediction method and system based on wearable equipment
Liu et al. Continuous estimation of lower limb prosthesis based on gbdbn and bp network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination