CN114041783A - Lower limb movement intention identification method based on empirical rule combined with machine learning - Google Patents
Lower limb movement intention identification method based on empirical rule combined with machine learning Download PDFInfo
- Publication number
- CN114041783A CN114041783A CN202111330071.4A CN202111330071A CN114041783A CN 114041783 A CN114041783 A CN 114041783A CN 202111330071 A CN202111330071 A CN 202111330071A CN 114041783 A CN114041783 A CN 114041783A
- Authority
- CN
- China
- Prior art keywords
- data
- sensor
- algorithm
- time
- threshold
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000010801 machine learning Methods 0.000 title claims abstract description 21
- 210000003141 lower extremity Anatomy 0.000 title claims abstract description 20
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 40
- 210000000629 knee joint Anatomy 0.000 claims abstract description 23
- 238000005303 weighing Methods 0.000 claims abstract description 6
- 230000002159 abnormal effect Effects 0.000 claims abstract description 5
- 238000012549 training Methods 0.000 claims abstract description 5
- 210000003414 extremity Anatomy 0.000 claims abstract description 4
- 230000005021 gait Effects 0.000 claims description 15
- 230000001174 ascending effect Effects 0.000 claims description 11
- 230000001133 acceleration Effects 0.000 claims description 9
- 210000003127 knee Anatomy 0.000 claims description 9
- 230000035945 sensitivity Effects 0.000 claims description 5
- 238000007635 classification algorithm Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 4
- 239000003550 marker Substances 0.000 claims description 3
- 238000002790 cross-validation Methods 0.000 claims description 2
- 238000005070 sampling Methods 0.000 claims description 2
- 238000012360 testing method Methods 0.000 claims description 2
- 238000002474 experimental method Methods 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 4
- 230000007547 defect Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000011542 limb amputation Methods 0.000 description 2
- 230000001537 neural effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000003064 k means clustering Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000004243 sweat Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6811—External prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
Abstract
A lower limb movement intention recognition method based on an empirical rule and combined with machine learning belongs to the technical field of mode recognition, and the method comprises the steps of firstly obtaining data generated by a knee joint angle sensor, a weighing sensor and an IMU sensor, denoising and removing abnormal values, then designing three classifiers, and respectively carrying out human body intention recognition through empirical threshold judgment and an improved weighted KNN algorithm. The method can accurately identify seven common motion modes by using a small number of mechanical sensors, greatly reduces the data volume of a training set required by improving the weighted KNN algorithm, reduces the running time of the algorithm on the STM32 single chip microcomputer, and ensures the real-time prediction of the motion state of the human body. The invention provides the human body intention recognition by combining the experience rule and the machine learning algorithm on the basis of the experiment, aims to promote the development of commercial artificial limbs and is more convenient for the daily use of lower limb amputees.
Description
Technical Field
The invention belongs to the technical field of mode identification, and particularly relates to a lower limb amputation patient movement intention identification method based on experience rules and machine learning.
Background
The risk of falling can be effectively reduced by accurately judging the movement intention of the lower limb amputation patient, so that the daily movement requirement can be met. Commercial prostheses mostly use empirical threshold judgment as a basis for switching motion states, and some scholars try to use machine learning classification algorithms to identify intentions. Since the machine learning algorithm requires a large amount of data to perform off-line training, the trained model also has a large number of parameters, and some challenges are still presented in real-time prediction recognition by using the STM 32. The KNN algorithm is a classic machine learning algorithm, has the advantages of high classification accuracy, insensitivity to abnormal values and the like, and has the advantage that the KNN algorithm cannot be compared with other machine learning algorithms when used for classifying the multi-motion state of data acquired based on a mechanical sensor in a lower limb prosthesis experiment. However, the method has the disadvantages of huge calculation amount, long processing time for real-time prediction of human intention on STM32 and the like.
Human motion intention recognition is the most important part of wearing an intelligent power lower limb prosthesis control system and is mainly divided into motion intention recognition based on nerve signals and motion intention recognition based on mechanical signals. The application of the neural signals in the control of the intelligent power artificial limb is severely restricted due to the problems that the neural biological signals are weak in signal amplitude, sweat influences the acquisition effect and the like in the acquisition process. The identification of movement modalities and the identification of movement intentions based on biomechanical signals using powered prostheses and mechanical sensors integrated on the prostheses is a hot spot of current research in the field of lower limb prostheses. Many algorithms can achieve more than 95% accuracy in switching between motion modes, but an error rate of about 5% still causes falls, and redundant sensors also cause burden to amputees.
Due to various defects of the existing algorithm, the effect is not ideal in practical application, and the algorithm needs to be improved. Aiming at the defects of the prior art, the invention provides a lower limb amputee motion intention identification method based on experience rules and machine learning.
Disclosure of Invention
The invention aims to provide a lower limb movement intention recognition method combining empirical rules with machine learning, which improves the accuracy of human body movement intention recognition classification to the maximum extent and reduces the processing time of real-time prediction.
The invention relates to a lower limb movement intention recognition method based on empirical rule combined with machine learning, which comprises the following steps:
1.1 acquiring action data acquired by each sensor in the knee joint prosthesis, comprising the following steps:
1.1.1 using a knee joint angle sensor, a weighing sensor and an IMU sensor which are placed on a knee joint prosthesis to collect data of 6 disabled testees during walking, ascending, descending, sitting, standing, ascending and descending;
1.1.2 pretreatment: denoising the acquired data, removing abnormal data, and adding a classification label to the normal data;
1.1.3 analyzing and comparing the sensor data under different motion states, summarizing and summarizing to judge the Threshold of sittingSitStation ThresholdStandThreshold value Threshold of uphill slopeRAThreshold value of downhill slopeRD;
1.2 the values of the knee joint angle sensor are used for distinguishing sitting and standing states from other motion states, and the method comprises the following steps:
1.2.1 calculate Knee Angle mean Knee Per 4sτ;
1.2.2 and Threshold of sittingSitAnd a station ThresholdStandComparing, and judging whether the exercise is sitting, standing or other exercise states;
1.3 the pitching angle value solved by the IMU sensor at the full foot landing time is used for distinguishing the uphill slope, the downhill slope and other motion states, and the method comprises the following steps:
1.3.1 using STM32 single chip microcomputer to obtain the difference of the time of two continuous heel touchdowns to obtain the time T of a complete gait cycle:
T=t1HS-t0HS
wherein: t is t0HSThe last heel landing time; t is t1HSThe current heel landing time;
1.3.2 obtaining the occurrence time of the full-foot landing:
tFF=tHS+0.25*T
wherein: t is tHSThe detected heel strike time; t is the gait cycle time calculated by 1.3.1; 0.25 indicates that full foot strike occurs 25% of the time in the gait cycle;
1.3.3 calculating the pitch angle of the IMU sensor at the time of full foot landing and the uphill ThresholdRAAnd a downhill ThresholdRDComparing, and judging whether the sports is uphill or downhill or in other sports states;
1.4 using improved weighted KNN classification algorithm to distinguish three motion states of walking, going upstairs and going downstairs and other motion states, comprising the following steps:
1.4.1, using a 200ms fixed time window to divide a sensor, collecting knee joint angle, pressure, X-axis acceleration, Y-axis acceleration, Z-axis acceleration, X-axis angular velocity, Y-axis angular velocity and Z-axis angular velocity data, and extracting a mean value and a standard deviation of each one-dimensional data;
1.4.2 carrying out Min-Max standardization on the acquired data, and calculating the distance between two samples in the KNN algorithm by using the Euclidean distance;
1.4.3 calculate the weight of each feature by sensitivity method, remove the s (s is 1, 2, …, l) th feature each time, then classify by KNN algorithm, count the total data n and the number of classification errors ns(ii) a ComputingnsThe larger the classification error is, the larger the contribution of the s-th characteristic quantity to classification is; weighting factor W of the s-th feature quantitysIs defined as:
wherein: u shapesAfter the s-th characteristic quantity is removed, the classification error rate of the algorithm is determined; u shapekAfter the k characteristic quantity is removed, the classification error rate of the algorithm is removed;
1.4.4 processing the data collected by the sensor, and then carrying out walking, upstairs going, downstairs going and the other steps according to the ratio of 1: 1: 1: 1, forming a data set by using the proportion, and calculating the classification accuracy by using an improved weighted KNN algorithm;
1.4.5 use K mean value clustering algorithm to reduce 50% data volume, satisfy the requirement of STM32 memory.
The sampling frequency of the knee joint angle sensor, the weighing sensor and the IMU sensor on the knee joint prosthesis in the step 1.1.1 is 100 HZ.
The gait cycle described in step 1.3.1 is divided into a support phase and a swing phase according to whether the foot is in contact with the ground or not, wherein the support phase is used when the foot is in contact with the ground, and the foot contact marker states include heel landing, full foot landing, heel off and toe off.
The IMU sensor in step 1.3.3 is arranged at the position of the knee joint lower limb artificial limb and is approximately vertical to the ground.
Step 1.4.4 said "other" refers to sitting, standing, ascending and descending; the improved weighted KNN algorithm is used for calculating the classification accuracy rate by 10-fold cross validation, dividing the data set into ten parts, taking 9 parts as training data and 1 part as test data in turn, and taking the average value of the accuracy rate of 10 times of results as the estimation of the algorithm accuracy.
The invention has the beneficial effects that:
the invention combines a machine learning method based on experience rules, and can realize effective identification of amputee's movement intention: a lower limb movement intention identification method combining empirical rules with machine learning is provided. Because the improved KNN algorithm has the defects of occupying a large amount of memory storage training sets, long calculation time and the like when STM32 is used for real-time prediction, four motion states of sitting, standing, ascending and descending are respectively distinguished by using two groups of experience thresholds, and the prediction time of the improved KNN algorithm can be greatly reduced under the condition of ensuring the classification accuracy. Because the traditional KNN algorithm has the same view on each dimension of characteristics, but actually, the contribution degree of each dimension of characteristics is different, the sensitivity method is used for calculating the weight of each dimension of characteristic quantity, the characteristics with large contribution degree are provided, and a larger weight value is given, so that the aim of improving the accuracy of the algorithm is fulfilled.
Drawings
FIG. 1 is a flow chart of a method for identifying lower limb movement intention based on empirical rules in combination with machine learning;
FIG. 2 is a logic diagram of a knee angle threshold classifier;
FIG. 3 is a diagram of gait event and phase definitions;
FIG. 4 is a logic diagram of a knee angle threshold classifier;
fig. 5 is a logic diagram of an improved weighted KNN classifier.
Detailed Description
The following further describes the implementation process of the present invention with reference to the attached drawings so as to enable those skilled in the art to better understand the present invention.
The implementation flow of the lower limb movement intention recognition method based on the empirical rule and combined with machine learning is shown in figure 1, and the method comprises the following steps:
1. the method for acquiring the action data acquired by each sensor in the knee joint prosthesis specifically comprises the following steps:
1.1 using a knee joint angle sensor, a weighing sensor and an IMU sensor which are placed on a knee joint prosthesis to collect data of 6 disabled testees during walking, ascending, descending, sitting, standing, ascending and descending movements;
1.2 pretreatment: denoising the acquired data, removing abnormal data, and adding a classification label to the normal data;
1.3 analyzing and comparing the sensor data in different motion states, summarizing and summarizing to judge the Threshold of sittingSitStation ThresholdStandThreshold value Threshold of uphill slopeRAThreshold value of downhill slopeRD;
1.3.1 determining the Threshold through the value of the knee joint angleSitAnd a station ThresholdStand:
ThresholdSit=87°
ThresholdStand=1°
1.3.2 determining an uphill Threshold through the pitching angle value at the full-foot landing timeRAAnd a downhill ThresholdRD:
ThresholdRA=PitchW-0.9Slope
ThresholdRD=PitchW+0.9Slope
Wherein, PitchWThe pitch angle value is the pitch angle value at the time when the walking on the flat ground lands on all feet, and Slope is the Slope gradient;
2. as shown in fig. 2, the method for distinguishing sitting and standing states from other motion states by using the angle value of the knee joint specifically comprises the following steps:
2.1 calculate the mean Knee Angle value Knee per 4sτ;
2.2 mean value KneeτGreater than a sitting ThresholdSitIf the duration t is more than 4 seconds, the sitting state can be judged; mean Knee angle value KneeτLess than a station ThresholdStandA duration t > 4 seconds may be determined as a station status,otherwise, the state is other state.
3. The gait events and phases of human motion are defined as shown in fig. 3, and the marker states of foot contact are Heel strike (Heel strike), full foot strike (Footflat), Heel lift (Heel off), and Toe lift (Toe off). The method is characterized in that the uphill and downhill are distinguished from other motion states by utilizing a pitch angle value solved by an IMU sensor at the full foot landing time, and specifically comprises the following steps:
3.1 obtaining the difference of the time of two continuous heel touchdowns by using an STM32 singlechip to obtain the time T of a complete gait cycle:
T=t1HS-t0HS
wherein: t is t0HSThe last heel landing time; t is t1HSThe current heel landing time;
3.2 the full foot strike condition generally occurs around 25% of the time of the entire gait cycle, and the time at which heel strike is detected plus 25% of the gait cycle time is estimated as the time at which full sole strike:
tFF=tHS+0.25*T
wherein: t is tHSTo detect heel strike time; t is the gait cycle time calculated by 3.1; 0.25 indicates that full foot strike occurs 25% of the time in the gait cycle;
3.3 calculating the Pitch angle of the IMU sensor at full foot strike and the Threshold of uphill as shown in FIG. 4RAAnd a downhill ThresholdRDAnd comparing to judge whether the movement is an uphill slope, a downhill slope or other movement states:
4. as shown in fig. 5, the method for distinguishing three motion states of walking, ascending stairs and descending stairs and other motion states by using the improved weighted KNN classification algorithm specifically includes the following steps:
4.1, using a 200ms fixed time window to divide a sensor, collecting knee joint angle, pressure, X-axis acceleration, Y-axis acceleration, Z-axis acceleration, X-axis angular velocity, Y-axis angular velocity and Z-axis angular velocity data, and extracting a mean value and a standard deviation of each one-dimensional data;
4.2 carrying out Min-Max standardization on the acquired data, and calculating the distance between two samples in the KNN algorithm by using the Euclidean distance;
4.3 calculate the weight of each feature by sensitivity method, remove the s (s is 1, 2, …, l) th feature each time, then classify by KNN algorithm, count the total data number n and the number n of classification errorss(ii) a ComputingnsThe larger the classification error is, the larger the contribution of the s-th characteristic quantity to classification is; weighting factor W of the s-th feature quantitysIs defined as:
wherein: u shapesAfter the s-th characteristic quantity is removed, the classification error rate of the algorithm is determined; u shapekAfter the k characteristic quantity is removed, the classification error rate of the algorithm is removed;
4.4 after the data collected by the sensor is processed, the walking, going upstairs, going downstairs and the others are performed according to the ratio of 1: 1: 1: the scale of 1 constitutes the data set, others include sitting, standing, ascending and descending. The classification accuracy is calculated by using an improved weighted KNN algorithm, and an experimental result shows that the accuracy of the algorithm can be effectively improved by using the characteristic weight calculated by using a sensitivity method, and the accuracy is improved by about 3%;
and 4.5, reducing the data volume by 50% through a K-means clustering algorithm, and calculating the classification accuracy by using the improved weighted KNN algorithm again.
Claims (5)
1. A lower limb movement intention identification method based on empirical rule and machine learning is characterized by comprising the following steps: comprises the following steps:
1.1 acquiring action data acquired by each sensor in the knee joint prosthesis, comprising the following steps:
1.1.1 using a knee joint angle sensor, a weighing sensor and an IMU sensor which are placed on a knee joint prosthesis to collect data of 6 disabled testees during walking, ascending, descending, sitting, standing, ascending and descending;
1.1.2 pretreatment: denoising the acquired data, removing abnormal data, and adding a classification label to the normal data;
1.1.3 analyzing and comparing the sensor data under different motion states, summarizing and summarizing to judge the Threshold of sittingSitStation ThresholdStandThreshold value Threshold of uphill slopeRAThreshold value of downhill slopeSA;
1.2 the values of the knee joint angle sensor are used for distinguishing sitting and standing states from other motion states, and the method comprises the following steps:
1.2.1 calculate Knee Angle mean Knee Per 4sτ;
1.2.2 and Threshold of sittingSitAnd a station ThresholdStandComparing, and judging whether the exercise is sitting, standing or other exercise states;
1.3 the pitching angle value solved by the IMU sensor at the full foot landing time is used for distinguishing the uphill slope, the downhill slope and other motion states, and the method comprises the following steps:
1.3.1 using STM32 single chip microcomputer to obtain the difference of the time of two continuous heel touchdowns to obtain the time T of a complete gait cycle:
T=t1HS-t0HS
wherein: t is t0HSThe last heel landing time; t is t1HSThe current heel landing time;
1.3.2 obtaining the occurrence time of the full-foot landing:
tFF=tHS+0.25*T
wherein: t is tHSThe detected heel strike time; t is the gait cycle time calculated by 1.3.1; 0.25 indicates that full foot strike occurs 25% of the time in the gait cycle;
1.3.3 calculating the pitch angle of the IMU sensor at the time of full foot landing and the uphill ThresholdRAAnd a downhill ThresholdSAComparing, and judging whether the sports is uphill or downhill or in other sports states;
1.4 using improved weighted KNN classification algorithm to distinguish three motion states of walking, going upstairs and going downstairs and other motion states, comprising the following steps:
1.4.1, using a 200ms fixed time window to divide a sensor, collecting knee joint angle, pressure, X-axis acceleration, Y-axis acceleration, Z-axis acceleration, X-axis angular velocity, Y-axis angular velocity and Z-axis angular velocity data, and extracting a mean value and a standard deviation of each one-dimensional data;
1.4.2 carrying out Min-Max standardization on the acquired data, and calculating the distance between two samples in the KNN algorithm by using the Euclidean distance;
1.4.3 calculate the weight of each feature by sensitivity method, remove the s (s is 1, 2, …, l) th feature each time, then classify by KNN algorithm, count the total data n and the number of classification errors ns(ii) a ComputingnsThe larger the classification error is, the larger the contribution of the s-th characteristic quantity to classification is; weighting factor W of the s-th feature quantitysIs defined as:
wherein: u shapesAfter the s-th characteristic quantity is removed, the classification error rate of the algorithm is determined; u shapekAfter the k characteristic quantity is removed, the classification error rate of the algorithm is removed;
1.4.4 processing the data collected by the sensor, and then carrying out walking, upstairs going, downstairs going and the other steps according to the ratio of 1: 1: 1: 1, forming a data set by using the proportion, and calculating the classification accuracy by using an improved weighted KNN algorithm;
1.4.5 use K mean value clustering algorithm to reduce 50% data volume, satisfy the requirement of STM32 memory.
2. The method for recognizing lower limb movement intention based on empirical rule combined with machine learning according to claim 1, characterized in that: the sampling frequency of the knee joint angle sensor, the weighing sensor and the IMU sensor on the knee joint prosthesis in the step 1.1.1 is 100 HZ.
3. The method for recognizing lower limb movement intention based on empirical rule combined with machine learning according to claim 1, characterized in that: the gait cycle described in step 1.3.1 is divided into a support phase and a swing phase according to whether the foot is in contact with the ground or not, wherein the support phase is used when the foot is in contact with the ground, and the foot contact marker states include heel landing, full foot landing, heel off and toe off.
4. The method for recognizing lower limb movement intention based on empirical rule combined with machine learning according to claim 1, characterized in that: the IMU sensor in step 1.3.3 is arranged at the position of the knee joint lower limb artificial limb and is approximately vertical to the ground.
5. The method for recognizing lower limb movement intention based on empirical rule combined with machine learning according to claim 1, characterized in that: step 1.4.4 said "other" refers to sitting, standing, ascending and descending; the improved weighted KNN algorithm is used for calculating the classification accuracy rate by 10-fold cross validation, dividing the data set into ten parts, taking 9 parts as training data and 1 part as test data in turn, and taking the average value of the accuracy rate of 10 times of results as the estimation of the algorithm accuracy.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111330071.4A CN114041783B (en) | 2021-11-11 | Lower limb movement intention recognition method based on combination of experience rules and machine learning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111330071.4A CN114041783B (en) | 2021-11-11 | Lower limb movement intention recognition method based on combination of experience rules and machine learning |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114041783A true CN114041783A (en) | 2022-02-15 |
CN114041783B CN114041783B (en) | 2024-04-26 |
Family
ID=
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114831627A (en) * | 2022-03-17 | 2022-08-02 | 吉林大学 | Lower limb prosthesis movement identification method based on three decision trees |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050033200A1 (en) * | 2003-08-05 | 2005-02-10 | Soehren Wayne A. | Human motion identification and measurement system and method |
CN110742712A (en) * | 2019-11-05 | 2020-02-04 | 哈工大机器人湖州国际创新研究院 | Artificial limb movement intention identification method and device based on source end fusion |
CN112754468A (en) * | 2021-01-07 | 2021-05-07 | 华南理工大学 | Human body lower limb movement detection and identification method based on multi-source signals |
CN113011458A (en) * | 2021-02-19 | 2021-06-22 | 华南理工大学 | Load-maneuvering exoskeleton human motion intention identification method and exoskeleton system |
CN113314209A (en) * | 2021-06-11 | 2021-08-27 | 吉林大学 | Human body intention identification method based on weighted KNN |
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050033200A1 (en) * | 2003-08-05 | 2005-02-10 | Soehren Wayne A. | Human motion identification and measurement system and method |
CN110742712A (en) * | 2019-11-05 | 2020-02-04 | 哈工大机器人湖州国际创新研究院 | Artificial limb movement intention identification method and device based on source end fusion |
CN112754468A (en) * | 2021-01-07 | 2021-05-07 | 华南理工大学 | Human body lower limb movement detection and identification method based on multi-source signals |
CN113011458A (en) * | 2021-02-19 | 2021-06-22 | 华南理工大学 | Load-maneuvering exoskeleton human motion intention identification method and exoskeleton system |
CN113314209A (en) * | 2021-06-11 | 2021-08-27 | 吉林大学 | Human body intention identification method based on weighted KNN |
Non-Patent Citations (1)
Title |
---|
孙伟;杨一涵;孙枫;姜伟;: "基于最邻近规则的移动端行人运动特征辨识", 中国惯性技术学报, no. 02, 15 April 2018 (2018-04-15), pages 97 - 101 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114831627A (en) * | 2022-03-17 | 2022-08-02 | 吉林大学 | Lower limb prosthesis movement identification method based on three decision trees |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112754468B (en) | Human body lower limb movement detection and identification method based on multi-source signals | |
CN109166275B (en) | Human body falling detection method based on acceleration sensor | |
Xu et al. | Real-time on-board recognition of continuous locomotion modes for amputees with robotic transtibial prostheses | |
Sant’Anna et al. | A symbol-based approach to gait analysis from acceleration signals: Identification and detection of gait events and a new measure of gait symmetry | |
Song et al. | Speed estimation from a tri-axial accelerometer using neural networks | |
CN104983489B (en) | Road conditions recognition methods during artificial leg walking | |
EP3459453B1 (en) | Information processing device, information processing method, and information processing program | |
CN111611859B (en) | Gait recognition method based on GRU | |
CN103345626A (en) | Intelligent wheelchair static gesture identification method | |
Ryu et al. | sEMG-signal and IMU sensor-based gait sub-phase detection and prediction using a user-adaptive classifier | |
CN113850104A (en) | Motion pattern recognition method for limbs | |
CN110977961A (en) | Motion information acquisition system of self-adaptive power-assisted exoskeleton robot | |
Song et al. | Adaptive neural fuzzy reasoning method for recognizing human movement gait phase | |
CN112263244A (en) | Gait-based fatigue degree evaluation system and method | |
CN112137779A (en) | Intelligent artificial limb and mode judgment method thereof | |
Zheng et al. | Locomotion mode recognition with robotic transtibial prosthesis in inter-session and inter-day applications | |
Hu et al. | A novel fusion strategy for locomotion activity recognition based on multimodal signals | |
Eskofier et al. | Pattern classification of foot strike type using body worn accelerometers | |
CN114041783B (en) | Lower limb movement intention recognition method based on combination of experience rules and machine learning | |
CN114041783A (en) | Lower limb movement intention identification method based on empirical rule combined with machine learning | |
CN107007285B (en) | Fall detection method based on pressure and acceleration transducer | |
CN103632133A (en) | Human gesture recognition method | |
KR102280291B1 (en) | Apparatus and method for identify patients with parkinson's disease and patients with podarthritis by performing neural network analysis by various detection information | |
CN116999057A (en) | Hemiplegia gait recognition and hemiplegia gait evaluation method based on wearable sensor | |
Baptista et al. | new approach of cycling phases detection to improve FES-pedaling in SCI individuals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |