CN112057834A - Rehabilitation action standard judging method based on sensor - Google Patents
Rehabilitation action standard judging method based on sensor Download PDFInfo
- Publication number
- CN112057834A CN112057834A CN202010944378.2A CN202010944378A CN112057834A CN 112057834 A CN112057834 A CN 112057834A CN 202010944378 A CN202010944378 A CN 202010944378A CN 112057834 A CN112057834 A CN 112057834A
- Authority
- CN
- China
- Prior art keywords
- sensor
- action
- rehabilitation
- evaluated
- sequence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4836—Diagnosis combined with treatment in closed-loop systems or methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24147—Distances to closest patterns, e.g. nearest neighbour classification
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Data Mining & Analysis (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Epidemiology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Physical Education & Sports Medicine (AREA)
- Surgery (AREA)
- Theoretical Computer Science (AREA)
- Molecular Biology (AREA)
- Primary Health Care (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Computational Biology (AREA)
- Physiology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Rehabilitation Tools (AREA)
Abstract
The invention discloses a sensor-based upper limb rehabilitation action standard evaluation method, which comprises the steps of dividing rehabilitation training into N levels for training according to the accuracy of completion degree to obtain a sample set; then, the action sequence of each variable of the patient rehabilitation training process acquired by the sensor is used as an action sequence to be evaluated, the standardized distance between the action sequence to be evaluated corresponding to each variable and the action sequence corresponding to each sample in the sample set is calculated by a DTW algorithm, and the sum of the standardized distances corresponding to a plurality of variables is obtained; and finally, inputting the sum of the standardized distances into the KNN classifier, wherein the maximum class of the first K samples with the minimum standardized distance with the action sequence to be evaluated in the rehabilitation training process of the patient belongs to the maximum class, namely the class of the action to be evaluated. The sensor is used for collecting the rehabilitation training action data of the patient, real-time and accurate, a doctor does not need to carry out rehabilitation action standard analysis one by one, and the workload and the medical cost of the doctor are reduced.
Description
The technical field is as follows:
the invention belongs to the technical field of rehabilitation training, and particularly relates to a rehabilitation action standard judging method of a sensor.
Background art:
limb rehabilitation (Physical rehabilitation): the main contents of the treatment methods for improving dyskinesia through active movement include joint mobility training, muscle strength enhancement training, posture correction training, and the like. The limb rehabilitation action training is a method for improving and assisting the limb action function of a patient by using purposeful and selected action activities as treatment means, aims to improve the daily living abilities of the patient such as self-care, work and leisure activities to the maximum extent and improve the living quality, and is an ideal method for the patient to return to families and society. The limb rehabilitation requires professional guidance of doctors, and the patient needs to go to a special rehabilitation institution for rehabilitation training, so that the patient rehabilitation training is limited by time and space, and a large amount of doctor resources are wasted; even under the current situation of doctor resource shortage, a large number of patients can only carry out rehabilitation action training blindly, and can not obtain standard feedback of rehabilitation action, so that the rehabilitation effect is reduced, and the progress is slow. Therefore, aiming at the conditions that the medical cost is high, the doctor resources are few and the rehabilitation standard of the patient is not clear at present, the later-period rehabilitation support is provided for the patient with limb dysfunction through researching the accuracy judgment of limb rehabilitation action, and the importance of modern rehabilitation medical treatment is brought. Patent CN110675934A discloses a limb dyskinesia patient remote rehabilitation training system, which acquires the movement data of the limb in the three-dimensional space through an inertial sensor, then constructs a rehabilitation movement track based on the limb movement data acquired by the inertial sensor, and then evaluates the rehabilitation movement completion degree of the patient (for example, evaluating in a scoring manner) based on the rehabilitation movement data and gives a correction suggestion. However, how to evaluate the rehabilitation exercise completion degree is not specifically described, and as a further study of the subject, the rehabilitation exercise sequence is analyzed through the DTW-KNN algorithm and classified judgment is made on the rehabilitation exercise accuracy.
The invention content is as follows:
the invention aims to overcome the defects in the prior art and seek to design a rehabilitation action standard judging method based on a sensor. By acquiring real-time data of the rehabilitation action and judging according to the standard action, the rehabilitation action is further improved and optimized, and the rehabilitation efficiency of patients with limb functional disorder is effectively improved. The problem that the rehabilitation action of the patients with limb dysfunction is difficult to accurately judge at present is solved.
In order to achieve the purpose, the invention relates to a sensor-based upper limb rehabilitation action standard judging method, which specifically comprises the following steps:
s1, dividing rehabilitation training into N levels according to accuracy of completion degree, wherein each level is trained for M times, and acquiring action data of each training through a sensor in the training process, wherein the action data comprises an action sequence consisting of two-dimensional arrays of a plurality of variables changing along with time, and each training is used as a sample to form a sample set comprising N multiplied by M samples;
s2, the action sequence of each variable in the patient rehabilitation training process acquired through the sensor is used as an action sequence to be evaluated, the standardized distance between the action sequence to be evaluated corresponding to each variable and the action sequence corresponding to each sample in the sample set is calculated through a DTW algorithm, the sum of the standardized distances corresponding to a plurality of variables is obtained, and the similarity degree between the action of the patient and the action corresponding to each group of samples in the sample set is evaluated;
s3, inputting the sum of the obtained standardized distances into a KNN classifier, wherein the maximum class of the first K samples with the minimum standardized distance with the action sequence to be evaluated in the rehabilitation training process of the patient is the class of the action to be evaluated;
specifically, the sensor includes one of an acceleration sensor, a gyroscope, and a direction sensor, wherein the acceleration sensor measures three-axis acceleration values ax, ay, az of the sensor, angular velocity values ω x, ω y, ω z of the sensor measured by the gyroscope about three axes, and angles β (pitch angle), γ (course angle), and α (roll angle) of the sensor rotation measured by the direction sensor, and thus, the motion sequence is a set of sequence points where any one of variables of ax, ay, az, ω x, ω y, ω z, β, γ, and α changes with time.
Further, the sensor may be a sensor on an existing smart device.
Compared with the prior art, the invention has the following beneficial effects: (1) calculating the accumulated distance between the patient rehabilitation action sequence and the sample rehabilitation action sequence through a DTW algorithm to replace the common Euclidean distance in a KNN algorithm, and solving the problem that the Euclidean distance cannot be directly obtained by the action sequence; (2) the rehabilitation action sequence is analyzed through the DTW-KNN algorithm, classification judgment is made on the accuracy of rehabilitation actions, and the patient further corrects the actions according to feedback evaluation, so that the rehabilitation training effect is guaranteed, the rehabilitation treatment efficiency is improved, and the rehabilitation progress of the patient is promoted; (3) the sensor is used for collecting the rehabilitation training action data of the patient, real-time and accurate, doctors do not need to carry out rehabilitation action standard analysis one by one, the workload and the medical cost of the doctors are reduced, and the economic burden of families of the patient is relieved.
Description of the drawings:
fig. 1 is a schematic diagram of a KNN classification algorithm according to the present invention.
FIG. 2 is a chart of course angle sequence and course angle sample sequence.
The specific implementation mode is as follows:
the invention is further illustrated by the following specific examples in combination with the accompanying drawings.
Example 1:
a sensor-based upper limb rehabilitation action standard judging method specifically comprises the following steps:
and S1, dividing rehabilitation training into N levels according to the accuracy of the completion degree, training each level for M times, and acquiring action data of each training through a sensor in the training process, wherein the action data comprises an action sequence consisting of two-dimensional arrays of a plurality of variables changing along with time, and each training is used as a sample to form a sample set comprising N multiplied by M samples.
For example, for each rehabilitation action, training is performed for 90 times according to three levels of good, general and poor according to the accuracy of the completion degree, wherein the good, general and poor are performed for 30 times, the action sequence of each training is acquired by a sensor in the training process, and each training is used as a sample to form a sample set comprising 90 samples.
S2, the action sequence of each variable in the patient rehabilitation training process acquired through the sensor is used as an action sequence to be evaluated, the standardized distance between the action sequence to be evaluated corresponding to each variable and the action sequence corresponding to each sample in the sample set is calculated through a DTW algorithm, the sum of the standardized distances corresponding to a plurality of variables is obtained, and the similarity degree between the action of the patient and the action corresponding to each group of samples in the sample set is evaluated.
S3, inputting the sum of the obtained standardized distances into a KNN classifier, wherein the maximum class of the first K samples with the minimum standardized distance with the action sequence to be evaluated in the rehabilitation training process of the patient is the class of the action to be evaluated;
preferably, in order to ensure the accuracy of classification and avoid the difficulty in making decisions due to the equal number of votes in the three categories, k is an odd number such as 7\9\ 11.
The classification algorithm of K Nearest Neighbor (KNN) is a relatively mature method in theory. The method has the following steps: in feature space, if the majority of the k nearest (i.e., nearest neighbor in feature space) samples in the vicinity of a sample belong to a certain class, then the sample also belongs to this class. As shown in fig. 1, the circle, the triangle and the square correspond to good, general and poor samples respectively, the hollow circle is the rehabilitation training action of the patient and is located at the origin position, and the samples are distributed around the action of the patient according to the standardized distance. When k is 9, 8 triangles and 1 square exist, and the rehabilitation grade of the action of the patient is general; when k is 11, there are 8 triangles and 2 squares, and the patient's action is also generally recovered.
Specifically, the sensor includes one of an acceleration sensor, a gyroscope, and a direction sensor, wherein the acceleration sensor measures three-axis acceleration values ax, ay, az of the sensor, angular velocity values ω x, ω y, ω z of the sensor measured by the gyroscope about three axes, and angles β (pitch angle), γ (course angle), and α (roll angle) of the sensor rotation measured by the direction sensor, and thus, the motion sequence is a set of sequence points where any one of variables of ax, ay, az, ω x, ω y, ω z, β, γ, and α changes with time. As shown in fig. 2, the sequence of the heading angle to be evaluated and the sequence of the heading angle sample are shown, wherein the test action is the sequence of the heading angle to be evaluated, and the type a template action is the sequence of the heading angle sample.
Further, the sensor may be a sensor on an existing smart device, such as a cell phone.
The specific processing process of the DTW algorithm comprises the following steps:
s201, according to the euclidean distance d (i, j) ═ f (X1.. xn) between the sequence points in the patient rehabilitation motion sequence Y (Y1.... ym) and the rehabilitation motion sequence X (X1.. xn) of the samplei,yi) The sequence distance matrix M is obtained when the sequence distance matrix M is more than or equal to 0;
s202, generating a cumulative distance matrix Mc, where Mc (i, j) ═ Min (Mc (i-1, j-1), Mc (i-1, j), Mc (i, j-1)) + M (i, j), according to the sequence distance matrix, where i is greater than or equal to 1 and less than or equal to N, j is greater than or equal to 1 and less than or equal to M, and the last row in the cumulative distance matrix Mc, and the data corresponding to the last column is the distance between two sequences, which is also called a normalized distance.
For example, X: (3, 5, 6, 7, 7, 1); y: (3,6,6,7,8,1,1). The distance matrix M between X and Y is shown in table 1 below and the cumulative distance matrix Mc is shown in table 2 below.
TABLE 1
X/Y | 3 | 6 | 6 | 7 | 8 | 1 | 1 |
3 | 0 | 3 | 3 | 4 | 5 | 2 | 2 |
5 | 2 | 1 | 1 | 2 | 3 | 4 | 4 |
6 | 3 | 0 | 0 | 1 | 2 | 5 | 5 |
7 | 4 | 1 | 1 | 0 | 1 | 6 | 6 |
7 | 4 | 1 | 1 | 0 | 1 | 6 | 6 |
1 | 2 | 5 | 5 | 6 | 7 | 0 | 0 |
TABLE 2
Claims (3)
1. A sensor-based upper limb rehabilitation action standard judging method is characterized by comprising the following steps:
s1, dividing rehabilitation training into N levels according to accuracy of completion degree, wherein each level is trained for M times, and acquiring action data of each training through a sensor in the training process, wherein the action data comprises an action sequence consisting of two-dimensional arrays of a plurality of variables changing along with time, and each training is used as a sample to form a sample set comprising N multiplied by M samples;
s2, the action sequence of each variable in the patient rehabilitation training process acquired through the sensor is used as an action sequence to be evaluated, the standardized distance between the action sequence to be evaluated corresponding to each variable and the action sequence corresponding to each sample in the sample set is calculated through a DTW algorithm, the sum of the standardized distances corresponding to a plurality of variables is obtained, and the similarity degree between the action of the patient and the action corresponding to each group of samples in the sample set is evaluated;
and S3, inputting the sum of the obtained standardized distances into the KNN classifier, wherein the maximum class of the first K samples with the minimum standardized distance with the action sequence to be evaluated in the rehabilitation training process of the patient belongs to, and the maximum class is the class of the action to be evaluated.
2. The sensor-based upper limb rehabilitation exercise standard judging method according to claim 1, wherein the sensor comprises one of an acceleration sensor, a gyroscope and a direction sensor, wherein the acceleration sensor measures three-axis acceleration values ax, ay and az of the sensor, the gyroscope measures angular velocity values ω x, ω y and ω z of the sensor rotating around three axes, and the direction sensor measures angles β (pitch angle), γ (course angle) and α (roll angle) of the sensor rotating, so that the action sequence is a sequence point set of any one of the variables ax, ay, az, ω x, ω y, ω z, β, γ and α changing with time.
3. The sensor-based upper limb rehabilitation action standard evaluation method according to claim 2, wherein the sensor can be a sensor on an existing smart device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010944378.2A CN112057834A (en) | 2020-09-10 | 2020-09-10 | Rehabilitation action standard judging method based on sensor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010944378.2A CN112057834A (en) | 2020-09-10 | 2020-09-10 | Rehabilitation action standard judging method based on sensor |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112057834A true CN112057834A (en) | 2020-12-11 |
Family
ID=73663401
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010944378.2A Pending CN112057834A (en) | 2020-09-10 | 2020-09-10 | Rehabilitation action standard judging method based on sensor |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112057834A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115543087A (en) * | 2022-10-14 | 2022-12-30 | 广州强基信息技术有限公司 | Artificial intelligence scoring method for virtual environment skill practice |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105496418A (en) * | 2016-01-08 | 2016-04-20 | 中国科学技术大学 | Arm-belt-type wearable system for evaluating upper limb movement function |
CN106175781A (en) * | 2016-08-25 | 2016-12-07 | 歌尔股份有限公司 | Utilize method and the wearable device of wearable device monitoring swimming state |
CN106295544A (en) * | 2016-08-04 | 2017-01-04 | 山东师范大学 | A kind of unchanged view angle gait recognition method based on Kinect |
CN108371545A (en) * | 2018-02-02 | 2018-08-07 | 西北工业大学 | A kind of human arm action cognitive method based on Doppler radar |
WO2020018463A1 (en) * | 2018-07-14 | 2020-01-23 | Mars, Incorporated | Biomarkers and test models for chronic kidney disease |
-
2020
- 2020-09-10 CN CN202010944378.2A patent/CN112057834A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105496418A (en) * | 2016-01-08 | 2016-04-20 | 中国科学技术大学 | Arm-belt-type wearable system for evaluating upper limb movement function |
CN106295544A (en) * | 2016-08-04 | 2017-01-04 | 山东师范大学 | A kind of unchanged view angle gait recognition method based on Kinect |
CN106175781A (en) * | 2016-08-25 | 2016-12-07 | 歌尔股份有限公司 | Utilize method and the wearable device of wearable device monitoring swimming state |
CN108371545A (en) * | 2018-02-02 | 2018-08-07 | 西北工业大学 | A kind of human arm action cognitive method based on Doppler radar |
WO2020018463A1 (en) * | 2018-07-14 | 2020-01-23 | Mars, Incorporated | Biomarkers and test models for chronic kidney disease |
Non-Patent Citations (1)
Title |
---|
李林峰: "基于腕部三轴加速度的运动模式识别", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115543087A (en) * | 2022-10-14 | 2022-12-30 | 广州强基信息技术有限公司 | Artificial intelligence scoring method for virtual environment skill practice |
CN115543087B (en) * | 2022-10-14 | 2023-04-07 | 广州强基信息技术有限公司 | Artificial intelligence scoring method for virtual environment skill practice |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104573665B (en) | A kind of continuous action recognition methods based on improvement viterbi algorithm | |
US7404774B1 (en) | Rule based body mechanics calculation | |
CN101499168B (en) | Structured light strip center extraction method based on ridge line tracing and Hessian matrix | |
CN108721870B (en) | Exercise training evaluation method based on virtual environment | |
CN107330249A (en) | A kind of Parkinsonian symptoms area of computer aided method of discrimination based on KINECT skeleton datas | |
CN112057834A (en) | Rehabilitation action standard judging method based on sensor | |
CN106228567A (en) | A kind of vertebra characteristic point automatic identifying method based on mean curvature flow | |
CN111259716A (en) | Human body running posture identification and analysis method and device based on computer vision | |
Sim et al. | Analysis of pelvis-thorax coordination patterns of professional and amateur golfers during golf swing | |
CN105631899A (en) | Ultrasonic image motion object tracking method based on gray-scale texture feature | |
CN107301409A (en) | Learn the system and method for processing electrocardiogram based on Wrapper feature selectings Bagging | |
CN1135492C (en) | Handwriting verification device | |
CN105844096A (en) | Hand function evaluation method based on image processing technology | |
Bosch et al. | Analysis of indoor rowing motion using wearable inertial sensors | |
CN113974612A (en) | Automatic assessment method and system for upper limb movement function of stroke patient | |
CN105069766B (en) | A kind of an inscription on a tablet restorative procedure based on the description of Chinese character image contour feature | |
CN112990089A (en) | Method for judging human motion posture | |
CN114092854A (en) | Intelligent rehabilitation auxiliary training system for spinal degenerative disease based on deep learning | |
Shi et al. | ROI detection of hand bone based on YOLO V3 | |
Li et al. | Path planning for a cable-driven parallel waist rehabilitation robot based on discriminant analysis model | |
CN114642588B (en) | Control method, device and system of rehabilitation robot | |
Li et al. | Research on multi-dimensional intelligent quantitative assessment of upper limb function based on kinematic parameters | |
CN117011944B (en) | Gait recognition correction method and system based on deep learning | |
CN113459158B (en) | Exoskeleton assistance efficiency evaluation method and device and computer readable storage medium | |
CN117577332B (en) | Rehabilitation evaluation method and system based on visual analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201211 |
|
RJ01 | Rejection of invention patent application after publication |