CN116869490A - Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence - Google Patents

Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence Download PDF

Info

Publication number
CN116869490A
CN116869490A CN202311155294.0A CN202311155294A CN116869490A CN 116869490 A CN116869490 A CN 116869490A CN 202311155294 A CN202311155294 A CN 202311155294A CN 116869490 A CN116869490 A CN 116869490A
Authority
CN
China
Prior art keywords
training
module
patient
rehabilitation training
somatosensory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311155294.0A
Other languages
Chinese (zh)
Other versions
CN116869490B (en
Inventor
区永康
蔡跃新
唐小武
刘鹏
黄才祥
周文管
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shurui Medical Technology Co ltd
Sun Yat Sen Memorial Hospital Sun Yat Sen University
Original Assignee
Guangzhou Shurui Medical Technology Co ltd
Sun Yat Sen Memorial Hospital Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shurui Medical Technology Co ltd, Sun Yat Sen Memorial Hospital Sun Yat Sen University filed Critical Guangzhou Shurui Medical Technology Co ltd
Priority to CN202311155294.0A priority Critical patent/CN116869490B/en
Publication of CN116869490A publication Critical patent/CN116869490A/en
Application granted granted Critical
Publication of CN116869490B publication Critical patent/CN116869490B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4848Monitoring or testing the effects of treatment, e.g. of medication
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1036Measuring load distribution, e.g. podologic studies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4005Detecting, measuring or recording for evaluating the nervous system for evaluating the sensory system
    • A61B5/4023Evaluating sense of balance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H5/00Exercisers for the eyes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Ophthalmology & Optometry (AREA)
  • Physiology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Epidemiology (AREA)
  • General Engineering & Computer Science (AREA)
  • Neurology (AREA)
  • Rehabilitation Therapy (AREA)
  • Acoustics & Sound (AREA)
  • Psychology (AREA)
  • Anesthesiology (AREA)
  • Hematology (AREA)
  • Pain & Pain Management (AREA)
  • General Physics & Mathematics (AREA)
  • Neurosurgery (AREA)
  • Primary Health Care (AREA)
  • Rehabilitation Tools (AREA)

Abstract

The application discloses a vestibule rehabilitation training dynamic evaluation system based on artificial intelligence, which relates to the field of rehabilitation training and comprises the following components: the user login module is used for customizing the personalized vestibule rehabilitation training by the patient login training system; the training action input module is used for inputting training actions and standard action videos; the illness state evaluation module is used for evaluating the illness state of the patient according to the standard action video stored in the training action input module; this vestibule rehabilitation training dynamic evaluation system based on artificial intelligence gathers patient plantar pressure information through plantar pressure sensor, can be when the patient makes standard action, the analysis patient plantar atress condition, can the side aassessment training effect through the detection to patient plantar atress condition, measures patient limb movement angle through inertial measurement device to and the patient feel the angle of feeling of body when feeling the training, can detect like this whether the patient is doing the action standard when carrying out rehabilitation training.

Description

Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence
Technical Field
The application relates to a rehabilitation training technology, in particular to a vestibular rehabilitation training dynamic evaluation system based on artificial intelligence.
Background
The vestibular system is an important component of the human body balance system, and is provided with special receptors, can receive proper stimulation, transmits the stimulation information to vestibular nuclei and cerebellum in the corresponding brain stem through vestibular nerves, transmits the information to a higher-level center in the brain through a plurality of nerve paths after being integrated, processed and the like with other sensory information (such as visual information and other body sense information) to perform high-level processing, even form subjective consciousness, or transmits the information to motor nuclei (such as eye motor nuclei, spinal cord anterior angle motor nuclei and the like) through a certain nerve path, thereby making specific and nonspecific functional response. The main symptoms of the vestibular system of the patient after the pathological changes comprise dizziness, balance disorder and the like.
When the existing rehabilitation training system is used, a patient with dysfunction carries out rehabilitation training by virtue of therapists and rehabilitation instruments, the environment is a closed and flat indoor environment of a rehabilitation treatment room, different from the natural environment, the training mode is single, the patient is easy to feel boring, the track of the movement of the patient cannot be recorded in the rehabilitation treatment process of the patient, the direction of the movement of the patient in the deflection movement process cannot be judged, and the targeted treatment cannot be carried out.
Disclosure of Invention
The application aims to provide a vestibular rehabilitation training dynamic evaluation system based on artificial intelligence, which aims to solve the defects in the prior art.
In order to achieve the above object, the present application provides the following technical solutions: vestibular rehabilitation training dynamic evaluation system based on artificial intelligence includes:
the user login module is used for customizing the personalized vestibule rehabilitation training by the patient login training system;
the training action input module is used for inputting training actions and standard action videos;
the system comprises a training action input module, a disease condition evaluation module, a disease detection module and a disease detection module, wherein the training action input module is used for inputting a standard action video to a patient;
the rehabilitation training plan making module is used for making a corresponding rehabilitation training plan according to the condition evaluation of the patient by the condition evaluation module, wherein the training plan comprises a visual training plan and a somatosensory training plan;
the rehabilitation training module is used for performing rehabilitation training by the patient according to a rehabilitation training program formulated by the rehabilitation training program formulation module, wherein the rehabilitation training comprises visual training and somatosensory training;
the rehabilitation training evaluation module is used for evaluating training results of rehabilitation training of patients and sending the evaluation results to the rehabilitation training plan making module for updating the rehabilitation training plan.
Further, the user login module is used for registering patients and inputting illness states, and is further used for managing each patient using the system, wherein management operation comprises list viewing, adding and deleting of the patients, and viewing and modifying of attributes of the users.
Further, the rehabilitation training planning module comprises:
the visual evaluation result identification module is used for carrying out identification scoring on each item of visual evaluation result of the patient by the illness state evaluation module;
the body sensing evaluation result identification module is used for carrying out identification scoring on each item of the body sensing evaluation result of the patient for the illness state evaluation module;
the training parameter input module is used for inputting training duty ratio parameters of visual training and somatosensory training by medical staff, wherein the sum of the visual training parameters and the somatosensory training parameters is 1;
the visual training plan making module is used for multiplying the identification scores of each item of the visual evaluation result identification module with the visual training duty ratio parameters and arranging the visual evaluation result identification scores from small to large;
and the somatosensory training plan making module is used for multiplying the identification scores of each item of the somatosensory evaluation result identification module with the somatosensory training duty ratio parameter and arranging the items from small to large.
Further, the specific working mode of the rehabilitation training planning module is as follows:
a1, a visual evaluation result identification module carries out identification scoring on each item of visual evaluation results of patients on the condition evaluation module;
a2, the body sensing evaluation result identification module carries out identification scoring on each item of the body sensing evaluation result of the patient;
a3, medical staff inputs training duty ratio parameters of visual training and somatosensory training through a training parameter input module;
a4, multiplying each item identification score of the visual evaluation result identification module by the visual training duty ratio parameter by the visual training plan making module, and arranging from small to large;
and A5, multiplying the individual item identification scores of the somatosensory evaluation result identification module by the somatosensory training duty ratio parameters by the somatosensory training plan making module, and arranging from small to large.
Further, the rehabilitation training module comprises:
a vision training module, the vision training module comprising:
the wearable VR module is used for the patient to perform visual training through VR;
the eyeball tracking module is used for tracking the eyeballs of a patient during vision training;
the somatosensory module is used for detecting the somatosensory of the patient when the patient performs visual training;
the motion training module, motion training module includes:
the body type identification module is used for identifying the body type information of the patient;
the limb identification module is used for identifying limb information of the patient according to the patient body type information identified by the body type identification module;
the joint point identification module is used for identifying joint points of the patient according to the user limb information identified by the limb identification module and adjusting the somatosensory training gesture according to the joint point position;
the plantar pressure sensor is used for collecting plantar pressure information of a patient;
and the inertia measurement device is used for measuring the limb movement angle of the patient and the somatosensory angle of the patient during somatosensory training.
Further, the specific method for calculating the somatosensory angle by the somatosensory training module comprises the following steps:
b1, the gravitational acceleration is expressed as g= (0, 1) in the geographical coordinate system T Multiplying by rotation matrix C transformed from geographical coordinate system to sensing coordinate system s n After that, its representation g ' = (g ' in the sensor coordinate system is derived ' x ,g' y ,g' z );
B2, the accelerometer measures the components of the motion acceleration a of the carrier at the current moment on three axes of a sensor coordinate system, namely a respectively x 、a y 、a z The offset e of the gravitational acceleration measured at the current moment relative to the gravitational acceleration calculated at the previous moment;
b3, a proportional-integral controller added in an algorithm can correct the deviation so as to improve the stability of the system;
b4, adjusting the angular velocity omega by using the corrected offset;
b5, updating the quaternion by solving a differential equation of the quaternion;
and B6, normalizing after the quaternion is updated, and calculating an attitude angle.
Further, the specific working mode of the rehabilitation training evaluation module is as follows:
c1, firstly, any group of human body joint point data is taken, and data information representing action characteristics is abstracted in a vector form;
c2, adding each standard rehabilitation training action into a standard posture vector database;
c3, detecting human body joint points of a certain action in actual training of a patient, and abstracting the human body joint points into vector data information;
and C4, carrying out similarity calculation on the joint point information of the identified gesture and each standard gesture information in the standard gesture vector database.
Compared with the prior art, the vestibule rehabilitation training dynamic evaluation system based on artificial intelligence provided by the application has the advantages that the plantar pressure information of the patient is acquired through the plantar pressure sensor, the plantar stress condition of the patient can be analyzed when the patient makes standard actions, the training effect can be evaluated laterally through the detection of the plantar stress condition of the patient, the limb movement angle of the patient is measured through the inertia measurement device, and the somatosensory angle of the patient when the patient performs somatosensory training is measured, so that whether the actions of the patient are standard or not can be detected when the patient performs rehabilitation training.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings required for the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments described in the present application, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
Fig. 1 is a schematic diagram of an overall structure according to an embodiment of the present application.
Detailed Description
In order to make the technical scheme of the present application better understood by those skilled in the art, the present application will be further described in detail with reference to the accompanying drawings.
Referring to fig. 1, an artificial intelligence based dynamic vestibular rehabilitation training assessment system includes:
the user login module is used for customizing the personalized vestibule rehabilitation training by the patient login training system;
the training action input module is used for inputting training actions and standard action videos;
the disease condition evaluation module is used for evaluating the disease condition of the patient according to the standard action video stored in the training action input module, wherein the evaluation comprises visual evaluation and body sensing evaluation;
the rehabilitation training plan making module is used for making a corresponding rehabilitation training plan according to the condition evaluation of the patient by the condition evaluation module, wherein the training plan comprises a visual training plan and a somatosensory training plan;
the rehabilitation training module is used for performing rehabilitation training on the patient according to a rehabilitation training program formulated by the rehabilitation training program formulation module, wherein the rehabilitation training comprises visual training and somatosensory training;
the rehabilitation training evaluation module is used for evaluating training results of rehabilitation training of patients and sending the evaluation results to the rehabilitation training plan making module for updating the rehabilitation training plan.
The system specifically works in a mode that a patient logs in a training system through a user login module to conduct personalized vestibular rehabilitation training customization, a training action and a standard action video are input through a training action input module, the patient condition is evaluated through a condition evaluation module according to the standard action video stored in the training action input module, a corresponding rehabilitation training plan is formulated according to condition evaluation of the patient through a rehabilitation training plan formulation module, the rehabilitation training is conducted on the patient through the rehabilitation training module according to the rehabilitation training plan formulated by the rehabilitation training plan formulation module, training results are evaluated on the rehabilitation training of the patient through a rehabilitation training evaluation module, and the evaluation results are sent to the rehabilitation training plan formulation module to update the rehabilitation training plan.
The user login module is used for registering patients and inputting illness states, and is also used for managing each patient using the system, wherein management operation comprises list viewing, adding and deleting of the patients, and viewing and modifying of the attributes of the users.
The rehabilitation training planning module comprises:
the visual evaluation result identification module is used for carrying out identification scoring on each item of visual evaluation result of the patient by the illness state evaluation module;
the body sensing evaluation result identification module is used for carrying out identification scoring on each item of the body sensing evaluation result of the patient for the illness state evaluation module;
the training parameter input module is used for inputting training duty ratio parameters of visual training and somatosensory training by medical staff, wherein the sum of the visual training parameters and the somatosensory training parameters is 1;
the visual training plan making module is used for multiplying the identification scores of each item of the visual evaluation result identification module with the visual training duty ratio parameters and arranging the visual evaluation result identification scores from small to large;
and the somatosensory training plan making module is used for multiplying the identification scores of each item of the somatosensory evaluation result identification module with the somatosensory training duty ratio parameter and arranging the items from small to large.
The specific working mode of the rehabilitation training plan making module is as follows:
a1, a visual evaluation result identification module carries out identification scoring on each item of visual evaluation results of patients on the condition evaluation module;
a2, the body sensing evaluation result identification module carries out identification scoring on each item of the body sensing evaluation result of the patient;
a3, medical staff inputs training duty ratio parameters of visual training and somatosensory training through a training parameter input module;
a4, multiplying each item identification score of the visual evaluation result identification module by the visual training duty ratio parameter by the visual training plan making module, and arranging from small to large;
and A5, multiplying the individual item identification scores of the somatosensory evaluation result identification module by the somatosensory training duty ratio parameters by the somatosensory training plan making module, and arranging from small to large.
The rehabilitation training module comprises:
the vision training module, the vision training module includes:
the wearable VR module is used for the patient to perform visual training through VR;
the eyeball tracking module is used for tracking the eyeballs of a patient during vision training;
the somatosensory module is used for detecting the somatosensory of the patient when the patient performs visual training;
the motion training module is felt to motion training module includes:
the body type identification module is used for identifying the body type information of the patient;
the limb identification module is used for identifying limb information of the patient according to the patient body type information identified by the body type identification module;
the joint point identification module is used for identifying joint points of the patient according to the user limb information identified by the limb identification module and adjusting the somatosensory training gesture according to the joint point position;
the plantar pressure sensor is used for collecting plantar pressure information of a patient;
and the inertia measurement device is used for measuring the limb movement angle of the patient and the somatosensory angle of the patient during somatosensory training.
The specific working method of the rehabilitation training module is that a patient carries out visual training through the wearable VR module, the patient carries out eyeball tracking through the eyeball tracking module during visual training, so that whether the eyeball of the patient is in a training state can be detected in real time, meanwhile, the patient is subjected to somatosensory detection during visual training through the somatosensory module, accidents such as falling and sliding are avoided, when the patient is subjected to somatosensory training, the body type information of the patient is identified through the body type identification module, the limb information of the patient is identified through the limb identification module according to the body type information of the patient, the joint point of the patient is identified through the joint point identification module according to the body information of the user identified by the limb identification module, and the somatosensory training posture is adjusted according to the joint point position, so that the identification error can not occur during action identification, meanwhile, the plantar pressure information of the patient is acquired through the plantar pressure sensor, the plantar pressure condition of the patient can be analyzed during standard action of the patient, the body movement angle of the patient can be measured through the detection of the plantar pressure condition of the patient, the body movement effect of the patient can be estimated laterally, and the body sensation angle of the patient during somatosensory training can be detected, and whether the patient is subjected to standard action of the rehabilitation training can be detected.
The specific method for calculating the somatosensory angle by the somatosensory training module comprises the following steps:
b1, the gravitational acceleration is expressed as g= (0, 1) in the geographical coordinate system T Multiplying by rotation matrix C transformed from geographical coordinate system to sensing coordinate system s n After that, its representation g ' = (g ' in the sensor coordinate system is derived ' x ,g' y ,g' z ) Wherein g' x 、g' y 、g' z Representing the g components in the X-axis, Y-axis and Z-axis respectively, due to C S n Is obtained by sequentially rotating the sensor coordinate system around the X axis, the Y axis and the Z axis, so that the geographic coordinate system can be converted into the sensor coordinate system only by sequentially rotating the sensor coordinate system around the Z axis, the Y axis and the X axis, and C s n =(C n sT =(C h bT The expression of the gravitational acceleration in the sensor coordinate system is:
wherein due to C s n The four-element number of the previous integration period is used for calculation, so that g' represents the coordinate of the gravity acceleration g under the sensor coordinate system at the previous moment;
b2, the accelerometer measures the components of the motion acceleration a of the carrier at the current moment on three axes of a sensor coordinate system, and the components are respectivelyIs a as x 、a y 、a z The expression is as follows:
because ofAccording to the limit theorem, when the offset angle alpha is small enough, the sine sin father alpha of the offset angle= father alpha;
b3, the proportional-integral controller added in the algorithm can correct the deviation to improve the stability of the system, wherein K is used p Represents proportional gain, K i Representing the integral gain, dt represents the sampling period, and the corrected angle offset e c The method comprises the following steps:
b4, adjusting the angular velocity ω with the corrected offset amount, thereby correcting the angular velocity ω c The method comprises the following steps:
wherein omega cx ,ω cy ,ω cz The components of the adjusted angular velocity on the X axis, the Y axis and the Z axis are respectively;
and B5, updating the quaternion by solving a differential equation of the quaternion, wherein the differential equation of the quaternion is solved according to a first-order Dragon-Aldrich tower method, and the differential equation of the quaternion can be obtained by the following steps:
wherein R represents the corrected angular velocity omega c Matrix representation in quaternion space, namely:
let T be the sampling period, according to the first order Dragon lattice-base tower expansion, get the quaternion after updating, namely:
and B6, carrying out normalization after the quaternion is updated, and calculating an attitude angle, wherein a normalization formula is as follows:
converting the quaternion into Euler angle representation, wherein the formula is as follows:
phi is the limb rotation attitude angle converted from the sensor coordinate system to the geographic coordinate system.
The rehabilitation training evaluation module specifically works in the following manner:
c1, firstly, any group of human body joint point data is taken, and data information representing action characteristics is abstracted in a vector form;
c2, adding each standard rehabilitation training action into a standard posture vector database, wherein the standard posture vector is represented by b= (x) 2 ,y 2 ) A representation;
c3, then, detecting the joint point of the human body for a certain action in the actual training of the patient, and abstracting the action into vector data information, wherein the actual action vector of the human body is represented by a= (x) 1 ,y 1 ) A representation;
and C4, carrying out similarity calculation on the joint point information of the identified gesture and each standard gesture information in a standard gesture vector database, and obtaining according to cosine similarity of the two vectors:
wherein the function value is 0 when the two poses are perpendicular, that is, the more standard the motion is, the closer the function value is to 1.
Working principle: when the system is used, the user logs in the training system through the user login module to conduct personalized vestibule rehabilitation training customization, the training action and standard action videos are input through the training action input module, the patient condition is evaluated through the condition evaluation module according to the standard action videos stored in the training action input module, the corresponding rehabilitation training plan is formulated according to the condition evaluation of the patient through the rehabilitation training plan formulation module, the rehabilitation training module carries out rehabilitation training according to the rehabilitation training plan formulated by the rehabilitation training plan formulation module, the rehabilitation training of the patient is evaluated through the rehabilitation training evaluation module, and the evaluation result is sent to the rehabilitation training plan formulation module to update the rehabilitation training plan.
While certain exemplary embodiments of the present application have been described above by way of illustration only, it will be apparent to those of ordinary skill in the art that modifications may be made to the described embodiments in various different ways without departing from the spirit and scope of the application. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive of the scope of the application, which is defined by the appended claims.

Claims (7)

1. Vestibular rehabilitation training dynamic evaluation system based on artificial intelligence, which is characterized by comprising:
the user login module is used for customizing the personalized vestibule rehabilitation training by the patient login training system;
the training action input module is used for inputting training actions and standard action videos;
the system comprises a training action input module, a disease condition evaluation module, a disease detection module and a disease detection module, wherein the training action input module is used for inputting a standard action video to a patient;
the rehabilitation training plan making module is used for making a corresponding rehabilitation training plan according to the condition evaluation of the patient by the condition evaluation module, wherein the training plan comprises a visual training plan and a somatosensory training plan;
the rehabilitation training module is used for performing rehabilitation training by the patient according to a rehabilitation training program formulated by the rehabilitation training program formulation module, wherein the rehabilitation training comprises visual training and somatosensory training;
the rehabilitation training evaluation module is used for evaluating training results of rehabilitation training of patients and sending the evaluation results to the rehabilitation training plan making module for updating the rehabilitation training plan.
2. The artificial intelligence based vestibular rehabilitation training dynamic assessment system according to claim 1, wherein the user login module is used for patient registration and condition input, and the user login module is further used for managing individual patients using the system, wherein management operations include list viewing, adding, deleting, viewing and modifying attributes of the user for the patients.
3. The artificial intelligence based vestibular rehabilitation training dynamic assessment system according to claim 2, wherein the rehabilitation training planning module comprises:
the visual evaluation result identification module is used for carrying out identification scoring on each item of visual evaluation result of the patient by the illness state evaluation module;
the body sensing evaluation result identification module is used for carrying out identification scoring on each item of the body sensing evaluation result of the patient for the illness state evaluation module;
the training parameter input module is used for inputting training duty ratio parameters of visual training and somatosensory training by medical staff, wherein the sum of the visual training parameters and the somatosensory training parameters is 1;
the visual training plan making module is used for multiplying the identification scores of each item of the visual evaluation result identification module with the visual training duty ratio parameters and arranging the visual evaluation result identification scores from small to large;
and the somatosensory training plan making module is used for multiplying the identification scores of each item of the somatosensory evaluation result identification module with the somatosensory training duty ratio parameter and arranging the items from small to large.
4. The artificial intelligence based vestibular rehabilitation training dynamic evaluation system according to claim 3, wherein the specific working mode of the rehabilitation training planning module is as follows:
a1, a visual evaluation result identification module carries out identification scoring on each item of visual evaluation results of patients on the condition evaluation module;
a2, the body sensing evaluation result identification module carries out identification scoring on each item of the body sensing evaluation result of the patient;
a3, medical staff inputs training duty ratio parameters of visual training and somatosensory training through a training parameter input module;
a4, multiplying each item identification score of the visual evaluation result identification module by the visual training duty ratio parameter by the visual training plan making module, and arranging from small to large;
and A5, multiplying the individual item identification scores of the somatosensory evaluation result identification module by the somatosensory training duty ratio parameters by the somatosensory training plan making module, and arranging from small to large.
5. The artificial intelligence based vestibular rehabilitation training dynamic assessment system according to claim 4, wherein the rehabilitation training module comprises:
a vision training module, the vision training module comprising:
the wearable VR module is used for the patient to perform visual training through VR;
the eyeball tracking module is used for tracking the eyeballs of a patient during vision training;
the somatosensory module is used for detecting the somatosensory of the patient when the patient performs visual training;
the motion training module, motion training module includes:
the body type identification module is used for identifying the body type information of the patient;
the limb identification module is used for identifying limb information of the patient according to the patient body type information identified by the body type identification module;
the joint point identification module is used for identifying joint points of the patient according to the user limb information identified by the limb identification module and adjusting the somatosensory training gesture according to the joint point position;
the plantar pressure sensor is used for collecting plantar pressure information of a patient;
and the inertia measurement device is used for measuring the limb movement angle of the patient and the somatosensory angle of the patient during somatosensory training.
6. The dynamic vestibular rehabilitation training assessment system based on artificial intelligence according to claim 5, wherein the specific method for calculating the somatosensory angle by the somatosensory training module is as follows:
b1, the gravitational acceleration is expressed as g= (0, 1) in the geographical coordinate system T Multiplying by rotation matrix C transformed from geographical coordinate system to sensing coordinate system s n After that, its representation g ' = (g ' in the sensor coordinate system is derived ' x ,g' y ,g' z );
B2, the accelerometer measures the components of the motion acceleration a of the carrier at the current moment on three axes of a sensor coordinate system, namely a respectively x 、a y 、a z The offset e of the gravitational acceleration measured at the current moment relative to the gravitational acceleration calculated at the previous moment;
b3, a proportional-integral controller added in an algorithm can correct the deviation so as to improve the stability of the system;
b4, adjusting the angular velocity omega by using the corrected offset;
b5, updating the quaternion by solving a differential equation of the quaternion;
and B6, normalizing after the quaternion is updated, and calculating an attitude angle.
7. The dynamic vestibular rehabilitation training assessment system based on artificial intelligence according to claim 6, wherein the specific working mode of the rehabilitation training assessment module is as follows:
c1, firstly, any group of human body joint point data is taken, and data information representing action characteristics is abstracted in a vector form;
c2, adding each standard rehabilitation training action into a standard posture vector database;
c3, detecting human body joint points of a certain action in actual training of a patient, and abstracting the human body joint points into vector data information;
and C4, carrying out similarity calculation on the joint point information of the identified gesture and each standard gesture information in the standard gesture vector database.
CN202311155294.0A 2023-09-08 2023-09-08 Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence Active CN116869490B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311155294.0A CN116869490B (en) 2023-09-08 2023-09-08 Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311155294.0A CN116869490B (en) 2023-09-08 2023-09-08 Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence

Publications (2)

Publication Number Publication Date
CN116869490A true CN116869490A (en) 2023-10-13
CN116869490B CN116869490B (en) 2024-01-09

Family

ID=88260940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311155294.0A Active CN116869490B (en) 2023-09-08 2023-09-08 Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence

Country Status (1)

Country Link
CN (1) CN116869490B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015179910A1 (en) * 2014-05-30 2015-12-03 Isotechnology Pty Ltd System and method for facilitating patient rehabilitation
JP2017012691A (en) * 2015-07-06 2017-01-19 パラマウントベッド株式会社 Rehabilitation support device, rehabilitation support system, rehabilitation support method and program
KR20170062826A (en) * 2015-11-30 2017-06-08 권대규 A postural balance training system
CN107050763A (en) * 2017-06-14 2017-08-18 韶关星火创客科技有限公司 A kind of its control method of new robot for rehabilitation of anklebone
US20180228430A1 (en) * 2017-02-10 2018-08-16 Mindmaze Holding Sa System, method and apparatus for rehabilitation with tracking
KR20190012986A (en) * 2017-07-31 2019-02-11 전북대학교산학협력단 A dynamic harness rehabilitation training system based motion recognition
CN111883229A (en) * 2020-07-31 2020-11-03 焦点科技股份有限公司 Intelligent movement guidance method and system based on visual AI
CN115624678A (en) * 2022-10-27 2023-01-20 上海韶脑传感技术有限公司 Rehabilitation training system and training method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015179910A1 (en) * 2014-05-30 2015-12-03 Isotechnology Pty Ltd System and method for facilitating patient rehabilitation
JP2017012691A (en) * 2015-07-06 2017-01-19 パラマウントベッド株式会社 Rehabilitation support device, rehabilitation support system, rehabilitation support method and program
KR20170062826A (en) * 2015-11-30 2017-06-08 권대규 A postural balance training system
US20180228430A1 (en) * 2017-02-10 2018-08-16 Mindmaze Holding Sa System, method and apparatus for rehabilitation with tracking
CN107050763A (en) * 2017-06-14 2017-08-18 韶关星火创客科技有限公司 A kind of its control method of new robot for rehabilitation of anklebone
KR20190012986A (en) * 2017-07-31 2019-02-11 전북대학교산학협력단 A dynamic harness rehabilitation training system based motion recognition
CN111883229A (en) * 2020-07-31 2020-11-03 焦点科技股份有限公司 Intelligent movement guidance method and system based on visual AI
CN115624678A (en) * 2022-10-27 2023-01-20 上海韶脑传感技术有限公司 Rehabilitation training system and training method

Also Published As

Publication number Publication date
CN116869490B (en) 2024-01-09

Similar Documents

Publication Publication Date Title
US11803241B2 (en) Wearable joint tracking device with muscle activity and methods thereof
Madgwick et al. An extended complementary filter for full-body MARG orientation estimation
CN102567638B (en) A kind of interactive upper limb healing system based on microsensor
Baghdadi et al. Hip and trunk kinematics estimation in gait through Kalman filter using IMU data at the ankle
US20150201867A1 (en) Electronic free-space motion monitoring and assessments
CN203149575U (en) Interactive upper limb rehabilitation device based on microsensor
Chen Human motion analysis with wearable inertial sensors
Memar et al. Quantification of whole-body bradykinesia in Parkinson's disease participants using multiple inertial sensors
Majumder et al. A robust orientation filter for wearable sensing applications
Alexandre et al. Wearable and IoT technologies application for physical rehabilitation
Sabatini Inertial sensing in biomechanics: a survey of computational techniques bridging motion analysis and personal navigation
Lin et al. Assessment of shoulder range of motion using a wearable inertial sensor network
Caserman et al. Real-time step detection using the integrated sensors of a head-mounted display
Li et al. Real-time hand gesture tracking for human–computer interface based on multi-sensor data fusion
Liu et al. A wearable gait analysis and recognition method for Parkinson’s disease based on error state Kalman filter
Du et al. RETRACTED: Research on the intelligent model of progress in physical education training based on motion sensor
Spasojević et al. A vision-based system for movement analysis in medical applications: the example of Parkinson disease
Mascret et al. A wearable sensor network with embedded machine learning for real-time motion analysis and complex posture detection
Fang et al. Wearable technology for robotic manipulation and learning
CN116869490B (en) Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence
Miao et al. A novel approach for upper limb functionality assessment based on deep learning and multimodal sensing data
Nguyen et al. Quantification of compensatory torso motion in post-stroke patients using wearable inertial measurement units
Meng et al. An inertial-based upper-limb motion assessment model: performance validation across various motion tasks
Huang et al. Evaluating power rehabilitation actions using a fuzzy inference method
Lueken et al. Using synthesized imu data to train a long-short term memory-based neural network for unobtrusive gait analysis with a sparse sensor setup

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant