CN107789803B - Cerebral stroke upper limb rehabilitation training method and system - Google Patents

Cerebral stroke upper limb rehabilitation training method and system Download PDF

Info

Publication number
CN107789803B
CN107789803B CN201711046995.5A CN201711046995A CN107789803B CN 107789803 B CN107789803 B CN 107789803B CN 201711046995 A CN201711046995 A CN 201711046995A CN 107789803 B CN107789803 B CN 107789803B
Authority
CN
China
Prior art keywords
hand
virtual
action
scene
patient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711046995.5A
Other languages
Chinese (zh)
Other versions
CN107789803A (en
Inventor
蒋晟
吴剑煌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201711046995.5A priority Critical patent/CN107789803B/en
Publication of CN107789803A publication Critical patent/CN107789803A/en
Application granted granted Critical
Publication of CN107789803B publication Critical patent/CN107789803B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/12Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0658Position or arrangement of display
    • A63B2071/0661Position or arrangement of display arranged on the user
    • A63B2071/0666Position or arrangement of display arranged on the user worn on the head or face, e.g. combined with goggles or glasses
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/836Sensors arranged on the body of the user

Abstract

A stroke upper limb rehabilitation training method and system comprises the following steps: constructing a virtual rehabilitation training scene; acquiring dynamic data of hand coordinates of a patient, processing the dynamic data and position information of a hand action object, and identifying the action of the hand through a processing result to enable the action of the hand to interact with a scene so as to finish the designated action of rehabilitation training; and displaying the motion of the hand in the virtual reality rehabilitation training scene in real time through the virtual reality helmet. This application passes through the body feel controller and gathers hand coordinate, and not only the low price, and the accuracy is high, need not calibration in advance, and the scene content is interesting, and it is strong to immerse the sense of feeling by the strong virtual reality device, makes the patient constantly obtain the excitation at the rehabilitation training in-process to can insist on the rehabilitation training for a long time.

Description

Cerebral stroke upper limb rehabilitation training method and system
Technical Field
The application relates to the field of medical care, in particular to a cerebral apoplexy upper limb rehabilitation training method and system.
Background
Central nervous system diseases such as stroke are an important cause of disability. Most patients have limb movement disorder or even serious disability after stroke. And the exercise rehabilitation training started in the early stage after the stroke can reduce the probability of physical disability to the maximum extent. The traditional rehabilitation training is manual assistant training for patients one by rehabilitation doctors, however, due to the serious shortage of the rehabilitation doctors (the ratio of the national rehabilitation doctors to the required population is only 0.4:10 ten thousand at present), the manual assistant training mode is low in efficiency due to insufficient supply and demand, and in addition, the traditional rehabilitation training is boring, so that the willingness of the patients to insist on the rehabilitation training for a long time is weakened. The patient can not get effective rehabilitation training in time in the early stage after the stroke, and the probability of the disability of the patient is increased.
Virtual reality applications are a good solution in athletic rehabilitation training. The virtual rehabilitation training can reduce the dependence degree of a rehabilitee and even realize the home rehabilitation of a patient. And the virtual rehabilitation training scene can make the original boring training become interesting, so that the patient can insist on training.
The current virtual rehabilitation training system solution for upper limbs includes the use of data glove devices and Kinect devices. The data glove is a wearable device, can detect motion information of a hand, and can realize touch, grabbing, releasing and other actions in a virtual scene. The mainstream data gloves at present comprise optical fiber sensor data gloves and force feedback data gloves. The optical fiber sensor data glove has accurate measurement and can accurately reflect hand motion information. The force feedback data glove can detect motion information of the hand, and can realize tactile feedback, so that the hand can feel force feedback of virtual reality. However, the data glove is expensive, which increases the cost of the rehabilitation training system to a great extent, and the movement correction is required before each use, which is not beneficial to the realization of home rehabilitation.
Kinect is a somatosensory device produced by Microsoft corporation, comprises a color camera and a depth camera, and can detect and track human body movement by utilizing skeleton tracking and depth information of Kinect. Gesture or motion recognition may be performed by determining similarity of motions. The Kinect can detect the movement of the whole body, but the accuracy is not enough, and the Kinect is not accurate and sensitive enough to detect detailed movement. If the motion of the fingers cannot be accurately identified, the rehabilitation training of the fingers cannot be carried out.
Disclosure of Invention
The application provides a stroke upper limb rehabilitation training method and system.
According to a first aspect of the present application, there is provided a stroke upper limb rehabilitation training method, including:
constructing a virtual rehabilitation training scene;
acquiring dynamic data of hand coordinates of a patient, processing the dynamic data and position information of a hand action object, identifying the action of the hand through a processing result, mapping a virtual hand in a virtual scene, and enabling the action of the hand to interact with the scene so as to finish the designated action of rehabilitation training;
and displaying the motion of the hand in the virtual reality rehabilitation training scene in real time through the virtual reality device.
In the above method, the recognizing the motion of the hand according to the processing result specifically includes:
let us set the distance between the virtual hand H and the hand action object A in the sceneIs dHA
When d isHAAnd when the touch position is less than or equal to 0, determining that the virtual hand touches the hand action object.
In the above method, the recognizing the motion of the hand according to the processing result further includes:
when d isHALess than or equal to 0, and the distance d between the thumb T and the index finger I of the virtual handTIAnd if the number of the specific objects is less than the first preset value, judging that the specific objects are grabbed by the virtual hand, binding the specific objects as sub-objects of the virtual hand, and moving and rotating the specific objects along with the virtual hand.
In the above method, the recognizing the motion of the hand according to the processing result further includes:
in the state of grabbing action, when dTIAnd when the specific object is larger than the second preset value, the specific object is judged to be released by the virtual hand, at the moment, the specific object is unbound from the sub-objects of the virtual hand, and the object does not move and rotate along with the virtual hand any more.
In the above method, the recognizing the motion of the hand according to the processing result further includes:
within a predetermined time, when the highest z-axis coordinate z of the index finger of the virtual hand isi1With its lowest z-axis coordinate zi2A distance d betweenizWhen the Z-axis coordinate is larger than the third preset value and the highest Z-axis coordinate of the wrist jointw1Lowest z-axis coordinate z with wrist jointw2A distance d betweenwzAnd when the second preset value is less than the fourth preset value, the action of the racket is identified.
In the above method, the mapping a virtual hand in a virtual scene specifically includes:
the dynamic data of the hand coordinates of the patient comprises coordinate information of the hand joint positions of the patient;
and mapping a virtual hand through the corresponding position of the joint coordinate information in the virtual scene, so that the posture and the relative position of the hand are consistent with those in reality.
The method further comprises the following steps:
when the virtual hand interacts with the virtual reality rehabilitation training scene, when the virtual hand touches an object in the scene, prompt information that the object is touched is sent out.
The method further comprises the following steps:
establishing quantitative corresponding relation between the actions of the hand and the upper limb and the Shang Tian Min evaluation method;
judging whether the posture, angle, speed and range of the hand and upper limb motions meet the requirements of the specified evaluation action in the Shang-tian-Min evaluation method or not according to the dynamic data of the hand coordinates of the patient;
and evaluating the action completion condition according to the judgment result.
According to a second aspect of the present application, there is provided a stroke upper limb rehabilitation training system, comprising:
the body sensation controller is used for acquiring dynamic data of hand coordinates of the patient;
the intelligent terminal comprises a processing module, a display module and a control module, wherein the processing module is used for acquiring dynamic data of hand coordinates of a patient, processing the dynamic data and position information of a hand action object, identifying the action of the hand through a processing result, mapping out a virtual hand in a virtual scene, and enabling the action of the hand to interact with the scene so as to finish the designated action of rehabilitation training;
and the virtual reality device is used for displaying the motion of the hand in the virtual reality rehabilitation training scene in real time.
In the above system, the distance d between the virtual hand H and the hand motion object a in the scene is set asHA
The processing module is also used when dHAAnd when the touch position is less than or equal to 0, determining that the virtual hand touches the hand action object.
In the above system, the processing module is further configured to, when dHALess than or equal to 0, and the distance d between the thumb T and the index finger I of the virtual handTIAnd if the number of the specific objects is less than the first preset value, judging that the specific objects are grabbed by the virtual hand, binding the specific objects as sub-objects of the virtual hand, and moving and rotating the specific objects along with the virtual hand.
In the above system, the processing module is further configured toIn the state of the grabbing action, when dTIAnd when the specific object is larger than the second preset value, the specific object is judged to be released by the virtual hand, at the moment, the specific object is unbound from the sub-objects of the virtual hand, and the object does not move and rotate along with the virtual hand any more.
In the above system, the processing module is further configured to determine the highest z-axis coordinate z of the index finger of the virtual hand within a predetermined timei1With its lowest z-axis coordinate zi2A distance d betweenizWhen the Z-axis coordinate is larger than the third preset value and the highest Z-axis coordinate of the wrist jointw1Lowest z-axis coordinate z with wrist jointw2A distance d betweenwzAnd when the second preset value is less than the fourth preset value, the action of the racket is identified.
In the system, the dynamic data of the hand coordinates of the patient comprises coordinate information of the joint positions of the hand of the patient;
the processing module is further used for mapping a virtual hand at a corresponding position in the virtual scene through the joint coordinate information, so that the posture and the relative position of the hand are consistent with those in reality.
The system further comprises an armband vibration sensor, wherein the armband vibration sensor is used for sending out prompt information that the virtual hand touches an object in the scene when the virtual hand interacts with the virtual reality rehabilitation training scene and the virtual hand touches the object in the scene.
In the system, the intelligent terminal further comprises an evaluation module for establishing quantitative corresponding relation between the actions of the hand and the upper limb part and a Shanghai sensitivity evaluation method; judging whether the posture, angle, speed and range of the hand and upper limb motions meet the requirements of the specified evaluation action in the Shang-tian-Min evaluation method or not according to the dynamic data of the hand coordinates of the patient; and evaluating the action completion condition according to the judgment result. .
Due to the adoption of the technical scheme, the beneficial effects of the application are as follows:
⑴ in the concrete implementation mode of the application, because the virtual rehabilitation training scene is constructed, the dynamic data of the hand coordinates of the patient and the position information of the hand action object are processed to recognize the action of the hand, so that the action of the hand interacts with the scene to complete the designated action of the rehabilitation training, the application collects the hand coordinates through the body sensation controller, the price is low, the accuracy is high, the pre-calibration is not needed, the scene content is interesting, the immersion sense of the immersion sense strong virtual reality device is strong, the patient can continuously obtain the excitation in the rehabilitation training process, and the rehabilitation training can be insisted for a long time.
⑵ in the specific implementation manner of the present application, the present application can automatically generate an assessment report after a patient completes training, which solves the problem that in the prior art, only a physician can manually assess the patient, and avoids the problem that assessment is greatly affected by the subjective of the physician, thereby not only improving the accuracy of assessment, but also saving labor cost.
Drawings
FIG. 1 is a flow chart of the method of the present application in one embodiment;
FIG. 2 is a flow chart of the method of the present application identifying touch, grab and release actions in one embodiment;
FIG. 3 is a flow chart of a method of the present application in identifying a clapping action in one embodiment;
FIG. 4 is a functional block diagram of the system of the present application in one embodiment;
FIG. 5 is a schematic view of a system of the present application in use in one embodiment;
fig. 6 is a functional block diagram of the system of the present application in another embodiment.
Detailed Description
The present application will be described in further detail below with reference to the accompanying drawings by way of specific embodiments.
The first embodiment is as follows:
as shown in fig. 1, an embodiment of the method for rehabilitation training of upper limbs of stroke according to the present application includes the following steps:
step 102: and constructing a virtual rehabilitation training scene.
In the embodiment, a virtual reality rehabilitation training scene can be specifically constructed through Unity3D, and the scene respectively corresponds to different rehabilitation training actions, such as finger hook-shaped grasping, side pinching and thumb loosening, finger spherical and cylindrical grasping, upper limb horizontal abduction and adduction, elbow joint flexion and extension, upper limb horizontal movement, memory training and the like.
Step 104: collecting dynamic data of hand coordinates of a patient, processing the dynamic data and position information of a hand action object, identifying the action of the hand through a processing result, mapping out a virtual hand in a virtual scene, and enabling the action of the hand to interact with the scene so as to finish the designated action of rehabilitation training.
In the embodiment, the coordinate information of the hand of the patient can be obtained by L eap Mot ion, and then the coordinate information is mapped to a virtual hand at a corresponding position in a virtual scene, wherein the posture and the relative position of the virtual hand are consistent with those of the hand of the real patient.
The mapping of the virtual hand in the virtual scene may specifically include:
the dynamic data of the hand coordinates of the patient comprises coordinate information of the hand joint positions of the patient;
and mapping a virtual hand at the corresponding position in the virtual scene through the joint coordinate information, so that the posture and the relative position of the hand are consistent with those in reality.
The coordinate information may include coordinate information of a plurality of joints of a single hand, such as coordinate information of all or part of the 29 joints of the selectable hand, in this embodiment, coordinate information of the 29 joints (x1, y1, z1), (x2, y2, z2), … …, (x29, y29, z29) may be specifically selected, and then the corresponding positions in the virtual scene may be mapped to a virtual hand, whose posture and relative position are consistent with the real patient's hand.
Step 106: and displaying the motion of the hand in the virtual reality rehabilitation training scene in real time through the virtual reality device.
The virtual reality rehabilitation training scene and the virtual hand are displayed in the virtual reality device, and in one implementation mode, a virtual reality helmet can be selected, so that the patient can obtain an immersive immersion feeling.
In the stroke upper limb rehabilitation training method of the application, the action of the hand is identified through the processing result, and the method can include the following steps:
let d be the distance between the virtual hand H and the hand action object A in the sceneHAWhen d isHAAnd when the touch position is less than or equal to 0, determining that the virtual hand touches the hand action object. In one embodiment, a signal may be sent to the feedback device via bluetooth, and the feedback device worn on the patient's hand may vibrate to indicate that the patient has touched an object. In one embodiment, the feedback device may be an armband vibration sensor.
In one embodiment, recognizing the motion of the hand by the processing result may further include:
when d isHALess than or equal to 0, and the distance d between the thumb T and the index finger I of the virtual handTIAnd if the value is less than the first preset value N1, the virtual hand is judged to grab the specific object, the specific object is bound to be a sub-object of the virtual hand, and the specific object moves and rotates along with the virtual hand. The first preset value N1 can be specifically set as required.
In one embodiment, recognizing the motion of the hand by the processing result may further include:
in the state of grabbing action, when dTIAnd when the specific object is larger than the second preset value N2, the virtual hand is judged to release the specific object, and at the moment, the specific object is unbound from the sub-objects of the virtual hand, so that the object does not move and rotate along with the virtual hand any more. The second preset value N2 can be specifically set as required.
In the method of the present application, a flow chart of the recognition of the virtual hand hitting an object in a scene and the grabbing and releasing actions is shown in fig. 2.
In one embodiment, recognizing the motion of the hand by the processing result may further include:
within a predetermined time T, when the highest z-axis coordinate z of the index finger of the virtual hand isi1With its lowest z-axis coordinate zi2A distance d betweenizGreater than the third preset value N3 and the highest z-axis of the wrist jointCoordinate zw1Lowest z-axis coordinate z with wrist jointw2A distance d betweenwzAnd when the value is less than the fourth preset value N4, the action of the racket is recognized. The predetermined time T, the third preset value N3, and the fourth preset value N4 can be set as required. The flow chart for identifying a racquet is shown in fig. 3.
When the virtual hand touches an object in the scene, a feedback device worn on the hand of the patient can send vibration feedback to prompt the patient to touch the object.
The basic motions described above constitute a game, and for example, touching an object, a grabbing motion, and a releasing motion may constitute a scene for training finger grasping, and a racket motion may constitute a scene for training a wrist joint. The scene is designed into interesting small games such as balloon beating, apple picking, stake beating and the like, each scoring in the game process can send out encouraging sound effect and picture special effect, so that the patient can continuously obtain motivation in the rehabilitation training process, the focus of the patient is shifted from the pain of rehabilitation exercise, and the rehabilitation training can be insisted on for a long time.
The cerebral apoplexy upper limb rehabilitation training method can further comprise the following steps:
when the virtual hand interacts with the virtual reality rehabilitation training scene, when the virtual hand touches an object in the scene, prompt information that the object is touched is sent out.
In one embodiment, the method for rehabilitation training of upper limbs of stroke according to the present application may further include:
establishing quantitative corresponding relation between the actions of the hand and the upper limb and the Shang Tian Min evaluation method;
judging whether the posture, angle, speed and range of the hand and upper limb motions meet the requirements of the specified evaluation action in the Shang-tian-Min evaluation method or not according to the dynamic data of the hand coordinates of the patient;
and evaluating the action completion condition according to the judgment result.
The dynamic data of the patient hand coordinates may particularly comprise dynamic coordinates of a plurality of joints of the patient hand.
When the application is used, a user needs to be registered and logged in, each user name corresponds to one patient, the evaluation method is combined with the actions of the hands and the upper limb part of the Shangmen evaluation method, evaluation is automatically carried out according to the condition that the patient finishes the specified evaluation action, the recovery stage grade of the patient in the Shangmen evaluation method is calculated, and a visual evaluation report is generated and can be referred by a rehabilitation doctor.
The evaluation method is as follows: the item score of each Shang Tian Min method hand and upper limb part is 0 to 2, wherein 0 score cannot be completed, 1 score cannot be completed fully, and 2 score can be completed fully; the recovery stage grade of the patient ranges from grade 1 to grade 12; if the score of the patient in the combined reaction project reaches 2 points, the corresponding Shang-tian-min evaluation method evaluates the recovery stage grade of the patient as grade 1; if the score of the patient reaches 2 points in the random contraction project, the recovery stage grade of the patient is rated as 2 grade; if the score of the patient reaches 1 in the joint sports item, the recovery stage grade of the patient is rated as 3; if the score of the patient in the joint sports item reaches 2 points, the grade of the recovery stage is rated as 4; if the score of the patient in the joint sports item reaches 3 points, the grade of the recovery stage is rated as 5; if the score of the patient in the joint sports item reaches 4 points, the grade of the recovery stage is rated as 6; if the score of the patient reaches 2 points in the partial separation sports item and the score of each small item does not appear 0 point, the recovery stage grade is 7 grades; if the score of the patient reaches 4 in the partial separation exercise program, the grade of the recovery stage is rated as 8; if the score of the patient in the separation exercise item reaches 2 points and the score of each small item does not appear 0 points, the recovery stage grade is 9 grades; if the score of the patient in the separation exercise item reaches 4 points and the score of each small item does not appear 0 points, the recovery stage grade is 10 grade; if the score of the patient in the separation exercise item reaches 6 points, the grade of the recovery stage is 11; if the patient scores 6 points in the speed check item and meets the speed requirement, the recovery stage grade is rated as 12.
Example two:
as shown in fig. 4 to 6, an embodiment of the system for rehabilitation training of upper limbs of stroke according to the present invention includes a somatosensory controller 10, a smart terminal 20, and a virtual reality device 30. The body sensation controller 10 is used for acquiring dynamic data of hand coordinates of a patient; the intelligent terminal 20 comprises a processing module 21, collects dynamic data of hand coordinates of a patient, processes the dynamic data and position information of a hand action object, identifies the action of the hand through a processing result, maps a virtual hand in a virtual scene, and enables the action of the hand to interact with the scene so as to finish the designated action of rehabilitation training. The intelligent terminal 20 may be a computer or a handheld terminal, and the handheld terminal includes a mobile phone, a PAD, and the like. A virtual reality device 30 for displaying the movement of the hand in the virtual reality rehabilitation training scene in real time, the virtual reality device 30 can be selected from a virtual reality helmet or other devices
In one embodiment, the system of the present application may further include an armband vibration sensor 40, where the armband vibration sensor 40 is configured to issue a prompt that an object has been touched when the virtual hand touches the object in the scene when the virtual hand interacts with the virtual reality rehabilitation training scene.
The stroke upper limb rehabilitation training system provided by the application sets the distance between a virtual hand H and a hand action object A in a scene as dHA(ii) a The processing module may also be adapted to, when dHAAnd when the touch position is less than or equal to 0, determining that the virtual hand touches the hand action object.
In one embodiment, the processing module may be further configured to, when dHALess than or equal to 0, and the distance d between the thumb T and the index finger I of the virtual handTIAnd if the specific object is smaller than the first preset value, the specific object is judged to be grabbed by the virtual hand, the specific object is bound to be a sub-object of the virtual hand, and the specific object moves and rotates along with the virtual hand. The first preset value N1 can be specifically set as required.
In another embodiment, the processing module can be further configured to, in the state of the grabbing action, when dTIWhen the specific object is larger than the second preset value, the specific object is judged to be released by the virtual hand, and at the moment, the specific object is separated from the child object of the virtual handThe binding is released in the body and the object no longer follows the virtual hand movement and rotation. The second preset value N2 can be specifically set as required.
The stroke upper limb rehabilitation training system provided by the application has the advantages that the processing module can be further used for working as the highest z-axis coordinate z of the index finger of the virtual hand in preset timei1With its lowest z-axis coordinate zi2A distance d betweenizWhen the Z-axis coordinate is larger than the third preset value and the highest Z-axis coordinate of the wrist jointw1Lowest z-axis coordinate z with wrist jointw2A distance d betweenwzAnd when the second preset value is less than the fourth preset value, the action of the racket is identified. The predetermined time T, the third preset value N3, and the fourth preset value N4 can be set as required.
In one embodiment, the dynamic data of the patient hand coordinates includes coordinate information of the patient hand joint positions; the processing module can be further used for mapping out a virtual hand from the corresponding position of the joint coordinate information in the virtual scene, so that the posture and the relative position of the hand are consistent with those in reality.
The coordinate information may include coordinate information of a plurality of joints of a single hand, such as coordinate information of all or part of the 29 joints of the selectable hand, in this embodiment, coordinate information of the 29 joints (x1, y1, z1), (x2, y2, z2), … …, (x29, y29, z29) may be specifically selected, and then the corresponding positions in the virtual scene may be mapped to a virtual hand, whose posture and relative position are consistent with the real patient's hand.
In another embodiment, the intelligent terminal 20 may further include an evaluation module 22, where the evaluation module 22 is configured to establish a quantitative correspondence between the motions of the hand and the upper limb and the upper field sensitivity evaluation method; judging whether the posture, angle, speed and range of the hand and upper limb motions meet the requirements of the specified evaluation action in the Shang-tian-Min evaluation method or not according to the dynamic data of the hand coordinates of the patient; and evaluating the action completion condition according to the judgment result. The dynamic data of the patient hand coordinates may particularly comprise dynamic coordinates of a plurality of joints of the patient hand. The evaluation module is combined with the hand and the upper limb part of the Shanghai-sensitive evaluation method to automatically generate an evaluation report and determine the recovery stage grade of the patient.
When the application is used, a user needs to be registered and logged in, each user name corresponds to one patient, the evaluation method is combined with the hand and the upper limb part of the Shangmen evaluation method, evaluation is automatically carried out according to the condition that the patient completes the designated evaluation action, the recovery stage grade of the patient in the Shangmen evaluation method is calculated, and a visual evaluation report is generated and can be referred by a rehabilitation doctor.
The evaluation method is as follows: the item score of each Shang Tian Min method hand and upper limb part is 0 to 2, wherein 0 score cannot be completed, 1 score cannot be completed fully, and 2 score can be completed fully; the recovery stage grade of the patient ranges from grade 1 to grade 12; if the score of the patient in the combined reaction project reaches 2 points, the corresponding Shang-tian-min evaluation method evaluates the recovery stage grade of the patient as grade 1; if the score of the patient reaches 2 points in the random contraction project, the recovery stage grade of the patient is rated as 2 grade; if the score of the patient reaches 1 in the joint sports item, the recovery stage grade of the patient is rated as 3; if the score of the patient in the joint sports item reaches 2 points, the grade of the recovery stage is rated as 4; if the score of the patient in the joint sports item reaches 3 points, the grade of the recovery stage is rated as 5; if the score of the patient in the joint sports item reaches 4 points, the grade of the recovery stage is rated as 6; if the score of the patient reaches 2 points in the partial separation sports item and the score of each small item does not appear 0 point, the recovery stage grade is 7 grades; if the score of the patient reaches 4 in the partial separation exercise program, the grade of the recovery stage is rated as 8; if the score of the patient in the separation exercise item reaches 2 points and the score of each small item does not appear 0 points, the recovery stage grade is 9 grades; if the score of the patient in the separation exercise item reaches 4 points and the score of each small item does not appear 0 points, the recovery stage grade is 10 grade; if the score of the patient in the separation exercise item reaches 6 points, the grade of the recovery stage is 11; if the patient scores 6 points in the speed check item and meets the speed requirement, the recovery stage grade is rated as 12.
The foregoing is a more detailed description of the present application in connection with specific embodiments thereof, and it is not intended that the present application be limited to the specific embodiments thereof. It will be apparent to those skilled in the art from this disclosure that many more simple derivations or substitutions can be made without departing from the spirit of the disclosure.

Claims (10)

1. A stroke upper limb rehabilitation training method is characterized by comprising the following steps:
constructing a virtual reality rehabilitation training scene;
acquiring dynamic data of hand coordinates of a patient, processing the dynamic data and position information of a hand action object, identifying the action of the hand through a processing result, mapping a virtual hand in a virtual scene, and enabling the action of the hand to interact with the scene so as to finish the designated action of rehabilitation training;
displaying the motion of the hand in the virtual reality rehabilitation training scene in real time through a virtual reality device;
the recognizing the motion of the hand by the processing result specifically includes:
let d be the distance between the virtual hand H and the hand action object A in the sceneHA
When d isHAWhen the touch position is less than or equal to 0, determining that the virtual hand touches the hand action object;
when d isHALess than or equal to 0, and the distance d between the thumb T and the index finger I of the virtual handTIIf the number of the specific objects is smaller than the first preset value, judging that the specific objects are grabbed by the virtual hand, binding the specific objects as sub-objects of the virtual hand, and moving and rotating the specific objects along with the virtual hand;
the recognizing the motion of the hand by the processing result further includes:
within a predetermined time, when the highest z-axis coordinate z of the index finger of the virtual hand isi1With its lowest z-axis coordinate zi2A distance d betweenizWhen the Z-axis coordinate is larger than the third preset value and the highest Z-axis coordinate of the wrist jointw1Lowest z-axis coordinate z with wrist jointw2A distance d betweenwzAnd when the second preset value is less than the fourth preset value, the action of the racket is identified.
2. The method of claim 1, wherein the identifying the motion of the hand by the processing result further comprises:
in the state of grabbing action, when dTIAnd when the specific object is larger than the second preset value, the specific object is judged to be released by the virtual hand, at the moment, the specific object is unbound from the sub-objects of the virtual hand, and the object does not move and rotate along with the virtual hand any more.
3. The method of claim 1, wherein mapping out a virtual hand in a virtual scene specifically comprises:
the dynamic data of the hand coordinates of the patient comprises coordinate information of the hand joint positions of the patient;
and mapping a virtual hand at a corresponding position in a virtual scene through the coordinate information of the hand joint position, so that the posture and the relative position of the hand are consistent with those in reality.
4. The method of claim 1, further comprising:
when the virtual hand interacts with the virtual reality rehabilitation training scene, when the virtual hand touches an object in the scene, prompt information that the object is touched is sent out.
5. The method of any of claims 1 to 4, further comprising:
establishing quantitative corresponding relation between the actions of the hand and the upper limb and the Shang Tian Min evaluation method;
judging whether the posture, angle, speed and range of the hand and upper limb motions meet the requirements of the specified evaluation action in the Shang-tian-Min evaluation method or not according to the dynamic data of the hand coordinates of the patient;
and evaluating the action completion condition according to the judgment result.
6. A stroke upper limb rehabilitation training system, comprising:
the body sensation controller is used for acquiring dynamic data of hand coordinates of the patient;
the intelligent terminal comprises a processing module, a display module and a control module, wherein the processing module is used for acquiring dynamic data of hand coordinates of a patient, processing the dynamic data and position information of a hand action object, identifying the action of the hand through a processing result, mapping out a virtual hand in a virtual scene, and enabling the action of the hand to interact with the scene so as to finish the designated action of rehabilitation training;
the virtual reality device is used for displaying the motion of the hand in the virtual reality rehabilitation training scene in real time;
let d be the distance between the virtual hand H and the hand action object A in the sceneHA
The processing module is also used when dHAWhen the touch position is less than or equal to 0, determining that the virtual hand touches the hand action object;
when d isHALess than or equal to 0, and the distance d between the thumb T and the index finger I of the virtual handTIIf the number of the specific objects is smaller than the first preset value, judging that the specific objects are grabbed by the virtual hand, binding the specific objects as sub-objects of the virtual hand, and moving and rotating the specific objects along with the virtual hand;
the processing module is further used for determining the highest z-axis coordinate z of the index finger of the virtual hand within a predetermined timei1With its lowest z-axis coordinate zi2A distance d betweenizWhen the Z-axis coordinate is larger than the third preset value and the highest Z-axis coordinate of the wrist jointw1Lowest z-axis coordinate z with wrist jointw2A distance d betweenwzAnd when the second preset value is less than the fourth preset value, the action of the racket is identified.
7. The system of claim 6, wherein the processing module is further configured to, in a state of a grabbing action, when dTIAnd when the specific object is larger than the second preset value, the specific object is judged to be released by the virtual hand, at the moment, the specific object is unbound from the sub-objects of the virtual hand, and the object does not move and rotate along with the virtual hand any more.
8. The system of claim 6, wherein the dynamic data of patient hand coordinates includes coordinate information of patient hand joint positions;
the processing module is further used for mapping a virtual hand from the corresponding position of the coordinate information of the hand joint position in the virtual scene, so that the posture and the relative position of the hand are consistent with those in reality.
9. The system of claim 6, further comprising an armband vibration sensor for issuing a notification that an object has been touched when the virtual hand touches an object in the scene while the virtual hand interacts with the virtual reality rehabilitation training scene.
10. The system of claim 6, wherein the intelligent terminal further comprises an evaluation module for establishing quantitative correspondence between the movements of the hand and upper limb portions and the Shang-tian sensitivity evaluation method; judging whether the posture, angle, speed and range of the hand and upper limb motions meet the requirements of the specified evaluation action in the Shang-tian-Min evaluation method or not according to the dynamic data of the hand coordinates of the patient; and evaluating the action completion condition according to the judgment result.
CN201711046995.5A 2017-10-31 2017-10-31 Cerebral stroke upper limb rehabilitation training method and system Active CN107789803B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711046995.5A CN107789803B (en) 2017-10-31 2017-10-31 Cerebral stroke upper limb rehabilitation training method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711046995.5A CN107789803B (en) 2017-10-31 2017-10-31 Cerebral stroke upper limb rehabilitation training method and system

Publications (2)

Publication Number Publication Date
CN107789803A CN107789803A (en) 2018-03-13
CN107789803B true CN107789803B (en) 2020-07-24

Family

ID=61548547

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711046995.5A Active CN107789803B (en) 2017-10-31 2017-10-31 Cerebral stroke upper limb rehabilitation training method and system

Country Status (1)

Country Link
CN (1) CN107789803B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109316732A (en) * 2018-09-28 2019-02-12 安阳市翔宇医疗设备有限责任公司 A kind of Training valuation device, equipment and readable storage medium storing program for executing
CN109350923B (en) * 2018-10-25 2021-06-01 北京机械设备研究所 Upper limb rehabilitation training system based on VR and multi-position sensors
CN109550233A (en) * 2018-11-15 2019-04-02 东南大学 Autism child attention training system based on augmented reality
CN109192272A (en) * 2018-11-26 2019-01-11 燕山大学 Based on the Leap Motion healing hand function training system combined with VR and its implementation
CN109830282A (en) * 2019-03-29 2019-05-31 贾艳滨 Scene training data processing method and processing device
CN110232963B (en) * 2019-05-06 2021-09-07 中山大学附属第一医院 Upper limb movement function evaluation system and method based on stereoscopic display technology
CN114082158B (en) * 2021-11-18 2022-05-20 南京医科大学 Upper limb rehabilitation training system for stroke patient
CN114367086A (en) * 2021-12-31 2022-04-19 华南理工大学 Lower limb rehabilitation training game system
CN115054903A (en) * 2022-06-30 2022-09-16 北京工业大学 Virtual game rehabilitation system and method for active rehabilitation of stroke patient

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103230664A (en) * 2013-04-17 2013-08-07 南通大学 Upper limb movement rehabilitation training system and method based on Kinect sensor
CN106422203A (en) * 2016-11-23 2017-02-22 佛山科学技术学院 Upper limb rehabilitation training method based on photoelectric multimode feedback of mirror image therapy
CN106821333A (en) * 2017-03-21 2017-06-13 黑龙江尔惠科技有限公司 A kind of cognition dysfunction rehabilitation detection means based on virtual scene, method and therapeutic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9696795B2 (en) * 2015-02-13 2017-07-04 Leap Motion, Inc. Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103230664A (en) * 2013-04-17 2013-08-07 南通大学 Upper limb movement rehabilitation training system and method based on Kinect sensor
CN106422203A (en) * 2016-11-23 2017-02-22 佛山科学技术学院 Upper limb rehabilitation training method based on photoelectric multimode feedback of mirror image therapy
CN106821333A (en) * 2017-03-21 2017-06-13 黑龙江尔惠科技有限公司 A kind of cognition dysfunction rehabilitation detection means based on virtual scene, method and therapeutic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于Leap Motion的脑卒中上肢功能康复主动运动系统;刘志辉等;《东华大学学报(自然科学版)》;20160831;572-575 *

Also Published As

Publication number Publication date
CN107789803A (en) 2018-03-13

Similar Documents

Publication Publication Date Title
CN107789803B (en) Cerebral stroke upper limb rehabilitation training method and system
CN109350923B (en) Upper limb rehabilitation training system based on VR and multi-position sensors
Mousavi Hondori et al. A spatial augmented reality rehab system for post-stroke hand rehabilitation
Da Gama et al. Motor rehabilitation using Kinect: a systematic review
Webster et al. Systematic review of Kinect applications in elderly care and stroke rehabilitation
KR101784410B1 (en) Kinect-based Pose Recognition Method and System for Exercise Game
KR20140107062A (en) Posture training system and method of control thereof
MX2011001698A (en) 3d monocular visual tracking therapy system for the rehabilitation of human upper limbs.
Karime et al. A fuzzy-based adaptive rehabilitation framework for home-based wrist training
Koutsiana et al. Serious gaming technology in upper extremity rehabilitation: scoping review
WO2023087954A1 (en) Upper limb rehabilitation training system for stroke patients
Song et al. Activities of daily living-based rehabilitation system for arm and hand motor function retraining after stroke
CN110404243A (en) A kind of method of rehabilitation and rehabilitation system based on posture measurement
Matos et al. Kinteract: a multi-sensor physical rehabilitation solution based on interactive games
KR101563298B1 (en) Hand rehabilitation system based on hand motion recognition
KR102048551B1 (en) System and Method for Virtual reality rehabilitation training using Smart device
Helmer et al. Smart textiles: Position and motion sensing for sport, entertainment and rehabilitation
KR20150097050A (en) learning system using clap game for child and developmental disorder child
Shen et al. A novel approach in rehabilitation of hand-eye coordination and finger dexterity
KR101567859B1 (en) System for rehabilitation exercise using user motion
Fazeli et al. A virtual environment for hand motion analysis
Kolsanov et al. Augmented Reality application for hand motor skills rehabilitation
Kuchinke et al. Technical view on requirements for future development of hand-held rehabilitation devices
Park et al. Development of a dance rehabilitation system using kinect and a vibration feedback glove
Yasmin Virtual Reality and Assistive Technologies: A Survey.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant