CN114367091B - Interaction method and device for double upper limb non-contact rehabilitation training - Google Patents

Interaction method and device for double upper limb non-contact rehabilitation training Download PDF

Info

Publication number
CN114367091B
CN114367091B CN202210034551.4A CN202210034551A CN114367091B CN 114367091 B CN114367091 B CN 114367091B CN 202210034551 A CN202210034551 A CN 202210034551A CN 114367091 B CN114367091 B CN 114367091B
Authority
CN
China
Prior art keywords
upper limb
patients
patient
grid
actions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210034551.4A
Other languages
Chinese (zh)
Other versions
CN114367091A (en
Inventor
王俊华
王兆坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaokang Medical Technology Co ltd
Original Assignee
Guangzhou Xiaokang Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaokang Medical Technology Co ltd filed Critical Guangzhou Xiaokang Medical Technology Co ltd
Priority to CN202210034551.4A priority Critical patent/CN114367091B/en
Publication of CN114367091A publication Critical patent/CN114367091A/en
Application granted granted Critical
Publication of CN114367091B publication Critical patent/CN114367091B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/12Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0084Exercising apparatus with means for competitions, e.g. virtual races
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0087Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0087Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
    • A63B2024/0096Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load using performance related parameters for controlling electronic or video games or avatars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/065Visualisation of specific exercise parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/807Photo cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/62Measuring physiological parameters of the user posture
    • A63B2230/625Measuring physiological parameters of the user posture used as a control parameter for the apparatus

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Rehabilitation Tools (AREA)

Abstract

The embodiment of the invention discloses an interaction method and device for double upper limb non-contact rehabilitation training, which comprises the following steps: skeleton data of two patients are captured and returned through the somatosensory sensor, the two patients are distinguished through the left-right position relation, the upper limb actions of the human body are reconstructed according to the skeleton data of the two patients, and the two patients are guided to carry out upper limb movement; defining the upper limb movement of a patient based on region lattices, wherein all lattices in a region are matched with a plurality of basic upper limb movements corresponding to the patient, and the two patients position the upper limb movement track by using the region lattices; respectively determining an upper limb movement auxiliary support mode and a body position mode during upper limb training according to the limb function evaluation results of the two patients; through the somatosensory sensor and the virtual reality interactive system, the two patients complete the upper limb interactive rehabilitation training scheme with different difficulty degrees by utilizing respective upper limb motions. The invention can not only guide the patient to do upper limb movement and increase the interest of upper limb training of the patient, but also improve the use efficiency of the equipment.

Description

Interaction method and device for double upper limb non-contact rehabilitation training
Technical Field
The invention relates to the technical field of virtual reality, in particular to an interaction method and device for double upper limb non-contact rehabilitation training.
Background
The problems of upper limb movement dysfunction of a plurality of patients with stroke, brain trauma and spinal cord injury seriously affect the life quality of the patients; for these patients, active rehabilitation is required to restore their function. In the prior art, single-person single-item training is mainly adopted, so that the occupied rehabilitation equipment is large, the equipment utilization rate is low, and the interest of the rehabilitation training is difficult to improve due to the fact that the single-person single-item training is adopted and competition and interaction do not exist in the training of a patient, so that the rehabilitation process is long and the rehabilitation effect is not obvious.
Chinese patent applications CN109589557A and CN109589556A propose a double upper limb contact type rehabilitation training system and an evaluation method based on a virtual reality environment, and these two applications complete upper limb rehabilitation training by respectively holding handles by two patients to move on the same training plane and utilizing contact type interaction between the two handles and a virtual scene and a virtual target on the training plane. Obviously, the prior art is a double upper limb training method in contact with a virtual target, a hand is required to hold a handle in the training process to interact with the virtual target, the practicability is limited, the space for three-dimensional training of two other patients is also limited, and the phenomenon of limb collision can occur when the two patients do not pay attention to the training.
Disclosure of Invention
In view of the above technical problems, an object of the present invention is to provide an interactive method and apparatus for double upper limb non-contact rehabilitation training, wherein a Kinect is used as an input device for double upper limb motion capture, a computer receives double skeleton data captured and returned by the Kinect, reconstructs the upper limb motion of a human body according to the skeleton data, and determines the respective upper limb motion trajectories of the two people positioned by respective region grids and the relationship between the upper limb motion trajectories and a virtual target to quickly determine the respective motion effectiveness of the upper limbs of the two people, and compares the two motion effectiveness.
The technical scheme of the invention is as follows:
according to a specific embodiment of the invention, the invention discloses an interaction method for double upper limb non-contact rehabilitation training, which is characterized by comprising the following steps of:
s1: bone data of the two patients are captured and returned through the somatosensory sensor, the two patients are distinguished through the left-right position relation, and the upper limb actions of the human body are reconstructed according to the bone data of the two patients;
s2: defining the upper limb actions of the patients based on the region cells, and positioning a plurality of standard upper limb actions and a plurality of standard movement tracks of the upper limbs for the two patients by using the region cells, wherein all cells in the region correspond to a plurality of basic upper limb actions of the patients;
s3: defining a virtual target falling point based on region grids for guiding two patients to finish matched upper limb actions, wherein each grid in the region is the falling point position of the virtual target;
s4: respectively determining an upper limb movement auxiliary support mode and a body position mode during upper limb training according to the limb function evaluation results of the two patients;
s5: the two patients respectively control respective virtual dummy bodies in the virtual scene to interact with the virtual object by using respective upper limb motions through the somatosensory sensor and the virtual reality interaction system, so as to complete upper limb interactive rehabilitation training schemes with different difficulty degrees;
s6, in the training process, according to the interaction accuracy of the virtual avatar and the virtual object operated by the two patients, judging the completion degree of the upper limb actions of the two patients within the specified time and giving timely feedback;
s7: after training is finished, upper limb training scores are evaluated according to the conditions of finishing the upper limb actions of the two patients, the times of the unfinished upper limb actions and the unfinished upper limb actions are analyzed, and the upper limb rehabilitation training results of the two patients are compared and analyzed.
In the above technical solution, in step S1, reconstructing the upper limb movements of the human body according to the bone data of the two patients means reconstructing nine basic upper limb movements of nine positions including upper, lower, left, right, middle, upper left, lower left, upper right and lower right of the upper limb of the patient according to the positional relationship of the wrist, elbow and shoulder joints of the upper limb calculated according to the bone data of the two patients, or reconstructing six basic upper limb movements of six positions including upper, lower, upper left, lower left, upper right and lower right of the upper limb of the patient.
In the above technical solution, in step S2, the area division means that the area for training the movement of the upper limbs of the patient is divided into nine-palace grids or six-palace grids;
the nine-square grid takes nine basic upper limb movement positions of the patient, namely upper, lower, left, right, middle, upper left, lower left, upper right and lower right, as markers, wherein the left, middle and right positions refer to strip-shaped positions of the upper limbs of the human moving above an umbilical cord and below a shoulder cord; "upper left, upper right" refers to the upper extremity action position above the shoulder line; "lower left, lower right" refers to the position of the upper limb movement below the umbilical line, where the nine basic upper limb movements of the patient match the nine positions of the Sudoku, and the complex upper limb movement consists of 2 or more than 2 basic upper limb movements;
six palace divisions take six basic upper limb movement positions of the patient, namely upper, lower, upper left, lower left, upper right and lower right as markers, an upper limb movement area is divided into an upper part and a lower part, and a boundary line of the upper limb movement area is a midline between a shoulder line and an umbilical line; "upper left, upper right" refers to the position of the upper extremity movement above the midline; "lower left, lower right" refers to the position of the upper limb movement below the midline where the patient's six basic upper limb movements match the six positions of the six palates and a complex upper limb movement consists of 2 or more than 2 basic upper limb movements.
In the above technical solution, step S3 specifically includes:
step S301: positioning the position of a drop point of a virtual object in advance through a nine-square grid or a six-square grid;
step S302: the position of the falling point is prompted in advance, so that the patient can easily complete corresponding upper limb actions in time;
step S303: the patient can easily and successfully touch the virtual object when arriving at the corresponding position of the nine-palace lattice or the six-palace lattice.
In the above technical solution, the difficulty level of the upper limb interactive rehabilitation training scheme in step S5 is determined according to one or more factors of the size, the speed, the movement distance, and the movement trajectory of the virtual object.
In the above technical solution, in step S6, determining whether the upper limb actions of the two patients are completed in time is implemented by the following method: when the patient performs certain upper limb action, prompting the virtual substitute to reach a certain grid of the matched nine-palace grid or six-palace grid, and judging that the upper limb of the patient is in place when the current time is the time when the corresponding virtual object reaches the drop point; otherwise, the process is reversed.
According to another specific embodiment of the invention, the invention discloses an interactive device for double upper limb non-contact rehabilitation training, which is characterized by comprising:
the captured data reconstruction module is used for capturing and returning the bone data of the two patients through the somatosensory sensor, distinguishing the two patients through the left-right position relation, and reconstructing the basic upper limb actions of the human body according to the bone data of the two patients; the motion sensing sensor adopts Kinect depth image acquisition equipment or a wearable human body position tracker;
the regional division setting module is used for defining the basic upper limb actions of the patient based on the nine-grid or six-grid, each grid in the nine-grid or six-grid is matched with a certain basic upper limb action of the patient, and the two patients can position the basic upper limb actions by utilizing the nine-grid or six-grid;
the virtual object falling point setting module is used for defining a virtual object falling point based on the nine-palace grid or the six-palace grid and guiding the two patients to perform matched basic upper limb actions, wherein each grid in the nine-palace grid or the six-palace grid is the falling point position of the virtual object;
the upper limb training auxiliary module is used for respectively determining an upper limb movement auxiliary support mode and a body position mode during upper limb training according to the limb function evaluation results of the two patients;
the upper limb movement training module is used for controlling respective virtual dummy bodies in the virtual scene to interact with virtual objects through the somatosensory sensor and the virtual reality interaction system by utilizing respective upper limb movement of two patients to complete an upper limb interaction rehabilitation training scheme with different difficulty degrees;
the training instant feedback module is used for judging the completion degree of the upper limb actions of the two patients within the specified time according to the interaction accuracy of the virtual avatar and the virtual object manipulated by the two patients in the training process and giving timely feedback;
and after the training is finished, the upper limb training score is evaluated according to the upper limb action completion condition of the two patients, the times of the upper limb actions which are not finished and the times of the upper limb actions which are not finished are analyzed, and the upper limb rehabilitation training results of the two patients are compared and analyzed.
In the above technical solution, the captured data reconstructing module includes an action reconstructing module, and the action reconstructing module reconstructs basic upper limb actions at nine positions, i.e., upper, lower, left, right, middle, upper left, lower left, upper right and lower right, of the upper limb of the patient or reconstructs basic upper limb actions at six positions, i.e., upper, lower, upper left, lower left, upper right and lower right, of the upper limb of the patient according to the positional relationship of the wrist, elbow and shoulder joints of the upper limb calculated by the bone data of the two patients.
In the technical scheme, nine basic upper limb movement positions of the patient, namely upper, lower, left, right, middle, upper left, lower left, upper right and lower right, are taken as markers by the nine palates, wherein the left, middle and right refer to the strip-shaped positions of the upper limbs of the human body moving above the umbilical line and below the shoulder line; "upper left, upper right" refers to the position of the upper extremity movement above the shoulder line; "lower left, lower right" refers to the position of the upper limb movement below the umbilical line; nine basic upper limb actions of the patient are matched with nine positions of the Sudoku, and the complex upper limb actions consist of 2 or more than 2 basic upper limb actions;
the six palace lattices take six basic upper limb movement positions of the patient, namely the upper part, the lower part, the upper left part, the lower left part, the upper right part and the lower right part, as markers, the upper limb movement area is divided into an upper part and a lower part, and the boundary line is the midline between a shoulder line and an umbilical line; "upper left, upper right" refers to the upper extremity action position above the midline; "lower left, lower right" refers to the position of the upper extremity below the midline; six basic upper limb movements of the patient are matched with six positions of the six grids, and the complex upper limb movement is composed of 2 or more than 2 basic upper limb movements.
In the above technical solution, the virtual object drop point setting module includes:
the positioning falling point position module is used for positioning the falling point position of the virtual object in advance through the nine-square grid or the six-square grid;
the drop point position prompting module is used for prompting the drop point position in advance so that the patient can easily complete corresponding upper limb actions in time;
and the virtual module touch module is used for judging whether the virtual object is successfully touched according to whether the patient timely reaches each position of the nine-grid or the six-grid.
In the above technical solution, the difficulty level of the upper limb interactive rehabilitation training scheme is determined according to one or more factors of the size, the speed, the movement distance, and the movement trajectory of the virtual object.
In the above technical solution, the training immediate feedback module judges the degree of completion of the upper limb actions of the two patients within a prescribed time by the following means: when the patient performs a certain upper limb action, the virtual substitute reaches a certain grid of the matched nine-palace grid or six-palace grid, and the current time is the time when the corresponding virtual object reaches the drop point, the patient is judged that the upper limb operation is in place; otherwise, the process is reversed.
Compared with the prior art, the specific embodiment of the invention has the following beneficial effects:
1. according to the invention, the Kinect is used as the double upper limb motion capture input device, and after receiving the double skeleton data captured and returned by the Kinect, the computer can reconstruct the upper limb motion of the human body according to the skeleton data.
2. In the virtual reality environment, a target moving far away has the problem of dynamic three-dimensional positioning, and two patients have difficulty in accurately judging the position of a virtual target drop point in time.
3. The invention focuses on marking the movement of the upper limb, focuses on the position of the wrist joint relative to the elbow joint and the shoulder joint, and guides the wrist of the patient to move to different positions of the nine-square grid.
4. The invention can simultaneously process 2 individual non-contact respective wrist movement positions and the condition of interaction with a virtual target on the same system and the same display screen, and complete double non-contact interactive competition training and interactive collaborative training.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic diagram of the definition of upper limb joint points and their positions in world coordinates according to the present invention;
FIG. 2 is a schematic diagram of an interaction method for double upper limb non-contact rehabilitation training according to the present invention;
FIG. 3 is a schematic diagram of the definition of upper limb movements for the Sudoku of the present invention;
FIG. 4 is a schematic diagram illustrating the motion trajectory based on Sudoku according to the present invention;
FIG. 5 is a schematic diagram of the definition of upper limb movement of the six-grid of the present invention;
fig. 6 is a schematic diagram for describing a movement track based on six grids.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention.
Since the present invention is mainly studied on the condition of upper limb movement of two patients, as shown in fig. 1, in the embodiment of the present invention, the present invention mainly defines the nodes of the upper limb portion as shown in fig. 1, and specifically includes the patient 2: head A 3 Right shoulder A 4 Left shoulder A 8 Vertebra A 1 Right hand A 21 Right hip A 12 Hip center A 0 Left hip A 16 Left hand A 22 Right foot A 15 Center of shoulder A 20 (ii) a Patient 1: head B 3 Right shoulder B 4 Left shoulder B 8 Vertebra B 1 Right hand B 21 Right hip B 12 Hip center B 0 Left hip B 16 Left hand B 22 Right foot B 15 Shoulder center B 20 As shown in fig. 1.
For the coordinate expression of the joint points, the coordinate values of all the joint points of the two patients in the world coordinate system can be directly obtained by the Kinect device, and the coordinate symbol expression is shown in table 1.
TABLE 1 coordinate definition of all joint points
Figure GDA0003883056230000071
Figure GDA0003883056230000081
According to an embodiment of the present invention, as shown in fig. 2, the present invention discloses an interactive method for double upper limb non-contact rehabilitation training, comprising the following steps:
s1: capturing and returning skeleton data of the two patients through the somatosensory sensor, distinguishing the two patients through the left-right position relation, and reconstructing the upper limb action of the human body according to the skeleton data of the two patients;
in the specific embodiment of this application, the sensor is felt to the body can adopt Kinect degree of depth image acquisition equipment, also can be wearing formula human position tracker.
Aiming at the difference between the patient 1 and the patient 2, the Kinect depth image acquisition equipment separates two different skeleton coordinate systems according to the sequence before and after entering, but cannot determine the relationship between the patient 1 and the patient 2. In the present invention, whether the person who enters is the patient 1 or the patient 2 is determined using the left-right positional relationship.
When two sets of skeletal coordinate systems are found to exist, one set is named J and the other set is K. When J is 0 point in the J skeleton 0 X-axis coordinate value J of point 0X And K is 0 point in the K skeleton 0 X-axis coordinate value K of point 0X When the condition of formula 1 is satisfied, J is considered to be 0 Point and the bone in which it is located is patient 2, and 0 point is marked as A 0 The other set of bones is considered as patient 1, and the 0 point is marked as B 0
J 0X <K 0X -0.3 (1)
The reconstruction of the upper limb actions of the human body according to the bone data of the two patients means that nine basic upper limb actions of nine positions including upper, lower, left, right, middle, upper left, lower left, upper right and lower right of the upper limb of the patient are reconstructed according to the position relation of the wrist, elbow and shoulder joints of the upper limb calculated according to the bone data of the two patients, or six basic upper limb actions of six positions including upper, lower, upper left, lower left, upper right and lower right of the upper limb of the patient are reconstructed.
S2: defining a regional grid-based patient upper limb motion for two patients using the regional grid to locate a plurality of standard upper limb motions and to locate a plurality of standard trajectories for the upper limbs, all grids in a region corresponding to a plurality of basic upper limb motions of the patient.
When the upper limb rehabilitation training is carried out, the upper limb rehabilitation training mainly guides two patients to carry out the up-down and left-right swinging of the upper limb. According to the swing characteristics, the embodiment of the invention designs an upper limb action description method based on the Sudoku or the Sudoku to rapidly describe the upper limb actions of two patients in real time, so as to meet the game requirements of the two patients.
In step S2, the region division means that the region of the patient' S upper limb movement is divided into nine-palace grids or six-palace grids.
The nine-square grid takes nine basic upper limb movement positions of the patient, namely upper, lower, left, right, middle, upper left, lower left, upper right and lower right, as markers, wherein the left, middle and right positions refer to strip-shaped positions of the upper limbs of the human moving above an umbilical cord and below a shoulder cord; "upper left, upper right" refers to the upper extremity action position above the shoulder line; "lower left, lower right" refers to the position of the upper limb movement below the umbilical line, as shown in figure 3, the nine basic upper limb movements of the patient match the nine positions of the Sudoku. The complex upper limb movements consist of 2 or more than 2 basic upper limb movements, such as crosscutting movements, which may consist of three basic upper limb movements with left-over-middle-over-right.
Six palace divisions take six basic upper limb movement positions of the patient, namely upper, lower, upper left, lower left, upper right and lower right as markers, an upper limb movement area is divided into an upper part and a lower part, and a boundary line of the upper limb movement area is a midline between a shoulder line and an umbilical line; "upper left, upper right" refers to the upper extremity action position above the midline; "lower left, lower right" refers to the position of the upper limb movement below the midline where the patient's six basic upper limb movements match the six positions of the six palates, complex upper limb movements consisting of 2 or more basic upper limb movements, as will be described in more detail below.
Physical definition of the upper limb movements of the squared figure:
the physical definition of the upper limb movements of patient 1 and patient 2 and their mathematical expressions are similar, differing only in the joint of patient 1 or patient 2; patient 2 was used as the subject of the formula below:
(1) Definition of height H of umbilical line and mathematical expression thereof
H=(Y A1 +Y A16 )/2 (2)
(2) Definition of upper limb movement and mathematical expression thereof
"left, middle and right" refer to the middle position of the human upper limb above the umbilical line and below the shoulder line; the upper limb action positions above the shoulder line are "upper left, upper right and upper right"; the upper limb action positions below the umbilical line are "lower left, lower right and lower right".
The physical definition of the upper limb action "up" is: "Upper" means that the hand moves between the two shoulder hip lines and above the shoulder line, i.e. remains substantially vertically upward, and its mathematical expression is as shown in equation 3.
((Y A21 >Y A4 )∩(X A21 >X A8 ∩X A21 <X A4 ))∪((Y A22 >Y A8 )∩(X A22 >X A8 ∩X A22 <X A4 )) (3)
The physical definition of the upper limb action "lower" is: "lower" means that the hand moves between the two shoulders and the hip line and below the umbilical line, i.e. the hand moves vertically downwards basically, and the mathematical expression is shown in formula 4.
((Y A21 <H)∩(X A21 >X A8 ∩X A21 <X A4 ))∪((Y A22 >H)∩(X A22 >X A8 ∩X A22 <X A4 )) (4)
The physical definition of the upper limb action "left" is: "left" means that the human hand moves in the middle of the strap beyond the hip line of the left shoulder. The mathematical expression is shown in formula 5.
((X A21 <X A4 )∩(Y A21 >H∩Y A21 <Y A4 ))∪((X A22 <X A8 )∩(Y A22 >H∩Y A22 <Y A8 )) (5)
The physical definition of the upper limb action "right" is: "Right" refers to the movement of the human hand in the middle of the strip shape outside the right shoulder hip line, and the mathematical expression is shown in the formula 6.
((X A21 >X A4 )∩(Y A21 >H∩Y A21 <Y A4 ))∪((X A22 >X A8 )∩(Y A22 >H∩Y A22 <Y A8 )) (6)
The physical definition of upper limb movement "in" is: the 'middle' refers to the movement of the hand at the crossing part of the hip line of the two shoulders and the umbilical line of the two shoulders, and the mathematical expression is shown as the formula 7.
((Y A21 >H)∩(Y A21 <Y A4 ∩X A21 >X A8 ∩X A21 <X A4 ))∪((Y A22 >H)∩(Y A22 <Y A4 ∩X A22 >X A8 ∩X A22 <X A4 )) (7)
The physical definition of the upper limb action "upper left" is: "Upper left" means that the hand moves in the upper left sector where the hip line of the left shoulder and the shoulder line intersect. The mathematical expression is shown in formula 8.
(Y A21 >Y A4 ∩X A21 <X A4 )∪(Y A22 >Y A8 ∩X A22 <X A8 ) (8)
The physical definition of the upper limb action "lower left" is: "lower left" means that the hand moves in the sector of the lower left where the hip line of the left shoulder intersects the umbilical line. The mathematical expression is shown in formula 9.
(Y A21 <H∩X A21 <X A4 )∪(Y A22 <H∩X A22 <X A8 ) (9)
The physical definition of the upper limb action "upper right" is: "Upper right" means that the hand moves in the upper right sector where the right shoulder hip line intersects the shoulder line. The mathematical expression is shown in formula 10.
(Y A21 >Y A4 ∩X A21 >X A4 )∪(Y A22 >Y A8 ∩X A22 >X A8 ) (10)
The physical definition of the upper limb action "lower right" is: "lower right" means that the hand moves at the sector part on the lower right of the intersection of the right shoulder hip line and the umbilical line. The mathematical expression is shown in formula 11.
(Y A21 <H∩X A21 >X A4 )∪(Y A22 <H∩X A22 >X A8 ) (11)
According to the actual needs, the embodiment of the present application can also utilize the similar basic upper limb movements of the patient positioned in the six-grid, which will be described in detail below.
The area division is described in detail with respect to the nine-grid, and the area division of another specific embodiment of the present invention is a six-grid, in which the upper limb actions of two patients are described in real time and quickly based on the six-grid upper limb action description method, so as to meet the requirements of a two-player game. Defining the upper limb actions as six categories of upper, lower, upper left, lower left, upper right and lower right according to the position of the six grids, wherein the boundary between the upper half part and the lower half part is the midline between the shoulder line and the umbilical line; the upper limb action positions above the midline are "upper left, upper right"; the upper limb movement positions below the midline are "lower left, lower right" as shown in fig. 4.
The physical definition of the upper limb movement of the six palace lattice is:
the physical definition and mathematical expression of the upper limb movements of patient 1 and patient 2 are similar, differing only in the joint of patient 1, or in the joint of patient 2; patient 2 was used as the subject of the formula below:
(1) Definition of midline height H and mathematical expression thereof
H=((Y A1 +Y A16 )/2+Y A8 )/2 (12)
(2) Definition of upper limb movement and mathematical expression thereof:
the upper limb action positions above the midline are "upper left, upper right"; the upper limb movement positions below the midline are "lower left, lower right".
The physical definition of the upper limb action "up" is: by "upper" is meant that the hand moves between the two shoulders at the hip line and above the midline, i.e., remains substantially vertically upward. The mathematical expression is shown in formula 13.
((Y A21 >H)∩(X A21 >X A8 ∩X A21 <X A4 ))∪((Y A22 >H)∩(X A22 >X A8 ∩X A22 <X A4 )) (13)
The physical definition of the upper limb action "lower" is: "lower" means that the hand moves between the two shoulders and the hip line and below the midline, i.e., the hand is kept moving vertically downward. The mathematical expression is shown in formula 4.
((Y A21 <H)∩(X A21 >X A8 ∩X A21 <X A4 ))∪((Y A22 <H)∩(X A22 >X A8 ∩X A22 <X A4 )) (14)
The physical definition of the upper limb action "upper left" is: "Upper left" means that the hand moves in the upper left sector where the hip line of the left shoulder and the midline intersect. The mathematical expression is shown in formula 15.
(Y A21 >H∩X A21 <X A4 )∪(Y A22 >H∩X A22 <X A8 ) (15)
The physical definition of the upper limb action "lower left" is: "lower left" means that the hand moves in the sector of the lower left where the hip line of the left shoulder intersects the midline. The mathematical expression is shown in formula 16.
(Y A21 <H∩X A21 <X A4 )∪(Y A22 <H∩X A22 <X A8 ) (16)
The physical definition of the upper limb action "upper right" is: "Upper right" means that the hand moves at the upper right sector where the right shoulder hip line intersects the midline. The mathematical expression is shown in formula 17.
(Y A21 >H∩X A21 >X A4 )∪(Y A22 >H∩X A22 >X A8 ) (17)
The physical definition of the upper limb action "lower right" is: "lower right" means that the hand moves in the sector of the right lower part where the right shoulder hip line intersects the midline. The mathematical expression is shown in formula 18.
(Y A21 <H∩X A21 >X A4 )∪(Y A22 <H∩X A22 >X A8 ) (18)
Six grid positions of upper limb movement on the human body are matched with 6 grid positions of the moving part of the virtual object operated by the human in the virtual scene.
S3: and defining a virtual target landing point based on region grids for guiding the two patients to complete the matched upper limb actions, wherein each grid in the region is the landing point position of the virtual target.
Step S3 specifically includes:
step S301: positioning the falling point position of the virtual object in advance through the nine-square grids or the six-square grids;
step S302: the position of the falling point is prompted in advance, so that the patient can easily complete corresponding upper limb actions in time;
step S303: the patient can easily and successfully touch the virtual object when arriving at the corresponding position of the nine-palace lattice or the six-palace lattice.
Aiming at the situation of the nine-grid or six-grid, the position of the nine-grid or six-grid moving on the upper limb of the human body is matched with the position of 9 grids or 6 grids of the moving part of the virtual object operated by the human in the virtual scene. The virtual object capable of interacting is randomly generated and automatically classified to one of the nine-palace grid or the six-palace grid to which the user belongs at a certain time. When two players play cooperatively, the common virtual object is bound with one grid in the common nine-square grid; when a double-player competition game is played, two players are respectively bound with the corresponding lattices in the nine-square lattices respectively. The grid to which the virtual object is bound may change at different times. As shown in fig. 4, each box of the grid is matched to the patient's upper limb movements.
When the patient performs a certain upper limb movement, if the patient matches the grid to which the virtual object belongs at the current moment, the patient is determined to touch the virtual object.
Besides the simple touch mode, the nine-square grid defined by the invention can realize more complex motion track description so as to meet the requirements of more complex rehabilitation games such as rowing, picking fruits and the like. Assuming that the user is required to make a diagonal cut from top left to bottom right, as shown by the bold-diagonal line trace in fig. 4, the method is described such that, within the valid time interval t, its movement point must pass: the upper left, the middle, the lower right, and can only pass through the upper and the right squares except the necessary ones. Regardless of the actual miter cut motion by the user, the final correction will be to the standard top left-middle-bottom right trajectory, as shown by the thin slashed trajectory in FIG. 5.
The track type motion interaction method is approximately the same as the touch interaction method, a corresponding motion track is planned during game design, and if the motion track of the user is the same as the corrected track, the motion of the user is considered to be consistent with the assumption, and the next step of processing is carried out according to the input.
Aiming at the situation of man-machine interaction of six lattices, the six-lattice positioning of upper limb movement on a human body is matched with the 6-lattice positioning of the moving part of a virtual object operated by a human in a virtual scene.
According to the interaction method for touching the virtual object, the virtual object capable of interacting can be randomly generated and automatically classified to one of six grids to which the user belongs at a certain moment. When two persons play cooperatively, the common virtual object is bound with one grid in the common six-grid; when the double competition game is played, two persons are respectively bound with the corresponding grids in the respective six-grid grids. The grid to which the virtual object is bound may change at different times. As shown in fig. 6, each grid of the six palates matches the patient's upper limb movements defined above.
When the patient performs an upper limb movement, if the patient matches the grid to which the virtual object belongs at the current time, the patient is determined to touch the virtual object.
Besides the simple touch mode, the more complex motion track description can be realized through the six-grid defined by the invention so as to meet the requirements of more complex rehabilitation games such as rowing, picking fruits and the like. Assuming that the user is required to make a motion of beveling from top left to bottom right, as shown by the thick line trace in fig. 6, the method is described such that, within the valid time interval t, its motion point must pass: left upper- > right lower, and can only pass through the upper or the next square except the necessary square.
The trajectory type motion interaction is approximately the same as the touch interaction method, a corresponding motion trajectory is planned during game design, and as long as the motion trajectory of the user is the same as the corrected trajectory, the user is considered to be consistent with the assumption, and then the next step of processing is performed according to the input.
S4: and respectively determining an upper limb movement auxiliary support mode and a body position mode during upper limb training according to the limb function evaluation results of the two patients. The upper limb training posture modes respectively comprise a standing posture mode and a sitting posture mode, wherein the standing posture needs to be supported by a standing frame.
S5: the two patients use respective upper limb movement to respectively control respective virtual dummy in the virtual scene to interact with the virtual object through the somatosensory sensor and the virtual reality interaction system, and the upper limb interactive rehabilitation training scheme with different difficulty degrees is completed.
The difficulty level of the upper limb interactive rehabilitation training scheme is determined according to the size and the speed of the virtual object, and one or more factors of the moving distance and the moving track.
And S6, in the training process, judging the degree of finishing the upper limb actions of the two patients within the specified time according to the interaction accuracy of the virtual avatar and the virtual object operated by the two patients, and giving timely feedback.
In the training process, according to the interaction accuracy of the virtual human or the virtual object operated by the two patients and the virtual target, judging whether the upper limb actions of the two patients are completed in time or not and counting the scoring condition, and meanwhile counting the upper limb actions which are not completed in time and the times of the upper limb actions which are not completed by the two patients, wherein the upper limb actions which are not completed refer to the areas of the patients with the inadequate upper limb movements in the nine-grid or six-grid.
Whether the upper limb actions of the two patients are completed in time is judged by the following method: when the patient performs certain upper limb action, the virtual substitute is promoted to reach the grid of the matched Sudoku or Sudoku, and the current time is the time when the corresponding virtual object reaches the drop point, the patient is judged that the upper limb operation is in place; otherwise, the process is reversed.
S7: after training is finished, upper limb training scores are evaluated according to the conditions of finishing the upper limb actions of the two patients, the times of the unfinished upper limb actions and the unfinished upper limb actions are analyzed, and the upper limb rehabilitation training results of the two patients are compared and analyzed.
According to another specific embodiment of the present invention, the present invention further discloses another specific embodiment of the present invention, and the present invention discloses an interactive device for double upper limb non-contact rehabilitation training, comprising:
the captured data reconstruction module is used for capturing and returning the bone data of the two patients through the somatosensory sensor, distinguishing the two patients through the left-right position relation, and reconstructing the basic upper limb actions of the human body according to the bone data of the two patients; the motion sensing sensor adopts Kinect depth image acquisition equipment or a wearable human body position tracker.
And the region division setting module is used for defining the basic upper limb actions of the patient based on the nine-palace division or the six-palace division, each division in the nine-palace division or the six-palace division is matched with a certain basic upper limb action of the patient, and the two patients can position the basic upper limb actions by utilizing the nine-palace division or the six-palace division.
And the virtual object falling point setting module is used for defining virtual object falling points based on the nine-grid or six-grid and guiding the two patients to perform matched basic upper limb actions, wherein each grid in the nine-grid or six-grid is the falling point position of the virtual object.
And the upper limb training auxiliary module is used for respectively determining an upper limb movement auxiliary support mode and a body position mode during upper limb training according to the limb function evaluation results of the two patients.
The upper limb movement training module is used for controlling respective virtual dummy bodies in the virtual scene to interact with virtual objects through the somatosensory sensor and the virtual reality interaction system by utilizing respective upper limb movement of two patients to complete an upper limb interaction rehabilitation training scheme with different difficulty degrees;
the training instant feedback module is used for judging the completion degree of the upper limb actions of the two patients within the specified time according to the interaction accuracy of the virtual dummy operated by the two patients and the virtual object in the training process and giving timely feedback;
and after the training is finished, estimating upper limb training scores according to the upper limb action completion conditions of the two patients, analyzing the number of times of the incomplete upper limb actions and the incomplete upper limb actions, and comparing and analyzing the upper limb rehabilitation training results of the two patients.
In an embodiment of the present invention, the captured data reconstructing module includes a motion reconstructing module, and the motion reconstructing module reconstructs basic upper limb motions of nine positions including upper, lower, left, right, middle, upper left, lower left, upper right and lower right of the upper limb of the patient or reconstructs basic upper limb motions of six positions including upper, lower, upper left, lower left, upper right and lower right of the upper limb of the patient according to the position relationship of the wrist, elbow and shoulder joints of the upper limb calculated by the bone data of the two patients.
In the specific embodiment of the invention, the nine-palace lattice takes nine basic upper limb movement positions of the patient, namely, upper, lower, left, right, middle, upper left, lower left, upper right and lower right, as markers, wherein the left, middle and right refer to strip-shaped positions of the upper limbs of the human moving above an umbilical line and below a shoulder line; "upper left, upper right" refers to the upper extremity action position above the shoulder line; "lower left, lower right" refers to the position of the upper extremity movement below the umbilical line; the nine basic upper limb movements of the patient are matched with the nine positions of the squared figure, and the complex upper limb movements consist of 2 or more than 2 basic upper limb movements, such as the transection movement, which may be performed by left-middle-right-three basic upper limb movements.
Six palace divisions take six basic upper limb movement positions of the upper part, the lower part, the upper left part, the lower left part, the upper right part and the lower right part of a patient as markers, an upper limb movement area is divided into an upper part and a lower part, and a boundary line of the upper limb movement area is a midline between a shoulder line and an umbilical line; "upper left, upper right" refers to the upper extremity action position above the midline; "lower left, lower right" refers to the position of the upper extremity below the midline; six basic upper limb movements of the patient are matched with six positions of the six grids, and the complex upper limb movement is composed of 2 or more than 2 basic upper limb movements.
In an embodiment of the present invention, the virtual object drop point setting module includes:
the positioning falling point position module is used for positioning the falling point position of the virtual object in advance through the nine-square grid or the six-square grid;
the drop point position prompting module is used for prompting the drop point position in advance so that the patient can easily complete corresponding upper limb actions in time;
and the virtual module touch module is used for judging whether the virtual object is successfully touched according to whether the patient timely reaches each position of the nine-grid or the six-grid.
In an embodiment of the present invention, the difficulty level of the upper limb interactive rehabilitation training scheme is determined according to one or more factors of the size, the speed, the movement distance and the movement track of the virtual object.
In an embodiment of the present invention, the training immediate feedback module determines the degree of completion of the upper limb movements of the two patients within a prescribed time by: when the patient performs a certain upper limb action, the virtual substitute reaches a certain grid of the matched nine-palace grid or six-palace grid, and the current time is the time when the corresponding virtual object reaches the drop point, the patient is judged that the upper limb operation is in place; otherwise, the reverse is carried out.
It should be noted that the interaction apparatus of the present invention is implemented by combining the interaction method, and each module of the interaction apparatus is implemented according to the interaction method of the present invention and the content described in the above interaction method, which is not described herein again.
According to the invention, the Kinect is used as the double upper limb motion capture input equipment, after the computer receives double skeleton data captured and returned by the Kinect, the motion of the upper limbs of the human body is reconstructed according to the skeleton data, the respective nine-palace grids or 1 total nine-palace grid and the motion tracks of the upper limbs of the two people positioned by the nine-palace grids are utilized, and the relationship between the motion tracks and the virtual target is judged, so that the effectiveness of the respective motion of the upper limbs of the two people is rapidly judged, and the effectiveness of the motion of the two people is compared; the respective six palace lattices or 1 total six palace lattices can also be utilized, the respective upper limb movement tracks of the two persons positioned by the six palace lattices and the relationship between the judgment and the virtual target are utilized to quickly judge the effectiveness of the respective actions of the upper limbs of the two persons, and the effectiveness of the actions of the two persons is compared. The invention can not only guide the patient to do upper limb movement, increase the interest of upper limb training of the patient, but also improve the efficiency of equipment training.
The foregoing description has described specific embodiments of the present invention. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (12)

1. An interaction method for double upper limb non-contact rehabilitation training is characterized by comprising the following steps:
s1: capturing and returning skeleton data of the two patients through the somatosensory sensor, distinguishing the two patients through the left-right position relation, and reconstructing the upper limb action of the human body according to the skeleton data of the two patients;
s2: defining the upper limb actions of the patients based on the region cells, and positioning a plurality of standard upper limb actions and a plurality of standard movement tracks of the upper limbs for the two patients by using the region cells, wherein all cells in the region correspond to a plurality of basic upper limb actions of the patients;
s3: defining a virtual target landing point based on regional grids for guiding two patients to complete matched upper limb actions, wherein each grid in the region is the landing point position of the virtual target;
s4: respectively determining an upper limb movement auxiliary support mode and a body position mode during upper limb training according to the limb function evaluation results of the two patients;
s5: the two patients respectively control respective virtual dummy bodies in the virtual scene to interact with the virtual object by using respective upper limb motions through the somatosensory sensor and the virtual reality interaction system, and the upper limb interactive rehabilitation training scheme with different difficulty degrees is completed;
s6, in the training process, judging the degree of finishing the upper limb actions of the two patients within the specified time according to the interaction accuracy of the virtual dummy operated by the two patients and the virtual object, and giving timely feedback;
s7: after training is finished, upper limb training scores are evaluated according to the conditions of finishing the upper limb actions of the two patients, the times of the unfinished upper limb actions and the unfinished upper limb actions are analyzed, and the upper limb rehabilitation training results of the two patients are compared and analyzed.
2. The interactive method for double upper limb non-contact rehabilitation training according to claim 1, wherein in step S1, the reconstruction of the upper limb movements of the human body according to the bone data of the two patients means reconstructing nine basic upper limb movements of nine positions including upper, lower, left, right, middle, upper left, lower left, upper right and lower right of the upper limb of the patient according to the positional relationship of the wrist, elbow and shoulder joints of the upper limb calculated according to the bone data of the two patients, or reconstructing six basic upper limb movements of six positions including upper, lower, upper left, lower left, upper right and lower right of the upper limb of the patient.
3. The interaction method for double-person upper limb non-contact rehabilitation training according to claim 2, wherein in step S2, the region division means that the region of the patient' S upper limb movement is divided into nine-palace divisions or six-palace divisions;
the nine-square grid takes nine basic upper limb movement positions of the patient, namely upper, lower, left, right, middle, upper left, lower left, upper right and lower right, as markers, wherein the left, middle and right positions refer to strip-shaped positions of the upper limbs of the human moving above an umbilical cord and below a shoulder cord; "upper left, upper right" refers to the position of the upper extremity movement above the shoulder line; "lower left, lower right" refers to the position of the upper limb movement below the umbilical line, where the nine basic upper limb movements of the patient match the nine positions of the Sudoku, and the complex upper limb movement consists of 2 or more than 2 basic upper limb movements;
the six-grid uses six basic upper limb movement positions of the patient, namely upper, lower, upper left, lower left, upper right and lower right, as markers, the upper limb movement area is divided into an upper part and a lower part, and the boundary line is the midline between the shoulder line and the umbilical line; "upper left, upper right" refers to the upper extremity action position above the midline; "lower left, lower right" refers to the position of the upper limb movement below the midline, where the patient's six basic upper limb movements match the six positions of the six palates, and complex upper limb movements consist of 2 or more basic upper limb movements.
4. The interaction method for double-person upper limb non-contact rehabilitation training according to claim 3, wherein the step S3 specifically comprises:
step S301: positioning the falling point position of the virtual object in advance through the nine-square grids or the six-square grids;
step S302: the position of the falling point is prompted in advance, so that the patient can easily complete corresponding upper limb actions in time;
step S303: the patient can reach the corresponding position of the nine-grid or six-grid, and the virtual object can be easily and successfully touched.
5. The interactive method for double-person upper limb non-contact rehabilitation training according to claim 1, wherein the difficulty level of the upper limb interactive rehabilitation training scheme in step S5 is determined according to one or more factors of the size, speed, movement distance and movement track of the virtual object.
6. The interaction method for double upper limb non-contact rehabilitation training according to claim 3, wherein in step S6, the determination of whether the upper limb actions of the two patients are completed in time is performed by: when the patient performs certain upper limb action, prompting the virtual substitute to reach a certain grid of the matched nine-palace grid or six-palace grid, and judging that the upper limb of the patient is in place when the current time is the time when the corresponding virtual object reaches the drop point; otherwise, the reverse is carried out.
7. The utility model provides an interactive device towards double upper limbs non-contact rehabilitation training which characterized in that includes:
the captured data reconstruction module is used for capturing and returning the bone data of the two patients through the somatosensory sensor, distinguishing the two patients through the left-right position relation, and reconstructing the basic upper limb actions of the human body according to the bone data of the two patients; the motion sensing sensor adopts Kinect depth image acquisition equipment or a wearable human body position tracker;
the regional division setting module is used for defining basic upper limb actions of the patient based on the nine-palace grid or the six-palace grid, all grids in the nine-palace grid or the six-palace grid are matched with a plurality of basic upper limb actions of the patient, and the two patients can position the basic upper limb actions by utilizing the nine-palace grid or the six-palace grid;
the virtual object falling point setting module is used for defining a virtual object falling point based on the nine-palace grid or the six-palace grid and guiding the two patients to perform matched basic upper limb actions, wherein each grid in the nine-palace grid or the six-palace grid is the falling point position of the virtual object;
the upper limb training auxiliary module is used for respectively determining an upper limb movement auxiliary support mode and a body position mode during upper limb training according to the limb function evaluation results of the two patients;
the upper limb movement training module is used for controlling respective virtual dummy bodies in the virtual scene to interact with the virtual object through the somatosensory sensor and the virtual reality interaction system by utilizing respective upper limb movement of two patients so as to complete upper limb interaction rehabilitation training schemes with different difficulty degrees;
the training instant feedback module is used for judging the degree of timely completion of the upper limb actions of the two patients within a specified time according to the interaction accuracy of the virtual avatar and the virtual object manipulated by the two patients in the training process and giving timely feedback;
and after the training is finished, the upper limb training score is evaluated according to the upper limb action completion condition of the two patients, the times of the upper limb actions which are not finished and the times of the upper limb actions which are not finished are analyzed, and the upper limb rehabilitation training results of the two patients are compared and analyzed.
8. The interactive device for double-person upper limb non-contact rehabilitation training according to claim 7, wherein the captured data reconstruction module comprises a motion reconstruction module, and the motion reconstruction module reconstructs basic upper limb motions of nine positions including upper, lower, left, right, middle, upper left, lower left, upper right and lower right of the upper limb of the patient or reconstructs basic upper limb motions of six positions including upper, lower, upper left, lower left, upper right and lower right of the upper limb of the patient according to the position relation of the wrist, elbow and shoulder joints of the upper limb calculated according to the bone data of the two patients.
9. The interactive device for double-person upper limb non-contact rehabilitation training according to claim 8, wherein the nine-square grid uses nine basic upper limb movement positions of the patient, namely upper, lower, left, right, middle, upper left, lower left, upper right and lower right, as markers, wherein the left, middle and right positions refer to strip-shaped positions of the upper limb of the human moving above the umbilical cord and below the shoulder cord; "upper left, upper right" refers to the upper extremity action position above the shoulder line; "lower left, lower right" refers to the position of the upper limb movement below the umbilical line; nine basic upper limb actions of the patient are matched with nine positions of the Sudoku, and the complex upper limb actions consist of 2 or more than 2 basic upper limb actions;
six palace divisions take six basic upper limb movement positions of the upper part, the lower part, the upper left part, the lower left part, the upper right part and the lower right part of a patient as markers, an upper limb movement area is divided into an upper part and a lower part, and a boundary line of the upper limb movement area is a midline between a shoulder line and an umbilical line; "upper left, upper right" refers to the upper extremity action position above the midline; "lower left, lower right" refers to the position of the upper extremity below the midline; six basic upper limb movements of the patient are matched with six positions of the six grids, and the complex upper limb movement is composed of 2 or more than 2 basic upper limb movements.
10. The interactive device for double-person upper limb non-contact rehabilitation training according to claim 9, wherein the virtual object drop point setting module comprises:
the positioning falling point position module is used for positioning the falling point position of the virtual object in advance through the nine-square grid or the six-square grid;
the drop point position prompting module is used for prompting the drop point position in advance so that the patient can easily complete corresponding upper limb actions in time;
and the virtual module touch module is used for judging whether the virtual object is successfully touched according to whether the patient timely reaches each position of the nine-grid or the six-grid.
11. The interactive device for double-person upper limb non-contact rehabilitation training according to claim 7, wherein the difficulty level of the upper limb interactive rehabilitation training scheme is determined according to the size, the speed, the moving distance and one or more factors of the moving track of the virtual object.
12. The interactive device for double-person upper limb non-contact rehabilitation training according to claim 9, wherein the training immediate feedback module judges the degree of completion of the upper limb actions of the two patients within a specified time by: when the patient performs an action on one upper limb, the virtual substitute body reaches a certain grid of the matched nine-grid or six-grid, and the current moment is also the moment when the corresponding virtual object reaches the drop point, the patient is judged that the upper limb of the patient is operated in place; otherwise, the reverse is carried out.
CN202210034551.4A 2022-01-13 2022-01-13 Interaction method and device for double upper limb non-contact rehabilitation training Active CN114367091B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210034551.4A CN114367091B (en) 2022-01-13 2022-01-13 Interaction method and device for double upper limb non-contact rehabilitation training

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210034551.4A CN114367091B (en) 2022-01-13 2022-01-13 Interaction method and device for double upper limb non-contact rehabilitation training

Publications (2)

Publication Number Publication Date
CN114367091A CN114367091A (en) 2022-04-19
CN114367091B true CN114367091B (en) 2022-12-06

Family

ID=81143090

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210034551.4A Active CN114367091B (en) 2022-01-13 2022-01-13 Interaction method and device for double upper limb non-contact rehabilitation training

Country Status (1)

Country Link
CN (1) CN114367091B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109589557A (en) * 2018-11-29 2019-04-09 广州晓康医疗科技有限公司 Based on reality environment tandem race rehabilitation training of upper limbs system and appraisal procedure
CN112617810A (en) * 2021-01-04 2021-04-09 重庆大学 Virtual scene parameter self-adaption method for restraining upper limb shoulder elbow rehabilitation compensation
WO2021068542A1 (en) * 2019-10-12 2021-04-15 东南大学 Force feedback technology-based robot system for active and passive rehabilitation training of upper limbs
CN113332668A (en) * 2021-06-02 2021-09-03 上海市徐汇区中心医院 Trunk control training device and training method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109589557A (en) * 2018-11-29 2019-04-09 广州晓康医疗科技有限公司 Based on reality environment tandem race rehabilitation training of upper limbs system and appraisal procedure
WO2021068542A1 (en) * 2019-10-12 2021-04-15 东南大学 Force feedback technology-based robot system for active and passive rehabilitation training of upper limbs
CN112617810A (en) * 2021-01-04 2021-04-09 重庆大学 Virtual scene parameter self-adaption method for restraining upper limb shoulder elbow rehabilitation compensation
CN113332668A (en) * 2021-06-02 2021-09-03 上海市徐汇区中心医院 Trunk control training device and training method thereof

Also Published As

Publication number Publication date
CN114367091A (en) 2022-04-19

Similar Documents

Publication Publication Date Title
JP7061694B2 (en) Image processing methods and equipment, imaging equipment, and storage media
US10058773B2 (en) Man-machine interaction controlling method and applications thereof
CN108721870B (en) Exercise training evaluation method based on virtual environment
CN106485055B (en) A kind of old type 2 diabetes patient&#39;s athletic training system based on Kinect sensor
CN110739040A (en) rehabilitation evaluation and training system for upper and lower limbs
CN106420254A (en) Multi-person interactive virtual reality rehabilitation training and evaluation system
US7404774B1 (en) Rule based body mechanics calculation
CN109453509A (en) It is a kind of based on myoelectricity and motion-captured virtual upper limb control system and its method
CN108537284A (en) Posture assessment scoring method based on computer vision deep learning algorithm and system
WO2011009302A1 (en) Method for identifying actions of human body based on multiple trace points
CN109589556B (en) Double-person cooperative upper limb rehabilitation training system based on virtual reality environment and evaluation method
CN103019386B (en) A kind of control method of human-computer interaction and its utilization
CN104353240A (en) Running machine system based on Kinect
CN107293175A (en) A kind of locomotive hand signal operation training method based on body-sensing technology
KR101317383B1 (en) Cognitive ability training apparatus using robots and method thereof
CN110232963A (en) A kind of upper extremity exercise functional assessment system and method based on stereo display technique
CN109421052A (en) A kind of quintet game Chinese-chess robot based on artificial intelligence
CN105107200A (en) Face change system and method based on real-time deep somatosensory interaction and augmented reality technology
CN109589557B (en) Upper limb rehabilitation training system and evaluation method based on virtual reality environment double competition
CN112076440A (en) Double-person interactive upper limb rehabilitation system based on recognition screen and training method thereof
CN102446359B (en) Small ball sport processing method based on computer and system thereof
CN107272884A (en) A kind of control method and its control system based on virtual reality technology
CN112007343A (en) Double-arm boxing training robot
CN114367091B (en) Interaction method and device for double upper limb non-contact rehabilitation training
CN113975775A (en) Wearable inertial body feeling ping-pong exercise training system and working method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 510000 units 403 and 404, plant C, No. 3, Huachuang Animation Industrial Park, No. 22 Huateng Road, Jinshan Village, Shiqi Town, Panyu District, Guangzhou City, Guangdong Province

Applicant after: GUANGZHOU XIAOKANG MEDICAL TECHNOLOGY Co.,Ltd.

Address before: 510000 unit 502, building 6, phase II, Huachuang Animation Industrial Park, Shiqi Town, Panyu District, Guangzhou City, Guangdong Province

Applicant before: GUANGZHOU XIAOKANG MEDICAL TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant