WO2014160172A1 - Procédé et système pour analyser une activité/exercice de rééducation virtuel - Google Patents

Procédé et système pour analyser une activité/exercice de rééducation virtuel Download PDF

Info

Publication number
WO2014160172A1
WO2014160172A1 PCT/US2014/025970 US2014025970W WO2014160172A1 WO 2014160172 A1 WO2014160172 A1 WO 2014160172A1 US 2014025970 W US2014025970 W US 2014025970W WO 2014160172 A1 WO2014160172 A1 WO 2014160172A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
rehabilitation
events
determining
rules
Prior art date
Application number
PCT/US2014/025970
Other languages
English (en)
Inventor
Mark EVIN
Julie GUEHO
David SCHACTER
Alexis YOUSSEF
Sung Jun BAE
Original Assignee
Jintronix, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jintronix, Inc. filed Critical Jintronix, Inc.
Priority to US14/774,960 priority Critical patent/US20160023046A1/en
Publication of WO2014160172A1 publication Critical patent/WO2014160172A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1124Determining motor skills
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • A63B2024/0015Comparing movements or motion sequences with computerised simulations of movements or motion sequences, e.g. for generating an ideal template as reference to be achieved by the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0065Evaluating the fitness, e.g. fitness level or fitness index
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0068Comparison to target or threshold, previous performance or not real time comparison to other individuals

Definitions

  • the present invention relates to the field of physical rehabilitation for patients, and more particularly to virtual rehabilitation systems and methods,
  • a computer-implemented method for analysing one of a rehabilitation activity and a performance of a user during a virtual rehabilitation exercise comprising: receiving one of a rehabilitation activity and executed movements performed by the user during the virtual rehabilitation exercise, the rehabilitation activity defining an interactive environment to be used for generating a simulation thai corresponds to the virtual rehabilitation exercise, the rehabilitation activity comprising at least one virtual user-controlled element and input parameters; determining movement rules corresponding to the one of the rehabilitation activity and the rehabilitation exercise; each one of the movement rules comprising a correlation between a given group consisting of at least a property of the virtual user-controlled element and a body part, and at least one of a respective elementary movement and a respective task- oriented movement; determining a sequence of movement events corresponding to the one of the rehabilitation activity and the executed mo vements, each one of the movement events corresponding to a given state of the property of the virtual user-controlled object in the interactive environment, the given state corresponding to one of
  • the step of receiving comprises receiving the rehabilitation activity
  • the step of determining movement rules comprises: determining a rehabilitation scenario that corresponds to the received rehabilitation activity; and retrieving from a database the movement rules that correspond to the determined rehabilitation scenario.
  • the step of determining a sequence of movement events comprises determining the movement events from at least one of the input parameters.
  • the step of determining a sequence of movement events comprises retrieving predefined movement events from a storing unit.
  • the step of determining a movement sequence comprising at least one of elementary movements and a task-oriented movement comprises: determining movement segments from the movement events; assigning a respective one of the movement rules to each one of the movement segments; and assigning at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.
  • the method further comprises a step of determining and outputting at least one clinical objective corresponding to the received rehabilitation activity,
  • the step of determining the at least one clinical objective comprises: comparing a given one of the input parameters to a challenge threshold; and when the given one of the input parameters is greater than the challenge threshold, identifying a movement characteristic related to the given one of the input parameters as being the at least one clinical objective.
  • the method further comprises a step of generating and outputting an alert
  • the step of generating the alert comprises: comparing a given one of the input parameters to a danger threshold; and when the given one of the input parameters is greater than the danger threshold, identifying a potential danger for the patient.
  • the step of receiving comprises receiving the executed mo vements performed by a user during a rehabilitation exercise.
  • the step of determining movement rules comprises retrieving movement rules corresponding to the rehabilitation exercise.
  • the step of determining a sequence of movement events comprises retrieving a sequence of ordered movement events corresponding to the rehabilitation exercise.
  • the step of determining a sequence of movement events comprises receiving unordered movement event triggers corresponding to the rehabilitation exercise, and ordering the unordered movement event triggers, thereby obtaining ordered movement events.
  • the step of determining a movement sequence comprising at least one of elementary movements and a task-oriented movement comprises: determining movement segments from the movement events; assigning a respective one of the movement rules to each one of the movement segments; and assigning at least one of the respective elementary movement and the respective task-oriented movement contained in ihe assigned mo v ement rule to the each one of the movement segments.
  • a system for analysing one of a rehabilitation activity and a performance of a user during a virtual rehabilitat on exercise during which the user performs executed movements the rehabilitation activity defining an interactive environment to be used for generating a simulation that corresponds to the virtual rehabilitation exercise, the rehabilitation activity comprising at least one virtual user-controlled element and input parameters
  • the system comprising: a movement rules determining module for determining movement rules corresponding to one of the rehabilitation activit and the rehabilitation exercise; each one of the movement rules comprising a correlation between a given group consisting of a property of the virtual user-controlled element and a body part, and at least one of a respective elementary movement and a respective task-oriented movement; a movement events determining module for determining a sequence of movement events corresponding to one of the rehabilitation activity and the executed movements, each one of the movement events corresponding to a given state of the property of the virtual user-controlled object in the interactive environment, the given state corresponding to one of a beginning and
  • the movement rules determining module is adapted to receive the rehabilitation activity.
  • the movement rules determining module is adapted to determine a rehabilitation scenario that corresponds to the received rehabilitation activity, and retrieve from a database the movement rules that correspond to the determined rehabilitation scenario.
  • the movement rules determining module is adapted to determine the movement events from at least one of the input parameters.
  • the movement events determining module is adapted to retrieve predefined movement events from a storing unit.
  • the elementary movement determining module is adapted to: determine movement segments from the movement events: assign a respective one of the movement rules to each one of the movement segments; and assign at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.
  • the system further comprises a clinical objective module for determining and outputting at least one clinical objective corresponding to the received rehabilitation activity.
  • the clinical objective module is adapted to: compare a given one of the input parameters to a challenge threshold; and when the given one of the input parameters is greater than the challenge threshold, identify a movement characteristic related to the given one of the input parameters as being the at least one clinical objective.
  • system further comprises an alert module for generating and outputting an alert.
  • the alert module is adapted to: compare a given one of the input parameters to a danger threshold; when the given one of the input parameters is greater than the danger threshold, identify a potential danger for the patient.
  • the movement rules determining module is adapted to receive the executed movements performed by a user during a rehabilitation exercise.
  • the movement rules determining module is adapted to retrieve movement rides corresponding to the rehabilitation exercise.
  • the movement events determining module is adapted to retrieve a sequence of ordered movement events corresponding to the rehabilitation exercise.
  • the movement events determining module is adapted to receive unordered movement event triggers corresponding to the rehabilitation exercise, and order the unordered movement event triggers in order to obtain ordered movement events.
  • the elementary movement determining module is adapted to: determine movement segments from the movement events: assign a respective one of the movement rules to each one of the movement segments; and assign at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.
  • the given group further comprises a direction of change for the property.
  • a computer- implemented method for creating a rehabilitation activity comprising: receiving clinical objectives; determining a rehabilitation scenario and corresponding movement rales adapted to the clinical objectives; determining movement events using the movement rules: determining customizable parameters for the rehabilitation scenario, thereby obtaining the rehabilitat on activity; and outputting the rehabilitation activity.
  • a system for creating a rehabilitation activity comprising: a scenario determining module for receiving clinical objectives and determining a rehabilitation scenario and corresponding movement rules adapted to the clinical objectives; a movement events determining module for determining movement events using the movement rules; and a scenario parameter determining module for determining customizable parameters for the rehabilitation scenario in order to obtain the rehabilitation activity, and output the rehabilitation activity.
  • Figure 1 is a flow chart of a method for analysing a rehabilitation activity, in accordance with an embodiment
  • Figure 2. illustrates right-hand a Cartesian coordinate system, in accordance with the prior art
  • Figure 3 is a block diagram of a system for analysing a rehabilitation activity, in accordance with an embodiment
  • Figure 4 is a flo chart of a method for determining the performance of a patieni during a virtual rehabilitation exercise, in accordance with an embodiment
  • Figure 5 is a flow chart of a method for generating a rehabilitation activity, in accordance with an embodiment.
  • a virtual rehabilitation system is adapted to display an interactive simulation with which a patient may interact to execute a rehabilitation exercise.
  • the virtual reh bilitation system comprises at least a simulation generator for generating the interactive simulation, a display unit for displaying the simulation to the patient, and a movement tracking unit for tracking a mo v ement of at least a body part of the patient.
  • a given element of the rehabilitation simulation may be controlled by a given body part of the patient.
  • the movement tracking unit tracks the movement of the given body part and transmits the position of the given body part to the simulation generator.
  • the simulation generator modifies a characteristic/property of the given element of the simulation in substantially real time as a function of the new position of the gi ven body part received from the movement tracking unit, thereby rendering the simulation interactive.
  • the position of the given element may be changed in the simulation according to the change of position of the tracked body part.
  • a medical professional such as a therapist or a clinician creates the virtual rehabilitation exercise that is adapted to the needs and/or condition of a patient.
  • the medical professional is presented with a user interface for creating the virtual rehabilitation exercise.
  • the medical professional may access a database of rehabilitation exercises through the interface, and selects a given rehabilitation exercise that is adequate to the patient.
  • the medical professional may further specify some parameters for the rehabilitation exercise.
  • the present description presents a method and system for analysing a rehabilitation exercise created by a medical professional in order to provide the medical professional with a list of movements to be executed by the patient according to the created rehabilitation exercise. The medical professional may then verify whether the rehabilitation exercise that he has created/selected is adequate for the patient. The method may also be used to analyse the movements performed by the patient during the rehabilitation exercise.
  • the present description further presents a method and system for creating a virtual rehabilitation exercise.
  • a rehabilitation activity refers to an interactive environment to be used for generating a simulation that corresponds to a virtual rehabilitation exercise.
  • a rehabilitation activity comprises all required information to create a virtual rehabilitation exercise
  • a rehabilitation scenario refers to an incomplete interactive environment, and comprises at least one virtual user-controlled element which is mapped to at least one body part to be exercised, and at least one virtual reference element of which a characteristic is not specified.
  • the rehabilitation scenario may further comprise a background scene.
  • a rehabilitation scenario comprises a background scene and a set of one or more virtual elements with properties/parameters that may be configured by the medical professional.
  • a property or parameter refers to any property of the background scene and the virtual elements, such as the color, size or position of an object, and/or any state or change in the interactive environment.
  • a rehabilitation scenario may be seen as an incomplete rehabilitation activity of which some properties/parameters such as a property or characteristic of the virtual user-controlled element have not been specified. The unspecified properties/parameters may help to define the type of involved movement required, the movement characteristic (quality of movement) required, and/or the like.
  • Each rehabilitation scenario covers at least one corresponding general rehabilitation focus. Examples of general focuses comprise executive functioning, bilateral coordination, posture, balance, and the like.
  • a rehabilitation scenario further comprises its corresponding general focus(es).
  • the general focus may be included in the metadata of the rehabilitation scenario.
  • the first scenario is directed to unilateral shoulder and elbow movement with a focus on continuous movement precision which may be expressed as an angle, a unit of distance, a percentage, or the like.
  • the continuous movement precision represents the patient's deviation from a target movement axis or path.
  • the present scenario mandates the performance of elementary movements with a certain required degree of continuous mo vement precision.
  • the first scenario comprises a 3D underwater background scene, a fish whose 3D position within the background scene is to be controlled by the shoulder and elbow movements of the patient, a sequence of food objects of which the position within the background scene determines a path to be followed by the fish to guide the movements of the patient, an optional piranha that may chase the fish, and optional obstacles for the fish.
  • an interactive simulation is displayed to the patient.
  • the interactive simulation comprises the fish, food objects, the piranha, and the obstacles inserted into the 3D underwater background scene.
  • the patient moves this shoulder and elbow to move the fish so that the fish eats the food elements and avoids the obstacles.
  • the patient should also control the speed of the fish so that it may not be caught by the piranha.
  • the customizable properties/parameters for the first scenario may include: the size of the food objects that controls the required precision of the patient's mo vement path; the shape of the sequence of food objects or the location of the food objects within the 3D underwater background scene in order to control the shape of the user's required movement path.
  • the food objects may be arranged in the shape of a square, a triangle, an eight figure, etc.; the speed of the piranha to control the minimum speed required by the patient's movements; and/or the number of movemeni repetitions required
  • the customizable properties/parameters for the first scenario may comprise additional properties/parameters such as the location and size of ihe obstacles.
  • the rehabilitation scenario is then referred to as a rehabilitation activity.
  • size of food objects 2 units diameter
  • shape of the sequence of food objects square
  • speed of the piranha 5 em per second: number of movement repetitions required: 3
  • the rehabilitation scenario further comprises the following general rehabilitation focuses: continuous movement precision and unilateral shoulder and elbow movement.
  • scenarios may also comprise a discrete movement precision that may be expressed as an angle, a unit of distance, a percentage, or the like.
  • a discrete movement precision represents ihe patient's ability to accurately reach an end-point of a movement within a desired area-threshold. For example, an activity may require a patient to hit a discrete target with their wrist with a certain degree of precision.
  • the second scenario is directed to bilateral shoulder and elbow movement with a focus on executive functioning and bi-iaieral coordination.
  • the second scenario features a 3D outdoor garden scene, and comprises a tree, fruits that fall from the tree, two basket elements or halves of a basket that form a basket when brought together, and a pail.
  • the depth and horizontal coordinates of each basket element are controlled by the 3D position of a respective elbow, shoulder, and hand of the patient, and the patient forms the complete basket by bringing his hands together.
  • the patient has to bring his hands together to catch the falling fruits in the basket, bring moves his hands while maintaining his hands in contact together to bring the basket containing the fruit on top of the pail, and then moves his hands apart so that the fruit falls from the basket into the pail
  • the customizable properties/parameters for the second scenario may comprise: the number of falling fruit; the size of the basket and pail to set the required precision of patient's movements; the horizontal and depth positions of falling fruit; the horizontal and depth position of the pail; and/or the speed and frequency for the falling fruits.
  • the second rehabilitation scenario further comprises the following general rehabilitation focuses: executive functioning and bi-laterai coordination.
  • Table 1 presents exemplary values for the customizable properties/parameters for the second exemplary scenario. Once the values are assigned to the customizable properties/parameters, the scenario is referred to as an activity. input parameter Selection
  • Table 1 Exemplary values for the customizable properties/parameters for the second exemplary scenario.
  • This scenario is directed to trunk movement while the patient is in a sitting position with an emphasis on posture and balance.
  • the patient controls the position of a ball using the angle of his trunk in order to hit targets at various horizontal and depth positions with the ball.
  • the third scenario comprises the following virtual elements: a background scene; a ball of which the position is controlled by the angle of tilt of the patient's trunk; and target objects.
  • the customizable properties/parameters for the third scenario may comprise: the size of the target objects which defines the precision requirement for the trunk movements; the horizontal position of target objects which defines the lateral trunk tilt requirement; the depth position of target objects which defines the forward trunk, tilt requirement; and the number of target objects.
  • the third rehabilitation scenario further comprises the following general rehabilitation focuses: posture, trunk movement, and balance.
  • the fourth scenario is directed to discrete precision of upper-extremity (i.e. arms) to perform task-oriented movements.
  • a task-oriented movement is defined as a movement that may require one or more elementary movements in order to achieve a basic functional movement purpose. Examples of task-oriented movements are reaching, crossing midline (moving from right-side to left side or vise-versa), raising, and diagonal movement. Some task-oriented movements may not have a descriptive label, and may be defined by the beginning position and end position of a certain body part, for example, the beginning and end position of the wrist.
  • a task-oriented movement may also be defined in terms of the position of the user's hand relative to the user's body, for example "ipsilateral to contraproximal" (same side to opposite side and near).
  • the general interactive mechanic corresponding to the fourth scenario is the following: the user must control an object in the interactive environment by manipulating the position of his wrist in order to hit target objects.
  • the position of the user-controlled object within the interactive environment is determined as a function of the 3D position of the patient's wrist relative to his trunk.
  • the patient manipulates the position of his wrist relative to his trunk in order to collide the user- controlled object with at least one target object within the interactive environment.
  • the fourth scenario comprises the following virtual elements: a background scene, a user- controlled object and at least one target object.
  • the virtual elements may optionally comprise obstacles that move according to a given speed, and should be avoided by the user-controlled object.
  • the customizable properties/parameters for the fourth scenario may comprise: the vertical and horizontal position of target objects; the depth position of target objects; the number of target objects; and the size, frequency, and speed of the target objects.
  • the fourth rehabilitation scenario further comprises the following general rehabilitation focus: sequential movement.
  • Table 2 presents exemplary values for the customizable properties/parameters of the fourth scenario.
  • Table 2 Exemplary values for the parameters of the fourth exemplary scenario.
  • rehabilitation activities are referred to as rehabilitation activities .
  • Figure 1 illustrates one embodiment of a computer-implemented method 10 for analysing a given rehabilitation activity by determining the movements to be executed by a patient that correspond to the given a rehabilitation activity
  • a medical professional may create a rehabilitation activity for a patient who is unable, or is not advised, to perform an elementary movement such as a right shoulder flexion, or a task-oriented movement such as "right arm arm-raising".
  • the method 10 analyses the rehabilitation activity In this case, the method 10 is used by the medical professional to determine the elem.ent.ary movements and/or the task-oriented movements required by the rehabilitation activity to be performed by the patient. If the results of the method 10 shows that the rehabilitation activity created by the medical professional requires a right shoulder flexion to be performed by the patient, then the medical professional realizes that the rehabilitation exercise he created is not adapted for the patient. Thereby, the use of the method 10 allows to avoid any injury or an aggravation of the patient condition.
  • a rehabilitation activity that has been created or selected by a medical professional is received at step 12.
  • the rehabilitation activity may have been created as described above.
  • an identification of the rehabilitation scenario corresponding to the rehabilitation activity is received along with the customized properties/parameters set by the medical professional.
  • the rehabilitation scenario and the customized properties/parameters then form a rehabilitation activity.
  • movement rules that correspond to the received rehabilitation activity are retrieved.
  • corresponding movement rules are stored in a database for each rehabilitation scenario.
  • the database is stored on a local or external storing unit.
  • the rehabilitation scenario corresponding to the received rehabilitation activity is first determined, and then the movement rules corresponding to the determined scenario are retrieved from the database.
  • Each rehabilitation scenario has a corresponding interactive mechanic which includes: a body-part to virtual-object input mapping; involved body parts, and customizable properties/parameters. Each interactive mechanic is expressed through its associated movement rules.
  • a movement rule is defined as the correlation between a user-controlled object property in the interactive environment, and a patient elementary movement or task- oriented movement.
  • An elementary movement is defined as a body movement type for a given body part of the patient according to a single degree of freedom. In a sequence of elementary movements, each elementary movement may be characterized by a respective range of movement. Examples of body movement types comprise flexion, extension, adduction, abduction, flexion, rotation, pronation, supination, etc. Examples of body parts comprise a trunk, a joint such as left knee, right shoulder, left elbow, left wrist, and/or the like, etc.
  • Table 3 Exemplary movement rales for the first exemplary rehabilitation scenario.
  • Table 4 Exemplary movement rules for the second exemplary rehabilitation scenario.
  • Table 5 Exemplary movement rules for the third exemp arv rehabilitation scenario.
  • Table 6 Exemplary movement rules for the fourth exemplary rehabilitation scenario.
  • Table 7a illustrates the movement rules for an interactive environment comprising a ball of which the size is to be controlled by the movement of the user's shoulder forward flexion during the virtual rehabilitation simulation.
  • Table 7a Movement rules for a rehabilitation scenario in which the size of a ball is to be controlled by a patient.
  • the movement rules contain only the property of a user- controlled object, a body part, and at least one of an elementary movement or task- oriented movement, as illustrated in Table 7b.
  • Table 7b Another embodiment of movement rules for a rehabilitation scenario in which the size of a ball is to be controlled by a patient.
  • Table 7a for example, if the diameter of the user-controlled object is increasing, and the body part controlling the diameter of the user-controlled object is the shoulder, then the elementary movement is "Shoulder forward flexion".
  • step 16 consists in determining a sequence movement events for the received rehabilitation activity.
  • a movement event is defined as a state in the interactive environment, which prompts the beginning or end of a movement required to be performed by the patient in order for the activity to be successfully completed.
  • the required movement may comprise at least one elementary movement, and/or at least one task-oriented movement.
  • a movement event may be triggered by the state of a user- controlled object property, or by any state or change in the interactive environment.
  • a movement segment is defined as the movement taking place in between two successive movement events.
  • the movement events are used to define the movement segments that the patient should perform in order complete a task in the virtual environment.
  • a movement event marks the point when there is a change in a movement segment such as the beginning or the end of a movement segment.
  • a movement event is defined by the state of a user-controlled object property, the achievement of which is required in order for the activity to be successfully completed.
  • a movement event may be defined by any property of the user-controlled object such as a position, a size, and a color, or by any state or change in the interactive environment.
  • a movement event may correspond to a respective position of the user-controlled object.
  • a movement event may correspond a respective color of the user-controlled object
  • a movement event may correspond to a respectiv e size of the user-controlled object
  • input scenario parameters translate directly into movement events.
  • movement events may each correspond to a respective position within the interactive environment.
  • the vertical, horizontal, and depth positions of the target objects may translate into movement events.
  • Table 8 presents the possible vertical and horizontal positions for the target objects while Table 9 presents the possible depth positions for the target objects.
  • Table 9 Depth positions for target objects.
  • Movement events as a function of target object positions are determined via pre-sets.
  • the shape of the sequence of food objects may be customized and each shape selection contains a series of pre- configured movement events.
  • Table 1 1 presents different examples of movement events determined from the shape of food objects.
  • Table 1 1 Mo vement events according to the pattern of food objects.
  • a movement event may be triggered by any change in the interactive environment, for example, the appearance of an object or the change in color of an object, A movement event may also be expressed as a relative change in position from the previous movement event.
  • the second scenario illustrates how, for the purposes of anticipating the required movements, rules may be stated to define specifically how multiple types of changes in the interactive environment can trigger movement events and segments, as illustrated in Table 12.
  • Table 12 Movement segments as a function of changes in the interactive environment.
  • Tables 13 and 14 illustrate how properties/parameters for the second scenario are translated into movement events:
  • Table 13 presents exemplary parameters entered by a medical professional for the second exemplary scenario.
  • T able 13 Exemplary parameters for the second rehabilitation scenario.
  • Table 14 presents the movement segments and the movement events that are determined for the second exemplary activity.
  • Table 14 Segment movements and event movements for the second exemplary activity.
  • step 18 the sequence of elementary movements and/or task-oriented movements that correspond to the received rehabilitation activity are determined using the movement rules and the movement events.
  • a functional task may be further determined from the determined elementary movements.
  • a functional movement or functional task is a sequence of elementar movements required to perform a specific task.
  • functional tasks may comprise brushing teeth, hanging a coat, cooking, self-grooming, playing a sport, and/or the like.
  • a movement segment is the movement to occur between two successive movement events.
  • the elementary movements and/or task-oriented movements per segment are determined by examining each movement segment, i.e. each two successive movement events in the rehabilitation activity, and assigning a respective movement rale to each movement segment.
  • Table 15 Determined elementary movements for the first exemplary scenario.
  • the determined elementary movements and/or task-oriented movements are outputted. They may be stored in a memory and/or sent to the machme of the medical professional. The medical professional may then consult the elementary movements to be executed by the paiient according to the rehabilitation exercise that he created for the patient. For example, the medical professional may realize that the rehabilitation activity that he created requires the execution of a given elementary movement that is not adequate for the patient. In this case, the medical professional may modify the rehabilitation activity or create a new rehabiiitation activity.
  • a rehabilitation activity/scenario comprises a general rehabilitation focus
  • the general rehabilitation focus may be retrieved from the rehabilitation scenario such as from the metadata thereof, and outputted to the medical professional along with the elementary movements and/or task-oriented movements.
  • the method 10 further comprises a step of determining movement ranges for elementary movements, if elementary movements are outputted from the previous step. This is achieved by means of a movement range table, which, for each movement rale, takes any two values of the user-controlled object property relevant to the movement rule, and matches those vaktes with their corresponding angle of rotation for the body part involved.
  • Table 16 is an exemplary movement range table for the first exemplary scenario.
  • Table 16 Movement range table for the first exemplary scenario.
  • any values of the user-controlled object property can be mapped using a range mapping equation.
  • One exemplary range mapping equation is the following of such a range mapping function is as follows: where bl and b2 are angle values, s is comprised in the range [ l, a2], and t is comprised in the range [bl, b2]. The value s included in the range [al, a2] is linearly mapped to the value t in range [bl, b21.
  • the method 10 further comprises a step of determining the clinical objectives that correspond to the created rehabilitation activity.
  • Clinical objectives comprise a movement description list. They may also comprise a movement range for each movement of the list, a movement characteristic, and a general focus,
  • the movement description list comprises a list of body parts of which a movement is required in order to complete a specific activity.
  • the movement description list comprises a fist of elementary movements which are required to complete the specific activity.
  • an elementary movement is defined as a body part and a movement type according to a single degree of freedom and direction.
  • An example of an elementary movement is shoulder [body part] forward flexion [movement type].
  • the movement description list comprises a list of task-oriented movements (or movement tasks).
  • a movement task may be composed of several combinations of elementary movements.
  • the movement task by the right ami of crossing the midline may possibly be achieved either by performing only- external shoulder rotation, performing only shoulder horizontal abduction, or by performing a combination of both external shoulder rotation and shoulder horizontal abduction.
  • a movement range for each elementary movement may optionally be included. For example, shoulder flexion 15'- 100'.
  • the movement characteristics may be expressed as: a speed, a continuous movement precision such as an average deviation from an ideal path, a reaction time such as the time it takes to react to a certain visual stimuli, and/or the like.
  • Clinical objectives can be formatted in various ways and in various levels of complexity. Table 17 illustrates various combinations for clinical objectives.
  • Table 17 Exemplary clinical objectives.
  • Table 18 illustrates examples of body parts, movement types, movement characteristics, and general foci, that may be used to create clinical objectives. It should be understood that the list of body parts, movement types, movement characteristics, and general foci contained in Table 16 is not exhaustive and exemplary only, and it should also be noted that all categories may be expanded upon.
  • Table 18 Exemplary body parts, movement de ascriptions, movement characteristics, and general foci to generate clinical objectives.
  • the step of generating the clinical objectives is accomplished by iteratively examining though each level configuration of the received rehabiiiiation activity, and identifying inputs that relate to the movement characteristics such as speed of movement or precision of movement.
  • the parameters are each compared to a respective challenge threshold. If a given one of the parameters exceeds its respective challenge threshold, then list the movement characteristic related to the parameter as a clinical objective.
  • a challenge threshold is a value assigned to a performance characteristic and is used to determine whether a specific performance characteristic value is considered as a clinical objective.
  • Challenge thresholds may be expressed as raw values. For example, a challenge threshold for speed may be set at lOcm/second. ⁇ this case, a speed requirement of 12cm/second would exceed the challenge threshold, and therefore would be considered as a clinical objective.
  • a challenge threshold may be expressed as a comparison to a patient's previous performance of a specific performance characteristic.
  • a previous performance can be summarized as one data point in many ways, such as average, weighted average, or a specific previous performance.
  • a challenge threshold may be expressed as any value that can quantify a comparison.
  • a challenge threshold can be expressed as a "percentage change" from a value that reflects the patient's previous speed performance.
  • challenge thresholds can be set globally for all body- parts, elementary movements, and movement tasks. In another embodiment, challenge thresholds can also be set for individual or groups of body parts, elementary mo vements, or movement tasks.
  • Challenge threshold values are stored in a database and can optionally be customized by a clinician. Challenge threshold values can be set for individual patients, or globally for all patients.
  • the received rehabilitation activity comprises a speed requirement at 6 cm/second
  • this requirement may be compared to patient's previous performances for the related movement type. If, in previous performances, the patient obtained an average speed of 5 cm/second and the challenge threshold is set to 5%, then it is determined ihai the speed requirement exceeds the challenge threshold, and the speed is listed as a clinical objectives for the patient.
  • the received rehabilitation activity comprises a precision requirement of 2 cm concerning the maximal deviation from a given path
  • this requirement may be compared to the patient's previous performances. If, in previous performances, the patient's average path deviation was 2.2. cm and the challenge threshold is set to 5%, it is then determined that the precision requirement exceeds the challenge threshold (it being assumed that the lower the precision requirement, the more challenging). The precision requirement is then listed as a clinical objective.
  • the received rehabilitation activity comprises a speed requirement of 5 cm/second
  • this requirement may be compared to the patient's previous performances. If, in previous performances, the patient has an average speed of 8 cm/second and the challenge threshold is 5%, it is determined that the speed requirement is below the challenge threshold, and the speed requirement is not listed as a clinical objective.
  • the body parts, movement types, movement tasks, movement ranges, movement characteristics, and clinical objectives can be outputted. They can be provided as a list describing the rehabilitation activity as a whole. For example, the following clinical objectives may be outputted: shoulder horizontal abduction, shouider horizontal adduction, shoulder forward flexion, and shoulder forward extension. This list is applied to the activity as a whole, and does not indicate which clinical objectives apply to which movement segments.
  • the outputs described above can be provided per movement segment. In this manner, the output can be broken down to describe each movement segment in the rehabilitation activity.
  • Table 19 illustrates an example of such an output format.
  • Table 19 Exemplary elementary movements per movement segment.
  • the outputs described above can be provided per group of two or more movement segments.
  • Table 20 illustrates an example of such an output format.
  • T able 2.0 Exemplary elementary movements per group of movement segments.
  • the method 10 further comprises a step of generating and outputting alerts regarding potential risk/danger for the patient to perfor the received rehabilitation activity.
  • the parameters values for the received rehabilitation activity are compared to a respective danger threshold. If the parameter value exceeds its respective danger threshold, then an alert is generated.
  • a danger or upper-limit threshold is a value related to a performance characteristic, which is used to determine whether a specific performance characteristic value may be excessive or may present a potential danger/risk to the patient.
  • Upper- limit thresholds may be expressed as raw values.
  • an upper-limit threshold for speed may be set at 35cm/second. In tins case, a speed requirement of 40cm/second would exceed the upper- limit threshold, and therefore would be considered excessive or a potential danger for the patient,
  • an upper- limit threshold may be expressed as a comparison to a patient's previous performance of a specific performance characteristic, in this case, an upper- limit threshold may be expressed as any value that can quantify a comparison.
  • an upper-limit threshold can be expressed as a "percentage change" from a value that reflects the patient's previous speed performance.
  • a previous performance can be summarized as one data point in many ways, such as average, weighted average, or a specific previous performance.
  • Upper- limit threshold values can be set globally for all body parts, elementary movements, and movement tasks. Alternatively, upper-limit threshold values can be set for individual or groups of body parts, elementary movements, and movement tasks.
  • Upper- limit threshold values can optionally be customized by clinicians.
  • Upper-limit threshold value can be set for individual patients, or globally for all patients.
  • a received rehabilitation activity comprises a speed requirement at 12 cm/second
  • this requirement may be compared to the patient's previous performances for the related movement type. If, in previous performances, the patient has an average speed of 7 cm/second and the danger threshold for this parameter is 30%, then it is determined that the speed requirement exceeds the danger threshold. In this case, an alert is generated and outputted. The alert may be indicative of the parameter and its value.
  • a received rehabilitation activity comprises shoulder forward flexion at '0- 140'
  • this requirement may be compared to the patient's previous performances. If, in previous performances, the patient has only been able to achieve a maximum of 1 10' shoulder forward flexion and the danger threshold for this parameter is 20%, then it is determined that the shoulder forward flexion requirement exceeds the danger threshold. In this case, an alert is generated and outputted.
  • Functional tasks are determined from the elementary movements or task-oriented movements determined at step 18.
  • a database comprises a list of functional tasks and the corresponding elementary movements or task-oriented movements for each functional task.
  • the elementary movements or task-oriented movements that are determined at step 18 are compared to the database, and the corresponding functional task, if any, is retrieved.
  • Table 21 presents one exemplary method for breaking down the functional task "hanging a coat" into its corresponding elementary movements.
  • MOVEMENT MOVEMENT
  • MOVEMENT MOVEMENT
  • Table 21 List of elementary movements forming the functional task "hanging a coat"
  • Figure 3 illustrates one embodiment of a system 50 for determining elementary movements from a rehabilitation activity.
  • the system 50 comprises a movement rules determining module 52, a movement events determining module 54, and an elementary movement determining module 56.
  • the movement rules determining module 52 is adapted to determine the movement rules that correspond to the received rehabilitation activity using the above- described method.
  • the movement events determining module 54 is adapted to determine the movement events that correspond to the received rehabilitation activity using the above described method with respect to step 16 of method 10.
  • the determined movement rules and movement events are received by the elementary movement determining module 56 from the movement rules determining module 52 and the movement events determining module 54, respectively.
  • the elementary movement determining module 56 is adapted to determine the elementary movements or task-oriented movements corresponding to the received rehabilitation activity from the received movement rules and movement events, as described above with respect to step 18 of method 0.
  • the system 50 further comprises a clinical objective module (not shown) for generating clinical objectives for the received rehabilitation activity using the above-described method.
  • the system 50 further comprises an alert generating unit (not shown) for generating and outputting alerts using the above- described method.
  • Figure 4 illustrates one embodiment of a method 100 for analysing the movements executed by a user/patient while performing virtual rehabilitation exercise, i.e. determining the list of elementary movements and/or task-oriented movements executed by the patient while performing the rehab litation exercise.
  • step 102 data indicative of the movements executed by the patient while performing the rehabilitation exercise are received.
  • step 104 the movement rules corresponding to the rehabilitation exercise performed by the patient are retrieved from a memory.
  • step 106 the movement events contained in the received executed movements are determined.
  • the movement events comprise movement event triggers.
  • a movement event trigger is defined as a function that determines when a movement event is triggered, based on one or more prescribed conditions being satisfied.
  • the prescribed condition can be any defined change in state in the interactive environment, for example, the appearance of an object, the collision of two objects, the position of an object, or the change in color of an object.
  • a prescribed condition may also be defined by a relative change in state from the previous movement event.
  • movement event triggers are used is a rehabilitation exercise in which the patient is requested to reach with his arm to hit a moving target.
  • a movement event trigger is attached to the target, with the defined condition that when the patient's arm hits the target, a movement event shall occur.
  • the movement event's value can be assigned as a parameter defined by the movement event's corresponding trigger.
  • the value defined by the movement event trigger can be any property of a user-controlled object, or be any state or change in the interactive environment.
  • the movement event trigger assigns the movement event's value as the target object's 3D position at the time the trigger condition is satisfied
  • the movement events or movement event triggers can be sent to the interactive simulation generator to aid in patient performance analysis, as below.
  • the list of movement event values are predefined and pre-ordered prior to the patient's performance of the rehabilitation activity.
  • the ordered list of movement events is not defined prior to patient activity performance.
  • the step of determining the sequence of movement events comprises defining, assigning values, and ordering the movement events based on patient actions during the rehabilitation exercise.
  • One example of an embodiment where movement events are predefined is where the patient must guide his arm along a pre-set and prescribed path, and no movement choice is required from the user.
  • movement events are not defined prior to patient performance is a scenario where the patient must reach with his arm to hit a number of targets in 3D space.
  • the movement event triggers are sent to the interactive simulation, where each movement event trigger corresponds to one target and assigns the value of its corresponding movement event to be the target's position in space.
  • the patient may choose how many targets to hit, in which order to hit them, and when to hit them. In this manner, the movement events can be ordered by the interactive simulation based on the order in which the patient actually hits the targets.
  • movement events can also be triggered by changes in the interactive environment.
  • hand reaches object For such an embodiment, the actual properties of the user-controlled object at the time of the movement event are not known until the patient performs the activity. These properties are only recorded during or after the performance of the activity,
  • the list of movement events is sent to the interactive simulation generator that will generate the virtual rehabilitation simulation with which the patient will engage.
  • the interactive environment generator uses the movement events as markers with which to segment patient performance data.
  • Table 2.2 presents an example of an ordered list of movement events. In this case, the first movement event must occur before the second movement event, the second movement event must occur before the third movement event, etc.
  • Table 23 presents an example of an unordered list of movement events.
  • the movement events may occur in any temporal order.
  • Table 23 Exemplary list of unordered movement events.
  • the system collects performance data, which can be extracted from position data provided by a motion tracking unit, any biometric sensory data, or from the interactive simulation generator.
  • the system subdivides the patient's performance into discrete movement segments based on the successive movement events achieved by the patient's performance and determines the patient performances.
  • Performance data may include, but is not limited to, speed of movement, fluidity of movement, angular range of motion, trunk forward compensation, trunk lateral compensation, pulse rate, electromyography (EMO) activity, precision of movement, accuracy of movement, reaction time, scapular compensation, and successful movement task completion, and/or the like.
  • the performance data comprise the speed of mo vement, which is determined from the time taken by the user to move his body part between two movement events and the distance between the real-world positions of the user's body part at each movement event.
  • the performance data comprise the precision of movement.
  • the precision of movement is determined from the position of the user-controlled object within the background scene during the performance, i.e. the recorded position of the user-controlled object within the 3D space during the simulation.
  • the precision of movement may be determined from the deviation of the path followed by the user-controlled object from the shortest distance straight line path between two movement events.
  • the performance data comprise the posterior trunk compensation performed during the movement, which is defined as the distance of forward-back movement performed by the trunk to aid in an upper-body movement.
  • the interactive simulation generator may order the movement events during or after the patient performs the activity, based on how the patient performs the rehabilitation activity. For example, in a scenario where he must clap his hand to kill bugs at. various horizontal and vertical coordinates, the patient may choose in which order to kill the bugs. The interactive environment would order the movement evenis based on ihe order ihai ihe patient chooses to kill the bugs.
  • the movement segments are generated in real time or after the performance (based on the patient's choice), and the performance measurements are gathered for each movement segment accordingly.
  • step 108 the sequence of elementary movements and/or task-oriented mo vements is determined using the above-described method with respect to method 10.
  • Form the movement events, movement sequences are created, and a respective movement mle is assigned to each movement segment.
  • the elementary movement and'Or task- oriented movement thai corresponds to the respective movement rule is assigned to the movement segment, thereby obtaining the sequence of elementary movements and/or task-oriented movements that were performed by the patient during the rehabilitation exercise.
  • the performance measurements are matched with the movement rules for the related scenario to correlate the segment-based performance measurements with specific elementary movements or task-oriented movements. This correlation is illustrated in Table 25 for the first exemplary scenario.
  • Table 25 Elementaiy movement and patient performances per movement segment for the first exemplary scenario.
  • the output of step 108 comprises is a list of elementary movements, task- oriented movements, or a combination thereof, which the user has performed during the rehabilitation activity.
  • the output of step 108 further comprises at least one patient performance measurement for at least one of an elementary movement or task oriented movement performed during the rehabilitation exercise.
  • a medical professional can view a list of movements that have been executed by the patient during the rehabilitation exercise. The medical professional may then evaluate which elementary movements or task-oriented movements were performed satisfactorily by the patient during the exercise, and which movements are in need of further clinical attention.
  • Figure 5 illustrates one embodiment of a computer-implemented method
  • a rehabilitation activity for creating a rehabilitation activity.
  • clinical objectives created by a medical professional are received. From (he received clinical objectives, a rehabilitation activity is automatically generated, and from the rehabilitation activity a virtual interactive simulation may be created. The patient may then interact with interactive simulation while executing a rehabilitation exercise.
  • clinical objectives may comprise a list of body parts.
  • clinical objectives may comprise a movement description list, which may include a list of elementary movements, task-oriented movements, or a combination thereof, and corresponding body parts.
  • clinical objectives may comprise at least one functional task.
  • Clinical objectives may also further comprise movement ranges for each elementary movement, movement characteristics for the body-parts and movements involved, as well as a general focus for the activity to be generated.
  • Tables 26-29 illustrate exemplary clinical objectives.
  • the clinical objectives comprise body parts and respective elementary movements.
  • the clinical objectives comprise body parts and respective task-oriented movements and movement characteristics.
  • the clinical objectives comprise a body pari, respective elementary movements and movement characteristics, and a general rehabilitation focus.
  • the clinical objectives comprise body parts and respective movements as well as a movement range for each elementary movement.
  • Table 26 First example of received clinical objectives.
  • Table 27 Second example of received clinical objectives.
  • Table 28 Third example of received clinical objectives.
  • Table 29 Fourth example of received clinical objectives.
  • the method 150 further comprises a step of breaking down the functional task into a sequence of elementary movements and/or task- oriented movements.
  • the method 150 further comprises a step of receiving an input indicative of a time of play from the medical professional .
  • the input indicative of the time of play may comprise a number of movement actions in a level, a number of levels in an activity, or the like.
  • the time of play such as the number of movement actions in a level or the number of levels in an activity, may be set randomly within a given range.
  • the method 150 further comprises a step of determining a scenario that correspond to the received clinical objectives, and movement rules that are adapted to the scenario.
  • determining a scenario that correspond to the received clinical objectives and movement rules that are adapted to the scenario.
  • more than one scenario are determined for the received clinical objectives, and a given one of the determined scenarios is selected.
  • a plurality of scenarios are stored in a database which further comprises movement rales for each scenario.
  • the elementary movements or the task-oriented movements and their corresponding body parts on which they apply contained in the received clinical objectives are compared to the movements listed in the movement rules for each scenario contained in the database. If ail of the elementary movements or task-oriented movements correspond to the movements listed in a given set of movement rules, then the scenario corresponding to the given set of movement rules is selected as being adapted for the received clinical objectives.
  • the selection of the adequate scenarios may be further based on the general rehabilitation focus.
  • the general rehabilitation focus of the received clinical objectives is compared to the general rehabilitation focus of the scenarios stored in the database. If a match is found, then the corresponding scenario is selected as being adapted to the received clinical objectives.
  • the scenario selection method based on the general rehabilitation focus may be used to discriminate between the scenarios that have been selected based on the elementary movements or task-oriented movements. For example, if a given scenario, that has been previously selected as being adequate based on its corresponding elementar movements or iask-oriented movements, has a general rehabilitation focus that does not match that of the clinical objectives, then the given scenario may be rejected. Alternatively, the given scenario is confirmed as being adapted for the received clinical objectives.
  • the fourth exemplary clinical objectives of Table 29 requires two scenarios so that all objectives be covered.
  • the second exemplary scenario is selected since its movement rules 5 and 6 illustrated in Table 4 satisfy the body-part to movement description correlation related to elbows for the fourth clinical objectives.
  • the third exemplary scenario is also selected since its movement rules 1, 4, and 6, satisfy the body-part to movement description correlation related to trunk for the fourth clinical objectives.
  • the first exemplary scenario may be selected instead of the second exemplary scenario for the fourth exemplary clinical objectives since the first exemplary scenario satisfies the same required body-part to movement description correlation.
  • a given one of the multiple scenarios may be randomly selected.
  • the medical professional may be requested to select a given on the multiple scenarios.
  • several or all of the relevant scenarios may be selected. In this case, the activities generated according to these scenarios may be subsequently presented to a medical professional who may decide to only select one activity for the patient or at least two activities for the patient.
  • the next step 156 comprises the step of determining the customizable properties/parameters of the selected scenario(s) to obtain a rehabilitation activity that may be converted into an interactive simulation. Movement events are first generated according to the generated movement mles for the selected scenario, and then the customizable properties/parameters are then determined using the movement events.
  • the relevant movement rules for the selected scenario are first identified.
  • the specific movement rules that satisfy the input's body-part to movement description correlation are isolated.
  • the first exemplary scenario is selected since its movement rules I, 2, 7, and 8 satisfy the clinical objectives.
  • the movement events each expressed as a 3D position may be randomly generated, as long as the following conditions are met:
  • the ( ⁇ , ⁇ , ⁇ ) coordinates of the movement events do not exceed exterior boundary ranges. These ranges can be defined by the medical profession or be stored in a database;
  • Rules 5 or 6 can never be true in the same sequence of movement events as mles 7 or 8;
  • the difference between movement events does not exceed a certain minimum.
  • This 'minimum difference' can be defined by the medical professional or be stored in a database. All the movement events together satisfy all the appropriate movement rides (in this case, rules 1, 2, 7,
  • predetermined movement events are stored in a database for each scenario. More than one set of movement events may be stored in the database for a same scenario, and each set of movement events correspond to a respective set of customizable properties/parameters values.
  • Table 30 illustrates four different sets of movement events for the first exemplary scenario.
  • the first set of movement events correspond to food objects positioned along a horizontal line.
  • the second set of movement events correspond to food objects positioned according to vertically-oriented square.
  • the third set of movement events correspond to food objects positioned according to a horizontally-oriented square.
  • the fourth set of movement events correspond to food objects positioned along a vertical line.
  • Table 30 Exemplary sets of predetermmed movement events for the first exemplary
  • the sets 1 and 3 of predetermined movement events satisfy the relevant movement rules 1, 2, 7, and 8, and are therefore selected.
  • the fourth exemplary scenario is selected since its movement rules 1 , 2, and 7 satisfy the body-part to movement description correlation.
  • the movement events may be each expressed as a 3D position and they can be randomly generated, as long as the following conditions are met:
  • the ⁇ , ⁇ , ⁇ coordinates of the movement events do not exceed exterior boundary ranges
  • the x,y,z coordinates of the movement events are integers between ⁇ 1 and 1 , inclusive
  • the second exemplary scenario is selected since its movement rules I, 2, 3, and 4 satisfy the body- part to movement description correlation.
  • the movement events may be each expressed as a 3D position and they can be randomly generated, as long as the following conditions are met:
  • the x coordinates of the movement events are integers between - 1 and I , inclusively;
  • the z coordinates of the movement events are integers between -1 and 0, inclusively;
  • the y coordinate of the movement events is always 0;
  • the odd number movement events in sequence correspond to the location of the falling objects
  • the even number movement events in sequence correspond to the location of pail; and The even number movement events in sequence shall be always be the same.
  • the third and fourth exemplary scenarios are selected since their movement rules together satisfy the body-part to movement description correlation.
  • the movement events may be each expressed as a 3D position and a sequence of movement events is generated for each selected scenario,
  • movement events may be automatically generated as long as the following conditions are met:
  • the x,z coordinates of the movement events do not exceed exterior boundary ranges
  • the difference between the movement events does not exceed a certain minimum.
  • the 'minimum difference' can be defined by the clinician or be stored in a database.
  • a movement range table such as the one illustrated in Table 31, is consulted in order to convert the angle to the movement event value.
  • Table 31 Movement range table for the third exemplary scenario.
  • the angle values specified in the clinical objectives are mapped to their corresponding movement event values (within the virtual space) of the user-controlled object property using a range mapping formula.
  • a range mapping formula is as follows: where bl and b2 are angle values, s is comprised in the range [al, a2], and t is comprised in the range [bl , b2].
  • the value s included in the range [al, a2] is linearly mapped to the value t in range [bl, b2j.
  • [desired Z value] 0 + ( ( 50 - 0 ) ( 1 - 0 ) ) / ( 90 - 0 )
  • Trunk Forward Flexion angle requirement would render a Z value of 0.56.
  • [desired X value] 0 + ( ( 30 - 0 ) ( (+/-) 1 - 0 ) ) / ( 90 - 0 )
  • Trunk Lateral Flexion angle requirement would render an X value at +/- 0.33.
  • the relevant scenario-specific rules that must be satisfied are as follows: The x,y,z coordinates of the movement events do not exceed exterior boundary ranges as defined in the scenario;
  • the x,y,z coordinates of the movement events are integers between - 1 and 1 , inclusively;
  • the movement events correspond directly to customizable parameters of the scenario, and the value of each determined movement event is simply assigned to its corresponding customizable parameter.
  • the movement events may correspond directly to the (x,y,z) position of the target objects.
  • the determined positions for the movement events are each assigned to a respective target object.
  • the movement events (outputted from the previous step) are converted into actual rehabilitation activity parameters, which may be read by the simulation generator and edited by a clinician.
  • the second exemplary scenario requires that all odd number movement events be translated into horizontal and depth coordinates for the failing objects, and even number movement events be translated into horizontal and depth coordinates for the pail.
  • Table 32 illustrates exemplary movement events for the second exemplary scenario.
  • Table 32 Exemplary movement events for the second exemplary scenario.
  • the movement events can be translated into vertical, horizontal, and depth positions, based on the following conversion Table 33:
  • Table 33 Conversion table for the movement events of the second exemplary scenario.
  • Table 34 Exemplary position for the falling fruits and the pail.
  • the scenario is complete and referred to as a rehabilitation activity.
  • the rehabiiiiation activity is outputted.
  • the automatically generated activity may be stored in memory for further download by the patient, sent to the physician for consultation or to the interactive simulation generator.
  • the method 150 further comprises a step of making adjustments related to movement characteristics required and difficulty. This is achieved by consulting the input related to movement characteristics.
  • a challenge level is not expressed: selecting a "movement characteristic" value that fails between a predetermined 'challenge threshold' and a predetermined 'upper- limit threshold'.
  • a challenge level is expressed: selecting a "movement characteristic" value that falls within a sub-range between a predetermined 'challenge threshold' and a predetermined 'upper-limit threshold', the sub-range being defined by the inputted challenge level. Specifically, a lower 'challenge level' would result in a sub-range closer to the challenge threshold, and a higher challenge level would result in a sub-range closer to the 'upper-limit threshold'.
  • the sub-ranges between the 'challenge threshold' and 'upper-limit threshold' can be derived through a linear or non-linear calculation.
  • the method may verify the patient's current ability in terms of speed, and adjust the speed requirement of the activity in order for it to be adequately challenging for the patient. It may be determined that in previous performances, patient has averaged 10 cm/second, the challenge threshold is 5%, and the upper-limit threshold is 30%. With this information, a speed requirement randomly in between 10.5 and 13 cm/second may be selected.
  • the method may verif the patient's current ability in terms of precision, and adjust the precision requirement of the activity in order for it to be adequately challenging for the patient. It may be determined that in previous performances, patient's average path deviation was 3 cm, the challenge threshold is 5%, and the upper-limit threshold is 30%. With this information, a precision requirement randomly in between 2.85 cm and 2.1 cm is selected (it being assumed that the lower the precision requirement, the more challenging).
  • Example 3 If, for example, "speed (challenge level: 2/5)" is listed as a clinical objective, the method may verify that the 'challenge threshold' in terms of speed is 1 Ocm/ ' second, and the upper-limit threshold in terms of speed is 35 cm/second. As the challenge level is 2/5, a speed requirement randomly between the sub-range of 15 and 20 cm/second may be selected.
  • the method 150 may be embodied as a system comprising a scenario determining module, a movement events determining module, and an scenario parameter determining module.
  • the scenario determining module is adapted to receive the clinical objectives and determine a corresponding scenario and corresponding movement rules, as described above.
  • the movement events determining module is adapted to receive the movement rales and determine movement events adapted for the selected scenario, as described above.
  • the scenario parameter determining module is adapted to receive the selected scenario and the determined movement events, and determine the value for the scenario customizable parameters in order to obtain a complete rehabilitation activity.
  • rehabilitation activity and receiving clinical objectives both from a medical professional
  • the rehabilitation activity and/or the clinical objectives may be received from a performance analysis system , a processing machine, and/or the like.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Physiology (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Rehabilitation Tools (AREA)

Abstract

L'invention concerne un procédé pour analyser l'une d'une activité de rééducation et d'une performance d'un utilisateur pendant un exercice de rééducation, comprenant : la réception de l'un parmi une activité de rééducation et des mouvements exécutés réalisés par l'utilisateur pendant l'exercice de rééducation, l'activité de rééducation définissant un environnement interactif à utiliser pour générer une simulation qui correspond à l'exercice de rééducation ; la détermination de règles de mouvement correspondant à l'une de l'activité de rééducation et de l'exercice de rééducation ; la détermination d'une séquence d'événements de mouvement correspondant à l'un de l'activité de rééducation et de l'exercice de rééducation, chacun des événements de mouvement correspondant à un état donné de la caractéristique de l'objet virtuel commandé par l'utilisateur dans l'environnement interactif, l'état donné correspondant à l'un d'un début et d'une fin d'un mouvement ; et la détermination d'une séquence de mouvements élémentaires à l'aide des règles de mouvement et des événements de mouvement.
PCT/US2014/025970 2013-03-14 2014-03-13 Procédé et système pour analyser une activité/exercice de rééducation virtuel WO2014160172A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/774,960 US20160023046A1 (en) 2013-03-14 2014-03-13 Method and system for analysing a virtual rehabilitation activity/exercise

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361783504P 2013-03-14 2013-03-14
US61/783,504 2013-03-14

Publications (1)

Publication Number Publication Date
WO2014160172A1 true WO2014160172A1 (fr) 2014-10-02

Family

ID=51625349

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/025970 WO2014160172A1 (fr) 2013-03-14 2014-03-13 Procédé et système pour analyser une activité/exercice de rééducation virtuel

Country Status (2)

Country Link
US (1) US20160023046A1 (fr)
WO (1) WO2014160172A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10572733B2 (en) 2016-11-03 2020-02-25 Zimmer Us, Inc. Augmented reality therapeutic movement display and gesture analyzer

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150132731A1 (en) * 2013-11-12 2015-05-14 Health Tech Pal Corp Physical therapy system
US10861604B2 (en) 2016-05-05 2020-12-08 Advinow, Inc. Systems and methods for automated medical diagnostics
US11164679B2 (en) 2017-06-20 2021-11-02 Advinow, Inc. Systems and methods for intelligent patient interface exam station
US10939806B2 (en) 2018-03-06 2021-03-09 Advinow, Inc. Systems and methods for optical medical instrument patient measurements
US11348688B2 (en) 2018-03-06 2022-05-31 Advinow, Inc. Systems and methods for audio medical instrument patient measurements
US11903712B2 (en) * 2018-06-08 2024-02-20 International Business Machines Corporation Physiological stress of a user of a virtual reality environment
US11344727B2 (en) 2018-11-09 2022-05-31 Board Of Regents, The University Of Texas System Stereognosis training system and method for patients with chronic stroke, spinal cord injury or neuropathy
US20200254310A1 (en) * 2019-02-13 2020-08-13 Triad Labs, LLC Adaptive virtual rehabilitation
US11918504B1 (en) 2019-11-13 2024-03-05 Preferred Prescription, Inc. Orthotic device to prevent hyperextension

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6244987B1 (en) * 1996-11-25 2001-06-12 Mitsubishi Denki Kabushiki Kaisha Physical exercise system having a virtual reality environment controlled by a user's movement
US20080061949A1 (en) * 2004-07-29 2008-03-13 Kevin Ferguson Human movement measurement system
US20120296235A1 (en) * 2011-03-29 2012-11-22 Rupp Keith W Automated system and method for performing and monitoring physical therapy exercises

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5429140A (en) * 1993-06-04 1995-07-04 Greenleaf Medical Systems, Inc. Integrated virtual reality rehabilitation system
US6827579B2 (en) * 2000-11-16 2004-12-07 Rutgers, The State University Of Nj Method and apparatus for rehabilitation of neuromotor disorders
EP1729858B1 (fr) * 2004-02-05 2013-04-10 Motorika Ltd. Methodes et appareils d'exercice et d'entrainement de reeducation
US8834169B2 (en) * 2005-08-31 2014-09-16 The Regents Of The University Of California Method and apparatus for automating arm and grasping movement training for rehabilitation of patients with motor impairment
KR100722229B1 (ko) * 2005-12-02 2007-05-29 한국전자통신연구원 사용자 중심형 인터페이스를 위한 가상현실 상호작용 인체모델 즉석 생성/제어 장치 및 방법
WO2008140780A1 (fr) * 2007-05-10 2008-11-20 Grigore Burdea Systèmes et procédés d'évaluation périodique et de rééducation à distance
US20120157263A1 (en) * 2009-01-20 2012-06-21 Mark Sivak Multi-user smartglove for virtual environment-based rehabilitation
US9403056B2 (en) * 2009-03-20 2016-08-02 Northeastern University Multiple degree of freedom rehabilitation system having a smart fluid-based, multi-mode actuator
TWI435744B (zh) * 2010-07-30 2014-05-01 Univ Nat Yang Ming 中風患者之雙側上肢動作訓練與評估系統
BR112013002637B1 (pt) * 2010-08-02 2020-12-29 CARDINAL HEALTH SWITZERLAND 515 GmbH stent flexível tendo articulações que se projetam
WO2012161657A1 (fr) * 2011-05-20 2012-11-29 Nanyang Technological University Systèmes, appareils, dispositifs, et procédés pour la réhabilitation neurophysiologique et/ou le développement fonctionnel synergique
US20150004581A1 (en) * 2011-10-17 2015-01-01 Interactive Physical Therapy, Llc Interactive physical therapy

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6244987B1 (en) * 1996-11-25 2001-06-12 Mitsubishi Denki Kabushiki Kaisha Physical exercise system having a virtual reality environment controlled by a user's movement
US20080061949A1 (en) * 2004-07-29 2008-03-13 Kevin Ferguson Human movement measurement system
US20120296235A1 (en) * 2011-03-29 2012-11-22 Rupp Keith W Automated system and method for performing and monitoring physical therapy exercises

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10572733B2 (en) 2016-11-03 2020-02-25 Zimmer Us, Inc. Augmented reality therapeutic movement display and gesture analyzer
US10621436B2 (en) 2016-11-03 2020-04-14 Zimmer Us, Inc. Augmented reality therapeutic movement display and gesture analyzer
US11176376B2 (en) 2016-11-03 2021-11-16 Zimmer Us, Inc. Augmented reality therapeutic movement display and gesture analyzer

Also Published As

Publication number Publication date
US20160023046A1 (en) 2016-01-28

Similar Documents

Publication Publication Date Title
WO2014160172A1 (fr) Procédé et système pour analyser une activité/exercice de rééducation virtuel
US20140371633A1 (en) Method and system for evaluating a patient during a rehabilitation exercise
CN110121748B (zh) 增强的现实治疗移动显示和手势分析器
US20210272376A1 (en) Virtual or augmented reality rehabilitation
CN110312471B (zh) 从神经肌肉活动测量中导出控制信号的自适应系统
US9788917B2 (en) Methods and systems for employing artificial intelligence in automated orthodontic diagnosis and treatment planning
EP3120256B1 (fr) Procédé et système pour fournir une rétroaction biomécanique au mouvement d'humains et d'objets
US9517111B2 (en) Methods and systems for employing artificial intelligence in automated orthodontic diagnosis and treatment planning
CN108463271A (zh) 用于运动技能分析以及技能增强和提示的系统和方法
US20130297554A1 (en) Methods and systems for employing artificial intelligence in automated orthodontic diagnosis & treatment planning
US20220351824A1 (en) Systems for dynamic assessment of upper extremity impairments in virtual/augmented reality
CN105637531A (zh) 人体姿势识别
JP2022507628A (ja) 様々なタイプの仮想現実および/または拡張現実環境内の神経筋活性化からのフィードバック
Guerrero et al. Kinect-based posture tracking for correcting positions during exercise
CN112288766B (zh) 运动评估方法、装置、系统及存储介质
Muztoba et al. Robust communication with IoT devices using wearable brain machine interfaces
US20160331304A1 (en) System and methods for automated administration and evaluation of physical therapy exercises
Krotov Human control of a flexible object: hitting a target with a bull-whip
Zaher et al. A framework for assessing physical rehabilitation exercises
EP2660742A1 (fr) Appareil d'entraînement
US11992745B2 (en) Method and system for assessing and improving wellness of person using body gestures
WO2020061213A1 (fr) Tâches d'apprentissage en réalité virtuelle pour physiothérapie et rééducation physique
KR20180022495A (ko) 훈련 컨텐츠의 난이도를 설정하는 방법 및 이를 운용하는 전자 장치
Fluet et al. Virtual reality-augmented rehabilitation for patients in sub-acute phase post stroke: a feasibility study
Chatzitofis et al. Technological module for unsupervised, personalized cardiac rehabilitation exercising

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14774615

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14774960

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14774615

Country of ref document: EP

Kind code of ref document: A1