WO2023102599A1 - Methods and systems for rehabilitation of prosthesis users - Google Patents
Methods and systems for rehabilitation of prosthesis users Download PDFInfo
- Publication number
- WO2023102599A1 WO2023102599A1 PCT/AU2022/051456 AU2022051456W WO2023102599A1 WO 2023102599 A1 WO2023102599 A1 WO 2023102599A1 AU 2022051456 W AU2022051456 W AU 2022051456W WO 2023102599 A1 WO2023102599 A1 WO 2023102599A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- user
- virtual
- sensor
- prosthesis
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 95
- 238000012545 processing Methods 0.000 claims abstract description 38
- 230000003190 augmentative effect Effects 0.000 claims abstract description 9
- 230000004044 response Effects 0.000 claims abstract description 7
- 230000003993 interaction Effects 0.000 claims description 38
- 238000005259 measurement Methods 0.000 claims description 17
- 238000012549 training Methods 0.000 claims description 10
- 230000001960 triggered effect Effects 0.000 claims description 10
- 238000002567 electromyography Methods 0.000 claims description 9
- 238000004891 communication Methods 0.000 description 158
- 230000000694 effects Effects 0.000 description 108
- 210000003414 extremity Anatomy 0.000 description 50
- 238000007726 management method Methods 0.000 description 50
- 230000006399 behavior Effects 0.000 description 37
- 230000006870 function Effects 0.000 description 35
- 210000003205 muscle Anatomy 0.000 description 25
- 238000010586 diagram Methods 0.000 description 20
- 230000008569 process Effects 0.000 description 18
- 230000004913 activation Effects 0.000 description 16
- 230000009471 action Effects 0.000 description 15
- 238000004422 calculation algorithm Methods 0.000 description 15
- 238000004364 calculation method Methods 0.000 description 14
- 238000013480 data collection Methods 0.000 description 10
- 210000003128 head Anatomy 0.000 description 10
- 238000010801 machine learning Methods 0.000 description 10
- 238000002266 amputation Methods 0.000 description 8
- 238000004458 analytical method Methods 0.000 description 7
- 230000001419 dependent effect Effects 0.000 description 7
- 210000001364 upper extremity Anatomy 0.000 description 7
- 238000013507 mapping Methods 0.000 description 6
- 210000000707 wrist Anatomy 0.000 description 6
- 239000008186 active pharmaceutical agent Substances 0.000 description 5
- 230000001144 postural effect Effects 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 206010048669 Terminal state Diseases 0.000 description 4
- 238000011542 limb amputation Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000005055 memory storage Effects 0.000 description 4
- 238000010276 construction Methods 0.000 description 3
- 239000003814 drug Substances 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 235000015220 hamburgers Nutrition 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 210000003141 lower extremity Anatomy 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000000052 comparative effect Effects 0.000 description 2
- 238000010411 cooking Methods 0.000 description 2
- 238000012517 data analytics Methods 0.000 description 2
- 235000013305 food Nutrition 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 238000005728 strengthening Methods 0.000 description 2
- MWCLLHOVUTZFKS-UHFFFAOYSA-N Methyl cyanoacrylate Chemical compound COC(=O)C(=C)C#N MWCLLHOVUTZFKS-UHFFFAOYSA-N 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013481 data capture Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 238000010304 firing Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000002040 relaxant effect Effects 0.000 description 1
- 230000003362 replicative effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000008093 supporting effect Effects 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1107—Measuring contraction of parts of the body, e.g. organ, muscle
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1124—Determining motor skills
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Bio-feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/0001—Games specially adapted for handicapped, blind or bed-ridden persons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/24—Use of tools
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/25—Bioelectric electrodes therefor
- A61B5/279—Bioelectric electrodes therefor specially adapted for particular uses
- A61B5/296—Bioelectric electrodes therefor specially adapted for particular uses for electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4851—Prosthesis assessment or monitoring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6824—Arm or wrist
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6828—Leg
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7455—Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/68—Operating or control means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/68—Operating or control means
- A61F2/70—Operating or control means electrical
- A61F2/72—Bioelectric control, e.g. myoelectric
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/76—Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/68—Operating or control means
- A61F2002/6827—Feedback system for providing user sensation, e.g. by force, contact or position
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/76—Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
- A61F2002/7615—Measuring means
- A61F2002/7625—Measuring means for measuring angular position
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/76—Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
- A61F2002/7615—Measuring means
- A61F2002/763—Measuring means for measuring spatial position, e.g. global positioning system [GPS]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/76—Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
- A61F2002/7615—Measuring means
- A61F2002/769—Displaying measured values
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/76—Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
- A61F2002/7695—Means for testing non-implantable prostheses
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/0001—Games specially adapted for handicapped, blind or bed-ridden persons
- A63F2009/0007—Games with therapeutic effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
Definitions
- Embodiments generally relate to methods and systems for rehabilitation of prosthesis users.
- embodiments relate to systems and methods for facilitating rehabilitation of prosthesis users using a virtual or augmented reality platform.
- a prosthesis may provide them with an increased ability to perform tasks such as picking up, holding, and manipulating objects.
- learning to use a prosthesis requires time and training. This can be an arduous process for the prosthesis user, and may require them to attend numerous in-person rehabilitation sessions with a clinician.
- Some embodiments relate to a method of determining performance metrics of a user relating to control of a virtual prosthetic limb in a virtual or augmented reality environment, the method comprising: causing a virtual reality environment to be presented to a user, wherein the virtual reality environment comprises a virtual prosthetic limb controllable by the user by way of a sensor device; receiving sensor data from the sensor device in response to the user causing the virtual prosthetic limb to move within the virtual reality environment, the sensor data comprising motion sensor data and human-prosthesis interface data; and processing the sensor data to derive at least one performance metric; wherein the at least one performance metric is derived by comparing the sensor data with at least one predetermined parameter.
- Some embodiments relate to a method of determining performance metrics of a user relating to control of a virtual prosthetic limb in a virtual or augmented reality environment, the method comprising: causing a virtual reality environment to be presented to a user, wherein the virtual reality environment comprises a virtual prosthetic limb controllable by the user by way of a sensor device; receiving sensor data from the sensor device in response to the user causing the virtual prosthetic limb to move within the virtual reality environment, the sensor data comprising motion sensor data and human-prosthesis interface data; and processing the sensor data to derive at least one performance metric; wherein the at least one performance metric is derived by comparing the sensor data with at least one predetermined parameter, wherein the at least one performance metric relates to a level of competency with which a user controls of the virtual prosthetic limb within the virtual reality environment.
- the performance metric is related to the performance of at least one basic skill.
- the performance metric is related to the performance of at least one functional skill.
- the performance metric is based on at least one motor behaviour metric.
- the at least one motor behaviour metric is at least one of a compensation score, a compensation style, a prosthesis utilisation score, a prosthesis utilisation style, a sensor utilisation score and a sensor utilisation style.
- the performance metric relates to the movement of the virtual limb within the virtual reality environment.
- the performance metric is determined in response to a trigger measurement event.
- the measurement event is triggered by an interaction between the virtual limb and at least one virtual object.
- Some embodiments further comprise prompting the user to perform a task within the virtual environment.
- the task required the user to interact with a virtual object within the virtual environment.
- Some embodiments further comprise receiving data related to the state of the virtual object, and using this data to derive the at least one performance metric.
- Some embodiments further comprise receiving data related to the pose of the user, and using this data to derive the at least one performance metric.
- Some embodiments further comprise receiving data related to the state of the virtual prosthetic limb, and using this data to derive the at least one performance metric.
- Some embodiments further comprise receiving data related to the state of the sensor device, and using this data to derive the at least one performance metric.
- comparing the sensor data with at least one predetermined parameter comprises comparing the sensor data with historical data generated by a population of users within an equivalent virtual reality environment. Some embodiments further comprise performing a clustering technique on the historical data to group the historical data into clusters, wherein comparing the sensor data with historical data comprises comparing the sensor data with the clustered data. Some embodiments further comprise determining that at least one cluster is a desired cluster, and determining the at least one performance metric by calculating a distance between the desired cluster and the sensor data. In some embodiments, determining the at least one performance metric comprises determining which cluster is closest to the sensor data.
- the performance metric comprises at least one of a motor performance metric and a prosthesis use metric.
- Some embodiments further comprise determining that an interaction has occurred between a virtual object and the virtual prosthetic limb, and generating haptic feedback data to be delivered to the user.
- the haptic feedback data comprises at least one of a vibration amplitude and frequency.
- the determined performance metrics of the user are communicated to the user to assist the user in performing functional tasks with a physical prosthetic limb in a real-world environment.
- Some embodiments relate to a method of presenting a virtual reality environment to a user, the method comprising: receiving input indicating a high level prosthesis rehabilitation goal; determining, based on the high level prosthesis rehabilitation goal, at least one low level skill relating to the high level goal; generating a virtual reality environment that facilitates in the performance of the low level skill; wherein the virtual reality environment comprises a virtual prosthetic limb controllable by the user by way of a sensor device, the sensor device comprising at least one motion sensor and at least one human-prosthesis interface sensor; receiving sensor data from the sensor device, the sensor data comprising motion sensor data and human-prosthesis interface data; and based on the received sensor data, causing movement of the virtual prosthetic limb in the virtual reality environment to allow the user to perform the low level skill.
- the high level goal relates to a real-world task.
- the low level skill relates comprises at least one of a basic skill and a functional skill.
- the virtual reality environment is used for the purpose or training and/or rehabilitation.
- Some embodiments further comprise processing the sensor data to derive at least one performance metric, wherein the at least one performance metric is derived by comparing the sensor data with at least one predetermined parameter.
- Some embodiments further comprise determining a recommendation for the user based on the at least one performance metric, and presenting to the user the recommendation.
- the sensor device comprises at least one inertial measurement unit.
- the sensor device comprises at least camera.
- the sensor device comprises at least one surface electromyography sensor.
- Some embodiments relate to a computer readable storage medium storing executable code, wherein when a processor executes the code, the processor is caused to perform the method of some other embodiments.
- Figure 1 is a schematic diagram of a system for virtual or augmented reality based therapeutics, according to some embodiments
- Figure 2 is a schematic of an optional dongle of the system of Figure 1 , according to some embodiments;
- Figure 3A is a diagram illustrating components of the system of Figure 1 being operated by a user with an upper limb amputation, according to some embodiments;
- Figure 3B is a diagram illustrating components of the system of Figure 1 being operated by a user with a lower limb amputation, according to some embodiments;
- Figure 4 is a schematic diagram of the software components of the system of Figure 1, according to some embodiments.
- Figure 5 is a schematic diagram showing the hardware services components of the system of Figure 4 in further detail
- Figure 6 is a schematic diagram showing the prosthesis services components of the system of Figure 4 in further detail
- FIG. 7 is a schematic diagram showing the data services components of the system of Figure 4 in further detail
- FIG 8 is a schematic diagram showing the game services components of the system of Figure 4 in further detail
- Figure 9 is a schematic diagram showing the rehabilitation management platform of Figure 1 in further detail
- Figure 10 is a schematic diagram showing the real-time database of Figure 9 in further detail
- FIG 11 is a schematic diagram showing the raw data processing module of Figure 9 in further detail
- Figure 12 is a schematic diagram showing the big data database of Figure 9 in further detail
- Figure 13 is a schematic diagram showing the performance calculation module of Figure 9 in further detail
- Figure 14 is a process flow diagram of a method of providing virtual reality based therapeutics by presenting games to a user using the system of Figure 1;
- Figure 15 is a process flow diagram of a method of providing virtual reality based assessment using the system of Figure 1;
- Figure 16A is an example screenshot that may be displayed during performance of the method of Figure 14;
- Figure 16B is an example screenshot that may be displayed during performance of the method of Figure 15
- Figure 17 is an example screenshot that may be displayed to a user during performance of the method of Figure 14 to allow for selection of games to present to a user using the system of Figure 1 ;
- Figures 18 A to 18D are example screenshots that may be displayed to a user during performance of the method of Figure 14;
- Figures 19A to 19D show the headset of Figure 1 displaying images that may be shown to the user during performance of the method of Figure 14;
- Figure 20 is an example screenshots that may be displayed to a user during performance of the method of Figure 14;
- Figures 21A to 21D are further example screenshots that may be displayed to a user during performance of the method of Figure 14.
- Embodiments generally relate to methods and systems for rehabilitation of prosthesis users.
- embodiments relate to systems and methods for facilitating rehabilitation of prosthesis users using a virtual or augmented reality platform.
- a prosthesis may require substantial rehabilitation and training to learn how to effectively use the prosthesis, and to train their body, both neurologically and physically, to use the prosthesis.
- a patient may wear a prosthesis that provides them with functions usually performed by the upper limb, such as picking up, holding, and manipulating objects, for example.
- Mechanical prostheses use human-to- prosthesis interface sensors that a user can interact with to cause the prosthesis to move in the desired way.
- a prosthesis may include surface electromyography sensors that are configured to make contact with the remaining portion of the user’s limb, so that electrical signals created when the wearer's remaining muscles contract are translated into movement of the prosthesis.
- Virtual reality and augmented reality platforms allow users to control an avatar within a three-dimensional virtual space, and to interact with virtual objects in that space. This may be done using hand-held controllers comprising user interface elements, such as buttons and motion sensors, to capture the user’s desired movements and actions. Interactions with the controllers are translated into movement of the avatar within the three-dimensional virtual space. While the remainder of this document refers to virtual reality, it should be appreciated that this may include augmented reality and/or mixed reality environments.
- Described embodiments use a virtual reality platform to assist in rehabilitation, training and assessment of prosthesis users.
- a sensor device that can track motion and muscle activation
- user inputs can be translated into movements of a virtual prosthesis displayed in a virtual environment.
- the user can learn to manipulate the virtual prosthesis to interact with virtual objects in the virtual space, and analytics data relating to their actions can be provided to a clinician for assessment.
- the virtual prosthesis By configuring the virtual prosthesis to act in a manner similar to a physical prosthesis would act given the same motion and muscle activation of a user, the user can apply the trained motion and muscle activation to a physical prosthesis.
- training with the virtual prosthesis in the virtual environment may improve a user’s use of a physical prosthesis in their day-to-day life as they move through and interact with their real-world physical environment.
- the virtual reality environment may be used to present games-based tasks to assist in the rehabilitation process. While the term “game” is used in the description below, it should be understood that the present application relates to a system of rehabilitation and assessment, and not to an entertainment gaming system. Tasks are presented to the user in the form of games to encourage the user to perform them, but could be presented without the game-like elements in some embodiments.
- the described virtual reality platform may be used to present assessment tasks and standardised tests to the user, in some embodiments.
- FIG. 1 shows a schematic diagram of a system 100 for virtual reality based therapeutics, according to some embodiments.
- System 100 may be configured to provide a virtual environment that can be used to provide one or more of therapy, rehabilitation, training, assessment, and telehealth.
- System 100 comprises a computing device 110 in communication with a headset 120. Headset 120 is further in communication with a sensor device 130. In some embodiments, headset 120 may be in communication with sensor device 130 via an optional dongle 135. In some embodiments, computing device 110 and headset 120 may further be in communication with a controller 190.
- computing device 110 may be in communication with further external devices and systems via a network 140.
- computing device 110 may further be in communication with one or more of a rehabilitation management platform 150, a user computing device 160 and/or a clinician computing device 170.
- a user may wear and/or use headset 120 and sensor device 130 in order to experience a virtual reality platform provided by system 100, to enable the user to interact with virtual objects in a virtual space presented to them via headset 120 by interaction with sensor device 130.
- Interaction with sensor device 130 may include movement of a limb or portion of a limb which sensor device 130 is configured to monitor, and/or activation of one or more muscles which sensor device 130 is configured to monitor.
- Computing device 110 may be a smartphone, laptop computer, desktop computer, or other computing device configured to communicate with headset 120. According to some embodiments, computing device 110 may be operated by a clinician. According to some embodiments, computing device 110 may be operated by a patient or user of headset 120 and sensor device 130. In some embodiments, rather than being a standalone device, computing device 110 may be integrated with headset 120.
- Computing device 110 comprises a processor 111 in communication with a memory 114.
- Processor 111 comprises one or more data processors for executing instructions, and may comprise one or more microprocessor based platforms, central processing units (CPUs), application specific instruction set processors (AS IPs), application specific integrated circuits (ASICs), suitable integrated circuits, or other processors capable of fetching and executing instruction code as stored in memory 114.
- Processor 111 may include an arithmetic logic unit (ALU) for mathematical and/or logical execution of instructions, such as operations performed on data stored in internal registers of processor 111.
- ALU arithmetic logic unit
- Memory 114 may comprise one or more memory storage locations, which may be volatile or non-volatile memory types.
- memory 114 may comprise one or more of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM) or flash memory.
- RAM random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable read-only memory
- Memory 114 is configured to store program code 115 accessible by the processor 111.
- Program code 115 may comprise a plurality of executable program code modules executable by processor 111 to cause processor 111 to perform functions as described in further detail below.
- memory 114 may comprise program code 115 in the form of a task presentation application 116, and an assessment application 117, both executable by processor 111.
- Executing task presentation application 116 may cause processor 111 to generate a virtual environment for presentation to a user via headset 120, as described below in further detail with reference to Figure 14.
- Executing assessment application 117 may cause processor 111 to process data received from headset 120 and sensor device 130 to assess a user’s performance within the virtual environment, as described below in further detail with reference to Figure 15. While task presentation application 116 and assessment application 117 are shown as both residing on computing device 110, according to some embodiments one or more of these applications may be stored on or caused to execute on another device of system 100, such as headset 120 or rehabilitation management platform 150, for example.
- Memory 114 may also comprise one or more data files comprising data 180 that is accessible to processor 111 for performing read and write functions. Some data types that may be stored in memory 114 are described below with reference to Figure 4.
- Computing device 110 may also comprise user inputs and outputs 113 capable of receiving inputs, such as requests, from one or more users of user computing device 110, and capable of conveying outputs, such as information, to the user.
- User I/O 113 may comprise one or more user interface components, such as one or more of a display device, a touch screen display, a keyboard, a mouse, a camera, a microphone, and buttons, for example.
- User computing device 110 may further comprise a communications module 112 configured to facilitate communication between user computing device 110 and one or more external computing devices or systems via one or more networks.
- Communications module 112 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel.
- communications module 112 may facilitate communication between user computing device 110 and other devices within system 100 via network 140.
- communications module 112 may facilitate communication between user computing device 110 and rehabihtation management platform 150, user computing device 160, and/or clinician computing device 170 via network 140.
- communications module 112 may facilitate communication of data related to patient performance from user computing device 110 to one or more of rehabilitation management platform 150, user computing device 160, and/or clinician computing device 170, for example.
- communications module 112 may alternatively or additionally facilitate direct communication between computing device 110 and other components of system 100.
- communications module 112 may facilitate direct communication between computing device 110 and headset 120, sensor device 130, and/or controller 190.
- communications module 112 may facilitate communication of calibration data from computing device 110 to one or more of headset 120, sensor device 130, and/or controller 190, for example.
- Communications module 112 may also facilitate communication of sensor and user performance data from one or more of headset 120, sensor device 130, and/or controller 190 to computing device 110.
- Communications module 112 may facilitate communication between computing device 110 and external computing devices or systems via one or more wired or wireless communications protocols.
- communications module 112 may facilitate communication between computing device 110 and external computing devices or systems via Wi-Fi, Bluetooth, Ethernet, USB, or other communication protocols.
- Network 140 may comprise one or more local area networks or wide area networks that facilitate communication between elements of system 100.
- network 140 may be the internet.
- network 140 may comprise at least a portion of any one or more networks having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, some combination thereof, or so forth.
- Network 140 may include, for example, one or more of: a wireless network, a wired network, an internet, an intranet, a public network, a packet-switched network, a circuit- switched network, an ad hoc network, an infrastructure network, a public- switched telephone network (PSTN), a cable network, a cellular network, a satellite network, a fibre-optic network, or some combination thereof.
- a wireless network a wired network
- an internet an intranet
- a public network a packet-switched network
- a circuit- switched network an ad hoc network
- an infrastructure network a public- switched telephone network (PSTN), a cable network, a cellular network, a satellite network, a fibre-optic network, or some combination thereof.
- PSTN public- switched telephone network
- Headset 120 may be a virtual reality headset configured to deliver a virtual reality environment to a user when worn on the user’s head.
- headset 120 may be an off-the-shelf or commercially available virtual reality headset.
- Headset 120 may be an Android-based VR headset in some embodiments, such as an Oculus Quest 1 or Oculus Quest 2 headset from Oculus®; a VIVE Focus 3 headset from HTC®; or a Pico Neo 2 or Pico Neo 3 headset from Pico Interactive®, in some embodiments.
- Headset 120 comprises a processor 121 in communication with a memory 124.
- Processor 121 comprises one or more data processors for executing instructions, and may comprise one or more microprocessor based platforms, central processing units (CPUs), application specific instruction set processors (ASIPs), application specific integrated circuits (ASICs), suitable integrated circuits, or other processors capable of fetching and executing instruction code as stored in memory 124.
- Processor 121 may include an arithmetic logic unit (ALU) for mathematical and/or logical execution of instructions, such as operations performed on data stored in internal registers of processor 121.
- Memory 124 may comprise one or more memory storage locations, which may be volatile or non-volatile memory types.
- memory 124 may comprise one or more of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM) or flash memory.
- Memory 124 is configured to store program code 125 accessible by the processor 121.
- Program code 125 may comprise a plurality of executable program code modules executable by processor 121 to cause processor 121 to perform functions such as displaying a virtual reality environment to a user via display 123, as described in further detail below.
- program code 125 may include virtual reality content built on a real-time game engine, such as the Unity game engine by Unity Technologies®, for example.
- Memory 124 may also comprise one or more data fdes comprising data 126 that is accessible to processor 121 for performing read and write functions.
- Headset 120 also comprises a display 123 capable of conveying visual data to a user in the form of a virtual reality environment.
- Display 123 may be a head-mounted display comprising one or more display optics that are positioned to be viewed by at least one eye of a user when headset 120 is worn.
- display 123 may comprise two display optics, with each display optic being configured to be viewed by an eye of the user to present a simulated three dimensional image to the user.
- headset 120 also comprises a motion sensor 127.
- motion sensor 127 may be configured to sense the motion of the headset 120 as it is being worn by a user, to allow for the image displayed by display 123 to be changed with the movement of headset 120.
- Processor 121 executing program code 125 may be caused to adapt the user’s view of the virtual environment being displayed to them via display 123 as the user moves in their real- world physical environment, the physical movement of the user being translated into virtual movement of the user’s avatar within the virtual environment and thus altering the perspective of the virtual environment shown to the user via display 123.
- motion sensor 127 may comprise an inertial measurement unit (IMU), or other device that measures at least one of the specific force, angular rate, and orientation of headset 120.
- IMU inertial measurement unit
- motion sensor 127 may comprise one or more of an accelerometer, gyroscope, and magnetometer.
- Headset 120 may further comprise a communications module 122 configured to facilitate communication between headset 120 and one or more external computing devices or systems.
- Communications module 122 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel.
- communications module 122 may facilitate communication between headset 120 and computing device 110.
- communications module 122 may facilitate communication of calibration and settings data from computing device 110 to headset 120.
- Communications module 122 may additionally or alternatively facilitate communication of sensor and user performance data from headset 120 to computing device 110.
- communications module 122 may facilitate communication of sensor data from one or more of sensor device 130 and controller 190 to headset 120.
- communications module 122 may additionally or alternatively facilitate communication of data from headset 120 to one or more of sensor device 130 and controller 190, which may include haptic feedback data in some embodiments.
- communications module 122 may facilitate direct communication between headset 120 and other components of system 100 via one or more wired or wireless communications protocols.
- communications module 122 may facilitate communication between headset 120 and external computing devices or systems via Wi-Fi, Bluetooth, Ethernet, USB, or other communication protocols.
- headset 120 may communicate with one or more of computing device 110, sensor device 130 and controller 190 via Bluetooth Low Energy (BLE).
- BLE Bluetooth Low Energy
- communications module 122 may comprise a software module configured to communicate with proprietary protocols used by sensor device 130, to allow communications module 122 to extract prosthesis input signals generated by sensor device 130 and use these signals to simulate prosthetic devices in virtual reality via display 123.
- system 100 may comprise a dongle 135 to facilitate communication between headset 120 and sensor device 130.
- dongle 135 may be necessary due to communication limitations in sensor device 130.
- dongle 135 may communicate with sensor device 130 via a wired interface, and may communicate via a wired communication protocol.
- dongle 135 may communicate with sensor device 130 via a proprietary communication protocol, which may be specific to sensor device 130.
- dongle 135 may be configured to communicate with headset 120 via a wireless communication protocol.
- Dongle 135 may be configured to communicate with headset 120 via BLE, for example.
- Dongle 135 may be configured to receive information such as sensor data from sensor device 130, and pass this to headset 120. Dongle 135 may also be configured to receive information from headset 120. Dongle 135 may receive haptic feedback data from headset 120 and pass this to sensor device 130. According to some embodiments, dongle 135 may additionally or alternatively receive configuration data from headset 120. The configuration data may relate to the type of sensor device 130 dongle 135 is to be connected to, and may allow dongle 135 to select an appropriate communication protocol for communication with sensor device 130. According to some embodiments, the type of sensor device may be detected by dongle 135 automatically. According to some embodiments, the type of sensor device may be selected by a clinician when setting up headset 120 for the patient, via user I/O 113 of computing device 110, a user interface of clinician computing device 170 or headset 120, for example.
- Figure 2 shows a schematic diagram of a sub-system 200 of system 100, showing a possible embodiment of dongle 135 in further detail.
- dongle 135 comprises a wired communications module 205 in communication with sensor device 130.
- wired communications module 205 facilitates communication with sensor device 130 via a proprietary communication protocol, using a proprietary bus.
- the bus may comprise three wires, which may be used for data, voltage (Vcc) and Ground in some embodiments.
- wired communications module 205 may comprise a UART to allow for serial communications with sensor device 130.
- wired communications module 205 may facilitate an analogue communication protocol.
- wired communications module 205 may be specific to sensor device 130, and may require configuration based on the properties of sensor device 130. Sensor configuration and/or sensor selection information may be generated by headset 120, received by wireless communications module 215, and used by processor 210 to configure wired communications module 205 to adopt the appropriate configuration.
- Dongle 135 further comprises a processor 210 for processing data received from wired communications module 205 and communicating this to a wireless communications module 215.
- processor 210 may comprise one or more data processors for executing instructions, and may comprise one or more microprocessor based platforms, central processing units (CPUs), application specific instruction set processors (ASIPs), application specific integrated circuits (ASICs), suitable integrated circuits, or other processors capable of fetching and executing instruction code as stored in a memory 240.
- CPUs central processing units
- ASIPs application specific instruction set processors
- ASICs application specific integrated circuits
- suitable integrated circuits or other processors capable of fetching and executing instruction code as stored in a memory 240.
- Memory 240 may comprise one or more memory storage locations, which may be volatile or non-volatile memory types.
- memory 124 may comprise one or more of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM) or flash memory.
- RAM random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable read-only memory
- flash memory flash memory
- Memory 240 may be configured to store program code accessible and executable by processor 210.
- Memory 240 may also comprise one or more data files comprising data that is accessible to processor 210 for performing read and write functions.
- memory 240 may comprise a data map 245, which may be used by processor 210 to map sensor data received from sensor device 130 via wired communication module 205 to pattern/power pairs or commands, which may be passed by wireless communications module 215 to headset 120 to be used for input commands by prosthetic services module 430, as described in further detail below with reference to Figure 4.
- Pattern/power pairs may correspond to data that describes both a combination of sensors being activated and a degree to which they are activated.
- the received sensor data may have values corresponding to both a pattern or combination of sensors activated in a sensor array, and to a power or strength at which the sensors in the pattern were activated.
- the pattern and power data may be mapped to prosthesis functions. For example, a particular pattern of sensors may correspond to a “fist” position being adopted by a prosthetic, and the power value will determine the degree to which the fist should be closed.
- processor 210 and memory 240 may reside within a microcontroller.
- processor 210 and memory 240 may reside within an electrician® Nano BLE microcontroller in some embodiments.
- dongle 135 further comprises a wireless communications module 215 in communication with headset 120.
- Wireless communications module 215 facilitates communication with headset 120 via a wireless communication protocol.
- wireless communications module 215 comprises a Bluetooth module, and facilitates communication with headset 120 via BLE.
- Dongle 135 may also comprise one or more of a bootloader 225, power source 230, and power management module 220.
- Bootloader 225 may be configured to allow for programming or re-programming of processor 210 and/or memory 240 via a programming port, which may be a USB port in some embodiment.
- bootloader 225 may be caused to erase and/or write to memory 240 to amend stored program code for execution by processor 210.
- Power source 230 may comprise one or more of a battery, USB connector, power plug, or other power source, and may be configured to supply power to the remaining electronic components of dongle 135.
- Power management module 220 may be configured to receive power status information from power source 230, and to communicate this information to processor 210. Processor 210 may in turn communicate this information to headset 120 for display to the user.
- system 100 further comprises at least one sensor device 130.
- sensor device 130 may be configured to be worn on the remaining portion of an amputated or missing limb of a user, and to generate signals based on the movement and use of the remaining portion of the limb.
- sensor device 130 may be configured to otherwise monitor the movement and muscle activation of a limb of a user.
- sensor device 130 may be able to be worn on or otherwise monitor the movement and muscle activation of a non-amputated limb of a user.
- Sensor device 130 comprises a communications module 131 configured to facilitate communication between sensor device 130 and one or more external computing devices or systems.
- Communications module 131 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel.
- communications module 131 may facilitate communication between sensor device 130 and headset 120, as described in further detail above.
- communications module 131 may facilitate communication between sensor device 130 and computing device 110.
- communications module 131 may facilitate direct communication between sensor device 130 and other components of system 100 via one or more wired or wireless communications protocols. For example, communications module 131 may facilitate communication between sensor device 130 and external computing devices or systems via Wi-Fi, Bluetooth, Ethernet, USB, or other communication protocols. In some embodiments, communications module 131 may facilitate direct communication between sensor device 130 and headset 120. For example, communications module 131 may facilitate communication of sensor data from sensor device 130 to headset 120. In some embodiments, communications module
- 131 may additionally or alternatively facilitate communication of data from headset 120 to sensor device 130. As described above, in some embodiments communication between headset 120 and sensor device 130 may be via dongle 135.
- Sensor device 130 further comprises at least one motion sensor 132 and at least one human-prosthetic interface sensor 133. According to some embodiments, motion sensor
- human-prosthetic interface sensor 133 may be separate devices located in separate locations.
- motion sensor 132 may comprise an inertial measurement unit (IMU), or other device that measures at least one of the specific force, angular rate, and orientation of sensor device 130.
- motion sensor 132 may comprise one or more of an accelerometer, gyroscope, and magnetometer.
- motion sensor 132 may comprise one or more cameras.
- Motion sensor 132 may provide data relating to the movement of a user’s limb through physical space.
- motion sensor 132 may be a motion sensor component from an off-the-shelf or commercially available virtual reality controller.
- motion sensor 132 may be a motion sensor component from an Androidbased VR controller, such as an Oculus Quest 1 or Oculus Quest 2 controller from Oculus®; a VIVE Focus 3 controller from HTC®; or a Pico Neo 2 or Pico Neo 3 controller from Pico Interactive®, in some embodiments.
- sensor device 130 may comprise a controller 195 housing the motion sensor 132. Controller 195 may be substantially identical to controller 190, as described in further detail below.
- Human-prosthetic interface sensor 133 may be a sensor configured to be interacted with by a wearer of sensor device 130 in order to initiate one or more actions, movements of articulations of a virtual prosthesis displayed by headset 120 within a virtual environment. Human-prosthetic interface sensor 133 may be configured to sense muscle activation in a limb or partial limb of a user. According to some embodiments, human-prosthetic interface sensor 133 may comprise a surface electromyography sensor, or other sensor that generates data in response to contracting and/or relaxation of one or more muscles of a wearer of sensor device 130 that are in contact with sensor device 130, proximate to sensor device 130, or which sensor device 130 is otherwise configured to monitor.
- human-prosthetic interface sensor 133 may be a pattern recognition based sensor, that generates data based on one or more combinations of actions performed by a wearer of sensor device 130.
- human-prosthetic interface sensor 133 may comprise one or more commercial patternbased surface electromyography sensors, such as one or more of the Ottobock MyoPlus, COAPT Complete Control, and i-Biomed Sense.
- human-prosthetic interface sensor 133 may be a button based sensor, comprising a one or more buttons activatable by a wearer of sensor device 130 by contracting and/or relaxing one or more of the muscles in contact with sensor device 130.
- human- prosthetic interface sensor 133 may comprise at least one of an inertial measurement unit (IMU), force sensitive resistor, cable driven interface, or other brain-machine or human-machine interface.
- IMU inertial measurement unit
- human-prosthetic interface sensor 133 may generate data which can be used to determine which muscle or combination of muscles a user activated by way of contraction or relaxation.
- human-prosthetic interface sensor 133 may generate data which can be used to determine the degree to which one or more muscles of the user were contracted or relaxed.
- sensor device 130 may comprise one or more additional user interface mechanisms (not shown), such as buttons, sliders, lights or motors for providing haptic feedback, for example.
- system 100 optionally further comprises a controller 190, which may be a motion-tracked controller configured to be held by a user by an able-bodied limb.
- Controller 190 comprises at least one communications module 191 and at least one motion sensor 192.
- Communications module 191 may be configured to facilitate communication between controller 190 and one or more external computing devices or systems. Communications module 191 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel. According to some embodiments, communications module 191 may facilitate communication between controller 190 and headset 120. According to some embodiments, communications module 191 may facilitate communication between controller 190 and computing device 110. For example, communications module 191 may facilitate communication of sensor data from controller 190 to headset 120. In some embodiments, communications module 191 may additionally or alternatively facilitate communication of data from headset 120 to controller 190. According to some embodiments, communications module 191 may facilitate direct communication between controller 190 and other components of system 100 via one or more wired or wireless communications protocols. For example, communications module 191 may facilitate communication between controller 190 and external computing devices or systems via Wi-Fi, Bluetooth, Ethernet, USB, or other communication protocols.
- Motion sensor 192 of controller 190 may comprise an inertial measurement unit (IMU), or other device that measures at least one of the specific force, angular rate, and orientation of controller 190.
- motion sensor 192 may comprise one or more of an accelerometer, gyroscope, and magnetometer.
- Motion sensor 192 may provide data relating to the movement of controller 190 through physical space.
- controller 190 may comprise one or more additional user interface mechanisms (not shown), such as buttons, sliders, lights or motors for providing haptic feedback, for example.
- controller 190 may be an off-the-shelf or commercially available virtual reality controller.
- controller 190 may be an Android-based VR controller, such as an Oculus Quest 1 or Oculus Quest 2 controller from Oculus®; a VIVE Focus 3 controller from HTC®; or a Pico Neo 2 or Pico Neo 3 controller from Pico Interactive®, in some embodiments.
- System 100 may further comprise a rehabilitation management platform 150 in communication with computing device 110 via network 140.
- Rehabilitation management platform 150 may be configured to host data and/or program code for supporting the rehabilitation services provided by computing device 110.
- rehabilitation management platform 150 may host a data analytics platform for processing data received from computing device 110 and calculating metrics about a user’s performance, as described in further detail below.
- rehabilitation management platform 150 may be a serverless or cloud based platform.
- rehabilitation management platform 150 may be a Firebase® or Google® cloud based platform in some embodiments.
- rehabilitation management platform 150 may comprise one or more computing devices and/or server devices, such as one or more servers, databases, and/or processing devices in communication over a network.
- rehabilitation management platform 150 comprises a processor 151 in communication with a memory 153.
- Processor 151 comprises one or more data processors for executing instructions, and may comprise one or more microprocessor based platforms, central processing units (CPUs), application specific instruction set processors (ASIPs), application specific integrated circuits (ASICs), suitable integrated circuits, or other processors capable of fetching and executing instruction code as stored in memory 153.
- Processor 151 may include an arithmetic logic unit (ALU) for mathematical and/or logical execution of instructions, such as operations performed on the data stored in internal registers of processor 151.
- ALU arithmetic logic unit
- Memory 153 may comprise one or more memory storage locations, which may be volatile or non-volatile memory types.
- memory 153 may comprise one or more of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM) or flash memory.
- RAM random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable read-only memory
- Memory 153 is configured to store program code 154 accessible by the processor 151.
- Program code 154 may comprise a plurality of executable program code modules executable by processor 151 to cause processor 151 to perform functions as described in further detail below.
- Memory 153 may also comprise data 155 that is accessible to processor 151 for performing read and write functions.
- Rehabilitation management platform 150 may further comprise a communications module 152 configured to facilitate communication between rehabilitation management platform 150 and one or more external computing devices via one or more networks.
- Communications module 152 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel. According to some embodiments, communications module 152 may facilitate communication between rehabilitation management platform 150 and other devices within system 100 via network 140. For example, communications module 152 may facilitate communication between rehabilitation management platform 150 and computing device 110 via network 150.
- Rehabilitation management platform 150 may also be in communication with other computing devices.
- rehabilitation management platform 150 is in communication with user computing device 160 and clinician computing device 170 via network 140.
- User computing device 160 and clinician computing device 170 may each comprise one or more of a laptop, desktop, smart phone, or other computing device.
- user computing device 160 and clinician computing device 170 may each comprise a processor, memory, user I/O and communications module which may be similar to processor 111, memory 114, user I/O 113 and communications module 112 as described above with reference to computing device 110.
- user computing device 160 may be operated by a patient undergoing rehabilitation via system 100, and may be the user’s personal device on which they can set up user profile information, select preferences, and view their progress.
- these functionalities may be accessible via a patient companion application residing in memory of user computing device 160 and executable by a processor of user computing device 160 to be displayed on a user interface of user computing device 160.
- these functionalities may be accessible via a web based application, or other means.
- Clinician computing device 170 may be operated by a clinician facilitating rehabilitation to a patient via system 100, and may be located a clinic environment in some cases. Clinician computing device 170 may be configured to allow clinicians to enter information and settings for their patient, and to view insights and analytics relating to the patient’s rehabilitation progress. According to some embodiments, these functionalities may be accessible via a dynamic web application hosted on a platform such as the rehabilitation management platform, and accessible via a browser application executable by a processor of clinician computing device 170 to be displayed on a user interface of clinician computing device 170. In some alternative embodiments, these functionalities may be accessible via a native application residing in memory of clinician computing device 170, or other means.
- Figure 3A shows a subsystem 300 of system 100 in position on a user 310 with an upper limb amputation.
- Subsystem 300 comprises headset 120, sensor device 130, and controller 190.
- headset 120 is positioned on a head 312 of user 310 via a strap 320, while a display portion 322 comprising display 123 is configured to be positioned on a face of user 310 in proximity to the eyes of user 310, so that user 310 can view display 123 while headset 120 is being worn.
- sensor device 130 is configured to be worn on an upper limb 314 of a user.
- Limb 314 may be a partially amputated or otherwise disabled limb.
- sensor device 130 may be sized to encircle the limb 314, and to be retained in position via an elastic or friction fit with the girth of limb 314.
- sensor device 130 comprises a controller 195, as described above.
- Controller 190 is configured to be held in a hand of an upper limb 316 of user 310, where limb 316 is an able-bodied limb.
- Figure 3B shows a subsystem 350 of system 100 in position on a user 310 with a lower limb amputation.
- Subsystem 350 comprises headset 120 and sensor device 130.
- headset 120 is positioned on a head 312 of user 310 via a strap 320, while a display portion 322 comprising display 123 is configured to be positioned on a face of user 310 in proximity to the eyes of user 310, so that user 310 can view display 123 while headset 120 is being worn.
- sensor device 130 is configured to be worn on a lower limb 354 of a user.
- Limb 354 may be a partially amputated or otherwise disabled limb.
- sensor device 130 may be sized to encircle the limb 354, and to be retained in position via an elastic or friction fit with the girth of limb 354.
- sensor device 130 comprises a controller 195, as described above.
- Figure 4 shows a schematic diagram of the software components of system 100 that provide a virtual reality platform 400. While some software components of platform 400 may be described as residing on a particular hardware component of system 100, any of the software components may reside on and/or be accessible to any of the hardware elements of system 100.
- virtual reality platform 400 may be configured to provide a virtual reality environment in which a user can train in a number of basic and functional skills required to operate a prosthesis.
- Basic skills may be skills which target rehabilitation and strengthening of the neurology and muscles of the residual limb, and may be used in the operation of the prosthesis.
- Basic skills may be skills may include causing the muscles of the patient or user to perform certain actions, which may include movement of a limb and/or muscle activation.
- basic skills may include skills that relate to movement of a residual limb, and muscle activation in the residual limb.
- basic skills may include activating the muscles in a residual arm limb that would cause a hand to make a fist gesture if the limb were not amputated, where that muscle activation can be used to cause a prosthetic hand to close.
- prosthesis input signal modulation e.g., EMG
- prosthesis input pattern activation e.g., EMG fist pattern
- arm range of motion e.g., prosthetic hand closure modulation
- prosthetic joint rotation modulation e.g., wrist and elbow
- Functional skills may comprise a combination of basic skills, and target the ability of users to use the basic skills to operate a physical prosthesis for activities of daily living, such as picking up an object with the prosthesis without compensating with their trunk.
- Other functional skills may include pick and place; bi-manual manipulation; single hand manipulation (prosthesis); tool use; utensil use; and use of prosthetic grasp types.
- Functional skills may comprise one or more basic skills performed sequentially and/or simultaneously.
- tool use may comprise basic skills such as movement of an arm, joint rotation, and hand closure, in some cases.
- Virtual reality platform 400 may be configured to provide activities in the form of games or activities which target a set of these basic and/or functional skills.
- games may target skills such as the ability to pick up and place objects; bimanual manipulation of objects; muscle activation and modulation of wearable sensors such as EMG, and other skills involving the use of a prosthesis.
- the activities may be presented in the form of activities or games that require the user to use the skill in order to achieve game objectives.
- a game for developing pick and place abilities may be presented in a barbeque setting (as shown in Figures 19C, 19D and 21C, described in further detail below), where a user must use a virtual prosthesis to pick up and place various virtual foods on a virtual barbeque in order to achieve game objectives and make progress in the game.
- a game for developing bimanual abilities may be an assembly game, where a user must use a virtual prosthesis to assemble various virtual objects into a virtual structure in order to achieve game objectives and make progress in the game.
- the game may be presented in a burger restaurant setting (as shown in Figures 18A and 21B, described in further detail below), where a user must use a virtual prosthesis to pick up and assemble various burger components in order to achieve game objectives and make progress in the game.
- a game for developing EMG modulation abilities may be a painting game, where a user must use a virtual prosthesis to virtually paint on a virtual canvas in order to achieve game objectives and make progress in the game.
- virtual reality platform 400 may be configured to generate assessments of a user’s competency in manipulating a virtual prosthesis. This assessment may be based on a user’s performance of the activities presented by virtual reality platform 400. In some embodiments, the assessment may be based on data relating to the movement and/or muscle activation exhibited by the user during their performance of the activities presented by virtual reality platform 400.
- assessments may be based on comparing user performance data with comparative data.
- the comparative data may be historical performance data collected from previous users, as described in further detail below.
- the assessments conducted by virtual reality platform 400 may be based on known assessments of prosthesis use which may be used by clinicians when assessing a user’s ability to manipulate a physical prosthesis.
- assessments may be based on known techniques such as arm prosthesis race challenges (such as those conducted by Cybathlon®), the Box and Blocks test, and/or the Jepson-Taylor hand function test, in some embodiments.
- Virtual reality platform 400 may comprise a platform management module 410.
- Platform management module 410 may contain executable code which, when executed, manages the behaviour of the rehabilitation platform provided by system 100.
- Platform management module 410 may manage the high-level interactions and configurations of the platform, based on system and user configuration data 435.
- aspects of system and user configuration data 435 may be generated by a user via a user computing device 160.
- aspects of system and user configuration data 435 may be generated by a clinician via a clinician computing device 170.
- System and user configuration data 435 may include system data relating to the hardware being used, such as data relating to the form and configuration of headset 120, sensor device 130 and/or controller 190.
- System and user configuration data 435 may include data relating to a user of system 100, which may include a user’s name, age, physical characteristics such as height and weight, and details relating to a user’s limb loss or limb difference, such as the side of their body affected and the degree of the amputation or disability.
- system and user configuration data 435 may include data relating to a prosthesis configuration.
- the prosthesis configuration data may relate to a physical prosthesis that a user of system 100 is or will be fitted with, and system 100 may use the prosthesis configuration data to generate a virtual prosthesis having characteristics replicating those of the physical prosthesis.
- the prosthesis configuration data may be dependent on a type of amputation, as well as a clinician’s assessment of the user.
- the prosthesis configuration data may include data relating to one or more components of a prosthesis, which may include a terminal device such as a prosthetic hand, a prosthetic wrist, and/or a prosthetic elbow.
- the virtual prosthesis may include a terminal device, and a wrist.
- a user with a transradial amputation may use a prosthetic that does not include an elbow.
- the terminal device may be a motorised hand with one degree of freedom, and the wrist may be a motorised wrist with one degree of freedom to perform pronation and supination, for example.
- Platform management module 410 may be configured to process system and user configuration data 435, and to communicate the processed data to one or more further modules.
- platform management module 410 may be configured to derive sensor configuration data 405 from the system and user configuration data 435, and to pass the sensor configuration data 405 to a hardware services module 420.
- hardware services module 420 may be configured to handle communications with hardware such as sensor device 130, such as by using the sensor configuration data 405 to map signals from sensor device 130 to virtual prosthesis commands.
- hardware services module 420 may generate sensor data 445 based on the sensor configuration data 405 and the signals received from sensor device 130 during use of sensor device 130 by a user, and pass this to a prosthesis services module 430.
- prosthesis services module 430 may also pass data to hardware services module 420, which may include haptic feedback data in some cases, which may be generated based on interaction between a virtual prosthesis and other virtual objects presented to a user in a virtual environment displayed via headset 120.
- the haptic feedback data may comprise data relating to a vibration amplitude and frequency to be delivered to the user by the hardware device, which may be sensor device 130 in some embodiments.
- Hardware services module 420 may further service sensor usage data 465 and pass this to a data services module 450.
- Sensor usage data 465 may comprise data relating to the use of sensor device 130, such as data relating to the sensed movement of sensor device 130 in physical space, and the sensed activation of muscles of a user using sensor device 130.
- platform management module 410 may be configured to derive game configuration data 425 from the system and user configuration data 435, and to pass the game configuration data 425 to a game services module 440.
- Game configuration data 425 may comprise data relating to the presentation of games by system 100, such as the types of games and difficulty level of games to present, for example.
- game services module 440 may be configured to manage virtual environments to be presented by headset 120, including handling activity and game flow within the virtual environments, handling behaviour of the virtual environments, and handling game asset behaviour.
- game services module 440 may receive data relating to a state of a corresponding virtual prosthesis from prosthesis services module 430, and generate data relating to user activity within the virtual reality environment, such as interaction of the virtual prosthesis with virtual objects. Game services module 440 may use this along with the game configuration data 425 received from platform management module 410 to generate interaction event data 455. Interaction event data 455 may include configuration data related to the virtual prosthesis, where such configuration is dependent on game logic. Interaction event data 455 may then be passed to prosthetic services module 430. Game services module 440 may also generate activity and interaction event data 485, and pass this to data services module 450. Activity and interaction event data 485 may include data relating to a user’s activity within the virtual environment, and data relating to interactions between the virtual prosthesis and any virtual objects.
- Platform management module 410 may further be configured to derive prosthesis configuration data 415 from the system and user configuration data 435, and to pass the prosthesis configuration data 415 to the prosthesis services module 430.
- prosthesis services module 430 may use the sensor data 445, prosthesis configuration data 415, and interaction event data 455 to simulate prosthesis function within the virtual environment provided by system 100 by providing movement and activation of the virtual prosthesis based on movement and activation sensed by sensor device 130.
- prosthesis services module 430 may use the sensor data 445, prosthesis configuration data 415, and interaction event data 455 to map sensor data 445 to virtual prosthesis function, which may then be presented to a user within a virtual environment via headset 120.
- Prosthesis usage data 475 may be generated by prosthesis services module 430 and passed to data services module 450.
- Prosthesis usage data 475 may include data relating to the manner in which the virtual prosthesis was used, including the movement of the virtual prosthesis in space and in relation to any virtual objects.
- Data services module 450 may be configured to receive raw data from the virtual environment presented by system 100, and to pass this to rehabilitation management platform 150 for analysis. According to some embodiments, data services module 450 may buffer data gathered during an activity presented via virtual reality platform 400, and may then synchronise the data with rehabilitation management platform 150 once the activity is completed, or periodically. In some alternative embodiments, the data may be sent to rehabilitation management platform 150 in real time.
- the raw data may include sensor usage data 465, prosthesis usage data 475, and activity and interaction event data 485, for example. This data may be captured through an event based system in some embodiments, as described below with reference to Figure 7.
- Figure 5 shows a sub-system 500 of the system of Figure 4, illustrating aspects of virtual reality platform 400, such as hardware services module 420, in further detail.
- Sub-system 500 includes sensor device 130 including human-prosthetic interface sensor 133.
- sensor 133 As a wearer interacts with sensor 133, such as by engaging muscles in the limb on which sensor device 130 is being worn and/or by moving the limb on which sensor device 130 is being worn through physical space, sensor 133 generated sensor data, which may include pattern and power command data 510.
- Pattern and power command data 510 is communicated to virtual reality platform 400. This may be by way of communications module 131 sending pattern and power command data 510 to communications module 122 of headset 120, for example. Pattern and power command data 510 is received by hardware services module 420.
- Hardware services module 420 may comprise a hardware communication module 520 and a sensor manager module 530. Pattern and power command data 510 is received by hardware communication module 520, which handles communication with sensor device 130. According to some embodiments, hardware services module 420 may comprise a native communications module which may be custom configured for each sensor device 130. Hardware services module 420 processes the received pattern and power command data 510 based on sensor configuration data 550 received from sensor manager module 530, to generate processed pattern and power command data 540, which is passed back to sensor manager module 530. According to some embodiments, the processing may comprise converting the data from a format received from sensor device 130 into a format usable by virtual reality platform 400.
- Sensor manager module 530 receives pattern and power command data 540, and also receives sensor configuration data 405 from platform management module 410. Sensor manager module 530 uses the sensor configuration data 405 to generate sensor configuration data 550, which is passed to hardware communication module 520. This may be done during a configuration process, which may only occur during a setup procedure. Sensor manager module 530 further uses the sensor configuration data 405 to process the received pattern and power commands 540 to generate sensor data 445, which is passed to prosthesis services module 430. This may be done during use of sensor device 130. Sensor manager module 530 may generate sensor data 445 by wrapping the pattern and power command data 540, and mapping this data to particular prosthetic functions. According to some embodiments, this may be done with reference to a sensor command dictionary, which may be specific to each sensor device 130.
- Prosthetic services module 430 receives the sensor data 445, along with prosthesis configuration data 415.
- Prosthetic services module 430 may be configured to facilitate the simulation of different types of prostheses, and to facilitate the dynamic mapping of sensor functions as detected by sensor device 130 to virtual prosthesis functions as displayed to the user via headset 120.
- Prosthetic services module 430 may comprise a prosthetic manager module 560, which may manage the state of the virtual prosthesis, for example. Based on sensor data 445 and prosthesis configuration data 415, prosthetic manager module 560 may manage the virtual position and motion of the virtual prosthesis as displayed to the user via headset 120.
- Prosthetic services module 430 and prosthetic manager module 560 are further illustrated in Figure 6.
- Figure 6 shows a sub-system 600 of the system of Figure 4, illustrating aspects of virtual reality platform 400, such as prosthetic services module 430, in further detail.
- Subsystem 600 includes hardware services module 420, comprising sensor manager module 530, as described above.
- sensor manager module 530 generates sensor data 445, which may comprise sensor usage data 605 and sensor state data 615, in some embodiments.
- sensor state data 615 is passed to prosthesis manager module 560 of prosthesis services module 430, while sensor usage data 605 is passed to an event logger module 640 of data services module 450.
- Prosthesis manager module 560 receives sensor state data 615, which may comprise information about the state of one or more human-prosthetic interface sensors 133.
- Prosthesis manager module 560 derives pattern and power data 645 from the sensor state data 615, and uses this to look up prosthesis function data 655 based on a human- prosthetic interface (HPI) map database 630.
- HPI map database 630 may be specific to a selected virtual prosthesis, and may define the relationships between sensor data generated by sensor 133 and the prosthesis function that should result from that sensor data. For example, Table 1 as shown below illustrates an example mapping between particular patterns or combinations of sensors 133 and corresponding prosthesis actions.
- Table 1 example mapping of sensor patterns to prosthesis actions
- the table shown above is an example only, and the mapping data stored in HPI map database 630 will be dependent on the quantity and type of sensors 133 used, the user’s capabilities and limitations, the user’s goals, and a clinician’s assessment of the user.
- the HPI map database 630 may be constructed based on data received from a clinician, which the clinician may enter via clinician computing device 170.
- HPI map database 630 may be constructed automatically by system 100.
- the automated construction of HPI map database 630 may be facilitated by a machine learning algorithm, which may present recommendations for HPI mapping based on user goals, as well as assessments and insights of user behaviour via system 100.
- prosthesis manager module 560 When a user activates the predetermined combination of sensors on sensor device 130 by engaging one or more particular muscles, that sensor data 445 is received by prosthesis manager module 560 and converted to a virtual prosthesis action by referencing HPI map database 630 to derive the corresponding prosthesis function data 655.
- prosthesis manager module 560 retrieves the prosthesis function data 655 relating to the pattern and power data 645, it may process the prosthesis function data 655 to derive at least one of joint state data 625 and terminal state data 665.
- joint state data 625 is derived to determine the state of the virtual joint required to perform the virtual prosthesis function defined by prosthesis function data 655.
- joint state data 625 may define an angle which the virtual joint should adopt, in some embodiments.
- Joint state data 635 is communicated to prosthesis joint module 610, which may be configured to handle the behaviour of the virtual prosthetic joint within the virtual environment presented by system 100.
- Prosthesis joint module 610 may pass joint usage data 635 to event logger module 640.
- terminal state data 665 is derived to determine the state of the virtual terminal device required to perform the virtual prosthesis function defined by prosthesis function data 655.
- terminal state data 665 may define a position which the virtual terminal device should adopt, which may include an angle of rotation of the virtual terminal device and/or the state of a virtual grasping mechanism, for example.
- Terminal state data 665 is communicated to terminal device module 620, which may be configured to handle the behaviour of the virtual terminal device, such as the animation of a virtual prosthetic hand, within the virtual environment presented by system 100.
- Terminal device module 620 may pass terminal usage data 675 to event logger module 640.
- Event logger module 640 may receive sensor data 445, which may include sensor usage data 605, terminal usage data 675 and joint usage data 635. Event logger module 640 may use the received data to handle event-driven data logging capabilities within the virtual environment presented by system 100, such as during activities or games presented to a user via headset 120.
- Figure 7 shows a sub-system 700 of the system of Figure 4, illustrating aspects of virtual reality platform 400, such as data services module 450, in further detail.
- Subsystem 700 includes data services module 450 and event logger module 640 as described above.
- Data services module 450 may be configured to handle data gathering from the virtual environment presented by system 100, and from the user of system 100. For example, as described above with reference to Figure 6, data services module 450 may receive data from other modules of system 100, which may include sensor data 445. Data services module 450 may further be configured to perform database management and syncing of data across databases.
- Data services module 450 may comprise an event logger module 640, as described above. Data services module 450 may further comprise one or more of an event system 730, triggered game object module 740, and tracked game object module 750.
- Triggered game object module 740 may relate to one or more in-game objects that are configured to trigger measurement events.
- the measurement event may result in one or more parameters being measured.
- the in-game objects may include objects that a user can interact with by operating a virtual prosthetic within the game. Where such an in-game object is picked up, put down, moved, touched, knocked, or otherwise interacted with, this may trigger a measurement event.
- the measurement event may result in measurement and logging of one or more parameters, such as a user’s pose, or a position of the virtual prosthesis.
- the triggering of a measurement event may cause triggered game object module 740 to send event data 715 corresponding to the triggered event to an event system 730.
- Event system 730 may be a Unity event system running on the Unity game engine by Unity Technologies®, for example.
- Event data 715 may comprise event invoke data, indicating that a triggering event was invoked, as well as event data relating to the type of event that was triggered.
- event system 730 may be configured to invoke a function call to event logger module 640, to cause event logger module 640 to store the event data 715 within an event log 720, as described in further detail below.
- Tracked game object module 750 may relate to one or more virtual in-game objects that are tracked within the virtual environment presented by system 100 for data logging purposes.
- the virtual objects that are tracked may include one or more of a user avatar’s head, hand, or prosthesis, for example.
- the movement of the tracked virtual objects may be based on real-world movement by the user that is translated into sensor data by one or more of headset 120, sensor device 130 and controller 190.
- Event logger module 640 may be configured to get object data from tracked game object module 750, and tracked game object module 750 may be caused to pass object data 725 to event logger module 640 based on a request for such data, periodically, or in real-time as such data is generated.
- data gathering by event logger module 640 may be done through an event-driven system.
- Data capture or measurement may be triggered by an in-game event, such as a user grasping or dropping a virtual object using a virtual prosthesis worn by their avatar; activating the virtual prosthesis; hitting a target; or finishing a game session, for example.
- Other events may include actions by the user, prosthetic function event, in-game logic, in-game actions, or other events.
- Event logger module 640 may be configured to store the received event data in one or more databases.
- event logger module may be configured to store event data in an event log 720.
- each event logger module 640 may have access to a single event log 720 for storing event data.
- Each entry in event log 720 may include data such as an event log type, the data and time of the event or activity, and event data 710 relating to the event.
- An example of some event log types are described in Table 2, below.
- Table 2 example event log types stored in event log 720
- each entry in event log 720 may be stored with associated event data 710.
- the event data 710 is captured and structured by event logger module 640 based on an event type, as described below with reference to Table 3. As indicated in the “Value” column of Table 3, different events may result in different types of data being captured.
- the captured information may be structured in JavaScript Object Notation (JSON) in some embodiments.
- JSON JavaScript Object Notation
- the structured data may be stored in a list within event data 710.
- Event data 710 may comprise one or more of a unique event identifier, a timestamp corresponding to when the event took place, an event type, and one or more value corresponding to the event. An example of some event types and the corresponding values are listed in Table 3, below.
- Table 3 example event types stored in event data 710
- event data 710 may be retrieved at the end of a game or session and transferred to a server or cloud.
- event log data 735 retrieved from event data 710 by event logger module 640 may be passed to game manager module 795 of game services module 440 for further processing, and to generate corresponding game data 745.
- Game manager module 795 may be configured to handle the state and flow of games presented to a user via system 100. For example, game manager module 795 may be configured to determine when to present a user with a tutorial, to manage the user’s level and move them through the levels when appropriate, and to manage interaction with data services module 450 and platform management module 410.
- New game data 745 generated by game manager module 795 may be passed to an API manager module 760, which may form part of rehabilitation management platform 150 in some embodiments.
- API manager module 760 may be configured to handle interactions between system 100 and an API used by a real-time database platform such as Firebase®, which may form rehabilitation management platform 150 in some embodiments.
- the real-time database platform may be configured to handle both offline or local data synchronisation and online or cloud based synchronisation automatically.
- API manager module 760 may receive system and user configuration data 435 from platform management module 410, and may store this data in a data store 770 of rehabilitation management platform 150, which may reside in memory 153 in some embodiments.
- data store 770 may handle persistent database management, data logging and transfer.
- API manager module 760 may also communicate with one or more of an analytics module 780 and an authentication module 790.
- Analytics module 780 may be configured to handle behavioural data collection for analytics purposes, which may include data relating to a user’s actions and performance within a virtual environment presented by system 100.
- Authentication module 790 may handle user authentication, which may include authentication of direct users of system 100 as well as clinicians monitoring patient use of system 100.
- data store 770 The data structures used by data store 770 are described in further detail below with reference to Figure 9.
- Figure 8 shows a sub-system 800 of the system of Figure 4, illustrating aspects of virtual reality platform 400, such as game services module 440, in further detail.
- Subsystem 800 includes platform management module 410 and game services module 440 having a game manager module 795, as described above.
- Game services module 440 further comprises one or more of a tutorial manager module 810, score manager module 820 and level manager module 830 in communication with game manager module 795.
- tutorial manager module 810 may be configured to facilitate the presentation of ingame tutorials to a user.
- tutorial manager module 810 may handle the logic of individual game tutorials, according to some embodiments.
- Score manager 820 may be configured to manage one or more game scores accrued by a user during play of one or more games. Managing a score may include determining when a score should be incremented or decremented, and an amount by which to increment or decrement the score, which may be based on one or more events that occur during game play.
- a user may have multiple separately stored scores across multiple games within system 100.
- a user may have a single score across a plurality of games they access via system 100.
- Level manager module 830 may be configured to manage one or more levels of a user within system 100. For example, level manager module 830 may be configured to handle the logic of specific levels of a game, such as determining when a user’s level should be incremented, and what settings need to be adjusted as a user changes levels. According to some embodiments, a user may have multiple separately stored levels across multiple games within system 100. In some embodiments, a user may have a single level across a plurality of games they access via system 100.
- Figure 9 shows a sub-system 900, illustrating aspects of rehabilitation management platform 150 in further detail.
- sub-system 900 shows the data analytics components of rehabilitation management platform 150, which may be configured to handle data storage, data processing, data analysis and insights based on data generated by system 100 and particularly virtual reality platform 400.
- Subsystem 900 includes the virtual reality platform 400 in communication with data store 770, analytics module 780 and authentication module 790 of rehabilitation management platform 150, as described above with reference to Figure 7.
- Subsystem 900 also shows further components of rehabilitation management platform 150, such as real-time engine 910.
- Real-time engine 910 may be a real-time game engine, such as the Unity game engine by Unity Technologies®, for example.
- Real-time engine 910 may be in communication with data store 770, analytics module 780 and authentication module 790.
- Real-time engine 910 may further be in communication with at least one of a user authentication platform 960, and a real-time database 920.
- User authentication platform 960 may be a cloud based platform in some embodiments. According to some embodiments, user authentication platform 960 may be an identity and access management (IAM) platform. User authentication platform 960 may allow for a user using a user companion application 965 to log in to virtual reality platform 400. According to some embodiments, user companion application 965 may reside in memory on user computing device 160, and may allow a user of user computing device 160 to set user preferences and access analytics generated by virtual reality platform 400.
- Real-time engine 910 may be configured to send data for storage in real-time database 920. Real-time database 920 may be a no-SQL real-time database in some embodiments, and may be a Firestore® database, for example.
- real-time database 920 may be stored on at least one of headset 120 and rehabilitation management platform 150. According to some embodiments, real-time database 920 may be mirrored so as to reside on both of headset 120 and rehabilitation management platform 150. Synchronisation of real-time database 920 may be handled automatically via a cloud hosting service, such as the Firebase® service, for example, and may include online and offline synchronisation methods.
- a cloud hosting service such as the Firebase® service, for example, and may include online and offline synchronisation methods.
- Real-time database 920 may be a database configured for storing activity raw data generated during use of virtual reality platform 400; user information; and hardware configurations, such as configurations relating to headset 120, sensor device 130 and controller 190.
- Real-time database 920 may receive user data from user companion application 965, in some embodiments.
- the data structures stored in real-time database 920 are described in further detail below with reference to Figure 10.
- Real-time database 920 may be accessible to a raw data processing module 930.
- raw data processing module 930 may be configured to retrieve activity event log data stored in real-time database 920, and may be configured to clear any activity event log data from real-time database 920 once processed.
- Raw data processing module 930 may be configured to perform containerised data processing functions to convert raw activity data received from realtime database 920 into activity and pose logs.
- Raw data processing module 930 is described in further detail below with reference to Figure 11.
- Raw data processing module 930 may pass the processed data to big data database 940 for storage.
- the data may be divided and stored separately depending on the user type, which may include able-bodied, amputee, and able-bodied with emulated amputation, according to some embodiments. In some embodiments, this data may be stored together, but tagged to allow for identification of the user type.
- Big data database 940 may be a tabular data warehouse in some embodiments. Big data database 940 may be structured by datasets, and used to store processed performance and motor behaviour data. According to some embodiments, the datasets may be divided by user type. Each dataset may be configured to collect data for a particular type of user and for a particular activity.
- Each game or activity presented by virtual reality platform 400 may have its own table in each dataset within big data database 940.
- the tables may be configured to store all processed results and raw logs for all players in each group of users identified by user type.
- the data structures stored in big data database 940 are described in further detail below with reference to Figure 12.
- the data stored in big data dataset 940 may be aggregated to be used for training of machine learning algorithms, in some embodiments. In some embodiments, the data stored in big data dataset 940 may be aggregated to be used to derive target performance metrics. According to some embodiments, the data may additionally or alternatively be used to generate insights for each individual user of virtual reality platform 400.
- a clinician may be able to access insights related to individual users via a clinician web application 980, which may reside in memory on clinician computing device 170.
- Clinician web application 980 may be configured to query patient data stored in big data database 940.
- Clinicians may be able to log in to clinician web application 980 via a clinician authentication platform 985 accessible via clinician web application 980.
- Clinician authentication platform 985 may be a cloud based authentication platform in some embodiments.
- clinician authentication platform 985 may be an identity and access management (IAM) platform.
- IAM identity and access management
- Big-data database 940 may also be accessible by a performance calculation module 950.
- Performance calculation module 950 may be configured to calculate the performance of one or more users of virtual reality platform 400.
- performance calculation module 950 may receive raw log table data from big data database 940, and process the data to derive activity performance tables. The activity performance table data may then be stored in big data database 940.
- performance calculation module 950 may also derive progress metrics and/or performance metrics, which may be provided to real-time database 920 for storage. Performance calculation module 950 is described in further detail below with reference to Figure 13.
- Real-time database 920 may be configured to store collections of data related to virtual reality platform 400.
- real-time database 920 is configured to store three collections of data, being headset data 1010, activity data 1020 and meta data 1030.
- Headset data 1010 may be configured to store data relating to headset configurations of headset 120, and may store a dataset for each individual headset 120 within system 100.
- the stored data may include an assigned clinic of headset 120, a unique identifier of a user of headset 120, and one or more identifiers of sensors contained within headset 120.
- Headset data 1010 may also contain raw activity data 1012, which may be stored separately for each game and/or activity provided by virtual reality platform 400, and may comprise data stored in event log 720 and event data 710, as described in further detail above with reference to Figure 7.
- writing raw data to raw activity data 1012 may trigger a function to cause raw data processing module 930 to process and subsequently clear the raw data stored.
- data stored in headset data 1010 may be anonymised.
- Activity data 1020 may be configured to store processed insights and data used by front-end services of system 100, such as user companion application 965 and clinician web application 980. Activity data 1020 may be stored separately for each user of system 100, and may include progress data 1022 and usage data 1026 for each user. Progress data 1022 may store goal data 1024 relating to a user’s progress to their predefined goals or skills. Each entry in goal data 1024 may include an overall progress score or metric, and/or individual progress scores or metrics for each skill or task that a user has attempted to perform using virtual reality platform 400. Usage data 1026 may store game session data 1028 and assessment data 1029. Game session data 1028 may relate to game sessions that a user has participated in via virtual reality platform 400.
- Each entry in game session data 1028 may include one or more of a name of a game played during the gaming session; a date and time of the gaming session; a score achieved by the player during the gaming session; and game specific metrics that may be particular to the game played.
- data stored in activity data 1020 may be anonymised.
- Meta data 1030 may store user configuration data 1032, which may be stored separately for each user of system 100.
- User configuration data 1032 may include data relating to a user’s personal identifying information, their amputation diagnosis and prosthesis prescription data, and/or the sensor and prosthesis configurations to be used by the user.
- each entry in user configuration data 1032 may include one or more of a unique identifier; a user name; a user’s gender; a user’s year or date of birth; amputation details such as the location of an amputation or limb difference; goal configuration data related to a user’s goals; prosthesis configuration data relating to a type of physical prosthesis fitted to the user and/or a type of virtual prosthesis that the user’s avatar wears in virtual environment 400; sensor configuration data relating to the sensors located on sensor device 130; and service configuration data.
- FIG 11 shows raw data processing module 930 in further detail.
- Raw data processing module 930 may be configured to process raw event data into four logs, being an activity log 1242, sensor log 1244, prosthesis log 1246 and interaction log 1248, and to store the data in big data database 940, as illustrated and described in further detail below with reference to Figure 12.
- each activity and game may have its own data processing method performed by raw data processing module 930.
- raw data stored in real-time database 920 is updated for a given activity, intervention or game based on data received from real-time engine 910, that data is passed to function call module 1110 of raw data processing module 930.
- Function call module 1110 determines the activity that the data relates to, and calls the appropriate function.
- Data processing module 1120 received the data and executes the called function to process the data into the appropriate log.
- the processed data is received by data transfer module 1130 and written to big data database 940 into the appropriate logs, as described in further detail below with reference to Figure 12.
- Data transfer module 1130 also causes the raw data to be cleared from real-time database 920.
- Big data database 940 may be configured to store raw event logs of data received from raw data processing module 940, as well as processed game data.
- processed game data may be stored in a de-identified form.
- processed game data may be stored in an aggregated form.
- processed game data may be organised in datasets and tables.
- Datasets stored in big data database 940 may represent three different user groups: able-bodied, amputees, able-bodied with emulated prosthesis. Data from these user groups may be stored in separate datasets, such as able-bodied datasets 1210, amputee datasets 1220, and emulated amputee datasets 1230, for example. Tables within each dataset may store data relating to the games and activities presented by virtual reality platform 400, with each game and activity having data stored in both raw log tables 1240 and performance log 1250. Log tables 1240 may comprise activity log 1242, sensor log 1244, prosthesis log 1246 and interaction log 1248. According to some embodiments, data related to each game and activity may be stored in each of activity log 1242, sensor log 1244, prosthesis log 1246, interaction log 1248 and performance log 1250.
- Activity log 1242 comprises log entries relating to activities attempted by one or more users. Each entry may comprise at least one of a unique identifier; a data and/or time at which the activity was attempted; and a level of the activity attempted. According to some embodiments, entries in activity log 1242 may additionally or alternatively comprise one or more of a timestamp that the activity was attempted; an event type relating to an event that occurred during the activity; an object ID relating to a virtual object interacted with within the activity; an object position of the object interacted with; an object quaternion of the object interacted with; a grasp side of the prosthetic used to interact with the object; a head position based on a virtual position of a headset 120 being used by the user; a head quaternion based on a virtual position of a headset 120 being used by the user; a left hand position based on a virtual position of a sensor device 130 or controller 190 being used by the user; a left hand quaternion based on a virtual position of a
- Sensor log 1244 comprises log entries relating to sensor data generated by one or more of headset 120, sensor device 130 and controller 190 during an activity presented by virtual reality platform 400. Each entry may comprise at least one of a unique identifier; a data and/or time at which the activity was attempted; and a level of the activity attempted.
- entries in sensor log 1244 may additionally or alternatively comprise one or more of a timestamp that the activity was attempted; an event type relating to an event that occurred during the activity; a pattern of sensors activated; a power measure relating to the degree to which a sensor was activated; a head position based on a virtual position of a headset 120 being used by the user; a head quaternion based on a virtual position of a headset 120 being used by the user; a left hand position based on a virtual position of a sensor device 130 or controller 190 being used by the user; a left hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user; a right hand position based on a virtual position of a sensor device 130 or controller 190 being used by the user; and a right hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user.
- Prosthesis log 1246 comprises log entries relating to prosthesis data generated by sensor device 130 and relating to the actions of a virtual prosthesis during an activity presented by virtual reality platform 400. Each entry may comprise at least one of a unique identifier; a data and/or time at which the activity was attempted; and a level of the activity attempted.
- entries in prosthesis log 1246 may additionally or alternatively comprise one or more of a timestamp that the activity was attempted; an event type relating to an event that occurred during the activity; a direction in which the virtual prosthesis was moved; a head position based on a virtual position of a headset 120 being used by the user; a head quaternion based on a virtual position of a headset 120 being used by the user; a left hand position based on a virtual position of a sensor device 130 or controller 190 being used by the user; a left hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user; a right hand position based on a virtual position of a sensor device 130 or controller 190 being used by the user; and a right hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user.
- Interaction log 1248 comprises log entries relating to event data generated during an activity presented by virtual reality platform 400. Each entry may comprise at least one of a unique identifier; a data and/or time at which the activity was attempted; and a level of the activity attempted. According to some embodiments, entries in sensor log 1244 may additionally or alternatively comprise one or more of a timestamp that the activity was attempted; an event type relating to an event that occurred during the activity; a pattern of sensors activated; and one or more values relating to the event.
- Performance log 1250 comprises log entries relating to performance of a user within virtual reality platform 400, and may be generated based on data from one or more of activity log 1242, sensor log 1244, prosthesis log 1246, and interaction log 1248. According to some embodiments, performance log 1250 may be written to by performance calculation module 950. Each entry may comprise a unique identifier.
- entries in performance log 1250 may additionally or alternatively comprise one or more of a data and/or time at which the activity was attempted; a level of the activity attempted; a playtime indicating a duration of play; an activity score achieved by the user during game play; a compensation score relating to the amount that the user compensated for their prosthetic; a compensation style relating to the manner in which the user compensated for their prosthetic; a hand usage metric; a wrist usage metric; an elbow usage metric; a sensor patterns usage metric, and one or more game specific metrics.
- Data stored in big data database 940 may be used by system 100 for purposes including the training of machine learning algorithms; generating population-level insights, and generating individual insights that may be accessible to clinician web application 980.
- FIG. 13 shows performance calculation module 950 in further detail.
- performance calculation module 950 receives data from big data database 940, and writes data to real-time database 920 and big data database 940.
- Performance calculation module 950 receives raw log table data 1305 from big data database 940 when new data is added to logs 1242, 1244, 1246 or 1248.
- Raw log table data 1305 is passed to ML model selection module 1310, which is configured to determine what type of data is contained in raw log table data 1305, and therefore what type of machine learning model should be used to process the data.
- Performance calculation module 950 may comprise a number of algorithms for generating performance data.
- performance calculation module 950 may contain a separate algorithm for each data log stored in big data database 940.
- performance calculation module 950 comprises an activity metric computation module 1320, a prosthesis metric computation module 1330, a sensor metric computation module 1340 and a motor behaviour computation module 1350.
- activity log data 1312 is passed to activity metric computation module 1320 for processing.
- Activity metric computation module 1320 performs a processing algorithm on activity log data 1312 to derive one or more activity metrics 1325.
- Activity metrics 1325 may be intervention or game dependent, in some embodiments.
- the activity metrics 1325 may include metrics related to a user’s performance in the activity. For example, where the activity is a target shooting game (as shown in Figures 18D, 19A and 21A, for example), the activity metrics 1325 may include the duration of the activity, the number of shots fired and the number of targets hit. Further activity metrics, such as an accuracy of the user, may be derived based on these raw metrics. Activity metrics 1325 are then passed to data collection module 1360.
- prosthesis log data 1314 is passed to prosthesis metric computation module 1330 for processing.
- Prosthesis metric computation module 1330 performs a processing algorithm on prosthesis log data 1314 to derive one or more prosthesis use metrics 1335.
- prosthesis metric computation module 1330 may analyse prosthesis event segments based on prosthesis log data 1314 to calculate metrics associated with prosthesis utilisation, for example.
- Prosthesis metrics 1335 may be intervention dependent, in some embodiments.
- Prosthesis use metrics 1335 are then passed to data collection module 1360.
- Event segments which may include prosthesis, activity or sensor event segments, may comprise pairs of related events, such as the start and end of a particular movement of the prosthetic by the user. According to some embodiments, an event may be triggered when an action or movement is commenced, and a second event may be triggered when that action or movement is stopped, to define the start and end of an event segment. These event segments may be used to calculate the duration of a particular activity.
- sensor log data 1316 is passed to sensor metric computation module 1340 for processing.
- Sensor metric computation module 1340 performs a processing algorithm on sensor log data 1316 to derive one or more sensor use metrics 1345.
- sensor metric computation module 1340 may analyse sensor event segments based on sensor log data 1316 to calculate metrics associated with sensor utilisation, for example.
- Sensor use metrics 1345 may be intervention dependent, in some embodiments.
- Sensor use metrics 1345 are then passed to data collection module 1360.
- motor behaviour computation module 1350 performs a processing algorithm on interaction log data 1318 to derive one or more motor behaviour metrics 1355.
- motor behaviour computation module 1350 may analyse prosthesis events and object grasping data as recorded in interaction log data 1318 to calculate metrics associated with a user’s grasping success rate, for example.
- Motor behaviour metrics 1355 may be intervention or game dependent, in some embodiments.
- motor behaviour computation module 1350 may comprise one or more machine learning algorithms to derive motor behaviour metrics 1355.
- the machine learning algorithms may be trained to compare data extracted from interaction log data 1318 for a particular user with corresponding data derived from a population of able-bodied or expert prosthesis users.
- a machine learning algorithm may be trained to compare grasping pose data for a given virtual object grasped by a user with grasping pose data derived from a population of able-bodied users grasping an identical virtual object.
- a machine learning algorithm may be trained to compare pose data for a given sensor function with pose data derived from a population of expert prosthesis users performing an identical sensor function.
- data for training such machine learning models may be derived from data retrieved from one or more raw data logs stored in big data database 940.
- interaction pose data may be derived from interaction log data 1318
- prosthesis action pose data may be derived from prosthesis log data 1314
- sensor function pose data may be derived from sensor log data 1216, in some embodiments.
- Motor behaviour computation module 1350 may use the one or more machine learning algorithms to derive motor behaviour metrics 1355 such as a compensation score and style, prosthesis utilisation score and style, and sensor utilisation score and style, for example.
- motor behaviour computation module 1350 may use a clustering-type approach to derive these metrics.
- Historical data may be processed, and users may be clustered into groups based on the data generated during their rehabilitation sessions. Each cluster may relate to a particular motor behaviour, such as an elbow down motor behaviour when grasping an object, for example.
- one or more clusters may be labelled as desired clusters.
- desired clusters may be labelled manually by one or more clinicians based on the motor behaviour that is considered desirable from a clinical perspective.
- desired clusters may be automatically labelled based on the clusters into which a large proportion of able-bodied or expert prosthesis users fall.
- motor behaviour computation module 1350 may analyse the distance between the new data and one or more of the desired clusters to determine a score for the new user. Motor behaviour computation module 1350 may also determine which cluster the new data is closest to, and determine a style based on the style associated with that cluster. The style may represent a particular feature in motor behaviour, such as an elbow down or elbow up grasping style, for example. The styles may be defined by clinicians based on their observations from clinical practice in some embodiments. This approach may be used to determine score and style for one or more of compensation, prosthesis utilisation and sensor utilisation.
- motor behaviour computation module 1350 may also derive a range of motion metrics, which may form part of motor behaviour metrics 1355.
- Motor behaviour computation module 1350 may comprise a workspace analysis algorithm that is configured to compute the range of motion of a tracked limb of a user based on interaction log data 1318.
- the range of motion may be a three-dimensional range of motion, in some embodiments.
- Motor behaviour metrics 1355 are then passed to data collection module 1360.
- Data collection module 1360 may be configured to receive at least one of activity metrics 1325, prosthesis use metrics 1335, sensor use metrics 1345 and motor behaviour metrics 1355. Data collection module 1360 may be configured to structure the received data into a performance data table, in some embodiments. Data collection module 1360 may then pass the structured data to at least one of a database update module 1370, and a skill progress update module 1380.
- Database update module 1370 may be configured to transfer the activity performance table data created by data collection module 1360 to big data database 940 for storage in performance log 1250.
- Skill progress update module 1380 may be configured to transfer the progress metrics data 1385 created by data collection module 1360 to realtime database 920, for storage in progress data 1022.
- progress data 1022 may be translated into a skill level that is presented to a user, as described below with reference to Figure 16B.
- the metrics calculated by performance calculation module 950 may be used to provide insights to a user or their clinician. For example, advice may be provided as to how a user can change their motor behaviour to reduce long term postural health issues, and the types of prostheses that may assist with this.
- Figure 14 is a process flow diagram of a method 1400 performed by processor 111 of computing device 110 executing task presentation application 116 to present a virtual environment to a user. While method 1400 is described as being performed by processor 111, some or all of method 1400 may alternatively be performed by other components of system 100, such as processor 121 or processor 151, for example.
- processor 111 executing task presentation application 116 receives data corresponding to a high level prosthetic use goal associated with a user.
- the data may be received from user computing device 160, after a user interacts with user computing device 160 to enter data relating to one or more goals.
- the data may be received from clinician computing device 170, after a clinician interacts with clinician computing device 170 to enter data relating to one or more goals of a patient.
- the high level goal may relate to one or more tasks that the user wishes to be able to accomplish using their prosthetic. For example, a high level goal may be “cook for my family” or “drive to work”.
- the high level goal may be selected from a predefined list of goals retrieved from a memory location such as memory 153, for example.
- the high level goal may be a new goal manually entered by the user or clinician.
- Figure 16A shows an example screenshot that may be displayed during step 1405.
- processor 111 executing task presentation application 116 is caused to determine one or more basic skills or functional skills associated with the high level goal.
- basic skills may be skills which target rehabilitation and strengthening of the neurology and muscles of the residual limb, and may be used in the operation of the prosthesis, while functional skills may comprise a combination of basic skills, and target the ability of users to use the basic skills to operate the prosthesis for activities of daily living.
- Processor 111 may be configured to determine one or more basic skills or functional skills associated with the high level goal by referring to a database storing predefined goal information. For example, where the high level goal is cooking, processor 111 may determine that the skills required to achieve the goal include utensil use, bi-manual manipulation and grasp types.
- the one or more basic skills or functional skills associated with the high level goal may be selected from a predefined list by a user or clinician via user computing device 160 or clinician computing device 170, where the predefined list may be retrieved by processor 111 for a memory location such as memory 153.
- basic skills or functional skills are specifically chosen for the particular high level goal to assist in accomplishing said high level goal. That is, the basic skills or functional skills chosen are used to provide the user with feedback on performance of said skills to better assist in reaching the high level goal, for example.
- processor 111 determines one or more games or activities to present to the user to assist them in achieving the one or more skills determined at step 1410.
- processor 111 may compare the skills identified at step 1410 with one or more skills associated with one or more games stored in a memory location such as memory 153.
- each game may be associated with at least one basic or functional skill. For example, an assembly game where a user must use a virtual prosthesis to assemble various virtual objects into a virtual structure may be associated with the skill of bi-manual manipulation.
- Processor 111 may determine one or more games that match the skills determined at step 1410 to present to the user.
- processor 111 may cause the one or more games identified at step 1415 to be presented to the user. These may be presented via one or more of a user interface of client computing device 160, user I/O 113 of computing device 110, or display 123 of headset 120.
- a user interface of client computing device 160 user I/O 113 of computing device 110, or display 123 of headset 120.
- An example user interface showing a selection of games being presented to the user is shown in Figure 17 and described in further detail below.
- processor 111 receives a user selection corresponding to one of the games presented at step 1420.
- the user selection may be received via a user interface of client computing device 160, via user I/O 113 of computing device 110, or via controller 190 operating in conjunction with headset 120, in some embodiments.
- processor 111 causes the selected game to be generated as a virtual environment for presentation to the user via headset 120.
- the virtual environment may comprise one or more virtual objects.
- processor 111 may retrieve the program code associated with the game from a memory location such as memory 153, and may pass the program code to headset 120 for execution by processor 121.
- Example virtual environments are shown in Figures 18A to 21D, and described in further detail below.
- processor 111 receives prosthesis data corresponding to a virtual prosthesis for display to the user in the selected game.
- processor 111 may retrieve the prosthesis data from a memory location such as memory 153.
- the prosthesis data may correspond to a physical prosthesis fitted to the user or proposed to be fitted to the user, and may be entered by a clinician via clinician computing device 170 in some embodiments.
- the prosthesis data may form part of system and user configuration data 435, as described above with reference to Figure 4.
- processor 111 causes a virtual prosthesis corresponding to the prosthesis data to be generated for presentation to the user via headset 120 within the generated virtual environment.
- processor causes the virtual environment and virtual prosthesis to be presented to the user via headset 120.
- Figure 16A shows an example screenshot 1600 that may be displayed to a clinician during steps 1405 of method 1400.
- screenshot 1600 is displayed on clinician computing device 170.
- screenshot 1600 may be displayed on other devices of system 100.
- Screenshot 1600 shows a patient identifier 1610 indicating the patient for whom goals and skills are being selected.
- goals and skills are being selected for a patient named John.
- Goal selection boxes 1620 provides for a means by which a user of device 170 can select one or more goals to be associated with John’s profde. According to some embodiments, boxes 1620 may be drop-down boxes from which a user can select a predefined goal. According to some embodiments, boxes 1620 may be a text box into which a user can manually enter a new goal.
- a goal of “Cook for my family” has been entered or selected in a first box 1620, while a user has entered the word “Drive” in a second box 1620 causing drop down items 1625 to be shown.
- drop down items 1635 include the goals “Drive my kids to school” and “Drive to work”. An option to add a new goal that doesn’t currently appear on the list is also shown
- Screenshot 1600 further provides virtual buttons 1630 and 1640, allowing a user to add a new goal to the list of John’s goals, or to save the goals, respectively. Interacting with virtual button 1640 may cause clinician computing device 170 to save the goals to a local memory location, or to send the goal data to an external device such as rehabilitation management platform 150 for storage in a memory location such as memory 153.
- Figure 15 is a process flow diagram of a method 1500 performed by processor 111 of computing device 110 executing assessment application 117, for assessing a user’s performance within a virtual environment, such as the virtual environment presented to the user at step 1445 of method 1400. While method 1500 is described as being performed by processor 111, some or all of method 1500 may alternatively be performed by other components of system 100, such as processor 121 or processor 151, for example.
- processor 111 executing assessment application 117 receives historical performance data relating to previous performance of one or more users within a virtual environment presented by system 100.
- the data may be retrieved from big data database 940, and may include data relating to able-bodied users or experienced amputees.
- processor 111 executing assessment application 117 performs a clustering method on the retrieved data, to arrange the data into clusters.
- processor 111 also determines at least one desired cluster, where the desired cluster is a cluster containing a predetermined percentage of data from able- bodied users or experienced amputees.
- processor 111 executing assessment application 117 receives new performance data relating to performance of a user within a virtual environment presented by system 100.
- the new data may be generated by one or more of headset 120, sensor device 130 or controller 190.
- the data may be retrieved from real-time database 920.
- the new performance data may comprise at least one of: sensor data, prosthesis data, interaction event data, motor behaviour data and postural data.
- Motor behaviour data may include compensation data relating to the manner in which and the amount by which a user compensates for their prosthetic through other body movement.
- Postural data may relate to the posture of the user during interaction with the virtual environment presented by system 100.
- processor 11 is caused to compare the data received at step 1515 with a predetermined parameter or predetermined control data, such as the clustered data determined at step 1510. According to some embodiments, processor 111 may determine which cluster the new performance data is closest to. According to some embodiments, processor 111 may determine the distance between the new performance data and at least one desired cluster.
- processor 111 may derive at least one performance metric from the data.
- the performance metric may include at least one of a motor performance metric and a prosthesis use metric, for example.
- processor 111 may analyse the distance between the new data and one or more of the desired clusters to determine a score metric.
- Processor 111 may additionally or alternatively determine which cluster the new data is closest to, and determine a style metric based on the style associated with that cluster.
- the at least one performance metric may be derived from at least one of: sensor data, prosthesis data, interaction event data, motor behaviour data and postural data.
- the at least one performance metric may be indicative of the user’s ability to functionally use a prosthesis outside of the virtual environment. That is, the at least one performance metric may relate to a user’s ability to perform real world tasks such as, cooking, driving a vehicle, or using a computer, for example.
- processor 111 may optionally also determine a recommendation for the user based on the calculated metrics. For example, a recommendation may be generated as to how a user can change their motor behaviour to reduce long term postural health issues, and/or the types of prostheses that may assist with this.
- processor 111 causes the metric and/or recommendation to be presented to a user and/or clinician. This may be via one or more of a user interface of client computing device 160; a user interface of a clinician computing device 170; user I/O 113 of computing device 110, or display 123 of headset 120.
- Figure 16B shows an example screenshot 1650 that may be displayed to a user during step 1530 of method 1500.
- screenshot 1650 is displayed on user computing device 160.
- screenshot 1650 may be displayed on other devices of system 100.
- Screenshot 1650 shows a goal identifier 1655 indicating a high level goal for which metrics are being displayed.
- metrics are being displayed for the goal of “Cook for my family”.
- An overall progress bar 1660 is shown for the high level goal, along with a level indicator 1665.
- a number of further progress bars are also shown for skills associated with the high level goal.
- a progress bar 1670 is shown for the skill “utensil use”;
- a progress bar 1680 is shown for the skill “bimanual manipulation”;
- a progress bar 1690 is shown for the skill “grasp types”.
- Each progress bar 1670, 1680 and 1690 also has an associated level indicator 1675, 1685 and 1695, respectively.
- each of level indicators 1665, 1675, 1685 and 1696 show a level of “2”.
- FIG. 17 shows an example screenshot 1700 that may be displayed to a user during step 1420 of method 1400.
- Screenshot 1700 may be displayed via headset 120, on user computing device 160, or on other devices of system 100.
- Screenshot 1700 shows an environment 1705, which may be presented as a virtual environment when being viewed via headset 120.
- a virtual prosthesis 1710 which may be controlled within environment 1705 by interaction with sensor device 130.
- Environment 1705 also contains a menu display 1720 showing a number of menu options 1722, 1724 and 1726. Each of the menu options may correspond to an activity or game selectable by the user to be displayed to them via headset 120.
- Option 1722 relates to a “pick and cook” activity, as described in further detail below with reference to Figures 19C, 19D and 21C.
- Option 1724 relates to a “paintball town” activity, as described in further detail below with reference to Figures 18D, 19A and 21 A.
- Option 1726 relates to a “box & blocks” activity, as described in further detail below with reference to Figures 18B, 19B and 20.
- a user may be able to interact with one of the presented options to make a selection, as described above with reference to step 1425 of method 400.
- Figures 18A to 18D show example screenshots 1800, 1820, 1840 and 1860 that may be displayed to a user during step 1430 of method 1400.
- Screenshots 1800, 1820, 1840 and 1860 may be displayed via headset 120, on user computing device 160, or on other devices of system 100.
- screenshots 18A to 18D show the user avatar from a third person perspective, according to some embodiments these screenshots may be generated for viewing by a third party, such as a clinician, rather than for the user to view via headset 120.
- FIG 18A shows a screenshot 1800.
- Screenshot 1800 shows an environment 1805. Within the environment 1805 is shown an avatar 1810 wearing a virtual prosthesis 1710, which may be controlled within environment 1805 by a user interacting with sensor device 130. Environment 1805 also includes a number of manipulable virtual objects 1815, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1805 may facilitate a building or construction type activity, where the user is prompted to guide the virtual prosthesis 1710 to interact with the virtual food objects 1815 and to construct a burger. This may involve the user grasping, moving, manipulating and releasing the virtual objects 1815 using the virtual prosthesis 710.
- Figure 18B shows a screenshot 1820.
- Screenshot 1820 shows an environment 1825. Within the environment 1825 is shown an avatar 1810 wearing a virtual prosthesis 1710, which may be controlled within environment 1825 by a user interacting with sensor device 130. Environment 1825 also includes a number of manipulable virtual objects 1835, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1825 may facilitate a “box & blocks” type activity, where the user is prompted to guide the virtual prosthesis 1710 to interact with the virtual objects 1835 to move them into and out of a virtual box. This may involve the user grasping, moving, manipulating and releasing the virtual objects 1835 using the virtual prosthesis 710.
- FIG 18C shows a screenshot 1840.
- Screenshot 1840 shows an environment 1845.
- an avatar 1810 wearing a virtual prosthesis 1710 which may be controlled within environment 1845 by a user interacting with sensor device 130.
- Environment 1845 also includes a number of manipulable virtual objects 1855, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130.
- environment 1845 may facilitate an activity where the user is prompted to guide the virtual prosthesis 1710 to interact with the virtual objects 1835 to move them within the virtual environment. This may involve the user grasping, moving, manipulating and releasing the virtual objects 1855 using the virtual prosthesis 710.
- FIG 18D shows a screenshot 1860.
- Screenshot 1860 shows an environment 1865.
- an avatar 1810 wearing a virtual prosthesis 1710 which may be controlled within environment 1865 by a user interacting with sensor device 130.
- Environment 1865 also includes a number of virtual target objects 1875, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130.
- environment 1865 may facilitate a “shooting” type activity such as paintball, where the user is prompted to guide the virtual prosthesis 1710 to pick up projectile type weapons such as paintball guns with paintball projectiles, and use these to strike virtual targets 1875. This may involve the user grasping, moving, manipulating and releasing the weapons, as well as aiming at and firing at virtual targets 1875 using the virtual prosthesis 710.
- Figures 19A to 19D show example images 1900, 1920, 1940 and 1960 that show screenshots which may be viewed via a headset 120 by a user during step 1430 of method 1400.
- Figure 19A shows a headset 120 with display optics 1910 and 1912 showing screenshots of images to be displayed to a user wearing headset 120, and configured to each be viewed by an eye of the user to present a simulated three dimensional image to the user.
- Figure 19A shows an environment 1865.
- a virtual prosthesis 1710 which may be controlled within environment 1865 by a user interacting with sensor device 130.
- Environment 1865 also shows a virtual weapon 1915 held by the virtual prosthesis 1710, and a virtual target 1875 for shooting with weapon 1915.
- environment 1865 may facilitate a “shooting” type activity, as described above with reference to Figure 18D.
- Figure 19B shows a headset 120 with display optics 1910 and 1912 showing screenshots of images to be displayed to a user wearing headset 120, and configured to each be viewed by an eye of the user to present a simulated three dimensional image to the user.
- Figure 19B shows an environment 1825. Within the environment 1825 is shown a virtual prosthesis 1710, which may be controlled within environment 1825 by a user interacting with sensor device 130. Environment 1825 also shows virtual objects 1835, with one virtual object 1935 being held by the virtual prosthesis 1710. According to some embodiments, environment 1865 may facilitate a “box & blocks” type activity, as described above with reference to Figure 18B .
- Figure 19C shows a headset 120 with display optics 1910 and 1912 showing screenshots of images to be displayed to a user wearing headset 120, and configured to each be viewed by an eye of the user to present a simulated three dimensional image to the user.
- Figure 19C shows an environment 1919.
- a virtual prosthesis 1710 which may be controlled within environment 1919 by a user interacting with sensor device 130.
- Environment 1919 also shows virtual objects 1917, with one virtual object 1915 being held by the virtual prosthesis 1710.
- environment 1865 may facilitate a “barbeque” type activity, where the user is prompted to guide the virtual prosthesis 1710 to interact with the virtual objects 1917 to move them onto a grill, rotate them on the grill and remove them from the grill. This may involve the user grasping, moving, manipulating and releasing the virtual objects 1917 using the virtual prosthesis 1710.
- Figure 19D shows an alternative view of environment 1919.
- Figure 20 shows a screenshot 2000 displaying a user perspective view of environment 1825.
- a virtual prosthesis 1710 which may be controlled within environment 1825 by a user interacting with sensor device 130.
- Environment 1825 also includes a number of manipulable virtual objects 1835, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130.
- environment 1825 may facilitate a “box & blocks” type activity, as described above with reference to Figure 18B.
- Figure 21 A shows a screenshot 2100 displaying a user perspective view of environment 1865.
- a virtual prosthesis 1710 which may be controlled within environment 1865 by a user interacting with sensor device 130.
- Environment 1865 also includes a virtual weapon 1915 held by virtual prosthetic 1710, and a virtual target 1875 for shooting with virtual weapon 1875.
- environment 1825 may facilitate a “shooting” type activity, as described above with reference to Figure 18D.
- Figure 2 IB shows a screenshot 2120 displaying a user perspective view of environment 1805.
- a virtual prosthesis 1710 which may be controlled within environment 1805 by a user interacting with sensor device 130.
- Environment 1805 also includes a number of manipulable virtual objects 1815, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130.
- environment 1825 may facilitate a building and construction type activity, as described above with reference to Figure 18 A.
- Figure 21C shows a screenshot 2140 displaying a user perspective view of environment 1919.
- a virtual prosthesis 1710 which may be controlled within environment 1919 by a user interacting with sensor device 130.
- Environment 1919 also includes a number of manipulable virtual objects 1917, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1919 may facilitate a barbeque type activity, as described above with reference to Figures 19C and 19D.
- Figure 21D shows a screenshot 2160 displaying a user perspective view of an environment 2165.
- a virtual prosthesis 2170 which may be controlled within environment 1919 by a user interacting with sensor device 130.
- virtual prosthesis 2170 is a lower body prosthesis, and may be controlled by a sensor device 130 worn on the lower body, as shown in Figure 3B, for example.
- Environment 2165 also includes a manipulable virtual object 2175, which the user may be able to interact with via the virtual prosthesis 2170. The user may do so by interacting with sensor device 130.
- environment 2165 may facilitate a soccer type activity, where the user is prompted to guide the virtual prosthesis 2170 to interact with the virtual object 2175, being a virtual ball, to move the virtual object 2175 through the virtual soccer field and through the virtual goals. This may involve the user kicking, tapping and rolling the virtual object 2175 using the virtual prosthesis 2170.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Transplantation (AREA)
- Vascular Medicine (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Business, Economics & Management (AREA)
- Dentistry (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Multimedia (AREA)
- Entrepreneurship & Innovation (AREA)
- Neurosurgery (AREA)
- Neurology (AREA)
- Dermatology (AREA)
- Biodiversity & Conservation Biology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Described embodiments generally relate to a method of determining performance metrics of a user relating to control of a virtual prosthetic limb in a virtual or augmented reality environment. The method comprises causing a virtual reality environment to be presented to a user, wherein the virtual reality environment comprises a virtual prosthetic limb controllable by the user by way of a sensor device; receiving sensor data from the sensor device in response to the user causing the virtual prosthetic limb to move within the virtual reality environment, the sensor data comprising motion sensor data and human-prosthesis interface data; and processing the sensor data to derive at least one performance metric. The at least one performance metric is derived by comparing the sensor data with at least one predetermined parameter, and the at least one performance metric relates to a level of competency with which a user controls of the virtual prosthetic limb within the virtual reality environment.
Description
"Methods and systems for rehabilitation of prosthesis users"
Technical Field
Embodiments generally relate to methods and systems for rehabilitation of prosthesis users. In particular, embodiments relate to systems and methods for facilitating rehabilitation of prosthesis users using a virtual or augmented reality platform.
Background
Many people living with limb loss or limb differences elect to use a prosthesis to assist with their daily activities. For example, where a patient has upper limb loss, a prosthesis may provide them with an increased ability to perform tasks such as picking up, holding, and manipulating objects. However, learning to use a prosthesis requires time and training. This can be an arduous process for the prosthesis user, and may require them to attend numerous in-person rehabilitation sessions with a clinician.
It is desired to address or ameliorate one or more shortcomings or disadvantages associated with prior systems for rehabilitation of prosthesis users, or to at least provide a useful alternative thereto.
Any discussion of documents, acts, materials, devices, articles or the like which has been included in the present specification is not to be taken as an admission that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present disclosure as it existed before the priority date of each of the appended claims.
Throughout this specification the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not the exclusion of any other element, integer or step, or group of elements, integers or steps.
Summary
Some embodiments relate to a method of determining performance metrics of a user relating to control of a virtual prosthetic limb in a virtual or augmented reality environment, the method comprising: causing a virtual reality environment to be presented to a user, wherein the virtual reality environment comprises a virtual prosthetic limb controllable by the user by way of a sensor device; receiving sensor data from the sensor device in response to the user causing the virtual prosthetic limb to move within the virtual reality environment, the sensor data comprising motion sensor data and human-prosthesis interface data; and processing the sensor data to derive at least one performance metric; wherein the at least one performance metric is derived by comparing the sensor data with at least one predetermined parameter.
Some embodiments relate to a method of determining performance metrics of a user relating to control of a virtual prosthetic limb in a virtual or augmented reality environment, the method comprising: causing a virtual reality environment to be presented to a user, wherein the virtual reality environment comprises a virtual prosthetic limb controllable by the user by way of a sensor device; receiving sensor data from the sensor device in response to the user causing the virtual prosthetic limb to move within the virtual reality environment, the sensor data comprising motion sensor data and human-prosthesis interface data; and processing the sensor data to derive at least one performance metric;
wherein the at least one performance metric is derived by comparing the sensor data with at least one predetermined parameter, wherein the at least one performance metric relates to a level of competency with which a user controls of the virtual prosthetic limb within the virtual reality environment.
According to some embodiments, the performance metric is related to the performance of at least one basic skill.
According to some embodiments, the performance metric is related to the performance of at least one functional skill.
In some embodiments, the performance metric is based on at least one motor behaviour metric.
In some embodiments, the at least one motor behaviour metric is at least one of a compensation score, a compensation style, a prosthesis utilisation score, a prosthesis utilisation style, a sensor utilisation score and a sensor utilisation style.
According to some embodiments, the performance metric relates to the movement of the virtual limb within the virtual reality environment.
In some embodiments, the performance metric is determined in response to a trigger measurement event.
In some embodiments, the measurement event is triggered by an interaction between the virtual limb and at least one virtual object.
Some embodiments further comprise prompting the user to perform a task within the virtual environment.
In some embodiments, the task required the user to interact with a virtual object within the virtual environment. Some embodiments further comprise receiving data related to the state of the virtual object, and using this data to derive the at least one performance metric.
Some embodiments further comprise receiving data related to the pose of the user, and using this data to derive the at least one performance metric.
Some embodiments further comprise receiving data related to the state of the virtual prosthetic limb, and using this data to derive the at least one performance metric.
Some embodiments further comprise receiving data related to the state of the sensor device, and using this data to derive the at least one performance metric.
According to some embodiments, comparing the sensor data with at least one predetermined parameter comprises comparing the sensor data with historical data generated by a population of users within an equivalent virtual reality environment. Some embodiments further comprise performing a clustering technique on the historical data to group the historical data into clusters, wherein comparing the sensor data with historical data comprises comparing the sensor data with the clustered data. Some embodiments further comprise determining that at least one cluster is a desired cluster, and determining the at least one performance metric by calculating a distance between the desired cluster and the sensor data. In some embodiments, determining the at least one performance metric comprises determining which cluster is closest to the sensor data.
In some embodiments, the performance metric comprises at least one of a motor performance metric and a prosthesis use metric.
Some embodiments further comprise determining that an interaction has occurred between a virtual object and the virtual prosthetic limb, and generating haptic feedback data to be delivered to the user. According to some embodiments, the haptic feedback data comprises at least one of a vibration amplitude and frequency.
In some embodiments, the determined performance metrics of the user are communicated to the user to assist the user in performing functional tasks with a physical prosthetic limb in a real-world environment.
Some embodiments relate to a method of presenting a virtual reality environment to a user, the method comprising: receiving input indicating a high level prosthesis rehabilitation goal; determining, based on the high level prosthesis rehabilitation goal, at least one low level skill relating to the high level goal; generating a virtual reality environment that facilitates in the performance of the low level skill; wherein the virtual reality environment comprises a virtual prosthetic limb controllable by the user by way of a sensor device, the sensor device comprising at least one motion sensor and at least one human-prosthesis interface sensor; receiving sensor data from the sensor device, the sensor data comprising motion sensor data and human-prosthesis interface data; and based on the received sensor data, causing movement of the virtual prosthetic limb in the virtual reality environment to allow the user to perform the low level skill.
In some embodiments, the high level goal relates to a real-world task.
In some embodiments, the low level skill relates comprises at least one of a basic skill and a functional skill.
In some embodiments, the virtual reality environment is used for the purpose or training and/or rehabilitation.
Some embodiments further comprise processing the sensor data to derive at least one performance metric, wherein the at least one performance metric is derived by comparing the sensor data with at least one predetermined parameter.
Some embodiments further comprise determining a recommendation for the user based on the at least one performance metric, and presenting to the user the recommendation.
According to some embodiments, the sensor device comprises at least one inertial measurement unit.
According to some embodiments, the sensor device comprises at least camera.
In some embodiments, the sensor device comprises at least one surface electromyography sensor.
Some embodiments relate to a computer readable storage medium storing executable code, wherein when a processor executes the code, the processor is caused to perform the method of some other embodiments.
Brief Description of Drawings
Embodiments are described in further detail below, by way of example and with reference to the accompanying drawings, in which:
Figure 1 is a schematic diagram of a system for virtual or augmented reality based therapeutics, according to some embodiments;
Figure 2 is a schematic of an optional dongle of the system of Figure 1 , according to some embodiments;
Figure 3A is a diagram illustrating components of the system of Figure 1 being operated by a user with an upper limb amputation, according to some embodiments;
Figure 3B is a diagram illustrating components of the system of Figure 1 being operated by a user with a lower limb amputation, according to some embodiments;
Figure 4 is a schematic diagram of the software components of the system of Figure 1, according to some embodiments;
Figure 5 is a schematic diagram showing the hardware services components of the system of Figure 4 in further detail;
Figure 6 is a schematic diagram showing the prosthesis services components of the system of Figure 4 in further detail;
Figure 7 is a schematic diagram showing the data services components of the system of Figure 4 in further detail;
Figure 8 is a schematic diagram showing the game services components of the system of Figure 4 in further detail;
Figure 9 is a schematic diagram showing the rehabilitation management platform of Figure 1 in further detail;
Figure 10 is a schematic diagram showing the real-time database of Figure 9 in further detail;
Figure 11 is a schematic diagram showing the raw data processing module of Figure 9 in further detail;
Figure 12 is a schematic diagram showing the big data database of Figure 9 in further detail;
Figure 13 is a schematic diagram showing the performance calculation module of Figure 9 in further detail;
Figure 14 is a process flow diagram of a method of providing virtual reality based therapeutics by presenting games to a user using the system of Figure 1;
Figure 15 is a process flow diagram of a method of providing virtual reality based assessment using the system of Figure 1;
Figure 16A is an example screenshot that may be displayed during performance of the method of Figure 14;
Figure 16B is an example screenshot that may be displayed during performance of the method of Figure 15;
Figure 17 is an example screenshot that may be displayed to a user during performance of the method of Figure 14 to allow for selection of games to present to a user using the system of Figure 1 ;
Figures 18 A to 18D are example screenshots that may be displayed to a user during performance of the method of Figure 14;
Figures 19A to 19D show the headset of Figure 1 displaying images that may be shown to the user during performance of the method of Figure 14;
Figure 20 is an example screenshots that may be displayed to a user during performance of the method of Figure 14; and
Figures 21A to 21D are further example screenshots that may be displayed to a user during performance of the method of Figure 14.
Description of Embodiments
Embodiments generally relate to methods and systems for rehabilitation of prosthesis users. In particular, embodiments relate to systems and methods for facilitating rehabilitation of prosthesis users using a virtual or augmented reality platform.
People living with limb loss or limb differences who elect to use a prosthesis may require substantial rehabilitation and training to learn how to effectively use the prosthesis, and to train their body, both neurologically and physically, to use the prosthesis. Where a patient has upper limb loss, they may wear a prosthesis that provides them with functions usually performed by the upper limb, such as picking up, holding, and manipulating objects, for example. Mechanical prostheses use human-to- prosthesis interface sensors that a user can interact with to cause the prosthesis to move in the desired way. For example, a prosthesis may include surface electromyography sensors that are configured to make contact with the remaining portion of the user’s limb, so that electrical signals created when the wearer's remaining muscles contract are translated into movement of the prosthesis. However, it can take time for a user to learn how to manipulate the prosthesis effectively.
Virtual reality and augmented reality platforms allow users to control an avatar within a three-dimensional virtual space, and to interact with virtual objects in that space. This may be done using hand-held controllers comprising user interface elements, such as buttons and motion sensors, to capture the user’s desired movements and actions. Interactions with the controllers are translated into movement of the avatar within the three-dimensional virtual space. While the remainder of this document refers to virtual reality, it should be appreciated that this may include augmented reality and/or mixed reality environments.
Described embodiments use a virtual reality platform to assist in rehabilitation, training and assessment of prosthesis users. By providing a sensor device that can track motion and muscle activation, user inputs can be translated into movements of a virtual prosthesis displayed in a virtual environment. The user can learn to manipulate the virtual prosthesis to interact with virtual objects in the virtual space, and analytics data relating to their actions can be provided to a clinician for assessment. By configuring the virtual prosthesis to act in a manner similar to a physical prosthesis would act given the same motion and muscle activation of a user, the user can apply the trained motion and muscle activation to a physical prosthesis. In other words, training with the virtual prosthesis in the virtual environment may improve a user’s use of a physical prosthesis in their day-to-day life as they move through and interact with their real-world physical environment.
According to some embodiments, the virtual reality environment may be used to present games-based tasks to assist in the rehabilitation process. While the term “game” is used in the description below, it should be understood that the present application relates to a system of rehabilitation and assessment, and not to an entertainment gaming system. Tasks are presented to the user in the form of games to encourage the user to perform them, but could be presented without the game-like elements in some embodiments. For example, the described virtual reality platform may be used to present assessment tasks and standardised tests to the user, in some embodiments.
Figure 1 shows a schematic diagram of a system 100 for virtual reality based therapeutics, according to some embodiments. System 100 may be configured to provide a virtual environment that can be used to provide one or more of therapy,
rehabilitation, training, assessment, and telehealth. System 100 comprises a computing device 110 in communication with a headset 120. Headset 120 is further in communication with a sensor device 130. In some embodiments, headset 120 may be in communication with sensor device 130 via an optional dongle 135. In some embodiments, computing device 110 and headset 120 may further be in communication with a controller 190.
According to some embodiments, computing device 110 may be in communication with further external devices and systems via a network 140. For example, computing device 110 may further be in communication with one or more of a rehabilitation management platform 150, a user computing device 160 and/or a clinician computing device 170. According to some embodiments, a user may wear and/or use headset 120 and sensor device 130 in order to experience a virtual reality platform provided by system 100, to enable the user to interact with virtual objects in a virtual space presented to them via headset 120 by interaction with sensor device 130. Interaction with sensor device 130 may include movement of a limb or portion of a limb which sensor device 130 is configured to monitor, and/or activation of one or more muscles which sensor device 130 is configured to monitor.
Computing device 110 may be a smartphone, laptop computer, desktop computer, or other computing device configured to communicate with headset 120. According to some embodiments, computing device 110 may be operated by a clinician. According to some embodiments, computing device 110 may be operated by a patient or user of headset 120 and sensor device 130. In some embodiments, rather than being a standalone device, computing device 110 may be integrated with headset 120.
Computing device 110 comprises a processor 111 in communication with a memory 114. Processor 111 comprises one or more data processors for executing instructions, and may comprise one or more microprocessor based platforms, central processing units (CPUs), application specific instruction set processors (AS IPs), application specific integrated circuits (ASICs), suitable integrated circuits, or other processors capable of fetching and executing instruction code as stored in memory 114. Processor 111 may include an arithmetic logic unit (ALU) for mathematical and/or logical
execution of instructions, such as operations performed on data stored in internal registers of processor 111.
Memory 114 may comprise one or more memory storage locations, which may be volatile or non-volatile memory types. For example, memory 114 may comprise one or more of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM) or flash memory. Memory 114 is configured to store program code 115 accessible by the processor 111. Program code 115 may comprise a plurality of executable program code modules executable by processor 111 to cause processor 111 to perform functions as described in further detail below. For example, memory 114 may comprise program code 115 in the form of a task presentation application 116, and an assessment application 117, both executable by processor 111.
Executing task presentation application 116 may cause processor 111 to generate a virtual environment for presentation to a user via headset 120, as described below in further detail with reference to Figure 14. Executing assessment application 117 may cause processor 111 to process data received from headset 120 and sensor device 130 to assess a user’s performance within the virtual environment, as described below in further detail with reference to Figure 15. While task presentation application 116 and assessment application 117 are shown as both residing on computing device 110, according to some embodiments one or more of these applications may be stored on or caused to execute on another device of system 100, such as headset 120 or rehabilitation management platform 150, for example.
Further executable code modules that may be stored in program code 115 are described below with reference to Figure 4.
Memory 114 may also comprise one or more data files comprising data 180 that is accessible to processor 111 for performing read and write functions. Some data types that may be stored in memory 114 are described below with reference to Figure 4.
Computing device 110 may also comprise user inputs and outputs 113 capable of receiving inputs, such as requests, from one or more users of user computing device
110, and capable of conveying outputs, such as information, to the user. User I/O 113 may comprise one or more user interface components, such as one or more of a display device, a touch screen display, a keyboard, a mouse, a camera, a microphone, and buttons, for example.
User computing device 110 may further comprise a communications module 112 configured to facilitate communication between user computing device 110 and one or more external computing devices or systems via one or more networks. Communications module 112 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel. According to some embodiments, communications module 112 may facilitate communication between user computing device 110 and other devices within system 100 via network 140. For example, communications module 112 may facilitate communication between user computing device 110 and rehabihtation management platform 150, user computing device 160, and/or clinician computing device 170 via network 140. According to some embodiments, communications module 112 may facilitate communication of data related to patient performance from user computing device 110 to one or more of rehabilitation management platform 150, user computing device 160, and/or clinician computing device 170, for example.
According to some embodiments, communications module 112 may alternatively or additionally facilitate direct communication between computing device 110 and other components of system 100. For example, communications module 112 may facilitate direct communication between computing device 110 and headset 120, sensor device 130, and/or controller 190. According to some embodiments, communications module 112 may facilitate communication of calibration data from computing device 110 to one or more of headset 120, sensor device 130, and/or controller 190, for example. Communications module 112 may also facilitate communication of sensor and user performance data from one or more of headset 120, sensor device 130, and/or controller 190 to computing device 110.
Communications module 112 may facilitate communication between computing device 110 and external computing devices or systems via one or more wired or wireless
communications protocols. For example, communications module 112 may facilitate communication between computing device 110 and external computing devices or systems via Wi-Fi, Bluetooth, Ethernet, USB, or other communication protocols.
Network 140 may comprise one or more local area networks or wide area networks that facilitate communication between elements of system 100. For example, according to some embodiments, network 140 may be the internet. However, network 140 may comprise at least a portion of any one or more networks having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, some combination thereof, or so forth. Network 140 may include, for example, one or more of: a wireless network, a wired network, an internet, an intranet, a public network, a packet-switched network, a circuit- switched network, an ad hoc network, an infrastructure network, a public- switched telephone network (PSTN), a cable network, a cellular network, a satellite network, a fibre-optic network, or some combination thereof.
Headset 120 may be a virtual reality headset configured to deliver a virtual reality environment to a user when worn on the user’s head. According to some embodiments, headset 120 may be an off-the-shelf or commercially available virtual reality headset. Headset 120 may be an Android-based VR headset in some embodiments, such as an Oculus Quest 1 or Oculus Quest 2 headset from Oculus®; a VIVE Focus 3 headset from HTC®; or a Pico Neo 2 or Pico Neo 3 headset from Pico Interactive®, in some embodiments.
Headset 120 comprises a processor 121 in communication with a memory 124. Processor 121 comprises one or more data processors for executing instructions, and may comprise one or more microprocessor based platforms, central processing units (CPUs), application specific instruction set processors (ASIPs), application specific integrated circuits (ASICs), suitable integrated circuits, or other processors capable of fetching and executing instruction code as stored in memory 124. Processor 121 may include an arithmetic logic unit (ALU) for mathematical and/or logical execution of instructions, such as operations performed on data stored in internal registers of processor 121.
Memory 124 may comprise one or more memory storage locations, which may be volatile or non-volatile memory types. For example, memory 124 may comprise one or more of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM) or flash memory. Memory 124 is configured to store program code 125 accessible by the processor 121. Program code 125 may comprise a plurality of executable program code modules executable by processor 121 to cause processor 121 to perform functions such as displaying a virtual reality environment to a user via display 123, as described in further detail below. According to some embodiments, program code 125 may include virtual reality content built on a real-time game engine, such as the Unity game engine by Unity Technologies®, for example. Memory 124 may also comprise one or more data fdes comprising data 126 that is accessible to processor 121 for performing read and write functions.
Headset 120 also comprises a display 123 capable of conveying visual data to a user in the form of a virtual reality environment. Display 123 may be a head-mounted display comprising one or more display optics that are positioned to be viewed by at least one eye of a user when headset 120 is worn. For example, display 123 may comprise two display optics, with each display optic being configured to be viewed by an eye of the user to present a simulated three dimensional image to the user.
According to some embodiments, headset 120 also comprises a motion sensor 127. According to some embodiments, motion sensor 127 may be configured to sense the motion of the headset 120 as it is being worn by a user, to allow for the image displayed by display 123 to be changed with the movement of headset 120. Processor 121 executing program code 125 may be caused to adapt the user’s view of the virtual environment being displayed to them via display 123 as the user moves in their real- world physical environment, the physical movement of the user being translated into virtual movement of the user’s avatar within the virtual environment and thus altering the perspective of the virtual environment shown to the user via display 123. According to some embodiments, motion sensor 127 may comprise an inertial measurement unit (IMU), or other device that measures at least one of the specific force, angular rate, and
orientation of headset 120. According to some embodiments, motion sensor 127 may comprise one or more of an accelerometer, gyroscope, and magnetometer.
Headset 120 may further comprise a communications module 122 configured to facilitate communication between headset 120 and one or more external computing devices or systems. Communications module 122 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel. According to some embodiments, communications module 122 may facilitate communication between headset 120 and computing device 110. For example, communications module 122 may facilitate communication of calibration and settings data from computing device 110 to headset 120. Communications module 122 may additionally or alternatively facilitate communication of sensor and user performance data from headset 120 to computing device 110. According to some embodiments, communications module 122 may facilitate communication of sensor data from one or more of sensor device 130 and controller 190 to headset 120. In some embodiments, communications module 122 may additionally or alternatively facilitate communication of data from headset 120 to one or more of sensor device 130 and controller 190, which may include haptic feedback data in some embodiments.
According to some embodiments, communications module 122 may facilitate direct communication between headset 120 and other components of system 100 via one or more wired or wireless communications protocols. For example, communications module 122 may facilitate communication between headset 120 and external computing devices or systems via Wi-Fi, Bluetooth, Ethernet, USB, or other communication protocols. According to some embodiments, headset 120 may communicate with one or more of computing device 110, sensor device 130 and controller 190 via Bluetooth Low Energy (BLE). According to some embodiments, communications module 122 may comprise a software module configured to communicate with proprietary protocols used by sensor device 130, to allow communications module 122 to extract prosthesis input signals generated by sensor device 130 and use these signals to simulate prosthetic devices in virtual reality via display 123.
In some embodiments, where direct communication between headset 120 and sensor device 130 isn’t possible, system 100 may comprise a dongle 135 to facilitate communication between headset 120 and sensor device 130. According to some embodiments, dongle 135 may be necessary due to communication limitations in sensor device 130. According to some embodiments, dongle 135 may communicate with sensor device 130 via a wired interface, and may communicate via a wired communication protocol. In some embodiments, dongle 135 may communicate with sensor device 130 via a proprietary communication protocol, which may be specific to sensor device 130. According to some embodiments, dongle 135 may be configured to communicate with headset 120 via a wireless communication protocol. Dongle 135 may be configured to communicate with headset 120 via BLE, for example. Dongle 135 may be configured to receive information such as sensor data from sensor device 130, and pass this to headset 120. Dongle 135 may also be configured to receive information from headset 120. Dongle 135 may receive haptic feedback data from headset 120 and pass this to sensor device 130. According to some embodiments, dongle 135 may additionally or alternatively receive configuration data from headset 120. The configuration data may relate to the type of sensor device 130 dongle 135 is to be connected to, and may allow dongle 135 to select an appropriate communication protocol for communication with sensor device 130. According to some embodiments, the type of sensor device may be detected by dongle 135 automatically. According to some embodiments, the type of sensor device may be selected by a clinician when setting up headset 120 for the patient, via user I/O 113 of computing device 110, a user interface of clinician computing device 170 or headset 120, for example.
Figure 2 shows a schematic diagram of a sub-system 200 of system 100, showing a possible embodiment of dongle 135 in further detail. In the illustrated embodiment, dongle 135 comprises a wired communications module 205 in communication with sensor device 130. According to some embodiments, wired communications module 205 facilitates communication with sensor device 130 via a proprietary communication protocol, using a proprietary bus. For example, the bus may comprise three wires, which may be used for data, voltage (Vcc) and Ground in some embodiments. According to some embodiments, wired communications module 205 may comprise a UART to allow for serial communications with sensor device 130. According to some
embodiments, wired communications module 205 may facilitate an analogue communication protocol.
In some embodiments, wired communications module 205 may be specific to sensor device 130, and may require configuration based on the properties of sensor device 130. Sensor configuration and/or sensor selection information may be generated by headset 120, received by wireless communications module 215, and used by processor 210 to configure wired communications module 205 to adopt the appropriate configuration.
Dongle 135 further comprises a processor 210 for processing data received from wired communications module 205 and communicating this to a wireless communications module 215. According to some embodiments, processor 210 may comprise one or more data processors for executing instructions, and may comprise one or more microprocessor based platforms, central processing units (CPUs), application specific instruction set processors (ASIPs), application specific integrated circuits (ASICs), suitable integrated circuits, or other processors capable of fetching and executing instruction code as stored in a memory 240.
Memory 240 may comprise one or more memory storage locations, which may be volatile or non-volatile memory types. For example, memory 124 may comprise one or more of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM) or flash memory. Memory 240 may be configured to store program code accessible and executable by processor 210. Memory 240 may also comprise one or more data files comprising data that is accessible to processor 210 for performing read and write functions. For example, memory 240 may comprise a data map 245, which may be used by processor 210 to map sensor data received from sensor device 130 via wired communication module 205 to pattern/power pairs or commands, which may be passed by wireless communications module 215 to headset 120 to be used for input commands by prosthetic services module 430, as described in further detail below with reference to Figure 4. Pattern/power pairs may correspond to data that describes both a combination of sensors being activated and a degree to which they are activated. Specifically, the received sensor data may have values corresponding to both a pattern or combination of
sensors activated in a sensor array, and to a power or strength at which the sensors in the pattern were activated. The pattern and power data may be mapped to prosthesis functions. For example, a particular pattern of sensors may correspond to a “fist” position being adopted by a prosthetic, and the power value will determine the degree to which the fist should be closed.
According to some embodiments, processor 210 and memory 240 may reside within a microcontroller. For example, processor 210 and memory 240 may reside within an Arduino® Nano BLE microcontroller in some embodiments.
In the illustrated embodiment, dongle 135 further comprises a wireless communications module 215 in communication with headset 120. Wireless communications module 215 facilitates communication with headset 120 via a wireless communication protocol. According to some embodiments, wireless communications module 215 comprises a Bluetooth module, and facilitates communication with headset 120 via BLE.
Dongle 135 may also comprise one or more of a bootloader 225, power source 230, and power management module 220. Bootloader 225 may be configured to allow for programming or re-programming of processor 210 and/or memory 240 via a programming port, which may be a USB port in some embodiment. For example, bootloader 225 may be caused to erase and/or write to memory 240 to amend stored program code for execution by processor 210. Power source 230 may comprise one or more of a battery, USB connector, power plug, or other power source, and may be configured to supply power to the remaining electronic components of dongle 135. Power management module 220 may be configured to receive power status information from power source 230, and to communicate this information to processor 210. Processor 210 may in turn communicate this information to headset 120 for display to the user.
Returning to Figure 1, system 100 further comprises at least one sensor device 130. According to some embodiments, sensor device 130 may be configured to be worn on the remaining portion of an amputated or missing limb of a user, and to generate signals based on the movement and use of the remaining portion of the limb. According to some embodiments, sensor device 130 may be configured to otherwise monitor the
movement and muscle activation of a limb of a user. According to some embodiments, sensor device 130 may be able to be worn on or otherwise monitor the movement and muscle activation of a non-amputated limb of a user.
Sensor device 130 comprises a communications module 131 configured to facilitate communication between sensor device 130 and one or more external computing devices or systems. Communications module 131 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel. According to some embodiments, communications module 131 may facilitate communication between sensor device 130 and headset 120, as described in further detail above. According to some embodiments, communications module 131 may facilitate communication between sensor device 130 and computing device 110.
According to some embodiments, communications module 131 may facilitate direct communication between sensor device 130 and other components of system 100 via one or more wired or wireless communications protocols. For example, communications module 131 may facilitate communication between sensor device 130 and external computing devices or systems via Wi-Fi, Bluetooth, Ethernet, USB, or other communication protocols. In some embodiments, communications module 131 may facilitate direct communication between sensor device 130 and headset 120. For example, communications module 131 may facilitate communication of sensor data from sensor device 130 to headset 120. In some embodiments, communications module
131 may additionally or alternatively facilitate communication of data from headset 120 to sensor device 130. As described above, in some embodiments communication between headset 120 and sensor device 130 may be via dongle 135.
Sensor device 130 further comprises at least one motion sensor 132 and at least one human-prosthetic interface sensor 133. According to some embodiments, motion sensor
132 and human-prosthetic interface sensor 133 may be separate devices located in separate locations.
According to some embodiments, motion sensor 132 may comprise an inertial measurement unit (IMU), or other device that measures at least one of the specific
force, angular rate, and orientation of sensor device 130. According to some embodiments, motion sensor 132 may comprise one or more of an accelerometer, gyroscope, and magnetometer. In some embodiments, motion sensor 132 may comprise one or more cameras. Motion sensor 132 may provide data relating to the movement of a user’s limb through physical space.
According to some embodiments, motion sensor 132 may be a motion sensor component from an off-the-shelf or commercially available virtual reality controller. For example, motion sensor 132 may be a motion sensor component from an Androidbased VR controller, such as an Oculus Quest 1 or Oculus Quest 2 controller from Oculus®; a VIVE Focus 3 controller from HTC®; or a Pico Neo 2 or Pico Neo 3 controller from Pico Interactive®, in some embodiments. According to some embodiments, and as illustrated in Figures 3A and 3B, sensor device 130 may comprise a controller 195 housing the motion sensor 132. Controller 195 may be substantially identical to controller 190, as described in further detail below.
Human-prosthetic interface sensor 133 may be a sensor configured to be interacted with by a wearer of sensor device 130 in order to initiate one or more actions, movements of articulations of a virtual prosthesis displayed by headset 120 within a virtual environment. Human-prosthetic interface sensor 133 may be configured to sense muscle activation in a limb or partial limb of a user. According to some embodiments, human-prosthetic interface sensor 133 may comprise a surface electromyography sensor, or other sensor that generates data in response to contracting and/or relaxation of one or more muscles of a wearer of sensor device 130 that are in contact with sensor device 130, proximate to sensor device 130, or which sensor device 130 is otherwise configured to monitor.
According to some embodiments, human-prosthetic interface sensor 133 may be a pattern recognition based sensor, that generates data based on one or more combinations of actions performed by a wearer of sensor device 130. For example, human-prosthetic interface sensor 133 may comprise one or more commercial patternbased surface electromyography sensors, such as one or more of the Ottobock MyoPlus, COAPT Complete Control, and i-Biomed Sense. According to some embodiments, human-prosthetic interface sensor 133 may be a button based sensor,
comprising a one or more buttons activatable by a wearer of sensor device 130 by contracting and/or relaxing one or more of the muscles in contact with sensor device 130. According to some embodiments, human- prosthetic interface sensor 133 may comprise at least one of an inertial measurement unit (IMU), force sensitive resistor, cable driven interface, or other brain-machine or human-machine interface. According to some embodiments, human-prosthetic interface sensor 133 may generate data which can be used to determine which muscle or combination of muscles a user activated by way of contraction or relaxation. According to some embodiments, human-prosthetic interface sensor 133 may generate data which can be used to determine the degree to which one or more muscles of the user were contracted or relaxed.
According to some embodiments, sensor device 130 may comprise one or more additional user interface mechanisms (not shown), such as buttons, sliders, lights or motors for providing haptic feedback, for example.
In some embodiments, system 100 optionally further comprises a controller 190, which may be a motion-tracked controller configured to be held by a user by an able-bodied limb. Controller 190 comprises at least one communications module 191 and at least one motion sensor 192.
Communications module 191 may be configured to facilitate communication between controller 190 and one or more external computing devices or systems. Communications module 191 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel. According to some embodiments, communications module 191 may facilitate communication between controller 190 and headset 120. According to some embodiments, communications module 191 may facilitate communication between controller 190 and computing device 110. For example, communications module 191 may facilitate communication of sensor data from controller 190 to headset 120. In some embodiments, communications module 191 may additionally or alternatively facilitate communication of data from headset 120 to controller 190.
According to some embodiments, communications module 191 may facilitate direct communication between controller 190 and other components of system 100 via one or more wired or wireless communications protocols. For example, communications module 191 may facilitate communication between controller 190 and external computing devices or systems via Wi-Fi, Bluetooth, Ethernet, USB, or other communication protocols.
Motion sensor 192 of controller 190 may comprise an inertial measurement unit (IMU), or other device that measures at least one of the specific force, angular rate, and orientation of controller 190. According to some embodiments, motion sensor 192 may comprise one or more of an accelerometer, gyroscope, and magnetometer. Motion sensor 192 may provide data relating to the movement of controller 190 through physical space.
According to some embodiments, controller 190 may comprise one or more additional user interface mechanisms (not shown), such as buttons, sliders, lights or motors for providing haptic feedback, for example.
According to some embodiments, controller 190 may be an off-the-shelf or commercially available virtual reality controller. For example, controller 190 may be an Android-based VR controller, such as an Oculus Quest 1 or Oculus Quest 2 controller from Oculus®; a VIVE Focus 3 controller from HTC®; or a Pico Neo 2 or Pico Neo 3 controller from Pico Interactive®, in some embodiments.
System 100 may further comprise a rehabilitation management platform 150 in communication with computing device 110 via network 140. Rehabilitation management platform 150 may be configured to host data and/or program code for supporting the rehabilitation services provided by computing device 110. For example, rehabilitation management platform 150 may host a data analytics platform for processing data received from computing device 110 and calculating metrics about a user’s performance, as described in further detail below.
According to some embodiments, rehabilitation management platform 150 may be a serverless or cloud based platform. For example, rehabilitation management platform
150 may be a Firebase® or Google® cloud based platform in some embodiments. In some alternative embodiments, rehabilitation management platform 150 may comprise one or more computing devices and/or server devices, such as one or more servers, databases, and/or processing devices in communication over a network.
In the illustrated embodiment, rehabilitation management platform 150 comprises a processor 151 in communication with a memory 153. Processor 151 comprises one or more data processors for executing instructions, and may comprise one or more microprocessor based platforms, central processing units (CPUs), application specific instruction set processors (ASIPs), application specific integrated circuits (ASICs), suitable integrated circuits, or other processors capable of fetching and executing instruction code as stored in memory 153. Processor 151 may include an arithmetic logic unit (ALU) for mathematical and/or logical execution of instructions, such as operations performed on the data stored in internal registers of processor 151.
Memory 153 may comprise one or more memory storage locations, which may be volatile or non-volatile memory types. For example, memory 153 may comprise one or more of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM) or flash memory. Memory 153 is configured to store program code 154 accessible by the processor 151. Program code 154 may comprise a plurality of executable program code modules executable by processor 151 to cause processor 151 to perform functions as described in further detail below. Memory 153 may also comprise data 155 that is accessible to processor 151 for performing read and write functions.
Rehabilitation management platform 150 may further comprise a communications module 152 configured to facilitate communication between rehabilitation management platform 150 and one or more external computing devices via one or more networks. Communications module 152 may comprise a combination of network interface hardware and network interface software suitable for establishing, maintaining and facilitating communication over a relevant communication channel. According to some embodiments, communications module 152 may facilitate communication between rehabilitation management platform 150 and other devices within system 100 via network 140. For example, communications module 152 may facilitate communication
between rehabilitation management platform 150 and computing device 110 via network 150.
Rehabilitation management platform 150 may also be in communication with other computing devices. For example, in the illustrated embodiment, rehabilitation management platform 150 is in communication with user computing device 160 and clinician computing device 170 via network 140. User computing device 160 and clinician computing device 170 may each comprise one or more of a laptop, desktop, smart phone, or other computing device. According to some embodiments, user computing device 160 and clinician computing device 170 may each comprise a processor, memory, user I/O and communications module which may be similar to processor 111, memory 114, user I/O 113 and communications module 112 as described above with reference to computing device 110.
According to some embodiments, user computing device 160 may be operated by a patient undergoing rehabilitation via system 100, and may be the user’s personal device on which they can set up user profile information, select preferences, and view their progress. According to some embodiments, these functionalities may be accessible via a patient companion application residing in memory of user computing device 160 and executable by a processor of user computing device 160 to be displayed on a user interface of user computing device 160. In some alternative embodiments, these functionalities may be accessible via a web based application, or other means.
Clinician computing device 170 may be operated by a clinician facilitating rehabilitation to a patient via system 100, and may be located a clinic environment in some cases. Clinician computing device 170 may be configured to allow clinicians to enter information and settings for their patient, and to view insights and analytics relating to the patient’s rehabilitation progress. According to some embodiments, these functionalities may be accessible via a dynamic web application hosted on a platform such as the rehabilitation management platform, and accessible via a browser application executable by a processor of clinician computing device 170 to be displayed on a user interface of clinician computing device 170. In some alternative embodiments, these functionalities may be accessible via a native application residing in memory of clinician computing device 170, or other means.
Figure 3A shows a subsystem 300 of system 100 in position on a user 310 with an upper limb amputation. Subsystem 300 comprises headset 120, sensor device 130, and controller 190. As illustrated, headset 120 is positioned on a head 312 of user 310 via a strap 320, while a display portion 322 comprising display 123 is configured to be positioned on a face of user 310 in proximity to the eyes of user 310, so that user 310 can view display 123 while headset 120 is being worn.
In the illustrated embodiments, sensor device 130 is configured to be worn on an upper limb 314 of a user. Limb 314 may be a partially amputated or otherwise disabled limb. According to some embodiments, sensor device 130 may be sized to encircle the limb 314, and to be retained in position via an elastic or friction fit with the girth of limb 314. In the illustrated embodiment, sensor device 130 comprises a controller 195, as described above.
Controller 190 is configured to be held in a hand of an upper limb 316 of user 310, where limb 316 is an able-bodied limb.
Figure 3B shows a subsystem 350 of system 100 in position on a user 310 with a lower limb amputation. Subsystem 350 comprises headset 120 and sensor device 130. As in Figure 3A, headset 120 is positioned on a head 312 of user 310 via a strap 320, while a display portion 322 comprising display 123 is configured to be positioned on a face of user 310 in proximity to the eyes of user 310, so that user 310 can view display 123 while headset 120 is being worn.
In the illustrated embodiment, sensor device 130 is configured to be worn on a lower limb 354 of a user. Limb 354 may be a partially amputated or otherwise disabled limb. According to some embodiments, sensor device 130 may be sized to encircle the limb 354, and to be retained in position via an elastic or friction fit with the girth of limb 354. In the illustrated embodiment, sensor device 130 comprises a controller 195, as described above.
Figure 4 shows a schematic diagram of the software components of system 100 that provide a virtual reality platform 400. While some software components of platform 400 may be described as residing on a particular hardware component of system 100,
any of the software components may reside on and/or be accessible to any of the hardware elements of system 100.
According to some embodiments, virtual reality platform 400 may be configured to provide a virtual reality environment in which a user can train in a number of basic and functional skills required to operate a prosthesis.
Basic skills may be skills which target rehabilitation and strengthening of the neurology and muscles of the residual limb, and may be used in the operation of the prosthesis. Basic skills may be skills may include causing the muscles of the patient or user to perform certain actions, which may include movement of a limb and/or muscle activation. Where a user is an amputee, basic skills may include skills that relate to movement of a residual limb, and muscle activation in the residual limb. For example, basic skills may include activating the muscles in a residual arm limb that would cause a hand to make a fist gesture if the limb were not amputated, where that muscle activation can be used to cause a prosthetic hand to close. Other basic skills may include prosthesis input signal modulation (e.g., EMG); prosthesis input pattern activation (e.g., EMG fist pattern); arm range of motion; prosthetic hand closure modulation; prosthetic joint rotation modulation (e.g., wrist and elbow) and prosthetic device switch.
Functional skills may comprise a combination of basic skills, and target the ability of users to use the basic skills to operate a physical prosthesis for activities of daily living, such as picking up an object with the prosthesis without compensating with their trunk. Other functional skills may include pick and place; bi-manual manipulation; single hand manipulation (prosthesis); tool use; utensil use; and use of prosthetic grasp types. Functional skills may comprise one or more basic skills performed sequentially and/or simultaneously. For example, tool use may comprise basic skills such as movement of an arm, joint rotation, and hand closure, in some cases.
Virtual reality platform 400 may be configured to provide activities in the form of games or activities which target a set of these basic and/or functional skills. For example, games may target skills such as the ability to pick up and place objects; bimanual manipulation of objects; muscle activation and modulation of wearable sensors
such as EMG, and other skills involving the use of a prosthesis. The activities may be presented in the form of activities or games that require the user to use the skill in order to achieve game objectives. For example, a game for developing pick and place abilities may be presented in a barbeque setting (as shown in Figures 19C, 19D and 21C, described in further detail below), where a user must use a virtual prosthesis to pick up and place various virtual foods on a virtual barbeque in order to achieve game objectives and make progress in the game. A game for developing bimanual abilities may be an assembly game, where a user must use a virtual prosthesis to assemble various virtual objects into a virtual structure in order to achieve game objectives and make progress in the game. For example, the game may be presented in a burger restaurant setting (as shown in Figures 18A and 21B, described in further detail below), where a user must use a virtual prosthesis to pick up and assemble various burger components in order to achieve game objectives and make progress in the game. A game for developing EMG modulation abilities may be a painting game, where a user must use a virtual prosthesis to virtually paint on a virtual canvas in order to achieve game objectives and make progress in the game.
According to some embodiments, virtual reality platform 400 may be configured to generate assessments of a user’s competency in manipulating a virtual prosthesis. This assessment may be based on a user’s performance of the activities presented by virtual reality platform 400. In some embodiments, the assessment may be based on data relating to the movement and/or muscle activation exhibited by the user during their performance of the activities presented by virtual reality platform 400.
According to some embodiments, assessments may be based on comparing user performance data with comparative data. The comparative data may be historical performance data collected from previous users, as described in further detail below. In some embodiments, the assessments conducted by virtual reality platform 400 may be based on known assessments of prosthesis use which may be used by clinicians when assessing a user’s ability to manipulate a physical prosthesis. For example, assessments may be based on known techniques such as arm prosthesis race challenges (such as those conducted by Cybathlon®), the Box and Blocks test, and/or the Jepson-Taylor hand function test, in some embodiments.
Virtual reality platform 400 may comprise a platform management module 410. Platform management module 410 may contain executable code which, when executed, manages the behaviour of the rehabilitation platform provided by system 100. Platform management module 410 may manage the high-level interactions and configurations of the platform, based on system and user configuration data 435. According to some embodiments, aspects of system and user configuration data 435 may be generated by a user via a user computing device 160. According to some embodiments, aspects of system and user configuration data 435 may be generated by a clinician via a clinician computing device 170. System and user configuration data 435 may include system data relating to the hardware being used, such as data relating to the form and configuration of headset 120, sensor device 130 and/or controller 190. System and user configuration data 435 may include data relating to a user of system 100, which may include a user’s name, age, physical characteristics such as height and weight, and details relating to a user’s limb loss or limb difference, such as the side of their body affected and the degree of the amputation or disability.
According to some embodiments, system and user configuration data 435 may include data relating to a prosthesis configuration. The prosthesis configuration data may relate to a physical prosthesis that a user of system 100 is or will be fitted with, and system 100 may use the prosthesis configuration data to generate a virtual prosthesis having characteristics replicating those of the physical prosthesis. The prosthesis configuration data may be dependent on a type of amputation, as well as a clinician’s assessment of the user. The prosthesis configuration data may include data relating to one or more components of a prosthesis, which may include a terminal device such as a prosthetic hand, a prosthetic wrist, and/or a prosthetic elbow. For a user with a transradial amputation, the virtual prosthesis may include a terminal device, and a wrist. A user with a transradial amputation may use a prosthetic that does not include an elbow. The terminal device may be a motorised hand with one degree of freedom, and the wrist may be a motorised wrist with one degree of freedom to perform pronation and supination, for example.
Platform management module 410 may be configured to process system and user configuration data 435, and to communicate the processed data to one or more further modules.
According to some embodiments, platform management module 410 may be configured to derive sensor configuration data 405 from the system and user configuration data 435, and to pass the sensor configuration data 405 to a hardware services module 420. As described below in further detail with reference to Figure 5, hardware services module 420 may be configured to handle communications with hardware such as sensor device 130, such as by using the sensor configuration data 405 to map signals from sensor device 130 to virtual prosthesis commands. During use of sensor device 130 by a user, hardware services module 420 may generate sensor data 445 based on the sensor configuration data 405 and the signals received from sensor device 130 during use of sensor device 130 by a user, and pass this to a prosthesis services module 430. According to some embodiments, prosthesis services module 430 may also pass data to hardware services module 420, which may include haptic feedback data in some cases, which may be generated based on interaction between a virtual prosthesis and other virtual objects presented to a user in a virtual environment displayed via headset 120. The haptic feedback data may comprise data relating to a vibration amplitude and frequency to be delivered to the user by the hardware device, which may be sensor device 130 in some embodiments. Hardware services module 420 may further service sensor usage data 465 and pass this to a data services module 450. Sensor usage data 465 may comprise data relating to the use of sensor device 130, such as data relating to the sensed movement of sensor device 130 in physical space, and the sensed activation of muscles of a user using sensor device 130.
According to some embodiments, platform management module 410 may be configured to derive game configuration data 425 from the system and user configuration data 435, and to pass the game configuration data 425 to a game services module 440. Game configuration data 425 may comprise data relating to the presentation of games by system 100, such as the types of games and difficulty level of games to present, for example. As described below in further detail with reference to Figure 6, game services module 440 may be configured to manage virtual environments
to be presented by headset 120, including handling activity and game flow within the virtual environments, handling behaviour of the virtual environments, and handling game asset behaviour. During use of sensor device 130, game services module 440 may receive data relating to a state of a corresponding virtual prosthesis from prosthesis services module 430, and generate data relating to user activity within the virtual reality environment, such as interaction of the virtual prosthesis with virtual objects. Game services module 440 may use this along with the game configuration data 425 received from platform management module 410 to generate interaction event data 455. Interaction event data 455 may include configuration data related to the virtual prosthesis, where such configuration is dependent on game logic. Interaction event data 455 may then be passed to prosthetic services module 430. Game services module 440 may also generate activity and interaction event data 485, and pass this to data services module 450. Activity and interaction event data 485 may include data relating to a user’s activity within the virtual environment, and data relating to interactions between the virtual prosthesis and any virtual objects.
Platform management module 410 may further be configured to derive prosthesis configuration data 415 from the system and user configuration data 435, and to pass the prosthesis configuration data 415 to the prosthesis services module 430. During use of sensor device 130, prosthesis services module 430 may use the sensor data 445, prosthesis configuration data 415, and interaction event data 455 to simulate prosthesis function within the virtual environment provided by system 100 by providing movement and activation of the virtual prosthesis based on movement and activation sensed by sensor device 130. For example, prosthesis services module 430 may use the sensor data 445, prosthesis configuration data 415, and interaction event data 455 to map sensor data 445 to virtual prosthesis function, which may then be presented to a user within a virtual environment via headset 120. Prosthesis usage data 475 may be generated by prosthesis services module 430 and passed to data services module 450. Prosthesis usage data 475 may include data relating to the manner in which the virtual prosthesis was used, including the movement of the virtual prosthesis in space and in relation to any virtual objects.
Data services module 450 may be configured to receive raw data from the virtual environment presented by system 100, and to pass this to rehabilitation management platform 150 for analysis. According to some embodiments, data services module 450 may buffer data gathered during an activity presented via virtual reality platform 400, and may then synchronise the data with rehabilitation management platform 150 once the activity is completed, or periodically. In some alternative embodiments, the data may be sent to rehabilitation management platform 150 in real time. The raw data may include sensor usage data 465, prosthesis usage data 475, and activity and interaction event data 485, for example. This data may be captured through an event based system in some embodiments, as described below with reference to Figure 7.
Figure 5 shows a sub-system 500 of the system of Figure 4, illustrating aspects of virtual reality platform 400, such as hardware services module 420, in further detail.
Sub-system 500 includes sensor device 130 including human-prosthetic interface sensor 133. As a wearer interacts with sensor 133, such as by engaging muscles in the limb on which sensor device 130 is being worn and/or by moving the limb on which sensor device 130 is being worn through physical space, sensor 133 generated sensor data, which may include pattern and power command data 510. Pattern and power command data 510 is communicated to virtual reality platform 400. This may be by way of communications module 131 sending pattern and power command data 510 to communications module 122 of headset 120, for example. Pattern and power command data 510 is received by hardware services module 420.
Hardware services module 420 may comprise a hardware communication module 520 and a sensor manager module 530. Pattern and power command data 510 is received by hardware communication module 520, which handles communication with sensor device 130. According to some embodiments, hardware services module 420 may comprise a native communications module which may be custom configured for each sensor device 130. Hardware services module 420 processes the received pattern and power command data 510 based on sensor configuration data 550 received from sensor manager module 530, to generate processed pattern and power command data 540, which is passed back to sensor manager module 530. According to some embodiments,
the processing may comprise converting the data from a format received from sensor device 130 into a format usable by virtual reality platform 400.
Sensor manager module 530 receives pattern and power command data 540, and also receives sensor configuration data 405 from platform management module 410. Sensor manager module 530 uses the sensor configuration data 405 to generate sensor configuration data 550, which is passed to hardware communication module 520. This may be done during a configuration process, which may only occur during a setup procedure. Sensor manager module 530 further uses the sensor configuration data 405 to process the received pattern and power commands 540 to generate sensor data 445, which is passed to prosthesis services module 430. This may be done during use of sensor device 130. Sensor manager module 530 may generate sensor data 445 by wrapping the pattern and power command data 540, and mapping this data to particular prosthetic functions. According to some embodiments, this may be done with reference to a sensor command dictionary, which may be specific to each sensor device 130.
Prosthetic services module 430 receives the sensor data 445, along with prosthesis configuration data 415. Prosthetic services module 430 may be configured to facilitate the simulation of different types of prostheses, and to facilitate the dynamic mapping of sensor functions as detected by sensor device 130 to virtual prosthesis functions as displayed to the user via headset 120. Prosthetic services module 430 may comprise a prosthetic manager module 560, which may manage the state of the virtual prosthesis, for example. Based on sensor data 445 and prosthesis configuration data 415, prosthetic manager module 560 may manage the virtual position and motion of the virtual prosthesis as displayed to the user via headset 120. Prosthetic services module 430 and prosthetic manager module 560 are further illustrated in Figure 6.
Figure 6 shows a sub-system 600 of the system of Figure 4, illustrating aspects of virtual reality platform 400, such as prosthetic services module 430, in further detail.
Subsystem 600 includes hardware services module 420, comprising sensor manager module 530, as described above. During use of sensor device 130, sensor manager module 530 generates sensor data 445, which may comprise sensor usage data 605 and sensor state data 615, in some embodiments. In the illustrated embodiment, sensor state
data 615 is passed to prosthesis manager module 560 of prosthesis services module 430, while sensor usage data 605 is passed to an event logger module 640 of data services module 450.
Prosthesis manager module 560 receives sensor state data 615, which may comprise information about the state of one or more human-prosthetic interface sensors 133. Prosthesis manager module 560 derives pattern and power data 645 from the sensor state data 615, and uses this to look up prosthesis function data 655 based on a human- prosthetic interface (HPI) map database 630. HPI map database 630 may be specific to a selected virtual prosthesis, and may define the relationships between sensor data generated by sensor 133 and the prosthesis function that should result from that sensor data. For example, Table 1 as shown below illustrates an example mapping between particular patterns or combinations of sensors 133 and corresponding prosthesis actions.
Table 1: example mapping of sensor patterns to prosthesis actions
Of course, the table shown above is an example only, and the mapping data stored in HPI map database 630 will be dependent on the quantity and type of sensors 133 used, the user’s capabilities and limitations, the user’s goals, and a clinician’s assessment of the user. According to some embodiments, the HPI map database 630 may be constructed based on data received from a clinician, which the clinician may enter via clinician computing device 170. In some embodiments, HPI map database 630 may be constructed automatically by system 100. In some embodiments, the automated construction of HPI map database 630 may be facilitated by a machine learning
algorithm, which may present recommendations for HPI mapping based on user goals, as well as assessments and insights of user behaviour via system 100.
When a user activates the predetermined combination of sensors on sensor device 130 by engaging one or more particular muscles, that sensor data 445 is received by prosthesis manager module 560 and converted to a virtual prosthesis action by referencing HPI map database 630 to derive the corresponding prosthesis function data 655.
Once prosthesis manager module 560 retrieves the prosthesis function data 655 relating to the pattern and power data 645, it may process the prosthesis function data 655 to derive at least one of joint state data 625 and terminal state data 665. Where the selected virtual prosthesis comprises a joint such as an elbow, joint state data 625 is derived to determine the state of the virtual joint required to perform the virtual prosthesis function defined by prosthesis function data 655. For example, joint state data 625 may define an angle which the virtual joint should adopt, in some embodiments. Joint state data 635 is communicated to prosthesis joint module 610, which may be configured to handle the behaviour of the virtual prosthetic joint within the virtual environment presented by system 100. Prosthesis joint module 610 may pass joint usage data 635 to event logger module 640.
Where the selected virtual prosthesis comprises an actuatable terminal device, such as a mechanical hand or grasping mechanism, terminal state data 665 is derived to determine the state of the virtual terminal device required to perform the virtual prosthesis function defined by prosthesis function data 655. For example, terminal state data 665 may define a position which the virtual terminal device should adopt, which may include an angle of rotation of the virtual terminal device and/or the state of a virtual grasping mechanism, for example. Terminal state data 665 is communicated to terminal device module 620, which may be configured to handle the behaviour of the virtual terminal device, such as the animation of a virtual prosthetic hand, within the virtual environment presented by system 100. Terminal device module 620 may pass terminal usage data 675 to event logger module 640.
Event logger module 640 may receive sensor data 445, which may include sensor usage data 605, terminal usage data 675 and joint usage data 635. Event logger module 640 may use the received data to handle event-driven data logging capabilities within the virtual environment presented by system 100, such as during activities or games presented to a user via headset 120.
Figure 7 shows a sub-system 700 of the system of Figure 4, illustrating aspects of virtual reality platform 400, such as data services module 450, in further detail.
Subsystem 700 includes data services module 450 and event logger module 640 as described above. Data services module 450 may be configured to handle data gathering from the virtual environment presented by system 100, and from the user of system 100. For example, as described above with reference to Figure 6, data services module 450 may receive data from other modules of system 100, which may include sensor data 445. Data services module 450 may further be configured to perform database management and syncing of data across databases.
Data services module 450 may comprise an event logger module 640, as described above. Data services module 450 may further comprise one or more of an event system 730, triggered game object module 740, and tracked game object module 750.
Triggered game object module 740 may relate to one or more in-game objects that are configured to trigger measurement events. The measurement event may result in one or more parameters being measured. For example, the in-game objects may include objects that a user can interact with by operating a virtual prosthetic within the game. Where such an in-game object is picked up, put down, moved, touched, knocked, or otherwise interacted with, this may trigger a measurement event. The measurement event may result in measurement and logging of one or more parameters, such as a user’s pose, or a position of the virtual prosthesis. The triggering of a measurement event may cause triggered game object module 740 to send event data 715 corresponding to the triggered event to an event system 730. Event system 730 may be a Unity event system running on the Unity game engine by Unity Technologies®, for example. Event data 715 may comprise event invoke data, indicating that a triggering event was invoked, as well as event data relating to the type of event that was triggered.
Upon receipt of event data 715, event system 730 may be configured to invoke a function call to event logger module 640, to cause event logger module 640 to store the event data 715 within an event log 720, as described in further detail below.
Tracked game object module 750 may relate to one or more virtual in-game objects that are tracked within the virtual environment presented by system 100 for data logging purposes. According to some embodiments, the virtual objects that are tracked may include one or more of a user avatar’s head, hand, or prosthesis, for example. The movement of the tracked virtual objects may be based on real-world movement by the user that is translated into sensor data by one or more of headset 120, sensor device 130 and controller 190. Event logger module 640 may be configured to get object data from tracked game object module 750, and tracked game object module 750 may be caused to pass object data 725 to event logger module 640 based on a request for such data, periodically, or in real-time as such data is generated.
According to some embodiments, data gathering by event logger module 640 may be done through an event-driven system. Data capture or measurement may be triggered by an in-game event, such as a user grasping or dropping a virtual object using a virtual prosthesis worn by their avatar; activating the virtual prosthesis; hitting a target; or finishing a game session, for example. Other events may include actions by the user, prosthetic function event, in-game logic, in-game actions, or other events. Event logger module 640 may be configured to store the received event data in one or more databases. For example, event logger module may be configured to store event data in an event log 720. According to some embodiments, each event logger module 640 may have access to a single event log 720 for storing event data. Each entry in event log 720 may include data such as an event log type, the data and time of the event or activity, and event data 710 relating to the event. An example of some event log types are described in Table 2, below.
Table 2: example event log types stored in event log 720
According to some embodiments, each entry in event log 720 may be stored with associated event data 710. The event data 710 is captured and structured by event logger module 640 based on an event type, as described below with reference to Table 3. As indicated in the “Value” column of Table 3, different events may result in different types of data being captured. The captured information may be structured in JavaScript Object Notation (JSON) in some embodiments. The structured data may be stored in a list within event data 710. Event data 710 may comprise one or more of a unique event identifier, a timestamp corresponding to when the event took place, an event type, and one or more value corresponding to the event. An example of some event types and the corresponding values are listed in Table 3, below.
Table 3: example event types stored in event data 710
According to some embodiments, event data 710 may be retrieved at the end of a game or session and transferred to a server or cloud. According to some embodiments, event log data 735 retrieved from event data 710 by event logger module 640 may be passed to game manager module 795 of game services module 440 for further processing, and to generate corresponding game data 745. Game manager module 795 may be configured to handle the state and flow of games presented to a user via system 100. For example, game manager module 795 may be configured to determine when to present a user with a tutorial, to manage the user’s level and move them through the levels when appropriate, and to manage interaction with data services module 450 and platform management module 410. New game data 745 generated by game manager module 795 may be passed to an API manager module 760, which may form part of rehabilitation management platform 150 in some embodiments.
API manager module 760 may be configured to handle interactions between system 100 and an API used by a real-time database platform such as Firebase®, which may form rehabilitation management platform 150 in some embodiments. According to some embodiments, the real-time database platform may be configured to handle both offline or local data synchronisation and online or cloud based synchronisation automatically. As well as receiving new game data 745 from game manager module
795, API manager module 760 may receive system and user configuration data 435 from platform management module 410, and may store this data in a data store 770 of rehabilitation management platform 150, which may reside in memory 153 in some embodiments. According to some embodiments, data store 770 may handle persistent database management, data logging and transfer.
API manager module 760 may also communicate with one or more of an analytics module 780 and an authentication module 790. Analytics module 780 may be configured to handle behavioural data collection for analytics purposes, which may include data relating to a user’s actions and performance within a virtual environment presented by system 100. Authentication module 790 may handle user authentication, which may include authentication of direct users of system 100 as well as clinicians monitoring patient use of system 100.
The data structures used by data store 770 are described in further detail below with reference to Figure 9.
Figure 8 shows a sub-system 800 of the system of Figure 4, illustrating aspects of virtual reality platform 400, such as game services module 440, in further detail.
Subsystem 800 includes platform management module 410 and game services module 440 having a game manager module 795, as described above. Game services module 440 further comprises one or more of a tutorial manager module 810, score manager module 820 and level manager module 830 in communication with game manager module 795.
Tutorial manager module 810 may be configured to facilitate the presentation of ingame tutorials to a user. For example, tutorial manager module 810 may handle the logic of individual game tutorials, according to some embodiments. Score manager 820 may be configured to manage one or more game scores accrued by a user during play of one or more games. Managing a score may include determining when a score should be incremented or decremented, and an amount by which to increment or decrement the score, which may be based on one or more events that occur during game play. According to some embodiments, a user may have multiple separately stored scores
across multiple games within system 100. In some embodiments, a user may have a single score across a plurality of games they access via system 100. Level manager module 830 may be configured to manage one or more levels of a user within system 100. For example, level manager module 830 may be configured to handle the logic of specific levels of a game, such as determining when a user’s level should be incremented, and what settings need to be adjusted as a user changes levels. According to some embodiments, a user may have multiple separately stored levels across multiple games within system 100. In some embodiments, a user may have a single level across a plurality of games they access via system 100.
Figure 9 shows a sub-system 900, illustrating aspects of rehabilitation management platform 150 in further detail. In particular, sub-system 900 shows the data analytics components of rehabilitation management platform 150, which may be configured to handle data storage, data processing, data analysis and insights based on data generated by system 100 and particularly virtual reality platform 400.
Subsystem 900 includes the virtual reality platform 400 in communication with data store 770, analytics module 780 and authentication module 790 of rehabilitation management platform 150, as described above with reference to Figure 7. Subsystem 900 also shows further components of rehabilitation management platform 150, such as real-time engine 910. Real-time engine 910 may be a real-time game engine, such as the Unity game engine by Unity Technologies®, for example. Real-time engine 910 may be in communication with data store 770, analytics module 780 and authentication module 790. Real-time engine 910 may further be in communication with at least one of a user authentication platform 960, and a real-time database 920.
User authentication platform 960 may be a cloud based platform in some embodiments. According to some embodiments, user authentication platform 960 may be an identity and access management (IAM) platform. User authentication platform 960 may allow for a user using a user companion application 965 to log in to virtual reality platform 400. According to some embodiments, user companion application 965 may reside in memory on user computing device 160, and may allow a user of user computing device 160 to set user preferences and access analytics generated by virtual reality platform 400.
Real-time engine 910 may be configured to send data for storage in real-time database 920. Real-time database 920 may be a no-SQL real-time database in some embodiments, and may be a Firestore® database, for example. According to some embodiments, real-time database 920 may be stored on at least one of headset 120 and rehabilitation management platform 150. According to some embodiments, real-time database 920 may be mirrored so as to reside on both of headset 120 and rehabilitation management platform 150. Synchronisation of real-time database 920 may be handled automatically via a cloud hosting service, such as the Firebase® service, for example, and may include online and offline synchronisation methods.
Real-time database 920 may be a database configured for storing activity raw data generated during use of virtual reality platform 400; user information; and hardware configurations, such as configurations relating to headset 120, sensor device 130 and controller 190. Real-time database 920 may receive user data from user companion application 965, in some embodiments. The data structures stored in real-time database 920 are described in further detail below with reference to Figure 10.
Real-time database 920 may be accessible to a raw data processing module 930. According to some embodiments, raw data processing module 930 may be configured to retrieve activity event log data stored in real-time database 920, and may be configured to clear any activity event log data from real-time database 920 once processed. Raw data processing module 930 may be configured to perform containerised data processing functions to convert raw activity data received from realtime database 920 into activity and pose logs. Raw data processing module 930 is described in further detail below with reference to Figure 11.
Raw data processing module 930 may pass the processed data to big data database 940 for storage. According to some embodiments, the data may be divided and stored separately depending on the user type, which may include able-bodied, amputee, and able-bodied with emulated amputation, according to some embodiments. In some embodiments, this data may be stored together, but tagged to allow for identification of the user type. Big data database 940 may be a tabular data warehouse in some embodiments. Big data database 940 may be structured by datasets, and used to store processed performance and motor behaviour data. According to some embodiments,
the datasets may be divided by user type. Each dataset may be configured to collect data for a particular type of user and for a particular activity. Each game or activity presented by virtual reality platform 400 may have its own table in each dataset within big data database 940. The tables may be configured to store all processed results and raw logs for all players in each group of users identified by user type. The data structures stored in big data database 940 are described in further detail below with reference to Figure 12.
The data stored in big data dataset 940 may be aggregated to be used for training of machine learning algorithms, in some embodiments. In some embodiments, the data stored in big data dataset 940 may be aggregated to be used to derive target performance metrics. According to some embodiments, the data may additionally or alternatively be used to generate insights for each individual user of virtual reality platform 400.
According to some embodiments, a clinician may be able to access insights related to individual users via a clinician web application 980, which may reside in memory on clinician computing device 170. Clinician web application 980 may be configured to query patient data stored in big data database 940. Clinicians may be able to log in to clinician web application 980 via a clinician authentication platform 985 accessible via clinician web application 980. Clinician authentication platform 985 may be a cloud based authentication platform in some embodiments. According to some embodiments, clinician authentication platform 985 may be an identity and access management (IAM) platform.
Big-data database 940 may also be accessible by a performance calculation module 950. Performance calculation module 950 may be configured to calculate the performance of one or more users of virtual reality platform 400. According to some embodiments, performance calculation module 950 may receive raw log table data from big data database 940, and process the data to derive activity performance tables. The activity performance table data may then be stored in big data database 940. According to some embodiments, performance calculation module 950 may also derive progress metrics and/or performance metrics, which may be provided to real-time
database 920 for storage. Performance calculation module 950 is described in further detail below with reference to Figure 13.
Figure 10 shows real-time database 920 in further detail. Real-time database 920 may be configured to store collections of data related to virtual reality platform 400. For example, in the illustrated embodiment, real-time database 920 is configured to store three collections of data, being headset data 1010, activity data 1020 and meta data 1030.
Headset data 1010 may be configured to store data relating to headset configurations of headset 120, and may store a dataset for each individual headset 120 within system 100. The stored data may include an assigned clinic of headset 120, a unique identifier of a user of headset 120, and one or more identifiers of sensors contained within headset 120. Headset data 1010 may also contain raw activity data 1012, which may be stored separately for each game and/or activity provided by virtual reality platform 400, and may comprise data stored in event log 720 and event data 710, as described in further detail above with reference to Figure 7. According to some embodiments, writing raw data to raw activity data 1012 may trigger a function to cause raw data processing module 930 to process and subsequently clear the raw data stored. According to some embodiments, data stored in headset data 1010 may be anonymised.
Activity data 1020 may be configured to store processed insights and data used by front-end services of system 100, such as user companion application 965 and clinician web application 980. Activity data 1020 may be stored separately for each user of system 100, and may include progress data 1022 and usage data 1026 for each user. Progress data 1022 may store goal data 1024 relating to a user’s progress to their predefined goals or skills. Each entry in goal data 1024 may include an overall progress score or metric, and/or individual progress scores or metrics for each skill or task that a user has attempted to perform using virtual reality platform 400. Usage data 1026 may store game session data 1028 and assessment data 1029. Game session data 1028 may relate to game sessions that a user has participated in via virtual reality platform 400. Each entry in game session data 1028 may include one or more of a name of a game played during the gaming session; a date and time of the gaming session; a score achieved by the player during the gaming session; and game specific metrics that may
be particular to the game played. According to some embodiments, data stored in activity data 1020 may be anonymised.
Meta data 1030 may store user configuration data 1032, which may be stored separately for each user of system 100. User configuration data 1032 may include data relating to a user’s personal identifying information, their amputation diagnosis and prosthesis prescription data, and/or the sensor and prosthesis configurations to be used by the user. For example, each entry in user configuration data 1032 may include one or more of a unique identifier; a user name; a user’s gender; a user’s year or date of birth; amputation details such as the location of an amputation or limb difference; goal configuration data related to a user’s goals; prosthesis configuration data relating to a type of physical prosthesis fitted to the user and/or a type of virtual prosthesis that the user’s avatar wears in virtual environment 400; sensor configuration data relating to the sensors located on sensor device 130; and service configuration data.
Figure 11 shows raw data processing module 930 in further detail. Raw data processing module 930 may be configured to process raw event data into four logs, being an activity log 1242, sensor log 1244, prosthesis log 1246 and interaction log 1248, and to store the data in big data database 940, as illustrated and described in further detail below with reference to Figure 12.
Due to the diversity of activities and games presented by virtual reality platform 400, each activity and game may have its own data processing method performed by raw data processing module 930. When raw data stored in real-time database 920 is updated for a given activity, intervention or game based on data received from real-time engine 910, that data is passed to function call module 1110 of raw data processing module 930. Function call module 1110 determines the activity that the data relates to, and calls the appropriate function. Data processing module 1120 received the data and executes the called function to process the data into the appropriate log. The processed data is received by data transfer module 1130 and written to big data database 940 into the appropriate logs, as described in further detail below with reference to Figure 12. Data transfer module 1130 also causes the raw data to be cleared from real-time database 920.
Figure 12 shows big data database 940 in further detail. Big data database 940 may be configured to store raw event logs of data received from raw data processing module 940, as well as processed game data. According to some embodiments, processed game data may be stored in a de-identified form. According to some embodiments, processed game data may be stored in an aggregated form. According to some embodiments, processed game data may be organised in datasets and tables.
Datasets stored in big data database 940 may represent three different user groups: able-bodied, amputees, able-bodied with emulated prosthesis. Data from these user groups may be stored in separate datasets, such as able-bodied datasets 1210, amputee datasets 1220, and emulated amputee datasets 1230, for example. Tables within each dataset may store data relating to the games and activities presented by virtual reality platform 400, with each game and activity having data stored in both raw log tables 1240 and performance log 1250. Log tables 1240 may comprise activity log 1242, sensor log 1244, prosthesis log 1246 and interaction log 1248. According to some embodiments, data related to each game and activity may be stored in each of activity log 1242, sensor log 1244, prosthesis log 1246, interaction log 1248 and performance log 1250.
Activity log 1242 comprises log entries relating to activities attempted by one or more users. Each entry may comprise at least one of a unique identifier; a data and/or time at which the activity was attempted; and a level of the activity attempted. According to some embodiments, entries in activity log 1242 may additionally or alternatively comprise one or more of a timestamp that the activity was attempted; an event type relating to an event that occurred during the activity; an object ID relating to a virtual object interacted with within the activity; an object position of the object interacted with; an object quaternion of the object interacted with; a grasp side of the prosthetic used to interact with the object; a head position based on a virtual position of a headset 120 being used by the user; a head quaternion based on a virtual position of a headset 120 being used by the user; a left hand position based on a virtual position of a sensor device 130 or controller 190 being used by the user; a left hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user; a right hand position based on a virtual position of a sensor device 130 or controller 190 being
used by the user; and a right hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user.
Sensor log 1244 comprises log entries relating to sensor data generated by one or more of headset 120, sensor device 130 and controller 190 during an activity presented by virtual reality platform 400. Each entry may comprise at least one of a unique identifier; a data and/or time at which the activity was attempted; and a level of the activity attempted. According to some embodiments, entries in sensor log 1244 may additionally or alternatively comprise one or more of a timestamp that the activity was attempted; an event type relating to an event that occurred during the activity; a pattern of sensors activated; a power measure relating to the degree to which a sensor was activated; a head position based on a virtual position of a headset 120 being used by the user; a head quaternion based on a virtual position of a headset 120 being used by the user; a left hand position based on a virtual position of a sensor device 130 or controller 190 being used by the user; a left hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user; a right hand position based on a virtual position of a sensor device 130 or controller 190 being used by the user; and a right hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user.
Prosthesis log 1246 comprises log entries relating to prosthesis data generated by sensor device 130 and relating to the actions of a virtual prosthesis during an activity presented by virtual reality platform 400. Each entry may comprise at least one of a unique identifier; a data and/or time at which the activity was attempted; and a level of the activity attempted. According to some embodiments, entries in prosthesis log 1246 may additionally or alternatively comprise one or more of a timestamp that the activity was attempted; an event type relating to an event that occurred during the activity; a direction in which the virtual prosthesis was moved; a head position based on a virtual position of a headset 120 being used by the user; a head quaternion based on a virtual position of a headset 120 being used by the user; a left hand position based on a virtual position of a sensor device 130 or controller 190 being used by the user; a left hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user; a right hand position based on a virtual position of a sensor device
130 or controller 190 being used by the user; and a right hand quaternion based on a virtual position of a sensor device 130 or controller 190 being used by the user.
Interaction log 1248 comprises log entries relating to event data generated during an activity presented by virtual reality platform 400. Each entry may comprise at least one of a unique identifier; a data and/or time at which the activity was attempted; and a level of the activity attempted. According to some embodiments, entries in sensor log 1244 may additionally or alternatively comprise one or more of a timestamp that the activity was attempted; an event type relating to an event that occurred during the activity; a pattern of sensors activated; and one or more values relating to the event.
Performance log 1250 comprises log entries relating to performance of a user within virtual reality platform 400, and may be generated based on data from one or more of activity log 1242, sensor log 1244, prosthesis log 1246, and interaction log 1248. According to some embodiments, performance log 1250 may be written to by performance calculation module 950. Each entry may comprise a unique identifier. According to some embodiments, entries in performance log 1250 may additionally or alternatively comprise one or more of a data and/or time at which the activity was attempted; a level of the activity attempted; a playtime indicating a duration of play; an activity score achieved by the user during game play; a compensation score relating to the amount that the user compensated for their prosthetic; a compensation style relating to the manner in which the user compensated for their prosthetic; a hand usage metric; a wrist usage metric; an elbow usage metric; a sensor patterns usage metric, and one or more game specific metrics.
Data stored in big data database 940 may be used by system 100 for purposes including the training of machine learning algorithms; generating population-level insights, and generating individual insights that may be accessible to clinician web application 980.
Figure 13 shows performance calculation module 950 in further detail. As described above with reference to Figure 9, performance calculation module 950 receives data from big data database 940, and writes data to real-time database 920 and big data database 940.
Performance calculation module 950 receives raw log table data 1305 from big data database 940 when new data is added to logs 1242, 1244, 1246 or 1248. Raw log table data 1305 is passed to ML model selection module 1310, which is configured to determine what type of data is contained in raw log table data 1305, and therefore what type of machine learning model should be used to process the data. Performance calculation module 950 may comprise a number of algorithms for generating performance data. In some embodiments, performance calculation module 950 may contain a separate algorithm for each data log stored in big data database 940. For example, in the illustrated embodiment, performance calculation module 950 comprises an activity metric computation module 1320, a prosthesis metric computation module 1330, a sensor metric computation module 1340 and a motor behaviour computation module 1350.
Where ML model selection module 1310 determines that raw log table data 1305 contains activity log data, the activity log data 1312 is passed to activity metric computation module 1320 for processing. Activity metric computation module 1320 performs a processing algorithm on activity log data 1312 to derive one or more activity metrics 1325. Activity metrics 1325 may be intervention or game dependent, in some embodiments. According to some embodiments, the activity metrics 1325 may include metrics related to a user’s performance in the activity. For example, where the activity is a target shooting game (as shown in Figures 18D, 19A and 21A, for example), the activity metrics 1325 may include the duration of the activity, the number of shots fired and the number of targets hit. Further activity metrics, such as an accuracy of the user, may be derived based on these raw metrics. Activity metrics 1325 are then passed to data collection module 1360.
Where ML model selection module 1310 determines that raw log table data 1305 contains prosthesis log data, the prosthesis log data 1314 is passed to prosthesis metric computation module 1330 for processing. Prosthesis metric computation module 1330 performs a processing algorithm on prosthesis log data 1314 to derive one or more prosthesis use metrics 1335. According to some embodiments, prosthesis metric computation module 1330 may analyse prosthesis event segments based on prosthesis log data 1314 to calculate metrics associated with prosthesis utilisation, for example.
Prosthesis metrics 1335 may be intervention dependent, in some embodiments. Prosthesis use metrics 1335 are then passed to data collection module 1360.
Event segments, which may include prosthesis, activity or sensor event segments, may comprise pairs of related events, such as the start and end of a particular movement of the prosthetic by the user. According to some embodiments, an event may be triggered when an action or movement is commenced, and a second event may be triggered when that action or movement is stopped, to define the start and end of an event segment. These event segments may be used to calculate the duration of a particular activity.
Where ML model selection module 1310 determines that raw log table data 1305 contains sensor log data, the sensor log data 1316 is passed to sensor metric computation module 1340 for processing. Sensor metric computation module 1340 performs a processing algorithm on sensor log data 1316 to derive one or more sensor use metrics 1345. According to some embodiments, sensor metric computation module 1340 may analyse sensor event segments based on sensor log data 1316 to calculate metrics associated with sensor utilisation, for example. Sensor use metrics 1345 may be intervention dependent, in some embodiments. Sensor use metrics 1345 are then passed to data collection module 1360.
Where ML model selection module 1310 determines that raw log table data 1305 contains interaction log data, the interaction log data 1318 is passed to motor behaviour computation module 1350 for processing. Motor behaviour computation module 1350 performs a processing algorithm on interaction log data 1318 to derive one or more motor behaviour metrics 1355. According to some embodiments, motor behaviour computation module 1350 may analyse prosthesis events and object grasping data as recorded in interaction log data 1318 to calculate metrics associated with a user’s grasping success rate, for example. Motor behaviour metrics 1355 may be intervention or game dependent, in some embodiments.
According to some embodiments, motor behaviour computation module 1350 may comprise one or more machine learning algorithms to derive motor behaviour metrics 1355. The machine learning algorithms may be trained to compare data extracted from interaction log data 1318 for a particular user with corresponding data derived from a
population of able-bodied or expert prosthesis users. For example, a machine learning algorithm may be trained to compare grasping pose data for a given virtual object grasped by a user with grasping pose data derived from a population of able-bodied users grasping an identical virtual object. In some embodiments, a machine learning algorithm may be trained to compare pose data for a given sensor function with pose data derived from a population of expert prosthesis users performing an identical sensor function.
According to some embodiments, data for training such machine learning models may be derived from data retrieved from one or more raw data logs stored in big data database 940. For example, interaction pose data may be derived from interaction log data 1318; prosthesis action pose data may be derived from prosthesis log data 1314; and sensor function pose data may be derived from sensor log data 1216, in some embodiments.
Motor behaviour computation module 1350 may use the one or more machine learning algorithms to derive motor behaviour metrics 1355 such as a compensation score and style, prosthesis utilisation score and style, and sensor utilisation score and style, for example.
According to some embodiments, motor behaviour computation module 1350 may use a clustering-type approach to derive these metrics. Historical data may be processed, and users may be clustered into groups based on the data generated during their rehabilitation sessions. Each cluster may relate to a particular motor behaviour, such as an elbow down motor behaviour when grasping an object, for example. According to some embodiments, one or more clusters may be labelled as desired clusters. According to some embodiments, desired clusters may be labelled manually by one or more clinicians based on the motor behaviour that is considered desirable from a clinical perspective. In some embodiments, desired clusters may be automatically labelled based on the clusters into which a large proportion of able-bodied or expert prosthesis users fall. When new user data is being analysed, motor behaviour computation module 1350 may analyse the distance between the new data and one or more of the desired clusters to determine a score for the new user. Motor behaviour computation module 1350 may also determine which cluster the new data is closest to,
and determine a style based on the style associated with that cluster. The style may represent a particular feature in motor behaviour, such as an elbow down or elbow up grasping style, for example. The styles may be defined by clinicians based on their observations from clinical practice in some embodiments. This approach may be used to determine score and style for one or more of compensation, prosthesis utilisation and sensor utilisation.
According to some embodiments, motor behaviour computation module 1350 may also derive a range of motion metrics, which may form part of motor behaviour metrics 1355. Motor behaviour computation module 1350 may comprise a workspace analysis algorithm that is configured to compute the range of motion of a tracked limb of a user based on interaction log data 1318. The range of motion may be a three-dimensional range of motion, in some embodiments.
Motor behaviour metrics 1355 are then passed to data collection module 1360.
Data collection module 1360 may be configured to receive at least one of activity metrics 1325, prosthesis use metrics 1335, sensor use metrics 1345 and motor behaviour metrics 1355. Data collection module 1360 may be configured to structure the received data into a performance data table, in some embodiments. Data collection module 1360 may then pass the structured data to at least one of a database update module 1370, and a skill progress update module 1380.
Database update module 1370 may be configured to transfer the activity performance table data created by data collection module 1360 to big data database 940 for storage in performance log 1250. Skill progress update module 1380 may be configured to transfer the progress metrics data 1385 created by data collection module 1360 to realtime database 920, for storage in progress data 1022. According to some embodiments, progress data 1022 may be translated into a skill level that is presented to a user, as described below with reference to Figure 16B.
According to some embodiments, the metrics calculated by performance calculation module 950 may be used to provide insights to a user or their clinician. For example,
advice may be provided as to how a user can change their motor behaviour to reduce long term postural health issues, and the types of prostheses that may assist with this.
Figure 14 is a process flow diagram of a method 1400 performed by processor 111 of computing device 110 executing task presentation application 116 to present a virtual environment to a user. While method 1400 is described as being performed by processor 111, some or all of method 1400 may alternatively be performed by other components of system 100, such as processor 121 or processor 151, for example.
At step 1405, processor 111 executing task presentation application 116 receives data corresponding to a high level prosthetic use goal associated with a user. According to some embodiments, the data may be received from user computing device 160, after a user interacts with user computing device 160 to enter data relating to one or more goals. According to some embodiments, the data may be received from clinician computing device 170, after a clinician interacts with clinician computing device 170 to enter data relating to one or more goals of a patient. The high level goal may relate to one or more tasks that the user wishes to be able to accomplish using their prosthetic. For example, a high level goal may be “cook for my family” or “drive to work”. According to some embodiments, the high level goal may be selected from a predefined list of goals retrieved from a memory location such as memory 153, for example. In some embodiments, the high level goal may be a new goal manually entered by the user or clinician. Figure 16A shows an example screenshot that may be displayed during step 1405.
At step 1410, processor 111 executing task presentation application 116 is caused to determine one or more basic skills or functional skills associated with the high level goal. As described above, basic skills may be skills which target rehabilitation and strengthening of the neurology and muscles of the residual limb, and may be used in the operation of the prosthesis, while functional skills may comprise a combination of basic skills, and target the ability of users to use the basic skills to operate the prosthesis for activities of daily living. Processor 111 may be configured to determine one or more basic skills or functional skills associated with the high level goal by referring to a database storing predefined goal information. For example, where the high level goal is cooking, processor 111 may determine that the skills required to
achieve the goal include utensil use, bi-manual manipulation and grasp types. In some embodiments, the one or more basic skills or functional skills associated with the high level goal may be selected from a predefined list by a user or clinician via user computing device 160 or clinician computing device 170, where the predefined list may be retrieved by processor 111 for a memory location such as memory 153. In some embodiments, basic skills or functional skills are specifically chosen for the particular high level goal to assist in accomplishing said high level goal. That is, the basic skills or functional skills chosen are used to provide the user with feedback on performance of said skills to better assist in reaching the high level goal, for example.
At step 1415, processor 111 determines one or more games or activities to present to the user to assist them in achieving the one or more skills determined at step 1410. According to some embodiments, processor 111 may compare the skills identified at step 1410 with one or more skills associated with one or more games stored in a memory location such as memory 153. According to some embodiments, each game may be associated with at least one basic or functional skill. For example, an assembly game where a user must use a virtual prosthesis to assemble various virtual objects into a virtual structure may be associated with the skill of bi-manual manipulation. Processor 111 may determine one or more games that match the skills determined at step 1410 to present to the user.
At step 1420, processor 111 may cause the one or more games identified at step 1415 to be presented to the user. These may be presented via one or more of a user interface of client computing device 160, user I/O 113 of computing device 110, or display 123 of headset 120. An example user interface showing a selection of games being presented to the user is shown in Figure 17 and described in further detail below.
At step 1425, processor 111 receives a user selection corresponding to one of the games presented at step 1420. The user selection may be received via a user interface of client computing device 160, via user I/O 113 of computing device 110, or via controller 190 operating in conjunction with headset 120, in some embodiments.
At step 1430, processor 111 causes the selected game to be generated as a virtual environment for presentation to the user via headset 120. According to some
embodiments, the virtual environment may comprise one or more virtual objects. According to some embodiments, processor 111 may retrieve the program code associated with the game from a memory location such as memory 153, and may pass the program code to headset 120 for execution by processor 121. Example virtual environments are shown in Figures 18A to 21D, and described in further detail below.
At step 1435, processor 111 receives prosthesis data corresponding to a virtual prosthesis for display to the user in the selected game. According to some embodiments, processor 111 may retrieve the prosthesis data from a memory location such as memory 153. The prosthesis data may correspond to a physical prosthesis fitted to the user or proposed to be fitted to the user, and may be entered by a clinician via clinician computing device 170 in some embodiments. According to some embodiments, the prosthesis data may form part of system and user configuration data 435, as described above with reference to Figure 4.
At step 1440, processor 111 causes a virtual prosthesis corresponding to the prosthesis data to be generated for presentation to the user via headset 120 within the generated virtual environment.
At step 1445, processor causes the virtual environment and virtual prosthesis to be presented to the user via headset 120.
Figure 16A shows an example screenshot 1600 that may be displayed to a clinician during steps 1405 of method 1400. In the illustrated example, screenshot 1600 is displayed on clinician computing device 170. In other embodiments, screenshot 1600 may be displayed on other devices of system 100.
Screenshot 1600 shows a patient identifier 1610 indicating the patient for whom goals and skills are being selected. In the illustrated example, goals and skills are being selected for a patient named John.
Goal selection boxes 1620 provides for a means by which a user of device 170 can select one or more goals to be associated with John’s profde. According to some embodiments, boxes 1620 may be drop-down boxes from which a user can select a
predefined goal. According to some embodiments, boxes 1620 may be a text box into which a user can manually enter a new goal.
In the illustrated embodiment, a goal of “Cook for my family” has been entered or selected in a first box 1620, while a user has entered the word “Drive” in a second box 1620 causing drop down items 1625 to be shown. In the illustrated embodiment, drop down items 1635 include the goals “Drive my kids to school” and “Drive to work”. An option to add a new goal that doesn’t currently appear on the list is also shown
Screenshot 1600 further provides virtual buttons 1630 and 1640, allowing a user to add a new goal to the list of John’s goals, or to save the goals, respectively. Interacting with virtual button 1640 may cause clinician computing device 170 to save the goals to a local memory location, or to send the goal data to an external device such as rehabilitation management platform 150 for storage in a memory location such as memory 153.
Figure 15 is a process flow diagram of a method 1500 performed by processor 111 of computing device 110 executing assessment application 117, for assessing a user’s performance within a virtual environment, such as the virtual environment presented to the user at step 1445 of method 1400. While method 1500 is described as being performed by processor 111, some or all of method 1500 may alternatively be performed by other components of system 100, such as processor 121 or processor 151, for example.
At step 1505, processor 111 executing assessment application 117 receives historical performance data relating to previous performance of one or more users within a virtual environment presented by system 100. According to some embodiments, the data may be retrieved from big data database 940, and may include data relating to able-bodied users or experienced amputees.
At step 1510, processor 111 executing assessment application 117 performs a clustering method on the retrieved data, to arrange the data into clusters. According to some embodiments, processor 111 also determines at least one desired cluster, where
the desired cluster is a cluster containing a predetermined percentage of data from able- bodied users or experienced amputees.
At step 1515, processor 111 executing assessment application 117 receives new performance data relating to performance of a user within a virtual environment presented by system 100. The new data may be generated by one or more of headset 120, sensor device 130 or controller 190. According to some embodiments, the data may be retrieved from real-time database 920. In some embodiments, the new performance data may comprise at least one of: sensor data, prosthesis data, interaction event data, motor behaviour data and postural data. Motor behaviour data may include compensation data relating to the manner in which and the amount by which a user compensates for their prosthetic through other body movement. Postural data may relate to the posture of the user during interaction with the virtual environment presented by system 100.
At step 1520, processor 11 is caused to compare the data received at step 1515 with a predetermined parameter or predetermined control data, such as the clustered data determined at step 1510. According to some embodiments, processor 111 may determine which cluster the new performance data is closest to. According to some embodiments, processor 111 may determine the distance between the new performance data and at least one desired cluster.
At step 1525, processor 111 may derive at least one performance metric from the data. The performance metric may include at least one of a motor performance metric and a prosthesis use metric, for example. As described above with reference to motor behaviour module 1350 of Figure 13, processor 111 may analyse the distance between the new data and one or more of the desired clusters to determine a score metric. Processor 111 may additionally or alternatively determine which cluster the new data is closest to, and determine a style metric based on the style associated with that cluster. In some embodiments, the at least one performance metric may be derived from at least one of: sensor data, prosthesis data, interaction event data, motor behaviour data and postural data. The at least one performance metric may be indicative of the user’s ability to functionally use a prosthesis outside of the virtual environment. That is, the at
least one performance metric may relate to a user’s ability to perform real world tasks such as, cooking, driving a vehicle, or using a computer, for example.
At step 1525, processor 111 may optionally also determine a recommendation for the user based on the calculated metrics. For example, a recommendation may be generated as to how a user can change their motor behaviour to reduce long term postural health issues, and/or the types of prostheses that may assist with this.
At step 1530, processor 111 causes the metric and/or recommendation to be presented to a user and/or clinician. This may be via one or more of a user interface of client computing device 160; a user interface of a clinician computing device 170; user I/O 113 of computing device 110, or display 123 of headset 120.
Figure 16B shows an example screenshot 1650 that may be displayed to a user during step 1530 of method 1500. In the illustrated example, screenshot 1650 is displayed on user computing device 160. In other embodiments, screenshot 1650 may be displayed on other devices of system 100.
Screenshot 1650 shows a goal identifier 1655 indicating a high level goal for which metrics are being displayed. In the illustrated example, metrics are being displayed for the goal of “Cook for my family”.
An overall progress bar 1660 is shown for the high level goal, along with a level indicator 1665. A number of further progress bars are also shown for skills associated with the high level goal. For example, in the illustrated embodiment, a progress bar 1670 is shown for the skill “utensil use”; a progress bar 1680 is shown for the skill “bimanual manipulation”; and a progress bar 1690 is shown for the skill “grasp types”. Each progress bar 1670, 1680 and 1690 also has an associated level indicator 1675, 1685 and 1695, respectively. In the illustrated example, each of level indicators 1665, 1675, 1685 and 1696 show a level of “2”.
Figure 17 shows an example screenshot 1700 that may be displayed to a user during step 1420 of method 1400. Screenshot 1700 may be displayed via headset 120, on user computing device 160, or on other devices of system 100.
Screenshot 1700 shows an environment 1705, which may be presented as a virtual environment when being viewed via headset 120. Within the environment 1705 is shown a virtual prosthesis 1710, which may be controlled within environment 1705 by interaction with sensor device 130. Environment 1705 also contains a menu display 1720 showing a number of menu options 1722, 1724 and 1726. Each of the menu options may correspond to an activity or game selectable by the user to be displayed to them via headset 120. Option 1722 relates to a “pick and cook” activity, as described in further detail below with reference to Figures 19C, 19D and 21C. Option 1724 relates to a “paintball town” activity, as described in further detail below with reference to Figures 18D, 19A and 21 A. Option 1726 relates to a “box & blocks” activity, as described in further detail below with reference to Figures 18B, 19B and 20. A user may be able to interact with one of the presented options to make a selection, as described above with reference to step 1425 of method 400.
Figures 18A to 18D show example screenshots 1800, 1820, 1840 and 1860 that may be displayed to a user during step 1430 of method 1400. Screenshots 1800, 1820, 1840 and 1860 may be displayed via headset 120, on user computing device 160, or on other devices of system 100. As screenshots 18A to 18D show the user avatar from a third person perspective, according to some embodiments these screenshots may be generated for viewing by a third party, such as a clinician, rather than for the user to view via headset 120.
Figure 18A shows a screenshot 1800. Screenshot 1800 shows an environment 1805. Within the environment 1805 is shown an avatar 1810 wearing a virtual prosthesis 1710, which may be controlled within environment 1805 by a user interacting with sensor device 130. Environment 1805 also includes a number of manipulable virtual objects 1815, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1805 may facilitate a building or construction type activity, where the user is prompted to guide the virtual prosthesis 1710 to interact with the virtual food objects 1815 and to construct a burger. This may involve the user grasping, moving, manipulating and releasing the virtual objects 1815 using the virtual prosthesis 710.
Figure 18B shows a screenshot 1820. Screenshot 1820 shows an environment 1825. Within the environment 1825 is shown an avatar 1810 wearing a virtual prosthesis 1710, which may be controlled within environment 1825 by a user interacting with sensor device 130. Environment 1825 also includes a number of manipulable virtual objects 1835, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1825 may facilitate a “box & blocks” type activity, where the user is prompted to guide the virtual prosthesis 1710 to interact with the virtual objects 1835 to move them into and out of a virtual box. This may involve the user grasping, moving, manipulating and releasing the virtual objects 1835 using the virtual prosthesis 710.
Figure 18C shows a screenshot 1840. Screenshot 1840 shows an environment 1845. Within the environment 1845 is shown an avatar 1810 wearing a virtual prosthesis 1710, which may be controlled within environment 1845 by a user interacting with sensor device 130. Environment 1845 also includes a number of manipulable virtual objects 1855, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1845 may facilitate an activity where the user is prompted to guide the virtual prosthesis 1710 to interact with the virtual objects 1835 to move them within the virtual environment. This may involve the user grasping, moving, manipulating and releasing the virtual objects 1855 using the virtual prosthesis 710.
Figure 18D shows a screenshot 1860. Screenshot 1860 shows an environment 1865. Within the environment 1865 is shown an avatar 1810 wearing a virtual prosthesis 1710, which may be controlled within environment 1865 by a user interacting with sensor device 130. Environment 1865 also includes a number of virtual target objects 1875, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1865 may facilitate a “shooting” type activity such as paintball, where the user is prompted to guide the virtual prosthesis 1710 to pick up projectile type weapons such as paintball guns with paintball projectiles, and use these to strike virtual targets 1875. This may involve the user grasping, moving, manipulating
and releasing the weapons, as well as aiming at and firing at virtual targets 1875 using the virtual prosthesis 710.
Figures 19A to 19D show example images 1900, 1920, 1940 and 1960 that show screenshots which may be viewed via a headset 120 by a user during step 1430 of method 1400.
Figure 19A shows a headset 120 with display optics 1910 and 1912 showing screenshots of images to be displayed to a user wearing headset 120, and configured to each be viewed by an eye of the user to present a simulated three dimensional image to the user. Figure 19A shows an environment 1865. Within the environment 1865 is shown a virtual prosthesis 1710, which may be controlled within environment 1865 by a user interacting with sensor device 130. Environment 1865 also shows a virtual weapon 1915 held by the virtual prosthesis 1710, and a virtual target 1875 for shooting with weapon 1915. According to some embodiments, environment 1865 may facilitate a “shooting” type activity, as described above with reference to Figure 18D.
Figure 19B shows a headset 120 with display optics 1910 and 1912 showing screenshots of images to be displayed to a user wearing headset 120, and configured to each be viewed by an eye of the user to present a simulated three dimensional image to the user. Figure 19B shows an environment 1825. Within the environment 1825 is shown a virtual prosthesis 1710, which may be controlled within environment 1825 by a user interacting with sensor device 130. Environment 1825 also shows virtual objects 1835, with one virtual object 1935 being held by the virtual prosthesis 1710. According to some embodiments, environment 1865 may facilitate a “box & blocks” type activity, as described above with reference to Figure 18B .
Figure 19C shows a headset 120 with display optics 1910 and 1912 showing screenshots of images to be displayed to a user wearing headset 120, and configured to each be viewed by an eye of the user to present a simulated three dimensional image to the user. Figure 19C shows an environment 1919. Within the environment 1919 is shown a virtual prosthesis 1710, which may be controlled within environment 1919 by a user interacting with sensor device 130. Environment 1919 also shows virtual objects 1917, with one virtual object 1915 being held by the virtual prosthesis 1710. According
to some embodiments, environment 1865 may facilitate a “barbeque” type activity, where the user is prompted to guide the virtual prosthesis 1710 to interact with the virtual objects 1917 to move them onto a grill, rotate them on the grill and remove them from the grill. This may involve the user grasping, moving, manipulating and releasing the virtual objects 1917 using the virtual prosthesis 1710.
Figure 19D shows an alternative view of environment 1919.
Figure 20 shows a screenshot 2000 displaying a user perspective view of environment 1825. Within the environment 1825 is shown a virtual prosthesis 1710, which may be controlled within environment 1825 by a user interacting with sensor device 130. Environment 1825 also includes a number of manipulable virtual objects 1835, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1825 may facilitate a “box & blocks” type activity, as described above with reference to Figure 18B.
Figure 21 A shows a screenshot 2100 displaying a user perspective view of environment 1865. Within the environment 1865 is shown a virtual prosthesis 1710, which may be controlled within environment 1865 by a user interacting with sensor device 130. Environment 1865 also includes a virtual weapon 1915 held by virtual prosthetic 1710, and a virtual target 1875 for shooting with virtual weapon 1875. According to some embodiments, environment 1825 may facilitate a “shooting” type activity, as described above with reference to Figure 18D.
Figure 2 IB shows a screenshot 2120 displaying a user perspective view of environment 1805. Within the environment 1805 is shown a virtual prosthesis 1710, which may be controlled within environment 1805 by a user interacting with sensor device 130. Environment 1805 also includes a number of manipulable virtual objects 1815, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1825 may facilitate a building and construction type activity, as described above with reference to Figure 18 A.
Figure 21C shows a screenshot 2140 displaying a user perspective view of environment 1919. Within the environment 1919 is shown a virtual prosthesis 1710, which may be controlled within environment 1919 by a user interacting with sensor device 130. Environment 1919 also includes a number of manipulable virtual objects 1917, which the user may be able to interact with via the virtual prosthesis 1710. The user may do so by interacting with sensor device 130. According to some embodiments, environment 1919 may facilitate a barbeque type activity, as described above with reference to Figures 19C and 19D.
Figure 21D shows a screenshot 2160 displaying a user perspective view of an environment 2165. Within the environment 2165 is shown a virtual prosthesis 2170, which may be controlled within environment 1919 by a user interacting with sensor device 130. Unlike virtual prosthesis 1710, which is an upper body prosthesis, virtual prosthesis 2170 is a lower body prosthesis, and may be controlled by a sensor device 130 worn on the lower body, as shown in Figure 3B, for example. Environment 2165 also includes a manipulable virtual object 2175, which the user may be able to interact with via the virtual prosthesis 2170. The user may do so by interacting with sensor device 130. According to some embodiments, environment 2165 may facilitate a soccer type activity, where the user is prompted to guide the virtual prosthesis 2170 to interact with the virtual object 2175, being a virtual ball, to move the virtual object 2175 through the virtual soccer field and through the virtual goals. This may involve the user kicking, tapping and rolling the virtual object 2175 using the virtual prosthesis 2170.
It will be appreciated by persons skilled in the art that numerous variations and/or modifications may be made to the above-described embodiments, without departing from the broad general scope of the present disclosure. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive.
Claims
1. A method of determining performance metrics of a user relating to control of a virtual prosthetic limb in a virtual or augmented reality environment, the method comprising: causing a virtual reality environment to be presented to a user, wherein the virtual reality environment comprises a virtual prosthetic limb controllable by the user by way of a sensor device; receiving sensor data from the sensor device in response to the user causing the virtual prosthetic limb to move within the virtual reality environment, the sensor data comprising motion sensor data and human-prosthesis interface data; and processing the sensor data to derive at least one performance metric; wherein the at least one performance metric is derived by comparing the sensor data with at least one predetermined parameter, wherein the at least one performance metric relates to a level of competency with which a user controls of the virtual prosthetic limb within the virtual reality environment.
2. The method of claim 1, wherein the performance metric is related to the performance of at least one basic skill.
3. The method of claim 1 or 2, wherein the performance metric is related to the performance of at least one functional skill.
4. The method of any one of claims 1 to 3, wherein the performance metric is based on at least one motor behaviour metric.
5. The method of claim 4, wherein the at least one motor behaviour metric is at least one of a compensation score, a compensation style, a prosthesis utilisation score, a prosthesis utilisation style, a sensor utilisation score and a sensor utilisation style.
64
6. The method of any one of claims 1 to 5, wherein the performance metric relates to the movement of the virtual limb within the virtual reality environment.
7. The method of any one of claims 1 to 6, wherein the performance metric is determined in response to a trigger measurement event.
8. The method of claim 7, wherein the measurement event is triggered by an interaction between the virtual limb and at least one virtual object.
9. The method of any one of claims 1 to 8, further comprising prompting the user to perform a task within the virtual environment.
10. The method of any one of claims 1 to 9, wherein the task required the user to interact with a virtual object within the virtual environment.
11. The method of claim 10, further comprising receiving data related to the state of the virtual object, and using this data to derive the at least one performance metric .
12. The method of any one of claims 1 to 11, further comprising receiving data related to the pose of the user, and using this data to derive the at least one performance metric.
13. The method of any one of claims 1 to 12, further comprising receiving data related to the state of the virtual prosthetic limb, and using this data to derive the at least one performance metric.
14. The method of any one of claims 1 to 13, further comprising receiving data related to the state of the sensor device, and using this data to derive the at least one performance metric.
15. The method of any one of claims 1 to 14, wherein comparing the sensor data with at least one predetermined parameter comprises comparing the sensor data with historical data generated by a population of users within an equivalent virtual reality environment.
65
16. The method of any one of claim 15, further comprising performing a clustering technique on the historical data to group the historical data into clusters, wherein comparing the sensor data with historical data comprises comparing the sensor data with the clustered data.
17. The method of claim 16, further comprising determining that at least one cluster is a desired cluster, and determining the at least one performance metric by calculating a distance between the desired cluster and the sensor data.
18. The method of claim 16 or claim 17, wherein determining the at least one performance metric comprises determining which cluster is closest to the sensor data.
19. The method of any one of claims 1 to 18, wherein the performance metric comprises at least one of a motor performance metric and a prosthesis use metric.
20. The method of any one of claims 1 to 19, further comprising determining that an interaction has occurred between a virtual object and the virtual prosthetic limb, and generating haptic feedback data to be delivered to the user.
21. The method of claim 20, wherein the haptic feedback data comprises at least one of a vibration amplitude and frequency.
22. The method of any one of claims 1 to 21, wherein the determined performance metrics of the user are communicated to the user to assist the user in performing functional tasks with a physical prosthetic limb in a real-world environment.
23. A method of presenting a virtual reality environment to a user, the method comprising: receiving input indicating a high level prosthesis rehabilitation goal; determining, based on the high level prosthesis rehabilitation goal, at least one low level skill relating to the high level goal;
66 generating a virtual reality environment that facilitates in the performance of the low level skill; wherein the virtual reality environment comprises a virtual prosthetic limb controllable by the user by way of a sensor device, the sensor device comprising at least one motion sensor and at least one human-prosthesis interface sensor; receiving sensor data from the sensor device, the sensor data comprising motion sensor data and human-prosthesis interface data; and based on the received sensor data, causing movement of the virtual prosthetic limb in the virtual reality environment to allow the user to perform the low level skill.
24. The method of claim 23, wherein the high level goal relates to a real- world task.
25. The method of claim 23 or 24, wherein the low level skill relates comprises at least one of a basic skill and a functional skill.
26. The method of any one of claims 23 to 25, wherein the virtual reality environment is used for the purpose or training and/or rehabilitation.
27. The method of any one of claims 23 to 26, further comprising processing the sensor data to derive at least one performance metric, wherein the at least one performance metric is derived by comparing the sensor data with at least one predetermined parameter.
28. The method of claim 27, further comprising determining a recommendation for the user based on the at least one performance metric, and presenting to the user the recommendation.
29. The method of any one of claims 1 to 28, wherein the sensor device comprises at least one inertial measurement unit.
30. The method of any one of claims 1 to 29, wherein the sensor device comprises at least camera.
67
31. The method of any one of claims 1 to 30, wherein the sensor device comprises at least one surface electromyography sensor.
32. A computer readable storage medium storing executable code, wherein when a processor executes the code, the processor is caused to perform the method of any one of claims 1 to 31.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2021903944A AU2021903944A0 (en) | 2021-12-06 | Methods and systems for rehabilitation of prosthesis users | |
AU2021903944 | 2021-12-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023102599A1 true WO2023102599A1 (en) | 2023-06-15 |
Family
ID=86729285
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU2022/051456 WO2023102599A1 (en) | 2021-12-06 | 2022-12-06 | Methods and systems for rehabilitation of prosthesis users |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023102599A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006086504A2 (en) * | 2005-02-09 | 2006-08-17 | Alfred E. Mann Institute For Biomedical Engineering At The University Of Southern California | Method and system for training adaptive control of limb movement |
US20180301057A1 (en) * | 2017-04-14 | 2018-10-18 | REHABILITATION INSTITUTE OF CHICAGO d/b/a Shirley Ryan AbilityLab | Prosthetic Virtual Reality Training Interface and Related Methods |
WO2020023421A1 (en) * | 2018-07-23 | 2020-01-30 | Mvi Health Inc. | Systems and methods for physical therapy |
-
2022
- 2022-12-06 WO PCT/AU2022/051456 patent/WO2023102599A1/en active Search and Examination
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006086504A2 (en) * | 2005-02-09 | 2006-08-17 | Alfred E. Mann Institute For Biomedical Engineering At The University Of Southern California | Method and system for training adaptive control of limb movement |
US20180301057A1 (en) * | 2017-04-14 | 2018-10-18 | REHABILITATION INSTITUTE OF CHICAGO d/b/a Shirley Ryan AbilityLab | Prosthetic Virtual Reality Training Interface and Related Methods |
WO2020023421A1 (en) * | 2018-07-23 | 2020-01-30 | Mvi Health Inc. | Systems and methods for physical therapy |
Non-Patent Citations (2)
Title |
---|
CHRISTIAN NISSLER; MARKUS NOWAK; MATHILDE CONNAN; STEFAN BüTTNER; JöRG VOGEL; INGO KOSSYK; ZOLTáN-CSABA MáRTON: "VITA—an everyday virtual reality setup for prosthetics and upper-limb rehabilitation", JOURNAL OF NEURAL ENGINEERING, INSTITUTE OF PHYSICS PUBLISHING, BRISTOL, GB, vol. 16, no. 2, 13 March 2019 (2019-03-13), GB , pages 026039, XP020339949, ISSN: 1741-2552, DOI: 10.1088/1741-2552/aaf35f * |
D. DHAWAN ET AL.: "Prosthetic Rehabilitation Training in Virtual Reality", 2019 IEEE 7TH INTERNATIONAL CONFERENCE ON SERIOUS GAMES AND APPLICATIONS FOR HEALTH (SEGAH), CONFERENCE, 5 August 2019 (2019-08-05), pages 1 - 8, XP033651077, Retrieved from the Internet <URL:https://ieeexplore.ieee.org/document/8882455> DOI: 10.1109/SeGAH.2019.8882455 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Rawat et al. | Evaluating and exploring the MYO ARMBAND | |
Woodward et al. | Adapting myoelectric control in real-time using a virtual environment | |
JP2022503776A (en) | Systems and methods for generating complementary data for visual displays | |
US20170003738A1 (en) | Systems and methods for immersive physical interaction with a virtual environment | |
US20170069223A1 (en) | Systems And Methods For Facilitating Rehabilitation Therapy | |
US10788889B1 (en) | Virtual reality locomotion without motion controllers | |
CN109243575A (en) | A kind of virtual acupuncture-moxibustion therapy method and system based on mobile interaction and augmented reality | |
Elnaggar et al. | Digitizing the hand rehabilitation using serious games methodology with user-centered design approach | |
Ali et al. | Virtual reality as a tool for physical training | |
Postolache et al. | Physiotherapy assessment based on Kinect and mobile APPs | |
Rechy-Ramirez et al. | A human–computer interface for wrist rehabilitation: a pilot study using commercial sensors to detect wrist movements | |
Oña et al. | Assessment of manual dexterity in VR: Towards a fully automated version of the box and blocks test | |
Gloumakov et al. | Trajectory control–an effective strategy for controlling multi-DOF upper limb prosthetic devices | |
Joyner et al. | Comparison of dexterous task performance in virtual reality and real-world environments | |
Ketcham et al. | Learning Game with Stroke Paralysis Rehabilitation for Virtual Reality. | |
US20160331304A1 (en) | System and methods for automated administration and evaluation of physical therapy exercises | |
Portaz et al. | Exploring raw data transformations on inertial sensor data to model user expertise when learning psychomotor skills | |
WO2023102599A1 (en) | Methods and systems for rehabilitation of prosthesis users | |
Palermo et al. | An augmented reality environment to provide visual feedback to amputees during sEMG Data Acquisitions | |
Tan et al. | Design Framework for Learning to Support Industry 4.0 | |
Pirovano | The design of exergaming systems for autonomous rehabilitation | |
White et al. | A virtual reality application for stroke patient rehabilitation | |
CN113877157B (en) | Hand function rehabilitation system combining data glove and VR technology | |
Lukacs et al. | Wrist rehabilitation in carpal tunnel syndrome by gaming using emg controller | |
Gaber et al. | A comparison of virtual rehabilitation techniques |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22902512 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
NENP | Non-entry into the national phase |
Ref country code: DE |