US20120157263A1 - Multi-user smartglove for virtual environment-based rehabilitation - Google Patents

Multi-user smartglove for virtual environment-based rehabilitation Download PDF

Info

Publication number
US20120157263A1
US20120157263A1 US13/145,436 US201013145436A US2012157263A1 US 20120157263 A1 US20120157263 A1 US 20120157263A1 US 201013145436 A US201013145436 A US 201013145436A US 2012157263 A1 US2012157263 A1 US 2012157263A1
Authority
US
United States
Prior art keywords
recited
input device
user
movement
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/145,436
Inventor
Mark Sivak
Maureen K. Holden
Constantinos Mavroidis
Avi Bajpai
Caitlyn Bintz
Jason Chrisos
Andrew Clark
Drew Lentz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/145,436 priority Critical patent/US20120157263A1/en
Assigned to NORTHEASTERN UNIVERSITY reassignment NORTHEASTERN UNIVERSITY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHRISOS, JASON, SIVAK, MARK, BAJPAI, AVI, CLARK, ANDREW, LENTZ, DREW, BINTZ, CAITLYN, MAVROIDIS, CONSTANTINOS, HOLDEN, MAUREEN K.
Publication of US20120157263A1 publication Critical patent/US20120157263A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/12Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
    • A63B23/16Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles for hands or fingers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16ZINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
    • G16Z99/00Subject matter not provided for in other main groups of this subclass
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6806Gloves
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B22/00Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
    • A63B2022/0094Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements for active rehabilitation, e.g. slow motion devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0087Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
    • A63B2024/0096Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load using performance related parameters for controlling electronic or video games or avatars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0655Tactile feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/10Positions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/10Positions
    • A63B2220/16Angular positions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/50Force related parameters
    • A63B2220/51Force
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/805Optical or opto-electronic sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/89Field sensors, e.g. radar systems
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/20Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry
    • A63B2225/54Transponders, e.g. RFID
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/04Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs
    • A63B23/08Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs for ankle joints
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes

Definitions

  • a device and system for rehabilitating hand and finger movements of stroke patients with neurological or orthopedic problems is disclosed, and, more specifically, a device and system that are structured and arranged to capture hand and wrist motion for the purpose of guiding a patient/user through rehabilitation exercises.
  • a rehabilitation program that can be performed in a stroke patient's home and performed without a visiting therapist being physically present in the patient's home could further increase rehab participation.
  • Such a program saves time in transportation and cost for clinical fees.
  • a key component of poor or incomplete functional recovery remains the impaired use of the stroke patient's hand and fingers.
  • Critical motions of the hand and wrist have been determined to be gross finger flexion and extension, opposition of the thumb, radial and ulnar deviation, supination and pronation of the wrist, wrist flexion and extension, and the hand's position and orientation in space. Accordingly, there is a compelling need to improve available methods for UE rehabilitation in stroke patients, and in particular, methods that improve hand and finger function.
  • Hand rehabilitation devices that are designed to be low-cost for in-home use remain a viable option for continuing the rehabilitation of the increasing number of stroke patients in the United States.
  • hand rehabilitation devices by others tend to be complex, expensive, and/or not readily available to clinicians.
  • FIG. 1 shows the P5 GloveTM.
  • the P5 GloveTM 10 was released for the home personal computer (PC) video game market in 2002 by Essential Reality, LLC of New York, N.Y. Although it is not wireless, the P5 GloveTM 10 is lightweight and allows for six degrees of tracking freedom including the three translational axes (the x-, y-, z-direction) and the three rotational axes (yaw, pitch, and roll).
  • the P5 GloveTM device 10 uses infrared (IR) technology, e.g., light-emitting diodes (LEDs), and bend sensors, to track movement of the patient/user's hand and/or fingers.
  • Bend sensors 9 which are discussed in greater detail below, are adapted to measure the bend or flexion of a digit, e.g., a finger or toe.
  • the bend sensors 9 are disposed along the back of each finger and thumb, to provide independent finger and thumb measurements.
  • the bend sensors 9 are mechanically coupled to the patient/user's hand by a ring 8 that fits around each fingertip.
  • IR sensors are structured and arranged to determine three-dimensional (3D) positioning.
  • the P5 GloveTM device 10 is electrically coupled to an infrared tower (or receptor) (not shown), e.g., via a PS/2 cable (not shown).
  • the infrared tower is electrically coupled to a PC (not shown), e.g., via a USB cable (not shown).
  • IR sensors can be used to determine position.
  • the first way involves a single IR LED that is disposed at a pre-set, known position.
  • a sensor is disposed on the tracking object. Based on the angle and intensity of the sensed light, the position of the sensors relative to the LED can be determined.
  • a second way in which IR light can be used to determine position is by disposing a single, IR light detecting sensor at a known position and by moving objects that emit IR light relative to the sensors.
  • the LED and the IR detecting sensors are disposed proximate each other. IR light from the LED reflects off of objects within the illumination area. The sensor picks up the reflected light, from which the position of the reflecting object can be determined.
  • Many multi-touch tables such as the Microsoft® Surface utilize reflective infrared technologies.
  • infrared positioning is accurate and relatively inexpensive it is not the most useful or most accurate method of determining the position of a P5 GloveTM. Indeed, because IR light detection is predicated on beams of light traveling between an emitter and a sensor, obstructions to the beam path limit this capability. Consequently, because a patient/user's hands move in many directions and at many angles there is no guarantee that emitted IR beams will reach the sensor without being obstructed or reflected.
  • FIG. 2 shows a Hand Mentor Rehabilitation Device 11 (“Hand Mentor”).
  • the Hand Mentor 11 is manufactured by Columbia Scientific, LLC of Arlington, Ariz. and has been cleared by the Food and Drug Administration (FDA) for use in rehabilitation clinics.
  • FDA Food and Drug Administration
  • the Hand Mentor 11 encourages patients to restore the range of motion of their wrist and hand using the principles of Repetitive Task Practice (RTP) and Constraint Induced Therapy (CI). As shown in FIG. 2 , the hand of the patient fits into a sleeve 12 that is adapted to sense and to generate a signal commensurate with the level of resistance caused by flexor spasticity.
  • RTP Repetitive Task Practice
  • CI Constraint Induced Therapy
  • the device 11 offers three different program types that are adapted to reduce spasticity, to recruit specific muscle groups, and/or to improve motor control.
  • the resistance signals are transmitted to a processing device 13 that includes software (or, alternatively, is hard wired) that is designed for unsupervised patient use of the device 11 .
  • the device 11 can also offer a therapist option, to establish rehabilitation regimens and generate data for documenting and reporting the patient/user's progress.
  • SensAble Technologies manufactures a line of haptic input devices 14 , which are designed to gather motion input and to provide feedback to the patient/user's fingers, hand, and arm.
  • the ST device 14 most suited for use in a rehabilitation virtual environment is a six degree-of-freedom PHANTOM® SensAble model, in which patients/users grasp a pen- or pencil-like portion 15 of the device 14 in either hand and control the x-direction, y-direction, z-direction, roll, pitch, and yaw.
  • the PHANTOM® six degree-of-freedom device 14 interfaces with a PC (not shown) via a parallel port (not shown).
  • the device 14 typically comes bundled with several software demos and with a software development kit specific to the inputs and limitations of the device.
  • the Falcon from Novint Technologies, Inc. of Albuquerque, N. Mex. is shown in FIG. 4 .
  • the Falcon device 16 was originally designed as an input device for playing games on a PC.
  • the device 16 has three degrees-of-freedom; however, different grips with a plurality of buttons or dials can be added to provide more degrees of freedom.
  • the Falcon device 16 includes a 4′′ ⁇ 4′′ ⁇ 4′′ workspace and has a two-pound (force) capability.
  • the Falcon device 16 interfaces with a PC, e.g., using a universal serial bus (USB), e.g., a USB 20 .
  • USB universal serial bus
  • the Falcon device 16 is sold with several games already available for it as well as driver software to play PC games.
  • the WiiTM controller 17 has two input device components: a Wii RemoteTM controller 18 and a NunchukTM controller 19 .
  • the WiiTM RemoteTM controller 18 when used solely, is normally held in a player's dominant hand. However, players choosing to use both the NunchukTM controller 19 and the WiiTM RemoteTM controller 18 , usually hold the WiiTM RemoteTM controller 18 in the right hand and the NunchukTM controller 19 , which is electrically coupled to the WiiTM RemoteTM controller 18 , in the player's left hand.
  • the WiiTM gaming console (not shown) connects directly to a power source and to a television or other display device.
  • Each WiiTM RemoteTM controller 18 and each WiiTM NunchukTM controller 19 communicates with the gaming console wirelessly via a sensor bar (not shown), e.g., using Bluetooth wireless technology.
  • a Rutgers Master II-ND Force Feedback Glove 20 is shown.
  • the glove device 20 was developed in 2002 at Rutgers University and is structured and arranged to use a plurality of, e.g., four, pneumatic actuators 21 and a plurality of sensors.
  • the direct-drive configuration of the actuators 21 provides force to the tips of the fingers 23 via finger rings 24 that are that mechanically-coupled to the actuators 21 .
  • Sensors are disposed on the patient/user's palm 22 , to avoid the presence of wires at the fingertips 23 .
  • the Rutgers Master II-ND device 20 is a research only device and there is no indication of software or a software development kit.
  • the CyberGlove device 25 is one of the leading products for sensing and capturing motion in the current market.
  • the device 25 is made from lightweight elastic and each sensor is extremely thin and flexible, making the sensors virtually undetectable.
  • the fabric on top of the device 25 is a stretch material that is provided for comfort.
  • the fabric on the bottom of the device 25 is made of an open or mesh material for better ventilation.
  • the device 25 is wireless and has a capacity of making eighteen or twenty-two high-accuracy, joint-angle measurements.
  • the glove 25 uses a proprietary resistive bend-sensing technology to capture real-time digital joint-angle data.
  • the 18-sensor model includes two bend sensors that are disposed on each finger, four abduction sensor, and sensors for monitoring thumb crossover, palm arch, wrist flexion, and wrist abduction.
  • the 22-sensor model includes a third bend sensor for each finger.
  • the CyberGlove II device 25 is electrically coupled to a PC, e.g., using a wireless USB receiver.
  • the software that come bundled with the glove 25 is for evaluation purposes only and is not for virtual reality.
  • The is no publicly available software development kit for the CyberGlove device 25 .
  • the 5DT Data Glove device 26 shown in FIG. 8 is designed for use in motion capture and animation.
  • the device 26 material is stretch Lycra® and the fingertips are exposed to facilitate the grasping function.
  • the device 26 is adapted to sense multiple bends, e.g., finger flexion, but is unable to measure the attitude or orientation of the hand in space or with respect to the patient/user's body.
  • the device 26 features automatic calibration and has an on-board processor (not shown).
  • 5DT Data Glove 5 Ultra device which includes five bend sensors to measure discrete finger and thumb flexure
  • 5DT Data Glove 14 Ultra device (depicted in FIG. 8 ), which uses two bend sensors on each finger and thumb and one bend sensor per abduction between adjacent digits.
  • the 5DT Data Glove 5 Ultra device 26 is adapted to include Bluetooth technology to make it wireless and, also, can include a cross-platform SDK.
  • the bundled software that comes with the device 26 has no rehabilitation applications.
  • FIGS. 1-8 Combinations of the commercially-available prior art technologies shown in FIGS. 1-8 have also been investigated by others for hand rehabilitation purposes.
  • the Rutgers Hand Master I and Hand Master II FIG. 6
  • the system uses the palm-mounted pneumatic pistons 21 and virtual reality to improve resisted finger flexion and non-resisted finger extension.
  • Robotic devices that train the entire arm have also been shown as benefit for stroke patients. More recently, the Bi-Manu-Track robotic arm trainer has been found to be equally as effective as electrical stimulation training.
  • a study utilizing the Howard Hand Robot found greater mobility gains for stroke subjects who exercised with robotic assistance in virtual reality during a relatively longer, e.g., three-week, intervention in comparison with subjects who had robotic assistance only during the last week-and-a-half of training.
  • None of these devices meets the need for a low-cost, simple, UE motor training device that patients could use easily in their homes, and, potentially, use with other patients over a network, the Internet, and the like.
  • VE-based virtual environment-based
  • robotic systems for hand rehabilitation exist. Those that do exist are prohibitively expensive, and most are not commercially available.
  • none is suitable for independent home use by patients and, furthermore, none provides for multiple patient/user interaction over the Internet.
  • a low-cost device that stroke or other patients could independently use in the home, to improve UE function and especially improved UE function of the hand.
  • Such a device would also be useful as an adjunct to ongoing rehabilitation therapy, providing patients with an interesting and motivating way to perform a home exercise program. If designed appropriately, such a system could be used by a therapist to establish exercise programs that were adjustable in level of difficulty, and tailored to the patient's specific interests. These features would likely increase patient motivation and compliance.
  • MUVER Multiple-User Virtual Environment for Rehabilitation
  • the MUVER is structured and arranged to enable multiple patients and system users at remote locations to interact with each other in virtual space with activities designed to enhance UE and skilled-hand function.
  • the intended application is for use as a supplemental, in-home rehabilitation tool for people with hand function and coordination disabilities, specifically the type of disability that would result from a stroke.
  • MUVER will be the first inexpensive, VE-based system that patients could purchase, e.g., for home use, that is specifically designed to enhance finger and thumb movement in addition to arm movement.
  • the MUVER system is flexible enough to include a variety of different rehabilitation devices to control the MUVER software.
  • the MUVER system is structured and arranged to monitor force and torque produced by the hand and fingers during grasping and manipulation tasks and can be extended to control ankle movements.
  • the system includes a virtual reality game-type interface that will have “scenes” developed specifically for patients with stroke who need to practice finger, hand, and arm movements.
  • the activities will be functional movements that involve the whole arm as well as hand, but with specific emphasis on hand and finger motions.
  • the device and system uniquely combine an ability to track hand position and orientation in space with tracking of finger and thumb configuration using an input device. This feature combination is critical to using the device to display a wide variety of hand and upper extremity exercises in virtual reality displays.
  • One such input device is fashioned like a glove for use with a multiple-user virtual environment system for rehabilitation exercise of a human hand and digits.
  • the input device is structured and arranged to generate signals corresponding to at least one of a discrete movement and an attitude of said hand and said digits.
  • the device includes a glove that can be readily donned and doffed on either hand by a user; a first plurality of sensors, each sensor being structured and arranged to provide data on movement and range of movement of at least one of the index finger, the middle finger, and the ring finger; a second plurality of sensors that is structured and arranged to provide data on movement of the thumb; and a positioning and tracking system that is structured and arranged to generate position coordinates in three rotational axes and three translational axes to determine at least one of the attitude and a velocity of said hand.
  • Another unique feature will be feedback lights placed on the back of the hand which will allow the patient to know if they are performing the correct motion while looking at their hand, as opposed to the screen. This will allow patients with impaired perceptual abilities to concentrate on the task while not having to interact as much with a computer interface.
  • the rehabilitation exercises will essentially be mini-games, so the device could easily be adapted for non-rehabilitation related virtual gaming.
  • FIG. 1 shows a view of a P5 Glove
  • FIG. 2 shows a view of a Hand Mentor Rehabilitation Device
  • FIG. 3 shows a view of SensAble Phantom® devices
  • FIG. 4 shows a view of a Novint Falcon device
  • FIG. 5 shows front and side views of a WiiTM remote control and a view of a WiiTM NunchuckTM device
  • FIG. 6 shows a view of a Rutgers. Master II-ND Force feedback Glove device
  • FIG. 7 shows a view of a CybergloveTM II device
  • FIG. 8 shows a view of a 5DT Data Glove 5 Ultra device
  • FIG. 9 shows a schematic of the multi-user virtual reality rehabilitation (MUVER) system in accordance with the present invention.
  • FIG. 10 shows schematics of four common multi-user virtual interactions
  • FIG. 11 shows a back side of a glove input device having bend sensors disposed on finger portions of the glove in registration with the index, middle, and ring fingers;
  • FIG. 12 shows bend sensors disposed on the glove of FIG. 11 in registration with the thumb and the base of the wrist;
  • FIG. 13 shows bends sensors for capturing wrist flexion/extension and for tracking radial and ulnar deviations and an IMU;
  • FIG. 14 shows and arrangement of electroluminescent wire light-emitting devices for providing visual signals to a patient/user
  • FIG. 15 shows two views of a second glove input device embodiment
  • FIG. 16 shows an IMU and a processing unit for the glove in FIG. 15 ;
  • FIG. 17 shows an embodiment of an input glove device using Hall effect sensing
  • FIG. 18 shows the palm portion of the input glove device shown in FIG. 17 ;
  • FIG. 19 shows the back of the had portion of the input glove device shown in FIG. 17 ;
  • FIG. 20 shows and arm sleeve embodiment of a glove input device
  • FIG. 21 shows a schematic of a hardware interface in accordance with the present invention.
  • FIGS. 22A and 22B show embodiments of a banana grip base structure
  • FIG. 23 shows an embodiment of a globe base structure
  • FIGS. 24A and 24B show embodiments of teardrop-shaped base structures
  • FIGS. 25A and 25B show embodiments of pyramid base structures
  • FIG. 26 shows a programming schematic for scripting a virtual reality scene
  • FIGS. 27A-27D show graphics of four stages of an exemplary virtual environment
  • FIG. 28 summarizes the mean and standard deviations of various testing sub-phases shown in FIGS. 27A-27D ;
  • FIG. 29 shows another schematic of the multi-user virtual reality rehabilitation (MUVER) system in accordance with the present invention.
  • MUVER virtual reality rehabilitation
  • FIG. 30 shows top and bottom portions to a ring prototype
  • FIG. 31 shows the top and bottom portions of FIG. 30 assembled
  • FIG. 32 shows an IMU disposed in the assembled ring prototype
  • FIG. 33 shows the ring prototype with a plurality of bend sensors
  • FIG. 34 shows a knuckle plate for the prototype of FIG. 33 ;
  • FIG. 35 shows a ring prototype mechanically coupled to a knuckle plate
  • FIG. 36 shows an exemplary virtual environment scene for a single degree of freedom knob
  • FIG. 37 shows an exemplary virtual environment scene for a single degree of freedom hand device
  • FIG. 38 shows an exemplary virtual environment scene for an active hand device
  • FIG. 39 shows an embodiment of a SmartGlove input device
  • FIGS. 40A and 40B show MCP flexion/extension set-ups for 45 degrees and 90 degrees, respectively;
  • FIGS. 40C and 40D show PIP flexion/extension set-ups for 45 degrees and 90 degrees, respectively;
  • FIG. 41A shows an illustrative bar graph of MPC bend data for the input device shown in FIGS. 40A and 40B ;
  • FIG. 41B shows an illustrative bar graph of MPC bend data for the input device shown in FIGS. 40C and 40D ;
  • FIG. 42 shows a schematic of a bi-manual SmartGlove system with arm splints for neutral wrist position and support.
  • the NU-MUVER Northeastern University Multiple-User Virtual Environment for Rehabilitation
  • the NU-MUVER Northeastern University Multiple-User Virtual Environment for Rehabilitation
  • the NU-MUVER system is designed to be used at home and/or over a network, e.g., a LAN, a WAN, the World Wide Web, the Internet, and the like, alone or with others, e.g., a therapist, other patients, and so forth.
  • a network e.g., a LAN, a WAN, the World Wide Web, the Internet, and the like, alone or with others, e.g., a therapist, other patients, and so forth.
  • the NU-MUVER consists of three basic components: an input device that generates data on position, attitude, and orientation of the patient/user's hand in space as well as of individual finger and thumb movements; commercially-available graphics software that provides object and animation routines that can be used to construct various movement re-training scenes; and a control unit that includes control software that enables a networking capability, movement parsing, performance scoring, recording, storage, manipulation, and display of data; and multiple training “scenes” that are designed to facilitate the practice of a particular movement(s) that is/are therapeutic for discrete patient populations.
  • a MUVER system 90 in accordance with the present invention is shown.
  • the system 90 is structured and arranged to provide a plurality of virtual environments 100 designed for specific rehabilitation exercises and for multiple patients/users 91 to interact with others, and with third parties 96 , e.g., medical personnel, physical therapists, and the like.
  • Virtual environments 100 or worlds, that are designed for more than one patient/user 91 are called Multi-User Virtual Environments (MUVE). Because the instant MUVE is for rehabilitation, the system 90 is referred to as a “Multi-User Virtual Environment for Rehabilitation” or MUVER 90 .
  • the elements of the MUVER 90 are shown in the figure and are discussed in greater detail below.
  • the MUVER system 90 is designed to be modular, which is to say that the number of patients/users 91 and the size of the virtual environment en gross or of each discrete, individual or personal virtual environment 100 can vary and, moreover, can be easily changed.
  • each patient/user 91 is equipped with his/her own personal computer (PC) 93 on which MUVER software 94 is installed.
  • PC personal computer
  • an input device 92 , the PC 93 , and the software 94 define each personal virtual environment 100 .
  • the input device 92 in each individual virtual environment 100 is adapted to enable each patient/user 91 to interact with other patients/users 91 , a third party 96 , and the like.
  • Communication from and between personal virtual environments 100 takes place over and through a virtual environment network 95 , e.g., a LAN, a WAN, the World Wide Web, the Internet, and the like.
  • a virtual environment network 95 e.g., a LAN, a WAN, the World Wide Web, the Internet, and the like.
  • This approach differs appreciably from other virtual environments in which a dedicated server operates the virtual environment for each of the patients/users.
  • This feature facilitates recording and logging communications between the virtual environment network 95 and a third party's computer 96 for later evaluation.
  • the design of a unique virtual environment has several stages.
  • the first stage is to use the nature of the patient/user's disability to determine what rehabilitation exercises or movements would be appropriate and feasible to emulate in a virtual environment.
  • the rehabilitation exercises or movements are selected by a physical therapist, a physician, a medical specialist, and the like.
  • the next step is to choose the character of multi-user interaction.
  • the character of interaction is determined by a physical therapist, a physician, a medical specialist, and the like.
  • Common types of multiplayer or multi-user virtual interaction e.g., competitive interaction, counter-operative (versus) interaction, cooperative interaction, mixed interaction, and any other combination of the first three interactions, are illustrated in the FIG. 10 .
  • “Competitive interaction” occurs where each patient/user 91 of a plurality of patients/users 91 , who have no direct interaction between them, completes the same task having the same goals for which a comparative score can be assigned. “Counter-operative” or “versus interactions” occur where a first patient/user 91 works against a second patient(s)/user(s) 91 to achieve competing goals, which only one of the patients/users 91 can obtain. “Cooperative interaction” occurs where two or more patients/users 91 work jointly to complete a common goal or task. “Mixed interactions” occur where patients/users 91 work together to complete a common goal but the performance of each patient/user 91 is scored comparatively.
  • the setting of the individual virtual environment 100 can be established.
  • the setting is simple and appropriate for the discrete patient/user 91 and, moreover, is designed to make the goal(s) clear for each patient/user. After the virtual environment 100 is completely designed, rigorous testing for both interaction and substance is necessary.
  • the MUVER system 90 includes three components: an input device 92 , a graphic display device, and a controller.
  • the MUVER system 90 will be described in terms of a SmartGloveTM as the input device, a Panda3D graphics engine for the graphics display device, and specialty software and driver programs for controlling the system 90 . These components are discussed in the subsequent sections. However, brief descriptions of sensors and of positioning and tracking systems are provided.
  • Sensing devices are provided to sense movement, e.g., bending, flexion, and so forth.
  • the bend or flexion of a human finger can be measured using various methods, which are collectively referred to as bend sensors.
  • Bend sensors use physical geometries and material properties to alter an electrical signal in proportion with angle or pressure. Bend radius and bend angle affect sensor output voltage. Bend sensors have been used for finger position measurement for quite some time, with the first large-scale commercial application appearing in 1989 with the Nintendo® Power Glove. There remains a wide range of currently-available products that use bend sensors, from very simple to very expensive.
  • bend sensors include optical fiber sensors and mechanical measurement devices.
  • Electromechanical sensors provided in, for example, the Nintendo® Power Glove use the patented technology of Abrams Gentile Entertainment Inc. (“Abrams”). Abrams defines five different electromechanical methods for changing the resistance of an electrically-conductive construction based on a bend angle.
  • a first economic application of these technologies involves a resistive sensors having a carbon-based, electrically-conductive ink as a stretched part, which changes electrical resistance in response to applied pressure. Using simple baseline calibration routines, reliable measurements of bend angle are attainable.
  • Optical bend sensors typically include a light source that is coupled to a light detector using, for example, an optical fiber. As the fiber bends, less light traverses the length of the fiber due to total internal reflection (TIR). For example, at higher bend angle, relatively few rays strike the detector and more rays exit the fiber at large angles.
  • TIR total internal reflection
  • This particular optical technology was used to develop the Data Glove, one of the earliest hand data recording systems, made by VPL Research, Inc.
  • the optical method provides a repeatable measurement of a bend angle; however, it is less cost effective than either of the previously described technologies.
  • Other optical technologies improve on the concept by using multiple fibers in a bundle or by pre-bending the fiber in a certain direction, and are therefore able to measure direction of bend as well as magnitude.
  • Hall effect sensors are switches that are activated in the presence of a magnetic field such as generated by a magnetic field-producing device, e.g., a magnet.
  • the sensor contains a capacitor that generates an electrical current and a magnetic field perpendicular thereto.
  • the magnetic charges generated follow a straight line except when in proximity of a magnetic field at which time the path of the charge becomes non-linear, i.e., curves, and accumulates on one face of the sensor.
  • the distance at which the magnetic field causes the sensor to act like a switch is a function of the strength of the magnetic field and, therefore, the magnet, and the current density specified by the sensor.
  • the ability to generate position coordinates in six axes (three translational and three rotational) and the ability to continuously track the position coordinates are critical to the operability of the device and system.
  • Several commercially-available positioning systems can produce position coordinates accurately and track multiple points at once.
  • magnetic tracking systems combine very high tracking resolution with high-speed sampling, which contribute to utilization in virtual reality simulations.
  • magnetic tracking systems are very expensive and, furthermore, the likelihood of successfully integrating magnetic tracking in an inexpensive, home system is not very high.
  • the magnetic fields associated with these systems also may experience high interference in home operation, affecting proper and satisfactory system operation.
  • Radio frequency positioning and tracking e.g., using a few identification tags (RFID) in combination with a plurality of receiving units
  • RFID systems determine the positioning of an object, e.g., a hand, by triangulating, e.g., measuring the time it takes for the RFID signal to travel to/from the object for each of the plurality of receiving units.
  • RFID systems operate at or near the wireless spectrum of most household, making interference an issue. RF systems also would not provide the accuracy necessary for the present invention.
  • Infrared positioning which is discussed above, can be both accurate and inexpensive. However, IR relies on line-of-site signals, making obstructions a huge problem.
  • IMUs Inertial measurement units
  • IMUs Inertial measurement units
  • IMUs are adapted to determine the orientation of an object (in space), the velocity of the object, and 3D positions using dead reckoning.
  • IMUs can be structured and arranged to gather data from all six degrees of freedom and avoids the shortcomings of the IR and electro-magnetic options.
  • dead reckoning One reason why IMUs have not been used heretofore, has to do with dead reckoning.
  • Dead reckoning refers to all object positions being measured relative to a pre-established and known initial starting (“home”) point.
  • sensors e.g., multi-axis accelerometers, gyroscopes, and the like, provide data to a processing unit that is adapted to calculate the speed of the IMU and the distance traveled from home.
  • processing unit e.g., multi-axis accelerometers, gyroscopes, and the like
  • an illumination source emits an IR light having a discrete, pre-established frequency, e.g., 44 MHz.
  • a plurality of sensors half of which operate at the pre-established frequency and half of which are out-of-phase with that frequency—measures the time it takes for the IR light to be reflected by an object and to return to the sensor, which is to say, the time-of-flight (TOF).
  • TOF data provide accurate depth data of the object, which can be gathered as quickly as an acceptable 60 frames per second.
  • SmartGloveTM SmartGloveTM
  • An ideal input device 92 is a wearable glove that is sized to be universal, i.e., useable on either hand, or adjustable, and, optionally, has its fingertip portions removed, to accommodate different hand sizes.
  • the input device 92 is structured and arranged to enable a patient/user 91 to don it and doff it using only one hand. A total weight not to exceed one pound and a dorsal weight not to exceed eight ounces is recommended to facilitate use by patients/users 91 .
  • the input device 92 is structured and arranged to measure at least one of the following accurately: finger flexion/extension measured to at least 90°; wrist flexion/extension, i.e., dorsal action, at ⁇ 90°; wrist-radial deviation up to 40°; and wrist-ulnar deviation up to 50°; and supination/pronation of the forearm up to 180°.
  • a first input device embodiment 70 includes bend sensors 71 , 72 , and 73 , which are disposed on the back of the input device 92 on finger portions that are in registration with the patient/user's index, middle, and ring fingers.
  • a sensor on the pinky finger whose movement generally follows that of the adjacent ring finger very closely, is optional.
  • a bend sensor 74 can also be disposed on the back of the input device 92 in registration with the thumb and a bend sensor 75 can be disposed on the input device 92 at the base of the palm of the hand.
  • the latter bend sensor 75 is adapted to bend as the heel of the thumb crosses the palm to oppose one or more of the fingers, e.g., during a pinch motion.
  • a two-dimensional bend sensor 77 which is disposed on the back of the input device 92 in registration with the wrist and oriented along the axis of the ulna, is provided to capture wrist flexion/extension.
  • a bend sensor 78 is disposed on the ulnar side inside of the hand, generally oriented along the axis of the thumb in a neutral position. At a neutral hand position, the bend sensor 78 for radial and ulnar deviations is slightly bent. When the hand is rotated or turned in an outward direction from the neutral position, the same sensor 78 will appear to remain straight. However, when the hand is rotated or turned inwardly from the neutral position, the sensor 78 will detect and measure the greatest movement.
  • the glove 70 can be hardwired to a base station that includes the electronics required to communicate to the PC 93 , e.g., wirelessly or via a USB 99 .
  • the base station can be ergonomically shaped and can include a mechanical button for dead reckoning purposes.
  • a plurality of, e.g. two, input buttons can also be provided to facilitate digital YES (or 1) and NO (or 0) input for navigating the software. To prevent false readings, software will only recognize button input at discrete, pre-established times, e.g., between exercise sets.
  • a feedback system e.g., a haptic system, an audible speaker, and/or light emitting devices, can also be disposed on or within the finger portions of the glove and/or on the back of the glove 70 , e.g., with or in the IMU housing 76 , to provide vibratory or auditory clues and/or visual signals during exercises.
  • a feedback system e.g., a haptic system, an audible speaker, and/or light emitting devices
  • electro-luminescent (el) wire 79 can be exposed around each of the index, middle, and ring fingers so that when any of the fingers is moved into a correct position, the el wire 79 disposed about the correctly-positioned finger can be illuminated, e.g., by a sequencer (not shown) electrically coupled to a power source and a power control device (not shown), e.g., an inverter.
  • a sequencer not shown
  • a power control device not shown
  • the glove 60 includes bend sensors 61 and 62 that are disposed, respectively, on the metacarpalphalangeal (MCP) joint and the proximal interphalangeal (PIP) joint of the thumb and of each finger, including the pinky finger.
  • MCP metacarpalphalangeal
  • PIP proximal interphalangeal
  • the MCP and PIP bend sensors 61 and 62 are adapted to record arcuate bend data associated with the motion or movement of each finger.
  • a third bend sensor 63 is disposed on the back of the hand at the base of the thumb.
  • a bi-directional bend sensor (not shown) is disposed to extend across the wrist on the palm side of the glove 60 .
  • a switch pad 64 e.g., a capacitive touch sensor
  • a switch pad 64 can be disposed on or within the tip of the thumb portion of the glove 60 for providing and recording pinch data.
  • the controller is adapted to use the touch data and bend sensor data generated to differentiate and identify the pinching finger from the non-pinching fingers.
  • An IMU 65 is provided on the glove 60 , e.g., between the knuckles and the wrist on the portion of the glove 60 corresponding to the back of the patient/user's hand.
  • the inventors used an IMU 65 that includes three integrated circuits: two vibrating beam gyroscope assemblies and one micro-machined accelerometer. Each of the three chips generate data in all three acceleration axes and in all three rotational axes. Hence, the degree of freedom is six.
  • the IMU 65 is electrically coupled to a processing unit 66 that is removably attached to the patient/user's forearm as shown in FIG. 16 .
  • a processing unit 66 that is removably attached to the patient/user's forearm as shown in FIG. 16 .
  • a Hall effect glove embodiment 50 is shown in FIGS. 17-19 .
  • Single bend sensors covering both the MCP and PIP joints 51 are disposed on each of the finger portions of the glove 50 .
  • a plurality, e.g., three, bend sensors 52 - 54 are disposed on the thumb portion of the glove 50 .
  • a bend sensor 52 is disposed between the thumb portion and the index finger portion of the glove 50 to track relative movement between the fingers and the thumb
  • another bend sensor 53 is disposed along the axis of the thumb portion to track movement of the thumb
  • a third bend sensor 54 is disposed at the base of the thumb portion of the glove 50 to measure roll of the wrist joint as the patient/user's thumb reaches across the palm.
  • Hall effect sensors 56 are disposed on the tips of each glove finger and a magnetic field generating device 57 , e.g., a magnet, is disposed at or near the tip of the glove thumb.
  • a magnetic field generating device 57 e.g., a magnet
  • the sensors 56 switch, providing data to the controller.
  • the controller is adapted to use the touch data and bend sensor data generated to differentiate between and to identify the pinching finger from the non-pinching fingers.
  • an IMU 52 is disposed on the back of the hand portion on the glove 50 .
  • Another Hall effect sensor 59 is also disposed in the palm of the glove 50 to enable dead reckoning. More specifically, whenever the patient/user 91 places his/her hand correctly on a base station (described in greater detail below) that is equipped with a magnetic-field generating device (not shown), the Hall effect sensor 59 will generate a signal from which the controller will call and execute an algorithm, software, driver programs, and the like to calibrate the IMU 52 .
  • the base can be wireless.
  • a wireless base minimizes the proliferation of wires and cables, which can get tangled and/or hinder movement.
  • a strap 40 can be removably attached to the patient/user's forearm.
  • the strap 40 can include a power source (not shown), e.g., one or more batteries, a controller (not shown), e.g., an chicken USB board manufactured by chicken Software of Italy, and an accelerometer 41 . Comparison of accelerometer readings from the accelerometer 41 on the forearm and from an accelerometer disposed in the IMU 52 can be used to determine the angle of wrist bending.
  • an arm sleeve embodiment 45 is shown.
  • the embodiment includes a glove portion 42 and a pulley portion 43 .
  • bend sensors 44 and 46 are dispose on the back of the hand portion of the glove portion 42 , to be in registration with the MCP and the PIP of each finger, while at least one bend sensors 47 is disposed along the PIP of the thumb.
  • at least one bend sensor is disposed on the glove portion 42 along the crease of the thumb along the palm of the hand. Flexion/extension of the wrist can be measured by a bend sensor (not shown) that is disposed on the glove portion 42 in registration with the posterior side of the patient/user's wrist.
  • the glove portion 42 can be fingerless, permitting better fit across a variety of hand sizes.
  • fingerless a plurality of rubber caps are attached to the tips of each finger and thumb. Inside each rubber cap is a small push button that is proximate the finger or thump tips. The operation of the buttons in the fingerless version is the same as that previously described.
  • the pulley portion 43 is provided to measure radial and ulnar deviations.
  • the pulley portion 43 includes a strap 48 that is securely but releasably attachable to the, e.g., medial side of the, glove portion 42 , e.g., using a hook and pile material, at a first end; that runs along the lateral side of the patient/user's wrist; and that passes through a small pulley 49 that is disposed, e.g., in an arm sleeve, above the patient/user's elbow.
  • an exemplary hardware interface i.e., input device 92
  • input device 92 an exemplary hardware interface, i.e., input device 92
  • the SmartGloveTM 92 was chosen as an input device 92 because of a low cost, off-the-shelf device that would be suitable for home use. Newer technologies could provide the same kind of interface for patients/users 91 while maintaining the high usability and low cost for the practitioner.
  • FIG. 30 shows top and bottom portions 101 , 102 of a mounting box 110 that, when assembled as shown in FIG. 31 , form a box-like structure that is structured and arranged to rest on the back of a patient/user's hand and to accommodate all of the necessary sensing devices.
  • a depressed area 106 for receiving an IMU 108 is provided in the top portion 101 .
  • a pair of vertical standoffs 109 are provided to orient the IMU 108 .
  • a slot 104 for accommodating a Hall effect sensor 105 is also provided in the top portion 101 .
  • the prototype 110 is releasably attachable to the back of the patient/user's hand using, for example, a hook and pile combination that can be routed through a pair of openings 103 provided for that purpose.
  • bend sensors 107 which will be disposed within the material of the SmartGloveTM 92 , are shown optionally coupled to exiting sleeve rings 109 at a first end and, to mimic the structure of the hand, are mechanically attached to a single point at the rear of the mounting box 110 at a second end. As shown in FIG. 35 , each bend sensor 107 enters the mounting box 110 via a respective slot 112 .
  • Elastic string can be used for attaching the bend sensors 107 to the single point. The string prevents the sensors 107 from rotating about the single point while also allowing the sensors 107 freedom to translate along the axis of each finger as flexion/extension occurs. This enables the sensors 107 to retain the geometry of the patient/user's hand as knuckles are flexed and relaxed. It also enables the bend sensors 107 to remain in a constant position relative to the fingers.
  • a knuckle plate 111 ( FIGS. 34 and 35 ) can be provided. Because signals from the sensors 107 vary with the bend radius independent of the actual angle of the bend, if the radius can be held constant, the angle of the MCP joint's bend can be accurately modeled.
  • the SmartGloveTM 92 was also chosen because it is easily connected to a PC 93 , e.g., wirelessly or via a universal serial bus (USB) port 99 , and offers satisfactory control to the patient/user 91 .
  • a USB 2.0 is preferred for greater data transfer at a faster rate.
  • the patient/user's hand and/or finger movements are transmitted to the PC 93 , e.g., wirelessly and/or via the USB connection 99 .
  • a device driver 97 for the SmartGloveTM 92 which can be installed in the operating system of the SmartGloveTM 92 or, alternatively, as shown in FIG. 21 , in operating system 98 of the PC 93 , interprets the input data and provides the data to the virtual reality software 94 .
  • the IMU 115 is incorporated into the input device 92 and disposed on the back of the patient/user's hand, when a USB connection 99 is used, to reduce the total weight of the system on the patient/user's hand a cable mount 112 can be disposed on the patient/user's forearm.
  • the virtual reality software 94 uses the SmartGloveTM 92 input data to generate output signals designed to display appropriate images in a virtual reality on a display device (not shown), e.g., the display device of a PC.
  • a display device e.g., the display device of a PC.
  • the MUVER programming display software includes three different pieces that are illustrated in FIG. 26 : a game engine 82 , 3D models and graphics 83 , and a scripting code 84 .
  • the system can also include multiple input devices, e.g., a pair of SmartGlovesTM for the left and right hands of the patient/user, that can be used simultaneously.
  • Multiple input devices in general and, more specifically, a pair of SmartGlovesTM permit more complex and realistic rehabilitation tasks such as, in virtual reality, simultaneously grasping a jar with a first hand and removing a lid from the jar with a second hand.
  • FIG. 42 An illustrative bi-manual system 120 is shown in FIG. 42 .
  • Each of the patient/user's hands and forearms are supported in adjustable arm splints 111 for neutral wrist position and support.
  • the end portions 119 of each of the splints are ergonomically curled to make the natural position as comfortable for the patient/user as possible.
  • Each hand is disposed within an input device 113 that can be a Spandex®/cotton blend glove.
  • Bend sensors 114 are positioned (within a pocket in or within the material of the glove 113 ) across the MCP and PIP of the patient/user's index, middle, and ring fingers, to measure finger flexion and extension.
  • a fourth bend sensor 116 is disposed along the back of the hand proximate the patient/user's wrist to measure wrist flexion and extension.
  • a fifth bend sensor 117 is disposed along the back of the thumb to measure the rotation of the thumb with respect to the patient/user's palm and fingers.
  • Hall effect sensors 118 can be wired in the finger tip portions of the glove 113 and are adapted to interact when they come in proximity of a small magnet (not shown) that is disposed in the palm of the glove.
  • the small magnets in each of the gloves are used in combination with the globe base (see below) to calibrate the position of the gloves with respect to the base.
  • An IMU 115 is disposed on the back portion of the glove 113 for monitoring the three-dimensional hand position, i.e., attitude, of the patient/user's hand.
  • a cable housing 112 is wrist-mounted. More preferably, wireless communication of data can be effected.
  • the patient/user must also be able to achieve the same dead reckoning position at the start and/or at the completion of each exercise due to, inter alia, the nature of IMU positioning.
  • the system includes a base structure that accounts for ergonomics and an activation sequence that enables the controller to receive data once the patient/user has placed his/her hands in the appropriate position.
  • FIG. 22A show a banana grip base 30 having a plurality of RFID tag receivers 31 and FIG. 22B shows the same grip base 30 with a patient/user's hands placed in the appropriate staring (“home”) position.
  • the ergonomics of the banana grip base 30 allows patients/users to place their hands on the base pad 32 without having to strain to cause them to lie flat.
  • the RFID receivers 31 do not require power, allowing the grip base 30 to remain completely passive, i.e., wireless.
  • RFID transceivers 59 are disposed in the input device 50 so that when the patient/user's hands are disposed at the “home” position, the receivers 31 and transceivers 59 are proximate, causing the transceiver 59 to emit a signal to that effect.
  • a drawback of the RFID approach is its accuracy. Slight deviations from a true “home” position may skew the results of the exercise.
  • FIG. 23 shows a globe base 35 that includes imprinted hand grooves 33 that define the “home” position.
  • Capacitive touch sensors 34 can be disposed at each of the finger and thumb tips of the hand grooves 33 .
  • the globe base 35 is adapted so that the patient/user's digits must touch each of the touch sensors 34 in both hand grooves 33 for location data to zero itself. This design is particularly attractive due to its simplicity and, further, it causes little strain on the patent/user's hands.
  • the hand grooves 33 and touch sensors 34 at the fingertips of the hand grooves 33 make it intuitive to use.
  • Hall effect sensors and magnets can be integrated into the input device 92 and the globe base 35 .
  • the globe base 35 is a single piece, ergonomic design that requires a power source.
  • the relatively large size make it possible to house electronics, e.g., a processing board, a Bluetooth® receiver, and other accessories (including the input devices when not in use), within the base 35 itself.
  • Various feedback systems e.g., speakers for audio feedback, vibration devices for haptic feedback, and LEDs for visual feedback, can also be disposed within or on the outer surface of the globe base 35 .
  • Prototype testing by the inventors highlighted the need for providing wrist and or forearm support for accommodating and supporting when the patient/user's hands when properly positioned in the “home” position.
  • the incorporation of medical arm splints with the globe base 35 allows the patient/user to maintain his/her hands in a neutral, “home” position, which is to say: full pronation, no extension or flexion, and no radial or ulnar deviation.
  • touch sensors 34 in the fingertips do not ensure that the patient/user's palms are also touching the base 35 .
  • location data may be incorrect.
  • the touch sensors 34 also require the glove device to be fingerless as only contact with exposed skin at the fingertips will activate the sensors 34 .
  • the size of the globe base 35 is relatively large, requiring additional exercise area.
  • FIGS. 24A and 24B show a teardrop base 39 , one of which is provided for each hand.
  • the teardrop base 39 design creates an ergonomic platform on which patients/users may rest his/her hands.
  • a tactile switch button 38 is disposed on the base 39 so that when the patient/user's hand is properly positioned over the base 39 , the button 38 will activate the device.
  • the active switch button 38 can be electrically coupled to the controller, e.g., via a USB connection.
  • the main problem with the teardrop design is that the button 38 will not necessarily be repeatedly activated by the same part of the patient/user's hand and because two bases 39 are needed—one for each hand—requiring two dead reckoning signals.
  • FIGS. 25A and 25B show a pyramid base 37 , one of which will also be provided for each hand.
  • the design creates an ergonomic platform for hand placement, which is particularly effective for stroke patients who frequently have difficulty spreading their hands.
  • the pyramid base 37 includes a magnetic field-generating source 36 , e.g., a magnet, that is adapted to activate a Hall effect sensor that is disposed in the palms of the input device gloves. Hall effect activation allows hand placement on the base 37 to be repeatable within an acceptable tolerance proportional to the resolution of the sensor 36 , while allowing the base 37 to be passive, i.e., wireless and not requiring power.
  • the simplicity of design, ergonomic shape and Hall effect activation are advantages of the pyramid base 37 .
  • two bases 37 are needed—one for each hand—requiring two dead reckoning signals and, furthermore, the sensor disposed in the glove input device may cause discomfort.
  • a 3D graphics engine is a library of subroutines for 3D rendering and game development.
  • the 3D models and graphics 83 portion populates the engine code and follows the rules of the game engine 82 .
  • the scripting code 84 in the game engine 82 controls many of the low level features, e.g., physics and display, and, preferably, is written in PythonTM Script. More preferably, the scripting code 84 can overwrite some or all low-level game engine code while also providing unique features to the 3D models 83 .
  • the MUVER 90 includes a software package that is simple both to implement and to change and that, also, is capable of providing the features that the MUVER 90 requires, e.g., network coding, 2D/3D rendering, and so forth.
  • One possible software option for the MUVER 90 is to use a 3D graphics engine for most of the code and using a scripting language to program in the various scenes and the use of the SmartGloveTM 92 .
  • the Panda3D graphics engine was originally created by Disney but is currently owned by Carnegie-Mellon University of Pittsburgh, Pa.
  • the software integrated into the graphics engine is open source, which is to say that it is free to the public for download for commercial and non-commercial use and can be freely modified.
  • the Panda3D graphics engine uses the PythonTM scripting (programming) language and is written in object-oriented, C++ libraries and modules. Panda3D has comprehensive support for networking that allows for rich virtual interactions between patient/users.
  • Data input methods for the Panda3D advantageously include direct input of Head Mounted Displays (HMD) and VR trackers.
  • HMD Head Mounted Displays
  • VR trackers VR trackers
  • Panda3D remains a preferred platform because of its license agreement as well as its capabilities.
  • Panda3D requires additional middleware which increases cost and complexity.
  • Middleware is a term or art used to define a software “bridge” between hardware such as between an input device 92 and the software of the MUVER.
  • the middleware must be compatible with the input device, i.e., the SmartGloveTM 92 , but must also be able to generate a compatible output signal to the chosen development software.
  • SWIG software development tools
  • SWIG software development tools
  • a first programming language e.g., C and C++
  • SWIG is used to create high-level, interpreted or compiled programming environments, and user interfaces.
  • SWIG is used with different types of languages—including common scripting languages such as PythonTM, which is the scripting language of Panda3D.
  • PythonTM common scripting languages
  • SWIG is open source and, hence, may be freely used, distributed, and modified for both commercial and non-commercial use.
  • SWIG is a difficult program to work with and very unstable. Notwithstanding, SWIG enables programmers to write programming methods for the SmartGloveTM 92 in C/C++ and, subsequently, to “wrap” them so that they can be read in PythonTM programming code. This features allows the SmartGloveTM 92 to be useable in the BlenderGE as a set of PythonTM scripts.
  • GlovePIE Garve Programmable Input Emulator
  • GlovePlE emulates movements made with the SmartGloveTM 92 using software macros and, further, binds the movements to an input device such as a keyboard or a joystick.
  • an input device such as a keyboard or a joystick.
  • use of the SmartGloveTM 92 as an input device is extended to any program that is traditionally control using a keyboard or a joystick.
  • only certain joystick/keyboard movements are emulated by the SmartGloveTM 92 , which limits the number of movements the would be available to a practitioner and/or a patient/user.
  • GlovePIE is no longer confined to VR gloves; but, rather, now supports emulating a myriad of input, using a myriad of devices, e.g., Polhemus, Intersense, Ascension, WorldViz, 5DT, and eMagin products.
  • GlovePIE may also control MIDI or OSC output.
  • OpenTracker is manufactured by Argent Data Systems of Santa Maria, Calif.
  • OpenTracker is another open source product that is adapted to create a full-featured tracking software package that can be integrated into any software as a device library.
  • the major advantages of OpenTracker are that it is the most full featured and robust middleware package. It also natively integrates into C/C++.
  • Hardware supported by the OpenTracker includes:
  • VRPN Virtual-Reality Peripheral Network
  • a PC or other host controller is disposed at each VR station to control the peripherals, e.g., tracker, button device, haptic device, analog inputs, sound, and the like.
  • VRPN provides middleware connections between the application(s) and the hardware devices using an appropriate class-of-service for each type of device sharing the link.
  • the application remains unaware of the network topology.
  • VRPN can be used with devices that are directly connected to the system that is executing (running) the application, using separate control programs or running the applications as a single program.
  • VRPN also provides an abstraction layer that makes all devices of the same base class look the same. For example, all tracking devices are made to look like they are of the type vrpn_Tracker. As a result, all trackers will produce the same types of reports. At the same time, it is possible for an application that requires access to specialized features of a certain tracking device, e.g., telling a certain type of tracker how often to generate reports, to derive a class that communicates with this type of tracker. If this specialized class were used with a tracker that did not understand how to set its update rate, the specialized commands would be ignored by that tracker.
  • VPRN system types include: Analog, Button, Dial, ForceDevice, Sound, Text, and Tracker. Each type abstracts a set of semantics for a specific device type. There are one or more servers for each type of device, and a client-side class to read values from the device and control its operation.
  • VRPN is the preferred middleware solution because of its capabilities and robustness, which the other solutions lacked. Historically, VRPN has also been used successfully with Panda3D.
  • Blender is another open source software package manufactured by the Blender Foundation that focuses on digital modeling and animation.
  • An integrated game engine called BlenderGE uses PythonTM as a scripting (programming) language.
  • the main advantages of Blender are that the software is included in the BlenderGE and, because, when Blender is used as a digital animation package, it has many features to use armatures, e.g., a human hand.
  • Source and Hammer are manufactured by the Valve Corporation, a video game company.
  • Source and Hammer is a leading character animation graphics engine whose main advantage is an advanced physics engine and multiple patient/user capabilities. Source and Hammer can be used for non-commercial purposes.
  • XNATM is a product of Microsoft® Corporation of Seattle, Wash.
  • XNATM uses C# code and a shared library to facilitate creation of games and simulations.
  • the biggest advantages of XNATM are that the community is very knowledgeable and it is designed for making multi-user games.
  • One caveat, however, is that XNATM does not have a bundled game engine. Accordingly, using XNATM software to create the MUVER would require much more programming than using one of the software packages associated with a game engine.
  • Flash® manufactured by Adobe Systems Incorporated of San Jose, Calif. is primarily used for Web sites and for Internet applications.
  • the biggest advantage of using Flash® is that most applications can be accessed from any Web browser, which means that installation is not mandatory.
  • Flash® does not have a bundled game engine and requires more programming than other software solutions.
  • the multi-user options available for Flash® are comparable to XNATM and Panda3D but better than what is available for Blender.
  • FIG. 27 shows an exemplary virtual environment (VE) training “scene” (or exercise) for competitive virtual interaction between two patient/users.
  • the exemplary scene is designed to allow patients/ users to practice both an active grasp and a maintained grasp.
  • An active grasp involves a pinching action using two to four fingers and the thumb and a release action.
  • a maintained grasp involves active supination, which is to say, a hand further rotating toward a palm-up position. These movements are deemed by experts essential to improved hand function in stroke patients.
  • a patient/user In this VE training “scene”, a patient/user must move his/her hand and fingers, which are operationally coupled to the input device, first, to grasp or grip a virtual lid 67 ( FIG. 27A ) and then to lift the virtual lid 67 from a virtual pot 58 ( FIG. 27B ). Once the patient/user has successfully grasped and lifted the virtual lid 67 , he/she supinates the virtual lid 67 to a palm-up position, all the while maintaining his/her grasp on the virtual lid 67 . In the final stage of the scene, the patient/user pronates the virtual lid 67 to a palm-down position; returns the virtual lid 67 back to the virtual pot 72 ; and releases his/her grasp on the virtual lid 67 ( FIG. 27D ).
  • the scene application is adapted so that a visual signal is generated upon successful completion of any or all of the movements or motor activity associated with stages.
  • the virtual lid 67 can change color, e.g., from grey to blue.
  • the virtual lid 67 can change color again, e.g., from blue to green.
  • the virtual lid 67 can change color again, e.g., from green to red.
  • the supination threshold for success can be pre-set, e.g., 45 degrees, and can be further adjusted to require a greater or lesser result.
  • the virtual lid 67 can return to its original color, e.g., grey. Undergoing this cycle of stages, the trial is counted as a “success”.
  • the application can be adapted to cause the virtual lid 67 to return automatically to the original position on the virtual pot 68 and to return to its original color, e.g., grey.
  • each discrete data element can be displayed separately to provide feedback about performance either during the session or after. These discrete elements count the number of successes for each phase; count the number of successful trials, which is to say, that all phases are completed successfully; record the time for each phase; and record the time for each trial. Time can also be displayed as a mean for block of trials, with the number in block adjustable.
  • FIG. 28 summarizes the results of testing of six (6) healthy subjects (four males, two females; 5 right-handed, one ambidextrous) who performed competitive interaction using the MUVER scene shown in FIG. 27 .
  • Each patient/user donned and calibrated the SmartGloveTM and was instructed in the hand rehabilitation movement depicted in the scene.
  • Each subject was provided with up to five minutes to practice and become accustomed to working in the virtual environment.
  • a more complete MUVER System 90 is shown.
  • the practitioner's interface 96 expands data collection of the basic system in two major ways: providing real-time feedback and modifying the virtual environment. Indeed, a practitioner can be a spectator and watch the actions and interactions of the patients/users 91 . They also have the ability to privately or openly provide feedback to a patient/user 91 or multiple patients/users 91 at once. As another form of feedback the practitioner can change or control the MUVER system 90 based on the actions and interactions of the patients/users 91 . Modifying the MUVER 90 has several advantages including changing the level of difficulty to better suit rehabilitation and also directing the MUVER 90 to facilitate certain interactions between patients/users 91 .
  • the more complete system 90 includes a Rapid Prototyping feature 85 .
  • Rapid Prototyping is a manufacturing method that allows custom objects to be created from a .STL file quickly and for low cost.
  • the MUVER 90 can be populated by models that can be exported as .STL files. Consequently, a practitioner can borrow an object from the real world; recreate it in the MUVER for virtual rehabilitation; and then create it using RP for use in actual rehabilitation exercises 86 .
  • a first exemplary MUVER scene that can be presented virtually in accordance with the present invention is a simple, Single Degree of Freedom Knob (SK).
  • an SK is an Electro-Rheological Fluid (ERF) based device that can be manipulated by the patient/user in a single degree of freedom, e.g., clockwise or counterclockwise rotation about an axis, by rotating a variably-resistive knob.
  • EMF Electro-Rheological Fluid
  • the MUVER system 90 allows a practitioner to place a patient/user in a virtual environment containing an SK device in the comfort and convenience of the patient/user's home. From a remote site, the practitioner is also able to customize the resistance level of the real world knobs for a particular patient/user.
  • a virtual knob 87 that might be shown in a patient/user's virtual environment is shown in FIG. 36 .
  • the MUVER design for the virtual knob 87 is meant to be for one or two patient/users.
  • a first patient/user manipulates his/her hand and fingers in the real world sufficiently to cause the circle 88 to rotate about an axis in either direction in the virtual environment. If there is a second patient/user, he/she likewise manipulates his/her hand in the real world sufficiently to cause the rounded square 89 to rotate about the axis in the same direction in the virtual environment. If there is no second patient/user, the practitioner or, alternatively, a software program can control and manipulate the rate of rotation of the rounded square 89 . Control of the square 89 by a non-patient/user can be performed using another knob, a rehabilitation device, a keyboard, a mouse, and the like.
  • the simplistic MUVER SK scene can be controlled to provide competitive, versus, or mixed virtual interactions.
  • the time it takes each patient/user to catch, i.e. to “tag”, the square 89 during a pre-established amount of time can be recorded and compared.
  • Another possible competitive interaction is, instead, to record the number of “tags” during a pre-established period of time for each patient/user.
  • An exemplary versus interaction can include awarding points to the first, circle patient/user for every “tag” of the square 89 and awarding points to the rounded square patient/user for avoiding being tagged over a set increment of time.
  • An exemplary mixed interaction can include two patient/users comparing their scores and times with another pair of patient/users or with historical scores of the two.
  • Scene themes can play a very important role in the MUVER design.
  • the customizable knob lends the device to many possible real scenes that can be made into virtual ones.
  • the availability of real life scenes in a virtual environment can be beneficial because a particular patient/user may have performed the task regularly in the real world and, hence, already understands the movements needed to successfully complete it.
  • Possible real life scenes to use in a MUVER for the SK can, for purposes of illustration and not limitation, include:
  • Fictional scene themes that parallel events that the patient/user is not familiar with have a low learning curve because the patient/user may not know how to complete the movements successfully. For example, a non-fisherman may not know how to reel in a fish or a teetotaler may not know how to turn a corkscrew to open a bottle of wine. Notwithstanding, fictional scene themes have the advantage of being made expressly for exercising a certain movement or knob design.
  • the prototype MUVER proposed here is an example of a fictional theme but the idea of tag is a common play mechanic that patient/user may be familiar with.
  • SHD Single Degree of Freedom Hand Device
  • the single degree of freedom is linear and has a pre-established, e.g., three inch, stroke.
  • the exercise routine the patient/user performs is simply grasp and release.
  • Challenges include noise from the MRI machine, simplicity of the device, and using mirrors for the patient/user to see the computer monitor displaying the MUVER graphics.
  • the real world SHD has many unique design aspects, which to the greatest extent possible are transferred over to the MUVER virtual environment design.
  • the scene theme in FIG. 37 replicates the real world act of inflating a balloon 85 using a hand pump 86 .
  • This particular exemplary scene theme advantageously provides an objective that is simple and intuitive. Additionally, because the progressive inflation of the balloon presents the patient/user with feedback, no other feedback needs to be provided.
  • This MUVER scene theme is that it can facilitate all four kinds of multiple patient/user interactions: competitive, cooperative, versus, and mixed.
  • the competitive interaction would be conducted by timing each patient/user for a pre-established number of pumps that will completely inflate the balloon 85 .
  • multiple pumps 86 control the rate of inflation of the same balloon 85 , a cooperative interaction can be accomplished.
  • This type of interaction can be further explored by controlling the rhythm of pumps from each patient/user or having a set order that patients/users must pump in order to inflate the balloon 85 .
  • an Active Hand Device is a two degree of freedom, active ERF device.
  • An exercise the patient/user does with the AHD includes the linear degree of freedom that the patient/user performs with the SHD as described above as well as a supination/pronation rotation similar to that of the SK.
  • the AHD is also an active device, which is to say, that the AHD can provide variable resistance and push back against the patient/user, which can be used to enhance the experience.
  • the major design considerations for such a device include having a customizable real world design so that the practitioner can choose to use both degrees or a single degree of freedom.
  • FIG. 38 a diagram of the prototype MUVER for an AHD is shown.
  • the patient/user is represented as the circle 69 .
  • the patient/user manipulates the AHD to move the circle 69 around a track 29 that can be designed by the practitioner.
  • the patient/user receives feedback in the form of accuracy in movement and also in the speed with which the circle 69 circumnavigates the track 29 .
  • Adjustable read lines 28 represent the start and the end of the track 29 .
  • the locations of the read lines 28 can be modified depending on what the practitioner wants the patient/user to do for an exercise.
  • the ellipse 27 surrounding the circle 69 is a force gradient, inside of which the patient/user tries to keep the circle 69 as it moves around the track 29 .
  • exercises can be made more or less difficult based on the hands position relative to the shoulder, movement tests can be designed to test extremity coordination.
  • the position/orientation feature is a valuable component to the rehabilitation package.
  • the system is designed to work with two gloves simultaneously, to allow patients/users 91 to use their capable hand to interact in exercises along with their disabled one.
  • mapping of real world movement must necessarily correspond to a virtual world movement, i.e., “direct mapping”.
  • the goal is rehabilitation, then whatever virtual “scene” best motivates a patient/user to expedite the rehabilitation process or rehabilitation milestones, the better.
  • “abstract mapping” by which real world movement produced by the patient/user differs from the movement of the virtual world object(s) is also possible with the invention as claimed.
  • a “direct mapping” scene may depict a hand waving while an “abstract mapping” scene may equate the flexion/extension to the movement of a cartoon animal.
  • Performance data e.g., hand and finger kinematics
  • “Scoring” connotes reduction of performance data into a format that is easily usable to determine performance and progress.
  • the scoring data is easily formatted into graphics, spreadsheets, summaries, and so forth.
  • FIGS. 40A-D show a patient/user's hand being constrained in 45 degree and 90 degree orientations, for measuring voltage as a function of joint bend of the MCP ( FIGS. 40A and 40B ) and the PIP ( FIGS. 40C and 40D ).
  • Patient/user movement data for FIGS. 40A and 40B are displayed in a bar graph in FIG. 41A , which separates the data by finger, i.e., index finger, middle finger, and ring finger, and by the angle of constraint, i.e., 0, 45 degrees, and 90 degrees.
  • Patient/user movement data for FIGS. 40C and 40D are displayed in a bar graph in FIG.
  • “Scoring” can be performed per “scene” or exercise, per phase within a “scene” or can be compiled over multiple scenes. “Scoring” can measure, for example, a number of repetitions, a magnitude of motion, speed of performance, speed of performance of a phase of a scene, accuracy of movement, and so forth.
  • the system can also include a “teacher model” capability, which provides the ability to record patient/user performance for analysis and later playback. During playback, patient/user errors can be highlighted and made known to the patient/user and correct performance can be demonstrated.
  • a “teacher model” capability provides the ability to record patient/user performance for analysis and later playback. During playback, patient/user errors can be highlighted and made known to the patient/user and correct performance can be demonstrated.
  • U.S. Pat. No. 5,554,033 discloses a virtual teacher and is included herein in its entirety by reference.
  • Another advantage of the present system is a library of “scenes” or exercises, each scene being used for a discrete purpose or functional goal and being adjustable. More specifically, each of the “scenes” can be programmed to adjust the degree of difficulty of the scene. Indeed, defining the values of a set of parameters, e.g., speed of motion, magnitude of motion, smoothness of motion, hand orientation during motion, and the like, enables medical personnel, physical therapists, and the like to tailor scenes for a discrete patient/user. Hence, collectively, the “scene” database provides a variety of purposes and functional goals.
  • amplified feedback by which real world movement can be discretionally amplified prior to virtual mapping. For example, if a patient/user only bends his/her fingers ten degrees, the signal can be amplified by a factor of five so that, the virtual movement shows a 50 degree flexure. In this manner, amplified feedback can be used as a carrot to encourage patients/users.

Abstract

A low-cost, virtual environment, rehabilitation system and a glove input device for patients suffering from stroke or other neurological impairments for independent, in-home use, to improve upper extremity motor function, including hand and finger control. The system includes a low-cost input device for tracking arm, hand, and finger movement; an open source gaming engine; and a processing device. The system is controllable to provide four types of multiple patient/user interactions: competition, cooperation, counter-operative, and mixed.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • Priority to U.S. provisional patent application No. 61/145,825 entitled “Multiple User Virtual Environment for Rehabilitation (MUVER)”, which was filed on. Jan. 20, 2009, and U.S. provisional patent application No. 61/266,543 entitled “Low Cost Smart Glove for Virtual Reality Based Rehabilitation”, which was filed on Dec. 4, 2009 is claimed.
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • N/A
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • A device and system for rehabilitating hand and finger movements of stroke patients with neurological or orthopedic problems is disclosed, and, more specifically, a device and system that are structured and arranged to capture hand and wrist motion for the purpose of guiding a patient/user through rehabilitation exercises.
  • 2. Summary of the Related Art
  • Every year, between 700,000 and 800,000 Americans suffer a new or a recurring stroke in which a sudden variance in blood supply to the brain causes loss of brain function. About 150,000 Americans die from the event, leaving approximately 650,000 who must overcome or learn to live with the effects of any permanent or long-term disability. These effects may include an inability to return to work, which can lead to a loss of income and benefits such as health care, and/or the need for daily living assistance. Thus, stroke remains the leading cause of disability among adults in the United States.
  • Eighty-five percent (%) of stroke patients, at least initially, suffer from physical disabilities that prevent them from performing normal functions and/or from returning to work. Moreover, six months after a stroke, 55-75% of survivors still experience limited upper extremity (UE) function. Indeed, in instances with initial UE paralysis, complete motor recovery has been reported in less than 15% of the cases.
  • Despite these disappointing statistics, studies have shown that patients with chronic stroke have the potential to improve their physical ability and independence through rehabilitation. Indeed, stroke patients generally appear to be highly motivated to make further gains once conventional, therapist-assisted or therapist-supervised rehabilitation (“rehab”) has been completed. Rising health care costs, however, are causing stroke patients to be discharge from hospital sooner and causing the length of physical therapy sessions to be shorter.
  • As a result, a rehabilitation program that can be performed in a stroke patient's home and performed without a visiting therapist being physically present in the patient's home could further increase rehab participation. Such a program saves time in transportation and cost for clinical fees.
  • A key component of poor or incomplete functional recovery remains the impaired use of the stroke patient's hand and fingers. Critical motions of the hand and wrist have been determined to be gross finger flexion and extension, opposition of the thumb, radial and ulnar deviation, supination and pronation of the wrist, wrist flexion and extension, and the hand's position and orientation in space. Accordingly, there is a compelling need to improve available methods for UE rehabilitation in stroke patients, and in particular, methods that improve hand and finger function. Hand rehabilitation devices that are designed to be low-cost for in-home use remain a viable option for continuing the rehabilitation of the increasing number of stroke patients in the United States. However, hand rehabilitation devices by others tend to be complex, expensive, and/or not readily available to clinicians.
  • There are many products currently available on the market that can achieve these or similar goals and that form the prior art. These include the Cyberglove, P5 Glove™, 5DT Dataglove, Acceleglove, Hand Mentor, and HandTutor. Other products of interest that are not necessarily expressly for rehabilitation but that contain several of the critical factors that can be incorporated into this design include the Nintendo® Wii™. Some of these products, such as the Cyberglove, are very precise, but too costly for in-home purposes ($10,000 per Cyberglove). Others are more affordable, e.g., the P5 Glove™ (approximately $100 per glove), but, without modification, do not deliver data accurately enough for the intended application. Hence there is tension and a necessary trade-off between cost and performance.
  • FIG. 1 shows the P5 Glove™. The P5 Glove™ 10 was released for the home personal computer (PC) video game market in 2002 by Essential Reality, LLC of New York, N.Y. Although it is not wireless, the P5 Glove™ 10 is lightweight and allows for six degrees of tracking freedom including the three translational axes (the x-, y-, z-direction) and the three rotational axes (yaw, pitch, and roll).
  • The P5 Glove™ device 10 uses infrared (IR) technology, e.g., light-emitting diodes (LEDs), and bend sensors, to track movement of the patient/user's hand and/or fingers. Bend sensors 9, which are discussed in greater detail below, are adapted to measure the bend or flexion of a digit, e.g., a finger or toe. The bend sensors 9 are disposed along the back of each finger and thumb, to provide independent finger and thumb measurements. The bend sensors 9 are mechanically coupled to the patient/user's hand by a ring 8 that fits around each fingertip.
  • With the P5 Glove™p0 10, IR sensors are structured and arranged to determine three-dimensional (3D) positioning. However, conventionally, the P5 Glove™ device 10 is electrically coupled to an infrared tower (or receptor) (not shown), e.g., via a PS/2 cable (not shown). The infrared tower, in turn, is electrically coupled to a PC (not shown), e.g., via a USB cable (not shown). Although this arrangement makes it easy to use at home, because of the infrared control receptor, the work space is limited, which is to say that the range within which motion can be detected is limited to only about three or four feet between the glove and the receptor.
  • Typically, there are three ways in which IR sensors can be used to determine position. The first way involves a single IR LED that is disposed at a pre-set, known position. A sensor is disposed on the tracking object. Based on the angle and intensity of the sensed light, the position of the sensors relative to the LED can be determined. A second way in which IR light can be used to determine position is by disposing a single, IR light detecting sensor at a known position and by moving objects that emit IR light relative to the sensors. Finally, in a third technique, the LED and the IR detecting sensors are disposed proximate each other. IR light from the LED reflects off of objects within the illumination area. The sensor picks up the reflected light, from which the position of the reflecting object can be determined. Many multi-touch tables such as the Microsoft® Surface utilize reflective infrared technologies.
  • Although infrared positioning is accurate and relatively inexpensive it is not the most useful or most accurate method of determining the position of a P5 Glove™. Indeed, because IR light detection is predicated on beams of light traveling between an emitter and a sensor, obstructions to the beam path limit this capability. Consequently, because a patient/user's hands move in many directions and at many angles there is no guarantee that emitted IR beams will reach the sensor without being obstructed or reflected.
  • FIG. 2 shows a Hand Mentor Rehabilitation Device 11 (“Hand Mentor”). The Hand Mentor 11 is manufactured by Columbia Scientific, LLC of Tucson, Ariz. and has been cleared by the Food and Drug Administration (FDA) for use in rehabilitation clinics.
  • The Hand Mentor 11 encourages patients to restore the range of motion of their wrist and hand using the principles of Repetitive Task Practice (RTP) and Constraint Induced Therapy (CI). As shown in FIG. 2, the hand of the patient fits into a sleeve 12 that is adapted to sense and to generate a signal commensurate with the level of resistance caused by flexor spasticity.
  • The device 11 offers three different program types that are adapted to reduce spasticity, to recruit specific muscle groups, and/or to improve motor control. The resistance signals are transmitted to a processing device 13 that includes software (or, alternatively, is hard wired) that is designed for unsupervised patient use of the device 11. Advantageously, the device 11 can also offer a therapist option, to establish rehabilitation regimens and generate data for documenting and reporting the patient/user's progress.
  • Referring to FIG. 3, SensAble Technologies (ST) of Woburn, Mass. manufactures a line of haptic input devices 14, which are designed to gather motion input and to provide feedback to the patient/user's fingers, hand, and arm. The ST device 14 most suited for use in a rehabilitation virtual environment is a six degree-of-freedom PHANTOM® SensAble model, in which patients/users grasp a pen- or pencil-like portion 15 of the device 14 in either hand and control the x-direction, y-direction, z-direction, roll, pitch, and yaw.
  • The PHANTOM® six degree-of-freedom device 14 interfaces with a PC (not shown) via a parallel port (not shown). The device 14 typically comes bundled with several software demos and with a software development kit specific to the inputs and limitations of the device.
  • The Falcon from Novint Technologies, Inc. of Albuquerque, N. Mex. is shown in FIG. 4. The Falcon device 16 was originally designed as an input device for playing games on a PC. The device 16 has three degrees-of-freedom; however, different grips with a plurality of buttons or dials can be added to provide more degrees of freedom.
  • The Falcon device 16 includes a 4″×4″×4″ workspace and has a two-pound (force) capability. The Falcon device 16 interfaces with a PC, e.g., using a universal serial bus (USB), e.g., a USB 20. The Falcon device 16 is sold with several games already available for it as well as driver software to play PC games.
  • The Wii™ manufactured by Nintendo Company, Limited of Kyoto, Japan was released in the Fall of 2006 as a personal video gaming console. Referring to FIG. 5, the Wii™ controller 17 has two input device components: a Wii Remote™ controller 18 and a Nunchuk™ controller 19. The Wii™ Remote™ controller 18, when used solely, is normally held in a player's dominant hand. However, players choosing to use both the Nunchuk™ controller 19 and the Wii™ Remote™ controller 18, usually hold the Wii™ Remote™ controller 18 in the right hand and the Nunchuk™ controller 19, which is electrically coupled to the Wii™ Remote™ controller 18, in the player's left hand.
  • The Wii™ gaming console (not shown) connects directly to a power source and to a television or other display device. Each Wii™ Remote™ controller 18 and each Wii™ Nunchuk™ controller 19 communicates with the gaming console wirelessly via a sensor bar (not shown), e.g., using Bluetooth wireless technology. Although a large library of publicly-available Wii™ games exists, presently, there is no licensed software development kit available to the public.
  • Referring to FIG. 6, a Rutgers Master II-ND Force Feedback Glove 20 is shown. The glove device 20 was developed in 2002 at Rutgers University and is structured and arranged to use a plurality of, e.g., four, pneumatic actuators 21 and a plurality of sensors. The direct-drive configuration of the actuators 21 provides force to the tips of the fingers 23 via finger rings 24 that are that mechanically-coupled to the actuators 21. Sensors are disposed on the patient/user's palm 22, to avoid the presence of wires at the fingertips 23. The Rutgers Master II-ND device 20 is a research only device and there is no indication of software or a software development kit.
  • Referring to FIG. 7, a CyberGlove II device 25 manufactured by Immersion Technologies of San Jose, Calif. is shown. The CyberGlove device 25 is one of the leading products for sensing and capturing motion in the current market. The device 25 is made from lightweight elastic and each sensor is extremely thin and flexible, making the sensors virtually undetectable. The fabric on top of the device 25 is a stretch material that is provided for comfort. The fabric on the bottom of the device 25 is made of an open or mesh material for better ventilation.
  • The device 25 is wireless and has a capacity of making eighteen or twenty-two high-accuracy, joint-angle measurements. The glove 25 uses a proprietary resistive bend-sensing technology to capture real-time digital joint-angle data. The 18-sensor model includes two bend sensors that are disposed on each finger, four abduction sensor, and sensors for monitoring thumb crossover, palm arch, wrist flexion, and wrist abduction. The 22-sensor model includes a third bend sensor for each finger.
  • The CyberGlove II device 25 is electrically coupled to a PC, e.g., using a wireless USB receiver. The software that come bundled with the glove 25 is for evaluation purposes only and is not for virtual reality. The is no publicly available software development kit for the CyberGlove device 25.
  • The 5DT Data Glove device 26 shown in FIG. 8, is designed for use in motion capture and animation. The device 26 material is stretch Lycra® and the fingertips are exposed to facilitate the grasping function.
  • The device 26 is adapted to sense multiple bends, e.g., finger flexion, but is unable to measure the attitude or orientation of the hand in space or with respect to the patient/user's body. The device 26 features automatic calibration and has an on-board processor (not shown).
  • Two 5DT versions are commercially available: the 5DT Data Glove 5 Ultra device, which includes five bend sensors to measure discrete finger and thumb flexure, and the 5DT Data Glove 14 Ultra device (depicted in FIG. 8), which uses two bend sensors on each finger and thumb and one bend sensor per abduction between adjacent digits.
  • The 5DT Data Glove 5 Ultra device 26 is adapted to include Bluetooth technology to make it wireless and, also, can include a cross-platform SDK. The bundled software that comes with the device 26 has no rehabilitation applications.
  • Combinations of the commercially-available prior art technologies shown in FIGS. 1-8 have also been investigated by others for hand rehabilitation purposes. For example, the Rutgers Hand Master I and Hand Master II (FIG. 6) have been used in combination with a Cyberglove™ (FIG. 7) to improve hand function in stroke patients. The system uses the palm-mounted pneumatic pistons 21 and virtual reality to improve resisted finger flexion and non-resisted finger extension.
  • Robotic devices that train the entire arm, such as the MIT-Manus, have also been shown as benefit for stroke patients. More recently, the Bi-Manu-Track robotic arm trainer has been found to be equally as effective as electrical stimulation training. A study utilizing the Howard Hand Robot found greater mobility gains for stroke subjects who exercised with robotic assistance in virtual reality during a relatively longer, e.g., three-week, intervention in comparison with subjects who had robotic assistance only during the last week-and-a-half of training.
  • A reported study by Fischer et al. concluded that there was no difference between three groups of stroke subjects who trained on a reach-to-grasp task in virtual reality with and without two different types of robotic assistance to finger extension during the training. Finally, a pilot study performed with a new Finger Trainer robotic device found some improvements in active movement and less development of spasticity in comparison with a control group that received bimanual therapy. However, this Finger Trainer was designed to perform passive finger movement only.
  • None of these devices, however, meets the need for a low-cost, simple, UE motor training device that patients could use easily in their homes, and, potentially, use with other patients over a network, the Internet, and the like. Presently, few virtual environment-based (VE-based) or robotic systems for hand rehabilitation exist. Those that do exist are prohibitively expensive, and most are not commercially available. Moreover, none is suitable for independent home use by patients and, furthermore, none provides for multiple patient/user interaction over the Internet.
  • Hence, it would be desirable to provide a low-cost device that stroke or other patients could independently use in the home, to improve UE function and especially improved UE function of the hand. Such a device would also be useful as an adjunct to ongoing rehabilitation therapy, providing patients with an interesting and motivating way to perform a home exercise program. If designed appropriately, such a system could be used by a therapist to establish exercise programs that were adjustable in level of difficulty, and tailored to the patient's specific interests. These features would likely increase patient motivation and compliance.
  • It would also be desirable to facilitate interactions with other patient/users over a local or a wide area network, the World Wide Web, the Internet, and so forth to make practice more fun and to enhance motivation. Such virtual interactions may also alleviate feelings of social isolation in patients who remain housebound due to mobility problems.
  • SUMMARY OF THE INVENTION
  • To meet the apparent need, a Multiple-User Virtual Environment for Rehabilitation (MOVER) is disclosed. The MUVER is structured and arranged to enable multiple patients and system users at remote locations to interact with each other in virtual space with activities designed to enhance UE and skilled-hand function. The intended application is for use as a supplemental, in-home rehabilitation tool for people with hand function and coordination disabilities, specifically the type of disability that would result from a stroke. Advantageously, MUVER will be the first inexpensive, VE-based system that patients could purchase, e.g., for home use, that is specifically designed to enhance finger and thumb movement in addition to arm movement.
  • The MUVER system is flexible enough to include a variety of different rehabilitation devices to control the MUVER software. For example, the MUVER system is structured and arranged to monitor force and torque produced by the hand and fingers during grasping and manipulation tasks and can be extended to control ankle movements.
  • The system includes a virtual reality game-type interface that will have “scenes” developed specifically for patients with stroke who need to practice finger, hand, and arm movements. The activities will be functional movements that involve the whole arm as well as hand, but with specific emphasis on hand and finger motions. Feedback features and training routines, based on principles of motor learning, facilitate motor recovery in patients at different levels of motor ability.
  • The device and system uniquely combine an ability to track hand position and orientation in space with tracking of finger and thumb configuration using an input device. This feature combination is critical to using the device to display a wide variety of hand and upper extremity exercises in virtual reality displays.
  • One such input device is fashioned like a glove for use with a multiple-user virtual environment system for rehabilitation exercise of a human hand and digits. The input device is structured and arranged to generate signals corresponding to at least one of a discrete movement and an attitude of said hand and said digits. The device includes a glove that can be readily donned and doffed on either hand by a user; a first plurality of sensors, each sensor being structured and arranged to provide data on movement and range of movement of at least one of the index finger, the middle finger, and the ring finger; a second plurality of sensors that is structured and arranged to provide data on movement of the thumb; and a positioning and tracking system that is structured and arranged to generate position coordinates in three rotational axes and three translational axes to determine at least one of the attitude and a velocity of said hand.
  • Another unique feature will be feedback lights placed on the back of the hand which will allow the patient to know if they are performing the correct motion while looking at their hand, as opposed to the screen. This will allow patients with impaired perceptual abilities to concentrate on the task while not having to interact as much with a computer interface.
  • It may also have leisure applications to, in particular gaming. The rehabilitation exercises will essentially be mini-games, so the device could easily be adapted for non-rehabilitation related virtual gaming.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • The foregoing and other objects, features, and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
  • FIG. 1 shows a view of a P5 Glove;
  • FIG. 2 shows a view of a Hand Mentor Rehabilitation Device;
  • FIG. 3 shows a view of SensAble Phantom® devices;
  • FIG. 4 shows a view of a Novint Falcon device;
  • FIG. 5 shows front and side views of a Wii™ remote control and a view of a Wii™ Nunchuck™ device;
  • FIG. 6 shows a view of a Rutgers. Master II-ND Force feedback Glove device;
  • FIG. 7 shows a view of a Cyberglove™ II device;
  • FIG. 8 shows a view of a 5DT Data Glove 5 Ultra device;
  • FIG. 9 shows a schematic of the multi-user virtual reality rehabilitation (MUVER) system in accordance with the present invention;
  • FIG. 10 shows schematics of four common multi-user virtual interactions;
  • FIG. 11 shows a back side of a glove input device having bend sensors disposed on finger portions of the glove in registration with the index, middle, and ring fingers;
  • FIG. 12 shows bend sensors disposed on the glove of FIG. 11 in registration with the thumb and the base of the wrist;
  • FIG. 13 shows bends sensors for capturing wrist flexion/extension and for tracking radial and ulnar deviations and an IMU;
  • FIG. 14 shows and arrangement of electroluminescent wire light-emitting devices for providing visual signals to a patient/user;
  • FIG. 15 shows two views of a second glove input device embodiment;
  • FIG. 16 shows an IMU and a processing unit for the glove in FIG. 15;
  • FIG. 17 shows an embodiment of an input glove device using Hall effect sensing;
  • FIG. 18 shows the palm portion of the input glove device shown in FIG. 17;
  • FIG. 19 shows the back of the had portion of the input glove device shown in FIG. 17;
  • FIG. 20 shows and arm sleeve embodiment of a glove input device;
  • FIG. 21 shows a schematic of a hardware interface in accordance with the present invention;
  • FIGS. 22A and 22B show embodiments of a banana grip base structure;
  • FIG. 23 shows an embodiment of a globe base structure;
  • FIGS. 24A and 24B show embodiments of teardrop-shaped base structures;
  • FIGS. 25A and 25B show embodiments of pyramid base structures;
  • FIG. 26 shows a programming schematic for scripting a virtual reality scene;
  • FIGS. 27A-27D show graphics of four stages of an exemplary virtual environment;
  • FIG. 28 summarizes the mean and standard deviations of various testing sub-phases shown in FIGS. 27A-27D;
  • FIG. 29 shows another schematic of the multi-user virtual reality rehabilitation (MUVER) system in accordance with the present invention;
  • FIG. 30 shows top and bottom portions to a ring prototype;
  • FIG. 31 shows the top and bottom portions of FIG. 30 assembled;
  • FIG. 32 shows an IMU disposed in the assembled ring prototype;
  • FIG. 33 shows the ring prototype with a plurality of bend sensors;
  • FIG. 34 shows a knuckle plate for the prototype of FIG. 33;
  • FIG. 35 shows a ring prototype mechanically coupled to a knuckle plate;
  • FIG. 36 shows an exemplary virtual environment scene for a single degree of freedom knob;
  • FIG. 37 shows an exemplary virtual environment scene for a single degree of freedom hand device;
  • FIG. 38 shows an exemplary virtual environment scene for an active hand device;
  • FIG. 39 shows an embodiment of a SmartGlove input device;
  • FIGS. 40A and 40B show MCP flexion/extension set-ups for 45 degrees and 90 degrees, respectively;
  • FIGS. 40C and 40D show PIP flexion/extension set-ups for 45 degrees and 90 degrees, respectively;
  • FIG. 41A shows an illustrative bar graph of MPC bend data for the input device shown in FIGS. 40A and 40B;
  • FIG. 41B shows an illustrative bar graph of MPC bend data for the input device shown in FIGS. 40C and 40D; and
  • FIG. 42 shows a schematic of a bi-manual SmartGlove system with arm splints for neutral wrist position and support.
  • DETAILED DESCRIPTION OF THE INVENTION
  • U.S. provisional patent application No. 61/145,825 entitled “Multiple User Virtual Environment for Rehabilitation (MUVER)”, which was filed on Jan. 20, 2009, and U.S. provisional patent application No. 61/266,543 entitled “Low Cost Smart Glove for Virtual Reality Based Rehabilitation”, which was filed on Dec. 4, 2009 are incorporated herein in their entirety.
  • As mentioned above, the increasing number of stroke patients necessarily recovering in their home is stoking the need for inexpensive, in-home, hand rehabilitation devices. The devices described above in the background section provide a basis for a variety of options for rehabilitation with the correct software and therapist-approved regimen. The NU-MUVER (Northeastern University Multiple-User Virtual Environment for Rehabilitation) system has been developed by Northeastern University of Boston, Mass. to meet the need for an inexpensive device that can be used to rehabilitate hand and finger movements of stroke survivors and other patients experiencing neurological or orthopedic problems. The NU-MUVER system is designed to be used at home and/or over a network, e.g., a LAN, a WAN, the World Wide Web, the Internet, and the like, alone or with others, e.g., a therapist, other patients, and so forth.
  • The NU-MUVER consists of three basic components: an input device that generates data on position, attitude, and orientation of the patient/user's hand in space as well as of individual finger and thumb movements; commercially-available graphics software that provides object and animation routines that can be used to construct various movement re-training scenes; and a control unit that includes control software that enables a networking capability, movement parsing, performance scoring, recording, storage, manipulation, and display of data; and multiple training “scenes” that are designed to facilitate the practice of a particular movement(s) that is/are therapeutic for discrete patient populations.
  • Referring to FIG. 9, a MUVER system 90 in accordance with the present invention is shown. The system 90 is structured and arranged to provide a plurality of virtual environments 100 designed for specific rehabilitation exercises and for multiple patients/users 91 to interact with others, and with third parties 96, e.g., medical personnel, physical therapists, and the like. Virtual environments 100, or worlds, that are designed for more than one patient/user 91 are called Multi-User Virtual Environments (MUVE). Because the instant MUVE is for rehabilitation, the system 90 is referred to as a “Multi-User Virtual Environment for Rehabilitation” or MUVER 90. The elements of the MUVER 90 are shown in the figure and are discussed in greater detail below.
  • The MUVER system 90 is designed to be modular, which is to say that the number of patients/users 91 and the size of the virtual environment en gross or of each discrete, individual or personal virtual environment 100 can vary and, moreover, can be easily changed. To facilitate modularity further, each patient/user 91 is equipped with his/her own personal computer (PC) 93 on which MUVER software 94 is installed. As shown in FIG. 9, an input device 92, the PC 93, and the software 94 define each personal virtual environment 100. The input device 92 in each individual virtual environment 100 is adapted to enable each patient/user 91 to interact with other patients/users 91, a third party 96, and the like.
  • Communication from and between personal virtual environments 100 takes place over and through a virtual environment network 95, e.g., a LAN, a WAN, the World Wide Web, the Internet, and the like. This approach differs appreciably from other virtual environments in which a dedicated server operates the virtual environment for each of the patients/users. This feature facilitates recording and logging communications between the virtual environment network 95 and a third party's computer 96 for later evaluation.
  • The Virtual Environments
  • The design of a unique virtual environment has several stages. The first stage is to use the nature of the patient/user's disability to determine what rehabilitation exercises or movements would be appropriate and feasible to emulate in a virtual environment. Preferably, the rehabilitation exercises or movements are selected by a physical therapist, a physician, a medical specialist, and the like.
  • Once appropriate rehabilitation exercises have been chosen, the next step is to choose the character of multi-user interaction. Once again, preferably, the character of interaction is determined by a physical therapist, a physician, a medical specialist, and the like. Common types of multiplayer or multi-user virtual interaction, e.g., competitive interaction, counter-operative (versus) interaction, cooperative interaction, mixed interaction, and any other combination of the first three interactions, are illustrated in the FIG. 10.
  • “Competitive interaction” occurs where each patient/user 91 of a plurality of patients/users 91, who have no direct interaction between them, completes the same task having the same goals for which a comparative score can be assigned. “Counter-operative” or “versus interactions” occur where a first patient/user 91 works against a second patient(s)/user(s) 91 to achieve competing goals, which only one of the patients/users 91 can obtain. “Cooperative interaction” occurs where two or more patients/users 91 work jointly to complete a common goal or task. “Mixed interactions” occur where patients/users 91 work together to complete a common goal but the performance of each patient/user 91 is scored comparatively.
  • Once a desired virtual interaction is selected, the setting of the individual virtual environment 100 can be established. Preferably, the setting is simple and appropriate for the discrete patient/user 91 and, moreover, is designed to make the goal(s) clear for each patient/user. After the virtual environment 100 is completely designed, rigorous testing for both interaction and substance is necessary.
  • The MUVER system 90 includes three components: an input device 92, a graphic display device, and a controller. For the purpose of illustration and not limitation, the MUVER system 90 will be described in terms of a SmartGlove™ as the input device, a Panda3D graphics engine for the graphics display device, and specialty software and driver programs for controlling the system 90. These components are discussed in the subsequent sections. However, brief descriptions of sensors and of positioning and tracking systems are provided.
  • Sensors
  • Sensing devices (“sensors”) are provided to sense movement, e.g., bending, flexion, and so forth. The bend or flexion of a human finger can be measured using various methods, which are collectively referred to as bend sensors.
  • Electronic bend sensors use physical geometries and material properties to alter an electrical signal in proportion with angle or pressure. Bend radius and bend angle affect sensor output voltage. Bend sensors have been used for finger position measurement for quite some time, with the first large-scale commercial application appearing in 1989 with the Nintendo® Power Glove. There remains a wide range of currently-available products that use bend sensors, from very simple to very expensive.
  • Other types of bend sensors include optical fiber sensors and mechanical measurement devices. Electromechanical sensors provided in, for example, the Nintendo® Power Glove use the patented technology of Abrams Gentile Entertainment Inc. (“Abrams”). Abrams defines five different electromechanical methods for changing the resistance of an electrically-conductive construction based on a bend angle. A first economic application of these technologies involves a resistive sensors having a carbon-based, electrically-conductive ink as a stretched part, which changes electrical resistance in response to applied pressure. Using simple baseline calibration routines, reliable measurements of bend angle are attainable.
  • Other similar electromechanical inventions have been patented, including some that modulate capacitance of the sensor in a similar manner. This class of sensors offers better reliability and accuracy than resistive sensors, and is also able to determine the direction of bending rather than just the magnitude of the bend.
  • Optical bend sensors typically include a light source that is coupled to a light detector using, for example, an optical fiber. As the fiber bends, less light traverses the length of the fiber due to total internal reflection (TIR). For example, at higher bend angle, relatively few rays strike the detector and more rays exit the fiber at large angles.
  • This particular optical technology was used to develop the Data Glove, one of the earliest hand data recording systems, made by VPL Research, Inc. The optical method provides a repeatable measurement of a bend angle; however, it is less cost effective than either of the previously described technologies. Other optical technologies improve on the concept by using multiple fibers in a bundle or by pre-bending the fiber in a certain direction, and are therefore able to measure direction of bend as well as magnitude.
  • Finally, a class of angle measurement sensors exists that relies on mechanical means such as the tension of a cable disposed inside a rigid tube, or the relative position of members in an armature. These systems are many, but, in most cases, are better suited to a particular application. Hence, these systems are not inherently flexible. Conventionally, with these systems, changes in the geometry of a mechanical arm assembly are measured. Although such a system could potentially cost less than, for example, a resistive bend sensor, the time spent on design and troubleshooting would likely offset any cost increases. Accuracy also can be quite good, however, greater mechanical tolerances must be controlled for repeatable measurements.
  • Hall effect sensors are switches that are activated in the presence of a magnetic field such as generated by a magnetic field-producing device, e.g., a magnet. The sensor contains a capacitor that generates an electrical current and a magnetic field perpendicular thereto. The magnetic charges generated follow a straight line except when in proximity of a magnetic field at which time the path of the charge becomes non-linear, i.e., curves, and accumulates on one face of the sensor. The distance at which the magnetic field causes the sensor to act like a switch is a function of the strength of the magnetic field and, therefore, the magnet, and the current density specified by the sensor.
  • Positioning and Tracking Systems
  • The ability to generate position coordinates in six axes (three translational and three rotational) and the ability to continuously track the position coordinates are critical to the operability of the device and system. Several commercially-available positioning systems can produce position coordinates accurately and track multiple points at once. For example, magnetic tracking systems combine very high tracking resolution with high-speed sampling, which contribute to utilization in virtual reality simulations. Disadvantageously, magnetic tracking systems are very expensive and, furthermore, the likelihood of successfully integrating magnetic tracking in an inexpensive, home system is not very high. The magnetic fields associated with these systems also may experience high interference in home operation, affecting proper and satisfactory system operation.
  • Radio frequency positioning and tracking, e.g., using a few identification tags (RFID) in combination with a plurality of receiving units, is a possible alternative. Typically, RFID systems determine the positioning of an object, e.g., a hand, by triangulating, e.g., measuring the time it takes for the RFID signal to travel to/from the object for each of the plurality of receiving units. However, conventional RFID systems operate at or near the wireless spectrum of most household, making interference an issue. RF systems also would not provide the accuracy necessary for the present invention.
  • Infrared positioning, which is discussed above, can be both accurate and inexpensive. However, IR relies on line-of-site signals, making obstructions a huge problem.
  • Inertial measurement units (IMUs) are adapted to determine the orientation of an object (in space), the velocity of the object, and 3D positions using dead reckoning. IMUs can be structured and arranged to gather data from all six degrees of freedom and avoids the shortcomings of the IR and electro-magnetic options. One reason why IMUs have not been used heretofore, has to do with dead reckoning.
  • “Dead reckoning” refers to all object positions being measured relative to a pre-established and known initial starting (“home”) point. As soon as the IMU moves from the initial starting point, sensors, e.g., multi-axis accelerometers, gyroscopes, and the like, provide data to a processing unit that is adapted to calculate the speed of the IMU and the distance traveled from home. Each successive move relates back to all previous movements, hence, positioning is a compilation of a plurality of discrete, smaller movements.
  • Finally, three-dimensional cameras provide real time transitional axis positional data but do not provide rotational axis positional data. Typically, an illumination source emits an IR light having a discrete, pre-established frequency, e.g., 44 MHz. A plurality of sensors—half of which operate at the pre-established frequency and half of which are out-of-phase with that frequency—measures the time it takes for the IR light to be reflected by an object and to return to the sensor, which is to say, the time-of-flight (TOF). TOF data provide accurate depth data of the object, which can be gathered as quickly as an acceptable 60 frames per second.
  • Input Device (SmartGlove™ )
  • An ideal input device 92 is a wearable glove that is sized to be universal, i.e., useable on either hand, or adjustable, and, optionally, has its fingertip portions removed, to accommodate different hand sizes. Preferably, the input device 92 is structured and arranged to enable a patient/user 91 to don it and doff it using only one hand. A total weight not to exceed one pound and a dorsal weight not to exceed eight ounces is recommended to facilitate use by patients/users 91.
  • Ideally, six axes of movement of the hand are feasible and, more importantly, are recognizable for the purpose of generating and recording movement and orientation data. Additionally, motion of each finger and thumb is not hindered and individually isolatable.
  • Preferably, the input device 92 is structured and arranged to measure at least one of the following accurately: finger flexion/extension measured to at least 90°; wrist flexion/extension, i.e., dorsal action, at ±90°; wrist-radial deviation up to 40°; and wrist-ulnar deviation up to 50°; and supination/pronation of the forearm up to 180°.
  • Referring to FIG. 11, a first input device embodiment 70 includes bend sensors 71, 72, and 73, which are disposed on the back of the input device 92 on finger portions that are in registration with the patient/user's index, middle, and ring fingers. To reduce weight and cost, a sensor on the pinky finger, whose movement generally follows that of the adjacent ring finger very closely, is optional. As shown in FIG. 12, a bend sensor 74 can also be disposed on the back of the input device 92 in registration with the thumb and a bend sensor 75 can be disposed on the input device 92 at the base of the palm of the hand. The latter bend sensor 75 is adapted to bend as the heel of the thumb crosses the palm to oppose one or more of the fingers, e.g., during a pinch motion.
  • Referring to FIG. 13, a two-dimensional bend sensor 77, which is disposed on the back of the input device 92 in registration with the wrist and oriented along the axis of the ulna, is provided to capture wrist flexion/extension. To track radial and ulnar deviations, a bend sensor 78 is disposed on the ulnar side inside of the hand, generally oriented along the axis of the thumb in a neutral position. At a neutral hand position, the bend sensor 78 for radial and ulnar deviations is slightly bent. When the hand is rotated or turned in an outward direction from the neutral position, the same sensor 78 will appear to remain straight. However, when the hand is rotated or turned inwardly from the neutral position, the sensor 78 will detect and measure the greatest movement.
  • Preferably, the glove 70 can be hardwired to a base station that includes the electronics required to communicate to the PC 93, e.g., wirelessly or via a USB 99. This eliminates the need to attach an electronics board to the patient/user's forearm. The base station can be ergonomically shaped and can include a mechanical button for dead reckoning purposes. A plurality of, e.g. two, input buttons can also be provided to facilitate digital YES (or 1) and NO (or 0) input for navigating the software. To prevent false readings, software will only recognize button input at discrete, pre-established times, e.g., between exercise sets.
  • A feedback system, e.g., a haptic system, an audible speaker, and/or light emitting devices, can also be disposed on or within the finger portions of the glove and/or on the back of the glove 70, e.g., with or in the IMU housing 76, to provide vibratory or auditory clues and/or visual signals during exercises. For example, referring to FIG. 14, electro-luminescent (el) wire 79 can be exposed around each of the index, middle, and ring fingers so that when any of the fingers is moved into a correct position, the el wire 79 disposed about the correctly-positioned finger can be illuminated, e.g., by a sequencer (not shown) electrically coupled to a power source and a power control device (not shown), e.g., an inverter.
  • Referring to FIG. 15, a second glove embodiment 60 is shown. The glove 60 includes bend sensors 61 and 62 that are disposed, respectively, on the metacarpalphalangeal (MCP) joint and the proximal interphalangeal (PIP) joint of the thumb and of each finger, including the pinky finger. The MCP and PIP bend sensors 61 and 62 are adapted to record arcuate bend data associated with the motion or movement of each finger. A third bend sensor 63 is disposed on the back of the hand at the base of the thumb. For measuring wrist flexion/extension, a bi-directional bend sensor (not shown) is disposed to extend across the wrist on the palm side of the glove 60.
  • Optionally, a switch pad 64, e.g., a capacitive touch sensor, can be disposed on or within the tip of the thumb portion of the glove 60 for providing and recording pinch data. In operation, when the tip of one or more of the glove fingers contacts, i.e., activates, the switch pad 64, the controller is adapted to use the touch data and bend sensor data generated to differentiate and identify the pinching finger from the non-pinching fingers.
  • An IMU 65 is provided on the glove 60, e.g., between the knuckles and the wrist on the portion of the glove 60 corresponding to the back of the patient/user's hand. During prototype testing, the inventors used an IMU 65 that includes three integrated circuits: two vibrating beam gyroscope assemblies and one micro-machined accelerometer. Each of the three chips generate data in all three acceleration axes and in all three rotational axes. Hence, the degree of freedom is six.
  • The IMU 65 is electrically coupled to a processing unit 66 that is removably attached to the patient/user's forearm as shown in FIG. 16. By attaching the processing unit 66 on the patent/user's forearm, the weight of the unit 66 is removed from the hand so as not to hinder or interfere with hand movement while keeping the sensors 61-64 proximate to the unit 66.
  • A Hall effect glove embodiment 50 is shown in FIGS. 17-19. Single bend sensors covering both the MCP and PIP joints 51 are disposed on each of the finger portions of the glove 50. A plurality, e.g., three, bend sensors 52-54 are disposed on the thumb portion of the glove 50. A bend sensor 52 is disposed between the thumb portion and the index finger portion of the glove 50 to track relative movement between the fingers and the thumb, another bend sensor 53 is disposed along the axis of the thumb portion to track movement of the thumb, and a third bend sensor 54 is disposed at the base of the thumb portion of the glove 50 to measure roll of the wrist joint as the patient/user's thumb reaches across the palm.
  • To measure finger pinch, on the palm side of the glove 50, Hall effect sensors 56 are disposed on the tips of each glove finger and a magnetic field generating device 57, e.g., a magnet, is disposed at or near the tip of the glove thumb. When the magnetic field from the magnetic field generating device 57 approaches and/or contacts any of the sensors 56, the sensors 56 switch, providing data to the controller. The controller is adapted to use the touch data and bend sensor data generated to differentiate between and to identify the pinching finger from the non-pinching fingers.
  • To properly measure hand position and orientation, an IMU 52 is disposed on the back of the hand portion on the glove 50. Another Hall effect sensor 59 is also disposed in the palm of the glove 50 to enable dead reckoning. More specifically, whenever the patient/user 91 places his/her hand correctly on a base station (described in greater detail below) that is equipped with a magnetic-field generating device (not shown), the Hall effect sensor 59 will generate a signal from which the controller will call and execute an algorithm, software, driver programs, and the like to calibrate the IMU 52.
  • Advantageously, with Hall effect sensors in the glove 50 to activate a signal, the base can be wireless. Although not required, a wireless base minimizes the proliferation of wires and cables, which can get tangled and/or hinder movement.
  • As shown in FIG. 19, a strap 40 can be removably attached to the patient/user's forearm. The strap 40 can include a power source (not shown), e.g., one or more batteries, a controller (not shown), e.g., an Arduino USB board manufactured by Arduino Software of Italy, and an accelerometer 41. Comparison of accelerometer readings from the accelerometer 41 on the forearm and from an accelerometer disposed in the IMU 52 can be used to determine the angle of wrist bending.
  • Referring to FIG. 20, an arm sleeve embodiment 45 is shown. The embodiment includes a glove portion 42 and a pulley portion 43. To measure the flexion/extension of the patient/patient/user's digits, bend sensors 44 and 46 are dispose on the back of the hand portion of the glove portion 42, to be in registration with the MCP and the PIP of each finger, while at least one bend sensors 47 is disposed along the PIP of the thumb. To measure the roll of the thumb, at least one bend sensor (not shown) is disposed on the glove portion 42 along the crease of the thumb along the palm of the hand. Flexion/extension of the wrist can be measured by a bend sensor (not shown) that is disposed on the glove portion 42 in registration with the posterior side of the patient/user's wrist.
  • Optionally, the glove portion 42 can be fingerless, permitting better fit across a variety of hand sizes. When fingerless, a plurality of rubber caps are attached to the tips of each finger and thumb. Inside each rubber cap is a small push button that is proximate the finger or thump tips. The operation of the buttons in the fingerless version is the same as that previously described.
  • The pulley portion 43 is provided to measure radial and ulnar deviations. To that end, the pulley portion 43 includes a strap 48 that is securely but releasably attachable to the, e.g., medial side of the, glove portion 42, e.g., using a hook and pile material, at a first end; that runs along the lateral side of the patient/user's wrist; and that passes through a small pulley 49 that is disposed, e.g., in an arm sleeve, above the patient/user's elbow.
  • Referring to FIG. 21, an exemplary hardware interface, i.e., input device 92, that each patient/user 91 will use is the SmartGlove™ developed at Northeastern University of Boston, Mass. The SmartGlove™ 92 was chosen as an input device 92 because of a low cost, off-the-shelf device that would be suitable for home use. Newer technologies could provide the same kind of interface for patients/users 91 while maintaining the high usability and low cost for the practitioner.
  • An input device mounting box prototype will be described referring to FIGS. 30-33. FIG. 30 shows top and bottom portions 101, 102 of a mounting box 110 that, when assembled as shown in FIG. 31, form a box-like structure that is structured and arranged to rest on the back of a patient/user's hand and to accommodate all of the necessary sensing devices. A depressed area 106 for receiving an IMU 108 is provided in the top portion 101. A pair of vertical standoffs 109 are provided to orient the IMU 108. A slot 104 for accommodating a Hall effect sensor 105 is also provided in the top portion 101.
  • The prototype 110 is releasably attachable to the back of the patient/user's hand using, for example, a hook and pile combination that can be routed through a pair of openings 103 provided for that purpose.
  • Referring to FIG. 33, bend sensors 107, which will be disposed within the material of the SmartGlove™ 92, are shown optionally coupled to exiting sleeve rings 109 at a first end and, to mimic the structure of the hand, are mechanically attached to a single point at the rear of the mounting box 110 at a second end. As shown in FIG. 35, each bend sensor 107 enters the mounting box 110 via a respective slot 112. Elastic string can be used for attaching the bend sensors 107 to the single point. The string prevents the sensors 107 from rotating about the single point while also allowing the sensors 107 freedom to translate along the axis of each finger as flexion/extension occurs. This enables the sensors 107 to retain the geometry of the patient/user's hand as knuckles are flexed and relaxed. It also enables the bend sensors 107 to remain in a constant position relative to the fingers.
  • Optionally, to fix and hold constant the radius over which an MCP bend sensor will flex and to prevent the bend sensors 107 from sliding to the sides of the patient/user's knuckles, a knuckle plate 111 (FIGS. 34 and 35) can be provided. Because signals from the sensors 107 vary with the bend radius independent of the actual angle of the bend, if the radius can be held constant, the angle of the MCP joint's bend can be accurately modeled.
  • The SmartGlove™ 92 was also chosen because it is easily connected to a PC 93, e.g., wirelessly or via a universal serial bus (USB) port 99, and offers satisfactory control to the patient/user 91. For use in connection with the MUVER system 90, a USB 2.0 is preferred for greater data transfer at a faster rate.
  • Technical specifications for the SmartGlove™ input device 92 include:
  • Finger Sensor Specifications
    • Five independent finger measurements
    • 60 Hz refresh rate
    • 0.5 degree resolution
    Tracking System Specifications
    • Optical tracking system
    • 3-foot range from “receptor”
    • 45 Hz refresh rate
    • Six DOF
    Yaw\Pitch\Roll Specifications
    • 3 degree resolution
    • 3 degree accuracy
    X\Y\Z Specifications
    • 0.125 inch resolution at 3 foot range from “receptor”
    • 0.5 inch accuracy at 3 foot range from “receptor”
    USB System Specs
    • USB 1.1 compliant
    • HID specification compliant
    • At least two USB interfaces provided, i.e., a native P5 mode, and standard mouse mode
  • Referring again to FIG. 21, during operation, which is to say, during manipulation of the hand rehabilitation (input) device 92, the patient/user's hand and/or finger movements are transmitted to the PC 93, e.g., wirelessly and/or via the USB connection 99. A device driver 97 for the SmartGlove™ 92, which can be installed in the operating system of the SmartGlove™ 92 or, alternatively, as shown in FIG. 21, in operating system 98 of the PC 93, interprets the input data and provides the data to the virtual reality software 94.
  • As shown in FIG. 39, although the IMU 115 is incorporated into the input device 92 and disposed on the back of the patient/user's hand, when a USB connection 99 is used, to reduce the total weight of the system on the patient/user's hand a cable mount 112 can be disposed on the patient/user's forearm.
  • The virtual reality software 94, in turn, uses the SmartGlove™ 92 input data to generate output signals designed to display appropriate images in a virtual reality on a display device (not shown), e.g., the display device of a PC. Preferably, the MUVER programming display software includes three different pieces that are illustrated in FIG. 26: a game engine 82, 3D models and graphics 83, and a scripting code 84.
  • Bi-Manual System
  • Although the above description describes a single input device, the system can also include multiple input devices, e.g., a pair of SmartGloves™ for the left and right hands of the patient/user, that can be used simultaneously. Multiple input devices in general and, more specifically, a pair of SmartGloves™, permit more complex and realistic rehabilitation tasks such as, in virtual reality, simultaneously grasping a jar with a first hand and removing a lid from the jar with a second hand.
  • An illustrative bi-manual system 120 is shown in FIG. 42. Each of the patient/user's hands and forearms are supported in adjustable arm splints 111 for neutral wrist position and support. The end portions 119 of each of the splints are ergonomically curled to make the natural position as comfortable for the patient/user as possible. Each hand is disposed within an input device 113 that can be a Spandex®/cotton blend glove.
  • Bend sensors 114 are positioned (within a pocket in or within the material of the glove 113) across the MCP and PIP of the patient/user's index, middle, and ring fingers, to measure finger flexion and extension. A fourth bend sensor 116 is disposed along the back of the hand proximate the patient/user's wrist to measure wrist flexion and extension. A fifth bend sensor 117 is disposed along the back of the thumb to measure the rotation of the thumb with respect to the patient/user's palm and fingers.
  • Hall effect sensors 118 can be wired in the finger tip portions of the glove 113 and are adapted to interact when they come in proximity of a small magnet (not shown) that is disposed in the palm of the glove. The small magnets in each of the gloves are used in combination with the globe base (see below) to calibrate the position of the gloves with respect to the base.
  • An IMU 115 is disposed on the back portion of the glove 113 for monitoring the three-dimensional hand position, i.e., attitude, of the patient/user's hand. To reduce the weight on the patient/user's hand and to facilitate connection of wiring to the IMU 115, a cable housing 112 is wrist-mounted. More preferably, wireless communication of data can be effected.
  • Base Unit
  • With the present invention, the patient/user must also be able to achieve the same dead reckoning position at the start and/or at the completion of each exercise due to, inter alia, the nature of IMU positioning. To best achieve this, the system includes a base structure that accounts for ergonomics and an activation sequence that enables the controller to receive data once the patient/user has placed his/her hands in the appropriate position.
  • FIG. 22A show a banana grip base 30 having a plurality of RFID tag receivers 31 and FIG. 22B shows the same grip base 30 with a patient/user's hands placed in the appropriate staring (“home”) position. The ergonomics of the banana grip base 30 allows patients/users to place their hands on the base pad 32 without having to strain to cause them to lie flat. The RFID receivers 31 do not require power, allowing the grip base 30 to remain completely passive, i.e., wireless.
  • According to this approach, RFID transceivers 59 (see, FIG. 18) are disposed in the input device 50 so that when the patient/user's hands are disposed at the “home” position, the receivers 31 and transceivers 59 are proximate, causing the transceiver 59 to emit a signal to that effect. A drawback of the RFID approach is its accuracy. Slight deviations from a true “home” position may skew the results of the exercise.
  • FIG. 23 shows a globe base 35 that includes imprinted hand grooves 33 that define the “home” position. Capacitive touch sensors 34 can be disposed at each of the finger and thumb tips of the hand grooves 33. The globe base 35 is adapted so that the patient/user's digits must touch each of the touch sensors 34 in both hand grooves 33 for location data to zero itself. This design is particularly attractive due to its simplicity and, further, it causes little strain on the patent/user's hands. The hand grooves 33 and touch sensors 34 at the fingertips of the hand grooves 33 make it intuitive to use. Alternatively, Hall effect sensors and magnets can be integrated into the input device 92 and the globe base 35.
  • The globe base 35 is a single piece, ergonomic design that requires a power source. The relatively large size make it possible to house electronics, e.g., a processing board, a Bluetooth® receiver, and other accessories (including the input devices when not in use), within the base 35 itself. Various feedback systems, e.g., speakers for audio feedback, vibration devices for haptic feedback, and LEDs for visual feedback, can also be disposed within or on the outer surface of the globe base 35.
  • Prototype testing by the inventors highlighted the need for providing wrist and or forearm support for accommodating and supporting when the patient/user's hands when properly positioned in the “home” position. The incorporation of medical arm splints with the globe base 35 (FIG. 42) allows the patient/user to maintain his/her hands in a neutral, “home” position, which is to say: full pronation, no extension or flexion, and no radial or ulnar deviation. By employing a splint that is not in wrist extension—as most are—patients having, for example, a contracture or spasticity in wrist or finger flexor muscles can more easily achieve the correct, neutral position.
  • Disadvantageously, touch sensors 34 in the fingertips alone do not ensure that the patient/user's palms are also touching the base 35. As a result, location data may be incorrect. The touch sensors 34 also require the glove device to be fingerless as only contact with exposed skin at the fingertips will activate the sensors 34. Finally, the size of the globe base 35 is relatively large, requiring additional exercise area.
  • FIGS. 24A and 24B show a teardrop base 39, one of which is provided for each hand. The teardrop base 39 design creates an ergonomic platform on which patients/users may rest his/her hands. A tactile switch button 38 is disposed on the base 39 so that when the patient/user's hand is properly positioned over the base 39, the button 38 will activate the device. The active switch button 38 can be electrically coupled to the controller, e.g., via a USB connection. The main problem with the teardrop design is that the button 38 will not necessarily be repeatedly activated by the same part of the patient/user's hand and because two bases 39 are needed—one for each hand—requiring two dead reckoning signals.
  • FIGS. 25A and 25B show a pyramid base 37, one of which will also be provided for each hand. The design creates an ergonomic platform for hand placement, which is particularly effective for stroke patients who frequently have difficulty spreading their hands. The pyramid base 37 includes a magnetic field-generating source 36, e.g., a magnet, that is adapted to activate a Hall effect sensor that is disposed in the palms of the input device gloves. Hall effect activation allows hand placement on the base 37 to be repeatable within an acceptable tolerance proportional to the resolution of the sensor 36, while allowing the base 37 to be passive, i.e., wireless and not requiring power.
  • The simplicity of design, ergonomic shape and Hall effect activation are advantages of the pyramid base 37. Disadvantageously, two bases 37 are needed—one for each hand—requiring two dead reckoning signals and, furthermore, the sensor disposed in the glove input device may cause discomfort.
  • Three-dimensional Graphics Engine
  • A 3D graphics engine is a library of subroutines for 3D rendering and game development. The 3D models and graphics 83 portion populates the engine code and follows the rules of the game engine 82. The scripting code 84 in the game engine 82 controls many of the low level features, e.g., physics and display, and, preferably, is written in Python™ Script. More preferably, the scripting code 84 can overwrite some or all low-level game engine code while also providing unique features to the 3D models 83.
  • Software demands for the MUVER 90 are both specific and advanced. Indeed, any development platform selected requires network capabilities and 3D graphics. Preferably, the MUVER 90 includes a software package that is simple both to implement and to change and that, also, is capable of providing the features that the MUVER 90 requires, e.g., network coding, 2D/3D rendering, and so forth. One possible software option for the MUVER 90 is to use a 3D graphics engine for most of the code and using a scripting language to program in the various scenes and the use of the SmartGlove™ 92.
  • There are many commercially-available game engines that have varying levels of programming sophistication. Five such engine solutions that could be integrated into the MUVER 90 include: Panda3D, Blender, Source and Hammer, XNA®, and Flash®. Each solution has advantages and disadvantages with respect to the other solutions.
  • The Panda3D graphics engine was originally created by Disney but is currently owned by Carnegie-Mellon University of Pittsburgh, Pa. The software integrated into the graphics engine is open source, which is to say that it is free to the public for download for commercial and non-commercial use and can be freely modified. The Panda3D graphics engine uses the Python™ scripting (programming) language and is written in object-oriented, C++ libraries and modules. Panda3D has comprehensive support for networking that allows for rich virtual interactions between patient/users. Data input methods for the Panda3D advantageously include direct input of Head Mounted Displays (HMD) and VR trackers.
  • Panda3D remains a preferred platform because of its license agreement as well as its capabilities. However, Panda3D requires additional middleware which increases cost and complexity. Middleware is a term or art used to define a software “bridge” between hardware such as between an input device 92 and the software of the MUVER. As a result, the middleware must be compatible with the input device, i.e., the SmartGlove™ 92, but must also be able to generate a compatible output signal to the chosen development software.
  • Potential middleware examples include software development tools, e.g. SWIG, that are adapted to connect programs written in a first programming language, e.g., C and C++, with a variety of high-level programming languages. Typically, SWIG is used to create high-level, interpreted or compiled programming environments, and user interfaces. SWIG is used with different types of languages—including common scripting languages such as Python™, which is the scripting language of Panda3D. Advantageously, SWIG is open source and, hence, may be freely used, distributed, and modified for both commercial and non-commercial use.
  • SWIG, however, is a difficult program to work with and very unstable. Notwithstanding, SWIG enables programmers to write programming methods for the SmartGlove™ 92 in C/C++ and, subsequently, to “wrap” them so that they can be read in Python™ programming code. This features allows the SmartGlove™ 92 to be useable in the BlenderGE as a set of Python™ scripts.
  • A second commercially-available middleware is GlovePIE (Glove Programmable Input Emulator), which was originally developed to emulate joystick and keyboard input. Succinctly, GlovePlE emulates movements made with the SmartGlove™ 92 using software macros and, further, binds the movements to an input device such as a keyboard or a joystick. As a result, use of the SmartGlove™ 92 as an input device is extended to any program that is traditionally control using a keyboard or a joystick. Disadvantageously, only certain joystick/keyboard movements are emulated by the SmartGlove™ 92, which limits the number of movements the would be available to a practitioner and/or a patient/user.
  • Advantageously, however, use of GlovePIE is no longer confined to VR gloves; but, rather, now supports emulating a myriad of input, using a myriad of devices, e.g., Polhemus, Intersense, Ascension, WorldViz, 5DT, and eMagin products. GlovePIE may also control MIDI or OSC output.
  • Hardware supported by the GlovePIE includes:
    • Nintendo Wii Remote
    • NaturalPoint (or eDimensional) TracklR, OptiTrack, SmartNav
    • FakeSpace Pinch Gloves (9600 baud by default, but can be changed)
    • Concept 2 PM3 rowing machines (ergo or erg)
    • All joysticks or gamepads recognized by Windows
    • Parallel port gamepads (with PPJoy)
    • All keyboards
    • Mice with up to 5 buttons and 2 scroll wheels
    • Most microphones (don't have to be high quality)
    • Most MIDI input or output devices
    • Essential Reality P5 Glove™
    • 5DT Data Glove (all versions)
    • eMagin 2800 3D Visor HMD
    • Polhemus trackers (must be set to 115200 baud): IsoTrak II, FasTrak, Liberty, Patriot, Liberty Latus
    • Ascension trackers: Flock of Birds, MotionStar, etc.
    • Intersense trackers: InterTrax, InertiaCube, IS-300, IS-600, IS-900, IS-1200, etc.
    • WorldViz PPT trackers (all versions)
    • GameTrak (only as a joystick, no direct support)
  • Yet another commercially-available middleware product is OpenTracker, which is manufactured by Argent Data Systems of Santa Maria, Calif. OpenTracker is another open source product that is adapted to create a full-featured tracking software package that can be integrated into any software as a device library. The major advantages of OpenTracker are that it is the most full featured and robust middleware package. It also natively integrates into C/C++. Hardware supported by the OpenTracker includes:
    • A.R.T. optical tracker (Windows, Linux)
    • ARToolkit(Windows, Linux)
    • ARToolkitPlus(Windows, Linux)
    • Parallel Port (Windows, Linux)
    • CyberMouse(Windows)
    • Origin Instruments DynaSight (Windows, Linux)
    • Ascension Flock of Birds (Windows, Linux)
    • Polhemus FastTrak/IsoTrak (Windows, Linux)
    • Garmin GPS devices like eTrex and similar (Windows)
    • ICube X (Windows)
    • Intersense InterTrax2 USB (Windows)
    • Joysticks (Windows)
    • evdev interface (mouse, keyboard, joysticks) (Linux)
    • Barco MagicY (Windows)
    • Midi (Windows)
    • gTec gMobilab/gMobilab+ (Windows, Linux)
    • P5Glove (Windows)
    • Phantom Omni (Linux requires 3rd party driver, Windows?)
    • 3DConnexion SpaceDevice (Windows; requires 3Dxware SDK)
    • 3DConnexion SpaceMouse (Plus USB) (Windows; requires 3Dxware SDK)
    • Ubisense Tracker (Windows, Linux)
    • Polhemus UltraTrak (Windows, Linux)
    • Wacom Graphire (Windows)
    • Wii (Windows)
    • XSense (Windows, requires MT9 SDK from XSens)
  • Finally, the Virtual-Reality Peripheral Network (VRPN) is a set of classes within a library and a set of servers that are designed to implement a network-transparent interface between application programs and the set of physical devices, e.g., trackers and so forth, used in a virtual-reality system. Conceptually, a PC or other host controller is disposed at each VR station to control the peripherals, e.g., tracker, button device, haptic device, analog inputs, sound, and the like.
  • VRPN provides middleware connections between the application(s) and the hardware devices using an appropriate class-of-service for each type of device sharing the link. The application remains unaware of the network topology. Advantageously, VRPN can be used with devices that are directly connected to the system that is executing (running) the application, using separate control programs or running the applications as a single program.
  • VRPN also provides an abstraction layer that makes all devices of the same base class look the same. For example, all tracking devices are made to look like they are of the type vrpn_Tracker. As a result, all trackers will produce the same types of reports. At the same time, it is possible for an application that requires access to specialized features of a certain tracking device, e.g., telling a certain type of tracker how often to generate reports, to derive a class that communicates with this type of tracker. If this specialized class were used with a tracker that did not understand how to set its update rate, the specialized commands would be ignored by that tracker.
  • Current VPRN system types include: Analog, Button, Dial, ForceDevice, Sound, Text, and Tracker. Each type abstracts a set of semantics for a specific device type. There are one or more servers for each type of device, and a client-side class to read values from the device and control its operation. VRPN is the preferred middleware solution because of its capabilities and robustness, which the other solutions lacked. Historically, VRPN has also been used successfully with Panda3D.
  • Returning to the game engines, Blender is another open source software package manufactured by the Blender Foundation that focuses on digital modeling and animation. An integrated game engine called BlenderGE uses Python™ as a scripting (programming) language. The main advantages of Blender are that the software is included in the BlenderGE and, because, when Blender is used as a digital animation package, it has many features to use armatures, e.g., a human hand.
  • The Source Game Engine and the Hammer Level Editor (“Source and Hammer”) are manufactured by the Valve Corporation, a video game company. Source and Hammer is a leading character animation graphics engine whose main advantage is an advanced physics engine and multiple patient/user capabilities. Source and Hammer can be used for non-commercial purposes.
  • XNA™ is a product of Microsoft® Corporation of Seattle, Wash. XNA™ uses C# code and a shared library to facilitate creation of games and simulations. The biggest advantages of XNA™ are that the community is very knowledgeable and it is designed for making multi-user games. One caveat, however, is that XNA™ does not have a bundled game engine. Accordingly, using XNA™ software to create the MUVER would require much more programming than using one of the software packages associated with a game engine.
  • Finally, Flash® manufactured by Adobe Systems Incorporated of San Jose, Calif. is primarily used for Web sites and for Internet applications. The biggest advantage of using Flash® is that most applications can be accessed from any Web browser, which means that installation is not mandatory. Like XNA™, however, Flash® does not have a bundled game engine and requires more programming than other software solutions. The multi-user options available for Flash®, however, are comparable to XNA™ and Panda3D but better than what is available for Blender.
  • Testing Results of Concept
  • FIG. 27 shows an exemplary virtual environment (VE) training “scene” (or exercise) for competitive virtual interaction between two patient/users. The exemplary scene is designed to allow patients/ users to practice both an active grasp and a maintained grasp. An active grasp involves a pinching action using two to four fingers and the thumb and a release action. A maintained grasp involves active supination, which is to say, a hand further rotating toward a palm-up position. These movements are deemed by experts essential to improved hand function in stroke patients.
  • In this VE training “scene”, a patient/user must move his/her hand and fingers, which are operationally coupled to the input device, first, to grasp or grip a virtual lid 67 (FIG. 27A) and then to lift the virtual lid 67 from a virtual pot 58 (FIG. 27B). Once the patient/user has successfully grasped and lifted the virtual lid 67, he/she supinates the virtual lid 67 to a palm-up position, all the while maintaining his/her grasp on the virtual lid 67. In the final stage of the scene, the patient/user pronates the virtual lid 67 to a palm-down position; returns the virtual lid 67 back to the virtual pot 72; and releases his/her grasp on the virtual lid 67 (FIG. 27D).
  • Preferably, the scene application is adapted so that a visual signal is generated upon successful completion of any or all of the movements or motor activity associated with stages. For example, once the patient/user satisfactorily is able to grip the virtual lid 67 (FIG. 27A) the virtual lid 67 can change color, e.g., from grey to blue. Further, once the virtual lid 67 is lifted from the virtual pot 68 (FIG. 27B) the virtual lid 67 can change color again, e.g., from blue to green. Similarly, upon successful grasp plus supination and transport (FIG. 27C), the virtual lid 67 can change color again, e.g., from green to red. Advantageously, the supination threshold for success can be pre-set, e.g., 45 degrees, and can be further adjusted to require a greater or lesser result. Finally, following pronation and transport (FIG. 27D), the virtual lid 67 can return to its original color, e.g., grey. Undergoing this cycle of stages, the trial is counted as a “success”.
  • If, however, the patient/user loses his/her grasp while raising, supinating, and/or pronating the virtual lid 67, the application can be adapted to cause the virtual lid 67 to return automatically to the original position on the virtual pot 68 and to return to its original color, e.g., grey.
  • The movement during each of the phases can be timed and counted separately. This provides more feedback to patients/users about their performance if the entire trial was not successful. The ability to count successful phases and trials and record time for each can be incorporated in the design for the scoring function described in greater detail below. Thus, each discrete data element can be displayed separately to provide feedback about performance either during the session or after. These discrete elements count the number of successes for each phase; count the number of successful trials, which is to say, that all phases are completed successfully; record the time for each phase; and record the time for each trial. Time can also be displayed as a mean for block of trials, with the number in block adjustable.
  • FIG. 28 summarizes the results of testing of six (6) healthy subjects (four males, two females; 5 right-handed, one ambidextrous) who performed competitive interaction using the MUVER scene shown in FIG. 27. Each patient/user donned and calibrated the SmartGlove™ and was instructed in the hand rehabilitation movement depicted in the scene. Each subject was provided with up to five minutes to practice and become accustomed to working in the virtual environment.
  • Subsequently, subjects were asked to complete ten movements as rapidly as possible. A short rest was given between each of the ten trial movements. The total time and the time for each phase were recorded for each trial and for each subject. The mean durations 4 and associated standard deviations 5 across blocks and subjects are summarized in FIG. 28 for each sub-phase of the movement task (grasp, turn, and return) and for the total task.
  • Start to completion of grasping the lid 67 averaged 1.6±0.7 sec. Grasp with turn and transport to right averaged 1.8±0.6 sec. Pronation, lid return, finger extension to release grasp averaged 2.1±0.9 sec. In summary, the mean time for the total task was 5.5±1.6 sec.
  • Experimenters and subjects noted that the mapping of real world to virtual environment movements was not completely accurate, which probably accounts for times that are somewhat slower than expected in healthy subjects. This inaccuracy, however, could be due to a hardware problem in the glove position detection and/or due to the software that maps position and orientation data to the virtual scene elements.
  • Referring to FIG. 29 a more complete MUVER System 90 is shown. As previously mentioned, the practitioner's interface 96 expands data collection of the basic system in two major ways: providing real-time feedback and modifying the virtual environment. Indeed, a practitioner can be a spectator and watch the actions and interactions of the patients/users 91. They also have the ability to privately or openly provide feedback to a patient/user 91 or multiple patients/users 91 at once. As another form of feedback the practitioner can change or control the MUVER system 90 based on the actions and interactions of the patients/users 91. Modifying the MUVER 90 has several advantages including changing the level of difficulty to better suit rehabilitation and also directing the MUVER 90 to facilitate certain interactions between patients/users 91.
  • The more complete system 90 includes a Rapid Prototyping feature 85. Rapid Prototyping (RP) is a manufacturing method that allows custom objects to be created from a .STL file quickly and for low cost. With this feature, the MUVER 90 can be populated by models that can be exported as .STL files. Consequently, a practitioner can borrow an object from the real world; recreate it in the MUVER for virtual rehabilitation; and then create it using RP for use in actual rehabilitation exercises 86.
  • Object Examples
  • A first exemplary MUVER scene that can be presented virtually in accordance with the present invention is a simple, Single Degree of Freedom Knob (SK). In the real world, an SK is an Electro-Rheological Fluid (ERF) based device that can be manipulated by the patient/user in a single degree of freedom, e.g., clockwise or counterclockwise rotation about an axis, by rotating a variably-resistive knob. The MUVER system 90 allows a practitioner to place a patient/user in a virtual environment containing an SK device in the comfort and convenience of the patient/user's home. From a remote site, the practitioner is also able to customize the resistance level of the real world knobs for a particular patient/user.
  • A virtual knob 87 that might be shown in a patient/user's virtual environment is shown in FIG. 36. The MUVER design for the virtual knob 87 is meant to be for one or two patient/users. A first patient/user manipulates his/her hand and fingers in the real world sufficiently to cause the circle 88 to rotate about an axis in either direction in the virtual environment. If there is a second patient/user, he/she likewise manipulates his/her hand in the real world sufficiently to cause the rounded square 89 to rotate about the axis in the same direction in the virtual environment. If there is no second patient/user, the practitioner or, alternatively, a software program can control and manipulate the rate of rotation of the rounded square 89. Control of the square 89 by a non-patient/user can be performed using another knob, a rehabilitation device, a keyboard, a mouse, and the like.
  • The simplistic MUVER SK scene can be controlled to provide competitive, versus, or mixed virtual interactions. For the competitive interaction, the time it takes each patient/user to catch, i.e. to “tag”, the square 89 during a pre-established amount of time can be recorded and compared. Another possible competitive interaction is, instead, to record the number of “tags” during a pre-established period of time for each patient/user.
  • An exemplary versus interaction can include awarding points to the first, circle patient/user for every “tag” of the square 89 and awarding points to the rounded square patient/user for avoiding being tagged over a set increment of time. An exemplary mixed interaction can include two patient/users comparing their scores and times with another pair of patient/users or with historical scores of the two.
  • Scene themes, such as the SK example above, can play a very important role in the MUVER design. The customizable knob lends the device to many possible real scenes that can be made into virtual ones. The availability of real life scenes in a virtual environment can be beneficial because a particular patient/user may have performed the task regularly in the real world and, hence, already understands the movements needed to successfully complete it.
  • Possible real life scenes to use in a MUVER for the SK can, for purposes of illustration and not limitation, include:
    • Opening a door knob
    • Unlocking a door
    • Turning a car's ignition
    • Opening a jar
    • Using a rotary phone
    • Manipulating a thermostat
    • Reeling in a fish
    • Screwing in a light blub
    • Opening a bottle of wine
    • Turning a stove on and off
    • Juicing a lemon or orange
    • Stirring coffee or soup
  • Fictional scene themes that parallel events that the patient/user is not familiar with have a low learning curve because the patient/user may not know how to complete the movements successfully. For example, a non-fisherman may not know how to reel in a fish or a teetotaler may not know how to turn a corkscrew to open a bottle of wine. Notwithstanding, fictional scene themes have the advantage of being made expressly for exercising a certain movement or knob design. The prototype MUVER proposed here is an example of a fictional theme but the idea of tag is a common play mechanic that patient/user may be familiar with.
  • Referring to FIG. 37 a diagram of virtual environment for a Single Degree of Freedom Hand Device (SHD) is shown. An SHD is a passive ERF device and, commonly, a research device meant for use in MRI machines.
  • The single degree of freedom is linear and has a pre-established, e.g., three inch, stroke. As a result, the exercise routine the patient/user performs is simply grasp and release. Challenges include noise from the MRI machine, simplicity of the device, and using mirrors for the patient/user to see the computer monitor displaying the MUVER graphics. The real world SHD has many unique design aspects, which to the greatest extent possible are transferred over to the MUVER virtual environment design.
  • The scene theme in FIG. 37 replicates the real world act of inflating a balloon 85 using a hand pump 86. This particular exemplary scene theme advantageously provides an objective that is simple and intuitive. Additionally, because the progressive inflation of the balloon presents the patient/user with feedback, no other feedback needs to be provided.
  • An advantage of this MUVER scene theme is that it can facilitate all four kinds of multiple patient/user interactions: competitive, cooperative, versus, and mixed. The competitive interaction would be conducted by timing each patient/user for a pre-established number of pumps that will completely inflate the balloon 85. When multiple pumps 86 control the rate of inflation of the same balloon 85, a cooperative interaction can be accomplished. This type of interaction can be further explored by controlling the rhythm of pumps from each patient/user or having a set order that patients/users must pump in order to inflate the balloon 85.
  • By having the actions of one patient/user inflate the balloon 85 while the actions of a second patient/user concurrently deflate it, one can provide counter-operative interaction. Finally, a mixed interaction using this MUVER design has the most depth and the greatest possibility, e.g., by having multiple patients/users inflate the same balloon while evaluating the relative percentages that each inflated. The patients/users could also be required to follow an inflation order or rhythm and could be judged on their respective accuracy at following it.
  • As another example, an Active Hand Device (AHD) is a two degree of freedom, active ERF device. An exercise the patient/user does with the AHD includes the linear degree of freedom that the patient/user performs with the SHD as described above as well as a supination/pronation rotation similar to that of the SK. Advantageously, the AHD is also an active device, which is to say, that the AHD can provide variable resistance and push back against the patient/user, which can be used to enhance the experience. The major design considerations for such a device include having a customizable real world design so that the practitioner can choose to use both degrees or a single degree of freedom.
  • Referring to FIG. 38, a diagram of the prototype MUVER for an AHD is shown. The patient/user is represented as the circle 69. The patient/user manipulates the AHD to move the circle 69 around a track 29 that can be designed by the practitioner. The patient/user receives feedback in the form of accuracy in movement and also in the speed with which the circle 69 circumnavigates the track 29.
  • Adjustable read lines 28 represent the start and the end of the track 29. The locations of the read lines 28 can be modified depending on what the practitioner wants the patient/user to do for an exercise. The ellipse 27 surrounding the circle 69 is a force gradient, inside of which the patient/user tries to keep the circle 69 as it moves around the track 29.
  • For example, exercises can be made more or less difficult based on the hands position relative to the shoulder, movement tests can be designed to test extremity coordination. Thus, the position/orientation feature is a valuable component to the rehabilitation package. The system is designed to work with two gloves simultaneously, to allow patients/users 91 to use their capable hand to interact in exercises along with their disabled one.
  • This is not to say that the mapping of real world movement must necessarily correspond to a virtual world movement, i.e., “direct mapping”. If the goal is rehabilitation, then whatever virtual “scene” best motivates a patient/user to expedite the rehabilitation process or rehabilitation milestones, the better. Accordingly, “abstract mapping” by which real world movement produced by the patient/user differs from the movement of the virtual world object(s) is also possible with the invention as claimed. For example, if the patient/user movement desired is wrist flexion and extension, a “direct mapping” scene may depict a hand waving while an “abstract mapping” scene may equate the flexion/extension to the movement of a cartoon animal.
  • Patient/User Scoring and Teacher Models
  • Patient/user's performance and other data are stored locally and, furthermore, can be transmitted to a third party, e.g., a clinical provider, via the network. Performance data, e.g., hand and finger kinematics, can be used to assess progress over time, the level of impairment, and so forth. “Scoring” connotes reduction of performance data into a format that is easily usable to determine performance and progress. In particular, the scoring data is easily formatted into graphics, spreadsheets, summaries, and so forth.
  • For example, FIGS. 40A-D show a patient/user's hand being constrained in 45 degree and 90 degree orientations, for measuring voltage as a function of joint bend of the MCP (FIGS. 40A and 40B) and the PIP (FIGS. 40C and 40D). Patient/user movement data for FIGS. 40A and 40B are displayed in a bar graph in FIG. 41A, which separates the data by finger, i.e., index finger, middle finger, and ring finger, and by the angle of constraint, i.e., 0, 45 degrees, and 90 degrees. Patient/user movement data for FIGS. 40C and 40D are displayed in a bar graph in FIG. 41B, which also separates the data by finger, i.e., index finger, middle finger, and ring finger, and by the angle of constraint, i.e., 0, 45 degrees, and 90 degrees. In both instances, at a glance, one can determine from the bar graphs that each of the joints of the patient/user can be moved.
  • “Scoring” can be performed per “scene” or exercise, per phase within a “scene” or can be compiled over multiple scenes. “Scoring” can measure, for example, a number of repetitions, a magnitude of motion, speed of performance, speed of performance of a phase of a scene, accuracy of movement, and so forth.
  • Optionally, the system can also include a “teacher model” capability, which provides the ability to record patient/user performance for analysis and later playback. During playback, patient/user errors can be highlighted and made known to the patient/user and correct performance can be demonstrated. U.S. Pat. No. 5,554,033 discloses a virtual teacher and is included herein in its entirety by reference.
  • Amplified Feedback and Adaptive Design
  • Another advantage of the present system is a library of “scenes” or exercises, each scene being used for a discrete purpose or functional goal and being adjustable. More specifically, each of the “scenes” can be programmed to adjust the degree of difficulty of the scene. Indeed, defining the values of a set of parameters, e.g., speed of motion, magnitude of motion, smoothness of motion, hand orientation during motion, and the like, enables medical personnel, physical therapists, and the like to tailor scenes for a discrete patient/user. Hence, collectively, the “scene” database provides a variety of purposes and functional goals.
  • Another feature of the present system is amplified feedback by which real world movement can be discretionally amplified prior to virtual mapping. For example, if a patient/user only bends his/her fingers ten degrees, the signal can be amplified by a factor of five so that, the virtual movement shows a 50 degree flexure. In this manner, amplified feedback can be used as a carrot to encourage patients/users.
  • Those skilled in the art can appreciate that the simplistic and rudimentary scene themes described above are for illustrative purposes only. By mixing a real world scene theme with fictional scoring to facilitate multiple patient/user interactions, the MUVER design proves to be very robust.

Claims (41)

1. A multiple-user virtual environment system for rehabilitation exercise of mammalian trunk, extremities, and digits, the system comprising:
a communication network; and
a plurality of individual virtual environments, each of the individual virtual environments including:
at least one input device that is structured and arranged to generate at least one of movement, orientation, velocity, and position data corresponding to discrete movement of a portion of the mammalian trunk or of one or more mammalian extremities or digits,
a processing device that is adapted to receive the at least one movement, orientation, velocity and position data from the input device; to store said movement, orientation, velocity, and position data; and to generate image data therefrom for display on a display device, and
a virtual environment interface that is adapted to enable virtual environment communication and virtual environment data transfer between the processing device and the network.
2. The system as recited in claim 1 further comprising an interface and processing device that enable a third party to observe and to record data from the processing device.
3. The system as recited in claim 2, wherein the third-party processing device is structured and arranged to pre-establish or adapt at least one rehabilitation exercise for each of the plurality of virtual environments or to modify said exercise or virtual environment.
4. The system as recited in claim 4, wherein the third-party processing device is structured and arranged to pre-establish a type of multi-user interaction for each of the at least one rehabilitation exercise and for each of the plurality of virtual environments.
5. The system as recited in claim 4, wherein the type of multi-user interaction is selected from the group consisting of a competitive interaction, a counter-operative interaction, a cooperative interaction, and a mixed interaction.
6. The system as recited in claim 4, wherein the third-party processing device is structured and arranged to adjust a degree of difficulty for each of the at least one rehabilitation exercises.
7. The system as recited in claim 1, wherein the virtual environment interface includes at least one of:
game engine hardware having a programming code and a plurality of rules,
game engine software having a programming code and a plurality of rules,
a scripting programming code that is capable of overwriting low-level game engine programming code,
three-dimensional model software that is adapted to populate the engine programming code and to adhere to the plurality of rules, and
three-dimensional graphic software that is adapted to populate the engine programming code and to adhere to the plurality of rules.
8. The system as recited in claim 1, wherein the network is selected from the group consisting of a local area network, a wide area network, the World Wide Web, and the Internet.
9. The system as recited in claim 1, wherein the network is adapted to generate a virtual reality environment using the virtual environment data.
10. The system as recited in claim 1, wherein the input device is adapted to monitor a position or an attitude of an extremity or of a digit in space.
11. The system as recited in claim 1, wherein the discrete movement is selected from the group consisting of movement in an x-direction, movement in a y-direction, movement in a z-direction, pitch, roll, and yaw, wherein each of the x-direction, the y-direction, and the z-direction is mutually perpendicular.
12. The system as recited in claim 1 further comprising a base unit that, in combination with the at least one input device, is structured and arranged to provide a dead reckoning starting and a dead reckoning ending point-of-reference, to enable the processing device to determine at least one of attitude and velocity of said at least one input device.
13. The system as recited in claim 12, wherein the base unit is selected from the group comprising a banana grip base, a globe base, a pyramidal base or a tear-drop base.
14. The system as recited in claim 13, wherein the globe base includes imprinted grooves that define the dead reckoning starting and the dead reckoning ending point-of-reference.
15. The system as recited in claim 14, wherein the base unit includes adjustable arm splints for neutral wrist position and forearm support.
16. The system as recited in claim 1, wherein the at least one input device comprises a pair of gloves.
17. The system as recited in claim 1, wherein the processing device is adapted to map real world movement directly into similar or abstractly into different virtual world movement.
18. The system as recited in claim 1 further comprising a teacher model capability to highlight shortcomings and errors of the user and to demonstrate how to correct said shortcomings and errors.
19. An input device for use with a multiple-user virtual environment system for rehabilitation exercise of a human hand and digits, the input device being structured and arranged to generate signals corresponding to at least one of a discrete movement and an attitude of said hand and said digits, the device comprising:
a glove that can be readily donned and doffed on either hand by a user, the glove having finger portions for at least a thumb, an index finger, a middle finger, and a ring finger;
a first plurality of sensors, each sensor being structured and arranged to provide data on movement and range of movement of at least one of the index finger, the middle finger, and the ring finger, each of the first plurality of sensors being disposed within the finger portions of said index finger, said middle finger, and said ring finger;
a second plurality of sensors that is structured and arranged to provide data on movement of the thumb; and
a positioning and tracking system that is structured and arranged to generate position coordinates in three rotational axes and three translational axes to determine at least one of the attitude and a velocity of said hand.
20. The input device as recited in claim 19, wherein the first and the second pluralities of sensors are selected from the group comprising electronic bend sensors, resistive bend sensors, capacitive bend sensors, optical fiber sensors, mechanical measurement bend sensors, angle measurement sensors, Hall effect sensors or electromechanical sensors.
21. The input device as recited in claim 19, wherein the positioning and tracking system is selected from the group comprising an inertial measurement unit (IMU), a radio frequency (RF) positioning and tracking system, an infrared positioning and tracking system, three-dimensional cameras or a magnetic tracking system.
22. The input device as recited in claim 21, wherein the positioning and tracking system is an inertial measurement unit (IMU) that employs dead reckoning to determine the attitude of the hand and the velocity of the hand.
23. The input device as recited in claim 22, the input device further including a Hall effect sensor for zeroing the IMU.
24. The input device as recited in claim 19, wherein the device has a total weight that does not exceed sixteen ounces.
25. The input device as recited in claim 24, wherein a dorsal weight on the input device does not exceed eight ounces.
26. The input device as recited in claim 19, wherein the input device is structured and arranged to measure at least one of the following accurately:
finger flexion/extension measured to at least 90°;
wrist flexion/extension or dorsal action at ±90°;
wrist-radial deviation up to 40°;
wrist-ulnar deviation up to 50°; and
forearm supination/pronation up to 180°.
27. The input device as recited in claim 19 further comprising a feedback system that is selected from the group comprising an audible speaker to provide auditory clues, at least one light-emitting device to provide a visual signal, and a haptic device to provide a vibratory signal.
28. The input device as recited in claim 19 further comprising a touch sensor that is adapted to provide and record pinch data, the touch sensor being disposed in a tip portion of the glove thumb and being activated by contact with any of the index finger, the middle finger, the ring finger or a pinky finger.
29. The input device as recited in claim 19 further comprising a communication means for providing hand and finger movement data and attitude data to a processing unit.
30. The input device as recited in claim 19 further comprising a pulley portion to measure radial and ulnar deviations, the pulley portion being disposed above a user's elbow and being releasably attached to a medial side of the glove.
31. A method of providing a virtual environment system for rehabilitation exercise to a plurality of users over a communication network, the method comprising:
providing an individual virtual environments to each of the plurality of users, each of the individual virtual environments including an input device, a processing device, and a virtual environment interface;
generating at least one of movement, orientation, velocity, and position data signals corresponding to discrete movement of one or more mammalian trunk, extremities or digits disposed in the input device;
receiving the data signals from the input device;
generating image data for display on a display device and other data from said input data; and
enabling virtual environment communication and virtual environment data transfer between the processing device and the network.
32. The method as recited in claim 31 further comprising enabling a third party to observe and to record image and other data from the processing device.
33. The method as recited in claim 31 further comprising pre-establishing at least one rehabilitation exercise for each of the plurality of virtual environments.
34. The method as recited in claim 31 further comprising pre-establishing a type of multi-user interaction for each of the at least one rehabilitation exercise and for each of the plurality of virtual environments.
35. The method as recited in claim 34, wherein the type of multi-user interaction is selected from the group consisting of a competitive interaction, a counter-operative interaction, a cooperative interaction, and a mixed interaction.
36. The method as recited in claim 31, wherein the discrete movement is selected from the group consisting of movement in an x-direction, movement in a y-direction, movement in a z-direction, pitch, roll, and yaw, wherein each of the x-direction, the y-direction, and the z-direction is mutually perpendicular.
37. The method as recited in claim 31 further comprising generating a virtual reality environment using the virtual environment data.
38. The method as recited in claim 31 further comprising monitoring a position of an extremity or of a digit in space.
39. The method as recited in claim 31 further comprising:
scoring user performance using said input data; and
displaying scoring results after each exercise, daily, weekly, after each session, after each phase of an exercise and/or immediately.
40. The method as recited in claim 31 further comprising amplifying the data signals corresponding to the at least one of movement, orientation, velocity, and position before displaying image data generated therefrom.
41. The method as recited in claim 34 further comprising adjusting a degree of difficulty of said at least one rehabilitation exercise.
US13/145,436 2009-01-20 2010-01-20 Multi-user smartglove for virtual environment-based rehabilitation Abandoned US20120157263A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/145,436 US20120157263A1 (en) 2009-01-20 2010-01-20 Multi-user smartglove for virtual environment-based rehabilitation

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US14582509P 2009-01-20 2009-01-20
US26654309P 2009-12-04 2009-12-04
PCT/US2010/021483 WO2010085476A1 (en) 2009-01-20 2010-01-20 Multi-user smartglove for virtual environment-based rehabilitation
US13/145,436 US20120157263A1 (en) 2009-01-20 2010-01-20 Multi-user smartglove for virtual environment-based rehabilitation

Publications (1)

Publication Number Publication Date
US20120157263A1 true US20120157263A1 (en) 2012-06-21

Family

ID=42356175

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/145,436 Abandoned US20120157263A1 (en) 2009-01-20 2010-01-20 Multi-user smartglove for virtual environment-based rehabilitation

Country Status (3)

Country Link
US (1) US20120157263A1 (en)
EP (1) EP2389152A4 (en)
WO (1) WO2010085476A1 (en)

Cited By (108)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100262047A1 (en) * 2009-04-08 2010-10-14 Drexel University Physical therapy systems and methods
US20120088983A1 (en) * 2010-10-07 2012-04-12 Samsung Electronics Co., Ltd. Implantable medical device and method of controlling the same
US20130036529A1 (en) * 2011-08-08 2013-02-14 Ford Global Technologies, Llc Glove having conductive ink and method of interacting with proximity sensor
US20130060166A1 (en) * 2011-09-01 2013-03-07 The Regents Of The University Of California Device and method for providing hand rehabilitation and assessment of hand function
US20130072836A1 (en) * 2010-04-06 2013-03-21 I2R Medical Limited Therapeutic hand exercise device
US20130143718A1 (en) * 2013-01-30 2013-06-06 Universita Degli Studi Di Cagliari Apparatus, a system and a relating method for local or remote rehabilitation and functional evaluation of the hands
US20130158946A1 (en) * 2010-08-13 2013-06-20 Hansjörg Scherberger Modelling of hand and arm position and orientation
US20130190093A1 (en) * 2010-05-04 2013-07-25 System And Method For Tracking And Mapping An Object To A Target Timoco Ltd. System and method for tracking and mapping an object to a target
US20130303951A1 (en) * 2012-05-11 2013-11-14 University Of Tennessee Research Foundation Portable Hand Rehabilitation Device
US20140121018A1 (en) * 2012-09-21 2014-05-01 Grigore Cristian Burdea Bimanual integrative virtual rehabilitation systems and methods
US20140172166A1 (en) * 2012-08-30 2014-06-19 Snu R&Db Foundation Treatment device for hemiplegia
US20140278830A1 (en) * 2013-03-15 2014-09-18 U.S. Physical Therapy, Inc. Method for injury prevention and job-specific rehabilitation
WO2014176353A1 (en) * 2013-04-24 2014-10-30 Tl Technologies Llc Rehabilitation monitoring device
US20150017623A1 (en) * 2012-01-04 2015-01-15 Gabriele Ceruti Method and apparatus for neuromotor rehabilitation using interactive setting systems
US20150133206A1 (en) * 2012-04-30 2015-05-14 The Regents Of The University Of California Method and apparatus for mobile rehabilitation exergaming
US20150196800A1 (en) * 2014-01-13 2015-07-16 Vincent James Macri Apparatus, method and system for pre-action therapy
KR101541082B1 (en) 2015-01-23 2015-08-03 주식회사 네오펙트 System and method for rehabilitation exercise of the hands
US20150287338A1 (en) * 2013-08-26 2015-10-08 John Andrew Wells Biometric data gathering
WO2015161194A1 (en) * 2014-04-17 2015-10-22 Flint Rehabilitation Devices, Llc. Systems and methods for rehabilitating the hand
WO2015167578A1 (en) * 2014-04-30 2015-11-05 Umm Al-Qura University Tactile feedback gloves
US20150358543A1 (en) * 2014-06-05 2015-12-10 Ali Kord Modular motion capture system
US20150357948A1 (en) * 2014-06-05 2015-12-10 Kevin W. Goldstein Hand Worn Wireless Remote Controller For Motors
US20160011013A1 (en) * 2014-07-11 2016-01-14 Sixense Entertainment, Inc. Method And Apparatus For Synchronizing a Transmitter and Receiver in a Magnetic Tracking System
US20160023046A1 (en) * 2013-03-14 2016-01-28 Jintronix, Inc. Method and system for analysing a virtual rehabilitation activity/exercise
US20160048205A1 (en) * 2014-08-13 2016-02-18 Iron Will Innovations Canada Inc. Sensor Proximity Glove for Control of Electronic Devices
US20160054797A1 (en) * 2014-08-22 2016-02-25 Sony Computer Entertainment Inc. Thumb Controller
US9278255B2 (en) 2012-12-09 2016-03-08 Arris Enterprises, Inc. System and method for activity recognition
US20160067136A1 (en) * 2013-05-16 2016-03-10 New York University Game-based sensorimotor rehabilitator
US20160147303A1 (en) * 2014-10-27 2016-05-26 Cherif Atia Algreatly Nanotechnology Clothing For Human-Computer Interaction
US20160161301A1 (en) * 2014-10-11 2016-06-09 Workaround Ug (Haftungsbeschraenkt) Workwear Unit, Bracelet, Connecting Piece, Glove, Sensor Module and Method of Detecting, Documenting, Analyzing, Monitoring and/or Teaching Processes
US20160193101A1 (en) * 2015-01-05 2016-07-07 National Tsing Hua University Rehabilitation system with stiffness measurement
US20160202755A1 (en) * 2013-09-17 2016-07-14 Medibotics Llc Sensor Array Spanning Multiple Radial Quadrants to Measure Body Joint Movement
US20160224112A1 (en) * 2013-10-11 2016-08-04 Fujitsu Limited Input apparatus
US20160246370A1 (en) * 2015-02-20 2016-08-25 Sony Computer Entertainment Inc. Magnetic tracking of glove fingertips with peripheral devices
US20160271498A1 (en) * 2015-03-20 2016-09-22 Miles Queller Lifton System and method for modifying human behavior through use of gaming applications
CN105999652A (en) * 2016-04-26 2016-10-12 珠海云智医疗科技有限公司 Brain injury rehabilitation training device based on pinching of double finger pulps
EP3098691A1 (en) * 2015-05-29 2016-11-30 Manus Machinae B.V. Flex sensor and instrumented glove
US9529433B2 (en) * 2014-12-30 2016-12-27 Stmicroelectronics Pte Ltd Flexible smart glove
EP2996551A4 (en) * 2013-05-16 2017-01-25 New York University Game-based sensorimotor rehabilitator
USD778531S1 (en) 2015-10-02 2017-02-14 Milwaukee Electric Tool Corporation Glove
WO2017039553A1 (en) * 2015-09-01 2017-03-09 AKSU YLDIRIM, Sibel A personalized rehabilitation system
USD787515S1 (en) * 2015-08-24 2017-05-23 Flint Rehabilitation Devices, LLC Hand-worn user interface device
US20170185142A1 (en) * 2015-12-25 2017-06-29 Le Holdings (Beijing) Co., Ltd. Method, system and smart glove for obtaining immersion in virtual reality system
US20170199560A1 (en) * 2010-07-20 2017-07-13 Empire Technology Development Llc Augmented reality proximity sensing
USD794901S1 (en) 2015-12-10 2017-08-22 Milwaukee Electric Tool Corporation Glove
US9821207B2 (en) * 2016-03-08 2017-11-21 Jess Jewett Golf training apparatus
US20170361217A1 (en) * 2012-09-21 2017-12-21 Bright Cloud International Corp. Bimanual integrative virtual rehabilitation system and methods
US20180005443A1 (en) * 2016-06-30 2018-01-04 Adam Gabriel Poulos Interaction with virtual objects based on determined restrictions
US20180015345A1 (en) * 2015-02-02 2018-01-18 Gn Ip Pty Ltd Frameworks, devices and methodologies configured to enable delivery of interactive skills training content, including content with multiple selectable expert knowledge variations
RU177032U1 (en) * 2017-05-10 2018-02-06 Александр Владимирович Елизаров SIMULATOR FOR FINGERS OF THE HAND
US20180036620A1 (en) * 2015-02-25 2018-02-08 Jabii Group Aps Boxing device for performing a harmless boxing match, method and uses thereof
USD812844S1 (en) 2016-01-20 2018-03-20 Milwaukee Electric Tool Corporation Glove
USD812845S1 (en) 2016-01-20 2018-03-20 Milwaukee Electric Tool Corporation Glove
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US9987554B2 (en) 2014-03-14 2018-06-05 Sony Interactive Entertainment Inc. Gaming device with volumetric sensing
US20180307507A1 (en) * 2015-06-12 2018-10-25 Spheredyne Co., Ltd. Input device and ui configuration and execution method thereof
US10137362B2 (en) 2016-05-04 2018-11-27 Thomas F Buchanan, IV Exo-tendon motion capture glove device with haptic grip response
US10212986B2 (en) 2012-12-09 2019-02-26 Arris Enterprises Llc System, apparel, and method for identifying performance of workout routines
US10220255B2 (en) * 2017-04-12 2019-03-05 Russell Copelan Wearable measuring apparatus for sports training
US10220303B1 (en) * 2013-03-15 2019-03-05 Harmonix Music Systems, Inc. Gesture-based music game
US20190087020A1 (en) * 2016-10-04 2019-03-21 Hewlett-Packard Development Company, L.P. Three-dimensional input device
US10254833B2 (en) 2015-02-20 2019-04-09 Sony Interactive Entertainment Inc. Magnetic tracking of glove interface object
US10304325B2 (en) 2013-03-13 2019-05-28 Arris Enterprises Llc Context health determination system
US20190160338A1 (en) * 2017-11-29 2019-05-30 Seiko Epson Corporation Training assistance device and non-transitory computer-readable storage medium
US10317997B2 (en) * 2016-03-11 2019-06-11 Sony Interactive Entertainment Inc. Selection of optimally positioned sensors in a glove interface object
US20190209086A1 (en) * 2018-01-05 2019-07-11 Rehabotics Medical Technology Corp. Fixed-sensor finger action detecting glove
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
US10372213B2 (en) * 2016-09-20 2019-08-06 Facebook Technologies, Llc Composite ribbon in a virtual reality device
CN110275537A (en) * 2019-06-27 2019-09-24 中国电子科技集团公司信息科学研究院 Motion profile cooperative control method and its device, computer readable storage medium
US20190370534A1 (en) * 2013-09-17 2019-12-05 Medibotics Llc Motion Recognition Clothing with Flexible Optical Sensors
US20200016363A1 (en) * 2012-06-27 2020-01-16 Vincent John Macri Digital virtual limb and body interaction
US20200054940A1 (en) * 2017-02-14 2020-02-20 Sony Interactive Entertainment Europe Limited Sensing Apparatus And Method
WO2020053332A1 (en) * 2018-09-14 2020-03-19 Deutsches Zentrum für Luft- und Raumfahrt e.V. Device and method for calibrating at least one input device, and program product
US10599217B1 (en) * 2016-09-26 2020-03-24 Facebook Technologies, Llc Kinematic model for hand position
US10632366B2 (en) 2012-06-27 2020-04-28 Vincent John Macri Digital anatomical virtual extremities for pre-training physical movement
US10694990B2 (en) 2012-09-21 2020-06-30 Bright Cloud International Corporation Bimanual computer games system for dementia screening
US10736544B2 (en) * 2015-09-09 2020-08-11 The Regents Of The University Of California Systems and methods for facilitating rehabilitation therapy
US10845876B2 (en) * 2017-09-27 2020-11-24 Contact Control Interfaces, LLC Hand interface device utilizing haptic force gradient generation via the alignment of fingertip haptic units
US10856778B2 (en) * 2015-04-14 2020-12-08 Inesc Tec—Instituto De Engenharia De Sistemas E Wrist rigidity assessment device for use in deep brain stimulation surgery
US10894204B2 (en) * 2016-05-04 2021-01-19 Contact Control Interfaces, LLC Exo-tendon motion capture glove device with haptic grip response
US10901506B2 (en) 2018-07-30 2021-01-26 Htc Corporation Finger-gesture detection device for control handle use in virtual reality, control assembly having the same and correction method for virtual reality system
CN112423718A (en) * 2018-07-18 2021-02-26 皇家飞利浦有限公司 Rehabilitation device and method for monitoring hand movement
US10942968B2 (en) 2015-05-08 2021-03-09 Rlt Ip Ltd Frameworks, devices and methodologies configured to enable automated categorisation and/or searching of media data based on user performance attributes derived from performance sensor units
US10950336B2 (en) 2013-05-17 2021-03-16 Vincent J. Macri System and method for pre-action training and control
CN112543978A (en) * 2019-06-17 2021-03-23 森桑姆德有限公司 Software and hardware system for rehabilitation of patients with cognitive impairment after upper limb stroke
US10993489B2 (en) 2015-02-18 2021-05-04 Milwaukee Electric Tool Corporation Glove
US11029757B1 (en) 2015-09-24 2021-06-08 Facebook Technologies, Llc Detecting positions of magnetic flux sensors having particular locations on a device relative to a magnetic field generator located at a predetermined position on the device
US11042219B2 (en) * 2017-11-16 2021-06-22 Zhaosheng Chen Smart wearable apparatus, smart wearable equipment and control method of smart wearable equipment
CN113101134A (en) * 2021-04-02 2021-07-13 上海交通大学医学院附属新华医院 Children lower limb movement auxiliary rehabilitation system based on power exoskeleton
US11074826B2 (en) 2015-12-10 2021-07-27 Rlt Ip Ltd Frameworks and methodologies configured to enable real-time adaptive delivery of skills training data based on monitoring of user performance via performance monitoring hardware
US11094418B2 (en) * 2015-12-31 2021-08-17 Nokia Technologies Oy Optimized biological measurement
US11136234B2 (en) 2007-08-15 2021-10-05 Bright Cloud International Corporation Rehabilitation systems and methods
US20210362004A1 (en) * 2019-07-31 2021-11-25 Medivr, Inc. Rehabilitation support apparatus and rehabilitation supporting method
US11369866B2 (en) * 2017-12-21 2022-06-28 Sony Interactive Entertainment Inc. Position tracking apparatus and method
WO2022156094A1 (en) * 2021-01-22 2022-07-28 上海司羿智能科技有限公司 Hand motion detection device and control method therefor, rehabilitation device and autonomous control system
WO2022157648A1 (en) * 2021-01-20 2022-07-28 Neurolutions, Inc. Systems and methods for remote motor assessment
USD960378S1 (en) * 2020-02-01 2022-08-09 Pankajkumar K Chhatrala Orthopedic thumb splint
US20220249947A1 (en) * 2021-02-10 2022-08-11 StrikerVR Inc. Simulation systems and methods including peripheral devices providing haptic feedback
US11458382B2 (en) * 2018-11-14 2022-10-04 South China University Of Technology Immersive upper limb rehabilitation training system
WO2022235283A1 (en) * 2021-05-07 2022-11-10 Command Gloves Llc Glove comprising scannable code and related system and method for use
US11517788B2 (en) * 2016-06-06 2022-12-06 Maxell, Ltd. Finger exercise training menu generating system, method thereof, and program thereof
US11534358B2 (en) * 2019-10-11 2022-12-27 Neurolutions, Inc. Orthosis systems and rehabilitation of impaired body parts
GB2612988A (en) * 2021-11-18 2023-05-24 Lusio Tech Pty Limited Systems and methods for users with impaired movement
US11673042B2 (en) 2012-06-27 2023-06-13 Vincent John Macri Digital anatomical virtual extremities for pre-training physical movement
EP3979966A4 (en) * 2019-06-07 2023-07-05 Osind Medi Tech Private Limited A self driven rehabilitation device and method thereof
US20230285836A1 (en) * 2020-06-24 2023-09-14 Nippon Telegraph And Telephone Corporation Position sense correction device, method, and program
US11804148B2 (en) 2012-06-27 2023-10-31 Vincent John Macri Methods and apparatuses for pre-action gaming
US11899838B2 (en) 2017-07-19 2024-02-13 Plexus Immersive Corp Hand worn interface device integreating electronic sensors

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120059290A1 (en) * 2010-09-02 2012-03-08 Yip Joanne Yiu Wan Wearable device for finger rehabilitation
DE102010060592A1 (en) 2010-11-16 2012-05-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Training system, mobile terminal and training method for one person
WO2012165882A2 (en) * 2011-05-31 2012-12-06 주식회사 네오펙트 Apparatus for rehabilitation exercise, wearable communication apparatus, and application system and method for applying same
KR101367801B1 (en) * 2011-05-31 2014-02-27 주식회사 네오펙트 Finger exercising apparatus and method for assisting exercise of finger
US20130113704A1 (en) * 2011-11-04 2013-05-09 The Regents Of The University Of California Data fusion and mutual calibration for a sensor network and a vision system
WO2013086023A1 (en) * 2011-12-05 2013-06-13 Northeastern University Customized, mechanically-assistive rehabilitation apparatus and method for distal extremities of the upper and lower regions
US20140371633A1 (en) * 2011-12-15 2014-12-18 Jintronix, Inc. Method and system for evaluating a patient during a rehabilitation exercise
US10004660B2 (en) 2012-11-23 2018-06-26 Flinders University Of South Australia Method of therapy and haptic gaming system for sensory agnosia
WO2017151142A1 (en) * 2016-03-04 2017-09-08 Hewlett-Packard Development Company, L.P. Generating digital representations using a glove interface
CN105717900B (en) * 2016-04-26 2018-09-14 华南理工大学 Intelligent housing gloves and its home control, self-defined control gesture method
RU2670649C9 (en) * 2017-10-27 2018-12-11 Федоров Александр Владимирович Method of manufacturing virtual reality gloves (options)
RU2685005C1 (en) * 2017-11-07 2019-04-16 Публичное акционерное общество "Газпром нефть" Method and computer system for designing location of cluster sites in deposits
EP3626221A1 (en) * 2018-09-20 2020-03-25 Koninklijke Philips N.V. A rehabilitation device and a method of monitoring hand movement
CN110176162B (en) * 2019-06-25 2021-09-21 范平 Wearable system and teaching method applied to wearable system
CN110264798B (en) * 2019-06-25 2021-09-07 范平 Wearable system and teaching method applied to wearable system
US11696704B1 (en) 2020-08-31 2023-07-11 Barron Associates, Inc. System, device and method for tracking the human hand for upper extremity therapy

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4986280A (en) * 1988-07-20 1991-01-22 Arthur D. Little, Inc. Hand position/measurement control system
US20020120362A1 (en) * 2001-02-27 2002-08-29 Corinna E. Lathan Robotic apparatus and wireless communication system
US20060022833A1 (en) * 2004-07-29 2006-02-02 Kevin Ferguson Human movement measurement system
US20060063645A1 (en) * 2004-09-17 2006-03-23 Yin-Liang Lai Multifunctional virtual-reality fitness equipment with a detachable interactive manipulator
US20070060445A1 (en) * 2005-08-31 2007-03-15 David Reinkensmeyer Method and apparatus for automating arm and grasping movement training for rehabilitation of patients with motor impairment
US20080281633A1 (en) * 2007-05-10 2008-11-13 Grigore Burdea Periodic evaluation and telerehabilitation systems and methods
US20090319058A1 (en) * 2008-06-20 2009-12-24 Invensys Systems, Inc. Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
US20100234182A1 (en) * 2009-01-15 2010-09-16 Saebo, Inc. Neurological device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7479967B2 (en) * 2005-04-11 2009-01-20 Systems Technology Inc. System for combining virtual and real-time environments

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4986280A (en) * 1988-07-20 1991-01-22 Arthur D. Little, Inc. Hand position/measurement control system
US20020120362A1 (en) * 2001-02-27 2002-08-29 Corinna E. Lathan Robotic apparatus and wireless communication system
US20060022833A1 (en) * 2004-07-29 2006-02-02 Kevin Ferguson Human movement measurement system
US7292151B2 (en) * 2004-07-29 2007-11-06 Kevin Ferguson Human movement measurement system
US20060063645A1 (en) * 2004-09-17 2006-03-23 Yin-Liang Lai Multifunctional virtual-reality fitness equipment with a detachable interactive manipulator
US20070060445A1 (en) * 2005-08-31 2007-03-15 David Reinkensmeyer Method and apparatus for automating arm and grasping movement training for rehabilitation of patients with motor impairment
US20080281633A1 (en) * 2007-05-10 2008-11-13 Grigore Burdea Periodic evaluation and telerehabilitation systems and methods
US20090319058A1 (en) * 2008-06-20 2009-12-24 Invensys Systems, Inc. Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
US20100234182A1 (en) * 2009-01-15 2010-09-16 Saebo, Inc. Neurological device

Cited By (161)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11136234B2 (en) 2007-08-15 2021-10-05 Bright Cloud International Corporation Rehabilitation systems and methods
US20100262047A1 (en) * 2009-04-08 2010-10-14 Drexel University Physical therapy systems and methods
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US10421013B2 (en) 2009-10-27 2019-09-24 Harmonix Music Systems, Inc. Gesture-based user interface
US9545356B2 (en) * 2010-04-06 2017-01-17 I2R Medical Limited Therapeutic hand exercise device
US20130072836A1 (en) * 2010-04-06 2013-03-21 I2R Medical Limited Therapeutic hand exercise device
US20130190093A1 (en) * 2010-05-04 2013-07-25 System And Method For Tracking And Mapping An Object To A Target Timoco Ltd. System and method for tracking and mapping an object to a target
US9110557B2 (en) * 2010-05-04 2015-08-18 Timocco Ltd. System and method for tracking and mapping an object to a target
US20170199560A1 (en) * 2010-07-20 2017-07-13 Empire Technology Development Llc Augmented reality proximity sensing
US10437309B2 (en) * 2010-07-20 2019-10-08 Empire Technology Development Llc Augmented reality proximity sensing
US20130158946A1 (en) * 2010-08-13 2013-06-20 Hansjörg Scherberger Modelling of hand and arm position and orientation
US20120088983A1 (en) * 2010-10-07 2012-04-12 Samsung Electronics Co., Ltd. Implantable medical device and method of controlling the same
US10595574B2 (en) 2011-08-08 2020-03-24 Ford Global Technologies, Llc Method of interacting with proximity sensor with a glove
US20130036529A1 (en) * 2011-08-08 2013-02-14 Ford Global Technologies, Llc Glove having conductive ink and method of interacting with proximity sensor
US10004286B2 (en) * 2011-08-08 2018-06-26 Ford Global Technologies, Llc Glove having conductive ink and method of interacting with proximity sensor
US20130060166A1 (en) * 2011-09-01 2013-03-07 The Regents Of The University Of California Device and method for providing hand rehabilitation and assessment of hand function
US20150017623A1 (en) * 2012-01-04 2015-01-15 Gabriele Ceruti Method and apparatus for neuromotor rehabilitation using interactive setting systems
US20150133206A1 (en) * 2012-04-30 2015-05-14 The Regents Of The University Of California Method and apparatus for mobile rehabilitation exergaming
US9326909B2 (en) * 2012-05-11 2016-05-03 University Of Tennessee Research Foundation Portable hand rehabilitation device
US20130303951A1 (en) * 2012-05-11 2013-11-14 University Of Tennessee Research Foundation Portable Hand Rehabilitation Device
US11331565B2 (en) 2012-06-27 2022-05-17 Vincent John Macri Digital anatomical virtual extremities for pre-training physical movement
US11673042B2 (en) 2012-06-27 2023-06-13 Vincent John Macri Digital anatomical virtual extremities for pre-training physical movement
US20200016363A1 (en) * 2012-06-27 2020-01-16 Vincent John Macri Digital virtual limb and body interaction
US11904101B2 (en) * 2012-06-27 2024-02-20 Vincent John Macri Digital virtual limb and body interaction
US11804148B2 (en) 2012-06-27 2023-10-31 Vincent John Macri Methods and apparatuses for pre-action gaming
US10632366B2 (en) 2012-06-27 2020-04-28 Vincent John Macri Digital anatomical virtual extremities for pre-training physical movement
US20140172166A1 (en) * 2012-08-30 2014-06-19 Snu R&Db Foundation Treatment device for hemiplegia
US20140121018A1 (en) * 2012-09-21 2014-05-01 Grigore Cristian Burdea Bimanual integrative virtual rehabilitation systems and methods
US20170361217A1 (en) * 2012-09-21 2017-12-21 Bright Cloud International Corp. Bimanual integrative virtual rehabilitation system and methods
US9724598B2 (en) * 2012-09-21 2017-08-08 Bright Cloud International Corp. Bimanual integrative virtual rehabilitation systems and methods
US10722784B2 (en) * 2012-09-21 2020-07-28 Bright Cloud International Corporation Bimanual integrative virtual rehabilitation system and methods
US10694990B2 (en) 2012-09-21 2020-06-30 Bright Cloud International Corporation Bimanual computer games system for dementia screening
US10212986B2 (en) 2012-12-09 2019-02-26 Arris Enterprises Llc System, apparel, and method for identifying performance of workout routines
US9278255B2 (en) 2012-12-09 2016-03-08 Arris Enterprises, Inc. System and method for activity recognition
US9089734B2 (en) * 2013-01-30 2015-07-28 Universita Degli Studi Di Cagliari Apparatus, a system and a relating method for local or remote rehabilitation and functional evaluation of the hands
US20130143718A1 (en) * 2013-01-30 2013-06-06 Universita Degli Studi Di Cagliari Apparatus, a system and a relating method for local or remote rehabilitation and functional evaluation of the hands
US10304325B2 (en) 2013-03-13 2019-05-28 Arris Enterprises Llc Context health determination system
US20160023046A1 (en) * 2013-03-14 2016-01-28 Jintronix, Inc. Method and system for analysing a virtual rehabilitation activity/exercise
US10220303B1 (en) * 2013-03-15 2019-03-05 Harmonix Music Systems, Inc. Gesture-based music game
US20140278830A1 (en) * 2013-03-15 2014-09-18 U.S. Physical Therapy, Inc. Method for injury prevention and job-specific rehabilitation
WO2014176353A1 (en) * 2013-04-24 2014-10-30 Tl Technologies Llc Rehabilitation monitoring device
US20160067136A1 (en) * 2013-05-16 2016-03-10 New York University Game-based sensorimotor rehabilitator
EP2996551A4 (en) * 2013-05-16 2017-01-25 New York University Game-based sensorimotor rehabilitator
US10299738B2 (en) * 2013-05-16 2019-05-28 New York University Game-based sensorimotor rehabilitator
US11682480B2 (en) 2013-05-17 2023-06-20 Vincent J. Macri System and method for pre-action training and control
US10950336B2 (en) 2013-05-17 2021-03-16 Vincent J. Macri System and method for pre-action training and control
US10417932B2 (en) 2013-08-26 2019-09-17 J. A. Wells and Associates L.L.C Biometric data gathering
US20150287338A1 (en) * 2013-08-26 2015-10-08 John Andrew Wells Biometric data gathering
US9704412B2 (en) * 2013-08-26 2017-07-11 John Andrew Wells Biometric data gathering
US10234934B2 (en) * 2013-09-17 2019-03-19 Medibotics Llc Sensor array spanning multiple radial quadrants to measure body joint movement
US20160202755A1 (en) * 2013-09-17 2016-07-14 Medibotics Llc Sensor Array Spanning Multiple Radial Quadrants to Measure Body Joint Movement
US10839202B2 (en) * 2013-09-17 2020-11-17 Medibotics Motion recognition clothing with flexible optical sensors
US20190370534A1 (en) * 2013-09-17 2019-12-05 Medibotics Llc Motion Recognition Clothing with Flexible Optical Sensors
US10108263B2 (en) * 2013-10-11 2018-10-23 Fujitsu Limited Input apparatus including input cancelling circuit
US20160224112A1 (en) * 2013-10-11 2016-08-04 Fujitsu Limited Input apparatus
US20150196800A1 (en) * 2014-01-13 2015-07-16 Vincent James Macri Apparatus, method and system for pre-action therapy
US11944446B2 (en) 2014-01-13 2024-04-02 Vincent John Macri Apparatus, method, and system for pre-action therapy
US10111603B2 (en) * 2014-01-13 2018-10-30 Vincent James Macri Apparatus, method and system for pre-action therapy
US11116441B2 (en) 2014-01-13 2021-09-14 Vincent John Macri Apparatus, method, and system for pre-action therapy
US9987554B2 (en) 2014-03-14 2018-06-05 Sony Interactive Entertainment Inc. Gaming device with volumetric sensing
WO2015161194A1 (en) * 2014-04-17 2015-10-22 Flint Rehabilitation Devices, Llc. Systems and methods for rehabilitating the hand
US9468847B2 (en) * 2014-04-30 2016-10-18 Umm Al-Qura University Tactile feedback gloves
WO2015167578A1 (en) * 2014-04-30 2015-11-05 Umm Al-Qura University Tactile feedback gloves
US20150314195A1 (en) * 2014-04-30 2015-11-05 Umm Al-Qura University Tactile feedback gloves
US20150358543A1 (en) * 2014-06-05 2015-12-10 Ali Kord Modular motion capture system
US20150357948A1 (en) * 2014-06-05 2015-12-10 Kevin W. Goldstein Hand Worn Wireless Remote Controller For Motors
US20160011013A1 (en) * 2014-07-11 2016-01-14 Sixense Entertainment, Inc. Method And Apparatus For Synchronizing a Transmitter and Receiver in a Magnetic Tracking System
US10234306B2 (en) * 2014-07-11 2019-03-19 Sixense Enterprises Inc. Method and apparatus for synchronizing a transmitter and receiver in a magnetic tracking system
US20160048205A1 (en) * 2014-08-13 2016-02-18 Iron Will Innovations Canada Inc. Sensor Proximity Glove for Control of Electronic Devices
WO2016024166A1 (en) * 2014-08-13 2016-02-18 Iron Will Innovations Canada Inc. Sensor proximity glove for control of electronic devices
US10055018B2 (en) * 2014-08-22 2018-08-21 Sony Interactive Entertainment Inc. Glove interface object with thumb-index controller
US20160054797A1 (en) * 2014-08-22 2016-02-25 Sony Computer Entertainment Inc. Thumb Controller
US11470895B2 (en) 2014-10-11 2022-10-18 Workaround Gmbh Workwear unit having a glove that fastens a control system and functional module to a user's body
US10537143B2 (en) * 2014-10-11 2020-01-21 Workaround Gmbh Workwear unit having a glove that fastens a control system and functional module to a user's body
US20160161301A1 (en) * 2014-10-11 2016-06-09 Workaround Ug (Haftungsbeschraenkt) Workwear Unit, Bracelet, Connecting Piece, Glove, Sensor Module and Method of Detecting, Documenting, Analyzing, Monitoring and/or Teaching Processes
US9727138B2 (en) * 2014-10-27 2017-08-08 Cherif Algreatly Nanotechnology clothing for human-computer interaction
US20160147303A1 (en) * 2014-10-27 2016-05-26 Cherif Atia Algreatly Nanotechnology Clothing For Human-Computer Interaction
US9529433B2 (en) * 2014-12-30 2016-12-27 Stmicroelectronics Pte Ltd Flexible smart glove
US20160193101A1 (en) * 2015-01-05 2016-07-07 National Tsing Hua University Rehabilitation system with stiffness measurement
US10413431B2 (en) * 2015-01-05 2019-09-17 National Tsing Hua University Rehabilitation system with stiffness measurement
US10143403B2 (en) 2015-01-23 2018-12-04 Neofect Co., Ltd. System and method for rehabilitation exercise of the hands
WO2016117758A1 (en) * 2015-01-23 2016-07-28 주식회사 네오펙트 Hand rehabilitation exercise system and method
KR101541082B1 (en) 2015-01-23 2015-08-03 주식회사 네오펙트 System and method for rehabilitation exercise of the hands
US10758158B2 (en) 2015-01-23 2020-09-01 Neofect Co., Ltd. System and method for rehabilitation exercise of the hands
US20180021647A1 (en) * 2015-02-02 2018-01-25 Gn Ip Pty Ltd Frameworks, devices and methodologies configured to provide of interactive skills training content, including delivery of adaptive training programs based on analysis of performance sensor data
US10806982B2 (en) * 2015-02-02 2020-10-20 Rlt Ip Ltd Frameworks, devices and methodologies configured to provide of interactive skills training content, including delivery of adaptive training programs based on analysis of performance sensor data
US10918924B2 (en) * 2015-02-02 2021-02-16 RLT IP Ltd. Frameworks, devices and methodologies configured to enable delivery of interactive skills training content, including content with multiple selectable expert knowledge variations
US20180015345A1 (en) * 2015-02-02 2018-01-18 Gn Ip Pty Ltd Frameworks, devices and methodologies configured to enable delivery of interactive skills training content, including content with multiple selectable expert knowledge variations
US10993489B2 (en) 2015-02-18 2021-05-04 Milwaukee Electric Tool Corporation Glove
US9665174B2 (en) * 2015-02-20 2017-05-30 Sony Interactive Entertainment Inc. Magnetic tracking of glove fingertips with peripheral devices
US10254833B2 (en) 2015-02-20 2019-04-09 Sony Interactive Entertainment Inc. Magnetic tracking of glove interface object
US20160246370A1 (en) * 2015-02-20 2016-08-25 Sony Computer Entertainment Inc. Magnetic tracking of glove fingertips with peripheral devices
US10046224B2 (en) * 2015-02-25 2018-08-14 Jabii Group Aps Boxing device for performing a harmless boxing match, method and uses thereof
US20180036620A1 (en) * 2015-02-25 2018-02-08 Jabii Group Aps Boxing device for performing a harmless boxing match, method and uses thereof
US20160271498A1 (en) * 2015-03-20 2016-09-22 Miles Queller Lifton System and method for modifying human behavior through use of gaming applications
US10856778B2 (en) * 2015-04-14 2020-12-08 Inesc Tec—Instituto De Engenharia De Sistemas E Wrist rigidity assessment device for use in deep brain stimulation surgery
US10942968B2 (en) 2015-05-08 2021-03-09 Rlt Ip Ltd Frameworks, devices and methodologies configured to enable automated categorisation and/or searching of media data based on user performance attributes derived from performance sensor units
EP3098691A1 (en) * 2015-05-29 2016-11-30 Manus Machinae B.V. Flex sensor and instrumented glove
US10635457B2 (en) * 2015-06-12 2020-04-28 Tyrenn Co., Ltd. Input device and UI configuration and execution method thereof
US20180307507A1 (en) * 2015-06-12 2018-10-25 Spheredyne Co., Ltd. Input device and ui configuration and execution method thereof
USD787515S1 (en) * 2015-08-24 2017-05-23 Flint Rehabilitation Devices, LLC Hand-worn user interface device
WO2017039553A1 (en) * 2015-09-01 2017-03-09 AKSU YLDIRIM, Sibel A personalized rehabilitation system
US10736544B2 (en) * 2015-09-09 2020-08-11 The Regents Of The University Of California Systems and methods for facilitating rehabilitation therapy
US11029757B1 (en) 2015-09-24 2021-06-08 Facebook Technologies, Llc Detecting positions of magnetic flux sensors having particular locations on a device relative to a magnetic field generator located at a predetermined position on the device
USD778531S1 (en) 2015-10-02 2017-02-14 Milwaukee Electric Tool Corporation Glove
USD812843S1 (en) 2015-10-02 2018-03-20 Milwaukee Electric Tool Corporation Glove
USD864519S1 (en) 2015-10-02 2019-10-29 Milwaukee Electric Tool Corporation Glove
USD794901S1 (en) 2015-12-10 2017-08-22 Milwaukee Electric Tool Corporation Glove
US11074826B2 (en) 2015-12-10 2021-07-27 Rlt Ip Ltd Frameworks and methodologies configured to enable real-time adaptive delivery of skills training data based on monitoring of user performance via performance monitoring hardware
US20170185142A1 (en) * 2015-12-25 2017-06-29 Le Holdings (Beijing) Co., Ltd. Method, system and smart glove for obtaining immersion in virtual reality system
US11094418B2 (en) * 2015-12-31 2021-08-17 Nokia Technologies Oy Optimized biological measurement
USD812845S1 (en) 2016-01-20 2018-03-20 Milwaukee Electric Tool Corporation Glove
USD812844S1 (en) 2016-01-20 2018-03-20 Milwaukee Electric Tool Corporation Glove
US9821207B2 (en) * 2016-03-08 2017-11-21 Jess Jewett Golf training apparatus
US10317997B2 (en) * 2016-03-11 2019-06-11 Sony Interactive Entertainment Inc. Selection of optimally positioned sensors in a glove interface object
CN105999652A (en) * 2016-04-26 2016-10-12 珠海云智医疗科技有限公司 Brain injury rehabilitation training device based on pinching of double finger pulps
US10894204B2 (en) * 2016-05-04 2021-01-19 Contact Control Interfaces, LLC Exo-tendon motion capture glove device with haptic grip response
US10137362B2 (en) 2016-05-04 2018-11-27 Thomas F Buchanan, IV Exo-tendon motion capture glove device with haptic grip response
US11517788B2 (en) * 2016-06-06 2022-12-06 Maxell, Ltd. Finger exercise training menu generating system, method thereof, and program thereof
US10037626B2 (en) * 2016-06-30 2018-07-31 Microsoft Technology Licensing, Llc Interaction with virtual objects based on determined restrictions
US20180005443A1 (en) * 2016-06-30 2018-01-04 Adam Gabriel Poulos Interaction with virtual objects based on determined restrictions
US10372213B2 (en) * 2016-09-20 2019-08-06 Facebook Technologies, Llc Composite ribbon in a virtual reality device
US10599217B1 (en) * 2016-09-26 2020-03-24 Facebook Technologies, Llc Kinematic model for hand position
US10831272B1 (en) 2016-09-26 2020-11-10 Facebook Technologies, Llc Kinematic model for hand position
US20190087020A1 (en) * 2016-10-04 2019-03-21 Hewlett-Packard Development Company, L.P. Three-dimensional input device
US10712836B2 (en) * 2016-10-04 2020-07-14 Hewlett-Packard Development Company, L.P. Three-dimensional input device
US20200054940A1 (en) * 2017-02-14 2020-02-20 Sony Interactive Entertainment Europe Limited Sensing Apparatus And Method
US10933309B2 (en) * 2017-02-14 2021-03-02 Sony Interactive Entertainment Europe Limited Sensing apparatus and method
US10220255B2 (en) * 2017-04-12 2019-03-05 Russell Copelan Wearable measuring apparatus for sports training
RU177032U1 (en) * 2017-05-10 2018-02-06 Александр Владимирович Елизаров SIMULATOR FOR FINGERS OF THE HAND
US11899838B2 (en) 2017-07-19 2024-02-13 Plexus Immersive Corp Hand worn interface device integreating electronic sensors
US10845876B2 (en) * 2017-09-27 2020-11-24 Contact Control Interfaces, LLC Hand interface device utilizing haptic force gradient generation via the alignment of fingertip haptic units
US11422624B2 (en) 2017-09-27 2022-08-23 Contact Control Interfaces, LLC Hand interface device utilizing haptic force gradient generation via the alignment of fingertip haptic units
US11042219B2 (en) * 2017-11-16 2021-06-22 Zhaosheng Chen Smart wearable apparatus, smart wearable equipment and control method of smart wearable equipment
US20190160338A1 (en) * 2017-11-29 2019-05-30 Seiko Epson Corporation Training assistance device and non-transitory computer-readable storage medium
US11369866B2 (en) * 2017-12-21 2022-06-28 Sony Interactive Entertainment Inc. Position tracking apparatus and method
US20190209086A1 (en) * 2018-01-05 2019-07-11 Rehabotics Medical Technology Corp. Fixed-sensor finger action detecting glove
US11464450B2 (en) * 2018-01-05 2022-10-11 Rehabotics Medical Technology Corp. Fixed-sensor finger action detecting glove
CN112423718A (en) * 2018-07-18 2021-02-26 皇家飞利浦有限公司 Rehabilitation device and method for monitoring hand movement
US10901506B2 (en) 2018-07-30 2021-01-26 Htc Corporation Finger-gesture detection device for control handle use in virtual reality, control assembly having the same and correction method for virtual reality system
US11226684B2 (en) * 2018-07-30 2022-01-18 Htc Corporation Finger-gesture detection device for control handle use in virtual reality, control assembly having the same and correction method for virtual reality system
WO2020053332A1 (en) * 2018-09-14 2020-03-19 Deutsches Zentrum für Luft- und Raumfahrt e.V. Device and method for calibrating at least one input device, and program product
US11458382B2 (en) * 2018-11-14 2022-10-04 South China University Of Technology Immersive upper limb rehabilitation training system
EP3979966A4 (en) * 2019-06-07 2023-07-05 Osind Medi Tech Private Limited A self driven rehabilitation device and method thereof
US20210321909A1 (en) * 2019-06-17 2021-10-21 Limited Liability Company "Sensomed" Hardware/software system for the rehabilitation of patients with cognitive impairments of the upper extremities after stroke
CN112543978A (en) * 2019-06-17 2021-03-23 森桑姆德有限公司 Software and hardware system for rehabilitation of patients with cognitive impairment after upper limb stroke
CN110275537A (en) * 2019-06-27 2019-09-24 中国电子科技集团公司信息科学研究院 Motion profile cooperative control method and its device, computer readable storage medium
US20210362004A1 (en) * 2019-07-31 2021-11-25 Medivr, Inc. Rehabilitation support apparatus and rehabilitation supporting method
US11690774B2 (en) 2019-10-11 2023-07-04 Neurolutions, Inc. Orthosis systems and rehabilitation of impaired body parts
US11534358B2 (en) * 2019-10-11 2022-12-27 Neurolutions, Inc. Orthosis systems and rehabilitation of impaired body parts
USD960378S1 (en) * 2020-02-01 2022-08-09 Pankajkumar K Chhatrala Orthopedic thumb splint
US20230285836A1 (en) * 2020-06-24 2023-09-14 Nippon Telegraph And Telephone Corporation Position sense correction device, method, and program
WO2022157648A1 (en) * 2021-01-20 2022-07-28 Neurolutions, Inc. Systems and methods for remote motor assessment
WO2022156094A1 (en) * 2021-01-22 2022-07-28 上海司羿智能科技有限公司 Hand motion detection device and control method therefor, rehabilitation device and autonomous control system
US20220249947A1 (en) * 2021-02-10 2022-08-11 StrikerVR Inc. Simulation systems and methods including peripheral devices providing haptic feedback
WO2022173973A1 (en) * 2021-02-10 2022-08-18 Strykervr Inc. Simulation systems and methods including peripheral devices providing haptic feedback
CN113101134A (en) * 2021-04-02 2021-07-13 上海交通大学医学院附属新华医院 Children lower limb movement auxiliary rehabilitation system based on power exoskeleton
WO2022235283A1 (en) * 2021-05-07 2022-11-10 Command Gloves Llc Glove comprising scannable code and related system and method for use
GB2612988A (en) * 2021-11-18 2023-05-24 Lusio Tech Pty Limited Systems and methods for users with impaired movement
GB2612988B (en) * 2021-11-18 2024-04-17 Lusio Tech Pty Limited Systems and methods for users with impaired movement

Also Published As

Publication number Publication date
EP2389152A1 (en) 2011-11-30
EP2389152A4 (en) 2016-05-11
WO2010085476A1 (en) 2010-07-29

Similar Documents

Publication Publication Date Title
US20120157263A1 (en) Multi-user smartglove for virtual environment-based rehabilitation
Postolache et al. Remote monitoring of physical rehabilitation of stroke patients using IoT and virtual reality
Webster et al. Systematic review of Kinect applications in elderly care and stroke rehabilitation
US20090098519A1 (en) Device and method for employment of video games to provide physical and occupational therapy and measuring and monitoring motor movements and cognitive stimulation and rehabilitation
AU2012394006A9 (en) Arm exercise device and system
WO2021226445A1 (en) Avatar tracking and rendering in virtual reality
US20150133206A1 (en) Method and apparatus for mobile rehabilitation exergaming
Decker et al. Wiihabilitation: rehabilitation of wrist flexion and extension using a wiimote-based game system
Tsekleves et al. The use of the Nintendo Wii in motor rehabilitation for virtual reality interventions: a literature review
Karime et al. A fuzzy-based adaptive rehabilitation framework for home-based wrist training
Holmes et al. Usability and performance of Leap Motion and Oculus Rift for upper arm virtual reality stroke rehabilitation
Shigapov et al. Design of digital gloves with feedback for VR
Sheng et al. Commercial device-based hand rehabilitation systems for stroke patients: State of the art and future prospects
WO2020049555A1 (en) System, device and method for fine motor movement training
WO2014174513A1 (en) Kinetic user interface
Gamboa et al. Advantages and limitations of leap motion from a developers', physical therapists', and patients' perspective
Shen et al. A novel approach in rehabilitation of hand-eye coordination and finger dexterity
Menezes et al. Development of a complete game based system for physical therapy with kinect
Chhor et al. Breakout: Design and evaluation of a serious game for health employing intel realsense
Batista et al. Surface electromyography for game-based hand motor rehabilitation
Vogiatzaki et al. Telemedicine system for game-based rehabilitation of stroke patients in the FP7-“StrokeBack” project
Bethi Exergames for telerehabilitation
Yasmin Virtual Reality and Assistive Technologies: A Survey.
Ong et al. Augmented Reality-Assisted Healthcare Exercising Systems
Ferreira Smartphone Based Tele-Rehabilitation

Legal Events

Date Code Title Description
AS Assignment

Owner name: NORTHEASTERN UNIVERSITY, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIVAK, MARK;HOLDEN, MAUREEN K.;MAVROIDIS, CONSTANTINOS;AND OTHERS;SIGNING DATES FROM 20110719 TO 20111111;REEL/FRAME:027901/0538

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION