WO2019016811A1 - Système et procédé de rééducation d'interface cerveau-ordinateur - Google Patents

Système et procédé de rééducation d'interface cerveau-ordinateur Download PDF

Info

Publication number
WO2019016811A1
WO2019016811A1 PCT/IL2018/050796 IL2018050796W WO2019016811A1 WO 2019016811 A1 WO2019016811 A1 WO 2019016811A1 IL 2018050796 W IL2018050796 W IL 2018050796W WO 2019016811 A1 WO2019016811 A1 WO 2019016811A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
computing unit
motor
physical actuator
errp
Prior art date
Application number
PCT/IL2018/050796
Other languages
English (en)
Inventor
Miriam Zacksenhouse
Reuven Katz
Original Assignee
Technion Research & Development Foundation Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Technion Research & Development Foundation Limited filed Critical Technion Research & Development Foundation Limited
Publication of WO2019016811A1 publication Critical patent/WO2019016811A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1124Determining motor skills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4058Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
    • A61B5/4064Evaluating the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0237Stretching or bending or torsioning apparatus for exercising for the lower limbs
    • A61H1/0266Foot
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0274Stretching or bending or torsioning apparatus for exercising for the upper limbs
    • A61H1/0285Hand
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • A61B2576/02Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
    • A61B2576/026Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part for the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/0522Magnetic induction tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/224Measuring muscular strength
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/375Electroencephalography [EEG] using biofeedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1635Hand or arm, e.g. handle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/164Feet or leg, e.g. pedal
    • A61H2201/1642Holding means therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1657Movement of interface, i.e. force application means
    • A61H2201/1659Free spatial automatic movement of interface within a working area, e.g. Robot
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/08Other bio-electrical signals
    • A61H2230/10Electroencephalographic signals
    • A61H2230/105Electroencephalographic signals used as a control parameter for the apparatus
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the invention relates to the field of motor training and motor rehabilitation after neurological trauma.
  • a person suffering neural damage may lose motor control of one or more body parts. Although some people may recover from strokes, a majority of patients suffer from residual neurological deficits that persistently impair function. Restoration of motor function involves instigating regenerative responses that promote the growth of new neural connections in the brain, a process known to those skilled in the art as brain plasticity. This process can be aided by physical therapy, which typically involves one-on-one attention from a therapist who assists the patient thorough repetitive physical exercises of the affected body part. However, physical therapy is usually intensive, time consuming and costly. The repetitive nature of physical therapy makes it conducive to at least partial automation through the use of electromechanical devices, such as robotic systems.
  • robotic rehabilitation systems deliver movement therapy (referred to as robotic therapy, RT) that involves performing goal-directed motor tasks in one or more degrees of freedom.
  • An interactive control system which takes into account any counter forces applied by the user, can adjust the power output of the robot to provide neutral, assistive, or resistive forces.
  • RT needs to actively engage the patients in attempting to move and to challenge them by adapting to their performance.
  • three general approaches have been developed: (1) assist-as-needed approaches, (2) RT triggering based on kinematic and neurophysiological indices, especially those indicating patient intent to move, and (3) virtual reality games for a more immersive experience.
  • EEG-based triggers may include movement-related cortical potentials (MRCP), which reflect movement intent, motor imagery.
  • BCIs brain-computer interfaces
  • a system which includes an acquisition element configured to acquire brain activity data from a user, a physical actuator configured to interact with movement of a body part of the user, and a computing unit operatively coupled to the acquisition element and the physical actuator, the computing unit being configured to instruct the user to perform a motor exercise, continually monitor user movement parameters, receive and continually process the brain activity data from the user, decode the brain activity data from the user in real time to extract an error -related potential (ErrP) signal, and adjust an operational parameter of the physical actuator based on the ErrP signal.
  • ErrP error -related potential
  • a method for motor training and rehabilitation incorporating brain-machine interface including the steps of acquiring brain activity data from a user, providing a physical actuator capable of interacting with movement of a body part of the user, providing an instruction to perform a motor exercise, the motor exercise including manipulation of the physical actuator by the user, wherein the instruction is provided by a computing unit, continually monitoring user movement parameters, continually processing the brain activity data from the user, decoding the brain activity data from the user in real time to extract an error-related potential (ErrP) signal, and adjusting an operational parameter of the physical actuator based on the ErrP signal.
  • ErrP error-related potential
  • the acquisition element is configured to acquire the brain activity data from the user using a technique including one of electroencephalography (EEG), magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), functional near-infrared spectroscopy (fNIRS), single -photon emission computed tomography (SPECT), and electrocorticography (ECoG).
  • EEG electroencephalography
  • MEG magnetoencephalography
  • fMRI functional magnetic resonance imaging
  • fNIRS functional near-infrared spectroscopy
  • SPECT single -photon emission computed tomography
  • EoG electrocorticography
  • the physical actuator includes one of: a movable element attachable to the user's body part, an articulated robotic arm having one or more movable joints, a grip or handle element, a wearable element, means for immovably securing at least one of user's body parts to limit the movement thereof during the performance of said motor exercise, and means for partially or fully bearing the weight of the user during said motor exercise.
  • the articulated robotic arm has more than one degree of freedom.
  • the grip or handle element is configured for at least one of linear, rotational and spherical movement.
  • the wearable element is a glove including one or more actuators attachable to corresponding digits of a hand of the user.
  • the wearable element is a boot comprising one or more actuators attachable to corresponding digits of a foot of the user.
  • the means for partially or fully bearing the weight of the user is a harness by which the user is suspended.
  • the computing unit is configured to control the application of at least one of passive, pushing, assisting, reminding, responding, and resisting forces by the physical actuator.
  • the computing unit is configured to continually monitor at least one of position, force, torque and velocity of the user's movement.
  • the computing unit is further configured to decode at least one of the P3a and the P3b and error-related negativity (ERN) components of the ErrP signal.
  • ERN error-related negativity
  • the computing unit is further configured to determine whether said ErrP signal corresponds to an execution error or an outcome error.
  • the computing unit further comprises a computer readable storage medium having stored thereon computer readable program instructions for executing a motor exercise program.
  • the computing unit is configured to provide to the user said motor exercise instruction by at least one of visual, aural, and tactile means. In some embodiments, the computing unit is configured to provide to the user a feedback relating to the performance of said motor exercise. In some embodiments, the computing unit further comprises a display device. In some embodiments, the computing unit is configured to present said instruction and feedback as part of a video game. In some embodiments, the computing unit is configured to present said instruction and feedback in virtual reality. In some embodiments, the computing unit comprises a camera-type motion sensor.
  • FIG. 1 is a schematic illustration of an embodiment of a system and method for brain-computer interface (BCI) rehabilitation
  • Fig. 2 illustrates one exemplary embodiment of a BCI rehabilitation system. DETAILED DESCRIPTION
  • the present system and method are configured to augment human movement behavior in order to accelerate complex movement skill acquisition and improve outcome.
  • the present system and method relate to the application of specific brain signals, evoked in response to perceived errors in the performance of a motor task, to provide various forms of feedback, including real-time feedback and/or post-performance feedback, for training and rehabilitation.
  • a newly emerging use of BCI technology is in the area of motor training and rehabilitation.
  • BCI rehabilitation devices may incorporate real time closed-loop feedback to enhance the recruitment of selected brain areas by guiding a more focused activation of specific brain signals. This in turn may help to accelerate brain plasticity, and thus reduce the length, difficulty and cost of the recovery process.
  • ERPs event-related potentials
  • ErrP error-related potentials
  • the system 100 of the present disclosure includes a physical actuator 102 and a computing unit 104.
  • the physical actuator 102 comprises a robotic-type movable element configured to interact with a body part of a user, such that said body part can independently and controllably move in concert with physical actuator 102 in a one or more degrees-of-freedom (DOF) environment, such as one, two, three, four, or more degrees of freedom.
  • DOE degrees-of-freedom
  • the interaction of the physical actuator 102 with the body part of the user may involve providing haptic feedback to the user, i.e., by applying torque and/or force as feedback, or by cancelling-out the dynamics of the physical actuator 102 such that it becomes free-moving.
  • Physical actuator 102 may further comprise a grip or handle element; a wearable element; means, such as straps, for immovably securing at least one of the user's body parts to limit the movement thereof during the performance of motor exercises; and means, such as a body harness, for partially or fully bearing the weight of the user during motor exercise.
  • the computing unit 104 is operatively coupled to the physical actuator 102 and is programmed to control operation thereof.
  • the computing unit 104 can be programmed to execute one or more desired exercise routines at the physical actuator 102, selected to improve motor function of an affected body part of a user.
  • the computing unit 104 is configured to (i) command the physical actuator 102 to apply at least one of passive (none), pushing (against the user), assisting (toward the goal), reminding (applied for short duration), responding (applied for short duration toward the goal), and resisting (against the movement of the user) forces;; and (ii) to continually monitor user's movement parameters, such as, but not limited to position, orientation, force, torque and velocity of the physical actuator 102.
  • the computing unit is further configured to recognize the application of counter forces by the user to the physical actuator 102 and to respond by adjusting the power output of the physical actuator 102, to permit the user to override the desired exercise path of the physical actuator 102.
  • the operation of the computing unit 104 may be overseen by a physical therapist, for example, via an appropriate computer interface.
  • the therapist will be able to select a particular sequence or mix of exercise routines for the user and adjust parameters of the operation of the system 100 in response to user interaction therewith.
  • computing unit 104 may activate the physical actuator 102 to provide assistive or resistive force to the user when an ErrP signal is detected.
  • Acquisition element 108 is configured to enable detection of a signal stream from the user in the course of the performance by the user of each exercise in the set of exercises.
  • Acquisition element 108 comprises, e.g., an electroencephalogram (EEG) electrode array comprising a desired number of electrodes disposed in contact with the user's scalp.
  • EEG electroencephalogram
  • Acquisition element 108 is operatively coupled to computing unit 104, which is further configured to (i) receive and continually process the brain activity data from the user, and (ii) decode the said brain activity data from the user in real time to extract ErrP signals associated with various types of task errors.
  • ErrP signals may include, for example, the P3a and P3b subcomponents of P300, as well as ERN.
  • Computing unit 104 is further configured to adjust one or more parameters of the operation of the physical actuator 102 in response to said ErrP signals.
  • computing unit 104 may activate the physical actuator 102 to provide assistive or resistive force to the user when one or more subcomponents of the ErrP signal are detected.
  • computing unit 104 may modify the sequence or mix of exercises based upon at least one identified ErrP signal. Those of skill in the art will appreciate that either one or both of the aforementioned subcomponents of P300 may be used by the computing unit 104 in evaluating the performance by the user of an exercise in the set of exercises.
  • the system 100 can include one or more audiovisual displays 106 that are operated by the computing unit 104 to provide visual and audio exercise instructions and performance feedback to the user.
  • the system 100 provides a visual instruction, e.g., by displaying the target orientation or position on the display 106.
  • the display 106 may show a video clip or illustration of an arm rotating to the right or left, as appropriate.
  • such instructions may be provided in an audio fashion via a speaker (not depicted).
  • said instructions and feedback may be provided in a video gaming or virtual reality environment as part of the rehabilitation training.
  • the system 100 may further provide haptic feedback to the user through tactile means (e.g., vibration applied at the physical actuator 102).
  • the system 100 may also employ a camera-type motion sensor, such as Kinect® by Microsoft Corp., to recognize user movements.
  • Fig. 2 illustrates one exemplary embodiment of a BCI rehabilitation system 200.
  • the exemplary embodiment of the BCI rehabilitation system 200 as depicted is particularly suited for use with a user's arm. Persons of skill in the art will appreciate that alternative embodiments may be adapted for use with other limbs, such a user's leg, or other limbs, such as a user's head.
  • the BCI rehabilitation system 200 comprises a physical actuator 202 useful with the systems of the present disclosure, such as, for example, the system 100 of Fig. 1.
  • the physical actuator 202 generally comprises a base 210 to which is coupled motor assembly 212.
  • Motor assembly 212 comprises motor 212a, an encoder (not depicted) that measures the orientation of the motor, and a handle 212b, which is securely coupled to the output shaft of motor 212a.
  • motor assembly 212 further comprises torque sensor 212c.
  • the torque sensor 212c is a transducer that converts a torsional mechanical input into an electrical output signal.
  • Base 210 is generally sized and shaped to ergonomically receive a user's forearm while the user's hand or palm is grasping the handle 212b.
  • other embodiments of the BCI rehabilitation system 200 may be adapted for treating other movements of the arm, e.g., reaching movements, or treatment of other limbs, e.g., the leg.
  • the movement of the output shaft of motor 212a establishes a one DOF environment at the handle 212b in which the user's wrist can rotate the handle 212b about the pronation/supination (PS) axis of a wrist joint rotation.
  • the handle 212b can assume various forms and is generally configured to promote ergonomic gripping thereof by a user's hand or palm.
  • the computing unit 204 is programmed to control operation of the physical actuator 202. In some embodiments, the computing unit 204 is further programmed to effectuate performance of one or more rehabilitation exercise routines at the physical actuator 202 selected to improve motor function of the user's wrist. In some embodiments, the system 200 can include one or more displays 206 that are operated by the computing unit 204 to display a graphical user interface related to the desired exercise routine.
  • Brain signals including, but not limited to P300, P3a, and P3b, as discussed above, are acquired using EEG electrode array 208 arranged in a fitted cap, corresponding to the acquisition element 108.
  • EEG magnetoencephalography
  • fMRI functional magnetic resonance imaging
  • fNIRS functional near-infrared spectroscopy
  • SPECT single-photon emission computed tomography
  • EoG electrocorticography
  • a particular intention of some embodiments of the invention is to interact with the cerebral aspects of rehabilitation, as they relate, for example, to the plasticity and/or training of a human brain.
  • a user's brain which is damaged due to traumatic brain injury or a stroke may be subjected to beneficial therapies as described herein.
  • beneficial therapies as described herein.
  • this invention may also be beneficial in connection when used with patients having other types of physical disabilities and limitations.
  • aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • the system disclosed in the present specification may further be specially constructed for the required purposes or may comprise a general-purpose computer or other device selectively activated or reconfigured by a computer program stored in the computer.
  • the algorithms presented herein are not inherently related to any particular computer or other apparatus.
  • Various general-purpose machines may be used with programs in accordance with the teachings herein.
  • the construction of more specialized system to perform the required method steps may be appropriate.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Python, Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • the present invention may be a system, a method, and/or a computer program product.
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Neurology (AREA)
  • Biophysics (AREA)
  • Rehabilitation Therapy (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Neurosurgery (AREA)
  • Psychology (AREA)
  • Physiology (AREA)
  • Dermatology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Psychiatry (AREA)
  • Rehabilitation Tools (AREA)

Abstract

L'invention concerne un système et un procédé d'entraînement et de rééducation motrice intégrant une interface cerveau-machine. Le système comprend un élément d'acquisition conçu pour acquérir des données d'activité cérébrale d'un utilisateur, un actionneur physique conçu pour interagir avec le mouvement d'une partie corporelle de l'utilisateur, et une unité de calcul couplée de manière fonctionnelle à l'élément d'acquisition et à l'actionneur physique, l'unité de calcul étant conçue pour ordonner à l'utilisateur d'effectuer un exercice moteur, surveiller en continu des paramètres de mouvement de l'utilisateur, recevoir et traiter en continu les données d'activité cérébrale de l'utilisateur, décoder les données d'activité cérébrale de l'utilisateur en temps réel pour extraire un signal de potentiel lié à une erreur (ErrP) et régler un paramètre opérationnel de l'actionneur physique sur la base du signal d'ErrP. L'invention concerne également des systèmes, un appareil et des procédés apparentés.
PCT/IL2018/050796 2017-07-18 2018-07-18 Système et procédé de rééducation d'interface cerveau-ordinateur WO2019016811A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762533721P 2017-07-18 2017-07-18
US62/533,721 2017-07-18

Publications (1)

Publication Number Publication Date
WO2019016811A1 true WO2019016811A1 (fr) 2019-01-24

Family

ID=65015713

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2018/050796 WO2019016811A1 (fr) 2017-07-18 2018-07-18 Système et procédé de rééducation d'interface cerveau-ordinateur

Country Status (1)

Country Link
WO (1) WO2019016811A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111523519A (zh) * 2020-06-09 2020-08-11 福州大学 ErrP自适应共空间模式识别方法
CN112085052A (zh) * 2020-07-28 2020-12-15 中国科学院深圳先进技术研究院 运动想象分类模型的训练方法、运动想象方法及相关设备
WO2021008087A1 (fr) * 2019-07-17 2021-01-21 西安交通大学 Procédé de test de sensibilité de contraste basé sur un potentiel visuel de mouvement évoqué
WO2021062016A1 (fr) * 2019-09-26 2021-04-01 The Regents Of The University Of California Système d'interface cerveau-machine périphérique par commande volitive d'ensembles moteurs individuels
WO2022047377A1 (fr) * 2020-08-31 2022-03-03 Vincent John Macri Interaction de membre virtuel numérique et de corps
CN115349857A (zh) * 2022-07-18 2022-11-18 国家康复辅具研究中心 一种基于fNIRS脑功能图谱的动态康复评估方法和系统
US11673042B2 (en) 2012-06-27 2023-06-13 Vincent John Macri Digital anatomical virtual extremities for pre-training physical movement
US11682480B2 (en) 2013-05-17 2023-06-20 Vincent J. Macri System and method for pre-action training and control
US11804148B2 (en) 2012-06-27 2023-10-31 Vincent John Macri Methods and apparatuses for pre-action gaming
US11904101B2 (en) 2012-06-27 2024-02-20 Vincent John Macri Digital virtual limb and body interaction
US11944446B2 (en) 2014-01-13 2024-04-02 Vincent John Macri Apparatus, method, and system for pre-action therapy

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050228515A1 (en) * 2004-03-22 2005-10-13 California Institute Of Technology Cognitive control signals for neural prosthetics
US7120486B2 (en) * 2003-12-12 2006-10-10 Washington University Brain computer interface
US20090221928A1 (en) * 2004-08-25 2009-09-03 Motorika Limited Motor training with brain plasticity
US20100137734A1 (en) * 2007-05-02 2010-06-03 Digiovanna John F System and method for brain machine interface (bmi) control using reinforcement learning
WO2014025765A2 (fr) * 2012-08-06 2014-02-13 University Of Miami Systèmes et procédés pour un décodage neural adaptatif
US20150012111A1 (en) * 2013-07-03 2015-01-08 University Of Houston Methods for closed-loop neural-machine interface systems for the control of wearable exoskeletons and prosthetic devices
WO2016094862A2 (fr) * 2014-12-12 2016-06-16 Francis Joseph T Interface cerveau-machine autonome

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7120486B2 (en) * 2003-12-12 2006-10-10 Washington University Brain computer interface
US20050228515A1 (en) * 2004-03-22 2005-10-13 California Institute Of Technology Cognitive control signals for neural prosthetics
US20090221928A1 (en) * 2004-08-25 2009-09-03 Motorika Limited Motor training with brain plasticity
US20100137734A1 (en) * 2007-05-02 2010-06-03 Digiovanna John F System and method for brain machine interface (bmi) control using reinforcement learning
WO2014025765A2 (fr) * 2012-08-06 2014-02-13 University Of Miami Systèmes et procédés pour un décodage neural adaptatif
US20150012111A1 (en) * 2013-07-03 2015-01-08 University Of Houston Methods for closed-loop neural-machine interface systems for the control of wearable exoskeletons and prosthetic devices
WO2016094862A2 (fr) * 2014-12-12 2016-06-16 Francis Joseph T Interface cerveau-machine autonome

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
CHAVARRIAGA, RICARDO ET AL.: "Errare machinale est: the use of error-related potentials in brain-machine interfaces", FRONTIERS IN NEUROSCIENCE, vol. 8, 22 July 2014 (2014-07-22), pages 208, XP055565990, Retrieved from the Internet <URL:https://doi.org/10.3389/fnins.2014.00208> *
DEMCHENKO, IGOR ET AL.: "Distinct electroencephalographic responses to disturbances and distractors during continuous reaching movements", BRAIN RESEARCH, vol. 1652, 28 September 2016 (2016-09-28), pages 178 - 187, XP029793181 *
OMEDES, JASON ET AL.: "Factors that affect error potentials during a grasping task: toward a hybrid natural movement decoding BCI", JOURNAL OF NEURAL ENGINEERING, 6 June 2018 (2018-06-06), XP020329287 *
POLICH, JOHN.: "Updating P300: an integrative theory of P3a and P3b", CLINICAL NEUROPHYSIOLOGY, vol. 118.10, 18 June 2007 (2007-06-18), pages 2128 - 2148, XP022248097 *
SP?LER, MARTIN ET AL.: "Error-related potentials during continuous feedback: using EEG to detect errors of different type and severity", FRONTIERS IN HUMAN NEUROSCIENCE, vol. 9, 26 March 2015 (2015-03-26), pages 155, XP055565994, Retrieved from the Internet <URL:https://doi.org/10.3389/fnhum.2015.00155> *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11673042B2 (en) 2012-06-27 2023-06-13 Vincent John Macri Digital anatomical virtual extremities for pre-training physical movement
US11804148B2 (en) 2012-06-27 2023-10-31 Vincent John Macri Methods and apparatuses for pre-action gaming
US11904101B2 (en) 2012-06-27 2024-02-20 Vincent John Macri Digital virtual limb and body interaction
US11682480B2 (en) 2013-05-17 2023-06-20 Vincent J. Macri System and method for pre-action training and control
US11944446B2 (en) 2014-01-13 2024-04-02 Vincent John Macri Apparatus, method, and system for pre-action therapy
WO2021008087A1 (fr) * 2019-07-17 2021-01-21 西安交通大学 Procédé de test de sensibilité de contraste basé sur un potentiel visuel de mouvement évoqué
WO2021062016A1 (fr) * 2019-09-26 2021-04-01 The Regents Of The University Of California Système d'interface cerveau-machine périphérique par commande volitive d'ensembles moteurs individuels
CN111523519A (zh) * 2020-06-09 2020-08-11 福州大学 ErrP自适应共空间模式识别方法
CN111523519B (zh) * 2020-06-09 2022-05-13 福州大学 ErrP自适应共空间模式识别方法
CN112085052A (zh) * 2020-07-28 2020-12-15 中国科学院深圳先进技术研究院 运动想象分类模型的训练方法、运动想象方法及相关设备
WO2022047377A1 (fr) * 2020-08-31 2022-03-03 Vincent John Macri Interaction de membre virtuel numérique et de corps
CN115349857A (zh) * 2022-07-18 2022-11-18 国家康复辅具研究中心 一种基于fNIRS脑功能图谱的动态康复评估方法和系统

Similar Documents

Publication Publication Date Title
WO2019016811A1 (fr) Système et procédé de rééducation d&#39;interface cerveau-ordinateur
US20220338761A1 (en) Remote Training and Practicing Apparatus and System for Upper-Limb Rehabilitation
Spataro et al. Reaching and grasping a glass of water by locked-in ALS patients through a BCI-controlled humanoid robot
Vogel et al. An assistive decision-and-control architecture for force-sensitive hand–arm systems driven by human–machine interfaces
Vourvopoulos et al. Robot navigation using brain-computer interfaces
Ktena et al. A virtual reality platform for safe evaluation and training of natural gaze-based wheelchair driving
Araujo et al. Development of a low-cost EEG-controlled hand exoskeleton 3D printed on textiles
Achic et al. Hybrid BCI system to operate an electric wheelchair and a robotic arm for navigation and manipulation tasks
Lupu et al. Virtual reality based stroke recovery for upper limbs using leap motion
Noronha et al. “Wink to grasp”—comparing eye, voice & EMG gesture control of grasp with soft-robotic gloves
Zhang et al. Combining mental training and physical training with goal-oriented protocols in stroke rehabilitation: a feasibility case study
Rechy-Ramirez et al. Impact of commercial sensors in human computer interaction: a review
JP2021529368A (ja) 理学療法用の仮想環境
D'Auria et al. Human-computer interaction in healthcare: How to support patients during their wrist rehabilitation
Mathew et al. A systematic review of technological advancements in signal sensing, actuation, control and training methods in robotic exoskeletons for rehabilitation
Allison The I of BCIs: next generation interfaces for brain–computer interface systems that adapt to individual users
Zhu et al. Face-computer interface (FCI): Intent recognition based on facial electromyography (fEMG) and online human-computer interface with audiovisual feedback
Guo et al. Human–robot interaction for rehabilitation robotics
Kilmarx et al. Sequence-based manipulation of robotic arm control in brain machine interface
Ma et al. Sensing and force-feedback exoskeleton robotic (SAFER) glove mechanism for hand rehabilitation
Batula et al. Developing an optical brain-computer interface for humanoid robot control
Feng et al. An interactive framework for personalized computer-assisted neurorehabilitation
S Tutak Design of ELISE robot for the paretic upper limb of stroke survivors
Al Nuaimi et al. Real-time Control of UGV Robot in Gazebo Simulator using P300-based Brain-Computer Interface
Kakkos et al. Human–machine interfaces for motor rehabilitation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18834555

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18834555

Country of ref document: EP

Kind code of ref document: A1