EP3068349A1 - Système de commande d'interfaces machine-cerveau et systèmes de prothèse nerveuse - Google Patents
Système de commande d'interfaces machine-cerveau et systèmes de prothèse nerveuseInfo
- Publication number
- EP3068349A1 EP3068349A1 EP14861977.8A EP14861977A EP3068349A1 EP 3068349 A1 EP3068349 A1 EP 3068349A1 EP 14861977 A EP14861977 A EP 14861977A EP 3068349 A1 EP3068349 A1 EP 3068349A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- motion
- commands
- prosthetic device
- joint
- motion commands
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000001537 neural effect Effects 0.000 title claims description 57
- 210000004556 brain Anatomy 0.000 title description 11
- 230000033001 locomotion Effects 0.000 claims abstract description 124
- 238000002610 neuroimaging Methods 0.000 claims abstract description 22
- 210000003205 muscle Anatomy 0.000 claims description 42
- 238000000354 decomposition reaction Methods 0.000 claims description 24
- 238000006073 displacement reaction Methods 0.000 claims description 22
- 238000004088 simulation Methods 0.000 claims description 22
- 230000004913 activation Effects 0.000 claims description 20
- 238000001994 activation Methods 0.000 claims description 20
- 238000000034 method Methods 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 18
- 238000012421 spiking Methods 0.000 claims description 17
- 238000013528 artificial neural network Methods 0.000 claims description 16
- 230000005284 excitation Effects 0.000 claims description 16
- 230000015654 memory Effects 0.000 claims description 16
- 230000006870 function Effects 0.000 claims description 15
- 210000002435 tendon Anatomy 0.000 claims description 15
- 230000001953 sensory effect Effects 0.000 claims description 8
- 230000001054 cortical effect Effects 0.000 description 15
- 238000013459 approach Methods 0.000 description 13
- 239000011159 matrix material Substances 0.000 description 13
- 238000013507 mapping Methods 0.000 description 11
- 230000001144 postural effect Effects 0.000 description 11
- 230000002232 neuromuscular Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 230000005484 gravity Effects 0.000 description 8
- 238000003860 storage Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 210000002569 neuron Anatomy 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000012549 training Methods 0.000 description 7
- 230000006399 behavior Effects 0.000 description 6
- 230000008602 contraction Effects 0.000 description 6
- 210000002346 musculoskeletal system Anatomy 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000003190 augmentative effect Effects 0.000 description 5
- 238000013461 design Methods 0.000 description 5
- 230000000670 limiting effect Effects 0.000 description 4
- 230000037397 musculoskeletal physiology Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 208000027765 speech disease Diseases 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 241000196324 Embryophyta Species 0.000 description 3
- 210000003169 central nervous system Anatomy 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 3
- 238000000537 electroencephalography Methods 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 3
- 210000001087 myotubule Anatomy 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 241000282887 Suidae Species 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 238000002599 functional magnetic resonance imaging Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000007659 motor function Effects 0.000 description 2
- 210000000653 nervous system Anatomy 0.000 description 2
- 230000000272 proprioceptive effect Effects 0.000 description 2
- 208000010586 pulmonary interstitial glycogenosis Diseases 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 208000020431 spinal cord injury Diseases 0.000 description 2
- 230000000946 synaptic effect Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 240000008100 Brassica rapa Species 0.000 description 1
- 241000270725 Caiman Species 0.000 description 1
- 208000005890 Neuroma Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 210000000412 mechanoreceptor Anatomy 0.000 description 1
- 108091008704 mechanoreceptors Proteins 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000002161 motor neuron Anatomy 0.000 description 1
- 230000035479 physiological effects, processes and functions Effects 0.000 description 1
- 230000009023 proprioceptive sensation Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 229920003987 resole Polymers 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000000392 somatic effect Effects 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 208000006379 syphilis Diseases 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/68—Operating or control means
- A61F2/70—Operating or control means electrical
- A61F2/72—Bioelectric control, e.g. myoelectric
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/68—Operating or control means
- A61F2/70—Operating or control means electrical
- A61F2002/701—Operating or control means electrical operated by electrically controlled means, e.g. solenoids or torque motors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/50—Prostheses not implantable in the body
- A61F2/76—Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
- A61F2002/7615—Measuring means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
Definitions
- the present invention relates to a robotic control system and, more particularly, to a system for controlling robotic prosthetic devices given motor intent inferred from neuroimatiinti data.
- BMI Brain machine interfaces
- neural prosthetics offer great hope for restoring function to people with spinal cord injuries and amputees, as well augmenting and enhancing the abilities of people with full motor function
- prosthetic device control there have been fewer advances, with most approaches rooted in conventional robotic control.
- BMI and neural prosthetics research which addresses the decoding of cortical signals for the downsiream execution of motion commands by an externa! device (see, for example, the List of Incorporated Literature References, Literature Reference Nos. I, 9, .1 1 5 1.2, and 15).
- Most of this work focuses on the neural decoding with less rigor applied to the prosthesis control hi cases where a robotic arm prosthesis is control led, the motion trajectory is typically converted into joint commands and executed using joint space control (see, for example.
- Task/posture decomposition has particular advantages over joint space control in B Is. It is based on an abstraction analogous to the manner in which cortical signals are encoded (e.g., eye-centered, hand-centered Cartesian coordinates). Further, it is generali abie to whole body prosthetic control involving multiple prosthetic limbs and .high degree-of-iteedom kinematics. Joint space prosthetic control is limited in this capacity due to its reliance on inverse kinematics solutions. Additionally, task/posture decomposition allows for the specification of postural behaviors based on minimizing important objective functions (e.g. power consumption, virtual muscle effort, etc.) consistent with the execution of cortical motion commands.
- important objective functions e.g. power consumption, virtual muscle effort, etc.
- the system includes one or more processors and a memory, the memory having executable instructions encoded thereon, such that upon execution of the instructions, the one or more processors perform several operations, such as recei ving neuroimaging data of a user from a neuroimaging device.
- the neuroimaging data is then decoded to infer spatial motion intent of the user, where the spatial motion intent includes desired motion commands of the torque controlled prosthetic device represented in a coordinate system.
- the system executes, with a prosthesis controller, the motion commands as torque commands to cause the torque controlled prosthetic device to move according to the spatial motion intent of the user.
- the system includes at least one torque controlled prosthetic device operabiy connected with the one or more processors.
- the system performs an operation of receiving, in the controller, sensory information regarding a current state of the prosthetic device,
- commands are executed using a task decomposition and posture decomposition, wherein the task decomposition is a task space control and the posture decomposition is formulated as a cost potential which represents a cost, function.
- the motion commands are executed as torque commands that generate a desired, task space control while minimizing the cost potential.
- the motio commands are executing using spiking neural network.
- the prosthesis controller is a neuromorphic
- prosthesis controller and further comprises: a neuromorphic spike encoder to represent the motion command as a set of neural spikes; a neuromorphic motor mapper to map the neural spikes representing Cartesian
- the system performs operations of receiving a model of the prosthetic device and a musculoskeletal mode!
- the musculoskeletal model includes musculoskeletal dynamics that include steadv state tendon forces; eeneratina, with a sensorimotor controller, simulated neural excitations given the motion commands to drive a set of muscle activations in a musculoskeletal simulation; and generating, with the prosthesis controller,, simulated actuator joint torques given the motion commands to drive a simulated prosthetic device.
- the present invention also includes a computer program
- the computer program product includes computer-readable instructions stored on a non-transitory computer-readable medium that are executable b a computer having one or more processors, such that upon execution of the instructions, the one or more processors perform the operations listed herein.
- the computer implemented method includes an act of causing a computer to execute such instructions and perform the resulting operations.
- FIG. t is a block diagram depicting the components of system
- FIG. 2 is an illustration of a computer program product embodying an aspect of the present invention
- FIG. 3 is a process flow-chart illustrating how high-level motor intent from the brain is decocted and a task-level .motion command is
- FIG. 4A is a simplified model of a human arm actuated by 14 muscles
- FIG. 4B s a table illustrating corresponding maximum isometric
- FIG. 5 A is an il lustration of a redundant muscle-actuated model of the human ann used as a physiological template for controlling a prosthetic arm;
- FIG. 5B is a graph illustrating time histories of joint motion, while performing a simulation run of the model as illustrated i FIG. 5 A;
- FIG. 5C is a graph illustrating time histories of hand motion while performing a simulation run of the mode l as illustrated in FIG. 5A;
- FIG. 5D is a graph illustrating time histories of muscle effort while performing a simulation run of the model as illustrated in FIG. 5 A;
- FIG. 6 A is a flow chart illustrating an artificial spiking neural network where motor map redundancy is resolved by training the network only on unique solutions of the inverse mapping given by ⁇ 3 ⁇ 4
- FIG. 6B is a flow chart illustrating an artificial spiking neural network where motor map redundancy is resolved by explicitly inhibiting the learning of solutions where the cost is high;
- FIG. 7 is flow chart illustrating how high-level motor intent from the brai is decoded and a task-level motion command is transmitted, to the prosthetic device controller;
- FIG. 8 is a flow chart illustrating how a decoded cortical output is sent to a simulated prosthesis controller and a simulated sensorimotor controller; [00041 ]
- FIG. 9 is a flow chart illustrating the neural and musculoskeletal
- FIG. 10 is a functional block diagram depicting a system-level
- FIG. 9 showing an abstracted representation of neural and musculoskeletal physiology
- FIG. 1 1 is an illustration depicting an active state museulotendon
- FIG. 12 is an illustration depicting a neuromuscular
- FIG. 13 is an illustration of a task-level sensorimotor controller
- FIG. 14 is a functional block diagram depicting an abstracted
- FIG. 15 is an illustration depicting a neuromuscular
- musculoskeletal system feed-forward path
- musculoskeletai system dynamics showing the musculoskeletai system dynamics augmented with a set ofholonomic constraints and Lagrange multipliers
- FIG , 16 is an illustration of a constrained task-level sensorimotor
- F G. 17 is an illustration of a, constrained task-level sensorimotor
- controller which generates motion consistent, with the input motion commands in the presence of olonomic system constraints.
- the present invention relates to a robotic control system and, more particularly, to a system for controlling robotic prosthetic devices given motor intent inferred from neuioimaging data.
- the following descriptio is presented to enable one of ordinary skill in the art to make and use the invention and to incorporate it in the context of particular applications.
- Various modifications, as well as a variety of uses in. different applications will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to a wide range of aspects.
- the present invention is not intended to be limited to the aspects presented, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
- Buneo (1 A., Jarvis, M. R., Batista, A. P., & Andersen, R. A., (2002), Direct visuomotor transformati ns for reaching. Nature, 416(6881 ), 632-636.
- Kiiatib, O,, (1 95), inertia! properties in robotic manipulation An object-level framework, The International Journal of Robotics Research,. 14(1 ), 1 -36.
- the present invention has three "principal" aspects.
- the first is a
- the system for controlling robotic prosthetic devices gi en motor intent inferred from neuroimaging data.
- the system is typically in the form of a computer system operating software or in the form of a "hard-coded" instruction set.
- This system may be incorporated into a wide variety of devices that provide different functionalities.
- the system is incorporated into a robot having actuators and appendages or other motion operable components and any other components as may be required to provide the functionality as described herein.
- the second principal aspect is a method, typically in the form of software, operated using a data processing system (computer).
- the third principal aspect is a computer program product.
- the computer program product generally represents computer-readable instructions stored on a non-transitory computer- readable medium such as an optical storage device, e.g., a compact disc (CD) or digital versatile disc (DVD), or a magnetic storage device such as a floppy disk or magnetic tape.
- a non-transitory computer- readable medium such as an optical storage device, e.g., a compact disc (CD) or digital versatile disc (DVD), or a magnetic storage device such as a floppy disk or magnetic tape.
- Computer-readable media include hard disks, read-only memory (ROM), and flash-type memories. These aspects will be described in more detail below.
- FIG. I A block diagram depicting an example of a system (i.e., computer system 100) of the present invention is provided in FIG. I .
- the computer system 100 is configured - to perform calculations, processes, operations, and or functions associated with a program or algorithm.
- certain processes and steps discussed herein are realized as a series of instructions (e.g., software program) that reside within computer readable memory units and are executed by one or more processors of the computer system 100. When executed, the instructions cause the computer system 100 to perform specific actions and exhibit specific behavior, such as described herein.
- the computer system 1 0 may include an address/data bus 102 that is configured to communicate information. Additionally, one or more data processing units, such as a processor 104 (or processors), are coupled with the address/data bus 10:2. The processor 104 is configured to process information and instructions. In an aspect, the processor 104 is a
- processor 104 may be a different type of processor such as a parallel processor, or a field programmable gate array.
- the computer system 100 is configured to utilize one or more data storage units.
- the computer system 100 may include a volatile memory unit 106 (e.g., random access memory (“RAM”), static RAM, dynamic RAM, etc.) coupled with the address data bus 102, wherein a volatile memory unit 106 is configured t store information and instructions for the processor 104,
- the computer system 1 0 further may include a non- volatile memory unit 108 (e.g., read-only memory (“ROM”),
- PROM programmable ROM
- ROM erasable programmable ROM
- the computer system 100 may execute instructions retrieved from an online data storage unit such as in "Cloud” computing, in an aspect, the computer system 100 also may include one or more interfaces, such as an interface 110, coupled with the address/data bus 102, The one or more interfaces are configured to enable the computer system 100 to interface with other electronic devices and computer systems.
- the commu ication interfaces implemented by the one or more interfaces may include wireline (e.g., serial cables, modems, network adaptors, etc.) and/or wireless (e.g.. wireless modems, wireless network adaptors, etc.) communication technology.
- the computer system 100 may include an input device .1 12 coupled with the address/data bus 102, wherein the input device 1 12 is configured to communicate information and command selections to the processor 100.
- the input device 1 12 is an alphanumeric input device, such as a keyboard, that may include alphanumeric and/or function keys.
- the input device 1.12 may be an input device other than an alphanumeric input device
- the computer system 100 may include a cursor control device 1 14 coupled with the address/data bus 102, wherein the cursor control device 1 14 is configured to communicate user input information and/or command selections to the processor 100.
- the cursor control device .1 14 is implemented using a device such as a mouse, a track-ball, a track-pad, an optical tracking device, or a touch screen.
- the cursor control device 1 14 is directed and/or activated via input from the input device 112, such as in response to the use of special keys and key sequence commands associated with the input device 1 12.
- the cursor control device 1. 1 is configured to be directed or guided by voice commands.
- the computer system 100 further may include one or more optional computer usable data storage devices, such as a storage device 116, coupled with the address/data bus 102.
- the storage device 1 16 is configured to store information and/or computer executable instructions, in one aspect, the storage device 1 16 is a storage device such as a magnetic or optical disk drive (e.g., hard disk drive ⁇ "HDD"), floppy diskette, compact disk read only memory (“CD-ROM”), digital versatile disk (“DVD”)).
- a display device 1 18 is coupled with the address/data bus 102, wherein the display device 118 is configured to display video and/or graphics.
- the display device 118 may include a cathode ray rube ("CRT"), liquid crystal display (“LCD”), field emission display (“FED”), plasma display, or any othe display device suitable for displaying video and/or graphic images and alphanumeric characters recognizable to a user.
- CTR cathode ray rube
- LCD liquid crystal display
- FED field emission display
- plasma display or any othe display device suitable for displaying video and/or graphic images and alphanumeric characters recognizable to a user.
- the computer system 100 presented herein is an example computing environment in accordance with an aspect.
- the non-limiting example of the computer system 100 is not strictly limited to being a computer system.
- an aspect provides thai the computer system 100 represents a t pe of data processing analysis that may be used in accordance with various aspects described herein.
- other computing systems may also be implemented, indeed, the spirit and scope of the present technology is not limited to any single data processing environment.
- one or more operations of various aspects of the present technology are control led or implemented using computer-executable instructions, such as program modules, being executed by a computer.
- program modules include routines, programs, objects, components and/or data structures that are configured to perform particular tasks or implement particular abstract data types.
- an aspect provides that one or more aspects of the present technology are implemented by utilising one or more distributed computing environments, such as where tasks are performed by remote processing devices that are linked through a communications network, or such as where various program modules are located in both local and remote computer-storage medi including memory-storage devices.
- the computer program product is depicted as floppy disk 200 or an optical disk 202 such as a CD or DVD.
- the computer program product generally represents computer-readable instructions stored o any compatible non-transitory computer-readable medium.
- the term "instructions” as used with respect to this invention generally indicates a set of operations to be performed on a computer, and may represent pieces of a whole program or individual, separable, software modules.
- Non-limitin examples of "instruction” include computer program code (source or object code) and "hard -coded” electronics (i.e. computer operations coded into a computer chip).
- the "instruction" is stored on any non-transitory computer-readable medium, such as in the memory of a computer or on a floppy disk, a CD-ROM, and a flash drive. In either event, the instructions are encoded on a non- transitory computer-readable medium.
- Brain machine interfaces (B I) and neural prosthetics offer great hope for restoring function to people with spinal cord injuries and amputees, as wel l augmenting and enhancing the abilities of people with full motor function.
- There ha ve been a number of advances in neural imaging as well as in decoding motor intent from neuroiinagiiig data.
- prosthetic device control there have been fewer advances, with most approaches rooted in conventional robotic control.
- the system described herein provides a system-level architecture for controlling robotic prosthetic devices given motor intent inferred from neuroimaging data.
- the system also provides stimulation architecture to facilitate virtual testing of the specific controller and prosthetic device design, with, respect to the human subject, prior to any clinical
- One controller is based on a bio-inspired task/posture decomposition of motion intended for impiementation on conventional digital computing hardware.
- the other controller is based on neuroraorphic mapping of task motion to joint motion with a bio-inspired resolution of kinematic redundancy., whic can he implemented on oeuromorphic computing hardware.
- a unique aspect is that both approaches utilize bio-inspired abstractions for controlling prosthetic devices in a natural human-like manner rather than utilizing conventional inverse kinematic approaches.
- BMi cortical commands
- human augmentation technologies can be implemented for enhancing the capabilities of the manufacturing floor worker. These technologies include assistive exoskeletons and other devices to increase worker productivity and reduce injury.
- the invention offers significant capabilities for the development of next generation BMls, such as user performance enhancement, assistive robotic devices, and neural prosthetic technologies.
- This desired motion command is referred to herein as 3 ⁇ 4 a multidimensional vector of task coord inates (e.g. Cartesian coordinates associated motion of a. single hand or coordinated motion of both hands).
- the goal is to control the prosthetic device based on this motor intent.
- inverse kinematics is conventionally used to map a desired Cartesian goal into a set of joint angles, ⁇ sf 1 (3 ⁇ 4 ⁇
- the prosthetic device is controlled by servoing individual joints or by using computed torque control.
- a human limb e.g., the human arm
- a task space representation of the dynamics is given by, f ⁇ A($)x- p(q.,q) - p(q), where x is a description of task coordinates that are to be controlled, f are the control forces that act at the task, and .f * f X and .p ⁇ q ⁇ are the task, space mass .matrix, centrifugal and Coriolis force vector, and gravity force vector, respectively.
- Task space was described by hatib in Literature Reference No. .10.
- control of the prosthetic device can be decomposed into a task-level component and a complementary postural component (due to the kinematic redundancy). Based on the choice of a particular generalized inverse of the Jaeobian these tw components of motion control are guaranteed to be dynamically consistent with each other and, as such, control can be synthesized to execute the cortical motion command while simultaneously achieving som postural objective for the prosthetic device.
- FIG. 3 depicts the system-level architecture associated with this approach. High-level motor intent from the brain 302 is decoded (with a cortical decoder 304) and a task-level motion command 306 is transmitted to the prosthetic device controller 308. High-level motor intent is captured using a neuroimaging device or any other suitable device operable for capturing neuroimaging data 300, non-limiting examples of which include neuroimaging equipment such as
- the cortical decoder 304 uses a decoding algorithm to infer spatial, motion intent 306 from the nemoiraadna data 300.
- this motor intent is in the form of desired motion commands represented in different Cartesian coordinate systems (eye-centered, hand-centered, etc.) using visumoior transfonmtions.
- the controller 308 executes the motion command using a task/posture decomposition. Relevant physiological criteri (i.e., postural criterion) or device power consumption criteria can be used to drive the postural component of motion 310, consistent with the conical command. Sensory information 312 (e.g., joint angles of the current state of the prosthetic- device 1 ⁇ from the prosthetic device 314 is fed back to the controller 308, resulting in closed loop control The architecture also accommodates sensory information 312 from the prosthetic device 314 being sent to the brain 302 to facilitate afferent proprioception.
- Relevant physiological criteri i.e., postural criterion
- Device power consumption criteria can be used to drive the postural component of motion 310, consistent with the conical command.
- Sensory information 312 e.g., joint angles of the current state of the prosthetic- device 1 ⁇ from the prosthetic device 314 is fed back to the controller 308, resulting in closed loop control
- the architecture also accommodates sensory information 3
- a potential. can be specified as representing some cost function.
- An example of suitable cost functions was described by De Sapio in Literature ' Reference No. 6.
- the task-level motion command can be controlled while simultaneously controlling motion in the complementary posture space to minimize This can be accomplished by applying gradient descent in the null/posture space. That is.
- the control torque which generates the desired task space control while minimizing the instantaneous cost potential in the posture space, is then.
- a physiological cost criterion like muscle effort, can he selectively represented.
- De Sapio, et a!. (see Literature Reference No. 6), proposed the following muscle effort criterion related to minimizing muscle activation, where g ⁇ q) is the gravity torque, M ⁇ q) is the matrix of muscle moment arms, and IT / 0? ⁇ is the diagonal matrix mapping muscle acti vation, «, to muscle force.
- the elements of f for a given muscle can be modeled by, where ⁇ $) is the muscle length, and f a and 1 ⁇ 2 represent, the maximum isometric muscle force and the optimal muscle fiber length respectively.
- FIG. 4A illustrates a simplified biomechanical model of the human arm actuated by 14 muscles
- IG. 4B is a table illustrating the maximum isometric forces. / s , and optimal fiber lengths, l Mf> for the biomechanical model of
- FIG. 4A The control law is defined as,
- control torque is defined as, where a dissipative term, and a gain, are added on the gradient descent term in the posture space portion of the control torque.
- FIG. 5A illustrates a redundant muscle-actuated model of the human arm used as a physiological template for controlling a prosthetic arm
- PIGs. SB, 5C and 5D are graphs illustrating time histories of joint motion, hand motion, and muscle effort, respecti vely, for a simulation run using the model of FIG. 5A.
- the model of FIG. 5 A depicts initial and final configurations (joint angles), q ⁇ ) and - (i f ), associated with gradient descent movement to a target, x f .
- the motion corresponds to gradient descent of the rnuscie effort, subject to the task requirement.
- the posture control seeks to reduce the muscle effort but is also constrained by the task requirement.
- the controller achieves the final target objective while the posture control simultaneously seeks to reduce the muscle effort (consisteni with the task requirement). No compensation for the dynamics (except for gravity) was included in the control. Thus, there is no feedback linearization of the inertia! terms present in the control . Normally, perfect feedback linearization would produce straight line motion to the goal, in the absence of feedback, linearization non-straight line motion results.
- the biomechanical model of FIG. 4A is used to generate a physiological criterion that is encoded in the prosthesis controller.
- the ac tuation system of the prosthetic arm need not be analogous to its biological muscle-driver* counterpart Rather, the biological muscie model defines a set of virtual muscles that direc t the postural control of the prosthetic device in a bio-inspired manner, eve if conventional robot actuators (e.g., DC motors, etc. ) are used in the prosthetic device to control joint movements.
- conventional robot actuators e.g., DC motors, etc.
- the control architecture depicted in FIG, 3 is intended to be
- kinematic redundancy in the prosthetic device was resol ved using a. task/posture decomposition, in the case of motor map learning by an arti ficial spiking neural network , kinematic redundancy also needs to be addressed since a differential movement in x would result in an infinite space of differential movements in q.
- FIG. 6.A and 6B depict artificial spiking neural networks with two different approaches for resolving kinematic redundancy. Both networks learn the inverse kinematic mapping using trajectory babbling 600, 61 1 of the prosthetic arm. Kinematic redundancy impacts the trainin such that for a given configuration (set. of joint angles), q 603, and Cartesian displacement. Ax 601, there will be multiple va lues of configuration (joint) space displacement, A*
- this can be resolved by using a specific generalized inverse of / (i.e.,
- the spiking neural network of FIG. 6A trains on the output of the trajectory babbling in Cartesian space 600.
- the signals 603 and Ax 601 serve as the context for the training and the signal &q 604 serves as the signal to be learned with respect, to that context.
- Signal 603 represents the current joint angle configuration that is being sensed at any given time.
- the spikes representing these signals act as inputs to layer 1 605 of spiki g neurons.
- the outputs of layer 1 605 act as spike inputs to layer 2 606.
- An additional layer 3 60? of neurons is employed on the context side (q 603 and Ax 601 input side).
- each neuron codes for a single combined representation of both the configuration, q 603, and Cartesian displacement.
- Ax see Literature Reference No, 14
- the synaptic weights of the spiking neurons in that layer are reinforced consistent with learning the training signals 609.
- Spike timing-dependent plasticity (see Literature Reference Nos. 2 and 14) is utilized to modulate the synaptic weights in the layers. After sufficient training on motor babbling data the spike timing-dependent plasticity can be turned, off as the neurons in layer 4 60? will have ieamed the differential, kinematic mapping. Note that x(t+ 1 ) and q(t+ 1 ) denote a new value for a next time step.
- the generalized inverse, * 602 can be chose based on the minimization of some quadratic Form. For example, to find a solution thai minimizes I f I 3 , the following solution can be used: &q ⁇ 4 ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ) ⁇ 1 ⁇ .
- FIG. 6B the redundancy problem can also be resolved by adding a layer of neurons 612 that explicitly inhibits the learning of solutions where the cost (physiological or otherwise) is high.
- the structure of Fig. 6B is similar to Fig, 6A, However, rather than using a generalized inverse to compute a unique value of &q to team for a given configuration, ⁇ ?, and Cartesian displacement, ⁇ ⁇ , forward kinematics 610 is used to compute the unique forward mapping from the configuration (joint) space motor babbling 61 1 to Cartesian space.
- a layer of neurons 612 is incorporated to inhibit the learning 613 f q signals with high cost values. In this way training will be directed at learning Aq values associated with minimal cost values.
- FIG. 7 that map Cartesian displacement, &x 601, to configuration (joint) space displacement, &q.
- a feedback loop 700 is specified in FIG. 7 to control the prosthetic arm.
- the overall system consists of output from a conical decoder, which specifies intentional mo vement 306 of the arm in Cartesian space. This signal is encoded as spikes 706 b the aeuromorphic spike encoder 702. These spikes, as well, as joint angle measurements from the prosthetic arm 714 encoded as spikes 720, act as input to the neuromorphic motor mapper 708.
- a spiking output representing the learned mapping to the
- FIG. 7 depicts the system-level architecture associated with the approach depicted in FIG. 6B.
- the prosthetic device controller i.e., neuromorphic prosthesis controller
- the prosthetic device controller 308 includes a neuromorphic spike encoder 702 (to represent the motion command 306 as a set of spikes 706), a neuromorphic motor mapper 708 (as illustrated in FIG. 6A and/or 6B) to map the neural spikes representing Cartesian displacements to neural spikes representing configuration (joint) space displacements, a spike decoder 710 (to decode the neural spikes representing configuration (joint) space displacements) and generate joint space commands, and a join t servo 712 to execute the joint space
- a neuromorphic spike encoder 702 to represent the motion command 306 as a set of spikes 706
- a neuromorphic motor mapper 708 as illustrated in FIG. 6A and/or 6B
- a spike decoder 710 to decode the neural spikes representing configuration (jo
- the sensed joint angles 71 are fed hack to the joint servo 712 and to the motor mapper 70S (after spike encoding 720).
- Relevant physiological criteria or device power consumption criteria can incorporated into the neuromorphic motor mapper (as illustrated in FIG, 6A and/or 6B) to drive the postural component of motion, consistent with the cortical command.
- the system can also he modified to allow human-in-the- ⁇ virtual testing of bio-integration issues associated with the prosthetic hardware.
- oeuroioia ing equipment 802 (e.g., EEG, tMRI, etc.) attached to the subject is sent to a simulated prosthesis controller 804 (whereas the prosthesis controller 308 in FIG, 3 is controlling an actual piece of hardware) and a simulated sensorimotor controller 806.
- the simulated prosthesis controller 804 models the behavior of controller designs (and operates in a manner similar to the prosthesis controller 308 described an illustrated with respect to FIG. 3), while the sensorimotor controller 806 simulates the sensorimotor processing and integration of the centra! nervous system (CMS).
- CCS centra! nervous system
- the simulated neural excitations 808 output from this controller 806 drive a set of muscles in a musculoskeletal simulation 810, while the simulated actuator torques 812 from the prosthesis controller 804 drive the simulated prosthetic device.
- a single physics-based environment incorporates the musculoskeletal dynamics of the subject as well as the dynamics of the prosthetic device. This architecture allows a wealth of simulation-based testing to be performed.
- the overall efficacy of the prosthesis controller 804 can be tested, including stability and
- the architecture will, also facilitate subject training with the prosthetic system and subject-specific tuning and customization of the system prior to any physical coupling of the subject with the prosthetic hardware.
- FIG. 9 depicts the neural and musculoskeletal physiology associated with the motor control problem.
- High-level motor intent 900 from the brain 900 transmitted to the central nervous system (CNS) 904, Sensorimotor integration and control results in the low-level command of motor neurons 906 innervating individual, muscles 90S (in the somatic nervous system 909) to generate coordinated musculoskeletal 910 movement.
- Afferent proprioceptive signals 9.12 transmit sensory data from mechanoreceptors in the joints and tendons back to the CNS 904.
- FIG. 1 depicts a system-level abstraction of FIG . 9.
- sensorimotor controller block 806 is central to the sensorimotor controller described herein. It receives task-level command input (i.e., motor intent 902) from the human operator (or brain 900) and generates low-level neural excitations 808 for musculoskeletal simulations 1004. This is a closed-loop control process where sensory data (via proprioceptive signals 1006) is fed back to the sensorimotor controller 806, which provides continual updates to the neural excitations 808 driving the musculoskeletal simulation 1.004.
- the sensorimotor controller 806 (simulator) operates by running musculoskeletal simulations on a musculoskeletal plant.
- the various aspects of the seiisiromotor controller S06 are described in further detail below with respect to elements 1300, 1600, and 1700 as illustrated in PIGs. 13, 1 , and 1 7, respectively.
- FIG. 1 1 provides an illustration of an
- FIG. 12 is an illustration depicting a
- neuromuscular 1200 and musculoskeletal 1202 system feed-forward path.
- neural excitations 808 provide input to the activation dynamics 1206 and output 1208 of the ac tivation dynamics 1206 provides input to the contraction dynamics 1210.
- output 1212 of the contraction dynamics 1210 provides input to the musculoskeletal ⁇ 202 dynamics through the joint torques.
- the musculoskeletal 1202 system dynamics are described by the following system of equations in configuration space.
- the neuromuscular 1202 dynamics describe the behavior of a set of r musculotendon actuators spanning the musculoskeletal system. These actuators are modeled as Hill-type acti ve state force generating units, as described by De Sapio et al. (see Literature Reference No. 18). It is assumed that the vector of r musculotendon lengths, I, can be uniquely determined from the system configuration, q. That is, — (f), As a consequence of this assumption, differential variations in i are given by,
- f T is the vector of r tendon forces.
- the negative sign is due to the convention of taking contractile muscle forces as positive.
- the .matrix of moment arms is denoted St. [000101] The behavior of the musculotendon actiiators (neurom oscular
- Activation dynamics 1208 refers to the process of muscle activation in response to neural excitation. This process can be modeled by the following equation of state written in terms of the vector of r mu where % 3 ⁇ 4
- the contraction dynamic s 1210 of the musculotendon. unit (illustrated as element 1 100 in FIG, 1 1) can be modeled as the lumped parameter system.
- the lumped parameter system 1 102 describes the configuration of forces.
- the relative angle associated with the muscle fibers 1 1 10, a. is referred to as the pennation angle 1 1 1 2
- the muscie actuated task space equations of motion can be expressed as:
- a set of muscle activations, « is determined, which minimizes e
- a set of neural excitations., » is computed which causes the actual activations, «, from the forward neuromuscular simulation to track , as follows:
- FIG. 13 is an illustration of a task- level sensorimotor controller 1300 according to the principles of the present invention, where motion commands are represented in. task space.
- the sensorimotor controller BOO generates optimal neural excitation commands 808 that minimize muscle activation 1302, and generate motion consistent with the input motion commands.
- holonomic constraints 1400 can be incorporated into the system.
- FIG. 14 illustrates a iunctiona! block diagram showing an abstracted representation of neural and musculoskeletal physiology.
- the sensorimoto controlier 806 in this aspect is a goal-oriented neiuomuscuiuar feedback controller that dri ves the constrained 1400 musculoskeletal simulation by specifying low-level neural 808 excitations that execute task-level commands. 0001.10]
- the holonomic constraints, ( ) ⁇ ® can be incorporated into the musculoskeletal dynamics.
- the configuration space dynamics was augmented with Lagrange multipliers, 1, ⁇ TM M (tj) q- b ⁇ q s q) + g ⁇ q)— ⁇ 3 ⁇ 4 where ⁇ * is the constraint Jacobiati.
- FIG. 15 constrained musculoskeletal system is depicted in FIG. 15. Specifically,
- FIG. 15 is an illustration depicting a neuromuscular and constrained musculoskeletal system (feed-forward path), showing the musculoskeletal system dynamics 1500 augmented with a set of holonomic constraints 1502 and Lagrange multipliers 1504.,
- the controller 1600 seeks a 1602, which minimizes
- constrained d n m cs can be represented in task space as
- FIG, 17 is an illustration of a constrained task-level sensorimotor controller 1700 based on a reformulation of the computed muscle control approach.
- the system represents motion commands in task space and generates optimal neural excitation 808 commands that minimize muscle activation 1702. and aenerates motion consistent with the input motion commands in the presence of holonomic system constraints.
Landscapes
- Health & Medical Sciences (AREA)
- Cardiology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Transplantation (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Vascular Medicine (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Prostheses (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361903538P | 2013-11-13 | 2013-11-13 | |
PCT/US2014/065537 WO2015073713A1 (fr) | 2013-11-13 | 2014-11-13 | Système de commande d'interfaces machine-cerveau et systèmes de prothèse nerveuse |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3068349A1 true EP3068349A1 (fr) | 2016-09-21 |
EP3068349A4 EP3068349A4 (fr) | 2017-12-13 |
Family
ID=53058028
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14861977.8A Pending EP3068349A4 (fr) | 2013-11-13 | 2014-11-13 | Système de commande d'interfaces machine-cerveau et systèmes de prothèse nerveuse |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3068349A4 (fr) |
CN (1) | CN105722479B (fr) |
WO (1) | WO2015073713A1 (fr) |
Families Citing this family (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10042422B2 (en) | 2013-11-12 | 2018-08-07 | Thalmic Labs Inc. | Systems, articles, and methods for capacitive electromyography sensors |
US11921471B2 (en) | 2013-08-16 | 2024-03-05 | Meta Platforms Technologies, Llc | Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source |
WO2015081113A1 (fr) | 2013-11-27 | 2015-06-04 | Cezar Morun | Systèmes, articles et procédés pour capteurs d'électromyographie |
US9880632B2 (en) | 2014-06-19 | 2018-01-30 | Thalmic Labs Inc. | Systems, devices, and methods for gesture identification |
US10489986B2 (en) | 2018-01-25 | 2019-11-26 | Ctrl-Labs Corporation | User-controlled tuning of handstate representation model parameters |
US11331045B1 (en) | 2018-01-25 | 2022-05-17 | Facebook Technologies, Llc | Systems and methods for mitigating neuromuscular signal artifacts |
WO2018022658A1 (fr) | 2016-07-25 | 2018-02-01 | Ctrl-Labs Corporation | Système adaptatif permettant de dériver des signaux de commande à partir de mesures de l'activité neuromusculaire |
US11635736B2 (en) | 2017-10-19 | 2023-04-25 | Meta Platforms Technologies, Llc | Systems and methods for identifying biological structures associated with neuromuscular source signals |
CN110337269B (zh) | 2016-07-25 | 2021-09-21 | 脸谱科技有限责任公司 | 基于神经肌肉信号推断用户意图的方法和装置 |
US10496168B2 (en) | 2018-01-25 | 2019-12-03 | Ctrl-Labs Corporation | Calibration techniques for handstate representation modeling using neuromuscular signals |
EP3487395A4 (fr) | 2016-07-25 | 2020-03-04 | CTRL-Labs Corporation | Procédés et appareil permettant de prédire des informations de position musculo-squelettique à l'aide de capteurs autonomes portables |
WO2020112986A1 (fr) | 2018-11-27 | 2020-06-04 | Facebook Technologies, Inc. | Procédés et appareil d'auto-étalonnage d'un système de capteur à électrode vestimentaire |
US10772519B2 (en) | 2018-05-25 | 2020-09-15 | Facebook Technologies, Llc | Methods and apparatus for providing sub-muscular control |
US11216069B2 (en) | 2018-05-08 | 2022-01-04 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
US11337652B2 (en) | 2016-07-25 | 2022-05-24 | Facebook Technologies, Llc | System and method for measuring the movements of articulated rigid bodies |
US10970936B2 (en) | 2018-10-05 | 2021-04-06 | Facebook Technologies, Llc | Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment |
US11907423B2 (en) | 2019-11-25 | 2024-02-20 | Meta Platforms Technologies, Llc | Systems and methods for contextualized interactions with an environment |
US11481030B2 (en) | 2019-03-29 | 2022-10-25 | Meta Platforms Technologies, Llc | Methods and apparatus for gesture detection and classification |
WO2019147949A1 (fr) | 2018-01-25 | 2019-08-01 | Ctrl-Labs Corporation | Traitement en temps réel d'estimations de modèle de représentation d'état de main |
WO2019147928A1 (fr) | 2018-01-25 | 2019-08-01 | Ctrl-Labs Corporation | Reconstruction d'état de main sur la base d'entrées multiples |
CN112074870A (zh) | 2018-01-25 | 2020-12-11 | 脸谱科技有限责任公司 | 重构的手部状态信息的可视化 |
US10937414B2 (en) | 2018-05-08 | 2021-03-02 | Facebook Technologies, Llc | Systems and methods for text input using neuromuscular information |
WO2019148002A1 (fr) | 2018-01-25 | 2019-08-01 | Ctrl-Labs Corporation | Techniques d'anonymisation de données de signal neuromusculaire |
US11150730B1 (en) | 2019-04-30 | 2021-10-19 | Facebook Technologies, Llc | Devices, systems, and methods for controlling computing devices via neuromuscular signals of users |
US11961494B1 (en) | 2019-03-29 | 2024-04-16 | Meta Platforms Technologies, Llc | Electromagnetic interference reduction in extended reality environments |
US11493993B2 (en) | 2019-09-04 | 2022-11-08 | Meta Platforms Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
US10592001B2 (en) | 2018-05-08 | 2020-03-17 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
WO2019231911A1 (fr) | 2018-05-29 | 2019-12-05 | Ctrl-Labs Corporation | Techniques de blindage pour la réduction du bruit dans la mesure de signal d'électromyographie de surface et systèmes et procédés associés |
WO2019241701A1 (fr) | 2018-06-14 | 2019-12-19 | Ctrl-Labs Corporation | Identification et authentification d'utilisateur au moyen de signatures neuromusculaires |
US11045137B2 (en) | 2018-07-19 | 2021-06-29 | Facebook Technologies, Llc | Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device |
CN112236741B (zh) * | 2018-07-31 | 2024-03-08 | 赫尔实验室有限公司 | 利用神经调节的增强型脑机接口的系统、方法和介质 |
WO2020036958A1 (fr) | 2018-08-13 | 2020-02-20 | Ctrl-Labs Corporation | Détection et identification de pointes en temps réel |
EP4241661A1 (fr) | 2018-08-31 | 2023-09-13 | Facebook Technologies, LLC | Interprétation de signaux neuromusculaires guidée par caméra |
WO2020061451A1 (fr) | 2018-09-20 | 2020-03-26 | Ctrl-Labs Corporation | Entrée de texte, écriture et dessin neuromusculaires dans des systèmes de réalité augmentée |
CN112771478A (zh) | 2018-09-26 | 2021-05-07 | 脸谱科技有限责任公司 | 对环境中的物理对象的神经肌肉控制 |
US10905383B2 (en) | 2019-02-28 | 2021-02-02 | Facebook Technologies, Llc | Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces |
EP3994554B1 (fr) * | 2019-07-02 | 2024-08-14 | HRL Laboratories, LLC | Système et procédé de décodage continu d'états cérébraux en signaux de commande à degrés de liberté multiples dans des dispositifs mains libres |
US20210018896A1 (en) * | 2019-07-16 | 2021-01-21 | Carnegie Mellon University | Methods and Systems for Noninvasive Mind-Controlled Devices |
US12089953B1 (en) | 2019-12-04 | 2024-09-17 | Meta Platforms Technologies, Llc | Systems and methods for utilizing intrinsic current noise to measure interface impedances |
CN111568615A (zh) * | 2020-04-16 | 2020-08-25 | 南方科技大学 | 电动假肢系统和电动假肢控制方法 |
US11868531B1 (en) | 2021-04-08 | 2024-01-09 | Meta Platforms Technologies, Llc | Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof |
CN115282431B (zh) * | 2022-07-06 | 2023-06-30 | 电子科技大学 | 一种基于脑电的光调控智能灯刺激方法及其装置 |
-
2014
- 2014-11-13 CN CN201480062290.0A patent/CN105722479B/zh active Active
- 2014-11-13 WO PCT/US2014/065537 patent/WO2015073713A1/fr active Application Filing
- 2014-11-13 EP EP14861977.8A patent/EP3068349A4/fr active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2015073713A1 (fr) | 2015-05-21 |
EP3068349A4 (fr) | 2017-12-13 |
CN105722479B (zh) | 2018-04-13 |
CN105722479A (zh) | 2016-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3068349A1 (fr) | Système de commande d'interfaces machine-cerveau et systèmes de prothèse nerveuse | |
US9566174B1 (en) | System for controlling brain machine interfaces and neural prosthetic systems | |
Zhu et al. | Robot learning from demonstration in robotic assembly: A survey | |
Mick et al. | Reachy, a 3D-printed human-like robotic arm as a testbed for human-robot control strategies | |
Caggiano et al. | MyoSuite--A contact-rich simulation suite for musculoskeletal motor control | |
Calinon | Learning from demonstration (programming by demonstration) | |
Kang et al. | Design, modeling and control of a pneumatically actuated manipulator inspired by biological continuum structures | |
Mataric | Getting humanoids to move and imitate | |
Kane et al. | The use of Kane's dynamical equations in robotics | |
Calinon et al. | Active teaching in robot programming by demonstration | |
Nguiadem et al. | Motion planning of upper-limb exoskeleton robots: a review | |
De Sapio et al. | Simulating the task-level control of human motion: a methodology and framework for implementation | |
US10899017B1 (en) | System for co-adaptation of robot control to human biomechanics | |
Roller et al. | Optimal control of a biomechanical multibody model for the dynamic simulation of working tasks | |
Ingram et al. | Modelling of the human shoulder as a parallel mechanism without constraints | |
Trivedi et al. | Biomimetic approaches for human arm motion generation: literature review and future directions | |
De Sapio | An approach for goal-oriented neuromuscular control of digital humans in physics-based simulations | |
Liu et al. | Computational modeling: Human dynamic model | |
Bong et al. | Standing balance control of a bipedal robot based on behavior cloning | |
Heremans et al. | Bio-inspired balance controller for a humanoid robot | |
US10409928B1 (en) | Goal oriented sensorimotor controller for controlling musculoskeletal simulations with neural excitation commands | |
Schearer et al. | Identifying inverse human arm dynamics using a robotic testbed | |
Liao et al. | Multi-muscle FES control of the human arm for interaction tasks—stabilizing with muscle co-contraction and postural adjustment: a simulation study | |
Fischer et al. | From bot to bot: Using a chat bot to synthesize robot motion | |
Covaciu et al. | VR interface for cooperative robots applied in dynamic environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20160609 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20171115 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61F 2/70 20060101ALI20171109BHEP Ipc: A61F 2/68 20060101AFI20171109BHEP Ipc: A61F 2/76 20060101ALI20171109BHEP Ipc: A61F 2/72 20060101ALI20171109BHEP Ipc: G06N 3/04 20060101ALI20171109BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20191018 |
|
GRAJ | Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted |
Free format text: ORIGINAL CODE: EPIDOSDIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
INTC | Intention to grant announced (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20201109 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230525 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20231211 |
|
GRAJ | Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted |
Free format text: ORIGINAL CODE: EPIDOSDIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTC | Intention to grant announced (deleted) | ||
INTG | Intention to grant announced |
Effective date: 20240517 |
|
GRAJ | Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted |
Free format text: ORIGINAL CODE: EPIDOSDIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |