EP3836836A1 - Détection et identification de pointes en temps réel - Google Patents
Détection et identification de pointes en temps réelInfo
- Publication number
- EP3836836A1 EP3836836A1 EP19850130.6A EP19850130A EP3836836A1 EP 3836836 A1 EP3836836 A1 EP 3836836A1 EP 19850130 A EP19850130 A EP 19850130A EP 3836836 A1 EP3836836 A1 EP 3836836A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- spike
- computer
- neuromuscular
- spike event
- detected
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims description 36
- 230000002232 neuromuscular Effects 0.000 claims abstract description 246
- 238000000034 method Methods 0.000 claims abstract description 210
- 230000036982 action potential Effects 0.000 claims abstract description 31
- 238000010304 firing Methods 0.000 claims abstract description 20
- 210000003205 muscle Anatomy 0.000 claims description 74
- 230000008569 process Effects 0.000 claims description 59
- 238000013528 artificial neural network Methods 0.000 claims description 56
- 210000001087 myotubule Anatomy 0.000 claims description 42
- 230000033001 locomotion Effects 0.000 claims description 29
- 230000000694 effects Effects 0.000 claims description 26
- 230000004913 activation Effects 0.000 claims description 24
- 238000001994 activation Methods 0.000 claims description 24
- 230000003213 activating effect Effects 0.000 claims description 19
- 238000004891 communication Methods 0.000 claims description 19
- 238000012545 processing Methods 0.000 claims description 18
- 230000007115 recruitment Effects 0.000 claims description 18
- 238000012549 training Methods 0.000 claims description 18
- 230000000306 recurrent effect Effects 0.000 claims description 16
- 239000011159 matrix material Substances 0.000 claims description 12
- 230000000007 visual effect Effects 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 8
- 238000005259 measurement Methods 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 8
- 230000003190 augmentative effect Effects 0.000 claims description 6
- 238000013527 convolutional neural network Methods 0.000 claims description 6
- 230000036541 health Effects 0.000 claims description 6
- 230000006690 co-activation Effects 0.000 claims description 5
- 239000002131 composite material Substances 0.000 claims description 5
- 238000012806 monitoring device Methods 0.000 claims description 5
- 239000004984 smart glass Substances 0.000 claims description 5
- 238000012804 iterative process Methods 0.000 claims description 4
- 230000001953 sensory effect Effects 0.000 claims description 3
- 238000007670 refining Methods 0.000 claims description 2
- 238000011897 real-time detection Methods 0.000 abstract 1
- 238000002567 electromyography Methods 0.000 description 46
- 238000005516 engineering process Methods 0.000 description 28
- 238000013179 statistical model Methods 0.000 description 27
- 210000002161 motor neuron Anatomy 0.000 description 21
- 239000013598 vector Substances 0.000 description 13
- 230000004118 muscle contraction Effects 0.000 description 12
- 230000001537 neural effect Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 8
- 238000005316 response function Methods 0.000 description 8
- 210000000707 wrist Anatomy 0.000 description 6
- 230000009471 action Effects 0.000 description 4
- 210000003169 central nervous system Anatomy 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 210000000715 neuromuscular junction Anatomy 0.000 description 4
- 210000002027 skeletal muscle Anatomy 0.000 description 4
- 239000000126 substance Substances 0.000 description 4
- 230000002123 temporal effect Effects 0.000 description 4
- 102000003505 Myosin Human genes 0.000 description 3
- 108060008487 Myosin Proteins 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 230000008602 contraction Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 230000028161 membrane depolarization Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 210000000063 presynaptic terminal Anatomy 0.000 description 3
- 210000001519 tissue Anatomy 0.000 description 3
- BHPQYMZQTOCNFJ-UHFFFAOYSA-N Calcium cation Chemical compound [Ca+2] BHPQYMZQTOCNFJ-UHFFFAOYSA-N 0.000 description 2
- 230000003321 amplification Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 210000003050 axon Anatomy 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000031018 biological processes and functions Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 229910001424 calcium ion Inorganic materials 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000009472 formulation Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000001404 mediated effect Effects 0.000 description 2
- 230000015654 memory Effects 0.000 description 2
- 210000003632 microfilament Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003387 muscular Effects 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000012421 spiking Methods 0.000 description 2
- 210000000278 spinal cord Anatomy 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 241000712899 Lymphocytic choriomeningitis mammarenavirus Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- OIPILFWXSMYKGL-UHFFFAOYSA-N acetylcholine Chemical compound CC(=O)OCC[N+](C)(C)C OIPILFWXSMYKGL-UHFFFAOYSA-N 0.000 description 1
- 229960004373 acetylcholine Drugs 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000005056 cell body Anatomy 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000004064 dysfunction Effects 0.000 description 1
- 238000002593 electrical impedance tomography Methods 0.000 description 1
- 238000004070 electrodeposition Methods 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 210000000245 forearm Anatomy 0.000 description 1
- 230000004941 influx Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000003064 k means clustering Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000037023 motor activity Effects 0.000 description 1
- 230000000272 proprioceptive effect Effects 0.000 description 1
- 230000036279 refractory period Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 210000002363 skeletal muscle cell Anatomy 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 210000000225 synapse Anatomy 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
- A61B5/004—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/25—Bioelectric electrodes therefor
- A61B5/279—Bioelectric electrodes therefor specially adapted for particular uses
- A61B5/296—Bioelectric electrodes therefor specially adapted for particular uses for electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
- A61B5/397—Analysis of electromyograms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Bio-feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6824—Arm or wrist
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7225—Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7282—Event detection, e.g. detecting unique waveforms indicative of a medical condition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/388—Nerve conduction study, e.g. detecting action potential of peripheral nerves
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
Definitions
- Neuromuscular signals arising from the human central nervous system may reflect neural activation that results in the contraction of one or more muscles in the human body.
- Neuromuscular sensors an example of which includes electromyography (EMG) sensors, placed on the surface of the human body record neuromuscular activity produced when skeletal muscle cells are activated.
- the neuromuscular activity measured by neuromuscular sensors may result from neural activation, muscle excitation, muscle contraction, or a combination of the neural activation and muscle contraction.
- Signals recorded by neuromuscular sensors are routinely used to assess neuromuscular dysfunction in patients with motor control disorders and have been used in some applications as control signals for devices such as prosthetic limbs.
- Coordinated movements of skeletal muscles in the human body that collectively result in the performance of a motor task originate with neural signals arising in the central nervous system.
- the neural signals travel from the central nervous system to muscles via spinal motor neurons, each of which has a cell body in the spinal cord and axon terminals on one or more muscle fibers.
- spinal motor neurons each of which has a cell body in the spinal cord and axon terminals on one or more muscle fibers.
- a spinal motor neuron and the muscle fiber(s) it innervates are collectively referred to as a“motor unit.”
- Muscles typically include muscle fibers from hundreds of motor units and simultaneous contraction of muscle fibers in multiple motor units is usually required for muscle contraction that results in movement of a skeletal segment and/or a force to be exerted by a part of the body.
- Muscles exhibit a characteristic pattern of motor unit recruitment in which motor units are activated in sequence, where the number of motor units activated depends on a strength of a desired muscle contraction.
- a motor unit action potential MUAP
- Neuromuscular sensors such as electromyography (EMG) sensors record electrochemical signals that result in motor activity, such as contraction of a muscle.
- EMG electromyography
- the biological signals recorded relate to the generation of MUAPs in muscle fibers of a motor unit.
- a MUAP only occurs when the corresponding motor unit is triggered by its motor neuron.
- Some embodiments are directed to analyzing neuromuscular signals to identify spike events in a motor neuron of a motor unit that results in the generation of MUAPs in the muscle fibers of the motor unit.
- Control signals determined based on one or more identified spike events may be used in some embodiments to control the operation of a device.
- Some embodiments are directed to a computerized system.
- the computerized system comprises a plurality of neuromuscular sensors configured to record a plurality of neuromuscular signals from a user, wherein the plurality of neuromuscular sensors are arranged on one or more wearable devices, and at least one computer processor.
- the at least one computer processor is programmed to detect, based on the plurality of neuromuscular signals or information derived from the plurality of neuromuscular signals, at least one spike event corresponding to firing of an action potential in at least one motor unit, determine, based on the plurality of neuromuscular signals or the information derived from the plurality of neuromuscular signals, a biological source of the detected at least one spike event, and generate at least one output based, at least in part, on the detected at least one spike event and/or the determined biological source of the detected at least one spike event.
- a computerized system comprising a plurality of neuromuscular sensors configured to record a plurality of neuromuscular signals from a user, wherein the plurality of neuromuscular sensors are arranged on one or more wearable devices, at least one computer processor programmed to detect, based on the plurality of neuromuscular signals or information derived from the plurality of neuromuscular signals, at least one spike event corresponding to firing of an action potential in at least one motor unit, determine, based on the plurality of neuromuscular signals or the information derived from the plurality of neuromuscular signals, a biological source of the detected at least one spike event, and generate at least one output based, at least in part, on the detected at least one spike event and/or the determined biological source of the detected at least one spike event.
- At least one computer processor is further programmed to apply one or more criteria for selecting a best biological source from a plurality of biological sources associated with respective detected spike events, select at least one best spike event associated with the best biological source, and subtract a detected waveform for the at least best spike event from at least one of the neuromuscular signals, and generating a residual neuromuscular signal.
- at least one computer processor is further programmed to perform an iterative process for processing detected spike events until no biological source is present within the residual signal that meets a minimum threshold for selection as a biological source for spike events.
- the at least one computer processor is further programmed to group detected spike events into a muscle-specific group based on co-activations and sort spike events within the muscle-specific group to approximate a recruitment curve.
- at least one computer processor is further programmed to apply at least one filter to a time- lagged representation of the plurality of neuromuscular signals, and wherein detecting the at least one spike event and determining the biological source of the detected at least one spike event is performed based on the filtered time-lagged representation of the plurality of neuromuscular signals.
- applying at least one filter to a time-lagged representation of the plurality of neuromuscular sensors comprises using a beamforming process to apply a plurality of beamforming filters to the time-lagged representation of the plurality of neuromuscular signals, wherein the plurality of beamforming filters are filters generated based on spatiotemporal patterns of one or more spike events.
- the beamforming process comprises using a minimum variance distortionless response technique.
- the beamforming process comprises using a linear constrained minimum variance technique.
- the at least one computer processor is further programmed to determine the spatiotemporal patterns of the one or more spike events corresponding to the plurality of beamforming filters.
- determining the spatiotemporal patterns of the one or more spike events corresponding to the plurality of beamforming filters comprises detecting a plurality of spike events in recorded neuromuscular signals, clustering the detected plurality of spike events, and determining the spatiotemporal patterns based on the clusters of spike events.
- detecting a plurality of spike events comprises detecting within the plurality of neuromuscular signals, periods of low activity, and detecting within the period of low activity, putative spike events.
- detecting the plurality of spike events further comprises analyzing the detected putative spike events to discard spike events having one or more particular characteristics.
- the one or more particular characteristics include a duration longer than a particular threshold duration.
- the at least one computer processor is further programmed to detect the at least one spike event and/or determine the biological source of the detected at least one spike event using one or more neural networks.
- the one or more neural networks includes a convolutional neural network.
- the one or more neural networks includes a recurrent neural network.
- the at least one computer processor is further programmed to detect the at least one spike event and determine the biological source of the detected at least one spike event using a multi-step iterative technique to decompose a time- lagged representation of the plurality of neuromuscular signals into signal components corresponding to at least one biological source, and detecting the at least one spike event from the at least one biological source.
- the multi-step iterative technique comprises matrix factorization.
- generating at least one output comprises generating compressed data including an indication of the at least one spike event.
- the indication of the at least one spike event is provided as a control signal to a computer-based system.
- the indication of the at least one spike event is provided as at least one of a group comprising a discrete control signal, a continuous control signal, and a composite control signal.
- generating at least one output comprises generating an indication of the at least one spike event.
- the indication of the at least one spike event includes an indication of a biological source of the at least one spike event and a time of occurrence of the at least one spike event.
- the at least one computer processor is programmed to provide feedback to the user responsive to the at least one output based, at least in part, on the detected at least one spike event and/or the determined biological source of the detected at least one spike event.
- the at least one computer processor is programmed to provide feedback to the user as part of a user training process.
- the feedback includes at least one of a group comprising auditory, visual, haptic, and multi-sensory feedback.
- the system further comprises an inertial sensor configured to determine movement artifacts or shifts in spatial location of muscle fibers of the at least one motor unit relative to one or more of the plurality of neuromuscular sensors.
- the at least one computer processor is programmed to filter or refine an output of an inferential model responsive to an output of the inertial sensor.
- the at least one computer processor is further programmed to transmit the compressed data including an indication of the at least one spike event over one or more wireless networks to an external device.
- the system further comprises at least one storage device, and at least one computer processor is further programmed to store the compressed data on the at least one storage device.
- generating at least one output comprises generating an updated computerized musculoskeletal representation comprising a plurality of rigid body segments connected by joints, wherein generating the updated computerized musculoskeletal representation comprises determining based, at least in part, on the detected at least one spike event and/or the identified biological source of the detected at least one spike event, musculoskeletal position information describing a spatial relationship between two or more connected segments of the plurality of rigid body segments of the computerized musculoskeletal representation and/or force information describing a force between two or more segments of the plurality of rigid body segments of the computerized musculoskeletal representation, and updating the computerized musculoskeletal representation based, at least in part, on the musculoskeletal position information and/or the force information.
- determining the musculoskeletal position information and/or the force information comprises providing as input to a trained inferential model, the detected at least one spike event and/or the identified biological source of the detected at least one spike event, and the musculoskeletal position information and/or the force information is determined based, at least in part, on an output of the trained inferential model.
- generating at least one output comprises generating in substantially real-time, at least one control signal for controlling at least one device.
- detecting at least one spike event comprises detecting a spatiotemporal pattern of the at least one spike event
- generating at least one control signal comprises generating the at least one control signal based, at least in part, on at least one characteristic of the detected spatiotemporal pattern of the at least one spike event.
- the at least one characteristic comprises a rate of the at least one spike event and/or a spatial distribution of the detected spatiotemporal pattern of the at least one spike event.
- the identified biological source comprises a motor unit.
- the identified biological source comprises a plurality of motor units.
- the identified biological source comprises a muscle.
- the identified biological source comprises a plurality of muscles.
- determining the biological source of the at least one spike event comprises determining that the at least one spike event is associated with a motor unit or group of motor units, at least one computer processor is further programmed to determine a muscle to which the motor unit or group of motor units belongs, and wherein generating at least one output comprises generating the at least one output based on the determined muscle to which the motor unit or group of motor units belongs.
- the determined muscle is associated with a motor unit recruitment sequence describing a sequence of activation of motor units for the determined muscle, and wherein the at least one computer processor is further programmed to determine where the motor unit or group of motor units fall within the motor unit recruitment sequence of the determined muscle.
- the system further comprises at least one auxiliary sensor configured to record a plurality of auxiliary signals, and wherein the at least one computer processor is further programmed to generate the at least one output based, at least in part, on the plurality of auxiliary signals.
- the at least one auxiliary sensor comprises at least one inertial measurement unit (IMU) sensor configured to record a plurality of IMU signals, and wherein the at least one computer processor is further programmed to generate the at least one output based, at least in part, on the plurality of IMU signals and/or information derived from the plurality of IMU signals.
- IMU inertial measurement unit
- At least one auxiliary sensor comprises at least one camera configured to record one or more images
- the at least one computer processor is further programmed to generate the at least one output based, at least in part, on the one or more images and/or information derived from the one or more images.
- detecting the at least one spike event is further based on the one or more images and/or the information derived from the one or more images.
- the at least one computer processor is included as a portion of a device separate from and in communication with the plurality of neuromuscular sensors arranged on the one or more wearable devices, and wherein the plurality of neuromuscular sensors are configured to wireless stream in substantially real-time, the plurality of neuromuscular signals and/or the information derived from the plurality of neuromuscular signals to the at least one computer processor.
- the device separate from and in communication with the plurality of neuromuscular sensors is a device selected from the group consisting of a remote server, a desktop computer, a laptop computer, a smartphone, and a wearable electronic device.
- the wearable electronic device is a smartwatch, a health monitoring device, smart glasses, or an augmented reality system.
- at least one computer processor is integrated with the one or more wearable devices on which the plurality of neuromuscular sensors are arranged.
- At least one computer processor comprises at least one first computer processor included as a portion of a device separate from and in communication with the plurality of neuromuscular sensors arranged on the one or more wearable devices and at least one second computer processor integrated with the one or more wearable devices on which the plurality of neuromuscular sensors are arranged.
- the plurality of neuromuscular sensors are configured to transmit at least some of the plurality of neuromuscular signals to the at least one first computer processor, wherein the at least one first computer processor is programmed to train, based on the at least some of the plurality of neuromuscular signals transmitted from the plurality of neuromuscular sensors, at least one spike detection model and/or at least one spike identification model, transmit the trained at least one spike detection model and/or the at least one spike identification model to the at least one second computer processor, and wherein the at least one second computer processor is programmed to detect the at least one spike event and determine the biological source of the detected at least one spike event using the at least one spike detection model and/or the at least one spike identification model transmitted from the at least one first computer processor.
- the at least one spike detection model and/or at least one spike identification model are trained to estimate at least one of a group comprising whether the user is activating a particular motor unit, whether the user is activating a particular motor unit with a particular timing, and whether the user is activating a particular combination of motor units.
- detecting at least one spike event corresponding to firing of an action potential in at least one motor unit comprises detecting at least one spike event corresponding to firing of an action potential in a plurality of motor units.
- at least one computer processor is further programmed to threshold the filtered time-lagged representation of the plurality of neuromuscular signals to detect the at least one spike event.
- a computer-implemented method of detecting spike events in neuromuscular data comprises receiving a plurality of neuromuscular signals from a plurality of neuromuscular sensors arranged on one or more wearable devices worn by a user, detecting, based on the plurality of neuromuscular signals or information derived from the plurality of neuromuscular signals, at least one spike event corresponding to firing of an action potential in at least one motor unit, determining, based on the plurality of neuromuscular signals or the information derived from the plurality of neuromuscular signals, a biological source of the detected at least one spike event, and generating at least one output based, at least in part, on the detected at least one spike event and/or the determined biological source of the detected at least one spike event.
- the method further comprises applying one or more criteria for selecting a best biological source from a plurality of biological sources associated with respective detected spike events, selecting at least one best spike event associated with the best biological source, and subtracting a detected waveform for the at least best spike event from at least one of the neuromuscular signals, and generating a residual neuromuscular signal.
- the method further comprises performing an iterative process for processing detected spike events until no biological source is present within the residual signal that meets a minimum threshold for selection as a biological source for spike events.
- the method further comprises grouping detected spike events into a muscle-specific group based on co activations and sorting spike events within the muscle-specific group to approximate a recruitment curve.
- the method further comprises applying at least one filter to a time-lagged representation of the plurality of neuromuscular signals, and wherein detecting the at least one spike event and determining the biological source of the detected at least one spike event is performed based on the filtered time-lagged representation of the plurality of neuromuscular signals.
- applying at least one filter to a time-lagged representation of the plurality of neuromuscular sensors comprises using a beamforming process to apply a plurality of beamforming filters to the time-lagged representation of the plurality of neuromuscular signals, wherein the plurality of beamforming filters are filters generated based on spatiotemporal patterns of one or more spike events.
- the beamforming process comprises using a minimum variance distortionless response technique.
- the beamforming process comprises using a linear constrained minimum variance technique.
- the method further comprises determining the spatiotemporal patterns of the one or more spike events corresponding to the plurality of beamforming filters.
- determining the spatiotemporal patterns of the one or more spike events corresponding to the plurality of beamforming filters comprises detecting a plurality of spike events in recorded neuromuscular signals, clustering the detected plurality of spike events, and determining the spatio temporal patterns based on the clusters of spike events.
- detecting a plurality of spike events comprises detecting within the plurality of neuromuscular signals, periods of low activity; and detecting within the period of low activity, putative spike events.
- detecting the plurality of spike events further comprises analyzing the detected putative spike events to discard spike events having one or more particular characteristics.
- the one or more particular characteristics include a duration longer than a particular threshold duration.
- method further comprises detecting the at least one spike event and/or determine the biological source of the detected at least one spike event using one or more neural networks.
- the one or more neural networks includes a convolutional neural network.
- the one or more neural networks includes a recurrent neural network.
- the method further comprises detecting the at least one spike event and determining the biological source of the detected at least one spike event using a multi-step iterative technique to decompose a time-lagged representation of the plurality of neuromuscular signals into signal components corresponding to at least one biological source, and detecting the at least one spike event from the at least one biological source.
- the multi- step iterative technique comprises matrix factorization.
- generating at least one output comprises generating compressed data including an indication of the at least one spike event.
- the indication of the at least one spike event is provided as a control signal to a computer-based system.
- the indication of the at least one spike event is provided as at least one of a group comprising a discrete control signal, a continuous control signal, and a composite control signal.
- generating at least one output comprises generating an indication of the at least one spike event.
- the indication of the at least one spike event includes an indication of a biological source of the at least one spike event and a time of occurrence of the at least one spike event.
- the method further comprises providing feedback to the user responsive to the at least one output based, at least in part, on the detected at least one spike event and/or the determined biological source of the detected at least one spike event.
- the method further comprises providing feedback to the user as part of a user training process.
- the feedback includes at least one of a group comprising auditory, visual, haptic, and multi-sensory feedback.
- the wearable device further comprises an inertial sensor, and wherein the method further comprises determining, using the inertial sensor, movement artifacts or shifts in spatial location of muscle fibers of the at least one motor unit relative to one or more of the plurality of neuromuscular sensors.
- the method further comprising filtering or refining an output of an inferential model responsive to an output of the inertial sensor.
- the method further comprises transmitting the compressed data including an indication of the at least one spike event over one or more wireless networks to an external device.
- at least one of the one or more wearable devices includes at least one storage device, and wherein the method further comprises storing the compressed data on the at least one storage device.
- generating at least one output comprises generating an updated computerized musculoskeletal representation comprising a plurality of rigid body segments connected by joints, wherein generating the updated computerized musculoskeletal representation comprises determining based, at least in part, on the detected at least one spike event and/or the identified biological source of the detected at least one spike event, musculoskeletal position information describing a spatial relationship between two or more connected segments of the plurality of rigid body segments of the computerized musculoskeletal representation and/or force information describing a force between two or more segments of the plurality of rigid body segments of the computerized musculoskeletal representation, and updating the computerized musculoskeletal representation based, at least in part, on the musculoskeletal position information and/or the force information.
- determining the musculoskeletal position information and/or the force information comprises providing as input to a trained inferential model, the detected at least one spike event and/or the identified biological source of the detected at least one spike event, and wherein the musculoskeletal position information and/or the force information is determined based, at least in part, on an output of the trained inferential model.
- generating at least one output comprises generating in substantially real-time, at least one control signal for controlling at least one device.
- detecting at least one spike event comprises detecting a spatiotemporal pattern of the at least one spike event
- generating at least one control signal comprises generating the at least one control signal based, at least in part, on at least one characteristic of the detected spatiotemporal pattern of the at least one spike event.
- at least one characteristic comprises a rate of the at least one spike event and/or a spatial distribution of the detected spatiotemporal pattern of the at least one spike event.
- the identified biological source comprises a motor unit.
- the identified biological source comprises a plurality of motor units.
- the identified biological source comprises a muscle.
- the identified biological source comprises a plurality of muscles.
- determining the biological source of the at least one spike event comprises determining that the at least one spike event is associated with a motor unit or group of motor units, wherein the method further comprises determining a muscle to which the motor unit or group of motor units belongs, and wherein generating at least one output comprises generating the at least one output based on the determined muscle to which the motor unit or group of motor units belongs.
- the determined muscle is associated with a motor unit recruitment sequence describing a sequence of activation of motor units for the determined muscle, and wherein the method further comprises determining where the motor unit or group of motor units fall within the motor unit recruitment sequence of the determined muscle.
- At least one of the one or more wearable devices includes at least one auxiliary sensor configured to record a plurality of auxiliary signals, and wherein the method further comprises generating the at least one output based, at least in part, on the plurality of auxiliary signals.
- the at least one auxiliary sensor comprises at least one inertial measurement unit (IMU) sensor configured to record a plurality of IMU signals, and wherein the method further comprises generating the at least one output based, at least in part, on the plurality of IMU signals and/or information derived from the plurality of IMU signals.
- IMU inertial measurement unit
- the at least one auxiliary sensor comprises at least one camera configured to record one or more images, and wherein the method further comprises generating the at least one output based, at least in part, on the one or more images and/or information derived from the one or more images.
- detecting the at least one spike event is further based on the one or more images and/or the information derived from the one or more images.
- At least one computer processor is included as a portion of a device separate from and in communication with the plurality of neuromuscular sensors arranged on the one or more wearable devices, and wherein the method further comprises streaming, from the plurality of neuromuscular sensors in substantially real-time, the plurality of neuromuscular signals and/or the information derived from the plurality of neuromuscular signals to the at least one computer processor.
- the device separate from and in communication with the plurality of neuromuscular sensors is a device selected from the group consisting of a remote server, a desktop computer, a laptop computer, a smartphone, and a wearable electronic device.
- the wearable electronic device is a smartwatch, a health monitoring device, smart glasses, or an augmented reality system.
- the method further comprises integrating at least one computer processor with the one or more wearable devices on which the plurality of neuromuscular sensors are arranged.
- At least one computer processor comprises at least one first computer processor included as a portion of a device separate from and in communication with the plurality of neuromuscular sensors arranged on the one or more wearable devices and at least one second computer processor integrated with the one or more wearable devices on which the plurality of neuromuscular sensors are arranged.
- the method further comprises transmitting, by the plurality of neuromuscular sensors, at least some of the plurality of neuromuscular signals to the at least one first computer processor, and wherein the at least one first computer processor performs acts of training, based on the at least some of the plurality of neuromuscular signals transmitted from the plurality of neuromuscular sensors, at least one spike detection model and/or at least one spike identification model, and transmitting the trained at least one spike detection model and/or the at least one spike identification model to the at least one second computer processor, and wherein the at least one second computer processor performs an act of detecting the at least one spike event and determine the biological source of the detected at least one spike event using the at least one spike detection model and/or the at least one spike identification model transmitted from the at least one first computer processor.
- the method further comprises training the at least one spike detection model and/or at least one spike identification model to estimate at least one of a group comprising whether the user is activating a particular motor unit, whether the user is activating a particular motor unit with a particular timing, and whether the user is activating a particular combination of motor units.
- detecting at least one spike event corresponding to firing of an action potential in at least one motor unit comprises detecting at least one spike event corresponding to firing of an action potential in a plurality of motor units.
- the method further comprises thresholding the filtered time-lagged representation of the plurality of neuromuscular signals to detect the at least one spike event.
- FIG. 1 is a flowchart of a biological process for performing a motor task in accordance with some embodiments of the technology described herein;
- FIG. 2 is a schematic diagram of a computer-based system for detecting spike events in neuromuscular data in accordance with some embodiments of the technology described herein;
- FIG. 3 is a flowchart of a substantially real-time process for detecting spike event information from neuromuscular data in accordance with some embodiments of the technology described herein;
- FIG. 4 is a flowchart of a process for associating spike events with muscles in accordance with some embodiments of the technology described herein;
- FIG. 5 is a flowchart of a process for generating filters for use with a substantially real-time spike event decoder in accordance with some embodiments of the technology described herein;
- FIG. 6 illustrates a wristband having EMG sensors arranged circumferentially thereon, in accordance with some embodiments of the technology described herein;
- FIG. 7 illustrates a user wearing the wristband of FIG. 6 while typing on a keyboard, in accordance with some embodiments of the technology described herein;
- FIG. 8 illustrates a plot for detecting spike events in two channels of recorded neuromuscular data during periods of low activity, in accordance with some embodiments of the technology described herein;
- FIG. 9A illustrates a plot of clustering spike events to identify spike events with similar spatiotemporal profiles, in accordance with some embodiments of the technology described herein;
- FIG. 9B illustrates six spatiotemporal profiles generated for each of six clusters of spike events, in accordance with some embodiments of the technology described herein;
- FIG. 10 illustrates a set of EMG channel waveforms associated with a number of biological sources, that may be produced in accordance with some embodiments of the technology described herein;
- FIG. 11 shows output of an MVDR-based spike event decoder configured in accordance with some embodiments of the technology described herein.
- FIG. 12 shows output of an MVDR-based spike event decoder
- MVDR filters for each of a plurality of motor units, wherein the decoder is configured in accordance with some embodiments of the technology described herein;
- FIG. 13 is a flowchart of a substantially real-time process for detecting spike event information from neuromuscular data in accordance with some embodiments of the technology described herein;
- FIG. 14A illustrates a wearable system with sixteen EMG sensors arranged circumferentially around an elastic band configured to be worn around a user’s lower arm or wrist, in accordance with some embodiments of the technology described herein;
- FIG. 14B is a cross-sectional view through one of the sixteen EMG sensors illustrated in FIG. 14;
- FIGS. 15A and 15B schematically illustrate components of a computer-based system on which some embodiments are implemented.
- FIG. 15A illustrates a wearable portion of the computer-based system
- FIG. 15B illustrates a dongle portion connected to a computer, wherein the dongle portion is configured to communicate with the wearable portion.
- FIG. 1 illustrates a flowchart of a biological process 100 for initiating a motor task by the coordinated movement of one or more muscles.
- action potentials are generated in one or more efferent spinal motor neurons.
- the motor neurons carry the neuronal signal (also referred to as“spikes” herein) away from the central nervous system and toward skeletal muscles in the periphery.
- the action potential travels along the axon of the motor neuron from its body in the spinal cord where the action potential is generated to the axon terminals of the motor neuron that innervate muscle fibers included in skeletal muscles.
- a motor neuron and the muscle fibers that it innervates are referred to herein as a motor unit.
- Muscle fibers in a motor unit are activated together in response to an action potential generated in the corresponding motor neuron of the motor unit.
- Individual muscles typically include muscle fibers from hundreds of motor units with the simultaneous contraction of muscle fibers in many motor units resulting in muscle contraction evidenced as perceptible muscle movement and/or force.
- a chemical synapse formed at the interface between an axon terminal of a spinal motor neuron and a muscle fiber is called a neuromuscular junction.
- process 100 proceeds to act 104, where an action potential is generated in the muscle fiber as a result of chemical activity at the neuromuscular junction.
- acetylcholine released by the motor neuron diffuses across the neuromuscular junction and binds with receptors on the surface of the muscle fiber triggering a depolarization of the muscle fiber.
- neuromuscular signals sensed on the body surface generated by the depolarization of individual muscle fibers are small (e.g., less than 100 pV)
- the collective action of multiple muscle fibers conducting simultaneously results in a detectable voltage potential that may be recorded by neuromuscular (e.g., EMG) sensors located on the surface of the body.
- neuromuscular e.g., EMG
- the collective conduction of muscle fibers from many motor units results in muscle contraction and perceptible motion. Accordingly, when a user performs a movement or gesture, the corresponding recorded neuromuscular signals include contributions from multiple activated motor units.
- process 100 proceeds to act 106, where the propagation of the action potential in the muscle fiber results in a series of chemical-mediated processes within the muscle fiber. For example, depolarization of a muscle fiber results in an influx of calcium ions into the muscle fiber. Calcium ions inside the muscle fiber bind with troponin complexes causing the troponin complexes to separate from myosin binding sites on actin filaments in the muscle fiber, thereby exposing the myosin binding sites.
- process 100 proceeds to act
- Process 100 then proceeds to act 110, where the collective contraction of muscle fibers in one or more muscles results in the performance of a motor task.
- motor unit recruitment As the tension of a muscle increases, the firing rates of active motor neurons increases and additional motor neurons may become active, which is a process referred to as motor unit recruitment.
- the pattern by which motor neurons innervating a muscle become active and increase their firing rate is, in some cases, stereotyped.
- Some embodiments are directed to analyzing neuromuscular signals to detect and identify/classify spike events corresponding to firing of action potentials in one or more motor units.
- a group of muscles necessary to perform the motor task is activated.
- the motor task is performed while the user is wearing a wearable device that includes neuromuscular sensors (e.g., EMG sensors)
- the neuromuscular signals recorded by the sensors on the surface of the body correspond to superimposed activity of all motor units in the muscles in the group activated during performance of the motor task.
- the neuromuscular signals may be analyzed and mapped to control signals to control a device based on the type of movement or gesture that the user performs.
- the analysis of neuromuscular signals involves the detection and identification of spike events in activated motor units.
- a generative model of an EMG signal x(t) may take the form: where t is the time, s, is the spatiotemporal waveform of the z ' -th MUAP observed by an EMG recording device, is the spike train of the corresponding motor neuron and h( ⁇ ) is the EMG measurement noise, where the spike train is represented as a time series of Dirac functions occurring each time the motor neuron fires.
- a MUAP is an electrical potential generated by activation of muscle fibers in a corresponding motor unit.
- the spatiotemporal waveform of the MUAP as detected by a pair of EMG sensors depends primarily on the position of the motor unit relative to the array of EMG sensors.
- Tissue between the site of the muscle fiber(s) composing the motor unit and an EMG sensor filters the spatiotemporal waveform, so that the same EMG sensor (or EMG sensors) may measure a distinct spatiotemporal pattern due to different locations of the muscle fibers in the underlying tissue and, accordingly, unique filtering caused by tissue between the muscle fibers and an EMG sensor (or EMG sensors).
- the spatiotemporal waveform of the MUAP remains constant as long as the electrode positions and the conductive medium (e.g., the user’s body) do not change.
- small variations in the spatiotemporal waveform for a MUAP may be introduced due to muscle contractions.
- the duration of a MUAP is on the order of 10-20 ms and may have an amplitude on the order of hundreds of microvolts.
- the duration of the MUAP is influenced largely based on the spacing between differential EMG electrodes and the velocity of the action potential wave traveling along the muscle fibers.
- the amplitude of the MUAP is influenced largely based on the distance from the motor unit to the EMG electrode pair and the number of muscle fibers in the motor unit.
- MUAP remains substantially constant, and as such encodes little or no information related to user intent, some embodiments are directed to extracting spike event information (e.g., spike train data) from neuromuscular signals as a measure of user intent.
- the extracted spike event information may be used to generate one or more outputs (e.g., one or more control signals, where the control signals may be used to change the state of a computerized system that is configured to receive the control signal).
- a mapping between spike event information and control signals may be implemented, for example, using an inferential model trained to associate particular spike event information with control signal outputs.
- the output of the trained inferential model may be musculoskeletal position information that describes, for example, the positions and/or forces of rigid body segments in a computer-implemented musculoskeletal model.
- the musculoskeletal model may be updated with predictions of the musculoskeletal position information output from the inferential model. Control signals may then be generated based on the updated musculoskeletal position information.
- the output of the trained inferential model may be the control signals themselves, such that a musculoskeletal model is not used.
- spike event information from a plurality of motor units may be combined, for example to enable two-dimensional control.
- some embodiments detect spike events in recorded neuromuscular signals and identify a biological source (e.g., a motor unit or group of motor units) of the detected spike events.
- the output e.g., a control signal
- the output is then generated based on the detected spike event(s) and/or the identified biological source.
- EMG sensors are used as examples of the type of neuromuscular sensors configured to detect neuromuscular activity.
- other types of neuromuscular sensors including, but not limited to, mechanomyography (MMG) sensors and sonomyography (SMG) sensors may additionally or alternatively be used in combination with EMG sensors to detect neuromuscular activity in accordance with some embodiments.
- MMG mechanomyography
- SMG sonomyography
- the neuromuscular signals recorded by the neuromuscular sensors may be used to identify activation of sub-muscular structures in accordance with the techniques described herein.
- FIG. 2 illustrates a system 200 in accordance with some embodiments.
- the system includes a plurality of sensors 210 configured to record signals resulting from the activation of motor units with portions of a human body.
- Sensors 210 may include a plurality of neuromuscular sensors configured to record signals arising from neuromuscular activity in skeletal muscle of a human body, as described above.
- the term“neuromuscular activity” as used herein refers to neural activation of spinal motor neurons that innervate a muscle, muscle activation, muscle contraction, or any combination of the neural activation, muscle activation, and muscle contraction.
- spike event information describing when an action potential has occurred and/or a biological source of a detected spike event may be determined from the sensed neuromuscular signals.
- Sensors 210 may include one or more Inertial Measurement Units (IMUs), which measure a combination of physical aspects of motion, using, for example, an accelerometer, a gyroscope, a magnetometer, or any combination of one or more accelerometers, gyroscopes and magnetometers.
- IMUs may be used to sense information about movement of the part of the body on which the IMU is attached and information derived from the sensed data (e.g., position and/or orientation information) may be tracked as the user moves over time.
- one or more IMUs may be used to track movements of portions of a user’ s body proximal to the user’ s torso relative to the sensor (e.g., arms, legs) as the user moves over time.
- signals from an IMU may be used to filter, post-process, or otherwise refine the spike event(s) inferred by an inferential model.
- the IMU(s) and neuromuscular sensors may be arranged to detect movement of different parts of the human body.
- the IMU(s) may be arranged to detect movements of one or more body segments proximal to the torso (e.g., an upper arm), whereas the neuromuscular sensors may be arranged to detect motor unit activity within one or more body segments distal to the torso (e.g., a forearm or wrist).
- the sensors may be arranged in any suitable way, and embodiments of the technology described herein are not limited based on the particular sensor arrangement.
- At least one IMU and a plurality of neuromuscular sensors may be co located on a body segment to track motor unit activity and/or movements of the body segment using different types of measurements.
- an IMU sensor and a plurality of EMG sensors are arranged on a wearable device configured to be worn around the lower arm or wrist of a user.
- the IMU sensor may be configured to track movement information (e.g., positioning and/or orientation over time) associated with one or more arm segments, to determine, for example whether the user has raised or lowered their arm, whereas the EMG sensors may be configured to determine sub-muscular information associated with activation of sub-muscular structures in muscles of the wrist or hand.
- an IMU sensor may provide control signals that a user may volitionally control independently from one or more MUAPs.
- Each of the sensors 210 includes one or more sensing components configured to sense information about a user.
- the sensing components may include one or more accelerometers, gyroscopes, magnetometers, or any combination thereof to measure characteristics of body motion, examples of which include, but are not limited to, acceleration, angular velocity, and sensed magnetic field around the body.
- the sensing components may include, but are not limited to, electrodes configured to detect electric potentials on the surface of the body (e.g., for EMG sensors), vibration sensors configured to measure skin surface vibrations (e.g., for MMG sensors), and acoustic sensing components configured to measure ultrasound signals (e.g., for SMG sensors) arising from muscle activity.
- EMG sensors EMG sensors
- vibration sensors configured to measure skin surface vibrations
- acoustic sensing components configured to measure ultrasound signals (e.g., for SMG sensors) arising from muscle activity.
- Exemplary sensors 210 that may be used in accordance with some embodiments are described in more detail in U.S. Patent Application No. 15/659,018 entitled“METHODS AND APPARATUS FOR PREDICTING MUSCULO SKELETAL POSITION INFORMATION USING WEARABLE AUTONOMOUS SENSORS,” incorporated by reference herein by its entirety.
- At least some of the plurality of sensors are arranged as a portion of a wearable device configured to be worn on or around part of a user’s body.
- a wearable device configured to be worn on or around part of a user’s body.
- an IMU sensor and a plurality of neuromuscular sensors are arranged circumferentially around an adjustable and/or elastic band such as a wristband or armband configured to be worn around a user’s wrist or arm.
- at least some of the sensors may be arranged on a wearable patch configured to be affixed to a portion of the user’s body, at least some of the sensors may be implanted EMG sensors, or at least some of the sensors may be included as a portion of an electronic tattoo worn by the user.
- multiple wearable devices each having one or more neuromuscular sensors (and, optionally, one or more IMUs) included thereon may be used to generate control information based on MUAPs, sub-muscular structures, and/or movement that involve multiple parts of the body.
- sixteen EMG sensors are arranged circumferentially around an elastic band configured to be worn around a user’s lower arm.
- FIG. 6 shows EMG sensors 504 arranged circumferentially around elastic band 502.
- the neuromuscular sensors may be placed on (or implanted in) any part of the body.
- a wearable armband or wristband may be used to generate control information for controlling a robot, controlling a vehicle, scrolling through text, controlling a virtual avatar, activating a discrete control (e.g.
- a button navigating in a two-dimensional (or higher dimensional) space, or any other suitable control task.
- a user 506 may be wearing elastic band 502 on hand 508.
- EMG sensors 504 may be configured to record EMG signals as a user controls keyboard 530 using fingers 540.
- elastic band 502 may also include, as an option, one or more IMUs (not shown), configured to record movement information, as discussed above.
- the wearable device may be provided without an IMU.
- multiple wearable devices each having one or more
- IMUs and/or neuromuscular sensors included thereon may be used to generate control information based on MUAPs, activation associated with sub-muscular structures, and/or movement that involve multiple parts of the body.
- sensors 210 only include a plurality of neuromuscular sensors (e.g., EMG sensors). In other embodiments, sensors 210 include a plurality of neuromuscular sensors and at least one“auxiliary” sensor configured to continuously record a plurality of auxiliary signals.
- auxiliary sensors include, but are not limited to, IMU sensors, an imaging device (e.g., a camera), a radiation-based sensor for use with a radiation-generation device (e.g., a laser- scanning device), or other types of sensors such as a heart-rate monitor.
- the output of one or more of the sensing components may be optionally processed using hardware signal processing circuitry (e.g., to perform amplification, filtering, and/or rectification).
- hardware signal processing circuitry e.g., to perform amplification, filtering, and/or rectification
- at least some signal processing of the output of the sensing components may be performed in software. Accordingly, signal processing of signals recorded by the sensors may be performed in hardware, software, or by any suitable combination of hardware and software, as aspects of the technology described herein are not limited in this respect.
- the recorded sensor data may be optionally processed to compute additional derived measurements that are then provided as input to a spike event detection process.
- recorded signals from an IMU sensor may be processed to derive an orientation signal that specifies the orientation of a body segment over time.
- recorded signals from an IMU sensor may be processed to determine movement (e.g. high velocity movement) that may cause sensor movement artifacts or shifts in the spatial location of muscle fibers of a motor unit relative to one or more EMG sensors, each of which may cause spurious spike events to be detected.
- IMU sensor data may be used to filter or otherwise refine the output of an inferential model configured for detecting one or more MUAPs.
- Sensors may implement signal processing using components integrated with the sensing components, or at least a portion of the signal processing may be performed by one or more components in communication with, but not directly integrated with the sensing components of the sensors 210.
- System 200 also includes one or more computer processors 212 programmed to communicate with sensors 210.
- signals recorded by one or more of the sensors may be provided to the processor(s) 212, which may be programmed to execute one or more machine learning algorithms that process signals output by the sensors 210 to train one or more inferential models (e.g., statistical models 214), and the trained (or retrained) statistical model(s) 214 may be stored for later use in generating control signals, as described in more detail below.
- processors 212 programmed to communicate with sensors 210.
- the processor(s) 212 may be programmed to execute one or more machine learning algorithms that process signals output by the sensors 210 to train one or more inferential models (e.g., statistical models 214), and the trained (or retrained) statistical model(s) 214 may be stored for later use in generating control signals, as described in more detail below.
- statistical model 214 may be a neural network and, for example, may be a recurrent neural network.
- the recurrent neural network may be a long short-term memory (LSTM) neural network. It should be appreciated, however, that the recurrent neural network is not limited to being an LSTM neural network and may have any other suitable architecture.
- LSTM long short-term memory
- the recurrent neural network may be a fully recurrent neural network, a gated recurrent neural network, a recursive neural network, a Hopfield neural network, an associative memory neural network, an Elman neural network, a Jordan neural network, an echo state neural network, a second order recurrent neural network, and/or any other suitable type of recurrent neural network.
- neural networks that are not recurrent neural networks may be used.
- deep neural networks, convolutional neural networks, and/or feedforward neural networks may be used.
- the output of an inferential model provides discrete outputs.
- Discrete outputs e.g., classification labels
- the model may be trained to estimate whether the user is activating a particular motor unit, activating a particular motor unit with a particular timing, activating a particular motor unit with a particular firing pattern, or activating a particular combination of motor units.
- discrete classification is used in some embodiments to estimate whether a particular motor unit fired an action potential within a given amount of time. In such a scenario, these estimates may then be accumulated to obtain an estimated firing rate for that motor unit.
- the neural network may include a softmax layer such that the outputs add up to one and may be interpreted as probabilities.
- the output of the softmax layer may be a set of values corresponding to a respective set of control signals, with each value indicating a probability that the user wants to perform a particular control action.
- the output of the softmax layer may be a set of three probabilities (e.g., 0.92, 0.05, and 0.03) indicating the respective probabilities that the detected pattern of activity is one of three known patterns.
- the neural network is not required to produce outputs that add up to one.
- the output layer of the neural network may be a sigmoid layer (which has no restriction that the probabilities add up to one).
- the neural network may be trained with a sigmoid cross-entropy cost.
- Such an implementation may be advantageous in the case when multiple different control actions may occur within a threshold amount of time and it is not important to distinguish the order in which these actions occur (e.g., a user may activate two patterns of neural activity within the threshold amount of time).
- any other suitable non- probabilistic multi-class classifier may be used, as aspects of the technology described herein are not limited in this respect.
- the output of the statistical model may be a continuous signal rather than a discrete signal.
- the model may output an estimate of the firing rate of each motor unit or the model may output a time- series electrical signal corresponding to each motor unit or sub-muscular structure.
- the statistical model may comprise a hidden Markov model (HMM), a switching HMM with the switching allowing for toggling among different dynamic systems, dynamic Bayesian networks, and/or any other suitable graphical model having a temporal component. Any such statistical model may be trained using recorded sensor signals.
- HMM hidden Markov model
- switching HMM with the switching allowing for toggling among different dynamic systems
- dynamic Bayesian networks and/or any other suitable graphical model having a temporal component.
- Any such statistical model may be trained using recorded sensor signals.
- the statistical model is a classifier taking as input, features derived from the recorded sensor signals.
- the classifier may be trained using features extracted from the sensor data.
- the classifier may be a support vector machine, a Gaussian mixture model, a regression based classifier, a decision tree classifier, a Bayesian classifier, and/or any other suitable classifier, as aspects of the technology described herein are not limited in this respect.
- Input features to be provided to the classifier may be derived from the sensor data in any suitable way.
- the sensor data may be analyzed as time series data using wavelet analysis techniques (e.g., continuous wavelet transform, discrete-time wavelet transform, etc.), Fourier-analytic techniques (e.g., short-time Fourier transform, Fourier transform, etc.), and/or any other suitable type of time-frequency analysis technique.
- wavelet analysis techniques e.g., continuous wavelet transform, discrete-time wavelet transform, etc.
- Fourier-analytic techniques e.g., short-time Fourier transform, Fourier transform, etc.
- the sensor data may be transformed using a wavelet transform and the resulting wavelet coefficients may be provided as inputs to the classifier.
- values for parameters of the statistical model may be estimated from training data.
- parameters of the neural network e.g., weights
- parameters of the statistical model may be estimated using gradient descent, stochastic gradient descent, and/or any other suitable iterative optimization technique.
- the statistical model may be trained using stochastic gradient descent and backpropagation through time.
- the training may employ a cross-entropy loss function and/or any other suitable loss function, as aspects of the technology described herein are not limited in this respect.
- System 200 also optionally includes one or more controllers 216.
- controller 216 may be a display controller configured to display a visual representation (e.g., of a hand) on a display.
- one or more computer processors may implement one or more trained statistical models that receive as input sensor signals and provide as output information that is used to generate control signals.
- a computer application configured to simulate a virtual reality environment may be instructed to display a visual character such as an avatar (e.g., via controller 216). Positioning, movement, and/or forces applied by portions of visual character within the virtual reality environment may be displayed based on the output of the trained statistical model(s).
- the visual representation may be dynamically updated as continuous signals are recorded by the sensors 210 and processed by the trained statistical model(s) 104 to provide a computer-generated representation of the character’ s movement that is updated in real-time.
- Some embodiments are directed to using a statistical model, at least in part, to map spike event information extracted from the neuromuscular signals to control signals.
- the statistical model may receive as input IMU signals, neuromuscular signals (e.g., EMG, MMG, and/or SMG signals), spike event information (e.g., spike train data) extracted from neuromuscular signals, external device signals (e.g., camera or laser-scanning signals), or a combination of IMU signals, neuromuscular signals, and external device signals detected as a user performs one or more muscular activations.
- the statistical model may be used to predict the control information without the user having to make perceptible movements.
- System 200 also optionally includes a user interface 218.
- Feedback determined based on the signals recorded by sensors 210 and processed by processor(s) 212 may be provided via user interface 218 to facilitate a user’s understanding of how the system is interpreting the user’s intended activation.
- User interface 218 may be implemented in any suitable way including, but not limited to, an audio interface, a video interface, a tactile interface, and electrical stimulation interface, or any combination of the foregoing.
- control signals based on a user activating one or more MUAPs may require user training so that the user may effectively and reliably activate the intended one or more MUAPs to create intended control signals.
- systems and methods provide sensory feedback to a user when they have activated a specified (i.e. desired) MUAP (and a model has detected the presence of the specified MUAP), so that the user may become more skillful at reliably activating that MUAP.
- feedback may comprise auditory, visual, haptic, or multi-sensory feedback with sufficiently low latency for the user to learn the mapping between the sensory feedback and the preceding MUAP activation.
- the architecture of system 200 may take any suitable form. Some embodiments employ a thin architecture in which processor 212 is included as a portion of a device separate from and in communication with the plurality of neuromuscular sensors 210 arranged on the one or more wearable devices.
- the neuromuscular sensors may be configured to wirelessly stream in substantially real-time, the plurality of neuromuscular signals and/or the information derived from the plurality of neuromuscular signals to processor 212 for processing including, but not limited to, spike event detection and biological source identification.
- the device separate from and in communication with the plurality of neuromuscular sensors may be, for example, a remote server, a desktop computer, a laptop computer, a smartphone, or a wearable electronic device such as a smartwatch, a health monitoring device, smart glasses, other wearable system (including head mounted wearable systems), or an augmented reality system.
- Some embodiments employ a thick architecture in which processor 212 is integrated with the one or more wearable devices on which the neuromuscular sensors 210 are arranged.
- the processing for spike event detection and/or biological source identification is divided between multiple processors, at least one of which is integrated with sensors 210 and at least one of which is included as a portion of a device separate from and in communication with the sensors 210.
- the neuromuscular sensors may be configured to transmit at least some of the recorded neuromuscular signals to a first computer processor remotely located from the sensors.
- the first computer processor may be programmed to train, based on the transmitted neuromuscular signals, at least one spike detection model and/or at least one spike identification model.
- the first computer processor may then be programmed to transmit the trained at least one spike detection model and/or the at least one spike identification model to a second computer processor integrated with the one or more wearable devices on which the sensors are arranged.
- the second computer processor may be programmed to detect spike events and determine the biological source of the detected spike events using the at least one spike detection model and/or the at least one spike identification model transmitted from the first computer processor. In this way, the training/fitting process and the real-time process of using the trained model(s) may be separated by being performed by different processors.
- FIG. 3 illustrates a process 300 for generating an output based on one or more spike events detected in recorded neuromuscular signals in accordance with some embodiments.
- act 310 a plurality of neuromuscular signals are recorded by a plurality of neuromuscular sensors worn by a user as the user activates one or more motor units.
- Process 300 then proceeds to act 320 where the recorded neuromuscular signals are optionally processed prior to detection of spike events.
- one or more time-lagged versions of the recorded signals may be generated and the time-lagged versions may subsequently be used for detection of spike events.
- the inventors have recognized that effective time lag values are on the order of the timescale of a motor unit action potential with the particular neuromuscular recording technique employed.
- the motor unit action potentials measured using surface EMG recordings generally exhibit a time lag in a range of between 10 and 50 ms. In some embodiments, a time lag of 15 to 25 ms may also be effective.
- Process 300 then proceeds to act 330, where at least one spike event is detected in the recorded neuromuscular signals.
- the recorded neuromuscular signals or information derived from the recorded neuromuscular signals are processed using one or more filters to detect spike events in the recorded neuromuscular signals.
- the one or more filters includes a plurality of filters, each of which is configured to detect spikes generated from a particular biological source (e.g., from a particular motor unit). Example techniques for generating filters for use with some embodiments are described in more detail below.
- Process 300 then proceeds to act 340, where the biological source of the detected spike event(s) is determined.
- the biological source determination in act 340 may be based on the output of the plurality of filters and their associated biological sources for which they are configured to detect spike events.
- the detection of one or more spike events in act 330 and the determination of a biological source of the spike event(s) in act 340 may be performed sequentially. Any suitable biological source for spike events may be determined in act 340.
- the biological source may be a single motor unit, a group of motor units, a muscle, or a group of muscles.
- the ability of the system to determine a particular biological source for spike events may be based, at least in part, on a spatiotemporal resolution of the system in distinguishing between different spike events. For example, in some instances the system may not be able to determine which of a plurality of motor units a spike originated from, but the system may be able to determine a group of motor units from which the spike originated. In other instances, the system may not be able to determine within a muscle which of the motor units a spike originated from, but the system may be able to determine which muscle the spike originated from, and so on.
- Process 300 then proceeds to act 350, where one or more outputs are generated based on the detected spike event(s) and/or the biological source of the spike event(s). Any suitable output may be generated for a particular application, and embodiments are not limited in this respect.
- the output may be compressed data representing the recorded neuromuscular signals.
- the system may be configured to store only information about the detected spike events such as their timing characteristics and/or their biological source information. Storing such compressed data may be beneficial, for example, for transmission of the data (e.g., over one or more wireless networks) to an external device and/or for logging data for health/fitness/ergonomics monitoring applications without having to store the raw recorded data.
- the output generated in act 350 is information used to update a musculoskeletal model.
- some embodiments employ a musculoskeletal model that is updated with musculoskeletal position information describing, for example, positions and/or forces of rigid body segments in the model.
- Spike event information determined in acts 330 and/or 340 may be provided as input to the musculoskeletal model as part of the updating process. Control signals may then be generated based on the updated musculoskeletal model.
- the output generated in act 350 is a control signal used to control an external device.
- some embodiments map spike event information (e.g., detected spike events and/or biological source information for the spike events) to control signals.
- spike event information e.g., detected spike events and/or biological source information for the spike events
- one or more control signals may be generated based on the identified biological source(s) and a pattern of activation represented in the detected spike event information.
- the spike event information may be provided as input to a trained statistical model and an output of the trained statistical model may be used to generate the one or more control signals.
- the output of the trained statistical model may be a set of one or more control signals.
- control signal(s) may be generated based on the spike event information without the use of a trained statistical model.
- the generated control signal(s) may then be provided to a control interface of a device to control an operation of the device.
- the device may be a display and a control signal may be provided to a display controller of the display.
- the control signal may include instructions to update information displayed on the display.
- the device may be a computer or other computing device (e.g., a smartphone) and the control signal may be provided to a controller of the computing device to change an operation of the device.
- the control signal may be used to control a device (e.g., a musical instrument) to provide an artistic expression. It should be appreciated that any device having a control interface may be controlled using control systems designed in accordance with the techniques described herein.
- the one or more control signals are generated based, at least in part, on the spike event information in substantially real-time.
- substantially real-time means that the spike event information determination process occurs and/or the control signals are generated shortly after the electrical event occurs while the neuromuscular data is being recorded, rather than happening off-line at a time when the neuromuscular signals are not being recorded.
- spike event information is detected within 5 seconds, within 1 second, within 500 ms, within 100 ms, or within 50 ms of the occurrence of the electrical event.
- the spike event information used to generate output in act 350 may include information about the spatiotemporal pattern of detected spike events (e.g., spike rate, spatial distribution of spike events, biological source of spike events).
- one or more control signals may be generated based, at least in part, on at least one characteristic of the spatiotemporal pattern of the detected spike events.
- the one or more control signals may be generated based, at least in part, on a spike rate and/or a spatial distribution of spike events detected from the neuromuscular signals.
- control signals based on MUAPs from one or more motor units may be used as one or more discrete controls (i.e. a button or set of buttons that, when activated cause a computing device to change an operation), one or more continuous controls (i.e. a one-dimensional controller such as to control the volume of a speaker or the temperature of a thermostat, a two-dimensional controller such as to navigate a cursor on a two-dimensional screen, or a higher-dimensional controller such as to control a robotic arm with three or more degrees of freedom).
- control signals based on MUAPs may comprise composite controls based on a particular sequence of activation of one or more MUAPs in order to achieve a greater number of discrete controls (i.e.
- DOF degrees of freedom
- a user may simultaneously (or near-simultaneously, within a defined period of time) activate two or more MUAPs that achieve unique discrete controls than the unitary activation of a MUAP.
- continuous controls as described above are generally not truly continuous and represent quantized control across a range of values.
- the motor unit(s) from which a detected spike event(s) originated from may be mapped to one or more muscles.
- FIG. 4 illustrates a process 400 for performing muscle classification in accordance with some embodiments.
- spike event information is determined, for example, in accordance with at least a portion of process 300 described above.
- Process 400 then proceeds to act 410, where one or more muscles to which the identified motor unit belongs is determined.
- the muscle(s) may be determined in any suitable way. For example, similarities between spatial profiles and correlations in spiking activity arising from multiple motor units may indicate that the multiple motor units belong to the same muscle.
- the muscular (or sub-muscular) source may be inferred based on the spatial pattern of signals recorded on a plurality of neuromuscular sensors on the skin of a user (or, optionally, implanted in a user).
- Process 400 then proceeds to act 414, where output is generated based, at least in part, on the determined muscle associated with the spike event information.
- the identification of a particular muscle relating to detected spike events may be used to further describe information about the spike events relative to the identified muscle. For example, as described above, each muscle in the human body may be characterized by a particular pattern of motor unit recruitment that describes an order by which additional motor units are recruited when needed.
- the information about a motor unit recruitment pattern of a muscle may be used, at least in part, to determine where the motor unit or group of motor units falls within the motor unit recruitment pattern for the determined muscle.
- FIG. 5 illustrates a process 500 for generating a plurality of filters, each of which represents spike activity within a biological source (e.g., a motor unit).
- the plurality of filters once generated, may be used to process neuromuscular signals and provide outputs in substantially real-time as the neuromuscular signals are recorded. Additionally, in some embodiments, as additional neuromuscular data is recorded, filter parameters may be updated such that the filters are dynamically updated.
- a plurality of spike events are detected in recorded neuromuscular signals.
- neuromuscular signals may be recorded during periods of relatively low activity, and then spike events may be detected using thresholding of the recorded data.
- FIG. 8 shows an example of the detection of putative spike events in two EMG sensor channels during periods of low activity.
- the putative spike events detected in the EMG recordings may be analyzed to eliminate false positives. For example, putative spike events having one or more particular characteristics (e.g., a duration longer than a threshold duration) may be discarded.
- process 500 proceeds to act 512, where the detected spike events are clustered, based on their spatiotemporal characteristics, to identify spike events likely arising from the same biological source.
- Clustering of spike events may occur in any suitable way.
- a window e.g., a 10 ms window
- Each spike event may then be defined as a vector of values, where each vector includes NxM samples, where N corresponds to the number of samples in the window for the event, and M corresponds to the number of neuromuscular sensors.
- a similarity metric may be used to identify vectors having values that cluster together, and thus are likely to represent spike events generated from a common biological source.
- PCA Principal Component Analysis
- k-means clustering or another suitable clustering technique may be used to cluster the lower-dimensional vectors into clusters of spike waveforms that have similar spatiotemporal characteristics.
- dimensionality reduction techniques include t-Distributed Stochastic Neighbor Embedding, deep auto encoders, and Uniform Manifold Approximation and Projection (UMAP).
- clustering methods include agglomerative clustering, Density-Based Spatial Clustering of Applications with Noise (DBSCAN), Hierarchical Density-Based Spatial Clustering of Applications with Noise (HDBSCAN).
- the vectors for each of the spike events are used to create an affinity matrix of the different spike events and a measure of similarity of the vectors, for example, using correlations, may be used to identify the clusters of spike waveforms having similar spatiotemporal characteristics.
- FIG. 9A illustrates the results of a clustering process in which clusters of spike waveforms having similar spatiotemporal characteristics have been identified.
- Each of the clusters of spike event data includes a plurality of spike events that represent a distribution of spike events generated by a particular biological source (e.g., a motor unit).
- the process then proceeds to act 514, where a plurality of filters are generated based on the spike events within each of the clusters, resulting in a set of filters, each of which is configured to detect spike events for its associated biological source (i.e., MUAP spike event).
- Act 512 serves to produce labeled spike event data from the unlabeled data detected in act 510.
- the labeled spike event data from act 512 may then be used, at least in part, to generate filters in act 514.
- the spatiotemporal response function for each cluster may be determined by, for example, calculating the mean of each of the spike event vectors in the cluster.
- FIG. 9B illustrates example spatiotemporal profiles calculated for six clusters using this technique.
- the spatiotemporal profiles may then be used to generate the filters in act 514.
- some embodiments use a beamforming formulation to determine filters for automatic spike decoding.
- An example of a beamforming formulation that may be used to generate filters in accordance with some embodiments is the minimum variance distortionless response (MVDR) filter, described in more detail below.
- MVDR minimum variance distortionless response
- multiple filters may be used to disambiguate multiple motor units that are present in one cluster, such as motor units that are located near each other anatomically.
- multiple MVDR filters may be combined to disambiguate multiple motor units that are present in one MVDR cluster.
- some embodiments employ neural networks to detect spike event data, and the labeled data output from act 512 in process 500 may be used to train the neural network.
- Any suitable neural network architecture may be used including, but not limited to, convolutional neural networks and recurrent neural networks. When recurrent neural networks are used, a convolutional layer may be used as the first layer of the network.
- Beamforming methods use filters to determine source signals from multidimensional sensor signals. Beamforming methods commonly use purely spatial filters. However, the inventors have recognized that spatial filtering alone is not suitable for detecting spike events in neuromuscular signals due to the large number of similarly localized sources relative to the small number of sensor channels. Accordingly, some embodiments use a time- lagged representation of the recorded neuromuscular signals to increase their effective dimensionality and exploit the consistent spatiotemporal response function for each source. Any suitable beamforming technique may be used, examples of which include, but are not limited to MVDR, described in more detail below, and linear constrained minimum variance
- LCMV LCMV
- W (i/C 'L) L C 1
- L the matrix of spatiotemporal response profiles (i.e. the collection of all h vectors in the MVDR notation below)
- C the sensor signal covariance matrix (equivalent to f ;:;. in the MVDR notation below).
- the outputs of the filtered signals may then be thresholded to determine whether a spike event occurs.
- Each filter corresponds to a biological source (e.g., an individual motor unit), so spike detection and identification of the spike’s biological source occur as a single process.
- Some embodiments are directed to using a plurality of MVDR filters to perform real-time spike detection in neuromuscular signals.
- MVDR filters Similar to matched filtering, the use of MVDR filters maintains a target signal as much as possible, but also minimizes the noise of any other signals that are present in the recorded neuromuscular signals.
- S sources each of which has a stereotyped signal profile or template across channels and time
- x s (t) be a binary variable indicating whether source s is triggered at time t.
- x s(i) will have a value of 1 for only a single time step at a time, not for the duration of the emission.
- Let hsc( ) be the profile of the stereotyped profile of source s as measured in channel c at time T. Then the measured signal will be the convolution operator and n c( t) is additional noise on channel c at time t.
- y, x, and h may be unwrapped into time-lagged vectors for each time- step.
- the template instead of being a 2D matrix of size CxT for C channels and T timesteps, is unwrapped into a CT x 1 vector, which is equivalent to concatenating successive time frames of the template on top of each other.
- y (/) h( ⁇ ) .
- y (/) hv ( / ) + n ( / ) .
- FIG. 10 illustrates a set of EMG channel waveforms associated with a number of biological sources, that may be produced in accordance with some embodiments of the technology described herein. As shown, each column reflects a spatiotemporal waveform as detected from one biological source (i.e., one motor unit) and each row is (average / template) waveform produced from one EMG channel.
- one biological source i.e., one motor unit
- each row is (average / template) waveform produced from one EMG channel.
- FIG. 11 illustrates an exemplary output of an automatic spike detector using an MVDR filter to process streaming recorded neuromuscular data.
- the MVDR filter output is similar to the ground truth spike times indicating that the accuracy of the automatic spike detection is high.
- spike event information for each motor unit may be determined, as shown in FIG. 12.
- training/fitting the model comprises determining the spatiotemporal response functions.
- the spatiotemporal patterns for each biological source e.g., each motor unit
- approaches that work in real time on streaming data and approaches that do not work in real-time e.g., iterative techniques
- the spatiotemporal response functions for biological sources are determined using matrix factorization technique. For example, Negro et al. Journal of Neural Engineering (2016) uses a multi-step iterative algorithm to decompose the time-lagged sensor signals into components corresponding to each source and detecting the spike events from each source.
- the spatiotemporal response functions can be obtained by any spike decomposition or spike sorting method, of which non-limiting examples include those used by commercially- available spike sorting software packages KiloSort, MountainSort, and combinations of the techniques used therein.
- estimates of the spatio-temporal response functions are updated, for example by repeating these estimation procedures with additional accumulated data or performing reverse correlation based on the timings of detected spikes during real-time operation of the detectors.
- spike events may be detected in the neuromuscular data.
- raw neuromuscular data may be thresholded.
- the recorded neuromuscular data may be whitened or filtered, for example, with wavelets.
- the detected spike events may then be clustered to identify similar spike events.
- the spike events in the cluster may then be used to determine the spatiotemporal patterns for each biological source (e.g. by taking the cluster means) for the beamforming filters. Thresholds on the filter outputs may be set by applying the beamforming filters to the data from which the clusters were determined, and determining which threshold results in an appropriate balance between false negatives (failing to identify a member of the cluster) and false positives (e.g. identifying an event from another cluster).
- the clusters are accepted, rejected, or ranked based on quality criteria.
- quality criteria can reflect intra- and inter-cluster distances, amplitudes of spatiotemporal response functions, biological plausibility of spatiotemporal response functions, and biological plausibility of spike times for events within the cluster (e.g. do multiple events occur within a motor neuron's refractory period).
- the event detection and clustering technique described above is repeated multiple times in an iterative manner.
- the best cluster e.g., the cluster with the largest amplitude events or the most tightly clustered cluster
- beamforming is used on recorded neuromuscular data to detect spike events corresponding to that cluster, e.g., for a first biological source.
- the detected spike events are convolved with the spatiotemporal pattern and the result is subtracted from the recorded neuromuscular signals to be used on further iterations for event detection and clustering, essentially eliminating the contribution of the first biological source to the neuromuscular signals.
- a stopping criterion e.g., when there are no more clusters that pass a quality criterion or when the residual signal has variance comparable to that of the noise (which can be estimated from a period of minimal neuromuscular activity), may be used to determine when to stop the iteration.
- Synthetic data may be generated by simulating neuromuscular data having spike times with a random point process (e.g. Poisson process, renewal process), convolving the spike events with spatiotemporal profiles of the spikes.
- the spatiotemporal profiles can be the cluster centers, samples from within the cluster, interpolations between samples within the clusters, or samples from a model fit from the cluster), and then adding noise.
- the noise can include Gaussian noise and also spatiotemporal profiles from other motor units (e.g., synthetically generated from a mathematical model, obtained from data from other users, or semisynthetic, e.g., scaled or otherwise transformed profile obtained from another user).
- Other motor units e.g., synthetically generated from a mathematical model, obtained from data from other users, or semisynthetic, e.g., scaled or otherwise transformed profile obtained from another user.
- the first convolutional layer of the networks can be initialized with the linear filters determined from beamforming methods, and connections to the output layer may be skipped so that the rest of the network can act as a correction to the beamforming estimates.
- the filters are saved and re-used across user sessions. Due to changes in sensor placement, reusing the filters may require calibration or registration of the signals across sessions.
- spike peeling comprises identifying one spike action potential (i.e. one biological source, generally a motor unit) at a time and extracting the spatiotemporal pattern of that biological source from the recording, potentially revealing more spikes in the residual.
- the purpose of this technique is to be able to extract spikes from a recording in an unsupervised and potentially online manner.
- spike peeling may be used to generate a session-specific "spike bank" to extract as many spikes as possible from a recording.
- FIG. 13 is a flow chart showing a substantially real-time process for detecting spike event information from neuromuscular data in accordance with some embodiments of the technology described herein.
- FIG. 13 shows an example process 1300 for spike peeling according to some embodiments. Such a process may be performed, for example, using various system or computer-based elements as described herein.
- neuromuscular signals are recorded from a user.
- the neuromuscular signals are processed at block 1304 and one or more spike events are detected at block 1306 (such as, for example, using beamforming techniques as described herein).
- the system determines the best biological source(s) for one or more spike event(s) that are detected.
- a first biological source is selected at block 1310 by applying criteria to determine a“best” biological source (i.e. motor unit) to extract, and the spike events (spike times) are extracted (i.e. saved for use as a control signal or other use) at block 1312.
- the spatiotemporal waveform for that biological source e.g. a spatiotemporal template for that biological source
- workflow 1306 through 1314 repeats one or more times until no biological source is present in the signal that meets a minimum threshold for selection as a biological source of spike events.
- spike peeling One benefit of spike peeling is that it may be possible to group spikes into
- One effective workflow for spike peeling selects the "best" spike to extract next based on the product of its highest amplitude and the log of the number of spikes assigned to it in the clustering process.
- the inventors have recognized that using just the highest amplitude as a criterion for selecting the next spike for extraction tends to pick up artifacts that are localized in a single sample in a single electrode, whereas using a criterion based only on the number of detected spikes tends to pick up common low-amplitude patterns that are not spike-like (i.e. not physiologically plausible).
- the combination of spike amplitude and number of detected spikes is most effective, in at least some instances, to identify spatiotemporal patterns that exhibit physiological characteristics.
- FIGs. 14A-14B and 15A-15B show several embodiments of a wearable system in which various embodiments may be practiced.
- FIG. 14A illustrates a wearable system with sixteen neuromuscular sensors 1410 (e.g., EMG sensors) arranged circumferentially around an elastic band 1420 configured to be worn around a user’s lower arm or wrist.
- EMG sensors 1410 are arranged circumferentially around elastic band 1420.
- any suitable number of neuromuscular sensors may be used. The number and arrangement of neuromuscular sensors may depend on the particular application for which the wearable device is used.
- a wearable armband or wristband can be used to generate control information for controlling an augmented reality system, a robot, controlling a vehicle, scrolling through text, controlling a virtual avatar, or any other suitable control task.
- sensors 1410 include a set of neuromuscular sensors (e.g., EMG sensors). In other embodiments, sensors 1410 can include a set of neuromuscular sensors and at least one“auxiliary” sensor configured to continuously record auxiliary signals. Examples of auxiliary sensors include, but are not limited to, other sensors such as IMU sensors, microphones, imaging sensors (e.g., a camera), radiation based sensors for use with a radiation-generation device (e.g., a laser- scanning device), or other types of sensors such as a heart-rate monitor. As shown the sensors 1410 may be coupled together using flexible electronics 1430 incorporated into the wearable device. FIG. 14B illustrates a cross-sectional view through one of the sensors 1410 of the wearable device shown in FIG. 14 A.
- the output of one or more of the sensing components can be optionally processed using hardware signal processing circuitry (e.g., to perform amplification, filtering, and/or rectification).
- hardware signal processing circuitry e.g., to perform amplification, filtering, and/or rectification
- at least some signal processing of the output of the sensing components can be performed in software.
- signal processing of signals sampled by the sensors can be performed in hardware, software, or by any suitable combination of hardware and software, as aspects of the technology described herein are not limited in this respect.
- a non-limiting example of a signal processing chain used to process recorded data from sensors 1410 are discussed in more detail below in connection with FIGS. 15A and 15B.
- FIGS. 15 A and 15B illustrate a schematic diagram with internal components of a wearable system with sixteen EMG sensors, in accordance with some embodiments of the technology described herein.
- the wearable system includes a wearable portion 1510 (FIG. 15A) and a dongle portion 1520 (FIG. 15B) in communication with the wearable portion 1510 (e.g., via Bluetooth or another suitable short range wireless communication technology).
- the wearable portion 1510 includes the sensors 1410, examples of which are described in connection with FIGS. 14A and 14B.
- the output of the sensors 1410 is provided to analog front end 1530 configured to perform analog processing (e.g., noise reduction, filtering, etc.) on the recorded signals.
- analog processing e.g., noise reduction, filtering, etc.
- the processed analog signals are then provided to analog-to-digital converter 1532, which converts the analog signals to digital signals that can be processed by one or more computer processors.
- An example of a computer processor that may be used in accordance with some embodiments is microcontroller (MCU) 1534 illustrated in FIG. 15 A. As shown, MCU 1534 may also include inputs from other sensors (e.g., IMU sensor 1540), and power and battery module 1542. The output of the processing performed by MCU may be provided to antenna 1550 for transmission to dongle portion 1520 shown in FIG. 15B.
- MCU microcontroller
- Dongle portion 1520 includes antenna 1552 configured to communicate with antenna 1550 included as part of wearable portion 1510. Communication between antenna 1550 and 1552 may occur using any suitable wireless technology and protocol, non-limiting examples of which include radiofrequency signaling and Bluetooth. As shown, the signals received by antenna 1552 of dongle portion 1520 may be provided to a host computer for further processing, display, and/or for effecting control of a particular physical or virtual object or objects.
- FIGS. 14 A, 14B and FIGS.15 A, 15B are discussed in the context of interfaces with EMG sensors, it is understood that the techniques described herein for reducing electromagnetic interference can also be implemented in wearable interfaces with other types of sensors including, but not limited to, mechanomyography (MMG) sensors, sonomyography (SMG) sensors, and electrical impedance tomography (EIT) sensors.
- MMG mechanomyography
- SMG sonomyography
- EIT electrical impedance tomography
- the above-described embodiments can be implemented in any of numerous ways.
- the embodiments may be implemented using hardware, software or a combination thereof.
- the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
- any component or collection of components that perform the functions described above can be generically considered as one or more controllers that control the above-discussed functions.
- the one or more controllers can be implemented in numerous ways, such as with dedicated hardware or with one or more processors programmed using microcode or software to perform the functions recited above.
- one implementation of the embodiments of the present invention comprises at least one non-transitory computer-readable storage medium (e.g., a computer memory, a portable memory, a compact disk, etc.) encoded with a computer program (i.e., a plurality of instructions), which, when executed on a processor, performs the above-discussed functions of the embodiments of the present invention.
- the computer-readable storage medium can be transportable such that the program stored thereon can be loaded onto any computer resource to implement the aspects of the present invention discussed herein.
- the reference to a computer program which, when executed, performs the above-discussed functions is not limited to an application program running on a host computer. Rather, the term computer program is used herein in a generic sense to reference any type of computer code (e.g., software or microcode) that can be employed to program a processor to implement the above- discussed aspects of the present invention.
- embodiments of the invention may be implemented as one or more methods, of which an example has been provided.
- the acts performed as part of the method(s) may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Theoretical Computer Science (AREA)
- Medical Informatics (AREA)
- Artificial Intelligence (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physiology (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Psychiatry (AREA)
- Computing Systems (AREA)
- Data Mining & Analysis (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Computational Linguistics (AREA)
- Human Computer Interaction (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Power Engineering (AREA)
- Fuzzy Systems (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- Biodiversity & Conservation Biology (AREA)
- Dermatology (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862718337P | 2018-08-13 | 2018-08-13 | |
PCT/US2019/046351 WO2020036958A1 (fr) | 2018-08-13 | 2019-08-13 | Détection et identification de pointes en temps réel |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3836836A1 true EP3836836A1 (fr) | 2021-06-23 |
EP3836836A4 EP3836836A4 (fr) | 2021-09-29 |
EP3836836B1 EP3836836B1 (fr) | 2024-03-20 |
Family
ID=69405236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19850130.6A Active EP3836836B1 (fr) | 2018-08-13 | 2019-08-13 | Détection et identification de pointes en temps réel |
Country Status (4)
Country | Link |
---|---|
US (1) | US11179066B2 (fr) |
EP (1) | EP3836836B1 (fr) |
CN (1) | CN112566553A (fr) |
WO (1) | WO2020036958A1 (fr) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021071915A1 (fr) | 2019-10-08 | 2021-04-15 | Unlimited Tomorrow, Inc. | Réseau de capteurs biométriques |
AU2021282171A1 (en) * | 2020-05-27 | 2022-12-15 | The Johns Hopkins University | System and method for implantable muscle interface |
US20210390436A1 (en) * | 2020-06-11 | 2021-12-16 | Sap Se | Determining Categories For Data Objects Based On Machine Learning |
US11589782B2 (en) * | 2020-08-17 | 2023-02-28 | The Trustees of the California State University | Movement analysis and feedback systems, applications, devices, and methods of production thereof |
CN112434630B (zh) * | 2020-12-01 | 2022-08-16 | 深圳先进技术研究院 | 连续运动信息预测模型的训练方法及其训练装置、设备 |
CN112971813A (zh) * | 2021-01-30 | 2021-06-18 | 上海麓联智能科技有限公司 | 一种用于神经信号的峰电位分类系统及分类方法 |
CN113255579B (zh) * | 2021-06-18 | 2021-09-24 | 上海建工集团股份有限公司 | 一种施工监测异常采集数据自动识别与处理的方法 |
EP4423724A1 (fr) * | 2021-10-25 | 2024-09-04 | Magic Leap, Inc. | Cartographie de réponse audio environnementale sur un dispositif de réalité mixte |
CN117825601B (zh) * | 2024-03-05 | 2024-05-24 | 山东润达检测技术有限公司 | 一种食品中二氧化硫的测定方法 |
CN118266949B (zh) * | 2024-06-03 | 2024-09-03 | 之江实验室 | 一种基于深度学习的脑深部锋电位信号检测方法和装置 |
Family Cites Families (250)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4055168A (en) | 1976-09-21 | 1977-10-25 | The Rockefeller University | Posture training device |
IL78244A0 (en) | 1986-03-24 | 1986-07-31 | Zvi Kamil | Instrumentation amplifier arrangement |
US5625577A (en) | 1990-12-25 | 1997-04-29 | Shukyohojin, Kongo Zen Sohonzan Shorinji | Computer-implemented motion analysis method using dynamics |
JP3103427B2 (ja) | 1992-04-01 | 2000-10-30 | ダイヤメディカルシステム株式会社 | 生体電気現象検出装置 |
EP0959444A4 (fr) | 1996-08-14 | 2005-12-07 | Nurakhmed Nurislamovic Latypov | Procede de suivi et de representation de la position et de l'orientation d'un sujet dans l'espace, procede de presentation d'un espace virtuel a ce sujet, et systemes de mise en oeuvre de ces procedes |
US6009210A (en) | 1997-03-05 | 1999-12-28 | Digital Equipment Corporation | Hands-free interface to a virtual reality environment using head tracking |
EP1107693A4 (fr) | 1998-08-24 | 2003-03-19 | Univ Emory | Procede et appareil pour predire l'apparition de crises en fonction de caracteristiques derivees de signaux indiquant une activite du cerveau |
US6745062B1 (en) | 1998-10-05 | 2004-06-01 | Advanced Imaging Systems, Inc. | Emg electrode apparatus and positioning system |
US6244873B1 (en) | 1998-10-16 | 2001-06-12 | At&T Corp. | Wireless myoelectric control apparatus and methods |
US6774885B1 (en) | 1999-01-20 | 2004-08-10 | Motek B.V. | System for dynamic registration, evaluation, and correction of functional human behavior |
US6411843B1 (en) | 1999-05-28 | 2002-06-25 | Respironics, Inc. | Method and apparatus for producing a model EMG signal from a measured EMG signal |
US6720984B1 (en) | 2000-06-13 | 2004-04-13 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Characterization of bioelectric potentials |
AU2001278318A1 (en) | 2000-07-24 | 2002-02-05 | Jean Nicholson Prudent | Modeling human beings by symbol manipulation |
US6820025B2 (en) | 2000-10-30 | 2004-11-16 | The United States Of America As Represented By The Secretary Of The Navy | Method and apparatus for motion tracking of an articulated rigid body |
WO2003063684A2 (fr) | 2002-01-25 | 2003-08-07 | Intellipatch, Inc. | Systeme et procede permettant de detecter et d'evaluer des parametres physiologiques et de modeliser une analyse predictive adaptable pour la gestion de symptomes |
JP2003255993A (ja) | 2002-03-04 | 2003-09-10 | Ntt Docomo Inc | 音声認識システム、音声認識方法、音声認識プログラム、音声合成システム、音声合成方法、音声合成プログラム |
US20040073414A1 (en) * | 2002-06-04 | 2004-04-15 | Brown University Research Foundation | Method and system for inferring hand motion from multi-cell recordings in the motor cortex using a kalman filter or a bayesian model |
US6942621B2 (en) | 2002-07-11 | 2005-09-13 | Ge Medical Systems Information Technologies, Inc. | Method and apparatus for detecting weak physiological signals |
JP3831788B2 (ja) * | 2002-09-11 | 2006-10-11 | 独立行政法人情報通信研究機構 | 活動筋肉表示装置 |
KR100506084B1 (ko) | 2002-10-24 | 2005-08-05 | 삼성전자주식회사 | 경혈점 탐색 장치 및 방법 |
EP1670547B1 (fr) | 2003-08-18 | 2008-11-12 | Cardiac Pacemakers, Inc. | Systeme de surveillance de patient |
CN1838933B (zh) | 2003-08-21 | 2010-12-08 | 国立大学法人筑波大学 | 穿着式动作辅助装置、穿着式动作辅助装置的控制方法和控制用程序 |
JP4178186B2 (ja) | 2003-08-21 | 2008-11-12 | 国立大学法人 筑波大学 | 装着式動作補助装置、装着式動作補助装置の制御方法および制御用プログラム |
US7565295B1 (en) | 2003-08-28 | 2009-07-21 | The George Washington University | Method and apparatus for translating hand gestures |
US7574253B2 (en) | 2003-09-26 | 2009-08-11 | Northwestern University | Signal processing using non-linear regression with a sinusoidal model |
US7961909B2 (en) | 2006-03-08 | 2011-06-14 | Electronic Scripting Products, Inc. | Computer interface employing a manipulated object with absolute pose detection component and a display |
JP4590640B2 (ja) | 2004-06-16 | 2010-12-01 | 国立大学法人 東京大学 | 筋骨格モデルに基づく筋力取得方法及び装置 |
US7901368B2 (en) | 2005-01-06 | 2011-03-08 | Braingate Co., Llc | Neurally controlled patient ambulation system |
WO2006105094A2 (fr) | 2005-03-29 | 2006-10-05 | Duke University | Systeme detecteur permettant l'identification et la poursuite des deplacements de multiples sources |
US7428516B2 (en) | 2005-06-23 | 2008-09-23 | Microsoft Corporation | Handwriting recognition using neural networks |
US8190249B1 (en) | 2005-08-01 | 2012-05-29 | Infinite Biomedical Technologies, Llc | Multi-parametric quantitative analysis of bioelectrical signals |
US7725147B2 (en) | 2005-09-29 | 2010-05-25 | Nellcor Puritan Bennett Llc | System and method for removing artifacts from waveforms |
US8280503B2 (en) | 2008-10-27 | 2012-10-02 | Michael Linderman | EMG measured during controlled hand movement for biometric analysis, medical diagnosis and related analysis |
JP4826459B2 (ja) | 2006-01-12 | 2011-11-30 | 株式会社豊田中央研究所 | 筋骨格モデル作成方法、人体応力/ひずみ推定方法、プログラムおよび記録媒体 |
US8762733B2 (en) | 2006-01-30 | 2014-06-24 | Adidas Ag | System and method for identity confirmation using physiologic biometrics to determine a physiologic fingerprint |
US7580742B2 (en) | 2006-02-07 | 2009-08-25 | Microsoft Corporation | Using electroencephalograph signals for task classification and activity recognition |
US7827000B2 (en) | 2006-03-03 | 2010-11-02 | Garmin Switzerland Gmbh | Method and apparatus for estimating a motion parameter |
WO2007120819A2 (fr) | 2006-04-15 | 2007-10-25 | The Board Of Regents Of The Leland Stanford Junior University | Systemes et procedes pour estimer une electromyographie de surface |
WO2008054511A2 (fr) | 2006-04-21 | 2008-05-08 | Quantum Applied Science & Research, Inc. | Système pour mesurer des signaux électriques |
GB2453263A (en) | 2006-05-16 | 2009-04-01 | Douglas S Greer | System and method for modeling the neocortex and uses therefor |
US7661068B2 (en) | 2006-06-12 | 2010-02-09 | Microsoft Corporation | Extended eraser functions |
US9405372B2 (en) | 2006-07-14 | 2016-08-02 | Ailive, Inc. | Self-contained inertial navigation system for interactive control using movable controllers |
US7848797B2 (en) * | 2006-08-17 | 2010-12-07 | Neurometrix, Inc. | Motor unit number estimation (MUNE) for the assessment of neuromuscular function |
US8437844B2 (en) | 2006-08-21 | 2013-05-07 | Holland Bloorview Kids Rehabilitation Hospital | Method, system and apparatus for real-time classification of muscle signals from self-selected intentional movements |
JP4267648B2 (ja) | 2006-08-25 | 2009-05-27 | 株式会社東芝 | インターフェース装置及びその方法 |
US7885732B2 (en) | 2006-10-25 | 2011-02-08 | The Boeing Company | Systems and methods for haptics-enabled teleoperation of vehicles and other devices |
US20080221487A1 (en) | 2007-03-07 | 2008-09-11 | Motek Bv | Method for real time interactive visualization of muscle forces and joint torques in the human body |
JP5357142B2 (ja) | 2007-04-24 | 2013-12-04 | コーニンクレッカ フィリップス エヌ ヴェ | 生理学的パラメータを測定するセンサー配置構成および方法 |
FR2916069B1 (fr) | 2007-05-11 | 2009-07-31 | Commissariat Energie Atomique | Procede de traitement pour la capture de mouvement d'une structure articulee |
DE102007044555A1 (de) | 2007-07-18 | 2009-01-22 | Siemens Ag | Optische Koppelvorrichtung und Verfahren zu deren Herstellung |
US8726194B2 (en) | 2007-07-27 | 2014-05-13 | Qualcomm Incorporated | Item selection using enhanced control |
JP5559691B2 (ja) | 2007-09-24 | 2014-07-23 | クアルコム,インコーポレイテッド | 音声及びビデオ通信のための機能向上したインタフェース |
US20090082692A1 (en) | 2007-09-25 | 2009-03-26 | Hale Kelly S | System And Method For The Real-Time Evaluation Of Time-Locked Physiological Measures |
US7714757B2 (en) | 2007-09-26 | 2010-05-11 | Medtronic, Inc. | Chopper-stabilized analog-to-digital converter |
US8343079B2 (en) | 2007-10-18 | 2013-01-01 | Innovative Surgical Solutions, Llc | Neural monitoring sensor |
FI20075798A0 (fi) | 2007-11-12 | 2007-11-12 | Polar Electro Oy | Elektrodirakenne |
GB0800144D0 (en) | 2008-01-04 | 2008-02-13 | Fitzpatrick Adam P | Electrocardiographic device and method |
US9597015B2 (en) | 2008-02-12 | 2017-03-21 | Portland State University | Joint angle tracking with inertial sensors |
US20100030532A1 (en) | 2008-06-12 | 2010-02-04 | Jasbir Arora | System and methods for digital human model prediction and simulation |
US8447704B2 (en) | 2008-06-26 | 2013-05-21 | Microsoft Corporation | Recognizing gestures from forearm EMG signals |
US8170656B2 (en) | 2008-06-26 | 2012-05-01 | Microsoft Corporation | Wearable electromyography-based controllers for human-computer interface |
US9037530B2 (en) | 2008-06-26 | 2015-05-19 | Microsoft Technology Licensing, Llc | Wearable electromyography-based human-computer interface |
US8444564B2 (en) | 2009-02-02 | 2013-05-21 | Jointvue, Llc | Noninvasive diagnostic system |
WO2010129922A2 (fr) * | 2009-05-07 | 2010-11-11 | Massachusetts Eye & Ear Infirmary | Traitement de signaux dans du bruit physiologique |
US8376968B2 (en) | 2009-05-15 | 2013-02-19 | The Hong Kong Polytechnic University | Method and system for quantifying an intention of movement of a user |
US20100315266A1 (en) | 2009-06-15 | 2010-12-16 | Microsoft Corporation | Predictive interfaces with usability constraints |
EP2459062A4 (fr) | 2009-07-30 | 2017-04-05 | University of Cape Town | Électromyographie non effractive de muscle profond |
US8718980B2 (en) | 2009-09-11 | 2014-05-06 | Qualcomm Incorporated | Method and apparatus for artifacts mitigation with multiple wireless sensors |
US20110077484A1 (en) | 2009-09-30 | 2011-03-31 | Nellcor Puritan Bennett Ireland | Systems And Methods For Identifying Non-Corrupted Signal Segments For Use In Determining Physiological Parameters |
TWI496558B (zh) | 2009-10-20 | 2015-08-21 | Tatung Co | 使用二極電極貼片量測心電圖與呼吸訊號之系統及方法 |
US8421634B2 (en) | 2009-12-04 | 2013-04-16 | Microsoft Corporation | Sensing mechanical energy to appropriate the body for data input |
EP2512331A4 (fr) | 2009-12-16 | 2015-01-14 | Ictalcare As | Système de prédiction des crises d'épilepsie |
US9268404B2 (en) | 2010-01-08 | 2016-02-23 | Microsoft Technology Licensing, Llc | Application gesture interpretation |
US8631355B2 (en) | 2010-01-08 | 2014-01-14 | Microsoft Corporation | Assigning gesture dictionaries |
JP5471490B2 (ja) | 2010-01-20 | 2014-04-16 | オムロンヘルスケア株式会社 | 体動検出装置 |
EP2548153A1 (fr) | 2010-03-16 | 2013-01-23 | Carlo Trugenberger | Système d'authentification, procédé d'authentification d'un objet, dispositif pour produire un dispositif d'identification, procédé de production d'un dispositif d'identification |
US8351651B2 (en) | 2010-04-26 | 2013-01-08 | Microsoft Corporation | Hand-location post-process refinement in a tracking system |
US8754862B2 (en) | 2010-07-11 | 2014-06-17 | Lester F. Ludwig | Sequential classification recognition of gesture primitives and window-based parameter smoothing for high dimensional touchpad (HDTP) user interfaces |
FR2962821B1 (fr) | 2010-07-13 | 2013-02-22 | Commissariat Energie Atomique | Procede et systeme de classification de signaux neuronaux, et procede de selection d'electrodes pour commande neuronale directe. |
CN103068348B (zh) | 2010-08-02 | 2015-07-15 | 约翰霍普金斯大学 | 使用协作机器人控制和音频反馈呈现力传感器信息的方法 |
US20120066163A1 (en) | 2010-09-13 | 2012-03-15 | Nottingham Trent University | Time to event data analysis method and system |
US20130123656A1 (en) * | 2010-11-15 | 2013-05-16 | Sandy L. Heck | Control System and Apparatus Utilizing Signals Originating in the Periauricular Neuromuscular System |
WO2012155157A1 (fr) | 2011-05-06 | 2012-11-15 | Azoteq (Pty) Ltd | Capteur capacitif de support multiple |
US9251588B2 (en) | 2011-06-20 | 2016-02-02 | Nokia Technologies Oy | Methods, apparatuses and computer program products for performing accurate pose estimation of objects |
US9128521B2 (en) | 2011-07-13 | 2015-09-08 | Lumo Bodytech, Inc. | System and method of biomechanical posture detection and feedback including sensor normalization |
US9707393B2 (en) | 2011-08-26 | 2017-07-18 | National Yunlin University Of Science And Technology | Feedback-control wearable upper-limb electrical stimulation device |
KR20140069124A (ko) | 2011-09-19 | 2014-06-09 | 아이사이트 모빌 테크놀로지 엘티디 | 증강 현실 시스템용 터치프리 인터페이스 |
US20130077820A1 (en) | 2011-09-26 | 2013-03-28 | Microsoft Corporation | Machine learning gesture detection |
FR2981561B1 (fr) | 2011-10-21 | 2015-03-20 | Commissariat Energie Atomique | Procede de detection d'activite a capteur de mouvements, dispositif et programme d'ordinateur correspondants |
ITTO20111024A1 (it) | 2011-11-08 | 2013-05-09 | Bitron Spa | Dispositivo di misura per segnali elettromiografici ad alta risoluzione e elevato numero di canali. |
US10176299B2 (en) * | 2011-11-11 | 2019-01-08 | Rutgers, The State University Of New Jersey | Methods for the diagnosis and treatment of neurological disorders |
US10430066B2 (en) | 2011-12-06 | 2019-10-01 | Nri R&D Patent Licensing, Llc | Gesteme (gesture primitive) recognition for advanced touch user interfaces |
JP2013206273A (ja) | 2012-03-29 | 2013-10-07 | Sony Corp | 情報処理装置、情報処理方法、および情報処理システム |
US10130298B2 (en) | 2012-04-03 | 2018-11-20 | Carnegie Mellon University | Musculoskeletal activity recognition system and method |
US9867548B2 (en) | 2012-05-25 | 2018-01-16 | Emotiv, Inc. | System and method for providing and aggregating biosignals and action data |
US9278453B2 (en) | 2012-05-25 | 2016-03-08 | California Institute Of Technology | Biosleeve human-machine interface |
US9891718B2 (en) | 2015-04-22 | 2018-02-13 | Medibotics Llc | Devices for measuring finger motion and recognizing hand gestures |
US20150366504A1 (en) | 2014-06-20 | 2015-12-24 | Medibotics Llc | Electromyographic Clothing |
US10921886B2 (en) | 2012-06-14 | 2021-02-16 | Medibotics Llc | Circumferential array of electromyographic (EMG) sensors |
US9814426B2 (en) | 2012-06-14 | 2017-11-14 | Medibotics Llc | Mobile wearable electromagnetic brain activity monitor |
US9582072B2 (en) | 2013-09-17 | 2017-02-28 | Medibotics Llc | Motion recognition clothing [TM] with flexible electromagnetic, light, or sonic energy pathways |
US8484022B1 (en) | 2012-07-27 | 2013-07-09 | Google Inc. | Adaptive auto-encoders |
US20150182165A1 (en) | 2012-08-03 | 2015-07-02 | Neurotopia, Inc. | Neurophysiological training headset |
US10234941B2 (en) | 2012-10-04 | 2019-03-19 | Microsoft Technology Licensing, Llc | Wearable sensor for tracking articulated body-parts |
WO2014107213A2 (fr) | 2012-10-16 | 2014-07-10 | The Florida International University Board Of Trustees | Simulateur d'activité pour interface neuronale |
US9351653B1 (en) * | 2012-11-29 | 2016-05-31 | Intan Technologies, LLC | Multi-channel reconfigurable systems and methods for sensing biopotential signals |
WO2014085910A1 (fr) | 2012-12-04 | 2014-06-12 | Interaxon Inc. | Système et procédé d'amélioration de contenu au moyen de données d'état du cerveau |
US20140196131A1 (en) | 2013-01-07 | 2014-07-10 | Salutron, Inc. | User authentication based on a wrist vein pattern |
US10528135B2 (en) | 2013-01-14 | 2020-01-07 | Ctrl-Labs Corporation | Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display |
US9459697B2 (en) | 2013-01-15 | 2016-10-04 | Leap Motion, Inc. | Dynamic, free-space user interactions for machine control |
CN105190578A (zh) | 2013-02-22 | 2015-12-23 | 赛尔米克实验室公司 | 用于基于手势控制的组合肌肉活动传感器信号和惯性传感器信号的方法和设备 |
US20140245200A1 (en) | 2013-02-25 | 2014-08-28 | Leap Motion, Inc. | Display control with gesture-selectable control paradigms |
US20140249397A1 (en) | 2013-03-01 | 2014-09-04 | Thalmic Labs Inc. | Differential non-contact biopotential sensor |
US20140277622A1 (en) | 2013-03-15 | 2014-09-18 | First Principles, Inc. | System and method for bio-signal control of an electronic device |
US9436287B2 (en) | 2013-03-15 | 2016-09-06 | Qualcomm Incorporated | Systems and methods for switching processing modes using gestures |
US9361411B2 (en) | 2013-03-15 | 2016-06-07 | Honeywell International, Inc. | System and method for selecting a respirator |
US9766709B2 (en) | 2013-03-15 | 2017-09-19 | Leap Motion, Inc. | Dynamic user interactions for display control |
IN2013MU01148A (fr) | 2013-03-26 | 2015-04-24 | Tata Consultancy Services Ltd | |
US10620709B2 (en) | 2013-04-05 | 2020-04-14 | Ultrahaptics IP Two Limited | Customized gesture interpretation |
US9717440B2 (en) | 2013-05-03 | 2017-08-01 | The Florida International University Board Of Trustees | Systems and methods for decoding intended motor commands from recorded neural signals for the control of external devices or to interact in virtual environments |
WO2014186370A1 (fr) | 2013-05-13 | 2014-11-20 | Thalmic Labs Inc. | Systèmes, articles et procédés pour des dispositifs électroniques portables qui s'adaptent aux différentes silhouettes de l'utilisateur |
US10314506B2 (en) | 2013-05-15 | 2019-06-11 | Polar Electro Oy | Heart activity sensor structure |
US10620775B2 (en) | 2013-05-17 | 2020-04-14 | Ultrahaptics IP Two Limited | Dynamic interactive objects |
US9218574B2 (en) | 2013-05-29 | 2015-12-22 | Purepredictive, Inc. | User interface for machine learning |
WO2014194257A1 (fr) | 2013-05-31 | 2014-12-04 | President And Fellows Of Harvard College | Exosquelette souple pour assistance au mouvement humain |
US9383819B2 (en) | 2013-06-03 | 2016-07-05 | Daqri, Llc | Manipulation of virtual object in augmented reality via intent |
WO2014197443A1 (fr) | 2013-06-03 | 2014-12-11 | Kacyvenski Isaiah | Capteur de mouvement et analyse |
KR101933921B1 (ko) | 2013-06-03 | 2018-12-31 | 삼성전자주식회사 | 포즈 추정 방법 및 장치 |
US11083402B2 (en) * | 2013-06-04 | 2021-08-10 | Medtronic, Inc. | Patient state determination based on one or more spectral characteristics of a bioelectrical brain signal |
KR101501661B1 (ko) | 2013-06-10 | 2015-03-12 | 한국과학기술연구원 | 착용형 근전도 센서 시스템 |
WO2014204330A1 (fr) | 2013-06-17 | 2014-12-24 | 3Divi Company | Procédés et systèmes de détermination de position et d'orientation à 6 ddl d'un afficheur facial et de mouvements associés d'utilisateur |
US20140376773A1 (en) | 2013-06-21 | 2014-12-25 | Leap Motion, Inc. | Tunable operational parameters in motion-capture and touchless interface operation |
US10402517B2 (en) | 2013-06-26 | 2019-09-03 | Dassault Systémes Simulia Corp. | Musculo-skeletal modeling using finite element analysis, process integration, and design optimization |
US9408316B2 (en) | 2013-07-22 | 2016-08-02 | Thalmic Labs Inc. | Systems, articles and methods for strain mitigation in wearable electronic devices |
US20150029092A1 (en) | 2013-07-23 | 2015-01-29 | Leap Motion, Inc. | Systems and methods of interpreting complex gestures |
US20150057770A1 (en) | 2013-08-23 | 2015-02-26 | Thaimic Labs Inc. | Systems, articles, and methods for human-electronics interfaces |
US10042422B2 (en) | 2013-11-12 | 2018-08-07 | Thalmic Labs Inc. | Systems, articles, and methods for capacitive electromyography sensors |
US20150124566A1 (en) | 2013-10-04 | 2015-05-07 | Thalmic Labs Inc. | Systems, articles and methods for wearable electronic devices employing contact sensors |
US11426123B2 (en) | 2013-08-16 | 2022-08-30 | Meta Platforms Technologies, Llc | Systems, articles and methods for signal routing in wearable electronic devices that detect muscle activity of a user using a set of discrete and separately enclosed pod structures |
US9788789B2 (en) | 2013-08-30 | 2017-10-17 | Thalmic Labs Inc. | Systems, articles, and methods for stretchable printed circuit boards |
US9372535B2 (en) | 2013-09-06 | 2016-06-21 | Thalmic Labs Inc. | Systems, articles, and methods for electromyography-based human-electronics interfaces |
US9483123B2 (en) | 2013-09-23 | 2016-11-01 | Thalmic Labs Inc. | Systems, articles, and methods for gesture identification in wearable electromyography devices |
CN105578954B (zh) | 2013-09-25 | 2019-03-29 | 迈恩德玛泽控股股份有限公司 | 生理参数测量和反馈系统 |
US9389694B2 (en) | 2013-10-22 | 2016-07-12 | Thalmic Labs Inc. | Systems, articles, and methods for gesture identification in wearable electromyography devices |
CN103777752A (zh) | 2013-11-02 | 2014-05-07 | 上海威璞电子科技有限公司 | 基于手臂肌肉电流检测和运动传感器的手势识别设备 |
GB2519987B (en) | 2013-11-04 | 2021-03-03 | Imperial College Innovations Ltd | Biomechanical activity monitoring |
US9594433B2 (en) | 2013-11-05 | 2017-03-14 | At&T Intellectual Property I, L.P. | Gesture-based controls via bone conduction |
CN105722479B (zh) | 2013-11-13 | 2018-04-13 | 赫尔实验室有限公司 | 用于控制大脑机器接口和神经假肢系统的系统 |
WO2015081113A1 (fr) | 2013-11-27 | 2015-06-04 | Cezar Morun | Systèmes, articles et procédés pour capteurs d'électromyographie |
US20150157944A1 (en) | 2013-12-06 | 2015-06-11 | Glenn I. Gottlieb | Software Application for Generating a Virtual Simulation for a Sport-Related Activity |
US9367139B2 (en) | 2013-12-12 | 2016-06-14 | Thalmic Labs Inc. | Systems, articles, and methods for gesture identification in wearable electromyography devices |
US9524580B2 (en) | 2014-01-06 | 2016-12-20 | Oculus Vr, Llc | Calibration of virtual reality systems |
US9659403B1 (en) | 2014-01-06 | 2017-05-23 | Leap Motion, Inc. | Initializing orientation in space for predictive information for free space gesture control and communication |
US9613262B2 (en) | 2014-01-15 | 2017-04-04 | Leap Motion, Inc. | Object detection and tracking for providing a virtual device experience |
US9600030B2 (en) | 2014-02-14 | 2017-03-21 | Thalmic Labs Inc. | Systems, articles, and methods for elastic electrical cables and wearable electronic devices employing same |
JP2017508229A (ja) | 2014-02-28 | 2017-03-23 | グプタ ヴィカスGUPTA, Vikas | ジェスチャによって操作する手首装着型カメラシステム |
US10613642B2 (en) | 2014-03-12 | 2020-04-07 | Microsoft Technology Licensing, Llc | Gesture parameter tuning |
US20150261306A1 (en) | 2014-03-17 | 2015-09-17 | Thalmic Labs Inc. | Systems, devices, and methods for selecting between multiple wireless connections |
US10199008B2 (en) | 2014-03-27 | 2019-02-05 | North Inc. | Systems, devices, and methods for wearable electronic devices as state machines |
US10409382B2 (en) | 2014-04-03 | 2019-09-10 | Honda Motor Co., Ltd. | Smart tutorial for gesture control system |
US20150296553A1 (en) | 2014-04-11 | 2015-10-15 | Thalmic Labs Inc. | Systems, devices, and methods that establish proximity-based wireless connections |
US9858391B2 (en) | 2014-04-17 | 2018-01-02 | The Boeing Company | Method and system for tuning a musculoskeletal model |
WO2015164951A1 (fr) | 2014-05-01 | 2015-11-05 | Abbas Mohamad | Procédés et systèmes relatifs à des avatars évolutifs personnalisés |
US20150325202A1 (en) | 2014-05-07 | 2015-11-12 | Thalmic Labs Inc. | Systems, devices, and methods for wearable computers with heads-up displays |
US9785247B1 (en) | 2014-05-14 | 2017-10-10 | Leap Motion, Inc. | Systems and methods of tracking moving hands and recognizing gestural interactions |
USD756359S1 (en) | 2014-05-15 | 2016-05-17 | Thalmic Labs Inc. | Expandable armband device |
KR101666399B1 (ko) | 2014-05-15 | 2016-10-14 | 한국과학기술연구원 | 다중 채널 표면 근전도에서의 신체 관절 운동학 정보 추출 방법, 이를 수행하기 위한 기록 매체 및 장치 |
USD717685S1 (en) | 2014-05-15 | 2014-11-18 | Thalmic Labs Inc. | Expandable armband |
US9741169B1 (en) | 2014-05-20 | 2017-08-22 | Leap Motion, Inc. | Wearable augmented reality devices with object detection and tracking |
US10782657B2 (en) | 2014-05-27 | 2020-09-22 | Ultrahaptics IP Two Limited | Systems and methods of gestural interaction in a pervasive computing environment |
US9880632B2 (en) | 2014-06-19 | 2018-01-30 | Thalmic Labs Inc. | Systems, devices, and methods for gesture identification |
WO2015199747A1 (fr) | 2014-06-23 | 2015-12-30 | Thalmic Labs Inc. | Systèmes, articles, et procédés pour des dispositifs d'interface homme-électronique portables |
US10216274B2 (en) | 2014-06-23 | 2019-02-26 | North Inc. | Systems, articles, and methods for wearable human-electronics interface devices |
US9552069B2 (en) | 2014-07-11 | 2017-01-24 | Microsoft Technology Licensing, Llc | 3D gesture recognition |
US9734704B2 (en) | 2014-08-12 | 2017-08-15 | Dominick S. LEE | Wireless gauntlet for electronic control |
WO2016041088A1 (fr) | 2014-09-19 | 2016-03-24 | Sulon Technologies Inc. | Système et procédé de localisation de périphériques portables dans des applications de réalité augmentée et de réalité virtuelle |
US9864460B2 (en) | 2014-09-26 | 2018-01-09 | Sensel, Inc. | Systems and methods for manipulating a virtual environment |
US9811555B2 (en) | 2014-09-27 | 2017-11-07 | Intel Corporation | Recognition of free-form gestures from orientation tracking of a handheld or wearable device |
WO2016076376A1 (fr) | 2014-11-12 | 2016-05-19 | 京セラ株式会社 | Dispositif pouvant être porté |
US9720515B2 (en) | 2015-01-02 | 2017-08-01 | Wearable Devices Ltd. | Method and apparatus for a gesture controlled interface for wearable devices |
US9612661B2 (en) | 2015-01-02 | 2017-04-04 | Wearable Devices Ltd. | Closed loop feedback interface for wearable devices |
US9696795B2 (en) | 2015-02-13 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments |
US20160274758A1 (en) | 2015-03-20 | 2016-09-22 | Thalmic Labs Inc. | Systems, devices, and methods for mitigating false positives in human-electronics interfaces |
US10432842B2 (en) | 2015-04-06 | 2019-10-01 | The Texas A&M University System | Fusion of inertial and depth sensors for movement measurements and recognition |
WO2016168117A2 (fr) | 2015-04-14 | 2016-10-20 | John James Daniels | Interfaces humain/humain, machine/humain, multi-sensorielles, électriques et portables |
US9804733B2 (en) | 2015-04-21 | 2017-10-31 | Dell Products L.P. | Dynamic cursor focus in a multi-display information handling system environment |
US10078435B2 (en) | 2015-04-24 | 2018-09-18 | Thalmic Labs Inc. | Systems, methods, and computer program products for interacting with electronically displayed presentation materials |
GB2537899B (en) | 2015-04-30 | 2018-02-21 | Hy5Pro As | Control of digits for artificial hand |
US9654477B1 (en) | 2015-05-05 | 2017-05-16 | Wells Fargo Bank, N. A. | Adaptive authentication |
US9898864B2 (en) | 2015-05-28 | 2018-02-20 | Microsoft Technology Licensing, Llc | Shared tactile interaction and user safety in shared space multi-person immersive virtual reality |
EP3302688B1 (fr) * | 2015-06-02 | 2020-11-04 | Battelle Memorial Institute | Systèmes de pontage neuronal du système nerveux |
EP3302691B1 (fr) | 2015-06-02 | 2019-07-24 | Battelle Memorial Institute | Système non effractif de rééducation de déficience motrice |
EP4406482A3 (fr) | 2015-06-02 | 2024-10-23 | Battelle Memorial Institute | Manchon neural pour enregistrement, détection et stimulation neuromusculaire |
US11589814B2 (en) | 2015-06-26 | 2023-02-28 | Carnegie Mellon University | System for wearable, low-cost electrical impedance tomography for non-invasive gesture recognition |
US9240069B1 (en) | 2015-06-30 | 2016-01-19 | Ariadne's Thread (Usa), Inc. | Low-latency virtual reality display system |
KR101626748B1 (ko) | 2015-08-03 | 2016-06-14 | 숭실대학교산학협력단 | 뇌파와 근전도를 이용한 움직임 패턴 측정 장치 및 그 방법 |
US10854104B2 (en) | 2015-08-28 | 2020-12-01 | Icuemotion Llc | System for movement skill analysis and skill augmentation and cueing |
US10387034B2 (en) | 2015-09-03 | 2019-08-20 | Microsoft Technology Licensing, Llc | Modifying captured stroke information into an actionable form |
US9824287B2 (en) | 2015-09-29 | 2017-11-21 | Huami Inc. | Method, apparatus and system for biometric identification |
US10459537B2 (en) | 2015-09-30 | 2019-10-29 | Stmicroelectronics, Inc. | Encapsulated pressure sensor |
WO2017062544A1 (fr) | 2015-10-06 | 2017-04-13 | University Of Pittsburgh-Of The Commonwealth System Of Higher Education | Procédé, dispositif et système de détection de l'activité neuromusculaire, physiologique, biomécanique et musculo-squelettique |
US9881273B2 (en) | 2015-10-28 | 2018-01-30 | Disney Interprises, Inc. | Automatic object detection and state estimation via electronic emissions sensing |
US10595941B2 (en) | 2015-10-30 | 2020-03-24 | Orthosensor Inc. | Spine measurement system and method therefor |
US11106273B2 (en) | 2015-10-30 | 2021-08-31 | Ostendo Technologies, Inc. | System and methods for on-body gestural interfaces and projection displays |
US10776712B2 (en) | 2015-12-02 | 2020-09-15 | Preferred Networks, Inc. | Generative machine learning systems for drug design |
CN105511615B (zh) | 2015-12-04 | 2019-03-05 | 深圳大学 | 基于emg的可穿戴式文本输入系统及方法 |
US20170188980A1 (en) | 2016-01-06 | 2017-07-06 | Empire Technology Development Llc | Wearable sensor based body modeling |
WO2017120669A1 (fr) | 2016-01-12 | 2017-07-20 | Bigmotion Technologies Inc. | Systèmes et procédés permettant une capture de mouvement de corps humain |
US20170259167A1 (en) | 2016-03-14 | 2017-09-14 | Nathan Sterling Cook | Brainwave virtual reality apparatus and method |
US9864434B2 (en) | 2016-03-30 | 2018-01-09 | Huami Inc. | Gesture control of interactive events using multiple wearable devices |
US10503253B2 (en) | 2016-03-31 | 2019-12-10 | Intel Corporation | Sensor signal processing to determine finger and/or hand position |
JP6728386B2 (ja) | 2016-03-31 | 2020-07-22 | センセル インコーポレイテッドSensel,Inc. | 人間コンピュータインタフェースシステム |
US10852835B2 (en) | 2016-04-15 | 2020-12-01 | Board Of Regents, The University Of Texas System | Systems, apparatuses and methods for controlling prosthetic devices by gestures and other modalities |
KR102629723B1 (ko) | 2016-04-19 | 2024-01-30 | 스카이워크스 솔루션즈, 인코포레이티드 | 무선 주파수 모듈의 선택적 차폐 |
US10203751B2 (en) | 2016-05-11 | 2019-02-12 | Microsoft Technology Licensing, Llc | Continuous motion controls operable using neurological data |
US9864431B2 (en) | 2016-05-11 | 2018-01-09 | Microsoft Technology Licensing, Llc | Changing an application state using neurological data |
WO2017208167A1 (fr) | 2016-05-31 | 2017-12-07 | Lab Schöpfergeist Ag | Appareil et méthode de stimulation du nerf |
US10426371B2 (en) * | 2016-06-07 | 2019-10-01 | Smk Corporation | Muscle condition measurement sheet |
KR101790147B1 (ko) | 2016-06-22 | 2017-10-25 | 재단법인 실감교류인체감응솔루션연구단 | 가상 객체 조작 시스템 및 방법 |
WO2018002722A1 (fr) | 2016-07-01 | 2018-01-04 | L.I.F.E. Corporation S.A. | Identification biométrique par des vêtements comportant une pluralité de capteurs |
CN117032398A (zh) * | 2016-07-06 | 2023-11-10 | 可穿戴设备有限公司 | 用于可穿戴设备的姿势控制接口的方法和装置 |
US10362414B2 (en) * | 2016-07-08 | 2019-07-23 | Oticon A/S | Hearing assistance system comprising an EEG-recording and analysis system |
US20190223748A1 (en) | 2018-01-25 | 2019-07-25 | Ctrl-Labs Corporation | Methods and apparatus for mitigating neuromuscular signal artifacts |
US10772519B2 (en) | 2018-05-25 | 2020-09-15 | Facebook Technologies, Llc | Methods and apparatus for providing sub-muscular control |
WO2018022658A1 (fr) * | 2016-07-25 | 2018-02-01 | Ctrl-Labs Corporation | Système adaptatif permettant de dériver des signaux de commande à partir de mesures de l'activité neuromusculaire |
US11635736B2 (en) | 2017-10-19 | 2023-04-25 | Meta Platforms Technologies, Llc | Systems and methods for identifying biological structures associated with neuromuscular source signals |
EP3487395A4 (fr) | 2016-07-25 | 2020-03-04 | CTRL-Labs Corporation | Procédés et appareil permettant de prédire des informations de position musculo-squelettique à l'aide de capteurs autonomes portables |
US10496168B2 (en) | 2018-01-25 | 2019-12-03 | Ctrl-Labs Corporation | Calibration techniques for handstate representation modeling using neuromuscular signals |
US10489986B2 (en) | 2018-01-25 | 2019-11-26 | Ctrl-Labs Corporation | User-controlled tuning of handstate representation model parameters |
CN110337269B (zh) | 2016-07-25 | 2021-09-21 | 脸谱科技有限责任公司 | 基于神经肌肉信号推断用户意图的方法和装置 |
US11337652B2 (en) | 2016-07-25 | 2022-05-24 | Facebook Technologies, Llc | System and method for measuring the movements of articulated rigid bodies |
US10765363B2 (en) | 2016-09-30 | 2020-09-08 | Cognionics, Inc. | Headgear for dry electroencephalogram sensors |
US10162422B2 (en) | 2016-10-10 | 2018-12-25 | Deere & Company | Control of machines through detection of gestures by optical and muscle sensors |
EP3548994B1 (fr) | 2016-12-02 | 2021-08-11 | Pison Technology, Inc. | Détection et utilisation de signaux électriques de tissus corporels |
US10646139B2 (en) | 2016-12-05 | 2020-05-12 | Intel Corporation | Body movement tracking |
US20190025919A1 (en) | 2017-01-19 | 2019-01-24 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression in an augmented reality system |
WO2018191755A1 (fr) | 2017-04-14 | 2018-10-18 | REHABILITATION INSTITUTE OF CHICAGO d/b/a Shirley Ryan AbilityLab | Interface d'apprentissage de réalité virtuelle prothétique et procédés associés |
US11259746B2 (en) * | 2017-07-10 | 2022-03-01 | General Electric Company | Method and system for neuromuscular transmission measurement |
US10481699B2 (en) | 2017-07-27 | 2019-11-19 | Facebook Technologies, Llc | Armband for tracking hand motion using electrical impedance measurement |
US20190076716A1 (en) | 2017-09-12 | 2019-03-14 | Intel Corporation | Activity training system |
US10606620B2 (en) | 2017-11-16 | 2020-03-31 | International Business Machines Corporation | Notification interaction in a touchscreen user interface |
US20190150777A1 (en) | 2017-11-17 | 2019-05-23 | Ctrl-Labs Corporation | Dual-supply analog circuitry for sensing surface emg signals |
US10827942B2 (en) | 2018-01-03 | 2020-11-10 | Intel Corporation | Detecting fatigue based on electroencephalogram (EEG) data |
CN112074870A (zh) | 2018-01-25 | 2020-12-11 | 脸谱科技有限责任公司 | 重构的手部状态信息的可视化 |
WO2019147928A1 (fr) | 2018-01-25 | 2019-08-01 | Ctrl-Labs Corporation | Reconstruction d'état de main sur la base d'entrées multiples |
WO2019148002A1 (fr) | 2018-01-25 | 2019-08-01 | Ctrl-Labs Corporation | Techniques d'anonymisation de données de signal neuromusculaire |
WO2019147949A1 (fr) | 2018-01-25 | 2019-08-01 | Ctrl-Labs Corporation | Traitement en temps réel d'estimations de modèle de représentation d'état de main |
US20190247650A1 (en) * | 2018-02-14 | 2019-08-15 | Bao Tran | Systems and methods for augmenting human muscle controls |
US20190324549A1 (en) | 2018-04-20 | 2019-10-24 | Immersion Corporation | Systems, devices, and methods for providing immersive reality interface modes |
EP3797345A4 (fr) | 2018-05-22 | 2022-03-09 | Magic Leap, Inc. | Fusion d'entrée transmodale pour système portable |
-
2019
- 2019-08-13 WO PCT/US2019/046351 patent/WO2020036958A1/fr unknown
- 2019-08-13 US US16/539,755 patent/US11179066B2/en active Active
- 2019-08-13 CN CN201980053249.XA patent/CN112566553A/zh active Pending
- 2019-08-13 EP EP19850130.6A patent/EP3836836B1/fr active Active
Also Published As
Publication number | Publication date |
---|---|
EP3836836A4 (fr) | 2021-09-29 |
CN112566553A (zh) | 2021-03-26 |
US11179066B2 (en) | 2021-11-23 |
US20200046265A1 (en) | 2020-02-13 |
EP3836836B1 (fr) | 2024-03-20 |
WO2020036958A1 (fr) | 2020-02-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3836836B1 (fr) | Détection et identification de pointes en temps réel | |
EP3801743B1 (fr) | Procédés et appareil d'obtention d'une commande sous-musculaire | |
CN110337269B (zh) | 基于神经肌肉信号推断用户意图的方法和装置 | |
US11361522B2 (en) | User-controlled tuning of handstate representation model parameters | |
US10504286B2 (en) | Techniques for anonymizing neuromuscular signal data | |
US9720515B2 (en) | Method and apparatus for a gesture controlled interface for wearable devices | |
US20190223748A1 (en) | Methods and apparatus for mitigating neuromuscular signal artifacts | |
US10970374B2 (en) | User identification and authentication with neuromuscular signatures | |
US20200275895A1 (en) | Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces | |
CN109598219A (zh) | 一种用于鲁棒肌电控制的自适应电极配准方法 | |
Antonius et al. | Electromyography gesture identification using CNN-RNN neural network for controlling quadcopters | |
Koch et al. | Inhomogeneously stacked rnn for recognizing hand gestures from magnetometer data | |
Murugan et al. | EMG signal classification using ANN and ANFIS for neuro-muscular disorders | |
Gupta et al. | Channel selection in multi-channel surface electromyogram based hand activity classifier | |
CN113557069A (zh) | 用于手势分类和施加的力估计的无监督机器学习的方法和装置 | |
Wu et al. | A Fast Classification Approach to Upper-Limb Posture Recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210119 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602019048754 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: A61B0005049200 Ipc: G06F0003010000 Ref country code: DE Ref legal event code: R079 Free format text: PREVIOUS MAIN CLASS: A61B0005049200 Ipc: G06F0003010000 |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20210901 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 5/397 20210101ALI20210826BHEP Ipc: A61B 5/296 20210101ALI20210826BHEP Ipc: A61B 5/388 20210101ALI20210826BHEP Ipc: A61B 5/11 20060101ALI20210826BHEP Ipc: A61B 5/00 20060101ALI20210826BHEP Ipc: G06F 3/01 20060101AFI20210826BHEP |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20231009 |
|
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: META PLATFORMS TECHNOLOGIES, LLC |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20240214 |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602019048754 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240621 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240620 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240620 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240620 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240621 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1668412 Country of ref document: AT Kind code of ref document: T Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240720 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240830 Year of fee payment: 6 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240320 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240722 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240830 Year of fee payment: 6 |