WO2022251696A1 - Wireless soft scalp electronics and virtual reality system for brain-machine interfaces - Google Patents
Wireless soft scalp electronics and virtual reality system for brain-machine interfaces Download PDFInfo
- Publication number
- WO2022251696A1 WO2022251696A1 PCT/US2022/031432 US2022031432W WO2022251696A1 WO 2022251696 A1 WO2022251696 A1 WO 2022251696A1 US 2022031432 W US2022031432 W US 2022031432W WO 2022251696 A1 WO2022251696 A1 WO 2022251696A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eeg
- low
- brain
- profile
- flexible
- Prior art date
Links
- 210000004761 scalp Anatomy 0.000 title claims abstract description 43
- 238000000034 method Methods 0.000 claims abstract description 92
- 238000013528 artificial neural network Methods 0.000 claims abstract description 41
- 210000004556 brain Anatomy 0.000 claims abstract description 23
- 230000000763 evoking effect Effects 0.000 claims abstract description 14
- 238000013527 convolutional neural network Methods 0.000 claims description 28
- 239000000758 substrate Substances 0.000 claims description 25
- 238000003491 array Methods 0.000 claims description 17
- 230000033001 locomotion Effects 0.000 claims description 16
- 238000005259 measurement Methods 0.000 claims description 12
- 230000000149 penetrating effect Effects 0.000 claims description 6
- 238000000537 electroencephalography Methods 0.000 description 113
- 229920001721 polyimide Polymers 0.000 description 45
- 239000004642 Polyimide Substances 0.000 description 43
- 238000012549 training Methods 0.000 description 40
- 239000004205 dimethyl polysiloxane Substances 0.000 description 29
- 235000013870 dimethyl polysiloxane Nutrition 0.000 description 29
- 229920000435 poly(dimethylsiloxane) Polymers 0.000 description 29
- CXQXSVUQTKDNFP-UHFFFAOYSA-N octamethyltrisiloxane Chemical compound C[Si](C)(C)O[Si](C)(C)O[Si](C)(C)C CXQXSVUQTKDNFP-UHFFFAOYSA-N 0.000 description 28
- 238000004987 plasma desorption mass spectroscopy Methods 0.000 description 28
- 230000008569 process Effects 0.000 description 26
- 238000012360 testing method Methods 0.000 description 17
- 239000000499 gel Substances 0.000 description 15
- 239000010949 copper Substances 0.000 description 13
- 239000004593 Epoxy Substances 0.000 description 12
- 239000010408 film Substances 0.000 description 12
- 229920002120 photoresistant polymer Polymers 0.000 description 12
- 239000000463 material Substances 0.000 description 11
- 238000004458 analytical method Methods 0.000 description 10
- 238000004528 spin coating Methods 0.000 description 10
- 229910021607 Silver chloride Inorganic materials 0.000 description 9
- HKZLPVFGJNLROG-UHFFFAOYSA-M silver monochloride Chemical compound [Cl-].[Ag+] HKZLPVFGJNLROG-UHFFFAOYSA-M 0.000 description 9
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 8
- 241001465754 Metazoa Species 0.000 description 8
- 229910052802 copper Inorganic materials 0.000 description 8
- 238000013461 design Methods 0.000 description 8
- 239000010931 gold Substances 0.000 description 8
- 238000004519 manufacturing process Methods 0.000 description 8
- 210000003491 skin Anatomy 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 7
- 238000004544 sputter deposition Methods 0.000 description 7
- 230000000638 stimulation Effects 0.000 description 7
- 201000000251 Locked-in syndrome Diseases 0.000 description 6
- 238000011156 evaluation Methods 0.000 description 6
- 238000010801 machine learning Methods 0.000 description 6
- 239000002184 metal Substances 0.000 description 6
- 229910052751 metal Inorganic materials 0.000 description 6
- 230000003190 augmentative effect Effects 0.000 description 5
- 230000007177 brain activity Effects 0.000 description 5
- 239000011248 coating agent Substances 0.000 description 5
- 238000000576 coating method Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 5
- 238000002156 mixing Methods 0.000 description 5
- 238000000059 patterning Methods 0.000 description 5
- 229920000139 polyethylene terephthalate Polymers 0.000 description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Chemical compound O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 5
- 150000001875 compounds Chemical class 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 210000003128 head Anatomy 0.000 description 4
- 239000012528 membrane Substances 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 238000002360 preparation method Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 3
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 238000012512 characterization method Methods 0.000 description 3
- 238000000151 deposition Methods 0.000 description 3
- 229920001971 elastomer Polymers 0.000 description 3
- 239000000806 elastomer Substances 0.000 description 3
- 238000005530 etching Methods 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000007774 longterm Effects 0.000 description 3
- 210000000337 motor cortex Anatomy 0.000 description 3
- 229910052760 oxygen Inorganic materials 0.000 description 3
- 239000001301 oxygen Substances 0.000 description 3
- 239000002245 particle Substances 0.000 description 3
- 230000033764 rhythmic process Effects 0.000 description 3
- 238000012800 visualization Methods 0.000 description 3
- CSCPPACGZOOCGX-UHFFFAOYSA-N Acetone Chemical compound CC(C)=O CSCPPACGZOOCGX-UHFFFAOYSA-N 0.000 description 2
- 229920002799 BoPET Polymers 0.000 description 2
- 241000282412 Homo Species 0.000 description 2
- 241000283986 Lepus Species 0.000 description 2
- 206010033799 Paralysis Diseases 0.000 description 2
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 description 2
- 210000000133 brain stem Anatomy 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000007635 classification algorithm Methods 0.000 description 2
- 238000004140 cleaning Methods 0.000 description 2
- 238000002790 cross-validation Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000001035 drying Methods 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 238000004070 electrodeposition Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 2
- 229910052737 gold Inorganic materials 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 238000013035 low temperature curing Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000007659 motor function Effects 0.000 description 2
- 239000003973 paint Substances 0.000 description 2
- 230000002085 persistent effect Effects 0.000 description 2
- 230000010363 phase shift Effects 0.000 description 2
- 238000001020 plasma etching Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 229910052709 silver Inorganic materials 0.000 description 2
- 239000004332 silver Substances 0.000 description 2
- 210000004927 skin cell Anatomy 0.000 description 2
- 238000010200 validation analysis Methods 0.000 description 2
- 210000000857 visual cortex Anatomy 0.000 description 2
- 241000282693 Cercopithecidae Species 0.000 description 1
- 206010010071 Coma Diseases 0.000 description 1
- 208000003870 Drug Overdose Diseases 0.000 description 1
- 208000032365 Electromagnetic interference Diseases 0.000 description 1
- 241000124008 Mammalia Species 0.000 description 1
- 206010061296 Motor dysfunction Diseases 0.000 description 1
- 241000428199 Mustelinae Species 0.000 description 1
- 206010033296 Overdoses Diseases 0.000 description 1
- 208000005374 Poisoning Diseases 0.000 description 1
- -1 Polydimethylsiloxane Polymers 0.000 description 1
- 206010037714 Quadriplegia Diseases 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 206010040880 Skin irritation Diseases 0.000 description 1
- 208000005247 Traumatic Brain Hemorrhage Diseases 0.000 description 1
- 208000030886 Traumatic Brain injury Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 239000003570 air Substances 0.000 description 1
- 239000012080 ambient air Substances 0.000 description 1
- 238000010171 animal model Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 208000003464 asthenopia Diseases 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 208000029028 brain injury Diseases 0.000 description 1
- 230000005587 bubbling Effects 0.000 description 1
- 230000003920 cognitive function Effects 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 230000001054 cortical effect Effects 0.000 description 1
- 238000001723 curing Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 210000004207 dermis Anatomy 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000003467 diminishing effect Effects 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 208000035475 disorder Diseases 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 231100000725 drug overdose Toxicity 0.000 description 1
- 229920005839 ecoflex® Polymers 0.000 description 1
- 238000010894 electron beam technology Methods 0.000 description 1
- 238000004146 energy storage Methods 0.000 description 1
- 239000003822 epoxy resin Substances 0.000 description 1
- 238000001704 evaporation Methods 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 229910052739 hydrogen Inorganic materials 0.000 description 1
- 239000001257 hydrogen Substances 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000003698 laser cutting Methods 0.000 description 1
- 244000144972 livestock Species 0.000 description 1
- 238000005297 material degradation process Methods 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 238000005459 micromachining Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000037023 motor activity Effects 0.000 description 1
- 239000002070 nanowire Substances 0.000 description 1
- 238000009832 plasma treatment Methods 0.000 description 1
- 231100000572 poisoning Toxicity 0.000 description 1
- 230000000607 poisoning effect Effects 0.000 description 1
- 229920000647 polyepoxide Polymers 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 230000001242 postsynaptic effect Effects 0.000 description 1
- 238000000820 replica moulding Methods 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 238000007790 scraping Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 210000002265 sensory receptor cell Anatomy 0.000 description 1
- 102000027509 sensory receptors Human genes 0.000 description 1
- 108091008691 sensory receptors Proteins 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000036556 skin irritation Effects 0.000 description 1
- 231100000475 skin irritation Toxicity 0.000 description 1
- 229910000679 solder Inorganic materials 0.000 description 1
- 239000002904 solvent Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 210000003478 temporal lobe Anatomy 0.000 description 1
- 230000009529 traumatic brain injury Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/25—Bioelectric electrodes therefor
- A61B5/262—Needle electrodes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/25—Bioelectric electrodes therefor
- A61B5/279—Bioelectric electrodes therefor specially adapted for particular uses
- A61B5/291—Bioelectric electrodes therefor specially adapted for particular uses for electroencephalography [EEG]
- A61B5/293—Invasive
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6846—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
- A61B5/6847—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
- A61B5/685—Microneedles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Definitions
- Patent Application No. 63/194,111 filed May 27, 2021, entitled “WIRELESS SOFT SCALP ELECTRONICS AND VIRTUAL REALITY SYSTEM FOR MOTOR IMAGERY-BASED BRAIN-MACHINE INTERFACES,” and U.S. Provisional Application No. 63/311,628, filed February 18, 2022, entitled, “VIRTUAL REALITY (VR)-ENABLED BRAIN -COMPUTER INTERFACES VIA WIRELESS SOFT BIOELECTRON ICS,” each of which is incorporated by reference herein in its entirety.
- Motor imagery electroencephalography refers to the mental simulation of body movements by consciously accessing aspects of body movement to provide a mechanism for brain-machine interfaces.
- Conventional electroencephalography (EEG) for motor imagery typically employs a hair cap with multiple wired electrodes and gels that involve extensive setup time and are uncomfortable to use. While the latest EEG designs are trending toward wireless, wearable EEG for day-to-day mobile EEG monitoring, they nevertheless continue to employ rigid, bulky circuitries and gel-based skin-contact electrodes that are of an obtrusive nature, providing low information throughput due to noise-prone brain signal detection, and have limited recording channels.
- Similar EEG hardware can also be used for the acquisition of steady-state visually evoked potentials (SSVEP), which are brain signals that are natural responses to visual stimulation at specific frequencies.
- SSVEP visually evoked potentials
- the retina When the retina is excited, for example, by a visual stimulus ranging from 3.5 Hz to 75 Hz, the brain can generate electrical activity at the same (or multiples of) frequency of the visual stimulus.
- BMI brain-machine interface
- An exemplary wireless soft scalp electronic system and method are disclosed that can actuate commands for a brain-machine interface (BMI) or brain-computer-interface (BCI) by performing real-time, continuous classification, e.g., via a trained neural network, of motor imagery (MI) brain signals or of steady-state visually evoked potential (SSVEP) signals.
- BMI brain-machine interface
- BCI brain-computer-interface
- MI motor imagery
- SSVEP steady-state visually evoked potential
- the exemplary system is configured as a low-profile, portable system that includes microneedle electrodes that can acquire EEG signals for a brain- machine interface controller.
- the microneedle electrodes may be configured as soft imperceptible gel-less epidermis-penetrating microneedle electrodes that can provide improved contact surface area and reduced electrode impedance density, e.g., to enhance EEG signals and the signal classification accuracy.
- the microneedle electrodes can be further integrated with soft electronics that can be mounted locally in proximity to the electrodes to reduce obtrusive wiring and improve signal acquisition quality.
- the exemplary wireless soft scalp electronic system and method can operate in combination with a virtual reality (VR) or augmented reality (AR) training system comprising a VR/AR environment controller to provide clear, consistent visuals and instant biofeedback to a user in a MI or SSVEP application.
- the VR/AR environment controller can employ the acquired and classified EEG signals to actuate a command that renders an object VR/AR scene associated with motor imagery (e.g., one or more body objects that can perform aspects of body movement) to be viewed by a user as feedback to the user during the MI training.
- the VR/AR hardware and brain-machine interface hardware can be used to provide and acquire visual stimuli for the acquisition of steady-state visually evoked potentials.
- the VR/AR hardware and associated training can reduce the variance in detectable EEG response, e.g., in MI and SSVEP applications.
- the scalp electronic system and associated training were observed to provide a high classification accuracy for motor imagery applications (93.22 ⁇ 1.33% for four classes), allowing wireless, real-time control of a virtual reality game.
- a system including an electroencephalography-based
- the system can include a set of low-profile EEG sensors each comprising an array of flexible epidermis-penetrating microneedle electrodes fabricated on a flexible-circuit substrate, the flexible-circuit substrate operatively connected to an analog-to- digital converter circuitry operatively connected to a wireless interface circuitry; and a brain- machine interface operatively connected to the set of low-profile EEG sensors, the brain- machine interface comprising: a processor; and a memory operatively connected to the processor, the memory having instructions stored thereon, where execution of the instructions by the processor causes the processor to: receive EEG signals acquired from the low-profile EEG sensor; continuously classify brain signals as control signals via a trained neural network from the acquired EEG signals; and output the control signals to a virtual reality environment controller to actuate a command (e.g., for training) in a VR scene generated by the virtual reality environment controller be viewed by the subject.
- a command e.g., for training
- the command causes a set of movements of an extremity in the VR scene
- the trained neural network is configured to classify the brain signals for the set of movements.
- the set of low-profile EEG sensors is connected to the brain-machine interface over a set of stretchable flexible connectors.
- the microneedle electrodes have expanded contact surface area and reduced electrode impedance density.
- the system further includes a wearable soft headset comprising a low-modulus elastomeric band.
- the trained neural network includes a spatial convolutional neural network.
- the set of low-profile EEG sensors is placed along the scalp for motor imagery.
- the set of low-profile EEG sensors is placed along the scalp for steady-state visually evoked potentials (SSVEP) measurements.
- SSVEP visually evoked potentials
- the virtual reality environment controller to configured to generate split-eye asynchronous stimulus (SEAS) in the virtual scene for a real-time text speller interface.
- SEAS split-eye asynchronous stimulus
- the execution of the instructions by the processor further causes the processor to transmit the acquired EEG signals to a remote or cloud computing device executing a retraining operation of the trained neural network; and receive during the run-time operation of the virtual reality environment controller an updated trained neural network from the remote or cloud computing device.
- a plurality of the flexible epidermis-penetrating microneedle electrodes of the array each is at least 500 pm in height (e.g., 800 pm) to mount on a hairy scalp with a base width of about 350 pm and has an area of about 36 mm2.
- a method can include providing a set of low-profile EEG sensors placed at a scalp of a user, where the set of low-profile EEG sensors each includes an array of flexible epidermis-penetrating microneedle electrodes fabricated on a flexible-circuit substrate, the flexible-circuit substrate operatively connected to an analog-to- digital converter circuitry operatively connected to a wireless interface circuitry; receiving, by a processor or a brain-machine interface operatively connected to the set of low-profile EEG sensors, EEG signals acquired from the low-profile EEG sensor; continuously classifying, by the processor, brain signals as control signals via a trained neural network from the acquired EEG signals; and outputting, by the processor, the control signals to a virtual reality environment controller to actuate a command in a VR scene generated by the virtual reality environment controller be viewed by the subject.
- the set of low-profile EEG sensors is placed directly on the scalp without conductive gel or paste.
- the set of low-profile EEG sensors includes i) a reference array of flexible epidermis-penetrating microneedle electrodes placed at an apex position on the scalp and ii) six arrays of flexible epidermis-penetrating microneedle electrodes releasably attached to a low-modulus elastomeric band at a first frontal position, a second back position, and at four side positions for motor imagery measurements.
- the set of low-profile EEG sensors includes i) a reference array of flexible epidermis-penetrating microneedle electrodes placed at a back position on the scalp and ii) four arrays of flexible epidermis-penetrating microneedle electrodes releasably attached to a low-modulus elastomeric band at a back region of the scalp for steady-state visually evoked potentials (SSVEP) measurements.
- SSVEP visually evoked potentials
- the method can further include transmitting, by the processor, the acquired EEG signals to a remote or cloud computing device executing a retraining operation of the trained neural network; and receiving, by the processor, during run time operation of the virtual reality environment controller an updated trained neural network from the remote or cloud computing device.
- a non-transitory computer readable medium is disclosed.
- the A non-transitory computer-readable medium can have instructions stored thereon, where execution of the instructions by a processor of a brain-machine interface controller causes the processor to: receive EEG signals acquired from a set of low-profile EEG sensors placed at a scalp of a user, where the set of low-profile EEG sensors each includes an array of flexible epidermis- penetrating microneedle electrodes fabricated on a flexible-circuit substrate, the flexible-circuit substrate operatively connected to an analog-to-digital converter circuitry operatively connected to a wireless interface circuitry, where the set of low-profile EEG sensors are placed directly on the scalp without conductive gel or paste; continuously classify brain signals as control signals via a trained neural network from the acquired EEG signals; and output the control signals to a virtual reality environment controller to actuate a command in a VR scene generated by the virtual reality environment controller be viewed by the subject.
- the set of low-profile EEG sensors includes i) a reference array of flexible epidermis-penetrating microneedle electrodes placed at an apex position on the scalp and ii) six arrays of flexible epidermis-penetrating microneedle electrodes releasably attached to a low-modulus elastomeric band at a first frontal position, a second back position, and at four side positions for motor imagery measurements.
- the set of low-profile EEG sensors includes i) a reference array of flexible epidermis-penetrating microneedle electrodes placed at a back position on the scalp and ii) four arrays of flexible epidermis-penetrating microneedle electrodes releasably attached to a low-modulus elastomeric band at a back region of the scalp for steady-state visually evoked potentials (SSVEP) measurements.
- SSVEP visually evoked potentials
- the execution of the instructions further causes the processor to transmit the acquired EEG signals to a remote or cloud computing device executing a retraining operation of the trained neural network; and receive during run-time operation of the virtual reality environment controller an updated trained neural network from the remote or cloud computing device.
- FIG. 1 shows an example electroencephalography -based brain-machine-interface system in accordance with an illustrative embodiment.
- FIG. 2 shows an example EEG brain-machine-interface system configured as a low-profile EEG-sensor soft-scalp-electronics device for motor imagery (MI) training or operation in accordance with an illustrative embodiment.
- MI motor imagery
- Figs. 3 A, 3B, and 3C each shows aspects of an example EEG brain-machine- interface system configured as a low-profile EEG-sensor soft-scalp-electronics device for SSVEP training or operation in accordance with an illustrative embodiment.
- FIG. 4A illustrates a method of operating the example EEG brain-machine- interface system in accordance with an illustrative embodiment.
- Fig. 4B shows an example method of operation to configure and re-configure the example EEG brain-machine-interface system during run-time operation in accordance with an illustrative embodiment.
- Figs. 5A, 5B, 5C, 5D, and 5E each illustrates example methods of fabricating components of the example EEG brain-machine-interface system in accordance with illustrative embodiments.
- Figs. 6A, 6B, 6C, 6D, and 6E each shows aspects of a study to develop virtual reality (VR) implementation for motor imagery training and real-time control using the example EEG brain-machine-interface system in accordance with an illustrative embodiment.
- VR virtual reality
- Figs. 7A and 7B shows mechanical characterization results of components of the example EEG brain-machine-interface system in accordance with illustrative embodiments.
- Figs. 8A, 8B, 8C, 8D, 8E, 8F, 8G each shows aspects of a study to develop virtual reality (VR) implementation for SSVEP training and real-time control using the example EEG brain-machine-interface system in accordance with an illustrative embodiment.
- VR virtual reality
- FIG. 1 shows an example electroencephalography-based (EEG) brain-machine- interface system 100 (“EEGBMI” system 100) in accordance with an illustrative embodiment.
- the system 100 includes a set of low-profile EEG sensors 102 (shown as 102a, 102b, 102c,
- the first EEG sensor 102a is shown as a reference electrode that is connected, via a flexible connector 109, to a flexible front-end electronics assembly 110 that interfaces with a BMI control system 112 that classifies, via a neural network 114 (shown as “Spatial CNN” 114), to generate control signals to a computing device or a machine 116.
- the computing device or machine 116 can include a VR/AR training system 118 and/or a machine computer system 119.
- the VR/AR training system 118 and/or the machine/computer system 119 can be configured to execute a VR/AR application 121.
- the virtual reality application 121 can include a BMI rendering and UI module 122 and a module containing game environment parameters 124.
- the term “VR/AR” refers to a virtual reality system, an augmented reality system, or a system capable of providing both.
- the other EEG sensors (shown as “Sensor Array” 102b, 102c, 102d) are measured, via the hardware or software, in relation to the reference sensor 102a and, in the example of FIG. 1, are connected via the flexible cabling 126 through the reference EEG sensor assembly 102a.
- the system 100 may employ more than one reference sensor assembly (e.g., 102a).
- the flexible cabling 126 in the example of FIG. 1, includes a set of laser- machined stretchable and flexible interconnects 128.
- the interconnects 128 can have electrical conductors formed in a meandering or serpentine pattern 130 that allows the interconnects 128 to be stretched or bent.
- the flexible connector 109 connects the flexible assembly of the reference sensor 102a to the flexible front-end electronics assembly 110.
- the flexible front-end electronics assembly 110 can include one or more analog to digital converters 132 operably connected to the array 104 of needle electrodes 102b 102c 102d through the flexible cable 126.
- the ADCs 132 can convert analog signals from the reference array of needle electrodes 102a and from the sensor array (e.g., 102b, 102c, 102d) to digital signals.
- the digital signals can be transmitted by the network interface 134 to the network interface 135 in the BMI control system 112.
- the flexible front-end electronics assembly 110 can include a controller 136 that can be configured to control the operation of the energy storage 138, ADCs 132, and network interface 134.
- the BMI control system 112 is configured to continuously classify brain signals as control signals via the trained neural network from the acquired EEG signals.
- the BMI control system 112 can provide the control signal to a machine 119, e.g., to operate a vehicle (e.g., power wheelchair) or a robotic limb, or the like.
- the BMI control system 112 can include a trained neural network 114, a network interface 135, a controller 137, a filter module 140, and a scaling module 142.
- the trained neural network 114 is configured to classify the acquired EEG signals to generate control signals to the computing device or machine 116.
- the trained neural network 114 is configured as a spatial CNN.
- the trained neural network can be configured as other CNN and AI systems, e.g., as described or referenced herein.
- the BMI control system 112 is configured to be re configured during run-time operation.
- the BMI control system is shown connected to a cloud system 144 configured with a neural network training system 146.
- the cloud system 144 is configured to receive the acquired EEG signals from the BMI control system 112 and re-train a local version of the neural network 114.
- the neural network training system 146 determines if the retrained neural network 148 improves upon the prior neural network 114. Upon such a determination, the neural network training system 146 provides the retrained neural network 148 to the BMI control system 112, which replaces (e.g., via its controller 137) the neural network 114 with the updated version.
- BMI control system 112 includes the network interface 135 to communicate and receive from the network interface 134 of the flexible front-end electronics assembly 110.
- the filter module 140 and scaling module 142 are configured to preprocess the acquired EEG signals prior to the classification operation.
- the filter module 140 is configured to filter the acquired EEG data, e.g., using a Butterworth bandpass filter, and the scaling module 142 is configured to upscale the filtered EEG data, e.g., using a linear upscaling operator.
- the BMI control system 112 can be configured to operate with a VR/AR training system 118 comprising a VR/AR environment controller (not shown) that can employ the classified control signal to actuate a set of commands in the VR scene to be displayed to the user.
- the VR/AR environment can be implemented using a VR/AR headset and VR/AR software.
- the VR/AR environment may operate with a VR software (e.g., Unity) to configure a computing device to display VR/AR graphics in a VR/AR headset.
- the VR/AR headset e.g., Samsung Gear VR
- the VR software may render 3D models ( Maya) of the hands and feet, or other geometric objects, to facilitate visualization of the MI.
- the animation software, VR/AR software, VR/AR headset, and various computing devices described with reference to this example implementation are all intended as non-limiting examples and that the present disclosure can be implemented using any suitable animation software, smartphone (or other computing devices), VR (or AR) headsets, and/or any AR or VR software package.
- the game described is a non-limiting example and that embodiments of the present disclosure can be used to control and receive output from any computing device.
- the computing device may include a processing unit that may be a standard programmable processor that performs arithmetic and logic operations necessary for the operation of the computing device. Multiple processors may be employed.
- processing unit and processor refers to a physical hardware device that executes encoded instructions for performing functions on inputs and creating outputs, including, for example, but not limited to, microprocessors (MCUs), microcontrollers, graphical processing units (GPUs), and application-specific circuits (ASICs).
- MCUs microprocessors
- GPUs graphical processing units
- ASICs application-specific circuits
- the computing device may also include a bus or other communication mechanism for communicating information among various components of the computing device.
- the logical operations described above can be implemented (1) as a sequence of computer-implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system.
- the implementation is a matter of choice dependent on the performance and other requirements of the computing system.
- the logical operations described herein are referred to variously as state operations, acts, or modules. These operations, acts, and/or modules can be implemented in software, in firmware, in special purpose digital logic, in hardware, and any combination thereof. It should also be appreciated that more or fewer operations can be performed than shown in the figures and described herein. These operations can also be performed in a different order than those described herein.
- One or more programs may implement or utilize the processes described in connection with the presently disclosed subject matter, e.g., through the use of an application programming interface (API), reusable controls, or the like.
- API application programming interface
- Such programs may be implemented in a high-level procedural or object-oriented programming language to communicate with a computer system.
- the program(s) can be implemented in assembly or machine language, if desired.
- the language may be a compiled or interpreted language, and it may be combined with hardware implementations.
- Example # 2 Motor Imagery -based Brain-Machine Interface
- FIG. 2 shows an example EEG BMI system 100 (shown as 100a) that includes the flexible front-end electronics assembly 110 (shown as 110a) configured as a low-profile EEG- sensor soft-scalp-electronics (SSE) device for motor imagery (MI) training or operation that interfaces with a VR/AR headset 202 (shown as 202a) in accordance with an illustrative embodiment.
- the flexible front-end electronics assembly 110 shown as 110a
- SSE soft-scalp-electronics
- MI motor imagery
- the SSE device 110a can be placed along the scalp of a user and includes (i) fully portable signal-acquisition electronics on a flexible substrate and (ii) stretchable interconnectors 128 that connect to a set of flexible microneedle arrays 104 (shown as 104a, 104b, 104c, 104d, 104e, and 104f).
- the soft-scalp-electronic system 110a can be configured for MI brain signal detection for persistent BMI by continuously recording brain signals via a head-worn strap 206.
- the SSE system 110a is configured to provide the acquired EEC signals via a wireless connection (or via a wired connection) to an external computing device that then classifies the acquired EEG signals as signals, e.g., for MI application or for an immersive visualization training.
- the BMI system 100a includes a reduced number of EEG electrodes that are straightforward to set up to reduce setup usage complexity, e.g., as compared to conventional EEG applications, while not sacrificing classification performance.
- the SSE system 110a includes an array of integrated stretchable interconnectors 128 bonded to flexible microneedle electrodes (FMNEs) (e.g., 104).
- FMNEs flexible microneedle electrodes
- the soft-scalp-electronic system 110a may be fabricated using a flexible membrane circuit to have great mechanical compliance.
- the flexible membrane circuit can be integrated with electronic chips (e.g., front-end acquisition IC and network interface IC) and encapsulated to maintain mechanical compliance.
- Each of the arrays of FMNEs includes a set of high-aspect-ratio needles, e.g., greater than 2 (e.g., 800 pm in height with a base width of 350 pm).
- Other needle base to height ratios may be employed, e.g., 1, 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2.0.
- the needle base to height ratios can be greater than 2.
- the wearable head strap 206 is attached or affixed to the wearable head strap 206, which may be integrated with a set of low-modulus elastomeric bands 208 that can be molded together to secure the multiple FMNEs at the MI positions on the user’s scalp.
- the primary band 206 can wrap around the head of the user about an axial plane 210 to secure five FMNEs on the temporal lobes (reference FMNE 104a (“Cz”), first axial FMNE 104b (“C2”), second axial FMNE 104c (“C3”), third axial FMNE 104d (“C4”), and axial fourth FMNE 104e (“C5”)).
- the primary band 208 connects, via the flexible interconnects, to an inion FMNE 104f (“Fz”) and a nasion FMNE 104g (“POz”) to provide 6- channels of EEC measurements with respect to the reference electrode array.
- Fz inion FMNE 104f
- POz nasion FMNE 104g
- a schematic of the same is shown in plot 212.
- Plot 212 also shows the FMNEs in relation to a standard EEG cap with 20+ electrodes.
- the primary band 208 also connects to a ground electrode 214 that is configured to be placed behind the ear.
- Other number of electrode arrays may be employed, including 7, 8, 9, 10, 11, 12, etc. In some embodiments, the number of electrode arrays can be greater than 12.
- the electrode arrays (e.g., 104) are connected to ADC front-end circuitries
- the soft- scalp-electronic system includes the network interface 134 (shown as “Bluetooth controller” 134a) that can communicate the acquired EEC signals to the BMI control system 112 (shown as “Tablet” 112a).
- the BMI control system 112a is configured to process sequences from the EEG recording machine learning classification algorithm 114 (shown as “convolutional neural network” 114a) to generate MI classifications that can be used as control signals to control VR/AR targets in a VR/AR system environment.
- the machine computer system 120a includes a filter operation 140 (shown as 140a) and rescaling operation 142 (shown as 142a).
- the machine computer system 112a is configured to optimally capture event-related synchronization and desynchronization, e.g., relating to separate hands and both feet, as well as capturing overall alpha rhythm activity.
- the ML model can decompose spatial features from multiple dipolar sources of the motor cortex.
- the output of the classification can be sent as a command 220 to a target shown as a VR target 222.
- FIG. 3 A shows an example EEG BMI system 100 (shown as 100b) that includes the flexible front-end electronics assembly 110 (shown as 110b) configured as a low-profile EEG-sensor soft-scalp-electronics (SSE) device for SSVEP training or operation that interfaces with a VR/AR headset 202 (shown as 202b) in accordance with an illustrative embodiment.
- the flexible front-end electronics assembly 110 shown as 110b
- SSE soft-scalp-electronics
- the SSE device 110b can also be placed along the scalp of a user and includes (i) fully portable signal-acquisition electronics on a flexible substrate and (ii) stretchable interconnectors 128 that connect to a set of flexible microneedle arrays 104 (shown as 104’, e.g., 104a’, 104b’, 104c’, 104d ⁇ 104e’ - see Fig. 3B).
- 104 flexible microneedle arrays 104
- the EEG BMI system 100b is used to acquire SSVEP signals from different eye-specific stimuli being presented to each eye via split-eye asynchronous stimulus (SEAS) application.
- SEAS split-eye asynchronous stimulus
- the separate eyes stimulation can produce unique asynchronous stimulus patterns that can provide more encoded channels to improve brain-signal recording throughput.
- the EEG BMI system 100b can be used to provide real-time monitoring of steady-state visually evoked potentials (SSVEP) for portable BCI with over 30 channels, e.g., for text spelling.
- SSVEP steady-state visually evoked potentials
- a user interface panel 302 is presented, e.g., in a VR/AR environment, with textual elements 304 in which the textual elements 304 (a portion highlighted) include different steady- state flickering stimuli patterns.
- each textual element 304 (32 elements) can be encoded with a unique steady-state flickering stimulus.
- the steady-state flickering stimuli patterns can be differently presently for the left and right display (shown as 308 and 310, respectively).
- Fig. 3C shows example stimulus frequencies (312) and targets (shown as “Target
- the stimulus frequencies 312 are presented as either the same frequencies in both eyes (rows “1” and “3,” shown as 316 and 318) or with different frequencies for the left eye and right eye (rows “2” and “4,” shown as 320, 322).
- rows “2” and “4” (320, 324) each of the first numbers (326) represents the frequency seen by the left eye, and each of the second numbers (328) represents the frequency seen by the right eye.
- Other configurations can be used.
- the same frequencies output e.g., per rows “1” and “3,” provide a common reference for the eye to track to which the different frequencies, e.g., per rows “2” and “4,” can be asynchronously presented and consistently detected.
- unique frequencies should be utilized between the left and right eye due to mixing that can occur in the subject that can affect classification.
- the EEGBMI system 100b can provide real-time data processing and classification, e.g., for 33 classes of SSVEP inputs.
- the EEG BMI system 100b could provide for 33 identifiable classes with an accuracy of 78.93% within a 0.8-second acquisition window and 91.73% within a 2-second acquisition window.
- the SSE system 110b may also include an array of integrated stretchable interconnectors bonded to flexible microneedle electrodes (FMNEs) 104’.
- FMNEs flexible microneedle electrodes
- the soft-scalp-electronic system 110b may be fabricated using a flexible membrane circuit to have great mechanical compliance.
- the flexible membrane circuit can be integrated with electronic chips (e.g., front-end acquisition IC and network interface IC) and encapsulated to maintain mechanical compliance.
- Each of the arrays of FMNEs 104’ in the example shown in Fig. 3A, includes a set of high-aspect-ratio needles, e.g., greater than 2 (e.g., 800 pm in height with a base width of 350 pm).
- Other needle base to height ratios may be employed, e.g., 1, 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2.0.
- the needle base to height ratios can be greater than 2.
- the soft-scalp-electronic system 110b in the example shown in Fig.
- the electrode arrays 104’ are connected to ADC front-end circuitries (e.g., comprising ADCs 132) of the soft-scalp-electronic system 110b.
- the soft-scalp-electronic system 110b includes the network interface (e.g., 134) that can communicate the acquired EEC signals to a BMI control system (e.g., 112).
- the BMI control system 112a is configured to process sequences from the EEG recording machine learning classification algorithm (e.g., 114) to generate SSVEP classifications that can be used as control signals to control VR/AR targets in a VR/AR system environment.
- Fig. 4B shows an example method 330 of operation to configure and re-configure the BMI control system (e.g., 112) during run-time operation.
- Method 430 includes acquiring (432) the EEG signals, e.g., via the soft-scalp-electronic system 110b.
- the EEG signals may be acquired during a calibration operation. Multiple training trials may be performed to acquire sufficient data to the train a machine-learning model with minimal bias.
- the calibration operation may be performed prior to each session.
- Method 430 then includes transmitting (434) the acquired signals to a training system (e.g., cloud infrastructure).
- a training system e.g., cloud infrastructure
- the training system can pre-process (436) the acquired signals via segmentation (438) (e.g., segmenting data in the range of 0.8 seconds and 2 seconds), filtering (440) (e.g., using a bi-directional 3rd order high-pass Butterworth filter with a corner frequency of 2.0 Hz), and rescaling operation (441) (e.g., linearly rescaling between a range of -0.5 and 0.5).
- segmentation e.g., segmenting data in the range of 0.8 seconds and 2 seconds
- filtering 440
- rescaling operation e.g., linearly rescaling between a range of -0.5 and 0.5.
- the training system may perform classification operation (438) by training variations of the Spatial-CNN model with hyperparameters adjustments (e.g., size of filters, number of filters, and number of convolution steps).
- the training system determines (440) if performance is improved. If so, the training system then transmits (442) the model parameters to a run-time system (e.g., the BMI control system 112).
- the EEG BMI system 100b for SSVEP can be used in combination with the EEG BMI system 100a for MI.
- FIG. 4A illustrates a method 400 of operating using an embodiment of the systems disclosed herein to output control signals.
- Method 400 includes (402) providing a low- profile EEG sensor, e.g., a described in relation to Fig. 1, 2A, 3 A, or other BMI configurations described herein.
- Method 400 then includes receiving (404) by a processor or a brain-machine interface operatively connected to the set of low-profile EEG sensors, EEG signals acquired from the low-profile EEG sensor receiving. Examples of the data acquisition are provided in relation to Figs. 1, 2A, and 3 A.
- Method 400 then includes continuously classifying (406) by the processor brain signals as control signals via a trained neural network from the acquired EEG signals.
- Method 400 then includes outputting (408) by the processor the control signals to a virtual reality environment controller to actuate a command in a VR scene generated by the virtual reality environment controller be viewed by the subject.
- FIG. 5A illustrates a method 500 of fabricating an array of electrodes on a substrate according to an example embodiment of the present disclosure.
- FIG. 5A illustrates a method 500 of fabricating an array of electrodes on a substrate according to an example embodiment of the present disclosure.
- the area of each electrode array can be about 36 mm 2 , which can improve the EEG spatial resolution over the conventional, large cup electrodes (100 mm 2 ) that typically require conductive gels.
- Other array size can be employed, e.g., about 10 mm 2 , about 15 mm 2 , about 20 mm 2 , about 25 mm 2 , about 30 mm 2 , about 35 mm 2 , about 40 mm 2 , about 45 mm 2 , about 50 mm 2 in which “about’ refers to range of ⁇ 1mm, ⁇ 2 mm, or ⁇ 2.5 mm.
- the array size can be between than 50 mm 2 and 100 mm 2 .
- Typical EEG with conductive gels is about 100 mm 2 in size.
- Process 500 includes providing (502) a master negative PDMS
- Process 500 includes creating (504) an epoxy positive mold using the negative PDMS mold.
- the positive mold can be formed by an epoxy resin (e.g., the resin EpoxAcast manufactured by Smooth On, Inc).
- Process 500 includes transferring (506) the epoxy positive mold to a glass slide.
- An adhesive can be used to bond the PDMS negative mold onto the glass slide as a bonding layer of PDMS.
- Image 506’ shows a positive epoxy mold.
- Process 502 to 506 can be repeated to form additional copies of the epoxy positive mold.
- the PDMS negative mold can be treated with ambient air plasma (e.g., for 2 minutes).
- Process 500 includes positioning (508) the copies of the epoxy positive mold in an array inside a tray.
- Process 500 includes adding (510) additional PDMS to the tray to form a new set of negative molds.
- the mold illustrated in step 508 is a 4x2 array, but it should be understood that any number of epoxy molds created in steps 502 to 506 can be used.
- Process 500 includes releasing (512) the PDMS negative mold from the tray used in steps 508 and 510.
- Image 512 shows a negative silicone mold formed from PDMS.
- Image 522’ shows the final coated electrode.
- the mold components are formed in steps 502 thru 512.
- the Process 500 first includes adding (514) a dilute polyimide (PI) solution to the PDMS negative mold and then can be soft-baked (e.g., at 80 °C for 10 minutes).
- Process 500 includes adding (516) a second dilute polyimide solution to the mold of 516, which can then be hard-baked (e.g., at 200°C for 1 hr.)
- the first dilute polyimide solution is a 3:1 ratio solution
- the second dilute polyimide solution is a 2: 1 solution.
- Process 500 includes removing (518) the PI needles from the mold, e.g., peeling the polyimide microneedle array (PI MNA) from the PDMS mold.
- PI MNA polyimide microneedle array
- Process 500 may include placing (520) the PDMS needles on a PDMS coated slide.
- a thin layer of Polyimide (PI) e.g., PI sold under the trademark PI 2610 by HD Microsystems
- PI Polyimide
- Process 500 then includes coating (522) the PI needles using sputter deposition, e.g., via Cr coating and then Au coating where the depths of CR and AU are 5nm and 200nm, respectively.
- the sputtering can be performed in multiple steps.
- the top or bottom surface of the PI needles can be sputter coated in one step, and then the remaining surface can be sputter coated in the next step.
- Example #2 Method of Fabrication of Flexible Microneedle Array
- FIG. 5B illustrates another method 530 of fabricating an array of electrodes on a substrate according to an example embodiment of the present disclosure.
- Method 530 includes creating a PDMS negative mold (512), e.g., as described in relation to Fig. 5 A.
- Process 530 then further includes adding (532) a thin layer of epoxy (e.g., EP4CL-80MED, Master Bond Inc.) to form the needles 533.
- the epoxy can be one-part epoxy with high tensile and compression strength, as well as its biocompatibility.
- One part epoxy can be used without a solvent, which can prevent the epoxy from bubbling and does not require a mixing step, avoiding air being mixed into the epoxy by mixing.
- Process 530 then includes adding (534) a perforated polyimide sheet (535) to the needles.
- Image 534 shows an example design of the perforated polyimide sheet
- images 534’ shows an example perforated polyimide sheet that is fabricated, which has high compliance and flexibility.
- Process 530 then includes performing a low-temperature cure (536) (e.g., 100 °C for 1 hour). Low-temperature curing can be employed to allow the molds to repeated used over more replica molding cycles. In contrast, in some embodiments, a high- temperature PI cure can destroy the PDMS molds in as few as three fabrication cycles.
- Process 530 then includes releasing (538) the needle structure from the mold and placing it on a PDMS-coated slide.
- the needle can be coated (540) by sputtering Cr/AU on both sides of the needle structure.
- Image 538’ shows the needle assembly prior to Cr/AU coating
- image 540’ shows the needle assembly after the Cr/AU coating.
- FIGS. 5C and 5D illustrate a method 550 of fabricating an example flexible main circuit in accordance with an illustrative embodiment.
- the flexible main circuit can include a polyimide substrate that can be sufficiently thin to allow electrode flexion to conform to the scalp surface.
- Method 550 includes spin coating (552) the PDMS on a cleaned silicon wafer
- Image 552’ shows the spin-coated PDMS.
- Method 550 then includes spin coating (553) polyimide (e.g., at 4000 rpm for 1 min) and bake in a vacuum oven (e.g., at 250 °C for 3 hr, including ramping steps).
- Image 553’ shows the first polyimide spin-coated wafer.
- Method 550 then includes sputtering (554) copper (e.g., 500 nm copper) for the
- Image 554’ shows the first copper deposited wafer.
- Method 550 then includes patterning (555) the wafer by spin coating photoresist
- Method 550 then includes etching (556) the exposed copper with a copper etchant
- Image 556’ shows the bottom Cu etched circuit with the Cu etching performed thereon.
- Method 550 then includes spin coating (557) polyimide (e.g., at 850 rpm for 1 min) and baking it in a vacuum oven (e.g., at 250 °C for 3 hr, including ramping steps).
- Image 557’ shows the second polyimide spin-coated wafer.
- Method 550 then includes patterning (558) the wafer by spin coating photoresist
- Method 550 then includes exposing (559) the PI to an oxygen plasma etch using reactive ion etching (Plasma-Therm) and removing the photoresist.
- Image 559 shows the polyimide circuit etched with vias.
- Method 550 then includes depositing (560) a second Cu layer by sputtering (e.g.,
- Image 560’ shows the 2 nd deposition wafer.
- Method 550 then includes patterning (561) the wafer by spin coating photoresist
- Method 550 then includes etching (562) exposed copper with a copper etchant (APS- 100, diluted 1 : 1 with DI water) and them removing the photoresist.
- Image 562’ shows the top Cu etched circuit.
- Method 550 then includes spin coating (563) polyimide (e.g., at 4000 rpm for 1 min) and bake in a vacuum oven (e.g., at 250 °C for 3 hr, including ramping steps).
- Image 563’ shows the third polyimide spin-coated wafer.
- Method 550 then includes patterning (564) the wafer by spin coating photoresist
- Method 550 then includes performing (565) oxygen plasma etch exposed PI using reactive ion etching (Plasma-Therm) and stripping the photoresist to produce the final flexible circuit.
- Image 565 shows the polyimide etched top circuit with the exposed Cu deposited layer.
- Method 550 then includes installing (566) ICs on the flexible circuit by transferring the circuit to a glass slide (see images 566’); reflowing solder onto chip components to install the ICs (see image 566”); and encapsulating the full circuit (e.g., 110) in an elastomer (see image 566”’).
- Fig. 5D shows the final fabricated flexible circuit, which is bent over a glass slide. Description of alternative methods are described in Mahmood et al. 2021; Mahmood et al. 2019; Zavanelli et al. 2021.
- FIG. 5E illustrates a method 570 of fabricating an example flexible interconnect
- Method 570 may employ a femtosecond laser cutter (WS-Flex USP, OPTEC) to fabricate the stretchable interconnect (e.g., 128) using a micro-machining process.
- Method 570 may include three main fabrication process (PET substrate preparation for polyimide film, sputtering Cr/Au on a polyimide film, and laser cut patterning).
- Method 570 may include spin-coating
- Method 570 may then include depositing (573) excess PI 2610 and spin-coating (e.g., at 3000 rpm for 1 minute), performing a first baking step on the hot plate (e.g., at 70°C for 30 mins), and after first baking step, removing PI film from PDMS/PET substrate and taping (576) it directly to clean hot plate to prevent the curling and contraction from heat), then proceeding with a second baking operation (578) (e.g., at 100°C for 15 min, then 150°C for 15 min, then 200°C for 15 min, then 250°C for 15 minutes).
- a first baking step e.g., at 70°C for 30 mins
- removing PI film from PDMS/PET substrate and taping removing PI film from PDMS/PET substrate and taping (576) it directly to clean hot plate to prevent the curling and contraction from heat
- a second baking operation e.g., at 100°C for 15 min, then 150°C for 15 min,
- Method 570 may first include taking a 0.5 mil sheet of PI film (Kapton HN Film, 0.5 mil, DuPont), cleaning it thoroughly, e.g., first with IP A, then with acetone, and drying after each cleaning. Method 570 may then include cutting the PI film into sheets of size 6in x 4in to fit inside the sputter machine. Method 570 may then include sputtering (574) Cr/Au (10 nm / 200 nm) on the PI film.
- PI film Kapton HN Film, 0.5 mil, DuPont
- Method 570 may then include cutting the PI film into sheets of size 6in x 4in to fit inside the sputter machine.
- Method 570 may then include sputtering (574) Cr/Au (10 nm / 200 nm) on the PI film.
- Method 570 includes reapplying the PI film sandwich onto the PDMS on PET substrate and using a femtosecond laser cutter (WS-Flex USP, OPTEC), secure the materials to the stage using a vacuum. Method 570 may then include preparing the material by aligning it with the stage and zeroing the laser head so that the masked areas align with the interconnect ends in the design.
- the pattern can be cut, e.g., by IFOV mode, 60kHz pulse, 60 movement speed, 60 jump speed, 12% power, and 2 repetitions (3 passes total).
- Method 570 may then include peeling (582) (e.g., manually peeling) the final, cut interconnects from the PDMS substrate using a fine-tipped tweezer.
- Image 582’ shows the patterned interconnectors prior to it being peeled.
- Image 584 shows the patterned interconnectors as it is being peeled.
- Images 584 show example stretchability characteristics of the flexible interconnect (e.g., 128) at 0%, 50%, and 100% stretching (584a, 584b, and 584c, respectively).
- Plot 586 shows a mechanical test result of the flexible interconnect (e.g., 128) over a set of cycles
- plot 588 shows a strain/resistance curve for the flexible interconnect (e.g., 128). The test shows the mechanical fracture after 250% of tensile stretching.
- a substrate for the interconnector is prepared by an electron-beam evaporating
- FIG. 6A illustrates an overview of the study to develop a virtual reality (VR) implementation for motor imagery training and real-time control of a video game demonstration.
- the study evaluated a fully portable, wireless, soft scalp electronics that includes at least three major components: 1) multiple flexible microneedle electrodes for mounting on the hairy scalp, 2) laser-machined stretchable and flexible interconnects, and 3) a low-profile, flexible circuit.
- the study also included a virtual reality (VR) component that can allow for a convenient and immersive training environment to assist with motor visualization.
- VR virtual reality
- These components were used in the study as a monolithic EEG system optimized for minimizing motion artifacts and maximizing signal quality.
- Epidermis-penetrating electrodes were employed to provide optimal impedance density on the scalp, improve the signal-to-noise ratio, and improve spatial resolution for MI recording.
- embodiments of the exemplary devices and systems provide a feasible approach to a high-performance BMI system that can operate well with a powerful machine-learning algorithm and in a virtual reality environment.
- embodiments of the present disclosure can employ imperceptible, hair-wearable systems with only 6 EEG channels and can achieve high accuracy of 93.22 ⁇ 1.33% for four classes with a peak information transfer rate of 23.02 bits/min with four human subjects.
- the study developed a customized Android application configured to provide real-time, continuous motor imagery classification of 6 channels of MI signals.
- the Android application was used to evaluate the training and testing processes of a VR.
- the system presented modified views (630) of VR visuals to a subject with text and animation prompts that are designed for MI response testing.
- VR screen 632 is an example VR scene comprising disembodied hands and feet.
- VR screen 633 is an example VR scene that includes clear color-coded visual cues and text prompt that can be actuated by the user through motor imagery.
- the developed application showed the VR scene 635 along with the associated EEG signals 637 that were observed by the measurement equipment.
- Plots 619 show the acquired EEG signals 637 from one of the interfaces of the Android application.
- Fig. 6B shows the neural network training of the MI classification system used in the study. In the example shown in FIG. 6B, training for a spatial convolutional neural network for motor imagery classification is shown.
- the system acquired six EEG channels (618) and decomposed them into spatial features from multiple dipolar sources of the motor cortex.
- the input (618) included six EEG data having a pre-defmed sample size (shown in this sample as 1000 samples).
- the spatial convolutional neural network employed in the study included a number of hidden layers (634) (shown as “2D convolution” 634a and “2D Spatial Convolution” 634b, 634c, 634d, 634e).
- a flatten step can be performed to generate a dense output layer.
- FIG. 6C includes a comparison plot 640 of spatial-CNN classification accuracy among four analysis bases, including raw data, high-pass filtered data (HPF), band-pass filtered data (Bandpass), and power spectral density analysis (PSD A). The analysis was conducted and is presented over multiple window lengths (1, 2, and 4 seconds). The error bars show a standard error from four subjects.
- HPF high-pass filtered data
- Bandpass band-pass filtered data
- PSD A power spectral density analysis
- FIG. 6C also includes a second comparison plot 642 of spatial-CNN classification accuracy between using conventional Ag/AgCl gel electrodes and the exemplary FMNE. The analysis was also conducted across multiple window lengths (1, 2, and 4 seconds), and the error bars show a standard error from four subjects.
- FIG. 6C also illustrates two confusion matrices (644, 646), indicating results from a real-time accuracy test of motor image brain data acquired by conventional Ag/AgCl electrodes and by the exemplary FMNE.
- FIG. 6C also illustrates two additional confusion matrices (648, 650) indicating results from a real-time accuracy test of motor image brain data, acquired using Spatial-CNN classifier and using a standard-CNN classifier.
- FIG. 6E illustrates a table showing the comparison of the example embodiment to other devices as reported in the literature [15, 20, 25, 26, 27, 21] Indeed, it can be observed that exemplary BMI sensor device and system can provide the accurate control, e.g., for a virtual reality game using a MI paradigm. In the table, it is shown that the exemplary BMI sensor device and system, in an implementation, can provide 93%+ accuracy to provide an ITR of about 23 bits/minute using only 6 electrodes. Other performance results for other configurations are also reported herein.
- FIG. 6D shows the results of a preliminary analysis performed in the study to evaluate the optimized numbers of channels and their selection. 44- channels data out of conventional 128 channels from 13 healthy and normal subjects were considered for the analysis, as performed in a prior work (High-Gamma Dataset) [1] The analysis determined 6 optimal channels from the 44 channels that were the most meaningful.
- the 6 channels were then implemented as the sensory array set, e.g., as shown and described in relation to Fig. 2.
- Other channels maybe similarly selected depending on the patient or subject that is evaluated, including those that are symptomatic.
- Fig. 6D the full dataset (652) employed in the study is shown.
- the data were preprocessed using a 3rd-order Butterworth bandpass filter, with corner frequencies at 4 Hz and 30 Hz, and split into windows of 4 seconds.
- the data were used to train (654) a convolutional neural network (CNN) with standard convolutions on the first layer, with a filter size of (10, 1), followed by four spatial convolutional layers, to generate a trained network.
- CNN convolutional neural network
- the data (652) were also evaluated (656) using a generator that cycled through the data channels (while eliminating the remaining channels) to calculate the output perturbation on the selected channels. This resulting data was fed into the trained network (generated from 654). The output perturbations are compared with the true expected outputs to generate (658) the relative perturbations for that channel. These relative perturbations are summed (662) over the classes to generate a final perturbation value for each of the channels.
- the results are then compared and ranked (664).
- the bar chart shows each channel's relative perturbations with the top-6 channels labeled.
- the instant study employed a reduced number of electrodes (i.e., 6), which also reduced the complexity of the setup without significantly reducing classification performance.
- FIG. 7 A shows the results from a quantitative mechanical test conducted during the study for the buckling force performance of the exemplary microneedle electrode, e.g., fabricated using processes described in relation to Fig. 5B. Specifically, Fig. 7A shows SEM observations (710) of microneedle electrodes evaluated under a motorized force gauge applied via an axial force upon a single microneedle. Plot (712) shows the force versus displacement curve from buckling force evaluation for five electrodes. It was observed that the five fabricated FMNEs could withstand an averaged applied force up to at least 626 mN, which is well above the skin insertion force (20 - 167 mN) of a single microneedle [17]
- FIG. 7B shows results from a cycle bending mechanical test to evaluate the flexibility of the exemplary FMNEs, e.g., to evaluate mechanical robustness in tissue insertion.
- the exemplary FMNE were configured as gold-based electrodes to be mounted on the skin surface, which are safe to use due to their excellent biocompatibility.
- Diagram 709 shows a schematic of the cross-section of the test specimen. During the test, needle electrodes were continuously bent up to 100 times with a radius of curvature of 5 mm while the change of electrical resistance was measured. The result shows a negligible resistance shift of less than 0.6 W.
- Table 1 shows the results of a comparison study of impedance and impedance density of microneedle (MN) electrodes. In the study, different microneedle designs of varying heights were evaluated. The design included a fixed base width of 200 pm and a pitch of 500 pm in a 14 x 14 array.
- BMIs Brain-machine interfaces
- EEG Electroencephalography
- SSVEPs Steady-state visually evoked potentials
- practical applications are limited due to the requirement of an array of visual stimuli impeding the operator's view.
- the bright, flickering stimuli can cause eye strain and fatigue when used for extended periods.
- motor imagery is a greatly advantageous paradigm for persistent BMI as it does not require the use of external stimuli; its classes are based on imagined motor activities such as opening and closing a hand or moving feet [14, 15] With MI, specified motor imagery tasks can result in sensorimotor rhythm fluctuations in the corresponding motor cortex region, which can be measurable with EEG.
- FIG. 8A, 8B, 8C, 8D, 8E, 8F, 8G each shows aspects of a study to develop virtual reality (VR) implementation for SSVEP training and real-time control using the example EEG brain-machine-interface system in accordance with an illustrative embodiment.
- VR virtual reality
- a platform was configured for split-eye asynchronous stimuli operation and evaluated for information-throughput as a portable brain-computer interface (BCI).
- BCI portable brain-computer interface
- the study confirmed, among other things, that a VR interface with 33 stimuli classes can be performed in a real-time, wireless recording of SSVEP for text spelling.
- the soft wearable platform included a flexible circuit, stretchable interconnectors, and dry needle electrodes; they operated together with a VR headset to provide the fully portable wireless BCI.
- the study also demonstrated that the skin-conformal electrodes provide biocompatible, consistent skin-electrode contact impedance for a high-quality recording of SSVEP.
- the exemplary wireless soft electronic system showed superior performance in the SSVEP recording.
- the Spatial CNN classification method integrated with the soft electronics, provided real-time data processing and classification, showing accuracy from 78.93% for 0.8 seconds to 91.73% for 2 seconds with 33 classes from nine human subjects.
- the bioelectronic system with only four EEG recording channels demonstrated high ITR performances (243.6 ⁇ 12.5 bit/min) compared to prior works, allowing for a successful demonstration of VR text spelling and navigation in a real-world environment.
- the system could significantly reduce the impedance density while allowing for smaller electrodes than in the conventional setting and improving spatial resolution for MI detection.
- the FMNE achieved superior SNR.
- the study used a soft bioelectronic system with multiple components, including a VR headset, dry needle electrodes (e.g., 102), stretchable interconnectors (e.g., 109), and wireless flexible circuits (e.g., 110).
- the study conducted mechanical reliability of the various components.
- the study also evaluated the performance of different electrodes for SSVEP stimulus setups.
- the training setup involved a subject wearing the VR head-mounted display
- HMD head-worn computing
- a subject can wear the soft electronics with dry needle electrodes (hairy site) and wireless circuit (neck), secured by a headband, along with the VR HMD for recording brain signals.
- Fig. 8A shows a VR text speller developed and evaluated during the study.
- Plot 806 shows an example measured EEG data from four EEG channels that were transferred via Bluetooth (BLE) communication to a central processor (Android) where the signals are processed and classified in real-time.
- BLE Bluetooth
- Android central processor
- Computer rendered output (808) shows the text-speller interface generated in the
- Flow diagram (810) shows the operation flow of the Android software developed for the BCI demonstration that was used to generate the text-speller interface (808).
- the Unreal Engine program (further discussed below) was employed to render the text speller software and stimulus overlay to the user.
- the software included operations for a passthrough camera to allow for the navigation of a real-world environment via an augmented reality viewport using an electric wheelchair.
- the study implemented the system on two sets of hardware: a VR-HMD viewport (812) and the augmented reality viewport (814).
- the SSVEP commands can be utilized for navigation control (816).
- the split-eye asymmetric stimuli (SEAS) platform was generated with a widely used cross-platform software development engine (Unreal Engine 4.26, Epic Games Inc.) targeting VR hardware (Oculus Quest 2, Facebook).
- a widely used cross-platform software development engine Unreal Engine 4.26, Epic Games Inc.
- VR hardware Oculus Quest 2, Facebook.
- Materials can be animated using " sprites ,” which are animated raster graphics, where consecutive frames are arranged in an n x n "sheet.”
- Unreal Engine's built-in texture animation feature these frames were extracted and rendered. These materials were used to animate most 2D or 3D objects and flat surfaces in the engine environment.
- the first step was to generate the sheets with the relevant frames based on the frame rate.
- a program was devised in MATLAB to generate the sinusoidal waveform, convert that waveform into a tile, with the brightness based on the sine wave's amplitude, then arrange those tiles into a 10x10 "sheet" for Unreal Engine's texture rendering system.
- Figs. 8F and 8G include the MATLAB code and specific instructions for Unreal Engine to generate the VR interface.
- Fig. 8G an example of a stimulus tile generation is shown with the waveforms and the corresponding tile layout in 10x10 sprites.
- the cross-platform software (Unreal Engine, Epic Games) was used to develop an animated texture that appears differently on the left- and right-hand sides of the VR panel.
- 'Set G ten standard stimuli between 10.0 and 17.2 Hz were generated to determine the separability of SSVEP stimuli.
- Table 2 shows the left and right eye frequencies and phases.
- Another test set ('Set 2') includes the left eye frequencies range between 10.0 and
- each subjects' skin was cleaned by gently rubbing with an alcohol wipe, and dead skin cells were removed using an abrasive gel (NuPrep, Weaver, and Co.) in order to maintain a contact impedance below 10 kQ on all electrodes.
- the abrasive gel was removed using an alcohol wipe and the surface dried using a clean paper towel.
- the only skin preparation conducted was a gentle rub of the electrode location with an alcohol wipe.
- the EEG data were recorded using a custom application running on an Android Tablet (Samsung Galaxy Tab S4), using Bluetooth Low Energy wireless communication.
- Fig. 8B shows the results of a preliminary performance evaluation for different electrode positions. Preliminary SSVEP and SEAS tests were performed in order to test the feasibility of using stimuli in the VR environment before running a text speller setup with 32 stimuli.
- Plot 818 shows the average classification accuracies for two SSVEP stimulation sets across multiple time windows (0.8 - 2 seconds).
- 'Set G For the first set of tests (labeled as 'Set G, per Table 2), ten standard stimuli between 10.0 and 17.2 Hz were generated to determine the separability of SSVEP stimuli (details in Table SI).
- Another test set ('Set 2', per Table 3) includes the left eye frequencies range between 10.0 and 17.7 Hz, and the right eye frequencies range between 16.9 and 9.2 Hz, respectively.
- Results from Set “1” show high accuracies with short-time samples.
- eight subjects in Set 1 demonstrate 91.25 ⁇ 1.40% accuracy at a window length of only 0.8 seconds. This result shows a high-throughput ITR, 206.7 ⁇ 7.3 bits/min.
- the overall accuracy increases significantly 93.88 ⁇ 1.11% at 1.0 sec, 95.03 ⁇ 0.97% at 1.2 sec, and 98.50 ⁇ 0.34% at 2.0 sec.
- FIG. 8B also show the results of the evaluation of two configurations
- configuration “A” 824, configuration “B” 826 of the electrode positions.
- Plot 822 shows the results comparing the classification accuracy between the two electrode arrangements. From the plot (822), it can be observed that configuration “A” demonstrated stronger performance than configuration “B” for the subjects that were evaluated. Error bars in graphs represent the standard error of the mean. In configuration “A” 824, two channels are biased to each one respective hemisphere. In configuration “B” 826, all channels share a central reference.
- ITR Information transfer rate
- ITR is calculated based on the number of targets, the average time required to relay command, and the classification accuracy per Equation 1.
- N is the number of targets
- A is the accuracy
- w is the total time required to execute a command, including data acquisition time plus processing, classification, and execution latencies.
- CNN Classification Performance To train the CNN, testing data were segmented on the initial time the stimulus was presented. For each time frame (0.8 - 1.2 s), only the first period was used, and the rest was discarded. After segmenting, the data was preprocessed using a 3rd-order Butterworth high-pass filter with a comer frequency of 2.0 Hz. The data was not preprocessed before being used in training and classification. For Sets the samples, N, were subdivided into groups of 10 for cross-fold validation. For Set “3”, the samples N were subdivided into groups of 4 for 4-fold cross-validation for faster setup times. The classification was performed using a convolutional neural network (CNN) with spatial convolutions (Bevilacqua et al.
- CNN convolutional neural network
- FIG. 8C shows the results of the classification performance for the CNN classifier used in the study.
- the CNN classifier employed a spatial-CNN classification.
- FIG. 8C an overview 828 of the spatial CNN model is shown, which includes its various hidden layers and their extracted features from a 1-sec segment of 4-channel EEG signals.
- the study used a stimulus setup having a left stimulation frequency of 8.2 Hz and a right stimulation frequency of 13.2 Hz.
- Plots 830 and 832 each shows the performance results for the two set of experiments.
- Plot 830 shows the classification accuracy
- plot 832 shows the average ITR for each of the two sets of experiments.
- the commercial setup showed (via plots 830 and 832, respectively) 74.72 ⁇ 3.03% accuracy from 0.8 sec of data (ITR: 222.4 ⁇ 15.0 bits/min). Longer time lengths were observed to offer slightly higher accuracy expected.
- the exemplary soft electronic system showed (via plots 830 and 832, respectively) a substantial increase in the classification accuracy and ITR with 78.93 ⁇ 2.36% and 243.6 ⁇ 12.5 bits/min, respectively.
- this study demonstrates the unique advantage of using the wireless soft platform with dry electrodes over the conventional tethered system with required skin preparation and wired electrodes.
- FIG. 8D shows the results of an evaluation of the effects of stimuli frequency and phase shift. The study was performed using the conventional and exemplary soft electronics setups. Plot 834 shows the left- and right-eye frequency response corresponding to consecutive stimuli visualized, and plot 836 shows the corresponding left- and right-eye phase offsets.
- Plot 838 shows a confusion matrix generated from the results of the soft electronics for a 33 -class SEAS stimuli (for nine subjects).
- Plot 840 shows the same results under the same experimental conditions for the conventional setup. It can be observed that for the single-frequency stimuli, most of the confusion is from neighboring frequencies. In contrast, dual -frequency stimuli have various mixing with both single and other dual -frequency stimuli. This result showed that stimuli from one eye or the other are processed in both hemispheres of the visual cortex. In addition, the study also demonstrated that there are significant hemisphere- related asynchronies and mixing to which classification can be performed. The result shows, at a high level, one of the highest ITRs with as few as 4 EEG channels, compared to prior work.
- Fig. 8E shows a table of comparative performance between the exemplary soft electronics and prior works. As shown in the table, the exemplary soft electronics can achieve a ITR of 243.5 bits/minute for 33 classifications using 4 electrode channels with an accuracy near 80%.
- LIS there are several causes of LIS in humans, including but not limited to: stroke of the brainstem, traumatic brain injury or hemorrhage, poisoning, or drug overdose.
- Brain activity analysis is typically used to diagnose LIS with instruments such as electroencephalography (EEG) to observe the sleep-wake patterns of the affected individuals.
- EEG electroencephalography
- BCIs offer a potential solution to subjects with a severe physical disability such as LIS or quadriplegia, restoring some movement and communication to these individuals and improving quality of life.
- EEG design for BCI has trended towards wearables with wireless functionality since the standardization of common wireless protocols such as Bluetooth (Lin et al. 2010).
- Dry electrodes offer excellent, consistent long-term performance compared with gel-based electrodes (Norton et al. 2015; Salvo et al. 2012; Stauffer et al. 2018); provided the skin preparation, and amplifier, shielding, and electrode configurations are adequate (Li et al. 2017; Salvo et al. 2012).
- Lightweight sensors with minimal cabling also greatly reduce dragging or movement artifacts with poorly configured conventional EEG (Tallgren et al. 2005).
- Embodiments of the present disclosure include portable VR-enabled BCIs using a soft bioelectronic system and the SEAS platform to use SSVEP.
- VR can be used to simultaneously present asynchronous SSVEP stimuli — different frequencies to each eye.
- novel stimuli with VR along with the soft, wearable wireless device enables a 33-class high-throughput SSVEP BCI with high accuracy and low control latency. Using only four channels, an accuracy of 78.93 ⁇ 1.05% for 0.8 seconds of data for a peak information transfer rate of 243.6 ⁇ 12.5 bits/min was observed to be achieved.
- the device achieves 91.73 ⁇ 0.68% for two seconds of data at a throughput of 126.6 ⁇ 3.7 bits/min. This performance is demonstrated using a real-time text speller interface using a full keyboard-type setup.
- a “subject” may be any applicable human, animal, or other organism, living or dead, or other biological or molecular structure or chemical environment, and may relate to particular components of the subject, for instance, specific tissues or fluids of a subject (e.g., human tissue in a particular area of the body of a living subject), which may be in a particular location of the subject, referred to herein as an “area of interest” or a “region of interest.”
- a subject may be a human or any animal. It should be appreciated that an animal may be a variety of any applicable type, including, but not limited thereto, mammal, veterinarian animal, livestock animal or pet type animal, etc. As an example, the animal may be a laboratory animal specifically selected to have certain characteristics similar to humans (e.g., rat, dog, pig, monkey), etc. It should be appreciated that the subject may be any applicable human patient, for example.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Neurosurgery (AREA)
- Radiology & Medical Imaging (AREA)
- Hospice & Palliative Care (AREA)
- Child & Adolescent Psychology (AREA)
- Psychology (AREA)
- Dermatology (AREA)
- Neurology (AREA)
- Educational Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Developmental Disabilities (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22812304.8A EP4351422A1 (en) | 2021-05-27 | 2022-05-27 | Wireless soft scalp electronics and virtual reality system for brain-machine interfaces |
KR1020237044931A KR20240024856A (en) | 2021-05-27 | 2022-05-27 | Wireless soft scalp electronics and virtual reality systems for brain-device interfaces |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163194111P | 2021-05-27 | 2021-05-27 | |
US63/194,111 | 2021-05-27 | ||
US202263311628P | 2022-02-18 | 2022-02-18 | |
US63/311,628 | 2022-02-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022251696A1 true WO2022251696A1 (en) | 2022-12-01 |
Family
ID=84229383
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2022/031432 WO2022251696A1 (en) | 2021-05-27 | 2022-05-27 | Wireless soft scalp electronics and virtual reality system for brain-machine interfaces |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4351422A1 (en) |
KR (1) | KR20240024856A (en) |
WO (1) | WO2022251696A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100106041A1 (en) * | 2008-10-28 | 2010-04-29 | Georgia Tech Research Corporation | Systems and methods for multichannel wireless implantable neural recording |
US20150351690A1 (en) * | 2013-06-06 | 2015-12-10 | Tricord Holdings, Llc | Modular physiologic monitoring systems, kits, and methods |
US20170231490A1 (en) * | 2014-08-10 | 2017-08-17 | Autonomix Medical, Inc. | Ans assessment systems, kits, and methods |
US20190183430A1 (en) * | 2017-12-04 | 2019-06-20 | Advancing Technologies, Llc | Wearable device utilizing flexible electronics |
US20190247650A1 (en) * | 2018-02-14 | 2019-08-15 | Bao Tran | Systems and methods for augmenting human muscle controls |
-
2022
- 2022-05-27 KR KR1020237044931A patent/KR20240024856A/en unknown
- 2022-05-27 WO PCT/US2022/031432 patent/WO2022251696A1/en active Application Filing
- 2022-05-27 EP EP22812304.8A patent/EP4351422A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100106041A1 (en) * | 2008-10-28 | 2010-04-29 | Georgia Tech Research Corporation | Systems and methods for multichannel wireless implantable neural recording |
US20150351690A1 (en) * | 2013-06-06 | 2015-12-10 | Tricord Holdings, Llc | Modular physiologic monitoring systems, kits, and methods |
US20170231490A1 (en) * | 2014-08-10 | 2017-08-17 | Autonomix Medical, Inc. | Ans assessment systems, kits, and methods |
US20190183430A1 (en) * | 2017-12-04 | 2019-06-20 | Advancing Technologies, Llc | Wearable device utilizing flexible electronics |
US20190247650A1 (en) * | 2018-02-14 | 2019-08-15 | Bao Tran | Systems and methods for augmenting human muscle controls |
Also Published As
Publication number | Publication date |
---|---|
KR20240024856A (en) | 2024-02-26 |
EP4351422A1 (en) | 2024-04-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Mahmood et al. | Fully portable and wireless universal brain–machine interfaces enabled by flexible scalp electronics and deep learning algorithm | |
Mishra et al. | Soft, conformal bioelectronics for a wireless human-wheelchair interface | |
Liu et al. | An epidermal sEMG tattoo-like patch as a new human–machine interface for patients with loss of voice | |
Tian et al. | Large-area MRI-compatible epidermal electronic interfaces for prosthetic control and cognitive monitoring | |
Ferrari et al. | Conducting polymer tattoo electrodes in clinical electro-and magneto-encephalography | |
Mishra et al. | Soft, wireless periocular wearable electronics for real-time detection of eye vergence in a virtual reality toward mobile eye therapies | |
Lin et al. | A flexible, robust, and gel-free electroencephalogram electrode for noninvasive brain-computer interfaces | |
Fu et al. | Dry electrodes for human bioelectrical signal monitoring | |
Shahandashti et al. | Highly conformable stretchable dry electrodes based on inexpensive flex substrate for long-term biopotential (EMG/ECG) monitoring | |
Lee et al. | Self-adhesive epidermal carbon nanotube electronics for tether-free long-term continuous recording of biosignals | |
Salvo et al. | A 3D printed dry electrode for ECG/EEG recording | |
Srivastava et al. | Long term biopotential recording by body conformable photolithography fabricated low cost polymeric microneedle arrays | |
Schalk | Brain–computer symbiosis | |
Mahmood et al. | Wireless Soft Scalp Electronics and Virtual Reality System for Motor Imagery‐Based Brain–Machine Interfaces | |
Hsieh et al. | Design of hydrogel-based wearable EEG electrodes for medical applications | |
Mahmood et al. | VR-enabled portable brain-computer interfaces via wireless soft bioelectronics | |
Baek et al. | Brain–computer interfaces using capacitive measurement of visual or auditory steady-state responses | |
Baek et al. | A thin film polyimide mesh microelectrode for chronic epidural electrocorticography recording with enhanced contactability | |
KR102026850B1 (en) | Method of Manufacturing Electrode for Measuring Biosignal Based on Low Temperature Solution Process, Electrode for Measuring Biosignal, and Apparatus for Measuring Biosignal Using Same | |
Ban et al. | Soft wireless headband bioelectronics and electrooculography for persistent human–machine interfaces | |
Wang et al. | Brain-controlled wheelchair review: From wet electrode to dry electrode, from single modal to hybrid modal, from synchronous to asynchronous | |
Ban et al. | Advances in materials, sensors, and integrated systems for monitoring eye movements | |
Zhou et al. | Nano foldaway skin-like E-interface for detecting human bioelectrical signals | |
Devi et al. | Hybrid brain computer interface in wheelchair using voice recognition sensors | |
Yin et al. | Chest-scale self-compensated epidermal electronics for standard 6-precordial-lead ECG |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22812304 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023573298 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022812304 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022812304 Country of ref document: EP Effective date: 20240102 |