CN114237385B - Man-machine brain control interaction system based on non-invasive brain electrical signals - Google Patents

Man-machine brain control interaction system based on non-invasive brain electrical signals Download PDF

Info

Publication number
CN114237385B
CN114237385B CN202111385929.7A CN202111385929A CN114237385B CN 114237385 B CN114237385 B CN 114237385B CN 202111385929 A CN202111385929 A CN 202111385929A CN 114237385 B CN114237385 B CN 114237385B
Authority
CN
China
Prior art keywords
brain
electroencephalogram
control
module
electroencephalogram signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111385929.7A
Other languages
Chinese (zh)
Other versions
CN114237385A (en
Inventor
连金岭
周瑾
王常勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Academy of Military Medical Sciences AMMS of PLA
Original Assignee
Academy of Military Medical Sciences AMMS of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Academy of Military Medical Sciences AMMS of PLA filed Critical Academy of Military Medical Sciences AMMS of PLA
Priority to CN202111385929.7A priority Critical patent/CN114237385B/en
Publication of CN114237385A publication Critical patent/CN114237385A/en
Application granted granted Critical
Publication of CN114237385B publication Critical patent/CN114237385B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Neurology (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Dermatology (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Neurosurgery (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a man-machine brain-control interaction system based on non-invasive brain-electrical signals. The system comprises an electroencephalogram signal acquisition module, a brain control mode selection module, a first visual stimulation display module, a second visual stimulation display module, a first electroencephalogram signal analysis module, a second electroencephalogram signal analysis module, a first control end, a second control end, a first execution end and a second execution end. According to the brain-computer interaction interface based on the brain-computer signal, provided by the invention, under the condition that the time is close to that of manual operation, the hands-free operation of the instructions can be realized, the operation efficiency is improved, meanwhile, the instructions are controlled to be 12, the physical space of the 12 instructions is saved, and the supply is provided for other necessary spaces.

Description

Man-machine brain control interaction system based on non-invasive brain electrical signals
Technical Field
The invention belongs to the technical field of brain-computer interfaces, and particularly relates to a brain-computer control interaction system of a human body based on a non-invasive brain-computer signal.
Background
The control panel of the cockpit of the human aircraft is equipped with a number of buttons, which the pilot needs to manually operate in performing the flight mission. On the one hand, during manual operation of the buttons, the hand needs to leave the lever and the eye needs to look for the buttons and feedback the hand movements. On the other hand, some buttons with low use frequency or relatively small importance can be hidden when not in use, and misoperation can be avoided. On the other hand, instructions with low frequency of use or relatively low importance, if displayed in a screen display manner, can save the physical space in the cockpit.
Currently, there are literature (Kryger M, wester B, pohlmeyer E A, et al flight simulation using a Brain-Computer Interface: A pilot, pilot student [ J ]. Experimental neurology,2017,287: 473-478.). However, this document uses invasive cortical signals for control, which has a certain risk and is not generalizable.
Disclosure of Invention
The invention aims to provide a non-invasive brain-controlled interaction system based on brain-electrical signals, which realizes the hands-free operation of a pilot on 12 instructions in a cockpit and saves physical space.
A man-machine brain-control interaction system based on non-invasive brain-electrical signals comprises an brain-electrical signal acquisition module, a brain-control mode selection module, a first visual stimulation display module, a second visual stimulation display module, a first brain-electrical signal analysis module, a second brain-electrical signal analysis module, a first control end, a second control end, a first execution end and a second execution end.
The electroencephalogram signal acquisition module comprises an electroencephalogram amplifier, an electroencephalogram electrode, an electroencephalogram cap and electroencephalogram acquisition software, and is used for acquiring electroencephalogram signals.
The brain control mode selection module realizes the selection of three brain control modes, namely, a brain control mode is not used in normal flight, a first brain control mode is used for outputting a command, and a second brain control mode is used for outputting a command.
The first visual stimulus display module and the second visual stimulus display module respectively realize visual stimulus display of a first brain control mode and a second brain control mode.
The first electroencephalogram signal analysis module analyzes the electroencephalogram signal induced by the first visual stimulation display module.
The second electroencephalogram signal analysis module analyzes the electroencephalogram signal induced by the second visual stimulus display module.
The first control end and the second control end respectively receive the output of the first electroencephalogram signal analysis module and the output of the second electroencephalogram signal analysis module.
The first executing end and the second executing end respectively receive control signals of the first control end and the second control end and execute corresponding tasks.
The invention has the beneficial effects that: according to the brain-computer interaction interface based on the brain-computer signal, provided by the invention, under the condition that the time is close to that of manual operation, the hands-free operation of the instructions can be realized, the operation efficiency is improved, meanwhile, the instructions are controlled to be 12, the physical space of the 12 instructions is saved, and the supply is provided for other necessary spaces. The left hand and the right hand respectively control two brain control modes, so that the number of brain control output commands is increased; the head-up display system is applied to the windshield of the man-machine and displays the visual stimulus interface.
Drawings
FIG. 1 is a block diagram of a system of the present invention.
Fig. 2 shows channels (black marks) used for electroencephalogram signal acquisition.
Fig. 3 is a first visual stimulus display module.
Fig. 4 is a second visual stimulus display module.
Detailed Description
The present invention will be described more fully hereinafter in order to facilitate an understanding of the present invention. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
The brain control mode selection device comprises an electroencephalogram signal acquisition module, a brain control mode selection module, a first visual stimulation display module, a second visual stimulation display module, a first electroencephalogram signal analysis module, a second electroencephalogram signal analysis module, a first control end, a second control end, a first execution end and a second execution end; the system block diagram is shown in fig. 1.
The electroencephalogram signal acquisition module is used for acquiring electroencephalogram signals. The brain control mode selection module realizes the selection of three brain control modes, namely, a brain control mode which is not used in normal flight, a first brain control mode which is used for outputting a command and a second brain control mode which is used for outputting a command. The first visual stimulus display module and the second visual stimulus display module respectively realize visual stimulus display of a first brain control mode and a second brain control mode. The first electroencephalogram signal analysis module analyzes the electroencephalogram signal induced by the first visual stimulus display module. The second electroencephalogram signal analysis module analyzes the electroencephalogram signal induced by the second visual stimulus display module. The first control end receives the output of the first electroencephalogram signal analysis module, and the second control end receives the output of the second electroencephalogram signal analysis module. The first execution end receives the control signal of the first control end and executes the corresponding task, and the second execution end receives the control signal of the second control end and executes the corresponding task.
The electroencephalogram signal acquisition module comprises an electroencephalogram amplifier, an electroencephalogram electrode, an electroencephalogram cap and electroencephalogram signal acquisition software, and the electroencephalogram signal sampling frequency is fs=1000 Hz. In this embodiment, 11 channels (Pz, P1, P2, P3, P4, POz, PO3, PO4, oz, O1, O2) are used, and are mainly distributed in and near the brain vision area for identifying steady-state visual evoked potentials, and the specific channels are shown in fig. 2.
The brain control mode selection module consists of 2 sub buttons, and is distributed at the positions of the two hands on the pilot operating lever, so that the two hands are not separated from the operating lever to be respectively and directly controlled. The left hand controls the first sub-button and the right hand controls the second sub-button. The left hand starts the first sub-button and the right hand closes the second sub-button to output a command L, and the system is switched to a first brain control mode; the left hand closes the first sub-button, the right hand opens the second sub-button to output a command 'R', and the system is switched to a second brain control mode; the sub buttons are closed simultaneously by both hands to output a command 'N', and the system is switched to a brain control mode which is not used in normal flight; the simultaneous opening of the sub-buttons by both hands belongs to misoperation, and the system is still switched to a normal flight mode. The two brain control modes belong to mutual exclusion relation and can not be started at the same time.
The first visual stimulus display module corresponds to a first brain control mode, and the stimulus interface is shown in fig. 3. Six blocks flash at the frequency and phase of fk/Φk (k=1, 2, …, 6), corresponding to control commands 1-6. The head-up display system of the first visual stimulus module is distributed at the upper left part of the windshield of the cockpit and corresponds to the command L.
The second visual stimulus display module corresponds to a second brain control mode, and the stimulus interface is shown in fig. 4. Six blocks flash at the frequency and phase of fk/Φk (k=7, 8, …, 12), corresponding to control commands 7-12. The head-up display system of the second visual stimulus module is distributed at the upper right side of the cockpit windshield and corresponds to the command 'R'.
First electroencephalogram signal analysis moduleThe block analyzes the brain electrical signals induced by the first visual stimulus display module to obtain the intention of the person. The electroencephalogram signal corresponding to the acquisition time length T can be expressed asN=t×fs represents a sampling point in the time dimension, and l=11 represents the number of channels. The features of channel i are then extracted, which can be expressed as
Ω i =[Υ 12 ,...,Υ p ,...,Υ P-1P ]
P=6,f n =Δ*0.3,f w =Δ*0.7,S(f a ,f b ) Representing the electroencephalogram signal x in the frequency band f a ,f b ]Average power density over, Δ=min (diff (sort ([ f 1 ,f 2 ,...,f k ,...,f K-1 ,f K ]) ) represents fk (k=1, 2, …, 6) sequentially arranged adjacent minimum interval values, sort () represents a sorting function, diff () represents a difference function. The eigenvalues of all channels can be expressed as:
Θ=[Ω 12 ,...,Ω l ,...,Ω L-1L ]
in the embodiment, an SVM classifier is established by adopting a one-to-many strategy, and 6 classifiers are established for 6 commands:
M k representing the number of support vectors for the kth classifier,weights representing the jth support vector of the kth classifier,/for the kth support vector>The j-th support vector, g, representing the k-th classifier k,1 As parameter b k,1 Representing the bias of the kth classifier. After the classifiers are established, the threshold Thr for each of the two-class classifiers is determined from the ROC curve (Receiver Operating Curve). Finally, the final score for each of the classification classifiers can be expressed as:
Y k =y k -Thr k
the class corresponding to the largest score is the final identification command of the system:
the second electroencephalogram signal analysis module analyzes the electroencephalogram signal induced by the second visual stimulus display module to obtain the intention of the person. The electroencephalogram signal corresponding to the acquisition time length T can be expressed asN=t×fs represents a sampling point in the time dimension, and l=11 represents the number of channels. The features of channel i are then extracted, which can be expressed as:
Ψ i =[Υ 78 ,...,Υ q ,...,Υ Q-1Q ]
Q=12,f n =Δ*0.3,f w =Δ*0.7,S(f a ,f b ) Representing the electroencephalogram signal x in the frequency band f a ,f b ]Average power density over, Δ=min (diff (sort ([ f 1 ,f 2 ,...,f k ,...,f K-1 ,f K ]) ) represents fk (k=7, 8, …, 12) sequentially arranged adjacent minimum interval values, sort () represents a sorting function, diff () represents a difference function. The eigenvalues of all channels can be expressed as:
Γ=[Ψ 12 ,...,Ψ l ,...,Ψ L-1L ]
in the embodiment, an SVM classifier is established by adopting a one-to-many strategy, and 6 classifiers are established for 6 commands:
N k representing the number of support vectors for the kth classifier,weights representing the jth support vector of the kth classifier,/for the kth support vector>The j-th support vector, g, representing the k-th classifier k,2 As parameter b k,2 Representing the bias of the kth classifier. After the classifiers are established, the threshold Thr for each of the two-class classifiers is determined from the ROC curve (Receiver Operating Curve). Finally, the final score for each of the classification classifiers can be expressed as:
Z k =z k -Thr k
the class corresponding to the largest score is the final recognition command:
the first control end receives the output command of the first electroencephalogram signal analysis module, and the second control end receives the output command of the second electroencephalogram signal analysis module. The first execution end receives the control signal of the first control end and executes the corresponding task, and the second execution end receives the control signal of the second control end and executes the corresponding task.
The working process comprises the following steps: during the flight, when the pilot does not need to use the brain control interaction system to output the set brain control command, the pilot normally executes the flight task. The two sub-buttons of the brain-controlled mode selection module are normally off. When the pilot needs to output a control command of the first brain control mode, the pilot opens the first sub-button by the left hand and closes the second sub-button by the right hand, the first visual stimulus display module is started, blocks corresponding to commands 1-6 start to flash, and the pilot stares at the blocks corresponding to the command, and the duration is T. The first electroencephalogram signal analysis module analyzes the electroencephalogram signals acquired in the time T and outputs instructions, the first control end receives the output of the first electroencephalogram signal analysis module, the first execution end receives control signals of the first control end and executes corresponding tasks, and finally the first sub-button is closed. When the pilot needs to output a control command in the second brain control mode, the pilot closes the first sub-button by the left hand and opens the second sub-button by the right hand, the second visual stimulus display module is started, the blocks corresponding to the commands 7-12 start to flash, and the pilot stares at the blocks corresponding to the commands and keeps on the time duration T. The second electroencephalogram signal analysis module analyzes the electroencephalogram signals acquired in the time T and outputs instructions, the second control end receives the output of the second electroencephalogram signal analysis module, the second execution end receives the control signals of the second control end and executes corresponding tasks, and finally the second sub-button is closed.
The above examples illustrate only a few embodiments of the invention, which are described in detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.

Claims (8)

1. The system is characterized by comprising an electroencephalogram signal acquisition module, a brain control mode selection module, a first visual stimulation display module, a second visual stimulation display module, a first electroencephalogram signal analysis module, a second electroencephalogram signal analysis module, a first control end, a second control end, a first execution end and a second execution end;
the electroencephalogram signal acquisition module transmits the acquired electroencephalogram signal to the brain control mode selection module; the brain control mode selection module realizes the selection of three brain control modes, namely, a brain control mode which is not used in normal flight, a first brain control mode which is used for outputting a command and a second brain control mode which is used for outputting a command;
the first visual stimulus display module and the second visual stimulus display module respectively realize visual stimulus display of a first brain control mode and a second brain control mode;
the first control end receives the output of the first electroencephalogram signal analysis module, and the second control end receives the output of the second electroencephalogram signal analysis module;
the first execution end receives the control signal of the first control end and executes a corresponding task, and the second execution end receives the control signal of the second control end and executes a corresponding task;
the first electroencephalogram signal analysis module analyzes the electroencephalogram signal induced by the first visual stimulation display module to obtain the intention of a person; the electroencephalogram signal corresponding to the acquisition time length T can be expressed asN=t×fs represents a sampling point in the time dimension, and l=11 represents the number of channels; the features of channel i are then extracted, which can be expressed as:
Ω i =[Υ 12 ,...,Υ p ,...,Υ P-1P ]
P=6,f n =Δ*0.3,f w =Δ*0.7,S(f a ,f b ) Representing the electroencephalogram signal x in the frequency band f a ,f b ]Average power density over, Δ=min (diff (sort ([ f 1 ,f 2 ,...,f k ,...,f K-1 ,f K ]) ) represents fk (k=1, 2, …, 6) sequentially arranged adjacent minimum interval values, sort () represents a sorting function, diff () represents a difference function;
the second electroencephalogram signal analysis module analyzes the electroencephalogram signal induced by the second visual stimulus display module to obtain the intention of a person; the electroencephalogram signal corresponding to the acquisition time length T can be expressed asN=t×fs represents a sampling point in the time dimension, and l=11 represents the number of channels; the features of channel i are then extracted, which can be expressed as:
Ψ i =[Υ 78 ,...,Υ q ,...,Υ Q-1Q ]
Q=12,f n =Δ*0.3,f w =Δ*0.7,S(f a ,f b ) Representing the electroencephalogram signal x in the frequency band f a ,f b ]Average power density over, Δ=min (diff (sort ([ f 1 ,f 2 ,...,f k ,...,f K-1 ,f K ]) ) represents fk (k=7, 8, …, 12) sequentially arranged adjacent minimum interval values, sort () represents a sorting function, diff () represents a difference function.
2. The brain-controlled interaction system of a human machine based on non-invasive brain electrical signals according to claim 1, wherein the brain electrical signal acquisition module comprises an brain electrical amplifier, brain electrodes, brain electrical caps and brain electrical acquisition software, and is used for acquiring brain electrical signals.
3. The brain-controlled interaction system of a human machine based on non-invasive brain electrical signals according to claim 1, wherein the brain-controlled mode selection module enables selection of three brain-controlled modes, namely, a normal flight without brain-controlled mode, a first brain-controlled mode with command output, and a second brain-controlled mode with command output.
4. The non-invasive brain-computer interaction system according to claim 1, wherein the first visual stimulus display module and the second visual stimulus display module implement visual stimulus display in a first brain-computer mode and a second brain-computer mode, respectively.
5. The non-invasive electroencephalogram based organic brain-controlled interaction system according to claim 1, wherein the first electroencephalogram analysis module analyzes electroencephalogram signals induced by the first visual stimulus display module.
6. The non-invasive electroencephalogram based organic brain-controlled interaction system according to claim 1, wherein the second electroencephalogram analysis module analyzes electroencephalogram signals induced by the second visual stimulus display module.
7. The non-invasive electroencephalogram based organic brain control interaction system according to claim 1, wherein the first control end and the second control end receive outputs of a first electroencephalogram analysis module and a second electroencephalogram analysis module, respectively.
8. The non-invasive electroencephalogram based organic brain-controlled interaction system according to claim 1, wherein the first execution end and the second execution end receive control signals of the first control end and the second control end, respectively, and perform corresponding tasks.
CN202111385929.7A 2021-11-22 2021-11-22 Man-machine brain control interaction system based on non-invasive brain electrical signals Active CN114237385B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111385929.7A CN114237385B (en) 2021-11-22 2021-11-22 Man-machine brain control interaction system based on non-invasive brain electrical signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111385929.7A CN114237385B (en) 2021-11-22 2021-11-22 Man-machine brain control interaction system based on non-invasive brain electrical signals

Publications (2)

Publication Number Publication Date
CN114237385A CN114237385A (en) 2022-03-25
CN114237385B true CN114237385B (en) 2024-01-16

Family

ID=80750318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111385929.7A Active CN114237385B (en) 2021-11-22 2021-11-22 Man-machine brain control interaction system based on non-invasive brain electrical signals

Country Status (1)

Country Link
CN (1) CN114237385B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105528072A (en) * 2015-12-02 2016-04-27 天津大学 Brain-computer interface speller by utilization of dynamic stop strategy
WO2018094720A1 (en) * 2016-11-24 2018-05-31 浙江大学 Clinical electroencephalogram signal-based brain-machine interface system for controlling robotic hand movement and application thereof
CN108415554A (en) * 2018-01-18 2018-08-17 大连理工大学 A kind of brain man-controlled mobile robot system and its implementation based on P300
CN109471530A (en) * 2018-10-22 2019-03-15 吉林大学 Brain control input method based on Steady State Visual Evoked Potential and Mental imagery
CN109656356A (en) * 2018-11-13 2019-04-19 天津大学 A kind of asynchronous control system of SSVEP brain-computer interface
CN113180992A (en) * 2021-03-03 2021-07-30 浙江工业大学 Upper limb rehabilitation exoskeleton closed-loop control system and method based on electroencephalogram interaction and myoelectricity detection
CN113625769A (en) * 2021-09-07 2021-11-09 中国人民解放军军事科学院军事医学研究院 Unmanned aerial vehicle formation multi-mode control system based on electroencephalogram signals
CN113625749A (en) * 2021-07-30 2021-11-09 中国人民解放军军事科学院军事医学研究院 Brain-controlled unmanned aerial vehicle formation control method based on steady-state visual evoked potential

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013142051A1 (en) * 2012-03-19 2013-09-26 University Of Florida Research Foundation, Inc. Methods and systems for brain function analysis
US10945864B2 (en) * 2016-08-17 2021-03-16 Teledyne Scientific & Imaging, Llc System and method for noninvasive identification of cognitive and behavioral goals
US10795441B2 (en) * 2017-10-23 2020-10-06 Korea University Research And Business Foundation Method of recognizing user intention by estimating brain signals, and brain-computer interface apparatus based on head mounted display implementing the method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105528072A (en) * 2015-12-02 2016-04-27 天津大学 Brain-computer interface speller by utilization of dynamic stop strategy
WO2018094720A1 (en) * 2016-11-24 2018-05-31 浙江大学 Clinical electroencephalogram signal-based brain-machine interface system for controlling robotic hand movement and application thereof
CN108415554A (en) * 2018-01-18 2018-08-17 大连理工大学 A kind of brain man-controlled mobile robot system and its implementation based on P300
CN109471530A (en) * 2018-10-22 2019-03-15 吉林大学 Brain control input method based on Steady State Visual Evoked Potential and Mental imagery
CN109656356A (en) * 2018-11-13 2019-04-19 天津大学 A kind of asynchronous control system of SSVEP brain-computer interface
CN113180992A (en) * 2021-03-03 2021-07-30 浙江工业大学 Upper limb rehabilitation exoskeleton closed-loop control system and method based on electroencephalogram interaction and myoelectricity detection
CN113625749A (en) * 2021-07-30 2021-11-09 中国人民解放军军事科学院军事医学研究院 Brain-controlled unmanned aerial vehicle formation control method based on steady-state visual evoked potential
CN113625769A (en) * 2021-09-07 2021-11-09 中国人民解放军军事科学院军事医学研究院 Unmanned aerial vehicle formation multi-mode control system based on electroencephalogram signals

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于脑电信号神经反馈控制智能小车的研究;李松;熊馨;伏云发;;生物医学工程学杂志(第01期);第21-30页 *

Also Published As

Publication number Publication date
CN114237385A (en) 2022-03-25

Similar Documents

Publication Publication Date Title
US10345806B2 (en) Autonomous driving system and method for same
DE102015219111B4 (en) Driver assistance system and method for adjusting the lighting of control elements in motor vehicles
CN105625857B (en) Control device and method for window glass and vehicle
CN112051780B (en) Brain-computer interface-based mobile robot formation control system and method
DE102017216837A1 (en) Gesture and facial expression control for a vehicle
CN112114670B (en) Man-machine co-driving system based on hybrid brain-computer interface and control method thereof
CN114237385B (en) Man-machine brain control interaction system based on non-invasive brain electrical signals
CN106362287A (en) Novel MI-SSSEP mixed brain-computer interface method and system thereof
CN106369737A (en) air conditioner control processing method and device
CN103995582A (en) Brain-computer interface character input method and system based on steady-state visual evoked potential (SSVEP)
CN106371451A (en) Unmanned aerial vehicle manipulation method and device based on steady state visual evoked potential
CN110123266B (en) Maneuvering decision modeling method based on multi-modal physiological information
US11281294B2 (en) Smart control device for determining user's intention from color stimulus based on brain-computer interface and control method thereof
WO2018234147A1 (en) Method for operating a display device for a motor vehicle and motor vehicle
CN113778113B (en) Pilot auxiliary driving method and pilot auxiliary driving system based on multi-mode physiological signals
CN113625769B (en) Unmanned aerial vehicle formation multi-mode control system based on electroencephalogram signals
CN106873777A (en) Mobile phone operation method and system based on brain electric control
CN115454238A (en) Human-vehicle interaction control method and device based on SSVEP-MI fusion and automobile
CN108319367B (en) Brain-computer interface method based on motion initiation evoked potential
CN107168313A (en) Control the method and device of vehicle drive
CN207657609U (en) Slip control switch, slip control switch module, car door and vehicle
CN105511622A (en) Thresholdless brain switch method based on P300 electroencephalogram mode
CN109814720A (en) A kind of brain control method and system of equipment
CN114385099B (en) Multi-unmanned aerial vehicle dynamic monitoring interface display method and device based on active push display
CN206856597U (en) Vehicle displays control system and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant