CN114185436A - Navigation system and device based on visual evoked potential brain-computer interface - Google Patents
Navigation system and device based on visual evoked potential brain-computer interface Download PDFInfo
- Publication number
- CN114185436A CN114185436A CN202111527572.1A CN202111527572A CN114185436A CN 114185436 A CN114185436 A CN 114185436A CN 202111527572 A CN202111527572 A CN 202111527572A CN 114185436 A CN114185436 A CN 114185436A
- Authority
- CN
- China
- Prior art keywords
- information
- electroencephalogram
- visual
- instructions
- navigation system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 50
- 230000000763 evoking effect Effects 0.000 title claims abstract description 24
- 238000012795 verification Methods 0.000 claims abstract description 13
- 210000004556 brain Anatomy 0.000 claims description 17
- 238000004891 communication Methods 0.000 claims description 14
- 230000006399 behavior Effects 0.000 claims description 4
- 230000001054 cortical effect Effects 0.000 claims description 3
- 230000003993 interaction Effects 0.000 abstract description 4
- 238000000034 method Methods 0.000 description 6
- 230000000638 stimulation Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 210000000857 visual cortex Anatomy 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010219 correlation analysis Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 210000000869 occipital lobe Anatomy 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- 206010033799 Paralysis Diseases 0.000 description 1
- 240000007651 Rubus glaucus Species 0.000 description 1
- 235000011034 Rubus glaucus Nutrition 0.000 description 1
- 235000009122 Rubus idaeus Nutrition 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000007177 brain activity Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000001766 physiological effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 210000004761 scalp Anatomy 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000004936 stimulating effect Effects 0.000 description 1
- 238000013526 transfer learning Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Dermatology (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The invention discloses a navigation system and a device based on a visual evoked potential brain-computer interface, wherein the navigation system comprises a light stimulus source for providing a visual signal containing pairing information; the electroencephalogram acquisition equipment is at least used for acquiring electroencephalogram information of the target, and the electroencephalogram information is at least generated when the target captures a visual signal; the multimedia system is used for broadcasting multimedia information matched with the control instruction according to the control instruction, and the multimedia information comprises video information or audio information; the control device is used for analyzing the electroencephalogram information and matching corresponding control instructions according to the electroencephalogram information, the control instructions at least comprise pairing instructions and/or behavior instructions, the pairing instructions are verification information adapted to the pairing information, and the behavior instructions at least comprise any one of the following instructions: playing, pausing, multiplying speed and stopping; and the storage device is used for storing the multimedia information which is provided with the verification secret key in a matching way, and the verification secret key is matched with the verification information. The scheme of the invention has the advantages of simple operation, high human-computer interaction efficiency and good accuracy.
Description
Technical Field
The present invention relates to an information interaction device, and more particularly, to a navigation system and device based on a visual evoked potential brain-computer interface.
Background
The brain-computer interface is a communication means which has made remarkable progress in brain science research in recent years. By encoding and decoding behavioral intents in brain activity, the brain-computer interface can establish direct communication and control channels between the brain and external devices. According to different signal collecting modes, brain-computer interfaces can be divided into invasive and non-invasive, and the non-invasive brain-computer interfaces are more widely applied, but because the signal-to-noise ratio is not high, more work needs to be done on the aspect of electroencephalogram coding and decoding.
The information disclosed in this background section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
Disclosure of Invention
The invention aims to provide a navigation system and a navigation device based on a visual evoked potential brain-computer interface, which are simple to operate, high in human-computer interaction efficiency, good in accuracy and extremely high in human-computer interaction experience.
To achieve the above object, an embodiment of the present invention provides a navigation system based on a visual evoked potential brain-computer interface, including a light stimulus source for providing a visual signal containing pairing information; the electroencephalogram acquisition equipment is at least used for acquiring electroencephalogram information of the target, and the electroencephalogram information is at least generated when the target captures a visual signal; the multimedia system is used for broadcasting multimedia information matched with the control instruction according to the control instruction, and the multimedia information comprises video information or audio information; the control device is used for analyzing the electroencephalogram information and matching corresponding control instructions according to the electroencephalogram information, the control instructions at least comprise pairing instructions and/or behavior instructions, the pairing instructions are verification information adapted to the pairing information, and the behavior instructions at least comprise any one of the following instructions: playing, pausing, multiplying speed and stopping; and the storage device is used for storing the multimedia information which is provided with the verification secret key in a matching way, and the verification secret key is matched with the verification information.
In one or more embodiments of the invention, the visual signal comprises image information or video information or strobe-encoded information.
In one or more embodiments of the invention, the pairing information in the visual signal comprises a pattern or a combination of colors or a frequency coding with pairing features.
In one or more embodiments of the invention, the multimedia system includes a playback system that includes a display interface and/or a sound player to broadcast video information or audio information.
In one or more embodiments of the invention, the multimedia system further comprises a memory at least for storing the multimedia information and/or for invoking the multimedia information in accordance with the control instructions.
In one or more embodiments of the invention, the electroencephalogram acquisition device at least comprises a plurality of first electrodes, and the first electrodes are at least used for acquiring electroencephalogram information of a target.
In one or more embodiments of the invention, the electroencephalogram information acquired by the first electrode is visual cortical electroencephalogram information.
In one or more embodiments of the invention, the first electrode is a dry electrode or a wet electrode or a gel electrode. Multiple electrodes of the same type can be arranged on the same module like a brain cap, or multiple electrodes of multiple types can be arranged at the same time. Of course, each electrode may be a single-claw electrode with only one sensing head, or may be a multi-claw electrode with a plurality of sensing heads. When a plurality of types of electrodes are arranged, the different types of electrodes can be arranged according to different electroencephalogram information acquisition areas, for example, a sponge dry electrode is arranged in a visual skin area, a gel electrode is arranged in an auditory skin area, and the like; of course, multiple types of multiple electrodes can be arranged in the same electroencephalogram information acquisition area, for example, a rubber dry electrode and a rubber wet electrode are arranged in a visual cortex area.
In one or more embodiments of the present invention, the navigation device based on the visual evoked potential brain-computer interface comprises the navigation system as described above.
In one or more embodiments of the invention, the navigation device based on the visual evoked potential brain-computer interface comprises a host, a display and a display, wherein the display is in communication connection with the host to receive and broadcast visual signals and/or multimedia information; the electroencephalogram acquisition device is in communication connection with the host machine to acquire the electroencephalogram information of the target and send the electroencephalogram information to the host machine.
Compared with the prior art, the navigation system based on the visual evoked potential brain-computer interface provided by the embodiment of the invention is applied to navigation activities through the visual evoked potential technology. And a set of equipment which can be used for navigation is established, and the shape and the function of the existing equipment need to be improved. For convenience of description, the following specific scenario takes the guide of a museum as an example, a museum exhibit interpreter based on visual evoked potential is designed, so that audio and video explanation corresponding to an exhibit can be realized by staring at a light stimulus source on the edge of the exhibit displayed by the museum functionally, and a what you see is what you get accurate explanation service is provided. However, the invention is not limited to the guiding of museums, and is also suitable for guiding activities in any scenes, including but not limited to science and technology museums, art museums, various exhibition halls, tourist attractions and the like.
Drawings
FIG. 1 is a schematic process diagram of a navigation system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a configuration of an optical stimulus source according to an embodiment of the present invention;
FIG. 3 is a schematic representation of the operation of a navigation system according to an embodiment of the present invention;
fig. 4 is a flow diagram of a navigation system according to an embodiment of the present invention.
Detailed Description
The following detailed description of the present invention is provided in conjunction with the accompanying drawings, but it should be understood that the scope of the present invention is not limited to the specific embodiments.
Throughout the specification and claims, unless explicitly stated otherwise, the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element or component but not the exclusion of any other element or component.
The visual evoked potential is that when visible light stimulates retina with artificial coding frequency, an electroencephalogram signal similar to coding frequency is generated in human occipital visual cortex. Taking Steady State Visual Evoked Potential (SSVEP) as an example, when the Visual stimulation is stable 3.7-75Hz flicker, the Visual cortex of the occipital lobe area of the human brain can detect the EEG signals of which the staring areas of the human eyes are the same or integral multiples of the light frequency. The VEP technology can realize high-speed character input of a brain control computer by a keyboard of a coding input method at present, and is an important communication tool for paralyzed patients.
As shown in fig. 1 to 4, according to the guidance system based on the visual evoked potential brain-computer interface according to the preferred embodiment of the present invention, through the visual signal with the matching information provided by the light stimulus source, after the visual signal is received by the target, such as a human, visually, the visual cortical activity of the brain generates the corresponding stress physiological activity, thereby generating the electroencephalogram information with the typical correlation with the matching information. The electroencephalogram information is captured by the electroencephalogram acquisition equipment and is transmitted to the controller, after a series of operations such as denoising and decoding, the electroencephalogram information is matched with a control instruction corresponding to the pairing information, the stored multimedia information is called according to the control instruction, and the operations such as playing are executed.
As an embodiment, the optical stimulus that presents the encoded information (i.e., pairing information) is: generally, different types of displays such as an LCD display screen, an OLED display screen, an electronic ink display screen and the like can be selected according to requirements, the display is controlled through a computer, and a matched visual signal such as a video, a picture and the like is played to a collection object in advance; or a pure light source coded by a singlechip, for example, an LED lamp is used as a light source, and a compiled visual signal is played in a continuous flashing mode.
As one embodiment, the brain electrical signal acquisition device can comprise a first electrode which is used for attaching to a corresponding region of the scalp for brain electrical signal acquisition, and a sensitive region of the first electrode is matched with the brain so as to acquire brain electrical information of a target obtaining visual stimulation response.
As an implementation, the brain electrical acquisition device may adopt a brain occipital visual cortex brain electrical acquisition device, which mainly includes: various forms of electroencephalogram caps in the occipital lobe area; the electroencephalogram electrodes correspondingly arranged on the electroencephalogram cap include but are not limited to dry electrodes, wet electrodes, gel electrodes and the like; the signal acquisition circuit comprises a main component which is a high-precision analog-to-digital conversion chip ADC; a power supply for supplying power to the device.
As an implementation manner, the power supply of the electroencephalogram acquisition device may be an external mains supply, an external storage battery (the external storage battery may be in a form of a standby power supply, and supplies power through a universal interface such as a USB), and the like, or may be a rechargeable lithium battery arranged on the electroencephalogram cap, and the like.
As an embodiment, the electroencephalogram signal needs to be optimized, decoded, etc. to obtain the matching information corresponding to the encoded light stimulation (visual signal). After the electroencephalogram signal is transmitted to a control device such as a host, preprocessing operations such as filtering and noise elimination are performed by the electroencephalogram signal according to a set program, and corresponding pairing information is read out of the preprocessed electroencephalogram signal through typical correlation analysis (CCA).
The preprocessing mode of the electroencephalogram signal can generally have spatial filtering, a filter bank and the like; the electroencephalogram decoding mode is algorithms such as typical correlation analysis (CCA) and the like, and can also be combined with machine learning algorithms such as transfer learning and the like. Of course, different processes have different demands on computer resources. For example, electroencephalogram decoding may be implemented by the CPU of a desktop computer. If the electroencephalogram analysis device is a portable device, the electroencephalogram analysis process can be completed by using small chips such as low-power-consumption SOC, FPGA, ASIC and the like through a simplified algorithm.
As an embodiment, when the electroencephalogram acquisition device and the control device are independent from each other, it is a matter of course that in order to satisfy the communication connection between the electroencephalogram acquisition device and the control device or between the display and the control device, a communication device for data instruction transmission may also be provided, if the conventional communication device is a wired communication device or a wireless communication device. For example, a desktop computer is used to decode the electroencephalogram signals, and the electroencephalogram signals of the electroencephalogram cap need to be uploaded to the computer through a WIFI local area network.
As an implementation manner, when the electroencephalogram acquisition device and the control device are integrated, a processing circuit chip and the like can be integrated into the electroencephalogram cap, that is, for example, the portable SOC is used for completion, and the SOC is integrated into the electroencephalogram acquisition device for direct calculation.
Fig. 1 of the present invention illustrates the flow of generating, collecting and analyzing VEP signals in one embodiment.
The invention is described below in a specific application:
for example, in the face of a large number of exhibits in a museum, an LED light stimulation source with the flicker frequency controlled by a single chip microcomputer can be designed, and the light stimulation source with the specific frequency is matched with the specific exhibits. The light stimulus structure can be as shown in fig. 2, and includes a light-transmitting plate 10 for displaying a characteristic image 11 representing a specific exhibit, as a circuit board 30 provided with an LED lamp device 31, a control system 32, a power supply system 33 and a communication system 34. Therefore, the control system 32, such as the MCU chip, can be used for custom programming the LED lamp flashing frequency and other visual signals containing coded information.
Of course, its housing may also be provided with a wired communication interface 21 to allow connection to an external computer. The appearance of the optical stimulus source can be varied and is not limited herein. The carrier for displaying the visual signal may be, instead of the LED lamp, a small LCD display screen, an OLED display screen, or any other display device with controllable frequency flicker.
The control system is generally completed by a single chip microcomputer and mainly comprises a Microprocessor Chip (MCU), various single chip microcomputer chips and programming methods are all suitable for controlling the LED flicker, and general programming environments comprise a 51 single chip microcomputer, an Arduino compatible single chip microcomputer, a raspberry group and the like.
The power supply system can supply power through a direct current power supply and also can supply power through a rechargeable battery and a non-rechargeable battery.
In order to reduce energy consumption, the optical stimulus source can be matched with the infrared sensor 41 to broadcast visual signals when a person approaches, such as sending out flickers, and realizing the functions of adjusting flicking brightness and modes through environmental illumination self-adaption.
The electroencephalogram collection can be achieved through a portable scheme, the electroencephalogram collection equipment can be simultaneously integrated with components such as an audio analog-digital conversion DAC chip, a storage and an earphone, the voice explanation of museum exhibits can be achieved when electroencephalogram information collection is completed, a possible scheme is shown in figure 3, and the electroencephalogram cap is connected with the earphone and the like.
A high-precision analog-to-digital converter can be used on the electroencephalogram cap as a working circuit connected to the first electrode, so that the acquisition of electroencephalogram signals is met. In addition, a chip with stronger performance, such as an SOC (system on chip) and the like, can be arranged to realize the tasks of analyzing VEP (VEP electroencephalogram) signals and decoding audio, and the working process can be as shown in fig. 4.
The analog audio signal output by the audio decoding drives the earphone connected to the module. The explanation audios of various exhibits are stored in the memory on the module, the audios are numbered in advance (verification keys) and are matched with the light stimulus source flickering signals (matching information) corresponding to the exhibits of a specific museum, when a visitor wearing the module needs to listen to the explanation when watching a specific exhibit, the visitor can stare eyes at the light stimulus source with a specific frequency beside the exhibit, and at the moment, the VEP decoding module can analyze the characteristics such as the flickering frequency and the like displayed by the light stimulus source of the visitor, so that the audios or videos with the numbers corresponding to the stimulating frequency are found in the memory and then are played.
The foregoing descriptions of specific exemplary embodiments of the present invention have been presented for purposes of illustration and description. It is not intended to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments were chosen and described in order to explain certain principles of the invention and its practical application to enable one skilled in the art to make and use various exemplary embodiments of the invention and various alternatives and modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims and their equivalents.
Claims (10)
1. A navigation system based on visual evoked potential brain-computer interface is characterized by comprising
A light stimulus source for providing a visual signal containing pairing information;
the electroencephalogram acquisition equipment is at least used for acquiring electroencephalogram information of a target, and the electroencephalogram information is at least generated when the target captures the visual signals;
the multimedia system is used for broadcasting multimedia information matched with the control instruction according to the control instruction, and the multimedia information comprises video information or audio information;
the control device is used for analyzing the electroencephalogram information and matching corresponding control instructions according to the electroencephalogram information, the control instructions at least comprise pairing instructions and/or behavior instructions, the pairing instructions are verification information adapted to the pairing information, and the behavior instructions at least comprise any one of the following instructions: playing, pausing, multiplying speed and stopping;
the storage device stores multimedia information with a verification key set in a matching manner, and the verification key is matched with the verification information.
2. The visual evoked potential brain-computer interface based navigation system of claim 1, wherein said visual signal comprises image information or video information or strobe coded information.
3. A visual evoked potential brain-computer interface based navigation system as in claim 1 or 2, wherein the pairing information in said visual signal comprises a pattern or color combination or frequency coding with pairing features.
4. The visual evoked potential brain-computer interface based navigation system of claim 1, wherein said multimedia system includes a playback system including a display interface and/or a sound player to broadcast video information or audio information.
5. The visual evoked potential brain-computer interface based navigation system of claim 4, wherein said multimedia system further comprises a memory at least for storing said multimedia information and/or for invoking said multimedia information in accordance with control instructions.
6. The visual evoked potential brain-computer interface based navigation system of claim 1, wherein said brain electrical acquisition device comprises at least a number of first electrodes for at least acquiring brain electrical information of a target.
7. The visual evoked potential brain-computer interface based navigation system of claim 6, wherein the brain electrical information acquired by said first electrode is visual cortical brain electrical information.
8. The visual evoked potential brain-computer interface based navigation system of claim 6 or 7, wherein the first electrode is a dry electrode or a wet electrode or a gel electrode.
9. Navigation device based on a visual evoked potential brain-computer interface comprising a navigation system according to any one of claims 1-8.
10. The visual evoked potential brain-computer interface based navigation apparatus as set forth in claim 9, including
The host computer is used for storing the data,
the display is in communication connection with the host computer to receive and broadcast the visual signals and/or the multimedia information;
the electroencephalogram acquisition device is in communication connection with the host machine to acquire the electroencephalogram information of the target and send the electroencephalogram information to the host machine.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111527572.1A CN114185436A (en) | 2021-12-14 | 2021-12-14 | Navigation system and device based on visual evoked potential brain-computer interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111527572.1A CN114185436A (en) | 2021-12-14 | 2021-12-14 | Navigation system and device based on visual evoked potential brain-computer interface |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114185436A true CN114185436A (en) | 2022-03-15 |
Family
ID=80543758
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111527572.1A Pending CN114185436A (en) | 2021-12-14 | 2021-12-14 | Navigation system and device based on visual evoked potential brain-computer interface |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114185436A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101576772A (en) * | 2009-05-14 | 2009-11-11 | 天津工程师范学院 | Brain-computer interface system based on virtual instrument steady-state visual evoked potentials and control method thereof |
CN105929937A (en) * | 2016-03-11 | 2016-09-07 | 南京邮电大学 | Mobile phone music playing system based on steady-state visual evoked potential (SSVEP) |
CN106200400A (en) * | 2016-08-18 | 2016-12-07 | 南昌大学 | A kind of house control system based on brain electricity APP |
CN108260012A (en) * | 2018-03-14 | 2018-07-06 | 广东欧珀移动通信有限公司 | Electronic device, video playing control method and related product |
CN110286757A (en) * | 2019-06-14 | 2019-09-27 | 长春理工大学 | A kind of wearable brain machine interface system and control method based on mixed reality |
CN111743538A (en) * | 2020-07-06 | 2020-10-09 | 江苏集萃脑机融合智能技术研究所有限公司 | Brain-computer interface alarm method and system |
CN111973178A (en) * | 2020-08-14 | 2020-11-24 | 中国科学院上海微系统与信息技术研究所 | Electroencephalogram signal identification system and method |
CN113190122A (en) * | 2021-05-31 | 2021-07-30 | 江苏集萃脑机融合智能技术研究所有限公司 | Intelligent device and method based on brain signals, intelligent system and application |
CN113220122A (en) * | 2021-05-06 | 2021-08-06 | 西安慧脑智能科技有限公司 | Brain wave audio processing method, equipment and system |
-
2021
- 2021-12-14 CN CN202111527572.1A patent/CN114185436A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101576772A (en) * | 2009-05-14 | 2009-11-11 | 天津工程师范学院 | Brain-computer interface system based on virtual instrument steady-state visual evoked potentials and control method thereof |
CN105929937A (en) * | 2016-03-11 | 2016-09-07 | 南京邮电大学 | Mobile phone music playing system based on steady-state visual evoked potential (SSVEP) |
CN106200400A (en) * | 2016-08-18 | 2016-12-07 | 南昌大学 | A kind of house control system based on brain electricity APP |
CN108260012A (en) * | 2018-03-14 | 2018-07-06 | 广东欧珀移动通信有限公司 | Electronic device, video playing control method and related product |
CN110286757A (en) * | 2019-06-14 | 2019-09-27 | 长春理工大学 | A kind of wearable brain machine interface system and control method based on mixed reality |
CN111743538A (en) * | 2020-07-06 | 2020-10-09 | 江苏集萃脑机融合智能技术研究所有限公司 | Brain-computer interface alarm method and system |
CN111973178A (en) * | 2020-08-14 | 2020-11-24 | 中国科学院上海微系统与信息技术研究所 | Electroencephalogram signal identification system and method |
CN113220122A (en) * | 2021-05-06 | 2021-08-06 | 西安慧脑智能科技有限公司 | Brain wave audio processing method, equipment and system |
CN113190122A (en) * | 2021-05-31 | 2021-07-30 | 江苏集萃脑机融合智能技术研究所有限公司 | Intelligent device and method based on brain signals, intelligent system and application |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Liu et al. | Implementation of SSVEP based BCI with Emotiv EPOC | |
KR102333704B1 (en) | Method for processing contents based on biosignals, and thereof device | |
CN106267514B (en) | Feeling control system based on brain electricity feedback | |
CN109964226A (en) | Electronic device and its control method | |
Kapeller et al. | A BCI using VEP for continuous control of a mobile robot | |
WO2004034870A2 (en) | Eeg system for time-scaling presentations | |
CN101464728A (en) | Human-machine interaction method with vision movement related neural signal as carrier | |
CN109637098A (en) | Wearable device and its control method | |
CN110221684A (en) | Apparatus control method, system, electronic device and computer readable storage medium | |
CN103169471A (en) | Portable electroencephalogram detection system | |
CN107390869A (en) | Efficient brain control Chinese character input method based on movement vision Evoked ptential | |
CN109199328A (en) | A kind of health robot control service system | |
CN116382481A (en) | Man-machine interaction glasses based on brain-computer interface technology | |
Hernandez et al. | Inside-out: Reflecting on your inner state | |
CN101339413B (en) | Switching control method based on brain electric activity human face recognition specific wave | |
Edlinger et al. | A hybrid brain-computer interface for improving the usability of a smart home control | |
CN209900391U (en) | Hypnosis system based on virtual reality | |
CN110688013A (en) | English keyboard spelling system and method based on SSVEP | |
Hsieh et al. | Home care by auditory Brain Computer Interface for the blind with severe physical disabilities | |
Mouli et al. | Eliciting higher SSVEP response from LED visual stimulus with varying luminosity levels | |
Lo et al. | Novel non-contact control system of electric bed for medical healthcare | |
CN113101021A (en) | Mechanical arm control method based on MI-SSVEP hybrid brain-computer interface | |
Trivedi et al. | Brainwave enabled multifunctional, communication, controlling and speech signal generating system | |
CN112783314B (en) | Brain-computer interface stimulation paradigm generating and detecting method, system, medium and terminal based on SSVEP | |
CN114185436A (en) | Navigation system and device based on visual evoked potential brain-computer interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20220315 |