CN114371784A - Brain-computer interface decoding method for steady-state visual evoked potential - Google Patents
Brain-computer interface decoding method for steady-state visual evoked potential Download PDFInfo
- Publication number
- CN114371784A CN114371784A CN202210046576.6A CN202210046576A CN114371784A CN 114371784 A CN114371784 A CN 114371784A CN 202210046576 A CN202210046576 A CN 202210046576A CN 114371784 A CN114371784 A CN 114371784A
- Authority
- CN
- China
- Prior art keywords
- data
- extended
- period
- frequency
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 230000000007 visual effect Effects 0.000 title claims abstract description 18
- 230000000763 evoking effect Effects 0.000 title claims abstract description 14
- 125000004122 cyclic group Chemical group 0.000 claims abstract description 27
- 230000000638 stimulation Effects 0.000 claims abstract description 16
- 238000005516 engineering process Methods 0.000 claims abstract description 14
- 238000006073 displacement reaction Methods 0.000 claims abstract description 13
- 230000000737 periodic effect Effects 0.000 claims abstract description 11
- 230000010355 oscillation Effects 0.000 claims abstract description 8
- 238000001914 filtration Methods 0.000 claims abstract description 4
- 238000012549 training Methods 0.000 claims description 45
- 238000012360 testing method Methods 0.000 claims description 33
- 238000005070 sampling Methods 0.000 claims description 9
- 238000004458 analytical method Methods 0.000 claims description 7
- 238000002372 labelling Methods 0.000 claims 1
- 230000005540 biological transmission Effects 0.000 abstract description 8
- 208000003464 asthenopia Diseases 0.000 abstract description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 210000003169 central nervous system Anatomy 0.000 description 2
- 238000010219 correlation analysis Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 210000003710 cerebral cortex Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000013383 initial experiment Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000000653 nervous system Anatomy 0.000 description 1
- 230000003534 oscillatory effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 210000000857 visual cortex Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Neurosurgery (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Health & Medical Sciences (AREA)
- Dermatology (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The invention discloses a brain-computer interface decoding method for steady-state visual evoked potential, which comprises the following steps: acquiring data of different stimulation durations through a sliding time window aiming at a steady-state visual evoked potential data set, and carrying out band-pass filtering on the data; according to the periodic oscillation characteristic of data, cyclic displacement is taken as a basic means, a novel data expansion technology is designed, and the data expansion technology is combined with other mode recognition methods to carry out electroencephalogram decoding. By expanding the sample data, the invention fully excavates the effective information of the sample, can improve the robustness of the algorithm, shorten the stimulation time and the calibration times, reduce the visual fatigue of the testee and improve the information transmission efficiency of the BCI system.
Description
Technical Field
The invention relates to the field of brain-computer interfaces, in particular to a brain-computer interface decoding method for Steady-State Visual Evoked Potential (SSVEP).
Background
Brain-Computer Interface (BCI) refers to a system that directly converts central nervous system activities into artificial output, and can replace, repair, enhance, supplement, or improve the normal output of the central nervous system, thereby realizing direct interaction between the nervous system and external devices. BCI integrates multiple engineering technologies, can provide a closed-loop interaction approach for people with limited motor ability, and provides an alternative interaction method for healthy people. According to different signal acquisition modes, the BCI can be divided into invasive BCI, partially invasive BCI and non-invasive BCI. Based on the characteristics of device safety and low cost, the non-invasive BCI is widely applied. The BCI system generally comprises technical links such as signal acquisition, signal decoding and output control, and finally realizes direct communication between the brain and external equipment.
SSVEP is the electrical potential produced by the cerebral cortex to repetitive visual flicker stimulation above 6Hz, consisting of oscillatory activity at the fundamental and harmonic frequencies of the stimulation. The SSVEP based BCI associates blinking visual stimuli with different frequencies with specific commands, and the user can select an output command by focusing on the different stimuli. Moreover, the BCI based on the SSVEP has the characteristics of high signal-to-noise ratio and high information transmission rate, and thus has been widely noticed and researched. Currently, the current practice is. Decoding methods for SSVEP have been developed more and more. Algorithms based on typical Correlation Analysis (CCA) and task Correlation Analysis (TRCA) are commonly used in the decoding process of SSVEP-BCIs, and the information identification accuracy and the information transmission rate of the SSVEP-BCIs are greatly improved. The current BCI technology based on the steady-state visual evoked potential has reached the information transmission rate of 376.58bits/min, and lays a strong foundation for further converting the BCI technology into application results.
However, the decoding method also has the problems of too long single stimulation time and more training times, which easily causes visual fatigue of the user, is not favorable for the user to perform long-time attention and operation, and reduces the user friendliness of the system. In order to increase the availability of the SSVEP-BCIs system, researchers propose a Joint Frequency-Phase Modulation (JFPM) method, and the stimulation paradigm is improved. On the other hand, the trend of research on the SSVEP-BCIs decoding algorithm is to improve the robustness of the algorithm with reduced calibration data and stimulation time. The SSVEP has the characteristic of periodic oscillation, which increases the availability and operability of data.
Disclosure of Invention
The invention provides a brain-computer interface decoding method facing steady state visual evoked potential, the invention provides a cyclic displacement single trial sample (CST) algorithm by utilizing the periodic oscillation characteristic of SSVEP, the algorithm can improve the robustness of the algorithm, shorten the stimulation time and the calibration times, reduce the visual fatigue of a subject and simultaneously improve the information transmission efficiency of a BCI system by expanding sample data and fully excavating effective information of the sample, and the details are described as follows:
a brain-computer interface decoding method oriented to steady-state visual evoked potentials, the method comprising:
acquiring a public reference data set, acquiring data of different stimulation durations from the reference data set through a sliding time window, and performing band-pass filtering on the data;
according to the periodic oscillation characteristic of the reference data set, a data expansion technology suitable for the reference data set is designed by taking cyclic displacement as a basic means, and electroencephalogram decoding is performed by adopting the data expansion technology;
the data expansion technology applied to the reference data set specifically includes:
1) performing cyclic displacement on the training data according to the marking frequency to obtain expanded training data; respectively constructing an original training template and an expanded training template based on the original training data and the expanded training data;
2) constructing an extended training-based spatial filter from the extended training data; performing cyclic displacement on the sample to be tested according to the potential target frequency to obtain extended data of the test sample;
3) constructing an extended test template by using extended data of the test sample; constructing an extended test spatial filter from extended data of the test sample; and matching the correlation coefficients by combining the templates.
Wherein, the training data carries out cyclic shift according to the mark frequency, and the obtained expanded training data is as follows:
the method comprises the steps of periodically dividing training data according to a current assumed frequency, and dividing a first period into a part 1A and a part 1B according to a tail incomplete period; carrying out cyclic shift according to the periodic sequence in sequence;
for the u-th cyclic shiftMoving the original signal according to the direction from a low-order period to a high-order period, wherein the shifting starting point is the starting point of the (u + 1) th period, and the shifting length is the length of a sampling point of one period; the incomplete period of the last vacancy is complemented by the 1B part of the first period in the original signal, the signal with overflow high bit is complemented from the second period to the end, and finally the 1A part of one period is complemented to the end of the signal to generate a new signal with the length of N
Wherein, the extended training template is:
Wherein q is the number of single-test-time samples, ckIs the length of a sampling point;
Further, the extended test template is:
For the k-th frequencyPerforming task related component analysis on the extended test data of the kth frequency for the u-th extended sample to obtain an extended test spatial filter w'k;
And performing combined template matching on the data, and measuring the correlation between the sample to be measured and the kth frequency to be measured in different modes:
final correlation coefficient rkFrom a weighted sum of four coefficients:
the target frequency is detected as follows:
where a is a weight vector.
The technical scheme provided by the invention has the beneficial effects that:
1. the invention utilizes the periodic oscillation characteristic of SSVEP, can improve the robustness of the decoding algorithm under the conditions of reduced calibration data and shortened stimulation time; the invention can reduce the visual fatigue of the user, increase the user friendliness of the system and promote the conversion of the technology to the application achievement;
2. the CST designed by the invention is a data expansion technology, can expand effective data through cyclic displacement, fully excavates effective information of the data, can improve the correlation between a sample to be tested and target frequency, and simultaneously reduces the correlation between the sample to be tested and non-target frequency;
3. the decoding algorithm designed by the invention proves that in the SSVEP, the sample to be detected contains effective information, and the information can be reasonably utilized to improve the performance of the SSVEP-BCIs system; the method provides an innovative and feasible research direction for improving the performance of the SSVEP-BCIs system; further research can obtain a perfect brain-computer interface system, and considerable social and economic benefits are expected to be obtained.
Drawings
FIG. 1 is a schematic diagram of a steady-state visual evoked potential oriented brain-computer interface decoding;
the design includes, among other things, acquisition of the published SSVEP reference data set, signal preprocessing, and target detection identification.
FIG. 2 is a schematic diagram of the u-th CST;
FIG. 3 is a schematic of SSVEP decoding based on cyclic shift single-run sample and task-related component analysis.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention are described in further detail below.
Example 1
When the human eye is subjected to a visual stimulus of a fixed frequency, the visual cortex of the human brain produces a continuous response related to the stimulus frequency, called the steady-state visual evoked potential. According to the method, an algorithm of a Cyclic Shift single test sample (CST) is designed according to the periodic oscillation characteristic of the SSVEP, and a Task-Related Component Analysis (TRCA) algorithm is fused to perform SSVEP classification identification. The off-line test result shows that the method is beneficial to improving the information identification accuracy and the information transmission rate of the SSVEP-BCIs.
The technical process comprises the following steps: the method comprises the steps of obtaining a public SSVEP reference data set, carrying out certain preprocessing and feature extraction, carrying out classification and identification by using a CST-TRCA algorithm, and calculating the judgment accuracy and the information transmission rate.
The invention can effectively improve the performance of the SSVEP-BCIs system in small samples and short stimulation time, thereby having more market application scenes, being applied to the fields of rehabilitation, life, entertainment and the like of disabled people and being expected to obtain considerable social and economic benefits.
Example 2
The scheme in embodiment 1 is further described below with reference to specific calculation formulas, examples, and fig. 1 to 3, and is described in detail below:
first, obtaining reference data
The embodiment of the invention adopts an SSVEP reference data set proposed in 2016 of the King force et al, which comprises data of 35 healthy volunteers. In the experiment, the subject was seated in a chair 70cm from the display screen, on which 40 visual stimuli were displayed in a 5 × 8 matrix. And coding the stimulation matrix by adopting a joint frequency phase modulation method. The frequency range of all stimulation blocks is set to 8-15.8Hz and the phase is 0-1.5 pi.
Second, signal preprocessing
Data for different stimulation durations are first acquired by sliding the time window. The time window length is 0.3s-1.0s, and the sliding step length is 0.1 s. The purpose of the sliding time window was to explore the effect of CST on the information recognition rate and information transmission rate at different stimulation durations. The data is then subjected to appropriate bandpass filtering.
Third, decoding step
The target detection and identification of the embodiment of the invention is based on cyclic shift single test sample (CST) and Task Related Component Analysis (TRCA) development.
The embodiment of the invention designs a data expansion technology CST suitable for SSVEP by taking cyclic displacement as a basic means according to the periodic oscillation characteristic of the SSVEP, adopts a CST-TRCA algorithm to perform electroencephalogram decoding, and mainly comprises the following seven steps:
(1) performing cyclic displacement on the training data according to the marking frequency to obtain expanded training data;
(2) respectively constructing an original training template and an expanded training template based on the original training data and the expanded training data;
(3) constructing an extended training spatial filter based on TRCA (true training code) by using the extended training data;
(4) performing cyclic displacement on the sample to be tested according to the potential target frequency to obtain extended data of the test sample;
(5) constructing an extended test template by using extended data of the test sample;
(6) constructing a TRCA-based extended test spatial filter from extended data of the test sample;
(7) and matching the correlation coefficients by combining the templates.
1. Cyclic shift test single sample (CST)
Assume a sampling frequency of fsIdeally, f is for one frequencykThe SSVEP test of (2), the period length of the signal being fs/fk. For the experiment, perform an fs/fkA cyclic shift of the length will result in a new trial. The new data had the same SSVEP properties as the initial experiments. However, if the trial is circularly shifted by other lengths, the original SSVEP properties will be broken in the expanded data. On this basis, embodiments of the present invention suggest that CST generate a new series of data from a single sample and investigate which displacement length is more suitable for extracting stable SSVEP.
Referring to FIG. 2, for a sample point of N, the sampling frequency is fsAssuming a single trial signal of frequency fkThe signal comprisingAn integer number of periods. If the signal contains incomplete periods, calculating the complete analog period signal corresponding to the current period number, the length of the sampling point is ck:
Wherein the content of the first and second substances,indicating rounding the number towards infinity. Therefore, the sampling point length of the incomplete period of the current signal is lk:
lk=ck-N (2)
And the original signal is periodically divided according to the current assumed frequency, and the first period is divided into two parts, namely 1A and 1B according to the last incomplete period. And sequentially performing cyclic shift according to the periodic sequence. For u (th)And (2) secondary cyclic shift, moving the original signal in the direction from a low bit period to a high bit period (from right to left), wherein the shift starting point is the starting point of the (u + 1) th period, and the shift length is the length of a sampling point in one period. The incomplete period with empty tail is complemented by the 1B part of the first period in the original signal, the signal with overflow high bit is complemented from the second period to the tail, finally the 1A part of one period is complemented to the tail end of the signal to generate a new signal with length of N
2. Decoding process
In the feature extraction process, the training samples are first processed.
The original training sample set for the kth stimulus is denoted XkThere were q single-test samples. According to the mark frequency fkThe extended data obtained by performing the u-th cyclic shift is named extended training dataThus, a set X comprising the original training samples can be obtainedkIn CkQ single test samples.
The corresponding trial times are superposed and averaged to obtain the original training template and the expansion of the k frequencyThe training templates are respectively recorded asAnd
and (4) performing task related component analysis on all the stimulated extension training data to obtain the extended integrated spatial filter W.
3. Processing a sample to be tested
The sample X to be tested performs CST at all potential target frequencies. Wherein, the u-th spreading sample corresponding to the k-th frequency is recorded asThereby, an extended test template corresponding to the k-th frequency can be obtained
Task related component analysis is carried out on the extended test data of the kth frequency to obtain an extended test spatial filter w'k。
Referring to fig. 3, then, the data is subjected to joint template matching, and the correlation between the sample to be measured and the kth frequency to be measured can be measured in different ways:
final correlation coefficient rkFrom a weighted sum of four coefficients:
where a is a weight vector.
The target frequency is detected according to equation (10):
the decoding of the electroencephalogram characteristics induced by the target frequency can be completed according to the decoding algorithm.
In the embodiment of the present invention, except for the specific description of the model of each device, the model of other devices is not limited, as long as the device can perform the above functions.
Those skilled in the art will appreciate that the drawings are only schematic illustrations of preferred embodiments, and the above-described embodiments of the present invention are merely provided for description and do not represent the merits of the embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (3)
1. A brain-computer interface decoding method oriented to steady-state visual evoked potentials, the method comprising:
acquiring data of different stimulation durations through a sliding time window aiming at a steady-state visual evoked potential data set, and carrying out band-pass filtering on the data;
according to the periodic oscillation characteristic of data, cyclic displacement is taken as a basic means, a novel data expansion technology is designed, and the data expansion technology is combined with other mode identification methods for electroencephalogram decoding;
the novel data expansion technology specifically comprises the following steps:
1) performing cyclic displacement on the training data according to the marking frequency to obtain expanded training data; respectively constructing an original training template and an expanded training template based on the original training data and the expanded training data;
2) constructing an extended training-based spatial filter from the extended training data; performing cyclic displacement on the sample to be tested according to the potential target frequency to obtain extended data of the test sample;
3) constructing an extended test template by using extended data of the test sample; constructing an extended test spatial filter from extended data of the test sample; and matching the correlation coefficients by combining the templates.
2. The steady-state visual evoked potential oriented brain-computer interface decoding method as set forth in claim 1, wherein the training data is circularly shifted according to the labeling frequency, and the obtained extended training data is:
the method comprises the steps of periodically dividing training data according to a current assumed frequency, and dividing a first period into a part 1A and a part 1B according to a tail incomplete period; carrying out cyclic shift according to the periodic sequence in sequence;
for the u-th cyclic shift, moving the original signal according to the direction from a low bit period to a high bit period, wherein the shift starting point is the starting point of the u +1 th period, and the shift length is the length of a sampling point of one period; the incomplete period of the last vacancy is complemented by the 1B part of the first period in the original signal, the signal with overflow high bit is complemented from the second period to the end, and finally the 1A part of one period is complemented to the end of the signal to generate a new signal with the length of N
3. The method of claim 1, wherein the brain-computer interface decoding method facing the steady-state visual evoked potential,
the extended training template is as follows:
Wherein q is the number of single-test-time samples, ckIs the length of a sampling point;
the extended test template is as follows:
Task related component analysis is carried out on the extended test data of the k-th frequency for the u-th extended sample corresponding to the k-th frequency, and an extended test spatial filter w 'can be obtained'k;
And performing combined template matching on the data, and measuring the correlation between the sample to be measured and the kth frequency to be measured in different modes:
final correlation coefficient rkFrom a weighted sum of four coefficients:
the target frequency is detected as follows:
where a is a weight vector.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210046576.6A CN114371784B (en) | 2022-01-14 | 2022-01-14 | Brain-computer interface decoding method oriented to steady-state visual evoked potential |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210046576.6A CN114371784B (en) | 2022-01-14 | 2022-01-14 | Brain-computer interface decoding method oriented to steady-state visual evoked potential |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114371784A true CN114371784A (en) | 2022-04-19 |
CN114371784B CN114371784B (en) | 2023-11-03 |
Family
ID=81143773
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210046576.6A Active CN114371784B (en) | 2022-01-14 | 2022-01-14 | Brain-computer interface decoding method oriented to steady-state visual evoked potential |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114371784B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115429289A (en) * | 2022-09-01 | 2022-12-06 | 天津大学 | Brain-computer interface training data amplification method, device, medium and electronic equipment |
CN115429289B (en) * | 2022-09-01 | 2024-05-31 | 天津大学 | Brain-computer interface training data amplification method, device, medium and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101788969B1 (en) * | 2016-04-20 | 2017-11-15 | 국방과학연구소 | Target Selection Method of Augmented Reality System Using Brain-Computer Interface Technic Based on Steady State Visual Evoked Potential |
CN110782005A (en) * | 2019-09-27 | 2020-02-11 | 山东大学 | Image annotation method and system for tracking based on weak annotation data |
CN111105444A (en) * | 2019-12-31 | 2020-05-05 | 哈尔滨工程大学 | Continuous tracking method suitable for underwater robot target grabbing |
CN111580643A (en) * | 2020-04-10 | 2020-08-25 | 天津大学 | Brain-computer interface method based on steady-state asymmetric visual evoked potential |
CN113805694A (en) * | 2021-08-26 | 2021-12-17 | 上海大学 | Auxiliary grabbing system and method based on brain-computer interface and computer vision |
-
2022
- 2022-01-14 CN CN202210046576.6A patent/CN114371784B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101788969B1 (en) * | 2016-04-20 | 2017-11-15 | 국방과학연구소 | Target Selection Method of Augmented Reality System Using Brain-Computer Interface Technic Based on Steady State Visual Evoked Potential |
CN110782005A (en) * | 2019-09-27 | 2020-02-11 | 山东大学 | Image annotation method and system for tracking based on weak annotation data |
CN111105444A (en) * | 2019-12-31 | 2020-05-05 | 哈尔滨工程大学 | Continuous tracking method suitable for underwater robot target grabbing |
CN111580643A (en) * | 2020-04-10 | 2020-08-25 | 天津大学 | Brain-computer interface method based on steady-state asymmetric visual evoked potential |
CN113805694A (en) * | 2021-08-26 | 2021-12-17 | 上海大学 | Auxiliary grabbing system and method based on brain-computer interface and computer vision |
Non-Patent Citations (2)
Title |
---|
明东;肖晓琳;汤佳贝;许敏鹏;: "基于异步并行诱发策略的混合范式脑-机接口技术", 纳米技术与精密工程, no. 05, pages 21 - 26 * |
谢松云;刘畅;吴悠;张娟丽;段绪;: "基于多模式EEG的脑-机接口虚拟键鼠系统设计", 西北工业大学学报, no. 02, pages 64 - 68 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115429289A (en) * | 2022-09-01 | 2022-12-06 | 天津大学 | Brain-computer interface training data amplification method, device, medium and electronic equipment |
CN115429289B (en) * | 2022-09-01 | 2024-05-31 | 天津大学 | Brain-computer interface training data amplification method, device, medium and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN114371784B (en) | 2023-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105956624B (en) | Mental imagery brain electricity classification method based on empty time-frequency optimization feature rarefaction representation | |
CN109299751B (en) | EMD data enhancement-based SSVEP electroencephalogram classification method of convolutional neural model | |
Vigário et al. | Independent component approach to the analysis of EEG and MEG recordings | |
CN114533086B (en) | Motor imagery brain electrolysis code method based on airspace characteristic time-frequency transformation | |
CN111067514B (en) | Multi-channel electroencephalogram coupling analysis method based on multi-scale multivariable transfer entropy | |
WO2019144776A1 (en) | Coding-decoding method for brain-machine interface system based on asymmetric electroencephalographic features | |
CN108363493B (en) | User characteristic model establishing method and system based on brain-computer interface and storage medium | |
CN111580643B (en) | Brain-computer interface method based on steady-state asymmetric visual evoked potential | |
CN113951900B (en) | Motor imagery intention recognition method based on multi-mode signals | |
CN105824418A (en) | Brain-computer interface communication system based on asymmetric visual evoked potential | |
CN102063180A (en) | HHT-based high-frequency combined coding steady state visual evoked potential brain-computer interface method | |
CN107132915B (en) | Brain-computer interface method based on dynamic brain function network connection | |
CN112137616B (en) | Consciousness detection device for multi-sense brain-body combined stimulation | |
Gao et al. | Multi-ganglion ANN based feature learning with application to P300-BCI signal classification | |
CN113208593A (en) | Multi-modal physiological signal emotion classification method based on correlation dynamic fusion | |
CN111820876A (en) | Dynamic construction method of electroencephalogram spatial filter | |
CN109009098A (en) | A kind of EEG signals characteristic recognition method under Mental imagery state | |
Zhang et al. | An improved method to calculate phase locking value based on Hilbert–Huang transform and its application | |
CN106073767B (en) | Phase synchronization measurement, coupling feature extraction and the signal recognition method of EEG signal | |
CN114557708A (en) | Device and method for detecting somatosensory stimulation consciousness based on electroencephalogram dual-feature fusion | |
CN110472595A (en) | Identification model construction method, device and the recognition methods of EEG signals, device | |
CN106843509B (en) | Brain-computer interface system | |
CN117520891A (en) | Motor imagery electroencephalogram signal classification method and system | |
CN114371784A (en) | Brain-computer interface decoding method for steady-state visual evoked potential | |
CN111543983A (en) | Electroencephalogram signal channel selection method based on neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |