CN113133769A - Equipment control method, device and terminal based on motor imagery electroencephalogram signals - Google Patents

Equipment control method, device and terminal based on motor imagery electroencephalogram signals Download PDF

Info

Publication number
CN113133769A
CN113133769A CN202110441392.5A CN202110441392A CN113133769A CN 113133769 A CN113133769 A CN 113133769A CN 202110441392 A CN202110441392 A CN 202110441392A CN 113133769 A CN113133769 A CN 113133769A
Authority
CN
China
Prior art keywords
motor imagery
electroencephalogram
spatial
characteristic
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110441392.5A
Other languages
Chinese (zh)
Inventor
刘京
田亮
陈栋
赵薇
王少华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei Normal University
Original Assignee
Hebei Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei Normal University filed Critical Hebei Normal University
Priority to CN202110441392.5A priority Critical patent/CN113133769A/en
Publication of CN113133769A publication Critical patent/CN113133769A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Fuzzy Systems (AREA)
  • Evolutionary Computation (AREA)
  • Pathology (AREA)
  • Mathematical Physics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention is suitable for the technical field of motor imagery electroencephalogram signal processing, and provides a device control method, a device and a terminal based on motor imagery electroencephalogram signals, wherein the method comprises the following steps: obtaining original electroencephalogram data of a target person detected by a plurality of electroencephalogram signal channels, and performing convolution to obtain a three-dimensional spatial feature matrix; calculating the similarity among all electroencephalogram signal channels according to the three-dimensional space characteristic matrix to obtain a space self-attention weight matrix; according to the spatial self-attention weight matrix, carrying out feature extraction on the original electroencephalogram data to obtain the spatial features of the motor imagery electroencephalogram signals of the target personnel; calculating the time characteristics of the motor imagery electroencephalogram signals, and classifying the motor imagery electroencephalogram signals based on the time characteristics and the space characteristics; and controlling the equipment connected with the target personnel according to the classification result. The invention can improve the classification precision of the motor imagery electroencephalogram signals, thereby improving the control accuracy of personnel on external equipment.

Description

Equipment control method, device and terminal based on motor imagery electroencephalogram signals
Technical Field
The invention belongs to the technical field of motor imagery electroencephalogram signal processing, and particularly relates to a device control method, a device and a terminal based on motor imagery electroencephalogram signals.
Background
Motor Imagery (MI) electroencephalography (EEG) classification is a research hotspot in the fields of brain science and human-computer interaction.
The main task of MI Brain electrical signal classification is to classify and identify MI Brain electrical signals generated by four motor imagery tasks (left hand, right hand, feet and tongue) of human Brain, and then encode the identified motor imagery tasks by relying on Brain Computer Interface (BCI) technology, thereby realizing the control of external equipment. In recent years, with the rapid development of Deep Learning (DL), the MI electroencephalogram classification technology based on Deep Learning has been widely focused.
However, the inventor of the present application finds that the existing deep learning method has the problem of low classification precision in single-subject MI brain electrical signal classification, and reduces the control accuracy of external equipment. The reason is that the MI electroencephalogram signal is a very weak time sequence signal, has the characteristics of continuity, instability and low signal-to-noise ratio, and the prior art has no emphasis on selecting an electroencephalogram signal channel, so that information irrelevant to motor imagery or weak relevance is also input into a classification network, and loss or redundancy of the MI electroencephalogram information is caused.
Disclosure of Invention
In view of this, the embodiment of the present invention provides a device control method, an apparatus and a terminal based on a motor imagery electroencephalogram signal, so as to improve the classification precision of the motor imagery electroencephalogram signal, and further improve the control accuracy of a person on an external device.
The first aspect of the embodiment of the invention provides a device control method based on motor imagery electroencephalogram signals, which comprises the following steps:
acquiring original electroencephalogram data of a target person detected by a plurality of electroencephalogram signal channels, and performing spatial convolution operation on the original electroencephalogram data to obtain a three-dimensional spatial characteristic matrix;
calculating the similarity between any two electroencephalogram signal channels according to the three-dimensional space characteristic matrix to obtain a space self-attention weight matrix;
according to the spatial self-attention weight matrix, carrying out feature extraction on the original electroencephalogram data to obtain the spatial features of the motor imagery electroencephalogram signals of the target personnel;
and calculating the time characteristic of the motor imagery electroencephalogram signal, classifying the motor imagery electroencephalogram signal based on the time characteristic and the space characteristic, and controlling equipment connected with the target person according to the classification result.
A second aspect of the embodiments of the present invention provides an apparatus control device based on a motor imagery electroencephalogram signal, including:
the spatial feature extraction module is used for acquiring original electroencephalogram data of a target person detected by a plurality of electroencephalogram signal channels and performing spatial convolution operation on the original electroencephalogram data to obtain a three-dimensional spatial feature matrix; calculating the similarity between any two electroencephalogram signal channels according to the three-dimensional space characteristic matrix to obtain a space self-attention weight matrix; according to the spatial self-attention weight matrix, carrying out feature extraction on the original electroencephalogram data to obtain the spatial features of the motor imagery electroencephalogram signals of the target personnel;
the time characteristic extraction module is used for calculating the time characteristic of the motor imagery electroencephalogram signal;
and the classification control module is used for classifying the motor imagery electroencephalogram signals based on the time characteristics and the space characteristics and controlling equipment connected with the target personnel according to the classification result.
A third aspect of the embodiments of the present invention provides a terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the device control method based on the motor imagery electroencephalogram signal as described above when executing the computer program.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the device control method based on motor imagery electroencephalogram signal as described above.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
in the equipment control method based on the motor imagery electroencephalogram signals, in the stage of classifying and identifying the motor imagery electroencephalogram signals, the similarity degree between each electroencephalogram signal channel is identified through a space self-attention mechanism to extract potential space relation between each channel electroencephalogram signal, and a space self-attention weight matrix is obtained; further, feature extraction is carried out on the original electroencephalogram data through a spatial self-attention weight matrix, a higher weight value can be distributed to an electroencephalogram signal channel related to motor imagery, the electroencephalogram signal channel related to the motor imagery is selected to extract spatial features, and spatial domain feature enhancement is carried out; and finally, classifying the motor imagery electroencephalogram signals by combining the time characteristics of the motor imagery electroencephalogram signals, and controlling external equipment according to the classification result. The invention can improve the classification precision of the motor imagery electroencephalogram signals, thereby improving the control accuracy of personnel on external equipment.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of a device control method based on motor imagery electroencephalogram signals according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a network architecture of a spatial self-attention layer according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a network architecture of a parallel multi-scale TCN layer according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an overall network architecture of a classification model of an electroencephalogram signal based on motor imagery according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of experimental confusion matrix test results provided by embodiments of the present invention;
FIG. 6 is a schematic diagram of experimental confusion matrix test results provided by embodiments of the present invention;
FIG. 7 is a diagram illustrating experimental confusion matrix test results provided by embodiments of the present invention;
FIG. 8 is a schematic structural diagram of a device control apparatus based on motor imagery electroencephalogram signals according to an embodiment of the present invention;
fig. 9 is a schematic diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Motor imagery electroencephalogram signal classification is a fundamental technology of a brain-computer interface technology based on motor imagery and has wide application in various fields. Such as rehabilitation, military operations, and life entertainment.
In recent years, with the rapid development of deep learning, a Convolutional Neural Network (CNN) gradually becomes a core method in classification of motor imagery electroencephalograms, and more researchers propose different MI electroencephalogram classification Network models based on the CNN. After the inventor of the application deeply studies the currently used deep learning method, the problem that the existing DL method has low classification precision on the task of four classification of single-test motor imagery electroencephalogram signals is found, and the control accuracy of personnel on external equipment is reduced. Aiming at the problem, the invention is designed as follows:
the embodiment of the invention provides a device control method based on motor imagery electroencephalogram signals, and as shown in figure 1, the method comprises the following steps:
s101, acquiring original electroencephalogram data of a target person detected by a plurality of electroencephalogram signal channels, and performing spatial convolution operation on the original electroencephalogram data to obtain a three-dimensional spatial feature matrix.
In the embodiment of the invention, a classification model of the motor imagery electroencephalogram signal is established, and is shown in reference to fig. 2. For single tested MI electroencephalogram signal classification, before the original electroencephalogram data of a target person is obtained for classification, the target person needs to be trained in advance, the weight of each convolution kernel in a model is learned, and then the MI electroencephalogram signals of the target person are classified according to the trained classification model.
Specifically, the first step of the classification process is to acquire a plurality of electroencephalogram signal channel signals in each area of the brain of a target person through electrodes to form original electroencephalogram data M belonging to R and having the height of H and the width of WH×WWherein H corresponds to the number of electroencephalogram signal channels. Then, inputting the original electroencephalogram data M into the convolutional layer Conv11 for spatial convolution operation, and obtaining a three-dimensional space characteristic diagram (or a three-dimensional space characteristic matrix) A ∈ RD×H×WAnd D-8 denotes the depth of the feature map.
And S102, calculating the similarity between any two electroencephalogram signal channels according to the three-dimensional space characteristic matrix to obtain a space self-attention weight matrix.
Optionally, as a possible implementation manner, according to the three-dimensional spatial feature matrix, the similarity between any two electroencephalogram signal channels is calculated, which may be detailed as:
determining a two-dimensional characteristic matrix corresponding to each electroencephalogram signal channel according to the three-dimensional spatial characteristic matrix;
calculating the similarity between any two EEG signal channels based on the following formula:
Figure BDA0003035202260000051
in the formula, PijIs the similarity between the ith EEG signal channel and the jth EEG signal channel, and f is the similarityA linear function, H is the number of channels of the electroencephalogram signal, EiIs a two-dimensional characteristic matrix corresponding to the ith EEG signal channel, (E)j)TIs the transpose of the two-dimensional characteristic matrix corresponding to the jth EEG signal channel.
In the embodiment of the invention, referring to fig. 2, the three-dimensional space feature map a can be firstly reshaped into a1 ∈ R by the Reshape and Transpose functionsH×(D×W)And A2 ∈ R(D×W)×HTo facilitate the implementation of matrix multiplication between them. Then, calculating the similarity between any two electroencephalogram signal channels by applying a softmax function (the value range of the similarity is 0-1, wherein 0 represents no similarity, and 1 represents complete similarity), and obtaining a spatial self-attention weight matrix P belonging to RH×H
And S103, extracting the characteristics of the original electroencephalogram data according to the spatial self-attention weight matrix to obtain the spatial characteristics of the motor imagery electroencephalogram signals of the target personnel.
Optionally, as a possible implementation manner, according to the spatial self-attention weight matrix, feature extraction is performed on the original electroencephalogram data to obtain spatial features of the motor imagery electroencephalogram signal of the target person, which may be detailed as:
extracting the influence characteristics of each electroencephalogram signal channel from the original electroencephalogram data according to the spatial self-attention weight matrix to obtain an influence characteristic matrix;
and calculating the spatial characteristics of the motor imagery electroencephalogram signals based on the influence characteristic matrix.
Optionally, as a possible implementation manner, the influence characteristics of each electroencephalogram signal channel may be extracted according to the following formula to obtain an influence characteristic matrix:
F=P×M
in the formula, F is an influence characteristic matrix, P is a spatial self-attention weight matrix, and M is a matrix formed by original electroencephalogram data.
Optionally, as a possible implementation, the spatial feature of the motor imagery electroencephalogram signal may be calculated based on the following formula:
G=α×F+M
in the formula, G is the spatial characteristic of the motor imagery electroencephalogram signal, alpha is a characteristic parameter obtained by pre-training, F is an influence characteristic matrix, and M is a matrix formed by original electroencephalogram data.
In an embodiment of the present invention, the spatial self-attention weight matrix P ∈ R is set as shown in FIG. 2H×HAnd the original electroencephalogram data M belongs to RH×WMatrix multiplication is carried out to obtain an influence characteristic matrix F epsilon RH×WNamely, the influence characteristic of each EEG signal channel is the weighted sum of the data of other EEG signal channels. And finally, multiplying the F by a learnable parameter alpha to form a residual block, and performing element-by-element summation operation on the original electroencephalogram data to obtain the final spatial characteristics. Where α is initialized to 0 at the time of training and gradually updated during the training of the entire deep learning system to determine a more appropriate value.
That is to say, the method automatically learns the feature similarity among all the electroencephalogram signal channels through a spatial self-attention mechanism, and the features of any electroencephalogram signal channel are updated by aggregating the features of all the electroencephalogram signal channels in a weighted summation mode. The mechanism can automatically assign higher weight values to the motor imagery related channels and lower weight values to the motor imagery unrelated channels to select the best channel, can extract high-level distinguishable spatial features, and define a more compact and more focused representation in the spatial domain of the original MI-EEG signal data. This verifies our hypothesis that any channel with similar motion-dependent characteristics can promote each other when one thinks of an action, regardless of its spatial location in the brain, solving the problem of loss of characteristic information resulting from manual selection of electroencephalogram signal channels.
For example, the parameters of the spatial feature extraction section may be as shown in table 1.
TABLE 1 spatial feature extraction parameters
Figure BDA0003035202260000071
And S104, calculating the time characteristic of the motor imagery electroencephalogram signal, classifying the motor imagery electroencephalogram signal based on the time characteristic and the space characteristic, and controlling equipment connected with the target person according to the classification result.
Optionally, as a possible implementation, the time characteristic of the motor imagery electroencephalogram signal is calculated, which may be detailed as:
carrying out multi-scale time convolution operation on the original electroencephalogram data to obtain time characteristics corresponding to each time scale;
and fusing the time characteristics corresponding to each time scale to obtain the time characteristics of the motor imagery electroencephalogram signals.
In the embodiment of the present invention, the inventor of the present application finds that, because the electroencephalogram signal is a time-varying non-stationary time sequence signal, in order to better extract the state and degree of the electroencephalogram signal changing with time, the prior art is usually implemented by using a cyclic convolution Network (RNN), but the RNN has very serious problems of gradient extinction and gradient explosion. The Time Convolution Network (TCN) can ensure that the shallow network obtains a larger receptive field by expanding convolution and changing the size of a convolution kernel, and the time direction of a back propagation path of the time convolution network is different from that of a sequence, thereby avoiding the problems of gradient extinction and explosion. In addition, compared with other time series classification networks, such as variants LSTM and GRU of RNN, TCN can view history information and future information by expanding convolution, while LSTM and the like can only view history information, which causes incomplete context information and reduces network performance. Also, the prediction of subsequent times in the RNN must wait for its predecessors to complete, but the TCNs can complete in parallel. Thus, in brain electrical signal training, one long input sequence can be processed in the TCN as a whole, rather than sequentially as in the RNN.
Specifically, the TCN consists of several residual blocks (convolution kernel size K)T2, the expansion coefficient d is {1,2 }). The differences between TCN and conventional CNN mainly include:
(1) and (3) causal convolution: the output of the TCN is the same length as the input. To this end, the TCN uses a 1D full-volume network architecture, in which the size of each hidden layer is the same as the size of the input layer, and the length of the subsequent layer is the same as the previous layer by means of 0 padding. Furthermore, causal convolution guarantees that no information flows from the future to the past, in short, the input at time t depends only on the input at time t or earlier.
(2) And (3) expansion convolution: the conventional causal convolution can only increase its receptive field size linearly at the depth of the network, which results in a very deep network or a network with a large convolution kernel being required if a large receptive field size is to be obtained, which is one of the most significant drawbacks of the conventional causal convolution. To solve this problem, TCN uses a series of dilation convolutions that increase the size of the receptive field in a manner proportional to the depth of the network by multiplying by a dilation factor d.
(3) Residual block: the residual block of the TCN consists of two layers of dilation convolution, each layer having, in addition to the causal convolution layer, bulk normalization, nonlinear activation and Dropout layers. Although the TCN has only one-dimensional convolution, treating the second dimension of the two-dimensional feature map as the depth dimension still allows processing of two-dimensional depth maps. Skipping the connection adds the input to the output profile map and checks if the depths of the input and output are the same, if not, a 1x1 convolution is performed.
Based on the advantages of the TCN, the embodiment of the invention adopts the parallel multi-scale TCN network on the basis of the TCN network architecture to solve the interference of noise artifacts existing in the MI-EEG signal on the time domain, the TCN networks with different scales can extract time domain feature information of dynamic changes of the original MI-EEG signal under different scales, and can extract sufficient time domain potential features, thereby effectively reducing the interference of the noise artifacts and improving the classification accuracy.
Referring to fig. 3, the specific time feature extraction steps may be as follows:
making M be belonged to RH×WInputting the convolutional layer Conv21 to obtain a characteristic diagram B e RD×1×WThe convolution kernel size of Conv21 is (22 × 1), and D is the depth of feature map B. And inputting the characteristic diagram B into four TCN convolution layers with different scales in parallel, wherein the scale of each TCN layer is different and is represented by the size of a convolution kernel. In an embodiment of the invention, the convolutions are eachThe kernel sizes are (1 × 25), (1 × 50), (1 × 75) and (1 × 100), respectively, and represent that the MI-EEG features are convolved with time steps of 100ms, 200ms, 300ms and 400ms, respectively, to obtain four sets of temporal feature maps T1,T2,T3,T4∈RD×1×W
For the time series prediction model, when the receptive field completely covers the input length, the prediction model has complete history, and the prediction effect of the model is the best, namely at tnThe value of the time of day only corresponding to tn-1,...t0The value of the time being dependent on when t0Length of time value l0Equal to the input length l (in this network, l ═ W) the model works best. Then, the dilation base is b, the convolution kernel size k (k ≧ b), the field size of TCN l and the number of residual blocks n are:
Figure BDA0003035202260000091
in this model, b is 2, l is 1125, then:
Figure BDA0003035202260000092
k is 25, 50, 75, 100, the number of residual blocks n corresponding to the TCN layer is 5,4, 4, 3, as shown in fig. 3. Finally, four groups of feature maps T are combined1,T2,T3,T4∈RD×1×WAnd (3) fusing a group of characteristic graphs T epsilon R into a group through a splicing function4D×1×WAnd T is the finally obtained time characteristic diagram which eliminates noise and contains different time scale information.
For example, the parameters of the temporal feature extraction section may be as shown in table 2.
TABLE 2 temporal feature extraction parameters
Figure BDA0003035202260000093
Figure BDA0003035202260000101
Optionally, as a possible implementation, classifying the motor imagery electroencephalogram signal based on the temporal feature and the spatial feature includes:
fusing the time characteristic and the space characteristic to obtain a space-time characteristic of the motor imagery electroencephalogram signal;
and inputting the space-time characteristics of the motor imagery electroencephalogram signals into a preset characteristic classification channel for classification to obtain the category of the motor imagery electroencephalogram signals.
In an embodiment of the invention, the output G e R of the space self-attention layer is shown in reference to FIG. 4H×WObtaining a characteristic diagram S e R through a space convolution layer (Conv12) with convolution kernel size of (22 multiplied by 1)D×1×WThen, the S is equal to RD×1×WAnd T ∈ R4D ×1×WSplicing and fusing according to the depth dimension to obtain an enhanced space-time information characteristic diagram N1∈R5D×1×WWherein D is 16 and W is 1125.
For example, the parameters of the feature fusion part may be as shown in table 3.
TABLE 3 feature fusion parameters
Figure BDA0003035202260000102
Then, fusing the characteristics to obtain N1∈RD×1×WThe method comprises the steps of inputting the data into a preset feature classification channel for classification, wherein the layer comprises two convolution layers (Conv3 and Conv4, both of which comprise a batch normalization layer (BN) and a nonlinear activation layer (NL), two average pooling layers (AvgP1 and AvgP2, both of which comprise Dropout layers), a full convolution layer (FC) and a LogSoftmax function.
Specifically, N is1After convolution Conv3 with kernel size of (1 × 75), the sum N is obtained by using 0 padding1Feature maps with the same size are obtained, and N is obtained by applying batch normalization and nonlinear activation to the obtained feature maps2. After thatWill N2Reducing the input size (80,1,1125) to the (80,1,140) output by an average pooling layer (AvgP1) of kernel size (1 × 8) yields N3. Then, N is added3After convolution (Conv4) with kernel size of (1 × 25), the sum N is obtained by using 0 padding3Feature maps with the same size are obtained, and N is obtained by applying batch normalization and nonlinear activation to the obtained feature maps4. Then N4Then, the input size (80,1,140) is reduced to (80,1,17) output to obtain N through an average pooling layer (AvgP2) with the kernel size of (1 × 8)5. Finally, N is5Passing through a full convolution layer (FC) with kernel size of (1 × 17), the output size is N6(4,1,1). Finally, by N6Conversion to conditional probabilities for four tags, four classifications are performed using the LogSoftmax function. And coding the identified motor imagery task by relying on a BCI technology according to the classification result, so that the control of external equipment can be realized. For example, the wheelchair can be started, stopped, steered, etc. according to the classification result, and it should be noted that the range of external devices is wide, and the external devices may be a cart, a home appliance, a robot, etc.
For example, the parameters of the feature classification section may be as shown in table 4.
TABLE 4 feature fusion parameters
Figure BDA0003035202260000111
According to the equipment control method based on the motor imagery electroencephalogram signals, on a spatial domain, the similarity degree between each electroencephalogram signal channel is identified through a spatial self-attention mechanism, potential spatial relation between each channel electroencephalogram signal is extracted, and a spatial self-attention weight matrix is obtained; furthermore, the original electroencephalogram data are subjected to feature extraction through a spatial self-attention weight matrix, the features of each electroencephalogram signal channel are updated by aggregating the features on all the channels in a weighted summation mode, higher weight values can be distributed to the electroencephalogram signal channels related to the motor imagery, the electroencephalogram signal channels related to the motor imagery are selected to extract spatial features, and spatial feature enhancement is carried out. In the time domain, multi-scale time convolution is adopted, time features under different time scales are extracted and fused to eliminate the noise interference in the motor imagery electroencephalogram signals, and the enhanced time features are obtained. And then fusing the time characteristic and the space characteristic of the motor imagery electroencephalogram signal, classifying the motor imagery electroencephalogram signal, and controlling the motor imagery electroencephalogram signal and external equipment according to a classification result. The invention can improve the classification precision of the single-test motor imagery electroencephalogram signal, thereby improving the control accuracy of personnel on external equipment.
In the following, the feasibility of the method of the embodiment of the present invention was verified by experiments.
Details of the experiment:
the experimental data were evaluated using three published MI-EEG datasets. The first data set was the BCICIV2a data set, which recorded four types of motor imagery tasks (left hand, right hand, feet and tongue) performed by 9 different subjects in 25 lead electrode channels (22 brain electrical signal channels and 3 eye electrical signal channels) with a sampling rate of 250 Hz. Each channel is pre-processed with a 0.5-100Hz band pass filter. The electroencephalogram test data of each subject are divided into 2 groups, wherein one group is used as a training set, and the other group is used as a testing set. Each group contained 288 motor imagery trials, with an average of 72 trials per class of motor imagery task. In addition, each experiment uses the same time window [ -0.5,4s ] to extract the motor imagery signals of all 22 brain electrical channels. Thus, in the data set, 9 training sets and 9 test sets are explicitly separated. In the subset, there were 72 experiments per category. Thus, after removing 3 more ocular channel signals, 22x1, 1125 data points were obtained for each experiment.
The second data set was the bcicicv 2b data set, which recorded two types of motor imagery tasks (left hand, right hand) for 9 different subjects on 6 channels (3 brain electrical signal channels and 3 eye electrical signal channels). Sampling was performed at 250Hz as with the 2a data set and pre-processing with bandpass filtering between 0.5-100 Hz. Unlike the 2a data set, each set of the 2b data set contains 480 motor imagery experiments, averaging 240 experiments per motor imagery task type. The 3 brain electrical channels are extracted by adopting a time window of [ -0.5,4s ] as well. Thus, after removing 3 more electrooculogram channel signals, 3x1, 1125 data points were obtained for each experiment.
The third data set is an HGD data set, which records that 14 healthy testees perform four motor imagery tasks on 44 brain electrical signal channels, 4s of imagination is performed, each tester performs 13 sessions, and each session comprises 80 trials. These four types of sports include left hand, right hand, feet and rest (immobilization). For each subject, the training set consisted of approximately 880 trials (all sessions except the last two sessions), and the test set consisted of approximately 160 trials (the last two sessions). The HGD sampling rate was 500 Hz. For a fair comparison with the BCICIV2a, the HGD was resampled at a sampling rate of 250Hz and using the same 4.5s time window, thus obtaining 44x1, 125 data points per experiment.
The most important evaluation index in the MI-EEG analysis is Accuracy (Accuracy), and the higher the average Accuracy of the categories is, the higher the Accuracy of the categories is, the better the model performance is. The average accuracy is calculated by the formula:
Figure BDA0003035202260000131
wherein TP is true positive number, TN is true negative number, FP is false positive number, and FN is false negative number. TP is the number of true positive samples, representing the number of positive samples that are correctly predicted. TN is the number of true negative samples, indicating the number of negative samples that are correctly predicted. FP is the number of false positive samples, representing the number of classes that were misjudged as positive samples. FN represents the number of false positives as negative samples. For the four types of MI classification, an NLLoss function in a Pythrch is defined as a loss function, an Xavier algorithm is used for initializing all parameters in a network, and an Adam algorithm is adopted as an optimization algorithm. The learning rate for the BCICIV2a data set was 0.0001, the learning rate for the BCICIV2b was 0.0001, and the learning rate for the HGD data set was 0.001. The batch size was 32.
Since the three data sets clearly demarcate the training data set and the testing data set, the present invention randomly divides the training data set into a training set (80%) and a validation set (20%), and all the testing sets are used as testing sets. With this partitioning, an early stopping strategy developed in the field of computer vision can be used. The first phase of training is stopped when the verification accuracy does not improve within a predetermined period of time. Training of the training and validation data sets then continues using the parameter values that result in the validation data set having the highest accuracy. Training ends when the loss function on the validation dataset drops to the same value as the training dataset at the end of the first training phase. The hyperparameters in the Dropout layer and the constants and weight decay rates in the batch normalization layer are set to 0.4, 10-5, and 0.1, respectively.
The comparative experiment process comprises the following steps:
single test classification experiments were first performed using the BCICIV2a dataset and the accuracy of this method was compared to other deep learning based methods with the results shown in table 5.
TABLE 5BCICIV2a comparison of accuracy data
Figure BDA0003035202260000132
Figure BDA0003035202260000141
As can be seen from table 5, the average accuracy values of other DL-based methods range from 65.43% to 77.35%, and the device control method based on the motor imagery electroencephalogram signal provided by the embodiment of the present invention is significantly better than other DL-based methods, and the average accuracy for the classification of a single test is 79.26%. In addition, fig. 5 shows the experimental results of the device control method based on motor imagery electroencephalogram signals on the confusion matrix and test set of MI tasks according to the embodiment of the present invention.
In order to further verify the self-adaptability and robustness of the device control method based on the motor imagery electroencephalogram signals, the embodiment of the invention evaluates the other two challenging data sets BCICIV2b and HGD, and performs a single-test classification experiment, wherein the corresponding confusion matrix result is shown in FIG. 6 and FIG. 7. Since the most advanced methods at present only provide the average accuracy values of BCICIV2b and HGD, only the average accuracies are listed in tables 6 and 7, respectively, for comparison when comparing.
TABLE 6 BCICIV2b comparison of accuracy data
Figure BDA0003035202260000142
TABLE 7 HGD accuracy data comparison
Figure BDA0003035202260000143
The data in table 6 shows that the bcicicv 2b data from this method gave an experimental result with 85.90% accuracy, a good improvement over other state-of-the-art methods. The data in Table 7 show that the HGD data experimental results obtained with this method are significantly higher than the 96.96% accuracy obtained with the other methods. Experimental results show that the equipment control method based on the motor imagery electroencephalogram signals provided by the embodiment of the invention has better adaptivity and robustness to different electroencephalogram data sets and different classification tasks.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
An embodiment of the present invention provides an apparatus control device based on motor imagery electroencephalogram signals, and as shown in fig. 8, the apparatus 80 includes:
the spatial feature extraction module 81 is configured to acquire original electroencephalogram data of a target person detected by a plurality of electroencephalogram signal channels, and perform spatial convolution operation on the original electroencephalogram data to obtain a three-dimensional spatial feature matrix; calculating the similarity between any two electroencephalogram signal channels according to the three-dimensional space characteristic matrix to obtain a space self-attention weight matrix; and according to the spatial self-attention weight matrix, performing feature extraction on the original electroencephalogram data to obtain the spatial features of the motor imagery electroencephalogram signals of the target personnel.
And the time characteristic extraction module 82 is used for calculating the time characteristic of the motor imagery electroencephalogram signal.
And the classification control module 83 is used for classifying the motor imagery electroencephalogram signals based on the time characteristics and the space characteristics and controlling equipment connected with the target personnel according to the classification result.
Optionally, as a possible implementation manner, the spatial feature extraction module 81 is configured to:
determining a two-dimensional characteristic matrix corresponding to each electroencephalogram signal channel according to the three-dimensional spatial characteristic matrix;
calculating the similarity between any two EEG signal channels based on the following formula:
Figure BDA0003035202260000151
in the formula, PijIs the similarity between the ith and the jth EEG signal channels, f is the similarity function, H is the number of the EEG signal channels, EiIs a two-dimensional characteristic matrix corresponding to the ith EEG signal channel, (E)j)TIs the transpose of the two-dimensional characteristic matrix corresponding to the jth EEG signal channel.
Optionally, as a possible implementation manner, the spatial feature extraction module 81 is configured to:
extracting the influence characteristics of each electroencephalogram signal channel from the original electroencephalogram data according to the spatial self-attention weight matrix to obtain an influence characteristic matrix;
and calculating the spatial characteristics of the motor imagery electroencephalogram signals based on the influence characteristic matrix.
Optionally, as a possible implementation manner, the spatial feature extraction module 81 is configured to extract an influence feature of each electroencephalogram signal channel according to the following formula to obtain an influence feature matrix:
F=P×M
in the formula, F is an influence characteristic matrix, P is a spatial self-attention weight matrix, and M is a matrix formed by original electroencephalogram data.
Optionally, as a possible implementation manner, the spatial feature extraction module 81 is configured to calculate the spatial feature of the motor imagery electroencephalogram signal based on the following formula:
G=α×F+M
in the formula, G is the spatial characteristic of the motor imagery electroencephalogram signal, alpha is a characteristic parameter obtained by pre-training, F is an influence characteristic matrix, and M is a matrix formed by original electroencephalogram data.
Optionally, as a possible implementation, the temporal feature extraction module 82 is configured to:
carrying out multi-scale time convolution operation on the original electroencephalogram data to obtain time characteristics corresponding to each time scale;
and fusing the time characteristics corresponding to each time scale to obtain the time characteristics of the motor imagery electroencephalogram signals.
Optionally, as a possible implementation, the classification control module 83 is configured to:
fusing the time characteristic and the space characteristic to obtain the space-time characteristic of the motor imagery electroencephalogram signal;
and inputting the space-time characteristics of the motor imagery electroencephalogram signals into a preset characteristic classification channel for classification to obtain the category of the motor imagery electroencephalogram signals.
Fig. 9 is a schematic diagram of a terminal according to an embodiment of the present invention. As shown in fig. 9, the terminal 90 of this embodiment includes: a processor 91, a memory 92 and a computer program 93 stored in the memory 92 and executable on the processor 91. The processor 91 executes the computer program 93 to implement the steps in each of the above-described embodiments of the device control method based on motor imagery electroencephalogram signals, such as steps S101 to S104 shown in fig. 1. Alternatively, the processor 91, when executing the computer program 93, implements the functions of the respective modules in the above-described respective apparatus embodiments, for example, the functions of the modules 81 to 83 shown in fig. 8.
Illustratively, the computer program 93 may be divided into one or more modules/units, which are stored in the memory 92 and executed by the processor 91 to implement the present invention. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 93 in the terminal 90. For example, the computer program 93 may be divided into a spatial feature extraction module 81, a temporal feature extraction module 82, and a classification control module 83 (a module in a virtual device), each of which functions specifically as follows:
the spatial feature extraction module 81 is configured to acquire original electroencephalogram data of a target person detected by a plurality of electroencephalogram signal channels, and perform spatial convolution operation on the original electroencephalogram data to obtain a three-dimensional spatial feature matrix; calculating the similarity between any two electroencephalogram signal channels according to the three-dimensional space characteristic matrix to obtain a space self-attention weight matrix; and according to the spatial self-attention weight matrix, performing feature extraction on the original electroencephalogram data to obtain the spatial features of the motor imagery electroencephalogram signals of the target personnel.
And the time characteristic extraction module 82 is used for calculating the time characteristic of the motor imagery electroencephalogram signal.
And the classification control module 83 is used for classifying the motor imagery electroencephalogram signals based on the time characteristics and the space characteristics and controlling equipment connected with the target personnel according to the classification result.
The terminal 90 may be a computing device such as a desktop computer, a notebook, a palm top computer, and a cloud server. The terminal 90 may include, but is not limited to, a processor 91, a memory 92. Those skilled in the art will appreciate that fig. 9 is merely an example of a terminal 90 and does not constitute a limitation of the terminal 90, and may include more or less components than those shown, or combine certain components, or different components, e.g., the terminal 90 may also include input-output devices, network access devices, buses, etc.
The Processor 91 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 92 may be an internal storage unit of the terminal 90, such as a hard disk or a memory of the terminal 90. The memory 92 may also be an external storage device of the terminal 90, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the terminal 90. Further, the memory 92 may also include both internal and external memory storage devices of the terminal 90. The memory 92 is used for storing computer programs and other programs and data required by the terminal 90. The memory 92 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A device control method based on motor imagery electroencephalogram signals is characterized by comprising the following steps:
acquiring original electroencephalogram data of a target person detected by a plurality of electroencephalogram signal channels, and performing spatial convolution operation on the original electroencephalogram data to obtain a three-dimensional spatial characteristic matrix;
calculating the similarity between any two electroencephalogram signal channels according to the three-dimensional space characteristic matrix to obtain a space self-attention weight matrix;
according to the spatial self-attention weight matrix, carrying out feature extraction on the original electroencephalogram data to obtain the spatial features of the motor imagery electroencephalogram signals of the target personnel;
calculating the time characteristic of the motor imagery electroencephalogram signal, classifying the motor imagery electroencephalogram signal based on the time characteristic and the space characteristic, and controlling equipment connected with the target person according to a classification result.
2. The device control method based on motor imagery electroencephalogram signals of claim 1, wherein calculating a similarity between any two electroencephalogram signal channels according to the three-dimensional spatial feature matrix comprises:
determining a two-dimensional characteristic matrix corresponding to each electroencephalogram signal channel according to the three-dimensional space characteristic matrix;
calculating the similarity between any two EEG signal channels based on the following formula:
Figure FDA0003035202250000011
in the formula, PijIs the similarity between the ith and the jth EEG signal channels, f is the similarity function, H is the number of the EEG signal channels, EiIs a two-dimensional characteristic matrix corresponding to the ith EEG signal channel, (E)j)TIs the transpose of the two-dimensional characteristic matrix corresponding to the jth EEG signal channel.
3. The device control method based on motor imagery electroencephalogram signal of claim 1, wherein the obtaining of the spatial features of the motor imagery electroencephalogram signal of the target person by performing feature extraction on the raw electroencephalogram data according to the spatial self-attention weight matrix comprises:
extracting the influence characteristics of each electroencephalogram signal channel from the original electroencephalogram data according to the spatial self-attention weight matrix to obtain an influence characteristic matrix;
and calculating the spatial characteristics of the motor imagery electroencephalogram signals based on the influence characteristic matrix.
4. The device control method based on motor imagery electroencephalogram signals of claim 3, wherein the influence characteristics of each electroencephalogram signal channel are extracted according to the following formula to obtain an influence characteristic matrix:
F=P×M
in the formula, F is an influence characteristic matrix, P is a spatial self-attention weight matrix, and M is a matrix formed by original electroencephalogram data.
5. The device control method based on motor imagery electroencephalogram signal of claim 3, wherein the spatial characteristics of the motor imagery electroencephalogram signal are calculated based on the following equation:
G=α×F+M
in the formula, G is the spatial characteristic of the motor imagery electroencephalogram signal, alpha is a characteristic parameter obtained by pre-training, F is an influence characteristic matrix, and M is a matrix formed by original electroencephalogram data.
6. The device control method based on motor imagery brain electrical signal of claim 1, wherein calculating a temporal characteristic of the motor imagery brain electrical signal comprises:
performing multi-scale time convolution operation on the original electroencephalogram data to obtain time characteristics corresponding to each time scale;
and fusing the time characteristics corresponding to each time scale to obtain the time characteristics of the motor imagery electroencephalogram signal.
7. The motor imagery brain electrical signal based device control method of any one of claims 1 to 6, wherein classifying the motor imagery brain electrical signal based on the temporal feature and the spatial feature comprises:
fusing the time characteristic and the space characteristic to obtain a space-time characteristic of the motor imagery electroencephalogram signal;
and inputting the space-time characteristics of the motor imagery electroencephalogram signals into a preset characteristic classification channel for classification to obtain the category of the motor imagery electroencephalogram signals.
8. A device control apparatus based on motor imagery electroencephalogram signals, comprising:
the spatial feature extraction module is used for acquiring original electroencephalogram data of a target person detected by a plurality of electroencephalogram signal channels, and performing spatial convolution operation on the original electroencephalogram data to obtain a three-dimensional spatial feature matrix; calculating the similarity between any two electroencephalogram signal channels according to the three-dimensional space characteristic matrix to obtain a space self-attention weight matrix; according to the spatial self-attention weight matrix, carrying out feature extraction on the original electroencephalogram data to obtain the spatial features of the motor imagery electroencephalogram signals of the target personnel;
the time characteristic extraction module is used for calculating the time characteristic of the motor imagery electroencephalogram signal;
and the classification control module is used for classifying the motor imagery electroencephalogram signals based on the time characteristics and the space characteristics and controlling equipment connected with the target personnel according to a classification result.
9. A terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202110441392.5A 2021-04-23 2021-04-23 Equipment control method, device and terminal based on motor imagery electroencephalogram signals Pending CN113133769A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110441392.5A CN113133769A (en) 2021-04-23 2021-04-23 Equipment control method, device and terminal based on motor imagery electroencephalogram signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110441392.5A CN113133769A (en) 2021-04-23 2021-04-23 Equipment control method, device and terminal based on motor imagery electroencephalogram signals

Publications (1)

Publication Number Publication Date
CN113133769A true CN113133769A (en) 2021-07-20

Family

ID=76811885

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110441392.5A Pending CN113133769A (en) 2021-04-23 2021-04-23 Equipment control method, device and terminal based on motor imagery electroencephalogram signals

Country Status (1)

Country Link
CN (1) CN113133769A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113491523A (en) * 2021-07-30 2021-10-12 济南汇医融工科技有限公司 Electrocardiosignal characteristic point detection method and system
CN113499524A (en) * 2021-07-23 2021-10-15 华南理工大学 Auxiliary rehabilitation training system using motor imagery electroencephalogram detection
CN113655884A (en) * 2021-08-17 2021-11-16 河北师范大学 Equipment control method, terminal and system
CN115251953A (en) * 2022-07-27 2022-11-01 河北师范大学 Motor imagery electroencephalogram signal identification method and device, terminal equipment and storage medium
CN115348074A (en) * 2022-08-12 2022-11-15 北京航空航天大学 Deep space-time mixed cloud data center network flow real-time detection method
CN117137498A (en) * 2023-09-15 2023-12-01 北京理工大学 Emergency situation detection method based on attention orientation and exercise intention electroencephalogram
CN118044814A (en) * 2024-04-16 2024-05-17 小舟科技有限公司 Attention identification and adjustment method based on electroencephalogram signals and computer equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120172743A1 (en) * 2007-12-27 2012-07-05 Teledyne Licensing, Llc Coupling human neural response with computer pattern analysis for single-event detection of significant brain responses for task-relevant stimuli
CN107438398A (en) * 2015-01-06 2017-12-05 大卫·伯顿 Portable wearable monitoring system
CN109924990A (en) * 2019-03-27 2019-06-25 兰州大学 A kind of EEG signals depression identifying system based on EMD algorithm
CN110765920A (en) * 2019-10-18 2020-02-07 西安电子科技大学 Motor imagery classification method based on convolutional neural network
CN111317468A (en) * 2020-02-27 2020-06-23 腾讯科技(深圳)有限公司 Electroencephalogram signal classification method and device, computer equipment and storage medium
CN111616721A (en) * 2020-05-31 2020-09-04 天津大学 Emotion recognition system based on deep learning and brain-computer interface and application

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120172743A1 (en) * 2007-12-27 2012-07-05 Teledyne Licensing, Llc Coupling human neural response with computer pattern analysis for single-event detection of significant brain responses for task-relevant stimuli
CN107438398A (en) * 2015-01-06 2017-12-05 大卫·伯顿 Portable wearable monitoring system
CN109924990A (en) * 2019-03-27 2019-06-25 兰州大学 A kind of EEG signals depression identifying system based on EMD algorithm
CN110765920A (en) * 2019-10-18 2020-02-07 西安电子科技大学 Motor imagery classification method based on convolutional neural network
CN111317468A (en) * 2020-02-27 2020-06-23 腾讯科技(深圳)有限公司 Electroencephalogram signal classification method and device, computer equipment and storage medium
CN111616721A (en) * 2020-05-31 2020-09-04 天津大学 Emotion recognition system based on deep learning and brain-computer interface and application

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
XIULING LIU,ET AL: "Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI", 《FRONTIERS IN NEUROSCIENCE》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113499524A (en) * 2021-07-23 2021-10-15 华南理工大学 Auxiliary rehabilitation training system using motor imagery electroencephalogram detection
CN113491523A (en) * 2021-07-30 2021-10-12 济南汇医融工科技有限公司 Electrocardiosignal characteristic point detection method and system
CN113655884A (en) * 2021-08-17 2021-11-16 河北师范大学 Equipment control method, terminal and system
CN115251953A (en) * 2022-07-27 2022-11-01 河北师范大学 Motor imagery electroencephalogram signal identification method and device, terminal equipment and storage medium
CN115251953B (en) * 2022-07-27 2024-09-20 河北师范大学 Motor imagery electroencephalogram signal identification method, device, terminal equipment and storage medium
CN115348074A (en) * 2022-08-12 2022-11-15 北京航空航天大学 Deep space-time mixed cloud data center network flow real-time detection method
CN115348074B (en) * 2022-08-12 2024-06-28 北京航空航天大学 Cloud data center network flow real-time detection method for deep space-time mixing
CN117137498A (en) * 2023-09-15 2023-12-01 北京理工大学 Emergency situation detection method based on attention orientation and exercise intention electroencephalogram
CN118044814A (en) * 2024-04-16 2024-05-17 小舟科技有限公司 Attention identification and adjustment method based on electroencephalogram signals and computer equipment
CN118044814B (en) * 2024-04-16 2024-06-18 小舟科技有限公司 Attention identification and adjustment method based on electroencephalogram signals and computer equipment

Similar Documents

Publication Publication Date Title
Altaheri et al. Physics-informed attention temporal convolutional network for EEG-based motor imagery classification
CN113133769A (en) Equipment control method, device and terminal based on motor imagery electroencephalogram signals
CN113627518B (en) Method for realizing neural network brain electricity emotion recognition model by utilizing transfer learning
CN113693613B (en) Electroencephalogram signal classification method, electroencephalogram signal classification device, computer equipment and storage medium
CN111160139B (en) Electrocardiosignal processing method and device and terminal equipment
CN113143295A (en) Equipment control method and terminal based on motor imagery electroencephalogram signals
CN112990008B (en) Emotion recognition method and system based on three-dimensional characteristic diagram and convolutional neural network
CN114469120B (en) Multi-scale Dtw-BiLstm-Gan electrocardiosignal generation method based on similarity threshold migration
CN113712573A (en) Electroencephalogram signal classification method, device, equipment and storage medium
CN111046969A (en) Data screening method and device, storage medium and electronic equipment
Asghar et al. Semi-skipping layered gated unit and efficient network: hybrid deep feature selection method for edge computing in EEG-based emotion classification
Irmak A novel implementation of deep-learning approach on malaria parasite detection from thin blood cell images
CN114595725B (en) Electroencephalogram signal classification method based on addition network and supervised contrast learning
Soltani et al. Improved algorithm for multiple sclerosis diagnosis in MRI using convolutional neural network
CN111860056B (en) Blink-based living body detection method, blink-based living body detection device, readable storage medium and blink-based living body detection equipment
CN114781441A (en) EEG motor imagery classification method and multi-space convolution neural network model
Charisma et al. Transfer learning with Densenet201 architecture model for potato leaf disease classification
CN114027786A (en) Sleep disordered breathing detection method and system based on self-supervision memory network
CN115886833A (en) Electrocardiosignal classification method and device, computer readable medium and electronic equipment
CN115169384A (en) Electroencephalogram classification model training method, intention identification method, equipment and medium
Rajaguru et al. Alcoholic EEG signal classification using multi-heuristic classifiers with stochastic gradient descent technique for tuning the hyperparameters
CN115251953B (en) Motor imagery electroencephalogram signal identification method, device, terminal equipment and storage medium
Zhao et al. Multiscale Global Prompt Transformer for EEG-Based Driver Fatigue Recognition
CN118013352B (en) EEG-fNIRS motor imagery identification method and device based on heterogram network
CN113553816B (en) Method and device for automatically optimizing scale based on small data volume sample of intelligent model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210720