CN114492560A - Electroencephalogram emotion classification method based on transfer learning - Google Patents

Electroencephalogram emotion classification method based on transfer learning Download PDF

Info

Publication number
CN114492560A
CN114492560A CN202111513601.9A CN202111513601A CN114492560A CN 114492560 A CN114492560 A CN 114492560A CN 202111513601 A CN202111513601 A CN 202111513601A CN 114492560 A CN114492560 A CN 114492560A
Authority
CN
China
Prior art keywords
electroencephalogram
emotion
layer
domain
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111513601.9A
Other languages
Chinese (zh)
Inventor
何聚厚
郑晓龙
房蓓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi Normal University
Original Assignee
Shaanxi Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi Normal University filed Critical Shaanxi Normal University
Priority to CN202111513601.9A priority Critical patent/CN114492560A/en
Publication of CN114492560A publication Critical patent/CN114492560A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • G06F18/2155Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the incorporation of unlabelled data, e.g. multiple instance learning [MIL], semi-supervised techniques using expectation-maximisation [EM] or naïve labelling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Abstract

The electroencephalogram emotion classification method based on transfer learning comprises the following steps: s100: acquiring an electroencephalogram signal as a training sample by using 62-lead electroencephalogram data acquisition equipment; s200: preprocessing a training sample; s300: constructing an electroencephalogram emotion migration model based on a depth domain adaptive network, wherein the model consists of a feature extractor, a category predictor and a domain classifier; s400: inputting the preprocessed training sample into the constructed migration model, and mixing the preprocessed source domain labeled emotion electroencephalogram data and the target domain unlabeled emotion electroencephalogram data to carry out iterative training on the migration model; s500: selecting the highest accuracy once after training as a migration model for finally identifying the emotion of the electroencephalogram; s600: and classifying the electroencephalogram emotion by using the migration model finally used for recognizing the electroencephalogram emotion. The method can enable the migration model to effectively solve the problem of cross-test migration of electroencephalogram emotion.

Description

Electroencephalogram emotion classification method based on transfer learning
Technical Field
The disclosure belongs to the technical field of biological feature recognition and artificial intelligence, and particularly relates to an electroencephalogram emotion classification method based on transfer learning.
Background
Emotions play a vital role in both daily life and research and study. The brain-computer interface is used as a bridge for communication, cooperation and co-fusion between people and machines, and if the emotion of the human can be recognized, the safety and the efficiency of man-computer cooperation can be greatly enhanced. Therefore, emotion recognition technology based on electroencephalogram signals has received wide attention.
At present, researchers use a convolutional neural network to identify electroencephalogram characteristics, and the accuracy rate of the method is higher than that of the traditional SVM. However, the non-stationarity of the brain electricity makes these models trained for a single subject difficult for other subjects to use. The emotion electroencephalogram data, particularly the data with labels, are difficult to collect, and a model with good cross-test effect is difficult to directly train by using the data containing the characteristics of multiple tested electroencephalograms. These two problems directly lead to difficulties in applying the emotion recognition model across the testees, and bring negative effects to generalization of the emotional brain-computer interface. To solve the problem of difficulty in applying models across subjects, some researchers have looked at domain-adaptive methods. Domain adaptation is a sub-field of transfer learning, the goal of which is to maintain discrimination information of different classes while reducing the differences between different domains, and earlier domain adaptation methods tend to reduce the distance between a source domain and a target domain in a particular space. The method only carries out domain adaptation from edge distribution, and has the problems of low data utilization rate and poor effect.
Disclosure of Invention
In order to solve the above problems, the present disclosure provides a brain emotion classification method based on transfer learning, which includes the following steps:
s100: acquiring an electroencephalogram signal by using 62-lead electroencephalogram data acquisition equipment, and acquiring labeled emotion electroencephalogram data from a source domain and unlabeled emotion electroencephalogram data from a target domain to serve as training samples;
s200: preprocessing the training sample;
s300: constructing an electroencephalogram emotion migration model based on a depth domain adaptive network, wherein the model consists of a feature extractor, a class predictor and a domain classifier;
s400: inputting the preprocessed training sample into the constructed migration model, and mixing the preprocessed source domain labeled emotion electroencephalogram data and the target domain unlabeled emotion electroencephalogram data to carry out iterative training on the migration model;
s500: selecting the highest accuracy once after training as a migration model for finally identifying the emotion of the electroencephalogram;
s600: and classifying the electroencephalogram emotion by using the migration model finally used for recognizing the electroencephalogram emotion.
By the technical scheme, the method solves the problem that the electroencephalogram emotion is difficult to migrate across the testee, and reduces the difficulty and the training cost of electroencephalogram emotion recognition and training. Aiming at the problems that different individual electroencephalogram data have large differences, so that the trained model is low in generalization and cannot be rapidly deployed, the electroencephalogram emotion migration model based on the depth domain adaptive network is constructed, differential entropy is respectively calculated for 5 frequency bands of electroencephalogram original data, the 5 frequency bands are converted into a three-dimensional matrix according to the electrode position, deep features are extracted from the electroencephalogram data by using a DenseNet network based on an attention mechanism, the utilization rate of the features is improved, the requirement on data volume is reduced, and the calculated amount is reduced. The extracted deep features are input into a depth domain adaptive network, and a source domain and a target domain are mixed through a gradient reverse transfer network, so that the distribution of the two domains is more similar, and the performance of a migration model and the accuracy of emotion recognition are improved. The migration model can effectively solve the problem of cross-test migration of electroencephalogram emotion, improves the application effect of the migration model in cross-test emotion recognition, and has a good application prospect.
Drawings
Fig. 1 is a flowchart of a brain emotion classification method based on transfer learning provided in an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a distribution of electrodes of a channel 62 of an electroencephalogram acquisition apparatus provided in one embodiment of the present disclosure;
FIG. 3 is a schematic diagram of the conversion of differential entropy signatures into a 2-dimensional matrix in one embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a process of mapping electroencephalogram electrodes to a two-dimensional plane according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a preprocessing process of electroencephalogram emotion data in one embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a migration model structure in one embodiment of the present disclosure;
FIG. 7 is a schematic diagram of feature extractor network architecture parameters in one embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a network structure of a feature extractor in an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a Dense Block (Dense Block) structure in one embodiment of the present disclosure;
FIG. 10 is a schematic diagram of a transition layer module configuration in one embodiment of the present disclosure;
FIG. 11 is a schematic structural diagram of a spatial attention module in an embodiment of the present disclosure.
Detailed Description
Specific embodiments of the present invention will be described in more detail below with reference to fig. 1 to 11. In one embodiment, as shown in fig. 1, the present disclosure provides a brain emotion classification method based on transfer learning, which includes the following steps:
s100: acquiring an electroencephalogram signal by using 62-lead electroencephalogram data acquisition equipment, and acquiring labeled emotion electroencephalogram data from a source domain and unlabeled emotion electroencephalogram data from a target domain to serve as training samples;
s200: preprocessing the training sample;
s300: constructing an electroencephalogram emotion migration model based on a depth domain adaptive network, wherein the model consists of a feature extractor, a class predictor and a domain classifier;
s400: inputting the preprocessed training sample into the constructed migration model, and mixing the preprocessed source domain labeled emotion electroencephalogram data and the target domain unlabeled emotion electroencephalogram data to carry out iterative training on the migration model;
s500: selecting the highest accuracy once after training as a migration model for finally identifying the emotion of the electroencephalogram;
s600: and classifying the electroencephalogram emotion by using the migration model finally used for recognizing the electroencephalogram emotion.
For the embodiment, the method maps the differential entropy characteristics extracted from 5 frequency bands to a two-dimensional plane according to the electrode positions by constructing a brain emotion migration model based on a depth domain self-adaptive network to convert the differential entropy characteristics into a two-dimensional matrix, and superposes the 5 two-dimensional matrices into a three-dimensional matrix. The DenseNet network based on the attention mechanism is used as a feature extractor to extract deep features of electroencephalogram data, and the source domain and the target domain are mixed through the gradient reverse transfer network, so that the distribution of the two domains is more similar, and the performance of the migration model and the accuracy of emotion recognition are improved.
The method comprises the steps that 62-lead electroencephalogram data acquisition equipment is used for acquiring electroencephalogram signals, a subject needs to wear the electroencephalogram data acquisition equipment to sit in a laboratory to watch emotion excitation materials prepared in advance, corresponding emotion electroencephalogram signals are generated, the sampling frequency is set to be 1000Hz, and labeled emotion electroencephalogram data from a source domain and unlabeled emotion electroencephalogram data from a target domain are acquired and serve as training samples. The electroencephalogram acquisition device 62 channel electrode distribution is shown in fig. 2.
And automatically recording the current model parameters with the highest accuracy rate every time the model passes through an epoch. When the model finishes training, the model with the highest classification accuracy can be obtained.
In another embodiment, the step S200 further includes:
s201: performing band-pass filtering on the labeled emotion electroencephalogram data from the source domain and the unlabeled emotion electroencephalogram data from the target domain, and filtering noise;
s202: down-sampling the filtered data to 200 Hz;
s203: dividing the filtered and down-sampled data into 1 second segments, and respectively calculating differential entropy for 5 frequency bands of the divided data;
s204: mapping the differential entropy to a two-dimensional plane according to the electrode position to convert the differential entropy into 5 two-dimensional matrixes;
s205: and superposing the 5 two-dimensional matrixes into a three-dimensional matrix.
For the embodiment, the preprocessing of the electroencephalogram data is specifically to firstly perform 1-50Hz band-pass filtering on the acquired data by using a Butterworth band-pass filter to filter noise. Then, through down-sampling, the electroencephalogram data are down-sampled to 200Hz to reduce the calculated amount, namely, the sampling frequency of the original electroencephalogram data is reduced from the original 1000Hz to 200 Hz. The filtered and down-sampled data is then segmented into 1 second segments, i.e., 62x200 data dimensions per second. Differential entropy is calculated on 5 frequency bands namely delta (1-3Hz), theta (4-7Hz), alpha (8-13Hz), beta (14-30Hz) and gamma (31-50Hz) of the divided data respectively. The differential entropy is calculated as:
DE=-∫p(x)log(p(x))dx
where x is a random variable and p (x) represents a probability density function of continuous information. If the approximation to a specific length follows a Gaussian distribution N (mu, sigma)2) Calculating differential entropy of the electroencephalogram data, wherein the differential entropy is as follows:
Figure BDA0003394049430000061
where μ denotes the mean value, σ2The variance is indicated. And x is a random variable. The obtained differential entropy signatures are then mapped to a two-dimensional plane by the electrode positions through a transfer function and converted into 5 two-dimensional matrices, thereby obtaining 5 matrices of 9 × 9, as shown in fig. 3 and 4. Then, the obtained matrix is interpolated by using a bicubic interpolation method to obtain 5 32x32 matrices, and two-dimensional matrices of 5 frequency bands are superimposed to form a three-dimensional matrix, so as to obtain a 32x32x5 three-dimensional matrix, which is specifically shown in fig. 5.
Meanwhile, the differential entropy characteristics of the 5 frequency bands are combined, and the training effect is better than that of the differential entropy characteristics of one frequency band which is used independently.
In another embodiment, the migration model structure is shown in fig. 6, and the step S300 further includes:
s301: extracting deep features from the preprocessed electroencephalogram data by using a DenseNet network based on an attention mechanism;
s302: and respectively inputting the extracted deep features into the class predictor and the domain classifier, and calculating a loss value according to an obtained result, wherein a gradient inversion layer is arranged between the domain classifier and the feature extractor, so that the identity transformation is realized in forward propagation, and the gradient direction is automatically inverted in backward propagation.
With this embodiment, by minimizing the class classification loss, accurate classification of emotions is achieved, domain classification loss is maximized, and source domain data and target domain data are obfuscated.
In another embodiment, the feature extractor uses a DenseNet based on attention mechanism, eliminating the last full connectivity layer.
For the embodiment, the network architecture parameters of the feature extractor are as shown in fig. 7, and the attention mechanism based DenseNet network can fully utilize the features of all layers due to the adoption of the dense connection mechanism, so that the utilization rate of the features is higher, and the problem of less training data is alleviated to a certain extent.
In another embodiment, as shown in fig. 8, the DenseNet network consists of 1 input layer, 2 dense blocks, 1 transition layer, 2 spatial attention modules, and one output layer.
In another embodiment, the input layer consists of one 3 × 3 convolutional layer; the output layer consists of a batch normalization layer, a ReLu activation function and a global average pooling layer.
In another embodiment, as shown in fig. 9 and 10, the dense block is composed of a bottleneck layer, a bulk normalization layer, a ReLu activation function, and a 3x3 convolution layer, wherein the bottleneck layer is composed of a bulk normalization layer, a ReLu activation function, and a 1x1 convolution layer; the transition layer is composed of a batch normalization layer, a ReLu activation function, a 1x1 convolution layer and a 2x2 average pooling layer.
For this embodiment, a Dense Block (Dense Block) is a module that contains many layers, each layer has the same feature map size, there are no pooling layers, and a Dense connection is used between layers. The transition layer is used for connecting two adjacent Dense blocks (Dense blocks), and the size of the feature map is reduced through the pooling layer, so that the calculation amount is reduced.
In another embodiment, as shown in FIG. 11, the spatial attention module consists of one global average pooling layer, one global maximum pooling layer, one 3x3 convolutional layer and one Sigmoid activation function.
For this embodiment, the spatial attention module is responsible for re-assigning weights to let the network focus more on features that contribute to a high degree. The spatial attention module firstly performs channel-based global maximum pooling and global average pooling on the features, then splices the obtained results together based on the channels, then passes through a 3x3 convolutional layer, then obtains a spatial attention weight through Sigmoid function activation, and then multiplies the weight by the input features of the module to obtain the final spatial attention feature.
In another embodiment, the class predictor and the domain classifier use two fully-connected layers, the number of convolution kernels of the two fully-connected layers is 50, the activating function of the class predictor uses a Softmax function, and the activating function of the domain classifier uses a Sigmoid function.
For the embodiment, the extracted deep features are respectively input into the category predictor and the domain classifier, loss values are calculated according to the obtained results, accurate category classification of emotion is realized by minimizing category classification loss, domain classification loss is maximized, and source domain data and target domain data are mixed up. A gradient inversion layer is arranged between the domain classifier and the feature extractor, identity transformation is achieved in forward propagation, gradient direction is automatically inverted in backward propagation, and the source domain and the target domain are mixed up through a gradient inversion transfer network, so that the two domains are distributed more similarly. The mathematical representation is as follows:
Rλ(x)=x (1)
Figure BDA0003394049430000081
wherein R isλ(x) Is a pseudo function of the gradient inversion layer, x represents a parameter propagated during gradient update, the parameter I represents an identity matrix, and λ is a parameter. Formula (1) is a formula for parameter identity transformation in forward propagation, and formula (2) is a formula for gradient automatic inversion in backward propagation.
In the gradient inversion layer, the parameter λ is not fixed, but dynamically varies. The variation expression is as follows:
Figure BDA0003394049430000091
where p represents the relative value of the iteration process, i.e. the ratio of the current number of iterations to the total number of iterations, and γ is a constant 10.
In another embodiment, an Adam optimizer is used to optimize the loss function during the model training process.
For the embodiment, in the training process, the network adopts an Adam optimizer to optimize the loss function, and the initial learning rate is 0.0001. The training batch size is 64 and epoch is set to 100. Training all training samples completely once is called an epoch. And after 100 times of training, selecting the highest accuracy as a migration model for recognizing emotion finally. And the category classification loss is minimized, and the accurate classification of emotion is realized. The domain classification loss is maximized, further confusing the source domain data and the target domain data.
Although the embodiments of the present invention have been described above with reference to the accompanying drawings, the present invention is not limited to the above-described embodiments and application fields, and the above-described embodiments are illustrative, instructive, and not restrictive. Those skilled in the art, having the benefit of this disclosure, may effect numerous modifications thereto without departing from the scope of the invention as defined by the appended claims.

Claims (10)

1. An electroencephalogram emotion classification method based on transfer learning comprises the following steps:
s100: acquiring an electroencephalogram signal by using 62-lead electroencephalogram data acquisition equipment, and acquiring labeled emotion electroencephalogram data from a source domain and unlabeled emotion electroencephalogram data from a target domain to serve as training samples;
s200: preprocessing the training sample;
s300: constructing an electroencephalogram emotion migration model based on a depth domain adaptive network, wherein the model consists of a feature extractor, a class predictor and a domain classifier;
s400: inputting the preprocessed training sample into the constructed migration model, and mixing the preprocessed source domain labeled emotion electroencephalogram data and the target domain unlabeled emotion electroencephalogram data to carry out iterative training on the migration model;
s500: selecting the one with the highest accuracy after training as a migration model for finally recognizing the emotion of the brain electricity;
s600: and classifying the electroencephalogram emotion by using the migration model finally used for recognizing the electroencephalogram emotion.
2. The method according to claim 1, said step S200 further comprising, preferably:
s201: performing band-pass filtering on the labeled emotion electroencephalogram data from the source domain and the unlabeled emotion electroencephalogram data from the target domain, and filtering noise;
s202, down-sampling the filtered data to 200 Hz;
s203: dividing the filtered and down-sampled data into 1 second segments, and respectively calculating differential entropy for 5 frequency bands of the divided data;
s204: mapping the differential entropy to a two-dimensional plane according to the electrode position to convert the differential entropy into 5 two-dimensional matrixes;
s205, superposing the 5 two-dimensional matrixes into a three-dimensional matrix.
3. The method of claim 1, the step S300 further comprising:
s301: extracting deep features from the preprocessed electroencephalogram data by using a DenseNet network based on an attention mechanism;
s302: and respectively inputting the extracted deep features into the class predictor and the domain classifier, and calculating a loss value according to an obtained result, wherein a gradient inversion layer is arranged between the domain classifier and the feature extractor, so that the identity transformation is realized in forward propagation, and the gradient direction is automatically inverted in backward propagation.
4. The method of claim 1, the feature extractor uses a DenseNet network based on attention mechanism, eliminating the last full connectivity layer.
5. The method of claim 4, the DenseNet consisting of 1 input layer, 2 dense blocks, 1 transition layer, 2 spatial attention modules, and one output layer.
6. The method of claim 4, said input layer consisting of one 3x3 convolutional layer; the output layer consists of a batch normalization layer, a ReLu activation function and a global average pooling layer.
7. The method of claim 4, the dense block consisting of a bottleneck layer, a bulk normalization layer, a ReLu activation function, a 3x3 convolutional layer, wherein the bottleneck layer consists of a bulk normalization layer, a ReLu activation function, a 1x1 convolutional layer; the transition layer is composed of a batch normalization layer, a ReLu activation function, a 1x1 convolution layer and a 2x2 average pooling layer.
8. The method of claim 4, wherein the spatial attention module is comprised of a global average pooling layer, a global maximum pooling layer, a 3x3 convolutional layer, and a Sigmoid activation function.
9. The method of claim 1, wherein the class predictor and the domain classifier use two fully-connected layers, the number of convolution kernels of each fully-connected layer is 50, the activation function of the class predictor uses a Softmax function, and the activation function of the domain classifier uses a Sigmoid function.
10. The method of claim 1, wherein an Adam optimizer is used to optimize the loss function during the model training process.
CN202111513601.9A 2021-12-06 2021-12-06 Electroencephalogram emotion classification method based on transfer learning Pending CN114492560A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111513601.9A CN114492560A (en) 2021-12-06 2021-12-06 Electroencephalogram emotion classification method based on transfer learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111513601.9A CN114492560A (en) 2021-12-06 2021-12-06 Electroencephalogram emotion classification method based on transfer learning

Publications (1)

Publication Number Publication Date
CN114492560A true CN114492560A (en) 2022-05-13

Family

ID=81492204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111513601.9A Pending CN114492560A (en) 2021-12-06 2021-12-06 Electroencephalogram emotion classification method based on transfer learning

Country Status (1)

Country Link
CN (1) CN114492560A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115105079A (en) * 2022-07-26 2022-09-27 杭州罗莱迪思科技股份有限公司 Electroencephalogram emotion recognition method based on self-attention mechanism and application thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115105079A (en) * 2022-07-26 2022-09-27 杭州罗莱迪思科技股份有限公司 Electroencephalogram emotion recognition method based on self-attention mechanism and application thereof

Similar Documents

Publication Publication Date Title
CN110069958B (en) Electroencephalogram signal rapid identification method of dense deep convolutional neural network
CN111062250B (en) Multi-subject motor imagery electroencephalogram signal identification method based on deep feature learning
CN112001306A (en) Electroencephalogram signal decoding method for generating neural network based on deep convolution countermeasure
CN112244873A (en) Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network
CN110059565A (en) A kind of P300 EEG signal identification method based on improvement convolutional neural networks
CN111931656B (en) User independent motor imagery classification model training method based on transfer learning
CN115919330A (en) EEG Emotional State Classification Method Based on Multi-level SE Attention and Graph Convolution
CN113010013A (en) Wasserstein distance-based motor imagery electroencephalogram migration learning method
CN112465069A (en) Electroencephalogram emotion classification method based on multi-scale convolution kernel CNN
CN115105076A (en) Electroencephalogram emotion recognition method and system based on dynamic convolution residual multi-source migration
CN114492560A (en) Electroencephalogram emotion classification method based on transfer learning
CN110033077A (en) Neural network training method and device
CN111428601B (en) P300 signal identification method, device and storage medium based on MS-CNN
CN113052099A (en) SSVEP classification method based on convolutional neural network
CN113842151B (en) Cross-test EEG cognitive state detection method based on efficient multi-source capsule network
Serkan et al. VarioGram–A colorful time-graph representation for time series
CN114757273A (en) Electroencephalogram signal classification method based on collaborative contrast regularization average teacher model
CN110448273B (en) Low-power-consumption epilepsy prediction circuit based on support vector machine
CN113269159A (en) Gesture recognition method fusing electromyographic signals and visual images
CN114358057A (en) Cross-individual electroencephalogram emotion recognition method, system, device and medium
CN117257242B (en) Epilepsy classification method and system
CN115919313B (en) Facial myoelectricity emotion recognition method based on space-time characteristics
CN115982558B (en) Electroencephalogram movement intention classification model building method and application thereof
CN109685031A (en) A kind of brain-computer interface midbrain signal characteristics classification method and system
CN118051831A (en) Underwater sound target identification method based on CNN-transducer cooperative network model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination