CN114469137B - Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model - Google Patents
Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model Download PDFInfo
- Publication number
- CN114469137B CN114469137B CN202111560169.9A CN202111560169A CN114469137B CN 114469137 B CN114469137 B CN 114469137B CN 202111560169 A CN202111560169 A CN 202111560169A CN 114469137 B CN114469137 B CN 114469137B
- Authority
- CN
- China
- Prior art keywords
- emotion
- electroencephalogram
- space
- data
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000008909 emotion recognition Effects 0.000 title claims abstract description 58
- 238000000034 method Methods 0.000 title claims abstract description 40
- 230000004927 fusion Effects 0.000 title claims abstract description 18
- 230000008451 emotion Effects 0.000 claims abstract description 205
- 238000012549 training Methods 0.000 claims abstract description 44
- 210000004556 brain Anatomy 0.000 claims abstract description 33
- 239000000284 extract Substances 0.000 claims abstract description 25
- 238000001228 spectrum Methods 0.000 claims abstract description 25
- 230000005611 electricity Effects 0.000 claims abstract description 24
- 238000012360 testing method Methods 0.000 claims abstract description 19
- 238000013528 artificial neural network Methods 0.000 claims description 10
- 238000000605 extraction Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 6
- 238000011478 gradient descent method Methods 0.000 claims description 6
- 230000000306 recurrent effect Effects 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 6
- 230000003595 spectral effect Effects 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 6
- 238000007781 pre-processing Methods 0.000 claims description 4
- 238000005070 sampling Methods 0.000 claims description 4
- 230000004913 activation Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 210000004727 amygdala Anatomy 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 210000001652 frontal lobe Anatomy 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 206010063659 Aversion Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009911 automatic emotional response Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 230000015654 memory Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000026683 transduction Effects 0.000 description 1
- 238000010361 transduction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/372—Analysis of electroencephalograms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Psychiatry (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Psychology (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Social Psychology (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention discloses a cross-domain electroencephalogram emotion recognition method and system based on a space-time feature fusion model, wherein the method comprises the following steps: (1) Acquiring two brain electricity emotion databases, wherein one brain electricity emotion database is used as a training set and the other brain electricity emotion database is used as a test set; (2) Establishing an electroencephalogram emotion recognition network, wherein the electroencephalogram emotion recognition network comprises a preprocessor, a data aligner, a space-time feature extractor and an emotion classifier which are sequentially connected from front to back, the space-time feature extractor firstly extracts time spectrum features of electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier performs emotion classification according to the space-time features; (3) training the electroencephalogram emotion recognition network; (4) And acquiring the electroencephalogram emotion data to be identified, and inputting the electroencephalogram emotion data serving as a test set sample into a trained electroencephalogram emotion identification network to obtain the identified emotion type. The invention has higher recognition accuracy.
Description
Technical Field
The invention relates to emotion recognition technology, in particular to a cross-domain electroencephalogram emotion recognition method and system based on a space-time feature fusion model.
Background
Emotion is critical to the quality and perception of the human daily experience. With the development of brain-machine interfaces and artificial intelligence, emotion recognition based on brain electrical signals has become a new emotion research method. The electroencephalogram signal contains a large amount of information related to emotion, has the characteristics of high time resolution and difficult camouflage, and shows great advantages in the field of real-time emotion recognition. The method can accurately and real-timely judge the emotion state of the person by some technical means, and has great application value in the fields of driver fatigue detection, depression auxiliary treatment, teachers and students emotion interaction evaluation and the like.
The electroencephalogram has very close relation with emotion. Amygdala participates in connective perception, automatic emotional response, and memory. Activation of amygdala appears to be more related to negative emotions, relative activation of the right frontal lobe is related to negative emotions (e.g., fear or aversion), and activation of the frontal lobe is related to positive emotions (e.g., happiness or happiness). The electroencephalogram feature type mainly utilizes all frequency bands of delta, theta, alpha, beta and gamma frequency bands. These features characterize the signal from different aspects, so that the various valid features extracted from the signal can be better categorized.
To build an excellent model, the user typically needs to collect enough labeling data to calibrate before using the brain-machine interface. This calibration process is often time consuming and laborious, a significant problem facing current affective brain interfaces. In addition, the identification accuracy of the existing method is still to be further improved.
Disclosure of Invention
The invention aims to: aiming at the problems existing in the prior art, the invention provides a cross-domain electroencephalogram emotion recognition method and system based on a space-time feature fusion model, which have higher recognition accuracy and do not need to calibrate data in advance.
The technical scheme is as follows: the cross-domain electroencephalogram emotion recognition method based on the space-time feature fusion model comprises the following steps of:
(1) Acquiring two brain electricity emotion databases, wherein one brain electricity emotion database is used as a training set and the other brain electricity emotion database is used as a test set, and the brain electricity emotion databases comprise a plurality of brain electricity emotion data and corresponding emotion type labels;
(2) Establishing an electroencephalogram emotion recognition network, wherein the electroencephalogram emotion recognition network comprises a preprocessor, a data aligner, a space-time feature extractor and an emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies emotion according to the space-time features;
(3) Training the electroencephalogram emotion recognition network, wherein during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
(4) And acquiring the electroencephalogram emotion data to be identified, and inputting the electroencephalogram emotion data serving as a test set sample into a trained electroencephalogram emotion identification network to obtain the identified emotion type.
Further, the preprocessor specifically comprises a filter and a downsampler which are connected in front and back, wherein the filter is used for carrying out finite impulse response filtering on the electroencephalogram emotion data, and the downsampler is used for downsampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
Further, the data aligner is specifically configured to perform european alignment on different electroencephalogram emotion data by adopting the following formula:
for the brain electricity emotion number after alignmentAccording to, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned electroencephalogram emotion data.
Further, the space-time feature extractor specifically includes:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from three-dimensional tensors, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics.
Furthermore, the classifier is specifically used for performing emotion classification according to space-time characteristics by using softmax, and outputting an emotion classification result.
The cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model provided by the invention comprises the following components:
the system comprises a database, a training set and a test set, wherein the two electroencephalogram emotion databases are used as the training set and the test set, and each electroencephalogram emotion database comprises a plurality of electroencephalogram emotion data and corresponding emotion type labels;
the system comprises a network establishing module, a data aligner, a space-time feature extractor and an emotion classifier, wherein the network establishing module is used for establishing an electroencephalogram emotion recognition network, the electroencephalogram emotion recognition network comprises a preprocessor, the data aligner, the space-time feature extractor and the emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies the emotions according to the space-time features;
the network training module is also used for training the electroencephalogram emotion recognition network, and during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
and the emotion recognition module is used for acquiring the electroencephalogram emotion data to be recognized, inputting the electroencephalogram emotion data as a test set sample into a trained electroencephalogram emotion recognition network, and obtaining recognized emotion types.
Further, the preprocessor specifically comprises a filter and a downsampler which are connected in front and back, wherein the filter is used for carrying out finite impulse response filtering on the electroencephalogram emotion data, and the downsampler is used for downsampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
Further, the data aligner is specifically configured to perform european alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned electroencephalogram emotion data.
Further, the space-time feature extractor specifically includes:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from three-dimensional tensors, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics.
Furthermore, the classifier is specifically used for performing emotion classification according to space-time characteristics by using softmax, and outputting an emotion classification result.
The beneficial effects are that: compared with the prior art, the invention has the remarkable advantages that: according to the invention, the training sample and the test sample come from different emotion brain signal databases, so that the problem caused by the consistency of feature distribution between the training sample and the test sample is solved, the data is not required to be calibrated in advance, the recognition accuracy is improved, and the space-time characteristics of the spectrum characteristics are used as classification standards during extraction, so that the accuracy is further improved.
Drawings
FIG. 1 is a schematic flow chart of one embodiment of a cross-domain electroencephalogram emotion recognition method based on a space-time feature fusion model;
FIG. 2 is a block diagram of the proposed electroencephalogram emotion recognition network of the present invention;
FIG. 3 is a network structure diagram of the proposed electroencephalogram emotion recognition network of the present invention;
fig. 4 is an electroencephalogram topology structure diagram of the electroencephalogram emotion recognition network provided by the invention.
Detailed Description
As shown in fig. 1, this embodiment provides a cross-domain electroencephalogram emotion recognition method based on a space-time feature fusion model, which includes:
(1) Two brain electricity emotion databases are obtained, one is used as a training set, the other is used as a test set, and the brain electricity emotion databases comprise a plurality of brain electricity emotion data and corresponding emotion type labels.
(2) An electroencephalogram emotion recognition network is established, and as shown in fig. 2 and 3, the electroencephalogram emotion recognition network comprises a preprocessor, a data aligner, a space-time feature extractor and an emotion classifier which are sequentially connected from front to back.
The preprocessing device specifically comprises a filter and a down sampler which are connected in front-back mode, wherein the filter is used for conducting finite impulse response filtering on electroencephalogram emotion data, and the down sampler is used for down-sampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
The data aligner is specifically configured to perform euclidean alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->The method is a time sequence of the c electrode channel in the aligned electroencephalogram emotion data, the value range of c is 1-62, and the method is suitable for data acquired by a 64-lead electrode cap of a 10-20 system.
The space-time feature extractor specifically includes:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from the three-dimensional tensor, the network is a 2dCNN+BiLSTM model architecture, as shown in fig. 4, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics.
The classifier is specifically used for carrying out emotion classification by using softmax according to space-time characteristics and outputting an emotion classification result.
(3) And training the electroencephalogram emotion recognition network, wherein during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the sample is input into the electroencephalogram emotion recognition network, and network parameters are updated through a random gradient descent method, so that network training is completed.
(4) And acquiring the electroencephalogram emotion data to be identified, and inputting the electroencephalogram emotion data serving as a test set sample into a trained electroencephalogram emotion identification network to obtain the identified emotion type.
The embodiment also provides a cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model, which comprises the following steps:
the system comprises a database, a training set and a test set, wherein the two electroencephalogram emotion databases are used as the training set and the test set, and each electroencephalogram emotion database comprises a plurality of electroencephalogram emotion data and corresponding emotion type labels;
the system comprises a network establishing module, a data aligner, a space-time feature extractor and an emotion classifier, wherein the network establishing module is used for establishing an electroencephalogram emotion recognition network, the electroencephalogram emotion recognition network comprises a preprocessor, the data aligner, the space-time feature extractor and the emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies the emotions according to the space-time features;
the network training module is also used for training the electroencephalogram emotion recognition network, and during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
and the emotion recognition module is used for acquiring the electroencephalogram emotion data to be recognized, inputting the electroencephalogram emotion data as a test set sample into a trained electroencephalogram emotion recognition network, and obtaining recognized emotion types.
The preprocessing device specifically comprises a filter and a down sampler which are connected in front-back mode, wherein the filter is used for conducting finite impulse response filtering on electroencephalogram emotion data, and the down sampler is used for down-sampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
The data aligner is specifically configured to perform euclidean alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned brain electricity emotion data, the value range of c is 1-62, and the method is suitable for the 64 guide of the 10-20 systemAnd (5) connecting data collected by the electrode cap.
The space-time feature extractor specifically includes:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from three-dimensional tensors, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics.
The classifier is specifically used for carrying out emotion classification by using softmax according to space-time characteristics and outputting an emotion classification result.
Simulation verification is carried out on the invention, and table 1 shows the recognition results of SEED and 2020 world robot race emotion BCI (Emotional BCI) in two emotion databases, a competition database and SEED data. In the embodiment, two emotion databases of SEED and 2020 world robot emotion BCI are adopted, and good effects are achieved on four protocols. Table 2 summarizes the current cross-person recognition algorithms in the SEED database, including linear support vector machine (linear support vector machine, SVM), kernel principal component analysis (kernel principal component analysis, KPCA), transition component analysis (transfer component analysis, TCA), transduction Parameter Transition (TPT), domain countermeasure neural network (domain adversarial neural network, DANN), dynamic Graph Convolutional Neural Network (DGCNN), double hemisphere domain countermeasure neural network (BiDANN), biDANN-S, hierarchical spatiotemporal neural network (R2G-STNN), and example adaptive graph (IAG). It can be seen from the table that the present invention achieves the highest accuracy and the smallest standard deviation. Unlike these methods, the training set of the present invention adds an additional BCI database. The training set is increased to enable the generalization effect of the training model to be better, and the fact that the method can effectively extract time-frequency space multi-view features is proved to be capable of carrying out emotion classification well both across databases and topics.
TABLE 1
TABLE 2
Note that subspace-based methods (e.g., TCA, SA, and GFK) have problems handling large volumes of EEG data due to computer memory constraints and computational problems. Thus, 5000 EEG feature samples have to be randomly selected from the training dataset to train the methods for comparison with them. Experimental results show that the brain electricity emotion recognition method provided by the invention has higher recognition rate.
The above disclosure is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention, which is defined by the appended claims.
Claims (8)
1. A cross-domain electroencephalogram emotion recognition method based on a space-time feature fusion model is characterized by comprising the following steps:
(1) Acquiring two brain electricity emotion databases, wherein one brain electricity emotion database is used as a training set and the other brain electricity emotion database is used as a test set, and the brain electricity emotion databases comprise a plurality of brain electricity emotion data and corresponding emotion type labels;
(2) Establishing an electroencephalogram emotion recognition network, wherein the electroencephalogram emotion recognition network comprises a preprocessor, a data aligner, a space-time feature extractor and an emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies emotion according to the space-time features;
wherein the space-time feature extractor specifically comprises:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from the three-dimensional tensor, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics;
(3) Training the electroencephalogram emotion recognition network, wherein during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
(4) And acquiring the electroencephalogram emotion data to be identified, and inputting the electroencephalogram emotion data serving as a test set sample into a trained electroencephalogram emotion identification network to obtain the identified emotion type.
2. The cross-domain electroencephalogram emotion recognition method based on the space-time feature fusion model according to claim 1, wherein the method is characterized by comprising the following steps of: the preprocessing device specifically comprises a filter and a down-sampler which are connected front and back, wherein the filter is used for carrying out finite impulse response filtering on the electroencephalogram emotion data, and the down-sampler is used for down-sampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
3. The cross-domain electroencephalogram emotion recognition method based on the space-time feature fusion model according to claim 1, wherein the method is characterized by comprising the following steps of: the data aligner is specifically configured to perform euclidean alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned electroencephalogram emotion data.
4. The cross-domain electroencephalogram emotion recognition method based on the space-time feature fusion model according to claim 1, wherein the method is characterized by comprising the following steps of: the classifier is specifically used for carrying out emotion classification by using softmax according to space-time characteristics and outputting an emotion classification result.
5. Cross-domain electroencephalogram emotion recognition system based on space-time feature fusion model is characterized by comprising:
the system comprises a database, a training set and a test set, wherein the two electroencephalogram emotion databases are used as the training set and the test set, and each electroencephalogram emotion database comprises a plurality of electroencephalogram emotion data and corresponding emotion type labels;
the system comprises a network establishing module, a data aligner, a space-time feature extractor and an emotion classifier, wherein the network establishing module is used for establishing an electroencephalogram emotion recognition network, the electroencephalogram emotion recognition network comprises a preprocessor, the data aligner, the space-time feature extractor and the emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies the emotions according to the space-time features;
wherein the space-time feature extractor specifically comprises:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>The c-th electrode channel in (3)ω (τ -t) is a short time analysis window, f represents frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from the three-dimensional tensor, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics;
the network training module is also used for training the electroencephalogram emotion recognition network, and during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
and the emotion recognition module is used for acquiring the electroencephalogram emotion data to be recognized, inputting the electroencephalogram emotion data as a test set sample into a trained electroencephalogram emotion recognition network, and obtaining recognized emotion types.
6. The cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model according to claim 5, wherein the system is characterized in that: the preprocessing device specifically comprises a filter and a down-sampler which are connected front and back, wherein the filter is used for carrying out finite impulse response filtering on the electroencephalogram emotion data, and the down-sampler is used for down-sampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
7. The cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model according to claim 5, wherein the system is characterized in that: the data aligner is specifically configured to perform euclidean alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned electroencephalogram emotion data.
8. The cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model according to claim 5, wherein the system is characterized in that: the classifier is specifically used for carrying out emotion classification by using softmax according to space-time characteristics and outputting an emotion classification result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111560169.9A CN114469137B (en) | 2021-12-20 | 2021-12-20 | Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111560169.9A CN114469137B (en) | 2021-12-20 | 2021-12-20 | Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114469137A CN114469137A (en) | 2022-05-13 |
CN114469137B true CN114469137B (en) | 2023-12-26 |
Family
ID=81494115
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111560169.9A Active CN114469137B (en) | 2021-12-20 | 2021-12-20 | Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114469137B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118141391B (en) * | 2024-05-08 | 2024-07-26 | 之江实验室 | Cross-library brain electrical fatigue identification method and system based on entropy optimization neural network |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109389059A (en) * | 2018-09-26 | 2019-02-26 | 华南理工大学 | A kind of P300 detection method based on CNN-LSTM network |
CN111126263A (en) * | 2019-12-24 | 2020-05-08 | 东南大学 | Electroencephalogram emotion recognition method and device based on double-hemisphere difference model |
KR20210006486A (en) * | 2018-08-13 | 2021-01-18 | 한국과학기술원 | Method for Adaptive EEG signal processing using reinforcement learning and System Using the same |
CN113288146A (en) * | 2021-05-26 | 2021-08-24 | 杭州电子科技大学 | Electroencephalogram emotion classification method based on time-space-frequency combined characteristics |
-
2021
- 2021-12-20 CN CN202111560169.9A patent/CN114469137B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20210006486A (en) * | 2018-08-13 | 2021-01-18 | 한국과학기술원 | Method for Adaptive EEG signal processing using reinforcement learning and System Using the same |
CN109389059A (en) * | 2018-09-26 | 2019-02-26 | 华南理工大学 | A kind of P300 detection method based on CNN-LSTM network |
CN111126263A (en) * | 2019-12-24 | 2020-05-08 | 东南大学 | Electroencephalogram emotion recognition method and device based on double-hemisphere difference model |
CN113288146A (en) * | 2021-05-26 | 2021-08-24 | 杭州电子科技大学 | Electroencephalogram emotion classification method based on time-space-frequency combined characteristics |
Non-Patent Citations (4)
Title |
---|
Cascade and Parallel Convolutional Recurrent Neural Networks on EEG-Based Intention Recognition for Brain Computer Interface;Dalin Zhang 等;The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18);全文 * |
Chun-Ren Phang 等.Classi cation of EEG-Based Brain Connectivity Networks in Schizophrenia Using a Multi-Domain Connectome Convolutional Neural Network.arXiv:1903.08858v1 [ cs.LG] 21 Mar 2019.2019,全文. * |
SST-EmotionNet: Spatial-Spectral-Temporal based Attention 3D Dense Network for EEG Emotion Recognition;Ziyu Jia 等;Poster Session D2: Emerging Multimedia Applications & Emotional and Social Signals in Multimedia;全文 * |
Transfer Learning for Motor Imagery Based Brain-Computer Interfaces: A Complete Pipeline;Dongrui Wu等;arXiv:2007.03746v3 [eess.SP];全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114469137A (en) | 2022-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110069958B (en) | Electroencephalogram signal rapid identification method of dense deep convolutional neural network | |
CN108491077B (en) | Surface electromyographic signal gesture recognition method based on multi-stream divide-and-conquer convolutional neural network | |
CN111832416B (en) | Motor imagery electroencephalogram signal identification method based on enhanced convolutional neural network | |
CN110353675B (en) | Electroencephalogram signal emotion recognition method and device based on picture generation | |
CN112381008B (en) | Electroencephalogram emotion recognition method based on parallel sequence channel mapping network | |
CN110353702A (en) | A kind of emotion identification method and system based on shallow-layer convolutional neural networks | |
CN108446635B (en) | Collaborative filtering recommendation system and method for acquiring preference with assistance of electroencephalogram signals | |
Lewicki | A review of methods for spike sorting: the detection and classification of neural action potentials | |
CN105841961A (en) | Bearing fault diagnosis method based on Morlet wavelet transformation and convolutional neural network | |
CN105956546A (en) | Emotion recognition method based on EEG signals | |
CN112022153B (en) | Electroencephalogram signal detection method based on convolutional neural network | |
CN111387974A (en) | Electroencephalogram feature optimization and epileptic seizure detection method based on depth self-coding | |
CN114298216A (en) | Electroencephalogram vision classification method based on time-frequency domain fusion Transformer | |
CN108256579A (en) | A kind of multi-modal sense of national identity quantization measuring method based on priori | |
CN110037693A (en) | A kind of mood classification method based on facial expression and EEG | |
CN112884063B (en) | P300 signal detection and identification method based on multi-element space-time convolution neural network | |
CN111832431A (en) | Emotional electroencephalogram classification method based on CNN | |
CN112465069A (en) | Electroencephalogram emotion classification method based on multi-scale convolution kernel CNN | |
CN109325410B (en) | Electroencephalogram EEG (electroencephalogram) feature extraction method based on convolutional neural network | |
CN112438741B (en) | Driving state detection method and system based on electroencephalogram feature transfer learning | |
CN114469137B (en) | Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model | |
CN109871831A (en) | A kind of emotion identification method and system | |
CN116842361A (en) | Epileptic brain electrical signal identification method based on time-frequency attention mixing depth network | |
CN113180659A (en) | Electroencephalogram emotion recognition system based on three-dimensional features and cavity full convolution network | |
CN115474899A (en) | Basic taste perception identification method based on multi-scale convolution neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |