CN114469137B - Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model - Google Patents

Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model Download PDF

Info

Publication number
CN114469137B
CN114469137B CN202111560169.9A CN202111560169A CN114469137B CN 114469137 B CN114469137 B CN 114469137B CN 202111560169 A CN202111560169 A CN 202111560169A CN 114469137 B CN114469137 B CN 114469137B
Authority
CN
China
Prior art keywords
emotion
electroencephalogram
space
data
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111560169.9A
Other languages
Chinese (zh)
Other versions
CN114469137A (en
Inventor
郑文明
常洪丽
宗源
江星洵
唐传高
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202111560169.9A priority Critical patent/CN114469137B/en
Publication of CN114469137A publication Critical patent/CN114469137A/en
Application granted granted Critical
Publication of CN114469137B publication Critical patent/CN114469137B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Psychiatry (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Psychology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a cross-domain electroencephalogram emotion recognition method and system based on a space-time feature fusion model, wherein the method comprises the following steps: (1) Acquiring two brain electricity emotion databases, wherein one brain electricity emotion database is used as a training set and the other brain electricity emotion database is used as a test set; (2) Establishing an electroencephalogram emotion recognition network, wherein the electroencephalogram emotion recognition network comprises a preprocessor, a data aligner, a space-time feature extractor and an emotion classifier which are sequentially connected from front to back, the space-time feature extractor firstly extracts time spectrum features of electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier performs emotion classification according to the space-time features; (3) training the electroencephalogram emotion recognition network; (4) And acquiring the electroencephalogram emotion data to be identified, and inputting the electroencephalogram emotion data serving as a test set sample into a trained electroencephalogram emotion identification network to obtain the identified emotion type. The invention has higher recognition accuracy.

Description

Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model
Technical Field
The invention relates to emotion recognition technology, in particular to a cross-domain electroencephalogram emotion recognition method and system based on a space-time feature fusion model.
Background
Emotion is critical to the quality and perception of the human daily experience. With the development of brain-machine interfaces and artificial intelligence, emotion recognition based on brain electrical signals has become a new emotion research method. The electroencephalogram signal contains a large amount of information related to emotion, has the characteristics of high time resolution and difficult camouflage, and shows great advantages in the field of real-time emotion recognition. The method can accurately and real-timely judge the emotion state of the person by some technical means, and has great application value in the fields of driver fatigue detection, depression auxiliary treatment, teachers and students emotion interaction evaluation and the like.
The electroencephalogram has very close relation with emotion. Amygdala participates in connective perception, automatic emotional response, and memory. Activation of amygdala appears to be more related to negative emotions, relative activation of the right frontal lobe is related to negative emotions (e.g., fear or aversion), and activation of the frontal lobe is related to positive emotions (e.g., happiness or happiness). The electroencephalogram feature type mainly utilizes all frequency bands of delta, theta, alpha, beta and gamma frequency bands. These features characterize the signal from different aspects, so that the various valid features extracted from the signal can be better categorized.
To build an excellent model, the user typically needs to collect enough labeling data to calibrate before using the brain-machine interface. This calibration process is often time consuming and laborious, a significant problem facing current affective brain interfaces. In addition, the identification accuracy of the existing method is still to be further improved.
Disclosure of Invention
The invention aims to: aiming at the problems existing in the prior art, the invention provides a cross-domain electroencephalogram emotion recognition method and system based on a space-time feature fusion model, which have higher recognition accuracy and do not need to calibrate data in advance.
The technical scheme is as follows: the cross-domain electroencephalogram emotion recognition method based on the space-time feature fusion model comprises the following steps of:
(1) Acquiring two brain electricity emotion databases, wherein one brain electricity emotion database is used as a training set and the other brain electricity emotion database is used as a test set, and the brain electricity emotion databases comprise a plurality of brain electricity emotion data and corresponding emotion type labels;
(2) Establishing an electroencephalogram emotion recognition network, wherein the electroencephalogram emotion recognition network comprises a preprocessor, a data aligner, a space-time feature extractor and an emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies emotion according to the space-time features;
(3) Training the electroencephalogram emotion recognition network, wherein during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
(4) And acquiring the electroencephalogram emotion data to be identified, and inputting the electroencephalogram emotion data serving as a test set sample into a trained electroencephalogram emotion identification network to obtain the identified emotion type.
Further, the preprocessor specifically comprises a filter and a downsampler which are connected in front and back, wherein the filter is used for carrying out finite impulse response filtering on the electroencephalogram emotion data, and the downsampler is used for downsampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
Further, the data aligner is specifically configured to perform european alignment on different electroencephalogram emotion data by adopting the following formula:
for the brain electricity emotion number after alignmentAccording to, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned electroencephalogram emotion data.
Further, the space-time feature extractor specifically includes:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from three-dimensional tensors, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics.
Furthermore, the classifier is specifically used for performing emotion classification according to space-time characteristics by using softmax, and outputting an emotion classification result.
The cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model provided by the invention comprises the following components:
the system comprises a database, a training set and a test set, wherein the two electroencephalogram emotion databases are used as the training set and the test set, and each electroencephalogram emotion database comprises a plurality of electroencephalogram emotion data and corresponding emotion type labels;
the system comprises a network establishing module, a data aligner, a space-time feature extractor and an emotion classifier, wherein the network establishing module is used for establishing an electroencephalogram emotion recognition network, the electroencephalogram emotion recognition network comprises a preprocessor, the data aligner, the space-time feature extractor and the emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies the emotions according to the space-time features;
the network training module is also used for training the electroencephalogram emotion recognition network, and during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
and the emotion recognition module is used for acquiring the electroencephalogram emotion data to be recognized, inputting the electroencephalogram emotion data as a test set sample into a trained electroencephalogram emotion recognition network, and obtaining recognized emotion types.
Further, the preprocessor specifically comprises a filter and a downsampler which are connected in front and back, wherein the filter is used for carrying out finite impulse response filtering on the electroencephalogram emotion data, and the downsampler is used for downsampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
Further, the data aligner is specifically configured to perform european alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned electroencephalogram emotion data.
Further, the space-time feature extractor specifically includes:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from three-dimensional tensors, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics.
Furthermore, the classifier is specifically used for performing emotion classification according to space-time characteristics by using softmax, and outputting an emotion classification result.
The beneficial effects are that: compared with the prior art, the invention has the remarkable advantages that: according to the invention, the training sample and the test sample come from different emotion brain signal databases, so that the problem caused by the consistency of feature distribution between the training sample and the test sample is solved, the data is not required to be calibrated in advance, the recognition accuracy is improved, and the space-time characteristics of the spectrum characteristics are used as classification standards during extraction, so that the accuracy is further improved.
Drawings
FIG. 1 is a schematic flow chart of one embodiment of a cross-domain electroencephalogram emotion recognition method based on a space-time feature fusion model;
FIG. 2 is a block diagram of the proposed electroencephalogram emotion recognition network of the present invention;
FIG. 3 is a network structure diagram of the proposed electroencephalogram emotion recognition network of the present invention;
fig. 4 is an electroencephalogram topology structure diagram of the electroencephalogram emotion recognition network provided by the invention.
Detailed Description
As shown in fig. 1, this embodiment provides a cross-domain electroencephalogram emotion recognition method based on a space-time feature fusion model, which includes:
(1) Two brain electricity emotion databases are obtained, one is used as a training set, the other is used as a test set, and the brain electricity emotion databases comprise a plurality of brain electricity emotion data and corresponding emotion type labels.
(2) An electroencephalogram emotion recognition network is established, and as shown in fig. 2 and 3, the electroencephalogram emotion recognition network comprises a preprocessor, a data aligner, a space-time feature extractor and an emotion classifier which are sequentially connected from front to back.
The preprocessing device specifically comprises a filter and a down sampler which are connected in front-back mode, wherein the filter is used for conducting finite impulse response filtering on electroencephalogram emotion data, and the down sampler is used for down-sampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
The data aligner is specifically configured to perform euclidean alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->The method is a time sequence of the c electrode channel in the aligned electroencephalogram emotion data, the value range of c is 1-62, and the method is suitable for data acquired by a 64-lead electrode cap of a 10-20 system.
The space-time feature extractor specifically includes:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from the three-dimensional tensor, the network is a 2dCNN+BiLSTM model architecture, as shown in fig. 4, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics.
The classifier is specifically used for carrying out emotion classification by using softmax according to space-time characteristics and outputting an emotion classification result.
(3) And training the electroencephalogram emotion recognition network, wherein during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the sample is input into the electroencephalogram emotion recognition network, and network parameters are updated through a random gradient descent method, so that network training is completed.
(4) And acquiring the electroencephalogram emotion data to be identified, and inputting the electroencephalogram emotion data serving as a test set sample into a trained electroencephalogram emotion identification network to obtain the identified emotion type.
The embodiment also provides a cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model, which comprises the following steps:
the system comprises a database, a training set and a test set, wherein the two electroencephalogram emotion databases are used as the training set and the test set, and each electroencephalogram emotion database comprises a plurality of electroencephalogram emotion data and corresponding emotion type labels;
the system comprises a network establishing module, a data aligner, a space-time feature extractor and an emotion classifier, wherein the network establishing module is used for establishing an electroencephalogram emotion recognition network, the electroencephalogram emotion recognition network comprises a preprocessor, the data aligner, the space-time feature extractor and the emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies the emotions according to the space-time features;
the network training module is also used for training the electroencephalogram emotion recognition network, and during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
and the emotion recognition module is used for acquiring the electroencephalogram emotion data to be recognized, inputting the electroencephalogram emotion data as a test set sample into a trained electroencephalogram emotion recognition network, and obtaining recognized emotion types.
The preprocessing device specifically comprises a filter and a down sampler which are connected in front-back mode, wherein the filter is used for conducting finite impulse response filtering on electroencephalogram emotion data, and the down sampler is used for down-sampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
The data aligner is specifically configured to perform euclidean alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned brain electricity emotion data, the value range of c is 1-62, and the method is suitable for the 64 guide of the 10-20 systemAnd (5) connecting data collected by the electrode cap.
The space-time feature extractor specifically includes:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from three-dimensional tensors, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics.
The classifier is specifically used for carrying out emotion classification by using softmax according to space-time characteristics and outputting an emotion classification result.
Simulation verification is carried out on the invention, and table 1 shows the recognition results of SEED and 2020 world robot race emotion BCI (Emotional BCI) in two emotion databases, a competition database and SEED data. In the embodiment, two emotion databases of SEED and 2020 world robot emotion BCI are adopted, and good effects are achieved on four protocols. Table 2 summarizes the current cross-person recognition algorithms in the SEED database, including linear support vector machine (linear support vector machine, SVM), kernel principal component analysis (kernel principal component analysis, KPCA), transition component analysis (transfer component analysis, TCA), transduction Parameter Transition (TPT), domain countermeasure neural network (domain adversarial neural network, DANN), dynamic Graph Convolutional Neural Network (DGCNN), double hemisphere domain countermeasure neural network (BiDANN), biDANN-S, hierarchical spatiotemporal neural network (R2G-STNN), and example adaptive graph (IAG). It can be seen from the table that the present invention achieves the highest accuracy and the smallest standard deviation. Unlike these methods, the training set of the present invention adds an additional BCI database. The training set is increased to enable the generalization effect of the training model to be better, and the fact that the method can effectively extract time-frequency space multi-view features is proved to be capable of carrying out emotion classification well both across databases and topics.
TABLE 1
TABLE 2
Note that subspace-based methods (e.g., TCA, SA, and GFK) have problems handling large volumes of EEG data due to computer memory constraints and computational problems. Thus, 5000 EEG feature samples have to be randomly selected from the training dataset to train the methods for comparison with them. Experimental results show that the brain electricity emotion recognition method provided by the invention has higher recognition rate.
The above disclosure is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention, which is defined by the appended claims.

Claims (8)

1. A cross-domain electroencephalogram emotion recognition method based on a space-time feature fusion model is characterized by comprising the following steps:
(1) Acquiring two brain electricity emotion databases, wherein one brain electricity emotion database is used as a training set and the other brain electricity emotion database is used as a test set, and the brain electricity emotion databases comprise a plurality of brain electricity emotion data and corresponding emotion type labels;
(2) Establishing an electroencephalogram emotion recognition network, wherein the electroencephalogram emotion recognition network comprises a preprocessor, a data aligner, a space-time feature extractor and an emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies emotion according to the space-time features;
wherein the space-time feature extractor specifically comprises:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>ω (τ -t) is the short time analysis window, f represents the frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from the three-dimensional tensor, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics;
(3) Training the electroencephalogram emotion recognition network, wherein during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
(4) And acquiring the electroencephalogram emotion data to be identified, and inputting the electroencephalogram emotion data serving as a test set sample into a trained electroencephalogram emotion identification network to obtain the identified emotion type.
2. The cross-domain electroencephalogram emotion recognition method based on the space-time feature fusion model according to claim 1, wherein the method is characterized by comprising the following steps of: the preprocessing device specifically comprises a filter and a down-sampler which are connected front and back, wherein the filter is used for carrying out finite impulse response filtering on the electroencephalogram emotion data, and the down-sampler is used for down-sampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
3. The cross-domain electroencephalogram emotion recognition method based on the space-time feature fusion model according to claim 1, wherein the method is characterized by comprising the following steps of: the data aligner is specifically configured to perform euclidean alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned electroencephalogram emotion data.
4. The cross-domain electroencephalogram emotion recognition method based on the space-time feature fusion model according to claim 1, wherein the method is characterized by comprising the following steps of: the classifier is specifically used for carrying out emotion classification by using softmax according to space-time characteristics and outputting an emotion classification result.
5. Cross-domain electroencephalogram emotion recognition system based on space-time feature fusion model is characterized by comprising:
the system comprises a database, a training set and a test set, wherein the two electroencephalogram emotion databases are used as the training set and the test set, and each electroencephalogram emotion database comprises a plurality of electroencephalogram emotion data and corresponding emotion type labels;
the system comprises a network establishing module, a data aligner, a space-time feature extractor and an emotion classifier, wherein the network establishing module is used for establishing an electroencephalogram emotion recognition network, the electroencephalogram emotion recognition network comprises a preprocessor, the data aligner, the space-time feature extractor and the emotion classifier which are sequentially connected from front to back, the preprocessor filters and downsamples electroencephalogram emotion data, the data aligner aligns different preprocessed electroencephalogram emotion data, the space-time feature extractor firstly extracts time spectrum features of the electroencephalogram emotion data, then converts the extracted time spectrum features into three-dimensional tensors and then extracts space-time features, and the emotion classifier classifies the emotions according to the space-time features;
wherein the space-time feature extractor specifically comprises:
the time spectrum feature extraction module is used for extracting the aligned electroencephalogram emotion dataThe spectrum characteristics are extracted by adopting the following short-time Fourier transformation, and one-dimensional sequence characteristics are formed:
in the method, in the process of the invention,is the spectral feature of the c-th electrode channel, < >>Is the brain electrical emotion data after alignment +.>The c-th electrode channel in (3)ω (τ -t) is a short time analysis window, f represents frequency;
a one-dimensional conversion three-dimensional module for converting one-dimensional sequence characteristics into three-dimensional tensor T by adopting the following formula n (H,W,r):
Wherein H is height, W is width, and r is frequency characteristic number extracted from each channel;
the cascade deep convolution recurrent neural network is used for extracting space-time characteristics from the three-dimensional tensor, the network is a 2dCNN+BiLSTM model framework, the 2dCNN extracts the space characteristics of each data, the space characteristics are input into BiLSTM, and the BiLSTM extracts time characteristics from the space characteristics to obtain the space-time characteristics;
the network training module is also used for training the electroencephalogram emotion recognition network, and during training, emotion data of each training set and corresponding emotion type labels are used as a sample, the electroencephalogram emotion recognition network is input, network parameters are updated through a random gradient descent method, and network training is completed;
and the emotion recognition module is used for acquiring the electroencephalogram emotion data to be recognized, inputting the electroencephalogram emotion data as a test set sample into a trained electroencephalogram emotion recognition network, and obtaining recognized emotion types.
6. The cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model according to claim 5, wherein the system is characterized in that: the preprocessing device specifically comprises a filter and a down-sampler which are connected front and back, wherein the filter is used for carrying out finite impulse response filtering on the electroencephalogram emotion data, and the down-sampler is used for down-sampling the filtered electroencephalogram emotion data to 200Hz to obtain preprocessed electroencephalogram emotion data.
7. The cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model according to claim 5, wherein the system is characterized in that: the data aligner is specifically configured to perform euclidean alignment on different electroencephalogram emotion data by adopting the following formula:
is aligned brain electricity emotion data, wherein X n ∈R C×t The N-th electroencephalogram emotion data, N is the total number of the electroencephalogram emotion data, wherein C is the electrode number of an electroencephalogram in the electroencephalogram emotion data, C=62, t is the time point number of each electroencephalogram emotion data, and the weight is->Is the time sequence of the c electrode channel in the aligned electroencephalogram emotion data.
8. The cross-domain electroencephalogram emotion recognition system based on the space-time feature fusion model according to claim 5, wherein the system is characterized in that: the classifier is specifically used for carrying out emotion classification by using softmax according to space-time characteristics and outputting an emotion classification result.
CN202111560169.9A 2021-12-20 2021-12-20 Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model Active CN114469137B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111560169.9A CN114469137B (en) 2021-12-20 2021-12-20 Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111560169.9A CN114469137B (en) 2021-12-20 2021-12-20 Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model

Publications (2)

Publication Number Publication Date
CN114469137A CN114469137A (en) 2022-05-13
CN114469137B true CN114469137B (en) 2023-12-26

Family

ID=81494115

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111560169.9A Active CN114469137B (en) 2021-12-20 2021-12-20 Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model

Country Status (1)

Country Link
CN (1) CN114469137B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118141391B (en) * 2024-05-08 2024-07-26 之江实验室 Cross-library brain electrical fatigue identification method and system based on entropy optimization neural network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109389059A (en) * 2018-09-26 2019-02-26 华南理工大学 A kind of P300 detection method based on CNN-LSTM network
CN111126263A (en) * 2019-12-24 2020-05-08 东南大学 Electroencephalogram emotion recognition method and device based on double-hemisphere difference model
KR20210006486A (en) * 2018-08-13 2021-01-18 한국과학기술원 Method for Adaptive EEG signal processing using reinforcement learning and System Using the same
CN113288146A (en) * 2021-05-26 2021-08-24 杭州电子科技大学 Electroencephalogram emotion classification method based on time-space-frequency combined characteristics

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210006486A (en) * 2018-08-13 2021-01-18 한국과학기술원 Method for Adaptive EEG signal processing using reinforcement learning and System Using the same
CN109389059A (en) * 2018-09-26 2019-02-26 华南理工大学 A kind of P300 detection method based on CNN-LSTM network
CN111126263A (en) * 2019-12-24 2020-05-08 东南大学 Electroencephalogram emotion recognition method and device based on double-hemisphere difference model
CN113288146A (en) * 2021-05-26 2021-08-24 杭州电子科技大学 Electroencephalogram emotion classification method based on time-space-frequency combined characteristics

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Cascade and Parallel Convolutional Recurrent Neural Networks on EEG-Based Intention Recognition for Brain Computer Interface;Dalin Zhang 等;The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18);全文 *
Chun-Ren Phang 等.Classi cation of EEG-Based Brain Connectivity Networks in Schizophrenia Using a Multi-Domain Connectome Convolutional Neural Network.arXiv:1903.08858v1 [ cs.LG] 21 Mar 2019.2019,全文. *
SST-EmotionNet: Spatial-Spectral-Temporal based Attention 3D Dense Network for EEG Emotion Recognition;Ziyu Jia 等;Poster Session D2: Emerging Multimedia Applications & Emotional and Social Signals in Multimedia;全文 *
Transfer Learning for Motor Imagery Based Brain-Computer Interfaces: A Complete Pipeline;Dongrui Wu等;arXiv:2007.03746v3 [eess.SP];全文 *

Also Published As

Publication number Publication date
CN114469137A (en) 2022-05-13

Similar Documents

Publication Publication Date Title
CN110069958B (en) Electroencephalogram signal rapid identification method of dense deep convolutional neural network
CN108491077B (en) Surface electromyographic signal gesture recognition method based on multi-stream divide-and-conquer convolutional neural network
CN111832416B (en) Motor imagery electroencephalogram signal identification method based on enhanced convolutional neural network
CN110353675B (en) Electroencephalogram signal emotion recognition method and device based on picture generation
CN112381008B (en) Electroencephalogram emotion recognition method based on parallel sequence channel mapping network
CN110353702A (en) A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN108446635B (en) Collaborative filtering recommendation system and method for acquiring preference with assistance of electroencephalogram signals
Lewicki A review of methods for spike sorting: the detection and classification of neural action potentials
CN105841961A (en) Bearing fault diagnosis method based on Morlet wavelet transformation and convolutional neural network
CN105956546A (en) Emotion recognition method based on EEG signals
CN111387974A (en) Electroencephalogram feature optimization and epileptic seizure detection method based on depth self-coding
CN112022153B (en) Electroencephalogram signal detection method based on convolutional neural network
CN114298216A (en) Electroencephalogram vision classification method based on time-frequency domain fusion Transformer
CN108256579A (en) A kind of multi-modal sense of national identity quantization measuring method based on priori
CN110037693A (en) A kind of mood classification method based on facial expression and EEG
CN112884063B (en) P300 signal detection and identification method based on multi-element space-time convolution neural network
CN111832431A (en) Emotional electroencephalogram classification method based on CNN
CN112465069A (en) Electroencephalogram emotion classification method based on multi-scale convolution kernel CNN
CN113729735A (en) Emotional electroencephalogram feature representation method based on multi-domain self-adaptive graph convolution neural network
CN112438741B (en) Driving state detection method and system based on electroencephalogram feature transfer learning
CN114469137B (en) Cross-domain electroencephalogram emotion recognition method and system based on space-time feature fusion model
CN109871831A (en) A kind of emotion identification method and system
CN116842361A (en) Epileptic brain electrical signal identification method based on time-frequency attention mixing depth network
CN113180659A (en) Electroencephalogram emotion recognition system based on three-dimensional features and cavity full convolution network
CN115474899A (en) Basic taste perception identification method based on multi-scale convolution neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant