CN113069117A - Electroencephalogram emotion recognition method and system based on time convolution neural network - Google Patents

Electroencephalogram emotion recognition method and system based on time convolution neural network Download PDF

Info

Publication number
CN113069117A
CN113069117A CN202110360719.6A CN202110360719A CN113069117A CN 113069117 A CN113069117 A CN 113069117A CN 202110360719 A CN202110360719 A CN 202110360719A CN 113069117 A CN113069117 A CN 113069117A
Authority
CN
China
Prior art keywords
data
electroencephalogram
baseline
signal
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110360719.6A
Other languages
Chinese (zh)
Inventor
吴万庆
韦程琳
蒋明哲
张献斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN202110360719.6A priority Critical patent/CN113069117A/en
Publication of CN113069117A publication Critical patent/CN113069117A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Psychiatry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Physiology (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Probability & Statistics with Applications (AREA)
  • Child & Adolescent Psychology (AREA)
  • Social Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Fuzzy Systems (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a method and a system for recognizing electroencephalogram emotion based on a time convolution neural network, wherein the method comprises the following steps: acquiring electroencephalogram data with emotion labels of a plurality of subjects; the electroencephalogram signal data comprise a baseline signal and an emotion induction signal; performing baseline calibration on the emotion-induced signal according to the baseline average value to convert electroencephalogram signal data into sample data; representing the sample data after data standardization processing as a vector sequence, and dividing the vector sequence into vector segments with equal length; inputting the segmented vector segments into a pre-constructed time convolution neural model, and extracting time characteristic information of electroencephalogram signal data through the time convolution neural model; and judging the emotion types by using a preset softmax classifier according to the time characteristic information and outputting a classification result. By implementing the method and the device, the complexity of the model can be simplified, the utilization rate of the model in hardware resources can be improved, and the efficiency and the accuracy of emotion recognition can be further improved.

Description

Electroencephalogram emotion recognition method and system based on time convolution neural network
Technical Field
The invention relates to the technical field of emotion recognition, in particular to a method and a system for recognizing brain emotion based on a time convolution neural network.
Background
Mental health is one of the most neglected fields in public health, people pay attention to treatment and nursing of diseases, and influence of psychological factors on physiological states is relatively neglected. Studies have found that more than 90% of the diseases are induced by negative emotions, which can evolve into psychological diseases for a long time, and then lead to the tragic occurrence of suicide. Therefore, by researching a human-computer interaction intelligent system based on an electroencephalogram interface technology, the human emotion state can be continuously monitored, identified and intervened in the early stage of negative emotion occurrence, and the method has positive significance for preventing diseases and improving personal happiness. Electroencephalogram signals and characteristic values thereof are used as explicit expressions of functional states of a central system, and are proved to be used for quantifying and identifying dynamic change processes of corresponding emotional brain areas in different emotional states, but because of noise sensitivity of the electroencephalogram signals and complexity of brain functions and structures, how to perform characteristic decoding and characteristic fusion on the electroencephalogram signals to realize effective identification of emotional states, and especially real-time calibration of multi-dimensional emotional state space in continuous time still remains to be solved urgently in the field.
At present, emotion recognition research based on electroencephalogram signals is quite large. One of the research goals is to improve the accuracy of the model, and the main idea is to find electroencephalogram feature representation and optimization models suitable for emotion recognition through various different methods. Deep learning has gained enormous results in the fields of computer vision, natural language processing, and the like and has received much attention. In recent years, many methods of deep learning have been applied to emotion recognition based on electroencephalogram signals, and extremely high recognition accuracy has been achieved. Li et al integrates the spatial, frequency domain and time characteristics of the multi-channel EEG signal, maps the signal into a two-dimensional image, and uses the series of EEG signal multi-dimensional characteristic images to represent the emotion change along with the EEG signal. These images are input into a mixed structure of a convolutional neural network and a cyclic neural network for long-time-span electroencephalogram emotion recognition. Compared to the baseline model, 75.21% accuracy was achieved. The method comprises the steps of firstly extracting features, then converting the features into images, and then further abstracting and identifying bottom-layer features by utilizing a neural network model. Thus, as in the conventional machine learning, a better feature combination needs to be selected by feature engineering, the feature engineering depends on the industry background knowledge and experience of people, a classifier for emotion recognition is selected in advance in the general process, for each feature of a data set, cross validation is performed by using a specific classifier to obtain the average accuracy as a standard for evaluating the quality of the feature, and then the features with the highest accuracy are combined, so that the method is high in complexity and needs a lot of time and energy. The hybrid network structure utilizes two different neural network models, each model has own complexity, so that a more subtle long-term spatial relationship cannot be drawn, and in addition, the problem that the cyclic convolution structure is difficult to train is caused, and the algorithm is not favorable for being deployed to mobile portable equipment.
The prior art has the following defects:
(1) the electroencephalogram data is different from traditional data such as images and videos, and the visual regularity is not available, so that the learning effect of the traditional convolutional neural network is poor, the characteristic value of the original electroencephalogram data needs to be extracted or converted into the image to be used as the input of the convolutional neural network model, the input structure is complex and time-consuming, and the development of online real-time processing of a future emotion recognition system is not facilitated.
(2) The model constructed by the hybrid structure of the convolutional neural network and the cyclic neural network is difficult to realize hardware, two different hardware acceleration engines need to be designed due to different network structures, the utilization rate of hardware resources in unit time is low, and the time cost is increased.
Disclosure of Invention
The invention aims to provide a method and a system for recognizing electroencephalogram emotion based on a time convolution neural network, which aim to solve the technical problems, so that the complexity of a model can be simplified, the utilization rate of the model in hardware resources can be improved, and the emotion recognition efficiency can be improved.
In order to solve the technical problem, the invention provides a electroencephalogram emotion recognition method based on a time convolution neural network, which comprises the following steps:
acquiring electroencephalogram data with emotion labels of a plurality of subjects; wherein the brain electrical signal data for each subject comprises a baseline signal and a mood-inducing signal;
obtaining a baseline average value of the baseline signal, and performing baseline calibration on the emotion-induced signal according to the baseline average value so as to convert the electroencephalogram signal data into sample data;
representing the sample data after data standardization processing as a vector sequence, and dividing the vector sequence into vector segments with equal length and time interval of L by adopting a preset first sliding window;
inputting the segmented vector segments into a pre-constructed time convolution neural model, and extracting time characteristic information of the electroencephalogram signal data through the time convolution neural model;
and judging the emotion types by using a preset softmax classifier according to the time characteristic information and outputting a classification result.
Further, the obtaining a baseline average value of the baseline signal and performing baseline calibration on the emotion-induced signal according to the baseline average value to convert the electroencephalogram signal data into sample data specifically includes:
dividing the baseline signal into m matrixes by using a preset second sliding window, wherein m is the number of time intervals L in the total length T of the baseline signal;
on the basis of the m matrixes, a preset formula is utilized to obtain a baseline average value of the baseline signal;
and performing baseline calibration on the emotional evoked signals according to the baseline average value so as to convert the electroencephalogram signal data into the sample data.
Further, the data standardization process adopts a Z-score standardization method.
Further, the time convolution neural model includes a causal convolution layer, a hole convolution layer, and a residual convolution layer.
In order to solve the same technical problem, the invention also provides a cerebral emotion recognition system based on the time convolution neural network, which comprises:
the data acquisition module is used for acquiring electroencephalogram data with emotion labels of a plurality of subjects; wherein the electroencephalogram signal data for each subject includes a baseline signal and an emotional-evoked signal;
the data conversion module is used for obtaining a baseline average value of the baseline signal and carrying out baseline calibration on the emotion-induced signal according to the baseline average value so as to convert the electroencephalogram signal data into sample data;
the vector segmentation module is used for representing the sample data after data standardization processing as a vector sequence and segmenting the vector sequence into vector segments with equal length and time interval of L by adopting a preset first sliding window;
the characteristic extraction module is used for inputting the segmented vector segments into a pre-constructed time convolution neural model and extracting time characteristic information of the electroencephalogram signal data through the time convolution neural model;
and the emotion recognition module is used for judging the emotion types according to the time characteristic information by using a preset softmax classifier and outputting a classification result.
Further, the data conversion module is specifically configured to: dividing the baseline signal into m matrixes by using a preset second sliding window, wherein m is the number of time intervals L in the total length T of the baseline signal; on the basis of the m matrixes, a preset formula is utilized to obtain a baseline average value of the baseline signal; and performing baseline calibration on the emotion induction signal according to the baseline average value so as to convert the electroencephalogram signal data into the sample data.
Further, the data standardization process adopts a Z-score standardization method.
Further, the time convolution neural model includes a causal convolution layer, a hole convolution layer, and a residual convolution layer.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a method and a system for recognizing electroencephalogram emotion based on a time convolution neural network, wherein the method comprises the following steps: acquiring electroencephalogram data with emotion labels of a plurality of subjects; wherein the electroencephalogram signal data of each subject comprises a baseline signal and an emotion induction signal; obtaining a baseline average value of the baseline signal, and performing baseline calibration on the emotion-induced signal according to the baseline average value so as to convert the brain electrical signal data into sample data; representing the sample data after data standardization processing as a vector sequence, and dividing the vector sequence into vector segments with equal length by adopting a preset first sliding window; inputting the segmented vector segments into a pre-constructed time convolution neural model, and extracting time characteristic information of the electroencephalogram signal data through the time convolution neural model; and judging the emotion types by using a preset softmax classifier according to the time characteristic information and outputting a classification result. By implementing the method, complexity of the model can be simplified, the utilization rate of the model in hardware resources can be improved, and the efficiency and accuracy of emotion recognition can be improved.
Drawings
Fig. 1 is a schematic flow chart of an electroencephalogram emotion recognition method based on a time convolution neural network according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a time convolutional network according to an embodiment of the present invention;
FIG. 3 is another schematic flow chart of a electroencephalogram emotion recognition method based on a time convolution neural network according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a brain emotion recognition system based on a time convolution neural network according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without any inventive step, are within the scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides a method for recognizing electroencephalogram emotion based on a time convolution neural network, including the steps of:
s1, acquiring electroencephalogram data with emotion labels of a plurality of subjects; wherein the electroencephalographic signal data for each subject includes a baseline signal and an emotional-evoked signal.
And S2, obtaining a baseline average value of the baseline signal, and performing baseline calibration on the emotion inducing signal according to the baseline average value so as to convert the electroencephalogram signal data into sample data.
In the embodiment of the present invention, further, step S2 specifically includes:
s201, dividing the baseline signal into m matrixes by using a preset second sliding window, wherein m is the number of time intervals L in the total length T of the baseline signal;
s202, based on the m matrixes, calculating a baseline average value of the baseline signal by using a preset formula;
and S203, performing baseline calibration on the emotion induced signal according to the baseline average value so as to convert the electroencephalogram signal data into the sample data.
S3, representing the sample data after data standardization processing as a vector sequence, and dividing the vector sequence into vector segments with equal length and time interval of L by adopting a preset first sliding window; the data normalization process is performed by using a Z-score normalization method.
S4, inputting the segmented vector segments into a pre-constructed time convolution neural model, and extracting time characteristic information of the electroencephalogram signal data through the time convolution neural model; the time convolution neural model includes a causal convolution layer, a void convolution layer, and a residual convolution layer.
And S5, judging the emotion category according to the time characteristic information by using a preset softmax classifier, and outputting a classification result.
Referring to fig. 2-3, based on the above-mentioned schemes, in order to better understand the electroencephalogram emotion recognition method based on the time convolutional neural network provided in the embodiments of the present invention, the following steps of the technical scheme are described in detail:
(1) data pre-processing
Electroencephalogram data with emotion labels of the subjects are acquired from a DEAP data set, wherein electroencephalogram data of 32 subjects are acquired in the data set (63s, wherein 3s is an electroencephalogram signal under a calm state and is called a baseline signal, and 60s is an electroencephalogram signal acquired after receiving emotional induction). It should be noted that the baseline signal is an electroencephalogram signal acquired by the subject under no stimulation (calm state), and the emotion-induced signal is an electroencephalogram signal acquired by the subject watching an emotion-induced video. In the present embodiment the emotional state tag is high/low wake, high/low cost. Arousal and valence are quantitative indicators of emotion defined in a two-dimensional emotion classification model. Since only 3 seconds of baseline signal were collected in the data set, they were averaged. Embodiments of the present invention use the deviation between the original brain electrical signal and the baseline signal (calm state) generated by a subject under stimulation with an emotional material to represent the emotional state over a particular period of time.
First, a multichannel baseline signal of length T seconds is considered to be a C × T matrix mat. C denotes the number of channels (the number of channels is the number of leads, here 32 leads), and T denotes the number of signal sample points in the T second period. The baseline signal is then partitioned into m matrices using a sliding window of size 1 second. The base line length in the DEAP data set is 3 seconds, so m is taken to be 3, and finally, the average value of the base line signal base _ mean (base line average value) can be calculated by the following formula:
Figure BDA0003005003980000061
the emotion-induced signals are divided into n matrixes by using the sliding time window with the same size, a deviation value between the emotion-induced signals and the baseline signals is taken as sample data, and the process can be regarded as the calibration of experimental signals and can better represent the emotion state reflected by the electroencephalogram signals. After signal calibration, since the dimension between different evaluation indexes affects the structure of data analysis, data standardization is needed to solve the comparability between data indexes. The present invention uses a Z-score-normalization method for normalization of experimental data.
(2) Constructing inputs to a time convolutional neural network
The multi-channel electroencephalogram time sequence signal obtained by the electroencephalogram acquisition equipment at each time point can be used as a vector, namely the signal at the time t can be expressed as
Figure BDA0003005003980000062
Wherein
Figure BDA0003005003980000063
Is the voltage value of the electroencephalogram signal collected by the ith electrode channel, and n represents the total number of the electrodes. For [ t, t + L]The invention uses a sliding window to divide a vector sequence of a one-dimensional signal into vector segments Vj ═ v with equal time intervals Lt,vt+1,...,vt+s-1L. Wherein S is the size of the sliding window and the subscript j is the sequence number of the signal segment. The time neural convolution network in the model takes vector segments Vj as input.
(3) Time convolution neural network structure design
Electroencephalogram signals are non-stationary signals that vary with time. For the time-series model, the time-convolutional neural network (TCN) has better clarity and simplicity than the conventional RNNs and their variants LSTM and GRU.
Establishing an initial time convolution neural model, wherein the Time Convolution Network (TCN) is composed of causal convolution, empty hole convolution and residual convolution. Inputting a time series of length N, { x } - { x }0,...,xN-1The output of the causal convolution layer is { y } - { y }0,...,yN-1According to { x }, predicting { y }.
Inputting vector segments constructed by original data into a TCN neural network for training and learning, wherein the size of a network input layer is the dimension of an EEG sequence (128, 32), a convolutional neural network of 3 TCN Residual modules (Residual blocks) is used for extracting time characteristic information of a channel from input, the size of a convolutional kernel k is 3, and a bulking coefficient d is 3. The number of convolution kernels of the 3 TCN residual modules is 128, 64 and 32 respectively. And finally, outputting the probabilities of various types through a softmax layer, and outputting a final classification result, a model output and classification labels 0 and 1 through a final classification layer. Model output 0 indicates that the current sample is classified as low wake/valence and output 1 indicates that it is classified as high wake/valence.
The Softmax classifier is an extension of the logistic regression model. Let m training set samples be { (x)(1),y(1)),(x(2),y(2)),...,(x(m),y(m)) H, label y(i)The probability p (y ═ j | x) represents the probability that the input is x and the sample is determined to be j, and which class is determined to be the highest probability. Namely, the k classification task is used, the output vector is k-dimensional, the sum of elements of all vectors is 1, and the output is:
Figure RE-GDA0003036814590000071
in the formula, θ is a model parameter, which is obtained by minimizing a cost function J (θ) represented by the formula:
Figure BDA0003005003980000072
wherein,
Figure BDA0003005003980000081
and adding a weight attenuation item to the cost function to punish parameters with overlarge weights, so that the parameters are converged to be optimal. The modified cost function is as follows:
Figure BDA0003005003980000082
wherein, λ is a weight attenuation coefficient,
Figure BDA0003005003980000083
is a weighted decay term. The schematic diagram of the time convolution network structure is shown in fig. 2.
The following are the emotion recognition experimental settings:
the scheme is electroencephalogram emotion recognition on a short-time electroencephalogram signal segment, so that a 1s time window is used for sample segmentation as a means for data expansion, and related researches show that 1s is a time window length suitable for emotion recognition. Thus, for the DEAP data set, every 1 minute of electroencephalogram signals are divided into 60 segments of 1 s; the total number of samples was 2400(40 × 60). And classifying emotional states according to a two-dimensional emotion classification model, wherein the emotion labels in the data set are continuous values of 1-9, so that 5 is selected as a threshold value and is divided into positive and negative items, namely high/low awakening and high/low price. The model classification effect was evaluated using ten cross-validations for each data tested. We have used python to complete all the codes, the framework of deep learning is pytorch, and the parameter values set in the concrete implementation process of the training model are as follows: with the dropout rate set to 0.05, the number of iterations is 1000. Emotional electroencephalogram signals from 32 subjects are tested, and the effectiveness of the method is verified by adopting a ten-fold cross validation method.
The following criteria may be used to measure the performance of the model:
(1) rate of accuracy:
Figure BDA0003005003980000084
TP is the number of positive classes predicted as positive classes, FN is the number of negative classes predicted as positive classes, FP is the number of positive classes predicted as negative classes, TN is the number of negative classes predicted as negative classes.
(2) Number of model parameters (spatial complexity): embodied as the volume of the model itself.
(3) Model calculation amount (time complexity): the training/prediction time of the model is determined.
The embodiment of the invention uses 10-fold cross validation to evaluate the classification accuracy of the model. Model the taxonomic structure of all 32 subjects in DEAP is shown in table 1 (arousal and titer dimensions).
TABLE 1 summary of the classification results of TCN models on DEAP data sets
Figure BDA0003005003980000091
Figure BDA0003005003980000101
As can be seen from the above table, the recognition accuracy of 32 subjects is ideal (the average recognition rate of valence and arousal emotional state is about 96.28, 96.85), and it can be seen from the standard deviation that the recognition effect of the time-based convolutional neural network is more stable without any particularly large deviation in each test. The model can accurately distinguish different valence and aroused emotional states. Compared with the Yang et al model performance, the recognition model of the embodiment of the invention is slightly superior in accuracy, and is reduced by 2.6 times in parameters, so that the modeling and prediction time is greatly reduced, as shown in the following table.
Total parameter Training time Predicting time
This scheme 300,066 0.9s 0.11s
Yang et al protocol 593,411 180s 5s
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
1. the electroencephalogram signal characteristics related to the emotional task can be quickly and effectively extracted end to end.
2. The model has high calculation speed and small complexity parameters, and is easy to be deployed on mobile equipment with limited calculation resources.
It should be noted that the above method or flow embodiment is described as a series of acts or combinations for simplicity, but those skilled in the art should understand that the present invention is not limited by the described acts or sequences, as some steps can be performed in other sequences or simultaneously according to the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are exemplary embodiments and that no single embodiment is necessarily required by the inventive embodiments.
Referring to fig. 4, in order to solve the same technical problem, the present invention further provides an electroencephalogram emotion recognition system based on a time convolution neural network, including:
the data acquisition module 1 is used for acquiring electroencephalogram data with emotion labels of a plurality of subjects; wherein the electroencephalogram signal data of each subject includes a baseline signal and an emotional evoked signal;
the data conversion module 2 is used for solving a baseline average value of the baseline signal and carrying out baseline calibration on the emotion-induced signal according to the baseline average value so as to convert the electroencephalogram signal data into sample data;
the vector segmentation module 3 is configured to represent the sample data subjected to data normalization processing as a vector sequence, and segment the vector sequence into vector segments with equal length and time intervals of L by using a preset first sliding window;
the feature extraction module 4 is used for inputting the segmented vector segments into a pre-constructed time convolution neural model and extracting time feature information of the electroencephalogram signal data through the time convolution neural model;
and the emotion recognition module 5 is used for judging emotion types according to the time characteristic information by using a preset softmax classifier and outputting a classification result.
In the embodiment of the present invention, further, the data conversion module 2 is specifically configured to: dividing the baseline signal into m matrixes by using a preset second sliding window, wherein m is the number of time intervals L in the total length T of the baseline signal; on the basis of the m matrixes, a preset formula is utilized to obtain a baseline average value of the baseline signal; and performing baseline calibration on the emotion induced signal according to the baseline average value so as to convert the electroencephalogram signal data into the sample data.
In the embodiment of the invention, further, the data standardization process adopts a Z-score standardization method.
In an embodiment of the present invention, further, the time convolutional neural model includes a causal convolutional layer, a void convolutional layer and a residual convolutional layer.
It can be understood that the above system item embodiments correspond to the method item embodiments of the present invention, and the electroencephalogram emotion recognition system based on the time convolution neural network provided by the embodiments of the present invention can implement the electroencephalogram emotion recognition method based on the time convolution neural network provided by any one of the method item embodiments of the present invention.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention.

Claims (8)

1. A electroencephalogram emotion recognition method based on a time convolution neural network is characterized by comprising the following steps:
acquiring electroencephalogram data with emotion labels of a plurality of subjects; wherein the electroencephalogram signal data of each subject includes a baseline signal and an emotional evoked signal;
obtaining a baseline average value of the baseline signal, and performing baseline calibration on the emotion-induced signal according to the baseline average value so as to convert the electroencephalogram signal data into sample data;
representing the sample data after data standardization processing as a vector sequence, and dividing the vector sequence into vector segments with equal length and time interval of L by adopting a preset first sliding window;
inputting the segmented vector segments into a pre-constructed time convolution neural model, and extracting time characteristic information of the electroencephalogram signal data through the time convolution neural model;
and judging the emotion types by using a preset softmax classifier according to the time characteristic information and outputting a classification result.
2. The electroencephalogram emotion recognition method based on the time convolution neural network, according to claim 1, wherein the obtaining of the baseline average value of the baseline signal and the baseline calibration of the emotion-induced signal according to the baseline average value are performed to convert the electroencephalogram signal data into sample data specifically include:
dividing the baseline signal into m matrixes by using a preset second sliding window, wherein m is the number of time intervals L in the total length T of the baseline signal;
on the basis of the m matrixes, a preset formula is utilized to obtain a baseline average value of the baseline signal;
and performing baseline calibration on the emotional evoked signals according to the baseline average value so as to convert the electroencephalogram signal data into the sample data.
3. The electroencephalogram emotion recognition method based on the time convolution neural network, according to claim 1, characterized in that the data standardization process is a Z-score standardization method.
4. The electroencephalogram emotion recognition method based on the time convolutional neural network, as recited in claim 1, wherein the time convolutional neural model comprises a causal convolutional layer, a void convolutional layer, and a residual convolutional layer.
5. An electroencephalogram emotion recognition system based on a time convolution neural network, characterized by comprising:
the data acquisition module is used for acquiring electroencephalogram data with emotion labels of a plurality of subjects; wherein the electroencephalogram signal data of each subject includes a baseline signal and an emotional evoked signal;
the data conversion module is used for obtaining a baseline average value of the baseline signal and carrying out baseline calibration on the emotion-induced signal according to the baseline average value so as to convert the electroencephalogram signal data into sample data;
the vector segmentation module is used for representing the sample data subjected to data standardization processing as a vector sequence and segmenting the vector sequence into vector segments with equal length and time interval of L by adopting a preset first sliding window;
the characteristic extraction module is used for inputting the segmented vector segments into a pre-constructed time convolution neural model and extracting time characteristic information of the electroencephalogram signal data through the time convolution neural model;
and the emotion recognition module is used for judging the emotion category according to the time characteristic information by using a preset softmax classifier and outputting a classification result.
6. The electroencephalogram emotion recognition system based on the time convolutional neural network of claim 5, wherein the data conversion module is specifically configured to: dividing the baseline signal into m matrixes by using a preset second sliding window, wherein m is the number of time intervals L in the total length T of the baseline signal; on the basis of the m matrixes, a preset formula is utilized to obtain a baseline average value of the baseline signal; and performing baseline calibration on the emotional evoked signals according to the baseline average value so as to convert the electroencephalogram signal data into the sample data.
7. The electroencephalogram emotion recognition system based on the time-convolutional neural network, as claimed in claim 5, wherein the data normalization process is a Z-score normalization method.
8. The electroencephalogram emotion recognition system based on the time convolutional neural network of claim 5, wherein the time convolutional neural model comprises a causal convolutional layer, a hole convolutional layer, and a residual convolutional layer.
CN202110360719.6A 2021-04-02 2021-04-02 Electroencephalogram emotion recognition method and system based on time convolution neural network Pending CN113069117A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110360719.6A CN113069117A (en) 2021-04-02 2021-04-02 Electroencephalogram emotion recognition method and system based on time convolution neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110360719.6A CN113069117A (en) 2021-04-02 2021-04-02 Electroencephalogram emotion recognition method and system based on time convolution neural network

Publications (1)

Publication Number Publication Date
CN113069117A true CN113069117A (en) 2021-07-06

Family

ID=76614921

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110360719.6A Pending CN113069117A (en) 2021-04-02 2021-04-02 Electroencephalogram emotion recognition method and system based on time convolution neural network

Country Status (1)

Country Link
CN (1) CN113069117A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113491523A (en) * 2021-07-30 2021-10-12 济南汇医融工科技有限公司 Electrocardiosignal characteristic point detection method and system
CN113554110A (en) * 2021-07-30 2021-10-26 合肥工业大学 Electroencephalogram emotion recognition method based on binary capsule network
CN114052735A (en) * 2021-11-26 2022-02-18 山东大学 Electroencephalogram emotion recognition method and system based on depth field self-adaption
CN116269386A (en) * 2023-03-13 2023-06-23 中国矿业大学 Multichannel physiological time sequence emotion recognition method based on ordinal division network

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107714057A (en) * 2017-10-01 2018-02-23 南京邮电大学盐城大数据研究院有限公司 A kind of three classification Emotion identification model methods based on convolutional neural networks
CN109199414A (en) * 2018-10-30 2019-01-15 武汉理工大学 A kind of audiovisual induction Emotion identification method and system based on EEG signals
CN109805898A (en) * 2019-03-22 2019-05-28 中国科学院重庆绿色智能技术研究院 Critical illness Mortality Prediction method based on attention mechanism timing convolutional network algorithm
CN110353702A (en) * 2019-07-02 2019-10-22 华南理工大学 A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN110472779A (en) * 2019-07-30 2019-11-19 东莞理工学院 A kind of power-system short-term load forecasting method based on time convolutional network
CN110610168A (en) * 2019-09-20 2019-12-24 合肥工业大学 Electroencephalogram emotion recognition method based on attention mechanism
US20200107766A1 (en) * 2018-10-09 2020-04-09 Sony Corporation Electronic device for recognition of mental behavioral attributes based on deep neural networks
CN111027686A (en) * 2019-12-26 2020-04-17 杭州鲁尔物联科技有限公司 Landslide displacement prediction method, device and equipment
CN111317468A (en) * 2020-02-27 2020-06-23 腾讯科技(深圳)有限公司 Electroencephalogram signal classification method and device, computer equipment and storage medium
CN111839506A (en) * 2019-04-30 2020-10-30 清华大学 Mental load detection method and device
CN112244873A (en) * 2020-09-29 2021-01-22 陕西科技大学 Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107714057A (en) * 2017-10-01 2018-02-23 南京邮电大学盐城大数据研究院有限公司 A kind of three classification Emotion identification model methods based on convolutional neural networks
US20200107766A1 (en) * 2018-10-09 2020-04-09 Sony Corporation Electronic device for recognition of mental behavioral attributes based on deep neural networks
CN109199414A (en) * 2018-10-30 2019-01-15 武汉理工大学 A kind of audiovisual induction Emotion identification method and system based on EEG signals
CN109805898A (en) * 2019-03-22 2019-05-28 中国科学院重庆绿色智能技术研究院 Critical illness Mortality Prediction method based on attention mechanism timing convolutional network algorithm
CN111839506A (en) * 2019-04-30 2020-10-30 清华大学 Mental load detection method and device
CN110353702A (en) * 2019-07-02 2019-10-22 华南理工大学 A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN110472779A (en) * 2019-07-30 2019-11-19 东莞理工学院 A kind of power-system short-term load forecasting method based on time convolutional network
CN110610168A (en) * 2019-09-20 2019-12-24 合肥工业大学 Electroencephalogram emotion recognition method based on attention mechanism
CN111027686A (en) * 2019-12-26 2020-04-17 杭州鲁尔物联科技有限公司 Landslide displacement prediction method, device and equipment
CN111317468A (en) * 2020-02-27 2020-06-23 腾讯科技(深圳)有限公司 Electroencephalogram signal classification method and device, computer equipment and storage medium
CN112244873A (en) * 2020-09-29 2021-01-22 陕西科技大学 Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
宋振振等: "基于时序卷积网络的情感识别算法", 《华东理工大学学报(自然科学版)》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113491523A (en) * 2021-07-30 2021-10-12 济南汇医融工科技有限公司 Electrocardiosignal characteristic point detection method and system
CN113554110A (en) * 2021-07-30 2021-10-26 合肥工业大学 Electroencephalogram emotion recognition method based on binary capsule network
CN113554110B (en) * 2021-07-30 2024-03-01 合肥工业大学 Brain electricity emotion recognition method based on binary capsule network
CN114052735A (en) * 2021-11-26 2022-02-18 山东大学 Electroencephalogram emotion recognition method and system based on depth field self-adaption
CN116269386A (en) * 2023-03-13 2023-06-23 中国矿业大学 Multichannel physiological time sequence emotion recognition method based on ordinal division network
CN116269386B (en) * 2023-03-13 2024-06-11 中国矿业大学 Multichannel physiological time sequence emotion recognition method based on ordinal division network

Similar Documents

Publication Publication Date Title
CN114052735B (en) Deep field self-adaption-based electroencephalogram emotion recognition method and system
Safayari et al. Depression diagnosis by deep learning using EEG signals: A systematic review
CN110693493A (en) Epilepsy electroencephalogram prediction method based on convolution and recurrent neural network combined time multiscale
CN113069117A (en) Electroencephalogram emotion recognition method and system based on time convolution neural network
CN112244873A (en) Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network
CN114176607B (en) Electroencephalogram signal classification method based on vision transducer
Yu et al. Epileptic seizure prediction using deep neural networks via transfer learning and multi-feature fusion
CN113011330B (en) Electroencephalogram signal classification method based on multi-scale neural network and cavity convolution
Zeng et al. GRP-DNet: A gray recurrence plot-based densely connected convolutional network for classification of epileptiform EEG
Anh-Dao et al. A multistage system for automatic detection of epileptic spikes
CN113974655A (en) Epileptic seizure prediction method based on electroencephalogram signals
Ellis et al. A novel local explainability approach for spectral insight into raw eeg-based deep learning classifiers
CN113076878A (en) Physique identification method based on attention mechanism convolution network structure
CN114595725B (en) Electroencephalogram signal classification method based on addition network and supervised contrast learning
CN107045624B (en) Electroencephalogram signal preprocessing and classifying method based on maximum weighted cluster
CN113796873B (en) Wearable dynamic electrocardiosignal classification method and system
CN116898454B (en) Epileptic classification method and system based on electroencephalogram feature fusion deep learning model
Al-hajjar et al. Epileptic seizure detection using feature importance and ML classifiers
CN113255789A (en) Video quality evaluation method based on confrontation network and multi-tested electroencephalogram signals
CN115700104B (en) Self-interpretable electroencephalogram signal classification method based on multi-scale prototype learning
Liu et al. Automated Machine Learning for Epileptic Seizure Detection Based on EEG Signals.
Jain et al. An efficient feature extraction technique and novel normalization method to improve EMG signal classification
Tang et al. Multi-Domain Based Dynamic Graph Representation Learning for EEG Emotion Recognition
CN117708682B (en) Intelligent brain wave acquisition and analysis system and method
CN114680904B (en) Electroencephalogram signal quality assessment method based on characteristic wave detection and stage algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210706