CN113569997A - Emotion classification method and system based on graph convolution neural network - Google Patents

Emotion classification method and system based on graph convolution neural network Download PDF

Info

Publication number
CN113569997A
CN113569997A CN202111014567.0A CN202111014567A CN113569997A CN 113569997 A CN113569997 A CN 113569997A CN 202111014567 A CN202111014567 A CN 202111014567A CN 113569997 A CN113569997 A CN 113569997A
Authority
CN
China
Prior art keywords
wavelet
electroencephalogram
neural network
emotion
entropy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111014567.0A
Other languages
Chinese (zh)
Inventor
郑向伟
高鹏志
张利峰
王涛
陈宣池
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Mass Institute Of Information Technology
Shandong Normal University
Original Assignee
Shandong Mass Institute Of Information Technology
Shandong Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Mass Institute Of Information Technology, Shandong Normal University filed Critical Shandong Mass Institute Of Information Technology
Priority to CN202111014567.0A priority Critical patent/CN113569997A/en
Publication of CN113569997A publication Critical patent/CN113569997A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention provides a method and a system for classifying emotions based on a convolutional neural network, which comprises the following steps: acquiring an original electroencephalogram signal; obtaining an emotion classification result according to a preset emotion classification model and the obtained original electroencephalogram signal; the emotion classification model is obtained according to an electroencephalogram signal training set and a atlas neural network training; acquiring an electroencephalogram signal training set comprises: acquiring an original electroencephalogram signal data set, eliminating a basic emotion state of the data set, extracting wavelet coefficients of the data set after the basic emotion state is eliminated, calculating an electroencephalogram wavelet energy ratio and a wavelet entropy by utilizing the wavelet coefficients, and modeling the characteristics of the calculated electroencephalogram wavelet entropy and wavelet entropy into a graph structure based on a brain function connection network structure; the present disclosure designs a atlas neural network from brain function connectivity maps and validates with the public data set, DEAP.

Description

Emotion classification method and system based on graph convolution neural network
Technical Field
The disclosure belongs to the technical field of emotion state recognition, and particularly relates to an emotion classification method and system based on a convolutional neural network.
Background
Emotion is a biological state related to the nervous system that plays an important role in our daily lives, even affecting the small and large decisions we make in our daily lives. With the improvement of the data processing capability of a computer and the deepening of the cognition of people on the self emotional state, the automatic emotion recognition system is widely applied. Currently, automatic emotion recognition systems have succeeded in recognizing emotions by using electroencephalograms (EEG). EEG is an electrical signal that records cortical surface activity and represents the state of synaptic activation of neurons in the brain. Recent studies have shown that EEG is a suitable signal for biometric authentication. However, the electroencephalogram signals have the defects of time asymmetry, instability, low signal-to-noise ratio, incapability of directly determining brain area reaction and the like. Therefore, it remains a difficult task to implement an EEG-based emotional state recognition system. Recently, many researchers have proposed an EEG emotional state recognition method based on a convolutional neural network, a deep belief network.
Studies have demonstrated that convolutional neural networks exhibit better performance in emotion recognition of physiological signals than the classical algorithms of machine learning. Compared with other methods, the classification performance of the classification model is determined by the characteristics extracted by the convolutional neural network, and the graph convolutional neural network has the advantages of weight sharing, translation invariance and dimension variability. The study finds that Graph Convolutional neural Network (GCN) has certain advantages in the field of image signal processing. Popsicle et al apply GCN to feature extraction for enhanced medical image modeling; wangxiesong et al propose a multi-label image recognition method based on adaptive multi-scale graph convolutional neural network, and construct the classifier based on adaptive label relation graph, thus has realized the multi-label image recognition more effectively; wumengting et al use a convolutional neural network for blind restoration of motion blur images; plum raining et al applied GCN to classification studies of alzheimer's disease to complete classification of 4 types of healthy Controls (CN), Early Mild Cognitive Impairment (EMCI), Late Mild Cognitive Impairment (LMCI), and AD.
The inventor of the present disclosure finds that the existing system for emotion state recognition and the application of the atlas neural network in the emotion recognition field of electroencephalogram have the following problems:
1. the single-class electroencephalogram feature emotion recognition accuracy is low, and the traditional classification model is easily affected by dimensionality to cause the emotion recognition accuracy to be low;
2. at present, a Graph Convolutional neural Network (GCN) is not widely applied to the electroencephalogram-based emotion recognition field, and the main reasons are as follows: (1) electroencephalography fails to provide spatial relative positional information between brain electrical channels; (2) the data volume of the current electroencephalogram is relatively small, and overfitting is easy to occur in the model training process; (3) the electroencephalogram features extracted by the current method contribute little to emotion recognition.
Disclosure of Invention
Aiming at the original electroencephalogram signals, the dependence of electroencephalogram experiments on a tested object is weakened by a basic emotion state elimination method; constructing a brain function connection network according to the correlation among all the electroencephalogram signal channels; modeling the extracted electroencephalogram wavelet entropy and sample entropy characteristics into a graph structure based on a brain function connection network structure; inputting the graph signals into the established graph convolution neural network model for emotion classification, and finally obtaining a classification result; the present disclosure designs a convolutional neural network according to a brain function connection diagram and verifies the convolutional neural network with a public data set DEAP (database for an electronic Analysis using physiological signals).
In order to achieve the purpose, the invention is realized by the following technical scheme:
in a first aspect, the present disclosure provides a method for emotion classification based on a convolutional neural network, including:
acquiring an original electroencephalogram signal;
obtaining an emotion classification result according to a preset emotion classification model and the obtained original electroencephalogram signal;
the emotion classification model is obtained according to an electroencephalogram signal training set and a atlas neural network training; acquiring an electroencephalogram signal training set comprises: the method comprises the steps of obtaining an original electroencephalogram signal data set, eliminating a basic emotion state of the data set, extracting wavelet coefficients of the data set after the basic emotion state is eliminated, calculating an electroencephalogram wavelet energy ratio and a wavelet entropy by utilizing the wavelet coefficients, and modeling the calculated electroencephalogram wavelet entropy and wavelet entropy characteristics into a graph structure based on a brain function connection network structure.
Further, the training of the emotion classification model comprises:
acquiring an original electroencephalogram signal, and acquiring an experimental electroencephalogram signal stimulated by an emotional state according to a basic emotional state elimination method;
decomposing the experimental data after the emotional state stimulation is obtained into a plurality of frequency bands according to wavelet packet transformation, and extracting wavelet coefficients;
estimating the overall complexity of the electroencephalogram according to the sample entropy, and calculating a wavelet energy ratio and a wavelet entropy from wavelet coefficients of a plurality of frequency bands;
constructing a brain function connection network based on a Phase Locking Value (PLV) correlation matrix according to the correlation among all electroencephalogram channel signals;
and connecting the network diagram and the atlas neural network according to the brain function, and training to obtain an emotion classification model.
Further, eliminating the base emotional state of the data set includes: initializing an input original electroencephalogram data set, and intercepting each tested electroencephalogram signal of an original electroencephalogram signal in the original electroencephalogram data set for an experimental trial time; and extracting the calm brain electrical signals of the first three seconds by adopting a basic emotional state elimination method, expanding the calm brain electrical signals to 60s, extracting 4s-63s of original brain electrical signals, and subtracting the 60s of calm brain electrical signals from the original brain electrical signals to obtain the emotional state brain electrical signals without the basic emotional state.
Further, extracting the wavelet coefficients of the data set after the elimination of the basic emotional state comprises: decomposing electroencephalogram signals of emotional states by wavelet packets, and decomposing the electroencephalogram signals by i layers of wavelet packets to obtain 2iA wavelet node;
and reconstructing the wavelet packet coefficients of the nodes of the ith layer to obtain reconstructed signals of all the nodes, dividing the reconstructed EEG signal frequency band into a plurality of frequency bands, and extracting the wavelet coefficients of the plurality of frequency bands.
Further, the calculating of the electroencephalogram wavelet energy ratio and the wavelet entropy by using the wavelet coefficients comprises:
the sample entropy algorithm comprises the following steps:
(a) setting an initial time signal to Si, wherein i ═ 1, 2.., n;
(b) determining parameters m and r, wherein m is a selected space dimension vector to be reconstructed, and r is a given threshold value and is a non-negative real number and represents a similar threshold value between reconstruction vectors;
(c) reconstructing a time series X1, X2, …, XN-m +1 ∈ m, wherein Xi ═ Si, Si +1, …, Si + m-1;
(d) calculating the distance D [ Xi, Xj ] ═ max { | Xi + k-Xi | } between any reconstruction vectors, wherein the value range of i, j is 1 to N-m + 1;
(e) given a threshold value r, where r>0, given the embedding dimension m, a measure of the magnitude of the probability of regularity of the sequence Xi is calculated
Figure BDA0003239470270000041
And to all
Figure BDA0003239470270000042
Averaging
Figure BDA0003239470270000043
(f) Setting the embedding dimension to m + 1;
(g) according to the steps (a) - (f), calculating Bm +1 (r);
(h)
Figure BDA0003239470270000051
the calculation of the wavelet energy ratio and the wavelet entropy includes:
hypothesis CiIs wavelet coefficient extracted after wavelet packet transform, wherein i ═ Alpha, Theta, Beta1, Beta2, Gamma1 and Gamma 2; the calculation of the total energy of the wavelet coefficients is defined as follows:
Figure BDA0003239470270000052
wavelet energy ratio eta of ith frequency bandiThe calculation is defined as follows:
Figure BDA0003239470270000053
the wavelet entropy calculation method of the ith frequency band is defined as follows:
Entropyi=-ηiln(ηi)。
further, the extracted features are arranged; feature arrangement is to map the positions of the electrode channels to positions on a 2D plan, and then arrange the extracted features according to the positions of the 2D plan to form a feature cube.
Further, for the design of the classifier, the designing of the atlas-based neural network model according to the brain function connection network includes:
an input layer: the input of each sample is the product of the arrangement of the simulation electrodes on the scalp and the number of features extracted by each channel;
and (3) drawing, rolling and laminating: performing spatial filtering and fusion on the input brain function connection network, wherein the connection between the input brain function connection network and an input layer is local connection;
graph pooling layer: integrating new characteristics of the graph convolution layer and realizing a dimension reduction effect;
and (3) drawing, rolling and laminating: reducing the dimension of the graph volume layer;
graph pooling layer: receiving new features calculated by the graph convolution layer and performing dimensionality reduction processing to form a classifier;
full connection layer: performing dimension increasing on the characteristics of the graph pooling layer, and providing high-dimensional information for the classification of the last layer;
an output layer: and outputting the emotional state by using the calculation result of the full connection layer.
In a second aspect, the present disclosure also provides an emotion classification system based on a convolutional neural network, including a signal acquisition module and an emotion classification module;
the signal acquisition module configured to: acquiring an original electroencephalogram signal;
the emotion classification module configured to: obtaining an emotion classification result according to a preset emotion classification model and the obtained original electroencephalogram signal;
the emotion classification model is obtained according to an electroencephalogram signal training set and a atlas neural network training; acquiring an electroencephalogram signal training set comprises: the method comprises the steps of obtaining an original electroencephalogram signal data set, eliminating a basic emotion state of the data set, extracting wavelet coefficients of the data set after the basic emotion state is eliminated, calculating an electroencephalogram wavelet energy ratio and a wavelet entropy by utilizing the wavelet coefficients, and modeling the calculated electroencephalogram wavelet entropy and wavelet entropy characteristics into a graph structure based on a brain function connection network structure.
In a third aspect, the present disclosure also provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of the method for emotion classification based on a convolutional neural network of the first aspect.
In a fourth aspect, the present disclosure also provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the steps of the method for emotion classification based on a convolutional neural network according to the first aspect when executing the program.
Compared with the prior art, the beneficial effect of this disclosure is:
1. in the disclosure, the wavelet energy ratio can reflect the energy of each frequency band in the electroencephalogram signal, the wavelet entropy is the expansion of the wavelet energy ratio, and can reflect whether the signal spectrum energy distribution in each space is ordered or unordered; sample entropy can reflect the complexity of the time series and is used to measure the complexity of the EEG signal; the brain function connection network diagram obtained based on the PLV incidence matrix can provide spatial information for the GCN emotion recognition model; finally, inputting the brain function connection network graphs into a GCN model for classification and prediction, and further identifying emotional states; the problems that the emotion recognition accuracy rate of single-class electroencephalogram features is low, and the emotion recognition accuracy rate is low due to the fact that a traditional classification model is easily affected by dimensionality are solved;
2. the emotion state recognition method based on the brain function connection network and the atlas neural network is designed in the disclosure, provides basis for wide application in the emotion recognition field of the atlas neural network, and has important significance.
Drawings
The accompanying drawings, which form a part hereof, are included to provide a further understanding of the present embodiments, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the present embodiments and together with the description serve to explain the present embodiments without unduly limiting the present embodiments.
FIG. 1 is a method framework diagram of example 1 of the present disclosure;
fig. 2 is a structural diagram of a graph convolution neural network according to embodiment 1 of the present disclosure.
The specific implementation mode is as follows:
the present disclosure is further described with reference to the following drawings and examples.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
Example 1:
as shown in fig. 1, the present embodiment provides a method for classifying emotions based on a convolutional neural network, including:
and (3) preprocessing the original electroencephalogram signals by adopting a basic emotional state elimination method to obtain experimental electroencephalogram signals stimulated by emotional states.
Aiming at the emotional electroencephalogram signals without basic emotional states, decomposing experimental data into six frequency bands of Alpha, Theta, Beta1, Beta2, Gamma1 and Gamma2 by utilizing Wavelet Packet Transform (WPT) and extracting corresponding wavelet coefficients; in this embodiment, the wavelet coefficients obtained by decomposition of db6 wavelet basis wavelet packets are used; the target of emotional state recognition is to search an EEG frequency band most relevant to emotional state recognition activities, and provide good EEG signal characteristics for EEG-based research.
Estimating the overall complexity of the electroencephalogram by adopting sample entropy, and calculating wavelet energy ratio and wavelet entropy from wavelet coefficients of six frequency bands of Alpha, Theta, Beta1, Beta2, Gamma1 and Gamma 2; and the sample entropy is used to estimate the complexity of the overall signal.
Setting labels as high-cost high-awakening (HVHA), high-cost low-awakening (HVLA), low-cost high-awakening (LVHA) and low-cost low-awakening (LVLA) dimensions, constructing a brain function connection network by using PLV according to the correlation among electroencephalogram channel signals, and drawing a brain function connection diagram by using an HERMES tool; specifically, a brain function connection network is constructed according to the correlation among all electroencephalogram channel signals; the electroencephalogram electrode is formed by selecting 32 electrode positions according to the international 10-20 system, namely Fp1, AF3, F3, F7, FC5, FC1, C3, T7, CP5, CP1, P3, P7, PO3, O1, Oz, PZ, Fp2, AF4, FZ, F4, F8, FC6, FC, CZ, C4, T8, Cp6, Cp2, P4, P8, PO4 and O2.
A GCN-based emotion recognition model is constructed, and 1 input layer, 2 graph convolution layers, 2 graph pooling layers and 1 full-connection layer are designed in the model. The input layer is used for receiving brain function connection network; the function of the atlas layer is to extract spatial information between electroencephalogram networks and carry out weighted summation on the characteristics of a specific channel to form classification characteristics which are easy to identify by a model; the graph pooling layer is used for reducing the dimension of the result calculated by the adjacent convolution layer and realizing the effect of compressing data; the full connection layer is used for carrying out data dimension transformation on the data of the graph pooling layer and providing high-dimension identification information for emotion identification and classification; the output layer is used for judging the calculation result of the full connection layer and outputting the emotion recognition result.
In this embodiment, the emotional state identification method based on the convolutional neural network specifically includes:
initializing an input raw brain electrical data set D, for each tested brain electrical signal S of the raw brain electrical signals in the data set DiIntercepting an experimental trial number i;
eliminating the basic emotional state for each trial i, and extracting a calm electroencephalogram signal SciAnd the EEG signal S after the experimenttiFrom StiMinus SciObtaining an experimental electroencephalogram signal S after emotional state stimulationsi
Experiment electroencephalogram signal S after emotional state stimulation by wavelet packet decompositionsiPerforming i-layer wavelet packet decomposition on the electroencephalogram signal to obtain 2iA wavelet node;
reconstructing wavelet packet coefficients of nodes of the ith layer to obtain a reconstructed signal S of each nodej,mWherein m is 1,2, …,2jExtracting wavelet coefficient C from six frequency bands of Alpha, Theta, Beta1, Beta2, Gamma1 and Gamma2i
Experiment electroencephalogram signal S after stimulation of emotional state by wavelet packetsiReconstructing and extracting wavelet coefficient C of EEG signal of each frequency bandi. Evaluating the complexity of the electroencephalogram signal of each channel by using a sample entropy algorithm and using CiAnd calculating the wavelet energy ratio and the wavelet entropy of each frequency band in each channel.
The specific calculation steps of the sample entropy are as follows:
(a) setting an initial time signal to Si, wherein i ═ 1, 2.., n;
(b) determining parameters m and r, wherein m is a selected space dimension vector to be reconstructed, and r is a given threshold value and is a non-negative real number and represents a similar threshold value between reconstruction vectors;
(c) reconstructing a time series X1, X2, …, XN-m +1 ∈ m, wherein Xi ═ Si, Si +1, …, Si + m-1;
(d) calculating the distance D [ Xi, Xj ] ═ max { | Xi + k-Xi | } between any reconstruction vectors, wherein the value range of i, j is 1 to N-m + 1;
(e) given threshold r (r)>0) Given isEmbedding dimension m, calculating a measure of the probability magnitude of the regularity of the sequence Xi
Figure BDA0003239470270000101
And to all
Figure BDA0003239470270000102
Averaging
Figure BDA0003239470270000103
(f) Setting the embedding dimension to m + 1;
(g) according to the steps a) to f), calculating Bm +1 (r);
(h)
Figure BDA0003239470270000104
the wavelet energy ratio and the wavelet entropy are calculated as follows:
hypothesis CiIs wavelet coefficient extracted after wavelet packet transform, wherein i ═ Alpha, Theta, Beta1, Beta2, Gamma1 and Gamma 2; the calculation of the total energy of the wavelet coefficients is defined as follows:
Figure BDA0003239470270000105
wavelet energy ratio eta of ith frequency bandiThe calculation is defined as follows:
Figure BDA0003239470270000106
the wavelet entropy calculation method of the ith frequency band is defined as follows:
Entropyi=-ηiln(ηi)
the extracted features are arranged, the positions of the electrode channels are mapped to positions on a 2D plan view, and then the extracted features are arranged according to the positions of the 2D plan view to form a 6 × 6 × 11 feature cube. The weighted brain network is converted into a binary brain network, so that the brain function connection network characteristic can be calculated conveniently.
The brain function connection network feature uses 5 global features and 2 local features. The 5 global network features include clustering coefficient, average shortest path length, harmony coefficient, global efficiency and local efficiency, and the 2 local network features include centrality and node degree. The calculation formula of each feature of the brain function connection network is as follows:
(a) clustering coefficient: the clustering coefficient C represents the degree of aggregation of nodes in the brain network.
Figure BDA0003239470270000111
Wherein C is the clustering coefficient of the node, and k is the connecting edge of the node.
(b) Average shortest path length: the average shortest path length measures the connectivity of any two nodes in the brain network. The smaller the average shortest path length value is, the more the number of connections of the brain function connection network is.
Figure BDA0003239470270000112
Where m, n represent any two network nodes, LmnThe shortest path length from node m to node n is shown, and n is the total number of nodes in the network.
(c) Matching degree coefficient: the matching degree coefficient P refers to the connection between nodes similar to the self in the brain function connection network. Where α is the variance of the metric of each edge endpoint, dmdnIs an edge endpoint metric value.
Figure BDA0003239470270000113
(d) Global efficiency: global efficiency is used to measure the overall information transfer and connectivity of a brain function connection network.
Figure BDA0003239470270000114
(e) Local efficiency: local efficiency is a measure of the transmission and connectivity of local information. And N is the total number of the brain function connection network nodes. EglobalRefers to the global efficiency of the brain function connection network.
Figure BDA0003239470270000115
(f) Centrality: the centrality is the shortest path length of any two nodes m and n, and is used for measuring the centrality of brain function connection network nodes.
Figure BDA0003239470270000116
(g) Node degree: the degree of the node is used for measuring the connectivity between any node m and other nodes in the brain function connection network node, and is the sum of the out-degree and the in-degree of one node.
Kn=∑amn
For the design of the classifier, a model of the GCN is designed according to the brain function connection network nodes. The present disclosure designs the functions of each layer of the GCN model (let li∈W×H×FnIs the size of the ith layer two-dimensional feature map):
(1) input layer (L1): the input for each sample is l1:W×H×FnWherein W × H is the arrangement of the simulation electrodes on the scalp, FnIs the number of features extracted from each channel.
(2) Graph volume layer (L2): the main function of this layer is to spatially filter and fuse the input brain function connection network, so the connection between this layer and the input layer is a local connection. The size and the number of the filters are set to K1∈W×H×Z×FnW is the width of the convolution kernel, H is the height of the convolution kernel, Z is the number of the characteristic diagram of the previous layer, FnIs the number of two-dimensional feature maps (or called filters) to be output, each obtainedThe size of each feature map is
Figure BDA0003239470270000121
The number of the two-dimensional characteristic graphs is
Figure BDA0003239470270000122
The reason for the convolution kernel being arranged as a matrix rather than a vector is that it is necessary to fuse spatial information to form abstract map signal features that are easily recognized by the model.
(3) Graph pooling layer (L3): the main function of this layer is to integrate the new features of the graph convolution layer L2 and achieve the dimension reduction effect. The size and the number of the filters of the third layer are set to K2And the size of each feature map obtained is
Figure BDA0003239470270000123
The number of the characteristic graphs is
Figure BDA0003239470270000124
(4) Graph volume layer (L4): the main function of this layer is to perform dimension reduction on the map convolution layer L3. The size and the number of the filters of the third layer are set to K3And the size of each feature map obtained is
Figure BDA0003239470270000125
The number of the characteristic graphs is
Figure BDA0003239470270000126
(5) Graph pooling layer (L5): this layer will receive the new features computed by the graph convolution layer L4 and perform a dimension reduction process, with layers 6 to 7 constituting the classifier.
(6) Full tie layer (L6): this layer upscales the features of the graph pooling layer L5 to provide high dimensional information for the classification of the last layer. The present disclosure sets the neurons of this layer to
Figure BDA0003239470270000131
(7) Output layer (L7): this layer outputs the emotional state using the calculation result of the full link layer L6.
Designing each layer structure of the GCN model, making final preparation for emotion state recognition, training and verifying the model by adopting a five-fold cross validation mode for 3 times after the model initialization is completed, and keeping model parameters with the best effect.
Example 2:
the embodiment provides an emotion classification system based on a convolutional neural network, which comprises a signal acquisition module and an emotion classification module;
the signal acquisition module configured to: acquiring an original electroencephalogram signal;
the emotion classification module configured to: obtaining an emotion classification result according to a preset emotion classification model and the obtained original electroencephalogram signal;
the emotion classification model is obtained according to an electroencephalogram signal training set and a atlas neural network training; acquiring an electroencephalogram signal training set comprises: the method comprises the steps of obtaining an original electroencephalogram signal data set, eliminating a basic emotion state of the data set, extracting wavelet coefficients of the data set after the basic emotion state is eliminated, calculating an electroencephalogram wavelet energy ratio and a wavelet entropy by utilizing the wavelet coefficients, and modeling the calculated electroencephalogram wavelet entropy and wavelet entropy characteristics into a graph structure based on a brain function connection network structure.
Example 3:
the present embodiment provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor, implements the steps of the method for emotion classification based on a convolutional neural network described in embodiment 1.
Example 4:
the present embodiment provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor executes the program to implement the steps of the emotion classification method based on a convolutional neural network described in embodiment 1.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and those skilled in the art can make various modifications and variations. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present embodiment should be included in the protection scope of the present embodiment.

Claims (10)

1. The emotion classification method based on the atlas neural network is characterized by comprising the following steps of:
acquiring an original electroencephalogram signal;
obtaining an emotion classification result according to a preset emotion classification model and the obtained original electroencephalogram signal;
the emotion classification model is obtained according to an electroencephalogram signal training set and a atlas neural network training; acquiring an electroencephalogram signal training set comprises: the method comprises the steps of obtaining an original electroencephalogram signal data set, eliminating a basic emotion state of the data set, extracting wavelet coefficients of the data set after the basic emotion state is eliminated, calculating an electroencephalogram wavelet energy ratio and a wavelet entropy by utilizing the wavelet coefficients, and modeling the calculated electroencephalogram wavelet entropy and wavelet entropy characteristics into a graph structure based on a brain function connection network structure.
2. The method of mood classification based on atlas neural network of claim 1, wherein the training of the mood classification model comprises:
acquiring an original electroencephalogram signal, and acquiring an experimental electroencephalogram signal stimulated by an emotional state according to a basic emotional state elimination method;
decomposing the experimental data after the emotional state stimulation is obtained into a plurality of frequency bands according to wavelet packet transformation, and extracting wavelet coefficients;
estimating the overall complexity of the electroencephalogram according to the sample entropy, and calculating a wavelet energy ratio and a wavelet entropy from wavelet coefficients of a plurality of frequency bands;
constructing a brain function connection network based on the phase locking value incidence matrix according to the correlation among all electroencephalogram channel signals;
and connecting the network diagram and the atlas neural network according to the brain function, and training to obtain an emotion classification model.
3. The method of mood classification based on atlas neural network of claim 2, wherein eliminating the base mood state of the dataset comprises: initializing an input original electroencephalogram data set, and intercepting each tested electroencephalogram signal of an original electroencephalogram signal in the original electroencephalogram data set for an experimental trial time; and extracting the calm brain electrical signals of the first three seconds by adopting a basic emotional state elimination method, expanding the calm brain electrical signals to 60s, extracting 4s-63s of original brain electrical signals, and subtracting the 60s of calm brain electrical signals from the original brain electrical signals to obtain the emotional state brain electrical signals without the basic emotional state.
4. The method of mood classification based on graph-convolution neural network of claim 2, wherein extracting wavelet coefficients of the data set after elimination of the underlying mood state comprises: decomposing electroencephalogram signals of emotional states by wavelet packets, and decomposing the electroencephalogram signals by i layers of wavelet packets to obtain 2iA wavelet node;
and reconstructing the wavelet packet coefficients of the nodes of the ith layer to obtain reconstructed signals of all the nodes, dividing the reconstructed EEG signal frequency band into a plurality of frequency bands, and extracting the wavelet coefficients of the plurality of frequency bands.
5. The method of mood classification based on atlas neural network of claim 2, wherein computing electroencephalogram wavelet energy ratio and wavelet entropy using wavelet coefficients comprises:
the sample entropy algorithm comprises the following steps:
(a) setting an initial time signal to Si, wherein i ═ 1, 2.., n;
(b) determining parameters m and r, wherein m is a selected space dimension vector to be reconstructed, and r is a given threshold value and is a non-negative real number and represents a similar threshold value between reconstruction vectors;
(c) reconstructing a time series X1, X2, …, XN-m +1 ∈ m, wherein Xi ═ Si, Si +1, …, Si + m-1;
(d) calculating the distance D [ Xi, Xj ] ═ max { | Xi + k-Xi | } between any reconstruction vectors, wherein the value range of i, j is 1 to N-m + 1;
(e) given a threshold value r, where r>0, given the embedding dimension m, a measure of the magnitude of the probability of regularity of the sequence Xi is calculated
Figure FDA0003239470260000021
And to all
Figure FDA0003239470260000022
Averaging
Figure FDA0003239470260000023
(f) Setting the embedding dimension to m + 1;
(g) according to the steps (a) - (f), calculating Bm +1 (r);
(h)
Figure FDA0003239470260000031
the calculation of the wavelet energy ratio and the wavelet entropy includes:
hypothesis CiIs wavelet coefficient extracted after wavelet packet transform, wherein i ═ Alpha, Theta, Beta1, Beta2, Gamma1 and Gamma 2; the calculation of the total energy of the wavelet coefficients is defined as follows:
Figure FDA0003239470260000032
wavelet energy ratio eta of ith frequency bandiThe calculation is defined as follows:
Figure FDA0003239470260000033
the wavelet entropy calculation method of the ith frequency band is defined as follows:
Entropyi=-ηiln(ηi)。
6. the method of mood classification based on atlas neural network of claim 1, wherein the extracted features are ranked; feature arrangement is to map the positions of the electrode channels to positions on a 2D plan, and then arrange the extracted features according to the positions of the 2D plan to form a feature cube.
7. The method of classifying emotion based on a convolutional neural network as claimed in claim 1, wherein designing a convolutional neural network model according to a brain function connection network for the design of a classifier comprises:
an input layer: the input of each sample is the product of the arrangement of the simulation electrodes on the scalp and the number of features extracted by each channel;
and (3) drawing, rolling and laminating: performing spatial filtering and fusion on the input brain function connection network, wherein the connection between the input brain function connection network and an input layer is local connection;
graph pooling layer: integrating new characteristics of the graph convolution layer and realizing a dimension reduction effect;
and (3) drawing, rolling and laminating: reducing the dimension of the graph volume layer;
graph pooling layer: receiving new features calculated by the graph convolution layer and performing dimensionality reduction processing to form a classifier;
full connection layer: performing dimension increasing on the characteristics of the graph pooling layer, and providing high-dimensional information for the classification of the last layer;
an output layer: and outputting the emotional state by using the calculation result of the full connection layer.
8. The emotion classification system based on the atlas neural network is characterized by comprising a signal acquisition module and an emotion classification module;
the signal acquisition module configured to: acquiring an original electroencephalogram signal;
the emotion classification module configured to: obtaining an emotion classification result according to a preset emotion classification model and the obtained original electroencephalogram signal;
the emotion classification model is obtained according to an electroencephalogram signal training set and a atlas neural network training; acquiring an electroencephalogram signal training set comprises: the method comprises the steps of obtaining an original electroencephalogram signal data set, eliminating a basic emotion state of the data set, extracting wavelet coefficients of the data set after the basic emotion state is eliminated, calculating an electroencephalogram wavelet energy ratio and a wavelet entropy by utilizing the wavelet coefficients, and modeling the calculated electroencephalogram wavelet entropy and wavelet entropy characteristics into a graph structure based on a brain function connection network structure.
9. A computer-readable storage medium, on which a computer program is stored for fingerprint similarity calculation, characterized in that the program, when being executed by a processor, implements the steps of the method for emotion classification based on a atlas neural network as recited in any one of claims 1-7.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the steps of the method for emotion classification based on a convolutional neural network as claimed in any of claims 1-7.
CN202111014567.0A 2021-08-31 2021-08-31 Emotion classification method and system based on graph convolution neural network Pending CN113569997A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111014567.0A CN113569997A (en) 2021-08-31 2021-08-31 Emotion classification method and system based on graph convolution neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111014567.0A CN113569997A (en) 2021-08-31 2021-08-31 Emotion classification method and system based on graph convolution neural network

Publications (1)

Publication Number Publication Date
CN113569997A true CN113569997A (en) 2021-10-29

Family

ID=78173323

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111014567.0A Pending CN113569997A (en) 2021-08-31 2021-08-31 Emotion classification method and system based on graph convolution neural network

Country Status (1)

Country Link
CN (1) CN113569997A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114431862A (en) * 2021-12-22 2022-05-06 山东师范大学 Multi-modal emotion recognition method and system based on brain function connection network
CN115099311A (en) * 2022-06-06 2022-09-23 陕西师范大学 Multi-modal emotion classification method based on electroencephalogram space-time frequency characteristics and eye movement characteristics
CN115444431A (en) * 2022-09-02 2022-12-09 厦门大学 Electroencephalogram emotion classification model generation method based on mutual information driving

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114431862A (en) * 2021-12-22 2022-05-06 山东师范大学 Multi-modal emotion recognition method and system based on brain function connection network
CN115099311A (en) * 2022-06-06 2022-09-23 陕西师范大学 Multi-modal emotion classification method based on electroencephalogram space-time frequency characteristics and eye movement characteristics
CN115099311B (en) * 2022-06-06 2024-03-19 陕西师范大学 Multi-modal emotion classification method based on electroencephalogram time-space characteristics and eye movement characteristics
CN115444431A (en) * 2022-09-02 2022-12-09 厦门大学 Electroencephalogram emotion classification model generation method based on mutual information driving

Similar Documents

Publication Publication Date Title
Yildirim A novel wavelet sequence based on deep bidirectional LSTM network model for ECG signal classification
CN111012336B (en) Parallel convolutional network motor imagery electroencephalogram classification method based on spatio-temporal feature fusion
CN113569997A (en) Emotion classification method and system based on graph convolution neural network
CN104523268B (en) Electroencephalogram signal recognition fuzzy system and method with transfer learning ability
Jang et al. EEG-based video identification using graph signal modeling and graph convolutional neural network
Abbas et al. DeepMI: Deep learning for multiclass motor imagery classification
CN106503799A (en) Deep learning model and the application in brain status monitoring based on multiple dimensioned network
Hsu Application of quantum-behaved particle swarm optimization to motor imagery EEG classification
Al-Saegh et al. CutCat: An augmentation method for EEG classification
CN111797674B (en) MI electroencephalogram signal identification method based on feature fusion and particle swarm optimization algorithm
CN112990008B (en) Emotion recognition method and system based on three-dimensional characteristic diagram and convolutional neural network
Behrouzi et al. Graph variational auto-encoder for deriving EEG-based graph embedding
Tajjour et al. A novel hybrid artificial neural network technique for the early skin cancer diagnosis using color space conversions of original images
Yang et al. Mlp with riemannian covariance for motor imagery based eeg analysis
CN112617860A (en) Emotion classification method and system of brain function connection network constructed based on phase-locked value
CN116250849A (en) Electroencephalogram signal identification method based on information separator and regional convolution network
Li et al. Feature selection method based on Menger curvature and LDA theory for a P300 brain–computer interface
CN115054272A (en) Electroencephalogram signal identification method and system for dyskinesia function remodeling
CN113951883A (en) Gender difference detection method based on electroencephalogram signal emotion recognition
CN117609951A (en) Emotion recognition method, system and medium integrating electroencephalogram and function near infrared
CN113128384A (en) Brain-computer interface software key technical method of stroke rehabilitation system based on deep learning
Huang et al. Tensor discriminant analysis for MI-EEG signal classification using convolutional neural network
CN116919422A (en) Multi-feature emotion electroencephalogram recognition model establishment method and device based on graph convolution
Saeedi et al. Schizophrenia diagnosis via FFT and wavelet convolutional neural networks utilizing EEG signals
CN112446307A (en) Local constraint-based non-negative matrix factorization electrocardiogram identity recognition method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination