CN112101152A - Electroencephalogram emotion recognition method and system, computer equipment and wearable equipment - Google Patents

Electroencephalogram emotion recognition method and system, computer equipment and wearable equipment Download PDF

Info

Publication number
CN112101152A
CN112101152A CN202010905842.7A CN202010905842A CN112101152A CN 112101152 A CN112101152 A CN 112101152A CN 202010905842 A CN202010905842 A CN 202010905842A CN 112101152 A CN112101152 A CN 112101152A
Authority
CN
China
Prior art keywords
electroencephalogram
hypergraph
matrix
emotion recognition
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010905842.7A
Other languages
Chinese (zh)
Other versions
CN112101152B (en
Inventor
杨利英
秦泽宇
张清杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN202010905842.7A priority Critical patent/CN112101152B/en
Publication of CN112101152A publication Critical patent/CN112101152A/en
Application granted granted Critical
Publication of CN112101152B publication Critical patent/CN112101152B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention belongs to the cross technical field of machine learning and emotion recognition, and discloses an electroencephalogram emotion recognition method, a system, computer equipment and wearable equipment, wherein the influence of non-emotion signals on emotion recognition is reduced by removing electroencephalogram signals generated in the process of video conversion at the beginning and subtracting the average value of the signals from the rest data; extracting time-frequency domain characteristics of the preprocessed electroencephalogram signals by using short-time Fourier transform; putting the features into a convolutional neural network for training, and extracting high-quality features; and carrying out hypergraph learning on the obtained features, constructing a hypergraph classifier model, and finishing emotion classification and identification. The time-frequency characteristics of the electroencephalogram signals are optimized by adopting a deep learning method, then the hypergraph learning method is used for sampling for training and classifying, the training time is effectively shortened on the basis of improving the accuracy of hypergraph learning classification, the operation space is compressed, and the method has important significance for the design and research and development of portable wearable equipment.

Description

Electroencephalogram emotion recognition method and system, computer equipment and wearable equipment
Technical Field
The invention belongs to the cross technical field of machine learning and emotion recognition, and particularly relates to an electroencephalogram emotion recognition method, an electroencephalogram emotion recognition system, computer equipment and wearable equipment.
Background
At present: compared with traditional emotion recognition methods based on facial expressions, characters, actions and the like, the electroencephalogram is used for emotion recognition, problem research can be conducted more directly and objectively, in modern life, good emotion states have positive influences on physical and mental health, families and work of people, good emotion can improve the learning and working efficiency of people, long-term depression states often cause serious psychological diseases and even can cause serious adverse effects on families and society, and many researches show that many mental diseases such as depression can be expressed in electroencephalogram signals, so that people can be helped to take effective measures in advance to avoid the dangers through recognition and evaluation of the emotion. The invention acquires the EEG signals by using the electrodes connected with the scalp layer, thereby acquiring the EEG signals of a plurality of channels, and then performs some preprocessing on the acquired EEG signals, such as reducing ocular artifacts and adopting the EEG signals in a certain frequency range.
The application of deep learning as an effective method in the field of machine learning relates to the aspect of people's life, and as for a convolutional neural network, multiple iterations and more layers enable the deep learning to reach higher accuracy, but more time and space are spent to construct a model, and after a certain accuracy is reached, much time is spent on continuing to promote.
Hypergraph learning is a graph learning method, and the properties of the hypergraph edges in the hypergraph can be just used for describing the relationship between data. Because the hypergraph problem can be converted into an eigenvalue solution problem, and the data can be described by keeping the association between samples without the assumption of sample distribution, the data characteristics can be better described under the complex and nonlinear sample distribution. However, due to the high requirements of hypergraph learning on features, high accuracy can only be achieved by using high quality features.
Through the above analysis, the problems and defects of the prior art are as follows: the deep learning spends a lot of time and space to construct a model, after a certain accuracy is reached, too much time cost is needed for continuous improvement, and the hypergraph learning needs high-quality features to reach the high accuracy, so that the selection aspect of the features needs to be optimized.
The difficulty in solving the above problems and defects is: when deep learning is to obtain higher accuracy, a deeper neural network is often constructed, but the deeper the neural network is, the more difficult the optimization problem is to solve, the memory consumption is huge, the calculation is complex, a large amount of time is spent on finding the optimal parameters, and the final result is often a local optimal solution. The electroencephalogram features extracted by the traditional feature extraction method have unsatisfactory effect in hypergraph learning, and particularly in unsupervised hypergraph learning, the classification accuracy is low, so that the optimal selection of input data is a difficulty in hypergraph learning.
The significance of solving the problems and the defects is as follows: when the neural network is trained to a certain degree, the output of the full connection layer is used as the characteristic, and the hypergraph learning is used for continuing training, so that a large amount of time is not needed to be spent for improving the accuracy, and the sampling training can be carried out to realize space compression while the high-quality characteristic is used for obtaining higher accuracy. As hypergraph learning also has good model training capability, only short-time deep learning is needed to be carried out on the features, and the subsequent hypergraph learning can achieve higher accuracy, thereby realizing the compression on time and space in the training process.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an electroencephalogram emotion recognition method and system, computer equipment and wearable equipment.
The invention is realized in such a way, and the electroencephalogram emotion recognition method comprises the following steps:
removing an electroencephalogram signal generated when the video is converted at the beginning, and subtracting the average value of the signal from the rest data;
extracting time-frequency domain characteristics of the preprocessed electroencephalogram signals by using short-time Fourier transform;
putting the features into a convolutional neural network for training;
and carrying out hypergraph learning on the obtained features, constructing a hypergraph classifier model, and finishing emotion classification and identification.
Further, the electroencephalogram emotion recognition method preprocesses original data, collects basic electroencephalograms without emotion when electroencephalograms with emotion are collected, records average values of electroencephalograms without emotion, removes electroencephalograms irrelevant to emotion, and subtracts the average values from subsequent emotion signals.
Furthermore, the electroencephalogram emotion recognition method extracts electroencephalogram characteristics, extracts time-frequency characteristics of a plurality of channels in each experiment from the preprocessed electroencephalogram signals in a short-time Fourier transform mode, and each sample contains the characteristics of all the channels.
Further, the electroencephalogram emotion recognition method utilizes a convolutional neural network to carry out deep learning on the features, the time-frequency features are put into a CNN network for training, and the sample features with better quality are obtained after multiple iterations of the convolutional neural network, wherein the convolutional neural network comprises three one-dimensional convolutional layers, a pooling layer and a full-connection layer.
Further comprising:
(1) and (4) selecting a sample. Selecting n samples at each time, wherein the samples are characterized by m dimensions, forming a characteristic matrix of n x m, and the number of channels is 1;
(2) convolutional layers were constructed with a total of three one-dimensional convolutional layers:
(3) constructing a pooling layer, entering the pooling layer, and performing average pooling;
(4) and constructing a full connection layer, an optimization function and a loss function. Entering a full-connection layer after standardization, outputting a k-dimensional vector for each sample, and then setting an optimization function and a loss function;
(5) inputting all samples, continuing to input the samples, and repeating the steps until all the samples are used;
(6) and (4) carrying out multiple iterations, thus finishing one iteration and starting the next iteration.
(3) Further comprising:
1) the number of convolution kernel rows of the first convolution layer is 2, 1 row is total, and the number of convolution kernels is 5;
2) the convolution kernel row number of the second convolution layer is 2, 5 rows are provided in total, and the convolution kernel number is 5;
3) the number of convolution kernel rows for the third convolutional layer is 2, 5 columns in total, and the number of convolution kernels is 5.
Further, the electroencephalogram emotion recognition method utilizes hypergraph learning to construct a classifier. Carrying out hypergraph learning by taking the features subjected to deep learning training as input, learning a mapping from the features to the labels, and constructing a classifier model to complete classification;
(1) constructing a hypergraph, constructing a hypergraph edge of the hypergraph according to the category of the sample, calculating an adjacent matrix h (v, e), and further obtaining a weight matrix W and a point degree matrix DvEdge matrixeThen, a hypergraph Laplace matrix L can be obtainedH
The calculation method of the adjacency matrix comprises the following steps:
Figure BDA0002661432490000041
the calculation method of the weight matrix is as follows:
Figure BDA0002661432490000042
the calculation method of the point degree matrix and the edge degree matrix is as follows, wherein V is a vertex set, and E is a super edge set:
Figure BDA0002661432490000043
Figure BDA0002661432490000044
(2) and solving a Laplace matrix of the hypergraph. The definition of the hypergraph laplacian matrix is:
Figure BDA0002661432490000045
(3) a penalty function for hypergraph segmentation is defined. Defining a loss function of the hypergraph segmentation as follows through the hypergraph Laplace matrix:
Figure BDA0002661432490000046
wherein the tangent vector F is ═ F1,…,fm]Class prediction, which represents a sample, is an n-dimensional column vector in which each component represents a prediction value for a class, FuAnd FvRepresenting row vectors of the matrix relative to the vertexes u and v, representing which category the vertexes belong to, Trace representing the Trace of the matrix, and a loss function of the hypergraph segmentation is used for minimizing the loss of the sample category relation;
(4) defining a prediction error, and measuring the prediction error by using Euclidean distance:
Figure BDA0002661432490000051
wherein Y is [ Y ═ Y1,…,ym]Representing a set of real tags, where the column vector yiIs a label vector of a certain category, if the sample is the category, the element value corresponding to the sample in the vector is set as 1, otherwise, the element value is-1;
(5) and constructing an objective function of hypergraph learning. Obtaining an objective function:
Figure BDA0002661432490000052
wherein lambda is a positive parameter for measuring two losses, the objective function is used for minimizing the loss of the sample class relationship and the class prediction error, and the class relationship between the samples can be kept as much as possible and the prediction error is minimized by solving the optimization objective;
(6) constructing a mapping relation, namely constructing a mapping relation F from the sample space to the label space as XTAnd B, substituting the mapping relation into the target function to form a regularized least square problem, solving a partial derivative of the projection matrix and assigning zero to obtain the mapping matrix:
B=(XLHXT+λXXT+ηI)-1(λX(2H-1));
wherein B represents a projection matrix of the feature space to the class, and η is a non-negative regularization parameter;
(7) formula substitution, the mapping relation is substituted into the objective function to obtain:
Figure BDA0002661432490000053
(8) solving a mapping matrix, solving a partial derivative of the projection matrix and assigning zero to the projection matrix so as to obtain the mapping matrix:
B=(XLHXT+λXXT+ηI)-1(λX(2H-1));
(9) and (3) performing class prediction, and projecting the sample to a subspace spanned by the projection matrix B in the test stage to obtain a prediction space:
Figure BDA0002661432490000054
where the function sign (-) returns the sign of each element of the vector, positive indicates that the sample belongs to the class, negative indicates that the sample does not, pi=[pi1,…,pim]Is a row vector that represents the confidence that a sample belongs to a class.
It is a further object of the invention to provide a computer device comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of:
removing an electroencephalogram signal generated when the video is converted at the beginning, and subtracting the average value of the signal from the rest data;
extracting time-frequency domain characteristics of the preprocessed electroencephalogram signals by using short-time Fourier transform;
putting the features into a convolutional neural network for training;
and carrying out hypergraph learning on the obtained features, constructing a hypergraph classifier model, and finishing emotion classification and identification.
Another object of the present invention is to provide a brain electric emotion recognition system for implementing the brain electric emotion recognition method, the brain electric emotion recognition system including:
the data preprocessing module is used for removing the electroencephalogram signals generated during the video conversion at the beginning and then subtracting the average value of the signals from the rest data;
the characteristic extraction module is used for extracting time-frequency domain characteristics from the preprocessed electroencephalogram signals by using short-time Fourier transform;
the characteristic optimization module is used for putting the characteristics into a convolutional neural network for training and extracting high-quality characteristics;
and the emotion recognition module is used for carrying out hypergraph learning on the obtained features, constructing a hypergraph classifier model and finishing emotion classification recognition.
The invention also aims to provide a wearable device, and the wearable device is provided with the electroencephalogram emotion recognition system.
By combining all the technical schemes, the invention has the advantages and positive effects that: hypergraph learning is a method of graph learning, in the traditional graph learning, the relationship between samples is a pair-wise corresponding relationship, but in reality most of the relationships are one-to-many and many-to-many complex relationships, and the hypergraph contains the relationship between multiple points, and such an attribute can be just used for describing the relationship between data. Because the hypergraph problem can be converted into a characteristic value solving problem, the assumption of sample distribution is not needed, and the data is described by keeping the association between samples, the data characteristic can be better described under the complex and nonlinear sample distribution, and in addition, the relation between every two samples can be fully utilized when the hypergraph is constructed, so that the higher accuracy can be kept even under the condition of less training samples. In practical experiments, the emotion classification accuracy is effectively improved by combining deep learning with hypergraph learning, the average accuracy is improved by about 15% compared with a simple hypergraph learning method, and the training time and space are effectively shortened. For a detailed description of the data, please refer to the experimental results below.
From the technical aspect, the extracted electroencephalogram characteristics are optimized by adopting a convolutional neural network method in deep learning to obtain representative electroencephalogram characteristics; in the hypergraph learning process, a hypergraph is constructed through the relation between samples to obtain a Laplace matrix of the hypergraph, and an optimally segmented target function is used as a loss function in supervised learning, so that a mapping matrix from a feature space to a label is constructed, and the classification problem is solved. From the aspect of performance, the training time is shortened under the condition of using less training samples, high accuracy is guaranteed, and important reference function and reference value are provided for research and development of wearable equipment and somatosensory equipment in the field of artificial intelligence.
The invention provides the electroencephalogram emotion recognition method combining deep learning and hypergraph learning from the aspect of reducing electroencephalogram characteristic dimensionality under the condition of overcoming the defects that an electroencephalogram is complex and is easily influenced by noise, and the method fully utilizes the relevance among samples by utilizing the hypergraph learning method while compressing the samples and shortening the training time, thereby achieving higher classification accuracy. The invention optimizes the extracted electroencephalogram characteristics and realizes dimension reduction by using a deep learning method, and then samples the training model by using a hypergraph learning method, thereby realizing data compression and reducing the operation time under the condition of keeping higher accuracy. The method has important theoretical significance and practical value for the design and research and development of wearable equipment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained from the drawings without creative efforts.
FIG. 1 is a flowchart of an electroencephalogram emotion recognition method provided by the embodiment of the invention.
FIG. 2 is a schematic structural diagram of an electroencephalogram emotion recognition system provided by an embodiment of the present invention;
in fig. 2: 1. a data preprocessing module; 2. a feature extraction module; 3. a feature optimization module; 4. and an emotion recognition module.
FIG. 3 is a flowchart of an implementation of the electroencephalogram emotion recognition method provided by the embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Aiming at the problems in the prior art, the invention provides an electroencephalogram emotion recognition method, a system, computer equipment and wearable equipment, and the invention is described in detail in the following with reference to the accompanying drawings.
As shown in FIG. 1, the electroencephalogram emotion recognition method provided by the invention comprises the following steps:
s101: the influence of non-emotion signals on emotion recognition is reduced by removing electroencephalogram signals generated in the process of video conversion at the beginning and then subtracting the average value of the signals from the rest data;
s102: extracting time-frequency domain characteristics of the preprocessed electroencephalogram signals by using short-time Fourier transform;
s103: putting the features into a convolutional neural network for training, and extracting high-quality features;
s104: and carrying out hypergraph learning on the obtained features, constructing a hypergraph classifier model, and finishing emotion classification and identification.
The electroencephalogram emotion recognition method provided by the invention can be implemented by adopting other steps by ordinary technicians in the field, and the electroencephalogram emotion recognition method provided by the invention in fig. 1 is only a specific embodiment.
As shown in fig. 2, the electroencephalogram emotion recognition system provided by the present invention includes:
the data preprocessing module 1 is used for removing electroencephalogram signals generated in the process of video conversion at the beginning and then subtracting the average value of the signals from the rest data, so that the influence of non-emotion signals on emotion recognition is reduced;
the characteristic extraction module 2 is used for extracting time-frequency domain characteristics from the preprocessed electroencephalogram signals by using short-time Fourier transform;
the characteristic optimization module 3 is used for putting the characteristics into a convolutional neural network for training and extracting high-quality characteristics;
and the emotion recognition module 4 is used for carrying out hypergraph learning on the obtained features so as to construct a hypergraph classifier model and finish emotion classification recognition.
The technical solution of the present invention is further described below with reference to the accompanying drawings.
As shown in FIG. 3, the electroencephalogram emotion recognition method of the present invention is specifically implemented as follows:
(1) preprocessing raw data
When the electroencephalogram generating emotion is collected, the basic electroencephalogram without emotion is collected, so that the average value of electroencephalogram signals before emotion is generated is recorded, the electroencephalogram signals irrelevant to emotion are removed, the average value is subtracted from the subsequent emotion signals, and the obtained data is the emotion electroencephalogram without interference of other electroencephalograms.
(2) Extraction of electroencephalogram features
And extracting time-frequency characteristics of a plurality of channels in each experiment by using a short-time Fourier transform mode on the preprocessed electroencephalogram signals, wherein each sample contains the characteristics of all the channels.
(3) Deep learning of features using convolutional neural networks
And putting the time-frequency characteristics into a CNN network for training, and obtaining sample characteristics with better quality after multiple iterations of a convolutional neural network. The convolutional neural network comprises three one-dimensional convolutional layers, a pooling layer and a full-connection layer.
(3a) And (4) selecting a sample. Each time, n samples are selected, which are characterized by m dimensions, and form a feature matrix of n x m, and the number of channels is 1.
(3b) And (5) building a convolutional layer. There are a total of three one-dimensional convolutional layers:
(3b.1) the number of convolution kernel rows of the first convolutional layer is 2, 1 column in total, and the number of convolution kernels is 5.
(3b.2) the number of convolution kernel rows of the second convolution layer is 2, 5 columns in total, and the number of convolution kernels is 5.
(3b.3) the number of convolution kernel rows of the third convolution layer is 2, 5 columns in total, and the number of convolution kernels is 5.
(3c) And constructing a pooling layer. Entering a pooling layer for average value pooling
(3d) And constructing a full connection layer, an optimization function and a loss function. And entering a full-connection layer after standardization, outputting a k-dimensional vector for each sample, and then setting an optimization function and a loss function.
(3e) All samples are input. Continuing to input samples, repeating the above steps until all samples are used
(3f) And (4) carrying out multiple iterations. And finishing one iteration and starting the next iteration.
(4) A classifier is constructed using hypergraph learning. And (3) carrying out hypergraph learning by taking the features subjected to deep learning training as input, and learning a mapping from the features to the labels, so that a classifier model is constructed and classified.
(4a) And constructing a hypergraph. Constructing a super edge of the hypergraph according to the category to which the sample belongs, calculating an adjacent matrix h (v, e), and further solving a weight matrix W and a point degree matrix DvEdge matrixeThen, a hypergraph Laplace matrix L can be obtainedH
The calculation method of the adjacency matrix comprises the following steps:
Figure BDA0002661432490000101
the calculation method of the weight matrix is as follows:
Figure BDA0002661432490000102
the calculation method of the point degree matrix and the edge degree matrix is as follows, wherein V is a vertex set, and E is a super edge set:
Figure BDA0002661432490000103
Figure BDA0002661432490000104
(4b) and solving a Laplace matrix of the hypergraph. The definition of the hypergraph laplacian matrix is:
Figure BDA0002661432490000105
(4c) a penalty function for hypergraph segmentation is defined. Through the laplacian matrix of the hypergraph, the penalty function for hypergraph segmentation can be defined as:
Figure BDA0002661432490000111
wherein the tangent vector F is ═ F1,…,fm]Class prediction, which represents a sample, is an n-dimensional column vector in which each component represents a prediction value for a class, FuAnd FvThe row vector of the representing matrix relative to the vertexes u and v indicates which category the vertex belongs to, Trace indicates the Trace of the matrix, and the loss function of the hypergraph segmentation is used for minimizing the loss of the sample category relation.
(4d) Defining a prediction error, and measuring the prediction error by using Euclidean distance:
Figure BDA0002661432490000112
wherein Y is [ Y ═ Y1,…,ym]Representing a set of real tags, where the column vector yiIs a label of a certain categoryAnd the value of an element corresponding to the sample in the vector is set to be 1 if the sample is the class, and is-1 otherwise.
(4e) And constructing an objective function of hypergraph learning. Obtaining an objective function:
Figure BDA0002661432490000113
wherein lambda is a positive parameter for measuring two losses, the objective function is used for minimizing the loss of the sample class relationship and the class prediction error, and the class relationship between the samples can be kept as much as possible and the prediction error can be minimized when the optimization objective is solved.
(4f) And constructing a mapping relation. Constructing a mapping relation F ═ X from a sample space to a label spaceTAnd B, substituting the mapping relation into the target function to form a regularized least square problem, solving a partial derivative of the projection matrix and assigning zero to the projection matrix so as to obtain the mapping matrix:
B=(XLHXT+λXXT+ηI)-1(λX(2H-1));
where B represents the projection matrix of the feature space into classes, and η is a non-negative regularization parameter.
(4g) And substituting the formula. And substituting the mapping relation into the objective function to obtain:
Figure BDA0002661432490000121
(4h) and solving the mapping matrix. And (3) performing partial derivation on the projection matrix and zero assignment to obtain a mapping matrix:
B=(XLHXT+λXXT+ηI)-1(λX(2H-1));
(4i) and (5) performing category prediction. The prediction space is obtained by projecting the samples into a subspace spanned by the projection matrix B in the test phase.
Figure BDA0002661432490000122
Wherein the sign () returns the sign of each element of the vector, a positive indicates that the sample belongs to the class, and a negative indicates that the sample does not belong to the class. p is a radical ofi=[pi1,…,pim]Is a row vector that represents the confidence that a sample belongs to a class.
The technical effects of the present invention will be described in detail with reference to experiments.
Experiment 1: prediction of electroencephalogram emotion data by using the method
The data used in this experiment is from DEAP (database for electronic Analysis using physical signals), which is used to study the multifunctional data of human emotional state, and can be freely obtained. The data set records experimental data of 32 experimenters, each experimenter performs 40 experiments, each experiment enables the experimenter to see a section of video, and electroencephalogram signals of 32 channels are collected by electroencephalogram collection equipment in the period. After the experimenter performs emotion self-assessment, the invention uses the pleasure value and the Arousal Arousal as classification assessment standards and divides a training set and a test set according to the ratio of 1: 1.
1. The data of one experimenter is selected for experiment each time, the data is preprocessed firstly, each experiment comprises 63 seconds of data, the first 3 seconds are electroencephalogram signals generated during video conversion, the electroencephalogram signals are removed, the data of the remaining 60 seconds are all subtracted by the average value of the signals of the first 3 seconds, and therefore the influence of non-emotion signals on emotion recognition is reduced.
2. Carrying out feature extraction on 60-second electroencephalogram signals, extracting time-frequency features of all channels of each experiment in a short-time Fourier transform mode, and setting parameters of a short-time Fourier function as: the FFT window size is 2s, the step size is 0.125s, and the window function is boxcar. The number of samples extracted in each experiment is 481, and the characteristic dimension is 2560 dimensions.
3. And respectively putting the time-frequency characteristics of all the tests of each person into a CNN network for training, wherein the convolutional neural network comprises three one-dimensional convolutional layers, a pooling layer and a full-connection layer. 100 samples are put in each iteration, the characteristic dimension output by the full-connection layer is set to be 4, the loss function adopts cross entropy loss, the optimizer selects AdamaOptizer, the iteration time is 20 times, and finally the characteristic set of each experimenter for 40 times of experiments is obtained, the sample number is 19240, and the characteristic dimension is 4.
4. The method comprises the following steps of taking the features trained by deep learning as input to carry out hypergraph learning:
(4.1) firstly, dividing the data into a training set and a test set according to the ratio of 1:1, and sampling and selecting the training set to obtain N samples. Constructing a super edge according to known labels of a training set, obtaining an incidence matrix containing a sample connection relation, and constructing a weight matrix, a point degree matrix and an edge degree matrix of the super graph according to Euclidean distance.
And (4.2) obtaining a hypergraph Laplace matrix which is a positive semi-definite matrix according to the matrix so as to obtain a hypergraph segmentation loss function, measuring the prediction error of the sample by using Euclidean distance, and combining the hypergraph segmentation loss function and the hypergraph segmentation loss function to obtain a target function with supervised hypergraph learning.
And (4.3) substituting the mapping relation representing the feature space to the label space into an objective function, solving the partial derivative of the mapping matrix and assigning zero to obtain an optimal mapping from the feature space to the label space, wherein the mapping relation is used as a classifier to finish classification.
Through experiments, the average prediction accuracy of the method and the common hypergraph learning method for all experimenters is calculated, and the two methods are compared. Table 1 performs two-class prediction on the joyness value dimension on the electroencephalogram emotion data, table 2 performs two-class prediction on the Arousal dimension on the electroencephalogram emotion data, and table 3 performs four-class prediction on the joyness value and the Arousal dimension on the electroencephalogram emotion data. To set up the comparative experiments, the experiments were all performed on the same pre-processed data, with the remaining step parameters kept consistent.
TABLE 1 accuracy of two-class prediction of the two methods for the pleasure Valence
Figure BDA0002661432490000131
TABLE 2 accuracy of two-class prediction of wakefulness Arousal using two methods
Figure BDA0002661432490000141
TABLE 3 four-Classification prediction accuracy of the two methods for the pleasure degree Valence and the Arousal degree Arousal
Figure BDA0002661432490000142
Experiment 2: influence of sampling training of hypergraph learning on accuracy
The scale of the training samples plays an important role in the practical application of the algorithm, and the less training samples can reduce the memory occupation and shorten the running time of the algorithm. After the high-quality characteristics of a certain person are extracted through deep learning, the method divides a training set and a test set according to the ratio of 1:1, the hypergraph learning is carried out after the data of the training set are sampled and selected, and all samples are selected by the test set. In the experiment, the pleasure degree Valence is selected as a two-class evaluation standard, and the pleasure degree Valence and the Arousal degree Arousal are selected as a four-class evaluation standard. Table 4 shows the variation of the average prediction accuracy for all experimenters under different sampling conditions, and table 5 shows the comparison of the hypergraph learning running time under different sampling conditions, which is performed on the same preprocessed data each time for setting the comparison experiment, and the parameters of the rest steps are kept consistent. The unified operation environment is as follows: personal computer, windows 7 system, 64G memory, 5T hard disk capacity, Intel (R) E5-2630 v2 as CPU, and 2.60GHz as main frequency.
TABLE 4 identification Rate comparison of different sampling ratios
Figure BDA0002661432490000143
TABLE 5 run time comparison (units: seconds) for different sample ratios
Figure BDA0002661432490000151
Experiment 3: comparison of results of the method proposed by the present invention and the existing method
The following experiments record a comparison of the results of the method of the present invention and other literature methods on DEAP data, the method comprising: (1) support Vector machine svm (support Vector machines): in the paper "EEG-Based emission Recognition Using a Wrapper-Based feed Selection Method", the authors use the power spectral density, oscillation characteristics and Shannon entropy of the EEG signal as characteristics and use an SVM as a classifier to perform classification prediction. (2) Secondary Discriminant analysis qda (quantitative diagnostic analysis): in the paper "sparse constrained social evolution enabled feature-channel-sample hybrid selection for day-life EEG observation recognition", the author uses a variety of common electroencephalogram features and utilizes secondary discriminant analysis to perform emotion recognition. (3) K nearest neighbor method knn (knearest neighbor): in The paper "The autoregression system based on autoregressive model and sequential for forward selection of emotional signatures", The authors use an autoregressive model to extract features and use The K-nearest neighbor method to predict emotion. The following table shows the mean accuracy of the four methods for the two classifications for all experimenters.
TABLE 6 comparison of the results of the present invention with the existing methods
Figure BDA0002661432490000152
As can be seen from the results in tables 1, 2 and 3, the method provided by the invention has the advantage that the prediction accuracy of hypergraph learning is obviously improved. As can be seen from tables 4 and 5, the method of the present invention can maintain a high accuracy rate under the condition of less training samples, and greatly shorten the operation time, which is beneficial to the improvement of the real-time performance. As can be seen from Table 6, compared with the conventional emotion recognition method, the method provided by the invention has the advantage that the classification accuracy is obviously improved.
It should be noted that the embodiments of the present invention can be realized by hardware, software, or a combination of software and hardware. The hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the apparatus and methods described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided on a carrier medium such as a disk, CD-or DVD-ROM, programmable memory such as read only memory (firmware), or a data carrier such as an optical or electronic signal carrier, for example. The apparatus and its modules of the present invention may be implemented by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., or by software executed by various types of processors, or by a combination of hardware circuits and software, e.g., firmware.
The above description is only for the purpose of illustrating the present invention and the appended claims are not to be construed as limiting the scope of the invention, which is intended to cover all modifications, equivalents and improvements that are within the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. The electroencephalogram emotion recognition method is characterized by comprising the following steps:
removing an electroencephalogram signal generated when the video is converted at the beginning, and subtracting the average value of the signal from the rest data;
extracting time-frequency domain characteristics of the preprocessed electroencephalogram signals by using short-time Fourier transform;
putting the features into a convolutional neural network for training;
and carrying out hypergraph learning on the obtained features, constructing a hypergraph classifier model, and finishing emotion classification and identification.
2. The electroencephalogram emotion recognition method of claim 1, wherein the electroencephalogram emotion recognition method preprocesses original data, collects the base electroencephalogram without emotion when the electroencephalogram with emotion is collected, records the average value of the electroencephalogram signals without emotion, removes the electroencephalogram signals without emotion, and subtracts the average value from the subsequent emotion signals.
3. The electroencephalogram emotion recognition method of claim 1, wherein the electroencephalogram emotion recognition method extracts electroencephalogram features, time-frequency features of a plurality of channels in each experiment are extracted from preprocessed electroencephalogram signals in a short-time Fourier transform mode, and each sample contains features of all channels.
4. The electroencephalogram emotion recognition method of claim 1, wherein the electroencephalogram emotion recognition method utilizes a convolutional neural network to perform deep learning on features, the time-frequency features are placed into a CNN network to be trained, and sample features with good quality are obtained after multiple iterations of the convolutional neural network, wherein the convolutional neural network comprises three one-dimensional convolutional layers, a pooling layer and a full-link layer.
5. The electroencephalogram emotion recognition method of claim 4, further comprising:
(1) selecting samples, wherein n samples are selected each time, the characteristics are m-dimensional, a characteristic matrix of n x m is formed, and the number of channels is 1;
(2) convolutional layers were constructed with a total of three one-dimensional convolutional layers:
(3) constructing a pooling layer, entering the pooling layer, and performing average pooling;
(4) constructing a full-connection layer, an optimization function and a loss function, entering the full-connection layer after standardization, outputting a k-dimensional vector for each sample, and then setting the optimization function and the loss function;
(5) inputting all samples, continuing to input the samples, and repeating the steps until all the samples are used;
(6) and (4) carrying out multiple iterations, thus finishing one iteration and starting the next iteration.
6. The electroencephalogram emotion recognition method of claim 5, wherein (3) further comprises:
1) the number of convolution kernel rows of the first convolution layer is 2, 1 row is total, and the number of convolution kernels is 5;
2) the convolution kernel row number of the second convolution layer is 2, 5 rows are provided in total, and the convolution kernel number is 5;
3) the number of convolution kernel rows for the third convolutional layer is 2, 5 columns in total, and the number of convolution kernels is 5.
7. The electroencephalogram emotion recognition method of claim 1, wherein the electroencephalogram emotion recognition method utilizes hypergraph learning to construct a classifier, takes the features subjected to deep learning training as input to perform hypergraph learning, learns a mapping from the features to the labels, and constructs a classifier model to complete classification;
(1) constructing a hypergraph, constructing a hypergraph edge of the hypergraph according to the category of the sample, calculating an adjacent matrix h (v, e), and further obtaining a weight matrix W and a point degree matrix DvEdge matrixeThen, a hypergraph Laplace matrix L can be obtainedH
The calculation method of the adjacency matrix comprises the following steps:
Figure FDA0002661432480000021
the calculation method of the weight matrix is as follows:
Figure FDA0002661432480000022
the calculation method of the point degree matrix and the edge degree matrix is as follows, wherein V is a vertex set, and E is a super edge set:
Figure FDA0002661432480000023
Figure FDA0002661432480000024
(2) obtaining a hypergraph Laplace matrix, wherein the definition of the hypergraph Laplace matrix is as follows:
Figure FDA0002661432480000025
(3) defining a loss function of hypergraph segmentation, wherein the loss function of hypergraph segmentation is defined by a hypergraph Laplace matrix as follows:
Figure FDA0002661432480000031
wherein the tangent vector F is ═ F1,…,fm]Class prediction, which represents a sample, is an n-dimensional column vector in which each component represents a prediction value for a class, FuAnd FvRepresenting row vectors of the matrix relative to the vertexes u and v, representing which category the vertexes belong to, Trace representing the Trace of the matrix, and a loss function of the hypergraph segmentation is used for minimizing the loss of the sample category relation;
(4) defining a prediction error, and measuring the prediction error by using Euclidean distance:
Figure FDA0002661432480000032
wherein Y is [ Y ═ Y1,…,ym]Representing a set of real tags, where the column vector yiIs a label vector of a certain category, if the sample is the category, the element value corresponding to the sample in the vector is set as 1, otherwise, the element value is-1;
(5) constructing an objective function of hypergraph learning to obtain the objective function:
Figure FDA0002661432480000033
wherein lambda is a positive parameter for measuring two losses, the objective function is used for minimizing the loss of the sample class relationship and the class prediction error, and the class relationship between the samples can be kept as much as possible and the prediction error is minimized by solving the optimization objective;
(6) constructing a mapping relation, namely constructing a mapping relation F from the sample space to the label space as XTAnd B, substituting the mapping relation into the target function to form a regularized least square problem, solving a partial derivative of the projection matrix and assigning zero to obtain the mapping matrix:
B=(XLHXT+λXXT+ηI)-1(λX(2H-1));
wherein B represents a projection matrix of the feature space to the class, and η is a non-negative regularization parameter;
(7) formula substitution, the mapping relation is substituted into the objective function to obtain:
Figure FDA0002661432480000041
(8) solving a mapping matrix, solving a partial derivative of the projection matrix and assigning zero to the projection matrix so as to obtain the mapping matrix:
B=(XLHXT+λXXT+ηI)-1(λX(2H-1));
(9) and (3) performing class prediction, and projecting the sample to a subspace spanned by the projection matrix B in the test stage to obtain a prediction space:
Figure FDA0002661432480000042
where the function sign (-) returns the sign of each element of the vector, positive indicates that the sample belongs to the class, negative indicates that the sample does not, pi=[pi1,…,pim]Is a row vector that represents the confidence that a sample belongs to a class.
8. A computer device, characterized in that the computer device comprises a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to carry out the steps of:
removing an electroencephalogram signal generated when the video is converted at the beginning, and subtracting the average value of the signal from the rest data;
extracting time-frequency domain characteristics of the preprocessed electroencephalogram signals by using short-time Fourier transform;
putting the features into a convolutional neural network for training;
and carrying out hypergraph learning on the obtained features, constructing a hypergraph classifier model, and finishing emotion classification and identification.
9. An electroencephalogram emotion recognition system for implementing the electroencephalogram emotion recognition method as claimed in any one of claims 1 to 7, wherein the electroencephalogram emotion recognition system comprises:
the data preprocessing module is used for removing the electroencephalogram signals generated during the video conversion at the beginning and then subtracting the average value of the signals from the rest data;
the characteristic extraction module is used for extracting time-frequency domain characteristics from the preprocessed electroencephalogram signals by using short-time Fourier transform;
the characteristic optimization module is used for putting the characteristics into a convolutional neural network for training and extracting high-quality characteristics;
and the emotion recognition module is used for carrying out hypergraph learning on the obtained features, constructing a hypergraph classifier model and finishing emotion classification recognition.
10. A wearable device characterized in that it is equipped with the electroencephalogram emotion recognition system of claim 9.
CN202010905842.7A 2020-09-01 2020-09-01 Electroencephalogram emotion recognition method, electroencephalogram emotion recognition system, computer equipment and wearable equipment Active CN112101152B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010905842.7A CN112101152B (en) 2020-09-01 2020-09-01 Electroencephalogram emotion recognition method, electroencephalogram emotion recognition system, computer equipment and wearable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010905842.7A CN112101152B (en) 2020-09-01 2020-09-01 Electroencephalogram emotion recognition method, electroencephalogram emotion recognition system, computer equipment and wearable equipment

Publications (2)

Publication Number Publication Date
CN112101152A true CN112101152A (en) 2020-12-18
CN112101152B CN112101152B (en) 2024-02-02

Family

ID=73756928

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010905842.7A Active CN112101152B (en) 2020-09-01 2020-09-01 Electroencephalogram emotion recognition method, electroencephalogram emotion recognition system, computer equipment and wearable equipment

Country Status (1)

Country Link
CN (1) CN112101152B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112617860A (en) * 2020-12-31 2021-04-09 山东师范大学 Emotion classification method and system of brain function connection network constructed based on phase-locked value
CN112690793A (en) * 2020-12-28 2021-04-23 中国人民解放军战略支援部队信息工程大学 Emotion electroencephalogram migration model training method and system and emotion recognition method and equipment
CN112766355A (en) * 2021-01-13 2021-05-07 合肥工业大学 Electroencephalogram signal emotion recognition method under label noise
CN112773378A (en) * 2021-01-20 2021-05-11 杭州电子科技大学 Electroencephalogram emotion recognition method for feature weight adaptive learning
CN112800239A (en) * 2021-01-22 2021-05-14 中信银行股份有限公司 Intention recognition model training method, intention recognition method and device
CN113129267A (en) * 2021-03-22 2021-07-16 杭州电子科技大学 OCT image detection method and system based on retina hierarchical data
CN113128353A (en) * 2021-03-26 2021-07-16 安徽大学 Emotion sensing method and system for natural human-computer interaction
CN113297981A (en) * 2021-05-27 2021-08-24 西北工业大学 End-to-end electroencephalogram emotion recognition method based on attention mechanism
CN113553896A (en) * 2021-03-25 2021-10-26 杭州电子科技大学 Electroencephalogram emotion recognition method based on multi-feature deep forest
CN114176607A (en) * 2021-12-27 2022-03-15 杭州电子科技大学 Electroencephalogram signal classification method based on visual Transformer
CN114219014A (en) * 2021-11-26 2022-03-22 合肥工业大学 Electroencephalogram-based attention-seeking pooling depressive disorder identification and classification method
CN114818786A (en) * 2022-04-06 2022-07-29 五邑大学 Channel screening method, emotion recognition method, system and storage medium
CN116701917A (en) * 2023-07-28 2023-09-05 电子科技大学 Open set emotion recognition method based on physiological signals
CN117251969A (en) * 2023-09-28 2023-12-19 深圳技术大学 Brain-like positive and negative feedback echo state network design method for emotion classification
CN117562542A (en) * 2024-01-17 2024-02-20 小舟科技有限公司 Emotion recognition method based on electroencephalogram signals, computer equipment and storage medium
CN117643470A (en) * 2024-01-30 2024-03-05 武汉大学 Fatigue driving detection method and device based on electroencephalogram interpretation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018014436A1 (en) * 2016-07-18 2018-01-25 天津大学 Emotion eeg recognition method providing emotion recognition model time robustness
CN110353702A (en) * 2019-07-02 2019-10-22 华南理工大学 A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN110399857A (en) * 2019-08-01 2019-11-01 西安邮电大学 A kind of brain electricity emotion identification method based on figure convolutional neural networks
KR20190128978A (en) * 2018-05-09 2019-11-19 한국과학기술원 Method for estimating human emotions using deep psychological affect network and system therefor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018014436A1 (en) * 2016-07-18 2018-01-25 天津大学 Emotion eeg recognition method providing emotion recognition model time robustness
KR20190128978A (en) * 2018-05-09 2019-11-19 한국과학기술원 Method for estimating human emotions using deep psychological affect network and system therefor
CN110353702A (en) * 2019-07-02 2019-10-22 华南理工大学 A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN110399857A (en) * 2019-08-01 2019-11-01 西安邮电大学 A kind of brain electricity emotion identification method based on figure convolutional neural networks

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
孙中皋;薛全德;王新军;黄晓理;: "基于脑电信号的情感识别方法综述", 北京生物医学工程, no. 02 *
尹旺;李惠媛;: "深度学习在脑电情感识别方面的应用研究进展", 计算机时代, no. 08 *
贾小云;王丽艳;陈景霞;张鹏伟;: "基于时频域组合特征的脑电信号情感分类算法", 科学技术与工程, no. 33 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112690793A (en) * 2020-12-28 2021-04-23 中国人民解放军战略支援部队信息工程大学 Emotion electroencephalogram migration model training method and system and emotion recognition method and equipment
CN112690793B (en) * 2020-12-28 2023-05-16 中国人民解放军战略支援部队信息工程大学 Emotion electroencephalogram migration model training method and system and emotion recognition method and equipment
CN112617860A (en) * 2020-12-31 2021-04-09 山东师范大学 Emotion classification method and system of brain function connection network constructed based on phase-locked value
CN112766355A (en) * 2021-01-13 2021-05-07 合肥工业大学 Electroencephalogram signal emotion recognition method under label noise
CN112766355B (en) * 2021-01-13 2022-08-05 合肥工业大学 Electroencephalogram signal emotion recognition method under label noise
CN112773378B (en) * 2021-01-20 2022-05-17 杭州电子科技大学 Electroencephalogram emotion recognition method for feature weight adaptive learning
CN112773378A (en) * 2021-01-20 2021-05-11 杭州电子科技大学 Electroencephalogram emotion recognition method for feature weight adaptive learning
CN112800239A (en) * 2021-01-22 2021-05-14 中信银行股份有限公司 Intention recognition model training method, intention recognition method and device
CN112800239B (en) * 2021-01-22 2024-04-12 中信银行股份有限公司 Training method of intention recognition model, and intention recognition method and device
CN113129267A (en) * 2021-03-22 2021-07-16 杭州电子科技大学 OCT image detection method and system based on retina hierarchical data
CN113553896B (en) * 2021-03-25 2024-02-09 杭州电子科技大学 Electroencephalogram emotion recognition method based on multi-feature depth forest
CN113553896A (en) * 2021-03-25 2021-10-26 杭州电子科技大学 Electroencephalogram emotion recognition method based on multi-feature deep forest
CN113128353B (en) * 2021-03-26 2023-10-24 安徽大学 Emotion perception method and system oriented to natural man-machine interaction
CN113128353A (en) * 2021-03-26 2021-07-16 安徽大学 Emotion sensing method and system for natural human-computer interaction
CN113297981B (en) * 2021-05-27 2023-04-07 西北工业大学 End-to-end electroencephalogram emotion recognition method based on attention mechanism
CN113297981A (en) * 2021-05-27 2021-08-24 西北工业大学 End-to-end electroencephalogram emotion recognition method based on attention mechanism
CN114219014A (en) * 2021-11-26 2022-03-22 合肥工业大学 Electroencephalogram-based attention-seeking pooling depressive disorder identification and classification method
CN114176607A (en) * 2021-12-27 2022-03-15 杭州电子科技大学 Electroencephalogram signal classification method based on visual Transformer
CN114176607B (en) * 2021-12-27 2024-04-19 杭州电子科技大学 Electroencephalogram signal classification method based on vision transducer
CN114818786A (en) * 2022-04-06 2022-07-29 五邑大学 Channel screening method, emotion recognition method, system and storage medium
CN114818786B (en) * 2022-04-06 2024-03-01 五邑大学 Channel screening method, emotion recognition system and storage medium
CN116701917B (en) * 2023-07-28 2023-10-20 电子科技大学 Open set emotion recognition method based on physiological signals
CN116701917A (en) * 2023-07-28 2023-09-05 电子科技大学 Open set emotion recognition method based on physiological signals
CN117251969A (en) * 2023-09-28 2023-12-19 深圳技术大学 Brain-like positive and negative feedback echo state network design method for emotion classification
CN117562542A (en) * 2024-01-17 2024-02-20 小舟科技有限公司 Emotion recognition method based on electroencephalogram signals, computer equipment and storage medium
CN117562542B (en) * 2024-01-17 2024-04-30 小舟科技有限公司 Emotion recognition method based on electroencephalogram signals, computer equipment and storage medium
CN117643470A (en) * 2024-01-30 2024-03-05 武汉大学 Fatigue driving detection method and device based on electroencephalogram interpretation
CN117643470B (en) * 2024-01-30 2024-04-26 武汉大学 Fatigue driving detection method and device based on electroencephalogram interpretation

Also Published As

Publication number Publication date
CN112101152B (en) 2024-02-02

Similar Documents

Publication Publication Date Title
CN112101152B (en) Electroencephalogram emotion recognition method, electroencephalogram emotion recognition system, computer equipment and wearable equipment
Nawaz et al. Comparison of different feature extraction methods for EEG-based emotion recognition
Zhao et al. ECG authentication system design incorporating a convolutional neural network and generalized S-Transformation
Li et al. A new ECG signal classification based on WPD and ApEn feature extraction
CN114052735B (en) Deep field self-adaption-based electroencephalogram emotion recognition method and system
CN111310570B (en) Electroencephalogram signal emotion recognition method and system based on VMD and WPD
CN111202517B (en) Sleep automatic staging method, system, medium and electronic equipment
Manjunath et al. A low-power lstm processor for multi-channel brain eeg artifact detection
CN113191225A (en) Emotional electroencephalogram recognition method and system based on graph attention network
CN113011493A (en) Electroencephalogram emotion classification method, device, medium and equipment based on multi-kernel width learning
Xie et al. Electroencephalogram emotion recognition based on a stacking classification model
Yao et al. Interpretation of electrocardiogram heartbeat by CNN and GRU
CN115414051A (en) Emotion classification and recognition method of electroencephalogram signal self-adaptive window
CN113768515A (en) Electrocardiosignal classification method based on deep convolutional neural network
Yao et al. A cnn-transformer deep learning model for real-time sleep stage classification in an energy-constrained wireless device
Mahato et al. Analysis of region of interest (RoI) of brain for detection of depression using EEG signal
Haider et al. EEG-based schizophrenia classification using penalized sequential dictionary learning in the context of mobile healthcare
CN116664956A (en) Image recognition method and system based on multi-task automatic encoder
CN116602676A (en) Electroencephalogram emotion recognition method and system based on multi-feature fusion and CLSTN
CN116186544A (en) Single-channel electroencephalogram sleep stage-dividing method based on deep learning
Hassan et al. Review of EEG Signals Classification Using Machine Learning and Deep-Learning Techniques
CN115631371A (en) Extraction method of electroencephalogram signal core network
CN114398991A (en) Electroencephalogram emotion recognition method based on Transformer structure search
Pirolla et al. Dimensionality reduction to improve content-based image retrieval: A clustering approach
Lu Human emotion recognition based on multi-channel EEG signals using LSTM neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant