CN111832431A - Emotional electroencephalogram classification method based on CNN - Google Patents

Emotional electroencephalogram classification method based on CNN Download PDF

Info

Publication number
CN111832431A
CN111832431A CN202010582404.1A CN202010582404A CN111832431A CN 111832431 A CN111832431 A CN 111832431A CN 202010582404 A CN202010582404 A CN 202010582404A CN 111832431 A CN111832431 A CN 111832431A
Authority
CN
China
Prior art keywords
emotion
data
layer
samples
convolutional neural
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010582404.1A
Other languages
Chinese (zh)
Inventor
陈林楠
杨涛
马玉良
张启忠
高云园
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202010582404.1A priority Critical patent/CN111832431A/en
Publication of CN111832431A publication Critical patent/CN111832431A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/15Biometric patterns based on physiological signals, e.g. heartbeat, blood flow

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Signal Processing (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a CNN-based emotion electroencephalogram classification method. The method comprises the steps of taking a Russell emotion dimension model in a continuous emotion dimension model as a reference, taking a DEAP data set as a sample, removing a base line of emotion electroencephalograms in the emotion electroencephalograms, normalizing data, extracting Pearson coefficients of three frequency bands of the electroencephalograms, converting the Pearson coefficients into a 2D picture format, screening experiments valuable for emotion electroencephalograms classification by taking the experiments as a unit through SBS, inputting screened experimental data into CWAGAN-GP to perform data enhancement so as to supplement a training set, and inputting the screened experimental data into an integrated convolutional neural network in a frame mode. The method can effectively classify the emotional electroencephalogram signals, provides considerable classification precision, and can effectively prevent overfitting of the convolutional neural network by inputting the emotional electroencephalogram signals into the integrated convolutional neural network in a frame form.

Description

Emotional electroencephalogram classification method based on CNN
Technical Field
The invention relates to an emotion electroencephalogram classification method, in particular to a method for classifying emotion electroencephalograms by using CNN (CNN) after preprocessing and optimization.
Background
A Convolutional Neural Network (CNN) is a feedforward type neural network that has excellent performance in large-scale image processing, and has been widely used in the fields of image classification, localization, and the like. Compared with other neural network structures, the convolutional neural network requires relatively few parameters, so that the convolutional neural network can be widely applied.
The generative countermeasure network (GAN) passes through two modules in the framework: the mutual game learning of the generative model and the discriminant model generates quite good output, and new data which are false and spurious can be generated according to the original data set. The WGAN-GP is proposed aiming at the problems of the WGAN, the WGAN still has the problems of difficult training and slow convergence rate in the real experimental process, and compared with the traditional GAN, the WGAN-GP is not obviously improved experimentally. The problem with WGAN is to use the clipping weight directly in dealing with the continuity constraint, i.e. check the absolute values of all the parameters of the arbiter for exceeding a threshold, e.g. 0.01, each time the parameters of the arbiter have been updated, and return these parameters to the range of-0.01, 0.01. In this case, the optimal strategy is to make all parameters as extreme as possible, either take the maximum value (e.g. 0.01) or take the minimum value (e.g. -0.01), the fitting capability of the deep neural network cannot be fully exerted for the deep neural network, and it is also found that forcing the shearing weight easily causes gradient disappearance or gradient explosion, both of which are caused by selection of the shearing range, and if the shearing range is selected to be too small, the gradient becomes larger slightly after passing through one layer of network, and then gradient explosion occurs after multiple layers. To solve this problem and find a suitable way to satisfy the continuity condition, the WGAN-GP uses a gradient penalty way to satisfy this continuity condition. CWGAN-GP is a condition constraint, namely a label, added on the basis of WGAN-GP, so that the CWGAN-GP can generate data of the specified label.
And (4) Selecting Backward (SBS) sequences, starting from the feature complete set A, and evaluating a function value to be optimal after one feature is removed from the feature set A each time.
Convolutional neural networks are a very efficient model of the image classification task. For such models, the present study attempts to convert the DEAP data into a 2D image format so that the CNN model can learn how to classify them efficiently.
Disclosure of Invention
The method adopts a Russell emotion dimension model in a continuous emotion dimension model as a reference, uses a DEAP data set as a sample, performs preprocessing data normalization, extracts Pearson coefficients of three frequency bands most relevant to emotion recognition in an electroencephalogram channel, converts the Pearson coefficients into a 2D picture format, screens valuable experiments for emotion electroencephalogram classification through SBS by taking the experiments as a unit, inputs the screened experimental data into CWAGAN-GP for data enhancement so as to supplement a training set, and finally inputs convolutional neural network classification.
The invention classifies the preprocessed emotion electroencephalogram by using CNN, and specifically comprises the following steps:
step 1, establishing an emotion dimension model represented by Russell, and labeling evaluation indexes labels in a data set according to required categories.
And 2, extracting the baseline data of 3 seconds before each experiment, segmenting the baseline data by using a selected time window, adding the segmented baseline data, averaging the segmented baseline data, segmenting the rest electroencephalogram data by using the selected time window, subtracting the processed baseline data to obtain the difference value of the electroencephalogram data, and performing data normalization. 3 frequency bands most relevant to emotion recognition are extracted from the difference values of the electroencephalogram data, and then Pearson coefficients of the 3 frequency bands are calculated and converted into a picture format.
And 3, for each subject, taking an experiment as a unit, converting the data of each subject into the experiment times, the segment number and the segment number, the residual number after screening is set, sending the data into SBS to screen out the experiment combination with the highest accuracy rate, sending the data into CWGAN-GP to carry out data enhancement through a labeled generating type confrontation network, and supplementing the data to balance the emotion data of each category because the emotion of some categories is reduced after screening.
And 4, processing the processed data and labels according to a training sample and a test sample 8: 2, and inputting the frame format into the integrated convolutional neural network to perform classification test on the processed samples.
In the step 1, when a Russell emotion dimension model is taken as a reference, each emotion state is placed on a two-dimensional plane, the arousal degree and the pleasure degree form two dimensions of a coordinate axis, the valence (value) represents the subjective evaluation of a main body (person) on emotion, and the negative emotion is changed into the positive emotion on the coordinate axis; arousal (arousal) represents the degree of emotion perception by a subject (human), varying on the numerical axis from calm to excited. The values of both dimensions vary from 1-9, with the magnitude of the values representing the degree of emotion. When selecting emotion dichotomy, the rating values on valence and arousal are divided with a median of 5 as a threshold, greater than 5 being labeled as 1, representing high valence or high arousal (HV, HA); less than or equal to 5 is labeled 0, representing low titer or low arousal (LV, LA); when selecting mood quartiles, HV, HA, LV, LA may be combined to generate four classes of tags for HVHA, HVLA, LVHA, LVLA.
In the step 2, the electroencephalogram signals are converted into a 2D picture form, and the method specifically comprises the following steps:
2-2. for 8-12Hz (alpha band), 12-30Hz (beta band), 30-60Hz (gamma band), the signal is filtered using a fourth order butterworth filter.
And 2-3, calculating 32 electroencephalogram channels for each frequency band, calculating a Pearson coefficient between every two channels, converting the Pearson coefficient into a format of 3X 1, and combining the results of 3 frequency bands to obtain a 2D picture format of 32X 3.
In step 4, the processed data and labels are processed according to the training sample and the test sample 8: 2, and inputting the frame format into the integrated convolutional neural network to perform classification test on the processed samples. The method specifically comprises the following steps:
the time window size is set to 1s, 40 × 60 × 32 ═ 76800 samples are segmented in total, for 60 segments of each experiment, every 3 segments, namely every 3 seconds of samples are combined into one frame, no repeated samples exist between frames, one experiment can generate 20 frames, the total number of samples becomes 40 × 20 × 32 ═ 25600, the sample format is 25600 × 3 × 32 × 3, 3 pictures are input to 3 convolutional neural networks at the same time, and the average of 3 output results is the final result of the input. All samples were run as 8: 2 training samples and testing samples.
The integrated convolutional neural network is composed of three convolutional neural networks with completely same structures, and the structure of each convolutional neural network is as follows:
the C1 layer is a convolutional layer with RELU, with 32 filters, convolution kernel set to 3 x 3, step size 1 x 1.
The S2 level is an average pooling level with a size of 2 x 2 and steps of 2 x 2.
The B3 layer is a BN layer.
The C4 layer is also a convolutional layer with RELU, with 64 filters, with the convolutional kernel set to 3 x 3, with steps of 1 x 1.
The S5 level is an average pooling level with a size of 2 x 2 and steps of 2 x 2.
The B6 layer is a BN layer.
The C7 layer is a convolutional layer with RELU, with 128 filters, convolution kernel set to 3 x 3, step size 1 x 1.
The D8 layer is a fully connected layer with an output dimension of 128.
The DO9 layer is a Dropout layer that randomly turns off some of the neurons with a probability of 0.5 during the training process.
Finally, the output dimension of the finally concatenated Softmax layer is 2.
The invention has the following beneficial effects:
the method adopts a Russell emotion dimension model in a continuous emotion dimension model as a reference, uses a DEAP data set as a sample, removes a base line of emotion electroencephalogram in the sample, performs data normalization to extract Pearson coefficients of three frequency bands of the electroencephalogram, converts the Pearson coefficients into a 2D picture format, screens through SBS data, enhances the data, and inputs the data into an integrated convolutional neural network in a frame form, so that the emotion classification accuracy can be effectively improved, and overfitting of the convolutional neural network is prevented.
Drawings
FIG. 1 is a Russell emotional dimension model;
FIG. 2 is a data type of a DEAP data set;
FIG. 3 is a schematic diagram of brain electrical data preprocessing;
fig. 4 is a basic configuration diagram of a CNN network;
FIG. 5 is a flow chart of emotion electroencephalogram classification;
Detailed Description
The present invention is further illustrated by the following specific examples. The following description is exemplary and explanatory only and is not restrictive of the invention in any way.
As shown in fig. 5, a CNN-based emotion electroencephalogram classification method specifically includes the following steps:
step 1, as shown in fig. 1, establishing a Russell emotion dimension model in a continuous emotion dimension model, and performing labeling processing on evaluation indexes labels in a data set according to required categories.
The Russell emotion dimension model is an emotion space continuous two-dimensional emotion space with arousal degree and valence as coordinate axes. Valence (value) represents the subjective evaluation of the mood by the subject (person), from negative to positive mood on the numerical axis; arousal (arousal) represents the degree of emotion perception by a subject (human), varying on the numerical axis from calm to excited. The values of both dimensions vary from 1-9, with the magnitude of the values representing the degree of emotion. When selecting emotion dichotomy, the rating values on valence and arousal are divided with a median of 5 as a threshold, greater than 5 being labeled as 1, representing high valence or high arousal (HV, HA); less than or equal to 5 is labeled 0, representing low titer or low arousal (LV, LA); when selecting mood quartiles, HV, HA, LV, LA may be combined to generate four classes of tags for HVHA, HVLA, LVHA, LVLA.
Step 2, as shown in fig. 2 and fig. 3, the method only selects the electroencephalogram data in the DEAP data set as the data set, the original data format of each subject is 40 × 8064 (experiment times × channel number × data points), and the data of 8 peripheral physiological signals are deleted. The sampling point of each channel is 63 × 128 × 8064, the channel is divided into 3 seconds of baseline data and 60 seconds of test data, the channel is segmented in a non-overlapping mode by taking the time duration of 1 second as a time window, 63 segmented samples are obtained by each channel, the baseline data samples of the first 3 seconds are added and then averaged, electroencephalogram data without emotional fluctuation in a normal state can be obtained, the electroencephalogram data without emotional fluctuation are subtracted from the rest electroencephalogram data samples, the difference value of the electroencephalogram data is obtained, then z-score standardization is carried out on the difference value of the electroencephalogram signals, and finally the obtained sample number is 76800 × 32 × 40 × 60 (the number of subjects × the number of experiments × the number of times of the segmentation). 3 frequency bands most relevant to emotion recognition, namely 8-12Hz (alpha frequency band), 12-30Hz (beta frequency band) and 30-60Hz (gamma frequency band), are extracted from the difference values of the electroencephalogram data, and a fourth-order Butterworth filter is used for filtering each channel of the signals. Extracting 3 frequency bands from each channel, and calculating a Pearson coefficient (PCC) between every two channels of the 32 brain electrical channels for each frequency band, wherein the operation formula is as follows:
Figure BDA0002552838780000041
where X, Y represent channels selected from 32 brain electrical channels.
The pearson coefficients are calculated to convert the image into a 32 x 1 format, and the results of 3 bands are combined to obtain a 32 x 3 2D image format.
And 3, for each subject, taking an experiment as a unit, converting the data of each subject into 40 × 60 × 32 × 3 (experiment times × segment number × picture), setting the number of the rest experiments to be 30, sending the data into SBS to screen out the experiment combination with the highest accuracy, sending the screened experiment data into CWGAN-GP to pass through a labeled generative countermeasure network to perform data enhancement to generate data of a specified label, and supplementing the data to balance the emotion data of each category.
And 4, combining samples of every 3 segments, namely every 3 seconds, into one frame for 60 segments of each experiment, wherein the samples are not repeated from frame to frame, one experiment can generate 20 frames, the total number of the samples becomes 40 × 20 × 32 × 25600, the format of the samples is 25600 × 3 × 32 × 3, and all the samples are divided into 8: 2 into training samples and test samples. And simultaneously inputting 3 pictures to 3 convolutional neural networks each time, wherein the convolutional neural networks are the same in configuration, and averaging the 3 output results to obtain the final input result. All samples were run as 8: 2 training samples and testing samples.
As shown in fig. 4;
layer 4-1.C1 is a convolutional layer with RELU, with 32 filters, convolutional kernel set to 3 x 3, step size 1 x 1.
The S2 level is an average pooling level of 2 x 2 in size with steps of 2 x 2.
The layer 4-3.B3 is a BN layer.
Layer C4 is also a convolutional layer with RELU, with 64 filters, convolution kernel set to 3 x 3, step size 1 x 1.
The S5 level is an average pooling level of 2 x 2 in size with steps of 2 x 2.
Layer 4-6.B6 is a BN layer.
Layer C7 is a convolutional layer with RELU, with 128 filters, convolutional kernel set to 3 x 3, step size 1 x 1.
The layer 4-8.D8 is a fully connected layer with an output dimension of 128.
The DO9 layer is a Dropout layer that randomly turns off some of the neurons with a probability of 0.5 during the training process.
4-10. finally, the output dimension of the finally concatenated Softmax layer is 2.
Table 1 is a comparison of the two classification effects of a traditional classification model based on DEAP data set and Russell emotion dimension model:
TABLE 1 comparison of classification accuracy for each classification model
Figure BDA0002552838780000051
The samples extracted in proportion are used as training samples, the rest samples are used as test samples, the accuracy in the process of classifying the CNN model respectively according to two standards of the pleasure degree and the arousal degree is input into the CNN model in a frame mode, considerable results are obtained, the average accuracy reaches 83.3% and 84.2%, and the method is obviously improved compared with other classification algorithms.

Claims (2)

1. A CNN-based emotion electroencephalogram classification method is characterized by comprising the following steps:
step 1, establishing an emotion dimension model represented by Russell, and labeling evaluation indexes labels in a data set according to required categories;
step 2, extracting the baseline data 3 seconds before each experiment, segmenting the baseline data by a selected time window, adding the segmented baseline data, averaging the segmented baseline data, segmenting the rest electroencephalogram data by the selected time window, subtracting the processed baseline data to obtain the difference value of the electroencephalogram data, and performing data normalization; extracting 3 frequency bands most relevant to emotion recognition from the difference values of the electroencephalogram data, calculating Pearson coefficients of the 3 frequency bands, and converting the Pearson coefficients into a 32 × 3 picture format;
step 3, for each subject, taking an experiment as a unit, converting the data of each subject into the experiment times, the segment number and the segment number 3, setting the residual number after screening, sending the residual number into a sequence, and then selecting the experiment combination with the highest screening accuracy rate;
step 4, distributing the processed data and the labels according to the proportion of the set training sample and the set test sample, and inputting the data and the labels into the integrated convolutional neural network in a frame form to perform classification test on the processed samples;
inputting the data obtained in the step 3 into a convolutional neural network in a frame form, specifically: for 60 segments of each experiment, combine every 3 segments, i.e., every 3 seconds of samples into one frame, with no repeated samples from frame to frame, one experiment generates 20 frames, and the total number of samples becomes: number of experiments 20 number of subjects, sample format: the total number of samples is 3 x 32 x 3, and all the samples are in proportion to the set training samples and the set testing samples; simultaneously inputting 3 pictures to 3 convolutional neural networks each time, wherein the convolutional neural networks are the same in configuration, and averaging the 3 output results to obtain the final input result;
the integrated convolutional neural network is composed of three convolutional neural networks with completely same structures, and the structure of each convolutional neural network is as follows:
the C1 layer is a convolutional layer with RELU, with 32 filters, convolutional kernel set to 3 x 3, step size 1 x 1;
the S2 level is an average pooling level with a size of 2 x 2 and a step size of 2 x 2;
the B3 layer is a BN layer;
c4 layer is also a convolutional layer with RELU, with 64 filters, convolutional kernel set to 3 x 3, step size 1 x 1;
the S5 level is an average pooling level with a size of 2 x 2 and a step size of 2 x 2;
the B6 layer is a BN layer;
the C7 layer is a convolutional layer with RELU, with 128 filters, convolutional kernel set to 3 x 3, step size 1 x 1;
the D8 layer is a fully connected layer with an output dimension of 128;
the DO9 layer is a Dropout layer and randomly turns off part of neurons with a probability of 0.5 in the training process;
the output dimension of the finally concatenated Softmax layer is 2.
2. The CNN-based emotional feature classification method according to claim 1, wherein: in the first step, labeling the evaluation indexes labels in the data set according to the required categories, specifically: when the Russell emotion dimension model is used as a reference, each emotion state is placed on a two-dimensional plane, the arousal degree and the pleasure degree form two dimensions of a coordinate axis, the valence represents the subjective evaluation of a main body on emotion, and the passive emotion is changed into the active emotion on the coordinate axis; the wakefulness degree represents the degree of feeling of the main body on the emotion and changes from calmness to excitement on a numerical axis; the numerical values of the two dimensions are changed from 1 to 9, and the degree of emotion is represented by the size of the numerical value; when selecting emotion classification, dividing evaluation values on valence and arousal degree by taking a median 5 as a threshold, marking the evaluation values larger than 5 as 1, and representing high valence HV or high arousal HA; less than or equal to 5 is labeled 0, representing low titer LV or low wakefulness LA; when selecting mood quartiles, HV, HA, LV, LA were combined to generate four classes of tags for HVHA, HVLA, LVHA, LVLA.
CN202010582404.1A 2020-06-23 2020-06-23 Emotional electroencephalogram classification method based on CNN Pending CN111832431A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010582404.1A CN111832431A (en) 2020-06-23 2020-06-23 Emotional electroencephalogram classification method based on CNN

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010582404.1A CN111832431A (en) 2020-06-23 2020-06-23 Emotional electroencephalogram classification method based on CNN

Publications (1)

Publication Number Publication Date
CN111832431A true CN111832431A (en) 2020-10-27

Family

ID=72898051

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010582404.1A Pending CN111832431A (en) 2020-06-23 2020-06-23 Emotional electroencephalogram classification method based on CNN

Country Status (1)

Country Link
CN (1) CN111832431A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112633365A (en) * 2020-12-21 2021-04-09 西安理工大学 Mirror convolution neural network model and motor imagery electroencephalogram recognition algorithm
CN112966566A (en) * 2021-02-05 2021-06-15 武汉中旗生物医疗电子有限公司 Electrocardiosignal baseline filtering method and device
CN112990342A (en) * 2021-04-08 2021-06-18 重庆大学 Semi-supervised SAR target recognition method
CN113191429A (en) * 2021-04-29 2021-07-30 国网河北省电力有限公司电力科学研究院 Power transformer bushing fault diagnosis method and device
CN114403877A (en) * 2022-01-21 2022-04-29 中山大学 Multi-physiological-signal emotion quantitative evaluation method based on two-dimensional continuous model
CN114745675A (en) * 2022-04-28 2022-07-12 重庆邮电大学 Wi-Fi indoor positioning method based on improved GAN combined hypothesis test

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112633365A (en) * 2020-12-21 2021-04-09 西安理工大学 Mirror convolution neural network model and motor imagery electroencephalogram recognition algorithm
CN112633365B (en) * 2020-12-21 2024-03-19 西安理工大学 Mirror convolution neural network model and motor imagery electroencephalogram recognition algorithm
CN112966566A (en) * 2021-02-05 2021-06-15 武汉中旗生物医疗电子有限公司 Electrocardiosignal baseline filtering method and device
CN112966566B (en) * 2021-02-05 2023-07-07 武汉中旗生物医疗电子有限公司 Electrocardiosignal baseline filtering method and device
CN112990342A (en) * 2021-04-08 2021-06-18 重庆大学 Semi-supervised SAR target recognition method
CN112990342B (en) * 2021-04-08 2023-09-19 重庆大学 Semi-supervised SAR target recognition method
CN113191429A (en) * 2021-04-29 2021-07-30 国网河北省电力有限公司电力科学研究院 Power transformer bushing fault diagnosis method and device
CN114403877A (en) * 2022-01-21 2022-04-29 中山大学 Multi-physiological-signal emotion quantitative evaluation method based on two-dimensional continuous model
CN114745675A (en) * 2022-04-28 2022-07-12 重庆邮电大学 Wi-Fi indoor positioning method based on improved GAN combined hypothesis test

Similar Documents

Publication Publication Date Title
CN111832431A (en) Emotional electroencephalogram classification method based on CNN
CN110069958B (en) Electroencephalogram signal rapid identification method of dense deep convolutional neural network
CN110353702A (en) A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN114052735B (en) Deep field self-adaption-based electroencephalogram emotion recognition method and system
CN111709267B (en) Electroencephalogram signal emotion recognition method of deep convolutional neural network
CN111329474A (en) Electroencephalogram identity recognition method and system based on deep learning and information updating method
CN111832416A (en) Motor imagery electroencephalogram signal identification method based on enhanced convolutional neural network
CN107066514A (en) The Emotion identification method and system of the elderly
Kaziha et al. A convolutional neural network for seizure detection
CN110135244B (en) Expression recognition method based on brain-computer collaborative intelligence
CN113128552A (en) Electroencephalogram emotion recognition method based on depth separable causal graph convolution network
CN114631831A (en) Cross-individual emotion electroencephalogram recognition method and system based on semi-supervised field self-adaption
CN110717423A (en) Training method and device for emotion recognition model of facial expression of old people
CN111920420A (en) Patient behavior multi-modal analysis and prediction system based on statistical learning
CN115221969A (en) Motor imagery electroencephalogram signal identification method based on EMD data enhancement and parallel SCN
CN113180659A (en) Electroencephalogram emotion recognition system based on three-dimensional features and cavity full convolution network
Srinivasan et al. Brain MR image analysis using discrete wavelet transform with fractal feature analysis
Thomas et al. Artificial neural network for diagnosing autism spectrum disorder
Micheli-Tzanakou et al. Neural networks and blood cell identification
CN113128353A (en) Emotion sensing method and system for natural human-computer interaction
CN116421200A (en) Brain electricity emotion analysis method of multi-task mixed model based on parallel training
Immanuel et al. ANALYSIS OF DIFFERENT EMOTIONS WITH BIO-SIGNALS (EEG) USING DEEP CNN
CN112450946A (en) Electroencephalogram artifact restoration method based on loop generation countermeasure network
Chitra et al. Facial expression recognition using local binary pattern and support vector machine
Suttapakti et al. Multi-directional Texture Feature Extraction for Glaucoma Classification from Color Retinal Images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination