CN113177482A - Cross-individual electroencephalogram signal classification method based on minimum category confusion - Google Patents

Cross-individual electroencephalogram signal classification method based on minimum category confusion Download PDF

Info

Publication number
CN113177482A
CN113177482A CN202110479311.0A CN202110479311A CN113177482A CN 113177482 A CN113177482 A CN 113177482A CN 202110479311 A CN202110479311 A CN 202110479311A CN 113177482 A CN113177482 A CN 113177482A
Authority
CN
China
Prior art keywords
class
sample
individual
cross
confusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110479311.0A
Other languages
Chinese (zh)
Inventor
陈勋
崔恒
刘爱萍
张勇东
吴枫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN202110479311.0A priority Critical patent/CN113177482A/en
Publication of CN113177482A publication Critical patent/CN113177482A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Biology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a cross-individual electroencephalogram signal classification method based on minimum category confusion, which comprises the following steps: 1. extracting frequency domain characteristics of EEG signals in each frequency band and standardizing; 2. establishing a GRU-MCC network model based on minimum category confusion, which consists of a feature extractor and a classifier, and simultaneously calculating cross entropy loss and minimum category confusion loss optimization model parameters; 3. verifying the model by adopting a leave-one-out cross-validation strategy on the public data set; 4. and (4) realizing the cross-individual electroencephalogram signal classification task by using the trained model. The method can realize high-accuracy cross-individual electroencephalogram signal classification, and has important significance in the fields of intelligent human-computer interaction, medical health and the like.

Description

Cross-individual electroencephalogram signal classification method based on minimum category confusion
Technical Field
The invention relates to the field of electroencephalogram signal processing, in particular to a cross-individual electroencephalogram signal classification method based on minimum category confusion.
Background
Electroencephalograms (EEG) are powerful tools for recording brain activity, and can more objectively and reliably capture the brain state of a human being. In addition, with the rapid development of wearable equipment and dry battery technology, the acquisition of electroencephalogram signals is more convenient. Therefore, classification based on electroencephalogram signals has received increasing attention in recent years, such as epilepsy detection, motor imagery, emotion recognition, sleep staging, and the like. The effective electroencephalogram signal classification method has important significance in the fields of intelligent human-computer interaction, medical health and the like.
The classification of electroencephalogram signals by machine learning has been widely studied, and the main steps are to design and extract features from EEG, and then train a classifier to perform an identification task on the extracted features. Common EEG features include time domain features, frequency domain features, and time-frequency features. Commonly used classifiers include Support Vector Machines (SVMs), K-nearest neighbor (KNN) and other conventional machine learning algorithms, and deep learning algorithms such as deep belief networks, convolutional neural networks, long and short term memory networks. However, these methods tend to ignore spatial information that is embedded between different EEG channels.
Because of the significant differences in emotional response between individuals, existing models do not transfer the knowledge learned from other subjects well to new subjects, requiring a large number of labeled samples to be obtained from the new subjects to retrain the model, which is time consuming and not applicable to many practical scenarios. In recent years, some researchers have tried to solve the problem of cross-individual electroencephalogram classification by using Domain Adaptive Neural Network (DANN), Depth Adaptive Network (DAN), and other field adaptive techniques. The method obtains higher accuracy in the cross-individual electroencephalogram signal classification task by aligning the characteristics of a training individual (source domain) and an individual to be predicted (target domain). However, to ensure feature alignment, they pay the cost of losing some class information, thereby reducing the discernability of the features. Furthermore, when the individual differences are significant, these methods may result in negative migration, i.e., knowledge learned in the source domain may negatively affect the prediction of the target domain.
Disclosure of Invention
The invention provides a cross-individual electroencephalogram signal classification method based on minimum category confusion to overcome the defects of the prior art, so that the identifiability of characteristics can be ensured and negative migration can be avoided, and the accuracy of cross-individual electroencephalogram signal classification is improved.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention relates to a cross-individual electroencephalogram signal classification method based on minimum category confusion, which is characterized by comprising the following steps:
step 1, acquiring electroencephalogram signals of a batch of training individuals and class labels corresponding to the electroencephalogram signals, acquiring electroencephalogram signals of a batch of individuals to be predicted, extracting frequency domain characteristics of each frequency band in the electroencephalogram signals of the training individuals and the individuals to be predicted, carrying out standardization processing, and obtaining an input sample sequence, wherein any one sample is marked as x, and
Figure BDA0003048556380000021
where n represents the number of channels, d represents the number of features per channel,
Figure BDA0003048556380000022
represents a real number;
step 2, constructing a GRU-MCC network model based on minimum category confusion;
step 2.1, establishing a feature extractor, and inputting the sample x into the feature extractor to obtain a depth feature hf
Step 2.2, establishing a classifier, and enabling the depth feature hfInputting the result into the classifier to obtain an output result z;
2.3, inputting the output result z into a SoftMax function layer to obtain the probability value of the sample x for each type of label;
step 2.4, calculating the cross entropy loss of the source domain sample corresponding to the electroencephalogram signal of the training individual with the emotion label in the input sample sequence;
step 2.5, calculating a target domain sample X corresponding to the EEG signal of the individual sample to be predicted in the input sample sequencetMinimum class confusion loss of (1);
step 2.6, optimizing parameters of the GRU-MCC model by combining the cross entropy loss and the minimum category confusion loss to obtain a trained GRU-MCC network model;
and 3, classifying the electroencephalogram signals of a batch of individual samples to be predicted by using the trained GRU-MCC model.
The method for classifying the cross-individual electroencephalogram signals is also characterized in that the feature extractor in the step 2.1 consists of a gating cycle unit and a full connection layer, and processes a sample x according to the following process:
and the sample x passes through the gate control cycle unit to obtain a spatial characterization sequence h ═ h1,h2,...,hi,...,hn]Wherein h isiIs a spatial characterization of the ith channel;
the space characterization sequence h is expanded into a long vector h', and dimension reduction processing is carried out on the space characterization sequence h through the full-connection layer, so that a depth feature h is obtainedf
The minimum class confusion loss in step 2.5 is calculated as follows:
step 2.5.1, probability adjustment:
target domain sample X corresponding to the EEG signal of the individual sample to be predicted in the input sample sequencetInputting the data into the feature extractor and classifier, and outputting a result Zt
Calculating target domain sample X by adopting temperature regulation strategytClass probability of
Figure BDA0003048556380000023
Wherein the class probability
Figure BDA0003048556380000024
Any one element of
Figure BDA0003048556380000025
Representing the probability that the ith target domain sample belongs to the jth class;
step 2.5.2, category correlation:
calculating class correlation coefficients of class j and class j
Figure BDA0003048556380000031
Thereby obtaining a class correlation matrix C in which,
Figure BDA0003048556380000032
representing the class probability
Figure BDA0003048556380000033
In the j-th column (c),
Figure BDA0003048556380000034
representing the class probability
Figure BDA0003048556380000035
Column j' of (5);
step 2.5.3, uncertainty weighting:
calculating the corresponding entropy of the ith target domain sample
Figure BDA0003048556380000036
Wherein the content of the first and second substances,
Figure BDA0003048556380000037
representing the class probability
Figure BDA0003048556380000038
Row i of (1);
calculating the weight W of the ith target domain sample by using a SoftMax functioniiAnd applying the weight WiiThe corresponding diagonal matrix W is used as a weight matrix;
computing
Figure BDA0003048556380000039
Thereby obtaining a category correlation matrix C' with weight;
step 2.5.4, category normalization:
normalizing each column of the weighted category correlation matrix C' by using a normalization strategy to obtain a normalized category correlation matrix
Figure BDA00030485563800000310
Step 2.5.5, minimum category confusion:
computing the normalized class correlation matrix
Figure BDA00030485563800000311
Is summed to obtain a minimum class confusion loss LMCC(Xt| θ), where θ represents a parameter of the GRU-MCC model.
Compared with the prior art, the invention has the beneficial effects that:
1. the invention provides a method for optimizing model parameters by using minimum class confusion loss, wherein the loss defines the confusion degree of a model trained on a source domain on different classes of samples when classifying target domain samples, and the transfer benefit is increased by reducing the class confusion, so that high-precision cross-individual electroencephalogram signal classification is realized.
2. The present invention constructs a gated-cycle cell based feature extractor to capture spatial information between EEG channels. Specifically, spatial information among channels is learned by using a gated circulation unit, and the gated circulation unit can capture a long-distance spatial dependence relationship and has the characteristics of few parameters and easy training; and secondly, in order to not lose effective information, the output result is expanded into a high-dimensional vector, dimension reduction is carried out through a full-connection layer, a depth feature with higher discriminative power is obtained, and finally the classification precision of the electroencephalogram signals is improved.
Drawings
FIG. 1 is a network structure diagram of GRU-MCC in accordance with the present invention;
FIG. 2 is a block diagram of a feature extractor of the GRU-MCC of the present invention;
FIG. 3 is a flow chart of the calculation of the minimum class confusion loss according to the present invention.
Detailed Description
In this embodiment, a cross-individual electroencephalogram signal classification method based on small category confusion includes the following steps:
step 1, acquiring electroencephalogram signals of a batch of training individuals and class labels corresponding to the electroencephalogram signals, acquiring electroencephalogram signals of a batch of individuals to be predicted, extracting frequency domain characteristics of each frequency band in the electroencephalogram signals of the training individuals and the individuals to be predicted, carrying out standardization processing, and obtaining an input sample sequence, wherein any one sample is marked as x, and
Figure BDA0003048556380000041
where n represents the number of channels, d represents the number of features per channel,
Figure BDA0003048556380000042
representing a real number.
Step 1.1, acquiring data required by an experiment from a public data set SEED, wherein the SEED data set records 15 electroencephalogram signals of a subject when watching emotion movie fragments. Each subject was tested in triplicate and each experiment was viewed for 15 movie fragments of approximately 4 minutes (5 of each of the positive, neutral and negative fragments). According to the international 10-20 standard system, 62 channels of EEG signals are acquired and the acquired signals are down-sampled to 200Hz, and the EEG data is processed with a 0-75Hz band-pass filter in order to filter out noise and remove artifacts.
The GRU-MCC model is trained by using leave-one-out (LOSO) cross-validation strategy. The LOSO strategy uses one subject to be predicted and 14 others to be trained at a time. This process is repeated so that each subject is used as an individual to be predicted once. Acquiring the electroencephalogram signals of the training individuals and the category labels corresponding to the electroencephalogram signals, and acquiring a batch of electroencephalogram signals of the individuals to be predicted.
Step 1.2, extracting differential entropy characteristics of 5 sub-frequency bands in electroencephalogram signals of a training individual and an individual to be predicted, wherein the differential entropy characteristics comprise delta (1-3Hz), theta (4-7Hz), alpha (8-13Hz), beta (14-30Hz) and gamma (31-50 Hz). And carrying out Z-score standardization on the differential entropy characteristics according to individuals to obtain an input sample sequence. For any differential entropy feature DE, the corresponding input sample x may be obtained, with the following formula:
Figure BDA0003048556380000043
in the formula (1), μ and σ represent the mean and standard deviation, respectively, of all differential entropy features of the individual, 62 means the number of channels, and 5 means the number of features per channel.
Step 2, constructing a GRU-MCC network model based on Minimum Class Confusion (MCC), as shown in FIG. 1, wherein the model consists of a feature extractor and a classifier, and simultaneously, cross entropy loss and Minimum Class confusion loss optimization model parameters are calculated.
Step 2.1, constructing a feature extractor shown in fig. 2, wherein the feature extractor is composed of a Gated Recycling Unit (GRU) and a full connection layer. First, the above sample x is input to the GRU layer:
Figure BDA0003048556380000051
in the formula (2), the reaction mixture is,
Figure BDA0003048556380000052
is the spatial characterization corresponding to the ith channel, and the dimension is 16. In order to fully utilize the information of all channels, a space characterization sequence h is expanded into a long vector h', and dimension reduction processing is carried out through a full connection layer, so that a depth feature h is obtainedf
Figure BDA0003048556380000053
In the formula (3), WfAnd bfRespectively representing weight and bias, RELU representing modificationA positive linear cell activation function, expressed as:
Figure BDA0003048556380000054
and 2.2, using the linear change as a classifier, connecting the classifier with a feature extractor and predicting the category of the electroencephalogram signal. Depth feature hfInputting the vector into a classifier to obtain an output vector z:
Figure BDA0003048556380000055
in formula (5), R represents the number of categories, and in this embodiment, R is 3. 2.3, inputting the output result z into a SoftMax function layer to obtain the probability value of the sample x for each type of label, wherein the calculation mode is as follows:
Figure BDA0003048556380000056
in equation (6), P (j | x) represents the prediction probability that the sample x belongs to the j-th class.
And 2.4, calculating the cross entropy loss of the source domain sample corresponding to the electroencephalogram signal of the training individual with the category label in the input sample sequence. Inputting a batch of labeled source domain samples XsIts cross entropy loss can be calculated:
Figure BDA0003048556380000057
Figure BDA0003048556380000058
in the formula (7), BsRepresenting the batch size of the source domain,/iIs a sample
Figure BDA0003048556380000059
True tag ofAnd theta is a parameter of the GRU-MCC model. In this embodiment, Bs=200。
Step 2.5, calculating a target domain sample X corresponding to the electroencephalogram signal of the individual sample to be predicted in the input sample sequencetMinimum class confusion loss. The calculation step of the minimum class confusion loss is shown in fig. 3 and is calculated according to the following process:
step 2.5.1, probability adjustment: target domain sample X corresponding to electroencephalogram signals of individual samples to be predicted in input sample sequencetInputting the data into a feature extractor and a classifier, and outputting the result
Figure BDA0003048556380000061
Wherein B istRepresenting the batch size of the target domain. In this example, Bt=200。
Deep learning tends to make overly confident predictions, and to reduce this tendency, a temperature adjustment strategy is employed to calculate target domain samples XtClass probability of (2):
Figure BDA0003048556380000062
in the formula (9), the reaction mixture is,
Figure BDA0003048556380000063
representing the probability that the ith target domain sample belongs to the jth class, and T is a temperature hyperparameter.
Step 2.5.2, category correlation: the class correlation coefficients for class j and class j' are defined as follows:
Figure BDA0003048556380000064
in the formula (10), the compound represented by the formula (10),
Figure BDA0003048556380000065
representing class probabilities
Figure BDA0003048556380000066
In the j-th column (c),
Figure BDA0003048556380000067
representing class probabilities
Figure BDA0003048556380000068
Column j' of (d).
Figure BDA0003048556380000069
Represents a class correlation matrix whose elements Cjj′Is a rough estimate of the class confusion between class j and class j', the probability that these samples fall into both classes is calculated.
Step 2.5.3, uncertainty weighting: different samples contribute differently to category confusion. For example, samples predicted to be nearly uniformly distributed (without distinct peaks) contribute less, while samples predicted to have several distinct peaks indicate that the classifier is hesitant between fuzzy classes and more important for embodying class confusion. Therefore, an entropy-based weighting mechanism is introduced, giving higher weights to more deterministic samples:
Figure BDA00030485563800000610
Figure BDA00030485563800000611
Figure BDA00030485563800000612
in the formulae (11) to (13),
Figure BDA00030485563800000613
representing class probabilities
Figure BDA00030485563800000614
In the ith row of the drawing, the first row,
Figure BDA00030485563800000615
representing class probabilities
Figure BDA00030485563800000616
Line i' of, WiiIs the weight of the ith target domain sample, and weight WiiThe corresponding diagonal matrix W serves as a weight matrix, and C' is a weighted class correlation matrix.
Step 2.5.4, category normalization: normalizing each column of the weighted class correlation matrix C' by using a normalization strategy to obtain a normalized class correlation matrix
Figure BDA0003048556380000071
Figure BDA0003048556380000072
Step 2.5.5, minimum category confusion: matrix array
Figure BDA0003048556380000073
The off-diagonal elements of (a) represent cross-class aliasing, and for prediction of target domain samples, it is ideal that no samples are classified into two classes at the same time, i.e.
Figure BDA0003048556380000074
Close to a diagonal matrix. Thus the minimum class confusion is defined as:
Figure BDA0003048556380000075
and 2.6, optimizing parameters of the GRU-MCC model by combining cross entropy loss and minimum category confusion loss to obtain a trained GRU-MCC network model. In the training process, a batch of labeled source domain samples X are input into the model at the same timesAnd a collection of unlabeled target domain samples XtSeparately calculating cross entropy loss and minimum class confusion loss, and optimizing model parameters according to the following formulaNumber:
Figure BDA0003048556380000076
in the formula (16), α represents the learning rate, and λ represents the balance between the two types of loss. In this embodiment, α is 0.001 and λ is 1.
And 3, classifying the electroencephalogram signals of a batch of individual samples to be predicted by using the trained GRU-MCC model. The final performance of the model is estimated by the average accuracy and standard deviation of all individuals to be predicted.
In one implementation, the GRU-MCC model is compared to SVM, DANN, DAN, GRU (removing the minimum class confusion penalty) and MCC (replacing the feature extractor with a full connectivity layer). The average accuracy and standard deviation of 15 subjects to be predicted are shown in table 1:
Figure BDA0003048556380000077
TABLE 1 Classification Performance of different methods
The result shows that the method is superior to the traditional machine learning method SVM and the domain self-adaptive methods DANN and DAN based on feature alignment, and improves the accuracy of cross-individual electroencephalogram signal classification. Compared with the GRU, the method helps to improve the accuracy by 14.45% on the cross-individual electroencephalogram signal classification task by introducing the minimum category confusion loss. Compared with MCC, the feature extractor designed by the invention helps to improve the accuracy by 2.69% on the task of cross-individual electroencephalogram signal classification. In addition, the standard deviation of the present invention is the smallest, 5.27, which indicates that the stability of GRU-MCC is better and the generalization ability of GRU-MCC is stronger for different individuals.

Claims (3)

1. A cross-individual electroencephalogram signal classification method based on minimum category confusion is characterized by comprising the following steps:
step 1, acquiring electroencephalograms of a batch of training individuals and class labels corresponding to the electroencephalograms, acquiring electroencephalograms of a batch of individuals to be predicted, and extracting training individualsThe method comprises the steps of carrying out standardization processing on frequency domain characteristics of each frequency band in electroencephalogram signals of a body and an individual to be predicted to obtain an input sample sequence, wherein any one sample is marked as x, and
Figure FDA0003048556370000011
where n represents the number of channels, d represents the number of features per channel,
Figure FDA0003048556370000012
represents a real number;
step 2, constructing a GRU-MCC network model based on minimum category confusion;
step 2.1, establishing a feature extractor, and inputting the sample x into the feature extractor to obtain a depth feature hf
Step 2.2, establishing a classifier, and enabling the depth feature hfInputting the result into the classifier to obtain an output result z;
2.3, inputting the output result z into a SoftMax function layer to obtain the probability value of the sample x for each type of label;
step 2.4, calculating the cross entropy loss of the source domain sample corresponding to the electroencephalogram signal of the training individual with the emotion label in the input sample sequence;
step 2.5, calculating a target domain sample X corresponding to the EEG signal of the individual sample to be predicted in the input sample sequencetMinimum class confusion loss of (1);
step 2.6, optimizing parameters of the GRU-MCC model by combining the cross entropy loss and the minimum category confusion loss to obtain a trained GRU-MCC network model;
and 3, classifying the electroencephalogram signals of a batch of individual samples to be predicted by using the trained GRU-MCC model.
2. The cross-individual electroencephalogram signal classification method according to claim 1, wherein the feature extractor in the step 2.1 is composed of a gating cycle unit and a full connection layer, and processes the sample x according to the following process:
and the sample x passes through the gate control cycle unit to obtain a spatial characterization sequence h ═ h1,h2,...,hi,...,hn]Wherein h isiIs a spatial characterization of the ith channel;
the space characterization sequence h is expanded into a long vector h', and dimension reduction processing is carried out on the space characterization sequence h through the full-connection layer, so that a depth feature h is obtainedf
3. The cross-individual brain electrical signal classification method according to claim 1, characterized in that the minimum class confusion loss in step 2.5 is calculated as follows:
step 2.5.1, probability adjustment:
target domain sample X corresponding to the EEG signal of the individual sample to be predicted in the input sample sequencetInputting the data into the feature extractor and classifier, and outputting a result Zt
Calculating target domain sample X by adopting temperature regulation strategytClass probability of
Figure FDA0003048556370000021
Wherein the class probability
Figure FDA0003048556370000022
Any one element of
Figure FDA0003048556370000023
Representing the probability that the ith target domain sample belongs to the jth class;
step 2.5.2, category correlation:
calculating class correlation coefficients of class j and class j
Figure FDA0003048556370000024
Thereby obtaining a class correlation matrix C in which,
Figure FDA0003048556370000025
to representThe class probability
Figure FDA0003048556370000026
In the j-th column (c),
Figure FDA0003048556370000027
representing the class probability
Figure FDA0003048556370000028
Column j' of (5);
step 2.5.3, uncertainty weighting:
calculating the corresponding entropy of the ith target domain sample
Figure FDA0003048556370000029
Wherein the content of the first and second substances,
Figure FDA00030485563700000210
representing the class probability
Figure FDA00030485563700000211
Row i of (1);
calculating the weight W of the ith target domain sample by using a SoftMax functioniiAnd applying the weight WiiThe corresponding diagonal matrix W is used as a weight matrix;
computing
Figure FDA00030485563700000212
Thereby obtaining a category correlation matrix C' with weight;
step 2.5.4, category normalization:
normalizing each column of the weighted category correlation matrix C' by using a normalization strategy to obtain a normalized category correlation matrix
Figure FDA00030485563700000213
Step 2.5.5, minimum category confusion:
calculating the normalizationNormalized class correlation matrix
Figure FDA00030485563700000214
Is summed to obtain a minimum class confusion loss LMCC(Xt| θ), where θ represents a parameter of the GRU-MCC model.
CN202110479311.0A 2021-04-30 2021-04-30 Cross-individual electroencephalogram signal classification method based on minimum category confusion Pending CN113177482A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110479311.0A CN113177482A (en) 2021-04-30 2021-04-30 Cross-individual electroencephalogram signal classification method based on minimum category confusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110479311.0A CN113177482A (en) 2021-04-30 2021-04-30 Cross-individual electroencephalogram signal classification method based on minimum category confusion

Publications (1)

Publication Number Publication Date
CN113177482A true CN113177482A (en) 2021-07-27

Family

ID=76925833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110479311.0A Pending CN113177482A (en) 2021-04-30 2021-04-30 Cross-individual electroencephalogram signal classification method based on minimum category confusion

Country Status (1)

Country Link
CN (1) CN113177482A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115251845A (en) * 2022-07-28 2022-11-01 纽锐思(苏州)医疗科技有限公司 Sleep monitoring method for processing brain wave signals based on TB-TF-BiGRU model

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105559777A (en) * 2016-03-17 2016-05-11 北京工业大学 Electroencephalographic identification method based on wavelet packet and LSTM-type RNN neural network
CN108985378A (en) * 2018-07-20 2018-12-11 天津师范大学 A kind of domain adaptive approach based on mixing interleaving depth network
CN109583346A (en) * 2018-11-21 2019-04-05 齐鲁工业大学 EEG feature extraction and classifying identification method based on LSTM-FC
CN110135244A (en) * 2019-04-02 2019-08-16 杭州电子科技大学 It is a kind of based on brain-machine cooperative intelligent expression recognition method
CN110517666A (en) * 2019-01-29 2019-11-29 腾讯科技(深圳)有限公司 Audio identification methods, system, machinery equipment and computer-readable medium
CN110575163A (en) * 2019-08-01 2019-12-17 深圳大学 Method and device for detecting driver distraction
CN110890155A (en) * 2019-11-25 2020-03-17 中国科学技术大学 Multi-class arrhythmia detection method based on lead attention mechanism
CN111950455A (en) * 2020-08-12 2020-11-17 重庆邮电大学 Motion imagery electroencephalogram characteristic identification method based on LFFCNN-GRU algorithm model
CN112022619A (en) * 2020-09-07 2020-12-04 西北工业大学 Multi-mode information fusion sensing system of upper limb rehabilitation robot
CN112100387A (en) * 2020-11-13 2020-12-18 支付宝(杭州)信息技术有限公司 Training method and device of neural network system for text classification
CN112274162A (en) * 2020-09-18 2021-01-29 杭州电子科技大学 Cross-tested EEG fatigue state classification method based on generation of anti-domain self-adaption
CN112336354A (en) * 2020-11-06 2021-02-09 山西三友和智慧信息技术股份有限公司 Epilepsy monitoring method based on EEG signal
CN112465069A (en) * 2020-12-15 2021-03-09 杭州电子科技大学 Electroencephalogram emotion classification method based on multi-scale convolution kernel CNN

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105559777A (en) * 2016-03-17 2016-05-11 北京工业大学 Electroencephalographic identification method based on wavelet packet and LSTM-type RNN neural network
CN108985378A (en) * 2018-07-20 2018-12-11 天津师范大学 A kind of domain adaptive approach based on mixing interleaving depth network
CN109583346A (en) * 2018-11-21 2019-04-05 齐鲁工业大学 EEG feature extraction and classifying identification method based on LSTM-FC
CN110517666A (en) * 2019-01-29 2019-11-29 腾讯科技(深圳)有限公司 Audio identification methods, system, machinery equipment and computer-readable medium
CN110135244A (en) * 2019-04-02 2019-08-16 杭州电子科技大学 It is a kind of based on brain-machine cooperative intelligent expression recognition method
CN110575163A (en) * 2019-08-01 2019-12-17 深圳大学 Method and device for detecting driver distraction
CN110890155A (en) * 2019-11-25 2020-03-17 中国科学技术大学 Multi-class arrhythmia detection method based on lead attention mechanism
CN111950455A (en) * 2020-08-12 2020-11-17 重庆邮电大学 Motion imagery electroencephalogram characteristic identification method based on LFFCNN-GRU algorithm model
CN112022619A (en) * 2020-09-07 2020-12-04 西北工业大学 Multi-mode information fusion sensing system of upper limb rehabilitation robot
CN112274162A (en) * 2020-09-18 2021-01-29 杭州电子科技大学 Cross-tested EEG fatigue state classification method based on generation of anti-domain self-adaption
CN112336354A (en) * 2020-11-06 2021-02-09 山西三友和智慧信息技术股份有限公司 Epilepsy monitoring method based on EEG signal
CN112100387A (en) * 2020-11-13 2020-12-18 支付宝(杭州)信息技术有限公司 Training method and device of neural network system for text classification
CN112465069A (en) * 2020-12-15 2021-03-09 杭州电子科技大学 Electroencephalogram emotion classification method based on multi-scale convolution kernel CNN

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
YING JIN ET AL: "Minimum Class Confusion for Versatile Domain", 《ARXIV:1912.03699V3》 *
张向荣等: "《人工智能前沿技术丛书 模式识别》", 30 September 2019 *
胡章芳等: "基于3DC-BGRU的脑电情感识别", 《计算机工程与应用》 *
闫美阳等: "多源域混淆的双流深度迁移学习", 《中国图象图形学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115251845A (en) * 2022-07-28 2022-11-01 纽锐思(苏州)医疗科技有限公司 Sleep monitoring method for processing brain wave signals based on TB-TF-BiGRU model
CN115251845B (en) * 2022-07-28 2024-05-03 纽锐思(苏州)医疗科技有限公司 Sleep monitoring method for processing brain wave signals based on TB-TF-BiGRU model

Similar Documents

Publication Publication Date Title
CN112766355B (en) Electroencephalogram signal emotion recognition method under label noise
CN111444960A (en) Skin disease image classification system based on multi-mode data input
CN112800998B (en) Multi-mode emotion recognition method and system integrating attention mechanism and DMCCA
Mensch et al. Learning neural representations of human cognition across many fMRI studies
CN112244873A (en) Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network
CN114052735B (en) Deep field self-adaption-based electroencephalogram emotion recognition method and system
CN113011239B (en) Motor imagery classification method based on optimal narrow-band feature fusion
CN114176607B (en) Electroencephalogram signal classification method based on vision transducer
CN110289081B (en) Epilepsia detection method based on deep network stack model self-adaptive weighting feature fusion
CN110598793A (en) Brain function network feature classification method
CN114564990B (en) Electroencephalogram signal classification method based on multichannel feedback capsule network
CN115100709B (en) Feature separation image face recognition and age estimation method
CN113392733B (en) Multi-source domain self-adaptive cross-tested EEG cognitive state evaluation method based on label alignment
CN114595725B (en) Electroencephalogram signal classification method based on addition network and supervised contrast learning
CN111513717A (en) Method for extracting brain functional state
Ning et al. Cross-subject EEG emotion recognition using domain adaptive few-shot learning networks
Zhang et al. Classification of canker on small datasets using improved deep convolutional generative adversarial networks
CN115221969A (en) Motor imagery electroencephalogram signal identification method based on EMD data enhancement and parallel SCN
Deepthi et al. An intelligent Alzheimer’s disease prediction using convolutional neural network (CNN)
CN113177482A (en) Cross-individual electroencephalogram signal classification method based on minimum category confusion
Haroon et al. An ensemble classification and binomial cumulative based PCA for diagnosis of Parkinson’s disease and autism spectrum disorder
Schwabedal et al. Automated classification of sleep stages and EEG artifacts in mice with deep learning
Rajput et al. A transfer learning-based brain tumor classification using magnetic resonance images
CN116746929A (en) Electroencephalogram emotion recognition method based on mixed enhancement and time sequence contrast learning
CN116821764A (en) Knowledge distillation-based multi-source domain adaptive EEG emotion state classification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210727

RJ01 Rejection of invention patent application after publication