CN114757235A - Emotion recognition method based on common and specific electroencephalogram feature mining - Google Patents

Emotion recognition method based on common and specific electroencephalogram feature mining Download PDF

Info

Publication number
CN114757235A
CN114757235A CN202210512240.4A CN202210512240A CN114757235A CN 114757235 A CN114757235 A CN 114757235A CN 202210512240 A CN202210512240 A CN 202210512240A CN 114757235 A CN114757235 A CN 114757235A
Authority
CN
China
Prior art keywords
electroencephalogram
emotion
matrix
common
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210512240.4A
Other languages
Chinese (zh)
Inventor
彭勇
刘鸿刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202210512240.4A priority Critical patent/CN114757235A/en
Publication of CN114757235A publication Critical patent/CN114757235A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses an emotion recognition method based on common and specific electroencephalogram feature mining, which comprises the following steps of: step 1: acquiring electroencephalogram emotion data, and step 2: preprocessing the electroencephalogram emotion data, and step 3: establishing a target function based on common and special electroencephalogram characteristic emotion recognition models, and step 4: obtaining an expression needing iterative optimization according to the objective function; and 5: and (4) inputting the electroencephalogram data preprocessed in the step (2) into a target function, and continuously performing iterative optimization according to the updated expression obtained in the step (4) to obtain an emotional state label of the testee. In addition, the invention can pertinently utilize the specific characteristics corresponding to a certain type of emotional state and the common characteristics of all the emotional states, so that the model has the performance of better identifying the emotional state, and can also find the relation between the emotional state and the specific characteristics, thereby capturing the emotional state more efficiently.

Description

Emotion recognition method based on common and specific electroencephalogram feature mining
Technical Field
The invention belongs to the technical field of electroencephalogram signal processing, and particularly relates to an emotion recognition method based on common and specific electroencephalogram feature mining.
Background
Human emotion can be observed from the outside through external factors such as expressions, speaking tone, eye spirit, limb actions and the like, and internal physiological signals such as electroencephalogram signals are analyzed through instruments so as to be observed, and the two types of emotion are also two major data sources for researching emotion recognition at present. However, the external expression can be re-expressed through human 'secondary processing', has disguise and confusion, is probably not an emotion really wanted to be expressed, can mislead some emotion recognition models, analyzes wrong results, and physiological signals cannot be controlled by human consciousness, cannot lie outside, and is relatively high in credibility, so that some factors causing errors of recognition results can be avoided to a certain extent by the research of selecting physiological signals for emotion recognition.
However, the electroencephalogram signals are very sensitive, and the electroencephalogram signals in the same emotional state may be different among different subjects or different time periods of the same subject. On the other hand, in the conventional semi-supervised least square linear regression model, part of sample data with labels and part of sample data without labels are predicted by training the model. In a common linear regression model based on least squares, the importance of each dimension of a sample is measured for electroencephalogram data in the same time period through a weight matrix, the importance of each dimension of the sample is continuously corrected in the training process, and a prediction label of an unmarked sample is obtained on the basis of the weight matrix. The invention aims at the identification task with higher difficulty of crossing time periods to improve the performance of the model.
Disclosure of Invention
The invention aims to provide an emotion recognition method based on common and specific electroencephalogram feature mining.
The emotion recognition method based on common and specific electroencephalogram feature mining is characterized by comprising the following steps:
step 1, collecting electroencephalogram data of a subject to obtain electroencephalogram emotion data of different emotion types.
And 2, preprocessing the electroencephalogram emotion data to obtain a labeled data set.
And 3, establishing an objective function.
Respectively increasing 2-1 norm and 1 norm of a weight matrix to rows and columns of the weight matrix in the least square linear regression model to obtain an expanded least square semi-supervised linear regression model; meanwhile, obtaining an adjacency matrix through a K-nearest neighbor algorithm according to characteristic information contained in the sample; and then the objective function is obtained.
And 4, jointly and iteratively optimizing an objective function by using the labeled data set, and sequentially updating the expression solution of the weight matrix W and the predicted label Ft in the model.
And 5, inputting the preprocessed electroencephalogram data of the tested object into the objective function which is optimized in an iteration mode, and obtaining the emotion type of the tested object.
Preferably, in the step 1, electroencephalogram emotion data of different emotion types are obtained by inducing a subject to generate different emotion changes and collecting electroencephalogram data of the subject.
Preferably, the preprocessing includes sampling the electroencephalogram data at a fixed frequency, and filtering noise and artifacts from the sampled electroencephalogram data by a band-pass filter.
Preferably, the preprocessing further comprises classifying the electroencephalogram data subjected to noise and artifact filtering according to n frequency bands, and respectively calculating differential entropy under each frequency band, wherein the differential entropy is used as electroencephalogram characteristics in the sample matrix. n is the number of divided frequency bands.
Preferably, the band-pass filter is a 1hz-75hz band-pass filter.
Preferably, the EEG data after noise and artifact filtering is divided into 5 frequency bands according to Delta (1-4Hz), Theta (4-8Hz), Alpha (8-14Hz), Beta (14-31Hz) and Gamma (31-50 Hz).
Preferably, the semi-supervised linear regression model of least squares obtained in step 3 is:
Figure BDA0003638452990000021
wherein the content of the first and second substances,
Figure BDA0003638452990000022
is a marked feature matrix;
Figure BDA0003638452990000023
a characteristic matrix of the measured object; p denotes the dimension of the feature matrix, nsAnd ntRespectively counting the number of samples in a training set and the number of samples in a testing set; w is a weight matrix;
Figure BDA0003638452990000024
Also consisting of two parts, YsLabels with feature matrixes of training set samples; ftIs a prediction label of the emotion category of the tested object; n is equal to ns+ntRepresenting the total number of samples. F ═ XW; l is2=L*-L;
Figure BDA0003638452990000025
An adjacent matrix obtained by a KNN algorithm on the sample matrix X; diagonal matrix L*Element (1) of
Figure BDA0003638452990000026
The 1 norm of the weight matrix W represents the interpretation of the emotion tag-specific features; the 2, 1 norm of the weight matrix W represents the interpretation of features common between emotion tags; lambda [ alpha ]1,λ2Beta is a preset model parameter; tr (-) is the trace operation of the matrix.
Preferably, in step 4, before the objective function optimization, the weight matrix and the prediction tag are initialized; wherein each element in the prediction tag is initialized to
Figure BDA0003638452990000027
c is the total category number of the emotion labels, and the weight matrix is initialized as follows: (X)TX+γI)-1XTY。
Preferably, the specific method of joint iteration in step 4 is as follows:
fixing a prediction label Ft, updating a weight matrix W, and enabling an objective function to be as follows:
Figure BDA0003638452990000031
fixing a weight matrix W, and updating a prediction label Ft, wherein the target function is as follows:
Figure BDA0003638452990000032
and solving the formula to obtain a prediction label, a specific characteristic corresponding to a certain type of emotional state and a common characteristic of all the emotional states.
The invention has the beneficial effects that:
1. In the linear regression model, each column of the weight matrix W acts on one type of the label, and each element in each column of the weight matrix can just explain the emotion contribution degree of the features in the sample matrix to the final emotion, and thus, the invention adds 1 norm of the weight matrix W in the model to further consolidate the significance. Similarly, since emotional states not only have features that are important to themselves, i.e., unique features, but also have common features that are meaningful to all emotions, the present invention also adds a 2,1 norm to the weight matrix W in the model. In the overall sense of electroencephalogram data, samples of the same type are close to each other, and the invention helps the model to obtain an adjacency matrix only from the characteristics of the samples by the K-nearest neighbor algorithm. Therefore, the method and the device avoid the traditional method, conduct targeted research on the special characteristics, utilize the common characteristics to have positive effects on all emotional states, and are beneficial to improving the performance of emotion recognition of the model.
2. Through iterative optimization training of the model, the emotion recognition method can perform more accurate and effective emotion recognition according to the specific characteristics corresponding to each emotion state in the electroencephalogram emotion of the testee and the common characteristics of all emotion states, so that more effective subsequent operation is performed on the testee according to the recognition result.
Drawings
FIG. 1 is a flow chart of the identification process of the present invention.
FIG. 2 is a schematic representation of the weight matrix 2-1 norm and 1 norm for feature selection.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
As shown in FIG. 1, an emotion recognition method based on common and specific electroencephalogram feature mining comprises the following steps:
step 1: and (4) acquiring electroencephalogram emotion data.
The testee is connected to the corresponding brain area through brain electricity cap lead, plays the film fragment that c has obvious emotional tendency at different time to the induced testee produces the change of mood, regard the brain electricity data of gathering as original emotional data set. In this embodiment, there are four emotion categories, which are sad, feared, happy, and calm. Taking the electroencephalogram data collected in one time period as the test data of the tested object. And marking emotion labels on the electroencephalogram data collected in the rest time periods to determine a target function.
According to the technical scheme, the film fragments with obvious emotional tendency are played at different times, so that the acquired electroencephalogram data are more comprehensive, different time periods are avoided, different electroencephalogram data are expressed for the same emotion, the later-stage identification is influenced, and a foundation is laid for accurately identifying the emotional state of the testee according to the electroencephalogram data in the later stage.
And 2, step: and preprocessing the electroencephalogram emotion data.
Preprocessing the electroencephalogram data acquired in the step (1), wherein the electroencephalogram data of each testee corresponds to a sample matrix X, each sample matrix has a corresponding emotion label vector y, and c emotions are shared; selecting two different sample matrixes, and respectively making data Xs with emotion labels and data Xt without emotion labels;
specifically, in this embodiment, the acquired electroencephalogram data are sampled at a sampling rate of 200Hz, then noise and artifacts are filtered by a band-pass filter of 1Hz to 75Hz, and Differential Entropy (DE) of the acquired electroencephalogram data is calculated as a sample matrix X in 5 frequency bands (Delta (1 to 4Hz), Theta (4 to 8Hz), Alpha (8 to 14Hz), Beta (14 to 31Hz), and Gamma (31 to 50Hz)), respectively, where the calculation formula is as follows:
Figure BDA0003638452990000041
wherein σ is a standard deviation of the probability density function; μ is the expectation of a probability density function.
With the above technical solution, it can be seen that the differential entropy characteristic is essentially a logarithmic form of the power spectral density characteristic, i.e.
Figure BDA0003638452990000042
Therefore, the signal-to-noise ratio can be improved through the preprocessing of the electroencephalogram signals, the preprocessing effect of data is further improved, interference is reduced, and a foundation is laid for accurate later-stage identification.
And the label vector y is an emotion label corresponding to the sample matrix X.
And 3, step 3: establishing an objective function
By utilizing identity information contained in a sample, an adjacent matrix is obtained by utilizing a K-nearest neighbor algorithm, meanwhile, the row and column of a weight matrix in a least square linear regression model are interpreted differently on the sample label (see figure 2), and the 2-1 norm and the 1 norm of the weight matrix W are respectively increased, so that an expanded least square semi-supervised linear regression model is obtained, and a target function is obtained based on the model.
Specifically, the extended least squares semi-supervised linear regression model is:
Figure BDA0003638452990000043
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003638452990000044
is composed of two parts, namely a first part and a second part,
Figure BDA0003638452990000045
is a marked feature matrix;
Figure BDA0003638452990000046
a feature matrix (a feature matrix of unmarked samples) of test data for the object to be tested; p denotes the dimension of the feature matrix, nsAnd ntRespectively, the number of marked samples and the number of unmarked samples; w is a weight matrix;
Figure BDA0003638452990000051
the method also comprises two parts, Ys is a label with a feature matrix of a training set sample, and a one-hot coding form is adopted; ft is a prediction label of the emotion category of the tested object (namely the prediction label of the feature matrix of the unmarked sample); n is ns+ntRepresents the total number of samples; c represents the total category number of emotion labels. F ═ XW; l is2=L*-L;
Figure BDA0003638452990000052
An adjacent matrix obtained by a KNN algorithm on the sample matrix X; diagonal matrix L *Element (1) of
Figure BDA0003638452990000053
The 1 norm of the weight matrix W represents the interpretation of the emotion tag characteristic features; the 2, 1 norm of the weight matrix W represents the interpretation of features common between emotion labels; lambda1,λ2Beta is a parameter; tr (-) is the trace operation of the matrix. When the model is used, the electroencephalogram data of the object to be tested is preprocessed and used as a characteristic matrix XtInputting a model, obtaining a predicted label F from the modelt
And 4, step 4: iterative optimization objective function
According to the objective function in the step 3, carrying out iterative optimization on the label matrix Ft to be predicted and the weight matrix W in the model for multiple times in sequence;
before optimizing the objective function, the prediction label and the weight matrix need to be initialized. Wherein, there is no markLabel initialization for a sample matrix
Figure BDA0003638452990000054
The weight matrix is initialized to: (X)TX+γI)-1XTAnd Y. Wherein γ is a preset initialization parameter.
The specific iterative method is as follows:
fixing the prediction label matrix Ft, and updating the objective function of the weight matrix W as follows:
Figure BDA0003638452990000055
first, this is a convex optimization problem, solved using the near-end gradient algorithm. Order:
Figure BDA0003638452990000056
wherein the content of the first and second substances,
Figure BDA0003638452990000057
is a Hilbert space; f (W) and g (W) are both convex functions, where f (W) is a Lipschitz continuous gradient satisfying
Figure BDA0003638452990000058
A convex function of (a); wherein L isfIs a Lipschitz constant; Δ W ═ Wt+1-Wt. Second order expansion of f (w) yields:
Figure BDA0003638452990000059
order:
Figure BDA00036384529900000510
the optimization problem for the weight matrix W can be written as:
Figure BDA0003638452990000061
at this time, for the optimization model proposed by the present invention, f (w) and g (w) are specifically:
Figure BDA0003638452990000062
the optimized expression for W can thus be further simplified:
Figure BDA0003638452990000063
and W(t)The value of (d) can be found according to the following expression:
W(t)=Wt+bt-1/bt(Wt-Wt-1) (14)
wherein, btSatisfies the following conditions:
Figure BDA0003638452990000064
b0,b1will be initialized to 1, WtIs the result of the t-th iterative optimization. Thus, the optimization problem for W becomes solving the following expression:
Figure BDA0003638452990000065
wherein S isε[·]Is a soft threshold algorithm:
Figure BDA0003638452990000066
wherein epsilon is beta/LfAnd the following operations can be performed according to the initialized W1 and W2:
Figure BDA0003638452990000067
Figure BDA0003638452990000068
wherein A isii=1/2||wi||2,wiThe ith row of W is indicated.
Thus LfIt can be derived that:
Figure BDA0003638452990000069
wherein the content of the first and second substances,
Figure BDA00036384529900000610
representing the maximum singular value operation.
Through the derivation, the iterative optimization process of the weight matrix W under the near-end gradient algorithm can be obtained as follows:
Figure BDA0003638452990000071
fixing a weight matrix W, and updating an objective function of the prediction label Ft as follows:
Figure BDA0003638452990000072
the above equation is formulated as an expression for the row-by-row operation of Ft:
Figure BDA0003638452990000073
wherein M ═ XtW,MiIs the ith row vector of M. And then solving the predicted label Ft line by line:
Figure BDA0003638452990000074
wherein v is M i(ii) a x is any unmarked sample.
And repeating the optimization iteration process of Ft and W for multiple times, thereby completing the optimization iteration of the whole objective function.
Meanwhile, the weight matrix W is processed to obtain the unique features and the common features of the emotional states.
In order to obtain the common features of emotional states, we need to process the obtained weight matrix W as follows:
Figure BDA0003638452990000075
wherein, thetapiRepresenting the importance of each characteristic dimension, w, of the sample matrixpi,wjRepresenting the row vector of W.
Because the EEG data is collected from each lead of different frequency bands and further processed, the frequency band and the lead with the highest weight are found
Figure BDA0003638452990000076
The importance ω (i) of each frequency band is calculated as follows:
w(i)=θ(i-1)×s+1(i-1)×x+2+…+θi×s (24)
wherein, b is the frequency band number when collecting the electroencephalogram data, s is the number of lead connections used when collecting the electroencephalogram data, and b is the number of times of the brain.
First, the
Figure BDA0003638452990000081
The importance ψ (i) of each lead is calculated as follows:
ψ(j)=θjj+s+…+θj+(b-1)×s (25)
acquiring the characteristic features of a certain emotional state by using the weight matrix W:
Figure BDA0003638452990000082
wjithe element representing the ith column and the jth row,
Figure BDA0003638452990000083
the sum of the elements in the ith column is shown, and the importance of the frequency band and the lead is obtained by the same formulas (24) and (25).
And 5: and (4) inputting the electroencephalogram data preprocessed in the step (2) into an objective function optimized in an iteration mode.
Performing iterative optimization processing on the unlabeled feature matrix (electroencephalogram signal) processed in the step 2 to obtain a prediction label F tNamely the emotional state corresponding to the acquisition moment of the subject. At the same time, the prediction label FtAnd the serial number corresponding to the medium maximum value is the corresponding emotion type, so that the emotion type of the measured object during electroencephalogram data acquisition is obtained.
The embodiments of the present invention have been described in detail above with reference to the accompanying drawings, but the present invention is not limited to the described embodiments. It will be apparent to those skilled in the art that various changes, modifications, substitutions and alterations can be made in these embodiments, including components thereof, without departing from the principles and spirit of the invention, and still fall within the scope of the invention.

Claims (9)

1. An emotion recognition method based on common and specific electroencephalogram feature mining is characterized by comprising the following steps: the method comprises the following steps:
step 1, collecting electroencephalogram data of a subject to obtain electroencephalogram emotion data of different emotion types;
step 2, preprocessing the electroencephalogram emotion data to obtain a labeled data set;
step 3, establishing a target function;
respectively increasing 2-1 norm and 1 norm of a weight matrix to rows and columns of the weight matrix in the least square linear regression model to obtain an expanded least square semi-supervised linear regression model; meanwhile, obtaining an adjacency matrix through a K-nearest neighbor algorithm according to characteristic information contained in the sample; further obtaining a target function;
Step 4, jointly and iteratively optimizing an objective function by using the data set with the label, and sequentially carrying out updating expression solving on the weight matrix W and the predicted label Ft in the model;
and 5, inputting the preprocessed electroencephalogram data of the measured object into the target function which is optimized in an iteration mode, and acquiring the emotion type of the measured object.
2. The emotion recognition method based on common and specific electroencephalogram feature mining, as claimed in claim 1, wherein: in the step 1, electroencephalogram emotional data of different emotion types are obtained by inducing a subject to generate different emotion changes and collecting electroencephalogram data of the subject.
3. The emotion recognition method based on common and specific electroencephalogram feature mining, as claimed in claim 1, wherein: the preprocessing comprises sampling the electroencephalogram data according to a fixed frequency, and filtering noise and artifacts of the sampled electroencephalogram data through a band-pass filter.
4. The emotion recognition method based on common and specific electroencephalogram feature mining, as claimed in claim 3, wherein: the preprocessing further comprises the steps of classifying the electroencephalogram data subjected to noise and artifact filtering according to n frequency bands, and respectively calculating differential entropy under each frequency band, wherein the differential entropy is used as electroencephalogram characteristics in a sample matrix; n is the number of divided frequency bands.
5. The emotion recognition method based on common and specific electroencephalogram feature mining, as claimed in claim 2, wherein: the band-pass filter is a 1hz-75hz band-pass filter.
6. The emotion recognition method based on common and specific electroencephalogram feature mining, as claimed in claim 1, wherein: the EEG data after noise and artifact filtering is divided into 5 frequency bands according to Delta, Theta, Alpha, Beta and Gamma.
7. The emotion recognition method based on common and specific electroencephalogram feature mining, as claimed in claim 1, wherein: the semi-supervised linear regression model of the least squares obtained in step 3 is:
Figure FDA0003638452980000011
wherein the content of the first and second substances,
Figure FDA0003638452980000021
Figure FDA0003638452980000022
is a marked feature matrix;
Figure FDA0003638452980000023
a characteristic matrix of the measured object; p denotes the dimension of the feature matrix, nsAnd ntRespectively counting the number of samples in a training set and the number of samples in a testing set; w is a weight matrix;
Figure FDA0003638452980000024
also consisting of two parts, YsLabels with feature matrixes of training set samples; ftIs a prediction label of the emotion category of the tested object; n is ns+ntRepresents the total number of samples; f ═ XW; l is2=L*-L;
Figure FDA0003638452980000025
An adjacent matrix obtained by a KNN algorithm on the sample matrix X; diagonal matrix LElement (1) of
Figure FDA0003638452980000026
The 1 norm of the weight matrix W represents the interpretation of the emotion tag-specific features; 2, l norm table of weight matrix W Interpretation of features common between emotion tags; lambda [ alpha ]1,λ2Beta is a preset model parameter; tr (-) is the trace operation of the matrix.
8. The emotion recognition method based on common and specific electroencephalogram feature mining, as claimed in claim 1, wherein: step 4, initializing the weight matrix and the prediction tag before optimizing the objective function; wherein each element in the prediction tag is initialized to
Figure FDA0003638452980000027
c is the total category number of the emotion labels, and the weight matrix is initialized as follows: (X)TX+γI)- 1XTY。
9. The emotion recognition method based on common and specific electroencephalogram feature mining, as claimed in claim 1, wherein: the specific method of the joint iteration in the step 4 is as follows:
fixing a prediction label Ft, updating a weight matrix W, and enabling an objective function to be as follows:
Figure FDA0003638452980000028
fixing a weight matrix W, and updating a prediction label Ft, wherein the target function is as follows:
Figure FDA0003638452980000029
s.t.Ft≥0,F1=1
and solving the formula to obtain a prediction label, a specific characteristic corresponding to a certain type of emotional state and a common characteristic of all the emotional states.
CN202210512240.4A 2022-05-11 2022-05-11 Emotion recognition method based on common and specific electroencephalogram feature mining Pending CN114757235A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210512240.4A CN114757235A (en) 2022-05-11 2022-05-11 Emotion recognition method based on common and specific electroencephalogram feature mining

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210512240.4A CN114757235A (en) 2022-05-11 2022-05-11 Emotion recognition method based on common and specific electroencephalogram feature mining

Publications (1)

Publication Number Publication Date
CN114757235A true CN114757235A (en) 2022-07-15

Family

ID=82336027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210512240.4A Pending CN114757235A (en) 2022-05-11 2022-05-11 Emotion recognition method based on common and specific electroencephalogram feature mining

Country Status (1)

Country Link
CN (1) CN114757235A (en)

Similar Documents

Publication Publication Date Title
Jiang et al. Self-supervised contrastive learning for EEG-based sleep staging
CN112244873A (en) Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network
CN114533086B (en) Motor imagery brain electrolysis code method based on airspace characteristic time-frequency transformation
CN113052113B (en) Depression identification method and system based on compact convolutional neural network
CN114190944B (en) Robust emotion recognition method based on electroencephalogram signals
CN113057657B (en) Electroencephalogram emotion classification method based on multi-scale connectivity characteristics and element migration learning
CN112185493A (en) Personality preference diagnosis device and project recommendation system based on same
CN116058800A (en) Automatic sleep stage system based on deep neural network and brain-computer interface
Duque et al. Visualizing high dimensional dynamical processes
CN114330422A (en) Cross-test migration learning method for estimating electroencephalogram emotional characteristics in real time
CN113974627A (en) Emotion recognition method based on brain-computer generated confrontation
CN117883082A (en) Abnormal emotion recognition method, system, equipment and medium
CN113988135A (en) Electromyographic signal gesture recognition method based on double-branch multi-stream network
CN114757235A (en) Emotion recognition method based on common and specific electroencephalogram feature mining
Komisaruk et al. Neural network model for artifacts marking in EEG signals
CN116035577A (en) Electroencephalogram emotion recognition method combining attention mechanism and CRNN
CN115736920A (en) Depression state identification method and system based on bimodal fusion
Alessandrini et al. EEG-Based Neurodegenerative Disease Classification using LSTM Neural Networks
CN115399735A (en) Multi-head attention mechanism sleep staging method based on time-frequency double-current enhancement
CN113974625A (en) Emotion recognition method based on brain-computer cross-modal migration
Moradi et al. Deep neural network method for classification of sleep stages using spectrogram of signal based on transfer learning with different domain data
CN117708682B (en) Intelligent brain wave acquisition and analysis system and method
CN112545535B (en) Sleep-wake cycle analysis method based on amplitude integrated electroencephalogram
CN116616800B (en) Scalp electroencephalogram high-frequency oscillation signal identification method and device based on meta-shift learning
CN113598791B (en) Consciousness disturbance classification method based on time-space convolution neural network used by resting state electroencephalogram

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination