CN116881762A - Emotion recognition method based on dynamic brain network characteristics - Google Patents

Emotion recognition method based on dynamic brain network characteristics Download PDF

Info

Publication number
CN116881762A
CN116881762A CN202211583971.4A CN202211583971A CN116881762A CN 116881762 A CN116881762 A CN 116881762A CN 202211583971 A CN202211583971 A CN 202211583971A CN 116881762 A CN116881762 A CN 116881762A
Authority
CN
China
Prior art keywords
brain
dynamic
emotion recognition
network characteristics
wave signals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211583971.4A
Other languages
Chinese (zh)
Inventor
王海玲
方志军
吴彦泽
汪丽珍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi Supersun Technology Lighting Co ltd
Shanghai University of Engineering Science
Original Assignee
Jiangxi Supersun Technology Lighting Co ltd
Shanghai University of Engineering Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi Supersun Technology Lighting Co ltd, Shanghai University of Engineering Science filed Critical Jiangxi Supersun Technology Lighting Co ltd
Priority to CN202211583971.4A priority Critical patent/CN116881762A/en
Publication of CN116881762A publication Critical patent/CN116881762A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention relates to a technical neighborhood of emotion recognition, and discloses an emotion recognition method based on dynamic brain network characteristics, which is characterized by comprising the following steps of: preprocessing the acquired brain wave signals; extracting dynamic brain function network characteristics from the preprocessed brain wave signals by using a dynamic phase linear measurement method of dyPLM; and inputting the extracted dynamic brain function network characteristics into a trained neural model for emotion recognition. The brain network features constructed by the method are not easy to be influenced by noise, dynamic features of emotion can be captured more accurately, the accuracy of emotion recognition can be improved effectively, and the application range is wider.

Description

Emotion recognition method based on dynamic brain network characteristics
Technical Field
The invention relates to the technical field of emotion recognition, in particular to an emotion recognition method based on dynamic brain network characteristics.
Background
Emotion recognition is one of the very important leading-edge research topics in the field of human-computer interaction (HCI) emotion intelligence, aiming at simulating human-computer interaction by means of measured human emotion. Scalp electroencephalogram (EEG) signals are used in a large number of emotion recognition studies due to their high time resolution, portability, non-invasive nature, and the like. However, the emotion recognition rate based on the electroencephalogram signals is not high at present, and the method cannot be applied to a real human-computer interaction scene. The reason is mainly extracted is static characteristics, but emotion is continuously and dynamically changed, and the current characteristic extraction method cannot mine the dynamic change characteristics of different emotions, so that the emotion recognition capability is prevented from being further improved. Therefore, development of a new dynamic feature extraction method for emotion recognition is needed, which is beneficial to promoting development and progress of man-machine interaction emotion intelligence based on emotion recognition.
The cognitive neuroscience research shows that the cognitive processing of the brain on emotion is completed by a plurality of different brain areas, and the brain areas cooperate with each other to form a network with different brain functions. Compared with the traditional characteristics such as power spectrum, differential entropy and the like, the brain network characteristics contain richer emotion-related information, and emotion recognition can be more effectively carried out. At present, brain network characteristics extracted aiming at emotion recognition are static, such as amplitude square coherence (magnitude squared coherence, MSC), phase-locked value (phase locking value, PLV), pearson correlation analysis and the like, but emotion is continuously and dynamically changed, and the current characteristic extraction method cannot mine dynamic change characteristics of different emotions, so that further improvement of emotion recognition capability is prevented.
There are also a number of problems and limitations with the electroencephalogram-based dynamic network estimation method. When a dynamic function network is constructed, a common method is a sliding window method, a time window with a fixed length is generally selected, then a signal calculation function in the window is utilized to connect, after calculation, the time window slides backwards for a fixed time point number, and then the function connection of the next time window is calculated, so that the function network which dynamically changes along with time is obtained; if the window length is too large, dynamic characteristics of emotion are difficult to capture, parameters are required to be set manually and subjected to manual parameter adjustment, in addition, connection obtained by a fixed window length method is actually connection of signals in each window time period, the time span is influenced by the window length, brain network characteristics of emotion at each time point cannot be extracted, and functional connection information has serious defects.
Disclosure of Invention
The invention provides a dynamic brain network feature-based emotion recognition method, which adopts a dyPLM algorithm to extract dynamic brain network features related to emotion, and finally inputs the extracted features into a trained neural model CNGRU for emotion recognition, so that the emotion recognition accuracy can be improved.
The invention can be realized by the following technical scheme:
a mood recognition method based on dynamic brain network characteristics,
preprocessing the acquired brain wave signals;
extracting dynamic brain function network characteristics from the preprocessed brain wave signals by using a dynamic phase linear measurement method of dyPLM;
and inputting the extracted dynamic brain function network characteristics into a trained neural model for emotion recognition.
Further, brain wave signals are acquired by using an electroencephalogram cap with a plurality of acquisition electrodes, then brain wave signals of each subject are preprocessed, data of each acquisition time point of the preprocessed brain wave signals are subjected to feature extraction by using a dyPLM dynamic phase linear measurement method, so that dynamic brain function network features are acquired, and finally the dynamic brain function network features corresponding to each subject are input into a trained neural model for emotion recognition.
Further, the brain wave signal after preprocessing is recorded asC is the total number of lead signals in the electroencephalogram cap, T is the total time point of data acquisition,
the preprocessed brain wave signals are subjected to Hilbert transformation to obtain transformed brain wave signals
And then the brain wave signals after transformationIs a signal of any two leads +.>And->Calculate->Wherein, the liquid crystal display device comprises a liquid crystal display device,representation->Complex conjugate of (2);
calculate the phase Δφ (t) of z (t), and time-frequency transform z (t) to obtain z (t, f) =Wherein the window function is a Gaussian function>[0,T]For the observation interval of the signal, the corresponding energy spectrum density s is calculated again z (t,f)=|z(t,f)| 2
Finally, calculating the functional connection value of any two lead signals at the acquisition time point t Constructing a brain function network matrix;
repeating the above process to obtain the function connection values corresponding to all the acquisition time points t of each subject, thereby obtaining the dynamic brain function network characteristics.
Furthermore, the neural network model adopts a CNGRU model, the CNGRU model is provided with two layers of convolution layers, each layer adopts a Relu activation function, and each layer is pooled and then adopts a dropout method, so that the overfitting is reduced; and outputting a feature map after two-layer convolution, inputting two-layer GRU models, and finally classifying by using a softmax regression model.
Further, the category identified by emotion is set as high-titer high-arousal degree, high-titer low-arousal degree, low-titer high-arousal degree and low-titer low-arousal degree.
The beneficial technical effects of the invention are as follows:
the invention provides a dynamic brain network construction method based on dyPLM algorithm for feature extraction, which can construct brain network for each sampling time point signal, thereby forming dynamic brain network in given sampling time period.
Drawings
FIG. 1 is a flow chart of a method of emotion recognition based on electroencephalogram dynamic brain functional network features of the present invention;
FIG. 2 is a schematic diagram of a dynamic brain function network construction process based on dyPLM in accordance with the present invention;
FIG. 3 is a diagram of a network model architecture of a CNGRU of the present invention;
FIG. 4 is a graph of all results tested for emotion recognition on a DEAP dataset of dynamic brain network features extracted by the dyPLM method and the sliding window-based Pearson correlation coefficient method provided by the invention;
FIG. 5 is a diagram illustrating the convergence of loss function values on a DEAP dataset according to an embodiment of the present invention;
fig. 6 is a schematic diagram of an emotion recognition system of an emotion recognition method based on an electroencephalogram dynamic brain function network feature according to an embodiment of the present invention.
Detailed Description
The invention will be described in further detail with reference to the drawings and the specific examples.
As shown in FIG. 1, since the phase difference between any two brain wave signal sources is a linear relationship which changes along with time, the slope of the linear relationship depends on the difference of the center frequencies of the two signals, and the dyPLM dynamic phase linear measurement algorithm is a pure phase correlation measurement method, the algorithm is used for realizing brain connection estimation based on signal phases, constructing dynamic brain network characteristics, and finally inputting the dynamic brain network characteristics into a neural network CNGRU model for emotion recognition, so that the accuracy of emotion recognition can be improved.
For ease of understanding, the invention has been described with reference to a set of emotional electroencephalogram data stored in a DEAP public database, which was experimentally collected by Koelstra et al from university of Mary queen, netherlands Wen Teda university, switzerland Federal theory university, switzerland, used to study the emotional state of humans. The DEAP database records physiological signals of 32 subjects watching 40 minutes of music videos based on physiological signals generated under the induction stimulation of the music video material, wherein each music video is watched for 1 minute, the sampling rate of the physiological signals is 512Hz, and the resampling is 128Hz. Each subject was subjected to 40 trials, and a timely assessment of self-emotion classification, including potency valance and Arousal degree Arousal, was performed after each trial was completed. In the following embodiments, the invention is described taking 32-lead electroencephalogram data resampled to 128Hz as an example, and the emotion classification of each trial is divided into four categories: high potency high arousal High Valence High Arousal, HVHA, high potency low arousal High Valence Low Arousal, HVLA, low potency high arousal Low Valence High Arousal, LVHA, low potency low arousal Low Valence Low Arousal, LVLA.
The method comprises the following steps:
step one, preprocessing the acquired brain wave signals;
preprocessing all brain wave signal data, removing noises such as blinks, eye movements, myoelectricity and the like, and obtaining noiseless and clean brain wave data;
step two, extracting dynamic brain function network characteristics from the preprocessed brain wave signals by using a dyPLM dynamic phase linear measurement method, namely extracting characteristics from data of each acquisition time point of the preprocessed brain wave signals by using the dyPLM dynamic phase linear measurement method, so as to obtain the dynamic brain function network characteristics, as shown in fig. 2.
The brain wave signals after preprocessing are recorded asC is the total number of lead signals in the electroencephalogram cap, T is the total time point of data acquisition,
the preprocessed brain wave signals are subjected to Hilbert transformation to obtain transformed brain wave signals
And then the brain wave signals after transformationIs a signal of any two leads +.>And->Calculate->Wherein, the liquid crystal display device comprises a liquid crystal display device,representation->Complex conjugate of (2);
then the phase delta phi (t) of the z (t) is calculated, and the z (t) is obtained after time-frequency transformation Wherein the window function is a Gaussian function>[0,T]For the observation interval of the signal, the time-frequency conversion of the invention adopts S-conversion, namely S-transformation, and then calculates the corresponding energy spectrum density S z (t,f)=|z(t,f)| 2
Finally, calculating the functional connection value of any two lead signals at the acquisition time point t Therefore, the dyPLM algorithm calculates the percentage of the spectrum energy in the narrow band 2B which takes 0 as the center at the moment of the acquisition time point t relative to the total energy of the signal at the acquisition time point, so as to form a 32 multiplied by 32 brain function network matrix corresponding to the acquisition time point t;
repeating the above process to obtain the functional connection values corresponding to all the acquisition time points t of each subject, thereby obtaining the dynamic brain functional network characteristic networkAnd has a value between 0 and 1.
And thirdly, inputting the extracted dynamic brain function network characteristics into a trained neural model for emotion recognition, as shown in fig. 3. The input of the nerve model is a brain function network matrix of 32×32, wherein 32 is the total number of lead signals of brain wave data;
sequentially inputting two one-dimensional convolution layers into each brain function network matrix, wherein the convolution kernel of each convolution layer is 3, the number is 128, the step length is 1, and the activation function is tanh; the pooling layer size behind each convolution layer is 2, dropout is set to 0.2; after the features pass through the two convolution layers, the pooling layer and the dropout layer, obtaining a Feature Map with the size of 6 multiplied by 128;
then inputting the Feature Map of 6×128 into the GRU model of two layers in time sequence, wherein the first layer has the dimension of 6×256, the second layer has the dimension of 6×32, and the output Feature dimension is 32×1;
finally, the 32 multiplied by 1 features are connected into a full connection layer with the length of 128, and then the softmax layer is accessed to realize four category classification of emotion.
Fig. 4 shows classification results of feature selection of each subject on a DEAP data set by using the dyPLM method and the Pearson correlation coefficient method based on the sliding window method provided by the embodiment of the invention, and it can be seen from the figure that the four classification accuracy of the emotion brain electrical data of most subjects by using the method provided by the invention reaches more than 99%, and the lowest classification accuracy of the tested serial number 22 reaches more than 92%; most tested dynamic brain network feature extraction methods based on dyPLM have the accuracy rate of emotion recognition far higher than that of Pearson correlation coefficient methods based on a sliding window method. Fig. 5 shows the bracelet results of the loss function of the method provided by the embodiment of the invention on the DEAP data set, and as can be seen from fig. 4 and fig. 5, the method provided by the invention performs well and has certain robustness.
Fig. 6 shows an emotion recognition system based on dynamic brain function network characteristics, which comprises an electroencephalogram acquisition module, an electroencephalogram data storage module, an electroencephalogram data preprocessing module, a dynamic brain network characteristic extraction module and an emotion recognition module, wherein the electroencephalogram data preprocessing module, the dynamic brain network characteristic extraction module and the emotion recognition module are applied to a server. It should be noted that the division of each module is merely a division of a logic function, and may be fully or partially integrated into a physical entity in actual implementation, and each module may be implemented in a form of a software program called by a processing component.
The electroencephalogram acquisition module is connected with the electroencephalogram data storage module and is used for acquiring emotion electroencephalogram data and transmitting the emotion electroencephalogram data to the memory for storage; the emotion electroencephalogram acquisition module is used for acquiring emotion electroencephalogram data, wherein the emotion electroencephalogram data comprises various categories such as positive emotion electroencephalogram, neutral emotion electroencephalogram and negative emotion electroencephalogram;
the electroencephalogram data preprocessing module is used for preprocessing the emotion electroencephalogram data, removing noises such as blinks, eye movements, myoelectricity and the like, and obtaining clean electroencephalogram data without noises;
the dynamic brain function network feature extraction module is used for extracting dynamic brain networks of different emotion types from the preprocessed brain electrical data;
the emotion recognition module is used for predicting and recognizing different emotion types based on the dynamic brain network characteristics of the different emotion types.
While particular embodiments of the present invention have been described above, it will be appreciated by those skilled in the art that these are merely illustrative, and that many changes and modifications may be made to these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims.

Claims (5)

1. A emotion recognition method based on dynamic brain network characteristics is characterized by comprising the following steps of:
preprocessing the acquired brain wave signals;
extracting dynamic brain function network characteristics from the preprocessed brain wave signals by using a dynamic phase linear measurement method of dyPLM;
and inputting the extracted dynamic brain function network characteristics into a trained neural network model for emotion recognition.
2. The emotion recognition method based on dynamic brain network characteristics according to claim 1, characterized in that: and acquiring brain wave signals of different subjects by using a multi-lead brain wave cap, preprocessing the brain wave signals of each subject, extracting characteristics of data of each acquisition time point of the preprocessed brain wave signals by using a dyPLM dynamic phase linear measurement method, acquiring dynamic brain function network characteristics, and finally inputting the dynamic brain function network characteristics corresponding to each subject into a trained neural network model for emotion recognition.
3. The emotion recognition method based on dynamic brain network characteristics according to claim 2, characterized in that: the brain wave signals after preprocessing are recorded asC is the total number of lead signals in the electroencephalogram cap, T is the total time point of data acquisition, hilbert transformation is firstly carried out on the preprocessed electroencephalogram signals to obtain transformed electroencephalogram signals>
And then the brain wave signals after transformationIs a signal of any two leads +.>And->Calculate->Wherein, thereinIs->Complex conjugate of (2);
then the phase delta phi (t) of the z (t) is calculated, and the z (t) is obtained after time-frequency transformation Wherein the window function is a Gaussian function>[0,T]For the observation interval of the signal, the corresponding energy spectrum density s is calculated again z (t,f)=|z(t,f)| 2
Finally, calculating the functional connection value of any two lead signals at the acquisition time point t Constructing a brain function network matrix;
repeating the above process to obtain the function connection values corresponding to all the acquisition time points t of each subject, thereby obtaining the dynamic brain function network characteristics.
4. The emotion recognition method based on dynamic brain network characteristics according to claim 1, characterized in that: the neural network model adopts a CNGRU model, the CNGRU model is provided with two layers of convolution layers, each layer adopts a Relu activation function, and each layer is pooled and then adopts a dropout method, so that the overfitting is reduced; and outputting a feature map after two-layer convolution, inputting two-layer GRU models, and finally classifying by using a softmax regression model.
5. The emotion recognition method based on dynamic brain network characteristics according to claim 4, characterized in that: the emotion recognition categories are set as high-titer high-arousal degree, high-titer low-arousal degree, low-titer high-arousal degree and low-titer low-arousal degree.
CN202211583971.4A 2022-12-09 2022-12-09 Emotion recognition method based on dynamic brain network characteristics Pending CN116881762A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211583971.4A CN116881762A (en) 2022-12-09 2022-12-09 Emotion recognition method based on dynamic brain network characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211583971.4A CN116881762A (en) 2022-12-09 2022-12-09 Emotion recognition method based on dynamic brain network characteristics

Publications (1)

Publication Number Publication Date
CN116881762A true CN116881762A (en) 2023-10-13

Family

ID=88268674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211583971.4A Pending CN116881762A (en) 2022-12-09 2022-12-09 Emotion recognition method based on dynamic brain network characteristics

Country Status (1)

Country Link
CN (1) CN116881762A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117617995A (en) * 2024-01-26 2024-03-01 小舟科技有限公司 Method for collecting and identifying brain-computer interface key brain region code and computer equipment
CN117708682A (en) * 2024-02-06 2024-03-15 吉林大学 Intelligent brain wave acquisition and analysis system and method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117617995A (en) * 2024-01-26 2024-03-01 小舟科技有限公司 Method for collecting and identifying brain-computer interface key brain region code and computer equipment
CN117708682A (en) * 2024-02-06 2024-03-15 吉林大学 Intelligent brain wave acquisition and analysis system and method
CN117708682B (en) * 2024-02-06 2024-04-19 吉林大学 Intelligent brain wave acquisition and analysis system and method

Similar Documents

Publication Publication Date Title
Hramov et al. Wavelets in neuroscience
Huang et al. S-EEGNet: Electroencephalogram signal classification based on a separable convolution neural network with bilinear interpolation
CN116881762A (en) Emotion recognition method based on dynamic brain network characteristics
CN111310570B (en) Electroencephalogram signal emotion recognition method and system based on VMD and WPD
CN110781945A (en) Electroencephalogram signal emotion recognition method and system integrating multiple features
CN113729707A (en) FECNN-LSTM-based emotion recognition method based on multi-mode fusion of eye movement and PPG
CN111493822A (en) Sleep electroencephalogram based rapid eye movement period sleep behavior disorder classification method
CN112869711A (en) Automatic sleep staging and migration method based on deep neural network
CN111407243A (en) Pulse signal pressure identification method based on deep learning
CN112587153A (en) End-to-end non-contact atrial fibrillation automatic detection system and method based on vPPG signal
CN114732409A (en) Emotion recognition method based on electroencephalogram signals
Cleatus et al. Epileptic seizure detection using spectral transformation and convolutional neural networks
Xu et al. EEG emotion classification based on baseline strategy
Madanu et al. Depth of anesthesia prediction via EEG signals using convolutional neural network and ensemble empirical mode decomposition
CN114190944A (en) Robust emotion recognition method based on electroencephalogram signals
Agarwal et al. Fusion of pattern-based and statistical features for Schizophrenia detection from EEG signals
Hou et al. Deep feature pyramid network for EEG emotion recognition
CN113317803B (en) Neural disease feature extraction method based on graph theory and machine learning
CN113558644A (en) Emotion classification method, medium and equipment for 3D matrix and multidimensional convolution network
Shah et al. Analysis of EEG for Parkinson’s Disease Detection
CN117407748A (en) Electroencephalogram emotion recognition method based on graph convolution and attention fusion
Anthiyur Aravindan et al. Prediction of arousal and valence state from electrodermal activity using wavelet based resnet50 model
Shilaskar et al. Fusion of eeg, emg, and ecg signals for accurate recognition of pain, happiness, and disgust
Yu et al. Design and implementation of automatic sleep staging based on ECG signals
Tunnell et al. A novel convolutional neural network for emotion recognition using neurophysiological signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination