CN115414051A - Emotion classification and recognition method of electroencephalogram signal self-adaptive window - Google Patents

Emotion classification and recognition method of electroencephalogram signal self-adaptive window Download PDF

Info

Publication number
CN115414051A
CN115414051A CN202110519615.5A CN202110519615A CN115414051A CN 115414051 A CN115414051 A CN 115414051A CN 202110519615 A CN202110519615 A CN 202110519615A CN 115414051 A CN115414051 A CN 115414051A
Authority
CN
China
Prior art keywords
window
electroencephalogram
emotion
signal
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110519615.5A
Other languages
Chinese (zh)
Inventor
梁琛
王忠民
王菲
王文浪
范琳
衡霞
贺炎
张�荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Posts and Telecommunications
Original Assignee
Xian University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Posts and Telecommunications filed Critical Xian University of Posts and Telecommunications
Priority to CN202110519615.5A priority Critical patent/CN115414051A/en
Publication of CN115414051A publication Critical patent/CN115414051A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Physiology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • General Physics & Mathematics (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Evolutionary Computation (AREA)
  • Dermatology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)

Abstract

The invention belongs to the technical field of signal processing, and particularly relates to an emotion classification and identification method for an electroencephalogram signal self-adaptive window, which comprises the following steps: preprocessing the acquired electroencephalogram data under different emotional states, iteratively comparing electroencephalogram signals at different time points and different lengths by using a generalized orthogonal partial directional coherent method, and selecting a key electroencephalogram signal which can represent emotion most; extracting features such as fractal dimension, differential entropy, power spectral density and the like according to the selected key electroencephalogram signals; performing weight calculation on the extracted features by using a reliefF algorithm to obtain high-quality features; and finally, carrying out emotion classification and identification on the valence and awakening two-dimensional emotion model by using a support vector machine algorithm and a K neighbor algorithm according to the selected high-quality features. By the method, the emotion recognition rate can be improved, the data volume can be reduced, and meanwhile, the processing time and the calculation cost can be reduced, so that the emotion classification recognition performance is improved.

Description

Emotion classification and recognition method of electroencephalogram signal self-adaptive window
Technical Field
The invention belongs to the technical field of signal processing, and particularly relates to an emotion classification recognition method for an electroencephalogram signal self-adaptive window.
Background
Among a plurality of physiological electrical signals, electroencephalogram signals are acquired according to brain activities, can directly reflect the activity states of the brain, and have the advantages of convenience in acquisition, high time resolution, low cost and the like, so that the electroencephalogram signals are used for emotion recognition research. At present, emotion recognition research related to electroencephalogram signals not only comprises artificial intelligence and computer science, but also relates to a plurality of interdisciplinary fields such as neuroscience, psychiatry and the like. Studying emotion-related brain cognitive activities is of great importance to people understanding their own emotions, optimizing computer-assisted functions, developing portable personal health care and monitoring systems, and advancing the development of psychological science.
The emotion electroencephalogram data are acquired through video induction, the connection mode of the tested functional brain network is difficult to keep relatively stable at the initial data acquisition stage and the stage when the experimental task is about to end, in the acquisition process, the tested functional brain network may cause artifact interference due to self reasons such as drowsiness and fatigue, or influence on the aspects such as the human brain emotion processing effect of the tested functional brain network is difficult to reflect due to overlong calculation time, so that a better experimental effect cannot be achieved if the complete electroencephalogram signals are used for emotion classification and identification.
In addition, because the electroencephalogram signal is a high-dimensional data signal, the calculation amount is large, the cost is high, and the data signal-to-noise ratio is low when an experiment is carried out, so that the experiment process is more complicated by using the complete electroencephalogram signal. Furthermore, due to individual differences, different subjects respond differently to different video stimuli with different times. In order to solve the problems, the invention provides an emotion classification and identification method of an electroencephalogram signal adaptive window.
Disclosure of Invention
Aiming at the situation, the invention provides an electroencephalogram signal adaptive window emotion classification and recognition method, which comprises the steps of preprocessing acquired electroencephalograms in different emotion states, selecting a key electroencephalogram signal capable of representing emotion most through a generalized orthogonal part directional drying method, optimizing features on the basis, and performing emotion classification and recognition by utilizing the optimized features, so that emotion classification and recognition performance is improved.
The invention provides an emotion classification and identification method of an electroencephalogram signal self-adaptive window, which is characterized by comprising the following steps of:
step one, acquiring electroencephalogram signals of different emotional states, including but not limited to emotional states such as happiness and sadness;
step two, preprocessing electroencephalogram data: the original electroencephalogram signal contains some artifact interference components, and artifacts in the signal need to be removed;
step three, calculating all possible signal combinations in the adaptive window: carrying out data reduction processing of a self-adaptive window by utilizing the preprocessed electroencephalogram data, and recording a minimum window, a maximum window and a variation constant as W respectively min 、W max And C, firstly setting the window size to W min And find a size of W min Then the window size is incremented by a change constant C, and likewise all signal combinations after the size increase C are found, and the previous step is repeated until the window size is greater than or equal to W max Iterating this process ensures that all possible signal time positions are considered;
step four, selecting a signal window with the maximum emotion intensity: calculating generalized orthogonal partial directional coherence value (gOPDC) of all signal combinations in time dimension, and selecting window data with the highest generalized orthogonal partial directional coherence value in all window data matrixes, wherein the window data is expressed as W gOPDC
Fifthly, extracting fractal dimension, differential entropy characteristics, power spectral density and other electroencephalogram (EEG) characteristics according to the selected window data;
sixthly, selecting features by utilizing a reliefF algorithm: selecting an example feature by utilizing a reliefF algorithm, finding K features corresponding to the same category and K features of different categories, calculating weight vectors corresponding to the features, and selecting the feature with the highest quality according to the weight vectors to ensure that the number of the selected features is smaller than the number of samples;
seventhly, performing emotion classification and identification by using a classifier: and (3) carrying out emotion classification recognition on all the preprocessed electroencephalogram data and the key electroencephalogram data subjected to data reduction in valence and awakening dimensions by utilizing a Support Vector Machine (SVM) and a K Nearest Neighbor (KNN) algorithm according to the selected features.
Compared with the prior art, the technical scheme provided by the invention has the beneficial effects that:
(1) The electroencephalogram signal part which can represent emotion better is selected through the self-adaptive window, so that data can be reduced, the calculated amount and the cost can be reduced, the accuracy of emotion classification and identification can be improved, and the emotion expression is more accurate;
(2) When the method is used for selecting key electroencephalogram signals by utilizing a generalized orthogonal partial directional coherent method, the relation between electroencephalogram channels is considered, the relation of data in space and function is restored, and more discriminative emotion information can be provided;
(3) Because the data points in the data set which can be used for electroencephalogram emotion classification recognition are limited in number, if the number of the feature points is obviously higher than that of the data points, the model is over-fitted, in order to overcome the over-fitting problem, the number of the feature points required by the training model can be reduced through feature selection, and therefore a new set of features with the largest emotion information amount is selected through the reliefF algorithm.
Drawings
FIG. 1 is a flow chart of an implementation of an emotion classification recognition method for an electroencephalogram signal adaptive window
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the following detailed description of the embodiments of the present invention is provided with reference to the accompanying drawings.
Step 1: acquiring electroencephalogram signals, namely acquiring electroencephalogram data of emotional states including but not limited to happy emotional states, sad emotional states and the like when a user watches different videos by taking video induction as stimulation;
and 2, step: and (3) preprocessing the electroencephalogram data, namely removing artifacts in the signals by using a Matlab automatic artifact removal tool kit by considering that the original electroencephalogram signals comprise some artifact interference components. Firstly, decomposing an original electroencephalogram signal X into spatial components by using Blind Source Separation (BSS) so as to separate artifacts caused by brain activities; secondly, detecting artifact components; and finally, reconstructing the electroencephalogram data by using the non-artifact components. The data is down-sampled to obtain a sampling frequency of 128Hz, reducing noise-related components from most of the electrode signal;
and step 3: all possible signal combinations in the adaptive window are calculated, all available electroencephalogram data are used, calculation is expensive, and high emotion classification recognition effect cannot be obtained. In addition, the stimuli to evoke the emotions are tedious, and during this time, the subject may experience a variety of emotions of different intensities, so a short time window needs to be selected to extract a signal that can better characterize the emotion;
and carrying out data reduction processing of the self-adaptive window by utilizing the preprocessed electroencephalogram data. Recording the minimum window, the maximum window and the variation constant as W respectively min 、W max And C, firstly setting the window size to W min And find a size of W min All the signals of (1) are combined. Next, the window size is incremented by a change constant C, and likewise, all signal combinations after the size increase C are found. Repeating the previous step until the window size is greater than or equal to W max Iterating this process ensures that all possible signal time positions are considered;
and 4, step 4: selecting the signal with the greatest emotional intensity, assuming a data set with S subjects, each subject having M samples, at each sampleThe length is t seconds, and the number of sample channels is N. Calculating the general orthogonal partial directional coherence value gOPDC between every two channels of all signal combinations of each sample in the time dimension, adding the gOPDC matrixes between the selected window data channels, selecting the window data with the highest general orthogonal partial directional coherence value from all the window data matrixes, and expressing the window data as W gOPDC
And 5: respectively extracting EEG characteristics such as fractal dimension, differential entropy characteristics, power spectral density and the like according to the selected window data;
step 6: and selecting features by utilizing a reliefF algorithm, selecting an example feature by utilizing the reliefF algorithm, finding K features corresponding to the same class and K features of different classes, calculating weight vectors corresponding to the features, and selecting the feature with the highest quality according to the weight vectors to ensure that the number of the selected features is less than the number of samples. Because the quantity of data points in a data set capable of recognizing electroencephalogram emotion is limited, and the quantity of characteristic points is obviously higher than that of the data points, the model is over-fitted, in order to overcome the over-fitting problem, the quantity of the characteristic points required by the training model can be reduced during characteristic selection, a new set of characteristics with the largest emotion information quantity are selected by utilizing a reliefF algorithm, and the algorithm has noise resistance and robustness on characteristic interaction;
and 7: and (3) performing emotion classification and identification by using a classifier, and performing emotion classification and identification on all preprocessed electroencephalogram data and key electroencephalogram data subjected to data reduction by using a support vector machine and a K neighbor algorithm in titer and awakening dimensions according to the selected characteristics.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art based on the technical solution of the present invention.

Claims (3)

1. An emotion classification and identification method of an electroencephalogram signal adaptive window is characterized by comprising the following steps:
step one, collecting electroencephalogram signals of different emotional states, including but not limited to emotional states of happiness, sadness and the like;
step two, brain electricity data preprocessing: the original electroencephalogram signal contains some artifact interference components, and artifacts in the signal need to be removed;
step three, calculating all possible signal combinations in the adaptive window: carrying out data reduction processing of a self-adaptive window by utilizing the preprocessed electroencephalogram data, and recording a minimum window, a maximum window and a variation constant as W respectively min 、W max And C, firstly setting the window size to W min And find a size of W min Then the window size is incremented by the constant of change C, and likewise all signal combinations after the size increase C are found and the previous step is repeated until the window size is greater than or equal to W max Iterating this process ensures that all possible signal time positions are considered;
step four, selecting a signal window with the maximum emotion intensity: calculating generalized orthogonal partial directional coherence (gOPDC) of all signal combinations in time dimension, and selecting window data with the highest generalized orthogonal partial directional coherence in all window data matrix, wherein the window data is represented as W gOPDC
Fifthly, extracting fractal dimension, differential entropy characteristics, power spectral density and other electroencephalogram (EEG) characteristics according to the selected window data;
sixthly, selecting characteristics by utilizing a reliefF algorithm: selecting an example feature by utilizing a reliefF algorithm, finding K features corresponding to the same category and K features of different categories, calculating weight vectors corresponding to the features, and selecting the feature with the highest quality according to the weight vectors to ensure that the number of the selected features is smaller than the number of samples;
seventhly, performing emotion classification and identification by using a classifier: and carrying out emotion classification recognition on all the preprocessed electroencephalogram data and the key electroencephalogram data subjected to data reduction in valence and awakening dimensions by utilizing a Support Vector Machine (SVM) and a K Nearest Neighbor (KNN) algorithm according to the selected features.
2. The electroencephalogram signal adaptive window emotion classification recognition method according to claim 1, characterized in that the signal with the maximum emotion intensity is selected in the fourth step: in view of the fact that when time-frequency stimulation is carried out to collect electroencephalogram data, a subject can experience a plurality of emotions with different intensities due to individual differences and self physiological and psychological factors, even if stimulation is formulated for arousing one emotion, a short time window needs to be searched for to extract signals which represent the emotion better, and the method and the device can select window data with the highest value as a signal window which can represent the emotion most by calculating the directional coherent value of the generalized orthogonal part of all signal combinations in the time dimension.
3. The emotion recognition method of electroencephalogram information adaptive window according to claim 1, characterized in that the method of calculating the generalized orthogonal partial orientation coherence in step four is based on a multivariate autoregressive model, and the multivariate autoregressive model with order p can be expressed as:
Figure FSA0000241715620000011
wherein m represents the number of channels, X (n) = (X) i (n),...,x m (n)) T For a given time series, U (n) = (U) i (n),...,u m (n)) T A white noise vector of normal distribution, A r To predict the coefficient matrix, it is given by equation (2):
Figure FSA0000241715620000012
after the multivariate autoregressive model is established, a coefficient matrix A of the multivariate autoregressive model is obtained by using a double-extended Kalman filtering algorithm r To A, a r Doing a pullThe pralace transform is converted onto the frequency domain:
Figure FSA0000241715620000021
in equation (3), I is the identity matrix, r is the model order, p is the maximum prediction order of the multivariate autoregressive model, and f is the frequency, and the multivariate autoregressive model established for the multichannel EEG signal in the time domain according to equation (1) is converted to the frequency domain:
Figure FSA0000241715620000022
the partial directional coherence value for channel i to channel j can be expressed as:
Figure FSA0000241715620000023
wherein, a j (n, f) is the j-th column of A (n, f), A ij (n, f) is the ijth element of A (n, f),
Figure FSA0000241715620000024
is a j Conjugate transposed vector of (2), P ij (n, f) takes a value between 0 and 1, A ij (n, f) are coefficients in the coefficient matrix A (n, f) corresponding to the ith row and the jth column,
Figure FSA0000241715620000025
the value of the generalized orthogonal partial directional coherence can be expressed as:
Figure FSA0000241715620000026
where n is the length of the time series, f is the frequency,
Figure FSA0000241715620000027
ω is the zero mean white noise vector of the diagonal covariance matrix.
CN202110519615.5A 2021-05-12 2021-05-12 Emotion classification and recognition method of electroencephalogram signal self-adaptive window Pending CN115414051A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110519615.5A CN115414051A (en) 2021-05-12 2021-05-12 Emotion classification and recognition method of electroencephalogram signal self-adaptive window

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110519615.5A CN115414051A (en) 2021-05-12 2021-05-12 Emotion classification and recognition method of electroencephalogram signal self-adaptive window

Publications (1)

Publication Number Publication Date
CN115414051A true CN115414051A (en) 2022-12-02

Family

ID=84195463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110519615.5A Pending CN115414051A (en) 2021-05-12 2021-05-12 Emotion classification and recognition method of electroencephalogram signal self-adaptive window

Country Status (1)

Country Link
CN (1) CN115414051A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116250837A (en) * 2023-02-14 2023-06-13 天津大学 Depression detection device based on dynamic factor brain network
CN116400800A (en) * 2023-03-13 2023-07-07 中国医学科学院北京协和医院 ALS patient human-computer interaction system and method based on brain-computer interface and artificial intelligence algorithm
CN117708682A (en) * 2024-02-06 2024-03-15 吉林大学 Intelligent brain wave acquisition and analysis system and method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116250837A (en) * 2023-02-14 2023-06-13 天津大学 Depression detection device based on dynamic factor brain network
CN116250837B (en) * 2023-02-14 2024-02-13 天津大学 Depression detection device based on dynamic factor brain network
CN116400800A (en) * 2023-03-13 2023-07-07 中国医学科学院北京协和医院 ALS patient human-computer interaction system and method based on brain-computer interface and artificial intelligence algorithm
CN116400800B (en) * 2023-03-13 2024-01-02 中国医学科学院北京协和医院 ALS patient human-computer interaction system and method based on brain-computer interface and artificial intelligence algorithm
CN117708682A (en) * 2024-02-06 2024-03-15 吉林大学 Intelligent brain wave acquisition and analysis system and method
CN117708682B (en) * 2024-02-06 2024-04-19 吉林大学 Intelligent brain wave acquisition and analysis system and method

Similar Documents

Publication Publication Date Title
CN111329474B (en) Electroencephalogram identity recognition method and system based on deep learning and information updating method
Ince et al. Adapting subject specific motor imagery EEG patterns in space–time–frequency for a brain computer interface
CN115414051A (en) Emotion classification and recognition method of electroencephalogram signal self-adaptive window
CN112656427A (en) Electroencephalogram emotion recognition method based on dimension model
CN112244873A (en) Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network
CN111184509A (en) Emotion-induced electroencephalogram signal classification method based on transfer entropy
EP2416703A2 (en) A method for the real-time identification of seizures in an electroencephalogram (eeg) signal
CN114533086B (en) Motor imagery brain electrolysis code method based on airspace characteristic time-frequency transformation
Carrión-Ojeda et al. Analysis of factors that influence the performance of biometric systems based on EEG signals
CN111310570A (en) Electroencephalogram signal emotion recognition method and system based on VMD and WPD
Agarwal et al. Classification of alcoholic and non-alcoholic EEG signals based on sliding-SSA and independent component analysis
CN115770044B (en) Emotion recognition method and device based on electroencephalogram phase amplitude coupling network
CN110543831A (en) brain print identification method based on convolutional neural network
CN115804602A (en) Electroencephalogram emotion signal detection method, equipment and medium based on attention mechanism and with multi-channel feature fusion
Kauppi et al. Decoding magnetoencephalographic rhythmic activity using spectrospatial information
Carrión-Ojeda et al. A method for studying how much time of EEG recording is needed to have a good user identification
Samal et al. Ensemble median empirical mode decomposition for emotion recognition using EEG signal
Nakra et al. Feature Extraction and Dimensionality Reduction Techniques with Their Advantages and Disadvantages for EEG-Based BCI System: A Review.
CN117883082A (en) Abnormal emotion recognition method, system, equipment and medium
Anderson et al. EEG subspace representations and feature selection for brain-computer interfaces
Wankhade et al. IKKN predictor: An EEG signal based emotion recognition for HCI
Saha et al. Automatic emotion recognition from multi-band EEG data based on a deep learning scheme with effective channel attention
CN109117790B (en) Brain print identification method based on frequency space index
Puri et al. Wavelet packet sub-band based classification of alcoholic and controlled state EEG signals
Saini et al. Imagined object recognition using eeg-based neurological brain signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination