CN115204232A - Residual error feature pyramid emotion recognition method and system based on electroencephalogram signals - Google Patents

Residual error feature pyramid emotion recognition method and system based on electroencephalogram signals Download PDF

Info

Publication number
CN115204232A
CN115204232A CN202210855623.1A CN202210855623A CN115204232A CN 115204232 A CN115204232 A CN 115204232A CN 202210855623 A CN202210855623 A CN 202210855623A CN 115204232 A CN115204232 A CN 115204232A
Authority
CN
China
Prior art keywords
matrix
initial
characteristic
fusion
electroencephalogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210855623.1A
Other languages
Chinese (zh)
Inventor
高强
侯法政
宋雨
刘俊杰
毛泽民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University of Technology
Original Assignee
Tianjin University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University of Technology filed Critical Tianjin University of Technology
Priority to CN202210855623.1A priority Critical patent/CN115204232A/en
Publication of CN115204232A publication Critical patent/CN115204232A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a residual error characteristic pyramid emotion recognition method and system based on electroencephalogram signals, and relates to the technical field of neural networks, wherein the method comprises the following steps: acquiring electroencephalogram data of a testee, which are acquired by electroencephalogram electrodes arranged according to an international 10-20 system; constructing a feature matrix of electroencephalogram data of a person to be tested, and fusing the feature matrix to obtain a fusion matrix; constructing an emotion recognition model based on a residual bidirectional feature pyramid; the emotion recognition model comprises a space depth-to-depth structure, a residual error structure and a bidirectional characteristic pyramid structure; and inputting the fusion matrix into the trained emotion recognition model to obtain the emotion category of the person to be tested. The invention can capture the electroencephalogram representation information and the electroencephalogram context information of the space, reduce the calculated amount during model training, integrate the high-level semantic information and the low-level spatial information and overcome the problem of network degradation caused by the increase of network depth.

Description

Residual error feature pyramid emotion recognition method and system based on electroencephalogram signals
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a residual error characteristic pyramid emotion recognition method and system based on electroencephalogram signals.
Background
Emotion recognition aims to create a harmonious human-machine environment by giving computers the ability to recognize, understand and adapt to the human emotion. With the rapid development of technologies such as a non-invasive sensing technology, a machine learning algorithm, the computing power of a computer and the like, the cognitive science is driven to advance, and emotion recognition is paid attention to by numerous scholars as the leading-edge field of the cognitive science. Although more and more emotion recognition methods have been proposed, two problems still exist at present. Firstly, the time domain, the frequency domain or the time-frequency domain features only consider the contextual correlation of emotion within a certain time, and ignore the spatial characterization features of the electroencephalogram signals; while spatial domain features ignore the contextual relevance of emotional states; the other is that the calculation amount required by the model is relatively increased along with the continuous deepening of the network layer number, and in addition, partial electroencephalogram representation characteristic information can be ignored to a certain extent by the aid of one-taste characteristic dimension reduction. Therefore, an emotion recognition method is needed to capture spatial electroencephalogram representation information and electroencephalogram context information, reduce the amount of calculation, integrate high-level semantic information and low-level spatial information, and overcome the problem that the network depth increases network degradation.
Disclosure of Invention
The invention aims to provide a residual error feature pyramid emotion recognition method and system based on electroencephalogram signals, which can capture electroencephalogram representation information and electroencephalogram context information of a space, reduce the calculated amount during model training, integrate high-level semantic information and low-level spatial information and overcome the problem that network degradation is increased due to network depth.
In order to achieve the purpose, the invention provides the following scheme:
a residual error feature pyramid emotion recognition method based on electroencephalogram signals comprises the following steps:
acquiring electroencephalogram data of a testee, which are acquired by electroencephalogram electrodes arranged according to an international 10-20 system;
constructing a feature matrix of the electroencephalogram data of the person to be tested, and fusing the feature matrix to obtain a fusion matrix; the characteristic matrix comprises an initial matrix, a left and right brain region symmetric difference matrix, a left and right brain region symmetric quotient matrix and a differential entropy matrix;
constructing an emotion recognition model based on a residual bidirectional feature pyramid; the emotion recognition model comprises a space-to-depth structure, a residual error structure and a bidirectional characteristic pyramid structure;
inputting the fusion matrix into a trained emotion recognition model to obtain the emotion category of the person to be tested; the emotion categories include happiness, anger, excitement, sadness, calmness, fear.
Optionally, the training process of the emotion recognition model includes:
acquiring electroencephalographic data of a subject acquired by electroencephalographic electrodes placed in accordance with the international 10-20 system;
constructing a feature matrix of electroencephalographic data of the subject;
fusing the feature matrix of the electroencephalogram data of the subject to obtain a fusion matrix of the electroencephalogram data of the subject;
and taking the fusion matrix of the electroencephalogram data of the subject as input, and training the emotion recognition model to obtain the trained emotion recognition model.
Optionally, the constructing a feature matrix of the electroencephalogram data of the subject specifically includes:
constructing an initial one-dimensional vector according to the number of electrode channels and the sampling frequency of the international 10-20 system;
constructing an initial two-dimensional matrix according to the initial one-dimensional vector and the electrode position of the international 10-20 system;
standardizing the initial two-dimensional matrix to obtain an initial matrix;
elements in the initial one-dimensional vector corresponding to electrodes positioned on front and rear sagittal lines in the international 10-20 system are removed, and the removed one-dimensional vector is obtained;
renumbering elements in the removed one-dimensional vector to obtain a renumbered one-dimensional vector;
determining element numbers in the renumbered one-dimensional vectors and symmetrical element numbers symmetrical to the element numbers;
taking the difference between the element corresponding to the element number and the element corresponding to the symmetric element number to obtain a symmetric difference one-dimensional vector;
constructing a symmetrical difference two-dimensional matrix according to the symmetrical difference one-dimensional vector and the electrode position of the international 10-20 system;
standardizing the symmetry difference two-dimensional matrix to obtain a left and right brain region symmetry difference matrix;
dividing the element corresponding to the element number with the element corresponding to the symmetrical element number to obtain a symmetrical quotient one-dimensional vector;
constructing a symmetric quotient two-dimensional matrix according to the symmetric quotient one-dimensional vector and the electrode position of the international 10-20 system;
standardizing the symmetric quotient two-dimensional matrix to obtain a symmetric quotient matrix of the left and right brain areas;
extracting differential entropy characteristics of a set frequency band from the initial one-dimensional vector by adopting a set time window to obtain differential entropy characteristic vectors;
and constructing a differential entropy matrix according to the differential entropy eigenvector and the electrode position of the international 10-20 system.
Optionally, the fusing the feature matrix to obtain a fusion matrix specifically includes:
preprocessing the feature matrix by applying convolution operation to obtain an initial feature map;
standardizing the initial characteristic diagram to obtain a standardized characteristic diagram;
applying an activation function to the standardized feature map to obtain a feature map;
carrying out nearest neighbor up-sampling on the feature map to obtain a feature matrix of the feature map;
and fusing the characteristic matrixes of the characteristic diagrams to obtain a fusion matrix.
Optionally, the inputting the fusion matrix into the trained emotion recognition model to obtain the emotion classification of the person to be tested specifically includes:
sampling and recombining the fusion matrix by using a space-to-depth structure in a fixed scale to obtain a plurality of initial fusion matrices;
unifying the channel numbers of the plurality of initial fusion matrixes by using convolution operation to obtain a plurality of unified channel number matrixes;
extracting the emotion category characteristics of the uniform channel number matrixes by using a bidirectional characteristic pyramid structure and a residual error structure;
and classifying the emotion classification features by applying a full connection layer, and outputting emotion classifications.
A residual error characteristic pyramid emotion recognition system based on electroencephalogram signals is applied to the residual error characteristic pyramid emotion recognition method based on the electroencephalogram signals, and the system comprises:
the data acquisition module is used for acquiring electroencephalogram data of a person to be tested, which are acquired according to electroencephalogram electrodes arranged in an international 10-20 system;
the characteristic matrix determining module is used for constructing a characteristic matrix of electroencephalogram data of the testee; the characteristic matrix comprises an initial matrix, a left and right brain region symmetric difference matrix, a left and right brain region symmetric quotient matrix and a differential entropy matrix;
the fusion matrix determining module is used for fusing the characteristic matrix to obtain a fusion matrix;
the model construction module is used for constructing an emotion recognition model based on a residual bidirectional feature pyramid; the emotion recognition model comprises a space-to-depth structure, a residual error structure and a bidirectional characteristic pyramid structure;
the emotion classification determining module is used for inputting the fusion matrix into a trained emotion recognition model to obtain the emotion classification of the person to be tested; the emotion categories include happiness, anger, excitement, sadness, calmness, fear.
Optionally, the system further comprises a training module;
the training module comprises:
an electroencephalogram data acquisition sub-module for acquiring electroencephalogram data of a subject collected by electroencephalogram electrodes placed in accordance with the international 10-20 system;
a construction sub-module for constructing a feature matrix of the electroencephalographic data of the subject;
the fusion sub-module is used for fusing the feature matrix of the electroencephalogram data of the subject to obtain a fusion matrix of the electroencephalogram data of the subject;
and the training submodule is used for training the emotion recognition model by taking the fusion matrix of the electroencephalogram data of the subject as input to obtain the trained emotion recognition model.
Optionally, the feature matrix determination module includes:
the initial one-dimensional vector construction submodule is used for constructing an initial one-dimensional vector according to the number of electrode channels and the sampling frequency of the international 10-20 system;
the initial two-dimensional matrix construction submodule is used for constructing an initial two-dimensional matrix according to the initial one-dimensional vector and the electrode position of the international 10-20 system;
the initial matrix determining submodule is used for standardizing the initial two-dimensional matrix to obtain an initial matrix;
the eliminating submodule is used for eliminating elements in the initial one-dimensional vector, which correspond to electrodes positioned on front and rear sagittal lines in the international 10-20 system, so as to obtain an eliminated one-dimensional vector;
the renumbering submodule is used for renumbering the elements in the removed one-dimensional vector to obtain a renumbered one-dimensional vector;
a numbering determination submodule for determining the element number in the renumbered one-dimensional vector and a symmetric element number symmetric to the element number;
the symmetrical difference one-dimensional vector determining submodule is used for taking the difference between the element corresponding to the element number and the element corresponding to the symmetrical element number to obtain a symmetrical difference one-dimensional vector;
the symmetrical difference two-dimensional matrix determining submodule is used for constructing a symmetrical difference two-dimensional matrix according to the symmetrical difference one-dimensional vector and the electrode position of the international 10-20 system;
the symmetrical difference matrix determining submodule is used for standardizing the symmetrical difference two-dimensional matrix to obtain a symmetrical difference matrix of the left and right brain areas;
a symmetric quotient one-dimensional vector determining submodule for dividing the element corresponding to the element number by the element corresponding to the symmetric element number to obtain a symmetric quotient one-dimensional vector;
the symmetrical quotient two-dimensional matrix determining submodule is used for constructing a symmetrical quotient two-dimensional matrix according to the symmetrical quotient one-dimensional vector and the electrode position of the international 10-20 system;
the symmetric quotient matrix determining submodule is used for standardizing the symmetric quotient two-dimensional matrix to obtain a symmetric quotient matrix of the left and right brain areas;
the differential entropy characteristic vector determining submodule is used for extracting differential entropy characteristics of a set frequency band from the initial one-dimensional vector by adopting a set time window to obtain a differential entropy characteristic vector;
and the differential entropy matrix determining submodule is used for constructing a differential entropy matrix according to the differential entropy characteristic vector and the electrode position of the international 10-20 system.
Optionally, the fusion matrix determining module includes:
the initial characteristic diagram determining submodule is used for preprocessing the characteristic matrix of the electroencephalogram signal by applying convolution operation to obtain an initial characteristic diagram;
the standardized characteristic diagram determining submodule is used for standardizing the initial characteristic diagram to obtain a standardized characteristic diagram;
the characteristic diagram determining submodule is used for applying an activation function to the standardized characteristic diagram to obtain a characteristic diagram;
the characteristic matrix determination submodule is used for carrying out nearest neighbor up-sampling on the characteristic diagram to obtain a characteristic matrix of the characteristic diagram;
and the fusion matrix determining submodule is used for fusing the characteristic matrix of the characteristic diagram to obtain a fusion matrix.
Optionally, the emotion category determination module includes:
the initial fusion matrix determining unit is used for sampling and recombining the fusion matrix in a fixed scale by utilizing a space-to-depth structure to obtain a plurality of initial fusion matrices;
a unified channel number matrix determining unit, configured to unify the channel numbers of the multiple initial fusion matrices by using convolution operation to obtain multiple unified channel number matrices;
the extraction unit is used for extracting the emotion category characteristics of the uniform channel number matrixes by using a bidirectional characteristic pyramid structure and a residual error structure;
and the result output unit is used for applying the full connection layer to classify the emotion classification features and outputting the emotion classification.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the invention provides a residual error characteristic pyramid emotion recognition method based on electroencephalogram signals, which comprises the following steps: acquiring electroencephalogram data of a testee, which is acquired by electroencephalogram electrodes arranged according to the international 10-20 system; constructing a feature matrix of electroencephalogram data of a person to be tested, and fusing the feature matrix to obtain a fusion matrix; the characteristic matrix comprises an initial matrix, a left and right brain region symmetric difference matrix, a left and right brain region symmetric quotient matrix and a differential entropy matrix; constructing an emotion recognition model based on a residual bidirectional feature pyramid; the emotion recognition model comprises a space-to-depth structure, a residual error structure and a bidirectional characteristic pyramid structure; inputting the fusion matrix into the trained emotion recognition model to obtain the emotion category of the person to be tested; the emotion categories include happiness, anger, excitement, sadness, calmness, fear. The invention provides an emotion recognition model based on a residual bidirectional feature pyramid, which is characterized in that four different matrixes are constructed according to original signals of sample points, and feature layers of the four matrixes are fused, so that electroencephalogram representation information and electroencephalogram context information of a space can be captured; by adopting the space-to-depth structure to replace the traditional convolution network structure as the backbone network, the calculation amount during model training can be reduced, and the residual error structure is added on the basis of the bidirectional characteristic pyramid structure, so that high-level semantic information and low-level spatial information can be fused, and the problem of network degradation caused by network depth increase is solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a flowchart of a residual error feature pyramid emotion recognition method based on electroencephalogram signals provided by the invention;
FIG. 2 is an overall frame diagram of the residual error characteristic pyramid emotion recognition method based on electroencephalogram signals provided by the invention;
FIG. 3 is a schematic view of the International 10-20 System electrode placement;
FIG. 4 is a schematic diagram of an S2D structure provided by the present invention;
FIG. 5 is a schematic diagram of a cross-scale fusion portion of the residual feature pyramid provided in the present invention;
FIG. 6 is a block diagram of a residual characteristic pyramid emotion recognition system based on electroencephalogram signals.
Description of the symbols:
the method comprises the following steps of 1-a data acquisition module, 2-a characteristic matrix determination module, 3-a fusion matrix determination module, 4-a model construction module and 5-an emotion category determination module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
The invention aims to provide a residual error feature pyramid emotion recognition method and system based on electroencephalogram signals, which can capture electroencephalogram representation information and electroencephalogram context information of a space, reduce the calculated amount during model training, integrate high-level semantic information and low-level spatial information and overcome the problem that network degradation is increased due to network depth.
In order to make the aforementioned objects, features and advantages of the present invention more comprehensible, the present invention is described in detail with reference to the accompanying drawings and the detailed description thereof.
As shown in fig. 1 and fig. 2, the residual feature pyramid emotion recognition method based on electroencephalogram signals provided by the present invention includes:
step S1: acquiring electroencephalogram data of a testee, which are acquired by electroencephalogram electrodes arranged according to an international 10-20 system; specifically, the international 10-20 system electrode placement method is shown in fig. 3.
Step S2: constructing a feature matrix of electroencephalogram data of the testee, and fusing the feature matrix to obtain a fusion matrix; the characteristic matrix comprises an initial matrix, a left and right brain region symmetric difference matrix, a left and right brain region symmetric quotient matrix and a differential entropy matrix.
S2 specifically comprises the following steps:
step S201: constructing an initial one-dimensional vector according to the number of electrode channels and the sampling frequency of the international 10-20 system; specifically, a one-dimensional vector is constructed according to the number of electrode channels
Figure BDA0003754412970000081
Wherein n is the number of electrode channels,
Figure BDA0003754412970000082
f is the sampling frequency, and the number of one-dimensional vectors per experiment for each subject is duration x sampling frequency size.
Step S202: and constructing an initial two-dimensional matrix according to the initial one-dimensional vector and the electrode position of the international 10-20 system.
Specifically, for an actual electrode cap, spatial relation exists between electrodes, and a one-dimensional vector cannot represent spatial information between the electrodes, so that a 9 × 9 matrix is constructed according to actual electrode positions to represent the spatial information of the electrodes, and a schematic diagram of the matrix construction is shown in the figure, wherein an initial two-dimensional matrix is an initial two-dimensional matrix
Figure BDA0003754412970000093
As shown in formula (1) and formula (2), other positions in the matrix are set to 0 to avoid interference from other factors. The number of channels in the formula (1) is 62, so that when the emotion recognition accuracy of the residual error feature pyramid emotion recognition method based on the electroencephalogram signals is verified, the SEED data set or HIED data set with the number of channels being 62 is applied; the number of channels in the formula (2) is 32, so that when the emotion recognition accuracy of the residual error feature pyramid emotion recognition method based on the electroencephalogram signals is verified, a DEAP data set with the same number of channels as 32 is applied.
Figure BDA0003754412970000091
Figure BDA0003754412970000092
Wherein, when the DEAP data set is used for verification, each subject can obtain 40 × 60 × 128 original signal matrixes (wherein 40 represents the number of times of test of each subject, 60 is the duration of each test, the unit is second, and 128 is the sampling frequency), and when the SEED data set is used for verification, each subject can obtain 15 × 240 × 200 original signal matrixes.
Step S203: and normalizing the initial two-dimensional matrix to obtain an initial matrix (OSM).
Specifically, in order to avoid the influence of too large numerical difference on subsequent classification, each two-dimensional matrix is normalized through z-score normalization to obtain an initial matrix (OSM). The calculation formula is as follows:
Figure BDA0003754412970000101
wherein
Figure BDA0003754412970000102
And σ l Respectively representing the mean and variance of the non-zero values of the corresponding matrix of the sample point l.
Step S204: and elements in the initial one-dimensional vector, which correspond to electrodes positioned on front and back sagittal lines in the international 10-20 system, are removed, so that the removed one-dimensional vector is obtained.
In particular, from a one-dimensional vector
Figure BDA0003754412970000103
The middle electrodes are eliminated, wherein the middle position electrodes in DEAP data set are Fz, cz, pz and Oz, and the middle position electrodes in SEED data set or HIED data set are FPz, fz, FCz Cz,CPz,Pz,POz,Oz
Step S205: and renumbering the elements in the removed one-dimensional vector to obtain the renumbered one-dimensional vector.
Step S206: and determining the element number in the renumbered one-dimensional vector and a symmetrical element number symmetrical to the element number.
Specifically, n is the number of channels, n is 28 in the DEAP dataset, and n is 54 in the SEED dataset or the HIED dataset, and symmetric electrode pairs are found according to the following rules:
when the number n is 18, the number n,
Figure BDA0003754412970000104
and
Figure BDA0003754412970000105
a pair of electrode pairs, wherein i is more than 0 and less than or equal to 14, and i is a positive integer;
Figure BDA0003754412970000106
to know
Figure BDA0003754412970000107
Is a pair of electrode pairs, wherein i is more than 14 and less than or equal to 28, and i is a positive integer.
When the number n is 54, the number n,
Figure BDA0003754412970000108
and
Figure BDA0003754412970000109
a pair of electrode pairs, wherein i is more than 0 and less than or equal to 27, and i is a positive integer;
Figure BDA00037544129700001010
to know
Figure BDA00037544129700001011
Is a pair of electrode pairs, wherein i is more than 27 and less than or equal to 54, and i is a positive integer.
Step S207: and taking the difference between the element corresponding to the element number and the element corresponding to the symmetric element number to obtain a symmetric difference one-dimensional vector.
Specifically, the calculation is performed according to equations (4) and (5):
Figure BDA0003754412970000111
Figure BDA0003754412970000112
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003754412970000113
representing the electrode pair difference corresponding to the sampling point 1, and i represents the electrode serial number after the middle electrode is removed. Taking the example of verification of a DEAP dataset, in the DEAP dataset
Figure BDA0003754412970000114
Is equal toChannel one (FP 1) minus channel fifteen (FP 2) corresponding to the first sampling point, for example, in the SEED data set for verification
Figure BDA0003754412970000115
Equal to channel one (FP 1) minus channel twenty-eight (FP 2) for the first sample point. Thereby obtaining a one-dimensional vector
Figure BDA0003754412970000116
Further, formula (4) is a calculation method when the DEAP data set is used for verification, and formula (5) is a calculation method when the SEED data set or the HIED data set is used for verification.
Step S208: and constructing a symmetrical difference two-dimensional matrix according to the symmetrical difference one-dimensional vector and the electrode position of the international 10-20 system.
In particular, the symmetry difference matrix D 2D_l As shown in equations (6) and (7).
Figure BDA0003754412970000117
Figure BDA0003754412970000121
The formula (6) is a symmetric difference two-dimensional matrix obtained when an SEED data set or a HIED data set is used for verifying the emotion recognition accuracy of the residual error feature pyramid emotion recognition method based on the electroencephalogram signals; formula (7) is a symmetric difference two-dimensional matrix obtained when the DEAP data set is used for verifying the emotion recognition accuracy of the electroencephalogram signal-based residual characteristic pyramid emotion recognition method provided by the invention.
Step S209: and standardizing the symmetry difference two-dimensional matrix to obtain a left and right brain region Symmetry Difference Matrix (SDM).
Specifically, in order to avoid the influence of too large numerical difference on the classification model, the left and right brain region Symmetric Difference Matrix (SDM) is obtained by adopting z-score standardization for standardization. The calculation formula is shown in formula (3).
Step S210: and dividing the element corresponding to the element number with the element corresponding to the symmetric element number to obtain a symmetric quotient one-dimensional vector.
Specifically, the calculation is performed according to equations (8) and (9):
Figure BDA0003754412970000122
Figure BDA0003754412970000123
wherein the content of the first and second substances,
Figure BDA0003754412970000131
and i represents the electrode pair difference corresponding to the sampling point l, and i represents the electrode serial number after the middle electrode is removed. Taking the example of verification of a DEAP dataset, in the DEAP dataset
Figure BDA0003754412970000132
Equal to channel five (FC 5) divided by channel nineteen (FC 6) for the tenth sample point correspondence, as exemplified by the validation of the SEED or HIED data set, in which,
Figure BDA0003754412970000133
equal to channel five (FC 5) divided by channel thirty-two (FC 6) for the tenth sampling point. Thereby obtaining a one-dimensional vector of symmetric quotient
Figure BDA0003754412970000134
Further, formula (8) is a calculation method when the DEAP data set is used for verification, and formula (9) is a calculation method when the SEED data set or the HIED data set is used for verification.
Step S211: and constructing a symmetrical quotient two-dimensional matrix according to the symmetrical quotient one-dimensional vector and the electrode position of the international 10-20 system.
In particular, a symmetric quotient two-dimensional matrix Q 2D_l As shown in equation (10) and equation (11).
Figure BDA0003754412970000135
Figure BDA0003754412970000141
The formula (10) is a symmetric quotient two-dimensional matrix obtained when an SEED data set or a HIED data set is used for verifying the emotion recognition accuracy of the residual error feature pyramid emotion recognition method based on the electroencephalogram signals; formula (11) is a symmetric quotient two-dimensional matrix obtained when the emotion recognition accuracy of the electroencephalogram signal-based residual feature pyramid emotion recognition method is verified by applying a DEAP data set.
Step S212: standardizing the symmetric quotient two-dimensional matrix to obtain a Symmetric Quotient Matrix (SQM) of the left and right brain regions; specifically, in order to avoid the influence of too large numerical difference on the classification model, the normalization is still performed by adopting z-score standardization to obtain a left and right brain region Symmetric Quotient Matrix (SQM). The calculation formula is shown in formula (3).
Step S213: extracting differential entropy characteristics of a set frequency band from the initial one-dimensional vector by adopting a set time window to obtain differential entropy characteristic vectors; specifically, to ensure that the amount of data is equivalent, differential entropy features (DE) of five frequency bands, which are a δ band, a θ band, an α band, a β band, and a γ band, are extracted using a 1-second time window. Wherein, the delta wave band is 1Hz-4Hz, the theta wave band is 4Hz-7Hz, the alpha wave band is 8Hz-12Hz, the beta wave band is 13Hz-30Hz, and the gamma wave band is 31Hz-45Hz. This resulted in 60 × 5 × 32 features per trial per subject in the DEAP dataset, and 240 × 5 × 62 features in the SEED dataset or 231 × 5 × 62 features in the HIED dataset.
Step S214: and constructing a differential entropy matrix according to the differential entropy eigenvector and the electrode position of the international 10-20 system. Specifically, a differential entropy two-dimensional matrix is constructed according to the differential entropy characteristic vector and the electrode position of the international 10-20 system; and standardizing the differential entropy two-dimensional matrix to obtain a Differential Entropy Matrix (DEM).
O 2D_l As shown in equation (12) and equation (13).
Figure BDA0003754412970000151
Figure BDA0003754412970000152
The formula (12) is a symmetric quotient two-dimensional matrix obtained when an SEED data set or an HIED data set is used for verifying the emotion recognition accuracy of the residual error characteristic pyramid emotion recognition method based on the electroencephalogram signals; formula (13) is a symmetric quotient two-dimensional matrix obtained when the emotion recognition accuracy of the electroencephalogram signal-based residual feature pyramid emotion recognition method is verified by applying a DEAP data set.
In order to avoid the influence of overlarge numerical value difference on the classification model, a Differential Entropy Matrix (DEM) is obtained by adopting z-score standardization for standardization. The calculation formula is shown in formula (3).
Step S215: and preprocessing the characteristic matrix by applying convolution operation to obtain an initial characteristic diagram.
Specifically, the four feature matrices obtained in the above steps are respectively subjected to preliminary feature processing by a 1 × 1 convolution operation with 50 convolution kernels.
Step S216: standardizing the initial characteristic diagram to obtain a standardized characteristic diagram; specifically, the initial feature map is normalized by batch normalization.
Step S217: applying an activation function to the standardized feature map to obtain a feature map; specifically, the activation function is a SiLU activation function; after this step, a 50 × 9 × 9 feature map of each feature matrix is obtained.
Step S218: carrying out nearest neighbor up-sampling on the feature map to obtain a feature matrix of the feature map; specifically, in order to fully capture the spatial information of the electroencephalogram signal, nearest neighbor upsampling is performed on the feature map. Furthermore, nearest neighbor upsampling is carried out on the feature map of 50 × 9 × 9, four feature matrixes obtained through upsampling have the dimensionality of 50 × 62 × 62, and then the four feature matrixes are fused.
Step S219: and fusing the characteristic matrixes of the characteristic diagrams to obtain a fusion matrix. Specifically, four fusion matrixes obtained through up-sampling are fused; the fusion operation is a simple splicing operation, and splicing is performed on the channel dimension, so that a 200 × 62 × 62 feature map is obtained.
And step S3: constructing an emotion recognition model based on a residual bidirectional feature pyramid; the emotion recognition model comprises a space-to-depth structure, a residual error structure and a bidirectional feature pyramid structure.
Training the emotion recognition model, wherein the training process of the emotion recognition model comprises the following steps:
step S301: electroencephalographic data is acquired from a subject collected by electroencephalographic electrodes placed according to the international 10-20 system.
Step S302: constructing a feature matrix of the electroencephalographic data of the subject.
Step S303: and fusing the feature matrix of the electroencephalogram data of the subject to obtain a fusion matrix of the electroencephalogram data of the subject.
Step S304: and taking the fusion matrix of the electroencephalogram data of the subject as input, and training the emotion recognition model to obtain the trained emotion recognition model.
The hyper-parameters of the trained emotion recognition model are shown in table 1.
TABLE 1 hyper-parameters of trained emotion recognition models
Figure BDA0003754412970000161
Figure BDA0003754412970000171
And step S4: inputting the fusion matrix into a trained emotion recognition model to obtain the emotion category of the person to be tested; the emotion categories include happiness, anger, excitement, sadness, calmness, fear. Wherein, happiness and motivation belong to positive emotions; anger, sadness and fear belong to negative emotions; calmness is a neutral mood.
S4 specifically comprises the following steps:
step S401: and sampling and recombining the fusion matrix by using a space-to-depth structure and a fixed scale to obtain a plurality of initial fusion matrices.
In particular, for Space-to-Depth (S2D) structures, since the concept of feature pyramids was proposed, most researchers consider convolutional neural networks as the backbone portion of a feature pyramid network to obtain feature maps of different scales. However, as convolutional networks continue to develop, the convolutional networks as the network backbone become deeper and deeper, which undoubtedly increases the computational cost of the model. For emotion calculation, the final goal is to realize real-time human-machine interaction. Therefore, reducing the model calculation cost is an indispensable ring for achieving this object. Furthermore, FPN focuses more on the fusion of high-level semantic information with low-level spatial information. Therefore, the S2D structure replaces the traditional convolutional network backbone. Of course, this does not mean to abandon the convolutional network computation at all, but to use the S2D structure in combination with the convolutional network to form the S2D layer as the basis of the backbone network. The S2D structure samples and recombines the characteristic diagram with a fixed scale by transferring the space dimension information to the depth dimension, replaces the traditional down-sampling operation, and does not add extra parameters. And then, outputting a plurality of initial fusion matrixes by using 1 × 1 convolution and a SilU activation function and fixing the number of channels. Details of the backbone network based on the S2D architecture are shown in fig. 4.
In practical application, the fusion matrix is initially calculated through a 3 multiplied by 3 convolution network, batch standardization and a SilU activation function, and the operation is circulated twice and is effectiveAnd extracting the spatial information of the matrix by using convolution. Specifically, the operation is cycled twice to obtain a matrix with the size of 31 × 31, and the convolution operation step size of the second cycle is 2, so that the matrix size becomes 1/2 of the original size. Then, through four cycles of S2D layers, M is obtained respectively 3 ,M 4 ,M 5 ,M 6 The feature matrixes of four different sizes are 15 multiplied by 15,7 multiplied by 7,3 multiplied by 3,1 multiplied by 1..
Step S402: unifying the channel numbers of the initial fusion matrixes by using convolution operation to obtain a plurality of unified channel number matrixes; specifically, the number of channels of the initial fusion matrices is unified by using 1 × 1 convolution operation, and the optimal number of channels is 256 after multiple verification.
Step S403: extracting the emotion category characteristics of the uniform channel number matrixes by using a bidirectional characteristic pyramid structure and a residual error structure; the bidirectional characteristic pyramid structure and the residual error structure can fully acquire fusion information between layers and between nodes.
Specifically, regarding Residual Feature Pyramid (RFPN), the initial Feature Pyramid (FPN) is proposed to better fuse semantic information of different scales, and complete deep semantic Feature extraction of multi-scale information. The invention applies a Residual error structure on the basis of the BiFPN to construct a Residual error Feature Pyramid structure (RFPN). RFPN (Residual Feature Pyramid Network) consists of two parts, residual concatenation and cross-scale fusion. Residual connection can modularize pyramid units, and simple stacking circulation can be performed. The application of the residual error structure can perform information fusion between the feature graph nodes through shortcut on one hand, and well avoid the problem of gradient explosion caused by the increase of the circulation times of the feature pyramid structure on the other hand, thereby increasing the feature extraction capability of the pyramid. Cross-scale fusion considerationThe feature map of the same scale and the feature maps of adjacent scales are obtained, and large-scale change of feature information is effectively avoided. As shown in figure 5, by feature map M' 5 For example, its final output contains the previous layer M 6 Nearest neighbor upsampling, previous layer M 5 Previous layer M 4 Maximum pooled downsampling and co-layer M' 4 And (4) fusion of the down-sampling of (1).
In the invention, an S2D structure is applied to form a backbone network of an emotion recognition model, and an RFPN module is utilized to carry out simple stacking circulation so as to carry out effective cross-scale and cross-layer feature fusion.
Step S404: and classifying the emotion classification features by applying a full connection layer, and outputting emotion classifications. Specifically, a final prediction result of the emotion classification of the testee is obtained through a classification prediction network, namely a full connection layer.
In order to better verify the emotion recognition, the characteristic matrixes are complementary, and fourteen experimental paradigms are constructed for experimental verification. Through the experiments of the models, the influence of the difference of the left and right brain regions on the emotion recognition performance can be verified, and the fact that the traditional frequency domain characteristics are fused can make up the defect of the symmetric matrix. The results of the different characteristic comparison experiments are shown in table 2.
TABLE 2 comparison of the results of the experiments with different characteristics
Figure BDA0003754412970000191
Figure BDA0003754412970000201
In all experimental results, when the input matrix is a single matrix, the SDM performs best, which further verifies that the left and right brain region differential matrices can effectively represent affective information. In addition, the difference between the left and right brain regions is more representative of the difference in the responses of the left and right brain regions to different emotional states than the quotient. When a plurality of matrixes are used as input, the combination of the SDM, the SQM and the DEM is superior to the combination of other matrixes, and therefore the three matrixes have good complementarity. The fusion method of the four matrixes provided by the invention is superior to other experimental paradigms, and the best performance is achieved.
In addition, in order to prove the superiority of the S2Dlayer, a classification model is constructed by adopting ResNet18 as a main part of the network. Similarly, the number of network layers is unified to increase the confidence of the experimental results, and the model comparison experimental results of different backbones are shown in table 3 by taking the DEAP data set as an example.
TABLE 3 results of model comparison experiments with different trunks
Figure BDA0003754412970000202
Figure BDA0003754412970000211
Obviously, compared with ResNet18 as a network backbone, the S2Dlayer has better performance, and the training time of the model is obviously shortened, which provides a reference for realizing the possibility of real-time human-computer interaction. Where H represents happiness, A represents anger, E represents motivation, S represents sadness, N represents calm, and F represents fear.
In addition, four classification models are constructed based on different FPN structures and are applied to three data sets for verification, and comparison experiment results of different pyramid models are shown in table 4. In order to increase the persuasion of the experimental result, a network structure with the same depth is constructed.
TABLE 4 comparison of experimental results for different pyramid models
Figure BDA0003754412970000212
Experimental results show that the characteristic pyramid structure (namely RFPN) provided by the invention has more excellent performance. The RFPN sufficiently fuses characteristic semantic information among layers and among nodes, and reduces the influence caused by gradient disappearance or gradient explosion along with the increase of the number of layers. Where H represents happiness, A represents anger, E represents motivation, S represents sadness, N represents calm, and F represents fear.
In addition, comparing emotion recognition models using different features and different networks in the existing research, the results of the comparison of related works are shown in table 5.
TABLE 5 correlation work comparison results
Figure BDA0003754412970000221
As can be seen from the table 5, the residual error feature pyramid emotion recognition method based on the electroencephalogram signals achieves satisfactory results in the aspect of emotion recognition. In DEAP, subjects relied on two-class experiments with an average accuracy of 96.89% in the potency dimension, 96.82% in the arousal dimension, and 93.56% in the four-class experiment. The accuracy of the three-classification and the two-classification respectively reaches 98.59 percent, and 92.84 percent is in the SEED data set.
The method constructs different characteristic matrixes based on the sampling points to fuse so as to represent different emotional state information. Not only the difference of the left and right brain areas of the emotional state of the human is considered, but also the frequency domain characteristic DE representing the context correlation of the electroencephalogram signals is fused. In addition, the S2D layer replaces the traditional CNN as a main stem of the classification model, and a feature pyramid of high-level semantics and low-level spatial information is fused, so that the calculation cost of the model is reduced. RFPN is constructed based on BiFPN, and is used for replacing the traditional FPN. RFPN consists of two parts: and performing cross-layer connection and cross-scale fusion based on a residual structure. The information fusion between layers and nodes is fully captured, and the influence of gradient disappearance or gradient explosion possibly caused by model depth increase is reduced. The residual error characteristic pyramid emotion recognition method based on the electroencephalogram signals is used for emotion recognition in a public data set DEAP data set and an SEED data set. In addition, the experimental results are compared with other researches based on a public data set, and the emotion recognition model based on the residual bidirectional feature pyramid provided by the invention is proved to achieve a satisfactory effect.
As shown in fig. 6, the residual feature pyramid emotion recognition system based on electroencephalogram signals provided by the present invention is applied to the residual feature pyramid emotion recognition method based on electroencephalogram signals, and the system includes:
the data acquisition module 1 is used for acquiring electroencephalogram data of a person to be tested, which are acquired by electroencephalogram electrodes arranged according to the international 10-20 system.
The characteristic matrix determining module 2 is used for constructing a characteristic matrix of electroencephalogram data of the testee; the characteristic matrix comprises an initial matrix, a left and right brain region symmetric difference matrix, a left and right brain region symmetric quotient matrix and a differential entropy matrix.
And the fusion matrix determining module 3 is used for fusing the characteristic matrix to obtain a fusion matrix.
The model construction module 4 is used for constructing an emotion recognition model based on a residual bidirectional feature pyramid; the emotion recognition model comprises a space-to-depth structure, a residual error structure and a bidirectional feature pyramid structure.
The emotion category determining module 5 is used for inputting the fusion matrix into the trained emotion recognition model to obtain the emotion category of the person to be tested; the emotion categories include happiness, anger, excitement, sadness, calmness, fear.
As a specific embodiment, the system further comprises a training module.
The training module comprises:
and the electroencephalogram data acquisition sub-module is used for acquiring electroencephalogram data of the subject acquired by electroencephalogram electrodes placed according to the international 10-20 system.
A construction sub-module for constructing a feature matrix of the electroencephalographic data of the subject.
And the fusion sub-module is used for fusing the feature matrix of the electroencephalogram data of the subject to obtain a fusion matrix of the electroencephalogram data of the subject.
And the training submodule is used for training the emotion recognition model by taking the fusion matrix of the electroencephalogram data of the subject as input to obtain the trained emotion recognition model.
As a specific embodiment, the feature matrix determining module 2 includes:
and the initial one-dimensional vector construction submodule is used for constructing an initial one-dimensional vector according to the number of electrode channels and the sampling frequency of the international 10-20 system.
And the initial two-dimensional matrix construction submodule is used for constructing an initial two-dimensional matrix according to the initial one-dimensional vector and the electrode position of the international 10-20 system.
And the initial matrix determining submodule is used for standardizing the initial two-dimensional matrix to obtain an initial matrix.
And the eliminating submodule is used for eliminating elements in the initial one-dimensional vector, which correspond to the electrodes on the front and rear sagittal lines in the international 10-20 system, so as to obtain the eliminated one-dimensional vector.
And the renumbering submodule is used for renumbering the elements in the removed one-dimensional vector to obtain the renumbered one-dimensional vector.
And the number determining submodule is used for determining the element number in the renumbered one-dimensional vector and a symmetrical element number symmetrical to the element number.
And the symmetrical difference one-dimensional vector determining submodule is used for taking the difference between the element corresponding to the element number and the element corresponding to the symmetrical element number to obtain a symmetrical difference one-dimensional vector.
And the symmetrical difference two-dimensional matrix determining submodule is used for constructing a symmetrical difference two-dimensional matrix according to the symmetrical difference one-dimensional vector and the electrode position of the international 10-20 system.
And the symmetry difference matrix determining submodule is used for standardizing the symmetry difference two-dimensional matrix to obtain a left and right brain area symmetry difference matrix.
And the symmetric quotient one-dimensional vector determining submodule is used for dividing the element corresponding to the element number with the element corresponding to the symmetric element number to obtain a symmetric quotient one-dimensional vector.
And the symmetric quotient two-dimensional matrix determining submodule is used for constructing a symmetric quotient two-dimensional matrix according to the symmetric quotient one-dimensional vector and the electrode position of the international 10-20 system.
And the symmetric quotient matrix determining submodule is used for standardizing the symmetric quotient two-dimensional matrix to obtain a symmetric quotient matrix of the left and right brain areas.
And the differential entropy characteristic vector determining submodule is used for extracting the differential entropy characteristics of the set frequency band from the initial one-dimensional vector by adopting a set time window to obtain the differential entropy characteristic vector.
And the differential entropy matrix determining submodule is used for constructing a differential entropy matrix according to the differential entropy eigenvector and the electrode position of the international 10-20 system.
As a specific embodiment, the fusion matrix determining module 3 includes:
and the initial characteristic map determining submodule is used for preprocessing the characteristic matrix of the electroencephalogram signal by applying convolution operation to obtain an initial characteristic map.
And the standardized characteristic diagram determining submodule is used for standardizing the initial characteristic diagram to obtain a standardized characteristic diagram.
And the characteristic diagram determining submodule is used for applying an activation function to the standardized characteristic diagram to obtain the characteristic diagram.
And the characteristic matrix determination submodule is used for carrying out nearest neighbor up-sampling on the characteristic diagram to obtain the characteristic matrix of the characteristic diagram.
And the fusion matrix determining submodule is used for fusing the characteristic matrix of the characteristic diagram to obtain a fusion matrix.
Wherein, the emotion classification determination module 5 comprises:
and the initial fusion matrix determining unit is used for sampling and recombining the fusion matrix with a fixed scale by utilizing a space-to-depth structure to obtain a plurality of initial fusion matrices.
And the unified channel number matrix determining unit is used for unifying the channel numbers of the plurality of initial fusion matrixes by using convolution operation to obtain a plurality of unified channel number matrixes.
And the extraction unit is used for extracting the emotion category characteristics of the uniform channel number matrixes by using the bidirectional characteristic pyramid structure and the residual error structure.
And the result output unit is used for applying the full connection layer to classify the emotion classification features and outputting the emotion classification.
The method and the system for residual error characteristic pyramid emotion recognition based on the electroencephalogram signals are based on a characteristic extraction and fusion strategy of sampling points and are applied to training and classifying a residual error characteristic pyramid network structure. Firstly, an Original Signal Matrix (OSM), a Symmetric Difference Matrix (SDM) of left and right brain regions, a Symmetric Quotient Matrix (SQM) of the left and right brain regions and a Differential Entropy Matrix (DEM) are constructed based on sampling points, and feature layer fusion is carried out on the four feature matrices. The characteristic fusion strategy integrates the difference characteristics of the left and right brain areas and the context relationship information of the electroencephalogram. And then putting the fused features into a classification model for training. The classification model is composed of a Space-to-Depth (S2D) structure which replaces a convolution Network to form a backbone and a Residual Feature Pyramid (RFPN) provided by the method. The method not only emphasizes the characteristic that the characteristic pyramid integrates high-level semantic information and low-level spatial information, but also reduces the training duration and the possible influence brought by the deepening of the layer number.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (10)

1. A residual error feature pyramid emotion recognition method based on electroencephalogram signals is characterized by comprising the following steps:
acquiring electroencephalogram data of a testee, which are acquired by electroencephalogram electrodes arranged according to an international 10-20 system;
constructing a feature matrix of electroencephalogram data of the testee, and fusing the feature matrix to obtain a fusion matrix; the characteristic matrix comprises an initial matrix, a left and right brain region symmetric difference matrix, a left and right brain region symmetric quotient matrix and a differential entropy matrix;
constructing an emotion recognition model based on a residual bidirectional feature pyramid; the emotion recognition model comprises a space-to-depth structure, a residual error structure and a bidirectional characteristic pyramid structure;
inputting the fusion matrix into a trained emotion recognition model to obtain the emotion category of the person to be tested; the emotion categories include happiness, anger, excitement, sadness, calmness, fear.
2. The electroencephalogram signal-based residual feature pyramid emotion recognition method of claim 1, wherein the training process of the emotion recognition model comprises:
acquiring electroencephalographic data of a subject acquired by electroencephalographic electrodes placed in accordance with the international 10-20 system;
constructing a feature matrix of electroencephalographic data of the subject;
fusing the feature matrix of the electroencephalogram data of the subject to obtain a fusion matrix of the electroencephalogram data of the subject;
and taking the fusion matrix of the electroencephalogram data of the subject as input, and training the emotion recognition model to obtain the trained emotion recognition model.
3. The method for residual feature pyramid emotion recognition based on electroencephalogram signals of claim 1, wherein the constructing the feature matrix of electroencephalogram data of the subject specifically comprises:
constructing an initial one-dimensional vector according to the number of electrode channels and the sampling frequency of the international 10-20 system;
constructing an initial two-dimensional matrix according to the initial one-dimensional vector and the electrode position of the international 10-20 system;
standardizing the initial two-dimensional matrix to obtain an initial matrix;
elements in the initial one-dimensional vector corresponding to electrodes positioned on front and rear sagittal lines in the international 10-20 system are removed, and the removed one-dimensional vector is obtained;
renumbering elements in the one-dimensional vector after being eliminated to obtain a one-dimensional vector after renumbering;
determining element numbers in the renumbered one-dimensional vectors and symmetrical element numbers symmetrical to the element numbers;
taking the difference between the element corresponding to the element number and the element corresponding to the symmetric element number to obtain a symmetric difference one-dimensional vector;
constructing a symmetrical difference two-dimensional matrix according to the symmetrical difference one-dimensional vector and the electrode position of the international 10-20 system;
standardizing the symmetry difference two-dimensional matrix to obtain a left and right brain region symmetry difference matrix;
dividing the element corresponding to the element number with the element corresponding to the symmetric element number to obtain a symmetric quotient one-dimensional vector;
constructing a symmetric quotient two-dimensional matrix according to the symmetric quotient one-dimensional vector and the electrode position of the international 10-20 system;
standardizing the symmetric quotient two-dimensional matrix to obtain a symmetric quotient matrix of the left and right brain areas;
extracting differential entropy characteristics of a set frequency band from the initial one-dimensional vector by adopting a set time window to obtain differential entropy characteristic vectors;
and constructing a differential entropy matrix according to the differential entropy eigenvector and the electrode position of the international 10-20 system.
4. The electroencephalogram signal-based residual feature pyramid emotion recognition method as recited in claim 1, wherein the fusing the feature matrix to obtain a fusion matrix specifically comprises:
preprocessing the feature matrix by applying convolution operation to obtain an initial feature map;
standardizing the initial characteristic diagram to obtain a standardized characteristic diagram;
applying an activation function to the standardized characteristic diagram to obtain a characteristic diagram;
carrying out nearest neighbor up-sampling on the feature map to obtain a feature matrix of the feature map;
and fusing the characteristic matrixes of the characteristic diagrams to obtain a fusion matrix.
5. The residual error feature pyramid emotion recognition method based on electroencephalogram signals, as claimed in claim 1, wherein the inputting the fusion matrix into a trained emotion recognition model to obtain the emotion classification of the person to be tested specifically comprises:
sampling and recombining the fusion matrix by using a space-to-depth structure in a fixed scale to obtain a plurality of initial fusion matrices;
unifying the channel numbers of the initial fusion matrixes by using convolution operation to obtain a plurality of unified channel number matrixes;
extracting the emotion category characteristics of the uniform channel number matrixes by using a bidirectional characteristic pyramid structure and a residual error structure;
and classifying the emotion classification features by applying a full connection layer, and outputting emotion classifications.
6. The residual error feature pyramid emotion recognition system based on the electroencephalogram signals is characterized by comprising:
the data acquisition module is used for acquiring electroencephalogram data of the testee, which are acquired according to electroencephalogram electrodes arranged in the international 10-20 system;
the characteristic matrix determining module is used for constructing a characteristic matrix of electroencephalogram data of the testee; the characteristic matrix comprises an initial matrix, a left and right brain region symmetric difference matrix, a left and right brain region symmetric quotient matrix and a differential entropy matrix;
the fusion matrix determining module is used for fusing the characteristic matrix to obtain a fusion matrix;
the model construction module is used for constructing an emotion recognition model based on a residual bidirectional feature pyramid; the emotion recognition model comprises a space-to-depth structure, a residual error structure and a bidirectional feature pyramid structure;
the emotion classification determining module is used for inputting the fusion matrix into a trained emotion recognition model to obtain the emotion classification of the person to be tested; the emotion categories include happiness, anger, excitement, sadness, calmness, fear.
7. The system of claim 6, further comprising a training module;
the training module comprises:
an electroencephalogram data acquisition sub-module for acquiring electroencephalogram data of a subject collected by electroencephalogram electrodes placed in accordance with the international 10-20 system;
a construction sub-module for constructing a feature matrix of electroencephalographic data of the subject;
the fusion sub-module is used for fusing the feature matrix of the electroencephalogram data of the subject to obtain a fusion matrix of the electroencephalogram data of the subject;
and the training sub-module is used for taking the fusion matrix of the electroencephalogram data of the subject as input and training the emotion recognition model to obtain the trained emotion recognition model.
8. The system of claim 6, wherein the feature matrix determination module comprises:
the initial one-dimensional vector construction submodule is used for constructing an initial one-dimensional vector according to the number of electrode channels and the sampling frequency of the international 10-20 system;
the initial two-dimensional matrix construction submodule is used for constructing an initial two-dimensional matrix according to the initial one-dimensional vector and the electrode position of the international 10-20 system;
the initial matrix determining submodule is used for standardizing the initial two-dimensional matrix to obtain an initial matrix;
the eliminating submodule is used for eliminating elements in the initial one-dimensional vector, which correspond to electrodes positioned on front and rear sagittal lines in the international 10-20 system, so as to obtain an eliminated one-dimensional vector;
the renumbering submodule is used for renumbering the elements in the removed one-dimensional vector to obtain a renumbered one-dimensional vector;
a numbering determination submodule for determining the element number in the renumbered one-dimensional vector and a symmetric element number symmetric to the element number;
the symmetrical difference one-dimensional vector determining submodule is used for taking the difference between the element corresponding to the element number and the element corresponding to the symmetrical element number to obtain a symmetrical difference one-dimensional vector;
the symmetrical difference two-dimensional matrix determining submodule is used for constructing a symmetrical difference two-dimensional matrix according to the symmetrical difference one-dimensional vector and the electrode position of the international 10-20 system;
the symmetrical difference matrix determining submodule is used for standardizing the symmetrical difference two-dimensional matrix to obtain a symmetrical difference matrix of the left and right brain areas;
a symmetric quotient one-dimensional vector determining submodule for dividing the element corresponding to the element number by the element corresponding to the symmetric element number to obtain a symmetric quotient one-dimensional vector;
the symmetrical quotient two-dimensional matrix determining submodule is used for constructing a symmetrical quotient two-dimensional matrix according to the symmetrical quotient one-dimensional vector and the electrode position of the international 10-20 system;
the symmetric quotient matrix determining submodule is used for standardizing the symmetric quotient two-dimensional matrix to obtain a symmetric quotient matrix of the left and right brain areas;
the differential entropy characteristic vector determining submodule is used for extracting differential entropy characteristics of a set frequency band from the initial one-dimensional vector by adopting a set time window to obtain a differential entropy characteristic vector;
and the differential entropy matrix determining submodule is used for constructing a differential entropy matrix according to the differential entropy characteristic vector and the electrode position of the international 10-20 system.
9. The system of claim 6, wherein the fusion matrix determination module comprises:
the initial characteristic diagram determining submodule is used for preprocessing the characteristic matrix of the electroencephalogram signal by applying convolution operation to obtain an initial characteristic diagram;
the standardized characteristic diagram determining submodule is used for standardizing the initial characteristic diagram to obtain a standardized characteristic diagram;
the characteristic diagram determining submodule is used for applying an activation function to the standardized characteristic diagram to obtain a characteristic diagram;
the characteristic matrix determination submodule is used for carrying out nearest neighbor up-sampling on the characteristic diagram to obtain a characteristic matrix of the characteristic diagram;
and the fusion matrix determining submodule is used for fusing the characteristic matrix of the characteristic diagram to obtain a fusion matrix.
10. The system of claim 6, wherein the emotion classification determination module comprises:
the initial fusion matrix determining unit is used for sampling and recombining the fusion matrix in a fixed scale by utilizing a space-to-depth structure to obtain a plurality of initial fusion matrices;
a unified channel number matrix determining unit, configured to unify the channel numbers of the multiple initial fusion matrices by using convolution operation to obtain multiple unified channel number matrices;
the extraction unit is used for extracting the emotion category characteristics of the uniform channel number matrixes by using a bidirectional characteristic pyramid structure and a residual error structure;
and the result output unit is used for applying the full connection layer to classify the emotion classification features and outputting the emotion classification.
CN202210855623.1A 2022-07-20 2022-07-20 Residual error feature pyramid emotion recognition method and system based on electroencephalogram signals Pending CN115204232A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210855623.1A CN115204232A (en) 2022-07-20 2022-07-20 Residual error feature pyramid emotion recognition method and system based on electroencephalogram signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210855623.1A CN115204232A (en) 2022-07-20 2022-07-20 Residual error feature pyramid emotion recognition method and system based on electroencephalogram signals

Publications (1)

Publication Number Publication Date
CN115204232A true CN115204232A (en) 2022-10-18

Family

ID=83581707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210855623.1A Pending CN115204232A (en) 2022-07-20 2022-07-20 Residual error feature pyramid emotion recognition method and system based on electroencephalogram signals

Country Status (1)

Country Link
CN (1) CN115204232A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117494013A (en) * 2023-12-29 2024-02-02 南方医科大学南方医院 Multi-scale weight sharing convolutional neural network and electroencephalogram emotion recognition method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117494013A (en) * 2023-12-29 2024-02-02 南方医科大学南方医院 Multi-scale weight sharing convolutional neural network and electroencephalogram emotion recognition method thereof
CN117494013B (en) * 2023-12-29 2024-04-16 南方医科大学南方医院 Multi-scale weight sharing convolutional neural network and electroencephalogram emotion recognition method thereof

Similar Documents

Publication Publication Date Title
CN110353675B (en) Electroencephalogram signal emotion recognition method and device based on picture generation
CN109598222B (en) EEMD data enhancement-based wavelet neural network motor imagery electroencephalogram classification method
CN109389171B (en) Medical image classification method based on multi-granularity convolution noise reduction automatic encoder technology
CN112990008B (en) Emotion recognition method and system based on three-dimensional characteristic diagram and convolutional neural network
CN113128552B (en) Electroencephalogram emotion recognition method based on depth separable causal graph convolution network
CN113693613A (en) Electroencephalogram signal classification method and device, computer equipment and storage medium
CN110781751A (en) Emotional electroencephalogram signal classification method based on cross-connection convolutional neural network
Sorić et al. Using convolutional neural network for chest X-ray image classification
CN113712573A (en) Electroencephalogram signal classification method, device, equipment and storage medium
Jinliang et al. EEG emotion recognition based on granger causality and capsnet neural network
CN113133769A (en) Equipment control method, device and terminal based on motor imagery electroencephalogram signals
CN115204232A (en) Residual error feature pyramid emotion recognition method and system based on electroencephalogram signals
CN115204231A (en) Digital human-computer interface cognitive load assessment method based on EEG (electroencephalogram) multi-dimensional features
CN114648048B (en) Electrocardiosignal noise reduction method based on variational self-coding and PixelCNN model
Asghar et al. Semi-skipping layered gated unit and efficient network: hybrid deep feature selection method for edge computing in EEG-based emotion classification
CN115238796A (en) Motor imagery electroencephalogram signal classification method based on parallel DAMSCN-LSTM
CN112259228B (en) Depression screening method by dynamic attention network non-negative matrix factorization
Huang et al. Modeling task fMRI data via mixture of deep expert networks
CN116919422A (en) Multi-feature emotion electroencephalogram recognition model establishment method and device based on graph convolution
CN116797817A (en) Autism disease prediction technology based on self-supervision graph convolution model
CN115017960B (en) Electroencephalogram signal classification method based on space-time combined MLP network and application
CN114881668A (en) Multi-mode-based deception detection method
Yang et al. Optimal Fuzzy Logic Enabled EEG Motor Imagery Classification for Brain Computer Interface
CN114862834B (en) Resting state functional magnetic resonance image data classification method
CN112668424B (en) RBSAGAN-based data augmentation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination