CN111461204A - Emotion identification method based on electroencephalogram signals and used for game evaluation - Google Patents
Emotion identification method based on electroencephalogram signals and used for game evaluation Download PDFInfo
- Publication number
- CN111461204A CN111461204A CN202010239163.0A CN202010239163A CN111461204A CN 111461204 A CN111461204 A CN 111461204A CN 202010239163 A CN202010239163 A CN 202010239163A CN 111461204 A CN111461204 A CN 111461204A
- Authority
- CN
- China
- Prior art keywords
- layer
- output
- representing
- input
- neural network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 238000011156 evaluation Methods 0.000 title claims abstract description 18
- 230000008451 emotion Effects 0.000 title claims description 24
- 230000008909 emotion recognition Effects 0.000 claims abstract description 34
- 238000013528 artificial neural network Methods 0.000 claims abstract description 33
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 31
- 230000002996 emotional effect Effects 0.000 claims abstract description 27
- 210000004556 brain Anatomy 0.000 claims abstract description 11
- 230000000306 recurrent effect Effects 0.000 claims abstract description 9
- 230000004913 activation Effects 0.000 claims description 30
- 230000006870 function Effects 0.000 claims description 27
- 239000011159 matrix material Substances 0.000 claims description 18
- 238000011176 pooling Methods 0.000 claims description 18
- 238000004364 calculation method Methods 0.000 claims description 13
- 238000012549 training Methods 0.000 claims description 9
- 238000000605 extraction Methods 0.000 claims description 7
- 238000009826 distribution Methods 0.000 claims description 3
- 210000002569 neuron Anatomy 0.000 claims description 3
- 238000005096 rolling process Methods 0.000 claims description 3
- 239000013598 vector Substances 0.000 claims description 3
- 230000001771 impaired effect Effects 0.000 claims description 2
- 230000002123 temporal effect Effects 0.000 abstract description 5
- 230000008921 facial expression Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 208000027418 Wounds and injury Diseases 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 208000014674 injury Diseases 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 208000019901 Anxiety disease Diseases 0.000 description 1
- 230000036506 anxiety Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000003169 central nervous system Anatomy 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/043—Architecture, e.g. interconnection topology based on fuzzy logic, fuzzy membership or fuzzy inference, e.g. adaptive neuro-fuzzy inference systems [ANFIS]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Psychiatry (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Computing Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychology (AREA)
- Fuzzy Systems (AREA)
- Computational Linguistics (AREA)
- Mathematical Optimization (AREA)
- Developmental Disabilities (AREA)
- Automation & Control Theory (AREA)
- Computational Mathematics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Mathematical Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Pure & Applied Mathematics (AREA)
- Child & Adolescent Psychology (AREA)
- Evolutionary Biology (AREA)
- Educational Technology (AREA)
Abstract
The invention discloses an emotion recognition method based on electroencephalogram signals for game evaluation, which comprises the following steps of: first, gaussian white noise is added to the input electroencephalogram data as a noisy input sample. Then, a convolutional neural network is used for capturing spatial features among the electroencephalogram signals from different channels and eliminating noise in the electroencephalogram signals, so that accuracy and anti-noise performance in game emotion recognition are effectively improved. And then, the membership degrees of different emotional states of the player are effectively extracted by using the fuzzy neural network, so that the accuracy of emotion recognition is further improved. Since the current emotional state of the player is affected by the previous emotional state throughout the game, the recurrent neural network is used to capture the temporal characteristics of the brain wave signals, better improving the accuracy of emotion recognition. And the neural network parameter adjustment and the structure adjustment are carried out through parameter learning and structure learning.
Description
Technical Field
The invention belongs to the field of emotion recognition, and particularly relates to an emotion recognition method based on electroencephalogram signals and used for game evaluation.
Background
Since computer games can display various information such as vision and sound in an attractive manner, they attract more and more users in many fields such as entertainment, education and training. At the same time, the game gives the player a rich emotional experience, such as enjoyment and happiness. In recent years, emotion calculation plays a crucial role in understanding human emotion by a computer, and can increase the emotional experience of human-computer interaction. In addition, some games are designed to change the storyline based on the selections made by the player, as the player's emotional state may infer his/her selections. In order to improve the quality and value of the game, it is necessary to evaluate the game. However, game evaluation requires feedback of the player's emotional state. For this reason, automatic emotion recognition by game users is crucial. The game may motivate the player rich and complex emotional states such as anxiety, depression, investments, level of suffering, and space to motivate a fight. However, since there are always individual emotional differences among different persons, the study of emotion recognition faces certain difficulties.
Many emotion recognition models for game evaluation have been proposed so far and can basically satisfy the need of emotion recognition, but many problems still need to be solved or deserve further research. Most current emotion recognition methods for game evaluation are based on facial expressions or speech for recognition. First, there is a problem with emotion recognition methods based on facial expressions: a person's facial expression may be fraudulent. Sometimes, facial expressions cannot express their true emotions, which leads to bias and even errors in the game evaluation process. Second, emotion recognition using voice data is less accurate than physiological data of the central nervous system such as brain waves, and the method of voice recognizing emotion is ineffective when people are silent. Research on cognitive theory shows that human emotion is closely related to human brain, and the accuracy of identifying real emotion in the game process of a player is greatly improved by using brain wave data.
In some scenarios (e.g., low quality devices and noisy external environments), the impact of noise on the accuracy of these models is not well studied, which will limit the prediction accuracy of the models. The activation performance of the FNN space activation layer function is not smooth enough, and the feature extraction capability of the FNN space activation layer function is limited. That is, for the conventional FNN, the accuracy of model prediction cannot be significantly improved by using a larger feature dimension. Therefore, many fuzzy neural network classifiers have limited prediction accuracy.
Disclosure of Invention
The invention aims to solve the defects of the prior art and provides a new neural network framework, namely a Convolution Smoothing Feedback Fuzzy Network (CSFFN), which is used for emotion recognition of game evaluation. CSFFN combines three subnets reasonably: convolutional Neural Networks (CNN), Fuzzy Neural Networks (FNN), and Recurrent Neural Networks (RNN). An emotional state of the player is detected during the game from electroencephalogram (EEG) signals. The CNN can capture the spatial characteristics among the electroencephalogram signals from different channels and eliminate the noise in the electroencephalogram signals, so that the accuracy and the anti-noise performance in game emotion recognition are effectively improved. The FNN effectively extracts membership degrees of different emotion states of the player, so that the accuracy of emotion recognition is further improved. Since the player's current emotional state may be affected by the previous emotional state throughout the game, the RNN is used to capture temporal features of the EEG signal, thereby better improving the accuracy of emotion recognition.
The object of the invention is achieved by at least one of the following solutions.
An emotion recognition method based on electroencephalogram signals for game evaluation comprises the following steps:
s1, adding white Gaussian noise to each channel data of the brain wave to serve as a noisy input sample;
s2, constructing a spatial feature extraction network based on a Convolutional Neural Network (CNN) framework to extract spatial features of brain waves, and denoising the spatial features by the CNN;
s3, inputting the spatial features extracted by the CNN into a fuzzy neural network, and extracting and analyzing the fuzzy features of the emotion;
s4, analyzing time series information by using a Recurrent Neural Network (RNN) for capturing time characteristics of an EEG signal and finally outputting an emotional state category;
and S5, training parameters of the fuzzy neural network by using parameter learning, and generating a rule of the fuzzy neural network FNN by using structure learning.
Further, the step S1 specifically includes the following steps:
gaussian white noise is added to 8 channels and 5 frequency band data of EEG signal respectively, and the formula is as follows:
wherein ,ξjRepresenting noisy input data, eijJ-dimension representing the i-th sample of the original noiseless data, d representing the intensity of the noise, is a constant, fN(0,1)Representing a random number, n, obtained from a standard normal distributionsRepresenting the number of samples in the original noiseless EEG data.
Further, the spatial feature extraction network based on the Convolutional Neural Network (CNN) framework constructed in step S2 is as follows:
the CNN comprises a convolutional layer, a pooling layer and three fully-connected layers, wherein,
and (3) rolling layers: the convolution kernel size is set to 5 × 5, the convolution layer output matrix is 9 × 9, and the calculation formula is as follows:
wherein ,ck9 × 9 matrix;denotes ckThe values of the a-th row and b-th column of the matrix;a weight representing the ith row and jth column of the kth convolution kernel; e.g. of the typeijA j-th dimension representing an i-th sample of the electroencephalogram signal; t is tkA threshold value representing a kth convolution kernel; n iskRepresenting the length (or width) of the convolution kernel
A pooling layer: in the pooling layer, the maximum value is selected from each of the 3 x 3 pooling windows, the output matrix of the pooling layer is a 3 x 3 matrix,
full connection layer: the formula for each fully connected layer is as follows:
wherein ,xeiRepresenting the ith characteristic value; h isiA threshold value representing a characteristic value; w is ai,jA jth weight representing an ith feature value;a value in row a and column b of a kth output matrix representing a pooling layer; n ishRepresents hkThe number of (2); p represents hkLength or width.
Further, in step S3, the fuzzy neural network FNN is divided into 5 layers, specifically:
an input layer: the input is expressed asEach node in the input layer corresponds to an input variable, and the input value is directly transmitted to the fuzzy layer;
wherein ,representing the output, x, of the ith node in the input layeriRepresenting input vectorsThe ith input variable of (1).
Blurring layer: each node uses a Gaussian membership function, and the membership value calculated by the fuzzy layer, namely the output of the fuzzy layer, is as follows:
wherein ,the output of the jth node of the fuzzy layer corresponding to the input of the ith node of the input layer is referred to; m isij,Respectively transmitting the input of the ith node of the input layer to the mean value and the variance of a Gaussian membership function of the jth hidden neuron of the fuzzy layer;
a space-active layer: obtaining the spatial activation intensity F through calculationjOutput as a spatially active layer:
wherein ,representing the output of the jth node of the spatially active layer, nfThe number of obscured layer nodes for node i connected to the spatially active layer.
Results layer: the nodes in the result layer are result nodes, and the calculation formula is as follows:
wherein w represents the corresponding weight, bs is the offset value;
an output layer: the nodes in the layer execute defuzzification and adopt weighted average defuzzification, and the method specifically comprises the following steps:
wherein R is the total number of fuzzification rules; y is the output of the model and,is the corresponding output of the loop layer at time t.
Further, the recurrent neural network RNN of step S4 includes a recurrent layer, which establishes a correlation between the electroencephalogram data at the current time and the electroencephalogram data at the previous time, and the calculation formula is as follows:
wherein ,is the output of the jth hidden node of the loop layer, t represents a time stamp,is the time activation intensity, Fj(t) is the spatial activation intensity of this cycle,w is the corresponding weight, and R is the total number of fuzzification rules.
Further, the neural network parameters are trained by using parameter learning, the parameter learning is to minimize an error cost function through a gradient descent algorithm, and the error cost function is
Wherein y (t +1) and yd(t +1) are the actual output and the desired output of the output layer respectively,
wherein η∈ (0,1) is called the learning rate and is a constant, ωiAre respective weights, y (t) and yd(t) actual and desired outputs of the output layer, respectively, Fj(t) is the spatial activation intensity at time t,and the corresponding output of the circulation layer at the time t-1.
The mean update formula for the gaussian membership functions is as follows:
the standard deviation update formula of the gaussian membership function is as follows:
wherein, y (t) and yd(t) actual and desired outputs of the output layer, respectively, FiFor spatial activation intensity, xjFor the input in the j-th dimension,andthe mean and standard deviation of the original Gaussian membership function.
Further, the rule for generating the fuzzy neural network FNN by using the structure learning specifically includes:
for the first input dataInitial mean m using a Gaussian membership function1And standard deviation σ1Generating a new fuzzy rule:
wherein ,xi(0) Is inputting dataThe ith dimension of (a); sigmainitIs a pre-defined standard deviation of the measured values,
for subsequent input dataCalculating the maximum spatial activation intensity Fj(t) to determine whether a new fuzzy rule should be generated:
if FI(t)<FthresholdA new fuzzy rule is generated, where I is such that the spatial activation strength Fj(t) rule number when maximum value is obtained, FthresholdIs a spatial activation intensity threshold.
When a new fuzzy rule is generated, the mean and variance are set as follows:
where R (t +1) is the number of rules for the current network at time t +1, and β is the overlap factor.
Further, the emotional state categories include open heart, impaired heart, superior feeling, and angry.
Compared with the prior art, the invention has the following beneficial effects:
(1) by reasonably combining CNN, FNN and RNN, a novel CSFFN is developed and used for game emotion recognition.
(2) The CSFFN can capture the spatial characteristics among EEG signals from different channels by using the CNN, and can eliminate noise in the EEG signals, so that the accuracy and the anti-noise performance of game emotion recognition can be effectively improved.
(3) The CSFFN can use the FNN to efficiently extract the membership of players to different emotional states, thereby further improving the accuracy of emotion recognition.
(4) CSFFN also takes into account the fact that a player's current emotional state may be affected by previous emotional states throughout the game. Therefore, the CSFFN introduces the time characteristics of the RNN captured electroencephalogram signals, and the emotion recognition accuracy is improved better.
Drawings
FIG. 1 is a flow chart of an emotion recognition method based on electroencephalogram for game evaluation according to the present invention.
FIG. 2 is a diagram of a convolutional neural network structure in the present invention.
Fig. 3 is a diagram showing the structure of the fuzzy neural network and the recurrent neural network according to the present invention.
Detailed Description
The invention will be further described with reference to examples and figures, but the embodiments of the invention are not limited thereto.
The invention idea of the application is that Gaussian white noise is added into each channel data of the input brain wave to serve as a noisy input sample; then, CNN is introduced for denoising, the influence of noise on EEG data is reduced, and EEG data spatial features are extracted; and then, extracting and analyzing fuzzy characteristics and time characteristics of emotion by using the FNN and the RNN respectively, and finally outputting emotion categories to realize the identification of the emotional state.
As shown in fig. 1, an embodiment of the present invention provides an emotion recognition method based on electroencephalogram signals for game evaluation, which performs emotion recognition by using a convolution smoothing feedback fuzzy network, and includes the following steps:
and S1, adding white Gaussian noise to the data of each brain wave channel to form a noisy input sample.
Specifically, gaussian white noise is added to 8 channels (AF3, AF4, F3, F4, T7, T8, P7, and P8) and 5 frequency bands (, θ, α, and γ) of the EEG signal, respectively, and the formula is as follows:
wherein ,ξjRepresenting noisy input data, eijJ-dimension representing the i-th sample of the original noiseless data, d representing the intensity of the noise, is a constant, fN(0,1) Representing a random number, n, obtained from a standard normal distributionsRepresenting the number of samples in the original noiseless EEG data.
S2, as shown in fig. 2, a spatial feature extraction network based on a Convolutional Neural Network (CNN) framework is constructed to extract spatial features of brain waves, and the CNN denoises the spatial features, adds gaussian white noise data on the basis of an original noiseless input sample to generate a noisy input sample, inputs the noisy input sample into the CNN for training, and makes the trained output result and the original noiseless input sample tend to be consistent, i.e., the noise level of the data can be reduced, and the generated features are obtained through the CNN. Since the convolution process is similar, fig. 2 shows only two of the convolution processes.
Specifically, the constructed spatial feature extraction network based on the Convolutional Neural Network (CNN) framework is as follows:
the CNN comprises a convolutional layer, a pooling layer and three fully-connected layers, wherein,
and (3) rolling layers: the convolution kernel size is set to 5 × 5, the convolution layer output matrix is 9 × 9, and the calculation formula is as follows:
wherein ,ck9 × 9 matrix;denotes ckThe values of the a-th row and b-th column of the matrix;a weight representing the ith row and jth column of the kth convolution kernel; e.g. of the typeijA j-th dimension representing an i-th sample of the electroencephalogram signal; t is tkA threshold value representing a kth convolution kernel; n iskRepresenting the length (or width) of the convolution kernel
A pooling layer: in the pooling layer, the maximum value is selected from each of the 3 x 3 pooling windows, the output matrix of the pooling layer is a 3 x 3 matrix,
full connection layer: the formula for each fully connected layer is as follows:
wherein ,xeiRepresenting the ith characteristic value; h isiA threshold value representing a characteristic value; w is ai,jA jth weight representing an ith feature value;a value in row a and column b of a kth output matrix representing a pooling layer; n ishRepresents hkThe number of (2); p represents hkLength or width.
And S3, inputting the features extracted by the CNN into a fuzzy neural network, and extracting and analyzing the fuzzy features of the emotion.
The fuzzy neural network is used for extracting and analyzing fuzzy characteristics of human emotion. The fuzzy feature means that the emotional states of the human being at a certain moment are mixed, and for a set of various emotional states (such as excitement, distraction, injury and despair), the human being may be in the excitement and distraction states at the same time instead of a single emotional state. The fuzzy neural network analyzes the fuzzy characteristics of human emotional states, and obtains the membership degree corresponding to the emotional states through the membership function of the emotional state set, namely the degree of excitement and the degree of distraction.
By adding fuzzy characteristics of emotion, the accuracy of emotion recognition can be further improved. The fuzzy characteristic of the emotion is combined with the time characteristic at a specific moment, so that the emotion can be well reflected, and the accuracy of emotion recognition is improved.
Specifically, the FNN fuzzy neural network is divided into 5 layers. (the followingRepresenting the output of each layer, w representing weight, b offset value without special description)
An input layer: the input is expressed asEach node in the input layer corresponds to an input variable, and the input value is directly transmitted to the fuzzy layer;
wherein ,representing the output, x, of the ith node in the input layeriRepresenting input vectorsThe ith input variable of (1).
Blurring layer: each node uses a Gaussian membership function, and the membership value calculated by the fuzzy layer, namely the output of the fuzzy layer, is as follows:
wherein ,the output of the jth node of the fuzzy layer corresponding to the input of the ith node of the input layer is referred to; m isij,Respectively transmitting the input of the ith node of the input layer to the mean value and the variance of a Gaussian membership function of the jth hidden neuron of the fuzzy layer;
a space-active layer: obtaining the spatial activation intensity F through calculationjAsOutput of the spatial activation layer:
wherein ,representing the output of the jth node of the spatially active layer, nfThe number of obscured layer nodes for node i connected to the spatially active layer.
Results layer: the nodes in the result layer are result nodes, and the calculation formula is as follows:
wherein w represents the corresponding weight, bs is the offset value;
an output layer: the nodes in the layer execute defuzzification and adopt weighted average defuzzification, and the method specifically comprises the following steps:
wherein R is the total number of fuzzification rules; y is the output of the model and,is the corresponding output of the loop layer at time t.
S4, efficiently analyze the time series information using RNN for capturing temporal characteristics of EEG signals, and finally output emotion classification.
RNNs are good at processing data entered in the form of a sequence, using feedback to analyze the time series of EEG signals and capture the temporal features of emotion. Since the emotion of the player has a rich context correlation in time, the emotional state of the player at the current time is affected by the emotional state of the previous time, and therefore, the temporal characteristics of the emotion need to be sufficiently analyzed.
The input to the RNN is the output of the spatial activation layer in the FNN, i.e., the spatial activation strength. When the output of the RNN at different moments tends to converge, the output is transmitted to a result layer and an output layer of the FNN, and emotion categories such as distraction, injury, superiority or anger are output.
Specifically, the RNN includes a loop layer, which may establish a correlation between electroencephalogram data at a current time and electroencephalogram data at a previous time, and the calculation formula is as follows:
wherein ,is the output of the jth hidden node of the loop layer, t represents a time stamp,is the time activation intensity, Fj(t) is the spatial activation intensity of this cycle,w is the corresponding weight, and R is the total number of fuzzification rules.
And S5, training the fuzzy neural network parameters by using parameter learning, and training the network by using the rule of generating the fuzzy neural network FNN by using structure learning.
(1) Training neural network parameters using parameter learning
Parameter learning is performed after the structure learning phase of each data is completed. All the CSFFN parameters and the free parameters of the fuzzy rules are learned by an online gradient descent algorithm. In other words, the goal of parameter learning is to minimize the error cost function by the gradient descent algorithm. The error cost function is
Wherein y (t +1) and yd(t +1) are the actual output and the desired output of the output layer respectively,
wherein η∈ (0,1) is called the learning rate and is a constant, ωiAre respective weights, y (t) and yd(t) actual and desired outputs of the output layer, respectively, Fj(t) is the spatial activation intensity at time t,the corresponding output of the loop layer at time t-1.
The mean update formula for the gaussian membership functions is as follows:
the standard deviation update formula of the gaussian membership function is as follows:
wherein, y (t) and yd(t) actual and desired outputs of the output layer, respectively, FiFor spatial activation intensity, xjFor the input in the j-th dimension,andthe mean and standard deviation of the original Gaussian membership function.
(2) Rule for generating fuzzy neural network by structure learning
Initially, the FNN does not establish fuzzy rules. For the first input dataInitial mean m using a Gaussian membership function1And standard deviation σ1Generating a new fuzzy rule:
wherein ,xi(0) Is inputting dataThe ith dimension of (a); sigmainitIs a pre-defined standard deviation of the measured values,
for subsequent input dataCalculating the maximum spatial activation intensity Fj(t) to determine if a new fuzzy rule should be generated
If FI(t)<FthresholdA new fuzzy rule is generated, where I is such that the spatial activation strength Fj(t) rule number when maximum value is obtained, FthresholdIs a spatial activation intensity threshold.
When a new fuzzy rule is generated, the mean and variance are set as follows:
where R (t +1) is the number of rules for the current network at time t +1, and β is the overlap factor.
The above description is only for the preferred embodiments of the present invention, but the protection scope of the present invention is not limited thereto, and any person skilled in the art can substitute or change the technical solution of the present invention and the inventive concept within the scope of the present invention, which is disclosed by the present invention, and the equivalent or change thereof belongs to the protection scope of the present invention.
Claims (8)
1. An emotion recognition method based on electroencephalogram signals and used for game evaluation is characterized by comprising the following steps:
s1, adding white Gaussian noise to each channel data of the brain wave to serve as a noisy input sample;
s2, constructing a spatial feature extraction network based on a Convolutional Neural Network (CNN) framework to extract spatial features of brain waves, and denoising the spatial features by the CNN;
s3, inputting the spatial features extracted by the CNN into a fuzzy neural network, and extracting and analyzing the fuzzy features of the emotion;
s4, analyzing time series information by using a Recurrent Neural Network (RNN) for capturing time characteristics of an EEG signal and finally outputting emotion classes;
and S5, training the fuzzy neural network parameters by using parameter learning, and training the network by using the rule of generating the fuzzy neural network FNN by using structure learning.
2. The electroencephalogram signal-based emotion recognition method for game evaluation as claimed in claim 1, wherein said step S1 specifically comprises the steps of:
gaussian white noise is added to 8 channels and 5 frequency band data of EEG signal respectively, and the formula is as follows:
wherein ,ξjRepresenting noisy input data, eijJ-dimension representing the i-th sample of the original noiseless data, d representing the intensity of the noise, is a constant, fN(0,1)Representing a random number, n, obtained from a standard normal distributionsRepresenting the number of samples in the original noiseless EEG data.
3. The electroencephalogram signal-based emotion recognition method for game evaluation as recited in claim 1, wherein the Convolutional Neural Network (CNN) framework-based spatial feature extraction network constructed in step S2 is as follows:
the CNN comprises a convolutional layer, a pooling layer and three fully-connected layers, wherein,
and (3) rolling layers: the convolution kernel size is set to 5 × 5, the convolution layer output matrix is 9 × 9, and the calculation formula is as follows:
wherein ,ck9 × 9 matrix;denotes ckThe values of the a-th row and b-th column of the matrix;a weight representing the ith row and jth column of the kth convolution kernel; e.g. of the typeijA j-th dimension representing an i-th sample of the electroencephalogram signal; t is tkA threshold value representing a kth convolution kernel; n iskRepresenting the length (or width) of the convolution kernel
A pooling layer: in the pooling layer, the maximum value is selected from each of the 3 x 3 pooling windows, the output matrix of the pooling layer is a 3 x 3 matrix,
full connection layer: the formula for each fully connected layer is as follows:
wherein ,xeiRepresenting the ith characteristic value; h isiA threshold value representing a characteristic value; w is ai,jA jth weight representing an ith feature value;a value in row a and column b of a kth output matrix representing a pooling layer; n ishRepresents hkThe number of (2); p represents hkLength or width.
4. The electroencephalogram signal-based emotion recognition method for game evaluation according to claim 1, wherein the fuzzy neural network FNN in step S3 is divided into 5 layers, specifically:
an input layer: the input is expressed asEach node in the input layer corresponds to an input variable, and the input value is directly transmitted to the fuzzy layer;
wherein ,representing the output, x, of the ith node in the input layeriRepresenting input vectorsThe ith input variable of (a) is,
blurring layer: each node uses a Gaussian membership function, and the membership value calculated by the fuzzy layer, namely the output of the fuzzy layer, is as follows:
wherein ,the output of the jth node of the fuzzy layer corresponding to the input of the ith node of the input layer is referred to; m isij,Input of ith node of input layer is passed to fuzzificationMean and variance of gaussian membership functions of the jth hidden neuron of the layer;
a space-active layer: obtaining the spatial activation intensity F through calculationjOutput as a spatially active layer:
wherein ,representing the output of the jth node of the spatially active layer, nfFor the number of obscured layer nodes connected to node i of the spatially active layer,
results layer: the nodes in the result layer are result nodes, and the calculation formula is as follows:
wherein w represents the corresponding weight, bs is the offset value;
an output layer: the nodes in the layer execute defuzzification and adopt weighted average defuzzification, and the method specifically comprises the following steps:
5. The electroencephalogram signal-based emotion recognition method for game evaluation as recited in claim 1, wherein the recurrent neural network RNN of step S4 includes a recurrent layer which establishes a correlation between electroencephalographic data of a current time and a previous time, and the calculation formula is as follows:
6. The method of claim 1, wherein the training of the fuzzy neural network parameters is performed by parameter learning, wherein the parameter learning is a minimization of an error cost function by a gradient descent algorithm, and the error cost function is
Wherein y (t +1) and yd(t +1) are the actual output and the desired output of the output layer respectively,
wherein η∈ (0,1) is called the learning rate and is a constant, ωiAre respective weights, y (t) and yd(t) are the actual outputs of the output layers, respectivelyAnd desired output, Fj(t) is the spatial activation intensity at time t,the corresponding output of the cycle layer at the time t-1,
the mean update formula for the gaussian membership functions is as follows:
the standard deviation update formula of the gaussian membership function is as follows:
7. The electroencephalogram signal-based emotion recognition method for game evaluation according to claim 1, wherein the rule for generating the fuzzy neural network FNN by using structure learning specifically comprises:
for the first input dataInitial mean m using a Gaussian membership function1And standard deviation σ1Generating a new fuzzy rule:
wherein ,xi(0) Is inputting dataThe ith dimension of (a); sigmainitIs a pre-defined standard deviation of the measured values,
for subsequent input dataCalculating the maximum spatial activation intensity Fj(t) to determine if a new fuzzy rule should be generated
If FI(t)<FthresholdA new fuzzy rule is generated, where I is such that the spatial activation strength Fj(t) rule number when maximum value is obtained, FthresholdIn order to be a spatial activation intensity threshold value,
when a new fuzzy rule is generated, the mean and variance are set as follows:
where R (t +1) is the number of rules for the current network at time t +1, and β is the overlap factor.
8. The electroencephalogram signal-based emotion recognition method for game evaluation as recited in claim 1, wherein: the emotional state categories include open heart, impaired heart, superior feeling, and angry.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010239163.0A CN111461204B (en) | 2020-03-30 | 2020-03-30 | Emotion recognition method based on electroencephalogram signals for game evaluation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010239163.0A CN111461204B (en) | 2020-03-30 | 2020-03-30 | Emotion recognition method based on electroencephalogram signals for game evaluation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111461204A true CN111461204A (en) | 2020-07-28 |
CN111461204B CN111461204B (en) | 2023-05-26 |
Family
ID=71681768
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010239163.0A Active CN111461204B (en) | 2020-03-30 | 2020-03-30 | Emotion recognition method based on electroencephalogram signals for game evaluation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111461204B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112336357A (en) * | 2020-11-06 | 2021-02-09 | 山西三友和智慧信息技术股份有限公司 | RNN-CNN-based EMG signal classification system and method |
CN112381008A (en) * | 2020-11-17 | 2021-02-19 | 天津大学 | Electroencephalogram emotion recognition method based on parallel sequence channel mapping network |
CN112842342A (en) * | 2021-01-25 | 2021-05-28 | 北京航空航天大学 | Electrocardiogram and magnetic signal classification method combining Hilbert curve and integrated learning |
CN113112017A (en) * | 2021-04-16 | 2021-07-13 | 唐山市工人医院 | Electroencephalogram grading and prognosis FPGA decoding system based on neural manifold |
CN113598789A (en) * | 2021-06-21 | 2021-11-05 | 天津大学 | Cross-individual thermal comfort discrimination method based on electroencephalogram signals |
CN114424940A (en) * | 2022-01-27 | 2022-05-03 | 山东师范大学 | Emotion recognition method and system based on multi-mode spatiotemporal feature fusion |
CN114742116A (en) * | 2022-06-13 | 2022-07-12 | 四川新源生物电子科技有限公司 | Generation method and system for analog acquisition of electroencephalogram signals |
CN116687409A (en) * | 2023-07-31 | 2023-09-05 | 武汉纺织大学 | Emotion recognition method and system based on digital twin and deep learning |
CN116943226A (en) * | 2023-09-20 | 2023-10-27 | 小舟科技有限公司 | Game difficulty adjusting method, system, equipment and medium based on emotion recognition |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101257417A (en) * | 2008-03-25 | 2008-09-03 | 浙江大学 | Method for detecting TCP/IP protocol concealed channel based on fuzzy neural network |
CN103336992A (en) * | 2013-06-27 | 2013-10-02 | 电子科技大学 | FNN learning algorithm |
CN110059565A (en) * | 2019-03-20 | 2019-07-26 | 杭州电子科技大学 | A kind of P300 EEG signal identification method based on improvement convolutional neural networks |
CN110353702A (en) * | 2019-07-02 | 2019-10-22 | 华南理工大学 | A kind of emotion identification method and system based on shallow-layer convolutional neural networks |
CN110866537A (en) * | 2019-09-27 | 2020-03-06 | 华南理工大学 | Brain wave-based emotion recognition method for game evaluation |
-
2020
- 2020-03-30 CN CN202010239163.0A patent/CN111461204B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101257417A (en) * | 2008-03-25 | 2008-09-03 | 浙江大学 | Method for detecting TCP/IP protocol concealed channel based on fuzzy neural network |
CN103336992A (en) * | 2013-06-27 | 2013-10-02 | 电子科技大学 | FNN learning algorithm |
CN110059565A (en) * | 2019-03-20 | 2019-07-26 | 杭州电子科技大学 | A kind of P300 EEG signal identification method based on improvement convolutional neural networks |
CN110353702A (en) * | 2019-07-02 | 2019-10-22 | 华南理工大学 | A kind of emotion identification method and system based on shallow-layer convolutional neural networks |
CN110866537A (en) * | 2019-09-27 | 2020-03-06 | 华南理工大学 | Brain wave-based emotion recognition method for game evaluation |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112336357A (en) * | 2020-11-06 | 2021-02-09 | 山西三友和智慧信息技术股份有限公司 | RNN-CNN-based EMG signal classification system and method |
CN112381008A (en) * | 2020-11-17 | 2021-02-19 | 天津大学 | Electroencephalogram emotion recognition method based on parallel sequence channel mapping network |
CN112381008B (en) * | 2020-11-17 | 2022-04-29 | 天津大学 | Electroencephalogram emotion recognition method based on parallel sequence channel mapping network |
CN112842342A (en) * | 2021-01-25 | 2021-05-28 | 北京航空航天大学 | Electrocardiogram and magnetic signal classification method combining Hilbert curve and integrated learning |
CN113112017A (en) * | 2021-04-16 | 2021-07-13 | 唐山市工人医院 | Electroencephalogram grading and prognosis FPGA decoding system based on neural manifold |
CN113598789A (en) * | 2021-06-21 | 2021-11-05 | 天津大学 | Cross-individual thermal comfort discrimination method based on electroencephalogram signals |
CN114424940A (en) * | 2022-01-27 | 2022-05-03 | 山东师范大学 | Emotion recognition method and system based on multi-mode spatiotemporal feature fusion |
CN114742116A (en) * | 2022-06-13 | 2022-07-12 | 四川新源生物电子科技有限公司 | Generation method and system for analog acquisition of electroencephalogram signals |
CN114742116B (en) * | 2022-06-13 | 2022-09-02 | 四川新源生物电子科技有限公司 | Generation method and system for analog acquisition of electroencephalogram signals |
CN116687409A (en) * | 2023-07-31 | 2023-09-05 | 武汉纺织大学 | Emotion recognition method and system based on digital twin and deep learning |
CN116687409B (en) * | 2023-07-31 | 2023-12-12 | 武汉纺织大学 | Emotion recognition method and system based on digital twin and deep learning |
CN116943226A (en) * | 2023-09-20 | 2023-10-27 | 小舟科技有限公司 | Game difficulty adjusting method, system, equipment and medium based on emotion recognition |
CN116943226B (en) * | 2023-09-20 | 2024-01-05 | 小舟科技有限公司 | Game difficulty adjusting method, system, equipment and medium based on emotion recognition |
Also Published As
Publication number | Publication date |
---|---|
CN111461204B (en) | 2023-05-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111461204A (en) | Emotion identification method based on electroencephalogram signals and used for game evaluation | |
CN112784798B (en) | Multi-modal emotion recognition method based on feature-time attention mechanism | |
Golany et al. | SimGANs: Simulator-based generative adversarial networks for ECG synthesis to improve deep ECG classification | |
Ghorbandaei Pour et al. | Human–robot facial expression reciprocal interaction platform: case studies on children with autism | |
Du et al. | Non-contact emotion recognition combining heart rate and facial expression for interactive gaming environments | |
Tognetti et al. | Modeling enjoyment preference from physiological responses in a car racing game | |
CN112766173B (en) | Multi-mode emotion analysis method and system based on AI deep learning | |
Seng et al. | Video analytics for customer emotion and satisfaction at contact centers | |
CN110135244B (en) | Expression recognition method based on brain-computer collaborative intelligence | |
WO2019137538A1 (en) | Emotion representative image to derive health rating | |
CN110866537B (en) | Brain wave-based emotion recognition method for game evaluation | |
Nogueira et al. | A hybrid approach at emotional state detection: Merging theoretical models of emotion with data-driven statistical classifiers | |
CN106909938A (en) | Viewing angle independence Activity recognition method based on deep learning network | |
Du et al. | An emotion recognition method for game evaluation based on electroencephalogram | |
Parvathi et al. | Emotion Analysis Using Deep Learning | |
Leite et al. | Adaptive gaussian fuzzy classifier for real-time emotion recognition in computer games | |
Mohan et al. | Depression detection using facial expression and sentiment analysis | |
Golzadeh et al. | Emotion recognition using spatiotemporal features from facial expression landmarks | |
KR102285482B1 (en) | Method and apparatus for providing content based on machine learning analysis of biometric information | |
Raizada et al. | Organ risk prediction for Parkinson’s disease using deep learning techniques | |
Ono et al. | Emotion estimation using body expression types based on LMA and sensitivity analysis | |
Killedar et al. | Fuzzy Logic for Video Game Engagement Analysis using Facial Emotion Recognition | |
KR102610267B1 (en) | Method for analyzing status of specific user corresponding to specific avatar by referring to interactions between the specific avatar and other avatars in the metaverse world and providing service to the specific user and device using the same | |
Reddy et al. | Brain Waves Computation using ML in Gaming Consoles | |
KR102610262B1 (en) | Method for providing counseling service to specific avatar of the specific real user in the metaverse world and device using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |