CN114209341A - Emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction - Google Patents

Emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction Download PDF

Info

Publication number
CN114209341A
CN114209341A CN202111608170.4A CN202111608170A CN114209341A CN 114209341 A CN114209341 A CN 114209341A CN 202111608170 A CN202111608170 A CN 202111608170A CN 114209341 A CN114209341 A CN 114209341A
Authority
CN
China
Prior art keywords
electroencephalogram
emotion
model
contribution degree
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111608170.4A
Other languages
Chinese (zh)
Other versions
CN114209341B (en
Inventor
陈子源
段舒哲
沙天慧
彭勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202111608170.4A priority Critical patent/CN114209341B/en
Publication of CN114209341A publication Critical patent/CN114209341A/en
Application granted granted Critical
Publication of CN114209341B publication Critical patent/CN114209341B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • A61B5/374Detecting the frequency distribution of signals, e.g. detecting delta, theta, alpha, beta or gamma waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/378Visual stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/38Acoustic or auditory stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Psychology (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Mathematical Physics (AREA)
  • Acoustics & Sound (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention designs an emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction. The invention comprises the following steps: 1. and acquiring electroencephalogram data of a plurality of testees in a scene of inducing emotion. 2. And (3) preprocessing the electroencephalogram data obtained in the step (1), extracting features and preparing related data. 3. And establishing a feature contribution degree differential data reconstruction model. 4. And (4) solving and training the model established in the step (3) and obtaining an emotion recognition result on the test data. 5. And obtaining key frequency band and key lead information, namely an electroencephalogram emotion activation mode, according to the differential representation factors. The method innovatively utilizes the differential representation factors, and improves the prediction precision of the electroencephalogram emotion recognition model; meanwhile, the activation mode of the electroencephalogram emotion can be acquired by using the method.

Description

Emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction
Technical Field
The invention belongs to the technical field of electroencephalogram signal processing, and particularly relates to an emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction.
Background
Emotion is the psychological and physiological response of human beings to external stimuli or self-stimuli, and is closely related to various aspects of cognition, decision making, efficiency and the like. Therefore, emotion recognition is a particularly basic and important link in the field of emotion calculation, and a premise for a computer to accurately recognize human emotional states is to realize harmonious human-computer interaction. The scalp potential response (electroencephalogram for short) of the neuron activity of the central nervous system is an electric signal for recording the brain activity of human beings, is a physiological signal derived from the central nervous system, and has the characteristics of being not disguised, objective, real and the like, so the electroencephalogram signal has become the gold standard of emotion recognition.
As a high-level activity of the brain, emotion production is accomplished by relying on the synergy of various regions of the brain. The emotion is sparse and infrequent, and the electroencephalogram belongs to a typical multichannel multi-rhythm signal (a multichannel finger measures the electroencephalogram signal in a multi-lead mode, and a multi-rhythm finger divides the frequency of the electroencephalogram signal by using a band-pass filter), so that the problem that the emotion electroencephalogram information is obviously distributed on channels and rhythms of the brain is a key problem in the current research.
The traditional method only starts from the machine learning and pattern recognition angles, and ignores the different expression abilities of the data of each frequency band and lead to different emotions. Therefore, the method has poor effect in the electroencephalogram emotion recognition task, and cannot well meet the requirement of human-computer interaction on high-accuracy emotion recognition; and the activation mode of electroencephalogram emotion cannot be obtained due to lack of explanation.
Disclosure of Invention
Aiming at the problems, the invention provides an emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction. According to the method, the differential expression vector theta is introduced and fused with the data reconstruction model, so that the identification accuracy is improved, meanwhile, on the basis of completing the identification task, the importance of the brain region and the frequency band is represented quantitatively by means of the differential expression vector, and the interpretability of the method is enhanced.
The invention provides the following technical scheme:
step 1, collecting electroencephalogram data of a user in C emotional states at different time intervals.
Step 2, performing feature marking and preprocessing on the data acquired in the step 1, wherein each sample matrix A consists of electroencephalogram frequency domain data of a testee, a label vector s records an emotional state label corresponding to each sample in the A, and different time periods are selected for the same user to be respectively used as dictionaries
Figure BDA0003430076780000011
And target data
Figure BDA0003430076780000012
And 3, establishing a characteristic contribution degree differential data reconstruction model.
And 4, performing joint iterative optimization on the differential representation factor theta and the linear representation coefficient alpha according to the model established in the step 3.
And 5, respectively calculating the reconstruction errors of the target data under each category by using the trained alpha and theta, and selecting the category with the minimum reconstruction error as the recognition result.
Figure BDA0003430076780000021
AS(k)Representative dictionary ASSample corresponding to the kth emotion in (1), αkIs AS(k)Corresponding linear representation coefficient, rkThe reconstruction error under the category.
And 6, searching an electroencephalogram key frequency band and key leads under the current task by using the theta obtained in the step 5 through the following two formulas, and obtaining an electroencephalogram activation mode. In the formula
Figure BDA0003430076780000022
The importance of the qth lead is shown, ω (k) shows the importance of the kth frequency band, and a larger value of importance indicates that the feature is more important.
ω(k)=θ(k-1)*62+1(k-1)*62+2+…+θk*62 (8)
Figure BDA0003430076780000023
Preferably, step 3 further comprises the following substeps:
step 3.1, establishing a data reconstruction model
Figure BDA0003430076780000024
In the formula (1), the first and second groups,
Figure BDA0003430076780000025
is target data AtD is the dimension of the frequency band feature,
Figure BDA0003430076780000026
is AsCorresponding linear representation coefficient, αiCorresponding to the ith element (i ═ 1,2,3 … n), λ in α1C is the total number of emotion categories for the parameter that linearly represents the coefficient α.
Figure BDA0003430076780000027
A vector of differentiated representation factors for the d features,
Figure BDA0003430076780000028
the array is a diagonal array, and the array is a diagonal array,
Figure BDA0003430076780000029
step 3.2, establishing a differential characteristic contribution degree model
Figure BDA00034300767800000210
In the formula (2), the first and second groups,
Figure BDA00034300767800000211
for differential representation of the vectors, b ═ y-Atα)⊙(y-Atα), d represents a feature quantity, which is defined as the product of corresponding elements in a vector, λ2Is a parameter of theta.
Step 3.3, combining the two models constructed in the steps 3.1 and 3.2, and placing the data reconstruction model and the differentiated feature contribution degree model under the same frame for iterative optimization, wherein the optimization model is a formula (3):
Figure BDA0003430076780000031
preferably, step 4 further comprises the following substeps: step 4.1, initializing differential representation factors
Figure BDA0003430076780000032
Setting a threshold value and a maximum iteration number.
And 4.2, calculating to obtain theta by using the obtained theta, and updating the linear expression coefficient alpha by using the model in the step 3.1.
The objective function is:
Figure BDA0003430076780000033
and 4.3, obtaining b by utilizing the alpha obtained by learning in the step 4.1, and updating the differential expression matrix theta according to the model in the step 3.2.
The objective function is:
Figure BDA0003430076780000034
and 4.4, substituting the updated alpha and theta into the overall objective function to solve to obtain an error.
The objective function is:
Figure BDA0003430076780000035
and 4.5, repeating the steps 4.2, 4.3 and 4.4, and performing combined iterative optimization on alpha and theta.
The invention has the following advantages: the method has the advantages that the process of carrying out the joint iterative optimization on alpha and theta is tested through experiments, the convergence of the target function can be realized in a short time, the complexity is low, and the result can be obtained in real time. Through experiments, the method obtains good identification accuracy on the current electroencephalogram public data set, the numerical value is stable and approximately about 85%, and the effect is good. The method utilizes the differential representation factor theta to obtain the important information of the frequency band and the lead in the electroencephalogram frequency domain data while improving the identification precision, so that the method can be improved in explanation and can provide elicitation and evidence for the fields of medicine and cognitive neuroscience.
Drawings
FIG. 1 is a flow chart of the method.
Detailed Description
The present invention is further explained below with reference to the drawings.
Example 1
As shown in fig. 1, an emotion activation mode mining method for feature contribution differentiation electroencephalogram data reconstruction is described by taking 62-lead and 5-frequency-band electroencephalogram data as an example, but the method is applicable to electroencephalogram data of any format; the method specifically comprises the following steps:
step 1, collecting electroencephalogram data
The testee has a rest for a moment under the condition of weak interference of the external environment, wears a brain electrode cap after the emotion is stable, watches the appointed film segment, induces the electroencephalogram emotion, and acquires the potential data of the corresponding brain area of the testee through different electrodes of the electroencephalogram cap to be used as an original electroencephalogram emotion data set.
Step 2, data cleaning
Setting the sampling frequency fSAnd (3) performing time domain sampling on the electroencephalogram data acquired in the step (1) to obtain discrete samples of the electroencephalogram data. The characteristics of weak electroencephalogram data and easy interference are considered, and a band-pass filter is utilizedAnd filtering the obtained original data set, removing noise and artifacts, obtaining data under the required frequency, and dividing the frequency band. The method is described by taking a 1Hz-70Hz filtering mode and dividing 5 frequency bands ((Delta (1-4Hz), Theta (4-8Hz), Alpha (8-14Hz), Beta (14-31Hz) and Gamma (31-50Hz)) as examples.
Step 3, data preprocessing and division
According to the data obtained in the step 2, different methods can be used for extracting the electroencephalogram characteristics, including discrete Fourier transform, high-order spectrum, power spectrum and the like, and the method is suitable for any frequency domain data. Here, the method takes the differential entropy characteristic as an example, and transforms it according to the following formula.
The differential entropy signature (DE) of the data obtained in step 2 is calculated and used as a sample matrix a.
Figure BDA0003430076780000041
Where μ is the expectation of the probability density function and σ is the standard deviation of the probability density function.
Meanwhile, data labels are recorded, electroencephalogram data are divided, and different time periods are selected for the same user to be used as dictionaries respectively
Figure BDA0003430076780000042
And target data
Figure BDA0003430076780000043
Through preprocessing, the electroencephalogram data quality can be well improved, the signal to noise ratio is improved, interference is reduced, and therefore reliability of the migration process is guaranteed.
And 4, establishing a characteristic contribution degree differential data reconstruction model. And a differentiation factor theta is introduced, and the importance of the electroencephalogram data of each dimension is expressed quantitatively, so that the characteristic of strong time interval correlation inhibition in the identification process is enhanced, and the accuracy of the algorithm is improved.
Step 4.1, establishing a data reconstruction model
Figure BDA0003430076780000044
In the model, the model is divided into a plurality of models,
Figure BDA0003430076780000051
is target data AtD is the dimension of the frequency band feature,
Figure BDA0003430076780000052
is ASCorresponding linear representation coefficient, αiCorresponding to the ith element (i ═ 1,2,3 … n), λ in α1C is the total number of emotion categories for the parameter that linearly represents the coefficient α.
Figure BDA0003430076780000053
For the differentiated representation factor of the d features,
Figure BDA0003430076780000054
the array is a diagonal array, and the array is a diagonal array,
Figure BDA0003430076780000055
step 4.2, establishing a differential characteristic contribution degree model
Figure BDA0003430076780000056
wherein ,
Figure BDA0003430076780000057
for differential representation of the factors, b ═ y-atα)⊙(y-Atα), wherein d is a feature number, which is defined as the product of corresponding elements in a vector, λ2Representing the parameter corresponding to theta.
Step 4.3, combining the two models constructed in the steps 4.1 and 4.2, and placing the data reconstruction model and the differentiated feature contribution degree model in the same frame for iterative optimization, wherein the optimization model is as follows:
Figure BDA0003430076780000058
and 5, performing joint iterative optimization on the differential representation factor theta and the linear representation coefficient alpha according to the model established in the step 3.
Step 5.1, initializing differential representation factors
Figure BDA0003430076780000059
Setting a threshold value and a maximum iteration number.
And 5.2, calculating to obtain theta by using the obtained theta, and updating the linear expression coefficient alpha by using the model in the step 4.1.
The objective function is:
Figure BDA00034300767800000510
the objective function is an unconstrained optimization problem and is solved by a method of directly deriving alpha.
Order:
Figure BDA00034300767800000511
Figure BDA00034300767800000512
thereby obtaining:
Figure BDA00034300767800000513
wherein :
Figure BDA0003430076780000061
let the derivative value of formula (17) equal to 0 to obtain
Figure BDA0003430076780000062
The formula is utilized to carry out iterative optimization on alpha to obtain the optimal value of alpha under the current theta
Step 5.3, using α learned in step 5.1 to obtain b, (y-a)tα)⊙(y-Atα), which is defined as the product of corresponding elements in the vector, and the differentiation representation matrix θ is updated according to the model learning of step 4.2.
The objective function is:
Figure BDA0003430076780000063
the target function is an optimization problem under inequality constraint, a Lagrange multiplier method is adopted for solving, and the Lagrange function is constructed as follows:
order to
Figure BDA0003430076780000064
Figure BDA0003430076780000065
Wherein beta and gamma are Lagrange multipliers.
Let beta be*、γ*For the optimal solution of the corresponding equation, using the KKT condition, the following equation is obtained:
Figure BDA0003430076780000066
writing equation (22) into vector form
θ*+m**1-γ*=0 (23)
With constraint theta introducedT1 is 1 to obtain
Figure BDA0003430076780000067
Combining the two formulae (22) and (24) to obtain:
Figure BDA0003430076780000071
order:
Figure BDA0003430076780000072
Figure BDA0003430076780000073
(22) can be simplified into
Figure BDA0003430076780000074
The scalar form is:
Figure BDA0003430076780000075
in combination (21), there can be obtained:
Figure BDA0003430076780000076
wherein (f (x))+=max(f(x),0)
After the transformation of (30), the following results are obtained:
Figure BDA0003430076780000077
at this time, if the optimum can be determined
Figure BDA0003430076780000078
Then the optimum theta*Can be obtained from (31), so that the above formula can be rewritten as
Figure BDA0003430076780000079
According to the constraint (23), defining a function:
Figure BDA00034300767800000710
in view of
Figure BDA00034300767800000711
Must satisfy
Figure BDA00034300767800000712
And then obtaining an iterative formula:
Figure BDA00034300767800000713
according to the formula (34), the calculation can be performed by iteration using Newton's method
Figure BDA00034300767800000714
θ can be obtained according to equation (31).
And 5.4, substituting the updated alpha and theta into an integral objective function to solve to obtain the integral error at the moment.
The objective function is:
Figure BDA00034300767800000715
Figure BDA0003430076780000081
and 5.5, repeating the steps 5.2, 5.3 and 5.4 for multiple times, carrying out combined iterative optimization on alpha and theta, and entering the step 6 when the overall objective function reaches a threshold value or exceeds a preset maximum iteration number.
And 6, respectively calculating the reconstruction errors of each class of data in the dictionary to the target data by using the trained alpha and theta, and selecting the class with the minimum reconstruction error as a recognition result.
Figure BDA0003430076780000082
Representative dictionary ASSample corresponding to the kth emotion in (1), αkIs AS(k)Corresponding linear representation coefficient, rkThe reconstruction error under the category. The prediction result is the emotion label corresponding to the target sample.
And 7, searching a key frequency band and a key lead under the current task by using the theta obtained in the step 6 through the following two formulas to obtain an electroencephalogram emotion activation mode. In the formula
Figure BDA0003430076780000083
The importance of the qth lead is shown, ω (k) shows the importance of the kth frequency band, and a larger value of importance indicates that the feature is more important. Here we illustrate a 62 lead, 5 band.
Figure BDA0003430076780000084
Through the importance of each frequency band and each lead, a brain electrical mapping and a frequency band weight histogram under the task can be obtained, and the expression capability of the brain electrical mapping and the frequency band weight histogram is visualized.

Claims (5)

1. The emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction is characterized by comprising the following steps of:
the method specifically comprises the following steps:
step 1, collecting electroencephalogram data of a user under C emotional states at different time intervals;
step 2, performing feature marking and preprocessing on the data acquired in the step 1, wherein each sample matrix consists of electroencephalogram frequency domain data of a testee, a label vector s records an emotional state label corresponding to each sample in the A, and different time periods are selected for the same user to be used as dictionaries respectively
Figure FDA0003430076770000011
And target data
Figure FDA0003430076770000012
Step 3, establishing a differential feature contribution degree model; obtaining a target function of joint optimization;
step 4, firstly initializing the differential representation factor
Figure FDA0003430076770000013
Obtaining a joint optimization objective function according to the step 3, and performing joint iterative optimization on the differential expression factor theta and the linear expression coefficient alpha by using a method of fixing one variable and updating one variable;
step 5, substituting the sample matrix A and the target data y into a reconstruction error calculation function by using the differential expression factor theta and the linear expression coefficient alpha obtained in the step 4 to obtain reconstruction errors under each category, and selecting the category with the minimum reconstruction error as an identification result;
and 6, acquiring importance degrees of each frequency band and lead under the current task by using the differential representation factor theta obtained in the step 4 and using a frequency band and lead importance calculation model, namely acquiring an activation mode of electroencephalogram emotion.
2. The emotion activation pattern mining method for feature contribution degree differentiation electroencephalogram data reconstruction according to claim 1, characterized in that: step 3, establishing a feature contribution degree differential data reconstruction model, which specifically comprises the following steps:
step 3.1, establishing a data reconstruction model;
Figure FDA0003430076770000014
in the model, the model is divided into a plurality of models,
Figure FDA0003430076770000015
is target data AtD is the dimension of the frequency band feature,
Figure FDA0003430076770000016
is ASCorresponding linear representation coefficient, αiCorresponding to the ith element (i ═ 1,2,3 … n), λ in α1C is a parameter for linearly representing the coefficient alpha, and c is the total number of emotion categories;
Figure FDA0003430076770000017
for the differentiated representation factor of the d features,
Figure FDA0003430076770000018
the array is a diagonal array, and the array is a diagonal array,
Figure FDA0003430076770000019
Figure FDA00034300767700000110
step 3.2, establishing a differential characteristic contribution degree model;
Figure FDA00034300767700000111
wherein ,
Figure FDA00034300767700000112
representing the factor for the difference of each feature, d represents the number of features, and b is (y-A)tα)⊙(y-Atα)b=(y-Atα)⊙(y-Atα), "is defined as an in-vector pairProduct of the corresponding elements, λ2A parameter of θ;
step 3.3, combining the two models constructed in the steps 3.1 and 3.2, and performing iterative optimization on the data reconstruction model and the differential feature contribution degree model under the same frame, wherein the optimization model is as follows:
Figure FDA00034300767700000113
s.t.θT1=1 θi≥0 (3)。
3. the emotion activation pattern mining method for feature contribution degree differentiation electroencephalogram data reconstruction according to claim 1 or 2, characterized in that: step 4 initialize differentiation display factor of
Figure FDA00034300767700000114
4. The emotion activation pattern mining method for feature contribution degree differentiation electroencephalogram data reconstruction according to claim 2, characterized in that: step 4, the following optimization strategy is used;
step 4.1, initializing differential representation factors
Figure FDA0003430076770000021
Setting a threshold value and the maximum iteration times;
step 4.2, calculating to obtain theta by using the obtained theta, and updating the linear expression coefficient alpha by using the model in the step 3.1;
the objective function is:
Figure FDA0003430076770000022
step 4.3, obtaining b by utilizing the alpha obtained by learning in the step 4.1, and updating a differential expression matrix theta according to the model in the step 3.2;
the objective function is:
Figure FDA0003430076770000023
step 4.4, carrying out solving by using the updated alpha and theta and substituting the updated alpha and theta into an integral objective function to obtain an error;
the objective function is:
Figure FDA0003430076770000024
s.t.θT1=1 θi≥0 (6)
and 4.5, repeating the steps 4.2, 4.3 and 4.4, and performing combined iterative optimization on alpha and theta.
5. The emotion activation pattern mining method for feature contribution degree differentiation electroencephalogram data reconstruction according to claim 1, characterized in that: step 6 by recording the relationship of the EEG frequency band, leads and each component of the weight vector, using the following formula (7)
ω(k)=θ(k-1)*62+1(k-1)*62+2+…+θk*62
Figure FDA0003430076770000025
Searching the electroencephalogram key frequency band and key leads under the current task, and acquiring an electroencephalogram activation mode.
CN202111608170.4A 2021-12-23 2021-12-23 Emotion activation mode mining method for characteristic contribution degree difference electroencephalogram data reconstruction Active CN114209341B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111608170.4A CN114209341B (en) 2021-12-23 2021-12-23 Emotion activation mode mining method for characteristic contribution degree difference electroencephalogram data reconstruction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111608170.4A CN114209341B (en) 2021-12-23 2021-12-23 Emotion activation mode mining method for characteristic contribution degree difference electroencephalogram data reconstruction

Publications (2)

Publication Number Publication Date
CN114209341A true CN114209341A (en) 2022-03-22
CN114209341B CN114209341B (en) 2023-06-20

Family

ID=80705913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111608170.4A Active CN114209341B (en) 2021-12-23 2021-12-23 Emotion activation mode mining method for characteristic contribution degree difference electroencephalogram data reconstruction

Country Status (1)

Country Link
CN (1) CN114209341B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106886792A (en) * 2017-01-22 2017-06-23 北京工业大学 A kind of brain electricity emotion identification method that Multiple Classifiers Combination Model Based is built based on layering
CN110353675A (en) * 2019-08-14 2019-10-22 东南大学 The EEG signals emotion identification method and device generated based on picture
CN110876626A (en) * 2019-11-22 2020-03-13 兰州大学 Depression detection system based on optimal lead selection of multi-lead electroencephalogram
CN111067513A (en) * 2019-12-11 2020-04-28 杭州电子科技大学 Sleep quality detection key brain area judgment method based on characteristic weight self-learning
CN111265214A (en) * 2020-02-25 2020-06-12 杭州电子科技大学 Electroencephalogram signal analysis method based on data structured decomposition
CN111616721A (en) * 2020-05-31 2020-09-04 天津大学 Emotion recognition system based on deep learning and brain-computer interface and application
CN111643077A (en) * 2020-06-19 2020-09-11 北方工业大学 Electroencephalogram data-based identification method for traffic dynamic factor complexity
US20200410890A1 (en) * 2018-03-09 2020-12-31 Advanced Telecommunications Research Institute International Brain activity training apparatus, brain activity training method and brain activity training program
CN112656427A (en) * 2020-11-26 2021-04-16 山西大学 Electroencephalogram emotion recognition method based on dimension model

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106886792A (en) * 2017-01-22 2017-06-23 北京工业大学 A kind of brain electricity emotion identification method that Multiple Classifiers Combination Model Based is built based on layering
US20200410890A1 (en) * 2018-03-09 2020-12-31 Advanced Telecommunications Research Institute International Brain activity training apparatus, brain activity training method and brain activity training program
CN110353675A (en) * 2019-08-14 2019-10-22 东南大学 The EEG signals emotion identification method and device generated based on picture
CN110876626A (en) * 2019-11-22 2020-03-13 兰州大学 Depression detection system based on optimal lead selection of multi-lead electroencephalogram
CN111067513A (en) * 2019-12-11 2020-04-28 杭州电子科技大学 Sleep quality detection key brain area judgment method based on characteristic weight self-learning
CN111265214A (en) * 2020-02-25 2020-06-12 杭州电子科技大学 Electroencephalogram signal analysis method based on data structured decomposition
CN111616721A (en) * 2020-05-31 2020-09-04 天津大学 Emotion recognition system based on deep learning and brain-computer interface and application
CN111643077A (en) * 2020-06-19 2020-09-11 北方工业大学 Electroencephalogram data-based identification method for traffic dynamic factor complexity
CN112656427A (en) * 2020-11-26 2021-04-16 山西大学 Electroencephalogram emotion recognition method based on dimension model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
隆燕芳: "基于神经网络的脑电情绪识别与重构方法研究", 杭州电子科技大学硕士论文, pages 6 - 51 *

Also Published As

Publication number Publication date
CN114209341B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
Khare et al. PDCNNet: An automatic framework for the detection of Parkinson’s disease using EEG signals
Bhattacharyya et al. A novel multivariate-multiscale approach for computing EEG spectral and temporal complexity for human emotion recognition
Khare et al. SPWVD-CNN for automated detection of schizophrenia patients using EEG signals
Wen et al. Deep convolution neural network and autoencoders-based unsupervised feature learning of EEG signals
CN112656427B (en) Electroencephalogram emotion recognition method based on dimension model
CN106886792B (en) Electroencephalogram emotion recognition method for constructing multi-classifier fusion model based on layering mechanism
CN110070105B (en) Electroencephalogram emotion recognition method and system based on meta-learning example rapid screening
CN114224342B (en) Multichannel electroencephalogram signal emotion recognition method based on space-time fusion feature network
CN112244873A (en) Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network
CN114533086B (en) Motor imagery brain electrolysis code method based on airspace characteristic time-frequency transformation
CN114305452B (en) Cross-task cognitive load identification method based on electroencephalogram and field adaptation
Hurtado-Rincon et al. Motor imagery classification using feature relevance analysis: An Emotiv-based BCI system
Zhang et al. Four-classes human emotion recognition via entropy characteristic and random Forest
CN110338760B (en) Schizophrenia three-classification method based on electroencephalogram frequency domain data
CN117883082A (en) Abnormal emotion recognition method, system, equipment and medium
Kim et al. eRAD-Fe: Emotion recognition-assisted deep learning framework
Alessandrini et al. EEG-Based Neurodegenerative Disease Classification using LSTM Neural Networks
Wang et al. EEG-based emotion identification using 1-D deep residual shrinkage network with microstate features
Gagliardi et al. Improving emotion recognition systems by exploiting the spatial information of EEG sensors
CN114052734B (en) Electroencephalogram emotion recognition method based on progressive graph convolution neural network
CN114209341A (en) Emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction
Chenane et al. EEG Signal Classification for BCI based on Neural Network
Castaño-Candamil et al. Post-hoc labeling of arbitrary M/EEG recordings for data-efficient evaluation of neural decoding methods
Vadivu et al. An Novel Versatile Inspiring Wavelet Transform and Resilient Direct Neural Network Classification Techniques for Monitoring Brain Activity System Based on EEG Signal
Hasan et al. Emotion prediction through EEG recordings using computational intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant