CN114209341A - Emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction - Google Patents
Emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction Download PDFInfo
- Publication number
- CN114209341A CN114209341A CN202111608170.4A CN202111608170A CN114209341A CN 114209341 A CN114209341 A CN 114209341A CN 202111608170 A CN202111608170 A CN 202111608170A CN 114209341 A CN114209341 A CN 114209341A
- Authority
- CN
- China
- Prior art keywords
- electroencephalogram
- emotion
- model
- contribution degree
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 230000008451 emotion Effects 0.000 title claims abstract description 32
- 230000004913 activation Effects 0.000 title claims abstract description 18
- 230000004069 differentiation Effects 0.000 title claims abstract description 14
- 238000005065 mining Methods 0.000 title claims abstract description 11
- 238000007781 pre-processing Methods 0.000 claims abstract description 5
- 230000006870 function Effects 0.000 claims description 23
- 238000005457 optimization Methods 0.000 claims description 19
- 239000013598 vector Substances 0.000 claims description 10
- 239000011159 matrix material Substances 0.000 claims description 7
- 230000002996 emotional effect Effects 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 claims description 4
- 230000008909 emotion recognition Effects 0.000 abstract description 6
- 230000001939 inductive effect Effects 0.000 abstract 1
- 210000004556 brain Anatomy 0.000 description 8
- 230000000694 effects Effects 0.000 description 4
- 241000282414 Homo sapiens Species 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 210000003169 central nervous system Anatomy 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 239000013604 expression vector Substances 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000007177 brain activity Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000006461 physiological response Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 210000004761 scalp Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/372—Analysis of electroencephalograms
- A61B5/374—Detecting the frequency distribution of signals, e.g. detecting delta, theta, alpha, beta or gamma waves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/377—Electroencephalography [EEG] using evoked responses
- A61B5/378—Visual stimuli
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/377—Electroencephalography [EEG] using evoked responses
- A61B5/38—Acoustic or auditory stimuli
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7253—Details of waveform analysis characterised by using transforms
- A61B5/7257—Details of waveform analysis characterised by using transforms using Fourier transforms
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Psychiatry (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Psychology (AREA)
- Signal Processing (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Mathematical Physics (AREA)
- Acoustics & Sound (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Social Psychology (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The invention designs an emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction. The invention comprises the following steps: 1. and acquiring electroencephalogram data of a plurality of testees in a scene of inducing emotion. 2. And (3) preprocessing the electroencephalogram data obtained in the step (1), extracting features and preparing related data. 3. And establishing a feature contribution degree differential data reconstruction model. 4. And (4) solving and training the model established in the step (3) and obtaining an emotion recognition result on the test data. 5. And obtaining key frequency band and key lead information, namely an electroencephalogram emotion activation mode, according to the differential representation factors. The method innovatively utilizes the differential representation factors, and improves the prediction precision of the electroencephalogram emotion recognition model; meanwhile, the activation mode of the electroencephalogram emotion can be acquired by using the method.
Description
Technical Field
The invention belongs to the technical field of electroencephalogram signal processing, and particularly relates to an emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction.
Background
Emotion is the psychological and physiological response of human beings to external stimuli or self-stimuli, and is closely related to various aspects of cognition, decision making, efficiency and the like. Therefore, emotion recognition is a particularly basic and important link in the field of emotion calculation, and a premise for a computer to accurately recognize human emotional states is to realize harmonious human-computer interaction. The scalp potential response (electroencephalogram for short) of the neuron activity of the central nervous system is an electric signal for recording the brain activity of human beings, is a physiological signal derived from the central nervous system, and has the characteristics of being not disguised, objective, real and the like, so the electroencephalogram signal has become the gold standard of emotion recognition.
As a high-level activity of the brain, emotion production is accomplished by relying on the synergy of various regions of the brain. The emotion is sparse and infrequent, and the electroencephalogram belongs to a typical multichannel multi-rhythm signal (a multichannel finger measures the electroencephalogram signal in a multi-lead mode, and a multi-rhythm finger divides the frequency of the electroencephalogram signal by using a band-pass filter), so that the problem that the emotion electroencephalogram information is obviously distributed on channels and rhythms of the brain is a key problem in the current research.
The traditional method only starts from the machine learning and pattern recognition angles, and ignores the different expression abilities of the data of each frequency band and lead to different emotions. Therefore, the method has poor effect in the electroencephalogram emotion recognition task, and cannot well meet the requirement of human-computer interaction on high-accuracy emotion recognition; and the activation mode of electroencephalogram emotion cannot be obtained due to lack of explanation.
Disclosure of Invention
Aiming at the problems, the invention provides an emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction. According to the method, the differential expression vector theta is introduced and fused with the data reconstruction model, so that the identification accuracy is improved, meanwhile, on the basis of completing the identification task, the importance of the brain region and the frequency band is represented quantitatively by means of the differential expression vector, and the interpretability of the method is enhanced.
The invention provides the following technical scheme:
And 3, establishing a characteristic contribution degree differential data reconstruction model.
And 4, performing joint iterative optimization on the differential representation factor theta and the linear representation coefficient alpha according to the model established in the step 3.
And 5, respectively calculating the reconstruction errors of the target data under each category by using the trained alpha and theta, and selecting the category with the minimum reconstruction error as the recognition result.
AS(k)Representative dictionary ASSample corresponding to the kth emotion in (1), αkIs AS(k)Corresponding linear representation coefficient, rkThe reconstruction error under the category.
And 6, searching an electroencephalogram key frequency band and key leads under the current task by using the theta obtained in the step 5 through the following two formulas, and obtaining an electroencephalogram activation mode. In the formulaThe importance of the qth lead is shown, ω (k) shows the importance of the kth frequency band, and a larger value of importance indicates that the feature is more important.
ω(k)=θ(k-1)*62+1+θ(k-1)*62+2+…+θk*62 (8)
Preferably, step 3 further comprises the following substeps:
step 3.1, establishing a data reconstruction model
In the formula (1), the first and second groups,is target data AtD is the dimension of the frequency band feature,is AsCorresponding linear representation coefficient, αiCorresponding to the ith element (i ═ 1,2,3 … n), λ in α1C is the total number of emotion categories for the parameter that linearly represents the coefficient α.A vector of differentiated representation factors for the d features,the array is a diagonal array, and the array is a diagonal array,
step 3.2, establishing a differential characteristic contribution degree model
In the formula (2), the first and second groups,for differential representation of the vectors, b ═ y-Atα)⊙(y-Atα), d represents a feature quantity, which is defined as the product of corresponding elements in a vector, λ2Is a parameter of theta.
Step 3.3, combining the two models constructed in the steps 3.1 and 3.2, and placing the data reconstruction model and the differentiated feature contribution degree model under the same frame for iterative optimization, wherein the optimization model is a formula (3):
preferably, step 4 further comprises the following substeps: step 4.1, initializing differential representation factorsSetting a threshold value and a maximum iteration number.
And 4.2, calculating to obtain theta by using the obtained theta, and updating the linear expression coefficient alpha by using the model in the step 3.1.
The objective function is:
and 4.3, obtaining b by utilizing the alpha obtained by learning in the step 4.1, and updating the differential expression matrix theta according to the model in the step 3.2.
The objective function is:
and 4.4, substituting the updated alpha and theta into the overall objective function to solve to obtain an error.
The objective function is:
and 4.5, repeating the steps 4.2, 4.3 and 4.4, and performing combined iterative optimization on alpha and theta.
The invention has the following advantages: the method has the advantages that the process of carrying out the joint iterative optimization on alpha and theta is tested through experiments, the convergence of the target function can be realized in a short time, the complexity is low, and the result can be obtained in real time. Through experiments, the method obtains good identification accuracy on the current electroencephalogram public data set, the numerical value is stable and approximately about 85%, and the effect is good. The method utilizes the differential representation factor theta to obtain the important information of the frequency band and the lead in the electroencephalogram frequency domain data while improving the identification precision, so that the method can be improved in explanation and can provide elicitation and evidence for the fields of medicine and cognitive neuroscience.
Drawings
FIG. 1 is a flow chart of the method.
Detailed Description
The present invention is further explained below with reference to the drawings.
Example 1
As shown in fig. 1, an emotion activation mode mining method for feature contribution differentiation electroencephalogram data reconstruction is described by taking 62-lead and 5-frequency-band electroencephalogram data as an example, but the method is applicable to electroencephalogram data of any format; the method specifically comprises the following steps:
The testee has a rest for a moment under the condition of weak interference of the external environment, wears a brain electrode cap after the emotion is stable, watches the appointed film segment, induces the electroencephalogram emotion, and acquires the potential data of the corresponding brain area of the testee through different electrodes of the electroencephalogram cap to be used as an original electroencephalogram emotion data set.
Setting the sampling frequency fSAnd (3) performing time domain sampling on the electroencephalogram data acquired in the step (1) to obtain discrete samples of the electroencephalogram data. The characteristics of weak electroencephalogram data and easy interference are considered, and a band-pass filter is utilizedAnd filtering the obtained original data set, removing noise and artifacts, obtaining data under the required frequency, and dividing the frequency band. The method is described by taking a 1Hz-70Hz filtering mode and dividing 5 frequency bands ((Delta (1-4Hz), Theta (4-8Hz), Alpha (8-14Hz), Beta (14-31Hz) and Gamma (31-50Hz)) as examples.
Step 3, data preprocessing and division
According to the data obtained in the step 2, different methods can be used for extracting the electroencephalogram characteristics, including discrete Fourier transform, high-order spectrum, power spectrum and the like, and the method is suitable for any frequency domain data. Here, the method takes the differential entropy characteristic as an example, and transforms it according to the following formula.
The differential entropy signature (DE) of the data obtained in step 2 is calculated and used as a sample matrix a.
Where μ is the expectation of the probability density function and σ is the standard deviation of the probability density function.
Meanwhile, data labels are recorded, electroencephalogram data are divided, and different time periods are selected for the same user to be used as dictionaries respectivelyAnd target data
Through preprocessing, the electroencephalogram data quality can be well improved, the signal to noise ratio is improved, interference is reduced, and therefore reliability of the migration process is guaranteed.
And 4, establishing a characteristic contribution degree differential data reconstruction model. And a differentiation factor theta is introduced, and the importance of the electroencephalogram data of each dimension is expressed quantitatively, so that the characteristic of strong time interval correlation inhibition in the identification process is enhanced, and the accuracy of the algorithm is improved.
Step 4.1, establishing a data reconstruction model
In the model, the model is divided into a plurality of models,is target data AtD is the dimension of the frequency band feature,is ASCorresponding linear representation coefficient, αiCorresponding to the ith element (i ═ 1,2,3 … n), λ in α1C is the total number of emotion categories for the parameter that linearly represents the coefficient α.For the differentiated representation factor of the d features,the array is a diagonal array, and the array is a diagonal array,
step 4.2, establishing a differential characteristic contribution degree model
wherein ,for differential representation of the factors, b ═ y-atα)⊙(y-Atα), wherein d is a feature number, which is defined as the product of corresponding elements in a vector, λ2Representing the parameter corresponding to theta.
Step 4.3, combining the two models constructed in the steps 4.1 and 4.2, and placing the data reconstruction model and the differentiated feature contribution degree model in the same frame for iterative optimization, wherein the optimization model is as follows:
and 5, performing joint iterative optimization on the differential representation factor theta and the linear representation coefficient alpha according to the model established in the step 3.
Step 5.1, initializing differential representation factorsSetting a threshold value and a maximum iteration number.
And 5.2, calculating to obtain theta by using the obtained theta, and updating the linear expression coefficient alpha by using the model in the step 4.1.
The objective function is:
the objective function is an unconstrained optimization problem and is solved by a method of directly deriving alpha.
Order:
thereby obtaining:
let the derivative value of formula (17) equal to 0 to obtain
The formula is utilized to carry out iterative optimization on alpha to obtain the optimal value of alpha under the current theta
Step 5.3, using α learned in step 5.1 to obtain b, (y-a)tα)⊙(y-Atα), which is defined as the product of corresponding elements in the vector, and the differentiation representation matrix θ is updated according to the model learning of step 4.2.
The objective function is:
the target function is an optimization problem under inequality constraint, a Lagrange multiplier method is adopted for solving, and the Lagrange function is constructed as follows:
Wherein beta and gamma are Lagrange multipliers.
Let beta be*、γ*For the optimal solution of the corresponding equation, using the KKT condition, the following equation is obtained:
writing equation (22) into vector form
θ*+m*-β*1-γ*=0 (23)
With constraint theta introducedT1 is 1 to obtain
Combining the two formulae (22) and (24) to obtain:
order:
(22) can be simplified into
The scalar form is:
in combination (21), there can be obtained:
wherein (f (x))+=max(f(x),0)
After the transformation of (30), the following results are obtained:
at this time, if the optimum can be determinedThen the optimum theta*Can be obtained from (31), so that the above formula can be rewritten as
According to the constraint (23), defining a function:
according to the formula (34), the calculation can be performed by iteration using Newton's methodθ can be obtained according to equation (31).
And 5.4, substituting the updated alpha and theta into an integral objective function to solve to obtain the integral error at the moment.
The objective function is:
and 5.5, repeating the steps 5.2, 5.3 and 5.4 for multiple times, carrying out combined iterative optimization on alpha and theta, and entering the step 6 when the overall objective function reaches a threshold value or exceeds a preset maximum iteration number.
And 6, respectively calculating the reconstruction errors of each class of data in the dictionary to the target data by using the trained alpha and theta, and selecting the class with the minimum reconstruction error as a recognition result.
Representative dictionary ASSample corresponding to the kth emotion in (1), αkIs AS(k)Corresponding linear representation coefficient, rkThe reconstruction error under the category. The prediction result is the emotion label corresponding to the target sample.
And 7, searching a key frequency band and a key lead under the current task by using the theta obtained in the step 6 through the following two formulas to obtain an electroencephalogram emotion activation mode. In the formulaThe importance of the qth lead is shown, ω (k) shows the importance of the kth frequency band, and a larger value of importance indicates that the feature is more important. Here we illustrate a 62 lead, 5 band.
Through the importance of each frequency band and each lead, a brain electrical mapping and a frequency band weight histogram under the task can be obtained, and the expression capability of the brain electrical mapping and the frequency band weight histogram is visualized.
Claims (5)
1. The emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction is characterized by comprising the following steps of:
the method specifically comprises the following steps:
step 1, collecting electroencephalogram data of a user under C emotional states at different time intervals;
step 2, performing feature marking and preprocessing on the data acquired in the step 1, wherein each sample matrix consists of electroencephalogram frequency domain data of a testee, a label vector s records an emotional state label corresponding to each sample in the A, and different time periods are selected for the same user to be used as dictionaries respectivelyAnd target data
Step 3, establishing a differential feature contribution degree model; obtaining a target function of joint optimization;
step 4, firstly initializing the differential representation factorObtaining a joint optimization objective function according to the step 3, and performing joint iterative optimization on the differential expression factor theta and the linear expression coefficient alpha by using a method of fixing one variable and updating one variable;
step 5, substituting the sample matrix A and the target data y into a reconstruction error calculation function by using the differential expression factor theta and the linear expression coefficient alpha obtained in the step 4 to obtain reconstruction errors under each category, and selecting the category with the minimum reconstruction error as an identification result;
and 6, acquiring importance degrees of each frequency band and lead under the current task by using the differential representation factor theta obtained in the step 4 and using a frequency band and lead importance calculation model, namely acquiring an activation mode of electroencephalogram emotion.
2. The emotion activation pattern mining method for feature contribution degree differentiation electroencephalogram data reconstruction according to claim 1, characterized in that: step 3, establishing a feature contribution degree differential data reconstruction model, which specifically comprises the following steps:
step 3.1, establishing a data reconstruction model;
in the model, the model is divided into a plurality of models,is target data AtD is the dimension of the frequency band feature,is ASCorresponding linear representation coefficient, αiCorresponding to the ith element (i ═ 1,2,3 … n), λ in α1C is a parameter for linearly representing the coefficient alpha, and c is the total number of emotion categories;for the differentiated representation factor of the d features,the array is a diagonal array, and the array is a diagonal array,
step 3.2, establishing a differential characteristic contribution degree model;
wherein ,representing the factor for the difference of each feature, d represents the number of features, and b is (y-A)tα)⊙(y-Atα)b=(y-Atα)⊙(y-Atα), "is defined as an in-vector pairProduct of the corresponding elements, λ2A parameter of θ;
step 3.3, combining the two models constructed in the steps 3.1 and 3.2, and performing iterative optimization on the data reconstruction model and the differential feature contribution degree model under the same frame, wherein the optimization model is as follows:
s.t.θT1=1 θi≥0 (3)。
4. The emotion activation pattern mining method for feature contribution degree differentiation electroencephalogram data reconstruction according to claim 2, characterized in that: step 4, the following optimization strategy is used;
step 4.1, initializing differential representation factorsSetting a threshold value and the maximum iteration times;
step 4.2, calculating to obtain theta by using the obtained theta, and updating the linear expression coefficient alpha by using the model in the step 3.1;
the objective function is:
step 4.3, obtaining b by utilizing the alpha obtained by learning in the step 4.1, and updating a differential expression matrix theta according to the model in the step 3.2;
the objective function is:
step 4.4, carrying out solving by using the updated alpha and theta and substituting the updated alpha and theta into an integral objective function to obtain an error;
the objective function is:
s.t.θT1=1 θi≥0 (6)
and 4.5, repeating the steps 4.2, 4.3 and 4.4, and performing combined iterative optimization on alpha and theta.
5. The emotion activation pattern mining method for feature contribution degree differentiation electroencephalogram data reconstruction according to claim 1, characterized in that: step 6 by recording the relationship of the EEG frequency band, leads and each component of the weight vector, using the following formula (7)
ω(k)=θ(k-1)*62+1+θ(k-1)*62+2+…+θk*62
Searching the electroencephalogram key frequency band and key leads under the current task, and acquiring an electroencephalogram activation mode.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111608170.4A CN114209341B (en) | 2021-12-23 | 2021-12-23 | Emotion activation mode mining method for characteristic contribution degree difference electroencephalogram data reconstruction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111608170.4A CN114209341B (en) | 2021-12-23 | 2021-12-23 | Emotion activation mode mining method for characteristic contribution degree difference electroencephalogram data reconstruction |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114209341A true CN114209341A (en) | 2022-03-22 |
CN114209341B CN114209341B (en) | 2023-06-20 |
Family
ID=80705913
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111608170.4A Active CN114209341B (en) | 2021-12-23 | 2021-12-23 | Emotion activation mode mining method for characteristic contribution degree difference electroencephalogram data reconstruction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114209341B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106886792A (en) * | 2017-01-22 | 2017-06-23 | 北京工业大学 | A kind of brain electricity emotion identification method that Multiple Classifiers Combination Model Based is built based on layering |
CN110353675A (en) * | 2019-08-14 | 2019-10-22 | 东南大学 | The EEG signals emotion identification method and device generated based on picture |
CN110876626A (en) * | 2019-11-22 | 2020-03-13 | 兰州大学 | Depression detection system based on optimal lead selection of multi-lead electroencephalogram |
CN111067513A (en) * | 2019-12-11 | 2020-04-28 | 杭州电子科技大学 | Sleep quality detection key brain area judgment method based on characteristic weight self-learning |
CN111265214A (en) * | 2020-02-25 | 2020-06-12 | 杭州电子科技大学 | Electroencephalogram signal analysis method based on data structured decomposition |
CN111616721A (en) * | 2020-05-31 | 2020-09-04 | 天津大学 | Emotion recognition system based on deep learning and brain-computer interface and application |
CN111643077A (en) * | 2020-06-19 | 2020-09-11 | 北方工业大学 | Electroencephalogram data-based identification method for traffic dynamic factor complexity |
US20200410890A1 (en) * | 2018-03-09 | 2020-12-31 | Advanced Telecommunications Research Institute International | Brain activity training apparatus, brain activity training method and brain activity training program |
CN112656427A (en) * | 2020-11-26 | 2021-04-16 | 山西大学 | Electroencephalogram emotion recognition method based on dimension model |
-
2021
- 2021-12-23 CN CN202111608170.4A patent/CN114209341B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106886792A (en) * | 2017-01-22 | 2017-06-23 | 北京工业大学 | A kind of brain electricity emotion identification method that Multiple Classifiers Combination Model Based is built based on layering |
US20200410890A1 (en) * | 2018-03-09 | 2020-12-31 | Advanced Telecommunications Research Institute International | Brain activity training apparatus, brain activity training method and brain activity training program |
CN110353675A (en) * | 2019-08-14 | 2019-10-22 | 东南大学 | The EEG signals emotion identification method and device generated based on picture |
CN110876626A (en) * | 2019-11-22 | 2020-03-13 | 兰州大学 | Depression detection system based on optimal lead selection of multi-lead electroencephalogram |
CN111067513A (en) * | 2019-12-11 | 2020-04-28 | 杭州电子科技大学 | Sleep quality detection key brain area judgment method based on characteristic weight self-learning |
CN111265214A (en) * | 2020-02-25 | 2020-06-12 | 杭州电子科技大学 | Electroencephalogram signal analysis method based on data structured decomposition |
CN111616721A (en) * | 2020-05-31 | 2020-09-04 | 天津大学 | Emotion recognition system based on deep learning and brain-computer interface and application |
CN111643077A (en) * | 2020-06-19 | 2020-09-11 | 北方工业大学 | Electroencephalogram data-based identification method for traffic dynamic factor complexity |
CN112656427A (en) * | 2020-11-26 | 2021-04-16 | 山西大学 | Electroencephalogram emotion recognition method based on dimension model |
Non-Patent Citations (1)
Title |
---|
隆燕芳: "基于神经网络的脑电情绪识别与重构方法研究", 杭州电子科技大学硕士论文, pages 6 - 51 * |
Also Published As
Publication number | Publication date |
---|---|
CN114209341B (en) | 2023-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Khare et al. | PDCNNet: An automatic framework for the detection of Parkinson’s disease using EEG signals | |
Bhattacharyya et al. | A novel multivariate-multiscale approach for computing EEG spectral and temporal complexity for human emotion recognition | |
Khare et al. | SPWVD-CNN for automated detection of schizophrenia patients using EEG signals | |
Wen et al. | Deep convolution neural network and autoencoders-based unsupervised feature learning of EEG signals | |
CN112656427B (en) | Electroencephalogram emotion recognition method based on dimension model | |
CN106886792B (en) | Electroencephalogram emotion recognition method for constructing multi-classifier fusion model based on layering mechanism | |
CN110070105B (en) | Electroencephalogram emotion recognition method and system based on meta-learning example rapid screening | |
CN114224342B (en) | Multichannel electroencephalogram signal emotion recognition method based on space-time fusion feature network | |
CN112244873A (en) | Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network | |
CN114533086B (en) | Motor imagery brain electrolysis code method based on airspace characteristic time-frequency transformation | |
CN114305452B (en) | Cross-task cognitive load identification method based on electroencephalogram and field adaptation | |
Hurtado-Rincon et al. | Motor imagery classification using feature relevance analysis: An Emotiv-based BCI system | |
Zhang et al. | Four-classes human emotion recognition via entropy characteristic and random Forest | |
CN110338760B (en) | Schizophrenia three-classification method based on electroencephalogram frequency domain data | |
CN117883082A (en) | Abnormal emotion recognition method, system, equipment and medium | |
Kim et al. | eRAD-Fe: Emotion recognition-assisted deep learning framework | |
Alessandrini et al. | EEG-Based Neurodegenerative Disease Classification using LSTM Neural Networks | |
Wang et al. | EEG-based emotion identification using 1-D deep residual shrinkage network with microstate features | |
Gagliardi et al. | Improving emotion recognition systems by exploiting the spatial information of EEG sensors | |
CN114052734B (en) | Electroencephalogram emotion recognition method based on progressive graph convolution neural network | |
CN114209341A (en) | Emotion activation mode mining method for feature contribution degree differentiation electroencephalogram data reconstruction | |
Chenane et al. | EEG Signal Classification for BCI based on Neural Network | |
Castaño-Candamil et al. | Post-hoc labeling of arbitrary M/EEG recordings for data-efficient evaluation of neural decoding methods | |
Vadivu et al. | An Novel Versatile Inspiring Wavelet Transform and Resilient Direct Neural Network Classification Techniques for Monitoring Brain Activity System Based on EEG Signal | |
Hasan et al. | Emotion prediction through EEG recordings using computational intelligence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |