CN107411737A - A kind of across the time recognition methods of mood based on resting electroencephalogramidentification similitude - Google Patents
A kind of across the time recognition methods of mood based on resting electroencephalogramidentification similitude Download PDFInfo
- Publication number
- CN107411737A CN107411737A CN201710251610.2A CN201710251610A CN107411737A CN 107411737 A CN107411737 A CN 107411737A CN 201710251610 A CN201710251610 A CN 201710251610A CN 107411737 A CN107411737 A CN 107411737A
- Authority
- CN
- China
- Prior art keywords
- electroencephalogram
- resting
- emotion
- time
- frequency band
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000000284 resting effect Effects 0.000 title claims abstract description 74
- 238000000034 method Methods 0.000 title claims abstract description 33
- 230000036651 mood Effects 0.000 title abstract description 11
- 230000008451 emotion Effects 0.000 claims abstract description 71
- 239000011159 matrix material Substances 0.000 claims abstract description 32
- 210000004556 brain Anatomy 0.000 claims abstract description 15
- 238000001514 detection method Methods 0.000 claims abstract description 13
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 12
- 230000005611 electricity Effects 0.000 claims abstract description 10
- 230000003595 spectral effect Effects 0.000 claims abstract description 10
- 238000012706 support-vector machine Methods 0.000 claims abstract description 6
- 230000003044 adaptive effect Effects 0.000 claims abstract description 5
- 238000012549 training Methods 0.000 claims abstract description 4
- 230000002996 emotional effect Effects 0.000 claims description 18
- 238000010187 selection method Methods 0.000 claims description 3
- 230000008901 benefit Effects 0.000 abstract description 7
- 230000003203 everyday effect Effects 0.000 abstract description 2
- 230000008909 emotion recognition Effects 0.000 description 13
- 238000005070 sampling Methods 0.000 description 8
- 238000012360 testing method Methods 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 238000012880 independent component analysis Methods 0.000 description 5
- 230000007935 neutral effect Effects 0.000 description 5
- 238000012935 Averaging Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000001914 filtration Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000001550 time effect Effects 0.000 description 3
- 238000012356 Product development Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000004397 blinking Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 210000001508 eye Anatomy 0.000 description 2
- 210000001061 forehead Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 210000001595 mastoid Anatomy 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 206010003805 Autism Diseases 0.000 description 1
- 208000020706 Autistic disease Diseases 0.000 description 1
- 241000700605 Viruses Species 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000010219 correlation analysis Methods 0.000 description 1
- 230000002354 daily effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000037213 diet Effects 0.000 description 1
- 235000005911 diet Nutrition 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 229940088597 hormone Drugs 0.000 description 1
- 239000005556 hormone Substances 0.000 description 1
- 210000000987 immune system Anatomy 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000004630 mental health Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000000491 multivariate analysis Methods 0.000 description 1
- 230000003183 myoelectrical effect Effects 0.000 description 1
- 230000036403 neuro physiology Effects 0.000 description 1
- 238000002610 neuroimaging Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000004761 scalp Anatomy 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/20—Workers
- A61B2503/22—Motor vehicles operators, e.g. drivers, pilots, captains
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Signal Processing (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Psychology (AREA)
- Developmental Disabilities (AREA)
- Child & Adolescent Psychology (AREA)
- Social Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Educational Technology (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The invention discloses a kind of across time recognition methods of the mood based on resting electroencephalogramidentification similitude, across the time recognition methods of mood comprises the following steps:Record user's resting electroencephalogramidentification of n days and mood brain electricity, by can frequency-division section adaptive tracking algorithm find the optimal separability frequency range of each user;Calculate every day resting electroencephalogramidentification and each lead of mood brain electricity it is optimal can frequency-division section power spectral density, form resting electroencephalogramidentification eigenmatrix and mood brain electrical feature matrix;The Euclidean distance of the resting electroencephalogramidentification and the resting electroencephalogramidentification of n days before on the detection same day is calculated respectively, select the time corresponding to minimum Euclidean distance, the mood brain electricity of this day builds mood detection model as the training set of SVMs, and across time mood electroencephalogramrecognition recognition is carried out so as to accurate, objective.The present invention can efficiently solve the low bottleneck problem of across time recognition correct rate in current Emotion identification, identification model be pushed to apply, and obtain considerable Social benefit and economic benefit.
Description
Technical Field
The invention relates to the field of emotion recognition based on electroencephalogram, in particular to an emotion cross-time recognition method based on resting electroencephalogram similarity.
Background
Emotion (emotion) is the general state of a person as to whether an objective thing meets his or her own needs. It is used as the high-level function of human brain, ensures the survival and adaptation of organisms, and influences the learning, memory and decision of people to different degrees. The emotion plays a ubiquitous role in the daily work and life of people. Negative emotions affect physical and mental health of people and reduce work quality and efficiency, and serious people can cause psychological diseases (such as depression, autism and the like) and also cause serious work errors. It has been shown that long-term accumulation of negative emotions can impair immune system function and make people more susceptible to infection by surrounding viruses. Therefore, it is necessary to find the negative emotion in good time and give appropriate intervention and regulation, especially for drivers, astronauts and other special workers. On the other hand, in the human-computer interaction system, if the system can capture the emotional state of a person, the human-computer interaction becomes more friendly, natural and efficient. Emotion analysis and recognition have become an important research topic across disciplines in the fields of neuroscience, psychology, cognitive science, computer science, artificial intelligence, and the like.
With the development of neurophysiology and the rise of brain imaging technology, electroencephalogram (EEG) has received attention from researchers and has been introduced into the field of emotion recognition because of its high time resolution, its ability to reflect the emotional state of a person objectively and truly, and its controllability is not controlled by human factors. And the new theoretical method improves the emotion recognition accuracy based on the brain electricity to a certain extent. However, once the practical application is approached, the recognition rate is greatly reduced, the application requirements are difficult to meet, and a great challenge is still faced in establishing a high-precision emotion recognition model. One of the difficulties is how to eliminate or reduce the time effect of the electroencephalogram signal, and further realize the cross-time emotion electroencephalogram recognition. It is known that the level of hormones, the external environment (such as temperature and humidity), and diet and sleep can cause differences in physiological signals, so that the electroencephalogram signals are different at different times even in the same emotional state. In practical application, a certain time interval is bound to exist between the establishment of the emotion recognition model and the recognition of emotional states, and test data cannot participate in the establishment of the emotion recognition model. It is impractical to build a recognition model that day and then immediately enter the application.
In conclusion, it is necessary to realize the emotional electroencephalogram recognition across time. Among the existing studies, there are few studies on cross-time emotion recognition. In 2001 Picard et al[1]The influence of time effect on the emotion recognition model is removed by adopting a method of subtracting a calm state from other emotional states, but neutral emotions cannot be recognized by using the method, the types of emotions are reduced, the neutral emotions are an important index of emotional stability, and the recognition of the neutral emotional state is very important and indispensable. 2012, Chueh, Tung-Hung et al[2]The influence of time effect is removed by utilizing a multivariate analysis of variance method, and the performance of the classifier is improved. However, there still exists a problem that the data in the test set is not independent, and the classifier is still constructed by mixing with the data at other times, which is also impractical in practical application. The difference of the electroencephalogram signal baselines (resting electroencephalograms) at different times is an important reason that the cross-time emotion electroencephalogram identification accuracy is low.
Disclosure of Invention
The invention provides an emotion cross-time recognition method based on resting electroencephalogram similarity, which can effectively solve the bottleneck problem of low cross-time recognition accuracy in current emotion recognition, pushes a recognition model to application, and obtains considerable social benefit and economic benefit, which is described in detail in the following:
an emotion cross-time recognition method based on resting electroencephalogram similarity, comprising the following steps of:
recording the resting electroencephalogram and the emotion electroencephalogram of the user for n days, and finding the optimal separability frequency band of each user through a separable frequency band self-adaptive tracking algorithm;
calculating the power spectral density of the optimal separable frequency band of each lead of the resting electroencephalogram and the emotion electroencephalogram of each day to form a resting electroencephalogram characteristic matrix and an emotion electroencephalogram characteristic matrix;
and respectively calculating Euclidean distances between the resting electroencephalograms of the day and the resting electroencephalograms of the previous n days, selecting the time corresponding to the minimum Euclidean distance, and constructing an emotion detection model by using the emotional electroencephalograms of the day as a training set of a support vector machine, thereby accurately and objectively carrying out cross-time emotion electroencephalogram recognition.
The step of finding the optimal separability frequency band of each user through the separable frequency band self-adaptive tracking algorithm specifically comprises the following steps:
calculating a time-frequency matrix of each lead by using short-time Fourier transform;
calculating a Fisher ratio for measuring energy differences within the same pattern and between different patterns;
after obtaining DW (f), calculating DFC by a waveband iteration selection method; the number of iterations is equal to the number of frequency segments to be obtained.
The method comprises the following steps of calculating the power spectral density of the optimal separable frequency band of each lead of the resting electroencephalogram and the emotion electroencephalogram every day, and constructing a resting electroencephalogram characteristic matrix and an emotion electroencephalogram characteristic matrix:
and (3) finding the optimal separable frequency band of each lead of each tested signal by adopting a separable frequency band self-adaptive tracking algorithm for the preprocessed signal, respectively calculating the power spectral density of the optimal separable frequency band of each lead of the resting brain electricity and the emotion brain electricity, and establishing an initial resting characteristic matrix and an emotion characteristic matrix.
The technical scheme provided by the invention has the beneficial effects that: according to the emotion electroencephalogram recognition method, the time point of the resting electroencephalogram with high similarity to the resting electroencephalogram on the test day is found, and the emotion electroencephalogram collected at the time point is used for building the emotion model, so that the emotion electroencephalogram recognition across time is realized. The method can effectively solve the bottleneck problem of cross-time emotion electroencephalogram recognition at present, and can obtain considerable social and economic benefits. The preferred embodiment is intended for patent assignment, technology collaboration or product development.
Drawings
FIG. 1 is a flow chart of a method for emotion cross-time recognition based on resting electroencephalogram similarity;
FIG. 2 is a 60 lead EEG lead map;
FIG. 3 is a flow chart of a separable band adaptive tracking calculation;
FIG. 4 is a flow chart of the DFCs algorithm;
fig. 5 is a scatter diagram of euclidean distance of resting electroencephalogram and cross-time emotion recognition accuracy.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention are described in further detail below.
In order to solve the problems in the background art, the embodiment of the present invention proposes an assumption: the higher the similarity of the resting electroencephalograms at different times, the higher the recognition rate of the emotional electroencephalograms across the time.
The embodiment of the invention provides a cross-time emotion electroencephalogram recognition method based on the similarity of resting electroencephalograms, namely, the time point of the resting electroencephalogram with high similarity to the resting electroencephalogram on the test day is found, and an emotion electroencephalogram collected at the time point is used for establishing an emotion model, so that cross-time emotion electroencephalogram recognition is realized. The method overcomes two problems in the background technology, does not reduce the recognized emotion types, does not participate in the establishment of the emotion recognition model by the data of the test set, and meets the requirements in practical application.
Example 1
The embodiment of the invention provides a cross-time emotion electroencephalogram identification method based on resting electroencephalogram similarity, and with reference to fig. 1, the method comprises the following steps:
101: recording the resting electroencephalogram and the emotion electroencephalogram of the user for n days, and finding the optimal separability frequency band of each user through a separable frequency band self-adaptive tracking algorithm;
102: calculating the power spectral density of the optimal separable frequency band of each lead of the resting electroencephalogram and the emotion electroencephalogram of each day to form a resting electroencephalogram characteristic matrix and an emotion electroencephalogram characteristic matrix;
in the characteristic matrix extraction stage, firstly, the preprocessed signals are adopted to find the optimal separable frequency band of each lead of each tested by adopting a separable frequency band self-adaptive tracking algorithm, the power spectral density of the optimal separable frequency band of each lead of the resting brain electricity and the emotion brain electricity is respectively calculated, and a preliminary resting characteristic matrix and an emotion characteristic matrix are established.
103: and respectively calculating Euclidean distances between the resting electroencephalograms of the day and the resting electroencephalograms of the previous n days, selecting the time corresponding to the minimum Euclidean distance, and constructing an emotion detection model by using the emotional electroencephalograms of the day as a training set of a support vector machine, thereby accurately and objectively carrying out cross-time emotion electroencephalogram recognition.
The smaller the euclidean distance, the greater the similarity between the two.
Prior to step 101, the method further comprises:
1. signal acquisition;
the data acquisition stage acquires 64-lead electroencephalogram signals tested in a resting state and different emotional states (positive, neutral and negative) in different time periods.
2. And (4) preprocessing data.
The acquired 64-lead electroencephalogram signals are preprocessed in four steps. The method comprises the steps of changing reference to binaural averaging, down-sampling to 500Hz, 1-100Hz band-pass filtering and removing electro-ocular interference by using an Independent Component Analysis (ICA) algorithm, and finally obtaining 60-lead preprocessed electroencephalogram signals.
In summary, in the embodiment of the invention, the time point of the resting electroencephalogram with high similarity to the resting electroencephalogram on the test day is searched through the steps 101 to 103, and the emotion electroencephalogram collected at the time point is used for establishing the emotion model, so that the cross-time emotion electroencephalogram recognition is realized.
Example 2
The scheme of example 1 is further described below with reference to fig. 2-5, calculation formulas, and examples, and is described in detail below:
201: a data acquisition stage;
the electroencephalogram acquisition device adopted in the embodiment of the invention is a 64-lead amplifier and a Scan4.5 acquisition system of Neuroscan company, the electrodes are placed according to a standard 10-20 system specified by the International electroencephalogram Association, and the lead distribution of the 60 conductive electrodes except the electrooculogram and the reference electrode is shown in figure 2. The right mastoid is used as a reference electrode during collection, the center of the top of the forehead of the brain is grounded, the impedance of all the electrodes is kept below 5k ohms, and the sampling frequency is 1000 Hz.
Each user needs to perform n data acquisitions (n > ═ 5) in one month, and the time interval between each acquisition is random for one day, three days, one week and two weeks. Each tested subject comes to a laboratory at the same time of an experimental day for data acquisition, firstly, the resting electroencephalogram of the user for 2 minutes is acquired, then, the video is used for inducing the positive, neutral and negative emotional states of the tested subject, and simultaneously, 64 leading electroencephalogram signals of the user are acquired.
The specific operation process of the data acquisition stage is well known to those skilled in the art, and is not described in detail in the embodiments of the present invention.
The embodiment of the invention does not limit the times and the time for acquiring the data, and the data are set according to the requirements in practical application.
202: preprocessing data;
the acquired 64-lead electroencephalogram signals are preprocessed in four steps. The method comprises the following steps: variable reference to binaural averaging, down-sampling to 500Hz, 1-100Hz band pass filtering, and ICA eye electrical interference cancellation.
1. The reference potential at the time of acquisition is at the right ear papilla, which results in a low signal amplitude for the right brain area lead. Therefore, the reference potential is firstly converted into M1 and M2 leads positioned at the mastoid parts on two sides, so that the subsequent data processing is facilitated.
2. The sampling frequency of the system is 1000Hz, and the system is mainly used for meeting the requirement of rapid change of electroencephalogram signals. However, the sampling frequency of 1000Hz is much higher than the theoretical sampling frequency of nyquist theorem, and too high sampling frequency causes too large data volume and reduces the efficiency of subsequent processing. Therefore, to down-sample the acquired data, the sampling frequency of the electroencephalogram signal is reduced from 1000Hz to 500 Hz.
3. The embodiment of the invention performs 1 Hz-100 Hz band-pass filtering to remove direct current interference and high-frequency signals.
4. The collected electroencephalogram signals inevitably contain the influence caused by the electrooculogram (including the up-down movement, the left-right movement and the blinking of the eyeball) and the electromyogram signals. Among them, the electro-oculogram, especially the blinking electro-oculogram, is particularly intense, and the most oculogram is the lead in the forehead region. For the influence generated by the eye electrical signal and the myoelectrical signal doped in the electroencephalogram signal, the embodiment of the invention filters the influence by an Independent Component Analysis (ICA) filtering method.
203: a self-adaptive tracking method capable of dividing frequency bands;
because different users have different optimal frequency bands, the embodiment of the present invention finds the optimal frequency bands capable of distinguishing different emotion types by using a method of Adaptive Tracking of Discrete Frequency Components (ATDFCs), which is very important for accurately extracting features and improving classification accuracy. The calculation process of the DF C (discrete Frequency Components) adaptive tracking method is shown in fig. 3.
1) The time-frequency matrix of each lead is calculated using a short-time Fourier transform, so that each lead has a discrete time-frequency matrix In(f,t);
2) The Fisher ratio is calculated and used to measure the ratio within a class (within the same model, in the present embodiment, the same mood type is referred to, for example: all positive patterns), and between classes (between different patterns, in embodiments of the invention referring to between different mood classes, for example: between the active mode and the passive mode), the calculation method is shown in formula (1), formula (2) and formula (3).
Wherein S isw(f,t)、SB(f,t)、mk(F, t), m (F, t) and FR(f, t) is a two-dimensional matrix. Sw(f,t)、SB(F, t) represents intra-class and inter-class differences, respectively, FR(f, t) is the Fisher ratio, mk(f, t) is the average time-frequency density of the kth class, m (f, t) is the average time-frequency density of all classes, and C represents the number of classes. In the embodiment of the invention, C is 3, nkIs the number of samples in the k-th class.
3) DW (f) can be obtained from the Fisher ratio and is calculated as in equation (4):
where dw (f) is the weight coefficient at frequency f. And Tt represents a time period when the STFT is calculated.
After obtaining the DW (f), calculating the DFC by a band iteration selection method, wherein the iteration number is equal to the number of frequency bands to be obtained; in a specific implementation, the following five steps from Step1 to Step5 may be used to calculate the frequency band with the highest separability, as shown in fig. 4.
Then, the weight DW (f) under the frequency band with the most separability is set to zero, and then the frequency band with the separability located at the second position is calculated. For example, when the frequency band with the highest separability is 9-14 Hz, dw (f) corresponding to 9, 10, 11, 12 and 13Hz is set to zero, and then the second separable frequency band is calculated; this process is repeated until the desired number of frequency bands is obtained.
Step1, firstly, determining the frequency band to be selected as 1-100Hz, and the sliding frequency window changes in 1Hz within 3-7 Hz (as shown in FIG. 5). Thus, 5 different bandwidth parameters denoted as BW may be obtainedj(j=1,2,3,4,5)。
Step2, the energy distribution α is calculated according to equation (5) as the frequency window moves along the frequency axis of DW (f).
Wherein, FiRepresenting the center frequency of the ith frequency band when the frequency window is shifted along the frequency axis. For example, when the frequency window width is 3Hz, 97 frequency bands can be obtained, such as 1-4 Hz, 2-5 Hz, 3-6 Hz, 4-7 Hz, …,97 Hz-100 Hz.
Step3, according to maximum energy distribution α, at all FiTo select the bestAs in equation (6).
For each BWjAll require to obtain oneThus, each j corresponds to an optimal center frequencyAnd optimum energy distribution
Step4 for comparison of Each BWjResolution, calculation ofIs calculated by equation (7) using (2,3,4,5)j。
Wherein,jis composed ofRelative change of (c).
Step5, finishing calculationjThen, a threshold value is setmin。
Experiments have shown that for different thresholds, e.g. 10%, 20%, 30%, 40%, …, the smaller the threshold, the closer the algorithm approaches the frequency bin with a frequency window of 3 Hz. Comparison2Andminif, if2Is greater thanminThen compare3Andminuntil a size is foundjIs less thanmin. The position of j-1 is the frequency band with the most separability.
In the embodiment of the invention, the power spectral values (one sample in 5 seconds) of the first separable frequency band and the second separable frequency band of each lead are selected to establish the resting electroencephalogram characteristic matrix PRI of each day(24*120)And emotional characteristic matrix PEiNi*120. Ni is the number of samples on day i. 60 leads 2 band-120 dimensional features.
PEiNi*120=(P1,P2,…,P120)(9)
Wherein R is1,R2,…,R120Respectively representing the first dimension, the second dimension, … dimension and 120 dimension of the resting electroencephalogram feature matrix; (P)1,P2,…,P120) The first dimension, the second dimension, … and 120-dimension of the emotional characteristic matrix are respectively.
204: calculating the Euclidean distance between the resting electroencephalograms;
distance is a measure of the degree of similarity or dissimilarity between observations or variables. Euclidean distance can be regarded as the degree of similarity of signals, and the closer the distance, the more similar.
Respectively calculating the Euclidean distance between the resting electroencephalogram characteristic matrix PRI (i is 1,2, …, n) of the day i and the resting electroencephalogram characteristic matrix TPR of the day of detection:
wherein MPri is the feature vector of the resting electroencephalogram feature matrix of the day i after averaging 24 samples, MTPR is the feature vector of the resting electroencephalogram feature matrix of the day of detection after averaging 24 samples, Di is the Euclidean distance obtained between the resting electroencephalogram of the day i and the resting electroencephalogram of the day of detection,is the h characteristic value, y of the ith tested resting electroencephalogram in the emotion databasehH is the h characteristic value of the resting brain wave of the current user, and h is 1,2, … and 120.
Then, finding out a time point corresponding to the minimum Euclidean distance, namely a time point corresponding to the resting electroencephalogram with the highest similarity on the detection day:
FT=find(min(Di)),i=1,2,…,n (13)
and the FT is a time point corresponding to the resting electroencephalogram with the highest similarity to the resting electroencephalogram on the detection day.
205: and (5) establishing an emotion detection model.
After finding the time point corresponding to the resting electroencephalogram with the highest similarity to the resting electroencephalogram on the detection day, utilizing the emotion electroencephalogram characteristic matrix of the day and adopting a Support Vector Machine (SVM)[3]And establishing an emotion detection model and identifying the current emotion state of the user.
In summary, in the embodiment of the present invention, through the steps 201 to 204, the time point of the resting electroencephalogram with high similarity to the resting electroencephalogram on the test day is found, and the emotion electroencephalogram acquired at the time point is used to establish the emotion model, so that the cross-time emotion electroencephalogram identification is realized.
Example 3
The feasibility of the protocols of examples 1 and 2 is verified below with reference to specific experimental data, fig. 5, and described in detail below:
fig. 5 is a scatter diagram of the euclidean distance of the resting electroencephalogram and the cross-time emotion recognition accuracy, and normalization processing is performed on each piece of data to be tested in order to avoid individual differences. The chart illustrates that the similarity of the resting electroencephalograms can predict the emotion recognition rate across time, namely the higher the similarity of the resting electroencephalograms at different times is, the higher the recognition accuracy rate across time is. The correlation analysis is carried out on the two, and p is 0.003, which has statistical significance.
According to the experimental data and the simulation waveform, the embodiment of the invention can effectively solve the bottleneck problem of low cross-time identification accuracy rate in the conventional emotion identification, and provides technical support for emotion identification from a laboratory to application.
The method can effectively solve the bottleneck problem of cross-time emotion electroencephalogram recognition at present, and can obtain considerable social and economic benefits. The preferred embodiment is intended for patent assignment, technology collaboration or product development.
Reference to the literature
[1]PICARD R W,VYZAS E,HEALEY J.Toward machine emotional intelligence:Analysis of affective physiological state[J].Pattern Analysis and MachineIntelligence,IEEE Transactions on,2001,23(10):1175-91.
[2]CHUEH T-H,CHEN T-B,LU H H-S,et al.Statistical Prediction ofEmotional States by Physiological Signals with Manova and Machine Learning[J].International Journal of Pattern Recognition and Artificial Intelligence,2012,26(04):
[3]HIDALGO-MU OZ AR,L PEZ M M,SANTOS I M,et al.Application of SVM-RFEon EEG signals for detecting the most relevant scalp regions linked toaffective valence processing[J]. Expert Systems with Applications,2013,40(6):2102–8.
[4] Baiyanru, a multi-paradigm induced electroencephalogram individual difference study for biological feature recognition [ D ]; tianjin university, 2012.
Those skilled in the art will appreciate that the drawings are only schematic illustrations of preferred embodiments, and the above-described embodiments of the present invention are merely provided for description and do not represent the merits of the embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (3)
1. A resting electroencephalogram similarity-based emotion cross-time recognition method is characterized by comprising the following steps:
recording the resting electroencephalogram and the emotion electroencephalogram of the user for n days, and finding the optimal separability frequency band of each user through a separable frequency band self-adaptive tracking algorithm;
calculating the power spectral density of the optimal separable frequency band of each lead of the resting electroencephalogram and the emotion electroencephalogram of each day to form a resting electroencephalogram characteristic matrix and an emotion electroencephalogram characteristic matrix;
and respectively calculating Euclidean distances between the resting electroencephalograms of the day and the resting electroencephalograms of the previous n days, selecting the time corresponding to the minimum Euclidean distance, and constructing an emotion detection model by using the emotional electroencephalograms of the day as a training set of a support vector machine, thereby accurately and objectively carrying out cross-time emotion electroencephalogram recognition.
2. The emotion time-span recognition method based on resting electroencephalogram similarity as claimed in claim 1, wherein the step of finding the optimal separability frequency band of each user through a separable frequency band adaptive tracking algorithm specifically comprises:
calculating a time-frequency matrix of each lead by using short-time Fourier transform;
calculating a Fisher ratio for measuring energy differences within the same pattern and between different patterns;
after obtaining DW (f), calculating DFC by a waveband iteration selection method; the number of iterations is equal to the number of frequency segments to be obtained.
3. The method for emotion time-span recognition based on resting electroencephalogram similarity according to claim 1, wherein the step of calculating the power spectral density of the best separable frequency band of each day of resting electroencephalogram and each lead of emotion electroencephalogram to form a resting electroencephalogram characteristic matrix and an emotion electroencephalogram characteristic matrix specifically comprises the following steps:
and (3) finding the optimal separable frequency band of each lead of each tested signal by adopting a separable frequency band self-adaptive tracking algorithm for the preprocessed signal, respectively calculating the power spectral density of the optimal separable frequency band of each lead of the resting brain electricity and the emotion brain electricity, and establishing a preliminary resting characteristic matrix and an emotion characteristic matrix.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710251610.2A CN107411737A (en) | 2017-04-18 | 2017-04-18 | A kind of across the time recognition methods of mood based on resting electroencephalogramidentification similitude |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710251610.2A CN107411737A (en) | 2017-04-18 | 2017-04-18 | A kind of across the time recognition methods of mood based on resting electroencephalogramidentification similitude |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107411737A true CN107411737A (en) | 2017-12-01 |
Family
ID=60424193
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710251610.2A Pending CN107411737A (en) | 2017-04-18 | 2017-04-18 | A kind of across the time recognition methods of mood based on resting electroencephalogramidentification similitude |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107411737A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020010785A1 (en) * | 2018-07-13 | 2020-01-16 | 华中师范大学 | Classroom teaching cognitive load measuring system |
WO2024042732A1 (en) * | 2022-08-24 | 2024-02-29 | 日本電信電話株式会社 | Perception visualization device, perception visualization method, and perception visualization program |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070173733A1 (en) * | 2005-09-12 | 2007-07-26 | Emotiv Systems Pty Ltd | Detection of and Interaction Using Mental States |
CN102715903A (en) * | 2012-07-09 | 2012-10-10 | 天津市人民医院 | Method for extracting electroencephalogram characteristic based on quantitative electroencephalogram |
CN102722728A (en) * | 2012-06-11 | 2012-10-10 | 杭州电子科技大学 | Motion image electroencephalogram classification method based on channel weighting supporting vector |
CN106108894A (en) * | 2016-07-18 | 2016-11-16 | 天津大学 | A kind of emotion electroencephalogramrecognition recognition method improving Emotion identification model time robustness |
CN106559625A (en) * | 2016-11-24 | 2017-04-05 | 天津大学 | Based on EEG to three-dimensional video-frequency difference parallax position captions Comfort Evaluation method |
-
2017
- 2017-04-18 CN CN201710251610.2A patent/CN107411737A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070173733A1 (en) * | 2005-09-12 | 2007-07-26 | Emotiv Systems Pty Ltd | Detection of and Interaction Using Mental States |
CN102722728A (en) * | 2012-06-11 | 2012-10-10 | 杭州电子科技大学 | Motion image electroencephalogram classification method based on channel weighting supporting vector |
CN102715903A (en) * | 2012-07-09 | 2012-10-10 | 天津市人民医院 | Method for extracting electroencephalogram characteristic based on quantitative electroencephalogram |
CN106108894A (en) * | 2016-07-18 | 2016-11-16 | 天津大学 | A kind of emotion electroencephalogramrecognition recognition method improving Emotion identification model time robustness |
CN106559625A (en) * | 2016-11-24 | 2017-04-05 | 天津大学 | Based on EEG to three-dimensional video-frequency difference parallax position captions Comfort Evaluation method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020010785A1 (en) * | 2018-07-13 | 2020-01-16 | 华中师范大学 | Classroom teaching cognitive load measuring system |
WO2024042732A1 (en) * | 2022-08-24 | 2024-02-29 | 日本電信電話株式会社 | Perception visualization device, perception visualization method, and perception visualization program |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106108894A (en) | A kind of emotion electroencephalogramrecognition recognition method improving Emotion identification model time robustness | |
CN107361766B (en) | Emotion electroencephalogram signal identification method based on EMD domain multi-dimensional information | |
Abo-Zahhad et al. | A new EEG acquisition protocol for biometric identification using eye blinking signals | |
Yang et al. | Subject-based feature extraction by using fisher WPD-CSP in brain–computer interfaces | |
Aboalayon et al. | Efficient sleep stage classification based on EEG signals | |
Ong et al. | Power spectral density analysis for human EEG-based biometric identification | |
US20130096453A1 (en) | Brain-computer interface devices and methods for precise control | |
Dornhege et al. | Optimizing spatio-temporal filters for improving brain-computer interfacing | |
Tong et al. | Emotion recognition based on photoplethysmogram and electroencephalogram | |
CN101828921A (en) | Identity identification method based on visual evoked potential (VEP) | |
CN110135285B (en) | Electroencephalogram resting state identity authentication method and device using single-lead equipment | |
Mousa et al. | A novel brain computer interface based on principle component analysis | |
Ramos-Aguilar et al. | Analysis of EEG signal processing techniques based on spectrograms | |
CN109009098B (en) | Electroencephalogram signal feature identification method under motor imagery state | |
Hindarto et al. | Feature extraction of electroencephalography signals using fast fourier transform | |
Shioji et al. | Personal authentication based on wrist EMG analysis by a convolutional neural network | |
Aydemir | Odor and Subject Identification Using Electroencephalography Reaction to Olfactory. | |
CN107411737A (en) | A kind of across the time recognition methods of mood based on resting electroencephalogramidentification similitude | |
Al‐Canaan et al. | BCI‐control and monitoring system for smart home automation using wavelet classifiers | |
Akhanda et al. | Detection of cognitive state for brain-computer interfaces | |
Awang et al. | Analysis of EEG signals by eigenvector methods | |
Huang et al. | Automatic artifact removal in EEG using independent component analysis and one-class classification strategy | |
Baziyad et al. | A study and performance analysis of three paradigms of wavelet coefficients combinations in three-class motor imagery based BCI | |
Neubig et al. | Recognition of imagined speech using electroencephalogram signals | |
Nawas et al. | K-NN classification of brain dominance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20171201 |