CN102512160B - Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands - Google Patents

Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands Download PDF

Info

Publication number
CN102512160B
CN102512160B CN 201110425021 CN201110425021A CN102512160B CN 102512160 B CN102512160 B CN 102512160B CN 201110425021 CN201110425021 CN 201110425021 CN 201110425021 A CN201110425021 A CN 201110425021A CN 102512160 B CN102512160 B CN 102512160B
Authority
CN
China
Prior art keywords
frequency
time
emotional state
emotional
pretreatment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN 201110425021
Other languages
Chinese (zh)
Other versions
CN102512160A (en
Inventor
綦宏志
曾红梅
许敏鹏
张迪
明东
万柏坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN 201110425021 priority Critical patent/CN102512160B/en
Publication of CN102512160A publication Critical patent/CN102512160A/en
Application granted granted Critical
Publication of CN102512160B publication Critical patent/CN102512160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention belongs to emotional state recognition technology, and provides an electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands, which is a more objective emotional state recognition method and is a more objective evaluation method for therapy evaluation to mental diseases. The technical scheme includes that the electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands comprises following steps: data acquisition and preprocessing, time frequency feature extraction, adaptive frequency band selection and emotional state recognition; the specific data acquisition and preprocessing step comprises emotionally inducing a testee by the aid of emotional images and preprocessing acquired original electroencephalogram signals, and preprocessing comprises changing reference potential, reducing sampling, realizing bandpass filtering and removing electrooculogram; time-frequency change is realized for processed electroencephalogram signals by the aid of short time Fourier transform; and adaptive frequency band selection is realized by a frequency band variable adaptive tracking method, and an SVM (support vector machine) is used for realizing classification recognition for feature frequency bands. The electroencephalogram emotional state feature extraction method is mainly applied to recognizing emotional states.

Description

But the brain electricity emotional state feature extracting method based on the frequency-division section adaptive tracing
Technical field
The invention belongs to the emotional state recognition technology, but relate to a kind of brain electricity emotional state feature extracting method based on the frequency-division section adaptive tracing.
Background technology
1872, Darwin pointed out that in " human and animal's expression " book emotion is the adaptation instrument of senior phylogenetic scale, and from then on people have begun the research of emotion experiment with theory.Through more than 100 years, flourish to 20 th century later emotion researchs, and combine with researchs such as cognition, neuroscience, brain sciences; Its research means is also varied, as brain electricity (EEG), functional mri (fMRI), function near infrared imaging (fNIRI) etc.
Along with the development of society, each age, each neighborhood people's emotional disturbance is more and more, more and more serious, and the various disease incidences relevant to emotion are more and more higher, as depression, manic disorder, anxiety neurosis, obsession, affective disorder etc.2010, according to World Health Organization's statistics, global depression rate was about 11%, and approximately there are 3.4 hundred million patients with depression and number in the whole world in continuous growth.Current depression has become the fourth-largest disease in the world, and expecting the year two thousand twenty may be only second to the second largest illness of the cardiopathic mankind with becoming, and depression will become the 21 century mankind's main killer.This has had a strong impact on people's quality of life, and the while also provides powerful power for the development of emotion research.Emotion research is broad development abroad.1985 set up in the world " international emotion EASD ".The process of its development has related to the various aspects of human society and people's mental life, and from basic theories, methodology and content, from simple emotion kind and effect thereof, emotion and cognitive relation are to complicated socialization's emotion.IAPS has been used in the research of relevant emotional problem since coming out widely, such as the relation of the cognitive activities such as physiological mechanism, mood regulation, emotion and the attention of emotion, memory etc.At present, lot of domestic and international seminar is carrying out based on the vision induced research of the Emotional Picture of IAPS, many achievements in research have also been obtained, these achievements in research are different according to signal acquisition method, can be divided into the combination of emotional experience self-report method, brain imaging technique, biofeedback therapy and above the whole bag of tricks.
In prior art, mostly there are the deficiencies such as technical sophistication, identification poor accuracy.
Summary of the invention
The present invention is intended to solution and overcomes the deficiencies in the prior art, proposes a kind of more objectively emotional state recognition methods, and an objective appraisal method more can be provided for the treatment evaluation of mental illness (as depression).For achieving the above object, the technical scheme that the present invention takes is, but the brain electricity emotional state feature extracting method based on the frequency-division section adaptive tracing comprises the steps: data acquisition and pretreatment, time-frequency characteristics extraction, self adaptation Frequency Band Selection and emotional state identification;
Data acquisition and pretreatment concrete steps comprise: select joyful degree scope to be divided into the Emotional Picture of eight grades, higher grade of the Emotional Picture emotion that induces is more positive, lower grade of the Emotional Picture emotion that induces is more passive, utilizing Emotional Picture to carry out emotion to the experimenter brings out, and record experimenter's EEG signals, the original EEG signals that gathers is carried out pretreatment, and pretreatment comprises and changes reference potential, down-sampled, bandpass filtering, four steps of removal eye electricity
It is to utilize short time discrete Fourier transform to carry out time-frequency to the complete EEG signals of pretreatment to change that time-frequency characteristics extracts;
But self adaptation Frequency Band Selection and emotional state identification refer to utilize the frequency-division section adaptive tracking method, but different tested each frequency-division section that lead are discussed respectively, obtaining each leads and has the frequency range of separability most, then use SVM to carry out Classification and Identification to the feature of extracting, thereby identification emotional state, SVM is the abbreviation of Support Vector Machine, represents support vector machine.
Utilizing short time discrete Fourier transform to carry out time-frequency to the complete EEG signals of pretreatment changes and to be:
1. utilizing short time discrete Fourier transform to carry out the time-frequency variation to the complete EEG signals of pretreatment is at first to use the observation window W (t) of a finite width that signal x (t) is observed, then the signal after windowing is carried out that Fourier transform obtains, is specially:
STFT ( t , ω ) = ∫ - ∞ + ∞ x ( τ ) W * ( τ - t ) e - jωτ dτ - - - ( 1 )
Here ω is angular frequency, W *(τ-t) is the W (complex conjugate function of τ-t); The observation window of limited value length along the time shaft translation, is obtained the time dependent information of spectrum distribution of signal on the time-frequency plane of two dimension, obtain the two-dimentional time-frequency matrix I of EEG signals n(f, t);
2. calculate the fisher ratio, weighing in class with it is to be capacity volume variance between different emotion grades between same emotion grade and class:
S W ( f , t ) = Σ k = 1 C Σ n = 1 n k ( I n ( f , t ) - m k ( f , t ) ) ( I n ( f , t ) - m k ( f , t ) ) T - - - ( 2 )
S B ( f , t ) = Σ k = 1 C n k ( m ( f , t ) - m k ( f , t ) ) ( m ( f , t ) - m k ( f , t ) ) T - - - ( 3 )
F R ( f , t ) = S B ( f , t ) S W ( f , t ) - - - ( 4 )
Wherein, S W, S B, m k, m and F RTwo-dimensional matrix, S W(f, t), S B(f, t) represent respectively in class and class between difference, m k(f, t) is the frequent degree of the mean time of k class, and m (f, t) is the frequent degree of the mean time of all classes, and C represents classification number, n kIt is the sample number of k apoplexy due to endogenous wind; T represents transposition;
3. DW (f) can be tried to achieve by the Fisher ratio:
DW ( f ) = Σ t = 1 t F R ( f , t )
Wherein, t represents the time period when STFT calculates;
4. after obtaining DW (f), calculate DFC by wave band iteration selection method, iterations need to equal the band number of acquisition; DFC is the abbreviation of Discriminative Frequency Components, can calculate with Step1 cited below the frequency range of tool separability to the Step5 five steps, then the weight DW (f) under will tool separability frequency range is set to zero, calculate again separability and be positioned at deputy frequency range, constantly repeat this process until the band number that need to obtain:
Step1, at first determine to need selecteed frequency range be 1-45Hz, and the frequency window of slip changes with step-length 1Hz between 3-7Hz, obtains 5 different bandwidth parameters and is designated as BW h, h=1,2,3,4,5;
Step2, when frequency window moves along the frequency axis of DW (f), according to formula (6) calculating energy distribution α:
α ( F i , BW h ) = Σ f = F i - BW h / 2 f = F i + BW h / 2 DW ( f ) - - - ( 6 )
F wherein iRepresent the mid frequency of i frequency range when frequency window moves along frequency axis
Step3, according to ceiling capacity distribution α, at all F iMiddle selection is best
Figure BDA0000120988700000027
F h opt = arg max F i α ( F i , BW h ) - - - ( 7 )
To each BW hAll to obtain one
Figure BDA0000120988700000029
Therefore, each h, all corresponding best mid frequency
Figure BDA00001209887000000210
And optimum capacity distributes
Figure BDA0000120988700000031
Step4, for each BW relatively hResolution capability, calculate
Figure BDA0000120988700000032
Relative variation, utilize following formula to calculate δ h:
δ h = α h opt - α h - 1 opt α h opt × 100 % - - - ( 8 )
Step5, calculated δ hAfterwards, establish a threshold value δ min, each subjects uses same threshold value, relatively δ 2With δ minSize, if δ 2Greater than δ min, then compare δ 3With δ minSize, until find δ hLess than δ min, the position of h-1 is exactly the frequency range of tool separability.
The present invention has following technique effect:
The present invention utilizes short time discrete Fourier transform to extract time-frequency characteristics, can divide the frequency band adaptive tracking method to carry out the self adaptation Frequency Band Selection, the accuracy of emotional state identification can be effectively improved, then an objective appraisal method more can be provided for the treatment evaluation of mental illness.
Description of drawings
Fig. 1 the technology of the present invention flow chart.
Fig. 2 DFC calculation process.
The specific embodiment
Fig. 1 is the technology of the present invention flow chart.Comprise data acquisition and pretreatment, time-frequency characteristics extraction, self adaptation Frequency Band Selection and four major parts of emotional state identification.
The IAPS that uses in the present invention (International Affective Picture System) by many have the emotion arousal effects, international addressable, form and these colour pictures carried out the scoring of three dimensions by the colour picture of extensive semantic domain.It is by the state-run mental health of U.S. research center (National Institute of Mental Health, NIMH) emotion and attention institute (Emotion ﹠amp; Attention) go through the cover standardization emotional distress picture system that the several years works out.The present invention has selected joyful degree scope to be divided into the picture of eight grades from IAPS.Higher grade of the Emotional Picture emotion that induces is more positive, and lower grade of the Emotional Picture emotion that induces is more passive.Utilize these pictures to carry out emotion to the experimenter and bring out, and record its EEG signals.The original EEG signals that gathers is carried out pretreatment, comprise changing reference potential, down-sampled, bandpass filtering, four steps of removal eye electricity.
Utilizing short time discrete Fourier transform to carry out time-frequency to the complete EEG signals of pretreatment changes.The method of Short Time Fourier Transform is at first to use the observation window W (t) of a finite width that signal x (t) is observed, and then the signal after windowing is carried out that Fourier transform obtains,
STFT ( t , ω ) = ∫ - ∞ + ∞ x ( τ ) W * ( τ - t ) e - jωτ dτ - - - ( 1 )
Here ω is angular frequency, W *(τ-t) is the W (complex conjugate function of τ-t).
When the observation window of limited value length along the time shaft translation, can obtain the time dependent information of spectrum distribution of signal on the time-frequency plane of two dimension, can obtain so the two-dimentional time-frequency matrix In of EEG signals.Calculate afterwards fisher ratio F RWeigh in class and the capacity volume variance between class.
S W ( f , t ) = Σ k = 1 C Σ n = 1 n k ( I n ( f , t ) - m k ( f , t ) ) ( I n ( f , t ) - m k ( f , t ) ) T - - - ( 2 )
S B ( f , t ) = Σ k = 1 C n k ( m ( f , t ) - m k ( f , t ) ) ( m ( f , t ) - m k ( f , t ) ) T - - - ( 3 )
F R ( f , t ) = S B ( f , t ) S W ( f , t ) - - - ( 4 )
Wherein, S W, S B, m k, m and F RTwo-dimensional matrix, S W(f, t), S B(f, t) represent respectively in class and class between difference, m k(f, t) is the frequent degree of the mean time of k class, and m (f, t) is the frequent degree of the mean time of all classes, and C represents classification number, n kIt is the sample number of k apoplexy due to endogenous wind; T represents transposition;
Can be obtained by the Fisher ratio
DW ( f ) = Σ t = 1 t F R ( f , t ) - - - ( 5 )
Wherein, t represents the time period when STFT calculates.
Due to the frequency band of tool separability not necessarily of the most obvious feature band on frequency, and differently tested different best separability frequency bands may be arranged, utilization of the present invention can divide the frequency band adaptive tracking method, and different tested divided frequency bands are discussed respectively.Can divide frequency band adaptive tracing (Discriminative Frequency Components, DFC) is very important to extracting accurately feature, improving classification accuracy rate.The main process of DFC as shown in Figure 2.
Step1, at first determine to need selecteed frequency range be 1-45Hz, and the frequency window of slip changes with step-length 1Hz between 3-7Hz.Therefore, we can obtain 5 different bandwidth parameters and be designated as BW j(j=1,2,3,4,5).
Step2, when frequency window moves along the frequency axis of DW (f), according to formula (6) calculating energy distribution α:
α ( F i , BW h ) = Σ f = F i - BW h / 2 f = F i + BW h / 2 DW ( f ) - - - ( 6 )
F wherein iRepresent the mid frequency of i frequency range when frequency window moves along frequency axis
Step3, according to ceiling capacity distribution α, at all F iMiddle selection is best
Figure BDA0000120988700000046
F h opt = arg max F i α ( F i , BW h ) - - - ( 7 )
To each BW hAll to obtain one
Figure BDA0000120988700000048
Therefore, each h, all corresponding best mid frequency
Figure BDA0000120988700000049
And optimum capacity distributes
Figure BDA00001209887000000410
Step4, for each BW relatively hResolution capability, calculate
Figure BDA00001209887000000411
Relative variation, utilize following formula to calculate δ h:
δ h = α h opt - α h - 1 opt α h opt × 100 % - - - ( 8 )
Step5, calculated δ hAfterwards, establish a threshold value δ min, each subjects uses same threshold value, experiment showed, for different threshold values, as, 10%, 20%, 30%, 40% ..., threshold value is less, and this algorithm more levels off to and selects frequency window is the frequency band of 3HZ.Compare δ 2With δ minSize, if δ 2Greater than δ min, then compare δ 3With δ minSize, until find δ hLess than δ min, the position of h-1 is exactly the frequency range of tool separability.
Use SVM (Support Vector Machine) to carry out Classification and Identification to characteristic spectra, the emotional state discrimination of acquisition is up to 82.8%, than the accuracy of not passing through the adaptive tracing frequency range (79.3%) height.
The present invention is intended to propose a kind of new emotional state feature extracting method, emotional state by time frequency analysis means identification people, then an objective appraisal method more be can provide for the treatment evaluation of mental illness, perhaps information or feedback provided for mood regulation.This invention can improve the accuracy of emotional state identification effectively, and obtains considerable Social benefit and economic benefit.Optimum implementation intends adopting patent transfer, patent grant, technological cooperation or product development.

Claims (1)

1. but the brain electricity emotional state feature extracting method based on the frequency-division section adaptive tracing, is characterized in that, comprises the steps: data acquisition and pretreatment, time-frequency characteristics extraction, self adaptation Frequency Band Selection and emotional state identification;
Data acquisition and pretreatment concrete steps comprise: select joyful degree scope to be divided into the Emotional Picture of eight grades, higher grade of the Emotional Picture emotion that induces is more positive, lower grade of the Emotional Picture emotion that induces is more passive, utilizing Emotional Picture to carry out emotion to the experimenter brings out, and record experimenter's EEG signals, the original EEG signals that gathers is carried out pretreatment, and pretreatment comprises and changes reference potential, down-sampled, bandpass filtering, four steps of removal eye electricity
It is to utilize short time discrete Fourier transform to carry out time-frequency to the complete EEG signals of pretreatment to change that time-frequency characteristics extracts;
But self adaptation Frequency Band Selection and emotional state identification refer to utilize the frequency-division section adaptive tracking method, but different tested each frequency-division section that lead are discussed respectively, obtaining each leads and has the frequency range of separability most, then use SVM to carry out Classification and Identification to the feature of extracting, thereby identification emotional state, SVM is the abbreviation of Support Vector Machine, represents support vector machine;
Utilizing short time discrete Fourier transform to carry out time-frequency to the complete EEG signals of pretreatment changes and to be:
1. utilizing short time discrete Fourier transform to carry out the time-frequency variation to the complete EEG signals of pretreatment is at first to use the observation window W (t) of a finite width that signal x (t) is observed, then the signal after windowing is carried out that Fourier transform obtains, is specially:
STFT ( t , ω ) = ∫ - ∞ + ∞ x ( τ ) W * ( τ - t ) e - jωt dτ - - - ( 1 )
Here ω is angular frequency, and wherein, t represents the time period when STFT calculates, W *(τ-t) is the W (complex conjugate function of τ-t); The observation window of limited value length along the time shaft translation, is obtained the time dependent information of spectrum distribution of signal on the time-frequency plane of two dimension, obtain the two-dimentional time-frequency matrix I of EEG signals n(f, t);
2. calculate the fisher ratio, weighing in class with it is to be capacity volume variance between different emotion grades between same emotion grade and class:
S W ( f , t ) = Σ k = 1 C Σ n = 1 n k ( I n ( f , t ) - m k ( f , t ) ) ( I n ( f , t ) - m k ( f , t ) ) T - - - ( 2 )
S B ( f , t ) = Σ k = 1 C n k ( m ( f , t ) - m k ( f , t ) ) ( m ( f , t ) - m k ( f , t ) ) T - - - ( 3 )
F R ( f , t ) = S B ( f , t ) S W ( f , t ) - - - ( 4 )
Wherein, S W, S B, m k, m and F RTwo-dimensional matrix, S W(f, t), S B(f,, t) represent respectively in class and class between difference, m k(f, t) is the frequent degree of the mean time of k class, and m (f, t) is the frequent degree of the mean time of all classes, and C represents classification number, n kIt is the sample number of k apoplexy due to endogenous wind; T represents transposition;
3. DW (f) can be tried to achieve by the Fisher ratio:
DW ( f ) = Σ t = 1 t F R ( f , t ) ;
4. after obtaining DW (f), calculate DFC by wave band iteration selection method, iterations need to equal the band number of acquisition;
DFC is the abbreviation of Discriminative Frequency Components, can calculate with Step1 cited below the frequency range of tool separability to the Step5 five steps, then the weight DW (f) under will tool separability frequency range is set to zero, calculate again separability and be positioned at deputy frequency range, constantly repeat this process until the band number that need to obtain:
Step1, at first determine to need selecteed frequency range be 1-45Hz, and the frequency window of slip changes with step-length 1Hz between 3-7Hz, obtains 5 different bandwidth parameters and is designated as BW h, h=1,2,3,4,5;
Step2, when frequency window moves along the frequency axis of DW (f), according to formula (6) calculating energy distribution α:
α ( F i , BW h ) = Σ f = F i - BW h / 2 f = F i + BW h / 2 DW ( f ) - - - ( 6 )
F wherein iRepresent the mid frequency of i frequency range when frequency window moves along frequency axis;
Step3, according to ceiling capacity distribution α, at all F iMiddle selection is best
Figure FDA00002836834600022
F h opt = arg max F i α ( F i , BW h ) - - - ( 7 )
To each BW hAll to obtain one
Figure FDA00002836834600024
Therefore, each h, all corresponding best mid frequency
Figure FDA00002836834600025
And optimum capacity distributes
Step4, for each BW relatively hResolution capability, calculate
Figure FDA00002836834600027
Relative variation, utilize following formula to calculate δ h:
δ h = α h opt - α h - 1 opt α h opt × 100 % - - - ( 8 )
Step5, calculated δ hAfterwards, establish a threshold value δ min, each subjects uses same threshold value, relatively δ 2With δ minSize, if δ 2Greater than δ min, then compare δ 3With δ minSize, until find δ hLess than δ min, the position of h-1 is exactly the frequency range of tool separability.
CN 201110425021 2011-12-16 2011-12-16 Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands Active CN102512160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110425021 CN102512160B (en) 2011-12-16 2011-12-16 Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110425021 CN102512160B (en) 2011-12-16 2011-12-16 Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands

Publications (2)

Publication Number Publication Date
CN102512160A CN102512160A (en) 2012-06-27
CN102512160B true CN102512160B (en) 2013-05-15

Family

ID=46283459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110425021 Active CN102512160B (en) 2011-12-16 2011-12-16 Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands

Country Status (1)

Country Link
CN (1) CN102512160B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106175752A (en) * 2015-04-30 2016-12-07 深圳市前海览岳科技有限公司 Eeg signal obtains Apparatus and method for, status assessing system and method

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102715903B (en) * 2012-07-09 2014-04-16 天津市人民医院 Method for extracting electroencephalogram characteristic based on quantitative electroencephalogram
CN103263274B (en) * 2013-05-24 2014-12-17 桂林电子科技大学 Expression display device based on FNIRI and ERP
CN103654799B (en) * 2013-12-13 2016-08-17 Tcl集团股份有限公司 A kind of infant emotion detection method based on brain wave and device
CN105701439A (en) * 2014-12-11 2016-06-22 赵化宾 Device and method for recognizing emotion, feeling and physiological need by adopting EEG, EMG and ECG signals
CN106175799A (en) * 2015-04-30 2016-12-07 深圳市前海览岳科技有限公司 Based on brain wave assessment human body emotion and the method and system of fatigue state
CN107085464B (en) * 2016-09-13 2019-11-26 天津大学 Emotion identification method based on P300 characters spells task
CN107007290B (en) * 2017-03-27 2019-09-20 广州视源电子科技股份有限公司 Brain electricity allowance recognition methods and device based on time domain and phase space
CN107157476B (en) * 2017-05-22 2018-04-10 西安科技大学 A kind of miner's anxiety degree recognition methods for the Intelligent mining helmet
WO2020096621A1 (en) * 2018-11-09 2020-05-14 Hewlett-Packard Development Company, L.P. Classification of subject-independent emotion factors
CN113143272A (en) * 2021-03-15 2021-07-23 华南理工大学 Shape programmable system for assisting emotional expression of autistic patient
CN113778228A (en) * 2021-09-10 2021-12-10 哈尔滨工业大学(深圳) Brain-computer interface system based on multifunctional emotion recognition and self-adaptive adjustment
CN114391846B (en) * 2022-01-21 2023-12-01 中山大学 Emotion recognition method and system based on filtering type feature selection

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070135728A1 (en) * 2005-12-01 2007-06-14 Lexicor Medical Technology, Llc Systems and Methods for Analyzing and Assessing Depression and Other Mood Disorders Using Electroencephalographic (EEG) Measurements
CN101690659A (en) * 2009-09-29 2010-04-07 华东理工大学 Brain wave analysis method
CN101853070A (en) * 2010-05-13 2010-10-06 天津大学 Man-machine interaction device for information fusion of forehead EEG and blood oxygen
KR20100128023A (en) * 2009-05-27 2010-12-07 세종대학교산학협력단 The emotion recognition system based on biometric signals
CN102156541A (en) * 2010-05-13 2011-08-17 天津大学 Prefrontal electroencephalogram information and blood oxygen information fused human-computer interaction method
CN102200833A (en) * 2011-05-13 2011-09-28 天津大学 Speller brain-computer interface (SCI) system and control method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070135728A1 (en) * 2005-12-01 2007-06-14 Lexicor Medical Technology, Llc Systems and Methods for Analyzing and Assessing Depression and Other Mood Disorders Using Electroencephalographic (EEG) Measurements
KR20100128023A (en) * 2009-05-27 2010-12-07 세종대학교산학협력단 The emotion recognition system based on biometric signals
CN101690659A (en) * 2009-09-29 2010-04-07 华东理工大学 Brain wave analysis method
CN101853070A (en) * 2010-05-13 2010-10-06 天津大学 Man-machine interaction device for information fusion of forehead EEG and blood oxygen
CN102156541A (en) * 2010-05-13 2011-08-17 天津大学 Prefrontal electroencephalogram information and blood oxygen information fused human-computer interaction method
CN102200833A (en) * 2011-05-13 2011-09-28 天津大学 Speller brain-computer interface (SCI) system and control method thereof

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
IAPS图片诱导首发抑郁症患者的情绪偏向性;彭代辉 等;《中国心理卫生杂志》;20071130;第21卷(第11期);753-755 *
基于脑电信号的驾驶员情绪状态识别研究;钟铭恩 等;《中国安全科学学报》;20110930;第21卷(第9期);64-69 *
彭代辉 等.IAPS图片诱导首发抑郁症患者的情绪偏向性.《中国心理卫生杂志》.2007,第21卷(第11期),753-755.
钟铭恩 等.基于脑电信号的驾驶员情绪状态识别研究.《中国安全科学学报》.2011,第21卷(第9期),64-69.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106175752A (en) * 2015-04-30 2016-12-07 深圳市前海览岳科技有限公司 Eeg signal obtains Apparatus and method for, status assessing system and method
CN106175752B (en) * 2015-04-30 2020-12-01 浙江美迪克医疗科技有限公司 Brain wave signal acquisition device and method, and state evaluation system and method

Also Published As

Publication number Publication date
CN102512160A (en) 2012-06-27

Similar Documents

Publication Publication Date Title
CN102512160B (en) Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands
CN102499677B (en) Emotional state identification method based on electroencephalogram nonlinear features
CN105956624B (en) Mental imagery brain electricity classification method based on empty time-frequency optimization feature rarefaction representation
CN105411565B (en) Heart rate variability tagsort method based on broad sense multi-scale wavelet entropy
CN101690659B (en) Brain wave analysis method
CN107361766B (en) Emotion electroencephalogram signal identification method based on EMD domain multi-dimensional information
CN102715911B (en) Brain electric features based emotional state recognition method
WO2017016086A1 (en) Depression evaluating system and method based on physiological information
CN106709469B (en) Automatic sleep staging method based on electroencephalogram and myoelectricity multiple characteristics
CN102697493B (en) Method for rapidly and automatically identifying and removing ocular artifacts in electroencephalogram signal
CN105877766A (en) Mental state detection system and method based on multiple physiological signal fusion
CN204931634U (en) Based on the depression evaluating system of physiologic information
CN108742660A (en) A kind of Emotion identification method based on wearable device
Li et al. An EEG-based method for detecting drowsy driving state
Wang et al. Driving fatigue detection based on EEG signal
Podgorelec Analyzing EEG signals with machine learning for diagnosing Alzheimer’s disease
CN110059564B (en) Feature extraction method based on power spectral density and cross-correlation entropy spectral density fusion
Du et al. A method for detecting high-frequency oscillations using semi-supervised k-means and mean shift clustering
Fattah et al. Identification of motor neuron disease using wavelet domain features extracted from EMG signal
CN109009098B (en) Electroencephalogram signal feature identification method under motor imagery state
Chen et al. Two-dimensional phase lag index image representation of electroencephalography for automated recognition of driver fatigue using convolutional neural network
CN114578963B (en) Electroencephalogram identity recognition method based on feature visualization and multi-mode fusion
Avdakovic et al. Diagnosis of epilepsy from EEG signals using global wavelet power spectrum
Qiao et al. Feature extraction and classifier evaluation of EEG for imaginary hand movements
Liu et al. Epileptic EEG identification based on hybrid feature extraction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant