CN107085464B - Emotion identification method based on P300 characters spells task - Google Patents

Emotion identification method based on P300 characters spells task Download PDF

Info

Publication number
CN107085464B
CN107085464B CN201610819373.0A CN201610819373A CN107085464B CN 107085464 B CN107085464 B CN 107085464B CN 201610819373 A CN201610819373 A CN 201610819373A CN 107085464 B CN107085464 B CN 107085464B
Authority
CN
China
Prior art keywords
mood
erp
task
signal
lead
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610819373.0A
Other languages
Chinese (zh)
Other versions
CN107085464A (en
Inventor
仝晶晶
赵欣
刘爽
许敏鹏
汤佳贝
安兴伟
何峰
明东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201610819373.0A priority Critical patent/CN107085464B/en
Publication of CN107085464A publication Critical patent/CN107085464A/en
Application granted granted Critical
Publication of CN107085464B publication Critical patent/CN107085464B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Developmental Disabilities (AREA)
  • Biophysics (AREA)
  • Human Computer Interaction (AREA)
  • Neurosurgery (AREA)
  • Child & Adolescent Psychology (AREA)
  • Neurology (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Dermatology (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The present invention relates to mood detection techniques to carry out automatic emotional state detection, the Emotion identification during the brain-computer interface task based on P300 can be effectively performed, and obtain considerable Social benefit and economic benefit to propose a kind of new method for detecting emotional state.The technical solution adopted by the present invention is that the Emotion identification method based on P300 characters spells task, induces active mood, neutral mood and negative feeling three classes emotional state by watching Emotional Picture first respectively, then carries out the characters spells task based on P300;EEG signal is acquired simultaneously;700ms data after label are extracted in EEG signal after Signal Pretreatment and are downsampled to 100Hz as feature, filtering out most preferably at times and can calculate the N2 of ERP signal, P3 amplitude obtains the input that feature vector is identified as follow-up mode, to carry out automatic Emotion identification.Present invention is mainly applied to carry out automatic emotional state detection.

Description

Emotion identification method based on P300 characters spells task
Technical field
The present invention relates to mood detection techniques, specifically, are related to the Emotion identification method based on P300 characters spells task.
Background technique
Mood is whether people meets own wishes or needs to objective things or scene (to extraneous or autostimulation) and produce Raw attitude experience is people's polyesthesia, the comprehensive psychology and physiological status generated of thought and act reaction.Mood is as human brain Premium Features, belong to the pith of human intelligence, in varying degrees influence people study, memory and decision.For mood Detection identification and regulation be always scientific research hot spot.
Emotion identification is the important component of affection computation, is the combination of computational science and psychic science, cognitive science, For the emotion characteristic for probing into human-computer interaction, the important guarantee of the concordance of people and computer is improved.In view of mood tool there are three types of at Point: subjective experience (and individual to different emotions state self impression), external presentation (affective state occur when each portion of body The movement quantized versions divided, i.e. expression) and physiology wake-up (physiological reaction that emotion generates).It is also corresponding for the identification of mood There are Subjective Reports method based on subjective experience, including mood measuring scale, selftest questionnaire etc., the operation letter of Subjective Reports method It is single, but since individual has deviation to cause precision and reliability not high the statement understanding of scale;Outer row based on external presentation For mensuration, i.e., by the analysis of relationship between external behaviors information and mood such as facial expression, posture expression and intonation expression, Identifying different emotional states, external behavior signal is controlled vulnerable to personal subjective will, and can artificially it be covered up or be pretended, Easily there is wrong identification;Based on physiology wake up physiological signal measurements method, mainly have cortisol levels, heart rate, blood pressure, breathing, Electrodermal activity, palm sweat, pupil diameter, event related potential and brain electricity etc., physiological signal by people nervous system and endocrine System dominates, and has spontaneity, can not pretend, seldom be controlled by the subjective will of people, more can feelings that are objective, being truly reflected people Not-ready status.The method of these Emotion identifications respectively has feature and can be freely combined.
Brain-computer interface (brain computer interface, BCI) is one and transmits between brain and peripheral equipment Information, and the communication system independent of nervus peripheralis and the normal output channel of sarcous brain.With electronic information skill The application of the fast development of art, brain-computer interface technology has obtained very big expansion, not only in neural rehabilitation field, amusement and recreation, Also there is preferable application prospect in the fields such as medical diagnosis on disease and detection, artificial intelligence and machine learning.Brain-machine interaction, it is this by the mankind Both hands thorough liberation come out novel communication mode be likely to bring the change of labor style and labor relation.But brain control process The fluctuation of middle mood can stability to brain-computer interface performance and reliability have an impact, therefore there is an urgent need to during brain control Monitoring identification emotional state, and Real-time Feedback is carried out to user.Acquisition and place due to brain-computer interface use process midbrain electricity It is necessary when reason, while EEG signals have temporal resolution high, the good feature of real-time, thus can use EEG signals into Emotion identification during row brain-computer interface task.
Brain-computer interface based on P300 characters spells is the typical case and important example of brain-computer interface.P300 is a kind of allusion quotation Type utilizes event related potential (event-related potentials, ERP) endogenous component, and amplitude and stimulation occur Predictability it is related, be generally present in 300ms or so after event occurs and gain the name.ERP is a kind of special brain evoked potential, It is caused using a variety of or multiple stimulations, is reflected in brain cognitive process by assigning stimulation with special psychological meaning Electrophysiology variation.In the 1960s, the research for mood ERP starts to rise, a large number of studies show that, scheme with neutrality Piece is compared, and the high Emotional Picture of arousal can induce bigger P300 wave amplitude;Active mood stimulation is relative to neutral and passive feelings Thread stimulation, can cause bigger P3b amplitude.Also some researches show that negative feeling stimulates P1 (117ms) amplitude induced in occipital region It is bigger than positive emotional distress, therefore Emotion identification is carried out with feasibility using ERP signal.In P300 characters spells task execution In the process, the ERP signal that can be generated, carries out that Emotion identification is convenient and efficient, and real-time is good with ERP signal.
Summary of the invention
In order to overcome the deficiencies of the prior art, the present invention is directed to propose a kind of new method for detecting emotional state, carries out automatic Emotional state detection, can be effectively performed the Emotion identification during the brain-computer interface task based on P300, and obtain considerable Social benefit and economic benefit.The technical solution adopted by the present invention is that the Emotion identification method based on P300 characters spells task, Induce active mood, neutral mood and negative feeling three classes emotional state respectively by watching Emotional Picture first, then into Characters spells task of the row based on P300;EEG signal is acquired simultaneously;It is extracted after Signal Pretreatment in EEG signal after label 700ms data are simultaneously downsampled to 100Hz as feature, and filtering out most preferably at times and can calculate the N2 of ERP signal, P3 amplitude, The input that feature vector is identified as follow-up mode is obtained, to carry out automatic Emotion identification.
Mood induces and characters spells task
Choose arousal it is matched induce active mood, neutral mood, negative feeling IAPS picture, when IAPS is applied In Different Culture, when not agnate, country variant, versatility nevertheless suffers from challenge, therefore to carry out applicability before testing Evaluation work does the scoring of arousal He pleasure degree to the picture of three classes mood, choose the picture compared with high arousal rate as feelings Thread induces material;
After inducing corresponding mood with Emotional Picture, the characters spells task based on P300 is carried out and the feelings current to oneself Not-ready status carries out subjective assessment, and characters spells interface is the character determinant of a 6*6, and every one character of spelling needs 10 rounds Ranks flashing, flash corresponding row or column every time and light 100ms, dimmed 75ms, therefore every one character of spelling needs (100+ 75) * 12*10/1000=21s;In order to guarantee that emotional state is still kept during characters spells task, using every spelling one Secondary character, the mode for inducing a mood are tested, and subjective assessment is that subject scores to the pleasant degree of oneself, are divided into For 1 to 9 grades, as grade increases, pleasant degree is increased, and 5 be neutral relaxation state.
Feature extraction
Collected EEG signals are pre-processed first, including removes Hz noise, remove baseline, bandpass filtering, drop is adopted Sample goes eye electric to 100Hz and ICA.Then pretreated signal is subjected to feature extraction;
The method flow of feature extraction is that autonomous scoring of the detection subject to oneself pleasure degree first, scoring is 1 to 3 It is considered as in negative feeling state, scores and be considered as 4 to 6 in neutral emotional state, score and be considered as 7 to 9 in positive feelings Not-ready status splits data into the characters spells data segment under three classes emotional state, and it includes 120 ranks that every segment data is 21s altogether Label and 1 target character label;For every segment data, need to extract after each ranks label 700ms data and this N2, P3 amplitude of 700ms data;
1) 700ms data after the label under certain emotional state are overlapped averagely, are put down by the amplitude of ERP signal Equal ERP signal, average ERP signal includes P1, N1, P2, N2, P3 ingredient, N2, P3 ingredient is paid close attention to, first by average ERP signal Incubation period is decided, then calculates the amplitude of N2, P3 of each stimulation ERP signal, shares 60 lead * 2=120 dimensional features, It is denoted as feature vector V1, V2 ..., V120;
2) feature of the data of 700ms as character recognition, sample frequency 100Hz, each lead have after interception label 700*100/1000=70 data point, current brain wave acquisition equipment lead number is more, add it is relevant to mood before Lead FP1, FP2 of forehead point, has chosen 8 leads, different type of emotion can be distinguished by being found using fisher ratio altogether The best separability period, carry out can adaptive tracing method at times calculate.
It can adaptive tracing method at times
One character of every spelling can generate 120 ERP signals, therefore each lead has a discrete ERP matrix Ik (n, t), n are the number of ERP signal, and t is the time, calculates the Fisher ratio of the ERP signal matrix of each lead first, connects down Iteration back-and-forth method is utilized, long using 100ms as window, for 25ms as overlapping duration, calculate each period can averagely divide ratio Rate;The three classes mood difference characteristic that can point ratio maximum period include is most, and selection can divide ratio first three maximum period As the feature of pattern-recognition, specific algorithm is as follows:
Calculate Fisher ratio FR(t) shown in method such as formula (1), wherein FRIt (t) is the three classes mood of a certain lead Divide coefficient, it is the difference S between class i.e. between three classes type of emotionB(t) interior with class is same type of emotion difference Sw(t) Ratio, Ik(n, t) is ERP value of n-th of the sample of kth class mood in t moment, mkIt (t) is all samples of kth class mood in t The average ERP value at moment, m (t) are average ERP value of all samples in t moment of all classes, and C represents classification number, in the present invention C=3, nkIt is the sample number in kth class;
The Fisher ratio acquired is the value of each lead at various moments, and best due to each lead can at times not Together, need to carry out period screening for lead, long using 100ms as window using iteration back-and-forth method, 25ms is used as overlapping duration, The average Fisher ratio of 9 periods is obtained, it can fraction weight.
Each period can fraction weight F (i) calculation method for example shown in formula (2), wherein F (i) is being averaged for i-th period Fisher ratio, tiValue at the time of to include in i-th of period;
To 9 periods of each lead can fraction carry out sequence from high to low again, select first three as Classification and Identification Feature, therefore share 8 lead * 300ms*100Hz*/1000=240 dimensional features, be denoted as feature vector V121, V122 ..., V360。
Pattern-recognition
After feature extraction, 360 dimensional feature vectors (V1, V2 ..., V360) is obtained, wherein preceding 120 dimension V1 ..., V120 are The N2 of ERP signal, P3 amplitude, rear 240 dimension V121 ..., V360 is the data of tri- periods of ERP, uses support vector machines (Support Vector Machine) classifier identifies feature, carries out mood Classification and Identification.
The features of the present invention and beneficial effect are:
The present invention solves the problems, such as to influence in brain-computer interface use process vulnerable to mood swing.To make in brain-computer interface It is characterized input with the ERP signal generated in the process, simpler quick identification mood simultaneously reminds user.With utilization Peripheral nerve physiological signal carries out Emotion identification and compares, simpler portable, in the brain-computer interface based on P300, because being adopted Feature is identical as the feature of character recognition, enormously simplifies operation program, and real-time is stronger.For friendly harmonious man-machine friendship Mutual etc. to provide most crucial technical support, it is convenient to bring to practical application, and can be applied to a variety of operative scenarios.
Detailed description of the invention:
The automatic Emotion identification method flow diagram of ERP signal during Fig. 1 task based access control.
The experiment flow figure of the automatic Emotion identification of ERP signal during Fig. 2 task based access control.
The feature extraction flow chart of the automatic Emotion identification of ERP signal during Fig. 3 task based access control.
Fig. 4 can adaptive tracing calculation process at times.
The selection of Fig. 5 iteration can schematic diagram at times.
Specific embodiment
The invention proposes a kind of new methods for detecting emotional state.Its techniqueflow is: record P300 characters spells are appointed EEG signals (EEG) during business, superposed average extract ERP signal, determine ERP signal N2, P3 under different emotional states Incubation period and calculate its amplitude, and using can adaptive tracing method at times filter out the best of ERP signalling can timesharing Section, constitutive characteristic matrix, as the input of follow-up mode identification, to carry out automatic emotional state detection.
Purport of the invention is to propose a kind of new method for detecting emotional state.By recording P300 characters spells task mistake EEG signals (EEG) in journey, extract ERP signal, calculate the amplitude of the ERP signal under different emotional states, and utilize The method of self-adoptive trace it can filter out the best of ERP signal at times and can obtain feature vector at times, as follow-up mode The input of identification, to carry out automatic emotional state detection.This invention can be effectively performed the brain-computer interface based on P300 and appoint Emotion identification during business, and obtain considerable Social benefit and economic benefit.
The automatic Emotion identification method flow diagram of ERP signal is as shown in Figure 2 during task based access control.Emotion identification it is entire Process are as follows: active mood, neutral mood and negative feeling three classes emotional state are induced respectively by watching Emotional Picture first, Then the characters spells task based on P300 is carried out, while acquiring EEG signal, is extracted in EEG signal after Signal Pretreatment 700ms data and 100Hz is downsampled to as feature after label, filtering out most preferably at times and can calculate the N2, P3 of ERP signal Amplitude obtains the input that feature vector is identified as follow-up mode, to carry out automatic Emotion identification.
1 mood induces and characters spells task
Choose arousal it is matched induce active mood, neutral mood, negative feeling IAPS picture, when IAPS is applied In Different Culture, when not agnate, country variant, versatility nevertheless suffers from challenge, therefore to carry out applicability before testing Evaluation work does the scoring of arousal He pleasure degree to the picture of three classes mood.The picture compared with high arousal rate is chosen as feelings Thread induces material.
After inducing corresponding mood with Emotional Picture, the characters spells task based on P300 is carried out and the feelings current to oneself Not-ready status carries out subjective assessment.Characters spells interface is the character determinant of a 6*6, and every one character of spelling needs 10 rounds Ranks flashing, flash corresponding row or column every time and light 100ms, dimmed 75ms, therefore every one character of spelling needs (100+ 75) * 12*10/1000=21s.In order to guarantee that emotional state is still kept during characters spells task, using every spelling one Secondary character, the mode for inducing a mood are tested.Subjective assessment is that subject scores to the pleasant degree of oneself, is divided into For 1 to 9 grades, as grade increases, pleasant degree is increased, and 5 be neutral relaxation state.Specific experiment flow chart is as shown in Figure 3.
2 signal acquisitions
EEG signals (EEG) include a large amount of physiologic information, are the most direct reflections to brain bioelectrical activity, pass through After analysis processing, Classification and Identification can be carried out to mood.Relative to other physiological parameters, mood electroencephalographic response is sensitive quickly, the time High resolution, specific Function are strong, Emotion identification accuracy is relatively high, so carrying out mood knowledge by EEG signals in recent years Other research is concerned.ERP signal has the resolution ratio of Millisecond, is the important of the big higher brain function such as assessment emotion cognition Tool.In the characters spells task based on P300, and the carry out Classification and Identification characterized by ERP signal, so that it is determined that user The character to be spelt.The feature for the Emotion identification that this method uses is identical as the feature of characters spells task, enormously simplifies meter Calculation process reduces the time of data processing, is more advantageous to realization real time implementation.
3 feature extractions
Collected EEG signals are pre-processed first, including removes Hz noise, remove baseline, bandpass filtering, drop is adopted Sample removes eye electricity etc. to 100Hz and ICA.Then pretreated signal is subjected to feature extraction.
The method flow of feature extraction is as shown in Figure 4.Due to being to be identified to three classes emotional state, it is first Autonomous scoring of the system detection subject to oneself pleasure degree, scores and is considered as 1 to 3 in negative feeling state, scoring is 4 to 6 It is considered as in neutral emotional state, scores and be considered as 7 to 9 in active mood state.It splits data under three classes emotional state Characters spells data segment, every segment data be 21s, altogether include 120 ranks labels and 1 target character label;For every section Data need to extract N2, P3 amplitude of 700ms data and this 700ms data after each ranks label.
3) amplitude of ERP signal.700ms data after label under certain emotional state are overlapped averagely, are put down Equal ERP signal.Average ERP signal includes that P1 (forward wave of the appearance of 100ms or so after stimulating), N1 (stimulate 100ms later The negative wave of the appearance of left and right), P2 (negative wave that 200ms or so occurs after stimulating), (200ms or so goes out N2 after stimulating Existing forward wave), P3 ingredient, wherein P1, N1, P2 are the exogenous or physiological ingredient of ERPs, and stimulated physical characteristic influences; N2, P3 are the endogenous or psychological ingredient of ERPs, the influence of not stimulated physical characteristic, the state of mind and attention with subject Power is related.Therefore we pay close attention to N2, P3 ingredient, first decide the incubation period of average ERP signal, then calculate each thorn The amplitude for swashing N2, P3 of ERP signal, shares 60 lead * 2=120 dimensional features, is denoted as feature vector V1, V2 ..., V120.ERP Ingredient names feature: it is named with incubation period, such as it is 100ms forward wave in incubation period that the ingredient, which is, is named as P100 or P1, The ingredient is the negative wave for being 200ms incubation period, then is named as N2.
4) feature of the data of 700ms as character recognition, sample frequency 100Hz, each lead have after interception label 700*100/1000=70 data point, current brain wave acquisition equipment lead number is more, we are with reference in research before Conduction choice FZ, CZ, PZ, OZ, PO7, PO8, and joined lead FP1, FP2 of forehead portion relevant to mood, select altogether 8 leads are taken.Since the best separability period of user is different, in order to be improved on the basis of not swaying the emotion recognition result Arithmetic speed finds the best separability period that can distinguish different type of emotion using fisher ratio.It can be at times Adaptive tracing method calculating process it is as shown in Figure 5.
It can adaptive tracing method at times
By taking the spelling of a character as an example, one character of every spelling can generate 120 ERP signals, therefore each lead There is a discrete ERP matrix Ik(n, t), n are the number of ERP signal, and t is the time.The ERP signal of each lead is calculated first Next the Fisher ratio of matrix utilizes iteration back-and-forth method, long using 100ms as window, and 25ms is calculated each as overlapping duration A period can averagely divide ratio.The three classes mood difference characteristic that can point ratio maximum period include is most, and selection can divide Feature of first three the maximum period of ratio as pattern-recognition.Specific algorithm is as follows.
Calculate Fisher ratio FR(t) shown in method such as formula (1).Wherein, FRIt (t) is the three classes mood of a certain lead Divide coefficient, it is (between different mode, to refer between three classes type of emotion) difference S in the present invention between classB(t) and in class (in same mode, referring to same type of emotion in the present invention) difference Sw(t) ratio.Ik(n, t) is n-th of kth class mood ERP value of the sample in t moment, mkIt (t) is all samples of kth class mood in the average ERP value of t moment, m (t) is all classes For all samples in the average ERP value of t moment, C represents classification number, C=3, n in the present inventionkIt is the sample number in kth class.
The Fisher ratio acquired is the value of each lead at various moments, and best due to each lead can at times not Together, it needs to carry out period screening for lead.Long using 100ms as window using iteration back-and-forth method, 25ms is used as overlapping duration, The average Fisher ratio of available 9 periods, can fraction weight.
Each period can fraction weight F (i) calculation method for example shown in formula (2).Wherein F (i) is being averaged for i-th period Fisher ratio, tiValue at the time of to include in i-th of period.
To 9 periods of each lead can fraction carry out sequence from high to low again, select first three as Classification and Identification Feature.Therefore 8 lead * 300ms*100Hz*/1000=240 dimensional features are shared, are denoted as feature vector V121, V122 ..., V360。
4 pattern-recognitions
After feature extraction, 360 dimensional feature vectors (V1, V2 ..., V360) is obtained, wherein preceding 120 dimension V1 ..., V120 are The N2 of ERP signal, P3 amplitude, rear 240 dimension V121 ..., V360 is the data of tri- periods of ERP.Since sample data set is less than normal, Feature is identified using support vector machines (Support Vector Machine, SVM) classifier, mood classification is carried out and knows Not.
Purport of the invention is to propose a kind of Emotion identification method of characters spells task based on P300, with task process The ERP signal of middle generation is inputted as feature, thus carry out Emotion identification accurately, objective, easy.This invention can be effectively The accuracy and simplicity of Emotion identification in brain-computer interface use process are improved, and obtains considerable social benefit and economic effect Benefit.Optimum implementation proposed adoption patent transfer, technological cooperation or product development.Since the technical operation is simple, sensibility is strong, Product based on the technological development can be applied to several scenes such as human-computer interaction, brain-computer interface etc..

Claims (6)

1. a kind of Emotion identification method based on P300 characters spells task, characterized in that pass through viewing Emotional Picture point first Active mood, neutral mood and negative feeling three classes emotional state are not induced, are then carried out the characters spells based on P300 and are appointed Business;EEG signal is acquired simultaneously;700ms data after label are extracted in EEG signal after Signal Pretreatment and are downsampled to 100Hz is as feature, and filtering out most preferably at times and can calculate the N2 of ERP signal, and P3 amplitude obtains feature vector as subsequent The input of pattern-recognition, to carry out automatic Emotion identification.
2. the Emotion identification method as described in claim 1 based on P300 characters spells task, characterized in that mood induce with Characters spells task: choose arousal it is matched induce active mood, neutral mood, negative feeling IAPS picture, work as IAPS When being applied to Different Culture, not agnate, country variant, versatility nevertheless suffers from challenge, therefore to be fitted before testing With the evaluation work of property, i.e., the scoring of arousal He pleasure degree is done to the picture of three classes mood, choose the picture compared with high arousal rate Material is induced as mood;
After inducing corresponding mood with Emotional Picture, the characters spells task based on P300 is carried out and the mood shape current to oneself State carries out subjective assessment, and characters spells interface is the character determinant of a 6*6, and every one character of spelling needs the row of 10 rounds Column flashing flashes corresponding row or column every time and lights 100ms, dimmed 75ms, therefore every one character of spelling needs (100+75) * 12*10/1000=21s;In order to guarantee that emotional state is still kept during characters spells task, using one secondary word of every spelling Symbol, the mode for inducing a mood are tested, and subjective assessment is that subject scores to the pleasant degree of oneself, is divided into 1 To 9 grades, as grade increases, pleasant degree is increased, and 5 be neutral relaxation state.
3. the Emotion identification method as described in claim 1 based on P300 characters spells task, characterized in that obtain feature to Measure detailed process are as follows:
Collected EEG signals are pre-processed first, including removes Hz noise, remove baseline, bandpass filtering, are downsampled to 100Hz and ICA goes eye electric, and pretreated signal is then carried out feature extraction;
Wherein, autonomous scoring of the detection subject to oneself pleasure degree first, scores and is considered as 1 to 3 in negative feeling state, Scoring is considered as 4 to 6 in neutral emotional state, scores and is considered as 7 to 9 in active mood state, splits data into three classes Characters spells data segment under emotional state, every segment data are 21s, altogether include 120 ranks labels and 1 target character mark Label;For every segment data, need to extract N2, P3 amplitude of 700ms data and this 700ms data after each ranks label;
1) 700ms data after the label under certain emotional state are overlapped averagely by the amplitude of ERP signal, obtain average ERP Signal, average ERP signal includes P1, N1, P2, N2, P3 ingredient, pays close attention to N2, P3 ingredient, first by the latent of average ERP signal Phase is decided, then calculates the amplitude of N2, P3 of each stimulation ERP signal, shares 60 lead * 2=120 dimensional features, is denoted as Feature vector V1, V2 ..., V120;
2) feature of the data of 700ms as character recognition, sample frequency 100Hz, each lead have 700* after interception label 100/1000=70 data point, Conduction choice FZ, CZ, PZ, OZ, PO7, PO8 add forehead portion relevant to mood Lead FP1, FP2 has chosen 8 leads altogether, and can distinguish different type of emotion best is found using fisher ratio can Point property period, carry out can adaptive tracing method at times calculate.
4. the Emotion identification method as claimed in claim 3 based on P300 characters spells task, characterized in that can at times certainly Adapting to tracing is specifically one character of every spelling, can generate 120 ERP signals, thus each lead have one it is discrete ERP matrix Ik(n, t), n are the number of ERP signal, and t is the time, calculates the Fisher of the ERP signal matrix of each lead first Next ratio utilizes iteration back-and-forth method, long using 100ms as window, and 25ms calculates the flat of each period as overlapping duration Ratio can be divided;The three classes mood difference characteristic that can point ratio maximum period include is most, before selection can divide ratio maximum Feature of three periods as pattern-recognition, specific algorithm are as follows:
Calculate Fisher ratio FR(t) shown in method such as formula (1), wherein FR(t) for a certain lead three classes mood can Point coefficient, it is the difference S between class i.e. between three classes type of emotionB(t) interior with class is same type of emotion difference Sw(t) ratio Value, Ik(n, t) is ERP value of n-th of the sample of kth class mood in t moment, mkIt (t) is all samples of kth class mood in t moment Average ERP value, m (t) is average ERP value of all samples in t moment of all classes, and C represents classification number, C=in the present invention 3, nkIt is the sample number in kth class;
The Fisher ratio acquired is the value of each lead at various moments, and best due to each lead can be different at times, Need to carry out period screening for lead, long using 100ms as window using iteration back-and-forth method, 25ms is obtained as overlapping duration The average Fisher ratio of 9 periods, can fraction weight.
5. the Emotion identification method as claimed in claim 4 based on P300 characters spells task, characterized in that each period Can fraction weight F (i) calculation method for example shown in formula (2), wherein F (i) is the average Fisher ratio of i-th of period, tiIt is i-th Value at the time of including in a period;
To 9 periods of each lead can fraction carry out sequence from high to low again, select first three spy as Classification and Identification Sign, therefore 8 lead * 300ms*100Hz*/1000=240 dimensional features are shared, it is denoted as feature vector V121, V122 ..., V360.
6. the Emotion identification method as described in claim 1 based on P300 characters spells task, characterized in that pattern-recognition tool Body step is, after obtaining feature vector, has obtained 360 dimensional feature vectors (V1, V2 ..., V360), wherein preceding 120 dimension V1 ..., V120 is the N2 of ERP signal, and P3 amplitude, rear 240 dimension V121 ..., V360 is the data of tri- periods of ERP, uses supporting vector Machine SVM (Support Vector Machine) classifier identifies feature, carries out mood Classification and Identification.
CN201610819373.0A 2016-09-13 2016-09-13 Emotion identification method based on P300 characters spells task Active CN107085464B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610819373.0A CN107085464B (en) 2016-09-13 2016-09-13 Emotion identification method based on P300 characters spells task

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610819373.0A CN107085464B (en) 2016-09-13 2016-09-13 Emotion identification method based on P300 characters spells task

Publications (2)

Publication Number Publication Date
CN107085464A CN107085464A (en) 2017-08-22
CN107085464B true CN107085464B (en) 2019-11-26

Family

ID=59615049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610819373.0A Active CN107085464B (en) 2016-09-13 2016-09-13 Emotion identification method based on P300 characters spells task

Country Status (1)

Country Link
CN (1) CN107085464B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109770889B (en) * 2017-11-15 2022-03-11 深圳市理邦精密仪器股份有限公司 Electrocardiogram data section selection method and device
JP6705611B2 (en) * 2018-03-09 2020-06-03 三菱電機株式会社 Discomfort condition determination device
CN109589493A (en) * 2018-09-30 2019-04-09 天津大学 It is a kind of based on the attentional regulation method through cranium galvanic current stimulation
CN111413874B (en) * 2019-01-08 2021-02-26 北京京东尚科信息技术有限公司 Method, device and system for controlling intelligent equipment
CN111414835B (en) * 2020-03-16 2022-04-05 西南大学 Detection and determination method for electroencephalogram signals caused by love impulsion

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102499677A (en) * 2011-12-16 2012-06-20 天津大学 Emotional state identification method based on electroencephalogram nonlinear features
CN102508545A (en) * 2011-10-24 2012-06-20 天津大学 Visual P300-Speller brain-computer interface method
CN102512160A (en) * 2011-12-16 2012-06-27 天津大学 Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands
CN103690165A (en) * 2013-12-12 2014-04-02 天津大学 Cross-inducing-mode emotion electroencephalogram recognition and modeling method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100249538A1 (en) * 2009-03-24 2010-09-30 Neurofocus, Inc. Presentation measure using neurographics

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102508545A (en) * 2011-10-24 2012-06-20 天津大学 Visual P300-Speller brain-computer interface method
CN102499677A (en) * 2011-12-16 2012-06-20 天津大学 Emotional state identification method based on electroencephalogram nonlinear features
CN102512160A (en) * 2011-12-16 2012-06-27 天津大学 Electroencephalogram emotional state feature extraction method based on adaptive tracking in different frequency bands
CN103690165A (en) * 2013-12-12 2014-04-02 天津大学 Cross-inducing-mode emotion electroencephalogram recognition and modeling method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Decreasing the interference of visual-based P300 BCI using facial expression changes;Jing Jin, Yu Zhang and Xingyu;《Proceeding of the 11th World Congress on Intelligent Control and Automation》;20140704;第2407-2411页 *
Emotional Face Retrieval with P300 Signals of Multiple Subjects;Junwei Fan,Hideaki Touyama;《2016 Joint 8th International Conference on Soft Computing and Intelligent Systems and 2016 17th International Symposium》;20160828;第490-495页 *

Also Published As

Publication number Publication date
CN107085464A (en) 2017-08-22

Similar Documents

Publication Publication Date Title
Greene et al. A survey of affective computing for stress detection: Evaluating technologies in stress detection for better health
Bulagang et al. A review of recent approaches for emotion classification using electrocardiography and electrodermography signals
Zhai et al. Stress detection in computer users based on digital signal processing of noninvasive physiological variables
Can et al. Stress detection in daily life scenarios using smart phones and wearable sensors: A survey
Bota et al. A review, current challenges, and future possibilities on emotion recognition using machine learning and physiological signals
Markova et al. Clas: A database for cognitive load, affect and stress recognition
Giannakakis et al. Review on psychological stress detection using biosignals
CN107085464B (en) Emotion identification method based on P300 characters spells task
Debie et al. Multimodal fusion for objective assessment of cognitive workload: A review
Krishna et al. An efficient mixture model approach in brain-machine interface systems for extracting the psychological status of mentally impaired persons using EEG signals
Schmidt et al. Wearable affect and stress recognition: A review
Zhai et al. Stress detection in computer users through non-invasive monitoring of physiological signals
Kusserow et al. Monitoring stress arousal in the wild
Chen et al. DeepFocus: Deep encoding brainwaves and emotions with multi-scenario behavior analytics for human attention enhancement
Ramos et al. Stress recognition-a step outside the lab
Ren et al. Comparison of the use of blink rate and blink rate variability for mental state recognition
Wampfler et al. Affective state prediction in a mobile setting using wearable biometric sensors and stylus
CN104367306A (en) Physiological and psychological career evaluation system and implementation method
CN111110256A (en) Emotion measuring method based on human face infrared thermal image
Gruenewald et al. [Regular Paper] Biomedical Data Acquisition and Processing to Recognize Emotions for Affective Learning
AU2021101097A4 (en) A system and method for automatic playlist generation by analysing human emotions through physiological signals
Zheng et al. Multi-modal physiological signals based fear of heights analysis in virtual reality scenes
Gao et al. Toward a systematic survey on wearable computing for education applications
Cao et al. Advancing classroom fatigue recognition: A multimodal fusion approach using self-attention mechanism
CN115804590A (en) Wakefulness detection system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant