CN107944473A - A kind of physiological signal emotion identification method based on the subjective and objective fusion of multi-categorizer - Google Patents
A kind of physiological signal emotion identification method based on the subjective and objective fusion of multi-categorizer Download PDFInfo
- Publication number
- CN107944473A CN107944473A CN201711077252.4A CN201711077252A CN107944473A CN 107944473 A CN107944473 A CN 107944473A CN 201711077252 A CN201711077252 A CN 201711077252A CN 107944473 A CN107944473 A CN 107944473A
- Authority
- CN
- China
- Prior art keywords
- mtd
- mrow
- msub
- particle
- mtr
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Psychiatry (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Fuzzy Systems (AREA)
- Signal Processing (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Mathematical Physics (AREA)
- Pulmonology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention discloses a kind of physiological signal emotion identification method based on the subjective and objective fusion of multi-categorizer.User carries out use feeling experience to product first, and fills in the PAD Affect Scale questionnaires of Chinese edition;Then heart rate and skin electrical signal of the collection user during Product Experience, and both objective physiological signals are handled and feature extraction;The heart rate feature extracted and skin electrical feature are respectively adopted SVM classifier to be trained and identify;The recognition result of each grader is represented using the other Probability Forms of target class, and recognition result is normalized;Weight assignment is carried out to each grader, optimizing is carried out to weight using particle cluster algorithm;The recognition result for different emotions classification is finally merged, using that maximum a kind of emotion of discrimination as final affective state.The present invention balances subjective, objective and different physiological signals the result of decision using the method for multiple Classifiers Combination so that final recognition result is more accurate, reliable.
Description
Technical field
The invention belongs to emotion cognition calculating field, is related to a kind of physiological signal feelings based on the subjective and objective fusion of multi-categorizer
Feel recognition methods.
Background technology
With the fast development of science and technology and computer technology so that the mankind constantly increase the degree of dependence of computer
By force, people are higher and higher to the intelligent Capability Requirement of computer.This Capability Requirement computer can be thought deeply as people, and
And intelligentized can understand, express the emotion of people so that computer user in a kind of harmonious man-machine interaction environment into
Row study, work and life.
With the arrival of user experience economy, people for feature that the focal point of product is no longer only product,
Stability and security, and the satisfaction for being more user at all in using product process, that is, it is so-called good
User experience.User experience is largely by subjective impact, and different products can stimulate out in interactive process
Different emotional experiences, at this time the affective state of user can most represent its sense of reality during product use.
Sentiment analysis includes subjective analysis method and objective analysis method, objective analysis mainly by physical signs, expression,
The objective datas such as phonetic feature are analyzed to identify the affective characteristics of user.Subjective analysis mainly with self-report or fills in investigation
The mode of questionnaire carries out.At present, for the correspondence of research emotion and the objective data that can reflect its feature, we are mostly
Single physical signs, voice messaging, expressive features are selected to be analyzed, but have ignored the subjective feeling of individual this is heavy
Factor is wanted, therefore the discrimination of emotion may be than relatively low.In addition, the physiological signal of people be mainly by people autonomic nerves system and
Internal system dominates, and from the subjectivity control of people;The subjective feeling of individual I can directly be expressed by individual,
The affective state of subject instantly can more efficiently be identified by merging the objective and subjective both sides emotion result of decision.
The content of the invention
, should the object of the present invention is to provide a kind of physiological signal emotion identification method based on the subjective and objective fusion of multi-categorizer
Method can effectively identify affective state of the user during Product Experience.Know with the single emotion much found out
Other correlation technique is different, and this method is respectively to two kinds of physiological signals (heart rate, skin pricktest) feature into market using SVM classifier
Perception is other, expresses this cofactor using the PAD Affect Scale questionnaires form fusion for filling in Chinese edition is subjective on this basis,
The optimizing of weight is carried out to this three classes result of decision finally by particle cluster algorithm, and carries out weight assignment, finally merge this three
The class result of decision obtains emotion recognition result to the end.Affective state can be effectively identified using new method proposed by the present invention.
To achieve the above object, the technical solution adopted by the present invention is a kind of physiology based on the subjective and objective fusion of multi-categorizer
Signal emotion identification method, comprises the steps of:
Step 1:User carries out use feeling experience to product, and in the emotion after having experienced in oneself experience of the process
State fills in the PAD Affect Scale questionnaires of a Chinese edition;
Step 2:Heart rate and skin electrical signal of the user during Product Experience are gathered, and it is objectively raw to both
Reason signal is handled and feature extraction;
Step 3:By the heart rate feature extracted and skin electrical feature be respectively adopted support vector machine classifier be trained and
Identification, meanwhile, the value of obtained three dimensions of PAD Affect Scale questionnaires for the Chinese edition that user is filled in is also used for PAD moulds
The training of type classification;
Step 4:The recognition result of each grader in step 3 is represented using the other Probability Forms of target class, and
Every kind of other recognition result of target class under each grader is normalized;
Step 5:To each grader carry out weight assignment, regard weight as particle, using particle cluster algorithm to weight into
Row optimizing;
Step 6:Recognition result of three graders for different emotions classification is merged, by that one kind that discrimination is maximum
Emotion is as final affective state.
Further, in above-mentioned steps four, the algorithm of the normalized is Ek(x)=(Pk(C1|x),Pk(C2|
x),...,Pk(CM|x)),
Wherein,
k∈(1,2...K);P(Ct|x,Ek) represent k-th of grader by physiological signal or subjective table
C is identified as up to sample xtThe probability of class emotion, its value range are [0,1];Represent M (M=4) class emotional category, respectively
It is glad, sad, indignation and fear;Therefore, the decision matrix that can obtain multi-categorizer is:
Further, it is above-mentioned that weight progress optimizing is referred to search out optimal weight using particle cluster algorithm in step 5
Value ω so that J (ω) is maximized, wherein, ω=(ω11,ω21,…,ωk1,…,ωkt,…ω1m,…,ωkm), Represent the real feelings classification of i-th of training sample,Represent
The emotion prediction classification of i-th of training sample of grader output, N represent training sample number, and particle cluster algorithm input is certainly
Plan matrixThe real feelings classification of training sampleThe inertia weight α of particle, Studying factors β1, β2,
Number of particles S, maximum iteration Q, discrimination threshold value η, the output of particle cluster algorithm is optimal weights coefficient ω, specific step
It is rapid as follows:
(1) particle is initialized:Generate S particle at random in D dimension spacesConstituent particle group, and generate at random
The flying speed of S particle
And rememberIt is the history optimal solution that j-th of particle search arrives, g=(g11,g21,…,
gK1,…,gkt,…,gKM) it is the history optimal solution that whole population searches;
(2) more new particle:Wherein r1,
r2It is the random number in the generation of [0,1] section, for increasing the randomness of search;
(3) particle is normalized:Each particle is normalized respectively according to different emotions classification,Wherein 1≤k≤K, 1≤t≤M;
(4) global optimizing is carried out to particle:The quality of each particle is evaluated using J (ω) as fitness function, and will
Its adaptive value is made comparisons with the best position that it passes through, and updates the optimal location p of each particlej(1≤j≤S), population are most
Excellent position g and update current best identified rate η ':
(5) end condition:Terminate as iterations q > Q or η ' > η, otherwise q ← q+1 goes to step 2.
Preferably, the inertia weight α of above-mentioned particle uses fixed weight 0.5, Studying factors β1, β2Value be β1=
β2=2.
Further, in above-mentioned steps six, the fusion refers to be merged the recognition result of multi-categorizer, is specially:
T ∈ (1,2...M), wherein,ωktRepresent k-th of classification
Device is on CtThe weight of class emotion, the final classification of sample
Compared with prior art, the present invention has following beneficial effect:
(1) method proposed by the present invention breaches the study limitation of relation between single emotional index and emotion, fusion two
The emotion recognition of kind of physiological signal, can be effective as a result, both physiological signals can have complementary advantages during emotion recognition
Identify different affective states;
(2) method proposed by the present invention has merged the objective and subjective both sides result of decision, heart rate and skin electrical signal
The objective data for being all accompanied by emotion change and producing, is that the affective state of user objectively responds, and the emotion table of user
Up to be then to affective state subjectivity reflect;
(3) present invention balances subjective, objective and different physiological signals decision-makings using the method for multiple Classifiers Combination
As a result so that final recognition result is more accurate, reliable.
Brief description of the drawings
Fig. 1 is the flow diagram of the method for the present invention.
Fig. 2 is particle cluster algorithm weight optimizing schematic diagram.
Embodiment
In conjunction with attached drawing, the present invention will be further described in detail.
Physiological signal emotion identification method proposed by the present invention based on the subjective and objective fusion of multi-categorizer, key point are this
Method has merged the recognition result of two kinds of physiological signals, and is fused to the emotion using subjective expression of results as factor is assisted in identifying
In identification model, breach relation between single emotional physiological signal and emotion and only consider the study limitation of objective factor.
Mainly include:User carries out use feeling experience to product, in portion is filled in after having experienced according to oneself affective state instantly
The PAD Affect Scale questionnaires of literary version;Meanwhile heart rate and skin electrical signal of the user during Product Experience are gathered, and to this
Two kinds of objective physiological signals are handled and feature extraction;Branch is respectively adopted in the heart rate feature extracted and skin electrical feature
Vector machine (SVM) grader is held to be trained and identify, meanwhile, the PAD Affect Scales questionnaire institute for the Chinese edition that user is filled in
The value of three obtained dimensions is also used for the training of PAD categories of model;Each grader is obtained to the other identification probability of target class,
And it is normalized;Weight assignment is carried out to each grader again, regards weight as particle, using particle cluster algorithm to power
Optimizing is carried out again;Recognition result of three graders for different emotions classification is finally merged, by that one kind that discrimination is maximum
Emotion is as final affective state.
As shown in Figure 1, the embodiment of the present invention is illustrated by taking user experience product A as an example:
Step 1:User's contact product A (to ensure effect, it is desirable to which user is not familiar with product A), and the production that undergoes
The function of product A, a PAD Affect Scales questionnaire is filled in the affective state after having experienced in oneself experience of the process;
Step 2:During user carries out usage experience to product A, the heart of the collection user during Product Experience
Rate and skin electrical signal, and both objective physiological signals are handled and feature extraction;
Step 3:The heart rate feature extracted and skin electrical feature are respectively adopted support vector machines (SVM) grader to carry out
Training and identification, meanwhile, the value of obtained three dimensions of PAD Affect Scale questionnaires for the Chinese edition that user is filled in is also used for
The training of PAD categories of model;
Step 4:The recognition result of each grader in step 3 is represented using the other Probability Forms of target class, and
Every kind of other recognition result of target class under each grader is normalized (method 1):
By known 4 kinds of emotional categories, (happiness 1, sadness 2, indignation 3 and frightened training sample x 4) are used for different classifications device
Ek, the recognition training of k ∈ (1,2,3), the recognition result of grader is represented in the form of posterior probability, and is normalized.
Have:Ek(x)=(Pk(C1|x),Pk(C2|x),Pk(C3|x),Pk(C4| x)), wherein, P(Ct|x,Ek) represent k-th of grader by physiological signal or master
See expression sample x and be identified as CtThe probability of class emotion, its value range are [0,1];Represent M (M=4) class emotional category,
It is happiness 1, sadness 2, indignation 3 and fear 4 respectively;Therefore, the decision matrix that can obtain multi-categorizer is:
Step 5:To each grader carry out weight assignment, regard weight as particle, using particle cluster algorithm to weight into
Row optimizing (method 2);
Particle cluster algorithm weight optimizing (process such as Fig. 2):For method proposed by the present invention, we will search out optimal
Weighted value ω so that J (ω) is maximized.
Wherein, ω=(ω11,ω21,…,ωk1,…,ωkt,…ω1m,…,ωkm),
Represent the real feelings class of i-th of training sample
Not,The emotion prediction classification of i-th of training sample of presentation class device output, N represent training sample number.
Particle cluster algorithm inputs:Decision matrixThe real feelings classification of training sampleParticle
Inertia weight α (uses fixed weight 0.5), Studying factors β1, β2(usual value is β1=β2=2), number of particles S, maximum change
Generation number Q, discrimination threshold value η.
Particle cluster algorithm exports:Optimal weights coefficient ω.
Comprise the following steps that:
(1) particle is initialized:Generate S particle at random in D dimension spacesConstituent particle group, and generate at random
The flying speed of S particle
And rememberIt is the history optimal solution that j-th of particle search arrives, g=(g11,g21,g31,…,
gkt,…,g34) it is the history optimal solution that whole population searches.
(2) more new particle:Wherein r1,
r2It is the random number in the generation of [0,1] section, for increasing the randomness of search.
(3) particle is normalized:Each particle is normalized respectively according to different emotions classification.Wherein 1≤k≤3,1≤t≤4.
(4) global optimizing is carried out to particle:The quality of each particle is evaluated using J (ω) as fitness function, and will
Its adaptive value is made comparisons with the best position that it passes through, and updates the optimal location p of each particlej(1≤j≤S), population are most
Excellent position g and update current best identified rate η ':
(5) end condition:Terminate when iterations q > Q or η ' > η (being set to 95%), otherwise q ← q+1 is gone to step
(2)。
Step 6:Recognition result of three graders for different emotions classification is merged, by that one kind that discrimination is maximum
Emotion is as final affective state (method 3):
For different emotional categories, the recognition result of multi-categorizer is merged:
T ∈ (1,2,3,4), wherein,ωktRepresent k-th of grader
On CtThe weight of class emotion.The final classification of sample
Claims (5)
1. a kind of physiological signal emotion identification method based on the subjective and objective fusion of multi-categorizer, it is characterised in that include following step
Suddenly:
Step 1:User carries out use feeling experience to product, and in the affective state after having experienced in oneself experience of the process
Fill in the PAD Affect Scale questionnaires of a Chinese edition;
Step 2:Heart rate and skin electrical signal of the user during Product Experience are gathered, and both objective physiology are believed
Number handled and feature extraction;
Step 3:The heart rate feature extracted and skin electrical feature are respectively adopted support vector machine classifier to be trained and know
Not, while by user the value of obtained three dimensions of PAD Affect Scale questionnaires for the Chinese edition filled in is also used for PAD models point
The training of class;
Step 4:The recognition result of each grader in step 3 is represented using the other Probability Forms of target class, and to every
Every kind of other recognition result of target class is normalized under a grader;
Step 5:Weight assignment is carried out to each grader, regards weight as particle, weight is sought using particle cluster algorithm
It is excellent;
Step 6:Recognition result of three graders for different emotions classification is merged, a kind of emotion of discrimination maximum is made
For final affective state.
2. the physiological signal emotion identification method according to claim 1 based on the subjective and objective fusion of multi-categorizer, its feature
The algorithm for being normalized described in step 4 is Ek(x)=(Pk(C1|x),Pk(C2|x),...,Pk(CM|x)),
Wherein,
k∈(1,2...K);P(Ct|x,Ek) represent k-th of grader by physiological signal or subjective expression sample
This x is identified as CtThe probability of class emotion, its value range are [0,1];Represent M (M=4) class emotional category, be high respectively
It is emerging, sad, indignation and fear;Therefore, the decision matrix that can obtain multi-categorizer is:
<mrow>
<msub>
<mi>A</mi>
<mrow>
<mi>K</mi>
<mo>&times;</mo>
<mi>M</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>P</mi>
<mn>1</mn>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>C</mi>
<mn>1</mn>
</msub>
<mo>|</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
<mtd>
<mrow>
<msub>
<mi>P</mi>
<mn>1</mn>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>C</mi>
<mn>2</mn>
</msub>
<mo>|</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
<mtd>
<mn>...</mn>
</mtd>
<mtd>
<mrow>
<msub>
<mi>P</mi>
<mn>1</mn>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>C</mi>
<mi>M</mi>
</msub>
<mo>|</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>P</mi>
<mn>2</mn>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>C</mi>
<mn>1</mn>
</msub>
<mo>|</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
<mtd>
<mrow>
<msub>
<mi>P</mi>
<mn>2</mn>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>C</mi>
<mn>2</mn>
</msub>
<mo>|</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
<mtd>
<mn>...</mn>
</mtd>
<mtd>
<mrow>
<msub>
<mi>P</mi>
<mn>2</mn>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>C</mi>
<mi>M</mi>
</msub>
<mo>|</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>P</mi>
<mi>K</mi>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>C</mi>
<mn>1</mn>
</msub>
<mo>|</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
<mtd>
<mrow>
<msub>
<mi>P</mi>
<mi>K</mi>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>C</mi>
<mn>2</mn>
</msub>
<mo>|</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
<mtd>
<mn>...</mn>
</mtd>
<mtd>
<mrow>
<msub>
<mi>P</mi>
<mi>K</mi>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>C</mi>
<mi>M</mi>
</msub>
<mo>|</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>.</mo>
</mrow>
3. the physiological signal emotion identification method according to claim 1 based on the subjective and objective fusion of multi-categorizer, its feature
It is that carrying out optimizing to weight using particle cluster algorithm described in step 5 refers to search out optimal weighted value ω so that J
(ω) is maximized, wherein, ω=(ω11,ω21,…,ωk1,…,ωkt,…ω1m,…,ωkm), Represent the real feelings classification of i-th of training sample,Represent
The emotion prediction classification of i-th of training sample of grader output, N represent training sample number, and particle cluster algorithm input is certainly
Plan matrixThe real feelings classification of training sampleThe inertia weight α of particle, Studying factors β1, β2,
Number of particles S, maximum iteration Q, discrimination threshold value η, the output of particle cluster algorithm is optimal weights coefficient ω, specific step
It is rapid as follows:
(1) particle is initialized:Generate S particle at random in D dimension spacesConstituent particle group, and S grain is generated at random
The flying speed of son
And rememberIt is the history optimal solution that j-th of particle search arrives, g=(g11,g21,…,
gK1,…,gkt,…,gKM) it is the history optimal solution that whole population searches;
(2) more new particle:Wherein r1,r2It is
In the random number of [0,1] section generation, the randomness for increase search;
(3) particle is normalized:Each particle is normalized respectively according to different emotions classification,Its
In 1≤k≤K, 1≤t≤M;
(4) global optimizing is carried out to particle:The quality of each particle is evaluated using J (ω) as fitness function, and is fitted
It should be worth and make comparisons with the best position that it passes through, update the optimal location p of each particlejThe optimal position of (1≤j≤S), population
Put g and update current best identified rate η ':
<mrow>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<msup>
<mi>p</mi>
<mi>j</mi>
</msup>
<mo>&LeftArrow;</mo>
<msup>
<mi>&omega;</mi>
<mi>j</mi>
</msup>
<mo>,</mo>
<mi>J</mi>
<mo>(</mo>
<msup>
<mi>p</mi>
<mi>j</mi>
</msup>
<mo>)</mo>
<mo><</mo>
<mi>J</mi>
<mo>(</mo>
<msup>
<mi>&omega;</mi>
<mi>j</mi>
</msup>
<mo>)</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>g</mi>
<mo>&LeftArrow;</mo>
<msup>
<mi>p</mi>
<mi>j</mi>
</msup>
<mo>,</mo>
<mi>J</mi>
<mo>(</mo>
<mi>g</mi>
<mo>)</mo>
<mo><</mo>
<mi>J</mi>
<mo>(</mo>
<msup>
<mi>p</mi>
<mi>j</mi>
</msup>
<mo>)</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<msup>
<mi>&eta;</mi>
<mo>&prime;</mo>
</msup>
<mo>&LeftArrow;</mo>
<mi>J</mi>
<mo>(</mo>
<mi>g</mi>
<mo>)</mo>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>;</mo>
</mrow>
(5) end condition:Terminate as iterations q > Q or η ' > η, otherwise q ← q+1 goes to step 2.
4. the physiological signal emotion identification method according to claim 3 based on the subjective and objective fusion of multi-categorizer, its feature
It is that the inertia weight α of particle uses fixed weight 0.5, Studying factors β1, β2Value be β1=β2=2.
5. the physiological signal emotion identification method according to claim 1 based on the subjective and objective fusion of multi-categorizer, its feature
It is that fusion refers to be merged the recognition result of multi-categorizer described in step 6, is specially:
T ∈ (1,2...M), wherein,ωktRepresent that k-th of grader closes
In CtThe weight of class emotion, the final classification of sample
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711077252.4A CN107944473A (en) | 2017-11-06 | 2017-11-06 | A kind of physiological signal emotion identification method based on the subjective and objective fusion of multi-categorizer |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711077252.4A CN107944473A (en) | 2017-11-06 | 2017-11-06 | A kind of physiological signal emotion identification method based on the subjective and objective fusion of multi-categorizer |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107944473A true CN107944473A (en) | 2018-04-20 |
Family
ID=61934319
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711077252.4A Pending CN107944473A (en) | 2017-11-06 | 2017-11-06 | A kind of physiological signal emotion identification method based on the subjective and objective fusion of multi-categorizer |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107944473A (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108630299A (en) * | 2018-04-27 | 2018-10-09 | 合肥工业大学 | Personality analysis method and system, storage medium based on skin resistance feature |
CN109192196A (en) * | 2018-08-22 | 2019-01-11 | 昆明理工大学 | A kind of audio frequency characteristics selection method of the SVM classifier of anti-noise |
CN109223005A (en) * | 2018-10-19 | 2019-01-18 | 内江天海科技有限公司 | A kind of psychological condition remote diagnosis system |
CN109259745A (en) * | 2018-10-25 | 2019-01-25 | 贵州医科大学附属医院 | A kind of wearable cardiovascular and cerebrovascular disease intelligent monitor system and method |
CN109325402A (en) * | 2018-08-06 | 2019-02-12 | 高维度(深圳)生物信息智能应用有限公司 | A kind of signal processing method, system and computer storage medium |
CN109472290A (en) * | 2018-10-11 | 2019-03-15 | 南京邮电大学 | Mood swing model analysis method based on finite state machine |
CN109508653A (en) * | 2018-10-26 | 2019-03-22 | 南京邮电大学 | A kind of subjective and objective individual combat Emotion identification method merged based on EEG signals with psychology |
CN110084263A (en) * | 2019-03-05 | 2019-08-02 | 西北工业大学 | A kind of more frame isomeric data fusion identification methods based on trust |
CN111134667A (en) * | 2020-01-19 | 2020-05-12 | 中国人民解放军战略支援部队信息工程大学 | Electroencephalogram signal-based time migration emotion recognition method and system |
WO2020096621A1 (en) * | 2018-11-09 | 2020-05-14 | Hewlett-Packard Development Company, L.P. | Classification of subject-independent emotion factors |
CN111738302A (en) * | 2020-05-28 | 2020-10-02 | 华南理工大学 | System for classifying and diagnosing Alzheimer disease based on multi-modal data |
CN113749656A (en) * | 2021-08-20 | 2021-12-07 | 杭州回车电子科技有限公司 | Emotion identification method and device based on multi-dimensional physiological signals |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103584872A (en) * | 2013-10-29 | 2014-02-19 | 燕山大学 | Psychological stress assessment method based on multi-physiological-parameter integration |
CN104856704A (en) * | 2015-03-31 | 2015-08-26 | 鲍崇智 | Method and system for objective-subjective combined psychological assessment |
CN104887198A (en) * | 2014-03-06 | 2015-09-09 | 中国科学院沈阳自动化研究所 | Pain quantitative analysis system and method based on human body physiological signal multi-parameter fusion |
CN106250855A (en) * | 2016-08-02 | 2016-12-21 | 南京邮电大学 | A kind of multi-modal emotion identification method based on Multiple Kernel Learning |
CN106803098A (en) * | 2016-12-28 | 2017-06-06 | 南京邮电大学 | A kind of three mode emotion identification methods based on voice, expression and attitude |
CN106886792A (en) * | 2017-01-22 | 2017-06-23 | 北京工业大学 | A kind of brain electricity emotion identification method that Multiple Classifiers Combination Model Based is built based on layering |
-
2017
- 2017-11-06 CN CN201711077252.4A patent/CN107944473A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103584872A (en) * | 2013-10-29 | 2014-02-19 | 燕山大学 | Psychological stress assessment method based on multi-physiological-parameter integration |
CN104887198A (en) * | 2014-03-06 | 2015-09-09 | 中国科学院沈阳自动化研究所 | Pain quantitative analysis system and method based on human body physiological signal multi-parameter fusion |
CN104856704A (en) * | 2015-03-31 | 2015-08-26 | 鲍崇智 | Method and system for objective-subjective combined psychological assessment |
CN106250855A (en) * | 2016-08-02 | 2016-12-21 | 南京邮电大学 | A kind of multi-modal emotion identification method based on Multiple Kernel Learning |
CN106803098A (en) * | 2016-12-28 | 2017-06-06 | 南京邮电大学 | A kind of three mode emotion identification methods based on voice, expression and attitude |
CN106886792A (en) * | 2017-01-22 | 2017-06-23 | 北京工业大学 | A kind of brain electricity emotion identification method that Multiple Classifiers Combination Model Based is built based on layering |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108630299A (en) * | 2018-04-27 | 2018-10-09 | 合肥工业大学 | Personality analysis method and system, storage medium based on skin resistance feature |
CN109325402A (en) * | 2018-08-06 | 2019-02-12 | 高维度(深圳)生物信息智能应用有限公司 | A kind of signal processing method, system and computer storage medium |
CN109192196A (en) * | 2018-08-22 | 2019-01-11 | 昆明理工大学 | A kind of audio frequency characteristics selection method of the SVM classifier of anti-noise |
CN109472290A (en) * | 2018-10-11 | 2019-03-15 | 南京邮电大学 | Mood swing model analysis method based on finite state machine |
CN109223005A (en) * | 2018-10-19 | 2019-01-18 | 内江天海科技有限公司 | A kind of psychological condition remote diagnosis system |
CN109259745A (en) * | 2018-10-25 | 2019-01-25 | 贵州医科大学附属医院 | A kind of wearable cardiovascular and cerebrovascular disease intelligent monitor system and method |
CN109508653A (en) * | 2018-10-26 | 2019-03-22 | 南京邮电大学 | A kind of subjective and objective individual combat Emotion identification method merged based on EEG signals with psychology |
WO2020096621A1 (en) * | 2018-11-09 | 2020-05-14 | Hewlett-Packard Development Company, L.P. | Classification of subject-independent emotion factors |
CN110084263A (en) * | 2019-03-05 | 2019-08-02 | 西北工业大学 | A kind of more frame isomeric data fusion identification methods based on trust |
CN110084263B (en) * | 2019-03-05 | 2021-04-30 | 西北工业大学 | Trust-based multi-frame heterogeneous data fusion identification method |
CN111134667A (en) * | 2020-01-19 | 2020-05-12 | 中国人民解放军战略支援部队信息工程大学 | Electroencephalogram signal-based time migration emotion recognition method and system |
CN111134667B (en) * | 2020-01-19 | 2024-01-26 | 中国人民解放军战略支援部队信息工程大学 | Time migration emotion recognition method and system based on electroencephalogram signals |
CN111738302A (en) * | 2020-05-28 | 2020-10-02 | 华南理工大学 | System for classifying and diagnosing Alzheimer disease based on multi-modal data |
CN113749656A (en) * | 2021-08-20 | 2021-12-07 | 杭州回车电子科技有限公司 | Emotion identification method and device based on multi-dimensional physiological signals |
CN113749656B (en) * | 2021-08-20 | 2023-12-26 | 杭州回车电子科技有限公司 | Emotion recognition method and device based on multidimensional physiological signals |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107944473A (en) | A kind of physiological signal emotion identification method based on the subjective and objective fusion of multi-categorizer | |
CN108363753B (en) | Comment text emotion classification model training and emotion classification method, device and equipment | |
CN106250855B (en) | Multi-core learning based multi-modal emotion recognition method | |
CN106529503A (en) | Method for recognizing face emotion by using integrated convolutional neural network | |
Chen et al. | K-means clustering-based kernel canonical correlation analysis for multimodal emotion recognition in human–robot interaction | |
CN104217226B (en) | Conversation activity recognition methods based on deep neural network Yu condition random field | |
CN107256392A (en) | A kind of comprehensive Emotion identification method of joint image, voice | |
CN107038480A (en) | A kind of text sentiment classification method based on convolutional neural networks | |
Ren et al. | Deep sequential image features on acoustic scene classification | |
CN106202952A (en) | A kind of Parkinson disease diagnostic method based on machine learning | |
CN107609572A (en) | Multi-modal emotion identification method, system based on neutral net and transfer learning | |
CN106886792A (en) | A kind of brain electricity emotion identification method that Multiple Classifiers Combination Model Based is built based on layering | |
CN110516696A (en) | It is a kind of that emotion identification method is merged based on the adaptive weighting bimodal of voice and expression | |
CN106919951A (en) | A kind of Weakly supervised bilinearity deep learning method merged with vision based on click | |
CN107247703A (en) | Microblog emotional analysis method based on convolutional neural networks and integrated study | |
CN109409433B (en) | Personality recognition system and method for social network users | |
CN108228569A (en) | A kind of Chinese microblog emotional analysis method based on Cooperative Study under the conditions of loose | |
CN110674483B (en) | Identity recognition method based on multi-mode information | |
CN109101584A (en) | A kind of sentence classification improved method combining deep learning with mathematical analysis | |
CN113887643B (en) | New dialogue intention recognition method based on pseudo tag self-training and source domain retraining | |
CN107239738A (en) | It is a kind of to merge eye movement technique and the sentiment analysis method of heart rate detection technology | |
CN105912525A (en) | Sentiment classification method for semi-supervised learning based on theme characteristics | |
CN109086794B (en) | Driving behavior pattern recognition method based on T-LDA topic model | |
CN106959946A (en) | A kind of text semantic feature generation optimization method based on deep learning | |
KR20210044017A (en) | Product review multidimensional analysis method and apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180420 |
|
RJ01 | Rejection of invention patent application after publication |