KR20130006030A - Construction method of classification model for emotion recognition and apparatus thereof - Google Patents

Construction method of classification model for emotion recognition and apparatus thereof Download PDF

Info

Publication number
KR20130006030A
KR20130006030A KR1020110067823A KR20110067823A KR20130006030A KR 20130006030 A KR20130006030 A KR 20130006030A KR 1020110067823 A KR1020110067823 A KR 1020110067823A KR 20110067823 A KR20110067823 A KR 20110067823A KR 20130006030 A KR20130006030 A KR 20130006030A
Authority
KR
South Korea
Prior art keywords
class
classification
class group
emotion
group
Prior art date
Application number
KR1020110067823A
Other languages
Korean (ko)
Inventor
이상국
오나래
Original Assignee
가톨릭대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 가톨릭대학교 산학협력단 filed Critical 가톨릭대학교 산학협력단
Priority to KR1020110067823A priority Critical patent/KR20130006030A/en
Publication of KR20130006030A publication Critical patent/KR20130006030A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Algebra (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention provides a method of extracting multi-dimensional feature values for each emotional class from biosignal data corresponding to N emotional classes, where N is a natural number of two or more. Grouping the first class group consisting of the emotional class of smaller natural number) and the second class group consisting of the NM emotional class, and combining the first class group and the second class group in the Fisher's Space, respectively. Mapping the binary class by repeatedly learning the mapped first class group and the second class group through an Adaboost algorithm, individually verifying a degree of classification for the binary classification, and Selecting the first class group and the second class group with the largest degree of classification as class groups for the current hierarchy. It provides a model construction method.
According to the present invention, a group of a plurality of emotional classes from which feature values are extracted is grouped into two groups in various cases, mapped into a Fisher space, trained through the Aidaboost algorithm, and then two groups corresponding to the case where the classification degree is the largest. Select as a class group for the current hierarchy to form a classification model. According to this, it is possible to binary classify various kinds of emotions from the bio-signal data in stages, thereby facilitating the classification and increasing the classification performance of the emotions.

Description

Construction method of classification model for emotion recognition and apparatus

The present invention relates to a method and apparatus for constructing a classification model for emotion recognition, and more particularly, to a method and apparatus for constructing a classification model for emotion recognition that can classify various emotions step by step using a biosignal. will be.

Emotional Intelligence Computing means that computers have the ability to recognize emotions that can process human emotions through learning and adaptation. This allows for more efficient human computer interaction (HCI).

As a method of recognizing human emotion, a method using image information and a method using voice information have been mainly studied. These studies present image information or audio information to the subject, and then measure and use biosignal according to the psychological change of the subject. The detection of the emotional state using the biosignal is based on the research result that all human emotional states can be expressed through the biosignal and the emotional state can be distinguished using only the biosignal.

The main types of biological signals that have been studied so far include electrocardiogram (ECG), electroencephalogram (EEG), electromyomram (EMG), electromyomram (EMG), galvanic skin response (GSR), skin temperature (SKT) ), Pulse wave (PPG), photoplethysmography (BVP), blood volume pressure (BVP), respiration rate (RES, respiration), and heart rate (HR).

In the early days of research on emotion recognition, the main focus was to confirm that human emotions can be distinguished by specific patterns of physical change along with strong physiological responses. In this early research, there is a method of performing emotion recognition by classifying many kinds of emotions at the same time. However, the simultaneous classification method of emotion has a problem in that the performance of emotion recognition decreases and the classification error increases because of the complexity of the data.

SUMMARY OF THE INVENTION An object of the present invention is to provide a method and apparatus for constructing a classification model for emotion recognition that can binary classify various kinds of emotions from biological signal data in stages.

The present invention provides a method of extracting multi-dimensional feature values for each emotional class from biosignal data corresponding to N emotional classes, where N is a natural number of two or more. Grouping the first class group consisting of the emotional class of smaller natural number) and the second class group consisting of the NM emotional class, and combining the first class group and the second class group in the Fisher's Space, respectively. Mapping the binary class by repeatedly learning the mapped first class group and the second class group through an Adaboost algorithm, individually verifying a degree of classification for the binary classification, and Selecting the first class group and the second class group with the largest degree of classification as class groups for the current hierarchy. It provides a model construction method.

Here, the grouping of the class and the verification of the classification degree are repeatedly performed on the selected class group until the number of each emotional class belonging to the selected first class group and the second class group is one. The class group can be layered.

The method of constructing a classification model for emotion recognition may further include receiving biosignal data for an arbitrary emotion and applying the biosignal data to the layered class group to perform the emotion classification for the arbitrary emotion. have.

The extracting of the multidimensional feature may include extracting the multidimensional feature from a plurality of biosignal data corresponding to the emotion class, and the biosignal data including respiratory rate data and skin conductivity corresponding to the emotion class. It may include at least one of data, blood pressure data, EMG data.

The grouping of the N emotional classes may include grouping while changing the number of emotional classes belonging to the first class group and the second class group or the types of emotional classes, and classifying the classification for the binary classification. In the verifying step, the degree of classification of the grouped first class group and the second class group may be calculated separately.

In addition, the step of individually verifying the classification degree for the binary classification, may use a Jack-Knife verification method.

The present invention provides a feature value extracting unit for extracting multi-dimensional feature values for each emotion class from biosignal data corresponding to N emotion classes (N is a natural number of two or more), and the N emotion classes as M; A class grouping unit for grouping the first class group consisting of dogs (M is a natural number smaller than N) and the second class group consisting of NM emotional classes, and each of the first class group and the second class group A Fisher space mapping unit for mapping to Fisher's Space, a binary classification unit for repeatedly classifying the mapped first class group and the second class group through an Adaboost algorithm, and the binary classification A classification degree verification unit that individually verifies the classification degree for the first class group and the second class group having the largest classification degree class for the current hierarchy. It provides a classification model construction equipment for Emotion Recognition that includes a group of classes choose to select a group.

According to a method and apparatus for constructing a classification model for emotion recognition according to the present invention, a plurality of emotion classes from which feature values have been extracted are grouped into two groups in various cases and mapped into a Fisher space to be learned through an Idaboost algorithm. The classification model is formed by selecting two groups corresponding to the case where the next classification degree is the largest as a class group for the current hierarchy. According to this, it is possible to binary classify various kinds of emotions from the bio-signal data in stages, thereby facilitating the classification and increasing the classification performance of the emotions.

1 is a block diagram of an apparatus for constructing a classification model for emotion recognition according to an embodiment of the present invention.
FIG. 2 is a flowchart of a method of constructing a classification model for emotion recognition using FIG. 1.
Figure 3 shows an example of the bio-signals of the four sensors for anger emotion in an embodiment of the present invention.
4 illustrates an embodiment in which a first class group and a second class group are mapped to a Fisher space according to an embodiment of the present invention.
FIG. 5 illustrates another embodiment in which the first class group and the second class group are mapped to a Fisher space according to an embodiment of the present invention.
6 shows a virtual code of Aidaboost according to an embodiment of the present invention.
Figure 7 shows an example of the change of the crystal boundary according to the number of times Ida boost in the embodiment of the present invention.
8 illustrates a conceptual example of a hierarchical classification model according to an embodiment of the present invention.
9 shows the structure of a hierarchical classification model obtained without using Fischer space for comparison with the present invention.
FIG. 10 is a result of a performance test according to the number of iterations of Aida boost in FIG. 9.
11 is a block diagram of a hierarchical classification model according to an embodiment of the present invention.
FIG. 12 illustrates the classification rate according to the number of Idaboost iterations and the number of Fischer space spindles for FIG. 11.
13 is a block diagram of a classification model for emotion classes 2, 7, 8 through the classification method according to an embodiment of the present invention.
FIG. 14 shows the classification rate according to the number of Idaboost iterations and the number of Fischer space spindles for FIG. 13.
15 is a block diagram of a classification model for emotion classes # 2, # 4, and # 8 through the classification method according to the embodiment of the present invention.
FIG. 16 illustrates the classification rate according to the number of Idaboost iterations and the number of Fischer space spindles for FIG. 15.
17 shows results of classification experiments for the sensitized class and the sensitized class according to the embodiment of the present invention.
FIG. 18 compares the performance results of FIGS. 14 and 16 with conventional performance results.

DETAILED DESCRIPTION Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art may easily implement the present invention.

The present invention relates to a method and apparatus for constructing a classification model for emotion recognition that can binary classify a variety of emotions from biological signal data in stages.

1 is a block diagram of an apparatus for constructing a classification model for emotion recognition according to an embodiment of the present invention. The apparatus 100 includes a feature value extractor 110, a class grouping unit 120, a Fisher space mapping unit 130, a binary classification unit 140, a classification degree verification unit 150, and a class group selection unit 160. ), And the emotion classification unit 170.

FIG. 2 is a flowchart of a method of constructing a classification model for emotion recognition using FIG. 1. Hereinafter, a method of constructing a classification model for emotion recognition will be described in detail with reference to FIG. 2.

First, the feature value extractor 110 extracts a multi-dimensional feature value for each emotion class from biosignal data corresponding to N emotion classes (N is a natural number of 2 or more) (S210).

N = 8 is used in this embodiment. That is, it includes no emotion, anger, hate, grief, platonic love, romantic love, joy, and reverence. There are eight kinds of emotional classes. Hereinafter, for convenience of description, the case of N = 8 will be described as an embodiment.

The bio-signal data corresponds to the body's response shown for each emotion. In this embodiment, MIT biosignal data provided by Picard and the like is used. This is achieved by showing one picture to each person with eight different emotions and then sampling at 20Hz 2 the signal from four sensors attached to the person. Here, the four sensors are for acquiring four biosignal data corresponding to one emotion class, and obtaining respiratory rate (RES) data, skin conductivity (GSR) data, blood pressure (BVP) data, and electromyogram (EMG) data. Each sensor is shown below.

3 shows an example of the biosignals of the four sensors for 'anger' sensitivity in this embodiment. (A), (b), (c), and (d) of FIG. 3 mean data of EMG, blood pressure, skin conductivity, and respiratory rate for anger sensitivity. It can be seen that each biosignal is profile data in the form of a continuous sequence.

The MIT biosignal data is divided into DATAII in which the initial acquisition data is stored as it is and DATAI in which each biosignal is separated and stored at regular intervals with a difference in emotion section. In this embodiment, more reliable DATAI is used. In the present embodiment, DATAI represents biosignal data for each emotional class obtained daily for 20 days separately selected among the 32 days of the experiment. Therefore, 20 pieces of data are included for each emotion class.

The step S210 is a step of extracting multi-dimensional feature values based on statistical techniques for each emotion class using biosignal data. That is, a total of six statistical values as shown in Equation 1 below are extracted for one biosignal data.

Figure pat00001

Here, X means one data of the biosignal data. N means the length of X. Equation (1) is an average value for the biosignal data, equation (2) is a variance value, equations (3) to (4), and equations (5) to (6) mean temporal variation values. X * means biosignal data normalized by Equation 2 below.

Figure pat00002

Here, a total of four biosignal data are collected for one emotion class, and a total of six statistical values are extracted for one biosignal data. That is, a total of 24 (= 4 × 6) statistical values for one emotion class. Therefore, through this method, it is possible to extract a feature value of 24 dimensions per one emotion class.

In addition, DATAI is biosignal data for each emotional class, which is obtained daily for 20 days, and has 20 data samples for each emotional class. Therefore, when using DATAI, a total of 20 24-dimensional feature values are generated per emotion class. In addition, a total of 160 (= 20 × 8) 24-dimensional feature values are generated, including a total of eight emotional classes.

The above embodiment is an example in which six statistical values are calculated per one biosignal data, and if the statistical values are calculated less, it is obvious that the order of feature values is lower.

After the step S210, the class grouping unit 120 'first class group' consisting of the 8 (= N) emotional classes, M (M is a natural number less than 8) emotional class and 8-M Grouping into a 'second class group' consisting of two emotional classes (S220).

The step S220 is a process of grouping while changing the 'number of emotional classes' or 'type of emotional classes' belonging to the first class group and the second class group.

First, an example of changing the number of emotional classes is as follows. The number of emotional classes in the first class group is classified from a total of eight emotional classes: The number of emotional classes in the second class group is' 4: 4 ',' 3: 5 ',' 2: 6 ', or' Can be classified as 1: 7 '. If the total number of eight emotions are divided into two groups of four , 8 C 4, and the number of cases classified into three and five two groups is 8 C 3 .

An example of changing the 'type of emotion class' is as follows. Here, for the convenience of explanation, a total of eight emotions are named as emotions 1 to 8 times. For example, when dividing two groups of four emotions into four groups, emotions of '1,3,5,7' are assigned to the first class group, and emotions '2,4,6,8' are sent to the second class group. Can be categorized to be included. In addition, the first class group may be classified to include emotions '2, 4, 5, 8', and the second class group may include emotions '1, 3, 6, 7'.

As described above, when eight emotional classes are classified into two class groups, it is obvious that the number of cases classified into two groups is very large as the number and types of emotional classes are changed and classified.

After the grouping step, the fischer space mapping unit 130 maps the classified first class group and the second class group to a Fisher's Space, respectively (S230).

This fischer space is called Fisher's Projection, Fisher's Linear Discriminant, or the like. The construction method of the fischer space refers to the following equations.

Figure pat00003

Figure pat00004

In Equations 3 and 4, C i denotes the i-th emotional class, and N i denotes the number of samples included in the emotional class C i . For example, the number of samples obtained for the 20 days per emotion class is 20. N means the total number of samples included in all emotion classes. In this case, the number of emotional classes is a total of c. Thus, Equation 3 is the average of the samples for each emotion class, and Equation 4 represents the average over the entire sample of all emotion classes.

Figure pat00005

Figure pat00006

Equation 5 calculates a covariance matrix between an average of all samples and an average of each class, and is called a between-class matrix. Equation 6 calculates the sum of the covariance matrices of each class, and is called a matrix within a class.

Figure pat00007

Equation 7 shows an objective equation for finding a Fisher space. Finding W that minimizes matrices in classes while maximizing matrices between classes is Fisher search's goal.

Figure pat00008

The Fisher space that satisfies the above object equation can be obtained as an eigenvector for the product matrix between the inverse of the matrix between classes and the matrix within the class, as shown in Equation (8).

The eigenvectors obtained by Equation 8 are interpreted as the main axis constituting the Fisher space, and the space can be transformed by mapping the sample to the main axis. The major axis is obtained by the number of dimensions of the input matrix, and can be selectively used in ascending order according to the magnitude of the eigenvalue.

4 illustrates an embodiment in which a first class group and a second class group are mapped to a Fisher space according to an embodiment of the present invention. That is, eight kinds of emotional classes are composed of the first class group consisting of 'anger, sadness, romantic love, joy' and the second class group consisting of 'insensitivity, hate, mental love, and worship'. It is thought. Here, 20 samples for each emotional class exist. In the case of FIG. 4, although some complexity is identified at the boundary of the two class groups, it can be seen that the two groups show a high degree of separation.

FIG. 5 illustrates another embodiment in which the first class group and the second class group are mapped to a Fisher space according to an embodiment of the present invention. This is the case in which eight kinds of emotional classes are divided into a first class group consisting of 'anger emotion' and a second class group consisting of 'the remaining seven emotions'. Here, 20 samples for each emotion exist. 5, the separation between groups can be confirmed. The classification form of this class group is an important factor that can determine the performance of the whole classification model.

After mapping the two groups to the fischer space, the binary classification unit 140 performs a binary classification by repeatedly learning the mapped first class group and the second class group through an Adaboost (Adaptive Boosting) algorithm (S240). ).

6 shows a virtual code of Aidaboost according to an embodiment of the present invention. In line 3, we start by applying the same weight (w t ) to all the learning samples. Then, for each iteration number, a weak classification model c t is defined that minimizes the error of the given sample weight.In this case, the weight is increased for the training sample in which an error occurs in the classification by c t , and the classification must be performed in the next iteration number. Adjust to be. Finally, T number of weak classifiers is generated and the classification model is completed in the form of weighted sum of the weak classifiers.

This Aidaboost algorithm performs individual cases for all cases in which eight emotional classes can be divided into two class groups. The Iidaboost algorithm has a high classification performance even in a complex feature space in which there are many kinds of emotion classes and a large number of parameters, and can express both linear and nonlinear boundaries.

The degree of granularity of decision boundaries is determined by the number of iterations of Idaboost learning. Figure 7 shows an example of the change of the crystal boundary according to the number of times Ida boost in the embodiment of the present invention. Referring to this, it can be seen that Idaboost determines the level of granularity of crystal boundaries according to the number of iterations. In other words, it can be seen that as the number of repetitions increases, the granularity of the crystal boundary becomes deeper, and the classification performance of the learning sample, that is, the degree of classification increases. However, when the crystal boundary is subdivided to a certain level or more, oversegmentation may occur and the classification performance may be deteriorated. Therefore, it is important to determine the optimal number of repetitions by analyzing the learning results according to the number of repetitions.

The above Idaboost algorithm is a technique for generating a strong classifier through the combination of a weak classifier, and the implementation of the weak classifier itself is very simple. More detailed principles of the Aida Boost algorithm are well known and thus will not be described in detail.

Subsequently, the classification degree verification unit 150 separately verifies the classification degree for the binary classification for the two class groups in all possible cases in the hierarchy based on the result of applying the Idaboost algorithm (S250). ).

Here, the degree of classification means the performance of binary classification through the Idaboost. The step S240 uses a Jack-Knife verification scheme. Since the jackknife verification method is one of conventional classification performance measurement methods, a detailed description thereof will be omitted.

After the verification, the class group selector 160, the first class group and the second class group corresponding to the case where the classification degree is the largest (excellent classification performance) from the individual verification results according to the step S250 Is selected as the class group for the current layer (S260). In other words, among all cases where eight emotional classes can be classified into two class groups, two class groups are selected as the class groups for the current hierarchy, which are the cases where the classification performance according to the application of Idaboost algorithm is the best. do.

These processes include grouping of the class (S220) to verifying the classification degree with respect to the selected class group until the number of each emotional class belonging to the selected first class group and the second class group becomes one. By repeating (S250), the class group is hierarchized.

That is, after the class group for the current layer is determined, it is checked whether a classification model for the lower layer is required, and in this case, steps S220 to S260 are executed again to classify the class groups in each step. In addition, when two class groups are determined for all hierarchies, the classification model ends.

The configuration of the present invention as described above is as follows. The present invention performs feature extraction, configuration of two emotional class groups, configuration of Fisher space, idaboost learning by layer, test of binary classification performance, and confirmation of two class groups by layer on biosignals acquired according to emotions. do. In other words, the feature values representing the characteristics of each signal are extracted from the biosignals, and the emotion classes are divided into two groups to form a Fisher space. The samples mapped in Fisher's space are trained with Idaboost, and Jackknife verification evaluates the learning performance of a class group. All class groups that can be organized within one hierarchy are trained and verified, and the class group that shows the best performance is determined as the class group at this stage. Each class group divided in one hierarchy is regarded as a separate next hierarchy, and class group construction and learning process is performed again.

Hereinafter, the test results of the emotion classification model to which the embodiment of the present invention is applied will be described.

The conditions of the test are as follows. In the case of class group separation, experiments are performed for all cases where the target class can be separated at each step (hierarchy). The number of repetitions of Idaboost is applied 100 times, 200 times, and 300 times, and the number of main axes of the fischer space is applied to 2 and 3 times to compare with the experiment. In addition, Jackknife verification is performed on each binary classification training and the final combined hierarchical classification model to evaluate the classification performance. The binary classification performance of each stage uses 20 times the number of jackknife verification of the class number, and the final classification performance uses 160 jackknife verification.

First, Table 1 below shows a labeling method for convenience of notation for the emotion class.

Label Emotion One No Emotion 2 Anger 3 Hate 4 Grief 5 Platonic Love 6 Romantic Love 7 Joy 8 Reverence

Referring to Table 1, a number is assigned to each emotion to facilitate the division.

8 illustrates a conceptual example of a hierarchical classification model according to an embodiment of the present invention. Referring to this, first, the eight emotional classes are divided into two class groups. In this case, many separation forms have been described above. After that, the separated group is trained with Idaboost and goes through the Jack Knife verification process. Here, the performance (classification) of classification is measured after learning and verifying for all cases in which eight kinds of emotions can be divided into two class groups. Then, the two group types showing the highest performance are determined as the group in the current stage. In the example of FIG. 8, the group consisting of the classes [1, 3, 5, 8] and the group consisting of the classes [2, 4, 6, 7] are determined.

In addition, each determined group repeats the above-described process in the same manner. In the example of FIG. 8, the [1, 3, 5, 8] class is further divided into two groups, and the class [1, 8] class and [ 3, 5] This is a class group. This process is repeated until eight emotional classes are classified into independent classes, that is, one class.

In the following, for comparison with an embodiment of the present invention, a performance of a hierarchical classification model constructed without mapping a 24-dimensional statistical feature value to a Fisher space will be described. Table 2 shows the experimental results.

step Group 1 Group 2 Classification performance One 2 1, 3, 4, 5, 6, 7, 8 90.6% 2 8 1, 3, 4, 5, 6, 7 86.4% 3 One 3, 4, 5, 6, 7 84.2% 4 3 4, 5, 6, 7 77.0% 5 6 4, 5, 7 81.3% 6 5 4, 7 78.3% 7 4 7 77.5%

In the highly complex 24D statistical feature-based space, the highest classification performance was found when assigning one class to one group and assigning the other class to one group for each step. .

9 shows the structure of a hierarchical classification model obtained without using Fischer space for comparison with the present invention. This is the structure of the classification model finally determined based on the learning results of each stage.

FIG. 10 shows the results of a performance test according to the number of iterations of Aida boost in FIG. 9. Referring to this, as a result of the experiment of the final classification performance for the configuration of Figure 9, it can be confirmed that the best classification performance of the repetition number of Idaboost 200 times. However, as a whole, the classification performance of 8 kinds of emotional classes remained at about 48-51%, indicating that the classification of emotional classes is very difficult only with a 24-dimensional space composed of statistical features.

Alternatively, the result of constructing a hierarchical classification model after mapping the 24 dimensional statistical feature values to the Fisher space according to an embodiment of the present invention is referred to Table 3 below.

step Group 1 Group 2 Classification performance One 2, 4, 6, 7 1, 3, 5, 8 93.5% 2-1 2 4, 6, 7 91.3% 2-2 1, 8 3, 5 86.3% 3-1 4 6, 7 86.7% 3-2 One 8 95.0% 3-3 3 5 95.0% 4-1 6 7 90.0%

Table 3 shows the composition status and verification results of the phased class groups. As a result of the experiment, after mapping two class groups into the Fisher space and performing the learning step by step, it is confirmed that the classification performance was about 86-95% within each step.

11 is a block diagram of a hierarchical classification model according to an embodiment of the present invention. 11 is a configuration corresponding to Table 3, through which it can be seen that a hierarchical classification model of up to four levels is generated. Also, in all iterations, class 2 (anger) and class 8 (worship) always show high classification performance when they belong to different groups.

If the classification model of FIG. 11 is used, the biometric signal data for any emotion is received and applied to each of the hierarchical class groups of FIG. 11 to apply the emotion classification for the random emotion to the emotion classification unit 170. This can be done at That is, the final classification of any emotion currently input may be performed while moving from the upper layer to the lower layer.

Based on the hierarchical classification model determined as above, the results of experimenting the final classification performance for 8 kinds of emotional classes are as follows. FIG. 12 illustrates the classification rate according to the number of Idaboost iterations and the number of Fischer space spindles for FIG. 11. As a result, it can be seen that the best classification performance is obtained when the number of repetitions of Idaboost is 200 times and the number of Fischer space spindles is three.

Here, the classification performance of eight kinds of emotional classes is about 64 ~ 72%. In addition, it can be seen that when the axis of the fischer space is three axes compared to the two axes shows a higher performance. It is confirmed that the number of repeats of Idaboost adversely affects the classification performance if it exceeds the titration line.

Next, we look at the results of the classification performance for a specific emotion class. In the existing Picard et al. Study, classification performance for the 2nd (wrath), 7th (joy) and 8th (worship) emotional classes was the highest with 88.3%.

13 is a block diagram of a classification model for emotion classes 2, 7, 8 through the classification method according to an embodiment of the present invention. FIG. 14 shows the classification rate according to the number of Idaboost iterations and the number of Fischer space spindles for FIG. 13.

For comparison with the results of FIGS. 11 and 12, as a result of conducting experiments on the emotion classes 2, 7, and 8 through the classification method according to the embodiment of the present invention, the learning and verification steps are performed in two steps. Modeled.

Through this, it is possible to confirm the best sorting performance when the number of iterations of Idaboost is 300 times and the number of fischer space spindles is three. Overall, the classification performance of three specific emotions (2, 7, 8) was confirmed to represent about 88-93%.

As another example, the existing Picard et al. Study showed excellent classification performances for No. 2 (Anger), No. 4 (Sorrow), and No. 8 (Cult) emotional classes. For comparison, the results of experiments on emotion classes 2, 4, and 8 through the classification method according to the embodiment of the present invention are as follows.

15 is a block diagram of a classification model for emotion classes # 2, # 4, and # 8 through the classification method according to the embodiment of the present invention. FIG. 16 illustrates the classification rate according to the number of Idaboost iterations and the number of Fischer space spindles for FIG. 15. The model was modeled in two stages through the training and verification phases. As a result, IAD Boost showed excellent classification performance when the number of iterations was 200 and 300 and the number of Fisher space spindles was three.

In the field of emotion recognition, research has been conducted on the classification of the emotion state and the no emotion state. Experimental results of the classification performance of the sensitized state and the insensitive state using the classification method according to the embodiment of the present invention refer to the following contents.

17 shows results of classification experiments for the sensitized class and the sensitized class according to the embodiment of the present invention. Earlier, according to Table 1, the regrettable state includes emotions from class 2 to class 8, which is assigned to one group. An insensitive state means a class corresponding to number 1.

As a result, the overall classification performance was about 91% or more. It can be judged that there is a considerable distinction between the sensible class group and the insensitive class. In particular, only a small number of samples in the overlapping regions of these two states showed an error, and class 3 emotion (hate) in the group of most regrettable classes was classified as insensitive class.

As described above, when the fischer space according to the embodiment of the present invention is used, the sorting performance is improved by about 20%.

Table 4 compares the best classification performance and the eight kinds of emotional class classification performance presented by Picard, etc. in the experimental results of FIGS. 9 and 11.

Experimental results of Picard Experiment result of FIG. Experiment result of FIG. Classification target 8 kinds of emotion 8 kinds of emotion 8 kinds of emotion Classification performance 46.30% 50.60% 71.90%

Accordingly, when the embodiment of the present invention is applied, it can be seen that the performance of about 25% is superior in the eight kinds of emotional classifications compared to the conventional techniques such as Picard.

FIG. 18 compares the performance results of FIGS. 14 and 16 with conventional performance results. This results in the classification problem of 2, 4, 8 classes (anger, sadness, cult) and 2, 7, 8 class (anger, joy, cult), which show excellent classification performance in the results of Picard et al. Compared with the example. As a result, in the case of the embodiment of the present invention, it can be seen that the classification performance is improved by about 5 to 9% compared to the technique of Picard and the like for a specific sensitivity. Therefore, it can be seen that the present invention exhibits excellent performance not only in the eight types of highly sensitive classification configurations but also in three types of relatively low complexity classification configurations.

When constructing a hierarchical classification model for each emotion using the present invention as described above, the configuration of class groups formed in stages may be a basis for inferring similarity or emotion data distribution patterns among emotion classes. In addition, the complexity of the data and the distribution pattern of the data affect the classification performance. Therefore, grasping similarity and distribution pattern between emotional classes suggests the possibility of improving the classification performance of highly complex emotional data.

According to the present invention, it is possible to reduce the complexity of the feature space by converting a highly complex classification problem for a plurality of emotions into a binary classification problem for two emotion groups. In addition, emotional class groups can be defined step by step to generate respective binary classification models, and each binary classification model can be combined into multiple emotional class classification models having a hierarchical structure, thereby improving classification efficiency and performance. .

In other words, the present invention forms a group of classes for a variety of emotional classes of high complexity to form a Fisher space, then learn a classification model, combining the generated classification model step by step emotional classification to generate a hierarchical classification model Use technique. Therefore, unlike the method of classifying each of the various emotional classes at the same time, the present invention can improve the performance of the final classification of the various emotional classes by using a method of classifying the classification step by step by simplifying the classification structure by the binary class classification method. have. In addition, a certain number of emotional class classification performance may also exhibit high performance.

While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.

100: classification model composition device for emotion recognition
110: feature value extraction unit 120: class grouping unit
130: Fisher space mapping unit 140: Binary classification unit
150: classification degree verification unit 160: class group selection unit
170: emotional classification

Claims (12)

Extracting a multidimensional feature value for each emotion class from biosignal data corresponding to N emotion classes (N is a natural number of two or more);
Grouping the N emotional classes into a first class group consisting of M emotional classes (M is a natural number less than N) and a second class group consisting of N M emotional classes;
Mapping the first class group and the second class group to a Fisher's Space, respectively;
Iteratively learning and binary classifying the mapped first class group and the second class group through an Adaboost algorithm;
Individually verifying a degree of classification for the binary classification; And
And selecting the first class group and the second class group having the largest degree of classification as the class groups for the current hierarchy.
The method according to claim 1,
The grouping step of the class and the verification step of the classification diagram are repeatedly performed on the selected class group until the number of each emotional class belonging to the selected first class group and the second class group is one. Method of constructing classification model for emotion recognition stratifying group.
The method according to claim 2,
And receiving the biosignal data for any emotion and applying it to the hierarchical class group to perform emotion classification for the random emotion.
The method according to claim 1,
Extracting the multi-dimensional feature value,
Extracting the multi-dimensional feature value from a plurality of biosignal data corresponding to the emotion class,
The bio signal data,
And a respiration rate data, skin conductivity data, blood pressure data, and electromyogram data corresponding to the emotion class.
The method according to claim 1,
Grouping the N emotion classes,
Grouping while changing the number of the emotional class belonging to the first class group and the second class group or the type of emotional class,
Individually verifying the classification degree for the binary classification,
And a classification model for emotion recognition that individually verifies the classification degree of the grouped first class group and the second class group.
The method according to claim 1,
Individually verifying the classification degree for the binary classification,
How to construct a classification model for emotion recognition using Jack-Knife verification.
A feature value extracting unit for extracting a multi-dimensional feature value for each emotion class from biological signal data corresponding to N emotion classes (N is a natural number of two or more);
A class grouping unit for grouping the N emotional classes into a first class group consisting of M emotional classes (M is a natural number smaller than N) and a second class group consisting of N M emotional classes;
A fischer space mapping unit for mapping the first class group and the second class group to a Fisher's Space, respectively;
A binary classification unit configured to perform binary classification on the mapped first class group and the second class group through an Adaboost algorithm to perform binary classification;
A classification degree verification unit which individually verifies a classification degree for the binary classification; And
And a class group selector configured to select the first class group and the second class group having the largest classification degree as class groups for the current hierarchy.
The method of claim 7,
Grouping of the classes, mapping of the Fisher space, the binary classification, and the classification for the selected class group until the number of each emotional class belonging to the selected first class group and the second class group is one An apparatus for constructing a classification model for emotion recognition to layer the class group by repeatedly performing the verification process of FIG.
The method according to claim 8,
And an emotion classification unit configured to receive biosignal data for an arbitrary emotion and apply it to the layered class group to perform an emotion classification for the arbitrary emotion.
The method of claim 7,
Extracting the multi-dimensional feature value,
Extracting the multi-dimensional feature value from a plurality of biosignal data corresponding to the emotion class,
The bio signal data,
And a respiration rate data, skin conductivity data, blood pressure data, and electromyogram data corresponding to the emotion class.
The method of claim 7,
The class grouping unit,
Grouping while changing the number of the emotional class belonging to the first class group and the second class group or the type of emotional class,
The classification degree verification unit,
And a classification model construction apparatus for emotion recognition that individually verifies the classification degree of the grouped first class group and the second class group.
The method of claim 7,
The classification degree verification unit,
Classification model construction device for emotion recognition using Jack-Knife verification method.
KR1020110067823A 2011-07-08 2011-07-08 Construction method of classification model for emotion recognition and apparatus thereof KR20130006030A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110067823A KR20130006030A (en) 2011-07-08 2011-07-08 Construction method of classification model for emotion recognition and apparatus thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110067823A KR20130006030A (en) 2011-07-08 2011-07-08 Construction method of classification model for emotion recognition and apparatus thereof

Publications (1)

Publication Number Publication Date
KR20130006030A true KR20130006030A (en) 2013-01-16

Family

ID=47837203

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110067823A KR20130006030A (en) 2011-07-08 2011-07-08 Construction method of classification model for emotion recognition and apparatus thereof

Country Status (1)

Country Link
KR (1) KR20130006030A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110769777A (en) * 2017-06-16 2020-02-07 阿莱恩技术有限公司 Automatic detection of tooth type and eruption status
KR20200106121A (en) * 2019-02-28 2020-09-11 한양대학교 산학협력단 Learning method and apparatus for facial expression recognition, facial expression recognition method using electromyogram data

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110769777A (en) * 2017-06-16 2020-02-07 阿莱恩技术有限公司 Automatic detection of tooth type and eruption status
US11996181B2 (en) 2017-06-16 2024-05-28 Align Technology, Inc. Automatic detection of tooth type and eruption status
KR20200106121A (en) * 2019-02-28 2020-09-11 한양대학교 산학협력단 Learning method and apparatus for facial expression recognition, facial expression recognition method using electromyogram data

Similar Documents

Publication Publication Date Title
CN111553295B (en) Multi-mode emotion recognition method based on self-attention mechanism
Mohammadpour et al. Classification of EEG-based emotion for BCI applications
Abbas et al. DeepMI: Deep learning for multiclass motor imagery classification
CN106778657A (en) Neonatal pain expression classification method based on convolutional neural networks
CN108596069A (en) Neonatal pain expression recognition method and system based on depth 3D residual error networks
Rejer EEG feature selection for BCI based on motor imaginary task
Li et al. EEG signal classification method based on feature priority analysis and CNN
KR102646257B1 (en) Deep Learning Method and Apparatus for Emotion Recognition based on Efficient Multimodal Feature Groups and Model Selection
Deepthi et al. An intelligent Alzheimer’s disease prediction using convolutional neural network (CNN)
Mzurikwao et al. A channel selection approach based on convolutional neural network for multi-channel EEG motor imagery decoding
Qin et al. Deep multi-scale feature fusion convolutional neural network for automatic epilepsy detection using EEG signals
CN117338313B (en) Multi-dimensional characteristic electroencephalogram signal identification method based on stacking integration technology
Bardak et al. EEG based emotion prediction with neural network models
Hamdi et al. Biomarker detection from fmri-based complete functional connectivity networks
Radha et al. Enhancing upper limb prosthetic control in amputees using non-invasive EEG and EMG signals with machine learning techniques
KR20130006030A (en) Construction method of classification model for emotion recognition and apparatus thereof
CN111783669B (en) Surface electromyogram signal classification and identification method for individual user
Dissanayake et al. DConv-LSTM-Net: A Novel Architecture for Single and 12-Lead ECG Anomaly Detection.
CN109800651B (en) Multiclass electroencephalogram classification method based on double-rule active overrun learning machine
Lu Human emotion recognition based on multi-channel EEG signals using LSTM neural network
Gurve et al. Motor Imagery Classification with Covariance Matrices and Non-Negative Matrix Factorization
Lee et al. CUR+ NMF for learning spectral features from large data matrix
Pramudita et al. EEG motor imagery signal classification using firefly support vector machine
Fouad et al. Attempts towards the first brain-computer interface system in INAYA Medical College
Hong et al. A deep learning framework based on dynamic channel selection for early classification of left and right hand motor imagery tasks

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application