CN109840513B - Face micro-expression recognition method and recognition device - Google Patents

Face micro-expression recognition method and recognition device Download PDF

Info

Publication number
CN109840513B
CN109840513B CN201910149809.3A CN201910149809A CN109840513B CN 109840513 B CN109840513 B CN 109840513B CN 201910149809 A CN201910149809 A CN 201910149809A CN 109840513 B CN109840513 B CN 109840513B
Authority
CN
China
Prior art keywords
expression
test
aus
sequence
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910149809.3A
Other languages
Chinese (zh)
Other versions
CN109840513A (en
Inventor
支瑞聪
李婷婷
刘梦祎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology Beijing USTB
Original Assignee
University of Science and Technology Beijing USTB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology Beijing USTB filed Critical University of Science and Technology Beijing USTB
Priority to CN201910149809.3A priority Critical patent/CN109840513B/en
Publication of CN109840513A publication Critical patent/CN109840513A/en
Application granted granted Critical
Publication of CN109840513B publication Critical patent/CN109840513B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a method and a device for recognizing facial micro-expressions, which can improve the recognition precision of the expressions on the basis of reducing the calculated amount. The method comprises the following steps: determining an association between AUs in an expression database and an association between an AU and an expression, wherein the AU represents a facial action unit; obtaining an AU template sequence of each expression according to the determined correlation between AUs and expressions in the expression database; acquiring an AU sequence of a test sample; and calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive public subsequence matching method according to the determined correlation between AUs in the expression database, and acquiring the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample. The present invention relates to the field of image processing and pattern recognition.

Description

Face micro-expression recognition method and recognition device
Technical Field
The invention relates to the field of image processing and pattern recognition, in particular to a face micro-expression recognition method and a face micro-expression recognition device.
Background
With the increasing demand for automatic emotion recognition, people are receiving more and more attention as an important expression mode of expressions in human faces. Psychologists Ekman and Friesen developed a Facial Action Coding System (FACS) that describes the state of a face as a combination of facial Action Units (AU), which are movements of different muscles of the face. The automatic detection of AU greatly helps to identify the facial expression of the depression patient, and has wide application in the aspects of human-computer interaction, network learning, market research, multimedia, mental health and the like of the depression patient.
From FACS, the human face has 43 muscles in total, which make up 10000 facial states, at least 3000 of which have specific emotions. The facial motion coding system established by Ekman and Friesen divides a human face into a plurality of mutually independent motion units. According to the skeleton and physical structure of the human face, the action units can effectively describe the facial expression. Despite differences in appearance between different individuals of humans, the physical architecture of the face is similar.
In the prior art, robust methods of mapping AU to emotion remain largely unexplored, and few deterministic association rule-based techniques map AUs to emotion classes to identify facial expressions.
Disclosure of Invention
The invention aims to provide a face micro-expression recognition method and a recognition device, which are used for recognizing facial expressions based on the relevance between facial Action Units (AUs) and the relevance between the AUs and the expressions.
In order to solve the above technical problem, an embodiment of the present invention provides a face micro expression recognition method, including:
determining an association between AUs in an expression database and an association between an AU and an expression, wherein the AU represents a facial action unit;
obtaining an AU template sequence of each expression according to the determined correlation between AUs and expressions in the expression database;
acquiring an AU sequence of a test sample;
and calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive public subsequence matching method according to the determined correlation between AUs in the expression database, and acquiring the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample.
Further, the determining the association between AUs and expressions in the expression database includes:
counting AU labels in the expression database, and determining the relevance between AUs in the expression database through a discrimination coefficient;
and carrying out statistics on the AU labels and the expression labels in the expression database, and determining the relevance between the AU and the expressions through the discrimination coefficient.
Further, the counting of the AU tags in the expression database and the determining of the association between AUs in the expression database by the discrimination coefficient includes:
determining a correlation coefficient between any two AUs using a first discrimination coefficient formula, wherein the first discrimination coefficient formula is expressed as:
Figure BDA0001981171990000021
wherein, XiAnd XjRespectively represent AUiAnd AUj,AUiAnd AUjRespectively represent a face operating unit i and a face operating unit j, P (X)j|Xi) To denote AUiAU under already occurring conditionsjProbability of occurrence of, P (X)i) To denote AUiProbability of occurrence of, P (X)jXi) To denote AUiAnd AUjProbability of coincidence;
Figure BDA0001981171990000022
Figure BDA0001981171990000023
wherein S represents the total number of images in the expression database for counting the correlation between AUs, and HiIndicating that the expression database contains labels AUiNumber of image samples of AiExpression database simultaneously containing label AUiAnd AUjThe number of image samples.
Further, the counting of the AU tags and the expression tags in the expression database, and the determining of the association between the AU and the expression through the discrimination coefficient includes:
obtaining a correlation coefficient between any AU and an expression by using a second discrimination coefficient formula, wherein the second discrimination coefficient formula is expressed as:
Figure BDA0001981171990000031
wherein, YiRepresentative expression Ti,XjRepresents AUj,P(Xj|Yi) To express an expression TiAU under already occurring conditionsjProbability of occurrence of, P (Y)i) To express an expression TiProbability of occurrence of, P (X)jYi) To express an expression TiAnd AUjProbability of coincidence;
Figure BDA0001981171990000032
Figure BDA0001981171990000033
wherein N represents the total number of images in the expression database for counting the relevance between AU and expressions, and FiRepresenting facial expressions database contains tags TiNumber of image samples, GiExpression database simultaneously contains label TiAnd AUjThe number of image samples.
Further, the obtaining of the AU template sequence of each expression according to the determined association between the AU and the expression in the expression database includes:
and sequencing according to the relevance between AUs and expressions from high to low, and acquiring a preset number of AUs with high relevance as sequence lengths to obtain an AU template sequence of each expression.
Further, a Test sample AU sequence is defined as Test _ AU ═ u { AU ═ AU {k},
Figure BDA0001981171990000034
Wherein k represents an AU type subscript, and L represents the number of AU types; expression TiThe template sequence of (1) is Temp _ AU(i)=∪{AUj},i=1,2,3,…R,
Figure BDA0001981171990000035
j represents an AU type index, and R represents the number of expression types;
the step of calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive common subsequence matching method according to the determined correlation between AUs in the expression database, and the step of obtaining the expression type to which the AU template sequence corresponding to the maximum similarity belongs as the expression type of the test sample comprises the following steps:
a11, determining the length K of the AU sequence Test _ AU and the AU template sequence Temp _ AU of the ith expression(i)Is the same;
a12, if the same, i.e., K equals J, obtains Test _ AU and Temp _ AU(i)Temp _ AU with the largest number of AUs same with each other(i)As an optimal solution, if the optimal solution is unique, the expression to which the AU template sequence corresponding to the optimal solution belongs is the expression type of the test sample; if the optimal solution is more than one, determining the Test sample AU sequence Test _ AU and the AU template sequence Temp _ AU of the ith expression(i)And obtaining the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample.
Further, the AU sequence Test _ AU of the Test sample and the AU template sequence Temp _ AU of the ith expression are tested(i)Similarity between them SimiExpressed as:
Figure BDA0001981171990000041
wherein, Test _ AUpRepresents an element in the Test _ AU,
Figure BDA0001981171990000042
AU template sequence Temp _ AU representing ith expression(i)The elements of (1);
the obtaining of the expression type to which the AU template sequence corresponding to the maximum similarity belongs as the expression type of the test sample includes:
obtaining the expression template sequence with the highest similarity as the optimal matching template, namely
Figure BDA0001981171990000043
And taking the expression type to which the optimal matching template belongs as the expression type of the test sample.
Further, the method further comprises:
if different, and K>J, for the test specimenEach AU tag Test _ AU in the AU sequence Test _ AUpCalculating the correlation index
Figure BDA0001981171990000044
Obtaining SpElement Test _ AU at the smallest valuepAnd deleting the Test _ AUpUntil Test _ AU and Temp _ AU(i)The length is the same and step a12 is continued.
Further, the method further comprises:
if different and K<J, when calculating Test _ AU and Temp _ AU(i)When the similarity between the expression and the expression is high, an AU template sequence Temp _ AU of the ith expression is randomly selected(i)And added to the sequence of Test sample AUs, Test _ AU, such that Test _ AU and Test _ AU(i)The length is the same and step a12 is continued.
The embodiment of the invention also provides a face micro-expression recognition device, which comprises:
a first determination module for determining a correlation between AUs in an expression database and a correlation between an AU and an expression, wherein the AU represents a facial action unit;
the second determining module is used for obtaining an AU template sequence of each expression according to the determined association between the AU and the expression in the expression database;
the acquisition module is used for acquiring an AU sequence of the test sample;
and the third determining module is used for calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive public subsequence matching method according to the relevance between AUs in the determined expression database, and acquiring the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample.
The technical scheme of the invention has the following beneficial effects:
in the scheme, the relevance between AUs in the expression database and the relevance between AUs and expressions are determined, so that the relevance relation between facial features and expressions of the human face is expressed more comprehensively and accurately; obtaining an AU template sequence of each expression according to the determined correlation between AUs and expressions in the expression database, and determining a next template for subsequent recognition of the expressions based on AUs, so that the expression recognition process can be simplified; acquiring an AU sequence of a test sample; and calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive public subsequence matching method according to the determined correlation between AUs in the expression database, and acquiring the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample, thereby realizing the automatic identification of the expression. Thus, the recognition accuracy of the expression can be improved on the basis of reducing the calculation amount.
Drawings
Fig. 1 is a schematic flow chart of a human face micro-expression recognition method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an association matrix and an association between AUs and AUs in a DISFA according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an association matrix and association between AUs and AUs in BP4D according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a facial micro-expression recognition device according to an embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments.
Example one
As shown in fig. 1, the method for recognizing a micro expression of a human face according to an embodiment of the present invention includes:
s101, determining the relevance between AUs in an expression database and the relevance between AUs and expressions, wherein AUs represent facial action units;
s102, obtaining an AU template sequence of each expression according to the determined correlation between AUs and expressions in the expression database;
s103, acquiring an AU sequence of the test sample;
and S104, calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive public subsequence matching method according to the determined correlation between AUs in the expression database, and acquiring the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample.
The method for recognizing the micro expression of the human face determines the relevance between AUs in the expression database and the relevance between AUs and expressions, so that the relevance between facial features and expressions of the human face can be expressed more comprehensively and accurately; obtaining an AU template sequence of each expression according to the determined correlation between AUs and expressions in the expression database, and determining a next template for subsequent recognition of the expressions based on AUs, so that the expression recognition process can be simplified; acquiring an AU sequence of a test sample; and calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive public subsequence matching method according to the determined correlation between AUs in the expression database, and acquiring the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample, thereby realizing the automatic identification of the expression. Thus, the recognition accuracy of the expression can be improved on the basis of reducing the calculation amount.
In an embodiment of the foregoing facial micro expression recognition method, further, the determining the association between AUs and the expression in the expression database includes:
counting AU labels in the expression database, and determining the relevance between AUs in the expression database through a discriminant coefficient (discriminant power);
and carrying out statistics on the AU labels and the expression labels in the expression database, and determining the relevance between the AU and the expressions through the discrimination coefficient.
In the embodiment, the relevance between AUs and expressions in the expression database is mined by adopting a coefficient distinguishing method, so that the relationship between facial features and expressions of the human face can be expressed more comprehensively and accurately.
In an embodiment of the foregoing method for recognizing facial micro expressions, the counting AU tags in an expression database, and determining the association between AUs in the expression database by using a discrimination coefficient further includes:
determining a correlation coefficient between any two AUs using a first discrimination coefficient formula, wherein the first discrimination coefficient formula is expressed as:
Figure BDA0001981171990000071
wherein, XiAnd XjRespectively represent AUiAnd AUj,AUiAnd AUjRespectively represent a face operating unit i and a face operating unit j, P (X)j|Xi) To denote AUiAU under already occurring conditionsjProbability of occurrence of, P (X)i) To denote AUiProbability of occurrence of, P (X)jXi) To denote AUiAnd AUjProbability of coincidence;
Figure BDA0001981171990000072
Figure BDA0001981171990000073
wherein S represents the total number of images in the expression database for counting the correlation between AUs, and HiIndicating that the expression database contains labels AUiNumber of image samples of AiExpression database simultaneously containing label AUiAnd AUjThe number of image samples.
In this embodiment, the conditional occurrence probability between any two AUs is counted by using the conditional probability, and the obtained conditional occurrence probability between any two AUs represents the relevance between any 2 AUs.
In this embodiment, the expression database may be a dispa expression database, or may be a BP4D expression database. It is assumed that the dispa expression database contains 12 labeled AUs, namely AU1 (inner eyebrow pulled up), AU2 (outer eyebrow pulled up), AU4 (eyebrow pressed down and gathered up), AU5 (upper eyelid lifted), AU6 (cheek lifted), AU9 (wrinkled nose), AU12 (pulled mouth angle tilted up), AU15 (pulled mouth angle tilted down), AU17 (pushed lower lip up), AU20 (mouth angle stretched), AU25 (lip separated), AU26 (lower jaw relaxed separated). BP4D expression database contains AU1 (the inner side of eyebrow is pulled up), AU2 (the outer side of eyebrow is pulled up), AU4 (eyebrow is pressed down and gathered up), AU6 (cheek is lifted), AU7 (the inner ring of orbicularis oculi contracts), AU10 (pull lip moves upwards), AU12 (pull mouth angle inclines upwards), AU14 (tighten mouth angle), AU15 (pull mouth angle inclines downwards), AU17 (push lower lip upwards), AU23 (tighten lips), AU24 (lips press each other) these 12 AU with labels.
In this embodiment, the AU labels of the image samples in the dispa expression database and the BP4D expression database are statistically sorted, and a discrimination coefficient is used to obtain an association matrix and a relationship diagram between AUs, where fig. 2 is an association matrix and an association diagram between AUs in the dispa, and fig. 3 is an association matrix and an association diagram between AUs in the BP 4D.
In a specific implementation manner of the foregoing facial micro expression recognition method, further performing statistics on an AU tag and an expression tag in the expression database, and determining the association between the AU and the expression by using a discrimination coefficient includes:
obtaining a correlation coefficient between any AU and an expression by using a second discrimination coefficient formula, wherein the second discrimination coefficient formula is expressed as:
Figure BDA0001981171990000081
wherein, YiRepresentative expression Ti,XjRepresents AUj,P(Xj|Yi) To express an expression TiAU under already occurring conditionsjProbability of occurrence of, P (Y)i) To express an expression TiProbability of occurrence of, P (X)jYi) To express an expression TiAnd AUjProbability of coincidence;
Figure BDA0001981171990000082
Figure BDA0001981171990000083
wherein N represents the total number of images in the expression database for counting the relevance between AU and expressions, and FiRepresenting facial expressions database contains tags TiNumber of image samples, GiExpression database simultaneously contains label TiAnd AUjThe number of image samples.
In this embodiment, taking the BP4D expression database as an example, the AU tag and the expression tag in the BP4D expression database are statistically sorted, and the correlation matrix between the AU and the expression is obtained by using the discrimination coefficient. Among them, the BP4D expression database contains AU1, AU2, AU4, AU6, AU7, AU10, AU12, AU14, AU15, AU17, AU23, 12 labeled AUs of AU24, and T1 (happy), T2 (sad), T3 (surprised or frightened), T4 (embarrassed), T5 (frightened or tensed), T6 (painful), T7 (angry or restless), T8 (disgust), etc. of 8 expression labels (also called: expression types) that trigger tasks of different expressions.
In this embodiment, the conditional probability between any AU and an expression is counted by using the conditional probability, and the obtained conditional probability between any AU and an expression represents the association between any expression and the occurrence of an AU.
In a specific implementation manner of the foregoing facial micro expression recognition method, further, the obtaining an AU template sequence of each expression according to a determined association between an AU and an expression in an expression database includes:
and sequencing according to the relevance between AUs and expressions from high to low, and acquiring a preset number of AUs with high relevance as sequence lengths to obtain an AU template sequence of each expression.
In this embodiment, the conditional probabilities of the AUs and the expressions represent the relevance between the AUs and the expressions, the conditional probabilities are sorted from high to low according to the relevance between the AUs and the expressions, the AUs with the larger conditional probability and the preset number (J5) are selected as the sequence length to obtain an AU template sequence of each expression, and a next template is determined for identifying the expression based on the AUs, so that the process of identifying the expression is simplified.
In an embodiment of the foregoing face micro expression recognition method, further, a Test sample AU sequence is defined as Test _ AU ═ u { AU ═ AUk},
Figure BDA0001981171990000091
Wherein k represents an AU type subscript, and L represents the number of AU types; expression TiThe template sequence of (1) is Temp _ AU(i)=∪{AUj},i=1,2,3,…R,
Figure BDA0001981171990000092
j represents an AU type index, and R represents the number of expression types;
the step of calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive common subsequence matching method according to the determined correlation between AUs in the expression database, and the step of obtaining the expression type to which the AU template sequence corresponding to the maximum similarity belongs as the expression type of the test sample comprises the following steps:
a11, determining the length K of the AU sequence Test _ AU and the AU template sequence Temp _ AU of the ith expression(i)Is the same;
a12, if the same, i.e., K equals J, obtains Test _ AU and Temp _ AU(i)Temp _ AU with the largest number of AUs same with each other(i)As an optimal solution, if the optimal solution is unique, the expression to which the AU template sequence corresponding to the optimal solution belongs is the expression type of the test sample; if the optimal solution is more than one, determining the Test sample AU sequence Test _ AU and the AU template sequence Temp _ AU of the ith expression(i)And obtaining the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample.
In an embodiment of the above-mentioned method for recognizing human face micro expression, the sample AU sequence Test _ AU and the AU template sequence Temp _ AU of the ith expression are further tested(i)In betweenSimilarity SimiExpressed as:
Figure BDA0001981171990000101
wherein, Test _ AUpRepresents an element in the Test _ AU,
Figure BDA0001981171990000102
AU template sequence Temp _ AU representing ith expression(i)The elements of (1);
the obtaining of the expression type to which the AU template sequence corresponding to the maximum similarity belongs as the expression type of the test sample includes:
obtaining the expression template sequence with the highest similarity as the optimal matching template, namely
Figure BDA0001981171990000103
And taking the expression type to which the optimal matching template belongs as the expression type of the test sample.
In a specific implementation manner of the foregoing human face micro expression recognition method, the method further includes:
if different, and K>J, for each AU tag Test _ AU in a sequence of AU samples Test _ AUpCalculating the correlation index
Figure BDA0001981171990000104
Obtaining SpElement Test _ AU at the smallest valuepAnd deleting the Test _ AUpUntil Test _ AU and Temp _ AU(i)The length is the same and step a12 is continued.
In a specific implementation manner of the foregoing human face micro expression recognition method, the method further includes:
if different and K<J, when calculating Test _ AU and Temp _ AU(i)When the similarity between the expression and the expression is high, an AU template sequence Temp _ AU of the ith expression is randomly selected(i)AU element ofAnd added to the Test sample AU sequence Test _ AU such that Test _ AU and Test _ AU(i)The length is the same and step a12 is continued.
In this embodiment, for the test sample, the image data is subjected to AU detection by using a computer vision technique, and an AU tag obtained by the AU detection may form an AU sequence of the test sample.
In this embodiment, an AU sequence of a test sample is compared with an obtained AU template sequence of each expression, a similarity between the two is calculated, and the expression type of the test sample is determined by finding an expression type to which the AU expression template sequence corresponding to the maximum similarity belongs, specifically:
first, assume that a Test sample AU sequence is Test _ AU ═ u { AU ═ AUk},
Figure BDA0001981171990000111
Expression TiThe template sequence of (1) is Temp _ AU(i)=∪{AUj},i=1,2,3,…8,
Figure BDA0001981171990000112
Then, the AU sequence Test _ AU of the Test sample and the AU template sequence Temp _ AU of the ith expression are judged(i)Whether the lengths of (a) and (b) are the same;
if K>J, deleting redundant AU sequence tags of the test samples, specifically: for each AU label Test _ AU in a sequence of AU samples Test _ AUpCalculating the correlation index
Figure BDA0001981171990000113
Obtaining SpElement Test _ AU at the smallest valuepAnd deleting the Test _ AUpUntil Test _ AU and Temp _ AU(i)The lengths are the same;
if K<J, expanding the AU sequence of the Test sample when calculating Test _ AU and Temp _ AU(i)When the similarity between the expression and the expression is high, the AU template sequence Temp _ AU of the ith expression is randomly selected(i)And added to the sequence of Test sample AUs, Test _ AU, such that Test _ AU and Test _ AU(i)The lengths are the same;
if the same, namely K equals J, then Test _ AU and Temp _ AU are obtained(i)Temp _ AU with the largest number of AUs same with each other(i)As an optimal solution, if the optimal solution is unique, the expression to which the AU template sequence corresponding to the optimal solution belongs is the expression type of the test sample; if the optimal solution is more than one, determining the Test sample AU sequence Test _ AU and the AU template sequence Temp _ AU of the ith expression(i)The similarity between the two groups is represented as:
Figure BDA0001981171990000114
obtaining the expression template sequence with the highest similarity as the optimal matching template, namely
Figure BDA0001981171990000115
And taking the expression type to which the optimal matching template belongs as the expression type of the test sample.
In this embodiment, for a test sample AU sequence, the expression type of the test sample is determined by calculating the similarity between the sequence and the AU template sequence of each expression, which reduces the amount of calculation and improves the accuracy of recognizing the expression.
Example two
The present invention also provides a specific embodiment of a facial micro-expression recognition device, which corresponds to the specific embodiment of the facial micro-expression recognition method, and the facial micro-expression recognition device can achieve the purpose of the present invention by executing the flow steps in the specific embodiment of the method, so the explanation in the specific embodiment of the facial micro-expression recognition method is also applicable to the specific embodiment of the facial micro-expression recognition device provided by the present invention, and will not be described in detail in the following specific embodiment of the present invention.
As shown in fig. 4, an embodiment of the present invention further provides a device for recognizing a micro expression of a human face, including:
a first determination module 11, configured to determine a correlation between AUs in the expression database and a correlation between an AU and an expression, where an AU represents a facial motion unit;
the second determining module 12 is configured to obtain an AU template sequence of each expression according to the determined association between an AU and an expression in the expression database;
an obtaining module 13, configured to obtain an AU sequence of a test sample;
and a third determining module 14, configured to calculate, according to the correlation between the AUs in the determined expression database, a similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive common subsequence matching method, and obtain an expression type to which the AU template sequence corresponding to the maximum similarity belongs as the expression type of the test sample.
The facial micro expression recognition device of the embodiment of the invention determines the relevance between AUs in the expression database and the relevance between AUs and expressions, thereby more comprehensively and accurately expressing the relevance between facial features and expressions of a face; obtaining an AU template sequence of each expression according to the determined correlation between AUs and expressions in the expression database, and determining a next template for subsequent recognition of the expressions based on AUs, so that the expression recognition process can be simplified; acquiring an AU sequence of a test sample; and calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive public subsequence matching method according to the determined correlation between AUs in the expression database, and acquiring the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample, thereby realizing the automatic identification of the expression. Thus, the recognition accuracy of the expression can be improved on the basis of reducing the calculation amount.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (3)

1. A face micro-expression recognition method is characterized by comprising the following steps:
determining an association between AUs in an expression database and an association between an AU and an expression, wherein the AU represents a facial action unit;
obtaining an AU template sequence of each expression according to the determined correlation between AUs and expressions in the expression database;
acquiring an AU sequence of a test sample;
calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive public subsequence matching method according to the determined correlation between AUs in the expression database, and acquiring the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample;
wherein the determining of the association between AUs and the association between an AU and an expression in the expression database comprises:
counting AU labels in the expression database, and determining the relevance between AUs in the expression database through a discrimination coefficient;
counting AU labels and expression labels in the expression database, and determining the relevance between AU and expressions through a discrimination coefficient;
wherein, the counting AU labels in the expression database, and determining the relevance between AUs in the expression database through the discrimination coefficient includes:
determining a correlation coefficient between any two AUs using a first discrimination coefficient formula, wherein the first discrimination coefficient formula is expressed as:
Figure FDA0002666355110000011
wherein, XiAnd XjRespectively represent AUiAnd AUj,AUiAnd AUjRespectively represent a face operating unit i and a face operating unit j, P (X)j|Xi) To denote AUiAU under already occurring conditionsjProbability of occurrence of, P (X)i) To denote AUiProbability of occurrence of, P (X)jXi) To denote AUiAnd AUjProbability of coincidence;
Figure FDA0002666355110000012
Figure FDA0002666355110000021
wherein S represents the total number of images in the expression database for counting the correlation between AUs, and HiIndicating that the expression database contains labels AUiNumber of image samples of AiExpression database simultaneously containing label AUiAnd AUjThe number of image samples of (a);
wherein, the Test sample AU sequence is defined as Test _ AU ═ U { AUk},
Figure FDA0002666355110000022
Wherein k represents an AU type subscript, and L represents the number of AU types; expression TiThe template sequence of (1) is Temp _ AU(i)=∪{AUj},i=1,2,3,…R,
Figure FDA0002666355110000023
j represents an AU type index, and R represents the number of expression types;
the step of calculating the similarity between the AU sequence of the test sample and the AU template sequence of each expression by using a self-adaptive common subsequence matching method according to the determined correlation between AUs in the expression database, and the step of obtaining the expression type to which the AU template sequence corresponding to the maximum similarity belongs as the expression type of the test sample comprises the following steps:
a11, determining the length K of the AU sequence Test _ AU and the AU template sequence Temp _ AU of the ith expression(i)Is the same;
a12, if the same, i.e., K equals J, obtains Test _ AU and Temp _ AU(i)Temp _ AU with the largest number of AUs same with each other(i)As an optimal solution, if the optimal solution is unique, the expression to which the AU template sequence corresponding to the optimal solution belongs is the expression type of the test sample; if the optimal solution is more than one, determining the Test sample AU sequence Test _ AU and the AU template sequence Temp _ AU of the ith expression(i)Obtaining the expression type of the AU template sequence corresponding to the maximum similarity as the expression type of the test sample;
wherein, the Test sample AU sequence Test _ AU and the ith expression AU template sequence Temp _ AU(i)Similarity between them SimiExpressed as:
Figure FDA0002666355110000024
wherein, Test _ AUpRepresents an element in the Test _ AU,
Figure FDA0002666355110000025
AU template sequence Temp _ AU representing ith expression(i)The elements of (1);
the obtaining of the expression type to which the AU template sequence corresponding to the maximum similarity belongs as the expression type of the test sample includes:
obtaining the expression template sequence with the highest similarity as the optimal matching template, namely
Figure FDA0002666355110000031
Taking the expression type to which the optimal matching template belongs as the expression type of the test sample;
wherein the method further comprises:
if different, and K>J, for each AU tag Test _ AU in a sequence of AU samples Test _ AUpCalculating the correlation index
Figure FDA0002666355110000032
Obtaining SpElement Test _ AU at the smallest valuepAnd deleting the Test _ AUpUntil Test _ AU and Temp _ AU(i)If the lengths are the same, continuing to execute the step A12;
wherein the method further comprises:
if different and K<J, when calculating Test _ AU and Temp _ AU(i)When the similarity between the expression and the expression is high, an AU template sequence Temp _ AU of the ith expression is randomly selected(i)And added to the sequence of Test sample AUs, Test _ AU, such that Test _ AU and Test _ AU(i)The length is the same and step a12 is continued.
2. The method of claim 1, wherein the counting of the AU tags and the expression tags in the expression database and the determining of the association between the AU and the expression by the discrimination coefficient comprises:
obtaining a correlation coefficient between any AU and an expression by using a second discrimination coefficient formula, wherein the second discrimination coefficient formula is expressed as:
Figure FDA0002666355110000033
wherein, YiRepresentative expression Ti,XjRepresents AUj,P(Xj|Yi) To express an expression TiAU under already occurring conditionsjProbability of occurrence of, P (Y)i) To express an expression TiProbability of occurrence of, P (X)jYi) To express an expression TiAnd AUjProbability of coincidence;
Figure FDA0002666355110000034
Figure FDA0002666355110000035
wherein N represents the total number of images in the expression database for counting the relevance between AU and expressions, and FiRepresenting facial expressions database contains tags TiNumber of image samples, GiExpression database simultaneously contains label TiAnd AUjThe number of image samples.
3. The method of claim 1, wherein the obtaining of the AU template sequence of each expression according to the determined association between the AU and the expression in the expression database comprises:
and sequencing according to the relevance between AUs and expressions from high to low, and acquiring a preset number of AUs with high relevance as sequence lengths to obtain an AU template sequence of each expression.
CN201910149809.3A 2019-02-28 2019-02-28 Face micro-expression recognition method and recognition device Active CN109840513B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910149809.3A CN109840513B (en) 2019-02-28 2019-02-28 Face micro-expression recognition method and recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910149809.3A CN109840513B (en) 2019-02-28 2019-02-28 Face micro-expression recognition method and recognition device

Publications (2)

Publication Number Publication Date
CN109840513A CN109840513A (en) 2019-06-04
CN109840513B true CN109840513B (en) 2020-12-01

Family

ID=66885066

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910149809.3A Active CN109840513B (en) 2019-02-28 2019-02-28 Face micro-expression recognition method and recognition device

Country Status (1)

Country Link
CN (1) CN109840513B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110427802A (en) * 2019-06-18 2019-11-08 平安科技(深圳)有限公司 AU detection method, device, electronic equipment and storage medium
CN113158788B (en) * 2021-03-12 2024-03-08 中国平安人寿保险股份有限公司 Facial expression recognition method and device, terminal equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170040693A (en) * 2015-10-05 2017-04-13 (주)감성과학연구센터 Method for extracting Emotional Expression information based on Action Unit

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012053867A1 (en) * 2010-10-21 2012-04-26 Samsung Electronics Co., Ltd. Method and apparatus for recognizing an emotion of an individual based on facial action units
CN103065122A (en) * 2012-12-21 2013-04-24 西北工业大学 Facial expression recognition method based on facial motion unit combination features
CN106169073A (en) * 2016-07-11 2016-11-30 北京科技大学 A kind of expression recognition method and system
CN107194347A (en) * 2017-05-19 2017-09-22 深圳市唯特视科技有限公司 A kind of method that micro- expression detection is carried out based on Facial Action Coding System
CN107862292B (en) * 2017-11-15 2019-04-12 平安科技(深圳)有限公司 Personage's mood analysis method, device and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170040693A (en) * 2015-10-05 2017-04-13 (주)감성과학연구센터 Method for extracting Emotional Expression information based on Action Unit

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
AU-inspired Deep Networks for Facial Expression Feature Learning;MengyiLiu 等;《Neurocomputing》;20150702;第159卷;第126-136页 *
Expression-assisted facial action unit recognition under incomplete AU annotation;ShangfeiWang 等;《Pattern Recognition》;20170131;第61卷;第78-91页 *

Also Published As

Publication number Publication date
CN109840513A (en) 2019-06-04

Similar Documents

Publication Publication Date Title
WO2019095571A1 (en) Human-figure emotion analysis method, apparatus, and storage medium
Wen et al. Automated depression diagnosis based on facial dynamic analysis and sparse coding
Bekhouche et al. Pyramid multi-level features for facial demographic estimation
Yu et al. On the integration of grounding language and learning objects
CN112085012A (en) Project name and category identification method and device
CN116504382B (en) Remote medical monitoring system and method thereof
CN111401339B (en) Method and device for identifying age of person in face image and electronic equipment
CN109840513B (en) Face micro-expression recognition method and recognition device
Khatri et al. Facial expression recognition: A survey
Yan et al. RAF-AU database: in-the-wild facial expressions with subjective emotion judgement and objective au annotations
Ivanovsky et al. Facial expression recognition algorithm based on deep convolution neural network
McDuff et al. Am-fed+: An extended dataset of naturalistic facial expressions collected in everyday settings
McDuff et al. Large-scale affective content analysis: Combining media content features and facial reactions
Shanthi et al. Algorithms for face recognition drones
Mocanu et al. Multimodal emotion recognition using cross modal audio-video fusion with attention and deep metric learning
CN114758382A (en) Face AU detection model establishing method and application based on adaptive patch learning
CN107103289B (en) Method and system for handwriting identification by using handwriting outline characteristics
Granger et al. Weakly supervised learning for facial behavior analysis: A review
Yang Facial expression recognition and expression intensity estimation
Zhang et al. Facial expression recognition based on salient patch selection
WO2015029158A1 (en) Data conversion device, data conversion method, and data conversion program
CN111126294A (en) Method and server for recognizing gait of terminal user based on mobile terminal data
Zhang et al. Predicting Eye Gaze Location on Websites
CN109919124B (en) Method for quickly constructing human face action unit recognition data set
Liu et al. Implicit video multi-emotion tagging by exploiting multi-expression relations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant