CN102254189A - Method for identifying face expression based on cloud model - Google Patents

Method for identifying face expression based on cloud model Download PDF

Info

Publication number
CN102254189A
CN102254189A CN2011102347906A CN201110234790A CN102254189A CN 102254189 A CN102254189 A CN 102254189A CN 2011102347906 A CN2011102347906 A CN 2011102347906A CN 201110234790 A CN201110234790 A CN 201110234790A CN 102254189 A CN102254189 A CN 102254189A
Authority
CN
China
Prior art keywords
human face
cloud
face expression
image
expression image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011102347906A
Other languages
Chinese (zh)
Inventor
王树良
池荷花
池莲花
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN2011102347906A priority Critical patent/CN102254189A/en
Publication of CN102254189A publication Critical patent/CN102254189A/en
Priority to CN201210293381.8A priority patent/CN102880855B/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a method for identifying a face expression based on a cloud model, and the method comprises the following steps of: (1) performing sample set training on a face expression image in an image library, and obtaining cloud digital feature values of each group of face expression images by adopting a reversing cloud generator; (2) reading the face expression image to be identified as a data matrix; (3) adding the face expression image to be identified in any of a group of the face expression images A in the face expression sample set so as to obtain a group of new face expression images A1, and obtaining the cloud digital feature values of the group of new face expression images A1 by adopting the reversing cloud generator; and (4) comparing the cloud digital feature values of the group of face expression images A and the group of the new face expression images A1, and identifying the category of the face expression image to be identified according to the differences between the two groups of cloud digital feature values. The method can be used for realizing face expression identification through utilizing the cloud model to extract the cloud digital feature of the face expression image and comparing and analyzing the extracted cloud digital feature.

Description

A kind of facial expression recognizing method based on cloud model
Technical field
The present invention relates to human face expression recognition technology field, relate in particular to a kind of facial expression recognizing method based on cloud model.
Background technology
Cloud model have macroscopic view accurately, fuzzy, controlled, the uncontrollable characteristics of microcosmic of macroscopic view of microcosmic, its essential unit is the notion cloud that water dust is formed, thought is to have taken into account randomness and ambiguity.It organically combines randomness in the natural language and ambiguity, constitute qualitative and quantitative mutual mapping, not only broken through the limitation of " hard calculating " in the probability statistics, and solved inherent shortcoming as the subordinate function of fuzzy set theory foundation stone, abolish the limitation of rough set boundary set, provide a cover to solve the new method and the new technology of uncertain problem in the data mining.Cloud model is as a kind of general mathematical theory, realized the free mathematics conversion between the qualitative, quantitative dexterously, its method and technical development are so far, extensively successfully be applied in Knowledge Discovery, Spatial Data Mining, Based Intelligent Control and the big system effectiveness assessment, solve or explain the problem or the phenomenon of nature, society, and obtained significant effect.
Cloud generator (Cloud Generator is called for short CG) refers to the generating algorithm of cloud model.Cloud generator set up qualitative and quantitatively between connect each other, sexual mapping relations in the amount of having, the amount in the interdependence, property, mainly comprise forward cloud generator, reverse cloud generator, X condition cloud generator and Y condition cloud generator.
Reverse cloud generator (Backward Cloud Generator) is the uncertain transformation model of realizing between numerical value and its language value, is from quantitatively to mapping qualitatively.It effectively is converted to the precise information of some with appropriate qualitative language value { Ex, the notion that En, He} represent, and the water dust integral body of representing these precise informations in view of the above and being reflected.The quantity of the corresponding precise information of water dust is many more, and the notion of reflection is definite more.Reverse cloud generator is reverse, an indirect cloud generative process, and it is the given one group of water dust Drop (x that meets a certain regularity of distribution i, CT (x i)) as sample, Drop (x i, CT (x i)) i water dust x of expression iQuantitative position and i water dust x in the number field space iRepresent the degree of certainty CT (x of this notion i), and three numerical characteristics of the pairing qualitativing concept of generation description cloud model (Ex, En, He), as shown in Figure 1.By forward cloud generator and reverse cloud generator, cloud model just set up qualitative and quantitatively between connect each other, sexual mapping relations in the amount of having, the amount in the interdependence, property.
Reverse cloud generator be input as Drop (x 1, CT (x 1)), Drop (x 2, CT (x 2)) ... Drop (x N, CT (x N)), be output as (He N), is based on the specific algorithm of the reverse cloud generator of match below for Ex, En:
(1) input Drop (x 1, CT (x 1)), Drop (x 2, CT (x 2)) ... Drop (x N, CT (x N));
(2) cloud is expected equation Linearization, to be converted into the water dust be observed reading, be the observation equation of unknown parameter with expectation and entropy, forms the error equation group of data adjustment, adopts the indirect adjustment method to find the solution then, obtains the least square fitting value of expectation value
Figure BDA0000083747640000022
(3) according to the least square fitting value of step (2) gained expectation value
Figure BDA0000083747640000023
And according to formula Calculate the sample of entropy;
(4) according to formula
Figure BDA0000083747640000025
Calculate entropy
Figure BDA0000083747640000026
(5) according to formula H ^ e = 1 n - 1 Σ i = 1 n ( E n i ′ - E ^ n ) 2 Calculate super entropy
Figure BDA0000083747640000028
(6) result according to step (2), (4), (5) exports
Figure BDA0000083747640000029
Calculate one in field as living things feature recognition and emotion and be rich in the challenging problem of intersecting, the human face expression recognition technology develops very fast under the promotion of various application, the human face expression recognition system mainly comprises following link: the obtaining of human face expression image, people's face detect, human face expression feature extraction and FacialExpression Recognition, and its structural drawing as shown in Figure 2.For a human face expression automatic recognition system, at first be still image or the facial expression image sequence that to obtain human face expression; Second step was that facial expression image is carried out pre-service, comprised that people's face detects and image normalization; The 3rd step was pretreated human face expression image to be carried out face characteristic extract, face characteristic extract comprise that primitive character obtains, feature dimensionality reduction and extraction, feature separate; The 4th step was an Expression Recognition, promptly expression was classified according to certain criterion according to the feature of being extracted.
Human face expression identification is that computer vision, emotion are calculated and a hot subject of research fields such as Flame Image Process, can be widely used in man-machine interaction, multimedia making, detects a lie, safe and secret, fields such as medical treatment is auxiliary, human behavior science.Identification has launched further investigation to Chinese scholars to human face expression, and main achievement in research has: based on an expert system analysis human face expression, and then obtain classifying based on the emotion of expression; Human face expression identification based on the countenance behavior; Human face expression identification based on wavelet analysis and support vector machine.At present, human face expression also has its intrinsic defective as biometrics identification technology, and this mainly shows:
(1) face characteristic less stable
Although the facial variation (deliberately except the lift face) that essence can not take place usually, but people's face is that extremely strong plastic three-dimension flexible skin surface is arranged, can change along with the variation at expression, age etc., the characteristic of skin also can great changes will take place along with age, cosmetic and even lift face, unexpected injury etc.
(2) reliability, security are lower
Although people's face of Different Individual has nothing in common with each other, human face totally is similar, and population is so numerous on the earth, and the difference between consequently a lot of people's the face is very delicate, and the safe and reliable authentication of technical realization has suitable difficulty.
(3) influenced by various external conditions very big for image acquisition, so recognition performance is on the low side
The production process of image has determined that Face Image Recognition System must be in the face of very difficult visual problem such as different illumination conditions, visual angle, variable in distance, these imaging factors all can greatly influence the apparent of facial image, thereby make recognition performance stable inadequately.
These shortcomings make recognition of face become a very challenge problem of difficulty, especially mismatch the user, the recognition of face problem under the imperfect acquisition condition, more become present hot issue.At present, the situation that best's face recognition system also can only relatively cooperate the user in the world, acquisition condition is more satisfactory just can satisfy general requirement of using substantially.Certainly, along with development of technology, believe that these problems also should progressively solve, thereby make face recognition technology can better meet the public's expectation.
Summary of the invention
Deficiency at the prior art existence, the invention provides a kind of facial expression recognizing method based on cloud model, this method is used cloud model and is excavated institute's tacit knowledge in the human face expression image, and extracts the human face expression feature based on cloud model, thereby realizes human face expression identification.
In order to solve the problems of the technologies described above, the present invention adopts following technical scheme:
A kind of facial expression recognizing method based on cloud model may further comprise the steps:
Step 1, existing human face expression image in the image library is carried out the sample set training, this sample set training is divided into two classes: same a kind of expression of different people and same people's different expressions, the class of promptly expressing one's feelings and people's face class, wherein, many group human face expression images are arranged respectively in each class sample set, adopt reverse cloud maker to extract the cloud numerical characteristic value { Ex of every group of human face expression image in the sample set, En, He};
Step 2 reads into data matrix with a human face expression image to be identified;
Step 3 inserts human face expression image to be identified in the human face expression sample sets of gained in the step 1 in the arbitrary group of human face expression image A, obtains one group of new human face expression image A 1, adopt reverse cloud maker to obtain this group new person face facial expression image A 1Cloud numerical characteristic value { Ex, En, He};
Step 4, relatively lineup face facial expression image A and one group of new person's face of step 3 gained facial expression image A 1The cloud numerical characteristic value, and the front and back difference of two groups of cloud numerical characteristic values of foundation is judged the affiliated classification of human face expression image to be identified, if the front and back difference of expectation value Ex and super entropy He is not more than 0.34+0.0010, and the front and back difference of entropy En is not more than 0.34+0.0020, human face expression image then to be identified and lineup's face facial expression image A coupling; Otherwise, human face expression image to be identified being inserted in the human face expression sample sets of gained in the step 1 in another group human face expression image A, repeating step three~four is until the lineup's face facial expression image that finds with human face expression images match to be identified.
Image library in the above-mentioned steps one is the JAFFE storehouse.
Reverse cloud generator in above-mentioned steps one and the step 3 adopts the reverse cloud generator algorithm based on match.
Adopt in the above-mentioned steps one reverse cloud generator extract every group of human face expression image in the sample set the cloud numerical characteristic value Ex, En, He} further comprise following substep:
1.1 every group of human face expression image reads into data matrix respectively in the class of will expressing one's feelings and the people's face class;
1.2 adopt reverse cloud generator that every group of data matrix of gained in the step 1.1 carried out image characteristics extraction respectively, and obtain cloud numerical characteristic value { Ex, En, the He} of every group of pairing human face expression image of data matrix.
Before in the above-mentioned steps two human face expression image to be identified being read into data matrix it is removed noise treatment.
Compared with prior art, the inventive method has the following advantages and beneficial effect:
1, the inventive method is promptly used the cloud model algorithm from a brand-new angle research human face expression identification, three cloud numerical characteristics of extraction human face expression Ex, and En, He}, and, realize human face expression identification based on these three cloud numerical characteristics;
2, fully excavate cloud numerical characteristic in the human face expression image { Ex, En, He}, and disclosed institute's tacit knowledge, i.e. Ex in the human face expression image: the general character of having represented facial image in the reverse cloud generator algorithm of the inventive method from cloud model; En: individual sex knowledge departs from the degree of general general character knowledge; He: the dispersion of departure degree.By the statement of these knowledge, can remove to analyze human face expression from more profound, thereby provide certain theoretical foundation for the research of people's face Expression Recognition;
3, the inventive method has adopted the reverse cloud generator algorithm based on match, and this algorithm ratio in identification is more effective based on the reverse cloud generator of equalization.
Description of drawings
Fig. 1 is the input and output figure of reverse cloud generator;
Fig. 2 face Expression Recognition system diagram of behaving;
Fig. 3 is the FB(flow block) of the inventive method;
Fig. 4 is a facial image to be identified.
Embodiment
Cloud model integrates ambiguity and randomness as the mathematics transformation model of uncertain knowledge, constitutes qualitative and quantitative mutual mapping.And human face expression also is a kind of in the uncertain data, and utilization cloud model technology realizes human face expression identification, and this is a core of the present invention.
Reverse cloud generator algorithm in the cloud model is fully excavated cloud numerical characteristic { Ex, En, the He} in the human face expression image, the present invention is by means of cloud numerical characteristic { Ex, En, He} in the cloud model, disclosed institute's tacit knowledge in the human face expression image, promptly Ex has represented the general character of facial image; On behalf of individual sex knowledge, En depart from the degree of general general character knowledge; He has represented the dispersion of departure degree.When having analyzed the implicit knowledge of facial image institute, the inventive method extracts the cloud numerical characteristic of human face expression image by means of cloud model, and { He} uses this group cloud numerical characteristic to realize human face expression identification for Ex, En.
The present invention is based on the facial expression recognizing method of cloud model, and this method is used cloud model and excavated institute's tacit knowledge in the human face expression image, and extracts the human face expression feature based on cloud model, thereby realizes human face expression identification.To describe the inventive method in detail below, concrete steps are as follows:
Step 1, existing human face expression image in the image library is carried out the sample set training, this sample set training is divided into two classes: same a kind of expression of different people and same people's different expressions, the class of promptly expressing one's feelings and people's face class, the different facial expression images that various different people are arranged in the image library as one group, promptly constitute the expression class of sample set with same a kind of facial expression image of different people, same people's different facial expression images as one group, are promptly constituted people's face class of sample set.So, many group human face expression images are arranged respectively in each class sample set, adopt reverse cloud maker to extract cloud numerical characteristic value { Ex, En, the He} of every group of human face expression image in the sample set;
The cloud numerical characteristic value of every group of human face expression image in the reverse cloud maker of the above-mentioned employing algorithm extraction sample set Ex, En, He} further comprise following substep:
1.1 every group of human face expression image reads into data matrix respectively in the class of will expressing one's feelings and the people's face class;
1.2 adopt reverse cloud generator that every group of data matrix of gained in the step 1.1 carried out image characteristics extraction respectively, and obtain cloud numerical characteristic value { Ex, En, the He} of every group of pairing human face expression image of data matrix.
The reverse cloud generator that is adopted in the step 1.2 be input as a series of water dust Drop (x 1, C T(x 1)), Drop (x 2, C T(x 2)) ... Drop (x N, C T(x N)), be output as (Ex, En, He, N), it is the reverse cloud generator algorithm that adopts based on match, and is specific as follows:
(7) input Drop (x 1, CT (x 1)), Drop (x 2, CT (x 2)) ... Drop (x N, CT (x N));
(8) cloud is expected equation
Figure BDA0000083747640000061
Linearization, to be converted into the water dust be observed reading, be the observation equation of unknown parameter with expectation and entropy, forms the error equation group of data adjustment, adopts the indirect adjustment method to find the solution then, obtains the least square fitting value of expectation value
Figure BDA0000083747640000062
(9) according to the least square fitting value of step (2) gained expectation value
Figure BDA0000083747640000063
And according to formula En i ′ = | x i - E x ^ | - 2 ln ( C T ( x ) ) Calculate the sample of entropy;
(10) according to formula
Figure BDA0000083747640000065
Calculate entropy
Figure BDA0000083747640000066
(11) according to formula H ^ e = 1 n - 1 Σ i = 1 n ( E n i ′ - E ^ n ) 2 Calculate super entropy
Figure BDA0000083747640000068
(12) result according to step (2), (4), (5) exports
Figure BDA0000083747640000069
The program sample of above-mentioned algorithm correspondence is as follows:
The program sample that generates input human face expression image expectation value is:
AveImage1=SelectAveImage(Images,Num,1);
The program sample that generates input human face expression image entropy is:
stdImage1=CalculateStdImage(Images,AveImage1,Num,1,1);
The program sample that generates the super entropy of input human face expression image is:
HeImage1=CalculateReVarianceImage(1,stdImage?1,1)。
Step 2 reads into data matrix with a human face expression image to be identified, and the program sample that this step adopts is [Images, Num]=ExtractMatrixFromImage ().As preferably, can before being read into data matrix, human face expression image to be identified remove noise treatment to it, can improve the accuracy of human face expression identification like this.
Step 3 inserts human face expression image to be identified in the human face expression sample sets of gained in the step 1 in the arbitrary group of human face expression image A, obtains one group of new human face expression image A 1, adopt reverse cloud maker to obtain this group new person face facial expression image A 1The cloud numerical characteristic value { Ex, En, He}, reverse cloud generator are the reverse cloud generator algorithms that adopts based on match, with the reverse cloud generator that is adopted in the step 1.
Step 4, relatively lineup face facial expression image A and one group of new person's face of step 3 gained facial expression image A 1The cloud numerical characteristic value, and the front and back difference of two groups of cloud numerical characteristic values of foundation is judged the affiliated classification of human face expression image to be identified, if the front and back difference of expectation value Ex and super entropy He is not more than 0.34+0.0010, and the front and back difference of entropy En is not more than 0.34+0.0020, human face expression image then to be identified and lineup's face facial expression image A coupling; Otherwise, human face expression image to be identified being inserted in the human face expression sample sets of gained in the step 1 in another group human face expression image A, repeating step three~four is until the lineup's face facial expression image that finds with human face expression images match to be identified.
Specify the application of the inventive method below with reference to embodiment:
Used human face expression image great majority come from JAFFE (Japanese Female Facial Expression) storehouse in the present embodiment.The JAFFE storehouse is the human face expression image data base http://www.kasrl.org/jaffe_download.html of an opening, comprise KA, KL, KM, KR, MK, NA, NM, TM, UY, YM totally 10 different Japanese women, everyone has AN, DI, FE, HA, NE, SA, SU totally 7 kinds of different expressions, every kind of expression has 3 or 4 sample images, and sum is 216 human face expression images.
At first, the human face expression image in the JAFFE storehouse is carried out the sample sets training, this sample sets training is divided into two classes: the same expression of different people and same people's different expressions, and the class of promptly expressing one's feelings and people's face class, this step is specific as follows:
The cloud numerical characteristic of 1-1, same people's difference expression extracts
KA in the JAFFE storehouse in the selected digital image storehouse as one group, and should organize 7 kinds of facial expression images such as her AN, DI, FE, HA, NE, SA, SU the human face expression image and read into data matrix; With the data matrix is input, utilization is carried out image characteristics extraction based on the reverse cloud generator of the reverse cloud generator algorithm of match to the pairing human face expression image of this group data matrix, obtain the cloud numerical characteristic value { Ex of 7 kinds of facial expression images of KA, En, He}, the result is shown in the 1st row the 9th~11 row of table 1.
Adopt above-mentioned identical method to obtain the cloud numerical characteristic value { Ex of the 7 kinds of facial expression images of 9 the Japanese women of difference such as KL, KM, KR, MK, NA, NM, TM, UY, YM in the JAFFE storehouse respectively, En, He}, result are respectively shown in the 9th~11 row that the 2nd row, the 3rd row, the 4th row, the 5th row, the 6th row, the 7th row, eighth row, the 9th row, the 10th of table 1 are gone.
The cloud numerical characteristic of the identical expression of 1-2, different people extracts
Selected AN expression in the JAFFE storehouse as one group, and should be organized 10 Japanese women's of difference such as KA, KL, KM, KR, MK, NA, NM, TM, UY, YM AN facial expression image the human face expression image and read into data matrix; With the data matrix is input, utilization is carried out image characteristics extraction based on the reverse cloud generator of the reverse cloud generator algorithm of match to the pairing human face expression image of this group data matrix, obtain the cloud numerical characteristic value { Ex of this group human face expression image, En, He}, the result is listed as shown in the 12nd~14 row as table 1 the 1st.
Adopt above-mentioned identical method to obtain the cloud numerical characteristic value { Ex of DI, the FE of 10 Japanese women of difference such as KA, KL, KM, KR, MK, NA, NM, TM, UY, YM in the JAFFE storehouse, HA, NE, 6 kinds of different facial expression images such as SA, SU respectively, En, He}, result are listed as shown in the 12nd~14 row as the 2nd row, the 3rd row, the 4th row, the 5th row, the 6th row, the 7th of table 1 respectively.
Table 1 is depicted as the result who human face expression image in the JAFFE storehouse is carried out sample sets training, and wherein the implication of every row is represented: input be seven kinds of different expressions of same individual, output be cloud numerical characteristic { Ex, En, He} of this group input picture; The implication of every row is represented: input be same a kind of expression of ten people, output be this group input picture cloud numerical characteristic { Ex, En, He}.
The training of table 1 human face expression sample set
Figure BDA0000083747640000091
Secondly, existing human face expression image to be identified as shown in Figure 4, reads into data matrix with human face expression image to be identified; Before human face expression image to be identified reads into data matrix, can remove noise treatment to human face expression image to be identified, make recognition result more accurate like this.
Then, human face expression image to be identified is inserted in the arbitrary group of human face expression image A in the human face expression sample sets of gained in the step 1, obtain one group of new human face expression image A 1, employing obtains this group new person face facial expression image A based on the reverse cloud maker of the reverse cloud generator algorithm of match 1Cloud numerical characteristic value { Ex, En, He}.
Step 4, relatively lineup face facial expression image A and one group of new person's face of step 3 gained facial expression image A 1The cloud numerical characteristic value, and the front and back difference of two groups of cloud numerical characteristic values of foundation is judged the affiliated classification of human face expression image to be identified, decision method is as follows: if the front and back difference of expectation value Ex and super entropy He is not more than 0.34+0.0010, and the front and back difference of entropy En is not more than 0.34+0.0020, human face expression image then to be identified and lineup's face facial expression image A coupling; Otherwise, human face expression image to be identified being inserted in the human face expression sample sets of gained in the step 1 in another group human face expression image A, repeating step three~four is until the lineup's face facial expression image that finds with human face expression images match to be identified.
Two groups of human face expression images to choosing 7 kinds of expressions of KA and KL in the sample set of human face expression shown in the table 1 below further specify step 4, and are as shown in table 2.
Table 2 liang group primitive man face facial expression image
The set of diagrams of choosing first row in the table 2 looks like to be designated as first group of human face expression image KA, and { the He} image is shown in table 1 the 1st row the 9th~11 row for Ex, En for its cloud numerical characteristic value; Human face expression image to be identified shown in Figure 4 is inserted among first group of human face expression image KA, be designated as second group of human face expression image KA 1, utilization obtains second group of human face expression image KA based on the reverse cloud generator of the reverse cloud generator algorithm of match 1The cloud numerical characteristic value Ex, En, He}, its corresponding cloud numerical characteristic image is shown in table 3 the 2nd row; Compare first group of human face expression image KA and second group of human face expression image KA 1The cloud numerical characteristic value Ex, En, He}, table 3 the 4th row have shown cloud numerical characteristic value { Ex, En, the He} difference image and the numerical value difference thereof of these two groups of human face expression images.
The set of diagrams of choosing second row in the table 2 looks like to be designated as the 3rd group of human face expression image KL, and { the He} image is shown in table 1 the 2nd row the 9th~11 row for Ex, En for its cloud numerical characteristic value; Human face expression image to be identified shown in Figure 4 is inserted among the 3rd group of human face expression image KL, be designated as the 4th group of human face expression image KL 1, utilization obtains the 4th group of human face expression image KL based on the reverse cloud generator of the reverse cloud generator algorithm of match 1The cloud numerical characteristic value Ex, En, He}, its corresponding cloud numerical characteristic image is shown in table 3 the 3rd row; Compare the 3rd group of human face expression image KL and the 4th group of human face expression image KL 1The cloud numerical characteristic value Ex, En, He}, table 3 the 5th row have shown cloud numerical characteristic value { Ex, En, the He} difference image and the numerical value difference thereof of these two groups of human face expression images.
Table 3 comparing result
Figure BDA0000083747640000111
Utilization Matlab technology is to the processing that quantizes of the 4th row of gained table 3 and the 5th row image, sample program: Number=Tonumber (Images); The result is recorded in the equal sign back of corresponding image respectively.In this experiment, according to arithmetic accuracy, it is as follows to be provided with respective threshold: the front and back numerical value difference of expectation value (0~0.34+0.0010) time, can judged that then the difference image of expectation meets the coupling requirement, otherwise not match; Entropy front and back numerical value difference (0~0.34+0.0020) time, can judge that then the difference image of entropy meets the coupling requirement, otherwise not match; Super entropy front and back numerical value difference (0~0.34+0.0010) time, can judge that then the difference image of super entropy meets the coupling requirement, otherwise not match; Expectation value that and if only if, entropy, super entropy front and back numerical value difference all meet the coupling requirement, this human face expression image to be identified just meets the coupling requirement.
Be listed as the cloud numerical characteristic value { Ex of first group and second group human face expression image as can be seen from the 4th row and the 5th of table 3, En, He} numerical value difference is respectively less than 0.34+0.0010,0.34+0.0020,0.34+0.0010, and the cloud numerical characteristic value of the 3rd group and the 4th group human face expression image { He} numerical value difference is respectively greater than 0.34+0.0010 for Ex, En, 0.34+0.0020,0.34+0.0010 therefore, human face expression image to be identified and KA more mate.And the fact also is like this.
Present embodiment has been verified the feasibility of the inventive method to a certain extent, has illustrated that employing cloud model technology is to realize the human face expression image recognition effectively.The inventive method has been deepened the cognition to human face expression identification, has also further expanded the application of cloud model simultaneously.

Claims (5)

1. the facial expression recognizing method based on cloud model is characterized in that, may further comprise the steps:
Step 1, existing human face expression image in the image library is carried out the sample set training, this sample set training is divided into two classes: same a kind of expression of different people and same people's different expressions, the class of promptly expressing one's feelings and people's face class, and adopt reverse cloud maker to extract the cloud numerical characteristic value of every group of human face expression image in the sample set
Figure 855435DEST_PATH_IMAGE001
Step 2 reads into data matrix with human face expression image to be identified;
Step 3 inserts human face expression image to be identified in the human face expression sample set of gained in the step 1 in the arbitrary group of human face expression image A, obtains one group of new human face expression image A 1, adopt reverse cloud maker to obtain this group new person face facial expression image A 1The cloud numerical characteristic value
Step 4, relatively lineup face facial expression image A and one group of new person's face of step 3 gained facial expression image A 1The cloud numerical characteristic value, and judge classification under the human face expression image to be identified, if expectation value according to the front and back difference of two groups of cloud numerical characteristic values
Figure 2011102347906100001DEST_PATH_IMAGE002
With super entropy
Figure 614629DEST_PATH_IMAGE003
Front and back difference be not more than 0.34+0.0010, and entropy Front and back difference be not more than 0.34+0.0020, human face expression image then to be identified and lineup's face facial expression image A coupling; Otherwise, human face expression image to be identified is inserted another group human face expression image A in the human face expression sample sets of gained in the step 1 ' in, repeating step three ~ four is until the lineup's face facial expression image that finds with human face expression images match to be identified.
2. the facial expression recognizing method based on cloud model according to claim 1 is characterized in that:
Image library in the described step 1 is the JAFFE storehouse.
3. the facial expression recognizing method based on cloud model according to claim 1 and 2 is characterized in that:
Described reverse cloud generator adopts the reverse cloud generator algorithm based on match.
4. the facial expression recognizing method based on cloud model according to claim 1 and 2 is characterized in that:
Extract the cloud numerical characteristic value of every group of human face expression image in the sample set in the described step 1
Figure 72155DEST_PATH_IMAGE001
Further comprise following substep:
1.1 every group of human face expression image reads into data matrix respectively in the class of will expressing one's feelings and the people's face class;
1.2 adopt reverse cloud generator that every group of data matrix of gained in the step 1.1 carried out image characteristics extraction respectively, and obtain the cloud numerical characteristic value of every group of pairing human face expression image of data matrix
Figure 948844DEST_PATH_IMAGE001
5. the facial expression recognizing method based on cloud model according to claim 1 and 2 is characterized in that:
Before in the described step 2 human face expression image to be identified being read into data matrix it is removed noise treatment.
CN2011102347906A 2011-08-16 2011-08-16 Method for identifying face expression based on cloud model Pending CN102254189A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN2011102347906A CN102254189A (en) 2011-08-16 2011-08-16 Method for identifying face expression based on cloud model
CN201210293381.8A CN102880855B (en) 2011-08-16 2012-08-16 Cloud-model-based facial expression recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011102347906A CN102254189A (en) 2011-08-16 2011-08-16 Method for identifying face expression based on cloud model

Publications (1)

Publication Number Publication Date
CN102254189A true CN102254189A (en) 2011-11-23

Family

ID=44981441

Family Applications (2)

Application Number Title Priority Date Filing Date
CN2011102347906A Pending CN102254189A (en) 2011-08-16 2011-08-16 Method for identifying face expression based on cloud model
CN201210293381.8A Active CN102880855B (en) 2011-08-16 2012-08-16 Cloud-model-based facial expression recognition method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201210293381.8A Active CN102880855B (en) 2011-08-16 2012-08-16 Cloud-model-based facial expression recognition method

Country Status (1)

Country Link
CN (2) CN102254189A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324947A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Certification method and authentication method
CN105160318A (en) * 2015-08-31 2015-12-16 北京旷视科技有限公司 Facial expression based lie detection method and system
CN105389821A (en) * 2015-11-20 2016-03-09 重庆邮电大学 Medical image segmentation method based on combination of cloud module and image segmentation
CN106101541A (en) * 2016-06-29 2016-11-09 捷开通讯(深圳)有限公司 A kind of terminal, photographing device and image pickup method based on personage's emotion thereof
CN108288048A (en) * 2018-02-09 2018-07-17 中国矿业大学 Based on the facial emotions identification feature selection method for improving brainstorming optimization algorithm
CN110263755A (en) * 2019-06-28 2019-09-20 上海鹰瞳医疗科技有限公司 Eye fundus image identification model training method, eye fundus image recognition methods and equipment
CN110619364A (en) * 2019-09-18 2019-12-27 哈尔滨理工大学 Wavelet neural network three-dimensional model classification method based on cloud model

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104077579B (en) * 2014-07-14 2017-07-04 上海工程技术大学 Facial expression recognition method based on expert system
CN106056054B (en) * 2016-05-24 2019-08-09 青岛海信移动通信技术股份有限公司 A kind of method and terminal carrying out fingerprint recognition
CN107799120A (en) * 2017-11-10 2018-03-13 北京康力优蓝机器人科技有限公司 Service robot identifies awakening method and device
CN109816893B (en) * 2019-01-23 2022-11-04 深圳壹账通智能科技有限公司 Information transmission method, information transmission device, server, and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010074786A2 (en) * 2008-12-04 2010-07-01 Total Immersion Software, Inc. System and methods for dynamically injecting expression information into an animated facial mesh
CN101777131B (en) * 2010-02-05 2012-05-09 西安电子科技大学 Method and device for identifying human face through double models
CN101872424B (en) * 2010-07-01 2013-03-27 重庆大学 Facial expression recognizing method based on Gabor transform optimal channel blur fusion

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324947A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Certification method and authentication method
CN105160318A (en) * 2015-08-31 2015-12-16 北京旷视科技有限公司 Facial expression based lie detection method and system
CN105160318B (en) * 2015-08-31 2018-11-09 北京旷视科技有限公司 Lie detecting method based on facial expression and system
CN105389821A (en) * 2015-11-20 2016-03-09 重庆邮电大学 Medical image segmentation method based on combination of cloud module and image segmentation
CN105389821B (en) * 2015-11-20 2018-02-27 重庆邮电大学 It is a kind of that the medical image cutting method being combined is cut based on cloud model and figure
CN106101541A (en) * 2016-06-29 2016-11-09 捷开通讯(深圳)有限公司 A kind of terminal, photographing device and image pickup method based on personage's emotion thereof
CN108288048A (en) * 2018-02-09 2018-07-17 中国矿业大学 Based on the facial emotions identification feature selection method for improving brainstorming optimization algorithm
CN108288048B (en) * 2018-02-09 2021-11-23 中国矿业大学 Facial emotion recognition feature selection method based on improved brainstorming optimization algorithm
CN110263755A (en) * 2019-06-28 2019-09-20 上海鹰瞳医疗科技有限公司 Eye fundus image identification model training method, eye fundus image recognition methods and equipment
CN110263755B (en) * 2019-06-28 2021-04-27 上海鹰瞳医疗科技有限公司 Eye ground image recognition model training method, eye ground image recognition method and eye ground image recognition device
US11893831B2 (en) 2019-06-28 2024-02-06 Shanghai Eaglevision Medical Technology Co., Ltd. Identity information processing method and device based on fundus image
CN110619364A (en) * 2019-09-18 2019-12-27 哈尔滨理工大学 Wavelet neural network three-dimensional model classification method based on cloud model

Also Published As

Publication number Publication date
CN102880855A (en) 2013-01-16
CN102880855B (en) 2015-01-28

Similar Documents

Publication Publication Date Title
CN102880855B (en) Cloud-model-based facial expression recognition method
Li et al. Deep learning in skin disease image recognition: A review
Maalej et al. Shape analysis of local facial patches for 3D facial expression recognition
Lemaire et al. Fully automatic 3D facial expression recognition using differential mean curvature maps and histograms of oriented gradients
CN101226590B (en) Method for recognizing human face
Wang et al. Probabilistic principal component subspaces: a hierarchical finite mixture model for data visualization
CN105469034A (en) Face recognition method based on weighted diagnostic sparseness constraint nonnegative matrix decomposition
CN101251894A (en) Gait recognizing method and gait feature abstracting method based on infrared thermal imaging
CN101630364A (en) Method for gait information processing and identity identification based on fusion feature
CN104268593A (en) Multiple-sparse-representation face recognition method for solving small sample size problem
CN101491441A (en) Identification method based on electroencephalogram signal
CN101169830A (en) Human face portrait automatic generation method based on embedded type hidden markov model and selective integration
CN103020653B (en) Structure and function magnetic resonance image united classification method based on network analysis
CN101571924B (en) Gait recognition method and system with multi-region feature integration
CN104834905A (en) Facial image identification simulation system and method
CN100461217C (en) Method for cutting complexity measure image grain
CN110534195B (en) Alzheimer disease detection method based on data space transformation
CN103745205A (en) Gait recognition method based on multi-linear mean component analysis
CN101710386A (en) Super-resolution face recognition method based on relevant characteristic and non-liner mapping
Songdechakraiwut et al. Dynamic topological data analysis for functional brain signals
CN103714340A (en) Self-adaptation feature extracting method based on image partitioning
Simonson et al. A statistical approach to combining multisource information in one‐class classifiers
CN103942545A (en) Method and device for identifying faces based on bidirectional compressed data space dimension reduction
CN100416592C (en) Human face automatic identifying method based on data flow shape
CN102289679B (en) Method for identifying super-resolution of face in fixed visual angle based on related characteristics and nonlinear mapping

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20111123