CN100461204C - Method for recognizing facial expression based on 2D partial least square method - Google Patents

Method for recognizing facial expression based on 2D partial least square method Download PDF

Info

Publication number
CN100461204C
CN100461204C CNB200710019405XA CN200710019405A CN100461204C CN 100461204 C CN100461204 C CN 100461204C CN B200710019405X A CNB200710019405X A CN B200710019405XA CN 200710019405 A CN200710019405 A CN 200710019405A CN 100461204 C CN100461204 C CN 100461204C
Authority
CN
China
Prior art keywords
sub
piece
lbp
centerdot
expression
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB200710019405XA
Other languages
Chinese (zh)
Other versions
CN101004791A (en
Inventor
孙宁
吴倩
冀贞海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CNB200710019405XA priority Critical patent/CN100461204C/en
Publication of CN101004791A publication Critical patent/CN101004791A/en
Application granted granted Critical
Publication of CN100461204C publication Critical patent/CN100461204C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

A method for identifying expression of face based on 2-D bias least square method includes dividing a training sample pattern to be seven types of expressions and dividing them to be a numbers of sub-blocks in equal size, picking up vein character of each sub-block by utilizing LBP operator to form character matrix of partial vein, carrying out character pick-up on partial vein character matrix by 2-D bias least square method to form seven types of expression template data for finalizing train process and using the most close face as expression attribution of inputted image.

Description

A kind of human facial expression recognition method based on the 2 D partial least square method
Technical field
The present invention relates to a kind of human facial expression recognition method, particularly a kind of image local feature extracting method based on the 2 D partial least square method.
Background technology
(Local Binary Pattern, LBP) operator is that a fixed size is 3 * 3 rectangular block to local binary, altogether corresponding to 9 gray-scale values.8 gray-scale values are all around compared with the center gray-scale value, sub-piece more than or equal to the center gray-scale value is represented by 1, otherwise then by 0 expression, 8 binary values reading according to clockwise direction are as the eigenwert of this 3 * 3 rectangular block, and final encoded radio is exactly the LBP value on this aspect.This operator can be measured and extract the texture information of local adjacent domain in the gray level image.Therefore, the LBP operator has been widely used in fields such as Texture classification, image retrieval, facial image analysis.
Traditional human facial expression recognition method based on the LBP operator roughly can be divided into two kinds.The one, sample image is divided into the plurality of sub piece, utilize the LBP operator that each sub-piece is carried out texture feature extraction, the LBP feature with all sub-pieces in the sub-picture is together in series as the characteristic of this this image of duplicate sample then.At last, utilize simple nearest neighbor classifier to obtain classification results.Though it is fast that this method has feature extraction speed, the advantage that computation complexity is low.But its defective also is significantly, at first the LBP feature of each sub-piece directly is together in series to characterize sample image and destroyed the structural information of each sub-piece in original image; Secondly after the intact image texture features of LBP operator extraction, just adopt nearest neighbor classifier to classify and make classification results too coarse, be difficult to obtain gratifying recognition correct rate.Two are to use the sub-piece of variable-size, and in conjunction with learning algorithms such as AdaBoost, in the training process of these class methods, the AdaBoost algorithm can be chosen the plurality of sub piece for the human facial expression recognition most critical automatically, and forms final strong classifier and finish classification feature.This method can obtain very high recognition correct rate owing to adopted the sorting algorithm of exquisite powerful (State-of-the-art).But the distinct issues of such algorithm are that the training of AdaBoost algorithm is extremely consuming time, and the sorter of a stalwartness of training needs several days at least, even the time in several weeks, and this has just seriously restricted the widespread use of these class methods.
Summary of the invention
The present invention just is to overcome above-mentioned defective, development, a kind of human facial expression recognition method based on the 2 D partial least square method of exploitation.
Technical scheme of the present invention is:
A kind of human facial expression recognition method based on the 2 D partial least square method, its major technique step is as follows:
A) with the training sample image collection according to seven kinds of expression classification, promptly indignation, detest, frightened, glad, neutral, sad and surprised;
B) sample image is divided into equal-sized m sub-piece;
C) utilize the textural characteristics of each sub-piece of LBP operator extraction;
D) the LBP textural characteristics that every width of cloth sample image is extracted constitutes a local grain eigenmatrix;
E) adopt adaptive weighted mechanism, promptly give weights to m sub-piece, the computing method of weights are: all neutral facial expression images in the training sample are obtained neutral average face, and extract the local grain eigenmatrix A=[a of this neutrality average face 1..., a m], then obtain corresponding local grain eigenmatrix respectively for all espressiove images H i = [ h 1 i , · · · , h m i ] , Utilize card side's distance calculation to go out all espressiove images and the LBP histogram difference of neutral average face image on certain height piece then, and with the adaptive weighting of the LBP histogram difference on certain height piece as this sub-piece, as follows:
e j = Σ i x 2 ( a j , h j i ) , ( j = 1 , · · · , m )
x 2It is the expression formula of card side's distance;
F) the local grain eigenmatrix that uses the 2 D partial least square method that all sample images are obtained carries out the statistical nature extraction, forms seven kinds of template datas that expression is corresponding, finishes training process;
G) when new images is imported, according to b) to f) step finishes the texture of new images, the extraction of statistical nature, the template data of seven kinds of expressions that form with step f) is measured then, is the expression ownership of this width of cloth input picture near the person.
It is 2 D partial least square method (Two-dimensional Partial Least Square that advantage of the present invention and effect are to use traditional partial least square method, hereinafter to be referred as 2DPLS), and utilize it to combine the local feature of expressing one's feelings with the local binary operator and extract, and a kind of improved image local feature extracting method has been proposed.
Specific as follows:
One. the present invention is divided into the plurality of sub piece with facial image, utilizes the textural characteristics in the sub-piece of LBP operator extraction, and the textural characteristics that all sub-pieces in every width of cloth image are obtained is combined into a local grain eigenmatrix in order to characterize this width of cloth facial image.This building method not only can extract the local grain feature of facial image effectively, can also keep the spatial information of each part in integral body, simultaneously the difference of the LBP feature of trying to achieve according to corresponding sub block in espressiove image and the neutral face image has designed a kind of adaptive weighted mechanism, makes the more objective and accurate of weight more artificial setting that each sub-piece obtained.
Two. the present invention has adopted more accurate but has calculated easy dimension reduction method local textural characteristics matrix is carried out the discriminant information extraction.Because facial image is converted into the local grain eigenmatrix, by conventional P LS method is extended to 2DPLS, just can handle the sample of matrix form.In the 2DPLS method, structure to class members's relational matrix has carried out corresponding modification, the scalar of representing a sample in the classic method is expanded to a diagonal matrix make its matrix form that adapts to sample, and the weights that utilize adaptive weighted method to obtain embody the difference of contained information importance in the different sub-pieces.Simultaneously, for the singular problem of class members's relation, the generalized inverse analytic solution of its covariance matrix have also been derived.
Three. the present invention obviously is better than traditional second class LBP method aspect time consumption for training.When training sample set was 70 width of cloth images, it was finished training process and only needs 4 seconds, then needs several hours based on traditional second class LBP method, and being more or less the same of both recognition correct rates.
Description of drawings
Fig. 1---basic LBP operator figure among the present invention.
Fig. 2---mesoscale of the present invention is the expansion LBP operator figure of (8,2) and (16,2).
Fig. 3---among the present invention based on topography's feature extracting method synoptic diagram of 2DPLS.
Fig. 4---the Expression Recognition rate figure that obtains under the different sub-piece division conditions among the present invention.
Fig. 5---the Expression Recognition accuracy figure that relevant experiment with the people obtains.
Fig. 6---with the Expression Recognition accuracy figure of the irrelevant experiment acquisition of people.
Embodiment
One, at first sets up the 2 D partial least square method
1.1 partial least square method (Partial Least Square, PLS)
Being provided with two class means is zero stochastic variable { r 1, r 2..., r n..., r i∈ R p, { s 1, s 2..., s n..., s i∈ R q, the target of PLS method will find a pair of projecting direction (weight vector) α and β exactly, makes projection r ~ = α T r , s ~ = β T s Satisfy:
1)
Figure C200710019405D00052
With
Figure C200710019405D00053
The variation information that comprises each independent variable as much as possible, promptly Var ( r ~ ) → max , Var ( s ~ ) → max .
2) With
Figure C200710019405D00057
Between degree of correlation reach maximum, promptly ρ ( r ~ , s ~ ) → max .
Comprehensive above 2 constraint conditions 1), 2) can draw, the PLS method is according to maximization
Figure C200710019405D00059
With
Figure C200710019405D000510
Covariance try to achieve the best projection direction, that is:
arg max α T α = 1 , β T β = 1 { var ( α T r ) ρ ( α T r , β T s ) 2 var ( β T s ) } = arg max α T α = 1 , β T β = 1 { [ cov ( α T r , β T s ) ] 2 } - - - ( 2 )
From above-mentioned modeling process about PLS as can be seen, constraint condition 1), 2) embodied the thought of principal component analysis (PCA) (PCA) and canonical correlation analysis (CCA) respectively, so the PLS method can be regarded as the CCA method of carrying out respectively after the PCA in r and s space.Consider that s is that the class members of arteface concerns variable in solving the pattern classification problem, formula (2) correspondingly is revised as:
arg max α T α = 1 , β T β = 1 { [ cov ( α T r , β T s ) ] 2 var ( β T s ) } - - - ( 3 )
Following formula is a constrained extreme-value problem, can be found the solution by method of Lagrange multipliers.Can obtain following secular equation group through calculating us:
Σ rs Σ ss - 1 Σ sr α = λα Σ sr Σ rr - 1 Σ rs β = λβ - - - ( 4 )
Wherein, ∑ Rr, ∑ Ss, ∑ Rs=∑ Sr TBe respectively stochastic variable r, the auto-covariance matrix of s and cross covariance matrix.
Obviously, we can obtain satisfied 1 and 2 the projecting direction that retrains by finding the solution secular equation group (4).In addition, we can try to achieve α and β respectively to should be noted that through type (4), but for pattern recognition problem, it is significant having only α.
1.2 the 2 D partial least square method (Two-dimensional PLS, 2DPLS)
(Two-dimensional PLS, 2DPLS) the maximum difference with conventional P LS method is to use matrix form to characterize sample image to the two dimension least square method.If X = [ x 11 , x 12 , · · · x 1 n 1 , x 21 , x 22 , · · · , x 2 n 2 , · · · x C n c ] Be sample data, wherein x Ij, (i=1 ..., C; J=1 ..., n i) be that size is the matrix of h * l, n i, (i=1 ..., C) be the interior sample size of i class, then total N=n in the sample set 1+ n 2+ n CIndividual sample image.Therefore, the average of sample is X ‾ = ( 1 / N ) Σ i = 1 C Σ j = 1 n i x ij . For the pattern classification problem, the sample data collection can be used as one group of stochastic variable in the 2DPLS method, is called the sample data matrix, and another group stochastic variable then needs arteface.
Next utilize the information of the affiliated classification of sample image to construct another group stochastic variable, be called class members's relational matrix.In traditional C CA method, class members's relational matrix has the building method of following two kinds of equivalences usually:
    
Figure C200710019405D00064
Wherein
Figure C200710019405D0006162550QIETU
Be that element is 1 n entirely i* 1 column vector is represented to comprise n in the i class iIndividual sample. Be that element is 0 n entirely i* 1 column vector represents not comprise in other kinds sample in the i class.In other words formula (5) is represented class members's relation table, and which kind of a certain sample belongs to, and then puts 1 in the relevant position of class members's relational matrix, other then be 0.So class members's relational matrix of structure has well shown each sample and all kinds of attaching relations.For the 2DCCA method, scalar characterizes sample image because it uses matrix, therefore according to the form of sample data matrix X class members's relational matrix has been carried out following modification on the basis of formula (5):
For the 2DPLS method, scalar characterizes sample data because it uses matrix, therefore according to the form of sample data matrix X class members's relational matrix has been carried out following modification equally on the basis of formula (5):
Figure C200710019405D00065
   
Wherein, P iExpression belongs to the sample in the i class.But each sample is pairing is not numeral 1 but the unit matrix I that size is l * l iTherefore, P iCan be expressed as by matrix I:
Figure C200710019405D00071
Such structure makes class members's relational matrix can not only embody the spatial information that each sample and all kinds of subordinate relation have also kept sample data.In order to try to achieve the average of class members's relational matrix on the two-dimensional sense, we are rewritten as matrix Y Y = [ y 11 , · · · , y 1 n 1 , · · · , y C n C ] , Y wherein IjIt is (the matrix of l * C) * l.Therefore, the average of class members's relational matrix is Y ‾ = ( 1 / N ) Σ i = 1 C Σ j = 1 n i y ij .
Behind the structure of finishing sample data matrix X and class members's relational matrix Y, can try to achieve the best projection direction w of 2DPLS according to formula (6) XAnd w Y:
arg max { [ cov ( w X T X , w Y T Y ) ] 2 var ( w Y T Y ) } - - - ( 8 )
Equally, formula (8) also is the extreme-value problem of a belt restraining.Adopt the method for Lagrange multipliers can be in the hope of following feature problem system of equations:
Σ XY Σ YY - 1 Σ YX w X = λ w X Σ YX Σ XX - 1 Σ XY w Y = λ w Y - - - ( 9 )
Wherein, Σ XX = Σ i = 1 C Σ j = 1 n i ( x ij - X ‾ ) ( x ij - X ‾ ) T With Σ YY = Σ i = 1 C Σ j = 1 n i ( y ij - Y ‾ ) ( y ij - Y ‾ ) T Be the auto-covariance matrix of sample data matrix X and class members's relational matrix Y, Σ XY = Σ YX T = Σ i = 1 C Σ j = 1 n i ( x ij - X ‾ ) ( y ij - Y ‾ ) T It then is the cross covariance matrix of two matrixes.
From above derivation as can be seen, in the 2DPLS method, still to face class members's relational matrix Y and auto-covariance matrix ∑ thereof YYTherefore the problem of full rank need not find the solution the matrix ∑ YYGeneralized inverse.Equally, according to because matrix Y and matrix ∑ YYThe singularity of matrix form, try to achieve the matrix ∑ YYGeneralized inverse For:
Σ YY g = Σ ZZ g 0 { h × ( C - 1 ) } × h 0 h × { h × ( C - 1 ) } 0 h × h ( l × C ) × ( l × C ) - - - ( 10 )
Wherein, Σ ZZ g = ( ( 1 / n C ) R C - 1 R C - 1 T + M g ) Generalized inverse for the auto-covariance matrix of matrix z in the formula (10).
Two, the selection of sample image neutron piece number
Divide the scope that sub-piece number has directly determined LBP operator extraction local grain feature on the facial image, the selection of sub-piece number is a part and the whole compromise problem that combines.This experiment utilizes the selection to the determinant piece number of the division experience of the different sizes of facial image.Consider effective embodiment of local feature and the feasibility on the algorithm, facial image is divided into 16,36,64,144 respectively, 200 sub-pieces, promptly every block size is respectively 30 * 30, and 20 * 20,15 * 15,10 * 10,6 * 6.Training set then is each picked at random 1 width of cloth image from everyone difference expression, and totally 70 width of cloth images are formed, and are remaining as test set.Carry out 5 average Expression Recognition accuracy of obtaining of experiment as shown in Figure 4 according to above strategy.From table as can be seen, group piece number is 64, and promptly each sub-block size is 15 * 15 o'clock, and the recognition correct rate that is obtained is the highest.This result has proved that the selection of sub-piece number need take all factors into consideration local and whole relation, and is excessive as the fruit piece, then lost the locality meaning, and sub-piece is too little, then can not guarantee enough statistical significances.Therefore, in following experiment, selecting size is that 15 * 15 sub-piece carries out five equilibrium to image.
Three, the local binary operator (Local Binary Pattern, LBP)
The LBP operator is that a fixed size is 3 * 3 rectangular block, altogether corresponding to 9 gray-scale values; 8 gray-scale values are all around compared with the center gray-scale value, sub-piece more than or equal to the center gray-scale value is represented by 1, otherwise then by 0 expression, 8 binary values reading according to clockwise direction are as the eigenwert of this 3 * 3 rectangular block, final encoded radio is exactly the LBP value on this aspect, as shown in Figure 1.Mathematical description is: definition of T=t (g c, g 0G P-1) represent textures windows to be processed, wherein g cThe central point of expression window, g 0G P-1P-1 point around the expression central point.Deducting the texture image that obtains behind the gray-scale value of central point is T=t (g c, g 0-g c..., g P-1-g c), if think that the gray scale of central point and gray scale on every side are independently, T ≈ t (g can be write as in so above-mentioned texture image unit 0-g c... g P-1-g c), and then can be write as T=t (s (g again 0-g c), s (g 1-g c) ..., s (g P-1-g c)), x wherein〉0 o'clock, s (x)=1; X≤0 o'clock, s (x)=0 by giving the factor of a scale-of-two 2p of each distribution, obtains the numerical value LBP of the LBP of this final window at last P, R=∑ PS (g c-g p) 2 p, wherein, P, R classify, and expression is counted and the size of radius
In order to improve the limitation of the textural characteristics that can't extract large-scale structure that initial LBP operator exists, use of a kind of main expansion of the rectangular block of the adjacent sub-blocks of varying number and different size as the LBP operator.Fig. 2 is the example of the LBP operator of two expansions, and wherein (P R) is illustrated in and has P sample point on the circumference that radius is R.The ability of the local grain that the extraction different characteristic of counting of the window of different sizes, different numbers is described is also different, and the content of describing local grain is also different.
In addition, when adjacent sub-blocks quantity and radius size are very big, have most of very limited in the feature that extracts to the description of texture.Therefore, Ojala has proposed another kind of expansion to the LBP operator and has been called " evenly pattern (Uniform Pattern) ".This time, the numerical value of LBP of each window was not the net result of absolute binary code, but calculate in each window the situation of change of code element in the binary code, different situations of change has been represented different local features, for example 00000000 has the zero code element to change, and 00011110 has two code elements to change.Concrete mathematical description is:
LBP P , R riu 2 = Σ p = 0 p = 1 s ( g p - g c ) if U ( LBP P , R ≤ 2 ) P + 1 otherwise - - - ( 11 )
Wherein
U ( LBP P , R ) = | s ( g P - 1 - g c ) - s ( g 0 - g c ) | + Σ p = 1 p - 1 | s ( g P - 1 - g c ) - s ( g 0 - g c ) | - - - ( 12 )
Subscript riu2 is the mark of an employing " evenly pattern ".Timo Ahonen is through studies show that, in 59 " Unified coding " of LBP, the probability that often occurs on the FERET storehouse is 79.3%.Experimental result proves that evenly pattern can effectively be described most textural characteristics in the picture of publishing picture, and reduces the quantity of feature greatly.
Analyze this quasi-mode classification problem for facial sex, need measure to determine the classification ownership of test data the difference between test data and the training data.Because the LBP feature characterizes with represented as histograms, uses nearest neighbor classifier to finish above function usually.The present invention uses chi method (x 2) measure the histogrammic difference of LBP, as follows:
χ 2 ( S , M ) = Σ i ( S i - M i ) 2 S i + M i - - - ( 13 )
Wherein S and M represent two LBP histograms respectively.
Four, the corresponding local grain eigenmatrix of foundation and sample image
Sample image is divided into m sub-piece, uses the LBP histogram h in the sub-piece of LBP operator extraction i∈ R k, and m LBP histogram of this image of duplicate sample formed the local grain eigenmatrix of k * m dimension, as shown in Figure 3.As above structure not only can effectively extract the local grain feature of facial image, can also keep the spatial information of local feature in image.
Five, adaptive weighted mechanism and statistical nature extract
In addition, the characteristic information that comprises in m the sub-piece in the image obviously is different for the importance of expression classification, so we have designed a kind of adaptive weighted mechanism and give rational weights to m sub-piece.At first all neutral facial expression images in the training sample are obtained neutral average face, and extract the local grain eigenmatrix A=[a of this neutrality average face 1..., a m], then obtain corresponding local grain eigenmatrix respectively for all espressiove images H i = [ h 1 i , · · · , h m i ] . Utilize card side's distance calculation to go out all espressiove images and LBP histogram (promptly be a certain row of the corresponding topical textural characteristics matrix) difference of neutral average face image on certain height piece then, gained difference and as the adaptive weighting of this sub-piece.As follows:
e j = Σ i x 2 ( a j , h j i ) , ( j = 1 , · · · , m ) - - - ( 1 )
We use the 2DPLS method that the sample image that characterizes with local grain eigenmatrix form is carried out dimensionality reduction.When structure class members relational matrix, the diagonal entry of unit matrix is to embody the different importance of different sub-pieces for Expression Recognition in the adaptive weighting alternate form (6) that will be tried to achieve by formula (1).According to secular equation group (9), we just can try to achieve eigenmatrix w xFrom this matrix, choose maximum several features value characteristic of correspondence vector structure projection matrix W and be used for dimension-reduction treatment.
Six, the judgement that the input picture expression is belonged to
Suppose local grain eigenmatrix H that size is h * l of input, can obtain matrix behind the dimensionality reduction by following equation
Figure C200710019405D00101
For:
H ~ = W T H - - - ( 14 )
Wherein, if from matrix w xIn chosen d vector, then projection matrix W and matrix
Figure C200710019405D0010163100QIETU
Size be respectively l * d and d * l.When differentiating, utilize card side's distance matrix metric
Figure C200710019405D00103
And the distance between training sample behind the dimensionality reduction, determine the expression of input picture according to nearest neighbouring rule.
Seven, experimental result and analysis
Two kinds of experiments: with relating to persons, irrelevant with the people.With the experiment emphasis of relating to persons be that because if same individual's shape of face is accurately located the well-done words of work, remaining difference mainly is exactly the difference of expressing one's feelings on the face on the discriminability of test expression itself.What on the contrary, the experiment that has nothing to do with the people was mainly tested is the popularization problem of Expression Recognition problem on standard expression storehouse and non-standard expression storehouse.
Experiment is based on Japanese female face (JAFFE) database of expressing one's feelings.When extracting, all adopts local grain LBP (8, the 1) operator of " evenly pattern ".All experiments under the condition of 1GB internal memory, use the MATLAB7.0 software platform to realize all at P4 3.0G CPU.In order to reduce the be bold influence of factors such as little and illumination variation of people's face position, people in the image, before experiment, image has been carried out following pre-service:
1. with human face region segmented extraction in the image, and unified size is 120 * 120.
2. the centre coordinate based on eyes and mouth carries out normalization to the position of people's face.In order to guarantee accuracy, centre coordinate is all by manual demarcation.
3. image is carried out illumination compensation and histogram equalization.
7.1 the experiment relevant with the people
The purpose of relevant experiment with the people is the discriminability of the same individual different table feelings of test.Training set is each picked at random 1 width of cloth image from everyone difference expression, and totally 70 width of cloth images are formed, and are remaining as test set.And using this paper method and classic method to carry out 5 experiments, the averaging of income recognition correct rate is respectively 87.86% and 78.41%.Figure 5 shows that the recognition correct rate that every class expression obtains in this experiment, the recognition correct rate that obtained of this paper method is apparently higher than traditional LBP method as can be seen, this raising comes from this paper method and has adopted 2DPLS that local textural characteristics matrix has been carried out the dimension-reduction treatment that helps differentiating, can obtain more expression discriminant information than directly utilizing the LBP histogram to carry out template matches in traditional LBP method.Angry, glad, surprised as seen from the figure simultaneously discrimination is comparatively desirable, and this is that the LBP operator is easy to catch texture difference therebetween because this several expressions variation on local detail is bigger.Detest, frightened and sad discrimination is then lower, this is because the performance of the face of these three kinds of expressions is quite similar, even manually all easily it is obscured.In addition, for the training set basis that 70 width of cloth facial images constitute, this paper method is finished training process and is only needed 4 seconds, finds out that thus this paper method does not obviously increase computing cost when significantly improving recognition correct rate.
7.2 the experiment that has nothing to do with the people
The experiment that has nothing to do with the people is in order to test this paper method at the generalization ability aspect the Expression Recognition, irrelevant with the people is exactly that image in the test set and the image in the training set are from different objects, so all facial expression images that we select 4 people at random are remaining as test sample book as training sample.Same this paper method and the classic method used carried out 5 experiments, and experimental result as shown in Figure 6.Can find by Fig. 6: the first, to compare with the experiment relevant with the people, recognition correct rate all descends to some extent.The difference of this explanation different people on face shape can produce a very large impact Expression Recognition.The second, the recognition correct rate that this paper method is obtained in this experiment still surpasses traditional LBP method comprehensively, has proved that further the local feature that adopts based on 2DPLS extracts the validity of expression feature extraction and the superiority of more traditional LBP method.The 3rd, be again that indignation, happiness and surprised three kinds of expressions have obtained the highest recognition correct rate in this experiment, this has illustrated that the big more expression generalization ability of facial deformation is good more.

Claims (3)

1. human facial expression recognition method based on the 2 D partial least square method, its step is as follows:
A) with the training sample image collection according to seven kinds of expression classification, promptly indignation, detest, frightened, glad, neutral, sad and surprised;
B) sample image is divided into equal-sized m sub-piece;
C) utilize the textural characteristics of each sub-piece of LBP operator extraction;
D) the LBP textural characteristics that every width of cloth sample image is extracted constitutes a local grain eigenmatrix;
E) adopt adaptive weighted mechanism, promptly give weights to m sub-piece, the computing method of weights are: all neutral facial expression images in the training sample are obtained neutral average face, and extract the local grain eigenmatrix A=[a of this neutrality average face 1..., a m], then obtain corresponding local grain eigenmatrix respectively for all espressiove images H i = [ h 1 i , · · · , h m i ] , Utilize card side's distance calculation to go out all espressiove images and the LBP histogram difference of neutral average face image on certain height piece then, and with the adaptive weighting of the LBP histogram difference on certain height piece as this sub-piece, as follows:
e j = Σ i x 2 ( a j , h j i ) , ( j = 1 , · · · , m )
x 2It is the expression formula of card side's distance;
F) the local grain eigenmatrix that uses the 2 D partial least square method that all sample images are obtained carries out the statistical nature extraction, forms seven kinds of template datas that expression is corresponding, finishes training process;
G) when new images is imported, according to b) to f) step finishes the texture of new images, the extraction of statistical nature, the template data of seven kinds of expressions that form with step f) is measured then, is the expression ownership of this width of cloth input picture near the person.
2. a kind of human facial expression recognition method based on the 2 D partial least square method according to claim 1 is characterized in that the selection of the neutron of sample image described in step b) piece number, and facial image is divided into 64 sub-pieces, and promptly every block size is 15 * 15.
3. a kind of human facial expression recognition method according to claim 1 based on the 2 D partial least square method, the method for building up that it is characterized in that local grain eigenmatrix in the step d) is: sample image is divided into m sub-piece, use the LBP histogram in the sub-piece of LBP operator extraction, and m the LBP histogram of this image of duplicate sample formed the local grain eigenmatrix of k * m dimension.
CNB200710019405XA 2007-01-19 2007-01-19 Method for recognizing facial expression based on 2D partial least square method Expired - Fee Related CN100461204C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB200710019405XA CN100461204C (en) 2007-01-19 2007-01-19 Method for recognizing facial expression based on 2D partial least square method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB200710019405XA CN100461204C (en) 2007-01-19 2007-01-19 Method for recognizing facial expression based on 2D partial least square method

Publications (2)

Publication Number Publication Date
CN101004791A CN101004791A (en) 2007-07-25
CN100461204C true CN100461204C (en) 2009-02-11

Family

ID=38703916

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB200710019405XA Expired - Fee Related CN100461204C (en) 2007-01-19 2007-01-19 Method for recognizing facial expression based on 2D partial least square method

Country Status (1)

Country Link
CN (1) CN100461204C (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101587186B (en) * 2008-05-22 2011-08-24 赵力 Characteristic extraction method of radar in-pulse modulation signals
CN101620669B (en) * 2008-07-01 2011-12-07 邹采荣 Method for synchronously recognizing identities and expressions of human faces
US20120023135A1 (en) * 2009-11-11 2012-01-26 Erik Dahlkvist Method for using virtual facial expressions
CN101719223B (en) * 2009-12-29 2011-09-14 西北工业大学 Identification method for stranger facial expression in static image
TWI411969B (en) * 2010-12-24 2013-10-11 Ind Tech Res Inst Method and system for matching texture feature points in images
CN102663400B (en) * 2012-04-16 2014-06-04 北京博研新创数码科技有限公司 LBP (length between perpendiculars) characteristic extraction method combined with preprocessing
CN102663436B (en) * 2012-05-03 2014-04-16 武汉大学 Self-adapting characteristic extracting method for optical texture images and synthetic aperture radar (SAR) images
CN103034858B (en) * 2012-11-30 2016-04-13 宁波大学 A kind of secondary cluster segmentation method of satellite cloud picture
CN103116765B (en) * 2013-03-18 2015-12-23 山东大学 A kind of facial expression recognizing method of local binary of odd, even grouping
CN103501428A (en) * 2013-10-22 2014-01-08 北京博威康技术有限公司 Multimedia monitoring method and multimedia monitoring device
CN103914693A (en) * 2014-04-22 2014-07-09 江西科技师范大学 Far-infrared face recognition method
CN103971131A (en) * 2014-05-13 2014-08-06 华为技术有限公司 Preset facial expression recognition method and device
CN104021395B (en) * 2014-06-20 2017-05-03 华侨大学 Target tracing algorithm based on high-order partial least square method
CN104036255B (en) * 2014-06-21 2017-07-07 电子科技大学 A kind of facial expression recognizing method
CN104657709B (en) * 2015-02-05 2019-03-15 小米科技有限责任公司 Facial image recognition method, device and server
CN104778472B (en) * 2015-04-24 2017-11-21 南京工程学院 Human face expression feature extracting method
CN105405147B (en) * 2015-11-17 2018-07-20 西北工业大学 A kind of Algorism of Matching Line Segments method based on fusion LBP and gray feature description
CN105975921B (en) * 2016-04-29 2019-06-07 厦门大学 Pedestrian detection method based on local feature symbiosis and Partial Least Squares
CN108133166B (en) * 2016-11-30 2023-03-14 中兴通讯股份有限公司 Method and device for displaying personnel state
CN108664850B (en) * 2017-03-30 2021-07-13 展讯通信(上海)有限公司 Human face posture classification method and device
CN107316015B (en) * 2017-06-19 2020-06-30 南京邮电大学 High-precision facial expression recognition method based on deep space-time characteristics
CN107644455B (en) * 2017-10-12 2022-02-22 北京旷视科技有限公司 Face image synthesis method and device
CN108537194A (en) * 2018-04-17 2018-09-14 谭红春 A kind of expression recognition method of the hepatolenticular degeneration patient based on deep learning and SVM
CN108765397A (en) * 2018-05-22 2018-11-06 内蒙古农业大学 A kind of timber image-recognizing method and device constructed based on dimensionality reduction and feature space
CN109753924A (en) * 2018-12-29 2019-05-14 上海乂学教育科技有限公司 It is a kind of for the face identification system of online education, method and application
CN110514924B (en) * 2019-08-12 2021-04-27 武汉大学 Power transformer winding fault positioning method based on deep convolutional neural network fusion visual identification
CN110543833B (en) * 2019-08-15 2020-09-22 平安国际智慧城市科技股份有限公司 Face recognition method, device and equipment based on data dimension reduction and storage medium
CN110781798B (en) * 2019-10-22 2022-08-12 浙江工业大学 Vehicle-mounted suspect locking system based on raspberry group and face recognition
CN117253279A (en) * 2023-11-17 2023-12-19 蓝色火焰科技成都有限公司 Vehicle remote control method, device and medium based on vehicle-mounted information service unit

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1790374A (en) * 2004-12-14 2006-06-21 中国科学院计算技术研究所 Face recognition method based on template matching

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1790374A (en) * 2004-12-14 2006-06-21 中国科学院计算技术研究所 Face recognition method based on template matching

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
An enhanced LBP Feature Based on FacialExpression Recognition. Lianghua He,Cairong Zou,Li Zhao,Die Hu.Proceeding of the 2005 IEEE Engineering in medicine and biology 27th annual conference shanghai. 2005
An enhanced LBP Feature Based on FacialExpression Recognition. Lianghua He,Cairong Zou,Li Zhao,Die Hu.Proceeding of the 2005 IEEE Engineering in medicine and biology 27th annual conference shanghai. 2005 *
Facial Expression Recognition Using KernelCannonical Correlation Analysis(KCCA). Wenming zheng, Xiaoyan zhou, Cairong zou, Li zhao.IEEE Transactions on neural networks,Vol.17 No.1. 2006
Facial Expression Recognition Using KernelCannonical Correlation Analysis(KCCA). Wenming zheng, Xiaoyan zhou, Cairong zou, Li zhao.IEEE Transactions on neural networks,Vol.17 No.1. 2006 *
基于偏最小二乘法的人脸识别. 楼安平,杨新,周大可.微型电脑应用,第25卷第1期. 2005
基于偏最小二乘法的人脸识别. 楼安平,杨新,周大可.微型电脑应用,第25卷第1期. 2005 *
基于特征流的面部表情运动分析及应用. 金辉,高文.软件学报,第14卷第12期. 2003
基于特征流的面部表情运动分析及应用. 金辉,高文.软件学报,第14卷第12期. 2003 *

Also Published As

Publication number Publication date
CN101004791A (en) 2007-07-25

Similar Documents

Publication Publication Date Title
CN100461204C (en) Method for recognizing facial expression based on 2D partial least square method
CN102663391B (en) Image multifeature extraction and fusion method and system
CN102831447B (en) Method for identifying multi-class facial expressions at high precision
CN108596154B (en) Remote sensing image classification method based on high-dimensional feature selection and multilevel fusion
CN102194114B (en) Method for recognizing iris based on edge gradient direction pyramid histogram
CN104392246B (en) It is a kind of based between class in class changes in faces dictionary single sample face recognition method
CN106023065A (en) Tensor hyperspectral image spectrum-space dimensionality reduction method based on deep convolutional neural network
CN101930537B (en) Method and system for identifying three-dimensional face based on bending invariant related features
CN104751191A (en) Sparse self-adaptive semi-supervised manifold learning hyperspectral image classification method
CN101630364A (en) Method for gait information processing and identity identification based on fusion feature
CN104299003A (en) Gait recognition method based on similar rule Gaussian kernel function classifier
CN109325507A (en) A kind of image classification algorithms and system of combination super-pixel significant characteristics and HOG feature
CN105678261B (en) Based on the direct-push Method of Data with Adding Windows for having supervision figure
CN103077378B (en) Contactless face recognition algorithms based on extension eight neighborhood Local textural feature and system of registering
CN108108760A (en) A kind of fast human face recognition
CN111339930A (en) Face recognition method combining mask attribute loss function
CN106055653A (en) Video synopsis object retrieval method based on image semantic annotation
CN104778472B (en) Human face expression feature extracting method
CN103400154A (en) Human body movement recognition method based on surveillance isometric mapping
CN104268507A (en) Manual alphabet identification method based on RGB-D image
CN102880870A (en) Method and system for extracting facial features
CN103714340A (en) Self-adaptation feature extracting method based on image partitioning
CN103745242A (en) Cross-equipment biometric feature recognition method
CN103942572A (en) Method and device for extracting facial expression features based on bidirectional compressed data space dimension reduction
CN111127407B (en) Fourier transform-based style migration forged image detection device and method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090211

Termination date: 20100219