CN109918998B - Large-gesture face recognition method - Google Patents

Large-gesture face recognition method Download PDF

Info

Publication number
CN109918998B
CN109918998B CN201910058322.4A CN201910058322A CN109918998B CN 109918998 B CN109918998 B CN 109918998B CN 201910058322 A CN201910058322 A CN 201910058322A CN 109918998 B CN109918998 B CN 109918998B
Authority
CN
China
Prior art keywords
face
gabor
block
affine transformation
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910058322.4A
Other languages
Chinese (zh)
Other versions
CN109918998A (en
Inventor
王辰星
程超
达飞鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201910058322.4A priority Critical patent/CN109918998B/en
Publication of CN109918998A publication Critical patent/CN109918998A/en
Application granted granted Critical
Publication of CN109918998B publication Critical patent/CN109918998B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a face recognition method suitable for large gestures, which comprises the steps of firstly carrying out block division operation on a picture, and obtaining affine transformation parameter candidate sets of corresponding side face blocks and front face blocks by using a weighted LK algorithm; then, for each parameter in the candidate set, finding the optimal parameter therein; correcting each side face block into the front face of the face by using the corresponding optimal affine transformation parameters, and splicing all the blocks to obtain the corrected front face form; and finally, adopting a weighted local gabor binary pattern histogram sequence algorithm, and taking the average gabor similarity obtained by the optimal parameters of each face block as the recognition weight of the face block to carry out authentication recognition. The method is simple and efficient, is not influenced by illumination, can efficiently and accurately obtain the global optimal affine transformation parameters, greatly improves the recognition rate of the side faces, and has good application prospects in large-pose face detection and recognition.

Description

Large-gesture face recognition method
Technical Field
The invention belongs to the technical field of computer vision and artificial intelligence, relates to a face recognition method, and in particular relates to a large-posture face recognition method.
Background
Face recognition technology is one of the hot research subjects in the fields of modern artificial intelligence, pattern recognition and computer vision. The human face recognition detection method has the advantages that the human face recognition detection is obtained in actual life and scientific application, and factors of large-angle gestures are main factors affecting human face recognition results. When the input face image is a side face image with a larger deflection angle, the performance of many conventional face recognition algorithms is greatly reduced, so that the recognition rate is obviously and greatly reduced. Experiments have shown that even for two images of the same person, different poses may be reflected in the images with a large difference, which may be much larger than the difference of persons of different identities in the same pose. Of all the algorithms at present, corrective normalization of the pose has proven to be an effective algorithm.
At present, main stream research methods mainly focus on three-dimensional face templates and two-dimensional plane technologies aiming at large-angle face gestures. In the aspect of a three-dimensional template, a three-dimensional template is constructed by a single image, but three-dimensional face reconstruction operation is complex, calculation amount is large, real-time requirements are difficult to realize, and the construction of the template needs to accurately calibrate some characteristic points; in two dimensions, the Lucas-Kanade algorithm is a very effective image alignment algorithm, but the traditional face pose recovery based on the LK algorithm is easy to sink into a local minimum value in the process of single image so as not to obtain a global optimal parameter, and the image recovery of the face is fuzzy.
Therefore, a new solution is needed to solve this problem.
Disclosure of Invention
The invention aims to: in order to solve the problem of low large-pose face recognition rate in the prior art, the large-pose face recognition method with the maximum gabor similarity based on the weighted LK algorithm is provided, and optimal affine transformation parameters of each face are found through gabor characteristics, so that a corrected clear face picture is obtained, and the recognition rate is effectively improved.
The technical scheme is as follows: in order to achieve the above purpose, the invention provides a method for recognizing the face of the large gesture with the maximum gabor similarity based on a weighted LK algorithm, which comprises the following steps:
1) Carrying out corresponding block division on each side face and the corresponding front face area in the training set;
2) Obtaining affine transformation parameters of each block of area divided on each face in the training set by using a weighted LK algorithm, and taking affine transformation parameters of all corresponding face areas in the training set as a candidate set;
3) Finding out the optimal affine transformation parameters corresponding to each divided face area according to the criterion that the similarity between each corrected face block and the corresponding face block gabor in the training set is maximum;
4) Using the optimal affine transformation parameter of each area to calculate the average gabor similarity of the face area under the parameter;
5) Performing angle measurement and side face region block division on a face picture to be identified;
6) Finding out optimal affine transformation parameters of all the block areas of the side face area in the step 5 according to the step 3, and synthesizing a front face;
7) And (3) identifying the front face synthesized in the step (6) by adopting a weighted local gabor binary pattern histogram sequence method, wherein the identification weight value of each face area is the average gabor similarity of the face area.
In the step 1, the side face refers to a face within a certain angle range, and the side face and the front face for correction do not need to be corresponding to each other precisely but are corresponding to a local area, which is specifically divided into: firstly, dividing pictures of a plurality of faces at a certain angle and front faces, wherein each block approximately corresponds to a side face picture and a front face picture in a partitioned mode, and the pictures are divided into N blocks of areas altogether.
Further, in the step 2, p represents affine transformation parameters, I represents an image of a side face, T represents a corresponding positive face image, and r represents a number of a block, thereby obtaining the following function:
E=||[g 1 *I r (W(x,p)),……,g M *I r (W(x,p))]-[g 1 *T r (x),……,g M *T r (x)]|| 2
g in i Representing the ith gabor transform, M represents the total number of gabor transforms selected, and the weighted LK algorithm is used to solve the above functions, which comprises the specific steps of:
2.1 Transformed into the fourier domain and using the Parseval theorem, the following can be obtained:
wherein the method comprises the steps of G respectively i ,I r ,T r A corresponding fourier transform;
2.2 Replacing the fourier transform with a matrix F containing fourier basis vectors:
E=[I r (W(x,p))-T r (x)] T F T SF[I r (W(x,p))-T r (x)]
Q=F T SF is a weighted matrix;
2.3 First-order taylor expansion of the above equation, and let the partial derivative with respect to Δp equal to 0, yield:
wherein the method comprises the steps of
2.4 The p parameter iterates from zero vector p=p+Δp until convergence is reached, and finally the convergence value is regarded as affine transformation parameter p of the block.
Further, the determining of the optimal affine transformation parameters in the step 3 includes the steps of:
3.1 Affine transformation parameters of all corresponding face areas in the training set are used as candidate sets P, and the ith block area in the kth side face in the library is in the parameter P i The following gabor feature, a feature vector b is obtained k,i
b k,i (p i )=[g 1 *I k,i (W(x,p i ));……;g M *I k,i (W(x,p i ))]
3.2 Gabor feature a of the ith block of the kth frontal face picture in the library) k,i
a k,i =[g 1 *T k,i (x);……;g M *T k,i (x)]
3.3 For each parameter p of the i-th block area i The gabor similarity gabor_s (p i )
3.4 I-th block region global optimum parameterIs that
Further, the specific face recognition process in the step 7 is as follows:
4.1 Block weighting identification is performed according to the division of each face, and the gray scale interval is assumed to be 0, L-1],H μ,ν,i Representing the face of the ith block in gabor transformation g μ,ν The histogram obtained below
H μ,ν,i =(h μ,ν,i,0 ,h μ,ν,i,1 ,……,h μ,ν,i,L-1 )
Wherein h is μ,ν,i,r =∑I{g μ,ν (z) =r } r represents a gray level, and
the final histogram for each face is:
4.2 Weighted LGBPHS for measuring similarity of two faces
Wherein:histogram representation representing a side face,/>Histogram representation representing a face, ω i The weight value of the ith face is as follows: />Wherein (1)>Represents the i-th entry in the first histogram,/-th entry>Representing the ith entry in the second histogram, L being the number of entries in the histogram;
4.3 Setting the weight value of the ith block to pass through the faceOptimal affine transformation parameters for blocksThe average gabor similarity obtained, i.e
4.4 Finally, find the leadMaximum->I.e. the identity of this side face picture.
Firstly, carrying out block division operation on a side face picture and a corresponding front face picture in a training data set, and obtaining affine transformation parameter candidate sets of the corresponding side face block and the front face block by utilizing a weighted LK (Lucas-Kanade) algorithm; then, for each parameter in the candidate set, finding the optimal parameter in the candidate set, so that the gabor similarity of each correction face block and the corresponding face block in the training set is maximum; correcting each side face block into the front face of the face by using the corresponding optimal affine transformation parameters, and splicing all the blocks to obtain the corrected front face form; and finally, adopting a weighted local gabor binary pattern histogram sequence algorithm, and taking the average gabor similarity obtained by the optimal parameters of each face block as the recognition weight of the face block to carry out authentication recognition.
The beneficial effects are that: compared with the prior art, the invention has the following advantages:
1. the pictures in the training set do not need to be calibrated with feature points, the needed preparation work is less, and the side face and the front face blocks corrected in the training set do not need to be accurately corresponding, so long as the approximate local areas correspond.
2. The training process requires less data set, the training time is short, and the time used in the face synthesis recognition stage is also less.
3. The method can efficiently and accurately obtain the global optimal affine transformation parameters, so that the recognition rate of the side face is greatly improved, and the method has certain robustness to the gesture and illumination.
Drawings
FIG. 1 is an overall flow chart of the present invention;
FIG. 2 is a side and front face block diagram;
FIG. 3 is a schematic diagram showing the gabor similarity of the obtained global optimum parameters;
FIG. 4 is a face image of each face region corrected by gabor optimal parameters;
fig. 5 is a schematic diagram of the LBP algorithm.
Detailed Description
The invention is further elucidated below in connection with the drawings and the specific embodiments.
As shown in fig. 1, the invention provides a weighted LK algorithm-based large-pose face recognition method with maximum gabor similarity, which specifically comprises the following steps:
firstly), a training picture is firstly, namely, the side face and the front face picture in a training set are correspondingly segmented, a schematic diagram after segmentation is shown in fig. 2, and the specific segmentation steps are as follows:
1.1 Face segmentation: each block of the front face is divided by a rectangle, N=27 blocks are divided in total, the front face is divided into 5+5+5+5+4+3 block areas from top to bottom, the length and width of the rectangle are set to 12 (the size of the face picture is 80 x 80), and the coordinates of the upper left corner of each area are recorded by an array pt because the length and width are determined;
1.2 Side face segmentation: and when the positions of the vertexes of the corresponding rectangles on the side face are found out during the side face blocking, manually dividing, and recording the positions of the four vertexes by using a matrix in each area.
Secondly), obtaining affine transformation parameters of each face area by a weighting LK algorithm for each side face and the corresponding front face in the training set, and taking the parameters of each side face area in the training set as a candidate set, wherein the method comprises the following specific steps:
2.1 5 scales and 8 directions of gabor transformation are prepared and stored in a two-dimensional array gabor rray { i, j }, wherein i represents the scale and j represents the direction;
2.2 Defining a class of LK for processing steps associated with the LK algorithm, which is specifically as follows:
2.2.1 In this class the following variables are defined:
img1 represents a front face image;
fx represents the kernel function of the Sobel filter in the x direction;
fy represents the kernel function of the Sobel filter in the y direction;
x represents a coordinate in the x direction;
y represents a coordinate in the y direction;
t_0 represents a face region;
dsize represents the size of the face.
2.2.2 In this class, an initialization method is defined, including the following:
initializing fx, fy to fx= [ -1,1], and fy=fx'. Initializing T_0 to be a positive face area, wherein the initialization process can be completed through an inner loop function, namely:
T_0=imcrop(img1,[pt(1),pt(2),11,11])
wherein encrop is a function provided in the matlab toolbox for intercepting a designated area of the image, pt designates the coordinates of the upper left corner, 11 is used to set the intercepted pixel length;
initializing x and y, first:
[x1,y1]=meshgrid(0:size(T_0,2)-1,0:size(T_0,1)-1)
where meshgrid is a function provided in the matlab toolbox, x1, y1 are all vectorized to initialize x and y, x=x1 (:), y=y1 (:); initialization of dsize is: dsize=size (t_0), resulting in a matrix dimension of the same size as t_0.
2.2.3 The affine transformation parameters of the side face block to the front face block can be obtained by the fitting method fit method, the area of the side face image is represented by bimg, the gradient images in the x and y directions of the side face are represented by Ix and Iy, and the affine transformation parameters are obtained by the following specific method:
2.2.3.1 First, if, iy, if=imfilter (bimg, fx), if=imfilter (bimg, fy), which is a function provided in matlab to filter an image, the first parameter specifying the image to be filtered, and the second parameter specifying the kernel function in the filtering process.
2.2.3.2 P represents the initialized affine transformation parameters, iterate from 0, the number of iterations being num_iter, p being a matrix of 6 parameters controlling rotation, scaling and translation in x and y directions, expressed as follows:
the image information obtained after the side face region has undergone the affine transformation parameter p is represented by patch, and the size is specified by dsize.
By forming a structure of affine transformation from tform=maketform ('affine', p), where makeform is a function that is self-contained in matlab, a TFORM structure can be created that cooperates with an imtransformation function to effect image transformation. Latch=imtransform (img, tform, 'bilinear', 'xdata', [0, dsize (2) -1], 'ydata', [0, dsize (1) -1 ]), the imtransform being a self-contained function in matlab, the first parameter specifying the image to be transformed, the second parameter tform specifying the structure to be transformed, the third parameter specifying the interpolation scheme to be used, 'bilinear' representing a bilinear cubic interpolation, xdata and ydata representing the positions of the specified output images in the two-dimensional output space X, Y. The gradient images Ix, iy are similarly subjected to the above-described transformation using the affine transformation parameters p, to obtain ix_p, iy_p.
2.2.3.3 A jacobian matrix is constructed with the current p. An all 1 and all 0 array v1, v0 of the same scale as x is constructed, i.e. v1=ones (size (x)), v0=zeros (size (x)), and derivatives dWx = [ -x, y, v1, v0, v0, v0] and dWy = [ v0, v0, v0, x, -y, v1] of affine transformation parameters are constructed.
The jacobian matrix J can be obtained by vectorizing the gradient images ix_p, iy_p in the x, y directions and performing point multiplication (corresponding multiplication of each element) with dWx, dWy, respectively, that is:
J=dWx.*repmat(Ix_p(:),[1,6])+dWy.*repmat(Iy_p(:),[1,6])
the repmat is a matlab self-contained function that can replicate the element represented by the first parameter, the dimension of replication being specified by the second parameter.
2.2.3.4 Using taylor first-order expansion to find dp, first performing gabor filter transformation on the side face region patch: p_feature=imfilter (patch, gaborrray { i, j }) gabor transform the face region t_0: t_feature=imfilter (t_0, gaborrray { i, J }) using jacobian matrix J to achieve the optimal linear approximation of the side face image region patch and the front face image region t_0, so as to find dp, namely:
dp=J\(T_feature(:)-p_feature(:))
the division left of the matrix is represented, and the division left J is used for obtaining dp by T_0 (:) -patch (:).
2.2.3.5 Update affine transformation parameter p with the obtained dp, i.e.
p=p+dp
2.2.3.6 If the iteration number is smaller than the designated iteration number, continuing to return to the step 2.2.3.2 for execution, if the iteration number exceeds the designated iteration number, exiting the loop, and obtaining p which is the optimal affine transformation parameter of the block side face region. The iteration number selected in the test is 30, and the general iteration number converges around 20.
2.3 All (60) affine transformation parameters of each side face block are obtained through the step 2.2 for each side face picture, and the parameters obtained by corresponding to the face areas are stored in the corresponding candidate set P.
Thirdly), selecting the optimal affine transformation parameters of the block according to the criterion of maximizing the similarity between the face block rectified by the block and the corresponding face block gabor in all training sets, wherein the optimal affine transformation parameters of the block are selected according to the specific steps as follows:
3.1 Measuring the gabor characteristics of the frontal area t_0, filtering the frontal area t_0 with each gabor transform in gabor rray, namely:
gabor_front_feature=imfilter(T_0,gaborArray{i,j})
vectorizing the obtained characteristic information, and combining the characteristics obtained through all gabor transformation to obtain gabor_front, namely:
gabor_front=[gabor_front;gabor_front_feature(:)]
3.2 Similar to step 2.2.3.2, the synthesized forward face region finalpatch is obtained, the gabor characteristics of the synthesized forward face region finalpatch are measured, and the synthesized forward face region finalpatch is filtered with each gabor transform in gabor rray, i.e.
gabor_pose_feature=imfilter(finalpatch,gaborArray{i,j})
Vectorizing the obtained characteristic information, and combining the characteristics obtained by all gabor transformation to obtain gabor_phase, namely
gabor_pose=[gabor_pose;gabor_pose_feature(:)]
3.3 The fitting degree cos_similarity of the affine parameter in the face area can be measured through cosine transformation, namely
cos_similar=(1-pdist2(gabor_front',gabor_pose','cosine'))
Wherein pdist2 is a function of matlab, the first and second parameters specify samples, and the third parameter can be used to specify what distance is required, and add up cosine similarities obtained after p transformation of affine transformation parameters of the face region of the block in all training sets (60 people are selected in this embodiment), to obtain a total similarity of parameters p, namely:
finding the parameter maximizing gabor_s (P) among the candidate set of parameters PThen->Namely the global optimum parameter of the face area, and calculating the global optimum parameter of the face area>The average gabor similarity under the action is used as the weight value omega of the face recognition at the back, namely: />
In this embodiment, as shown in fig. 3, there are listed the similarity of the gabor cosine of the face block and the gabor cosine of the face block of 60 persons (40 °) after correction of the optimal parameters under the same angles of the 1 st, 7 th, 9 th and 15 th blocks, and the horizontal dotted line in each face area in the figure is the average gabor cosine similarity obtained by using the optimal parameters of the block.
In this embodiment, as shown in fig. 4, a side face (including a case under different illumination conditions) marked as 40 ° in the face library is corrected. It can be seen that under the condition of uneven illumination, the optimal transformation parameters obtained according to the gabor characteristics can still be adapted, so that the obtained global optimal parameters are proved to have certain robustness to illumination.
Fourth), face recognition: firstly, angle measurement and side face region block division are carried out on a side face picture to be identified, and a front face picture is synthesized according to optimal affine transformation parameters of all block regions, so that identity judgment can be carried out by using the synthesized front face picture. Because the method is based on local blocks, and the function of each local block in the whole face recognition is different in size, and in addition, the local blocks fitted according to the optimal correction parameters also have relative advantages and disadvantages, a recognition weight omega is set for each face region, and the method specifically comprises the following steps:
4.1 In the correcting process of the gesture, a gabor characteristic correcting method based on the human face region division and the region division is adopted, so that the recognition is carried out by using a weighted LGBPHS algorithm;
4.2 Gabor transformation is carried out on each front face region T_0 in the face library to obtain gabor_front_feature, namely
gabor_front_feature=imfilter(T_0,gaborArray{i,j})
LBP algorithm is performed on gabor_front_feature, and the LBP algorithm is roughly described as follows: processing each pixel of the image by using a 3*3 template, comparing the sizes of the current pixel and surrounding pixels, and setting 1 larger than the current pixel and 0 smaller than the current pixel; secondly, coding eight surrounding pixels, wherein the eight 0 s and the eight 1 s just can form a byte number, and then forming the unsigned number according to a certain rule; finally, this number is assigned to the current pixel. The algorithm is schematically shown in fig. 5, and the whole algorithm process is represented by LBP, namely:
LBP_front_feature=LBP(gabor_front_feature)
the obtained image result LBP_front_feature is passed through an imhist function to obtain a gray histogram, i.e
[counts_front,~]=imhist(LBP_front_feature)
Wherein the imhist function is a matlab self-contained function for counting the occurrence frequency of each gray value in the positive face image area and recording in a count_front array, and obtaining the histogram representation of each positive face after processing each face area by using all gabor transformation and the imhist function
4.3 After correcting the side face area in the test set by the corresponding global optimal parameters, performing gabor transformation on the side face area, namely:
gabor_pose_feature=imfilter(finalpatch,gaborArray{i,j})
LBP algorithm is carried out on the gabor_phase_feature to obtain LBP_front_feature, namely:
LBP_front_feature=LBP(gabor_pose_feature)
the obtained image result LBP_front_feature obtains a gray histogram through an imhist function:
[counts_pose,~]=imhist(gabor_pose_feature)
counting the occurrence frequency of each gray value in the positive face image area, recording the occurrence frequency in a count_post array, processing each face area of the test set by using all gabor transformation and an imhist function, and obtaining a side face histogram representation
4.4 Metric side face histogramHistogram +/for each face>Is a degree of similarity of (c). The measurement method is to take the minimum value of each gray level, namely the common part of the corresponding item as the similarity of the item; then multiplying the weight value omega of the partial face region i And finally obtaining the fitting degree of the side face and the human face, and obtaining the maximum identity of the side face picture, wherein the function expression is as follows:
in the embodiment, 120 side face pictures (angles include +/-30 degrees, +/-45 degrees and +/-60 degrees) are respectively identified and compared through the method and other face recognition methods.
Other face recognition methods are a LLR (Local Linear Regression) method based on linear transformation from a side face block to a front face block and a posture correction method based on minimum pixel difference, and the final average recognition rates are 68.62% and 75.75% respectively; the average recognition rate of the method of the invention is 98.15%. Therefore, the method can effectively improve the recognition rate of the side face picture.

Claims (1)

1. A large-posture face recognition method is characterized in that: the method comprises the following steps:
1) Carrying out corresponding block division on each side face and the corresponding front face area in the training set;
2) Obtaining affine transformation parameters of each block of area divided on each face in the training set by using a weighted LK algorithm, and taking affine transformation parameters of all corresponding face areas in the training set as a candidate set;
3) Finding out the optimal affine transformation parameters corresponding to each divided face area according to the criterion that the similarity between each corrected face block and the corresponding face block gabor in the training set is maximum;
4) Calculating the average gabor similarity of the face region under the optimal affine transformation parameters of each region by utilizing the optimal affine transformation parameters;
5) Performing angle measurement and side face region block division on a face picture to be identified;
6) Finding out optimal affine transformation parameters of all the block areas of the side face area in the step 5 according to the step 3, and synthesizing a front face;
7) The face synthesized in the step 6 is identified by adopting a weighted local gabor binary pattern histogram sequence method, wherein the identification weight value of each face area is the average gabor similarity of the face areas;
in the step 2, p is used for representing affine transformation parameters, I is used for representing images of side faces, T is used for representing corresponding positive face images, and subscript r is used for representing numbers of blocks, so that the following functions are obtained:
E=||[g 1 *I r (W(x,p)),……,g M *I r (W(x,p))]-[g 1 *T r (x),……,g M *T r (x)]|| 2
g in i Representing the ith gabor transform, i= … M, M representing the total number of gabor transforms selected, the above function is solved using a weighted LK algorithm, which comprises the specific steps of:
2.1 Transformed to the fourier domain and using the Parseval theorem, the result is as follows:
wherein the method comprises the steps of G respectively i ,I r ,T r A corresponding fourier transform;
2.2 Replacing the fourier transform with a matrix F containing fourier basis vectors:
E=[I r (W(x,p))-T r (x)] T F T SF[I r (W(x,p))-T r (x)]
Q=F T SF is a weighted matrix;
2.3 First-order taylor expansion of the above equation, and let the partial derivative with respect to Δp equal to 0, yield:
wherein the method comprises the steps of
2.4 The p parameter starts to iterate from zero vector, p=p+Δp, until convergence is reached, and finally the convergence value is regarded as affine transformation parameter p;
the determination of the optimal affine transformation parameters in step 3 comprises the steps of:
3.1 Affine transformation parameters of all corresponding face areas in the training set are used as candidate sets P, and the ith block area in the kth side face in the library is in the parameter P i The following gabor feature, a feature vector b is obtained k,i
b k,i (p i )=[g 1 *I k,i (W(x,p i ));……;g M *I k,i (W(x,p i ))]
3.2 Gabor feature a of the ith block of the kth frontal face picture in the library) k,i
a k,i =[g 1 *T k,i (x);……;g M *T k,i (x)]
3.3 For each parameter p of the i-th block area i The gabor similarity gabor_s (p i )
3.4 I-th block region global optimum parameterIs that
The specific face recognition process in step 7 is as follows:
4.1 Block weighting identification is performed according to the division of each face, and the gray scale interval is assumed to be 0, L-1],H μ,ν,i Representing the face of the ith block in gabor transformation g μ,ν The histogram obtained below
H μ,ν,i =(h μ,ν,i,0 ,h μ,ν,i,1 ,……,h μ,ν,i,L-1 )
Wherein h is μ,ν,i,r =∑I{g μ,ν (z) =r } r represents a gray level, and
the final histogram for each face is:
4.2 Weighted LGBPHS for measuring similarity of two faces
Wherein:histogram representation representing a side face,/>Histogram representation representing a face, ω i The weight value of the ith face is as follows: />Wherein (1)>Represents the i-th entry in the first histogram,/-th entry>Representing the ith entry in the second histogram, L being the number of entries in the histogram;
4.3 Setting the weight value of the i-th block as the optimal affine transformation parameter passing through the face blockThe average gabor similarity obtained, i.e
4.4 Finally, find the leadMaximum->I.e. the identity of this side face picture.
CN201910058322.4A 2019-01-22 2019-01-22 Large-gesture face recognition method Active CN109918998B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910058322.4A CN109918998B (en) 2019-01-22 2019-01-22 Large-gesture face recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910058322.4A CN109918998B (en) 2019-01-22 2019-01-22 Large-gesture face recognition method

Publications (2)

Publication Number Publication Date
CN109918998A CN109918998A (en) 2019-06-21
CN109918998B true CN109918998B (en) 2023-08-01

Family

ID=66960557

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910058322.4A Active CN109918998B (en) 2019-01-22 2019-01-22 Large-gesture face recognition method

Country Status (1)

Country Link
CN (1) CN109918998B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110427843B (en) * 2019-07-18 2021-07-13 广州利科科技有限公司 Intelligent face recognition method
CN112990047B (en) * 2021-03-26 2024-03-12 南京大学 Multi-pose face verification method combining face angle information

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1885310A (en) * 2006-06-01 2006-12-27 北京中星微电子有限公司 Human face model training module and method, human face real-time certification system and method
CN102024141A (en) * 2010-06-29 2011-04-20 上海大学 Face recognition method based on Gabor wavelet transform and local binary pattern (LBP) optimization
US8385687B1 (en) * 2006-12-06 2013-02-26 Matrox Electronic Systems, Ltd. Methods for determining a transformation between images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1885310A (en) * 2006-06-01 2006-12-27 北京中星微电子有限公司 Human face model training module and method, human face real-time certification system and method
US8385687B1 (en) * 2006-12-06 2013-02-26 Matrox Electronic Systems, Ltd. Methods for determining a transformation between images
CN102024141A (en) * 2010-06-29 2011-04-20 上海大学 Face recognition method based on Gabor wavelet transform and local binary pattern (LBP) optimization

Also Published As

Publication number Publication date
CN109918998A (en) 2019-06-21

Similar Documents

Publication Publication Date Title
Dai et al. A 3d morphable model of craniofacial shape and texture variation
Jeni et al. Dense 3D face alignment from 2D videos in real-time
Ploumpis et al. Combining 3d morphable models: A large scale face-and-head model
Jermyn et al. Elastic shape matching of parameterized surfaces using square root normal fields
Leordeanu et al. Semi-supervised learning and optimization for hypergraph matching
Xiong et al. Supervised descent method for solving nonlinear least squares problems in computer vision
US20140043329A1 (en) Method of augmented makeover with 3d face modeling and landmark alignment
CN107845134A (en) A kind of three-dimensional rebuilding method of the single body based on color depth camera
US20060285770A1 (en) Direct method for modeling non-rigid motion with thin plate spline transformation
Hornáček et al. Highly overparameterized optical flow using patchmatch belief propagation
Sánchez-Riera et al. Simultaneous pose, correspondence and non-rigid shape
Smith et al. Facial shape-from-shading and recognition using principal geodesic analysis and robust statistics
Garg et al. Robust trajectory-space tv-l1 optical flow for non-rigid sequences
EP2153379A2 (en) Generalized statistical template matching under geometric transformations
CN109918998B (en) Large-gesture face recognition method
Chen et al. Single and sparse view 3d reconstruction by learning shape priors
Martins et al. Monocular head pose estimation
Wu et al. Model-based face reconstruction using sift flow registration and spherical harmonics
Lu et al. Active shape model and its application to face alignment
Kossaifi et al. Fast and exact bi-directional fitting of active appearance models
Yu et al. SAR pixelwise registration via multiscale coherent point drift with iterative residual map minimization
Bi A motion image pose contour extraction method based on B-spline wavelet
Dornaika et al. Face model adaptation using robust matching and active appearance models
Phothisane et al. A robust composite metric for head pose tracking using an accurate face model
Igual et al. Continuous procrustes analysis to learn 2D shape models from 3D objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant