CN110796022A - Low-resolution face recognition method based on multi-manifold coupling mapping - Google Patents

Low-resolution face recognition method based on multi-manifold coupling mapping Download PDF

Info

Publication number
CN110796022A
CN110796022A CN201910954656.XA CN201910954656A CN110796022A CN 110796022 A CN110796022 A CN 110796022A CN 201910954656 A CN201910954656 A CN 201910954656A CN 110796022 A CN110796022 A CN 110796022A
Authority
CN
China
Prior art keywords
face
resolution
mapping
image
coupling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910954656.XA
Other languages
Chinese (zh)
Other versions
CN110796022B (en
Inventor
张凯兵
郑冬冬
李敏奇
景军锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aoyuan Smart Life Service Guangzhou Group Co ltd
Shenzhen Wanzhida Technology Co ltd
Original Assignee
Xian Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Polytechnic University filed Critical Xian Polytechnic University
Priority to CN201910954656.XA priority Critical patent/CN110796022B/en
Publication of CN110796022A publication Critical patent/CN110796022A/en
Application granted granted Critical
Publication of CN110796022B publication Critical patent/CN110796022B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a low-resolution face recognition method based on multi-manifold coupling mapping, which comprises two stages, namely a training stage and a testing stage. In the training stage, the HR-LR face features are converted into a common feature subspace through supervised learning of two linear coupling mappings, and the discrimination capability and the separability of a coupling mapping relation matrix are enhanced by jointly utilizing the local manifold geometric structure information and the label information of the high-resolution and low-resolution face images. In the testing stage, aiming at a given LR testing face image, the LR testing set face characteristics and the HR image set face characteristics are mapped to a common characteristic subspace for matching and recognition by utilizing two learned coupling mappings. The low-resolution face recognition method based on multi-manifold coupling mapping utilizes local manifold geometric structure information and label information of an HR-LR face image, enhances the discrimination capability and separability of a coupling mapping relation matrix, and improves the recognition performance of an LR face.

Description

Low-resolution face recognition method based on multi-manifold coupling mapping
Technical Field
The invention belongs to the technical field of face recognition methods, and relates to a low-resolution face recognition method based on multi-manifold coupling mapping.
Background
Face recognition is one of the most important research topics in the field of computer vision. At present, a High-Resolution (HR) face recognition method under a controlled condition tends to be mature, and starts to be popularized and applied in a large range in many production practices. However, under the actual uncontrolled condition, the performance of the face recognition system is drastically reduced under the influence of adverse factors such as posture, illumination, expression, occlusion, resolution, and the like, and cannot meet the requirements of actual application. Therefore, the recognition technology for studying Low-Resolution (LR) faces is receiving wide attention from researchers.
Over the past several decades, many different low resolution face recognition methods have been proposed. Depending on the recognition principle, there are three general categories: the method comprises an LR face recognition method based on a reconstructed Super-Resolution (SR) image, an LR face recognition method based on a public feature subspace and an LR face recognition method based on deep learning.
LR face recognition methods based on reconstructed SR images develop rapidly, and the methods mainly utilize an image SR reconstruction technology to obtain HR face images with good visual effects and achieve similarity matching of faces. Although the method based on the image SR can obtain the HR face image with higher visual effect, artifacts are easily introduced at key feature points of the face, and the recognition performance is seriously influenced; moreover, with the large-area coverage of the monitoring network, the calculation complexity of the method is high, and the actual application requirements are difficult to meet.
In recent years, the LR face recognition method based on the public feature subspace becomes an effective way for solving the problem of unmatched feature dimensions of the HR-LR face image due to the advantages of relatively simple algorithm, less time consumption and the like. The method firstly maps HR-LR face images with different dimensions to a public characteristic subspace by learning the coupling mapping of the HR-LR face, and then completes the similarity matching of the HR-LR face images in the characteristic subspace with the same dimension. At present, two common solutions are mainly provided for the problem of an LR face in a public feature subspace, wherein the first solution is an LR face recognition method based on dictionary learning and sparse representation, and the method is mainly used for performing sparse coding on local structural features of the face through the dictionary learning and the sparse representation and then transforming the face into a low-dimensional feature space to realize the matching of the LR face. The second is an LR face recognition method based on coupling mapping, which generally has 3 mapping modes: 1) sampling an HR face image to the same characteristic dimension as that of an LR face image for matching; 2) the LR face image is up-sampled to the same characteristic dimension as the HR face image for matching; 3) and meanwhile, the HR-LR face image is mapped to a common characteristic subspace for matching. The method aims to transform the HR-LR training face image features to a public feature subspace to learn an HR-LR coupling mapping matrix, and then transform the HR coupling mapping matrix and the LR coupling mapping matrix to the public feature subspace respectively to realize the transformation and identification of the LR testing face image features.
With the rapid development of deep learning, the LR face recognition method based on deep learning is proposed in succession, and compared with the traditional machine learning algorithm, the deep learning is more advantageous when a large number of training samples are processed. The face features are extracted mainly through a convolutional neural network, and effective activation functions and loss functions are adopted to optimize network parameters, so that the recognition of an end-to-end HR-LR face is realized.
Disclosure of Invention
The invention aims to provide a low-resolution face recognition method based on multi-manifold coupling mapping, which utilizes local manifold geometric structure information and label information of an HR-LR face image, enhances the discrimination capability and separability of a coupling mapping relation matrix, and improves the recognition performance of an LR face.
The technical scheme adopted by the invention is that a low-resolution face identification method based on multi-manifold coupling mapping is implemented according to the following steps:
step 1, selecting N HR facial images from a standard facial database to form an HR image set, and randomly selecting N HR facial images from the HR image settThe method comprises the steps of taking half of facial images of each person as an HR training set, carrying out smooth downsampling on the HR training set to generate an LR training set, and constructing class labels of training facial image samples, wherein,
Figure BDA0002226882530000031
step 2, based on a coupling mapping learning method, simultaneously mapping the face images in the HR training set and the LR training set to a public characteristic subspace to obtain a formula based on coupling mapping and perform matrixing;
step 3, adding the local geometric structure information and the discrimination information of the sample into the formula subjected to matrixing in the step 2, and solving an HR coupling mapping matrix PHAnd LR coupling mapping matrix PL
Step 4, the other half of the face images in the HR image set are subjected to smooth downsampling to generate an LR test set, and the total number of the images in the test set is Np
And 5, transforming the HR image set and the LR test set to a public feature subspace to obtain an HR-LR face mapping feature
Figure BDA0002226882530000033
And
Figure BDA0002226882530000034
step 6, applying the nearest neighbor classifier to the HR-LR face mapping feature projected in the public feature subspaceClassifying to obtain the face mapping characteristics
Figure BDA0002226882530000036
The category label of (1).
The present invention is also characterized in that,
the step 1 specifically comprises the following steps: selecting N HR facial images from standard facial database to form HR image set
Figure BDA0002226882530000037
Randomly selecting a half of face images containing each person from an HR image set as an HR training set:and (3) performing smooth downsampling on the HR training set to generate an LR training set:
Figure BDA0002226882530000039
wherein
Figure BDA00022268825300000310
Representing the ith low-resolution face image,showing the ith high-resolution face image,
Figure BDA0002226882530000041
representing the total number of the images of the training set;
the class labels of the training face image samples are as follows:
Figure BDA0002226882530000042
the standard face database comprises an FERET and CMU PIE face database, and the generation of the LR training set by smoothly downsampling the HR training set specifically comprises the following steps: for the high-resolution FERET and CMU PIE face libraries, the resolution is respectively as follows: the resolution of the FERET face library is 72 multiplied by 72, the resolution of the CMU PIE face library is 32 multiplied by 28, the FERET face library is multiplied by 4 and 9, the CMU PIE face library is subjected to 2 and 4 times of smooth down-sampling, the resolution of the FERET face library in the generated LR training set is 18 multiplied by 18 and 8 multiplied by 8, and the resolution of the CMU PIE face library is 16 multiplied by 14 and 8 multiplied by 7.
The step 2 specifically comprises the following steps:
step 2.1, based on the coupling mapping learning method, simultaneously mapping the face images in the HR training set and the LR training set to a public feature subspace, and expressing that:
Figure BDA0002226882530000043
wherein the HR feature vector is:corresponding LR feature vectors:
Figure BDA0002226882530000045
fHmapping function representing HR face image to common feature subspace, corresponding fLRepresenting a mapping function of the LR face image to a public feature subspace, and d representing the dimension of the public feature subspace;
step 2.2, set fH(x)=PH Tx and fL(x)=PL Tx, the formula of step 2.1 is expressed in a matrixing way, and is expressed as:
Figure BDA0002226882530000046
wherein P isHMapping matrices and P for HR couplingLThe mapping matrix is coupled to the LR.
The step 3 specifically comprises the following steps:
step 3.1, adding the local geometric structure information and the discrimination information of the sample into the formula subjected to matrixing in the step 2, and expressing as follows:
Figure BDA0002226882530000047
wherein a and b respectively represent the weight factors of the discrimination items in the class and the discrimination items between the classes,
Figure BDA0002226882530000048
is a matrix of local similarities which is a matrix of local similarities,
Figure BDA0002226882530000051
representing an intra-class discrimination matrix;represents an inter-class discrimination matrix, i is 1,2, 3 … Nt,j=1、2、 3…Nt
Step 3.2, setting
Figure BDA0002226882530000053
And
Figure BDA0002226882530000054
equation (1) is then expressed as:
Figure BDA0002226882530000055
wherein D iss、DwAnd DbAre diagonal matrices, respectively defined as
Figure BDA0002226882530000057
And
step 3.3, setting
Figure RE-GDA0002298689890000059
Andequation (2) can be simplified to J (P)H,PL)=tr(PTYGYTP);
Step 3.4, the objective function of the formula (2) is minimized to solve the following optimization problem: j (P)H,PL)s.t.PTYYTP=Iand PTY1 ═ 0, where I is a unit array of size d × d, 1 ═ 1,1]TIs composed of 2N t1 vector of term, set
Figure BDA00022268825300000512
Andthe solution P of the optimization problem is obtained by solving the 2 nd to (d +1) th generalized eigenvectors Ep ═ λ Fp (3) of P,
expanding equation (3) yields:
Figure BDA0002226882530000061
the formula (4) is simplified to obtain:
Figure BDA0002226882530000062
two coupling mapping matrixes P can be obtained by jointly solving the formula (5)HAnd PL
Local similarity matrix
Figure BDA0002226882530000063
Is represented as follows:
Figure BDA0002226882530000064
wherein n is the number of neighborhood samples belonging to the HR training face image of the same class;
Figure BDA0002226882530000065
represents a gaussian kernel width; c is a scale factor;
in-class discriminant matrix
Figure BDA0002226882530000066
Is represented as follows:
wherein k iswRepresenting the number of neighborhood samples in the HR training face image class;
inter-class discrimination matrix
Figure BDA0002226882530000068
Is represented as follows:
Figure BDA0002226882530000071
wherein k isbAnd the number of neighborhood samples among the HR training face image classes is represented.
The step 4 specifically comprises the following steps:
and respectively carrying out smooth downsampling on the other half of high-resolution face images in the HR image set to generate an LR test set:
Figure BDA0002226882530000072
wherein
Figure BDA0002226882530000073
Representing the ith low-resolution test face image,
Figure BDA0002226882530000074
representing the total number of the images of the test set;
wherein, the high resolution means that the resolution of the FERET face library is 72 multiplied by 72, and the resolution of the CMU PIE face library is 32 multiplied by 28;
respectively performing smooth downsampling on the samples: 4 times and 9 times of FERET face library, and 2 times and 4 times of CMU PIE face library;
low resolution means respectively: the FERET face library resolution is 18 × 18 and 8 × 8, and the CMU PIE face library resolution is 16 × 14 and 8 × 7.
The step 5 specifically comprises the following steps:
and transforming the HR image set and the LR test set to a public feature subspace to obtain HR-LR face mapping features, which specifically comprises the following steps:
two coupling matrixes P solved according to the training stageHAnd PLRespectively collecting HR images XGAnd LR test set XPTransforming to common feature subspace to obtain HR-LR face mapping featuresAnd
Figure BDA0002226882530000076
the step 6 specifically comprises the following steps: applying nearest neighbor classifier to HR-LR face mapping features projected in common feature subspace
Figure BDA0002226882530000077
Classifying to obtain the face mapping characteristics
Figure BDA0002226882530000078
Class label of
Figure BDA0002226882530000079
The invention has the beneficial effects that:
(1) according to the method, the local geometric structure information of the sample is added into the target function, so that the local neighborhood relationship of the HR-LR face image in the respective original feature space can be effectively reserved, and the separability of the sample in the projected public feature subspace is greatly enhanced;
(2) the method simultaneously considers the intra-class and inter-class discrimination information of the sample, and effectively improves the discrimination of the sample in the projected public feature subspace;
(3) the invention fully considers the discrimination information and the label information of the sample, and utilizes the local manifold geometric structure information and the label information of the HR-LR face image to ensure that the distances of the HR-LR face images of the same class in the obtained public characteristic subspace are as close as possible, and the distances of the HR-LR face images of different classes in the public characteristic subspace are as distant as possible, thereby enhancing the discrimination capability and the separability of the coupling mapping relation matrix and improving the identification performance of the LR face.
Drawings
FIG. 1 is a system framework diagram of a low-resolution face recognition method based on multi-manifold coupling mapping;
FIG. 2 is a comparison result diagram of the effect of feature dimensions on the identification effect of the FERET face library in the present invention and the existing method;
FIG. 3 is a comparison result diagram of the effect of the feature dimension on the CMU PIE face library on the recognition effect according to the present invention and the existing method;
FIG. 4 is a comparison result chart of the effect of rank level on the identification effect of the FERET face library in the present invention and the existing method;
FIG. 5 is a comparison result diagram of the effect of rank level on the recognition effect on the CMU PIE face library according to the present invention and the existing method;
FIG. 6 is a comparison result chart of the effect of the resolution ratio of the present invention on the FERET face library on the recognition effect with the existing method;
fig. 7 is a comparison result diagram of the effect of the resolution on the recognition effect of the CMU PIE face library according to the present invention and the existing method.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
The invention relates to a low-resolution face recognition method based on multi-manifold coupling mapping, which comprises a training stage and a testing stage, and is implemented according to the following steps:
first, training phase
Step 1, selecting N HR face images from a standard face database (FERET and CMU PIE face database) to form an HR image set
Figure BDA0002226882530000091
Randomly selecting half of the face images containing each person from the HR image set as an HR training set:
Figure BDA0002226882530000092
performing smooth down-sampling on HR training sets with high resolution (the resolution of a FERET face library is 72 multiplied by 72, and the resolution of a CMU PIE face library is 32 multiplied by 28) (4 times and 9 times of the FERET face library, and 2 times and 4 times of the CMU PIE face library) respectively to generate low resolution (the resolution of the FERET face library is 18 multiplied by 18 and 8 multiplied by 8, and the resolution of the CMU PIE face library is 16 multiplied by 14 and 8 multiplied by 7) training sets:
Figure BDA0002226882530000093
wherein
Figure BDA0002226882530000094
Representing the ith low-resolution face image,
Figure BDA0002226882530000095
representing the ith high-resolution face image,
Figure BDA0002226882530000096
representing the total number of the images of the training set; the class labels of the training face image samples are as follows:
Figure BDA0002226882530000097
as shown in fig. 1, step 2, based on the coupling mapping learning method, simultaneously mapping the face images in the HR training set and the LR training set to the common feature subspace, obtaining a formula based on the coupling mapping and performing matrixing, specifically:
step 2.1, based on the coupling mapping learning method, simultaneously mapping the face images in the HR training set and the LR training set to a public feature subspace, and expressing that:
Figure BDA0002226882530000098
wherein the HR feature vector is:
Figure BDA0002226882530000099
corresponding LR feature vectors:
Figure BDA00022268825300000910
fHmapping function representing HR face image to common feature subspace, corresponding fLRepresenting a mapping function of the LR face image to a public feature subspace, and d representing the dimension of the public feature subspace;
step 2.2, set fH(x)=PH Tx and fL(x)=PL Tx, the formula of step 2.1 is expressed in a matrixing way, and is expressed as:
Figure BDA00022268825300000911
wherein P isHMapping matrices and P for HR couplingLMapping a matrix for LR coupling;
step 3, adding the local geometric structure information and the discrimination information of the sample into the formula subjected to matrixing in the step 2, and solving an HR coupling mapping matrix PHAnd LR coupling mapping matrix PLAnd setting a dieType parameters a, b, c, n, kwAnd kbThe method specifically comprises the following steps:
step 3.1, adding the local geometric structure information and the discrimination information of the sample into the formula subjected to matrixing in the step 2, and expressing as follows:
Figure BDA0002226882530000101
wherein a and b respectively represent the weight factors of the discrimination items in the class and the discrimination items between the classes,
Figure BDA0002226882530000102
is a matrix of local similarities which is a matrix of local similarities,
Figure BDA0002226882530000103
representing an intra-class discrimination matrix;
Figure BDA0002226882530000104
represents an inter-class discrimination matrix, i is 1,2, 3 … Nt,j=1、2、 3…Nt
Step 3.2, setting
Figure BDA0002226882530000105
And
Figure BDA0002226882530000106
equation (1) is then expressed as:
wherein D iss、DwAnd DbAre diagonal matrices, respectively defined as
Figure BDA0002226882530000108
Figure BDA0002226882530000109
Step 3.3, setting
Figure RE-GDA00022986898900001011
Andequation (2) can be simplified to J (P)H,PL)=tr(PTYGYTP);
Step 3.4, the objective function of the formula (2) is minimized to solve the following optimization problem: j (P)H,PL)s.t.PTYYTP=Iand PTY1 ═ 0, where I is a unit array of size d × d, 1 ═ 1,1]TIs composed of 2N t1 vector of term, set
Figure BDA0002226882530000111
And
Figure BDA0002226882530000112
the solution P to the optimization problem can be obtained by solving the 2 nd to (d +1) th generalized eigenvectors Ep ═ λ Fp (3) of P,
expanding equation (3) yields:
Figure BDA0002226882530000113
the formula (4) is simplified to obtain:
two coupling mapping matrixes P can be obtained by jointly solving the formula (5)HAnd PL
Local similarity matrixIs represented as follows:
wherein n is the number of neighborhood samples belonging to the HR training face image of the same class;
Figure BDA0002226882530000121
represents the Gaussian kernel width; c is a scale factor;
in-class discriminant matrix
Figure BDA0002226882530000122
Is represented as follows:
Figure BDA0002226882530000123
wherein k iswRepresenting the number of neighborhood samples in the HR training face image class;
inter-class discrimination matrix
Figure BDA0002226882530000124
Is represented as follows:
Figure BDA0002226882530000125
wherein k isbAnd the number of neighborhood samples among the HR training face image classes is represented.
Second, testing stage
And 4, respectively carrying out smooth downsampling on the other half of high-resolution face images in the HR image set to generate an LR test set:whereinRepresenting the ith low-resolution test face image,
Figure BDA0002226882530000128
representing the total number of the images of the test set;
wherein, the high resolution means that the resolution of the FERET face library is 72 multiplied by 72, and the resolution of the CMU PIE face library is 32 multiplied by 28;
respectively performing smooth downsampling on the samples: 4 times and 9 times of FERET face library, and 2 times and 4 times of CMU PIE face library;
low resolution means respectively: the resolution of the FERET face library is 18 multiplied by 18 and 8 multiplied by 8, and the resolution of the CMU PIE face library is 16 multiplied by 14 and 8 multiplied by 7;
and 5, transforming the HR image set and the LR test set to a public feature subspace to obtain an HR-LR face mapping feature
Figure BDA0002226882530000129
And
Figure BDA00022268825300001210
the method specifically comprises the following steps:
two coupling matrixes P solved according to the training stageHAnd PLRespectively collecting HR images XGAnd LR test set XPTransforming to common feature subspace to obtain HR-LR face mapping features
Figure BDA00022268825300001211
And
Figure BDA0002226882530000131
step 6, applying the nearest neighbor classifier to the HR-LR face mapping feature projected in the public feature subspace
Figure BDA0002226882530000132
Classifying to obtain the face mapping characteristicsClass label of
Figure BDA0002226882530000134
To verify the effectiveness of the present invention, the following simulations were performed:
on the same training set and test set, some benchmark methods for extracting face features by using Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are selected in the form of comparison experiments, such as HR-PCA (extracting features directly using PCA for HR face images), HR-LDA (extracting features directly using LDA for HR face images), customer-PCA (performing customer interpolation for LR face images and extracting features using PCA), customer-LDA (performing customer interpolation for LR face images and extracting features using PCA), and a Coupled Local Preserving Mapping (CLPMs) method for comparison to verify the effectiveness of the present invention, wherein the CLPMs method specifically refers to LI B, IEEE change, scan G, low-preserving method [ process ] for example, 2010,17(1):20-23.".
First, the invention uses Rank-1 and 8 × 8(FERET face library) and 8 × 7(CMU PIE face library) to carry out experiments, and analyzes the influence of characteristic dimension on the recognition effect. As can be seen from the simulation results of fig. 2 and 3: the recognition effect of the invention on 2 standard face libraries approaches or exceeds the HR-LDA reference method. The identification performance of the method is far superior to that of other methods, and the identification performance is distributed in a dimension segment with lower feature dimension. The method not only considers the local geometric structure information of the sample, but also considers the intra-class and inter-class discrimination information of the sample, so that the dual mapping obtained by learning can effectively improve the discrimination and the separability of the sample.
Experiment two, Rank-n is an important index for evaluating the performance of the recognition algorithm in pattern recognition, and is used for calculating the probability that the first n face images in the matching result contain correct matching. After the matched faces are sorted from high to low in the candidate set according to the similarity, the more the correctly matched faces are sorted, the better the algorithm effect is. In the experiment, Rank-n is adopted to evaluate the performance of the invention, and figures 4 and 5 show the identification performance of the invention under different Rank levels. As can be seen from the simulation results of fig. 4: taking Rank-1 of a FERET face library as an example, the probability of matching a target face for the first time in n (n is 1,2, …,10) most similar faces reaches about 94 percent, and on 2 standard face libraries, the highest recognition rate of the method is obviously superior to other methods at different Rank levels, and the recognition rate slowly rises along with the gradual increase of the Rank levels and finally tends to be flat. The experiment fully shows that the invention has better stability.
And thirdly, two resolutions are respectively set for each standard face library in the experiment to evaluate the recognition performance of the invention and analyze the influence of the resolutions on the recognition effect, wherein the resolutions of the FERET face library are respectively 8 × 8 and 18 × 18, and the resolutions of the CMU PIE face library are respectively 8 × 7 and 16 × 14. As can be seen from the simulation results of fig. 6 and 7: the recognition effect of the invention on 2 standard face libraries is better than that of other methods, and the recognition effect is not worse than that of other methods due to the influence of resolution, which fully shows that the invention has good robustness on the resolution of the face image.
The results of the three experiments show that compared with the existing low-resolution face recognition method based on coupling mapping, the method has stronger discrimination and separability on the sample, and the recognition performance is far better than that of other similar methods.

Claims (9)

1. A low-resolution face recognition method based on multi-manifold coupling mapping is characterized by being implemented according to the following steps:
step 1, selecting N HR facial images from a standard facial database to form an HR image set, and randomly selecting N HR facial images from the HR image settTaking half of each human face image as an HR training set, carrying out smooth downsampling on the HR training set to generate an LR training set, and constructing a class label of a training human face image sample, wherein,
Figure FDA0002226882520000011
step 2, based on a coupling mapping learning method, simultaneously mapping the face images in the HR training set and the LR training set to a public characteristic subspace to obtain a formula based on coupling mapping and perform matrixing;
step 3, adding the local geometric structure information and the discrimination information of the sample into the formula subjected to matrixing in the step 2, and solving an HR coupling mapping matrix PHAnd LR coupling mappingMatrix PL
Step 4, the other half of the face images in the HR image set are subjected to smooth downsampling to generate an LR test set, and the total number of the images in the test set is Np
Figure FDA0002226882520000012
And 5, transforming the HR image set and the LR test set to a public feature subspace to obtain an HR-LR face mapping feature
Figure FDA0002226882520000013
And
Figure FDA0002226882520000014
step 6, applying the nearest neighbor classifier to the HR-LR face mapping characteristics projected in the public characteristic subspace
Figure FDA0002226882520000015
Classifying to obtain the face mapping characteristics
Figure FDA0002226882520000016
The category label of (1).
2. The method for low-resolution face recognition based on multi-manifold coupling mapping according to claim 1, wherein the step 1 specifically comprises: selecting N HR facial images from standard facial database to form HR image set
Figure FDA0002226882520000017
Randomly selecting a half of face images containing each person from an HR image set as an HR training set:
Figure FDA0002226882520000018
and (3) performing smooth downsampling on the HR training set to generate an LR training set:
Figure FDA0002226882520000019
wherein
Figure FDA00022268825200000110
Representing the ith low-resolution face image,representing the ith high-resolution face image,
Figure FDA0002226882520000022
representing the total number of the images of the training set;
the class labels of the training face image samples are as follows:
Figure FDA0002226882520000023
3. the method according to claim 2, wherein the standard face database includes a FERET and CMU PIE face database, and the generating of the LR training set by smooth downsampling of the HR training set specifically comprises: for the high-resolution FERET and CMU PIE face libraries, the resolution is respectively as follows: the resolution of the FERET face library is 72 multiplied by 72, the resolution of the CMU PIE face library is 32 multiplied by 28, the FERET face library is subjected to 4 times and 9 times, the CMU PIE face library is subjected to 2 times and 4 times of smooth sampling, the generated FERET face library in the LR training set has the resolutions of 18 multiplied by 18 and 8 multiplied by 8, and the resolution of the CMU PIE face library is 16 multiplied by 14 and 8 multiplied by 7.
4. The method for low-resolution face recognition based on multi-manifold coupling mapping according to claim 2, wherein the step 2 specifically comprises:
step 2.1, based on the coupling mapping learning method, simultaneously mapping the face images in the HR training set and the LR training set to a public feature subspace, and expressing that:
Figure FDA0002226882520000024
wherein the HR feature vector is:
Figure FDA0002226882520000025
corresponding LR feature vectors:
Figure FDA0002226882520000026
fHmapping function representing HR face image to common feature subspace, corresponding fLRepresenting a mapping function of the LR face image to a public feature subspace, and d representing the dimension of the public feature subspace;
step 2.2, set fH(x)=PH Tx and fL(x)=PL Tx, the formula of step 2.1 is expressed in a matrixing way, and is expressed as:
Figure FDA0002226882520000027
wherein P isHMapping matrices and P for HR couplingLThe mapping matrix is coupled to the LR.
5. The method according to claim 4, wherein the step 3 specifically comprises:
step 3.1, adding the local geometric structure information and the discrimination information of the sample into the formula subjected to matrixing in the step 2, and expressing as follows:
Figure RE-FDA0002298689880000031
wherein a and b respectively represent the weight factors of the discrimination items in the class and the discrimination items between the classes,is a matrix of local similarities which is a matrix of local similarities,
Figure RE-FDA0002298689880000033
representing an intra-class discrimination matrix;
Figure RE-FDA0002298689880000034
represents an inter-class discrimination matrix, i is 1,2, 3 … Nt,j=1、2、3…Nt
Step 3.2, setting
Figure RE-FDA0002298689880000035
Andequation (1) is then expressed as:
Figure RE-FDA0002298689880000037
wherein D iss、DwAnd DbAre diagonal matrices, respectively defined as
Figure RE-FDA0002298689880000038
Figure RE-FDA0002298689880000039
And
Figure RE-FDA00022986898800000310
step 3.3, setting
Figure RE-FDA00022986898800000311
And
Figure RE-FDA00022986898800000312
equation (2) can be simplified to J (P)H,PL)=tr(PTYGYTP);
Step 3.4, the objective function of the formula (2) is minimized to solve the following optimization problem: j (P)H,PL)s.t.PTYYTP=I andPTY1 ═ 0, where I is a unit array of size d × d, 1 ═ 1,1]TIs composed of 2Nt1 vector of term, set
Figure RE-FDA00022986898800000313
And
Figure RE-FDA0002298689880000041
the solution P of the optimization problem is obtained by solving the 2 nd to (d +1) th generalized eigenvectors Ep ═ λ Fp (3) of P,
expanding equation (3) yields:
Figure RE-FDA0002298689880000042
the formula (4) is simplified to obtain:
two coupling mapping matrixes P can be obtained by jointly solving the formula (5)HAnd PL
6. The method according to claim 5, wherein the local similarity matrix is a multi-manifold-coupling-mapping-based low-resolution face recognition method
Figure FDA0002226882520000044
Is represented as follows:
Figure FDA0002226882520000045
wherein n is the number of neighborhood samples belonging to the HR training face image of the same class;
Figure FDA0002226882520000046
represents the Gaussian kernel width; c is a scale factor;
the in-class discriminant matrix
Figure FDA0002226882520000047
Is represented as follows:
wherein k iswRepresenting the number of neighborhood samples in the HR training face image class;
the inter-class discrimination matrix
Figure FDA0002226882520000052
Is represented as follows:
Figure FDA0002226882520000053
wherein k isbAnd the number of neighborhood samples among the HR training face image classes is represented.
7. The method for low-resolution face recognition based on multi-manifold coupling mapping according to claim 5, wherein the step 4 specifically comprises:
and respectively carrying out smooth downsampling on the other half of high-resolution face images in the HR image set to generate an LR test set:wherein
Figure FDA0002226882520000055
Representing the ith low-resolution test face image,
Figure FDA0002226882520000056
representing the total number of the images of the test set;
wherein, the high resolution means that the resolution of the FERET face library is 72 multiplied by 72, and the resolution of the CMU PIE face library is 32 multiplied by 28;
respectively performing smooth downsampling on the samples: 4 times and 9 times of FERET face library, and 2 times and 4 times of CMU PIE face library;
low resolution means respectively: the FERET face library resolution is 18 × 18 and 8 × 8, and the CMU PIE face library resolution is 16 × 14 and 8 × 7.
8. The method according to claim 7, wherein the step 5 specifically comprises:
and transforming the HR image set and the LR test set to a public feature subspace to obtain an HR-LR face mapping feature, which specifically comprises the following steps:
two coupling matrixes P solved according to the training stageHAnd PLRespectively collecting HR images XGAnd LR test set XPTransforming to common feature subspace to obtain HR-LR face mapping features
Figure FDA0002226882520000057
And
Figure FDA0002226882520000058
9. the method according to claim 8, wherein the step 6 specifically comprises: applying nearest neighbor classifier to HR-LR face mapping features projected in common feature subspace
Figure FDA0002226882520000061
Classifying to obtain the face mapping characteristics
Figure FDA0002226882520000062
Class label of
Figure FDA0002226882520000063
CN201910954656.XA 2019-10-09 2019-10-09 Low-resolution face recognition method based on multi-manifold coupling mapping Active CN110796022B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910954656.XA CN110796022B (en) 2019-10-09 2019-10-09 Low-resolution face recognition method based on multi-manifold coupling mapping

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910954656.XA CN110796022B (en) 2019-10-09 2019-10-09 Low-resolution face recognition method based on multi-manifold coupling mapping

Publications (2)

Publication Number Publication Date
CN110796022A true CN110796022A (en) 2020-02-14
CN110796022B CN110796022B (en) 2023-07-21

Family

ID=69438877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910954656.XA Active CN110796022B (en) 2019-10-09 2019-10-09 Low-resolution face recognition method based on multi-manifold coupling mapping

Country Status (1)

Country Link
CN (1) CN110796022B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111695455A (en) * 2020-05-28 2020-09-22 西安工程大学 Low-resolution face recognition method based on coupling discrimination manifold alignment
CN112287995A (en) * 2020-10-26 2021-01-29 深圳大学 Low-resolution image identification method based on multilayer coupling mapping
WO2022087778A1 (en) * 2020-10-26 2022-05-05 深圳大学 Low-resolution image recognition method based on multi-layer coupled mapping
CN115564808A (en) * 2022-09-01 2023-01-03 宁波大学 Multi-resolution hyperspectral/SAR image registration method based on public space-spectrum subspace

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101697197A (en) * 2009-10-20 2010-04-21 西安交通大学 Method for recognizing human face based on typical correlation analysis spatial super-resolution
CN102693419A (en) * 2012-05-24 2012-09-26 武汉大学 Super-resolution face recognition method based on multi-manifold discrimination and analysis
CN105550649A (en) * 2015-12-09 2016-05-04 武汉工程大学 Extremely low resolution human face recognition method and system based on unity coupling local constraint expression
CN106056067A (en) * 2016-05-27 2016-10-26 南京邮电大学 Corresponding relationship prediction-based low-resolution face recognition method
WO2017219391A1 (en) * 2016-06-24 2017-12-28 深圳市唯特视科技有限公司 Face recognition system based on three-dimensional data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101697197A (en) * 2009-10-20 2010-04-21 西安交通大学 Method for recognizing human face based on typical correlation analysis spatial super-resolution
CN102693419A (en) * 2012-05-24 2012-09-26 武汉大学 Super-resolution face recognition method based on multi-manifold discrimination and analysis
CN105550649A (en) * 2015-12-09 2016-05-04 武汉工程大学 Extremely low resolution human face recognition method and system based on unity coupling local constraint expression
CN106056067A (en) * 2016-05-27 2016-10-26 南京邮电大学 Corresponding relationship prediction-based low-resolution face recognition method
WO2017219391A1 (en) * 2016-06-24 2017-12-28 深圳市唯特视科技有限公司 Face recognition system based on three-dimensional data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
卢涛等: "基于图像超分辨极限学习机的极低分辨率人脸识别", 《计算机应用》 *
王莹等: "基于深度网络的多形态人脸识别", 《计算机科学》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111695455A (en) * 2020-05-28 2020-09-22 西安工程大学 Low-resolution face recognition method based on coupling discrimination manifold alignment
CN111695455B (en) * 2020-05-28 2023-11-10 广西申能达智能技术有限公司 Low-resolution face recognition method based on coupling discrimination manifold alignment
CN112287995A (en) * 2020-10-26 2021-01-29 深圳大学 Low-resolution image identification method based on multilayer coupling mapping
WO2022087778A1 (en) * 2020-10-26 2022-05-05 深圳大学 Low-resolution image recognition method based on multi-layer coupled mapping
CN112287995B (en) * 2020-10-26 2023-08-15 深圳大学 Low-resolution image recognition method based on multi-layer coupling mapping
CN115564808A (en) * 2022-09-01 2023-01-03 宁波大学 Multi-resolution hyperspectral/SAR image registration method based on public space-spectrum subspace
CN115564808B (en) * 2022-09-01 2023-08-25 宁波大学 Multi-resolution hyperspectral/SAR image registration method based on public space-spectrum subspace

Also Published As

Publication number Publication date
CN110796022B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN110796022B (en) Low-resolution face recognition method based on multi-manifold coupling mapping
CN104866810B (en) A kind of face identification method of depth convolutional neural networks
CN111695456B (en) Low-resolution face recognition method based on active discriminant cross-domain alignment
CN108446589B (en) Face recognition method based on low-rank decomposition and auxiliary dictionary in complex environment
CN112257647A (en) Human face expression recognition method based on attention mechanism
CN108765279A (en) A kind of pedestrian's face super-resolution reconstruction method towards monitoring scene
CN109447123B (en) Pedestrian re-identification method based on label consistency constraint and stretching regularization dictionary learning
CN104268593A (en) Multiple-sparse-representation face recognition method for solving small sample size problem
CN111695455B (en) Low-resolution face recognition method based on coupling discrimination manifold alignment
CN112488205A (en) Neural network image classification and identification method based on optimized KPCA algorithm
CN112836671B (en) Data dimension reduction method based on maximized ratio and linear discriminant analysis
CN112307995A (en) Semi-supervised pedestrian re-identification method based on feature decoupling learning
CN106022241B (en) A kind of face identification method based on wavelet transformation and rarefaction representation
CN106096517A (en) A kind of face identification method based on low-rank matrix Yu eigenface
Ocquaye et al. Dual exclusive attentive transfer for unsupervised deep convolutional domain adaptation in speech emotion recognition
CN104966075B (en) A kind of face identification method and system differentiating feature based on two dimension
CN106529586A (en) Image classification method based on supplemented text characteristic
CN107918761A (en) A kind of single sample face recognition method based on multiple manifold kernel discriminant analysis
CN109376787A (en) Manifold learning network and computer visual image collection classification method based on it
CN104715266A (en) Image characteristics extracting method based on combination of SRC-DP and LDA
CN116452863A (en) Class center knowledge distillation method for remote sensing image scene classification
Li et al. Kernel-based multifactor analysis for image synthesis and recognition
Yuan et al. Holistic learning-based high-order feature descriptor for smoke recognition
CN110781822A (en) SAR image target recognition method based on self-adaptive multi-azimuth dictionary pair learning
CN108052981B (en) Image classification method based on nonsubsampled Contourlet transformation and convolutional neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230625

Address after: 511400 No.192-198, No. 66, Hanxingzhi Street, Zhongcun Street, Panyu District, Guangzhou, Guangdong

Applicant after: Aoyuan Smart Life Service (Guangzhou) Group Co.,Ltd.

Address before: 518000 1002, Building A, Zhiyun Industrial Park, No. 13, Huaxing Road, Henglang Community, Longhua District, Shenzhen, Guangdong Province

Applicant before: Shenzhen Wanzhida Technology Co.,Ltd.

Effective date of registration: 20230625

Address after: 518000 1002, Building A, Zhiyun Industrial Park, No. 13, Huaxing Road, Henglang Community, Longhua District, Shenzhen, Guangdong Province

Applicant after: Shenzhen Wanzhida Technology Co.,Ltd.

Address before: 710048 Shaanxi province Xi'an Beilin District Jinhua Road No. 19

Applicant before: XI'AN POLYTECHNIC University

GR01 Patent grant
GR01 Patent grant