CN103246874B - Face identification method based on JSM (joint sparsity model) and sparsity preserving projection - Google Patents

Face identification method based on JSM (joint sparsity model) and sparsity preserving projection Download PDF

Info

Publication number
CN103246874B
CN103246874B CN201310162456.3A CN201310162456A CN103246874B CN 103246874 B CN103246874 B CN 103246874B CN 201310162456 A CN201310162456 A CN 201310162456A CN 103246874 B CN103246874 B CN 103246874B
Authority
CN
China
Prior art keywords
image
class
dimensionality reduction
training
owned part
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310162456.3A
Other languages
Chinese (zh)
Other versions
CN103246874A (en
Inventor
杨新武
牛文杰
赵晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panoramic Zhilian Wuhan Technology Co ltd
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN201310162456.3A priority Critical patent/CN103246874B/en
Publication of CN103246874A publication Critical patent/CN103246874A/en
Application granted granted Critical
Publication of CN103246874B publication Critical patent/CN103246874B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention relates to a face identification method based on JSM (joint sparsity models) and sparsity preserving projection. According to the method, a transformation base formed by all training images substitutes a common random matrix in the JSM algorithm; public parts and privacy parts of each type of training face images are extracted by utilizing the JSM algorithm; images in a face database are classified as per people; all images of one person are grouped into the same class; the public parts indicate common face characteristics of each type of face images; the privacy parts indicate detail changes on expressions of faces, illumination and the like; images reconstructed by utilizing sparse public and privacy parts are approximate to original training images; and a dimensionality reduction matrix is obtained through solving the most optimized problem with the smallest reconstruction error. The dimensionality reduction matrix obtained finally performs dimensionality reduction treatment on testing images, the public parts and the privacy parts of each type of training images subjected to dimensionality reduction are used for reconstructing the testing images, and the testing images belong to the class of training images with the smallest reconstruction error.

Description

A kind of face identification method based on joint sparse model and sparse holding mapping
Technical field
The invention belongs to biological identification technology, mode identification technology are and in particular to sparse holding mapping and joint are dilute Thin model method.
Background technology
With social development, the requirement to fast and effectively auto authentication for the every field is increasingly urgent, identification And checking has important using value in national security, public safety and military security field.Biometrics identification technology by In it, there is very strong self stability and individual difference, have become as the optimal foundation of field of biological recognition.
In recent years, recognition of face developed rapidly in the world as computer security technique, recognition of face Technology increasingly causes extensive concern.Face recognition technology application background is quite varied, can be used for public security system criminal investigation and breaks Criminal's identification of case, the monitoring of certificate verification, bank and customs of customs, airport and secret department etc., automatic entrance guard system The aspects such as system, video conference.This many application make face Study of recognition become one extremely meaningful and be rich in challenging problem.
Because it is with a wide range of applications, face recognition technology becomes the research heat of computer vision field in recent years One of point, through development for many years, face recognition technology achieves huge progress, and previous researchers have been proposed for one The face identification method of series, such as principal component analysis(PCA), independent component analysis(ICA), linear discriminant analysis(LDA)Deng. Introduce the method for rarefaction representation it is proposed that sorting algorithm based on rarefaction representation in Wright et al. field of face identification again (SRC).This algorithm is as dictionary all of training facial image(Each facial image is an atom).Test face figure As the process of rarefaction representation is exactly to find its linear combination on these dictionary element.
Have been proposed that at present based on joint sparse model(Joint Sparsity models, JSM)Face know Other method, but there is a problem of that discrimination is low.
Content of the invention
Present invention aims to the low problem of the discrimination of prior art presence, there is provided a kind of dilute based on joint Thin model and the face identification method of sparse holding mapping.
The present invention is achieved by the following technical solutions:
Face identification method based on joint sparse model and sparse holding mapping is it is characterised in that schemed with all of training Replace the conventional random matrix in JSM algorithm as the conversion base of composition, extract every class using JSM algorithm and train facial image Publicly-owned part and privately owned part(Image in face database is pressed people and is classified, and all images of same person are classified as a class, publicly-owned part Illustrate the face characteristic that every class facial image has, privately owned part illustrates the details such as the different expression of face, illumination and becomes Change), approach original training image using the image that sparse publicly-owned and privately owned partial reconfiguration goes out, minimum by solving reconstructed error Optimization problem, try to achieve dimensionality reduction matrix.The dimensionality reduction matrix finally tried to achieve carries out dimension-reduction treatment to test image, and with after dimensionality reduction The publicly-owned and privately owned of each class of training image reconstruct test image, reconstructed error minimum be determined as test image institute Belong to classification.It comprises the following steps that:
Step one, pretreatment
1.1)All images in face database are normalized, normalized operation refers to first unified for image big Little is 32 × 32 gray scale value matrixs, then each row order of gray scale value matrix is placed on the back of first row, obtaining size is 1024 × 1 gray scale value matrix, on the basis of all operations hereafter are all gradation of image value matrix after normalization;Face Storehouse includes the different facial expression images of different people, and the different images of same person are classified as a class, takes several works in each class For training image, remaining image is as test image;
Step 2, the JSM feature extraction based on training sample set
Extract each class training figure using the joint sparse model JSM algorithm of the conversion base forming based on all training images As the publicly-owned part in spatial domainWith privately owned part and, specific as follows:
2.1)Calculate feature set W that kth class trains imagek
WkRepresent the publicly-owned part of kth class and the set of privately owned part, WkIt is a matrix, the inside contains the public affairs of kth class There is the privately owned part with each of kth class training image, publicly-owned part illustrates the facial image somebody altogether of same person Face feature, the variations in detail such as the privately owned part expression different with the face illustrating same person, illumination.
Circular:First the training image of all classes is carried out pre- dimensionality reduction with principal component analysis PCA algorithm, obtain Training image collection g after dimensionality reductionk, wherein,gk,jRepresent j-th facial image in k-th people PCA feature, JkIt is the training image number in each class.Then image set g after dimensionality reductionkInput as JSM algorithm obtains Wk, JSM algorithm is as follows:
Wherein,
A=[A1,A2,...Ai...,AM], M is the total number of all training images of all classes, and all of training image is compiled Number for 1 arrive M, AiIt is i-th training image gray scale value matrix after PCA dimensionality reduction, for representing training image after dimensionality reduction, T represents Transposition computing, obtains Wk
2.2)Obtain the publicly-owned part of kth class training image by solving the sparse minimum one normal form problem keeping in mappingPrivately owned part with kth j-th training image of classWherein WkRarefaction representation is as follows:
Wk=argmin||Wk||1
Optimum solutionJkRepresent the number of kth class training image,Represent kth class instruction Practice the publicly-owned part of image,Represent the privately owned part of j-th training image in kth class;
2.3)According to 2.2)Middle extractionCalculate publicly-owned part in spatial domain for the kth class training image Sum with privately owned partIt is calculated as follows:
WhereinA is step 2.1)Defined in A,Represent j-th training figure in the kth class tried to achieve As the privately owned part in spatial domain.
Step 2.2)In be by solve a minimum one normal form problem obtain WkRarefaction representation, so, try to achieve 'sWithIt is all the expression on sparse domain, so the public affairs trying to achieve facial image in spatial domain will be brought by an inversion Have and privately owned part.
Step 3, calculating dimensionality reduction matrix
Approach original training figure using the image that the publicly-owned and privately owned partial reconfiguration of the rarefaction representation extracting in step 2 goes out As so that this reconstructed error reaches minimum, by solving-optimizing problem, try to achieve dimensionality reduction matrix, described calculating dimensionality reduction matrix Step is as follows:
3.1)According to joint sparse model, using public affairs in spatial domain for each class training image extracting in step 2 There is partWith privately owned part andAll training images of all classes are reconstructed, the kth class later to PCA dimensionality reduction The gray scale value matrix t of j-th training imagekjIt is reconstructed and obtain reconstructed image matrixMethod as follows:
Wherein,It is step 2.3)In the kth class training image the obtained publicly-owned part in spatial domain, calculated according to JSM Method, fkIt isWith the privately owned part of kth class andUsing step 2.1)In the publicly-owned part tried to achieve of JSM algorithm, computational methods As follows:
rkj=Ψ′Wkj
Wherein,Represent training image matrix tkjDeduct the publicly-owned part of its generic Ψ '=[B ', C '], B '=[A, A]T,A=[A1,A2,...Ai...,AM], M is all training images of all classes Total number, all of training image number be 1 arrive M, AiIt is i-th later training image gray scale value matrix of PCA dimensionality reduction, T Represent transposition computing, obtain
3.2)Using joint sparse model and step 3.1)In the reconstructed image matrix obtainedSolve dimensionality reduction matrix.Protect The reconstruct image of the card training all classes of image is as approaching original training signal as far as possible it is possible to solve by solving following formula Dimensionality reduction matrix:
In order to derive conveniently, here formula once be deformed, replace with i k to represent the mark of classification, xijReplace tkjRepresent j-th training image matrix of the i-th class,ReplaceRepresent to xijImage array after reconstruct.Then according to 3.1)In Reconstructing method,It is the publicly-owned part of the i-th class,Replace step 3.1)Middle fkTo represent it is Publicly-owned part after j-th training image reconstruct of the i-th class, then have
Wherein A is the set of all training ganmma controller value matrixs, JiIt is the number of training sample in the i-th class, K represents total Class number, w represents dimensionality reduction matrix finally to be solved.
Above formula passes through simple conversion and can obtain
If eijIt is the i-th class training corresponding label vector of image, size is M × 1, and M is the number of all training samples, its (i-1) Ji+ j element is 1, and other is 0, then above formula is simply derived as follows:
Wherein,
Respective items are equal.
θcIt is made up of the publicly-owned part of each class, its each row The column vector form of the publicly-owned part of each training image generic corresponding,
WhereinAfter expression is j-th training image reconstruct of the i-th class Publicly-owned part,θclIt is made up of the publicly-owned part of each class, the public affairs of the corresponding each class of every a line
There is part.Derivation is as follows,
Using above formula:
Following derivation can be carried out using the sparse method keeping mapping, limit WTAATW=1, then object function can turn Turn to following optimal problem:
In order to simplify expression, this minimum problems can be converted into equal max problem, that is,:
May finally derive that w is the feature corresponding to d eigenvalue of maximum of the generalized eigenvalue problem shown in above formula Vector:
βATw=λAATw
Whereinλ is the characteristic value of equation(In program Calculated), profit can obtain required dimensionality reduction matrix in this way.
Step 4, Classification and Identification
Using the dimensionality reduction matrix tried to achieve, the publicly-owned and privately owned characteristic image of test image and each class is all used dimensionality reduction matrix Carry out dimensionality reduction, then reconstruct test sample using the publicly-owned and privately owned feature that each class is extracted, reconstructed error minimum Class is then the classification belonging to this test sample.The step of described Classification and Identification is as follows:
4.1) using the dimensionality reduction matrix w trying to achieve, to test image y and step 2.3)In each class that obtains train image Publicly-owned part and privately owned part and all carry out dimensionality reduction with dimensionality reduction matrix w, test image y obtains after w dimensionality reductionKth class is trained The publicly-owned part of imageObtain after w dimensionality reductionKth class training the privately owned part of image andObtain after w dimensionality reduction It is calculated as follows:
4.2) utilize step 4.1)Obtain the sum of the publicly-owned part after each class trains image dimensionality reduction and privately owned part, respectively To test image after dimensionality reductionIt is reconstructed, obtain reconstructing test image matrix stack K table Show the sum of class,Represent test imageThe i-th class after dimensionality reduction trains the publicly-owned part of imageWith the private after dimensionality reduction Have part andThe reconstructed image calculating, that is,Wherein fiIt is test image after dimensionality reductionThe publicly-owned portion with the i-th class PointDifference and dimensionality reduction after the privately owned part of the i-th class andUsing step 2.1) in the publicly-owned portion tried to achieve of JSM algorithm Point, computational methods are as follows:
ri=Ψ′Wi
Wherein, By step 4.1)Obtain, Ψ '=[B ', C '], W is step 4.1) in the dimensionality reduction matrix tried to achieve, A=[A1,A2,...Ai...,AM], M is all The total number of all training images of class, all of training image number be 1 arrive M, AiIt is i-th later training of PCA dimensionality reduction Gradation of image value matrix, T represents transposition computing, obtains Wi, wherein,
4.3) utilizeWithCalculateThe reconstructed error l of corresponding each class, that is,:
Test image y is classified as the minimum affiliated class of l.
Beneficial effect
Based on the face identification method of joint sparse model and sparse holding mapping, not only there is occupancy memory space little Feature, and simultaneously taken account of the correlation and between class in class, improve the accuracy rate of identification.
Brief description
Fig. 1 is the flow chart of method therefor of the present invention.
Fig. 2 is the JSM feature extraction figure based on training sample set.
Specific embodiment
Fig. 1 is the flow process based on joint sparse model and the face identification method of sparse holding mapping proposed by the present invention Figure.Whole flow process is divided into training module and identification module, and training module mainly pre-processes to training image, then extracts The publicly-owned part of every class training image and and form privately owned part, by reconstructing training sample so as to and original training sample Error reach minimum of computation and go out dimensionality reduction matrix;Identification module is that unknown test image is pre-processed, then with each The publicly-owned and privately owned feature of class is reconstructed to it, and the minimum class of reconstructed error is exactly the classification belonging to test image.
In conjunction with Fig. 1, the implementation process of the present invention is described in detail.Embodiments of the invention are with the technology of the present invention Implemented premised on scheme, given detailed embodiment and specific operating process, but protection scope of the present invention It is not limited to following embodiments.
Embodiment employs a publicly-owned face database, Yale face database.Yale face database comprises 15 people, everyone 11 pictures, the main change including illumination condition and expression.15 people are expressed as 15 classes, each class has 11 face figures Picture.For each class facial image in experiment, randomly select 5 facial images as training image, remaining as test chart Picture.Training image totally 75 therefore in face database.
Provide the explanation of each detailed problem in this inventive technique scheme involved in detail below:
Step one, pretreatment
1.1)Whole 165 images in face database are normalized, compress image to first 32 × 32 as Plain size, is then placed on the back of first row each row order of gray scale value matrix, obtains the gray value that size is 1024 × 1 On the basis of matrix all operations hereafter are all gradation of image value matrix after normalization.
Step 2, the JSM feature extraction (Fig. 2) based on training sample set
It is to substitute into extract feature in JSM algorithm with the conversion base of all training images composition in this step.
2.1)First the training image of all classes is carried out pre- dimensionality reduction with principal component analysis PCA algorithm, after setting dimensionality reduction Intrinsic dimensionality is 75, obtains training image collection g after dimensionality reductionk, wherein,, gk,jRepresent k-th people In j-th facial image PCA feature, JkRepresent the number of the training image in each class, k=1,2 ..., 15, j=1, 2 ..., 5, then image set g after dimensionality reductionkInput as JSM algorithm obtains Wk, JSM algorithm is as follows:
Wherein, A=[A1, A2,...Ai...,AM], M is the total number of all training images of all classes, M=75, and it is 1 to M that all of training image is numbered, AiIt is i-th later training image gray scale value matrix of PCA dimensionality reduction, training image sum is 75, and each training image passes through Dimension after PCA extracts feature is 75, so the size of A is 75 × 75, T represents transposition computing, obtains Wk
2.2)Obtain the publicly-owned part of kth class training image by solving the sparse minimum one normal form problem keeping in mappingPrivately owned part with kth j-th training image of classWherein WkRarefaction representation is as follows:
Wk=argmin||Wk||1
Optimum solutionJkRepresent the number of kth class training image,Represent the training of kth class The publicly-owned part of image,Represent the privately owned part of j-th training image in kth class;
2.3)According to 2.2)Middle extractionCalculate publicly-owned part in spatial domain for the kth class training image Sum with privately owned partIt is calculated as follows:
WhereinA is step 2.1)Defined in A,Represent j-th training figure in the kth class tried to achieve As the privately owned part in spatial domain.
Step 3, calculating dimensionality reduction matrix
3.1)According to joint sparse model, using public affairs in spatial domain for each class training image extracting in step 2 There is partWith privately owned part andAll training images of all classes are reconstructed, the kth class later to PCA dimensionality reduction The gray scale value matrix t of j-th training imagekjIt is reconstructed and obtain reconstructed image matrixMethod as follows:
Wherein,It is step 2.3)In the kth class training image the obtained publicly-owned part in spatial domain, calculated according to JSM Method, fkIt isWith the privately owned part of kth class andUsing step 2.1)In the publicly-owned part tried to achieve of JSM algorithm, computational methods As follows:
rkj=Ψ′Wkj
Wherein, Represent training image matrix tkjDeduct the publicly-owned part of its generic Ψ '=[B ', C '], B '=[A, A]T,A=[A1,A2,...Ai...,AM], M is all training images of all classes Total number, all of training image number be 1 arrive M, AiIt is i-th later training image gray scale value matrix of PCA dimensionality reduction, T Represent transposition computing, obtain
3.2)Using joint sparse model and step 3.1)In the reconstructed image matrix obtainedSolve dimensionality reduction matrix.Protect The reconstruct image of the card training all classes of image is as approaching original training signal as far as possible it is possible to solve by solving following formula Dimensionality reduction matrix:
In order to derive conveniently, here formula once be deformed, replace with i k to represent the mark of classification, xijReplace tkjRepresent j-th training image matrix of the i-th class,ReplaceRepresent to xijImage array after reconstruct.Then according to 3.1)In Reconstructing method, It is the publicly-owned part of the i-th class,Replace step 3.1)Middle fkIt is i-th to represent Publicly-owned part after j-th training image reconstruct of class, then have
Wherein A is the set of all training ganmma controller value matrixs, JiIt is the number of training sample in the i-th class, K represents total Class number, w represents dimensionality reduction matrix finally to be solved.
Above formula passes through simple conversion and can obtain
If eijIt is the i-th class training corresponding label vector of image, size is M × 1, and M is the number of all training samples, its (i-1) Ji+ j element is 1, and other is 0, then above formula is simply derived as follows:
Following derivation can be carried out using the sparse method keeping mapping, limit WTAATW=1, then object function can turn Turn to following optimal problem:
In order to simplify expression, this minimum problems can be converted into equal max problem, that is,:
May finally derive that w is the feature corresponding to d eigenvalue of maximum of the generalized eigenvalue problem shown in above formula Vector:
βATW=λAATW
Wherein θc It is made up of the publicly-owned part of each class, its each row corresponds to the column vector of the publicly-owned part of each training image generic Form,WhereinExpression is the public affairs after j-th training image reconstruct of the i-th class There is part,θclIt is made up of the publicly-owned part of each class, the publicly-owned part of the corresponding each class of every a line.The λ side of expression The characteristic value of journey.In program is realized, above-mentioned variable can be substituted into successively and required dimensionality reduction matrix in formula, can be obtained.
Step 4, Classification and Identification
4.1) choose test image y, using the dimensionality reduction matrix w trying to achieve, to test image y and step 2.3)In obtain Each class train the publicly-owned part of imageWith privately owned part andAll carry out dimensionality reduction with dimensionality reduction matrix w, test image y is through w Obtain after dimensionality reductionKth class trains the publicly-owned part of imageObtain after w dimensionality reductionKth class trains the privately owned part of image WithObtain after w dimensionality reductionIt is calculated as follows:
4.2) utilize step 4.1)Obtain the sum of the publicly-owned part after each class trains image dimensionality reduction and privately owned part, respectively To test image after dimensionality reductionIt is reconstructed, obtain reconstructing test image matrix stack K table Show the sum of class,Represent test imageThe i-th class after dimensionality reduction trains the publicly-owned part of imageWith the private after dimensionality reduction Have part andThe reconstructed image calculating, that is,Wherein fiIt is test image after dimensionality reductionThe publicly-owned portion with the i-th class PointDifference and dimensionality reduction after the privately owned part of the i-th class andUsing step 2.1) in the publicly-owned portion tried to achieve of JSM algorithm Point, computational methods are as follows:
ri=Ψ′Wi
Wherein,Ψ '=[B ', C '], W is step The dimensionality reduction matrix tried to achieve in rapid 4.1), A=[A1,A2,...Ai...,AM], M is the total number of all training images of all classes, All of training image number be 1 arrive M, AiIt is i-th later training image gray scale value matrix of PCA dimensionality reduction, T represents that transposition is transported Calculate, obtain Wi, wherein,
4.3) utilizeWithCalculateThe reconstructed error l of corresponding each class, that is,:
Test image y is classified as the minimum affiliated class of l.
The experimental result of the explanation present invention is explained in detail below:
Yale and CMU-AMP face database is selected in experiment.Yale face database comprises 15 volunteers, and every volunteer has 11 Pictures, totally 165 pictures, the main change including illumination condition and expression.In experiment, each image is all normalized to 32 × 32 pixel sizes.For each class facial image, randomly select the different image of 5 width as training sample, remaining conduct Test sample, carries out 5 experiments and averages.Experimental result is as shown in table 1.Randomly choose training sample and many experiments guarantee The stability of experimental result.
Table 1 discrimination compares
CMU-AMP face database comprises 13 volunteers, and every volunteer has 75 pictures, totally 975 pictures, main inclusion Volunteer is glad, angry, and surprised grade is expressed one's feelings.In experiment, each image is all normalized to 32 × 32 pixel sizes.For each Class facial image, randomly selects the different image of 5 width as training sample, remaining as test sample, carry out 5 experiments and take Mean value.Experimental result is as shown in table 2.
Table 2 discrimination compares
Dimension in table 1 represents the dimension being dropped to test image with the dimensionality reduction matrix tried to achieve.In table, the method for JSM is base In the method for joint sparse model, the inventive method is the proposed method calculating dimensionality reduction matrix, by upper figure in Yale Can be seen that the method that method presented herein is superior to JSM in low dimensional with the experimental result on CMU-AMP storehouse.
Due to introducing the sparse thought keeping mapping in joint sparse model, with the conversion of all training samples composition Publicly-owned and privately owned feature that base rarefaction representation extracts is it is contemplated that the correlation and between class in class.And combine sparse projection derivation Gone out the computing formula of dimensionality reduction matrix it is ensured that the signal being gone out using the publicly-owned part of training sample and privately owned partial reconstitution as far as possible Approach primary signal, the data projection in higher dimensional space originally, to lower dimensional space, is remained the main of original image simultaneously Feature.Show really to improve discrimination by the experiment on face database.

Claims (1)

1. a kind of face identification method based on joint sparse model and sparse holding mapping is it is characterised in that include following walking Suddenly:
S1:Pretreatment:
S1.1 is normalized to all images in face database, and normalized operation refers to first image table is shown as gray scale Value matrix, is then placed on the back of first row each row order of gray scale value matrix, obtains column matrix, that is, after being normalized Gradation of image value matrix, on the basis of all operations hereafter are all gradation of image value matrix after normalization;Face database Include the different facial expression images of different people, the different images of same person are classified as a class, take several conducts in each class Training image, remaining image is as test image;
S2:Extract each class training figure using the joint sparse model JSM algorithm of the conversion base forming based on all training images As the publicly-owned part in spatial domain and privately owned part and wherein publicly-owned part in spatial domain for the kth class training imageWith Privately owned partComputational methods are as follows:
S2.1 calculates feature set W that kth class trains imagek
First the gradation of image value matrix after the training image normalization of kth class is carried out dimensionality reduction with principal component analysis PCA algorithm, Obtain training image collection g after dimensionality reductionk, then image set g after dimensionality reductionkInput as JSM algorithm obtains Wk, JSM algorithm is such as Under:
Wherein, convert base A=[A1, A2,...Ai...,AM], M is the total number of all training images of all classes, AiIt is i-th later training image of PCA dimensionality reduction Gray scale value matrix, for representing training image after dimensionality reduction, T represents transposition computing, obtains Wk
S2.2 obtains the publicly-owned part of kth class training image by solving the sparse minimum one normal form problem keeping in mappingWith The privately owned part of kth j-th training image of classWherein WkRarefaction representation is as follows:
Wk=arg min | | Wk||1
Optimum solutionJkRepresent the number of kth class training image,Represent kth class training image Publicly-owned part,Represent the privately owned part of j-th training image in kth class;
S2.3 is according to extraction in S2.2 Calculate publicly-owned part in spatial domain for the kth class training imageWith privately owned PartIt is calculated as follows:
WhereinA is the A defined in step S2.1,Represent that j-th training image in the kth class tried to achieve exists Privately owned part in spatial domain;
S3:Calculate dimensionality reduction matrix:
S3.1 according to joint sparse model, using publicly-owned part in spatial domain for each class training image extracting in S2 With privately owned part andAll training images of all classes are reconstructed, the kth class later to PCA dimensionality reduction is trained for j-th The gray scale value matrix t of imagekjIt is reconstructed and obtain reconstructed image matrixMethod as follows:
Wherein,It is publicly-owned part in spatial domain for the kth class training image obtained in step S2.3, according to JSM algorithm, fk It isWith the privately owned part of kth class andUsing the publicly-owned part in the spatial domain that the JSM algorithm in step S2.1 is tried to achieve, calculate Method is as follows:
rkj=Ψ ' Wkj
Wherein, Represent training image matrix tkjDeduct the publicly-owned part of its genericTable Show privately owned part and Ψ '=[B ', C '], B '=[A, the A] of kth classT,A=[A1,A2,...Ai...,AM], M is the total number of all training images of all classes, AiIt is i-th later training image gray scale value matrix of PCA dimensionality reduction, T table Show transposition computing, obtain Wkj, wherein,
S3.2 is using the reconstructed image matrix obtained in joint sparse model and step S3.1Solve dimensionality reduction matrix;
Keep mapping according to sparse, obtain dimensionality reduction matrix w by solving following minimization problem:
Wherein, tkjIt is the gray scale value matrix of later kth j-th training image of class of PCA dimensionality reduction,It is to training in step S3.1 Image array tkjIt is reconstructed and obtain reconstructed image matrix, JkIt is the number of training image in kth class, K represents total class number;
S4:Classification and Identification:
S4.1, using the dimensionality reduction matrix w trying to achieve, trains the publicly-owned part of image to each class obtaining in test image y and S2.3 With privately owned part and all carry out dimensionality reduction with dimensionality reduction matrix w, test image y obtains after w dimensionality reductionKth class trains the public affairs of image There is partObtain after w dimensionality reductionKth class training the privately owned part of image andObtain after w dimensionality reductionIt is calculated as follows:
S4.2 obtains, using S4.1, the sum that each class trains the publicly-owned part after image dimensionality reduction and privately owned part, respectively to dimensionality reduction after Test imageIt is reconstructed, obtain reconstructing test image matrix stack K represents the total of class Number,Represent test imageThe i-th class after dimensionality reduction trains the publicly-owned part of imageWith the privately owned part after dimensionality reduction andThe reconstructed image calculating, that is,Wherein fiIt is test image after dimensionality reductionThe publicly-owned part with the i-th classDifference And the privately owned part of the i-th class after dimensionality reduction andThe publicly-owned part tried to achieve using the JSM algorithm in S2.1, computational methods are such as Under:
ri=Ψ ' Wi
Wherein,Obtained by S4.1, Ψ '=[B ', C '], W is the dimensionality reduction matrix tried to achieve in S4.1, A=[A1,A2,...Ai...,AM], M is the institute of all classes Have the total number of training image, all of training image number be 1 arrive M, AiIt is i-th later training image ash of PCA dimensionality reduction Angle value matrix, T represents transposition computing, obtains
S4.3 utilizesWithCalculateThe reconstructed error l of corresponding each class, that is,:
Test image y is classified as the minimum affiliated class of l.
CN201310162456.3A 2013-05-03 2013-05-03 Face identification method based on JSM (joint sparsity model) and sparsity preserving projection Active CN103246874B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310162456.3A CN103246874B (en) 2013-05-03 2013-05-03 Face identification method based on JSM (joint sparsity model) and sparsity preserving projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310162456.3A CN103246874B (en) 2013-05-03 2013-05-03 Face identification method based on JSM (joint sparsity model) and sparsity preserving projection

Publications (2)

Publication Number Publication Date
CN103246874A CN103246874A (en) 2013-08-14
CN103246874B true CN103246874B (en) 2017-02-15

Family

ID=48926387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310162456.3A Active CN103246874B (en) 2013-05-03 2013-05-03 Face identification method based on JSM (joint sparsity model) and sparsity preserving projection

Country Status (1)

Country Link
CN (1) CN103246874B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103440504B (en) * 2013-09-13 2016-08-17 中国科学院自动化研究所 A kind of robust biological feather recognition method based on structure priori
CN104484890B (en) * 2014-12-18 2017-02-22 上海交通大学 Video target tracking method based on compound sparse model
CN106548454A (en) * 2016-09-08 2017-03-29 清华大学 The method and apparatus for processing medical image
CN107133648B (en) * 2017-05-05 2020-08-04 南京航空航天大学 One-dimensional range profile identification method based on adaptive multi-scale fusion sparse preserving projection
CN107273842B (en) * 2017-06-09 2020-07-03 北京工业大学 Selective integrated face recognition method based on CSJOGA algorithm
CN107368803A (en) * 2017-07-14 2017-11-21 广州智慧城市发展研究院 A kind of face identification method and system based on classification rarefaction representation
CN109919052B (en) * 2019-02-22 2021-05-14 武汉捷丰天泽信息科技有限公司 Criminal investigation simulation image model generation method, criminal investigation simulation image method and device
CN111767906B (en) * 2020-09-01 2020-11-27 腾讯科技(深圳)有限公司 Face detection model training method, face detection device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102521609A (en) * 2011-12-02 2012-06-27 湖南大学 Near-infrared and visible light face image recognition method based on distributed compression sensing theory
CN102737234A (en) * 2012-06-21 2012-10-17 北京工业大学 Gabor filtering and joint sparsity model-based face recognition method
CN102768732A (en) * 2012-06-13 2012-11-07 北京工业大学 Face recognition method integrating sparse preserving mapping and multi-class property Bagging

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102521609A (en) * 2011-12-02 2012-06-27 湖南大学 Near-infrared and visible light face image recognition method based on distributed compression sensing theory
CN102768732A (en) * 2012-06-13 2012-11-07 北京工业大学 Face recognition method integrating sparse preserving mapping and multi-class property Bagging
CN102737234A (en) * 2012-06-21 2012-10-17 北京工业大学 Gabor filtering and joint sparsity model-based face recognition method

Also Published As

Publication number Publication date
CN103246874A (en) 2013-08-14

Similar Documents

Publication Publication Date Title
CN103246874B (en) Face identification method based on JSM (joint sparsity model) and sparsity preserving projection
CN106503687B (en) Merge the monitor video system for identifying figures and its method of face multi-angle feature
CN102737234B (en) Based on the face identification method of Gabor filtering and joint sparse model
Perez et al. Methodological improvement on local Gabor face recognition based on feature selection and enhanced Borda count
Abdullah et al. Optimizing face recognition using PCA
Bereta et al. Local descriptors in application to the aging problem in face recognition
CN102982322A (en) Face recognition method based on PCA (principal component analysis) image reconstruction and LDA (linear discriminant analysis)
CN103714326B (en) One-sample face identification method
CN102637251A (en) Face recognition method based on reference features
CN106934359A (en) Various visual angles gait recognition method and system based on high order tensor sub-space learning
CN103164689A (en) Face recognition method and face recognition system
Imani et al. Principal component discriminant analysis for feature extraction and classification of hyperspectral images
CN101976360A (en) Sparse characteristic face recognition method based on multilevel classification
CN101976352A (en) Various illumination face identification method based on small sample emulating and sparse expression
CN106022223A (en) High-dimensional local-binary-pattern face identification algorithm and system
Nimbarte et al. Age Invariant Face Recognition using Convolutional Neural Network.
Yousaf et al. A robust and efficient convolutional deep learning framework for age‐invariant face recognition
CN103714340A (en) Self-adaptation feature extracting method based on image partitioning
CN103605993B (en) Image-to-video face identification method based on distinguish analysis oriented to scenes
CN111259780A (en) Single-sample face recognition method based on block linear reconstruction discriminant analysis
CN114937298A (en) Micro-expression recognition method based on feature decoupling
CN114241564A (en) Facial expression recognition method based on inter-class difference strengthening network
Niu et al. Discriminative video representation with temporal order for micro-expression recognition
CN101216878A (en) Face identification method based on general non-linear discriminating analysis
CN102289679B (en) Method for identifying super-resolution of face in fixed visual angle based on related characteristics and nonlinear mapping

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20200114

Address after: 430014 room 1201, building 5, contemporary science and Technology Park, guannanyuan 1st Road, Donghu New Technology Development Zone, Wuhan City, Hubei Province

Patentee after: Panoramic Zhilian (Wuhan) Technology Co.,Ltd.

Address before: 100124 Chaoyang District, Beijing Ping Park, No. 100

Patentee before: Beijing University of Technology

TR01 Transfer of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A face recognition method based on joint sparse model and sparse preserving mapping

Effective date of registration: 20231204

Granted publication date: 20170215

Pledgee: Wuhan area branch of Hubei pilot free trade zone of Bank of China Ltd.

Pledgor: Panoramic Zhilian (Wuhan) Technology Co.,Ltd.

Registration number: Y2023980068905

PE01 Entry into force of the registration of the contract for pledge of patent right