CN109977807B - Face feature template protection method and system based on complex matrix - Google Patents

Face feature template protection method and system based on complex matrix Download PDF

Info

Publication number
CN109977807B
CN109977807B CN201910180893.5A CN201910180893A CN109977807B CN 109977807 B CN109977807 B CN 109977807B CN 201910180893 A CN201910180893 A CN 201910180893A CN 109977807 B CN109977807 B CN 109977807B
Authority
CN
China
Prior art keywords
matrix
gray
scrambling
image
face image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910180893.5A
Other languages
Chinese (zh)
Other versions
CN109977807A (en
Inventor
邵珠宏
孙浩浩
徐子涵
尚媛园
赵晓旭
丁辉
刘铁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Pengbo Information Technology Co ltd
Hunan Zunyi Electronic Technology Co.,Ltd.
Original Assignee
Capital Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Capital Normal University filed Critical Capital Normal University
Priority to CN201910180893.5A priority Critical patent/CN109977807B/en
Publication of CN109977807A publication Critical patent/CN109977807A/en
Application granted granted Critical
Publication of CN109977807B publication Critical patent/CN109977807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/50Maintenance of biometric data or enrolment thereof
    • G06V40/53Measures to keep reference information secret, e.g. cancellable biometrics

Abstract

The invention discloses a face feature template protection method and system based on a complex matrix, wherein the method comprises the following steps: acquiring a plurality of gray-scale face images, and calculating a local variance atlas corresponding to each gray-scale face image; constructing a plurality of complex matrixes according to the plurality of gray-level face images and the plurality of local variance maps; calculating a covariance matrix according to a training image and a plurality of complex matrixes in the plurality of gray-level face images, and generating an eigenvector matrix according to the covariance matrix; scrambling the feature vector matrix to generate a scrambling matrix, and projecting a plurality of complex matrixes by using the scrambling matrix to generate features corresponding to the face image; and comparing the test image with the training image in the multiple gray-scale face images, and identifying the corresponding characteristics of the face images through a classifier. The method is based on the image local variance atlas and two-dimensional principal component analysis, can protect the privacy of the face image and the safety of data, and is applied to the field of identity recognition and authentication.

Description

Face feature template protection method and system based on complex matrix
Technical Field
The invention relates to the technical field of image processing, in particular to a face feature template protection method and system based on a complex matrix.
Background
Because the traditional identity authentication system is easy to forge and tamper, and the face acquisition is non-contact and non-invasive, and has the advantages of universality, easiness in acquisition, high safety and the like, face recognition research is particularly concerned, becomes one of important ways of identity authentication, and is applied to the fields of banks, railway stations, entry and exit management and the like. However, if the face image is revealed, an attacker can deceive the face recognition and authentication system by imitating the face mask, for example, by using 3D printing to make an ultra-realistic human skin mask, which may result in serious economic consequences and risk hazards. Therefore, the safety and privacy of the human face features also become one of the important technical problems to be solved urgently in practical application.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, the invention aims to provide a face feature template protection method based on a complex matrix, which can protect the privacy of a face image and the safety of data based on an image local variance map and two-dimensional principal component analysis.
The invention also aims to provide a face feature template protection system based on a complex matrix.
In order to achieve the above object, an embodiment of the present invention provides a face feature template protection method based on a complex matrix, including: s1, acquiring a plurality of gray-scale face images, and calculating a local variance map corresponding to each gray-scale face image; s2, constructing a plurality of complex matrixes according to the plurality of gray-scale face images and the plurality of local variance maps; s3, calculating a covariance matrix according to the training images in the gray-scale face images and the complex matrixes, and generating an eigenvector matrix according to the covariance matrix; s4, scrambling the feature vector matrix to generate a scrambling matrix, and projecting the plurality of complex matrixes by using the scrambling matrix to generate features corresponding to the face image; and S5, comparing the test image with the training image in the multiple gray-scale face images, and identifying the corresponding features of the face images through a classifier.
The face feature template protection method based on the complex matrix not only uses the gray information of the face image, but also fully utilizes the detail information of the face image, thereby effectively improving the identification precision. Meanwhile, the feature matrix is scrambled, and the safety and privacy of the face image are fully considered. When the face image feature template is attacked or threatened, a new feature template can be reissued by modifying the scrambling matrix to replace the original attacked or threatened template, and the face image feature template can be applied to the field of identity recognition and authentication.
In addition, the face feature template protection method based on the complex matrix according to the above embodiment of the present invention may further have the following additional technical features:
further, in an embodiment of the present invention, the calculation formula of the local variance map corresponding to each gray-level face image is as follows:
Figure GDA0003215297880000021
wherein f isi V(x, y) is a local variance map of the gray-scale face image, i is 1, 2, …, M is the number of the gray-scale face image, and the size of the gray-scale face image is W1×W2L represents the total number of pixels in the neighborhood,
Figure GDA0003215297880000022
representing the mean gray value of the neighborhood.
Further, in an embodiment of the present invention, the S2, further includes:
the gray-scale face image fi(x, y) as a real component, the local variance map fi V(x, y) as imaginary components, constructing the plurality of complex matrices fi c(x, y), specifically expressed as:
fi c(x,y)=fi(x,y)+fi V(x,y)i
wherein f isi(x, y) is a gray-scale face image, fi V(x, y) is a gray scale face mapLocal variance maps of the images.
Further, in an embodiment of the present invention, the S3 specifically includes:
s31, calculating the average value of the training images in the plurality of gray-scale face images
Figure GDA0003215297880000023
According to the average value
Figure GDA0003215297880000024
And said plurality of complex matrices fi cGenerating the covariance matrix, wherein the covariance matrix is as follows:
Figure GDA0003215297880000025
wherein the content of the first and second substances,
Figure GDA0003215297880000026
the average value of training images in a plurality of gray level face images is shown, and T represents the transposition of a matrix;
s32, performing eigenvalue decomposition on the covariance matrix, selecting eigenvectors corresponding to the first d largest eigenvalues to generate the eigenvector matrix W ═ X1,X2,…,Xd]。
Further, in an embodiment of the present invention, the scrambling the feature vector matrix to generate a scrambling matrix specifically includes:
obtaining a dimension of W2×W2Performing row transformation and/or column transformation on the unit matrix to generate a new matrix R;
transforming the eigenvector matrix W and the new matrix R to obtain the scrambling matrix, wherein the scrambling matrix W' is:
W'=RTW。
in order to achieve the above object, an embodiment of another aspect of the present invention provides a face feature template protection system based on a complex matrix, including:
the calculation module is used for acquiring a plurality of gray-scale face images and calculating a local variance map corresponding to each gray-scale face image;
the construction module is used for constructing a plurality of complex matrixes according to the plurality of gray-scale face images and the plurality of local variance maps;
a generating module, configured to calculate a covariance matrix according to a training image in the multiple gray-scale face images and the multiple complex matrices, and generate an eigenvector matrix according to the covariance matrix;
the projection module is used for scrambling the characteristic vector matrix to generate a scrambling matrix, and projecting the plurality of complex matrixes by using the scrambling matrix to generate characteristics corresponding to the face image;
and the recognition protection module is used for comparing the test image in the multiple gray-scale face images with the training image and recognizing the characteristics corresponding to the face images through a classifier.
The face feature template protection system based on the complex matrix not only uses the gray information of the face image, but also fully utilizes the detail information of the face image, thereby effectively improving the identification precision. Meanwhile, the feature matrix is scrambled, and the safety and privacy of the face image are fully considered. When the face image feature template is attacked or threatened, a new feature template can be reissued by modifying the scrambling matrix to replace the original attacked or threatened template, and the face image feature template can be applied to the field of identity recognition and authentication.
In addition, the face feature template protection system based on the complex matrix according to the above embodiment of the present invention may further have the following additional technical features:
further, in an embodiment of the present invention, the calculation formula of the local variance map corresponding to each gray-level face image is as follows:
Figure GDA0003215297880000041
wherein f isi V(x,y)The local variance atlas of the gray level face image is shown, i is 1, 2, …, M is the number of the gray level face image, and the size of the gray level face image is W1×W2L represents the total number of pixels in the neighborhood,
Figure GDA0003215297880000042
representing the mean gray value of the neighborhood.
Further, in one embodiment of the invention, the building block is, in particular for,
the gray-scale face image fi(x, y) as a real component, the local variance map fi V(x, y) as imaginary components, constructing the plurality of complex matrices fi c(x, y), specifically expressed as:
fi c(x,y)=fi(x,y)+fi V(x,y)i
wherein f isi(x, y) is a gray-scale face image, fi VAnd (x, y) is a local variance map of the gray-level face image.
Further, in an embodiment of the present invention, the generating module further includes: a first generation unit and a decomposition unit;
the first generation unit is used for calculating the average value of training images in the plurality of gray-scale face images
Figure GDA0003215297880000043
According to the average value
Figure GDA0003215297880000044
And said plurality of complex matrices fi cGenerating the covariance matrix, wherein the covariance matrix is as follows:
Figure GDA0003215297880000045
wherein the content of the first and second substances,
Figure GDA0003215297880000046
the average value of training images in a plurality of gray level face images is shown, and T represents the transposition of a matrix;
the decomposition unit is used for decomposing the eigenvalues of the covariance matrix, selecting the eigenvectors corresponding to the first d largest eigenvalues and generating the eigenvector matrix W ═ X1,X2,…,Xd]。
Further, in an embodiment of the present invention, the scrambling the feature vector matrix to generate a scrambling matrix specifically includes:
obtaining a dimension of W2×W2Performing row transformation and/or column transformation on the unit matrix to generate a new matrix R;
transforming the eigenvector matrix W and the new matrix R to obtain the scrambling matrix, wherein the scrambling matrix W' is:
W'=RTW。
additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of a face feature template protection method based on a complex matrix according to an embodiment of the present invention;
FIG. 2 is an original face image and corresponding local variance map according to one embodiment of the present invention;
fig. 3 is a schematic structural diagram of a face feature template protection system based on a complex matrix according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The following describes a face feature template protection method and system based on a complex matrix according to an embodiment of the present invention with reference to the accompanying drawings.
First, a face feature template protection method based on a complex matrix proposed according to an embodiment of the present invention will be described with reference to the accompanying drawings.
Fig. 1 is a flowchart of a face feature template protection method based on a complex matrix according to an embodiment of the present invention.
As shown in fig. 1, the face feature template protection method based on complex matrix includes the following steps:
in step S1, a plurality of gray-scale face images are obtained, and a local variance map corresponding to each gray-scale face image is calculated.
Specifically, assume that there are M gray-scale face images fi(x, y) (i ═ 1, 2, …, M) with dimension W1×W2Firstly, M local variance atlases corresponding to M gray-scale face images are calculated, wherein the neighborhood size of a pixel point is M multiplied by M, and then a local variance calculation formula is as follows:
Figure GDA0003215297880000061
wherein, L ═ m2Representing the total number of pixel points in the neighborhood,
Figure GDA0003215297880000062
representing the mean gray value of the neighborhood. It should be noted that, the local variance of the brightness of the pixels at the image boundary is obtained by adopting a symmetric filling method.
In step S2, a plurality of complex matrices are constructed from the plurality of grayscale face images and the plurality of local variance maps.
Further, S2 further includes: the gray face image fi(x, y) local variance map f as real componenti V(x, y) as imaginary components, constructing a plurality of complex matrices fi c(x, y), specifically expressed as:
fi c(x,y)=fi(x,y)+fi V(x,y)i
wherein f isi(x, y) is a gray-scale face image, fi VAnd (x, y) is a local variance map of the gray-level face image.
Specifically, an original gray image is used as a real part component, a local variance map is used as an imaginary part component, and a complex matrix f is constructedi c(x, y), specifically expressed as:
fi c(x,y)=fi(x,y)+fi V(x,y)i
in step S3, a covariance matrix is calculated from the training images of the plurality of gray-scale face images and the plurality of complex matrices, and an eigenvector matrix is generated from the covariance matrix.
Further, S3 further includes:
s31, calculating the average value of the training images in the multiple gray-scale face images
Figure GDA0003215297880000063
According to the mean value
Figure GDA0003215297880000064
And a plurality of complex matrices fi cGenerating a covariance matrix, wherein the covariance matrix is as follows:
Figure GDA0003215297880000065
wherein the content of the first and second substances,
Figure GDA0003215297880000066
the average value of training images in a plurality of gray level face images is shown, and T represents the transposition of a matrix;
s32, carrying out eigenvalue decomposition on the covariance matrix, selecting the eigenvectors corresponding to the first d largest eigenvalues to generate an eigenvector matrix W [ X ]1,X2,…,Xd]。
Specifically, first, the average value of the training set images is calculated
Figure GDA0003215297880000067
The covariance matrix is then:
Figure GDA0003215297880000071
then, eigenvalue decomposition is carried out on the covariance matrix, and eigenvectors corresponding to the first d maximum eigenvalues are selected to obtain an eigenvector matrix W ═ X1,X2,…,Xd]。
In step S4, the feature vector matrix is scrambled to generate a scramble matrix, and a plurality of complex matrices are projected by the scramble matrix to generate features corresponding to the face image.
Further, scrambling the eigenvector matrix to generate a scrambling matrix specifically includes:
obtaining a dimension of W2×W2Performing row transformation and/or column transformation on the unit matrix to generate a new matrix R;
and transforming the eigenvector matrix W and the new matrix R to obtain a scrambling matrix, wherein the scrambling matrix W' is as follows:
W'=RTW。
specifically, one W is generated2×W2The unit matrix of the size is subjected to one or more of row transformation or column transformation to obtain a new matrix R, and then the eigenvector matrix W obtained in the last step is transformed, wherein the transformation formula is as follows:
W'=RTW,
generating a permutation matrix W 'by transformation, and using the permutation matrix W' to pair a complex matrix fi cAnd (x, y) projecting to obtain the corresponding features of the face image.
In step S5, a test image of the plurality of grayscale face images is compared with the training image, and features corresponding to the face images are identified by the classifier.
Specifically, the test image is compared with the training image, and a nearest neighbor classifier based on Euclidean distance is used for classification to obtain the recognition rate.
Principal component analysis has been successfully used for face recognition, but face recognition methods based on principal component analysis have not been revocable. The biological feature identification method based on the scrambled principal component analysis protects the safety and privacy of biological features at the same time when the identification precision is not influenced; when the face image feature template is attacked or threatened, a new feature template can be released again. The human eye is usually sensitive to high frequency information because the high frequency part of the image usually reflects the structural information of the image, and the local variance of the image can better characterize the structural information of the image.
The method of the embodiment of the invention is based on the local variance atlas of the image and two-dimensional principal component analysis, not only uses the gray information of the face image, but also fully utilizes the detail information of the face image, and effectively improves the identification precision. Meanwhile, the feature matrix is scrambled, and the safety and privacy of the face image are fully considered. When the face image feature template is attacked or threatened, a new feature template can be reissued by modifying the scrambling matrix to replace the original attacked or threatened template, and the face image feature template can be applied to the field of identity recognition and authentication.
The face feature template protection method based on complex matrix according to the embodiment of the present invention is described below with an embodiment.
A library of RadfD facial images was used for a total of 67 people, 8 for each. Wherein 6 images are selected as training, and 2 images are selected as testing; the image size is 64 × 64, and the neighborhood sizes are 3 × 3, 5 × 5, and 7 × 7.
After the above steps are performed, as shown in fig. 2, fig. 2(a) is an original gray-scale face image, fig. 2(b) is a neighborhood 3 × 3 local variance map, fig. 2(c) is a neighborhood 5 × 5 local variance map, fig. 2(d) is a neighborhood 7 × 7 local variance map, and table 1 is a comparison table of recognition rates (%), which shows recognition results of the method according to the embodiment of the present invention and the method based on scrambling two-dimensional principal component analysis (RP-2DPCA), and it can be seen that the recognition rate can be effectively improved by using the image local variance map.
TABLE 1
Figure GDA0003215297880000081
According to the face feature template protection method based on the complex matrix, compared with the traditional principal component analysis method, the image structure information is merged into the principal component analysis, so that the information carrying amount is increased, and the identification precision is further improved; since the matrix R has randomness, the feature template can be reissued when the registered biometric data is lost or stolen.
Next, a face feature template protection system based on a complex matrix proposed according to an embodiment of the present invention is described with reference to the drawings.
Fig. 3 is a schematic structural diagram of a face feature template protection system based on a complex matrix according to an embodiment of the present invention.
As shown in fig. 3, the protection system includes: a computing module 100, a building module 200, a generating module 300, a projection module 400, and an identification protection module 500.
The calculating module 100 is configured to obtain a plurality of gray-scale face images, and calculate a local variance map corresponding to each gray-scale face image.
The construction module 200 is configured to construct a plurality of complex matrices according to the plurality of gray-scale face images and the plurality of local variance maps.
The generating module 300 is configured to calculate a covariance matrix according to a training image of the plurality of gray-level face images and the plurality of complex matrices, and generate an eigenvector matrix according to the covariance matrix.
The projection module 400 is configured to scramble the feature vector matrix to generate a scrambling matrix, and project the plurality of complex matrices by using the scrambling matrix to generate features corresponding to the face image.
The recognition protection module 500 is configured to compare a test image with a training image in a plurality of gray-scale face images, and recognize features corresponding to the face images through a classifier.
The protection system 10 can protect the privacy of the face image and the security of data.
Further, in an embodiment of the present invention, the calculation formula of the local variance map corresponding to each gray-level face image is as follows:
Figure GDA0003215297880000091
wherein f isi V(x, y) is a local variance map of the gray-scale face image, i is 1, 2, …, M is the number of the gray-scale face image, and the size of the gray-scale face image is W1×W2L represents the total number of pixels in the neighborhood,
Figure GDA0003215297880000092
representing the mean gray value of the neighborhood.
Further, in one embodiment of the invention, a building block, in particular for,
the gray face image fi(x, y) local variance map f as real componenti V(x, y) as imaginary components, constructing a plurality of complex matrices fi c(x, y), specifically expressed as:
fi c(x,y)=fi(x,y)+fi V(x,y)i
wherein f isi(x, y) is a gray-scale face image, fi VAnd (x, y) is a local variance map of the gray-level face image.
Further, in an embodiment of the present invention, the generating module further includes: a first generation unit and a decomposition unit;
a first generation unit for calculating an average value of training images in the plurality of gray-scale face images
Figure GDA0003215297880000093
According to the mean value
Figure GDA0003215297880000094
And a plurality of complex matrices fi cGenerating a covariance matrix, wherein the covariance matrix is as follows:
Figure GDA0003215297880000095
wherein the content of the first and second substances,
Figure GDA0003215297880000096
the average value of training images in a plurality of gray level face images is shown, and T represents the transposition of a matrix;
a decomposition unit for decomposing the eigenvalue of the covariance matrix, and selecting the eigenvector corresponding to the first d largest eigenvalues to generate an eigenvector matrix W ═ X1,X2,…,Xd]。
Further, in an embodiment of the present invention, scrambling the feature vector matrix to generate a scrambling matrix specifically includes:
obtaining a dimension of W2×W2Performing row transformation and/or column transformation on the unit matrix to generate a new matrix R; and transforming the eigenvector matrix W and the new matrix R to obtain a scrambling matrix, wherein the scrambling matrix W' is as follows:
W'=RTW。
it should be noted that the foregoing explanation of the embodiment of the face feature template protection method based on the complex matrix is also applicable to the apparatus of this embodiment, and is not repeated herein.
According to the face feature template protection system based on the complex matrix, provided by the embodiment of the invention, the gray information of the face image is used, and meanwhile, the detail information of the face image is fully utilized, so that the identification precision is effectively improved. Meanwhile, the feature matrix is scrambled, and the safety and privacy of the face image are fully considered. When the face image feature template is attacked or threatened, a new feature template can be reissued by modifying the scrambling matrix to replace the original attacked or threatened template, and the face image feature template can be applied to the field of identity recognition and authentication.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (4)

1. A face feature template protection method based on a complex matrix is characterized by comprising the following steps:
s1, acquiring a plurality of gray-scale face images, and calculating a local variance map corresponding to each gray-scale face image;
s2, constructing a plurality of complex matrixes according to the plurality of gray-scale face images and the plurality of local variance maps;
s3, calculating a covariance matrix according to the training images in the gray-scale face images and the complex matrixes, and generating an eigenvector matrix according to the covariance matrix;
s4, scrambling the feature vector matrix to generate a scrambling matrix, and projecting the plurality of complex matrixes by using the scrambling matrix to generate features corresponding to the face image;
s5, comparing the test image with the training image in the multiple gray-scale face images, and identifying the corresponding features of the face images through a classifier;
the calculation formula of the local variance atlas corresponding to each gray level face image is as follows:
Figure FDA0003222653070000011
wherein the content of the first and second substances,
Figure FDA0003222653070000012
the local variance atlas of the gray level face image is shown, i is 1, 2, …, M is the number of the gray level face image, and the size of the gray level face image is W1×W2L represents the total number of pixels in the neighborhood,
Figure FDA0003222653070000013
representing the mean gray value of the neighborhood;
the S2, further comprising:
the gray-scale face image fi(x, y) as a real component, the local variance map
Figure FDA0003222653070000014
Constructing the plurality of complex matrices as imaginary components
Figure FDA0003222653070000015
The concrete expression is as follows:
Figure FDA0003222653070000016
wherein f isi(x, y) are grayscale face images,
Figure FDA0003222653070000017
the local variance atlas is a gray level face image;
the S3 further includes:
s31, calculating the average value of the training images in the plurality of gray-scale face images
Figure FDA0003222653070000018
According to the average value
Figure FDA0003222653070000019
And said plurality of complex matrices
Figure FDA00032226530700000110
Generating the covariance matrix, wherein the covariance matrix is as follows:
Figure FDA0003222653070000021
wherein the content of the first and second substances,
Figure FDA0003222653070000022
the average value of training images in a plurality of gray level face images is shown, and T represents the transposition of a matrix;
s32, performing eigenvalue decomposition on the covariance matrix, selecting eigenvectors corresponding to the first d largest eigenvalues to generate the eigenvector matrix W ═ X1,X2,…,Xd]。
2. The method according to claim 1, wherein the scrambling the eigenvector matrix to generate a scrambling matrix specifically comprises:
obtaining a dimension of W2×W2Performing row transformation and/or column transformation on the unit matrix to generate a new matrix R;
transforming the eigenvector matrix W and the new matrix R to obtain the scrambling matrix, wherein the scrambling matrix W' is:
W'=RTW。
3. a face feature template protection system based on a complex matrix is characterized by comprising:
the calculation module is used for acquiring a plurality of gray-scale face images and calculating a local variance map corresponding to each gray-scale face image;
the construction module is used for constructing a plurality of complex matrixes according to the plurality of gray-scale face images and the plurality of local variance maps;
a generating module, configured to calculate a covariance matrix according to a training image in the multiple gray-scale face images and the multiple complex matrices, and generate an eigenvector matrix according to the covariance matrix;
the projection module is used for scrambling the characteristic vector matrix to generate a scrambling matrix, and projecting the plurality of complex matrixes by using the scrambling matrix to generate characteristics corresponding to the face image;
the recognition protection module is used for comparing a test image in the multiple gray-scale face images with a training image and recognizing the characteristics corresponding to the face images through a classifier;
the calculation formula of the local variance atlas corresponding to each gray level face image in the calculation module is as follows:
Figure FDA0003222653070000023
wherein the content of the first and second substances,
Figure FDA0003222653070000024
the local variance atlas of the gray level face image is shown, i is 1, 2, …, M is the number of the gray level face image, and the size of the gray level face image is W1×W2L represents the total number of pixels in the neighborhood,
Figure FDA0003222653070000031
representing the mean gray value of the neighborhood;
the building block is particularly useful for building,
the gray-scale face image fi(x, y) as a real component, the local variance map
Figure FDA0003222653070000032
Constructing the plurality of complex matrices as imaginary components
Figure FDA0003222653070000033
The concrete expression is as follows:
Figure FDA0003222653070000034
wherein f isi(x, y) are grayscale face images,
Figure FDA0003222653070000035
the local variance atlas is a gray level face image;
the generation module further comprises: a first generation unit and a decomposition unit;
the first generation unit is used for calculating the average value of training images in the plurality of gray-scale face images
Figure FDA0003222653070000036
According to the average value
Figure FDA0003222653070000037
And said plurality of complex matrices
Figure FDA0003222653070000038
Generating the covariance matrix, the covariance matrixThe matrix is as follows:
Figure FDA0003222653070000039
wherein the content of the first and second substances,
Figure FDA00032226530700000310
the average value of training images in a plurality of gray level face images is shown, and T represents the transposition of a matrix;
the decomposition unit is used for decomposing the eigenvalues of the covariance matrix, selecting the eigenvectors corresponding to the first d largest eigenvalues and generating the eigenvector matrix W ═ X1,X2,…,Xd]。
4. The system of claim 3, wherein the scrambling the eigenvector matrix to generate a scrambling matrix specifically comprises:
obtaining a dimension of W2×W2Performing row transformation and/or column transformation on the unit matrix to generate a new matrix R;
transforming the eigenvector matrix W and the new matrix R to obtain the scrambling matrix, wherein the scrambling matrix W' is:
W'=RTW。
CN201910180893.5A 2019-03-11 2019-03-11 Face feature template protection method and system based on complex matrix Active CN109977807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910180893.5A CN109977807B (en) 2019-03-11 2019-03-11 Face feature template protection method and system based on complex matrix

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910180893.5A CN109977807B (en) 2019-03-11 2019-03-11 Face feature template protection method and system based on complex matrix

Publications (2)

Publication Number Publication Date
CN109977807A CN109977807A (en) 2019-07-05
CN109977807B true CN109977807B (en) 2021-10-12

Family

ID=67078405

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910180893.5A Active CN109977807B (en) 2019-03-11 2019-03-11 Face feature template protection method and system based on complex matrix

Country Status (1)

Country Link
CN (1) CN109977807B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110633650A (en) * 2019-08-22 2019-12-31 首都师范大学 Convolutional neural network face recognition method and device based on privacy protection
CN110610144B (en) * 2019-08-28 2022-04-22 首都师范大学 Expression recognition method and system for privacy protection

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101610396A (en) * 2008-06-16 2009-12-23 北京智安邦科技有限公司 Intellective video monitoring device module and system and method for supervising thereof with secret protection
CN103731271A (en) * 2013-12-30 2014-04-16 北京工业大学 On-line face identity authentication method based on homomorphic encrypting and chaotic scrambling
CN105719224A (en) * 2016-01-18 2016-06-29 济南大学 Biological characteristic image encryption method based on dual-tree complex wavelet transformation
CN106778711A (en) * 2017-02-22 2017-05-31 安徽创世科技股份有限公司 A kind of face identification method based on information fusion

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6826300B2 (en) * 2001-05-31 2004-11-30 George Mason University Feature based classification
KR102460069B1 (en) * 2015-09-30 2022-10-28 삼성전자주식회사 Security certification apparatus using biometric information and security certification method
US20180108019A1 (en) * 2016-10-15 2018-04-19 Systems Imagination, Inc. Secure Encryption Using Genomic Information

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101610396A (en) * 2008-06-16 2009-12-23 北京智安邦科技有限公司 Intellective video monitoring device module and system and method for supervising thereof with secret protection
CN103731271A (en) * 2013-12-30 2014-04-16 北京工业大学 On-line face identity authentication method based on homomorphic encrypting and chaotic scrambling
CN105719224A (en) * 2016-01-18 2016-06-29 济南大学 Biological characteristic image encryption method based on dual-tree complex wavelet transformation
CN106778711A (en) * 2017-02-22 2017-05-31 安徽创世科技股份有限公司 A kind of face identification method based on information fusion

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
基于三维人脸成像系统的复数域人脸识别方法;叶长明等;《电子测量与仪器学报》;20110531;第25卷(第5期);422-423 *
基于图像结构信息复数表示与奇异值分解的灰度图像质量评价方法;王宇庆;《光电子激光》;20120930;第23卷(第9期);1828 *
基于混沌符号动力学的图像加密研究;王金铭等;《计算机工程》;20110120;第37卷(第2期);133 *
融合混沌理论与RSA的人脸特征模板加密算法;赵福东;《中国优秀硕士学位论文全文数据库信息科技辑》;20170615(第6期);22-26 *

Also Published As

Publication number Publication date
CN109977807A (en) 2019-07-05

Similar Documents

Publication Publication Date Title
CN106503655B (en) A kind of electric endorsement method and sign test method based on face recognition technology
US20070122009A1 (en) Face recognition method and apparatus
US20080065900A1 (en) Method and apparatus for biometrics
CN108369785A (en) Activity determination
CN104823203A (en) Biometric template security and key generation
CN110633650A (en) Convolutional neural network face recognition method and device based on privacy protection
CN106778613A (en) A kind of auth method and device based on the matching of face cut zone
CN109977807B (en) Face feature template protection method and system based on complex matrix
Mahalingam et al. Face verification with aging using AdaBoost and local binary patterns
Kar et al. A multi-algorithmic face recognition system
Guetta et al. Dodging attack using carefully crafted natural makeup
Ng et al. Multi-layer age regression for face age estimation
CN106650657A (en) Authentication method and device based on full face binary matching
Chen et al. Iris recognition using 3D co-occurrence matrix
CN113435264A (en) Face recognition attack resisting method and device based on black box substitution model searching
CN108921080A (en) Image-recognizing method, device and electronic equipment
Bharadi et al. Multi-instance iris recognition
Zhou et al. Orientation analysis for rotated human face detection
Guo et al. Integrating diversity into neural-network-based face deidentification
Luong et al. Reconstructing a fragmented face from a cryptographic identification protocol
CN114612991A (en) Conversion method and device for attacking face picture, electronic equipment and storage medium
Palanikumar et al. Advanced palmprint recognition using unsharp masking and histogram equalization
CN1987890A (en) Humanface image matching method for general active snape changing mode
Dang et al. Practical construction of face-based authentication systems with template protection using secure sketch
CN109801072B (en) Private key generation method and system of block chain electronic wallet based on facial features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230726

Address after: Room 1109, No. 31, Nancheng Section, Guantai Road, Nancheng Street, Dongguan City, Guangdong Province 523071

Patentee after: Dongguan Pengbo Information Technology Co.,Ltd.

Address before: 100037 No. 105 West Third Ring Road North, Beijing, Haidian District

Patentee before: Capital Normal University

Effective date of registration: 20230726

Address after: Room 205, Building 1, Xiangdi Yajing Homeland, No. 458, Renmin East Road, Dongtundu, Furong District, Changsha, 410000, Hunan Province

Patentee after: Hunan Zunyi Electronic Technology Co.,Ltd.

Address before: Room 1109, No. 31, Nancheng Section, Guantai Road, Nancheng Street, Dongguan City, Guangdong Province 523071

Patentee before: Dongguan Pengbo Information Technology Co.,Ltd.