CN104899578A - Method and device for face identification - Google Patents

Method and device for face identification Download PDF

Info

Publication number
CN104899578A
CN104899578A CN201510363785.3A CN201510363785A CN104899578A CN 104899578 A CN104899578 A CN 104899578A CN 201510363785 A CN201510363785 A CN 201510363785A CN 104899578 A CN104899578 A CN 104899578A
Authority
CN
China
Prior art keywords
sample
class
training
matrix
inter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510363785.3A
Other languages
Chinese (zh)
Other versions
CN104899578B (en
Inventor
张莉
周伟达
王邦军
张召
李凡长
杨季文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhangjiagang Institute of Industrial Technologies Soochow University
Original Assignee
Zhangjiagang Institute of Industrial Technologies Soochow University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhangjiagang Institute of Industrial Technologies Soochow University filed Critical Zhangjiagang Institute of Industrial Technologies Soochow University
Priority to CN201510363785.3A priority Critical patent/CN104899578B/en
Publication of CN104899578A publication Critical patent/CN104899578A/en
Application granted granted Critical
Publication of CN104899578B publication Critical patent/CN104899578B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24143Distances to neighbourhood prototypes, e.g. restricted Coulomb energy networks [RCEN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for face identification. The method comprises a step of using obtained face image data as a to-be-measured sample; a step of mapping the to-be-measured sample into a low-dimension characteristic space using a projection transformation matrix to obtain a projected test sample; a step of searching a standard sample which is closest to the test sampling in a training sample set as a target sample; and a step of determining the class of the target sample as the class of the test sample. The projection transformation matrix is a transformation matrix which is obtained by training multiple samples in the training sample set through a constructed intra-class adjacent matrix and a constructed inter-class adjacent matrix, so as to maximize the inter-matrix distance and minimize the intra-class distance. According to the method and the device for face identification, two adjacent matrixes, namely the intra-class adjacent matrix and the inter-class adjacent matrix, are constructed for the orthogonal discriminant projection, and the intra-class information and the inter-class information is separately expressed, so that the balanced information is obtained, and the object of achieving the minimal intra-matrix distance and the maximal inter-matrix distance is achieved.

Description

Face recognition method and device
Technical Field
The invention relates to the field of computer vision, in particular to a method and a device for face recognition.
Background
Face recognition technology has been developed as a very popular research topic in computer vision and is one of the most successful applications in the field of image analysis. The human face data is typical high-dimensional small sample data, and the dimension reduction of the human face data is a necessary preprocessing step. In recent decades of development, a series of dimension reduction techniques have been proposed in succession.
The currently proposed orthogonal discriminant projection methods construct only one adjacency graph containing intra-class and inter-class information. In the case of unbalanced data distribution, information within a class and information between classes are also unbalanced in the adjacency graph, which may result in that the purpose of minimum intra-class distance and maximum inter-class distance cannot be achieved.
Disclosure of Invention
The invention aims to provide a method and a device for face recognition, and aims to solve the problem that the prior art cannot realize intra-class minimum and inter-class maximum.
In order to solve the above technical problem, the present invention provides a method for face recognition, comprising:
taking the obtained face image data as a sample to be detected;
mapping the sample to be tested to a low-dimensional feature space by using a projection transformation matrix to obtain a projected test sample;
in a training sample set, searching a standard sample closest to the test sample as a target sample;
determining a category of the target sample as a category of the test sample;
the projective transformation matrix is a transformation matrix obtained by training a plurality of samples in the training sample set through a constructed intra-class adjacency matrix and an inter-class adjacency matrix, so that the inter-class distance is maximum and the intra-class distance is minimum.
Optionally, the projective transformation matrix is a transformation matrix obtained by training a plurality of samples in the training sample set through a constructed intra-class adjacency matrix and an inter-class adjacency matrix, and includes:
by constructed intra-class adjacency matrix FwAnd inter-class adjacency matrix FbAccording to Sw=X(Dw-Fw)XTAnd Sb=X(Db-Fb)XTCalculating to obtain an intra-class local divergence matrix SwAnd inter-class local divergence matrix Sb
By said intra-class local divergence matrix SwAnd inter-class local divergence matrix SbCalculating to obtain a projection transformation matrix P so as to enable the inter-class distance to be maximum and the intra-class distance to be minimum;
wherein,
t>0,andare each xiSet of homogeneous and heterogeneous neighbors of DwAnd DbAre all diagonal matrices.
Optionally, said passing of said intra-class local divergence matrix SwAnd the inter-class local divergence matrix SbDetermining the projective transformation matrix P comprises:
for the intra-class local divergence matrix SwAnd the inter-class local divergence matrix SbAnd carrying out generalized eigen decomposition, arranging the obtained eigenvalues in a descending order, and taking eigenvectors corresponding to the first d eigenvalues as the projection transformation matrix P, wherein d is the dimension of the space after projection transformation.
Optionally, the training sample set is a pre-established set, and the pre-established process includes:
taking the obtained multiple face images as training samples;
mapping the training sample to a low-dimensional feature space by using the projective transformation matrix to obtain a projected standard sample;
and storing the standard sample and the known class of the face image as the training sample set.
Optionally, the searching, in the training sample set, a standard sample closest to the test sample as a target sample includes:
and searching a standard sample closest to the test sample in the training sample set by using a nearest neighbor classifier as a target sample.
The invention also provides a face recognition device, which comprises:
the acquisition module is used for taking the acquired face image data as a sample to be detected;
the mapping module is used for mapping the sample to be tested to a low-dimensional feature space by using a projection transformation matrix to obtain a projected test sample;
the searching module is used for searching a standard sample closest to the test sample in the training sample set as a target sample;
a determining module for determining the category of the target sample as the category of the test sample;
the projection transformation matrix is obtained by training a plurality of samples in the training sample set through a constructed intra-class adjacent matrix and an inter-class adjacent matrix by a training module so as to enable the inter-class distance to be maximum and the intra-class distance to be minimum.
Optionally, the training module comprises:
the training acquisition unit is used for taking the acquired multiple face images as training samples;
the training mapping unit is used for mapping the training sample to a low-dimensional feature space by using the projection transformation matrix to obtain a projected standard sample;
and the training storage unit is used for storing the standard sample and the known class of the face image as the training sample set.
Optionally, the searching module is configured to search, in the training sample set, a standard sample closest to the test sample as a target sample, where the searching includes:
the searching module is specifically configured to search, in the training sample set, a standard sample closest to the test sample as a target sample by using a nearest neighbor classifier.
The method and the device for recognizing the human face provided by the invention utilize the projection transformation matrix to map the obtained sample to be tested to the low-dimensional feature space to obtain the projected test sample. Then, in the training sample set, a standard sample closest to the test sample is searched as a target sample, and the category of the target sample is determined as the category of the test sample, so as to achieve the purpose of face recognition. The method and the device for recognizing the human face respectively construct two adjacent matrixes for orthogonal discriminant projection: the inter-class and intra-class adjacency matrixes are used for separately representing the intra-class information and the inter-class information to obtain balanced information, so that the purpose of intra-class minimum and inter-class maximum is achieved.
Drawings
FIG. 1 is a flowchart of a method of an embodiment of a face recognition method according to the present invention;
FIG. 2 is a flow chart of a projective transformation matrix determination process in another embodiment of the face recognition method provided by the present invention;
FIG. 3 is a flow chart of a process of pre-establishing a training sample set in another embodiment of the method for face recognition provided by the present invention;
FIG. 4 is a graph of classification accuracy versus dimensionality for three algorithms;
fig. 5 is a block diagram of a specific embodiment of a face recognition apparatus according to the present invention.
Detailed Description
In order that those skilled in the art will better understand the disclosure, the invention will be described in further detail with reference to the accompanying drawings and specific embodiments. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
A method flowchart of a specific embodiment of the method for face recognition provided by the present invention is shown in fig. 1, and the method includes:
step S101: taking the obtained face image data as a sample to be detected;
step S102: mapping the sample to be tested to a low-dimensional feature space by using a projection transformation matrix to obtain a projected test sample;
step S103: in a training sample set, searching a standard sample closest to the test sample as a target sample;
step S104: determining a category of the target sample as a category of the test sample;
the projective transformation matrix is a transformation matrix obtained by training a plurality of samples in the training sample set through a constructed intra-class adjacency matrix and an inter-class adjacency matrix, so that the inter-class distance is maximum and the intra-class distance is minimum.
The face recognition method provided by the invention utilizes the projection transformation matrix to map the obtained sample to be tested to the low-dimensional feature space, so as to obtain the projected test sample. Then, in the training sample set, a standard sample closest to the test sample is searched as a target sample, and the category of the target sample is determined as the category of the test sample, so as to achieve the purpose of face recognition. The face recognition method provided by the invention constructs two adjacency matrixes for orthogonal discriminant projection respectively: the inter-class and intra-class adjacency matrixes are used for separately representing the intra-class information and the inter-class information to obtain balanced information, so that the purpose of intra-class minimum and inter-class maximum is achieved.
It should be noted that, in the present invention, the intra-class refers to the relationship between samples of the same class; inter-class refers to the relationship between samples of different classes.
The present invention provides another specific implementation manner of the face recognition method, and compared with the previous embodiment, this embodiment adds a determination process of a projective transformation matrix, as shown in fig. 2:
step S201: by constructed intra-class adjacency matrix FwAnd inter-class adjacency matrix FbAccording to Sw=X(Dw-Fw)XTAnd Sb=X(Db-Fb)XTCalculating to obtain an intra-class local divergence matrix SwAnd inter-class local divergence matrix Sb
Step S202: by said intra-class local divergence matrix SwAnd inter-class local divergence matrix SbCalculating to obtain a projection transformation matrix P so as to enable the inter-class distance to be maximum and the intra-class distance to be minimum;
wherein,
t>0,andare each xiSet of homogeneous and heterogeneous neighbors of DwAnd DbAre all diagonal matrices.
As a preferred embodiment, by means of an internal local divergence matrix SwAnd inter-class local divergence matrix SbDetermining the projective transformation matrix P may be further specifically:
for the intra-class local divergence matrix SwAnd inter-class local divergence matrix SbCarrying out generalized eigen decomposition, arranging the obtained eigenvalues in a descending order, and taking eigenvectors corresponding to the first d eigenvalues as the projective transformation matrix P, wherein d is space after projective transformationThe dimension of (c) between (d).
After determining the projective transformation matrix, the embodiment further provides a process of pre-establishing a training sample set, as shown in fig. 3:
step S301: taking the obtained multiple face images as training samples;
step S302: mapping the training sample to a low-dimensional feature space by using the projective transformation matrix to obtain a projected standard sample;
step S303: and storing the standard sample and the known class of the face image as the training sample set.
The present invention further provides another specific implementation manner of the method for face recognition, in this embodiment, the ORL face database contains 400 face images of 40 persons; each person is 10 images. In which some images of the human face are taken at different times. The human facial expression and facial details vary to varying degrees, such as open or closed eyes, wearing or not wearing glasses, laughing or not laughing; the human face posture is changed to a considerable degree, and the depth rotation and the plane rotation can reach 200; each image is 32 × 32 pixels in size, and each pixel is 256 gray levels. Randomly selecting 50% from the database as a training sample, using the remaining 50% as a test sample, repeating the random sampling for 10 times, and reporting the average result.
Specifically, the present embodiment includes a process of establishing a face training data set through training and classifying images through the face training data set.
The existing face training data set is set asWherein xi∈RDIs certain face data, yiX is represented by {1,2, …, c }iC represents the number of classes, N represents the total number of training samples, and D represents the dimension of the training samples.
In this embodiment, N is 200, c is 40, and D is 1024. Other values are of course possible, all without affecting the implementation of the invention.
In order to find an optimal transformation P while simultaneously considering the geometrical characteristics and training point information that maintain low-dimensional coordinates, the data set is subjected toMapping to a relatively low-dimensional feature space, such as a D-dimensional space, and D < D. In this low-dimensional feature space, the inter-class distance is maximized and the intra-class distance is minimized, i.e.:
m a x P t r a c e ( P T S b P P T S w P )
wherein trace is the matrix trace function, SbIs an inter-class local divergence matrix, SwLocal divergence matrices within classes. To compute these two local divergence matrices, we construct two adjacency matrices, the intra-class adjacency matrix FwAnd inter-class adjacency matrix Fb. Then Sw=X(Dw-Fw)XTAnd Sb=X(Db-Fb)XTWherein D iswAnd DbAre all diagonal matrices and are,andFwand FbThe definition is as follows:
and
where t > 0 is a parameter of the function,andare each xiA set of homogeneous neighbors and heterogeneous neighbors. In the present embodiment, t is 8.
To obtain P, we are dealing with SbAnd SwAnd carrying out generalized characteristic decomposition. Sorting the obtained eigenvalues in descending order, and taking eigenvectors corresponding to d eigenvalues to form a matrix P ═ P1,p2,…,pd]Wherein p isiIs the feature vector after feature decomposition.
Obtaining a projection matrix PThen, the sample of the original sample space is projected to the low-dimensional feature space by projection, zi=PTxiWherein z isiIs xiProjection in a low dimensional space, zi∈Rd. Order toIs a set of projected training samples. In this example, the value of d varies from 1 to 50.
For a certain sample x to be tested belongs to RDMapping the low-dimensional feature space into a low-dimensional feature space by using a projection transformation P to obtain a projected test sample z ═ PTx∈Rd
And classifying the test sample z after projection in a low-dimensional feature space by using a nearest neighbor classifier. That is, in the training sample setThen, the sample closest to the test sample is found, and the class of the sample is assigned to the projected test sample z. This completes the classification of x. In this embodiment, there are 200 samples to be tested, and the classification module is repeated 200 times.
Figure 4 shows a graph of classification accuracy versus dimensionality for three algorithms. The three comparison methods are: orthogonal Discriminant Projection (ODP), Discriminant Neighbor Embedding (DNE), and the present invention. It can be seen that the recognition rate of the present invention is higher than the other two methods. Table 1 shows the comparison of the best performance of the three methods at down-dimensions between 1 and 50, with the corresponding best dimensions in parentheses. The invention achieves the best performance at lower dimensions.
TABLE 1
A block diagram of a specific embodiment of a face recognition apparatus provided in the present invention is shown in fig. 5, and the apparatus includes:
an obtaining module 100, configured to use the obtained face image data as a sample to be detected;
the mapping module 200 is configured to map the sample to be tested to a low-dimensional feature space by using a projection transformation matrix to obtain a projected test sample;
the searching module 300 is configured to search, in the training sample set, a standard sample closest to the test sample as a target sample;
a determining module 400 for determining the category of the target sample as the category of the test sample;
the projective transformation matrix is a transformation matrix obtained by the training module 500 through the constructed intra-class adjacency matrix and inter-class adjacency matrix to train a plurality of samples in the training sample set, so as to maximize the inter-class distance and minimize the intra-class distance.
The face recognition device provided by the invention utilizes the projection transformation matrix to map the obtained sample to be tested to the low-dimensional feature space, so as to obtain the projected test sample. Then, in the training sample set, a standard sample closest to the test sample is searched as a target sample, and the category of the target sample is determined as the category of the test sample, so as to achieve the purpose of face recognition. The face recognition device provided by the invention constructs two adjacency matrixes for orthogonal discriminant projection respectively: the inter-class and intra-class adjacency matrixes are used for separately representing the intra-class information and the inter-class information to obtain balanced information, so that the purpose of intra-class minimum and inter-class maximum is achieved.
The training module 500 in the apparatus for face recognition provided by the present invention further comprises:
a training acquisition unit 501, configured to use the acquired multiple face images as training samples;
a training mapping unit 502, configured to map the training sample to a low-dimensional feature space by using the projective transformation matrix, so as to obtain a projected standard sample;
a training storage unit 503, configured to store the standard sample and the known class of the face image as the training sample set.
As a specific embodiment, the searching module is configured to search, in the training sample set, a standard sample closest to the test sample as a target sample, and includes:
and the searching module searches the standard sample closest to the test sample in the training sample set as a target sample by using a nearest neighbor classifier.
Other specific settings and methods of the face recognition device provided by the invention are similar, and are not described herein again.
The face recognition device provided by the invention utilizes the projection transformation matrix to map the obtained sample to be tested to the low-dimensional feature space, so as to obtain the projected test sample. Then, in the training sample set, a standard sample closest to the test sample is searched as a target sample, and the category of the target sample is determined as the category of the test sample, so as to achieve the purpose of face recognition. The face recognition device provided by the invention constructs two adjacency matrixes for orthogonal discriminant projection respectively: the inter-class and intra-class adjacency matrixes are used for separately representing the intra-class information and the inter-class information to obtain balanced information, so that the purpose of intra-class minimum and inter-class maximum is achieved. Compared with an orthogonal discriminant projection algorithm, the method can solve the problem of unbalanced distribution of the data samples, and has high recognition rate.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. A method of face recognition, comprising:
taking the obtained face image data as a sample to be detected;
mapping the sample to be tested to a low-dimensional feature space by using a projection transformation matrix to obtain a projected test sample;
in a training sample set, searching a standard sample closest to the test sample as a target sample;
determining a category of the target sample as a category of the test sample;
the projective transformation matrix is a transformation matrix obtained by training a plurality of samples in the training sample set through a constructed intra-class adjacency matrix and an inter-class adjacency matrix, so that the inter-class distance is maximum and the intra-class distance is minimum.
2. The method of claim 1, wherein the projective transformation matrix is a transformation matrix obtained by training a plurality of samples in the training sample set through a constructed intra-class adjacency matrix and an inter-class adjacency matrix, and comprises:
by constructed intra-class adjacency matrix FwAnd inter-class adjacency matrix FbAccording to Sw=X(Dw-Fw)XTAnd Sb=X(Db-Fb)XTCalculating to obtain an intra-class local divergence matrix SwAnd inter-class local divergence matrix Sb
By said intra-class local divergence matrix SwAnd inter-class local divergence matrix SbCalculating to obtain a projection transformation matrix P so as to enable the inter-class distance to be maximum and the intra-class distance to be minimum;
wherein,
t>0,andare each xiSet of homogeneous and heterogeneous neighbors of DwAnd DbAre all diagonal matrices.
3. The method of claim 2, wherein the passing of the intra-class local divergence matrix SwAnd the inter-class local divergence matrix SbDetermining the projective transformation matrix P comprises:
for the intra-class local divergence matrix SwAnd the inter-class local divergence matrix SbAnd carrying out generalized eigen decomposition, arranging the obtained eigenvalues in a descending order, and taking eigenvectors corresponding to the first d eigenvalues as the projection transformation matrix P, wherein d is the dimension of the space after projection transformation.
4. A method for face recognition as claimed in any one of claims 1 to 3, wherein the set of training samples is a pre-established set, and the pre-established process comprises:
taking the obtained multiple face images as training samples;
mapping the training sample to a low-dimensional feature space by using the projective transformation matrix to obtain a projected standard sample;
and storing the standard sample and the known class of the face image as the training sample set.
5. The method of any one of claims 1 to 3, wherein the searching for the standard sample closest to the test sample as the target sample in the training sample set comprises:
and searching a standard sample closest to the test sample in the training sample set by using a nearest neighbor classifier as a target sample.
6. An apparatus for face recognition, comprising:
the acquisition module is used for taking the acquired face image data as a sample to be detected;
the mapping module is used for mapping the sample to be tested to a low-dimensional feature space by using a projection transformation matrix to obtain a projected test sample;
the searching module is used for searching a standard sample closest to the test sample in the training sample set as a target sample;
a determining module for determining the category of the target sample as the category of the test sample;
the projection transformation matrix is obtained by training a plurality of samples in the training sample set through a constructed intra-class adjacent matrix and an inter-class adjacent matrix by a training module so as to enable the inter-class distance to be maximum and the intra-class distance to be minimum.
7. The apparatus for face recognition as defined in claim 6, wherein the training module comprises:
the training acquisition unit is used for taking the acquired multiple face images as training samples;
the training mapping unit is used for mapping the training sample to a low-dimensional feature space by using the projection transformation matrix to obtain a projected standard sample;
and the training storage unit is used for storing the standard sample and the known class of the face image as the training sample set.
8. The apparatus for face recognition according to claim 6, wherein the searching module is configured to search, as the target sample, a standard sample closest to the test sample in the training sample set, and includes:
the searching module is specifically configured to search, in the training sample set, a standard sample closest to the test sample as a target sample by using a nearest neighbor classifier.
CN201510363785.3A 2015-06-26 2015-06-26 A kind of method and device of recognition of face Active CN104899578B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510363785.3A CN104899578B (en) 2015-06-26 2015-06-26 A kind of method and device of recognition of face

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510363785.3A CN104899578B (en) 2015-06-26 2015-06-26 A kind of method and device of recognition of face

Publications (2)

Publication Number Publication Date
CN104899578A true CN104899578A (en) 2015-09-09
CN104899578B CN104899578B (en) 2019-02-12

Family

ID=54032232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510363785.3A Active CN104899578B (en) 2015-06-26 2015-06-26 A kind of method and device of recognition of face

Country Status (1)

Country Link
CN (1) CN104899578B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105825236A (en) * 2016-03-18 2016-08-03 苏州大学 Method and system for building sample detection model
CN106257488A (en) * 2016-07-07 2016-12-28 电子科技大学 A kind of radar target identification method based on neighborhood characteristics space discriminatory analysis
CN107203786A (en) * 2017-06-06 2017-09-26 苏州大学 Image-recognizing method and device based on sparse border Fisher algorithms
CN107480623A (en) * 2017-08-07 2017-12-15 西安电子科技大学 The neighbour represented based on cooperation keeps face identification method
CN110738248A (en) * 2019-09-30 2020-01-31 朔黄铁路发展有限责任公司 State perception data feature extraction method and device and system performance evaluation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679162A (en) * 2014-01-03 2014-03-26 苏州大学 Human-face identifying method and system
CN103679161A (en) * 2014-01-03 2014-03-26 苏州大学 Human-face identifying method and device
US8687880B2 (en) * 2012-03-20 2014-04-01 Microsoft Corporation Real time head pose estimation
CN103870848A (en) * 2014-04-01 2014-06-18 苏州大学 Obtaining and sample classification method for projection transformation matrix

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8687880B2 (en) * 2012-03-20 2014-04-01 Microsoft Corporation Real time head pose estimation
CN103679162A (en) * 2014-01-03 2014-03-26 苏州大学 Human-face identifying method and system
CN103679161A (en) * 2014-01-03 2014-03-26 苏州大学 Human-face identifying method and device
CN103870848A (en) * 2014-04-01 2014-06-18 苏州大学 Obtaining and sample classification method for projection transformation matrix

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105825236A (en) * 2016-03-18 2016-08-03 苏州大学 Method and system for building sample detection model
CN106257488A (en) * 2016-07-07 2016-12-28 电子科技大学 A kind of radar target identification method based on neighborhood characteristics space discriminatory analysis
CN106257488B (en) * 2016-07-07 2019-11-19 电子科技大学 A kind of radar target identification method based on neighborhood characteristics space discriminatory analysis
CN107203786A (en) * 2017-06-06 2017-09-26 苏州大学 Image-recognizing method and device based on sparse border Fisher algorithms
CN107480623A (en) * 2017-08-07 2017-12-15 西安电子科技大学 The neighbour represented based on cooperation keeps face identification method
CN110738248A (en) * 2019-09-30 2020-01-31 朔黄铁路发展有限责任公司 State perception data feature extraction method and device and system performance evaluation method
CN110738248B (en) * 2019-09-30 2022-09-27 朔黄铁路发展有限责任公司 State perception data feature extraction method and device and system performance evaluation method

Also Published As

Publication number Publication date
CN104899578B (en) 2019-02-12

Similar Documents

Publication Publication Date Title
Simonyan et al. Fisher vector faces in the wild.
Harandi et al. Graph embedding discriminant analysis on Grassmannian manifolds for improved image set matching
Zhao et al. Person re-identification by salience matching
Phillips et al. The good, the bad, and the ugly face challenge problem
Zhou et al. Double shrinking sparse dimension reduction
CN104899578B (en) A kind of method and device of recognition of face
CN103824269B (en) Face effect processing method and system
US20080137917A1 (en) Information Processing Apparatus and Information Processing Method, Recognition Apparatus and Information Recognition Method, and Program
CN106446754A (en) Image identification method, metric learning method, image source identification method and devices
CN105868716A (en) Method for human face recognition based on face geometrical features
CN103020640A (en) Facial image dimensionality reduction classification method based on two-dimensional principal component analysis
CN106096517A (en) A kind of face identification method based on low-rank matrix Yu eigenface
CN105469117B (en) A kind of image-recognizing method and device extracted based on robust features
Beksi et al. Object classification using dictionary learning and rgb-d covariance descriptors
CN104966075B (en) A kind of face identification method and system differentiating feature based on two dimension
Gonzalez-Diaz et al. Neighborhood matching for image retrieval
CN104715266B (en) The image characteristic extracting method being combined based on SRC DP with LDA
CN106980809A (en) A kind of facial feature points detection method based on ASM
Huang et al. Complete local Fisher discriminant analysis with Laplacian score ranking for face recognition
CN103345621B (en) A kind of face classification method based on sparse CI
Zaeemzadeh et al. Iterative projection and matching: Finding structure-preserving representatives and its application to computer vision
CN116612324A (en) Small sample image classification method and device based on semantic self-adaptive fusion mechanism
Hong et al. Variant grassmann manifolds: A representation augmentation method for action recognition
CN104318224A (en) Face recognition method and monitoring equipment
TW201828156A (en) Image identification method, measurement learning method, and image source identification method and device capable of effectively dealing with the problem of asymmetric object image identification so as to possess better robustness and higher accuracy

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant