CN103207993B - Differentiation random neighbor based on core embeds the face identification method analyzed - Google Patents

Differentiation random neighbor based on core embeds the face identification method analyzed Download PDF

Info

Publication number
CN103207993B
CN103207993B CN201310125325.8A CN201310125325A CN103207993B CN 103207993 B CN103207993 B CN 103207993B CN 201310125325 A CN201310125325 A CN 201310125325A CN 103207993 B CN103207993 B CN 103207993B
Authority
CN
China
Prior art keywords
sample
matrix
identification method
face identification
class
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201310125325.8A
Other languages
Chinese (zh)
Other versions
CN103207993A (en
Inventor
郑建炜
黄琼芳
邱虹
王万良
蒋一波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hailiang Information Technology Co ltd
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN201310125325.8A priority Critical patent/CN103207993B/en
Publication of CN103207993A publication Critical patent/CN103207993A/en
Application granted granted Critical
Publication of CN103207993B publication Critical patent/CN103207993B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

A kind of differentiation random neighbor based on core of disclosure embeds the face identification method analyzed, it relates to area of pattern recognition, effectively to extract nonlinear discriminant information, obtain for the purpose of higher discrimination, including training process and test process, step is as follows: a) randomly selects l sample of each object and carries out model training, it is thus achieved that corresponding projection matrix B, and remaining data are all as test sample; B) all of training sample and test sample are projected to low dimensional manifold space; C) nearest neighbor classifier is adopted to be identified rate detection. Face identification method provided by the invention is effectively improved discrimination in existing technical foundation, maintains the composition of sample in class and between class well. The present invention can be used in machine learning and pattern recognition category, except recognition of face, it may also be used for the field such as image recognition and target recognition.

Description

Differentiation random neighbor based on core embeds the face identification method analyzed
Technical field
The present invention is a kind of face identification method, specifically, relates to a kind of differentiation random neighbor based on core and embeds the face identification method analyzed, can be used for recognition of face, image recognition, target recognition etc.
Background technology
In society, identity validation has highly important value. In recent years, the biological characteristic of the mankind is increasingly widely used in the identity of individual and authenticates, and compared to traditional method safety, reliable, feature is unique, stability is high, not easily stolen and crack. The intrinsic biological characteristic of the mankind mainly has: DNA, fingerprint, iris, voice, gait, palmmprint, face etc., based on people's cognition to independent personal feature, in conjunction with advanced computer technology and pattern recognition theory, such as DNA identifies that technology, fingerprint identification technology, face recognition technology etc. grow up one after another. For current research level, DNA identifies and fingerprint recognition has higher discrimination, and the most by force but the Condition of Strong Constraint of its use still limits the use of both approaches to reliability. Recognition of face has following powerful advantages compared to other biological feather recognition method: (1) too much participates in without user, contactless collection, without the property invaded; (2) user be there is no any obvious stimulation, it is simple to hide; (3) equipment cost is cheap, mainly adopts photographic head to collect face. Thus recognition of face is as a kind of special biometrics identification technology, have the applied environment of many uniquenesses, as criminal's arrest, automatically-controlled door access control system, customs pass by inspection, credit card confirmation etc.
Recognition of face becomes the study hotspot in Pattern recognition and image processing field already, and current main stream approach is based on the face recognition algorithms of subspace. Such as edge Fisher analytic process, local Fisher techniques of discriminant analysis, minimax distance analysis method and maximum spacing figure imbedding method etc.In recent years, for the input data of nonlinear Distribution structure, it has been proposed that the face identification method of many Nonlinear Dimension Reduction technology, wherein of greatest concern it is based on kernel method and two kinds of technology based on geometry. Such as Isometric Maps method, be locally linear embedding into, laplacian eigenmaps and local tangent space alignment etc. The method that the present invention proposes belongs to the recognition of face based on kernel method, and it can produce nonlinear mapping, represents the popular structure of sample data well, reaches more satisfactory dimensionality reduction effect.
Add up through patent consulting, the patent of existing many recognition of face aspects both at home and abroad: such as, keeps the face identification method (200710114882.4) of embedding and support vector machine based on the neighbour having supervision, based on the face identification method (200710300730.3) of general non-linear discriminating analysis, a kind of face identification method (200810030577.1) etc.
Summary of the invention
The invention solves the problems that the linear dimensionality reduction technology of existing technology can not process the technical problem of the input data of nonlinear Distribution structure very well and can not be effectively improved discrimination, maintain in class well and the shortcoming of the composition of sample between class, it is provided that a kind of differentiations random neighbor based on core embeds the face identification method of analysis.
The technical solution adopted for the present invention to solve the technical problems is:
A kind of differentiation random neighbor based on core embeds the face identification method analyzed, and comprises the following steps:
A) randomly selecting l sample of each object and carry out model training, it is thus achieved that corresponding projection matrix B, remaining data are all as test sample;
B) all of training sample and test sample are projected to low dimensional manifold space;
C) nearest neighbor classifier is adopted to be identified rate detection.
Specifically, in the face identification method of the present invention, specifically include training part and part of detecting two parts, wherein,
Described training process specifically includes following step:
A1 determines training sample matrix X=[x1,x2,…,xN] and class label, it is determined that kernel function, set variance parameter λ and maximum iteration time Mt;
According in step a1, between sample matrix X calculating input sample, Euclidean distance, the Sample Similarity in former space and class label calculate joint probability p to a2 between twoij:
p ij = exp ( - ( K ii + K jj - 2 K ij ) / 2 λ 2 ) Σ c t = c l exp ( - ( K tt + K ll - 2 K tl ) / 2 λ 2 ) if c i = c j exp ( - ( K ii + K jj - 2 K ij ) / 2 λ 2 ) Σ c t ≠ c m exp ( - ( K tt + K mm - 2 K tm ) / 2 λ 2 ) else
Joint probability pijIntroduce Gauss RBF kernel function κ (x, x ')=exp (-σ | | x-x ' | |2 2). The given n having class label ties up sample x1 1,x2 1,…,xN1 1,x1 2,x2 2,…,xN2 2,…,x1 C,x2 C,…,xNC C, whereinRepresenting the i-th sample of c class, the total classification number of sample is C, NiIt is the sample number of the i-th class, Ki=[κ(x1,xi),...,κ(xN,xi)]T, it is a column vector;
A3 initializes transformation matrix B0So that it is element meets (0,1) Gauss distribution;
A4 calculates joint probability q according to Sample Similarity and the class label of subspaceij, keep the similarity between similar sample as much as possible by KL divergence and reduce the similarity between foreign peoples's sample, finally utilizing conjugate gradient method to update transformation matrix Bt:
A41 joint probability qijFor:
q ij = ( 1 + ( K i - K j ) T B T B ( K i - K j ) ) - 1 Σ c t = c l ( 1 + ( K t - K l ) T B T B ( K t - K l ) ) - 1 if c i = c j ( 1 + ( K i - K j ) T B T B ( K i - K j ) ) - 1 Σ c t ≠ c m ( 1 + ( K t - K m ) T B T B ( K t - K m ) ) - 1 else
A42 objective cost function is:
min C ( A ) = Σ c i = c j p ij p ij q ij + Σ c i ≠ c k p ik log p ik q ik
A43, under this object function, updates transformation matrix B by classical conjugate gradient methodtIt is iterated solving, wherein carrys out parametrization cost functional by two kinds of methods:
A431. projection matrix B parameter cost functional is utilized:
dC ( B ) d ( B ) = Σ c i = c j p ij q ij ( q ij ) ′ + Σ c i ≠ c t p it q it ( q it ) ′
= 2 B [ Σ c i = c j u ij ( K i - K j ) ( K i - K j ) T + Σ c i ≠ c t u it ( K i - K t ) ( K i - K t ) T ]
For making expression convenient, define following auxiliary variable:
wij=[1+(Ki-Kj)ΤBΤB(Ki-Kj)]-1
uij=(pij-qij)wij
u ij in = u ij if c i = c j 0 else
u ij ou = u ij if c i ≠ c j 0 else
By above-mentioned auxiliary variable, above-mentioned gradient formula can be reduced to:
dC ( B ) d ( B ) = 2 B [ Σ c i = c j u ij ( K i - K j ) ( K i - K j ) T + Σ c i ≠ c j u it ( K i - K t ) ( K i - K t ) T ]
= 4 B [ K ( D in - U in + D ou - U ou ) K T ]
Wherein diagonal matrix Din, DouIn element by corresponding UinAnd UouEach column and composition (or each row and, due to UinAnd UouIt is symmetrical matrix), namelyAnd
A432. utilizing the cost functional in projection matrix A parameter characteristic space, the linear projection transformation matrix A in feature space F can according to nonlinear mapping function It is expressed as (using belowSubstitute):
Wherein A=B Φ, B=[b(1),...,b(r)]TAnd
dC F ( A ) d ( A ) = Σ c i = c j p ij q ij ( q ij ) ′ + Σ c i ≠ c t p it q it ( q it ) ′
= 2 [ Σ c i = c j u ij BQ ij ( K i - K j ) + Σ c i ≠ c t u it BQ it ( K i - K t ) ] Φ
WhereinFor N N matrix, the i-th row of matrix are by vector Ki-KjComposition, jth arranges by vector Kj-KiComposition, all the other row are made up of null vector;
A5 exports final projection matrix Bt
Described training process specifically includes following step:
A51. training sample matrix X=[x is determined1,x2,…,xN] and class label;
A52. projection matrix B is utilizedtTraining sample is projected to low dimensional manifold space;
A53. projection matrix B is utilizedtTraining sample is projected to low dimensional manifold space.
The technology design of the present invention: to a kind of new Dimension Reduction Analysis method proposed by Zheng Jianwei etc. recently, is called that differentiation random neighbor embeds (discriminativestochasticneighborembedding, DSNE) and carries out the improvement based on core. DSNE embeds (stochasticneighborembedding in the random neighbor proposed such as Hinton, SNE) and t-distribution SNE(t-distributedstochasticneighborembedding, the t-SNE of improvement that propose such as Laurens) basis on introduce linear projective transformation thought and class label information. Euclidean distance between high dimensional data is converted into probability expression-form by SNE, its cost functional builds criterion calls subspace and has identical form of probability with the former input space, and t-SNE adopts and has symmetric joint probability and express the conditional probability form substituted in SNE, and in subspace, introduce t-distribution show similarity between sample between two. Owing to SNE and t-SNE broadly falls into non-linear unsupervised dimension reduction method, so there is " sample exterior problem " and being not suitable for the defect of pattern discrimination task. 2011, the random neighbor towards popular study proposed by Wu etc. projects (manifold-orientedstochasticneighborprojection, MSNP) " sample exterior problem " is solved well, but being linear unsupervised dimension reduction method based on MSNP, it is still not suitable for pattern recognition task. And linearly have the DSNE of supervision to solve the problem of these two aspects dexterously, but linear feature makes it cannot efficiently solve nonlinear feature extraction problem, and DSNE is for different classes of sample, its probability density still has much room for improvement. The present invention utilizes the thought of kernel method to propose a kind of differentiation random neighbor based on core and embeds the face identification method (kernelDSNE, KDSNE) analyzed, and overcomes the defect of DSNE well.
The invention have the advantage that the input data that can process nonlinear Distribution structure very well, be effectively improved discrimination, maintain in class well and the composition of sample between class.
Accompanying drawing explanation
Fig. 1 is the part face image pattern in ORL face database;
Fig. 2 is the part face image pattern in Yale face database;
Fig. 3 is the change of the discrimination under different subspace dimension in ORL face database;
Fig. 4 is the change of the discrimination under different subspace dimension in Yale face database;
Fig. 5 is the flow chart of the present invention.
Detailed description of the invention
The invention will be further described below. With reference to accompanying drawing 1-4:
A kind of differentiation random neighbor based on core embeds the face identification method analyzed, and comprises the following steps:
A) randomly selecting l sample of each object and carry out model training, it is thus achieved that corresponding projection matrix B, remaining data are all as test sample;
B) all of training sample and test sample are projected to low dimensional manifold space;
C) nearest neighbor classifier is adopted to be identified rate detection.
Specifically, in the face identification method of the present invention, specifically include training part and part of detecting two parts, wherein,
Described training process specifically includes following step:
A1 determines training sample matrix X=[x1,x2,…,xN] and class label, it is determined that kernel function, set variance parameter λ and maximum iteration time Mt;
According in step a1, between sample matrix X calculating input sample, Euclidean distance, the Sample Similarity in former space and class label calculate joint probability p to a2 between twoij:
p ij = exp ( - ( K ii + K jj - 2 K ij ) / 2 λ 2 ) Σ c t = c l exp ( - ( K tt + K ll - 2 K tl ) / 2 λ 2 ) if c i = c j exp ( - ( K ii + K jj - 2 K ij ) / 2 λ 2 ) Σ c t ≠ c m exp ( - ( K tt + K mm - 2 K tm ) / 2 λ 2 ) else
Joint probability pijIntroduce Gauss RBF kernel functionThe given n having class label ties up sample x1 1,x2 1,…,xN1 1,x1 2,x2 2,…,xN2 2,…,x1 C,x2 C,…,xNC C, whereinRepresenting the i-th sample of c class, the total classification number of sample is C, NiIt is the sample number of the i-th class, Ki=[κ(x1,xi),...,κ(xN,xi)]T, it is a column vector;
A3 initializes transformation matrix B0So that it is element meets (0,1) Gauss distribution;
A4 calculates joint probability q according to Sample Similarity and the class label of subspaceij, keep the similarity between similar sample as much as possible by KL divergence and reduce the similarity between foreign peoples's sample, finally utilizing conjugate gradient method to update transformation matrix Bt:
A41 joint probability qijFor:
q ij = ( 1 + ( K i - K j ) T B T B ( K i - K j ) ) - 1 Σ c t = c l ( 1 + ( K t - K l ) T B T B ( K t - K l ) ) - 1 if c i = c j ( 1 + ( K i - K j ) T B T B ( K i - K j ) ) - 1 Σ c t ≠ c m ( 1 + ( K t - K m ) T B T B ( K t - K m ) ) - 1 else
A42 objective cost function is:
min C ( A ) = Σ c i = c j p ij p ij q ij + Σ c i ≠ c k p ik log p ik q ik
A43, under this object function, updates transformation matrix B by classical conjugate gradient methodtIt is iterated solving, wherein carrys out parametrization cost functional by two kinds of methods:
A431. projection matrix B parameter cost functional is utilized:
dC ( B ) d ( B ) = Σ c i = c j p ij q ij ( q ij ) ′ + Σ c i ≠ c t p it q it ( q it ) ′
= 2 B [ Σ c i = c j u ij ( K i - K j ) ( K i - K j ) T + Σ c i ≠ c t u it ( K i - K t ) ( K i - K t ) T ]
For making expression convenient, define following auxiliary variable:
wij=[1+(Ki-Kj)ΤBΤB(Ki-Kj)]-1
uij=(pij-qij)wij
u ij in = u ij if c i = c j 0 else
u ij ou = u ij if c i ≠ c j 0 else
By above-mentioned auxiliary variable, above-mentioned gradient formula can be reduced to:
dC ( B ) d ( B ) = 2 B [ Σ c i = c j u ij ( K i - K j ) ( K i - K j ) T + Σ c i ≠ c t u it ( K i - K t ) ( K i - K t ) T ]
= 4 B [ K ( D in - U in + D ou - U ou ) K T ]
Wherein diagonal matrix Din, DouIn element by corresponding UinAnd UouEach column and composition (or each row and, due to UinAnd UouIt is symmetrical matrix), namelyAndA432. utilizing the cost functional in projection matrix A parameter characteristic space, the linear projection transformation matrix A in feature space F can according to nonlinear mapping function It is expressed as (using belowSubstitute):
Wherein A=B Φ, B=[b(1),...,b(r)]TAnd
dC F ( A ) d ( A ) = Σ c i = c j p ij q ij ( q ij ) ′ + Σ c i ≠ c t p it q it ( q it ) ′
= 2 [ Σ c i = c j u ij BQ ij ( K i - K j ) + Σ c i ≠ c t u it BQ it ( K i - K t ) ] Φ
WhereinFor N N matrix, the i-th row of matrix are by vector Ki-KjComposition, jth arranges by vector Kj-KiComposition, all the other row are made up of null vector;
A5 exports final projection matrix Bt
Described training process specifically includes following step:
A1 determines training sample matrix X=[x1,x2,…,xN] and class label;
A2 utilizes projection matrix BtTraining sample is projected to low dimensional manifold space;
A3 utilizes projection matrix BtTraining sample is projected to low dimensional manifold space.
Two classical face databases of ORL and Yale are adopted to be identified rate detection. In experiment, above-mentioned face database is adjusted to 32 × 32 pixels by unification, and the gray value of every pixel is within 0-255 scope. ORL face database randomly chooses 3 and 5 samples of every class and is identified rate detection, Yale face database then randomly chooses 4 and 6 samples of every class. The present invention adopts two kinds of linear dimension-reduction algorithms of DSNE and MSNP to carry out contrast test, and the random neighbor towards popular study that wherein Wu etc. propose projects and has been verified that in paper that MSNP algorithm is better than the general dimension-reduction algorithm such as SNE, t-SNE, LLTSA, LPP on identification ability. The concrete configuration parameter of various algorithms is as follows: in KDSNE1, KDSNE2 and DSNE, variance parameter λ=0.1 and maximum iteration time are 300; Sample degree of freedom γ=4 and maximum iteration time that in MSNP, Cauchy is distributed are 1000.
Table 1 is all algorithms best identified rate in two face databases and respective subspace dimension (bracket inner digital), and wherein thickened portion is the highest discrimination under identical training sample. As seen from Table 1, KDSNE1 has the discrimination of optimum in ORL face database, and KDSNE2 has the discrimination of optimum in Yale face database, it can be seen that, the height of KDSNE1 and KDSNE2 discrimination is also different with the difference of data base. As for KDSNE1 and the KDSNE2 comparison compared with other algorithms, can learn from figure and table, KDSNE2 in Yale relatively the DSNE of discrimination suboptimum improve > 3%, and although KDSNE2 discrimination promotes average less than 2% in ORL, and DSNE is in close proximity to KDSNE2 in l=5 tests, but DSNE has but used higher subspace dimension just to reach the discrimination being closer to, and is basically inferior to KDSNE1 and KDSNE2 from recognition of face.
Best identified rate that the various algorithm of table 1 obtains in ORL and Yale data base and respective dimensions

Claims (5)

1. embed the face identification method analyzed based on the differentiation random neighbor of core, including training process and test process, it is characterised in that comprise the following steps:
A) randomly select l sample of each object and carry out model training, it is thus achieved that corresponding projection matrix B ∈ Rr×N, wherein N is training sample quantity, and r is sample dimension after projection, and remaining data are all as test sample; In described step a), randomly select l sample of each object and carry out model training and include following five steps:
A1 determines sample matrix X=[x1,x2,…,xN] and class label, it is determined that kernel function, set variance parameter λ and maximum iteration time Mt, wherein xi∈Rd×N, it is i-th input sample, λ is the variance parameter of corresponding Gaussian function, and Mt is maximum iteration time;
According in step a1, between sample matrix X calculating input sample, Euclidean distance, the Sample Similarity in former space and class label calculate joint probability p to a2 between twoij;
A3 initializes transformation matrix B0So that it is element meets (0,1) Gauss distribution;
A4 calculates joint probability q according to Sample Similarity and the class label of subspaceij, keep the similarity between similar sample as much as possible by KL divergence and reduce the similarity between foreign peoples's sample, finally utilizing conjugate gradient method to update transformation matrix Bt;
A5 exports final projection matrix Bt;
B) all of training sample and test sample are projected to low dimensional manifold space;
C) nearest neighbor classifier is adopted to be identified rate detection.
2. face identification method according to claim 1, it is characterised in that calculate joint probability p in described step a2ijTime introduce Gauss RBF kernel function κ (x, x ')=exp (-λ | | x-x ' | |2 2), the given n having class label ties up sample x1 1,x2 1,…,xN1 1,x1 2,x2 2,…,xN2 2,…,x1 C,x2 C,…,xNC C, wherein xi cRepresenting the i-th sample of c class, the total classification number of sample is C, NiBeing the sample number of the i-th class, after introducing kernel function, the joint probability of the sample in former space is:
Wherein Ki=[κ (x1,xi),...,κ(xN,xi)]T, it is a column vector being made up of kernel function value.
3. face identification method according to claim 2, it is characterised in that the joint probability q of Operators Space of falling into a trap at described step a4ijTime have also been introduced Gauss RBF kernel function κ (x, x ')=exp (-λ | | x-x ' | |2 2), it may be assumed that
4. face identification method according to claim 3, it is characterised in that by minimizing in similar sample and between foreign peoples's sample, respective KL divergence obtains objective cost function in described step a4:
5. face identification method according to claim 4, it is characterised in that under described object function, carrys out parametrization cost functional by two kinds of methods:
A41. projection matrix B parameter cost functional is utilized:
A42. projection matrix A ∈ R is utilizedr×dThe cost functional in parameter characteristic space, the linear projection transformation matrix A in feature space can according to nonlinear mapping functionIt is expressed as (using belowSubstitute
Wherein A=B Φ, B=[b(1),...,b(r)]TAnd
WhereinFor N N matrix, the i-th row of matrix are by vector Ki-KjComposition, jth arranges by vector Kj-KiComposition, all the other row are made up of null vector.
CN201310125325.8A 2013-04-10 2013-04-10 Differentiation random neighbor based on core embeds the face identification method analyzed Expired - Fee Related CN103207993B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310125325.8A CN103207993B (en) 2013-04-10 2013-04-10 Differentiation random neighbor based on core embeds the face identification method analyzed

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310125325.8A CN103207993B (en) 2013-04-10 2013-04-10 Differentiation random neighbor based on core embeds the face identification method analyzed

Publications (2)

Publication Number Publication Date
CN103207993A CN103207993A (en) 2013-07-17
CN103207993B true CN103207993B (en) 2016-06-15

Family

ID=48755210

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310125325.8A Expired - Fee Related CN103207993B (en) 2013-04-10 2013-04-10 Differentiation random neighbor based on core embeds the face identification method analyzed

Country Status (1)

Country Link
CN (1) CN103207993B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103500345A (en) * 2013-09-29 2014-01-08 华南理工大学 Method for learning person re-identification based on distance measure
CN103953490A (en) * 2014-04-23 2014-07-30 浙江工业大学 Implementation method for monitoring status of hydraulic turbine set based on HLSNE
CN105893954B (en) * 2016-03-30 2019-04-23 深圳大学 A kind of Non-negative Matrix Factorization face identification method and system based on nuclear machine learning
CN108427923B (en) * 2018-03-08 2022-03-25 广东工业大学 Palm print identification method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142082A (en) * 2011-04-08 2011-08-03 南京邮电大学 Virtual sample based kernel discrimination method for face recognition
CN102693419A (en) * 2012-05-24 2012-09-26 武汉大学 Super-resolution face recognition method based on multi-manifold discrimination and analysis
CN102831389A (en) * 2012-06-28 2012-12-19 北京工业大学 Facial expression recognition algorithm based on discriminative component analysis

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142082A (en) * 2011-04-08 2011-08-03 南京邮电大学 Virtual sample based kernel discrimination method for face recognition
CN102693419A (en) * 2012-05-24 2012-09-26 武汉大学 Super-resolution face recognition method based on multi-manifold discrimination and analysis
CN102831389A (en) * 2012-06-28 2012-12-19 北京工业大学 Facial expression recognition algorithm based on discriminative component analysis

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
判别随机近邻嵌入分析方法;郑建炜等;《计算机辅助设计与图形学学报》;20121130;第24卷(第11期);第1481页第2栏最后一段 *
基于核的最近邻特征重心分类器及人脸识别应用;贺云辉等;《电路与系统学报》;20070430;第12卷(第2期);5-10页 *

Also Published As

Publication number Publication date
CN103207993A (en) 2013-07-17

Similar Documents

Publication Publication Date Title
CN107609497B (en) Real-time video face recognition method and system based on visual tracking technology
Salimi et al. Visual-based trash detection and classification system for smart trash bin robot
Alapati et al. Combining clustering with classification: a technique to improve classification accuracy
Zhang Off‐line signature verification and identification by pyramid histogram of oriented gradients
CN101976360B (en) Sparse characteristic face recognition method based on multilevel classification
CN102156887A (en) Human face recognition method based on local feature learning
CN105825176A (en) Identification method based on multi-mode non-contact identity characteristics
Huang et al. DeepDiff: Learning deep difference features on human body parts for person re-identification
CN103605972A (en) Non-restricted environment face verification method based on block depth neural network
CN106295694A (en) A kind of face identification method of iteration weight set of constraints rarefaction representation classification
CN104008395A (en) Intelligent bad video detection method based on face retrieval
CN106096517A (en) A kind of face identification method based on low-rank matrix Yu eigenface
CN103207993B (en) Differentiation random neighbor based on core embeds the face identification method analyzed
CN103745205A (en) Gait recognition method based on multi-linear mean component analysis
CN103077378A (en) Non-contact human face identifying algorithm based on expanded eight-domain local texture features and attendance system
Ramya et al. Certain investigation on iris image recognition using hybrid approach of Fourier transform and Bernstein polynomials
CN103714340A (en) Self-adaptation feature extracting method based on image partitioning
Travieso et al. Bimodal biometric verification based on face and lips
Ngxande et al. Detecting inter-sectional accuracy differences in driver drowsiness detection algorithms
CN117218707B (en) Deep face detection method based on positive disturbance
CN114241564A (en) Facial expression recognition method based on inter-class difference strengthening network
CN103903017A (en) Face recognition method based on self-adaption soft histogram local binary patterns
Abboud et al. Biometric templates selection and update using quality measures
CN101216878A (en) Face identification method based on general non-linear discriminating analysis
CN100416592C (en) Human face automatic identifying method based on data flow shape

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CB03 Change of inventor or designer information

Inventor after: Zheng Jianwei

Inventor after: Qiu Hong

Inventor after: Yang Ping

Inventor after: Huang Qiongfang

Inventor after: Wang Wanliang

Inventor after: Jiang Yibo

Inventor before: Zheng Jianwei

Inventor before: Huang Qiongfang

Inventor before: Qiu Hong

Inventor before: Wang Wanliang

Inventor before: Jiang Yibo

COR Change of bibliographic data
TR01 Transfer of patent right

Effective date of registration: 20180604

Address after: 310012 530, room fifth, No. 20, West Dou Men Road, Xihu District, Hangzhou, Zhejiang.

Patentee after: HANGZHOU HAILIANG INFORMATION TECHNOLOGY Co.,Ltd.

Address before: 310014 Chao Wang Road, Xiacheng City, Hangzhou, Zhejiang Province, No. 18

Patentee before: Zhejiang University of Technology

TR01 Transfer of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160615