CN105787430A - Method for identifying second level human face with weighted collaborative representation and linear representation classification combined - Google Patents

Method for identifying second level human face with weighted collaborative representation and linear representation classification combined Download PDF

Info

Publication number
CN105787430A
CN105787430A CN201610018508.3A CN201610018508A CN105787430A CN 105787430 A CN105787430 A CN 105787430A CN 201610018508 A CN201610018508 A CN 201610018508A CN 105787430 A CN105787430 A CN 105787430A
Authority
CN
China
Prior art keywords
training sample
class
sample
matrix
class training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610018508.3A
Other languages
Chinese (zh)
Inventor
施志刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Shipping College
Original Assignee
Nantong Shipping College
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Shipping College filed Critical Nantong Shipping College
Priority to CN201610018508.3A priority Critical patent/CN105787430A/en
Publication of CN105787430A publication Critical patent/CN105787430A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for identifying second level human face with weighted collaborative representation and linear representation classification combined. The method combines CRC and LRC. The method includes the following steps: firstly based on PCA, reducing dimensions of all the image samples so as to reduce computing complexity; taking different contributions of sample local similarity prior information on identifying classifications into consideration, constructing a weighted matrix and embedding to the CRC, proposing the weighted CRC, and then based on the weighted CRC, in accordance with reconstruction residual error ordering, reserving plural types of training samples with a greater similarity for LRC so as to realize secondary classification identification. The method reduces classification objects, which makes identification more accurate, and substantially reduces time for identification.

Description

Weighting is worked in coordination with and is characterized the two grades of face identification methods blended with linear expression classification
Technical field
The present invention relates to a kind of face identification method, particularly a kind of working in coordination with based on weighting characterizes the two grades of face identification methods blended with linear expression classification.
Background technology
In the current digital information epoch, along with the development of the Internet is swift and violent, the safety of various information resources is critically important problem, all takes in a lot of occasions and accesses the safety prevention measure controlled.This wherein, the identity based on physiological feature differentiates, as recognition of face, fingerprint recognition etc. are just used widely because of its real-time, the feature of convenience.Nowadays, at traffic and transport field, particularly safe class being required, higher airport, station etc. all carry out real-time security monitoring by computer vision system, any the extraction of correlated characteristic in video and analysis are all based on face.How to be the research contents of current field of face identification to human face detection and recognition in complex environment.
The face identification method that feature based extracts is intended to the dependency of low dimensional feature and the classification finding target image, but there is presently no the dimensional images transformation criterion to lower dimensional space of authority.Along with the proposition of compressed sensing coding theory, some research worker find that the selection of feature is not critically important.If estimation more accurately can be had to be obtained with higher discrimination facial image at higher dimensional space.Therefore, face identification method based on sparse coding model causes extensive concern in recent years.Wright etc. propose the face recognition algorithms of sparse representation classification (SRC) at first.Solved the linear combination of all kinds of training samples that can characterize target sample by sparse constraint, and target sample is integrated into the apoplexy due to endogenous wind of maximum nonzero coefficient.The method is insensitive due to the influence of noise that image itself is existed, and has stronger robustness in identification.
Although having good effect in the application based on the method for SRC, but some experts are under study for action it was also found that SRC is based on the l of maximal possibility estimation1-Norm solves, and due to needs iteration, the complexity causing calculating is higher.Zhang etc. think in recognition of face, and classical SRC model is based on l1-Norm sparse solves and be built upon every class training sample must be on complete basis.But the sample in actual applications, being used in the face database trained is inadequate often.Since sample is incomplete in class, owing between class, sample also has certain similarity, therefore can characterize collaborative for its class sample.It is proposed to this end that use l2-Norm replaces l1-Sparse collaborative sign thed solve of norm classifies (CRC).The method is compared based on l on openness1-The SRC of norm is relatively low, but when Classification and Identification, it is ensured that test sample is sufficiently small with the error of sample in class, and the error of sample is sufficiently large simultaneously and between class so that identify more efficient, and the efficiency of calculating is greatly improved.
Additionally, NASEEM etc. also proposed linear expression classification (LRC), can accurately estimate the facial image distribution at higher dimensional space equally.With CRC to all training samples by sparse constraint, judge that according to all kinds of reconstructed residual the principle of test sample class is different.The minima that LRC is then according to test sample with all kinds of training sample reconstructed errors determines that it belongs to.This algorithm is simply, easily realize.In face recognition application, computational efficiency or discrimination are all fine, but the requirement of image is higher, are subject to the interference of noise.
Summary of the invention
The technical problem to be solved is to provide a kind of weighting that less demanding and recognition effect and efficiency are all higher to image itself and works in coordination with two grades of face identification methods that sign blends with linear expression classification.
For solving above-mentioned technical problem, the technical solution adopted in the present invention is:
A kind of weighting is worked in coordination with and is characterized the two grades of face identification methods blended with linear expression classification, it is characterised in that comprise the steps of
Step1: face database contains the image of C people, and everyone has niWidth image, each image is sized to m × n, defines training sample setTest sample is Y ∈ Rm×n, by all kinds of training samplesVector turns toThe matrix of such i-th class training sample composition isThe matrix that C class training sample is constituted is Χ=[Χ1,…,XC]∈Rm×N, test sample Y vector is turned to y ∈ Rm×1
Step2: calculate each training sampleAnd the error between y, builds diagonal weight matrix M;
Step3: utilize principal component analysis pairCarrying out dimension-reduction treatment with y, obtain new training sample and test sample, vector turns to respectivelyAnd B, the matrix of such i-th class training sample composition isThe matrix that C class training sample is constituted is A=[A1,…,AC];
Step4: diagonal weight matrix M is embedded in collaborative expression model, by AiAs the encoder dictionary of collaborative presentation class,
Step5: solve weighting with method of least square and work in coordination with the encoder dictionary A representing modeliCorresponding sparse vector αi, i.e. αi=(Ai TAi+λMTM)-1Ai TB;
Step6: calculate new test sample B and every class training sample A respectivelyiThe residual error e of reconstructσi, i.e. eσi=| | B-Aiαi||2, A in formulaiαiFor to every class training sample AiReconstruct;
Step7: by every class training sample AiReconstructed residual matrix e with test sample Bσi=[eσ1,eσ2,…,eσC] in element by ascending order arrangement, obtain orderly residual matrix eεi=[eε1,eε2,…,eεC], and therefrom filtering out front S class residual error less than the sample set corresponding to threshold value, then the matrix that the S class training sample filtered out is constituted isWherein the matrix of the i-th class training sample composition is
Step8: if test sample B belongs to the i-th class, by B respectively with every class training sampleLinear expression, namely
Step9: solve the i-th class training sample with method of least squareCoefficient vector μi,
Step10: reconstruct the i-th class training sampleNamelyB in formulaiIt is the i-th class training sampleReconstruct;
Step11: calculate test sample B and every class training sampleReconstruct BiBetween error βi, i.e. βi=| | B-Bi||2
Step12: according to reconstructed error βiJudge the ownership of test sample B.
Further, described each training sampleAnd the similarity between test sample y passes through Euclidean distance and formulaCalculate, and then build diagonal weight matrix
Further, in weighting synergetic classification, according to reconstructed residual, select eσiMinimum front S class training sample is used for secondary characterization, is proved that by l-G simulation test the S class of reservation meets S=C*15% and can obtain good recognition effect.
Further, when weighting synergetic classification, test sample B and every class training sample AiReconstructed error eσiBy Euclidean distance eσi=| | B-Aiαi||2Calculating obtains, and unconventional eσi=| | B-Aiαi||2/||αi||2, this allows for training sample when randomly selecting, and uses Euclidean distance to calculate reconstructed error and can obtain better recognition effect.
Further, the described ownership judging test sample B passes through formulaJudge.
The present invention compared with prior art, has the following advantages and effect:
1, l is used2-norm, as constraints, substantially increases the efficiency of computing, and this sparse the solving characterized by the cooperation of all samples still has good recognition effect when reconfiguration classification;
2, owing to each training sample has certain local similarity with test sample, therefore these prior informations are different at the percentage contribution of Classification and Identification, set up weighting matrix by weight samples and be embedded in CRC, utilize principal component analysis (PCA) to original sample dimensionality reduction before synergetic classification, reduce the complexity of calculating to a certain extent, improve discrimination simultaneously.
3, utilization is collaborative characterizes and linear expression different characteristics on samples selection, it is possible to makes up single use and works in coordination with expression phenomenon of appearance identification shake under a certain low dimensional feature.
4, filtering out the target class that dependency is bigger, this way reducing class scope by secondary classification, make classification more accurate, discrimination is higher.
Detailed description of the invention
The present invention is described in further detail by the examples below, and following example are explanation of the invention and the invention is not limited in following example.
The weighting of the present invention is worked in coordination with and is characterized with linear expression two grades of face identification methods blending of classification based on the relatively low time complexity of CRC algorithm, and still can obtain higher discrimination by collaborative under sacrificing the premise that sample is necessarily openness, therefore in conjunction with CRC, new method is proposed.Consider that each training sample has certain local similarity with test sample, so they depend on similarity degree for the contribution of Classification and Identification, and this similarity can be calculated by Euclidean distance, namely each training sample obtains with the error of test sample, for avoiding the phenomenon using principal component analysis (PCA) in synergetic classification to easily occurring identifying shake after image dimensionality reduction under a certain low dimensional feature, having merged linear expression (LRC), algorithm specifically comprises the following steps that
Step1: setting face database and contain the image of C people, everyone has ni(i=1,2 ..., C) width image, thus define training sample set Being the i-th class jth width image pattern, each image is sized to m × n, and test sample is Y ∈ Rm×n, by all kinds of training samplesVector turns toThe matrix of such i-th class training sample composition isThe matrix that C class training sample is constituted is Χ=[Χ1,…,XC]∈Rm×N, test sample Y vector is turned to y ∈ Rm×1
Step2: calculate each training sample by Euclidean distanceAnd the error between y, namelyD in formulaijRepresent the local similarity between the i-th class jth width training sample and test sample y, thus build diagonal weight matrix M, namely
Step3: utilize principal component analysis (PCA) rightCarrying out dimension-reduction treatment with y, obtain new training sample and test sample, vector turns to respectivelyAnd B, the matrix of such i-th class training sample composition isThe matrix that C class training sample is constituted is A=[A1,…,AC]。
Step4: diagonal weight matrix M is embedded in collaborative expression model, by AiAs the encoder dictionary of collaborative presentation class,
In above formulaRepresent sparse solution, αiFor every class training sample AiSparse vector, λ is regularization parameter.
Step5: solve above-mentioned weighting with method of least square and work in coordination with the encoder dictionary A representing modeli(i=1,2 ..., C) corresponding to sparse vector αi, namely
αi=(Ai TAi+λMTM)-1Ai TB
Then sparse solutionFor α therein12,…,αi(i=1,2 ..., C), namely
Step6: calculate new test sample B and every class training sample A according to following formula respectivelyiThe residual error e of reconstructσi, namely
eσi=B-Aiαi||2I=1,2 ... C
A in formulaiαiFor every class training sample AiReconstruct.When weighting synergetic classification, test sample B and every class training sample AiReconstructed error eσiBy Euclidean distance eσi=| | B-Aiαi||2Calculating obtains, and unconventional eσi=| | B-Aiαi||2/||αi||2, this allows for training sample when randomly selecting, and uses Euclidean distance to calculate reconstructed error and can obtain better recognition effect.
Step7: by every class training sample AiReconstructed residual matrix e with test sample Bσi=[eσ1,eσ2,…,eσC] (i=1,2 ... C) in element by ascending order arrangement, obtain orderly residual matrix eεi=[eε1,eε2,…,eεC] (i=1,2 ... C), and therefrom filter out front S class residual error less than the sample set corresponding to threshold value, wherein S=C*15%.The matrix that the S class training sample then filtered out is constituted isWherein the matrix of the i-th class training sample composition is In weighting synergetic classification, according to reconstructed residual, select eσiMinimum front S class training sample is used for secondary characterization, is proved that by l-G simulation test the S class of reservation meets S=C*15% and can obtain good recognition effect.
Step8: if test sample B belong to the i-th class (i=1,2 ... S), by B respectively with every class training sampleLinear expression, namelyμ in formulaiRepresent the i-th class training sampleCoefficient vector.
Step9: solve the i-th class training sample with method of least squareCoefficient vector μi
Step10: reconstruct the i-th class training sampleNamelyB in formulaiIt is the i-th class training sampleReconstruct.
Step11: calculate test sample B and every class training sample according to Euclidean distanceReconstructed error βi, i.e. βi=| | B-Bi||2(i=1,2 ... S).B in formulaiIt it is the i-th class training sampleReconstruct.
Step12: judge the ownership of test sample B according to following formula, namely
Being verified by emulation experiment, the face identification method of the present invention has greatly improved on recognition performance.This is owing to rarefaction representation (SRC) is compared in collaborative expression (CRC), owing to using l2-norm, as constraints, substantially increases the efficiency of computing, and this sparse the solving characterized by the cooperation of all samples still has good recognition effect when reconfiguration classification.The direct reconstructed error according to test sample with Different categories of samples of linear expression classification (LRC) is classified, and therefore algorithm simply easily realizes.Effect and time complexity two aspect based on algorithm consider, this technology is based on the local similarity principle of each training sample with test sample, owing to these prior informations are different to the percentage contribution of Classification and Identification, set up weighting matrix for this by weight samples and be embedded in CRC, and utilize principal component analysis (PCA) to original sample dimensionality reduction, also reduce the complexity of calculating to a certain extent.Furthermore with CRC and LRC different characteristics on samples selection, take combined strategy, introduce weighting CRC and the LRC technology blended, it is possible to solve very well synergetic classification to use principal component analysis (PCA) to, after image dimensionality reduction, occurring identifying the phenomenon of shake under a certain low dimensional feature.First the sign of all training samples is carried out sparse solving by this technical method, according to reconstructed residual, filters out the target class that dependency is bigger;Then utilize LRC respectively Different categories of samples to be solved reconstructed error, judge the ownership of test sample with this.This secondary classification way by reducing class scope, can make classification more accurate, and discrimination is higher.
Above content described in this specification is only illustration made for the present invention.Described specific embodiment can be made various amendment or supplements or adopt similar mode to substitute by those skilled in the art; without departing from the content of description of the present invention or surmount the scope that present claims book is defined, protection scope of the present invention all should be belonged to.

Claims (5)

1. a weighting is worked in coordination with and is characterized the two grades of face recognition technologies blended with linear expression classification, it is characterised in that comprise the steps of
Step1: face database contains the image of C people, and everyone has niWidth image, each image is sized to m × n, defines training sample setTest sample is Y ∈ Rm×n, by all kinds of training samplesVector turns toThe matrix of such i-th class training sample composition isThe matrix that C class training sample is constituted is X=[X1..., XC]∈Rm×N, test sample Y vector is turned to y ∈ Rm×1
Step2: calculate each training sampleAnd the error between y, builds diagonal weight matrix M;
Step3: utilize principal component analysis pairCarrying out dimension-reduction treatment with y, obtain new training sample and test sample, vector turns to respectivelyAnd B, the matrix of the i-th so new class training sample composition isThe matrix that C class training sample is constituted is A=[A1..., AC];
Step4: diagonal weight matrix M is embedded in collaborative expression model, by AiAs the encoder dictionary of collaborative presentation class,
Step5: solve weighting with method of least square and work in coordination with the encoder dictionary A representing modeliCorresponding sparse vector αi, i.e. αi=(Ai TAi+λMTM)-1Ai TB;
Step6: calculate new test sample B and every class training sample A respectivelyiThe residual error e of reconstructσi, i.e. eσi=| | B-Aiαi||2, A in formulaiαiFor to every class training sample AiReconstruct;
Step7: by every class training sample AiReconstructed residual matrix e with test sample Bσi=[eσ1, eσ2..., eσC] in element by ascending order arrangement, obtain orderly residual matrix eεi=[eε1, eε2..., eεC], and therefrom filtering out front S class residual error less than the sample set corresponding to threshold value, then the matrix that the S class training sample filtered out is constituted isWherein the matrix of the i-th class training sample composition is(i=1,2 ..., S, j=1,2 ..., ni);
Step8: if test sample B belongs to the i-th class, by B respectively with every class training sampleLinear expression, namely(i=1,2 ... S);
Step9: solve the i-th class training sample with method of least squareCoefficient vector μi,
μ i = ( A ‾ i T A ‾ i ) - 1 A ‾ i T B ;
Step10: reconstruct the i-th class training sampleNamelyB in formulaiIt is the i-th class training sampleReconstruct;
Step11: calculate test sample B and every class training sampleReconstruct BiBetween error βi, i.e. βi=| | B-Bi||2
Step12: according to reconstructed error βiJudge the ownership of test sample B.
2. the weighting described in claim 1 is worked in coordination with and is characterized the two grades of face recognition technologies blended with linear expression classification, it is characterised in that: described each training sampleAnd the similarity between test sample y passes through Euclidean distance and formulaCalculate, and then build diagonal weight matrix M = d 11 0 ... 0 0 d 12 ... 0 . . . . . . . . . . . . 0 0 0 d i j .
3. the weighting described in claim 1 is worked in coordination with and is characterized the two grades of face recognition technologies blended with linear expression classification, it is characterised in that: in weighting synergetic classification, according to reconstructed residual, select eσiMinimum front S class training sample is used for secondary characterization, the S class of reservation, meets S=C*15% and can obtain good recognition effect.
4. the weighting described in claim 1 is worked in coordination with and is characterized the two grades of face recognition technologies blended with linear expression classification, it is characterised in that: when weighting synergetic classification, test sample B and every class training sample AiReconstructed error eσiBy Euclidean distance eσi=| | B-Aiαi||2Calculating obtains.
5. the weighting described in claim 1 is worked in coordination with and is characterized the two grades of face identification methods blended with linear expression classification, it is characterised in that: the described ownership judging test sample B passes through formulaJudge.
CN201610018508.3A 2016-01-12 2016-01-12 Method for identifying second level human face with weighted collaborative representation and linear representation classification combined Pending CN105787430A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610018508.3A CN105787430A (en) 2016-01-12 2016-01-12 Method for identifying second level human face with weighted collaborative representation and linear representation classification combined

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610018508.3A CN105787430A (en) 2016-01-12 2016-01-12 Method for identifying second level human face with weighted collaborative representation and linear representation classification combined

Publications (1)

Publication Number Publication Date
CN105787430A true CN105787430A (en) 2016-07-20

Family

ID=56403019

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610018508.3A Pending CN105787430A (en) 2016-01-12 2016-01-12 Method for identifying second level human face with weighted collaborative representation and linear representation classification combined

Country Status (1)

Country Link
CN (1) CN105787430A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250929A (en) * 2016-07-29 2016-12-21 中国石油大学(华东) The method for designing of elastomeric network constraint self-explanatory rarefaction representation grader
CN106407982A (en) * 2016-09-23 2017-02-15 厦门中控生物识别信息技术有限公司 Data processing method and equipment
CN108197573A (en) * 2018-01-03 2018-06-22 南京信息工程大学 The face identification method that LRC and CRC deviations based on mirror image combine
CN109478228A (en) * 2016-09-30 2019-03-15 富士通株式会社 Fusion method, device and the electronic equipment of classification results
CN109766810A (en) * 2018-12-31 2019-05-17 陕西师范大学 Recognition of face classification method based on collaboration expression and pond and fusion
CN110096992A (en) * 2019-04-26 2019-08-06 兰州大学 A kind of face identification method indicating non-linear fusion Pasteur coefficient based on collaboration
CN110232317A (en) * 2019-05-05 2019-09-13 五邑大学 Hyperspectral image classification method based on super-pixel segmentation and two phase classification strategy
CN110956113A (en) * 2019-11-25 2020-04-03 南京审计大学 Robust face recognition method based on secondary cooperation representation identification projection
CN111950429A (en) * 2020-08-07 2020-11-17 南京审计大学 Face recognition method based on weighted collaborative representation
CN113505717A (en) * 2021-07-17 2021-10-15 桂林理工大学 Online passing system based on face and facial feature recognition technology

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104166860A (en) * 2014-07-25 2014-11-26 哈尔滨工业大学深圳研究生院 Constraint-based face identification method for single test sample
CN104182734A (en) * 2014-08-18 2014-12-03 桂林电子科技大学 Linear-regression based classification (LRC) and collaborative representation based two-stage face identification method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104166860A (en) * 2014-07-25 2014-11-26 哈尔滨工业大学深圳研究生院 Constraint-based face identification method for single test sample
CN104182734A (en) * 2014-08-18 2014-12-03 桂林电子科技大学 Linear-regression based classification (LRC) and collaborative representation based two-stage face identification method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
RADU TIMOFTE: "Weighted Collaborative Representation and Classfication of Images", 《21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION》 *
WEI LI 等: "Nearest Regularized Subspace for Hyperspectral Classification", 《IEEE TRANSACTION ON GEOSCIENCE AND REMOTE SENSING》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250929A (en) * 2016-07-29 2016-12-21 中国石油大学(华东) The method for designing of elastomeric network constraint self-explanatory rarefaction representation grader
CN106407982A (en) * 2016-09-23 2017-02-15 厦门中控生物识别信息技术有限公司 Data processing method and equipment
CN106407982B (en) * 2016-09-23 2019-05-14 厦门中控智慧信息技术有限公司 A kind of data processing method and equipment
CN109478228A (en) * 2016-09-30 2019-03-15 富士通株式会社 Fusion method, device and the electronic equipment of classification results
CN108197573A (en) * 2018-01-03 2018-06-22 南京信息工程大学 The face identification method that LRC and CRC deviations based on mirror image combine
CN109766810A (en) * 2018-12-31 2019-05-17 陕西师范大学 Recognition of face classification method based on collaboration expression and pond and fusion
CN109766810B (en) * 2018-12-31 2023-02-28 陕西师范大学 Face recognition classification method based on collaborative representation, pooling and fusion
CN110096992B (en) * 2019-04-26 2022-12-16 兰州大学 Face recognition method based on collaborative representation nonlinear fusion Bhattacharyya coefficient
CN110096992A (en) * 2019-04-26 2019-08-06 兰州大学 A kind of face identification method indicating non-linear fusion Pasteur coefficient based on collaboration
CN110232317A (en) * 2019-05-05 2019-09-13 五邑大学 Hyperspectral image classification method based on super-pixel segmentation and two phase classification strategy
CN110232317B (en) * 2019-05-05 2023-01-03 五邑大学 Hyper-spectral image classification method, device and medium based on super-pixel segmentation and two-stage classification strategy
CN110956113A (en) * 2019-11-25 2020-04-03 南京审计大学 Robust face recognition method based on secondary cooperation representation identification projection
CN110956113B (en) * 2019-11-25 2022-05-24 南京审计大学 Robust face recognition method based on secondary cooperation representation identification projection
CN111950429A (en) * 2020-08-07 2020-11-17 南京审计大学 Face recognition method based on weighted collaborative representation
CN111950429B (en) * 2020-08-07 2023-11-14 南京审计大学 Face recognition method based on weighted collaborative representation
CN113505717A (en) * 2021-07-17 2021-10-15 桂林理工大学 Online passing system based on face and facial feature recognition technology

Similar Documents

Publication Publication Date Title
CN105787430A (en) Method for identifying second level human face with weighted collaborative representation and linear representation classification combined
CN105138972A (en) Face authentication method and device
Esmaeili et al. Fast-at: Fast automatic thumbnail generation using deep neural networks
CN102938070B (en) A kind of behavior recognition methods based on action subspace and weight behavior model of cognition
CN102799858B (en) Based on the medical ultrasonic image automatic identifying method of redundancy feature abatement
CN102629320B (en) Ordinal measurement statistical description face recognition method based on feature level
CN105303150B (en) Realize the method and system of image procossing
Qi et al. Pedestrian detection from thermal images: A sparse representation based approach
CN102622590B (en) Identity recognition method based on face-fingerprint cooperation
CN105574475A (en) Common vector dictionary based sparse representation classification method
CN107220603A (en) Vehicle checking method and device based on deep learning
CN105447441A (en) Face authentication method and device
Ng et al. Hybrid ageing patterns for face age estimation
Anwer et al. Combining holistic and part-based deep representations for computational painting categorization
CN103440494A (en) Horrible image identification method and system based on visual significance analyses
CN103500345A (en) Method for learning person re-identification based on distance measure
CN104700089A (en) Face identification method based on Gabor wavelet and SB2DLPP
CN106446774A (en) Face recognition method based on secondary nearest neighbor sparse reconstruction
CN105023027A (en) Sole trace pattern image retrieval method based on multi-feedback mechanism
Chang et al. Progressive dimensionality reduction by transform for hyperspectral imagery
Li et al. Face liveness detection and recognition using shearlet based feature descriptors
Petrov et al. Self-organizing maps for texture classification
CN103942545A (en) Method and device for identifying faces based on bidirectional compressed data space dimension reduction
Cui et al. Edge detection algorithm optimization and simulation based on machine learning method and image depth information
Tome et al. Scenario-based score fusion for face recognition at a distance

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160720