CN108537206B - Face verification method based on convolutional neural network - Google Patents

Face verification method based on convolutional neural network Download PDF

Info

Publication number
CN108537206B
CN108537206B CN201810366071.1A CN201810366071A CN108537206B CN 108537206 B CN108537206 B CN 108537206B CN 201810366071 A CN201810366071 A CN 201810366071A CN 108537206 B CN108537206 B CN 108537206B
Authority
CN
China
Prior art keywords
positive
sample
template
neural network
convolutional neural
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810366071.1A
Other languages
Chinese (zh)
Other versions
CN108537206A (en
Inventor
袭肖明
于治楼
陈祥
吴永健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Inspur Scientific Research Institute Co Ltd
Original Assignee
Shandong Inspur Scientific Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Inspur Scientific Research Institute Co Ltd filed Critical Shandong Inspur Scientific Research Institute Co Ltd
Priority to CN201810366071.1A priority Critical patent/CN108537206B/en
Publication of CN108537206A publication Critical patent/CN108537206A/en
Application granted granted Critical
Publication of CN108537206B publication Critical patent/CN108537206B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention discloses a face verification method based on a convolutional neural network, which is realized by the steps of firstly collecting training images, selecting positive samples and generating positive templates; constructing a training set, and comparing the collected training images with the positive template; designing a loss function, and training a convolutional neural network model by combining a training set through optimizing the function so that the convolutional neural network model can be input in a sample pair mode; and constructing a sample pair by the sample to be verified and the generated positive type template, inputting the sample pair into the convolutional neural network, determining the sample as a positive type or a negative type according to a verification result, wherein the positive type passes the verification, and the negative type does not pass the verification. Compared with the prior art, the face verification method based on the convolutional neural network has the advantages that single sample input is changed into sample pair input, the similarity between the samples and the positive template can be learned, the comparison result between the samples and the positive template is directly output, one-to-one comparison with other positive samples is not needed, and the verification time can be effectively shortened.

Description

Face verification method based on convolutional neural network
Technical Field
The invention relates to the technical field of computer application, in particular to a face verification method based on a convolutional neural network.
Background
With the continuous development of society and economy, the market demands for identity authentication more and more, and the identity authentication is more and more important. Face authentication is the mainstream method of identity authentication at present. Existing approaches focus on reducing recognition error rates, implying an assumption that the identity importance of each user is the same. However, for some scenarios where a low rejection rate or a low false positive rate is required, this assumption is problematic. For example, for a financial institution, only a few high-level employees may have important privileges for payments, transfers, etc. In this application scenario, the loss of misidentifying a foreign person as a senior employee is much higher than the loss of misidentifying a senior employee as a foreign person. For another example, for a wanted criminal pursuit identification system, the loss resulting from misclassifying wanted victims in the database as good citizens is much higher than the loss resulting from misclassifying good citizens as wanted victims. Therefore, for the scenes needing low rejection rate or low false recognition rate, how to invent the face verification method based on the special scenes enables the face verification method to be capable of efficiently and correctly recognizing the face of a specific user, and the face verification method has important significance for reducing the product competitiveness of enterprises.
In order to solve the problem, the patent provides a face verification method based on a convolutional neural network.
Disclosure of Invention
The technical task of the invention is to provide a face verification method based on a convolutional neural network aiming at the defects.
A face verification method based on a convolutional neural network is realized by the following steps,
firstly, collecting training images, selecting a positive sample, and generating a positive template;
secondly, constructing a training set, comparing the collected training images with a normal template, and dividing the training images into samples of the same type as the template and samples of different types from the template;
designing a loss function, and training a convolutional neural network model by combining a training set through optimizing the function so that the convolutional neural network model can be input in a sample pair mode;
and fourthly, constructing a sample pair by the sample to be verified and the generated positive type template, inputting the sample pair into the convolutional neural network, determining the sample as a positive type or a negative type according to a verification result, wherein the positive type passes the verification, and the negative type does not pass the verification.
In the first step, the positive samples refer to attention samples in an application scene specified by people, when the positive template is generated, firstly, weighting setting is carried out on each positive sample according to the importance of the positive sample, and then, the positive template T is obtained by using the following formula through the existing positive template and the weight thereof:
Figure BDA0001637292880000021
in the formula, N is the number of the existing positive samples, RiIs the weight of the ith user, and measures the importance of the identity of the ith user, UiData of the ith user.
In the training set construction process in the second step, after the collected training images are compared with the normal template, the samples of the same type as the template are marked as 1; samples that are not in the same class as the template are labeled-1.
The convolutional neural network in the third step is trained through the following loss function and optimization process:
Figure BDA0001637292880000022
s.t Li=wzi+b;
s.t zi=xi-T;
in the above formula, C1Is a cost weight that classifies positive classes as negative classes, C2The cost weight is that the negative class is wrongly classified into the positive class; qiIs a class indication function of the ith sample, Q if the ith sample is classified as a positive class by the modeli1 is ═ 1; otherwise, Qi=-1;LiIs the predicted class result for the ith sample; w is the weight of the training network, b is the bias of the training network; z is a radical ofiIs an input sample pair, passing through sample xiThe decision is made as to the positive type template T generated, and if the two samples are similar, it is marked as 1.
In the optimization process of the objective function, the cost weight C is obtained according to the following algorithm1And C2And the parameters w and b of the model:
1) fixed C1And C2Two variables, giving initial values to both variables, so that C1≥C2
2) C is to be1And C2Substituting the loss function formula in the third step of the formula, and solving parameters w and b of the model by using a random gradient descent method;
3) then, the calculated w and b are substituted into the formula of the loss function, and C is used1And C2Is equal to 0, find C1And C2A value of (d);
4) repeating the step 2) and the step 3) until a convergence condition is reached.
In the fourth step, after the sample pair is input into the trained convolutional neural network, if the classification output is 1, the sample is a positive class and can pass verification; if the classification output is-1, then the sample is a negative class and cannot be validated.
Compared with the prior art, the face verification method based on the convolutional neural network has the following beneficial effects:
according to the face verification method based on the convolutional neural network, the input of a single sample is changed into the input of a sample pair, so that the similarity information between the sample and the positive template can be better learned.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is an exemplary diagram of an implementation of the present invention.
Detailed Description
The present invention will be described in further detail with reference to specific embodiments in order to make the technical field better understand the scheme of the present invention. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, a face verification method based on a convolutional neural network classifies a small number of concerned samples into positive classes for a specific scene. Firstly, a class-positive template learning method is provided. The positive type template is generated by integrating the importance information of the user identity. Then, a face verification model based on the cost-sensitive pairwise convolutional neural network is provided. A new cost learning Loss function is designed, and the Loss function is optimized, so that the proposed model can more effectively learn the importance information of the user identity, and the verification result with the minimum Loss cost is obtained.
The realization process is as follows,
firstly, collecting training images, selecting a positive sample, and generating a positive template;
secondly, constructing a training set, comparing the collected training images with a normal template, and dividing the training images into samples of the same type as the template and samples of different types from the template;
designing a loss function, and training a convolutional neural network model by combining a training set through optimizing the function so that the convolutional neural network model can be input in a sample pair mode;
and fourthly, constructing a sample pair by the sample to be verified and the generated positive type template, inputting the sample pair into the convolutional neural network, determining the sample as a positive type or a negative type according to a verification result, wherein the positive type passes the verification, and the negative type does not pass the verification.
In the first step, the positive type sample refers to a sample which is more concerned in the application scenario. For example, few employees with high authority in financial institutions mentioned in the background of the invention, wanted victims, and victims in pursuit systems.
Each positive type sample is then weighted according to its importance. For example, for the financial institutions mentioned in the present application, the weight of the line length is 0.6, the weight of the minor line length is 0.3, and so on. The more important the identity of the user, the higher the weight.
Obtaining the positive type template T by the following formula through the existing positive type template and the weight thereof:
Figure BDA0001637292880000031
in the formula, N is the number of the existing positive samples, RiIs the weight of the ith user, and measures the importance of the identity of the ith user, UiData of the ith user.
In the training set construction process in the second step, after the collected training images are compared with the normal template, the samples of the same type as the template are marked as 1; samples that are not in the same class as the template are labeled-1.
The convolutional neural network in the third step is trained through the following loss function and optimization process:
Figure BDA0001637292880000041
s·t Li=wzi+b;
s.t zi=xi-T;
in the above formula, C1Is a cost weight that classifies positive classes as negative classes, C2The cost weight is that the negative class is wrongly classified into the positive class; qiIs a class indication function of the ith sample, Q if the ith sample is classified as a positive class by the modeli1 is ═ 1; otherwise, Qi=-1;LiIs the predicted class result for the ith sample; w is the weight of the training network, b is the bias of the training network; z is a radical ofiIs an input sample pair, passing through sample xiThe decision is made as to the positive type template T generated, and if the two samples are similar, it is marked as 1.
In the optimization process of the objective function, the cost weight C is obtained according to the following algorithm1And C2And the parameters w and b of the model:
1) fixed C1And C2Two variables, to twoThe variable is initialized so that C1≥C2
2) C is to be1And C2Substituting the loss function formula in the third step of the formula, and solving parameters w and b of the model by using a random gradient descent method;
3) then, the calculated w and b are substituted into the formula of the loss function, and C is used1And C2Is equal to 0, find C1And C2A value of (d);
4) repeating the step 2) and the step 3) until a convergence condition is reached.
In the fourth step, after the sample pair is input into the trained convolutional neural network, if the classification output is 1, the sample is a positive class and can pass verification; if the classification output is-1, then the sample is a negative class and cannot be validated.
In the cost-sensitive paired convolutional neural network model provided by the invention, a new cost weight is introduced, and the learned cost weight can reflect the importance of the identity of the positive user, so that the model can obtain a verification result with the minimum loss cost during identity verification. The traditional convolutional neural network adopts single sample input, and when verification is carried out, the single sample input is compared with positive samples in a database one by one, so that the verification time is prolonged. The model provided by the invention changes single sample input in the convolutional neural network into sample pair input, can learn the similarity between the sample and the positive template, directly outputs the comparison result between the sample and the positive template, does not need to compare with other positive samples one by one, and can effectively reduce the verification time.
The present invention can be easily implemented by those skilled in the art from the above detailed description. It should be understood, however, that the intention is not to limit the invention to the particular embodiments described. On the basis of the disclosed embodiments, a person skilled in the art can combine different technical features at will, thereby implementing different technical solutions.
In addition to the technical features described in the specification, the technology is known to those skilled in the art.

Claims (4)

1. A face verification method based on a convolutional neural network is characterized in that the realization process is as follows,
firstly, collecting training images, selecting a positive sample, and generating a positive template;
step two, constructing a training set, comparing the collected training images with a normal template, and dividing the training images into samples of the same type as the template and samples of different types from the template;
designing a loss function, training a convolutional neural network model by combining a training set through optimizing the function, so that the convolutional neural network model can input a sample pair, wherein the sample pair input refers to the common input of a sample to be verified and a generated normal template;
step four, constructing a sample pair by the sample to be verified and the generated positive type template, inputting the sample pair into the convolutional neural network, determining the sample as a positive type or a negative type according to a verification result, wherein the positive type passes the verification, and the negative type does not pass the verification;
in the first step, the positive samples refer to attention samples in an application scene specified by people, when the positive template is generated, firstly, weighting setting is carried out on each positive sample according to the importance of the positive sample, and then, the positive template T is obtained by using the following formula through the existing positive template and the weight thereof:
Figure FDA0003100168960000011
in the formula, N is the number of the existing positive samples, RiIs the weight of the ith user, and measures the importance of the identity of the ith user, UiData of the ith user;
the convolutional neural network in the third step is trained through the following loss function and optimization process:
Figure FDA0003100168960000012
s.t Li=wzi+b;
s.t zi=xi-T;
in the above formula, C1Is a cost weight that classifies positive classes as negative classes, C2The cost weight is that the negative class is wrongly classified into the positive class; qiIs a class indication function of the ith sample, Q if the ith sample is classified as a positive class by the modeli1 is ═ 1; otherwise, Qi=-1;LiIs the predicted class result for the ith sample; w is the weight of the training network, b is the bias of the training network; z is a radical ofiIs an input sample pair, passing through sample xiThe decision is made as to the positive type template T generated, and if the two samples are similar, it is marked as 1.
2. The face verification method based on the convolutional neural network as claimed in claim 1, wherein in the training set construction process in the second step, after the collected training image is compared with the positive template, the sample of the same kind as the template is marked as 1; samples that are not in the same class as the template are labeled-1.
3. The face verification method based on the convolutional neural network as claimed in claim 1, wherein in the optimization process of the objective function, the cost weight C is obtained according to the following algorithm1And C2And the parameters w and b of the model:
1) fixed C1And C2Two variables, giving initial values to both variables, so that C1≥C2
2) C is to be1And C2Substituting the loss function formula in the third step of the formula, and solving parameters w and b of the model by using a random gradient descent method;
3) then, the calculated w and b are substituted into the formula of the loss function, and C is used1And C2Is equal to 0, find C1And C2A value of (d);
4) repeating the step 2) and the step 3) until a convergence condition is reached.
4. The face verification method based on the convolutional neural network as claimed in claim 1, wherein in the fourth step, after the sample pair is input into the trained convolutional neural network, if the classification output is 1, the sample is a positive class, and can be verified; if the classification output is-1, then the sample is a negative class and cannot be validated.
CN201810366071.1A 2018-04-23 2018-04-23 Face verification method based on convolutional neural network Active CN108537206B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810366071.1A CN108537206B (en) 2018-04-23 2018-04-23 Face verification method based on convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810366071.1A CN108537206B (en) 2018-04-23 2018-04-23 Face verification method based on convolutional neural network

Publications (2)

Publication Number Publication Date
CN108537206A CN108537206A (en) 2018-09-14
CN108537206B true CN108537206B (en) 2021-08-10

Family

ID=63479159

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810366071.1A Active CN108537206B (en) 2018-04-23 2018-04-23 Face verification method based on convolutional neural network

Country Status (1)

Country Link
CN (1) CN108537206B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109558827A (en) * 2018-11-26 2019-04-02 济南浪潮高新科技投资发展有限公司 A kind of finger vein identification method and system based on personalized convolutional neural networks
CN110610082A (en) * 2019-09-04 2019-12-24 笵成科技南京有限公司 DNN-based system and method for passport to resist fuzzy attack

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467564A (en) * 2010-11-12 2012-05-23 中国科学院烟台海岸带研究所 Remote sensing image retrieval method based on improved support vector machine relevance feedback
CN105005774A (en) * 2015-07-28 2015-10-28 中国科学院自动化研究所 Face relative relation recognition method based on convolutional neural network and device thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016065534A1 (en) * 2014-10-28 2016-05-06 中国科学院自动化研究所 Deep learning-based gait recognition method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467564A (en) * 2010-11-12 2012-05-23 中国科学院烟台海岸带研究所 Remote sensing image retrieval method based on improved support vector machine relevance feedback
CN105005774A (en) * 2015-07-28 2015-10-28 中国科学院自动化研究所 Face relative relation recognition method based on convolutional neural network and device thereof

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Accelerate Convolutional Neural Networks For Binary Classification Via Cascading Cost-sensitive Feature;Junbiao Pang et al.;《2016 IEEE International Conference on Image Processing(ICIP)》;20160819;全文 *
Facial Expression Recognition Using WeightedMixture Deep Neural Network Based onDouble-Channel Facial Images;Biao Yang et al.;《IEEE Access》;20171215;全文 *
基于卷积神经网络和代价敏感的不平衡图像分类方法;谭洁帆 等.;《计算机应用》;20180411;全文 *

Also Published As

Publication number Publication date
CN108537206A (en) 2018-09-14

Similar Documents

Publication Publication Date Title
US11417147B2 (en) Angle interference resistant and occlusion interference resistant fast face recognition method
CN103824055B (en) A kind of face identification method based on cascade neural network
CN109447099B (en) PCA (principal component analysis) dimension reduction-based multi-classifier fusion method
CN106022317A (en) Face identification method and apparatus
CN102902980B (en) A kind of biometric image analysis based on linear programming model and recognition methods
US20140133743A1 (en) Method, Apparatus and Computer Readable Recording Medium for Detecting a Location of a Face Feature Point Using an Adaboost Learning Algorithm
CN111104852B (en) Face recognition technology based on heuristic Gaussian cloud transformation
CN110020868B (en) Anti-fraud module decision fusion method based on online transaction characteristics
EP3779775A1 (en) Media processing method and related apparatus
CN108537206B (en) Face verification method based on convolutional neural network
CN108446687A (en) A kind of adaptive face vision authentication method based on mobile terminal and backstage interconnection
Amaro et al. Evaluation of machine learning techniques for face detection and recognition
CN114612968A (en) Convolutional neural network-based lip print identification method
Jha et al. Automation of cheque transaction using deep learning and optical character recognition
CN112395901A (en) Improved face detection, positioning and recognition method in complex environment
JP2020170496A (en) Age privacy protection method and system for face recognition
Wang et al. Face detection based on template matching and neural network
Abilash et al. Currency recognition for the visually impaired people
Wang et al. Integration of heterogeneous classifiers based on choquet fuzzy integral
Khalid et al. Fusion of multi-classifiers for online signature verification using fuzzy logic inference
Sharma et al. Face recognition using haar cascade and local binary pattern histogram in opencv
Kangwanwatana et al. Improve face verification rate using image pre-processing and facenet
Charishma et al. Smart Attendance System with and Without Mask using Face Recognition
Ho et al. Real Time Face Recognition using Raspberry Pi for Class Attendance System
Diaz et al. Explainable offline automatic signature verifier to support forensic handwriting examiners

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210721

Address after: Building S02, 1036 Gaoxin Langchao Road, Jinan, Shandong 250100

Applicant after: Shandong Inspur Scientific Research Institute Co.,Ltd.

Address before: 250100 First Floor of R&D Building 2877 Kehang Road, Sun Village Town, Jinan High-tech Zone, Shandong Province

Applicant before: JINAN INSPUR HI-TECH INVESTMENT AND DEVELOPMENT Co.,Ltd.

GR01 Patent grant
GR01 Patent grant