CN112668362B - Human evidence comparison model training method for dynamic optimization class proxy - Google Patents
Human evidence comparison model training method for dynamic optimization class proxy Download PDFInfo
- Publication number
- CN112668362B CN112668362B CN201910979497.9A CN201910979497A CN112668362B CN 112668362 B CN112668362 B CN 112668362B CN 201910979497 A CN201910979497 A CN 201910979497A CN 112668362 B CN112668362 B CN 112668362B
- Authority
- CN
- China
- Prior art keywords
- class
- data set
- human
- training
- proxy
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012549 training Methods 0.000 title claims abstract description 32
- 238000000034 method Methods 0.000 title claims abstract description 17
- 238000005457 optimization Methods 0.000 title claims abstract description 13
- 238000013527 convolutional neural network Methods 0.000 claims description 9
- 238000010606 normalization Methods 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims description 6
- 230000009191 jumping Effects 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 4
- 238000013135 deep learning Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention provides a human evidence comparison model training method of a dynamic optimization class agent. The method effectively solves the problem that the human identification model can not be effectively converged through dynamic optimization of the class proxy, and simultaneously the training precision of the model is effectively improved.
Description
[ field of technology ]
The invention relates to the technical field of biological recognition, in particular to a human evidence comparison model training method of a dynamic optimization class agent.
[ background Art ]
In recent years, with the rapid development of face recognition, face recognition-based person-to-person comparison technology plays an important role in various places where person-to-person consistency needs to be verified. The person certificate comparison technology is simple in that the collected face image of the person with the certificate (identity card, second generation card, passport, driving license, harbor and Australian pass and the like) is compared and verified with the head portrait contained in the certificate, so as to judge whether the person with the certificate is consistent with the person on the certificate.
The current mainstream method for training the human evidence comparison model is deep learning, softmax is an important loss function in the deep learning, and a class of corresponding weight in the softmax without a bias term is called class proxy of the class. The loss used in the algorithm for training the human evidence comparison model is the classification loss with intervals, and the model is optimized by receiving the similar attractive signals and the other types of repulsive signals through the class agents.
[ invention ]
Aiming at the problem that the class agent in the prior art can not be converged to the class center effectively, the invention provides a human evidence comparison model training method for dynamically optimizing the class agent. Based on the determined person evidence data set, the training process often needs to repeatedly use the data set for a plurality of rounds, and the class agent is updated after each round is finished; in each round of training, the data set is divided into a plurality of batches to be used, and the network parameters in front of the class agents are updated after each batch is finished.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a person and evidence comparison model training method of a dynamic optimization class agent comprises the following steps:
step 1: performing convolutional neural network training on the portrait data set to obtain a trained portrait model;
step 2: and (3) training the portrait model trained in the step (1) on a portrait data set by adopting a dynamic optimization proxy method to finally obtain a portrait comparison model.
In the step 2, training by adopting a method of dynamically optimizing the class proxy comprises the following steps:
s0: dividing the person's evidence data set into a plurality of classes;
s1: initializing all network parameters before the human certificate comparison model agent by using the human figure model trained in the step 1;
s2: initializing class agents, comprising the steps of:
a. extracting the characteristics of all samples of the human evidence data set by using the human figure model trained in the step 1 and carrying out normalization treatment on the characteristics;
b. calculating the average value of the sample characteristics of each class according to the data obtained in the step a;
c. b, the class proxy initialization value of each class is represented by the calculation result of the step b;
s3: reading a batch of samples, the batch comprising a plurality of samples;
s4: on the batch in the step S3, forward propagation is carried out on the convolutional neural network to obtain the characteristics and the loss values of all samples of the batch;
s5: carrying out normalization processing on the characteristics obtained in the step S4;
s6: accumulating the result obtained in the step S5 into a corresponding class proxy cache;
s7: back propagation is carried out on the convolutional neural network to obtain gradients and update all network parameters before the class proxy;
s8: judging whether the person certificate data set is traversed or not, executing the next step if the person certificate data set is traversed, and jumping to the step S3 if the person certificate data set is not traversed;
s9: calculating the sample characteristic average value in each class in the personnel data set according to the class proxy cache obtained in the step S6;
s10: updating the corresponding class proxy by using the calculation result in the step S9 and zeroing the class proxy cache;
s11: judging whether the termination condition of the human evidence comparison model training is met, ending the training if the termination condition is met, otherwise, jumping to the step S3 to continue the training.
The invention has the beneficial effects that the class proxy is updated once after the whole personal identification data set is trained once, and the class proxy is simultaneously subjected to the same kind of attraction signals and other kinds of rejection signals when being updated, so that the effective convergence of the personal identification comparison model is realized.
[ detailed description ] of the invention
Example 1:
the embodiment provides a person and certificate comparison model training method for a dynamic optimization class proxy, which comprises the following steps:
(1) Performing convolutional neural network training on the portrait data set to obtain a trained portrait model;
(2) Preprocessing a human license data set, wherein the preprocessed human license data set has N samples and is divided into M classes; the preprocessing mode is to detect key points of a human face by using a human face detection algorithm, and then align the key points to standard key points by using a similarity transformation;
(3) Because the human-evidence comparison model and the human-image model have the same network structure, the trained human-image model in the step (1) is used for initializing all network parameters before human-evidence comparison model class proxy;
(4) Initializing class agents, comprising the steps of:
A. extracting the characteristics of all samples of the human evidence data set by using the human figure model trained in the step (1):
x 1 ,x 2 ,…,x i ,…,x N
B. and (C) carrying out normalization processing on all the characteristics in the step A:
C. and C, calculating the average value of the sample characteristics of each class according to the data obtained in the step B and the number of the classes contained in the person evidence data set:
wherein,,is the average value of the sample characteristics contained in the ith class, N i Is the total number of samples of class i, +.>Is a normalized feature of the jth sample of the ith class.
D. The class proxy initialization value of each class is represented by the calculation result of the step C, namely, the average value of the sample characteristics contained in the class:
(5) Reading a batch of samples, wherein the number of samples in the batch is B;
(6) On the batch in the step (5), forward propagation is carried out on the convolutional neural network to obtain the characteristics and loss values of all samples of the batch;
all samples of the batch were characterized by:
wherein,,characteristic of the jth sample in the ith class of the batch, b i Represents the total number of samples of class i in the batch, and +.>
(7) And (3) carrying out normalization processing on the characteristics obtained in the step (6), wherein the normalization characteristics are as follows:
(8) Updating the corresponding class proxy cache by using the data in the step (7), wherein the updating mode is as follows:
(9) Back propagation is carried out on the convolutional neural network to obtain gradients and update all network parameters before the class proxy;
(10) Judging whether the person certificate data set is traversed or not, executing the next step if the person certificate data set is traversed, and jumping to the step (5) if the person certificate data set is not traversed;
(11) According to the class proxy cache obtained in the step (8) and the number of classes contained in the personnel evidence data set, calculating the sample characteristic average value of each class in the personnel evidence data set:
(12) Updating the corresponding class agent with the calculation result of step (11):
(13) Zeroing the class proxy cache;
(14) Judging whether the termination condition of the human evidence comparison model training is met, ending the training if the termination condition is met, otherwise, jumping to the step (5) to continue the training.
The model is optimized through dynamic optimization of the class proxy, meanwhile, the training precision of the model is improved, and the recall rate is improved by 2.1% compared with the traditional human evidence model training method using the class proxy under the false recognition rate of 0.01% of the internal test set.
Claims (2)
1. A person and certificate comparison model training method of a dynamic optimization class agent is characterized by comprising the following steps:
step 1: performing convolutional neural network training on the portrait data set to obtain a trained portrait model;
step 2: training the portrait model trained in the step 1 by adopting a dynamic optimization proxy method on a portrait data set to finally obtain a portrait comparison model;
the training method adopting the dynamic optimization class agent comprises the following steps:
s0: dividing the person's evidence data set into a plurality of classes;
s1: initializing all network parameters before the human certificate comparison model agent by using the human figure model trained in the step 1;
s2: initializing a class proxy;
s3: reading a batch of samples, the batch comprising a plurality of samples;
s4: on the batch in the step S3, forward propagation is carried out on the convolutional neural network to obtain the characteristics and the loss values of all samples of the batch;
s5: carrying out normalization processing on the characteristics obtained in the step S4;
s6: accumulating the result obtained in the step S5 into a corresponding class proxy cache;
s7: back propagation is carried out on the convolutional neural network to obtain gradients and update all network parameters before the class proxy;
s8: judging whether the person certificate data set is traversed or not, executing the next step if the person certificate data set is traversed, and jumping to the step S3 if the person certificate data set is not traversed;
s9: calculating the sample characteristic average value in each class in the personnel data set according to the class proxy cache obtained in the step S6;
s10: updating the corresponding class proxy by using the calculation result in the step S9 and zeroing the class proxy cache;
s11: judging whether the termination condition of the human evidence comparison model training is met, ending the training if the termination condition is met, otherwise jumping to the step
S3, training is continued.
2. The human evidence comparison model training method of a dynamic optimization class agent according to claim 1, wherein the initializing of the class agent comprises the steps of:
a. extracting the characteristics of all samples of the human evidence data set by using the human figure model trained in the step 1 and carrying out normalization treatment on the characteristics;
b. calculating the average value of the sample characteristics of each class according to the data obtained in the step a;
c. the class proxy initialization value of each class is represented by the calculation result of step b.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910979497.9A CN112668362B (en) | 2019-10-15 | 2019-10-15 | Human evidence comparison model training method for dynamic optimization class proxy |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910979497.9A CN112668362B (en) | 2019-10-15 | 2019-10-15 | Human evidence comparison model training method for dynamic optimization class proxy |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112668362A CN112668362A (en) | 2021-04-16 |
CN112668362B true CN112668362B (en) | 2023-06-16 |
Family
ID=75399979
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910979497.9A Active CN112668362B (en) | 2019-10-15 | 2019-10-15 | Human evidence comparison model training method for dynamic optimization class proxy |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112668362B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116386108B (en) * | 2023-03-27 | 2023-09-19 | 南京理工大学 | Fairness face recognition method based on instance consistency |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106599797A (en) * | 2016-11-24 | 2017-04-26 | 北京航空航天大学 | Infrared face identification method based on local parallel nerve network |
CN106845357A (en) * | 2016-12-26 | 2017-06-13 | 银江股份有限公司 | A kind of video human face detection and recognition methods based on multichannel network |
CN108564029A (en) * | 2018-04-12 | 2018-09-21 | 厦门大学 | Face character recognition methods based on cascade multi-task learning deep neural network |
EP3543917A1 (en) * | 2018-03-19 | 2019-09-25 | SRI International Inc. | Dynamic adaptation of deep neural networks |
-
2019
- 2019-10-15 CN CN201910979497.9A patent/CN112668362B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106599797A (en) * | 2016-11-24 | 2017-04-26 | 北京航空航天大学 | Infrared face identification method based on local parallel nerve network |
CN106845357A (en) * | 2016-12-26 | 2017-06-13 | 银江股份有限公司 | A kind of video human face detection and recognition methods based on multichannel network |
EP3543917A1 (en) * | 2018-03-19 | 2019-09-25 | SRI International Inc. | Dynamic adaptation of deep neural networks |
CN108564029A (en) * | 2018-04-12 | 2018-09-21 | 厦门大学 | Face character recognition methods based on cascade multi-task learning deep neural network |
Non-Patent Citations (1)
Title |
---|
DocFace+: ID Document to Selfie* Matching;Yichun Shi and Anil K. Jain;《Computer Vision and Pattern Recognition》;20180918;1-11 * |
Also Published As
Publication number | Publication date |
---|---|
CN112668362A (en) | 2021-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109165566B (en) | Face recognition convolutional neural network training method based on novel loss function | |
CN108509862B (en) | Rapid face recognition method capable of resisting angle and shielding interference | |
CN110334741B (en) | Radar one-dimensional range profile identification method based on cyclic neural network | |
CN101447020B (en) | Pornographic image recognizing method based on intuitionistic fuzzy | |
CN106250821A (en) | The face identification method that a kind of cluster is classified again | |
CN110263673A (en) | Human facial expression recognition method, apparatus, computer equipment and storage medium | |
CN110084149B (en) | Face verification method based on hard sample quadruple dynamic boundary loss function | |
CN110008674B (en) | High-generalization electrocardiosignal identity authentication method | |
CN110321785A (en) | A method of introducing ResNet deep learning network struction dermatoglyph classification prediction model | |
CN113221655B (en) | Face spoofing detection method based on feature space constraint | |
CN110969073B (en) | Facial expression recognition method based on feature fusion and BP neural network | |
CN114469120B (en) | Multi-scale Dtw-BiLstm-Gan electrocardiosignal generation method based on similarity threshold migration | |
CN111898533B (en) | Gait classification method based on space-time feature fusion | |
CN111144566A (en) | Neural network weight parameter training method, characteristic classification method and corresponding device | |
CN108717548B (en) | Behavior recognition model updating method and system for dynamic increase of sensors | |
CN107798308A (en) | A kind of face identification method based on short-sighted frequency coaching method | |
CN110929243B (en) | Pedestrian identity recognition method based on mobile phone inertial sensor | |
CN112668362B (en) | Human evidence comparison model training method for dynamic optimization class proxy | |
CN109255339A (en) | Classification method based on adaptive depth forest body gait energy diagram | |
CN113469002B (en) | Identity recognition method based on blockchain interaction, biological multi-feature recognition and multi-source data fusion | |
CN110443577A (en) | A kind of campus attendance checking system based on recognition of face | |
CN109409231A (en) | Multiple features fusion sign Language Recognition Method based on adaptive hidden Markov | |
CN111062345B (en) | Training method and device for vein recognition model and vein image recognition device | |
CN116884067B (en) | Micro-expression recognition method based on improved implicit semantic data enhancement | |
CN111582382B (en) | State identification method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |