CN110705402A - Face recognition confidence value mapping algorithm - Google Patents

Face recognition confidence value mapping algorithm Download PDF

Info

Publication number
CN110705402A
CN110705402A CN201910887638.4A CN201910887638A CN110705402A CN 110705402 A CN110705402 A CN 110705402A CN 201910887638 A CN201910887638 A CN 201910887638A CN 110705402 A CN110705402 A CN 110705402A
Authority
CN
China
Prior art keywords
score
recognition rate
face recognition
recognition
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910887638.4A
Other languages
Chinese (zh)
Inventor
王汝杰
王志保
陈澎祥
李森
慈红斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Tiandi Weiye Robot Technology Co Ltd
Original Assignee
Tianjin Tiandi Weiye Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Tiandi Weiye Robot Technology Co Ltd filed Critical Tianjin Tiandi Weiye Robot Technology Co Ltd
Priority to CN201910887638.4A priority Critical patent/CN110705402A/en
Publication of CN110705402A publication Critical patent/CN110705402A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a face recognition confidence value mapping algorithm, which comprises the following steps: evaluating the test sample by the face recognition model to obtain recognition rate and error recognition rate data; performing data fitting on the recognition rate and the false recognition rate; selecting data segmentation mapping; and mapping the output value according to the fitted formula. The invention has the beneficial effects that: after the product is replaced by different algorithms, external parameters are not required to be modified, and only the recognition effect of the product is required to be mapped by using the algorithm, so that the expected effect of the algorithm can be achieved, and the accuracy of face recognition is improved.

Description

Face recognition confidence value mapping algorithm
Technical Field
The invention belongs to the field of face recognition, and particularly relates to a face recognition confidence value mapping algorithm.
Background
With the development of deep learning technology, the recognition rate of the face recognition algorithm exceeds that of human beings in many occasions. Face recognition algorithms are widely used in various products. Due to different algorithm manufacturers or due to model iteration, default confidence values input by the recognition algorithm are changed continuously, so that an upper-layer system can be debugged continuously to meet different algorithms to achieve the best effect. This leads to a cumbersome adaptation and high professional requirements.
Disclosure of Invention
In view of the above, the present invention is directed to a face recognition confidence value mapping algorithm to solve the above-mentioned problems.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the face recognition confidence value mapping algorithm comprises the following steps:
A. evaluating the test sample by the face recognition model to obtain recognition rate and error recognition rate data;
B. b, performing data fitting on the recognition rate and the false recognition rate obtained in the step A;
C. selecting data segmentation mapping;
D. and mapping the output value according to the fitted formula.
Further, in the step a, the face recognition model is used to evaluate the recognition rate and the false recognition rate of the test sample, so as to obtain the recognition rate and the false recognition rate corresponding to each score in the test sample, wherein the score represents the probability of recognizing one sample.
Further, in the step B, the data obtained in the step A are respectively fitted by taking the score as a horizontal coordinate and the recognition rate and the false recognition rate as a vertical coordinate to obtain a data curve.
Further, the step C process is as follows:
C1. taking the minimum score point with the false recognition rate of 0 in the step B as a reference segmentation point, and recording the corresponding score as a;
C2. mapping the score of the interval a-b to the interval b-1.0, and considering the score which is larger than b before mapping as 1.0, wherein b is the minimum score when the identification accuracy tends to be stable;
C3. carrying out polynomial fitting on data of the false recognition rate corresponding to the interval a-b values;
C4. and equally dividing the score of the interval b-1.0, and performing polynomial fitting on the false recognition rate.
Further, in the step D, when the output score of the face recognition model is within the interval a-b, the output score is substituted into the polynomial obtained in the step C3, then the false recognition rate parameters corresponding to the polynomial obtained by fitting in the steps C3 and C4 are equal to each other, a set of solution x of the score parameters is obtained, the score solution larger than 1.0 is removed, and the other score solution is smaller than 1.0 and is determined according to MAX (x, b).
Compared with the prior art, the face recognition confidence value mapping algorithm has the following advantages:
the face recognition confidence value mapping algorithm does not need to modify external parameters after the product is replaced by different algorithms, and only needs to map the recognition effect of the product by using the algorithm, so that the output of the algorithm is mapped to the score in the score interval with high recognition rate, the accuracy of the face recognition model is improved, and the optimal effect of the model is achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a flowchart illustrating the steps of a face recognition confidence value mapping algorithm according to an embodiment of the present invention;
FIG. 2 is a recognition rate polynomial fit curve in an embodiment of the present invention;
FIG. 3 is a fitting curve of a polynomial of the false positive rate according to an embodiment of the present invention;
FIG. 4 is a 0.6-0.8 segmented data fitting curve according to an embodiment of the present invention;
FIG. 5 is a curve fitted to 0.8-1.0 segmented data in an embodiment of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
As shown in fig. 1, the face recognition confidence value mapping algorithm includes the following steps:
A. evaluating the test sample by the face recognition model to obtain recognition rate and error recognition rate data;
B. b, performing data fitting on the recognition rate and the false recognition rate obtained in the step A;
C. selecting data segmentation mapping;
D. and mapping the output value according to the fitted formula.
And in the step A, the face recognition model is used for evaluating the recognition rate and the false recognition rate of the test sample to obtain the recognition rate and the false recognition rate corresponding to each score in the test sample, wherein the score represents the probability of recognizing one sample.
And in the step B, fitting the data obtained in the step A respectively by taking the score as a horizontal coordinate and the recognition rate and the error recognition rate as a vertical coordinate to obtain a data curve.
The step C process is as follows:
C1. taking the minimum score point with the false recognition rate of 0 in the step B as a reference segmentation point, and recording the corresponding score as a;
C2. mapping the scores of the intervals a-b to the intervals b-1.0, and considering the scores which are larger than b before mapping as 1.0, because the recognition rates are basically consistent when the scores are larger than 0.8, the score mapping has no meaning, wherein b is the minimum score when the recognition accuracy tends to be stable; the recognition accuracy is higher when the score is higher and lower, and the recognition accuracy is higher when the score is lower, but the recognition accuracy is lower.
C3. Carrying out polynomial fitting on data of the false recognition rate corresponding to the interval a-b values;
C4. and equally dividing the score of the interval b-1.0, and performing polynomial fitting on the false recognition rate.
And D, when the output score of the face recognition model is in an interval a-b, substituting the output score into the polynomial obtained in the step C3, and then enabling the corresponding false recognition rate parameters in the polynomial obtained by fitting in the step C3 and the step C4 to be equal to each other, obtaining a solution x of a group of score parameters, removing the score solution larger than 1.0, and determining the solution x according to MAX (x, b) if the other score solution is smaller than 1.0.
In this embodiment, a group of face recognition models is given to evaluate a test sample to obtain recognition rate and error recognition rate data, and with 0.01 score as an interval, see table 1:
TABLE 1 identification and error Rate data enumeration
Figure BDA0002207804090000041
As shown in fig. 2 and 3, the data are respectively subjected to polynomial fitting by using the score as an abscissa and the recognition rate and the false recognition rate as an ordinate to obtain a data curve, wherein the fitting polynomial of the recognition rate is as follows:
y=-159.44x6+470.04x5-503.86x4+238.05x3-49.631x2+3.875x+0.9258
in fig. 3, a point with a false recognition rate of 0 is found as a reference segmentation point, and the test model in this embodiment is selected to be 0.6;
as shown in FIG. 2, when the score of the model is greater than 0.8, the recognition rate is basically consistent, and the score mapping has no significance, so that the scores in the interval of 0.6-0.8 are mapped to the interval of 0.8-1.0, wherein the scores greater than 0.8 before mapping are all regarded as 1.0;
as shown in fig. 4, the score data of 0.6-0.8 is taken to perform polynomial fitting, and the fitting polynomial is obtained as:
y=14.806x2-24.047x+9.7656;
as shown in fig. 5, the interval 0.8-1.0 is divided equally, and then polynomial fitting is performed to obtain a fitting polynomial as:
y=16.337x12-32.735x1+16.4;
when the model output score is 0.6-0.8, here 0.6 for example, it can be obtained from the two fitting curves:
16.337*x12-32.735*x1+16.4=14.806*0.62-24.047*0.6+9.7656,
by solving the equations, x is 0.75131 and 1.252424, and since the range of data to be mapped is between 0.8-1.0, values greater than 1.0 are removed and a smaller solution 0.7513 is selected. Since 0.75131 does not completely meet the expected 0.8 requirement, the value can be taken to be a proper value according to MAX (x,0.8), which also avoids the complex operation of too complex fitting function.
Finally, the algorithm is used for outputting scores in the score interval mapped to high recognition rate, and the accuracy of the face recognition model is improved.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (5)

1. The face recognition confidence value mapping algorithm is characterized by comprising the following steps of:
A. evaluating the test sample by the face recognition model to obtain recognition rate and error recognition rate data;
B. b, performing data fitting on the recognition rate and the false recognition rate obtained in the step A;
C. selecting data segmentation mapping;
D. and mapping the output value according to the fitted formula.
2. The face recognition confidence value mapping algorithm of claim 1, wherein: and in the step A, the face recognition model is used for evaluating the recognition rate and the false recognition rate of the test sample to obtain the recognition rate and the false recognition rate corresponding to each score in the test sample, wherein the score represents the probability of recognizing one sample.
3. The face recognition confidence value mapping algorithm of claim 2, wherein: and in the step B, fitting the data obtained in the step A respectively by taking the score as a horizontal coordinate and the recognition rate and the error recognition rate as a vertical coordinate to obtain a data curve.
4. The face recognition confidence value mapping algorithm of claim 3, wherein the step C process is as follows:
C1. taking the minimum score point with the false recognition rate of 0 in the step B as a reference segmentation point, and recording the corresponding score as a;
C2. mapping the score of the interval a-b to the interval b-1.0, and considering the score which is larger than b before mapping as 1.0, wherein b is the minimum score when the identification accuracy tends to be stable;
C3. carrying out polynomial fitting on data of the false recognition rate corresponding to the interval a-b values;
C4. and equally dividing the score of the interval b-1.0, and performing polynomial fitting on the false recognition rate.
5. The face recognition confidence value mapping algorithm of claim 4, wherein: and D, when the output score of the face recognition model is in an interval a-b, substituting the output score into the polynomial obtained in the step C3, and then enabling the corresponding false recognition rate parameters in the polynomial obtained by fitting in the step C3 and the step C4 to be equal to each other, obtaining a solution x of a group of score parameters, removing the score solution larger than 1.0, and determining the solution x according to MAX (x, b) if the other score solution is smaller than 1.0.
CN201910887638.4A 2019-09-19 2019-09-19 Face recognition confidence value mapping algorithm Pending CN110705402A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910887638.4A CN110705402A (en) 2019-09-19 2019-09-19 Face recognition confidence value mapping algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910887638.4A CN110705402A (en) 2019-09-19 2019-09-19 Face recognition confidence value mapping algorithm

Publications (1)

Publication Number Publication Date
CN110705402A true CN110705402A (en) 2020-01-17

Family

ID=69195810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910887638.4A Pending CN110705402A (en) 2019-09-19 2019-09-19 Face recognition confidence value mapping algorithm

Country Status (1)

Country Link
CN (1) CN110705402A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106096547A (en) * 2016-06-11 2016-11-09 北京工业大学 A kind of towards the low-resolution face image feature super resolution ratio reconstruction method identified
CN108648239A (en) * 2018-05-04 2018-10-12 苏州富强科技有限公司 The scaling method of phase height mapping system based on segmented fitting of a polynomial
CN108921123A (en) * 2018-07-17 2018-11-30 重庆科技学院 A kind of face identification method based on double data enhancing
CN109494716A (en) * 2018-11-15 2019-03-19 沈阳工业大学 Wind power output power confidence interval prediction technique based on Bootstrap
US10346693B1 (en) * 2019-01-22 2019-07-09 StradVision, Inc. Method and device for attention-based lane detection without post-processing by using lane mask and testing method and testing device using the same
CN110083517A (en) * 2019-04-29 2019-08-02 秒针信息技术有限公司 A kind of optimization method and device of user's portrait confidence level

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106096547A (en) * 2016-06-11 2016-11-09 北京工业大学 A kind of towards the low-resolution face image feature super resolution ratio reconstruction method identified
CN108648239A (en) * 2018-05-04 2018-10-12 苏州富强科技有限公司 The scaling method of phase height mapping system based on segmented fitting of a polynomial
CN108921123A (en) * 2018-07-17 2018-11-30 重庆科技学院 A kind of face identification method based on double data enhancing
CN109494716A (en) * 2018-11-15 2019-03-19 沈阳工业大学 Wind power output power confidence interval prediction technique based on Bootstrap
US10346693B1 (en) * 2019-01-22 2019-07-09 StradVision, Inc. Method and device for attention-based lane detection without post-processing by using lane mask and testing method and testing device using the same
CN110083517A (en) * 2019-04-29 2019-08-02 秒针信息技术有限公司 A kind of optimization method and device of user's portrait confidence level

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吴蔚杰: "图像质量与人脸识别关系量化模型的研究与实现", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》 *
周丽芳等: "一种基于分治策略的Huffman-LBP多姿态人脸识别", 《小型微型计算机系统》 *
陈晓方等: "基于主元导数特征聚类的加氢裂化动态调整区间识别", 《清华大学学报(自然科学版)》 *

Similar Documents

Publication Publication Date Title
WO2018028546A1 (en) Key point positioning method, terminal, and computer storage medium
CN106683680B (en) Speaker recognition method and device, computer equipment and computer readable medium
WO2018108129A1 (en) Method and apparatus for use in identifying object type, and electronic device
TWI752455B (en) Image classification model training method, image processing method, data classification model training method, data processing method, computer device, and storage medium
WO2021120834A1 (en) Biometrics-based gesture recognition method and apparatus, computer device, and medium
CN107423773B (en) Automatic registration method and device for three-dimensional skull
CN109101481A (en) A kind of name entity recognition method, device and electronic equipment
CN110147732A (en) Refer to vein identification method, device, computer equipment and storage medium
CN107945791B (en) Voice recognition method based on deep learning target detection
CN111832462B (en) Frequency hopping signal detection and parameter estimation method based on deep neural network
WO2020082734A1 (en) Text emotion recognition method and apparatus, electronic device, and computer non-volatile readable storage medium
JP2018500645A (en) System and method for tracking objects
CN109034095A (en) A kind of face alignment detection method, apparatus and storage medium
CN109656366B (en) Emotional state identification method and device, computer equipment and storage medium
WO2022027913A1 (en) Target detection model generating method and apparatus, device and storage medium
CN111274955A (en) Emotion recognition method and system based on audio-visual feature correlation fusion
CN113539501A (en) Breathing machine man-machine asynchronous classification method, system, terminal and storage medium
CN112131322B (en) Time sequence classification method and device
CN114343577B (en) Cognitive function evaluation method, terminal device, and computer-readable storage medium
CN107729947A (en) A kind of Face datection model training method, device and medium
CN110705402A (en) Face recognition confidence value mapping algorithm
CN108009157B (en) Statement classification method and device
CN111126566A (en) Abnormal furniture layout data detection method based on GAN model
CN109976555A (en) A kind of frequency calibrating method and device
CN114996466A (en) Method and system for establishing medical standard mapping model and using method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200117

RJ01 Rejection of invention patent application after publication