CN109871762B - Face recognition model evaluation method and device - Google Patents

Face recognition model evaluation method and device Download PDF

Info

Publication number
CN109871762B
CN109871762B CN201910039128.1A CN201910039128A CN109871762B CN 109871762 B CN109871762 B CN 109871762B CN 201910039128 A CN201910039128 A CN 201910039128A CN 109871762 B CN109871762 B CN 109871762B
Authority
CN
China
Prior art keywords
similarity
base map
base
test
site
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910039128.1A
Other languages
Chinese (zh)
Other versions
CN109871762A (en
Inventor
翟彬彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201910039128.1A priority Critical patent/CN109871762B/en
Publication of CN109871762A publication Critical patent/CN109871762A/en
Priority to PCT/CN2019/118257 priority patent/WO2020147408A1/en
Application granted granted Critical
Publication of CN109871762B publication Critical patent/CN109871762B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a method and a device for evaluating a face recognition model, and relates to the technical field of artificial intelligence, wherein the method comprises the following steps: constructing n test positive samples; constructing n test negative samples; inputting n test negative samples into a face recognition model; receiving a feature vector corresponding to the site illumination and a feature vector corresponding to the base map output by the face recognition model, calculating cosine similarity between the feature vector corresponding to the site illumination and the feature vector corresponding to the base map, and determining similarity between the site illumination and the base map of each test negative sample in the n test negative samples according to the calculated cosine similarity; sorting the n test negative samples; determining target similarity; calculating the passing rate of n test positive samples; generating an evaluation result of the face recognition model; and outputting an evaluation result of the face recognition model. The technical scheme provided by the embodiment of the invention can solve the problem that the face recognition model cannot be evaluated according to the recognition accuracy in the prior art.

Description

Face recognition model evaluation method and device
[ field of technology ]
The invention relates to the technical field of artificial intelligence, in particular to a method and a device for evaluating a face recognition model.
[ background Art ]
The face recognition model is used for scenes of staff, examinees and the like needing to record attendance, and is widely applied by recognizing attendance staff by taking faces as media. An excellent face recognition model should have high recognition accuracy.
At present, a face recognition model cannot be evaluated according to recognition accuracy.
[ invention ]
In view of the above, the embodiment of the invention provides a method and a device for evaluating a face recognition model, which are used for solving the problem that the face recognition model cannot be evaluated according to recognition accuracy in the prior art.
In one aspect, an embodiment of the present invention provides a method for evaluating a face recognition model, where the method includes: constructing n test positive samples, wherein each test positive sample in the n test positive samples comprises a site photo and a base map, the site photo and the base map included in the same test positive sample correspond to the same user, the base map is an image stored in a base map library, n base maps are stored in the base map library, each base map has a corresponding relation with one user, and n is a natural number larger than 1; constructing n test negative samples, wherein each test negative sample in the n test negative samples comprises a site photo and a base map, and the site photo and the base map included in the same test negative sample correspond to different users; inputting the n test negative samples into a face recognition model; receiving a feature vector corresponding to the site illumination and a feature vector corresponding to the base map output by the face recognition model, calculating cosine similarity between the feature vector corresponding to the site illumination and the feature vector corresponding to the base map, and determining similarity between the site illumination and the base map of each test negative sample in the n test negative samples according to the calculated cosine similarity; sequencing the n test negative samples according to the similarity between the site illumination and the base map of each test negative sample in the n test negative samples to obtain a sequencing result; acquiring a preset false recognition rate; determining target similarity according to the preset false recognition rate and the sequencing result; calculating the passing rate of the n test positive samples according to the target similarity; generating an evaluation result of the face recognition model according to the passing rate of the n test positive samples, wherein the evaluation result is used for evaluating the recognition accuracy of the face recognition model; and outputting an evaluation result of the face recognition model.
Further, the determining the target similarity according to the preset false recognition rate and the sorting result includes: calculating the number of misrecognitions according to a formula h=n×l, wherein L is the preset misrecognition rate, and h is the number of misrecognitions; h test negative samples are screened out according to the sequencing result, wherein the similarity between the site shots and the base graphs of the screened h test negative samples is larger than the similarity between the site shots and the base graphs of the rest n-h test negative samples; taking the test negative sample with the lowest similarity between the site photograph and the base map in the h test negative samples as a target test negative sample; and determining the target similarity according to the similarity between the site photograph and the base map of the target test negative sample.
Further, the calculating the passing rate of the n test positive samples according to the target similarity includes: calculating the similarity between the site shots and the base graphs of the n test positive samples to obtain n similarities; counting the number of test positive samples with the similarity between the site photograph and the base map being greater than or equal to the target similarity; the pass rate of the n test positive samples is calculated according to the following formula: p=n1/n, where P is the pass rate of the n test positive samples, and n1 is the number of test positive samples having a similarity between the site illumination and the base map greater than or equal to the target similarity.
Further, the constructing n test negative samples includes: acquiring site illumination of an ith user, wherein i is a natural number, and i is more than or equal to 1 and less than or equal to n; acquiring n base graphs stored in the base graph library; filtering out the base map corresponding to the ith user; and carrying out similarity calculation on the field shot of the ith user and the rest n-1 base graphs, taking the base graph corresponding to the maximum similarity as the base graph of the ith test negative sample, and taking the field shot of the ith user as the field shot of the ith test negative sample.
Further, the step of performing similarity calculation on the site photograph of the ith user and the remaining n-1 base graphs includes: calculating a feature vector corresponding to the field shot of the ith user, and carrying out L2 norm normalization processing on the feature vector corresponding to the field shot of the ith user; acquiring the feature vectors corresponding to the remaining n-1 base graphs, and carrying out L2 norm normalization processing on the feature vectors corresponding to the remaining n-1 base graphs; calculating cosine similarity between the feature vector corresponding to the field shot of the ith user after L2 norm normalization processing and the feature vector corresponding to each of the remaining n-1 base graphs; and taking cosine similarity between the feature vector corresponding to the field shot of the ith user and the feature vector corresponding to the kth base map in the remaining n-1 base maps after L2 norm normalization processing as similarity between the field shot of the ith user and the kth base map in the remaining n-1 base maps, wherein k is more than or equal to 1 and less than or equal to n-1.
In one aspect, an embodiment of the present invention provides an evaluation device for a face recognition model, where the device includes: the first construction unit is used for constructing n test positive samples, each test positive sample in the n test positive samples comprises a site photo and a base map, the site photo and the base map which are included in the same test positive sample correspond to the same user, the base map is an image stored in a base map library, n base maps are stored in the base map library, each base map has a corresponding relation with one user, and n is a natural number which is larger than 1; the second construction unit is used for constructing n test negative samples, each test negative sample in the n test negative samples comprises a site photo and a base map, and the site photo and the base map which are included in the same test negative sample correspond to different users; the input unit is used for inputting the n test negative samples into a face recognition model; the first determining unit is used for receiving the feature vector corresponding to the field shot and the feature vector corresponding to the base map output by the face recognition model, calculating cosine similarity between the feature vector corresponding to the field shot and the feature vector corresponding to the base map, and determining similarity between the field shot and the base map of each test negative sample in the n test negative samples according to the calculated cosine similarity; the sequencing unit is used for sequencing the n test negative samples according to the similarity between the site photograph and the base map of each test negative sample in the n test negative samples to obtain a sequencing result; the acquisition unit is used for acquiring a preset false recognition rate; the second determining unit is used for determining the target similarity according to the preset false recognition rate and the sorting result; a calculating unit, configured to calculate the passing rates of the n test positive samples according to the target similarity; the generation unit is used for generating evaluation results of the face recognition model according to the passing rate of the n test positive samples, and the evaluation results are used for evaluating the recognition accuracy of the face recognition model; and the output unit is used for outputting the evaluation result of the face recognition model.
Further, the second determining unit includes: a first calculating subunit, configured to calculate, according to a formula h=n×l, a number of misrecognitions, where L is the preset misrecognition rate, and h is the number of misrecognitions; the screening subunit is used for screening out h test negative samples according to the sorting result, wherein the similarity between the site shots and the base graphs of the screened h test negative samples is larger than the similarity between the site shots and the base graphs of the other n-h test negative samples; a first determining subunit, configured to use, as a target test negative sample, a test negative sample with the lowest similarity between the site illumination and the base map in the h test negative samples; and the second determination subunit is used for determining the target similarity according to the similarity between the site illumination and the base map of the target test negative sample.
Further, the computing unit includes: the second calculating subunit is used for calculating the similarity between the site shots and the base graphs of the n test positive samples to obtain n similarities; the statistics subunit is used for counting the number of the test positive samples with the similarity between the site illumination and the base map being greater than or equal to the target similarity; a third calculation subunit, configured to calculate the passing rates of the n test positive samples according to the following formula: p=n1/n, where P is the pass rate of the n test positive samples, and n1 is the number of test positive samples having a similarity between the site illumination and the base map greater than or equal to the target similarity.
Further, the second building unit includes: the first acquisition subunit is used for acquiring site illumination of an ith user, wherein i is a natural number and is more than or equal to 1 and less than or equal to n; the second acquisition subunit is used for acquiring n base graphs stored in the base graph library; the filtering subunit is used for filtering the base graph corresponding to the ith user; and the fourth calculation subunit is used for calculating the similarity between the field shot of the ith user and the rest n-1 base graphs, taking the base graph corresponding to the maximum similarity as the base graph of the ith test negative sample, and taking the field shot of the ith user as the field shot of the ith test negative sample.
Further, the fourth computing subunit includes: the first calculation module is used for calculating the feature vector corresponding to the field shot of the ith user and carrying out L2 norm normalization processing on the feature vector corresponding to the field shot of the ith user; the acquisition module is used for acquiring the feature vectors corresponding to the remaining n-1 base graphs and carrying out L2 norm normalization processing on the feature vectors corresponding to the remaining n-1 base graphs; the second calculation module is used for calculating cosine similarity between the feature vector corresponding to the field shot of the ith user after the L2 norm normalization processing and the feature vector corresponding to each of the remaining n-1 base graphs; the determining module is used for taking cosine similarity between the feature vector corresponding to the field shot of the ith user and the feature vector corresponding to the kth base map in the remaining n-1 base maps after the L2 norm normalization processing as similarity between the field shot of the ith user and the kth base map in the remaining n-1 base maps, wherein k is more than or equal to 1 and less than or equal to n-1.
In one aspect, an embodiment of the present invention provides a storage medium, where the storage medium includes a stored program, and when the program runs, the device where the storage medium is controlled to execute the above-mentioned method for evaluating a face recognition model.
In one aspect, an embodiment of the present invention provides a computer device, including a memory and a processor, where the memory is configured to store information including program instructions, and the processor is configured to control execution of the program instructions, where the program instructions, when loaded and executed by the processor, implement the steps of the method for evaluating a face recognition model described above.
In the embodiment of the invention, a test negative sample is input into a face recognition model, a feature vector corresponding to a scene shot and a feature vector corresponding to a base map output by the face recognition model are received, cosine similarity between the feature vector corresponding to the scene shot and the feature vector corresponding to the base map is calculated, and the similarity between the scene shot and the base map of each test negative sample is determined according to the calculated cosine similarity; sorting the n test negative samples according to the similarity between the site photograph and the base map to obtain a sorting result, and determining the target similarity according to a preset false recognition rate and the sorting result; calculating the passing rate of n test positive samples according to the target similarity; the evaluation results of the face recognition model are generated according to the passing rate of the n test positive samples, and are used for evaluating the recognition accuracy of the face recognition model, so that the technical problem that the face recognition model cannot be evaluated according to the recognition accuracy in the prior art is solved, and the technical effect of evaluating the face recognition model according to the recognition accuracy is achieved.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of an alternative method for evaluating a face recognition model according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an evaluation device of an alternative face recognition model according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an alternative computer device provided by an embodiment of the present invention.
[ detailed description ] of the invention
For a better understanding of the technical solution of the present invention, the following detailed description of the embodiments of the present invention refers to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one relationship describing the association of the associated objects, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
Fig. 1 is a flowchart of an optional method for evaluating a face recognition model according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, constructing n test positive samples, wherein each test positive sample in the n test positive samples comprises a site photo and a base map, the site photo and the base map included in the same test positive sample correspond to the same user, the base map is an image stored in a base map library, n base maps are stored in the base map library, each base map has a corresponding relation with one user, and n is a natural number larger than 1.
In step S104, n test negative samples are constructed, where each test negative sample in the n test negative samples includes a site photograph and a base map, and the site photograph and the base map included in the same test negative sample correspond to different users.
Step S106, n test negative samples are input into the face recognition model.
Step S108, receiving the feature vector corresponding to the site illumination and the feature vector corresponding to the base map output by the face recognition model, calculating cosine similarity between the feature vector corresponding to the site illumination and the feature vector corresponding to the base map, and determining similarity between the site illumination and the base map of each test negative sample in the n test negative samples according to the calculated cosine similarity.
Step S110, sorting the n test negative samples according to the similarity between the site photograph and the base map of each test negative sample in the n test negative samples, and obtaining a sorting result.
Step S112, obtaining a preset false recognition rate.
Step S114, determining the target similarity according to the preset false recognition rate and the sequencing result.
And step S116, calculating the passing rate of n test positive samples according to the target similarity.
And S118, generating an evaluation result of the face recognition model according to the passing rate of the n test positive samples, wherein the evaluation result is used for evaluating the recognition accuracy of the face recognition model.
Step S120, outputting an evaluation result of the face recognition model.
When staff, examinees and the like need to enter a staff field and an examination room, the cameras can shoot pictures in front of the lenses, and in the embodiment of the invention, the pictures are expressed by site illumination.
The pictures in the base are pre-collected.
For example, 10000 base charts are stored in a base chart library, the base charts are acquired in advance, each base chart has a one-to-one correspondence with one user, for example, front face photos of the 0890 number user are acquired in advance, base charts of the 0890 number user are obtained, the base charts are stored in a database, and the base charts have a one-to-one correspondence with the 0890 number user. The total number of the test positive samples is 10000, namely test positive sample S1, test positive samples S2 and … … and test positive sample S10000. A test positive sample includes a site shot and a base map corresponding to the same user, a test positive sample Si includes a site shot of user i and a base map of user i, for example, test positive sample S0055 includes a site shot of user 0055 and a base map of user 0055, and test positive sample S0036 includes a site shot of user 0036 and a base map of user 0036. A test negative includes a spot shot and a base map, and the spot shot and the base map correspond to different users, for example, the test negative S0055 includes a spot shot and a base map, the spot shot is a spot shot of user 0055, and the base map is not a base map of user 0055. The test negative sample S0036 includes a site shot and a base map, the site shot is a site shot of user 0036, and the base map is not a base map of user 0036. 10000 test negative samples are input into a face recognition model, the face recognition model outputs feature vectors corresponding to the site shots of 10000 test negative samples and feature vectors corresponding to the base map, cosine similarity between the feature vectors corresponding to the site shots and the feature vectors corresponding to the base map is calculated, and the similarity between the site shots and the base map of each test negative sample in 10000 test negative samples is determined according to the calculated cosine similarity. And sequencing 10000 test negative samples according to the sequence from high to low of the similarity between the site illumination and the base map to obtain a sequencing result, and determining the target similarity according to the preset false recognition rate and the sequencing result. Calculating the passing rate of 10000 test positive samples according to the target similarity; and generating an evaluation result of the face recognition model according to the passing rate of 10000 test positive samples, wherein the evaluation result is used for evaluating the recognition accuracy of the face recognition model.
Since the recognition accuracy of the face recognition model generally cannot reach 100%, there may be cases of misrecognition, for example, the site photograph and the base map of the same person cannot be recognized, or the site photograph and the base map of different persons are recognized as the same person. The reasons for the misrecognition may be various, for example, the collected gestures of the site illumination are side face gestures, the user in the site illumination wears glasses, the user in the base map does not wear glasses, and the illumination conditions in the site illumination and the base map also have influence on the recognition result. A high-quality face recognition model can reduce the false recognition rate to a low degree.
In the embodiment of the invention, a test negative sample is input into a face recognition model, a feature vector corresponding to a scene shot and a feature vector corresponding to a base map output by the face recognition model are received, cosine similarity between the feature vector corresponding to the scene shot and the feature vector corresponding to the base map is calculated, and the similarity between the scene shot and the base map of each test negative sample is determined according to the calculated cosine similarity; sorting the n test negative samples according to the similarity between the site photograph and the base map to obtain a sorting result, and determining the target similarity according to a preset false recognition rate and the sorting result; calculating the passing rate of n test positive samples according to the target similarity; the evaluation results of the face recognition model are generated according to the passing rate of the n test positive samples, and are used for evaluating the recognition accuracy of the face recognition model, so that the technical problem that the face recognition model cannot be evaluated according to the recognition accuracy in the prior art is solved, and the technical effect of evaluating the face recognition model according to the recognition accuracy is achieved.
Taking a specific process of constructing the ith test negative example as an example, a specific explanation is given of a process of constructing the test negative example. The specific process of constructing the ith test negative sample is as follows: acquiring site illumination of an ith user, wherein i is a natural number, and i is more than or equal to 1 and less than or equal to n; obtaining n base graphs stored in a base graph library; filtering the base map corresponding to the ith user; and carrying out similarity calculation on the field shot of the ith user and the rest n-1 base graphs, taking the base graph corresponding to the maximum similarity as the base graph of the ith test negative sample, and taking the field shot of the ith user as the field shot of the ith test negative sample.
For example, 10000 base graphs are stored in the base graph library, and the specific process of constructing 0055 th negative test sample is as follows: acquiring a site illumination of a 0055 user; 10000 base graphs stored in a base graph library are obtained; filtering the base map corresponding to the 0055 user; and (3) carrying out similarity calculation on the site illumination of the user 0055 and the rest 9999 base graphs, wherein the base graph corresponding to the maximum similarity is the base graph P560, the base graph P560 is taken as the base graph of the 0055 th test negative sample, and the site illumination of the user 0055 is taken as the site illumination of the 0055 th test negative sample. The specific process of constructing the 0036 th test negative sample is: acquiring site illumination of 0036 users; 10000 base graphs stored in a base graph library are obtained; filtering a base map corresponding to the 0036 user; and carrying out similarity calculation on the site photo of the user 0036 and the rest 9999 base graphs, wherein the base graph corresponding to the maximum similarity is the base graph P9923, the base graph P9923 is taken as the base graph of the 0036 th test negative sample, and the site photo of the user 0036 is taken as the site photo of the 0036 th test negative sample.
Optionally, determining the target similarity according to the preset false recognition rate and the sequencing result includes: calculating the number of misrecognitions according to a formula h=n×l, wherein L is a preset misrecognition rate, and h is the number of misrecognitions; h test negative samples are screened out according to the sequencing result, wherein the similarity between the site shots and the base graphs of the screened h test negative samples is larger than the similarity between the site shots and the base graphs of the rest n-h test negative samples; taking the test negative sample with the lowest similarity between the site photograph and the base map in the h test negative samples as a target test negative sample; and determining the target similarity according to the similarity between the site photograph and the base map of the target test negative sample.
The preset false recognition rate is a preset value, which can be 1%, 2%, 5%, 10% and the like, and can be set according to actual conditions.
The preset false recognition rate and the target similarity are in a negative correlation relationship, and if the preset false recognition rate is higher, the target similarity is smaller; if the preset false recognition rate is low, the target similarity is high.
Optionally, calculating the passing rate of the n test positive samples according to the target similarity includes: calculating the similarity between the site shots and the base graphs of the n test positive samples to obtain n similarities; counting the number of test positive samples with the similarity between the site photograph and the base map being greater than or equal to the target similarity; the pass rate of n test positive samples was calculated according to the following formula: p=n1/n, where P is the pass rate of n test positive samples and n1 is the number of test positive samples with a similarity between the site view and the base map greater than or equal to the target similarity.
The passing rate of the n test positive samples can evaluate the recognition accuracy of the face recognition model. The higher the passing rate of the n test positive samples is, the higher the recognition accuracy of the face recognition model is; the lower the passing rate of the n test positive samples is, the lower the recognition accuracy of the face recognition model is.
And (3) carrying out similarity calculation on the site photograph of the ith user and the rest n-1 base graphs, wherein the specific process comprises the following steps of: calculating a feature vector corresponding to the field shot of the ith user, and carrying out L2 norm normalization processing on the feature vector corresponding to the field shot of the ith user; obtaining the feature vectors corresponding to the remaining n-1 base graphs, and carrying out L2 norm normalization processing on the feature vectors corresponding to the remaining n-1 base graphs; calculating cosine similarity between the feature vector corresponding to the field shot of the ith user after L2 norm normalization processing and the feature vector corresponding to each base map in the remaining n-1 base maps; and taking cosine similarity between the feature vector corresponding to the field shot of the ith user after L2 norm normalization processing and the feature vector corresponding to the kth base map in the remaining n-1 base maps as similarity between the field shot of the ith user and the kth base map in the remaining n-1 base maps, wherein k is more than or equal to 1 and less than or equal to n-1.
The concrete process of calculating the cosine similarity between the feature vector corresponding to the field shot of the ith user after the L2 norm normalization processing and the feature vector corresponding to each base map in the remaining n-1 base maps can be as follows: carrying out L2 norm normalization processing on the feature vector corresponding to the field shot of the ith user, and then carrying out transposition processing to obtain a vector serving as a first vector, wherein the first vector is a matrix of 1 row and m columns, and m is the dimension of the feature vector corresponding to the field shot of the ith user; taking the feature vector obtained after L2 norm normalization processing is carried out on the feature vector corresponding to the n-1 base graphs as a second vector, wherein the second vector is a matrix of m rows and 1 columns, the number of the second vector is n-1, the n-1 second vectors are arranged into a first matrix, the first matrix is a matrix of m rows (n-1) columns, the first vector is multiplied by the first matrix to obtain a second vector, the second vector is a matrix of 1 rows (n-1) columns, and the kth element of the second vector is the cosine similarity between the feature vector corresponding to the field illumination of the ith user and the feature vector corresponding to the kth base graph, and k is more than or equal to 1 and less than or equal to n-1.
Vector L2 norm normalization is the division of each element in a vector by the L2 norm of the vector. The L2 norm is calculated according to the following formula:
||x|| 2 Represents the L2 norm of the vector x, and n represents the number of elements contained in the vector x.
Because the L2 norm normalization processing is carried out on the feature vector before the cosine similarity is calculated, the modulo removal processing is not needed each time in the process of calculating the cosine similarity, thereby converting multiple vector multiplication operations into single matrix multiplication operations, realizing parallel calculation and having high similarity calculation efficiency.
The evaluation result is used for evaluating the recognition accuracy of the face recognition model. If the evaluation result shows that the recognition accuracy of the face recognition model is lower than the preset accuracy level, after the evaluation result of the face recognition model is output, prompt information is also output, and the prompt information is used for prompting that the face recognition model needs to be optimized.
The embodiment of the invention provides an evaluation device of a face recognition model, which is used for executing the evaluation method of the face recognition model, as shown in fig. 2, and comprises the following steps: the first building unit 10, the second building unit 11, the input unit 12, the first determination unit 13, the sorting unit 14, the acquisition unit 15, the second determination unit 16, the calculation unit 17, the generation unit 18, and the output unit 19.
The first construction unit 10 is configured to construct n test positive samples, where each test positive sample in the n test positive samples includes a site photograph and a base map, where the site photograph and the base map included in the same test positive sample correspond to the same user, the base map is an image stored in a base map library, n base maps are stored in the base map library, each base map has a corresponding relationship with one user, and n is a natural number greater than 1.
The second construction unit 11 is configured to construct n test negative samples, where each test negative sample in the n test negative samples includes a site photograph and a base map, and the site photograph and the base map included in the same test negative sample correspond to different users.
An input unit 12 for inputting n test negative samples into the face recognition model.
The first determining unit 13 is configured to receive the feature vector corresponding to the field shot and the feature vector corresponding to the base map output by the face recognition model, calculate cosine similarity between the feature vector corresponding to the field shot and the feature vector corresponding to the base map, and determine similarity between the field shot and the base map of each of the n test negative samples according to the calculated cosine similarity.
And the sorting unit 14 is configured to sort the n test negative samples according to the similarity between the site photograph and the base map of each test negative sample in the n test negative samples, so as to obtain a sorting result.
An obtaining unit 15, configured to obtain a preset false recognition rate.
And a second determining unit 16, configured to determine the target similarity according to a preset false recognition rate and a sequencing result.
A calculating unit 17 for calculating the passing rate of the n test positive samples according to the target similarity.
And the generating unit 18 is configured to generate an evaluation result of the face recognition model according to the passing rates of the n test positive samples, where the evaluation result is used to evaluate the recognition accuracy of the face recognition model.
And an output unit 19, configured to output an evaluation result of the face recognition model.
When staff, examinees and the like need to enter a staff field and an examination room, the cameras can shoot pictures in front of the lenses, and in the embodiment of the invention, the pictures are expressed by site illumination.
The pictures in the base are pre-collected.
For example, 10000 base charts are stored in a base chart library, the base charts are acquired in advance, each base chart has a one-to-one correspondence with one user, for example, front face photos of the 0890 number user are acquired in advance, base charts of the 0890 number user are obtained, the base charts are stored in a database, and the base charts have a one-to-one correspondence with the 0890 number user. The total number of the test positive samples is 10000, namely test positive sample S1, test positive samples S2 and … … and test positive sample S10000. A test positive sample includes a site shot and a base map corresponding to the same user, a test positive sample Si includes a site shot of user i and a base map of user i, for example, test positive sample S0055 includes a site shot of user 0055 and a base map of user 0055, and test positive sample S0036 includes a site shot of user 0036 and a base map of user 0036. A test negative includes a spot shot and a base map, and the spot shot and the base map correspond to different users, for example, the test negative S0055 includes a spot shot and a base map, the spot shot is a spot shot of user 0055, and the base map is not a base map of user 0055. The test negative sample S0036 includes a site shot and a base map, the site shot is a site shot of user 0036, and the base map is not a base map of user 0036. 10000 test negative samples are input into a face recognition model, the face recognition model outputs feature vectors corresponding to the site shots of 10000 test negative samples and feature vectors corresponding to the base map, cosine similarity between the feature vectors corresponding to the site shots and the feature vectors corresponding to the base map is calculated, and the similarity between the site shots and the base map of each test negative sample in 10000 test negative samples is determined according to the calculated cosine similarity. And sequencing 10000 test negative samples according to the sequence from high to low of the similarity between the site illumination and the base map to obtain a sequencing result, and determining the target similarity according to the preset false recognition rate and the sequencing result. Calculating the passing rate of 10000 test positive samples according to the target similarity; and generating an evaluation result of the face recognition model according to the passing rate of 10000 test positive samples, wherein the evaluation result is used for evaluating the recognition accuracy of the face recognition model.
Since the recognition accuracy of the face recognition model generally cannot reach 100%, there may be cases of misrecognition, for example, the site photograph and the base map of the same person cannot be recognized, or the site photograph and the base map of different persons are recognized as the same person. The reasons for the misrecognition may be various, for example, the collected gestures of the site illumination are side face gestures, the user in the site illumination wears glasses, the user in the base map does not wear glasses, and the illumination conditions in the site illumination and the base map also have influence on the recognition result. A high-quality face recognition model can reduce the false recognition rate to a low degree.
In the embodiment of the invention, a test negative sample is input into a face recognition model, a feature vector corresponding to a scene shot and a feature vector corresponding to a base map output by the face recognition model are received, cosine similarity between the feature vector corresponding to the scene shot and the feature vector corresponding to the base map is calculated, and the similarity between the scene shot and the base map of each test negative sample is determined according to the calculated cosine similarity; sorting the n test negative samples according to the similarity between the site photograph and the base map to obtain a sorting result, and determining the target similarity according to a preset false recognition rate and the sorting result; calculating the passing rate of n test positive samples according to the target similarity; the evaluation results of the face recognition model are generated according to the passing rate of the n test positive samples, and are used for evaluating the recognition accuracy of the face recognition model, so that the technical problem that the face recognition model cannot be evaluated according to the recognition accuracy in the prior art is solved, and the technical effect of evaluating the face recognition model according to the recognition accuracy is achieved.
Optionally, the second determining unit 16 includes: the system comprises a first computing subunit, a screening subunit, a first determining subunit and a second determining subunit. The first calculating subunit is configured to calculate the number of misrecognitions according to a formula h=n×l, where L is a preset misrecognition rate, and h is the number of misrecognitions. And the screening subunit is used for screening out h test negative samples according to the sorting result, wherein the similarity between the site shots and the base graphs of the screened h test negative samples is greater than the similarity between the site shots and the base graphs of the rest n-h test negative samples. And the first determination subunit is used for taking the test negative sample with the lowest similarity between the site illumination and the base graph in the h test negative samples as a target test negative sample. And the second determination subunit is used for determining the target similarity according to the similarity between the site illumination of the target test negative sample and the base map.
Optionally, the calculation unit 17 includes: the system comprises a second computing subunit, a statistics subunit and a third computing subunit. And the second calculating subunit is used for calculating the similarity between the site shots and the base graphs of the n test positive samples to obtain n similarities. And the statistics subunit is used for counting the number of the test positive samples with the similarity between the site illumination and the base map being greater than or equal to the target similarity. A third calculation subunit, configured to calculate the passing rate of the n test positive samples according to the following formula: p=n1/n, where P is the pass rate of n test positive samples and n1 is the number of test positive samples with a similarity between the site view and the base map greater than or equal to the target similarity.
Optionally, the second building unit 11 comprises: the system comprises a first acquisition subunit, a second acquisition subunit, a filtering subunit and a fourth calculation subunit. The first acquisition subunit is used for acquiring site illumination of an ith user, wherein i is a natural number and is more than or equal to 1 and less than or equal to n. And the second acquisition subunit is used for acquiring n base graphs stored in the base graph library. And the filtering subunit is used for filtering the base map corresponding to the ith user. And the fourth calculation subunit is used for calculating the similarity between the field shot of the ith user and the remaining n-1 base graphs, taking the base graph corresponding to the maximum similarity as the base graph of the ith test negative sample, and taking the field shot of the ith user as the field shot of the ith test negative sample.
Optionally, the fourth computing subunit comprises: the device comprises a first calculation module, an acquisition module, a second calculation module and a determination module. The first calculation module is used for calculating the feature vector corresponding to the field shot of the ith user and carrying out L2 norm normalization processing on the feature vector corresponding to the field shot of the ith user. The acquisition module is used for acquiring the eigenvectors corresponding to the remaining n-1 base graphs and carrying out L2 norm normalization processing on the eigenvectors corresponding to the remaining n-1 base graphs. The second calculation module is used for calculating cosine similarity between the feature vector corresponding to the field shot of the ith user after the L2 norm normalization processing and the feature vector corresponding to each of the remaining n-1 base graphs. The determining module is used for taking the cosine similarity between the feature vector corresponding to the field shot of the ith user after the L2 norm normalization processing and the feature vector corresponding to the kth base map in the remaining n-1 base maps as the similarity between the field shot of the ith user and the kth base map in the remaining n-1 base maps, wherein k is more than or equal to 1 and less than or equal to n-1.
The embodiment of the invention provides a storage medium, which comprises a stored program, wherein when the program runs, equipment where the storage medium is controlled to execute the following steps: constructing n test positive samples, wherein each test positive sample in the n test positive samples comprises a site photo and a base map, the site photo and the base map which are included in the same test positive sample correspond to the same user, the base map is an image stored in a base map library, n base maps are stored in the base map library, each base map has a corresponding relation with one user, and n is a natural number larger than 1; constructing n test negative samples, wherein each test negative sample in the n test negative samples comprises a site photo and a base map, and the site photo and the base map of the same test negative sample correspond to different users; inputting n test negative samples into a face recognition model; receiving a feature vector corresponding to the site illumination and a feature vector corresponding to the base map output by the face recognition model, calculating cosine similarity between the feature vector corresponding to the site illumination and the feature vector corresponding to the base map, and determining similarity between the site illumination and the base map of each test negative sample in the n test negative samples according to the calculated cosine similarity; sorting the n test negative samples according to the similarity between the site photograph and the base map of each test negative sample in the n test negative samples to obtain a sorting result; acquiring a preset false recognition rate; determining target similarity according to a preset false recognition rate and a sequencing result; calculating the passing rate of n test positive samples according to the target similarity; generating evaluation results of the face recognition model according to the passing rate of the n test positive samples, wherein the evaluation results are used for evaluating the recognition accuracy of the face recognition model; and outputting an evaluation result of the face recognition model.
Optionally, the device controlling the storage medium when the program runs further performs the following steps: calculating the number of misrecognitions according to a formula h=n×l, wherein L is a preset misrecognition rate, and h is the number of misrecognitions; h test negative samples are screened out according to the sequencing result, wherein the similarity between the site shots and the base graphs of the screened h test negative samples is larger than the similarity between the site shots and the base graphs of the rest n-h test negative samples; taking the test negative sample with the lowest similarity between the site photograph and the base map in the h test negative samples as a target test negative sample; and determining the target similarity according to the similarity between the site photograph and the base map of the target test negative sample.
Optionally, the device controlling the storage medium when the program runs further performs the following steps: calculating the similarity between the site shots and the base graphs of the n test positive samples to obtain n similarities; counting the number of test positive samples with the similarity between the site photograph and the base map being greater than or equal to the target similarity; the pass rate of n test positive samples was calculated according to the following formula: p=n1/n, where P is the pass rate of n test positive samples and n1 is the number of test positive samples with a similarity between the site view and the base map greater than or equal to the target similarity.
Optionally, the device controlling the storage medium when the program runs further performs the following steps: acquiring site illumination of an ith user, wherein i is a natural number, and i is more than or equal to 1 and less than or equal to n; obtaining n base graphs stored in a base graph library; filtering the base map corresponding to the ith user; and carrying out similarity calculation on the field shot of the ith user and the rest n-1 base graphs, taking the base graph corresponding to the maximum similarity as the base graph of the ith test negative sample, and taking the field shot of the ith user as the field shot of the ith test negative sample.
Optionally, the device controlling the storage medium when the program runs further performs the following steps: calculating a feature vector corresponding to the field shot of the ith user, and carrying out L2 norm normalization processing on the feature vector corresponding to the field shot of the ith user; obtaining the feature vectors corresponding to the remaining n-1 base graphs, and carrying out L2 norm normalization processing on the feature vectors corresponding to the remaining n-1 base graphs; calculating cosine similarity between the feature vector corresponding to the field shot of the ith user after L2 norm normalization processing and the feature vector corresponding to each base map in the remaining n-1 base maps; and taking cosine similarity between the feature vector corresponding to the field shot of the ith user after L2 norm normalization processing and the feature vector corresponding to the kth base map in the remaining n-1 base maps as similarity between the field shot of the ith user and the kth base map in the remaining n-1 base maps, wherein k is more than or equal to 1 and less than or equal to n-1.
The embodiment of the invention provides a computer device, which comprises a memory and a processor, wherein the memory is used for storing information comprising program instructions, the processor is used for controlling the execution of the program instructions, and the program instructions realize the following steps when loaded and executed by the processor: constructing n test positive samples, wherein each test positive sample in the n test positive samples comprises a site photo and a base map, the site photo and the base map which are included in the same test positive sample correspond to the same user, the base map is an image stored in a base map library, n base maps are stored in the base map library, each base map has a corresponding relation with one user, and n is a natural number larger than 1; constructing n test negative samples, wherein each test negative sample in the n test negative samples comprises a site photo and a base map, and the site photo and the base map of the same test negative sample correspond to different users; inputting n test negative samples into a face recognition model; receiving a feature vector corresponding to the site illumination and a feature vector corresponding to the base map output by the face recognition model, calculating cosine similarity between the feature vector corresponding to the site illumination and the feature vector corresponding to the base map, and determining similarity between the site illumination and the base map of each test negative sample in the n test negative samples according to the calculated cosine similarity; sorting the n test negative samples according to the similarity between the site photograph and the base map of each test negative sample in the n test negative samples to obtain a sorting result; acquiring a preset false recognition rate; determining target similarity according to a preset false recognition rate and a sequencing result; calculating the passing rate of n test positive samples according to the target similarity; generating evaluation results of the face recognition model according to the passing rate of the n test positive samples, wherein the evaluation results are used for evaluating the recognition accuracy of the face recognition model; and outputting an evaluation result of the face recognition model.
Optionally, the program instructions when loaded and executed by the processor further implement the steps of: calculating the number of misrecognitions according to a formula h=n×l, wherein L is a preset misrecognition rate, and h is the number of misrecognitions; h test negative samples are screened out according to the sequencing result, wherein the similarity between the site shots and the base graphs of the screened h test negative samples is larger than the similarity between the site shots and the base graphs of the rest n-h test negative samples; taking the test negative sample with the lowest similarity between the site photograph and the base map in the h test negative samples as a target test negative sample; and determining the target similarity according to the similarity between the site photograph and the base map of the target test negative sample.
Optionally, the program instructions when loaded and executed by the processor further implement the steps of: calculating the similarity between the site shots and the base graphs of the n test positive samples to obtain n similarities; counting the number of test positive samples with the similarity between the site photograph and the base map being greater than or equal to the target similarity; the pass rate of n test positive samples was calculated according to the following formula: p=n1/n, where P is the pass rate of n test positive samples and n1 is the number of test positive samples with a similarity between the site view and the base map greater than or equal to the target similarity.
Optionally, the program instructions when loaded and executed by the processor further implement the steps of: acquiring site illumination of an ith user, wherein i is a natural number, and i is more than or equal to 1 and less than or equal to n; obtaining n base graphs stored in a base graph library; filtering the base map corresponding to the ith user; and carrying out similarity calculation on the field shot of the ith user and the rest n-1 base graphs, taking the base graph corresponding to the maximum similarity as the base graph of the ith test negative sample, and taking the field shot of the ith user as the field shot of the ith test negative sample.
Optionally, the program instructions when loaded and executed by the processor further implement the steps of: calculating a feature vector corresponding to the field shot of the ith user, and carrying out L2 norm normalization processing on the feature vector corresponding to the field shot of the ith user; obtaining the feature vectors corresponding to the remaining n-1 base graphs, and carrying out L2 norm normalization processing on the feature vectors corresponding to the remaining n-1 base graphs; calculating cosine similarity between the feature vector corresponding to the field shot of the ith user after L2 norm normalization processing and the feature vector corresponding to each base map in the remaining n-1 base maps; and taking cosine similarity between the feature vector corresponding to the field shot of the ith user after L2 norm normalization processing and the feature vector corresponding to the kth base map in the remaining n-1 base maps as similarity between the field shot of the ith user and the kth base map in the remaining n-1 base maps, wherein k is more than or equal to 1 and less than or equal to n-1.
FIG. 3 is a schematic diagram of an alternative computer device provided by an embodiment of the present invention. As shown in fig. 3, the computer device 50 of this embodiment includes: the processor 51, the memory 52, and the computer program 53 stored in the memory 52 and capable of running on the processor 51, where the computer program 53 when executed by the processor 51 implements the method for evaluating the face recognition model in the embodiment, and is not repeated here. Alternatively, the computer program when executed by the processor 51 implements the functions of each model/unit in the evaluation device of the face recognition model in the embodiment, and in order to avoid repetition, it is not described in detail herein.
The computer device 50 may be a desktop computer, a notebook computer, a palm top computer, a cloud server, or the like. Computer devices may include, but are not limited to, a processor 51, a memory 52. It will be appreciated by those skilled in the art that fig. 3 is merely an example of computer device 50 and is not intended to limit computer device 50, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., a computer device may also include an input-output device, a network access device, a bus, etc.
The processor 51 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 52 may be an internal storage unit of the computer device 50, such as a hard disk or memory of the computer device 50. The memory 52 may also be an external storage device of the computer device 50, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the computer device 50. Further, the memory 52 may also include both internal storage units and external storage devices of the computer device 50. The memory 52 is used to store computer programs and other programs and data required by the computer device. The memory 52 may also be used to temporarily store data that has been output or is to be output.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present invention, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a Processor (Processor) to perform part of the steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the invention.

Claims (10)

1. A method for evaluating a face recognition model, the method comprising:
constructing n test positive samples, wherein each test positive sample in the n test positive samples comprises a site photo and a base map, the site photo and the base map included in the same test positive sample correspond to the same user, the base map is an image stored in a base map library, n base maps are stored in the base map library, each base map has a corresponding relation with one user, and n is a natural number larger than 1;
constructing n test negative samples, wherein each test negative sample in the n test negative samples comprises a site photo and a base map, and the site photo and the base map included in the same test negative sample correspond to different users;
inputting the n test negative samples into a face recognition model;
receiving a feature vector corresponding to the site illumination and a feature vector corresponding to the base map output by the face recognition model, calculating cosine similarity between the feature vector corresponding to the site illumination and the feature vector corresponding to the base map, and determining similarity between the site illumination and the base map of each test negative sample in the n test negative samples according to the calculated cosine similarity;
Sequencing the n test negative samples according to the similarity between the site illumination and the base map of each test negative sample in the n test negative samples to obtain a sequencing result;
acquiring a preset false recognition rate;
determining target similarity according to the preset false recognition rate and the sequencing result;
calculating the passing rate of the n test positive samples according to the target similarity;
generating an evaluation result of the face recognition model according to the passing rate of the n test positive samples, wherein the evaluation result is used for evaluating the recognition accuracy of the face recognition model;
and outputting an evaluation result of the face recognition model.
2. The method of claim 1, wherein the determining the target similarity from the preset false recognition rate and the ranking result comprises:
calculating the number of misrecognitions according to a formula h=n×l, wherein L is the preset misrecognition rate, and h is the number of misrecognitions;
h test negative samples are screened out according to the sequencing result, wherein the similarity between the site shots and the base graphs of the screened h test negative samples is larger than the similarity between the site shots and the base graphs of the rest n-h test negative samples;
Taking the test negative sample with the lowest similarity between the site photograph and the base map in the h test negative samples as a target test negative sample;
and determining the target similarity according to the similarity between the site photograph and the base map of the target test negative sample.
3. The method of claim 1, wherein said calculating the pass rate of the n test positive samples from the target similarity comprises:
calculating the similarity between the site shots and the base graphs of the n test positive samples to obtain n similarities;
counting the number of test positive samples with the similarity between the site photograph and the base map being greater than or equal to the target similarity;
the pass rate of the n test positive samples is calculated according to the following formula: p=n1/n, where P is the pass rate of the n test positive samples, and n1 is the number of test positive samples having a similarity between the site illumination and the base map greater than or equal to the target similarity.
4. A method according to any one of claims 1 to 3, wherein said constructing n test negative samples comprises:
acquiring site illumination of an ith user, wherein i is a natural number, and i is more than or equal to 1 and less than or equal to n;
acquiring n base graphs stored in the base graph library;
Filtering out the base map corresponding to the ith user;
and carrying out similarity calculation on the field shot of the ith user and the rest n-1 base graphs, taking the base graph corresponding to the maximum similarity as the base graph of the ith test negative sample, and taking the field shot of the ith user as the field shot of the ith test negative sample.
5. The method of claim 4, wherein the similarity calculation of the i-th user's site photograph with the remaining n-1 base graphs comprises:
calculating a feature vector corresponding to the field shot of the ith user, and carrying out L2 norm normalization processing on the feature vector corresponding to the field shot of the ith user;
acquiring the feature vectors corresponding to the remaining n-1 base graphs, and carrying out L2 norm normalization processing on the feature vectors corresponding to the remaining n-1 base graphs;
calculating cosine similarity between the feature vector corresponding to the field shot of the ith user after L2 norm normalization processing and the feature vector corresponding to each of the remaining n-1 base graphs;
and taking cosine similarity between the feature vector corresponding to the field shot of the ith user and the feature vector corresponding to the kth base map in the remaining n-1 base maps after L2 norm normalization processing as similarity between the field shot of the ith user and the kth base map in the remaining n-1 base maps, wherein k is more than or equal to 1 and less than or equal to n-1.
6. An apparatus for evaluating a face recognition model, the apparatus comprising:
the first construction unit is used for constructing n test positive samples, each test positive sample in the n test positive samples comprises a site photo and a base map, the site photo and the base map which are included in the same test positive sample correspond to the same user, the base map is an image stored in a base map library, n base maps are stored in the base map library, each base map has a corresponding relation with one user, and n is a natural number which is larger than 1;
the second construction unit is used for constructing n test negative samples, each test negative sample in the n test negative samples comprises a site photo and a base map, and the site photo and the base map which are included in the same test negative sample correspond to different users;
the input unit is used for inputting the n test negative samples into a face recognition model;
the first determining unit is used for receiving the feature vector corresponding to the field shot and the feature vector corresponding to the base map output by the face recognition model, calculating cosine similarity between the feature vector corresponding to the field shot and the feature vector corresponding to the base map, and determining similarity between the field shot and the base map of each test negative sample in the n test negative samples according to the calculated cosine similarity;
The sequencing unit is used for sequencing the n test negative samples according to the similarity between the site photograph and the base map of each test negative sample in the n test negative samples to obtain a sequencing result;
the acquisition unit is used for acquiring a preset false recognition rate;
the second determining unit is used for determining the target similarity according to the preset false recognition rate and the sorting result;
a calculating unit, configured to calculate the passing rates of the n test positive samples according to the target similarity;
the generation unit is used for generating evaluation results of the face recognition model according to the passing rate of the n test positive samples, and the evaluation results are used for evaluating the recognition accuracy of the face recognition model;
and the output unit is used for outputting the evaluation result of the face recognition model.
7. The apparatus according to claim 6, wherein the second determining unit includes:
a first calculating subunit, configured to calculate, according to a formula h=n×l, a number of misrecognitions, where L is the preset misrecognition rate, and h is the number of misrecognitions;
the screening subunit is used for screening out h test negative samples according to the sorting result, wherein the similarity between the site shots and the base graphs of the screened h test negative samples is larger than the similarity between the site shots and the base graphs of the other n-h test negative samples;
A first determining subunit, configured to use, as a target test negative sample, a test negative sample with the lowest similarity between the site illumination and the base map in the h test negative samples;
and the second determination subunit is used for determining the target similarity according to the similarity between the site illumination and the base map of the target test negative sample.
8. The apparatus of claim 6, wherein the computing unit comprises:
the second calculating subunit is used for calculating the similarity between the site shots and the base graphs of the n test positive samples to obtain n similarities;
the statistics subunit is used for counting the number of the test positive samples with the similarity between the site illumination and the base map being greater than or equal to the target similarity;
a third calculation subunit, configured to calculate the passing rates of the n test positive samples according to the following formula: p=n1/n, where P is the pass rate of the n test positive samples, and n1 is the number of test positive samples having a similarity between the site illumination and the base map greater than or equal to the target similarity.
9. A storage medium comprising a stored program, wherein the program, when run, controls a device in which the storage medium is located to perform the method of evaluating a face recognition model according to any one of claims 1 to 5.
10. A computer device comprising a memory for storing information including program instructions and a processor for controlling execution of the program instructions, characterized by: the program instructions, when loaded and executed by a processor, implement the steps of the method for evaluating a face recognition model according to any one of claims 1 to 5.
CN201910039128.1A 2019-01-16 2019-01-16 Face recognition model evaluation method and device Active CN109871762B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910039128.1A CN109871762B (en) 2019-01-16 2019-01-16 Face recognition model evaluation method and device
PCT/CN2019/118257 WO2020147408A1 (en) 2019-01-16 2019-11-14 Facial recognition model evaluation method and apparatus, and storage medium and computer device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910039128.1A CN109871762B (en) 2019-01-16 2019-01-16 Face recognition model evaluation method and device

Publications (2)

Publication Number Publication Date
CN109871762A CN109871762A (en) 2019-06-11
CN109871762B true CN109871762B (en) 2023-08-08

Family

ID=66917594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910039128.1A Active CN109871762B (en) 2019-01-16 2019-01-16 Face recognition model evaluation method and device

Country Status (2)

Country Link
CN (1) CN109871762B (en)
WO (1) WO2020147408A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109871762B (en) * 2019-01-16 2023-08-08 平安科技(深圳)有限公司 Face recognition model evaluation method and device
CN110765903A (en) * 2019-10-10 2020-02-07 浙江大华技术股份有限公司 Pedestrian re-identification method and device and storage medium
CN110991314B (en) * 2019-11-28 2023-11-10 以萨技术股份有限公司 Face clustering accuracy-based test method and system
CN111738349B (en) * 2020-06-29 2023-05-02 重庆紫光华山智安科技有限公司 Detection effect evaluation method and device of target detection algorithm, storage medium and equipment
CN118447268A (en) * 2023-09-22 2024-08-06 荣耀终端有限公司 Identity recognition method, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017181769A1 (en) * 2016-04-21 2017-10-26 腾讯科技(深圳)有限公司 Facial recognition method, apparatus and system, device, and storage medium
CN108776768A (en) * 2018-04-19 2018-11-09 广州视源电子科技股份有限公司 Image recognition method and device
CN109189961A (en) * 2018-07-23 2019-01-11 上海斐讯数据通信技术有限公司 A kind of calculation method and system of recognition of face confidence level

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150235073A1 (en) * 2014-01-28 2015-08-20 The Trustees Of The Stevens Institute Of Technology Flexible part-based representation for real-world face recognition apparatus and methods
CN108009528B (en) * 2017-12-26 2020-04-07 广州广电运通金融电子股份有限公司 Triple Loss-based face authentication method and device, computer equipment and storage medium
CN108805048B (en) * 2018-05-25 2020-01-31 腾讯科技(深圳)有限公司 face recognition model adjusting method, device and storage medium
CN109871762B (en) * 2019-01-16 2023-08-08 平安科技(深圳)有限公司 Face recognition model evaluation method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017181769A1 (en) * 2016-04-21 2017-10-26 腾讯科技(深圳)有限公司 Facial recognition method, apparatus and system, device, and storage medium
CN108776768A (en) * 2018-04-19 2018-11-09 广州视源电子科技股份有限公司 Image recognition method and device
CN109189961A (en) * 2018-07-23 2019-01-11 上海斐讯数据通信技术有限公司 A kind of calculation method and system of recognition of face confidence level

Also Published As

Publication number Publication date
WO2020147408A1 (en) 2020-07-23
CN109871762A (en) 2019-06-11

Similar Documents

Publication Publication Date Title
CN109871762B (en) Face recognition model evaluation method and device
CN108182394B (en) Convolutional neural network training method, face recognition method and face recognition device
CN108197532A (en) The method, apparatus and computer installation of recognition of face
CN106780662B (en) Face image generation method, device and equipment
CN108108662B (en) Deep neural network recognition model and recognition method
CN108875522A (en) Face cluster methods, devices and systems and storage medium
CN109902546A (en) Face identification method, device and computer-readable medium
CN111340077B (en) Attention mechanism-based disparity map acquisition method and device
CN110427970A (en) Image classification method, device, computer equipment and storage medium
CN109671020A (en) Image processing method, device, electronic equipment and computer storage medium
CN111667001B (en) Target re-identification method, device, computer equipment and storage medium
CN109766925B (en) Feature fusion method and device, electronic equipment and storage medium
CN104573652A (en) Method, device and terminal for determining identity identification of human face in human face image
CN111914908B (en) Image recognition model training method, image recognition method and related equipment
CN114495241B (en) Image recognition method and device, electronic equipment and storage medium
CN115424053B (en) Small sample image recognition method, device, equipment and storage medium
CN110276243A (en) Score mapping method, face comparison method, device, equipment and storage medium
CN110188602A (en) Face identification method and device in video
CN110399970B (en) Wavelet convolution wavelet neural network and information analysis method and system
CN108021693A (en) A kind of image search method and device
CN113192028B (en) Quality evaluation method and device for face image, electronic equipment and storage medium
CN115222443A (en) Client group division method, device, equipment and storage medium
CN114612979A (en) Living body detection method and device, electronic equipment and storage medium
CN111626212B (en) Method and device for identifying object in picture, storage medium and electronic device
CN115880740A (en) Face living body detection method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant