CN109918992B - Model evaluation method and device based on face attendance scene and computer equipment - Google Patents

Model evaluation method and device based on face attendance scene and computer equipment Download PDF

Info

Publication number
CN109918992B
CN109918992B CN201910019147.8A CN201910019147A CN109918992B CN 109918992 B CN109918992 B CN 109918992B CN 201910019147 A CN201910019147 A CN 201910019147A CN 109918992 B CN109918992 B CN 109918992B
Authority
CN
China
Prior art keywords
similarity
model
site
maximum similarity
evaluated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910019147.8A
Other languages
Chinese (zh)
Other versions
CN109918992A (en
Inventor
翟彬彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201910019147.8A priority Critical patent/CN109918992B/en
Publication of CN109918992A publication Critical patent/CN109918992A/en
Application granted granted Critical
Publication of CN109918992B publication Critical patent/CN109918992B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The application provides a model evaluation method, a model evaluation device and computer equipment based on a face attendance scene, wherein the method comprises the following steps: combining the site shots of each user with non-corresponding base pictures in pairs to obtain n negative samples, wherein the non-corresponding base pictures comprise base pictures corresponding to site shots of another user with the maximum similarity with the site shots of each user; sorting the n negative samples according to the maximum similarity, and obtaining the mth maximum similarity; and detecting n positive samples under the mth maximum similarity by using the model to be evaluated to finish the evaluation of the model to be evaluated, wherein n and m are any positive integers, n is more than or equal to 2, and n is more than m. The application can avoid the problem that visual expression cannot be carried out when the model is evaluated in the artificial intelligence field, can evaluate the positive sample only according to the m-th maximum similarity with the highest precision, and fully combines the production practice.

Description

Model evaluation method and device based on face attendance scene and computer equipment
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a model evaluation method, device and computer equipment based on a face attendance scene.
Background
In the prior art, the model capacity is evaluated by using a receiver operation characteristic curve (Receiver Operating Characteristic; hereinafter, abbreviated as ROC) and an evaluation index F1 score (hereinafter, abbreviated as F1-score).
The ROC curve is a comprehensive index reflecting the sensitivity and specificity continuous variable, and the correlation between the sensitivity and the specificity is revealed by a composition method, a series of sensitivity and specificity are calculated by setting the continuous variable to a plurality of different critical values, and the sensitivity is plotted as an ordinate and the (1-specificity) abscissa to form a curve, and the larger the area under the curve is, the higher the diagnosis accuracy is. On the ROC curve, the point closest to the upper left of the graph is the critical value for both higher sensitivity and specificity.
F1-score is an index used in statistics to measure the accuracy of two classification models. The method and the device simultaneously consider the accuracy and recall rate of the classification model. F1-score can be seen as a weighted average of model accuracy and recall, with a maximum of 1 and a minimum of 0.
However, in the above model evaluation method, whether the ROC curve or the F1 score is adopted, during the process of performing model evaluation, a full negative sample is adopted as the negative sample test set when the negative sample test set is constructed, so that during the process of performing model evaluation, the workload of evaluation is greatly increased.
Disclosure of Invention
The embodiment of the application provides a model evaluation method, a model evaluation device and computer equipment based on a face attendance scene, which are used for avoiding the situation that visual expression cannot be carried out when a model is evaluated, and fully combining production practice when the model is evaluated through brand new construction of a negative sample.
In a first aspect, an embodiment of the present application provides a model evaluation method based on a face attendance scene, including the following steps:
acquiring site shots of n users and base charts corresponding to each site shot, and combining the site shots of the same user and the corresponding base charts in pairs to obtain n positive samples;
combining the site shots of each user with non-corresponding base pictures in pairs to obtain n negative samples, wherein the non-corresponding base pictures comprise base pictures corresponding to site shots of another user with the maximum similarity with the site shots of each user;
sequencing the n negative samples according to the maximum similarity, and acquiring the mth maximum similarity, wherein the mth maximum similarity is the maximum similarity between the site illumination and the non-corresponding base map arranged in the negative sample at the mth position;
and detecting n positive samples under the mth maximum similarity by using the model to be evaluated to finish the evaluation of the model to be evaluated, wherein n and m are any positive integers, n is more than or equal to 2, and n is more than m.
Wherein in one possible implementation, the sorting the n negative samples according to the magnitude of the maximum similarity includes:
performing bubbling sequencing on the n negative samples according to the maximum similarity; or (b)
And selecting and sorting the n negative samples according to the maximum similarity.
In one possible implementation manner, the step of combining the site photograph and the corresponding base map of the same user two by two includes:
calculating the similarity between the site photograph of each user and the corresponding base map based on a similarity algorithm;
the method comprises the steps of combining the site shots of each user with the non-corresponding base map in pairs, and before obtaining n negative samples, further comprises the following steps:
and calculating the similarity between the site photograph and the non-corresponding base map of each user based on a similarity algorithm.
Wherein in a possible implementation manner, before the detecting n positive samples under the mth maximum similarity by using the model to be evaluated, the method further includes:
setting an evaluation threshold of the model to be evaluated as the mth maximum similarity;
the detecting n positive samples under the mth maximum similarity by using the model to be evaluated to complete the evaluation of the model to be evaluated comprises the following steps:
When the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is greater than or equal to the m maximum similarity, determining that the positive sample passes the detection, and when the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is less than the m maximum similarity, determining that the positive sample does not pass the detection;
and when the probability of passing through the detection of the positive sample is larger than or equal to a preset threshold value, obtaining an evaluation result that the model to be evaluated is good, and when the probability of passing through the detection of the positive sample is smaller than the preset threshold value, obtaining an evaluation result that the model to be evaluated is bad.
In a second aspect, an embodiment of the present application provides a model evaluation device based on a face attendance scene, including:
the acquisition module is used for acquiring the site shots of n users and the base map corresponding to each site shot;
the positive sample construction module is used for carrying out pairwise combination on the site photograph and the corresponding base map of the same user so as to obtain n positive samples;
the negative sample construction module is used for combining the site shots of each user with the non-corresponding base pictures to obtain n negative samples, wherein the non-corresponding base pictures comprise base pictures corresponding to site shots of another user with the maximum similarity with the site shots of each user;
The sorting module is used for sorting the n negative samples according to the maximum similarity;
the acquisition module is further used for acquiring the mth maximum similarity according to the sorting of the sorting module, wherein the mth maximum similarity is the maximum similarity between the site illumination and the non-corresponding base map which are arranged in the negative sample at the mth position;
the evaluation module is used for detecting n positive samples under the mth maximum similarity by using the model to be evaluated so as to finish the evaluation of the model to be evaluated, wherein n and m are any positive integers, n is more than or equal to 2, and n is more than m.
In one possible implementation manner, the sorting module is specifically configured to perform bubbling sorting on the n negative samples according to the maximum similarity, or perform selective sorting on the n negative samples according to the maximum similarity.
In one possible implementation manner, the device further comprises a similarity calculation module, wherein the similarity calculation module is used for calculating the similarity between the site illumination of each user and the corresponding base graph based on a similarity algorithm, and calculating the similarity between the site illumination of each user and the non-corresponding base graph based on the similarity algorithm.
In one possible implementation manner, the apparatus further includes an evaluation threshold setting module, where the evaluation threshold setting module is configured to set an evaluation threshold of the model to be evaluated to the mth maximum similarity;
the evaluation module is specifically configured to determine that the positive sample passes detection when the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is greater than or equal to the mth maximum similarity, and determine that the positive sample fails detection when the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is less than the mth maximum similarity; and
and when the probability of passing through the detection of the positive sample is larger than or equal to a preset threshold value, obtaining an evaluation result that the model to be evaluated is good, and when the probability of passing through the detection of the positive sample is smaller than the preset threshold value, obtaining an evaluation result that the model to be evaluated is bad.
In a third aspect, an embodiment of the present application further provides a computer device, where the computer device includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the computer program, the foregoing model evaluation method based on a face attendance scene is implemented.
In a fourth aspect, an embodiment of the present application further provides a non-transitory computer readable storage medium, on which a computer program is stored, where the computer program when executed by a processor implements the model method based on the face attendance scene.
In the technical scheme, after the site shots of n users and the base diagrams corresponding to each site shot are obtained, the site shots of the same user and the corresponding base diagrams are combined in pairs to obtain n positive samples, the site shots of each user and the base diagrams not corresponding to each user are combined in pairs to obtain n negative samples, the n negative samples are ordered according to the similarity so as to obtain the m maximum similarity, the n positive samples are detected under the m maximum similarity by using the model to be evaluated so as to complete the evaluation of the model to be evaluated, and therefore, the positive samples can be evaluated only according to the m maximum similarity with top1 precision when the model is evaluated, and the evaluation efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions of the prior art, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it will be obvious that the drawings in the following description are some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort to a person skilled in the art.
FIG. 1 is a flow chart of one embodiment of a model evaluation method based on a face attendance scene of the present application;
FIG. 2 is a flowchart of another embodiment of a model evaluation method based on a face attendance scene of the present application;
FIG. 3 is a schematic structural diagram of an embodiment of a model evaluation device based on a face attendance scene according to the present application;
FIG. 4 is a schematic structural diagram of another embodiment of a model evaluation device based on a face attendance scene according to the present application;
FIG. 5 is a schematic structural diagram of a model evaluation device based on a face attendance scene according to another embodiment of the present application;
FIG. 6 is a schematic diagram of a computer device according to an embodiment of the present application.
Detailed Description
For a better understanding of the technical solution of the present application, the following detailed description of the embodiments of the present application refers to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Fig. 1 is a flowchart of an embodiment of a model evaluation method based on a face attendance scene according to the present application, as shown in fig. 1, the method may include:
step 101: and acquiring the site photos of n users and the base map corresponding to each site photo.
Specifically, the site photograph and the base map corresponding to each site photograph may be obtained by an obtaining module, and the obtaining module may be any camera that converts an optical image into electronic data by using an electronic sensor, including but not limited to a single-lens reflex camera, a micro-lens reflex camera, and the like.
As an alternative embodiment, the base map is typically electronic data of the user's image captured and stored by a camera in advance of capturing the user's live view.
Step 102: and combining the site photographs of the same user and the corresponding base map in pairs to obtain n positive samples.
Further, before the step 102, the method may further include:
and calculating the similarity between the site photograph of each user and the corresponding base map based on a similarity algorithm.
(1) Specifically, the similarity algorithm is mainly used for scoring the similarity degree of the content between two images, and judging the similarity degree of the image content according to the score. Including but not limited to Scale-invariant feature transform (Scale-invariant feature transform; hereinafter SIFT) algorithms or hash algorithms, etc. In this embodiment, a cosine similarity algorithm is used to calculate the similarity between the site illumination of each user and the corresponding base map, and specifically, this embodiment measures the similarity between the site illumination of each user and the corresponding base map by measuring the cosine value of the vector inner product space between them. Examples: firstly, extracting the site illumination of each user and the image features of the corresponding base map to obtain corresponding feature vectors;
(2) The cosine similarity of the feature vector between the site illumination of each user and the corresponding base map is calculated through a formula (1):
wherein in formula (1)Representing the corresponding feature vector of the user's site-view, < >>Representing the feature vector corresponding to the corresponding base map.
Wherein, the closer the value of cos θ is to 1, the higher the closeness of the site illumination and the corresponding base map for each user.
Step 103: and combining the site shots of each user with non-corresponding base pictures in pairs to obtain n negative samples, wherein the non-corresponding base pictures comprise base pictures corresponding to site shots of another user with the greatest similarity with the site shots of each user.
Further, before the step 103, the method further includes:
and calculating the similarity between the site photograph and the non-corresponding base map of each user based on a similarity algorithm.
Also, in this embodiment, a cosine similarity algorithm may be used to calculate the similarity between the site photograph and the non-corresponding base map of each user, and specifically, this embodiment measures the similarity between the site photograph and the non-corresponding base map of each user by measuring the cosine value of the vector inner product space between them. The specific implementation process may refer to the related description in step 102, and will not be described herein.
Step 104: and sequencing the n negative samples according to the maximum similarity, and acquiring the mth maximum similarity, wherein the mth maximum similarity is the maximum similarity between the site illumination and the non-corresponding base map arranged in the negative sample at the mth position.
Further, the value of m is changed along with the change of the preset false recognition rate of the model, if there are 100 negative samples in the model and the preset false recognition rate is 5%, the value of m is 5.
Step 105: and detecting n positive samples under the m-th maximum similarity by using the model to be evaluated to finish the evaluation of the model to be evaluated, wherein n and m are any positive integers, n is more than or equal to 2, and n is more than m.
In the model evaluation method based on the face attendance scene, after the site shots of n users and the base diagrams corresponding to each site shot are acquired, the site shots of the same user and the corresponding base diagrams are combined in pairs to obtain n positive samples, the site shots of each user and the non-corresponding base diagrams are combined in pairs to obtain n negative samples, the n negative samples are sequenced according to the similarity so as to acquire the m maximum similarity, the n positive samples are detected under the m maximum similarity by using the model to be evaluated so as to complete the evaluation of the model to be evaluated, and therefore, the positive samples can be evaluated only according to the m maximum similarity with the highest precision when the model to be evaluated is evaluated, and the evaluation efficiency is improved.
Referring again to fig. 1, in the embodiment of fig. 1 of the present application, step 104 may include:
performing bubbling sequencing on the n negative samples according to the maximum similarity; or selecting and sorting the n negative samples according to the maximum similarity.
The bubbling ordering algorithm and the selection ordering algorithm are described below, respectively.
Specifically, the implementation process of the bubble ordering algorithm is as follows:
(11) Comparing adjacent elements, and if the first is larger than the second, exchanging;
(12) The same is done for each pair of adjacent elements, from the beginning of the first queue to the last pair of the end. After this is done, the last element will be the largest (small) number;
(13) Repeating the above steps for all elements except for the last element that has been selected (ordered);
(14) The above steps continue to be repeated each time for fewer and fewer elements (unordered elements) until no pair of numbers need to be compared, and the sequence is eventually ordered.
In this embodiment, the elements in the bubble ordering algorithm refer to the maximum similarity, and examples include:
assume that there are 7 negative samples in total, and set the maximum similarity corresponding to the 7 negative samples as follows: 37%, 67%, 47%, 27%, 97%, 87%, 57%, then:
(i) Initial state: 37%, 67%, 47%, 27%, 97%, 87%, 57%;
(ii) First pass ordering: 37%, 47%, 27%, 67%, 87%, 57%, 97%; (6 times 97% sunk to the end of the unordered sequence);
(iii) Second pass ordering: 37%, 27%, 47%, 67%, 57%, 87%, 97%; (compare 5 times, 87% sink to the end of the unordered sequence);
(iv) Third pass ordering: 27%, 37%, 47%, 57%, 67%, 87%, 97%; (4 comparisons 67% sunk to the end of the unordered sequence);
(v) Fourth pass ordering: 27%, 37%, 47%, 57%, 67%, 87%, 97%; (3 times, 57% sunk to the end of the unordered sequence);
(vi) Fifth pass ordering: 27%, 37%, 47%, 57%, 67%, 87%, 97%; (2 comparisons, 47% sunk to the end of the unordered sequence);
(vii) Sixth pass ordering: 27%, 37%, 47%, 57%, 67%, 87%, 97%; (1 comparison, 37% sunk to the end of the unordered sequence).
Specifically, the principle of selecting sorting is that each pass selects the smallest (or largest) element from the data elements to be sorted, and the smallest element is sequentially placed at the end of the sorted sequence until all the data elements to be sorted are sorted. Examples: the total number of the negative samples is 7, and the maximum similarity corresponding to the 7 negative samples is respectively as follows: 37%, 67%, 47%, 27%, 97%, 87%, 57%, then:
(i) Initial state: the disordered region is: r [ 37%, 67%, 47%, 27%, 97%, 87%, 57% ], the ordered region is empty;
(ii) First pass ordering: in the disordered region: r [ 37%, 67%, 47%, 27%, 97%, 87%, 57% ] selects the smallest record R [ 27% ], exchanges it with the 1 st record R [ 1 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced;
(iii) Second pass ordering: r [ 37%, 67%, 47%, 97%, 87%, 57% ] selects the smallest record R [ 37% ], exchanges it with the 2 nd record R [ 2 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced and increased, respectively;
(iv) Third pass ordering: r [ 67%, 47%, 97%, 87%, 57% ] selects the smallest record R [ 47% ], exchanges it with the 3 rd record R [ 3 ] of the ordered region, so that the unordered region and the ordered region become a new unordered region with 1 record number reduced and a new ordered region with 1 record number reduced and increased, respectively;
(v) Fourth pass ordering: r [ 67%, 97%, 87%, 57% ] selects the smallest record R [ 57% ], exchanges it with the 4 th record R [ 4 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced and 1 record number increased, respectively;
(vi) Fifth pass ordering: r [ 67%, 97%, 87% ] selects the smallest record R [ 67% ], exchanges it with the 5 th record R [ 5 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced and increased respectively;
(vii) Sixth pass ordering: r [ 97%, 87% ] selects the smallest record R [ 87% ], exchanges it with the 6 th record R [ 6 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced and increased, respectively;
(viii) Seventh pass ordering: r (97%) is selected and exchanged with the 7 th record R (7) of the ordered region, so that the unordered region and the ordered region become a new unordered region with 1 record number reduced and a new ordered region with 1 record number reduced and 1 record number increased respectively, and the ordered regions become R (27%, 37%, 47%, 57%, 67%, 87%, 97%) at this time.
Fig. 2 is a flowchart of still another embodiment of the model evaluation method based on the face attendance scene according to the present application, as shown in fig. 2, in the embodiment of fig. 1 of the present application, before step 105, the method may further include:
step 201, setting an evaluation threshold of the model to be evaluated as the mth maximum similarity.
Specifically, by setting the evaluation threshold to the mth maximum similarity, it is possible to perform model evaluation by comparing only the corresponding similarity in the positive sample with the mth maximum similarity having the highest accuracy, i.e., if the similarity in the positive sample is higher than the mth maximum similarity having the highest accuracy at this time, the positive sample should be regarded as passing the detection, and otherwise, not passing the detection, so that the evaluation efficiency of the model is ensured by using the mth maximum similarity having the highest accuracy.
Thus, step 105 may include:
and 202, determining that the positive sample passes detection when the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is greater than or equal to the mth maximum similarity, and determining that the positive sample does not pass detection when the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is less than the mth maximum similarity.
And 203, obtaining an evaluation result of the model to be evaluated according to the probability of passing the detection of the positive sample.
Specifically, when the probability of passing the detection of the positive sample is greater than or equal to a preset threshold, the evaluation result of the model to be evaluated may be determined to be superior, and when the probability of passing the detection of the positive sample is less than the preset threshold, the evaluation result of the model to be evaluated may be determined to be inferior.
The preset threshold may be set by itself according to system performance and/or implementation requirements, and the size of the preset threshold is not limited in this embodiment, for example, the preset threshold may be 80%.
Fig. 3 is a schematic structural diagram of an embodiment of a model evaluation device based on a face attendance scene according to the present application, where, as shown in fig. 3, the device may include:
the obtaining module 31 is configured to obtain the site shots of the n users and the base map corresponding to each site shot.
Specifically, the site photograph and the base map corresponding to each site photograph may be obtained by an obtaining module, and the obtaining module may be any camera that converts an optical image into electronic data by using an electronic sensor, including but not limited to a single-lens reflex camera, a micro-lens reflex camera, and the like.
As an alternative embodiment, the base map is typically electronic data of the user's image captured and stored by a camera in advance of capturing the user's live view.
The positive sample construction module 32 is configured to combine the site photograph and the corresponding base map of the same user two by two to obtain n positive samples.
Further, the apparatus may further include:
A similarity calculating module 33 (see fig. 4) for calculating a similarity between the site photograph and the corresponding base map of each user based on a similarity algorithm, and calculating a similarity between the site photograph and the non-corresponding base map of each user based on the similarity algorithm, including:
(1) The similarity algorithm is mainly used for scoring the similarity degree of the content between the two images, and judging the similarity degree of the image content according to the score. Including but not limited to Scale-invariant feature transform (Scale-invariant feature transform; hereinafter SIFT) algorithms or hash algorithms, etc. In this embodiment, a cosine similarity algorithm is used to calculate the similarity between the site illumination of each user and the corresponding base map, and specifically, this embodiment measures the similarity between the site illumination of each user and the corresponding base map by measuring the cosine value of the vector inner product space between them. Examples: firstly, extracting the site illumination of each user and the image features of the corresponding base map to obtain corresponding feature vectors;
(2) The cosine similarity of the feature vector between the site illumination of each user and the corresponding base map is calculated through a formula (1):
wherein in formula (1) Representing the corresponding feature vector of the user's site-view, < >>Representing the feature vector corresponding to the corresponding base map.
Wherein, the closer the value of cos θ is to 1, the higher the closeness of the site illumination and the corresponding base map for each user.
The negative sample construction module 34 is configured to combine the site shots of each user with non-corresponding base graphs to obtain n negative samples, where the non-corresponding base graphs include base graphs corresponding to site shots of another user having a maximum similarity with the site shots of each user.
Further, before the negative sample construction module 34 constructs a negative sample, the similarity calculation module 33 calculates the similarity between the site photograph and the non-corresponding base map of each user based on a similarity algorithm.
Also, in this embodiment, a cosine similarity algorithm may be used to calculate the similarity between the site photograph and the non-corresponding base map of each user, and specifically, this embodiment measures the similarity between the site photograph and the non-corresponding base map of each user by measuring the cosine value of the vector inner product space between them. For specific implementation, reference is made to the description of the positive sample construction module 32, and details thereof are omitted herein.
The sorting module 35 is configured to sort the n negative samples according to the maximum similarity. Thus, the obtaining module 31 may obtain the mth maximum similarity, where the mth maximum similarity is the maximum similarity between the site photograph and the non-corresponding base map arranged in the negative sample at the mth position.
Further, the value of m is changed along with the change of the preset false recognition rate of the model, if there are 100 negative samples in the model and the preset false recognition rate is 5%, the value of m is 5.
And the evaluation module 36 is configured to detect n positive samples under the mth maximum similarity by using the model to be evaluated, so as to complete the evaluation of the model to be evaluated, where n and m are any positive integers, and n is greater than or equal to 2, and n is greater than m.
In the above model evaluation method based on the face attendance scene, after the obtaining module 31 obtains the site shots of n users and the base diagrams corresponding to each site shot, the positive sample construction module 32 performs two-by-two combination on the site shots of the same user and the corresponding base diagrams to obtain n positive samples, and the negative sample construction module 34 performs two-by-two combination on the site shots of each user and the non-corresponding base diagrams to obtain n negative samples, the sorting module 35 sorts the n negative samples according to the similarity, the obtaining module 31 can obtain the mth maximum similarity, and the evaluation module 36 detects the n positive samples under the mth maximum similarity by using the model to be evaluated, so that the positive samples can be evaluated only according to the mth maximum similarity with the highest precision when the model to be evaluated is evaluated, and the evaluation efficiency is improved.
Referring to fig. 3 and 4, in the model evaluation device based on the face attendance scene of the present application,
the sorting module 35 is specifically configured to perform bubbling sorting on the n negative samples according to the magnitude of the maximum similarity, or perform selective sorting on the n negative samples according to the magnitude of the maximum similarity.
The bubbling ordering algorithm and the selection ordering algorithm are described below, respectively.
Specifically, the implementation process of the bubble ordering algorithm is as follows:
(11) Comparing adjacent elements, and if the first is larger than the second, exchanging;
(12) The same is done for each pair of adjacent elements, from the beginning of the first queue to the last pair of the end. After this is done, the last element will be the largest (small) number;
(13) Repeating the above steps for all elements except for the last element that has been selected (ordered);
(14) The above steps continue to be repeated each time for fewer and fewer elements (unordered elements) until no pair of numbers need to be compared, and the sequence is eventually ordered.
In this embodiment, the elements in the bubble ordering algorithm refer to the maximum similarity, and examples include:
assume that there are 7 negative samples in total, and set the maximum similarity corresponding to the 7 negative samples as follows: 37%, 67%, 47%, 27%, 97%, 87%, 57%, then:
(i) Initial state: 37%, 67%, 47%, 27%, 97%, 87%, 57%;
(ii) First pass ordering: 37%, 47%, 27%, 67%, 87%, 57%, 97%; (6 times 97% sunk to the end of the unordered sequence);
(iii) Second pass ordering: 37%, 27%, 47%, 67%, 57%, 87%, 97%; (compare 5 times, 87% sink to the end of the unordered sequence);
(iv) Third pass ordering: 27%, 37%, 47%, 57%, 67%, 87%, 97%; (4 comparisons 67% sunk to the end of the unordered sequence);
(v) Fourth pass ordering: 27%, 37%, 47%, 57%, 67%, 87%, 97%; (3 times, 57% sunk to the end of the unordered sequence);
(vi) Fifth pass ordering: 27%, 37%, 47%, 57%, 67%, 87%, 97%; (2 comparisons, 47% sunk to the end of the unordered sequence);
(vii) Sixth pass ordering: 27%, 37%, 47%, 57%, 67%, 87%, 97%; (1 comparison, 37% sunk to the end of the unordered sequence).
Specifically, the principle of selecting sorting is that each pass selects the smallest (or largest) element from the data elements to be sorted, and the smallest element is sequentially placed at the end of the sorted sequence until all the data elements to be sorted are sorted. Examples: the total number of the negative samples is 7, and the maximum similarity corresponding to the 7 negative samples is respectively as follows: 37%, 67%, 47%, 27%, 97%, 87%, 57%, then:
(i) Initial state: the disordered region is: r [ 37%, 67%, 47%, 27%, 97%, 87%, 57% ], the ordered region is empty;
(ii) First pass ordering: in the disordered region: r [ 37%, 67%, 47%, 27%, 97%, 87%, 57% ] selects the smallest record R [ 27% ], exchanges it with the 1 st record R [ 1 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced;
(iii) Second pass ordering: r [ 37%, 67%, 47%, 97%, 87%, 57% ] selects the smallest record R [ 37% ], exchanges it with the 2 nd record R [ 2 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced and increased, respectively;
(iv) Third pass ordering: r [ 67%, 47%, 97%, 87%, 57% ] selects the smallest record R [ 47% ], exchanges it with the 3 rd record R [ 3 ] of the ordered region, so that the unordered region and the ordered region become a new unordered region with 1 record number reduced and a new ordered region with 1 record number reduced and increased, respectively;
(v) Fourth pass ordering: r [ 67%, 97%, 87%, 57% ] selects the smallest record R [ 57% ], exchanges it with the 4 th record R [ 4 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced and 1 record number increased, respectively;
(vi) Fifth pass ordering: r [ 67%, 97%, 87% ] selects the smallest record R [ 67% ], exchanges it with the 5 th record R [ 5 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced and increased respectively;
(vii) Sixth pass ordering: r [ 97%, 87% ] selects the smallest record R [ 87% ], exchanges it with the 6 th record R [ 6 ] of the ordered area, so that the unordered area and the ordered area become a new unordered area with 1 record number reduced and a new ordered area with 1 record number reduced and increased, respectively;
(viii) Seventh pass ordering: r (97%) is selected and exchanged with the 7 th record R (7) of the ordered region, so that the unordered region and the ordered region become a new unordered region with 1 record number reduced and a new ordered region with 1 record number reduced and 1 record number increased respectively, and the ordered regions become R (27%, 37%, 47%, 57%, 67%, 87%, 97%) at this time.
Fig. 5 is a schematic structural diagram of another embodiment of a model evaluation device based on a face attendance scene according to the present application, where, as shown in fig. 5, the device may further include:
the evaluation threshold setting module 51 is configured to set an evaluation threshold of the model to be evaluated as the mth maximum similarity.
Specifically, the evaluation threshold setting module 51 may set the evaluation threshold to the mth maximum similarity, and may simply compare the similarity corresponding to the positive sample with the mth maximum similarity having the highest precision when performing the model evaluation, that is, if the similarity in the positive sample is higher than the mth maximum similarity having the highest precision, the positive sample should be regarded as passing the detection, and otherwise, the positive sample should not pass the detection, so that the evaluation efficiency of the model is ensured by using the mth maximum similarity having the highest precision.
Thus, the evaluation module 36 may perform the following:
(1) When the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is greater than or equal to the mth maximum similarity, determining that the positive sample passes the detection, and when the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is less than the mth maximum similarity, determining that the positive sample does not pass the detection; the method comprises the steps of,
(2) And obtaining an evaluation result of the model to be evaluated according to the probability of passing the detection of the positive sample.
Specifically, when the probability of passing the detection of the positive sample is greater than or equal to a preset threshold, the evaluation result of the model to be evaluated may be determined to be superior, and when the probability of passing the detection of the positive sample is less than the preset threshold, the evaluation result of the model to be evaluated may be determined to be inferior.
The preset threshold may be set by itself according to system performance and/or implementation requirements, and the size of the preset threshold is not limited in this embodiment, for example, the preset threshold may be 80%.
Fig. 6 is a schematic structural diagram of an embodiment of a computer device according to the present application, where the computer device may include a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the computer program, the model evaluation method based on a face attendance scene provided by the embodiment of the present application may be implemented.
The computer device may be a server, for example: the cloud server, or the above-mentioned computer device, may also be an electronic device, for example: the embodiment of the present application is not limited to a specific form of a smart device such as a smart phone, a smart watch, a personal computer (Personal Computer; hereinafter referred to as a PC), a notebook computer, or a tablet computer.
Fig. 6 shows a block diagram of an exemplary computer device 52 suitable for use in implementing embodiments of the present application. The computer device 52 shown in fig. 6 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present application.
As shown in fig. 6, the computer device 52 is in the form of a general purpose computing device. Components of computer device 52 may include, but are not limited to: one or more processors or processing units 56, a system memory 78, a bus 58 that connects the various system components, including the system memory 78 and the processing units 56.
Bus 58 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include industry Standard architecture (Industry Standard Architecture; hereinafter ISA) bus, micro channel architecture (Micro Channel Architecture; hereinafter MAC) bus, enhanced ISA bus, video electronics standards Association (Video Electronics Standards Association; hereinafter VESA) local bus, and peripheral component interconnect (Peripheral Component Interconnection; hereinafter PCI) bus.
Computer device 52 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by computer device 52 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 78 may include computer system readable media in the form of volatile memory, such as random access memory (Random Access Memory; hereinafter: RAM) 70 and/or cache memory 72. The computer device 52 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, the storage system 74 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, commonly referred to as a "hard disk drive"). Although not shown in fig. 6, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a compact disk read only memory (Compact Disc Read Only Memory; hereinafter CD-ROM), digital versatile read only optical disk (Digital Video Disc Read Only Memory; hereinafter DVD-ROM), or other optical media) may be provided. In such cases, each drive may be coupled to bus 58 through one or more data media interfaces. Memory 78 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of embodiments of the application.
A program/utility 80 having a set (at least one) of program modules 82 may be stored, for example, in the memory 78, such program modules 82 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 82 generally perform the functions and/or methods of the embodiments described herein.
The computer device 52 can also communicate with one or more external devices 54 (e.g., keyboard, pointing device, display 64, etc.), one or more devices that enable a user to interact with the computer device 52, and/or any device (e.g., network card, modem, etc.) that enables the computer device 52 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 62. Also, the computer device 52 may communicate with one or more networks such as a local area network (Local Area Network; hereinafter: LAN), a wide area network (Wide Area Network; hereinafter: WAN) and/or a public network such as the Internet via the network adapter 60. As shown in fig. 6, the network adapter 60 communicates with other modules of the computer device 52 via the bus 58. It should be appreciated that although not shown in fig. 6, other hardware and/or software modules may be used in connection with computer device 52, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 56 executes programs stored in the system memory 78 to perform various functional applications and data processing, for example, to implement the model evaluation method based on the face attendance scene provided by the embodiment of the present application.
The embodiment of the application also provides a non-transitory computer readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the model evaluation method based on the face attendance scene provided by the embodiment of the application can be realized.
The non-transitory computer readable storage media described above may employ any combination of one or more computer readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory; EPROM) or flash Memory, an optical fiber, a portable compact disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a local area network (Local Area Network; hereinafter: LAN) or a wide area network (Wide Area Network; hereinafter: WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and additional implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order from that shown or discussed, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
Depending on the context, the word "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to detection". Similarly, the phrase "if determined" or "if detected (stated condition or event)" may be interpreted as "when determined" or "in response to determination" or "when detected (stated condition or event)" or "in response to detection (stated condition or event), depending on the context.
It should be noted that, the terminal according to the embodiment of the present application may include, but is not limited to, a personal Computer (Personal Computer; hereinafter abbreviated as PC), a personal digital assistant (Personal Digital Assistant; hereinafter abbreviated as PDA), a wireless handheld device, a Tablet Computer (Tablet Computer), a mobile phone, an MP3 player, an MP4 player, and the like.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a Processor (Processor) to perform part of the steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (hereinafter referred to as ROM), a random access Memory (Random Access Memory) and various media capable of storing program codes such as a magnetic disk or an optical disk.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the application.

Claims (9)

1. The model evaluation method based on the face attendance scene is characterized by comprising the following steps of:
acquiring site shots of n users and base charts corresponding to each site shot, and combining the site shots of the same user and the corresponding base charts in pairs to obtain n positive samples;
combining the site shots of each user with non-corresponding base pictures in pairs to obtain n negative samples, wherein the non-corresponding base pictures comprise base pictures corresponding to site shots of another user with the maximum similarity with the site shots of each user;
sequencing the n negative samples according to the maximum similarity, and acquiring the mth maximum similarity, wherein the mth maximum similarity is the maximum similarity between the site illumination and the non-corresponding base map arranged in the negative sample at the mth position;
detecting n positive samples under the mth maximum similarity by using a model to be evaluated to finish the evaluation of the model to be evaluated, wherein n and m are any positive integers, n is more than or equal to 2, and n is more than m;
When the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is greater than or equal to the m maximum similarity, determining that the positive sample passes the detection, and when the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is less than the m maximum similarity, determining that the positive sample does not pass the detection; and
when the probability of passing through the detection of the positive sample is larger than or equal to a preset threshold value, an evaluation result with the best to-be-evaluated model is obtained, and when the probability of passing through the detection of the positive sample is smaller than the preset threshold value, an evaluation result with the inferior to-be-evaluated model is obtained;
the value of m is changed along with the change of the false recognition rate preset by the model.
2. The method of claim 1, wherein said ordering said n negative samples by said magnitude of maximum similarity comprises:
performing bubbling sequencing on the n negative samples according to the maximum similarity; or (b)
And selecting and sorting the n negative samples according to the maximum similarity.
3. The method of claim 1, wherein the combining the site photograph and the corresponding base map of the same user two by two, before obtaining n positive samples, further comprises:
Calculating the similarity between the site photograph of each user and the corresponding base map based on a similarity algorithm;
the method comprises the steps of combining the site shots of each user with the non-corresponding base map in pairs, and before obtaining n negative samples, further comprises the following steps:
and calculating the similarity between each user site photograph and the non-corresponding base map based on a similarity algorithm.
4. A method according to claim 3, wherein said using the model to be evaluated further comprises, prior to detecting n positive samples at said mth maximum similarity:
and setting an evaluation threshold value of the model to be evaluated as the mth maximum similarity.
5. The utility model provides a model evaluation device based on face attendance scene which characterized in that, the device includes:
the acquisition module is used for acquiring the site shots of n users and the base map corresponding to each site shot;
the positive sample construction module is used for carrying out pairwise combination on the site photograph and the corresponding base map of the same user so as to obtain n positive samples;
the negative sample construction module is used for combining the site shots of each user with the non-corresponding base pictures to obtain n negative samples, wherein the non-corresponding base pictures comprise base pictures corresponding to site shots of another user with the maximum similarity with the site shots of each user;
The sorting module is used for sorting the n negative samples according to the maximum similarity;
the acquisition module is further used for acquiring the mth maximum similarity according to the sorting of the sorting module, wherein the mth maximum similarity is the maximum similarity between the site illumination and the non-corresponding base map which are arranged in the negative sample at the mth position;
the evaluation module is used for detecting n positive samples under the mth maximum similarity by using the model to be evaluated so as to finish the evaluation of the model to be evaluated, wherein n and m are any positive integers, n is more than or equal to 2, and n is more than m;
the device also comprises an evaluation threshold setting module, wherein the evaluation threshold setting module is used for setting the evaluation threshold of the model to be evaluated as the mth maximum similarity;
the evaluation module is specifically configured to determine that the positive sample passes detection when the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is greater than or equal to the mth maximum similarity, and determine that the positive sample fails detection when the similarity corresponding to the positive sample obtained by the detection of the model to be evaluated is less than the mth maximum similarity; and
when the probability of passing through the detection of the positive sample is larger than or equal to a preset threshold value, an evaluation result with the best to-be-evaluated model is obtained, and when the probability of passing through the detection of the positive sample is smaller than the preset threshold value, an evaluation result with the inferior to-be-evaluated model is obtained;
The value of m is changed along with the change of the false recognition rate preset by the model.
6. The apparatus of claim 5, wherein the device comprises a plurality of sensors,
the sorting module is specifically configured to perform bubbling sorting on the n negative samples according to the maximum similarity, or perform selective sorting on the n negative samples according to the maximum similarity.
7. The apparatus of claim 5, wherein the device comprises a plurality of sensors,
the device also comprises a similarity calculation module, wherein the similarity calculation module is used for calculating the similarity between the site illumination of each user and the corresponding base map based on a similarity algorithm and calculating the similarity between the site illumination of each user and the non-corresponding base map based on the similarity algorithm.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method according to any one of claims 1 to 4 when the computer program is executed.
9. A non-transitory computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements the method according to any one of claims 1 to 4.
CN201910019147.8A 2019-01-09 2019-01-09 Model evaluation method and device based on face attendance scene and computer equipment Active CN109918992B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910019147.8A CN109918992B (en) 2019-01-09 2019-01-09 Model evaluation method and device based on face attendance scene and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910019147.8A CN109918992B (en) 2019-01-09 2019-01-09 Model evaluation method and device based on face attendance scene and computer equipment

Publications (2)

Publication Number Publication Date
CN109918992A CN109918992A (en) 2019-06-21
CN109918992B true CN109918992B (en) 2023-11-03

Family

ID=66960149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910019147.8A Active CN109918992B (en) 2019-01-09 2019-01-09 Model evaluation method and device based on face attendance scene and computer equipment

Country Status (1)

Country Link
CN (1) CN109918992B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114519520A (en) * 2022-02-17 2022-05-20 深圳集智数字科技有限公司 Model evaluation method, model evaluation device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014082496A1 (en) * 2012-11-27 2014-06-05 腾讯科技(深圳)有限公司 Method and apparatus for identifying client characteristic and storage medium
KR101415309B1 (en) * 2014-04-10 2014-07-04 한국지질자원연구원 Determination method and apparatus of the optimum distance of upward continuation for stripping
CN108763277A (en) * 2018-04-10 2018-11-06 平安科技(深圳)有限公司 A kind of data analysing method, computer readable storage medium and terminal device
CN108805048A (en) * 2018-05-25 2018-11-13 腾讯科技(深圳)有限公司 A kind of method of adjustment of human face recognition model, device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014082496A1 (en) * 2012-11-27 2014-06-05 腾讯科技(深圳)有限公司 Method and apparatus for identifying client characteristic and storage medium
KR101415309B1 (en) * 2014-04-10 2014-07-04 한국지질자원연구원 Determination method and apparatus of the optimum distance of upward continuation for stripping
CN108763277A (en) * 2018-04-10 2018-11-06 平安科技(深圳)有限公司 A kind of data analysing method, computer readable storage medium and terminal device
CN108805048A (en) * 2018-05-25 2018-11-13 腾讯科技(深圳)有限公司 A kind of method of adjustment of human face recognition model, device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于最大-最小相似度学习方法的文本提取;付慧 等;软件学报;第19卷(第03期);第621-629页 *

Also Published As

Publication number Publication date
CN109918992A (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN109086709B (en) Feature extraction model training method and device and storage medium
CN107545241B (en) Neural network model training and living body detection method, device and storage medium
CN111680551B (en) Method, device, computer equipment and storage medium for monitoring livestock quantity
CN109671020B (en) Image processing method, device, electronic equipment and computer storage medium
US9064171B2 (en) Detection device and method for transition area in space
US20140324888A1 (en) Method and Apparatus for Identifying a Gesture Based Upon Fusion of Multiple Sensor Signals
CN108875487B (en) Training of pedestrian re-recognition network and pedestrian re-recognition based on training
CN111062871A (en) Image processing method and device, computer equipment and readable storage medium
WO2018090937A1 (en) Image processing method, terminal and storage medium
CN108734106B (en) Rapid riot and terrorist video identification method based on comparison
CN115311730B (en) Face key point detection method and system and electronic equipment
CN115062186B (en) Video content retrieval method, device, equipment and storage medium
CN111340213B (en) Neural network training method, electronic device, and storage medium
CN109829383B (en) Palmprint recognition method, palmprint recognition device and computer equipment
CN109961103B (en) Training method of feature extraction model, and image feature extraction method and device
CN109918992B (en) Model evaluation method and device based on face attendance scene and computer equipment
JP2014178857A (en) Image search system and image search method
CN110313001A (en) Photo processing method, device and computer equipment
CN115393755A (en) Visual target tracking method, device, equipment and storage medium
CN110046632A (en) Model training method and device
CN116048682A (en) Terminal system interface layout comparison method and electronic equipment
CN111444319B (en) Text matching method and device and electronic equipment
CN115004245A (en) Target detection method, target detection device, electronic equipment and computer storage medium
CN114419525A (en) Harmful video detection method and system
JP6244887B2 (en) Information processing apparatus, image search method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant