CN112912889B - Image template updating method, device and storage medium - Google Patents

Image template updating method, device and storage medium Download PDF

Info

Publication number
CN112912889B
CN112912889B CN201980002050.4A CN201980002050A CN112912889B CN 112912889 B CN112912889 B CN 112912889B CN 201980002050 A CN201980002050 A CN 201980002050A CN 112912889 B CN112912889 B CN 112912889B
Authority
CN
China
Prior art keywords
image
target object
template
uniformity
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980002050.4A
Other languages
Chinese (zh)
Other versions
CN112912889A (en
Inventor
肖梦佳
王波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Publication of CN112912889A publication Critical patent/CN112912889A/en
Application granted granted Critical
Publication of CN112912889B publication Critical patent/CN112912889B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

An updating method, equipment and storage medium of an image template, the updating method of the image template comprises the following steps: image acquisition is carried out on a target object to obtain a target sample image (101) of the target object; determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range (102); pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set (103); calculating a uniformity parameter for each of the at least one set of pre-replacement templates (104); and if the uniformity parameter of the pre-replacement template set obtained after the target object replaces the second image template is smaller than the preset uniformity value, taking the target sample image as the image template of the target object and replacing the second image template (105). The method ensures that the image templates in the target object are uniformly distributed, and improves the accuracy and the recognition rate of image recognition.

Description

Image template updating method, device and storage medium
Technical Field
The embodiment of the application relates to the technical field of image recognition, in particular to an updating method, equipment and storage medium of an image template.
Background
As image recognition technology is mature, application of the image recognition technology is also becoming wider. Taking face recognition as an example, face recognition has been widely applied to security inspection, authentication, electronic payment, and other scenarios. Taking face recognition as an example, in the face recognition process, firstly, sampling and registering face images, establishing an image template of the face images according to the characteristics of the face images which are sampled and registered, and in the later application, determining whether the currently identified person is a registered user according to the image template of the face images. However, in practical application, the image template of the registered face image is sampled, and the angle of the acquired image is random, so that the recognition rate is low when the face image to be recognized is recognized.
Disclosure of Invention
In view of the above, one of the technical problems to be solved by the embodiments of the present invention is to provide a method, an apparatus and a storage medium for updating an image template, which are used for overcoming the above-mentioned drawbacks.
In a first aspect, an embodiment of the present application provides a method for updating an image template, including:
image acquisition is carried out on the target object to obtain a target sample image of the target object;
determining that the similarity between the target sample image and a first image template of the target object is in a first preset range, wherein the target object is provided with at least one image template, and the first image template belongs to the at least one image template of the target object;
Pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set;
calculating uniformity parameters of each pre-replacement template set in at least one pre-replacement template set, wherein the uniformity parameters are used for indicating uniformity of image template distribution of a target object in the corresponding pre-replacement template set;
and if the uniformity parameter of the pre-replacement template set obtained after the target object replaces the second image template is smaller than the preset uniformity value, taking the target sample image as the image template of the target object and replacing the second image template, wherein the second image template belongs to at least one image template of the target object.
Optionally, in one embodiment of the present application, replacing the target sample image with each image template of the target object and obtaining the corresponding uniformity parameter includes:
uniformly dividing at least one region in the value ranges of the transverse rotation angle and the longitudinal rotation angle of the target object; for each image template of the target object, after the target sample image is replaced by the image template of the target object, determining the number of the image templates distributed in each region, and calculating the standard deviation/variance of the number of the image templates of all the regions as a uniformity parameter.
Optionally, in one embodiment of the present application, the method further comprises: for the image templates of the target object before replacement, the number of the image templates distributed in each region is determined, and the standard deviation/variance of the number of the image templates of all regions is calculated as a preset uniformity value.
Optionally, in one embodiment of the present application, calculating the standard deviation/variance of the number of image templates of the at least one region as the uniformity parameter includes:
dividing at least one region into a first portion and a second portion; calculating the standard deviation/variance of the number of the image templates of all the areas in the first part divided by a first factor to be used as a first parameter in the uniformity parameters; and calculating the standard deviation/variance of the number of the image templates of all the areas in the second part by the first factor as a second parameter in the uniformity parameters.
Optionally, in one embodiment of the present application, the method further comprises:
for the image templates of the target object before replacement, determining the number of the image templates distributed in each area; calculating standard deviation/variance of the number of the image templates of all areas in the first part as a first uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the first parameter is smaller than the first uniformity value; and/or calculating the standard deviation/variance of the number of the image templates of all areas in the second part as a second uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the second parameter is smaller than the second uniformity value.
Optionally, in one embodiment of the present application, the method further comprises: calculating a first similarity average value of each image in the at least one image and an image template of the target object; and determining an image of which the first similarity average value is within a second preset range as a target sample image of the target object.
Optionally, in one embodiment of the present application, the method further comprises: and when the similarity is larger than the first threshold value and smaller than or equal to the second threshold value, determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range.
Optionally, in one embodiment of the present application, the method further comprises: and calculating a second similarity average value of each sample image in the sample image set of the non-target object and the image template of the target object, and determining the second similarity average value with the largest value as a first threshold value.
Optionally, in one embodiment of the present application, the method further comprises: determining a transverse rotation angle and a longitudinal rotation angle of a target object in a target sample image; and when the sum of squares of the transverse rotation angle and the longitudinal rotation angle is in a third preset range, determining that the similarity between the target sample image of the target object and the first image template of the target object is in the first preset range.
Optionally, in one embodiment of the present application, pre-replacing each image template of the target object with the target sample image to obtain at least one pre-replaced template set includes:
when the number of the image templates of the target object is determined to be equal to the preset number, each image template of the target object is replaced in advance by using the target sample image to obtain at least one replaced template set.
Optionally, in one embodiment of the present application, the method further comprises: and when the number of the image templates of the target object is smaller than the preset number, determining the target sample image of the target object as the image template of the target object.
Optionally, in one embodiment of the present application, the method further comprises: when the uniformity parameter smaller than the preset uniformity value exists in the uniformity parameters of the target object, replacing the image template with the smallest uniformity parameter obtained after the replacement of the target object with the target sample image in the image template of the target object.
Optionally, in one embodiment of the present application, the method further comprises: and replacing the image template with the smallest number of hits matched with the target object in the image recognition process with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value.
Optionally, in one embodiment of the present application, the method further comprises: and replacing the image template with the longest service time with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value after the image template is replaced with the target object.
Optionally, in one embodiment of the present application, the method further comprises: matching the target sample image with at least one image template of the target object; after the target sample image is successfully matched with any image template of the target object, the successful authentication of the target object is determined.
Optionally, in one embodiment of the present application, the method further comprises: and determining the target sample image as an image template of the target object, storing the image template of the target object, and finishing registration of the target object.
Optionally, in one embodiment of the present application, the method further comprises: and converting the two-dimensional image of the target sample image into three-dimensional point cloud information, and calculating the transverse rotation angle and the longitudinal rotation angle of the target object in the target sample image through the external reference rotation matrix according to the three-dimensional point cloud information.
In a second aspect, an updating apparatus of an image template includes: a permission learning module, a uniformity parameter module and an image template management module;
The permission learning module is used for determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range;
the uniformity parameter module is used for respectively replacing each image template of the target object by using the target sample image when the number of the image templates of the target object is equal to the preset number, and obtaining corresponding uniformity parameters, wherein the uniformity parameters are used for indicating the uniformity of the distribution of the image templates of the target object after the image templates of the target object are replaced by the target sample image;
and the image template management module is used for taking the target sample image as the image template of the target object and replacing the second image template if the uniformity parameter obtained after the target object replaces the second image template is smaller than the preset uniformity value.
In a third aspect, an electronic device, comprising: the system comprises at least one processor, a memory and an image acquisition module, wherein the at least one processor, the memory and the image acquisition module are mutually connected through a bus and complete communication; the image acquisition module comprises an infrared camera and a speckle projector and is used for acquiring images; at least one program is stored on the memory, and the at least one processor implements the method as described in any one of the embodiments of the first aspect when executing the program stored on the memory.
In a fourth aspect, a storage medium having a computer program stored thereon, which, when executed by at least one processor, implements a method as described in any of the embodiments of the first aspect.
In the process of identifying the target object, the similarity between the target sample image of the target object and the first image template is determined, the target sample image is determined to be a standard learning sample image when the similarity is within the first preset range, repeated image templates in the image templates of the target object are reduced, and whether the target sample image can replace the image templates in the target object or not is determined according to the corresponding uniformity parameters obtained by respectively replacing each image template of the target object by the target sample image, so that the uniform distribution of the image templates in the target object is ensured, the target object can be identified in all angles, and the accuracy and the identification rate of image identification are improved.
Drawings
Some specific embodiments of the present application will be described in detail below by way of example and not by way of limitation with reference to the accompanying drawings. The same reference numbers will be used throughout the drawings to refer to the same or like parts or portions. It will be appreciated by those skilled in the art that the drawings are not necessarily drawn to scale. In the accompanying drawings:
Fig. 1 is a flowchart of an updating method of an image template according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a second similarity average provided in an embodiment of the present application;
fig. 3 is a schematic diagram of correspondence between similarity and radius of a distribution circle according to an embodiment of the present application;
fig. 4 is a schematic diagram of a third preset range provided in an embodiment of the present application;
FIG. 5 is a logic block diagram of an updating method of an image template according to an embodiment of the present application;
fig. 6 is a block diagram of an image template updating device according to an embodiment of the present application;
fig. 7 is a schematic hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
It is not necessary for any of the embodiments of the present application to be practiced with all of the advantages described above.
In order to better understand the technical solutions in the embodiments of the present application, the following descriptions will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the embodiments of the present application shall fall within the scope of protection of the embodiments of the present application.
Embodiments of the present application are further described below with reference to the accompanying drawings of embodiments of the present application.
Embodiment 1,
Fig. 1 is a flowchart of an updating method of an image template according to an embodiment of the present application; the method can be applied to the process of image recognition registration of the target object, and can also be applied to the process of image recognition after the target object registration is completed, and the method is not limited to the process, as shown in fig. 1, and comprises the following steps:
and step 101, acquiring an image of a target object to obtain a target sample image of the target object.
The target object refers to an object to be identified in image identification, for example, in an application scenario of face recognition, the target object may be a face of a user, for example, in an application scenario of license plate identification, the target object may be a license plate of a vehicle, which is of course only illustrated here, and in different application scenarios, the target object may be a different object.
In this application, "object" is used to denote a singular item, for example, an object of an object, and "object" is not used as a limitation, but is used to logically more clearly indicate the implementation of the present solution. In this application, the terms "first," "second," and the like are used for distinguishing between and not for limiting, and are not used for describing a sequence (having a certain sequential relationship) or a collection (having a second must be first, or having a first must be second).
Step 102, determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range.
The target object is provided with at least one image template, the first image template belongs to the at least one image template of the target object, when the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range, the target sample image can be used as a permitted learning sample image, namely, the preliminary condition of the image template serving as the target object is met, and the target sample image can be added into a learning queue.
Typically, the target object has a plurality of image templates, the first image template being only one of the image templates of the target object, and in one embodiment, during the registration phase, the plurality of image templates are created using the front image acquired of the front side of the target object, and one of the image templates is used as the first image template. For example, in an application scenario of face recognition, in a registration stage, an image template of a face image established by using face images collected from the front is a first image template. Of course, the first image template may be any one of image templates of the target object, which is only exemplary herein and not representative of the present application.
Here, two specific examples are listed to illustrate how the similarity between the target sample image of the target object and the first image template of the target object is determined to be within the first preset range, and of course, this is merely an exemplary illustration and does not represent a limitation of the present application.
Optionally, in a first example, the method further comprises: and when the similarity is larger than the first threshold value and smaller than or equal to the second threshold value, determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range.
It should be noted that, the first preset range is defined by a first threshold value as a lower limit value and a second threshold value as an upper limit value, setting the first threshold value can ensure that the selected target sample image is the sample image of the target object and is not the sample image of the non-target object, so that the interference of the sample image of the non-target object is removed, setting the second threshold value can ensure that the target sample image with extremely high similarity with the first image template is removed, and the image template is avoided from repeating.
Based on the first example, further, in one embodiment of the present application, the method further comprises: and calculating a second similarity average value of each sample image in the sample image set of the non-target object and the image template of the target object, and determining the second similarity average value with the largest value as a first threshold value. For example, for one sample image a, the similarity of the sample image a to each image template of the target object is calculated, and the average value of the similarities is calculated to obtain a second similarity average value of the sample image a, the second similarity average values of all sample images are compared, and the second similarity average value with the largest value is determined as the first threshold value.
Here, how to calculate the second similarity average value between each sample image in the sample image set of the non-target object and the image template of the target object, and obtain the sample image set of the non-target object, wherein the sample images included in the sample image set of the non-target object are not sample images of the target object, but sample images of other objectsFor example, the sample image of the non-target object is assembled with M sample images, M is an integer greater than 1, the target object has n image templates, n is an integer greater than 1, and for the mth sample image, M is [1, M]Integer in, m sample image and second similarity average value of image template of target objectThe calculation formula is as formula 1:
wherein,a second similarity average value s representing an mth sample image and an image template of the target object mi Representing the similarity between the mth sample image and the ith image template of the target object. It should be noted that, in this embodiment, the similarity between the sample image and the image template may be calculated by using the neural network model, and the similarity output by the neural network model may be obtained by inputting the sample image and the image template into the neural network model.
As shown in fig. 2, fig. 2 is a schematic diagram of a second similarity average value provided in the embodiment of the present application, in fig. 2, the horizontal axis represents the second similarity average value, the left vertical axis represents the number of sample images, the right vertical axis represents the number of sample images in proportion to the number of sample images, the bar graph represents the number of sample images under the condition that the corresponding second similarity average value is satisfied, the line graph represents the number of sample images under the condition that the corresponding second similarity average value is less than or equal to the number of sample images in proportion to the number of sample images of the non-target object, and the number of sample images of the second similarity average value within 0.8 is close to 1 in the sample image collection of the non-target object, that is, the similarity between almost all sample images of the non-target object and the image templates of the target object is less than 0.8, so that the sample images with the similarity greater than 0.8 can be considered as the sample images of the target object, and therefore 0.8 can be regarded as the largest second similarity average value, that is 0.8 as the first threshold.
Optionally, in a second example, the method further comprises: determining a transverse rotation angle and a longitudinal rotation angle of a target object in a target sample image; and when the sum of squares of the transverse rotation angle and the longitudinal rotation angle is in a third preset range, determining that the similarity between the target sample image of the target object and the first image template of the target object is in the first preset range.
Here, how the third preset range is specified, for example, in a rectangular coordinate system, the longitudinal rotation angle of the target object (i.e., the angle by which the target object rotates about the x-axis) is expressed by the x-axis, the lateral rotation angle of the target object (i.e., the angle by which the target object rotates about the y-axis) is expressed by the y-axis, and the ranges of the longitudinal rotation angle and the lateral rotation angle of the target object are each within 90 °, i.e., [ -90,90 ]]. Acquiring a sample image set of the target object, wherein the sample image set of the target object can comprise 1000 sample images of the target object, calculating the similarity between each sample image and the front image template of the target object, for example, the similarity between the mth sample image and the front image template of the target object can be expressed as sim m . Calculating the R (distributed circle radius) value of each sample image in the rectangular coordinate system according to the coordinates of each sample image in the rectangular coordinate system,for example, the m-th sample image has coordinates (x m ,y m ) Representing that the rotation angle of the target object in the mth sample image around the x axis is x m The rotation angle around the y axis is y m The R value of the mth sample image is R mDetermining a sample image with similarity greater than 0.95 as a sample image which is repeated with a front image template of the target object, and determining a sample image with similarity less than or equal to 0.8 For the invalid sample image, the similarity of the image template of the invalid sample image and the target object is too low, the sample image which can be regarded as a sample image of a non-target object, the sample image with the similarity larger than 0.8 is determined to be a valid sample image, and the similarity of the image template of the valid sample image and the target object is higher, so that the sample image which can be regarded as a sample image of the target object. Therefore, the similarity between the target sample image and the first image template of the target object should be at (0.8,1)]As shown in fig. 3, fig. 3 is a schematic diagram of a correspondence relationship between a similarity and a radius of a distribution circle provided in the embodiment of the present application, where an R value corresponding to a similarity of 0.95 is 5, an R value corresponding to a similarity of 0.8 is 60, and the similarity interval (0.8,1]The corresponding R value interval is [5, 60), that is, a third preset range, so when the R value of the target sample image is within the third preset range, that is, the [5,60 ] interval, it is determined that the similarity between the target sample image and the first image template of the target object is within the first preset range, as shown in fig. 4, fig. 4 is a schematic diagram of the third preset range provided in the embodiment of the present application, in the rectangular coordinate system shown in fig. 4, the ring shape of the shadow area represents the third preset range, where the shadow area represents a distribution area where the R value is greater than or equal to 5 and less than or equal to 60, and the area outside the shadow area represents a sample image distribution area where the R value of the sample image is less than 5 (the similarity with the front image template of the target object is greater than 0.95) and the R value of the sample image is greater than 60 (the similarity with the front image template of the target object is less than 0.8).
And 103, respectively pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set.
It should be noted that, in step 103, replacing each image template with the target sample image is pre-replacing, which is only used to calculate the uniformity parameter, and the target sample image is not directly used as the image template.
Optionally, in one embodiment of the present application, pre-replacing each image template of the target object with the target sample image to obtain at least one pre-replaced template set includes:
when the number of the image templates of the target object is determined to be equal to the preset number, each image template of the target object is replaced in advance by using the target sample image to obtain at least one replaced template set.
Optionally, in one embodiment of the present application, the method further comprises: and when the number of the image templates of the target object is smaller than the preset number, determining the target sample image of the target object as the image template of the target object.
The upper limit of the number of the image templates of the target object may be a preset number, and if the number of the image templates of the target object is smaller than the preset number, the image templates of the target object may be further increased, and when the similarity between the target sample image and the first image template of the target object is within a first preset range, the target sample image is directly used as the image template of the target object; if the number of the image templates of the target object is equal to the preset number, it is indicated that the image templates of the target object cannot be increased continuously, and at this time, it is required to determine whether the target sample image can replace a certain image template of the target object, and a better effect is obtained.
Step 104, calculating uniformity parameters of each of the at least one pre-replacement template set.
The uniformity parameter is used to indicate uniformity of image template distribution of the target object in the corresponding pre-replacement template set.
The uniformity parameter is used for indicating uniformity of distribution of the image templates in the pre-replacement template set obtained after the image templates of the target object are replaced by the target sample image. For example, the target object has n image templates, n is an integer greater than 1, the target sample image is substituted for the ith image template to obtain an ith pre-substituted template set of the target object, the uniformity parameter of the ith pre-substituted template set is calculated to be used as the uniformity parameter of the ith template, the target sample image is substituted for each image template respectively to obtain corresponding uniformity parameters, and n pre-substituted template sets and n uniformity parameters can be obtained altogether.
Taking a rectangular coordinate system as an example, the longitudinal rotation angle of the target object (i.e., the angle by which the target object rotates about the x-axis) is represented by the x-axis, and the lateral rotation angle of the target object (i.e., the angle by which the target object rotates about the y-axis) is represented by the y-axis. Here, two specific examples are listed to illustrate how the uniformity parameter is calculated.
Alternatively, in a first example, an overall uniformity parameter may be calculated, replacing each image template of the target object with the target sample image and obtaining a corresponding uniformity parameter, including:
uniformly dividing at least one region in the value ranges of the transverse rotation angle and the longitudinal rotation angle of the target object; for each image template of the target object, after the target sample image is replaced by the image template of the target object, determining the number of the image templates distributed in each region, and calculating the standard deviation/variance of the number of the image templates of all the regions as a uniformity parameter.
Optionally, in one embodiment of the present application, the method further comprises: for the image templates of the target object before replacement, the number of the image templates distributed in each region is determined, and the standard deviation/variance of the number of the image templates of all regions is calculated as a preset uniformity value.
Based on the first example, as shown in fig. 4, the annular shadow area is the range of the transverse rotation angle and the longitudinal rotation angle of the target object, that is, the range of the R value is between [5,60 ], and the annular shadow area is equally divided into 8 sector areas (that is, at least one area), and of course, may be divided into 6 sector areas and 4 sector areas, which may be freely set according to the specific situation, which is not limited in the present application. Here, taking a face as an example of the target object, in the 8 sector areas shown in fig. 4, a point in the positive x-axis direction indicates an angle of upward rotation of the face, a point in the negative x-axis direction indicates an angle of downward rotation of the face, a point in the positive y-axis direction indicates an angle of rightward rotation of the face, a point in the positive y-axis direction indicates an angle of leftward rotation of the face, a slope of a first quadrant indicates an angle of upward rightward rotation of the face, a slope of a second quadrant indicates an angle of downward rightward rotation of the face, a slope of a third quadrant indicates an angle of downward leftward rotation of the face, and a slope of a fourth quadrant indicates an angle of upward leftward rotation of the face. Of course, this is only an exemplary illustration, and positive direction may be an angle of downward or rightward rotation, and negative direction may be an angle of upward or leftward rotation. The present application is not limited in this regard.
Taking n image templates of the target object as an example, the preset uniformity value is the standard deviation calculated according to the original image templates of the target object, that is, before the target sample image replaces the image templates of the target object, the number of the image templates of the target object distributed in each sector area is determined, and the number of the image templates in the t-th sector area is x 0t (t=1, 2,3,4,5,6,7, 8), and calculating an average value of the number of image templates of the 8-sector areaThen, the standard deviation std of the number of image templates of 8 sector areas is calculated according to equation 2 0 As a preset uniformity value:
wherein T is the number of at least one region. For the ith image template, replacing the ith image template with the target sample image (i.e. taking the target sample image as the ith image template), calculating the number of image templates in each sector, wherein the number of image templates in the ith sector is x it (t=1, 2,3,4,5,6,7, 8), and calculating an average value of the number of image templates of the 8-sector areaThen, the standard deviation std of the number of image templates of 8 sector areas is calculated according to equation 3 i
Wherein T is at least one regionNumber of parts. std i (i.e., overall uniformity parameter) representing uniformity of distribution of image templates of the target object, std after the target sample image replaces the ith image template of the target object i The smaller the uniformity of the image template distribution. If std i Less than std 0 After the ith image template of the target object is replaced by the target sample image, the image templates are distributed more uniformly, and the target sample image can be used as the ith image template to replace the original image template.
Alternatively, in a second example, a local uniformity parameter may be calculated, a standard deviation/variance of the number of image templates of at least one region is calculated as the uniformity parameter, including:
dividing at least one region into a first portion and a second portion; calculating the standard deviation/variance of the number of the image templates of all the areas in the first part divided by a first factor to be used as a first parameter in the uniformity parameters; and calculating the standard deviation/variance of the number of the image templates of all the areas in the second part by the first factor as a second parameter in the uniformity parameters.
It should be noted that, all the areas in the first portion are the sum of the areas belonging to the first portion, and all the areas in the first portion plus all the areas in the second portion are the range of values of the horizontal rotation angle and the vertical rotation angle of the target object.
Of course, it is also possible to divide at least one region into a plurality of parts, for example, divide at least one region into S parts, S is an integer greater than 1, calculate the standard deviation/variance of the number of image templates of all regions in the S-th part divided by the S-th factor as the S-th parameter in the uniformity parameters, and obtain S parameters, where S is an integer in [1, S ].
Optionally, in one embodiment of the present application, the method further comprises:
for the image templates of the target object before replacement, determining the number of the image templates distributed in each area; calculating standard deviation/variance of the number of the image templates of all areas in the first part as a first uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the first parameter is smaller than the first uniformity value; and/or calculating the standard deviation/variance of the number of the image templates of all areas in the second part as a second uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the second parameter is smaller than the second uniformity value.
For example, based on the annular shadow region shown in fig. 4, the annular shadow region is divided into two annular regions of a first part and a second part with r=25 as a boundary, and in combination with the calculation process of the standard deviation in the first example, the standard deviation std of the numbers of the image templates of the first part and the second part is calculated according to the formula 4 and the formula 5, respectively, for the original image template of the target object before the image template of the target object is replaced with the target sample image 01 And std 02
Wherein x is 01t (t=1, 2,3,4,5,6,7, 8) means that the number of image templates in the t-th sector in the first part is,representing the average of the number of image templates in the 8 sector areas of the first part.
Wherein x is 02t (t=1, 2,3,4,5,6,7, 8) means that the number of image templates in the t-th sector in the second part is,representing the average of the number of image templates in the 8 sector areas of the first part.
Will std 01 As a first uniformity value, std 02 As a second averageHomogeneity values.
For the ith image template, the ith image template is replaced by the target sample image (i.e. the target sample image is taken as the ith image template), and the standard deviation std of the numbers of the first part and the second part image templates is calculated according to the formula 6 and the formula 7 respectively in combination with the calculation process of the standard deviation in the first example i1 And std i2
Wherein x is i1t (t=1, 2,3,4,5,6,7, 8) means that the number of image templates in the t-th sector in the first part is,representing the average of the number of image templates in the 8 sector areas of the first part.
Wherein x is i2t (t=1, 2,3,4,5,6,7, 8) means that the number of image templates in the t-th sector in the second part is,representing the average of the number of image templates in the 8 sector areas of the first part.
Will std i1 As a first parameter std i2 As a second parameter. Alternatively, the first parameter and the second parameter may be weighted, e.g. the first parameter P i1 =std i1 /w i1 ,w i1 Represents a first factor, w i1 The larger the first portion, the more important the uniformity, and similarly the second parameter P i2 =std i2 /w i2 。w i2 Representing a second factor. The first parameter and the second parameter are local uniformity parameters.
std i1 Representing target sample image replacementAfter changing the ith image template of the target object, the distribution uniformity of the image template of the target object in the first part and std i1 The smaller the target object's image template is, the better the uniformity of distribution in the first part, std i2 After representing the ith image template of the target object replaced by the target sample image, the distribution uniformity of the image template of the target object in the second part, std i2 The smaller the image template of the target object, the better the uniformity of distribution in the second portion. If std i1 Less than std 01 After the ith image template representing the target object is replaced by the target sample image, the image template of the target object is more uniformly distributed in the first part, the target sample image can be used as the ith image template to replace the original image template, and/or if std i2 Less than std 02 Then the image templates of the target object are distributed more evenly in the second part after the i-th image template of the target object is replaced by the image representing the target sample.
Of course, the local uniformity parameter divided by the factor can also be used to make the determination, e.g., if P i1 Less than std 01 After the ith image template representing the target object is replaced by the target sample image, the image template of the target object is more uniformly distributed in the first part, the target sample image can be used as the ith image template to replace the original image template, and/or if P i2 Less than std 02 Then the image templates of the target object are distributed more evenly in the second part after the i-th image template of the target object is replaced by the image representing the target sample.
By combining the first example and the second example, the overall uniformity parameter can be judged first, then the local uniformity parameter can be judged, and if the overall uniformity parameter is equal to a preset uniformity value after the ith image template of the target object is replaced by the target sample image, whether the local uniformity parameter is smaller than the first uniformity value or the second uniformity value is further judged. If the overall uniformity parameter is less than the preset uniformity value, the target sample image can be directly replaced, and if the overall uniformity parameter is greater than the preset uniformity value, the target sample image cannot be used as an image template of the target object.
And step 105, if the uniformity parameter of the pre-replacement template set obtained after the target object replaces the second image template is smaller than the preset uniformity value, taking the target sample image as the image template of the target object and replacing the second image template.
It should be noted that there are a plurality of image templates of the target object, the second image template belongs to at least one image template of the target object, the second image template represents an image template that can be used for replacement, and there are a plurality of second image templates, and the description in step 102 is combined, so that std is satisfied i Less than std 0 There may be a plurality of image templates, for example, out of n image templates of the target object, the 1 st image template and the 2 nd image template satisfy std, respectively 1 <std 0 ,std 2 <std 0 That is, after the 1 st image template of the target object is replaced by the target sample image and after the 2 nd image template of the target object is replaced by the target sample image, the distribution of the image templates of the target object is more uniform, and both the 1 st image template and the 2 nd image template can be replaced as the second image template. At this time, the 1 st image template and the 2 nd image template are added to the replacement queue, and one image template is selected from a plurality of image templates in the replacement queue to be replaced.
Here, three specific examples are listed to illustrate how the target sample image is replaced as an image template of the target object.
Optionally, in a first example, the method further comprises: when the uniformity parameter smaller than the preset uniformity value exists in the uniformity parameters of the target object, replacing the image template with the smallest uniformity parameter obtained after the replacement of the target object with the target sample image in the image template of the target object.
For example, in the replacement queue, std 1 <std 0 ,std 2 <std 0 At the same time std 2 <std 1 That is, after the target sample image replaces the 2 nd image template, the image template distribution is higher than that of the target sample image by the 1 st image templateThe plate is more uniform and the 2 nd image template is replaced with the target sample image. Of course, the overall uniformity is merely illustrative, and the local uniformity may be used for determination, which is not limited in this application.
Optionally, in a second example, the method further comprises: and replacing the image template with the smallest number of hits matched with the target object in the image recognition process with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value.
Optionally, in a third example, the method further comprises: and replacing the image template with the longest service time with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value after the image template is replaced with the target object.
For example, in the replacement queue, when the time for which the 1 st image template is the image template of the target object is 5 days and the time for which the 2 nd image template is the image template of the target object is 1 day, the 1 st image template is replaced with the target sample image.
Optionally, in one embodiment of the present application, the method further comprises: and converting the two-dimensional image of the target sample image into three-dimensional point cloud information, and calculating the transverse rotation angle and the longitudinal rotation angle of the target object in the target sample image through the external reference rotation matrix according to the three-dimensional point cloud information.
The three-dimensional point cloud information is used for indicating position coordinates and depth information of each pixel point in the target sample image, and the three-dimensional point cloud information can represent the spatial position of each pixel point. The external parameter rotation matrix is a parameter of a shooting device for shooting a target sample image, and the transverse rotation angle and the longitudinal rotation angle of a target object in the target sample image can be calculated according to the three-dimensional point cloud information and the external parameter rotation matrix of the shooting device. Similarly, the horizontal rotation angle and the vertical rotation angle of the target object in the image corresponding to each image template of the target object can be calculated, and the description is omitted here.
Optionally, before step 101, a plurality of acquired images may be determined, and a target sample image of the target object is selected from the plurality of images, for example, in an embodiment of the present application, the method further includes: calculating a first similarity average value of each image in the at least one image and an image template of the target object; and determining an image of which the first similarity average value is within a second preset range as a target sample image of the target object.
For example, the similarity value range may be [0,1], where the similarity value 0 indicates that the sample image is completely different, and the similarity value 1 indicates that the sample image is completely the same, and in combination with the example corresponding to fig. 2, when the similarity value is greater than 0.8, it may be determined that the sample image is the sample image of the target object, so the second preset range may be (0.8,1), which is not described herein, but may be any other value.
According to the method, the number of the image templates of the target object can be increased continuously in the process of identifying the target object until the number of the image templates reaches the upper limit, and after the number of the image templates reaches the upper limit, the uniformity of the image templates which can still replace the image templates is better, the identification accuracy is improved, and furthermore, because the image templates are updated continuously, the real-time performance of the image templates of the target object is better, after the image templates are updated, the angle diversity of the image templates is improved greatly, after the image templates of the same sample are updated by using the image templates of the application, the FRR (False Rejection Rate) is obviously reduced compared with that before the image templates are updated by using the application, and the identification rate is obviously improved.
The updating of the image template can be that the target object is registered in the registration process, for example, the user performs face registration; or the target object may be identified during the authentication process, for example, by the user. Here, two specific application scenarios are described, and of course, this is only an exemplary illustration, and does not represent the limitation of the present application.
Optionally, in the first application scenario, the target object may be authenticated, and the method further includes: after image acquisition is carried out on a target object to obtain a target sample image of the target object, matching the target sample image with at least one image template of the target object; after the target sample image is successfully matched with any image template of the target object, the successful authentication of the target object is determined. When the identity of the target object is identified, the target object is subjected to image acquisition to obtain a target sample image, then the target sample image is matched, and if the target sample image is matched with a certain image template of the target object, verification is successful. Of course, there may be a case of verification failure, for example, when an object to be verified performs identity verification, the object to be verified performs image acquisition to obtain an acquired image, matches the sample image with an image template of the target object, and if all image templates of the target object fail to match with the acquired image of the object to be verified, it is indicated that the object to be verified is not the target object, and the identity verification fails. Of course, the description is intended to be illustrative only and is not to be taken as limiting the present application.
Optionally, in the second application scenario, the target object may be registered, and the method further includes: after the target object is subjected to image acquisition to obtain a target sample image of the target object, determining the target sample image as an image template of the target object, storing the image template of the target object, and finishing registration of the target object. Here, in the registering process of the target object, the image of the target object is acquired to obtain a target sample image, and the target sample image is determined as the image template of the target object, which is of course only illustrative herein, if multiple sample images of the target object have been acquired in the registering process, if the acquisition is continued, it is required to determine whether the newly acquired sample image can replace the existing image template according to steps 102-105.
Here, the first application scenario and the second application scenario are specific application scenarios of the method for updating the image template of the present application, and the updating of the image template may be performed during the authentication process of the target object (for example, face recognition), or may be performed during the registration process of the target object (for example, face registration), and the image template is automatically replaced during the identification or registration process, so as to improve the uniformity of the distribution of the image template.
In the process of identifying the target object, the similarity between the target sample image of the target object and the first image template is determined, the target sample image is determined to be a standard learning sample image when the similarity is within the first preset range, repeated image templates in the image templates of the target object are reduced, and whether the target sample image can replace the image templates in the target object or not is determined according to the corresponding uniformity parameters obtained by respectively replacing each image template of the target object by the target sample image, so that the uniform distribution of the image templates in the target object is ensured, the target object can be identified in all angles, and the accuracy and the identification rate of image identification are improved.
Embodiment II,
Based on the image template updating method shown in fig. 1, fig. 5 is a logic block diagram of an image template updating method according to an embodiment of the present application, and as shown in fig. 5, the method includes the following steps:
step 501, the current target sample image is identified by using the image template of the target object.
If the identification is successful, step 502 is performed indicating that the target sample image is a sample image of the target object, and if the identification is unsuccessful, the process ends. In the corresponding embodiment of fig. 5, the target object may be a human face.
When the image of the current target is acquired, the image acquisition can be performed by using an infrared camera device, a speckle projector and a light supplementing device. In the identification process, when the infrared image is shot by the infrared camera device, the infrared camera device and the light supplementing device are turned on, the speckle projector is turned off, and the infrared image is acquired; when the speckle projector is used for collecting speckle images, the speckle projector is opened, the infrared camera device and the light supplementing device are closed, the speckle images are collected, depth information can be obtained through the collected speckle images, and the two-dimensional infrared images are converted into three-dimensional point cloud information. And calculating the angle of the current point cloud through the camera external reference rotation matrix.
Step 502, adding 1 to the number of times of matching hits of the image template successfully identified.
Step 503, determining whether the target sample image permits learning.
If the similarity between the target sample image and the face image template of the target object (in this embodiment, the first image template is the face image template) is within a first preset range, the target sample image permits learning.
Step 504, calculating each image template of the target object to be replaced by the target sample image, and obtaining the corresponding overall uniformity parameter.
Step 505, judging whether the overall uniformity is improved.
Determining whether global uniformity improves may be based on the first example in step 102. If std i Less than std 0 The global uniformity is better after the i-th image template of the target object is replaced by the target sample image. If the global uniformity is not improved, step 506 is performed, and if the global uniformity is improved, step 508 is performed.
Step 506, calculating each image template of the target object to be replaced by the target sample image, and obtaining the corresponding local uniformity parameter.
Step 507, judging whether the local uniformity parameter is improved.
Determining whether global uniformity improves may be based on the second example in step 102. If P i1 Less than std 01 And/or P i2 Less than std 02 The local uniformity is better after the i-th image template of the target object is replaced by the target sample image. If the local uniformity is not improved, then end, if the local uniformity is improved, then step 508 is performed.
Step 508, determining an image template with the smallest number of matching hits among at least one replaceable image template of the target object.
Step 509, using the image template with the longest use time as the second image template in at least one image template with the smallest number of matching hits of the target object.
It should be noted that, at least one replaceable image template forms a replacement queue, if there is only one image template in the replacement queue, the target sample image is directly replaced by the image template, if there are multiple image templates in the replacement queue, the selection is needed according to step 508, if, after the selection according to step 508, there are two image templates in the replacement queue that have the same number of matching hits and are both the minimum number of matching hits, the selection is further performed according to step 509 between the two image templates.
Step 510, replacing the target sample image as an image template of the target object and replacing the second image template.
In the process of identifying the target object, the similarity between the target sample image of the target object and the first image template is determined, the target sample image is determined to be a standard learning sample image when the similarity is within the first preset range, repeated image templates in the image templates of the target object are reduced, and whether the target sample image can replace the image templates in the target object or not is determined according to the corresponding uniformity parameters obtained by respectively replacing each image template of the target object by the target sample image, so that the uniform distribution of the image templates in the target object is ensured, the target object can be identified in all angles, and the accuracy and the identification rate of image identification are improved.
Third embodiment,
Fig. 6 is a block diagram of an image template updating apparatus according to an embodiment of the present application, where the image template updating apparatus 60 includes: a permit learning module 601, a uniformity parameter module 602, and an image template management module 603;
the permission learning module 601 is configured to determine that a similarity between a target sample image of a target object and a first image template of the target object is within a first preset range;
the uniformity parameter module 602 is configured to, when the number of image templates of the target object is equal to a preset number, replace each image template of the target object with a target sample image respectively and obtain a corresponding uniformity parameter, where the uniformity parameter is used to indicate uniformity of distribution of the image templates of the target object after the target sample image replaces the image templates of the target object;
the image template management module 603 is configured to take the target sample image as the image template of the target object and replace the second image template if the uniformity parameter obtained after the target object replaces the second image template is smaller than the preset uniformity value.
Fourth embodiment,
Fig. 7 is a schematic hardware structure of an electronic device according to an embodiment of the present application. As shown in fig. 7, the electronic device 70 includes: at least one processor 701, a memory 702, and a bus 703, the at least one processor 701 and the memory 702 being connected to each other by the bus and completing communication;
The memory 702 is used as a non-volatile computer readable storage medium for storing non-volatile software programs, non-volatile computer executable programs, and modules, such as corresponding program instructions/modules (for determining a capacitive screen touch position method) in the embodiments of the present application. At least one processor 701 executes various functional applications of the server and data processing by running non-volatile software programs, instructions and modules stored in memory 702, i.e. implementing the method embodiments described above (method of determining capacitive screen touch location).
Memory 702 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created from the use of (determining capacitive screen touch location means), etc. In addition, the memory 702 may include high-speed random access memory 702, and may also include non-volatile memory 702, such as at least one disk memory 702 piece, flash memory device, or other non-volatile solid-state memory 702 piece. In some embodiments, the memory 702 optionally includes memory 702 remotely located with respect to the at least one processor 701, the remote memory 702 being connectable (determining capacitive screen touch location means) via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The memory 702 stores at least one program, and the at least one processor 701 implements the methods described in the embodiments of the present application when executing the program stored on the memory 702.
Fourth embodiment,
The present embodiments provide a storage medium having a computer program stored thereon, which when executed by at least one processor, implements the methods described in the embodiments of the present application.
The product can execute the method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. Technical details not described in detail in this embodiment may be found in the methods provided in the embodiments of the present application.
The electronic device of the embodiments of the present application exist in a variety of forms including, but not limited to:
(1) Mobile communication devices, which are characterized by mobile communication functionality and are aimed at providing voice, data communication. Such terminals include smart phones (e.g., iPhone), multimedia phones, functional phones, and low-end phones, among others.
(2) Ultra mobile personal computer equipment, which belongs to the category of personal computers, has the functions of calculation and processing and generally has the characteristic of mobile internet surfing. Such terminals include PDA, MID and UMPC devices, etc., such as iPad.
(3) Portable entertainment devices such devices can display and play multimedia content. Such devices include audio, video players (e.g., iPod), palm game consoles, electronic books, and smart toys and portable car navigation devices.
(4) The server, i.e., the device providing the computing service, is composed of at least one processor 810, a hard disk, a memory, a system bus, etc., and is similar to a general computer architecture, but has high requirements in terms of processing capability, stability, reliability, security, scalability, manageability, etc., since it is required to provide highly reliable services.
(5) Other electronic devices with data interaction function.
Thus, particular embodiments of the present subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may be advantageous.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, at least one processor or at least one processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the at least one processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), a programmable logic controller, and an embedded microcontroller, examples of the controller including, but not limited to, the following microcontrollers: ARC625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in at least one of software and/or hardware when implementing the present application.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on at least one computer-usable storage medium (including but not limited to disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to at least one processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the at least one processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes at least one processor (CPU), input/output interfaces, a network interface, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on at least one computer-usable storage medium (including but not limited to disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular transactions or implement particular abstract data types. The application may also be practiced in distributed computing environments where transactions are performed by remote processing devices that are connected through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (20)

1. A method of updating an image template, comprising:
image acquisition is carried out on a target object to obtain a target sample image of the target object;
determining that the similarity between the target sample image and a first image template of the target object is within a first preset range, wherein the target object is provided with at least one image template, and the first image template belongs to the at least one image template of the target object;
Pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set;
calculating a uniformity parameter of each of the at least one pre-replacement template set, the uniformity parameter being used to indicate uniformity of image template distribution of the target object in the corresponding pre-replacement template set;
and if the uniformity parameter of the pre-replacement template set obtained after the target object replaces the second image template is smaller than a preset uniformity value, taking the target sample image as the image template of the target object and replacing the second image template, wherein the second image template belongs to at least one image template of the target object.
2. The method of claim 1, wherein replacing the target sample image with each image template of the target object and resulting in a corresponding uniformity parameter, respectively, comprises:
uniformly dividing at least one region in the value ranges of the transverse rotation angle and the longitudinal rotation angle of the target object;
for each image template of the target object, after the target sample image is substituted for the image template of the target object, determining the number of the image templates distributed in each region, and calculating the standard deviation/variance of the number of the image templates of all regions as the uniformity parameter.
3. The method according to claim 2, wherein the method further comprises:
for the image templates of the target object before replacement, determining the number of the image templates distributed in each area, and calculating the standard deviation/variance of the number of the image templates of all areas as the preset uniformity value.
4. The method according to claim 2, wherein calculating the standard deviation/variance of the number of image templates of the at least one region as the uniformity parameter comprises:
dividing the at least one region into a first portion and a second portion;
calculating the numerical value obtained by dividing the standard deviation/variance of the number of the image templates of all the areas in the first part by a first factor as a first parameter in the uniformity parameters;
and calculating the numerical value obtained by dividing the standard deviation/variance of the number of the image templates of all the areas in the second part by the first factor as a second parameter in the uniformity parameters.
5. The method according to claim 4, wherein the method further comprises:
for the image templates of the target object before replacement, determining the number of the image templates distributed in each area;
Calculating standard deviation/variance of the number of image templates of all areas in the first part as a first uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the first parameter is smaller than the first uniformity value;
and/or calculating standard deviation/variance of the number of image templates of all areas in the second part as a second uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the second parameter is smaller than the second uniformity value.
6. The method according to claim 1, wherein the method further comprises:
calculating a first similarity average value of each image in at least one image and an image template of the target object; and determining an image of which the first similarity average value is within a second preset range as a target sample image of the target object.
7. The method according to claim 1, wherein the method further comprises:
and when the similarity is larger than a first threshold value and smaller than or equal to a second threshold value, determining that the similarity between the target sample image of the target object and the first image template of the target object is within the first preset range.
8. The method of claim 7, wherein the method further comprises:
and calculating a second similarity average value of each sample image in the sample image set of the non-target object and the image template of the target object, and determining the second similarity average value with the largest value as the first threshold value.
9. The method according to claim 1, wherein the method further comprises:
determining a transverse rotation angle and a longitudinal rotation angle of a target object in the target sample image; and when the square sum of the transverse rotation angle and the longitudinal rotation angle is in a third preset range, determining that the similarity between the target sample image of the target object and the first image template of the target object is in the first preset range.
10. The method of claim 1, wherein pre-replacing each image template of the target object with the target sample image, respectively, results in at least one pre-replacement template set, comprising:
when the number of the image templates of the target object is determined to be equal to the preset number, each image template of the target object is replaced in advance by using the target sample image to obtain at least one replaced template set.
11. The method according to claim 1, wherein the method further comprises:
and when the number of the image templates of the target object is smaller than a preset number, determining the target sample image of the target object as the image template of the target object.
12. The method according to claim 1, wherein the method further comprises:
and when the uniformity parameter smaller than a preset uniformity value exists in the uniformity parameters of the target object, replacing the image template with the smallest uniformity parameter obtained after the replacement of the target object with the target sample image in the image template of the target object.
13. The method according to claim 1, wherein the method further comprises:
and replacing the image template with the smallest hit number matched with the target object in the image recognition process with the target sample image in the image template with the uniformity parameter smaller than the preset uniformity value obtained after the replacement with the target object.
14. The method according to claim 1, wherein the method further comprises:
and replacing the image template with the longest service time with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value after the image template is replaced with the target object.
15. The method according to claim 1, wherein the method further comprises:
matching the target sample image with at least one image template of the target object;
and after the target sample image is successfully matched with any one image template of the target object, determining that the identity verification of the target object is successful.
16. The method according to claim 1, wherein the method further comprises:
and determining the target sample image as an image template of the target object, storing the image template of the target object, and finishing registration of the target object.
17. The method according to any one of claims 1-16, further comprising:
converting the two-dimensional image of the target sample image into three-dimensional point cloud information, and calculating the transverse rotation angle and the longitudinal rotation angle of the target object in the target sample image through an external reference rotation matrix according to the three-dimensional point cloud information.
18. An image template updating apparatus, comprising: a permission learning module, a uniformity parameter module and an image template management module;
the permission learning module is used for respectively replacing each image template of the target object with a target sample image when the number of the image templates of the target object is equal to a preset number, and obtaining corresponding uniformity parameters, wherein the uniformity parameters are used for indicating the uniformity of the distribution of the image templates of the target object after the image templates of the target object are replaced by the target sample image;
The uniformity parameter module is used for determining a calibration coefficient of the first sensor according to the actual object reference value and the actual object measurement value, and the calibration coefficient is used for calibrating the electric signal detected by the first sensor;
and the image template management module is used for taking the target sample image as the image template of the target object and replacing the second image template if the uniformity parameter obtained after the target object replaces the second image template is smaller than a preset uniformity value.
19. An electronic device, comprising: the system comprises at least one processor, a memory and an image acquisition module, wherein the at least one processor, the memory and the image acquisition module are mutually connected through a bus and complete communication; the image acquisition module comprises an infrared camera and a speckle projector;
the memory having at least one program stored thereon, the at least one processor implementing the method of any of claims 1-17 when executing the program stored on the memory.
20. A storage medium, comprising: the storage medium having stored thereon a computer program which, when executed by at least one processor, implements the method according to any of claims 1-17.
CN201980002050.4A 2019-09-27 2019-09-27 Image template updating method, device and storage medium Active CN112912889B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/108656 WO2021056450A1 (en) 2019-09-27 2019-09-27 Method for updating image template, device, and storage medium

Publications (2)

Publication Number Publication Date
CN112912889A CN112912889A (en) 2021-06-04
CN112912889B true CN112912889B (en) 2024-03-26

Family

ID=75165050

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980002050.4A Active CN112912889B (en) 2019-09-27 2019-09-27 Image template updating method, device and storage medium

Country Status (2)

Country Link
CN (1) CN112912889B (en)
WO (1) WO2021056450A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139612A (en) * 2021-05-07 2021-07-20 上海商汤临港智能科技有限公司 Image classification method, training method of classification network and related products
CN114401440A (en) * 2021-12-14 2022-04-26 北京达佳互联信息技术有限公司 Video clip and clip model generation method, device, apparatus, program, and medium
CN115937629B (en) * 2022-12-02 2023-08-29 北京小米移动软件有限公司 Template image updating method, updating device, readable storage medium and chip

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106021606A (en) * 2016-06-21 2016-10-12 广东欧珀移动通信有限公司 Fingerprint template updating method and terminal equipment
CN106372609A (en) * 2016-09-05 2017-02-01 广东欧珀移动通信有限公司 Fingerprint template update method, fingerprint template update device and terminal equipment
KR20170046436A (en) * 2015-10-21 2017-05-02 삼성전자주식회사 Biometric authentication method and biometrics authentication apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016206081A1 (en) * 2015-06-26 2016-12-29 宇龙计算机通信科技(深圳)有限公司 Method and device for updating fingerprint template
US10769255B2 (en) * 2015-11-11 2020-09-08 Samsung Electronics Co., Ltd. Methods and apparatuses for adaptively updating enrollment database for user authentication

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170046436A (en) * 2015-10-21 2017-05-02 삼성전자주식회사 Biometric authentication method and biometrics authentication apparatus
CN106021606A (en) * 2016-06-21 2016-10-12 广东欧珀移动通信有限公司 Fingerprint template updating method and terminal equipment
CN106372609A (en) * 2016-09-05 2017-02-01 广东欧珀移动通信有限公司 Fingerprint template update method, fingerprint template update device and terminal equipment

Also Published As

Publication number Publication date
CN112912889A (en) 2021-06-04
WO2021056450A1 (en) 2021-04-01

Similar Documents

Publication Publication Date Title
CN112912889B (en) Image template updating method, device and storage medium
CN110363091B (en) Face recognition method, device and equipment under side face condition and storage medium
CN112052789A (en) Face recognition method and device, electronic equipment and storage medium
KR20190072563A (en) Method and apparatus for detecting facial live varnish, and electronic device
EP3113114A1 (en) Image processing method and device
KR102316230B1 (en) Image processing method and device
CN111523431B (en) Face recognition method, device and equipment
CN108875487B (en) Training of pedestrian re-recognition network and pedestrian re-recognition based on training
CN109740415A (en) Vehicle attribute recognition methods and Related product
CN108875492B (en) Face detection and key point positioning method, device, system and storage medium
CN110555428B (en) Pedestrian re-identification method, device, server and storage medium
CN110414550B (en) Training method, device and system of face recognition model and computer readable medium
TW202006630A (en) Payment method, apparatus, and system
CN105005593A (en) Scenario identification method and apparatus for multi-user shared device
US20140232748A1 (en) Device, method and computer readable recording medium for operating the same
CN111382738A (en) Reading method and device of pointer type instrument
CN110335139A (en) Appraisal procedure, device, equipment and readable storage medium storing program for executing based on similarity
KR20220098312A (en) Method, apparatus, device and recording medium for detecting related objects in an image
CN111160251B (en) Living body identification method and device
CN111199169A (en) Image processing method and device
CN112965592A (en) Equipment interaction method, device and system
CN112991441A (en) Camera positioning method and device, electronic equipment and storage medium
CN110909817A (en) Distributed clustering method and system, processor, electronic device and storage medium
US10551195B2 (en) Portable device with improved sensor position change detection
CN110322139B (en) Policy recommendation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant