CN112912889A - Image template updating method, device and storage medium - Google Patents

Image template updating method, device and storage medium Download PDF

Info

Publication number
CN112912889A
CN112912889A CN201980002050.4A CN201980002050A CN112912889A CN 112912889 A CN112912889 A CN 112912889A CN 201980002050 A CN201980002050 A CN 201980002050A CN 112912889 A CN112912889 A CN 112912889A
Authority
CN
China
Prior art keywords
image
target object
template
uniformity
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980002050.4A
Other languages
Chinese (zh)
Other versions
CN112912889B (en
Inventor
肖梦佳
王波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Publication of CN112912889A publication Critical patent/CN112912889A/en
Application granted granted Critical
Publication of CN112912889B publication Critical patent/CN112912889B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

An image template updating method, equipment and storage medium are provided, wherein the image template updating method comprises the following steps: acquiring an image of a target object to obtain a target sample image (101) of the target object; determining that a similarity between a target sample image of a target object and a first image template of the target object is within a first preset range (102); pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set (103); calculating a homogeneity parameter for each of at least one set of pre-replacement templates (104); and if the uniformity parameter of the pre-replacement template set obtained after the target object replaces the second image template is smaller than the preset uniformity value, taking the target sample image as the image template of the target object and replacing the second image template (105). The method ensures that the image templates in the target object are uniformly distributed, and improves the accuracy and the recognition rate of image recognition.

Description

Image template updating method, device and storage medium Technical Field
The embodiment of the application relates to the technical field of image recognition, in particular to an image template updating method, image template updating equipment and a storage medium.
Background
As image recognition technology is more mature, the application of image recognition technology is more and more extensive. Taking face recognition as an example, face recognition has been widely applied to security inspection, identity authentication, electronic payment and other scenes. Taking face recognition as an example, in the process of face recognition, firstly, sampling and registering a face image, establishing an image template of the face image according to the characteristics of the sampled and registered face image, and in the subsequent application, determining whether a currently recognized person is a registered user according to the image template of the face image. However, in practical applications, the image template of the registered face image is sampled, and the angle of the acquired image is random, so that the recognition rate is low when the face image to be recognized is recognized.
Disclosure of Invention
In view of the above, an object of the present invention is to provide an image template updating method, an apparatus and a storage medium, which overcome the above-mentioned drawbacks.
In a first aspect, an embodiment of the present application provides an image template updating method, including:
acquiring an image of a target object to obtain a target sample image of the target object;
determining that the similarity between a target sample image and a first image template of a target object is within a first preset range, wherein the target object has at least one image template, and the first image template belongs to the at least one image template of the target object;
respectively pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set;
calculating a uniformity parameter of each pre-replacement template set in at least one pre-replacement template set, wherein the uniformity parameter is used for indicating the uniformity of the image template distribution of the target object in the corresponding pre-replacement template set;
and if the uniformity parameter of the pre-replacement template set obtained after the target object replaces the second image template is smaller than the preset uniformity value, taking the target sample image as the image template of the target object and replacing the second image template, wherein the second image template belongs to at least one image template of the target object.
Optionally, in an embodiment of the present application, the replacing each image template of the target object with the target sample image and obtaining the corresponding uniformity parameter respectively includes:
uniformly dividing at least one region in the value ranges of the transverse rotation angle and the longitudinal rotation angle of the target object; and for each image template of the target object, replacing the image template of the target object with the target sample image, determining the number of the image templates distributed in each area, and calculating the standard deviation/variance of the number of the image templates of all the areas as uniformity parameters.
Optionally, in an embodiment of the present application, the method further includes: for the image templates of the target object before replacement, the number of image templates distributed in each region is determined, and the standard deviation/variance of the number of image templates of all regions is calculated as a preset uniformity value.
Optionally, in an embodiment of the present application, calculating a standard deviation/variance of the number of image templates of the at least one region as the uniformity parameter includes:
dividing at least one region into a first portion and a second portion; calculating a numerical value obtained by dividing the standard deviation/variance of the number of the image templates of all the areas in the first part by a first factor to serve as a first parameter in the uniformity parameters; and calculating the value of the standard deviation/variance of the number of the image templates of all the areas in the second part divided by the first factor as a second parameter in the uniformity parameters.
Optionally, in an embodiment of the present application, the method further includes:
determining the number of image templates distributed in each area for the image templates of the target object before replacement; calculating the standard deviation/variance of the number of the image templates in all the areas in the first part as a first uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the first parameter is smaller than the first uniformity value; and/or calculating the standard deviation/variance of the number of the image templates of all the areas in the second part as a second uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the second parameter is smaller than the second uniformity value.
Optionally, in an embodiment of the present application, the method further includes: calculating a first similarity average value of each image in the at least one image and the image template of the target object; and determining the image with the first similarity average value within a second preset range as a target sample image of the target object.
Optionally, in an embodiment of the present application, the method further includes: and when the similarity is greater than a first threshold and less than or equal to a second threshold, determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range.
Optionally, in an embodiment of the present application, the method further includes: and calculating a second similarity average value of each sample image in the sample image set of the non-target object and the image template of the target object, and determining the second similarity average value with the maximum value as a first threshold value.
Optionally, in an embodiment of the present application, the method further includes: determining a transverse rotation angle and a longitudinal rotation angle of a target object in a target sample image; and when the sum of the squares of the transverse rotation angle and the longitudinal rotation angle is within a third preset range, determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range.
Optionally, in an embodiment of the present application, the pre-replacing each image template of the target object with the target sample image to obtain at least one pre-replacement template set includes:
and when the number of the image templates of the target object is determined to be equal to the preset number, pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set.
Optionally, in an embodiment of the present application, the method further includes: and when the number of the image templates of the target object is less than the preset number, determining the target sample image of the target object as the image template of the target object.
Optionally, in an embodiment of the present application, the method further includes: and when the uniformity parameter smaller than the preset uniformity value exists in the uniformity parameters of the target object, replacing the image template with the minimum uniformity parameter obtained after the target object is replaced with the target sample image in the image templates of the target object.
Optionally, in an embodiment of the present application, the method further includes: and replacing the image template with the minimum number of matched hits of the target object in the image identification process with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value after the replacement with the target object.
Optionally, in an embodiment of the present application, the method further includes: and replacing the image template with the longest service time with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value after replacement with the target object.
Optionally, in an embodiment of the present application, the method further includes: matching the target sample image with at least one image template of the target object; and after the target sample image is successfully matched with any image template of the target object, determining that the identity verification of the target object is successful.
Optionally, in an embodiment of the present application, the method further includes: and determining the target sample image as an image template of the target object, storing the image template of the target object, and finishing the registration of the target object.
Optionally, in an embodiment of the present application, the method further includes: and converting the two-dimensional image of the target sample image into three-dimensional point cloud information, and calculating the transverse rotation angle and the longitudinal rotation angle of the target object in the target sample image through the external reference rotation matrix according to the three-dimensional point cloud information.
In a second aspect, an apparatus for updating an image template includes: the device comprises a study permitting module, a uniformity parameter module and an image template management module;
the system comprises a permission learning module, a first image template acquisition module and a second image template acquisition module, wherein the permission learning module is used for determining that the similarity between a target sample image of a target object and the first image template of the target object is within a first preset range;
the uniformity parameter module is used for replacing each image template of the target object by the target sample image respectively and obtaining corresponding uniformity parameters when the number of the image templates of the target object is equal to the preset number, and the uniformity parameters are used for indicating the uniformity of the distribution of the image templates of the target object after the image templates of the target object are replaced by the target sample image;
and the image template management module is used for taking the target sample image as the image template of the target object and replacing the second image template if the uniformity parameter obtained after the target object replaces the second image template is smaller than the preset uniformity value.
In a third aspect, an image recognition apparatus includes: the system comprises at least one processor, a memory and an image acquisition module, wherein the at least one processor, the memory and the image acquisition module are mutually connected through a bus and complete communication; the image acquisition module comprises an infrared camera and a speckle projector and is used for acquiring images; the memory has stored thereon at least one program which, when executed by the at least one processor, performs the method as described in any one of the embodiments of the first aspect.
In a fourth aspect, a storage medium has stored thereon a computer program which, when executed by at least one processor, performs a method as described in any one of the embodiments of the first aspect.
According to the method and the device, in the process of identifying the target object, the similarity between the target sample image of the target object and the first image template is determined, when the similarity is within a first preset range, the target sample image is determined to be a sample image allowed to be learned, repeated image templates in the image template of the target object are reduced, whether the target sample image can replace the image template in the target object or not is determined by replacing each image template of the target object with the target sample image respectively to obtain corresponding uniformity parameters, the image templates in the target object are guaranteed to be distributed uniformly, the target object can be identified at all angles, and the accuracy and the identification rate of image identification are improved.
Drawings
Some specific embodiments of the present application will be described in detail hereinafter by way of illustration and not limitation with reference to the accompanying drawings. The same reference numbers in the drawings identify the same or similar elements or components. Those skilled in the art will appreciate that the drawings are not necessarily drawn to scale. In the drawings:
fig. 1 is a flowchart of an image template updating method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a second similarity average value according to an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a correspondence relationship between similarity and a radius of a distribution circle according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating a third preset range according to an embodiment of the present disclosure;
fig. 5 is a logic block diagram of an image template updating method according to an embodiment of the present application;
fig. 6 is a structural diagram of an image template updating apparatus according to an embodiment of the present application;
fig. 7 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
It is not necessary for any particular embodiment of the invention to achieve all of the above advantages at the same time.
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present application, the technical solutions in the embodiments of the present application will be described clearly and completely below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application shall fall within the scope of the protection of the embodiments in the present application.
The following further describes specific implementations of embodiments of the present application with reference to the drawings of the embodiments of the present application.
The first embodiment,
Fig. 1 is a flowchart of an image template updating method according to an embodiment of the present disclosure; the method may be applied to the process of image recognition and registration of a target object, and may also be applied to the process of image recognition after the target object is registered, which is not limited in this application, as shown in fig. 1, and includes the following steps:
step 101, acquiring an image of a target object to obtain a target sample image of the target object.
The target object refers to an object to be recognized in image recognition, for example, in an application scenario of face recognition, the target object may be a face of a user, and for example, in an application scenario of license plate recognition, the target object may be a license plate of a vehicle.
In this application, "object" is used to mean singular, referring to an object, for example, an object refers to an object, and "object" is not limited to any particular one, but is used to logically more clearly show the implementation of the present solution. In this application, the terms "first," "second," and the like are used for distinguishing, not for limiting, and not for any purpose of indicating any sequence(s) (having a certain sequential relationship) or group(s) (having a second necessarily being first, or having a first necessarily being second).
Step 102, determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range.
The target object has at least one image template, the first image template belongs to the at least one image template of the target object, and when the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range, the target sample image can be used as a sample image for permitting learning, namely, the preliminary condition of the image template used as the target object is met, and the target sample image can be added into a learning queue.
In general, the target object has a plurality of image templates, and the first image template is only one of the image templates of the target object. For example, in an application scenario of face recognition, in a registration stage, an image template of a face image created by using a face image acquired from the front side is a first image template. Of course, the first image template may be any one of the image templates of the target object, which is only an exemplary illustration and does not represent that the present application is limited thereto.
Here, two specific examples are listed to illustrate how to determine that the similarity between the target sample image of the target object and the first image template of the target object is within the first preset range, which is, of course, only an exemplary description here, and does not represent that the present application is limited thereto.
Optionally, in the first example, the method further comprises: and when the similarity is greater than a first threshold and less than or equal to a second threshold, determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range.
It should be noted that the first preset range is defined by a first threshold serving as a lower limit value and a second threshold serving as an upper limit value, the first threshold is set to ensure that the selected target sample image is a sample image of the target object and not a sample image of the non-target object, interference of the sample image of the non-target object is removed, and the second threshold is set to ensure that the target sample image with a very high similarity to the first image template is removed, so that image template duplication is avoided.
Based on the first example, further, in an embodiment of the present application, the method further includes: and calculating a second similarity average value of each sample image in the sample image set of the non-target object and the image template of the target object, and determining the second similarity average value with the maximum value as a first threshold value. For example, for one sample image a, the similarity of the sample image a and each image template of the target object is calculated, the average value of the similarities is calculated to obtain the second similarity average value of the sample image a, the second similarity average values of all the sample images are compared, and the second similarity average value with the largest value is determined as the first threshold.
Here, how to calculate and calculate the second similarity average value between each sample image in the sample image set of the non-target object and the image template of the target object is further described in detail, so as to obtain the sample image set of the non-target object, where none of the sample images included in the sample image set of the non-target object is the sample image of the target object, but the sample images of other objects are sample images, for example, the sample image set of the non-target object has M sample images, M is an integer greater than 1, the target object has n image templates, n is an integer greater than 1, and for the M-th sample image, M is [1, M]Integer of, second similarity average of mth sample image and image template of target object
Figure PCTCN2019108656-APPB-000001
The calculation formula is as formula 1:
Figure PCTCN2019108656-APPB-000002
wherein the content of the first and second substances,
Figure PCTCN2019108656-APPB-000003
second similarity average, s, of image templates representing the m-th sample image and the target objectmiRepresenting the similarity between the mth sample image and the ith image template of the target object. It should be noted thatIn the embodiment, the similarity between the sample image and the image template may be calculated through a neural network model, and the similarity output by the neural network model may be obtained by inputting the sample image and the image template into the neural network model.
As shown in fig. 2, fig. 2 is a schematic diagram of a second similarity average value provided by an embodiment of the present application, in fig. 2, a horizontal axis represents the second similarity average value, a left vertical axis represents a sample image number, a right vertical axis represents a sample image number ratio, a histogram represents a sample image number satisfying a condition of the corresponding second similarity average value, a line graph represents a sample image number ratio under a condition of being less than or equal to the corresponding second similarity average value, in a sample image set of non-target objects, the sample image number ratio of the second similarity average value within 0.8 is close to 1, that is, in the sample image set of non-target objects, similarities between sample images of almost all non-target objects and image templates of target objects are less than 0.8, and a sample image with a similarity greater than 0.8 can be considered as a sample image of a target object, therefore, 0.8 may be used as the maximum second similarity average, i.e., 0.8 may be used as the first threshold.
Optionally, in a second example, the method further comprises: determining a transverse rotation angle and a longitudinal rotation angle of a target object in a target sample image; and when the sum of the squares of the transverse rotation angle and the longitudinal rotation angle is within a third preset range, determining that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range.
Here, how to determine the third preset range is described in detail, for example, in a rectangular coordinate system, the longitudinal rotation angle of the target object (i.e., the angle of rotation of the target object around the x-axis) is represented by the x-axis, the lateral rotation angle of the target object (i.e., the angle of rotation of the target object around the y-axis) is represented by the y-axis, and the ranges of the longitudinal rotation angle and the lateral rotation angle of the target object are both within 90 °, i.e., [ -90,90]. Obtaining a set of sample images of a target object, a sample of the target objectThe image set may contain sample images of 1000 target objects, and the similarity between each sample image and the front image template of the target object is calculated, e.g., the similarity between the mth sample image and the front image template of the target object may be expressed as simm. Calculating an R (distribution circle radius) value of each sample image in the rectangular coordinate system from the coordinates of each sample image in the rectangular coordinate system,
Figure PCTCN2019108656-APPB-000004
for example, the coordinate of the m-th sample image is (x)m,y m) X represents the rotation angle of the target object around the x axis in the m-th sample imagemThe angle of rotation about the y-axis being ymThen the R value of the m-th sample image is Rm
Figure PCTCN2019108656-APPB-000005
And determining the sample image with the similarity greater than 0.95 as a sample image which is repeated with the front image template of the target object, determining the sample image with the similarity less than or equal to 0.8 as an invalid sample image, determining the sample image with the similarity greater than 0.8 as an effective sample image, determining the sample image with the similarity greater than 0.8 as a sample image which is repeated with the front image template of the target object, and determining the sample image with the similarity less than or equal to 0.8 as the sample image of the non-target object. Therefore, the similarity between the target sample image and the first image template of the target object should be at (0.8, 1)]Meanwhile, as shown in fig. 3, fig. 3 is a schematic diagram of a correspondence relationship between a similarity and a radius of a distribution circle provided in the embodiment of the present application, where an R value corresponding to a similarity 0.95 is 5, an R value corresponding to a similarity 0.8 is 60, and a similarity interval (0.8, 1)]The corresponding R value interval is [5,60), namely, the third preset range, so that when the R value of the target sample image is within the third preset range, namely [5,60), it is determined that the similarity between the target sample image and the first image template of the target object is within the first preset range,as shown in fig. 4, fig. 4 is a schematic diagram of a third preset range provided by an embodiment of the present application, and in the rectangular coordinate system shown in fig. 4, an annular shape of a shaded region represents the third preset range, where the shaded region represents a distribution region having an R value greater than or equal to 5 and less than or equal to 60, and a region outside the shaded region represents a distribution region of a sample image having an R value less than 5 (similarity to the front image template of the target object is greater than 0.95) and an R value greater than 60 (similarity to the front image template of the target object is less than 0.8).
And 103, pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set.
It should be noted that, the step 103 of replacing each image template with the target sample image is a pre-replacement, which is only used for calculating the uniformity parameter, and does not directly use the target sample image as the image template.
Optionally, in an embodiment of the present application, the pre-replacing each image template of the target object with the target sample image to obtain at least one pre-replacement template set includes:
and when the number of the image templates of the target object is determined to be equal to the preset number, pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set.
Optionally, in an embodiment of the present application, the method further includes: and when the number of the image templates of the target object is less than the preset number, determining the target sample image of the target object as the image template of the target object.
The upper limit of the number of the image templates of the target object can be a preset number, if the number of the image templates of the target object is smaller than the preset number, the image templates of the target object can be continuously increased, and when the similarity between the target sample image and the first image template of the target object is within a first preset range, the target sample image is directly used as the image template of the target object; if the number of the image templates of the target object is equal to the preset number, it is indicated that the number of the image templates of the target object cannot be increased continuously, and at this time, it is required to determine whether the target sample image can replace one image template of the target object, and a better effect is obtained.
And 104, calculating the uniformity parameter of each pre-replacement template set in at least one pre-replacement template set.
The uniformity parameter is used to indicate the uniformity of the image template distribution of the target object in the corresponding set of pre-replacement templates.
The uniformity parameter is used for indicating the uniformity of the distribution of the image templates in the pre-replacement template set obtained after the target sample image replaces the image template of the target object. For example, the target object has n image templates, where n is an integer greater than 1, the target sample image is substituted for the ith image template to obtain an ith pre-substituted template set of the target object, and the uniformity parameter of the ith pre-substituted template set is obtained through calculation as the ith template uniformity parameter, the target sample image is substituted for each image template to obtain corresponding uniformity parameters, and n pre-substituted template sets and n uniformity parameters can be obtained altogether.
Taking a rectangular coordinate system as an example, the longitudinal rotation angle of the target object (i.e., the angle of rotation of the target object around the x-axis) is represented by the x-axis, and the lateral rotation angle of the target object (i.e., the angle of rotation of the target object around the y-axis) is represented by the y-axis. Here, two specific examples are presented to illustrate how the uniformity parameter is calculated.
Optionally, in the first example, the calculating the overall uniformity parameter, and respectively replacing each image template of the target object with the target sample image and obtaining the corresponding uniformity parameter may include:
uniformly dividing at least one region in the value ranges of the transverse rotation angle and the longitudinal rotation angle of the target object; and for each image template of the target object, replacing the image template of the target object with the target sample image, determining the number of the image templates distributed in each area, and calculating the standard deviation/variance of the number of the image templates of all the areas as uniformity parameters.
Optionally, in an embodiment of the present application, the method further includes: for the image templates of the target object before replacement, the number of image templates distributed in each region is determined, and the standard deviation/variance of the number of image templates of all regions is calculated as a preset uniformity value.
Based on the first example, as shown in fig. 4, the annular shadow area is a value range of the transverse rotation angle and the longitudinal rotation angle of the target object, that is, the value range of the R value is [5,60 ], and the annular shadow area is equally divided into 8 sector areas (that is, at least one area), and of course, the annular shadow area may also be divided into 6 sector areas and 4 sector areas, which may be freely set according to specific situations, and the present application does not limit this. Here, in the 8 sector areas shown in fig. 4, a point in the positive x-axis direction represents an angle at which the face is turned upward, a point in the negative x-axis direction represents an angle at which the face is turned downward, a point in the positive y-axis direction represents an angle at which the face is turned rightward, a point in the positive y-axis direction represents an angle at which the face is turned leftward, a slope of a diagonal line in the first quadrant is 0, which represents an angle at which the face is turned upward rightward, a diagonal line in the second quadrant represents an angle at which the face is turned downward to the right, a diagonal line in the third quadrant represents an angle at which the face is turned downward to the left, and a diagonal line in the fourth quadrant represents an angle at which the face is turned upward to the left. Of course, the illustration is only exemplary here, and a positive direction may indicate an angle of rotation downward and rightward, and a negative direction may indicate an angle of rotation upward and leftward. This is not limited by the present application.
Taking n image templates of the target object as an example, the preset uniformity value is a standard deviation calculated according to an original image template of the target object, that is, before the target sample image replaces the image template of the target object, the number of the image templates of the image template of the target object distributed in each sector area is determined, and the number of the image templates in the t-th sector area is x0t(t is 1,2,3,4,5,6,7,8), and the average of the number of image templates in the 8-sector area is calculated
Figure PCTCN2019108656-APPB-000006
Then, the standard deviation std of the number of image templates of 8 sector regions is calculated according to equation 20As preset uniformity values:
Figure PCTCN2019108656-APPB-000007
wherein T is the number of at least one region. For the ith image template, replacing the ith image template with the target sample image (namely, using the target sample image as the ith image template), calculating the number of the image templates in each fan-shaped area, wherein the number of the image templates in the tth fan-shaped area is xit(t is 1,2,3,4,5,6,7,8), and the average of the number of image templates in the 8-sector area is calculated
Figure PCTCN2019108656-APPB-000008
Then, the standard deviation std of the number of image templates of 8 sector regions is calculated according to equation 3i
Figure PCTCN2019108656-APPB-000009
Wherein T is the number of at least one region. stdi(i.e., the global uniformity parameter) indicates the uniformity of the distribution of the image templates of the target object std after the ith image template of the target object is replaced by the target sample imageiThe smaller the size, the better the uniformity of the distribution of the image template. If stdiLess than std0If the target sample image replaces the ith image template of the target object, the image templates are distributed more uniformly, and the target sample image can be used as the ith image template to replace the original image template.
Optionally, in a second example, the local homogeneity parameter may be calculated, and the standard deviation/variance of the number of image templates of at least one region is calculated as the homogeneity parameter, including:
dividing at least one region into a first portion and a second portion; calculating a numerical value obtained by dividing the standard deviation/variance of the number of the image templates of all the areas in the first part by a first factor to serve as a first parameter in the uniformity parameters; and calculating the value of the standard deviation/variance of the number of the image templates of all the areas in the second part divided by the first factor as a second parameter in the uniformity parameters.
It should be noted that all the regions in the first portion are the sum of the regions of all the regions belonging to the first portion, and all the regions in the first portion plus all the regions in the second portion are the value ranges of the transverse rotation angle and the longitudinal rotation angle of the target object.
Of course, at least one region may also be divided into a plurality of portions, for example, at least one region is divided into S portions, S is an integer greater than 1, a value obtained by dividing the standard deviation/variance of the number of image templates of all regions in the S-th portion by the S-th factor is calculated as the S-th parameter in the uniformity parameter, so as to obtain S parameters, where S is an integer within [1, S ].
Optionally, in an embodiment of the present application, the method further includes:
determining the number of image templates distributed in each area for the image templates of the target object before replacement; calculating the standard deviation/variance of the number of the image templates in all the areas in the first part as a first uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the first parameter is smaller than the first uniformity value; and/or calculating the standard deviation/variance of the number of the image templates of all the areas in the second part as a second uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the second parameter is smaller than the second uniformity value.
For example, based on the ring-shaped shaded region shown in fig. 4, with R25 as a boundary, the ring-shaped shaded region is divided into two ring-shaped regions, i.e., a first portion and a second portion, and the calculation process of the standard deviation in the first example is combined, and the target sample image is used instead of the target sample imageBefore the image template of the target object is replaced, the standard deviation std of the number of the first part image templates and the second part image templates is calculated according to a formula 4 and a formula 5 respectively for the original image template of the target object01And std02
Figure PCTCN2019108656-APPB-000010
Wherein x is01t(t ═ 1,2,3,4,5,6,7,8) indicates that the number of image templates in the t-th sector in the first portion is,
Figure PCTCN2019108656-APPB-000011
representing the average of the number of image templates within the 8 sectors of the first portion.
Figure PCTCN2019108656-APPB-000012
Wherein x is02t(t ═ 1,2,3,4,5,6,7,8) indicates that the number of image templates in the t-th sector in the second portion is,
Figure PCTCN2019108656-APPB-000013
representing the average of the number of image templates within the 8 sectors of the first portion.
Will std01As a first uniformity value, std02As a second uniformity value.
For the ith image template, the target sample image is used to replace the ith image template (i.e. the target sample image is used as the ith image template), and the standard deviation std of the number of the first part and the second part image templates is calculated according to the formula 6 and the formula 7 respectively in combination with the calculation process of the standard deviation in the first examplei1And stdi2
Figure PCTCN2019108656-APPB-000014
Wherein x isi1t(t ═ 1,2,3,4,5,6,7,8) indicates that the number of image templates in the t-th sector in the first portion is,
Figure PCTCN2019108656-APPB-000015
representing the average of the number of image templates within the 8 sectors of the first portion.
Figure PCTCN2019108656-APPB-000016
Wherein x isi2t(t ═ 1,2,3,4,5,6,7,8) indicates that the number of image templates in the t-th sector in the second portion is,
Figure PCTCN2019108656-APPB-000017
representing the average of the number of image templates within the 8 sectors of the first portion.
Will stdi1As a first parameter, stdi2As a second parameter. Alternatively, the first parameter and the second parameter may be weighted, for example, the first parameter Pi1=std i1/w i1,w i1Denotes a first factor, wi1The larger the second parameter P, the more important it is to indicate that the homogeneity of the first part is, and the likei2=std i2/w i2。w i2Representing the second factor. The first parameter and the second parameter are local uniformity parameters.
std i1After the ith image template representing the target object is replaced by the target sample image, the distribution uniformity, std, of the image template of the target object in the first parti1The smaller the uniformity with which the image template of the target object is distributed in the first portion, stdi2After the ith image template representing the target object is replaced by the target sample image, the distribution uniformity, std, of the image template of the target object in the second parti2The smaller the uniformity with which the image template of the target object is distributed in the second portion. If stdi1Less than std01After the target sample image replaces the ith image template of the target object, the image templates of the target object are distributed more uniformly in the first part, and the target sample image can be used as the ith image template to replace the original image template, and/or if std is usedi2Less than std02Then the image template of the target object is more evenly distributed in the second portion after the target sample image replaces the ith image template of the target object.
Of course, the determination can also be made using the local uniformity parameter after division by a factor, for example, if Pi1Less than std01After the target sample image replaces the ith image template of the target object, the image templates of the target object are distributed more uniformly in the first part, and the target sample image can be used as the ith image template to replace the original image template, and/or if P is the casei2Less than std02Then the image template of the target object is more evenly distributed in the second portion after the target sample image replaces the ith image template of the target object.
With reference to the first example and the second example, the overall uniformity parameter may be determined first, and then the local uniformity parameter may be determined, and if the overall uniformity parameter is equal to the preset uniformity value after the ith image template of the target object is replaced by the target sample image, it may be further determined whether the local uniformity parameter is smaller than the first uniformity value or the second uniformity value. And if the overall uniformity parameter is smaller than the preset uniformity value, the target sample image can be directly replaced, and if the overall uniformity parameter is larger than the preset uniformity value, the target sample image cannot be used as an image template of the target object.
And 105, if the uniformity parameter of the pre-replacement template set obtained after the target object replaces the second image template is smaller than the preset uniformity value, taking the target sample image as the image template of the target object and replacing the second image template.
It should be noted that there are multiple image templates of the target object, the second image template belongs to at least one image template of the target object, the second image template represents an image template that can be used for replacement, there are multiple second image templates, and the std is satisfied in conjunction with the description in step 102iLess than std0There may be a plurality of image templates, for example, n image templates of the target object, the 1 st image template and the 2 nd image template satisfying std1<std 0,std 2<std 0That is to say, after the target sample image replaces the 1 st image template of the target object, and after the target sample image replaces the 2 nd image template of the target object, the image templates of the target object are distributed more uniformly, and both the 1 st image template and the 2 nd image template can be replaced as the second image template. At this time, the 1 st image template and the 2 nd image template are added to the replacement queue, and it is necessary to select one image template from the plurality of image templates in the replacement queue for replacement.
Here, three specific examples are listed to explain how to replace the target sample image as the image template of the target object.
Optionally, in the first example, the method further comprises: and when the uniformity parameter smaller than the preset uniformity value exists in the uniformity parameters of the target object, replacing the image template with the minimum uniformity parameter obtained after the target object is replaced with the target sample image in the image templates of the target object.
E.g. in the replacement queue, std1<std 0,std 2<std 0At the same time, std2<std 1That is, after the 2 nd image template is replaced by the target sample image, the image template distribution ratio isAnd replacing the 1 st image template with the standard sample image is more uniform, and replacing the 2 nd image template with the target sample image. Of course, the overall uniformity is only used for illustration, and the local uniformity may be used for judgment, which is not limited in the present application.
Optionally, in a second example, the method further comprises: and replacing the image template with the minimum number of matched hits of the target object in the image identification process with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value after the replacement with the target object.
Optionally, in a third example, the method further comprises: and replacing the image template with the longest service time with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value after replacement with the target object.
For example, in the replacement queue, if the time when the 1 st image template is the image template of the target object is 5 days and the time when the 2 nd image template is the image template of the target object is 1 day, the 1 st image template is replaced with the target sample image.
Optionally, in an embodiment of the present application, the method further includes: and converting the two-dimensional image of the target sample image into three-dimensional point cloud information, and calculating the transverse rotation angle and the longitudinal rotation angle of the target object in the target sample image through the external reference rotation matrix according to the three-dimensional point cloud information.
The three-dimensional point cloud information is used for indicating the position coordinates and the depth information of each pixel point in the target sample image, and the three-dimensional point cloud information can represent the spatial position of each pixel point. The external reference rotation matrix is a parameter of a shooting device for shooting a target sample image, and the transverse rotation angle and the longitudinal rotation angle of a target object in the target sample image can be calculated according to the three-dimensional point cloud information and the external reference rotation matrix of the shooting device. Similarly, the horizontal rotation angle and the vertical rotation angle of the target object in the image corresponding to each image template of the target object may be calculated, which is not described herein again.
Optionally, before step 101, a determination may be made on the acquired multiple images, and a target sample image of the target object is selected from the multiple images, for example, in an embodiment of the present application, the method further includes: calculating a first similarity average value of each image in the at least one image and the image template of the target object; and determining the image with the first similarity average value within a second preset range as a target sample image of the target object.
For example, the similarity value range may be [0,1], where a similarity value of 0 indicates that the sample images are completely different, and a similarity value of 1 indicates that the sample images are completely the same, and in combination with the example corresponding to fig. 2, when the similarity value is greater than 0.8, the sample images that are the target object may be determined, so that the second preset range may be (0.8,1], which is not described herein again, and of course, may also be other values, which is not limited in this application.
In the process of identifying the target object, the number of the image templates of the target object can be continuously increased until the number of the image templates reaches the upper limit, and after the number of the image templates reaches the upper limit, the uniformity of the image templates which can still replace the image templates is better, the identification accuracy is improved, and because the image templates are continuously updated, the real-time performance of the image templates of the target object is better, after the image templates are updated, the angle diversity of the image templates is greatly improved, after the same sample is updated by using the image templates, compared with the situation before the image templates are updated by using the image templates, the FRR (False Rejection Rate) is obviously reduced, and the identification Rate is obviously improved.
The updating of the image template may be that the target object is in a registration process, for example, a user performs face registration; or the target object is in the process of identity verification, for example, the user performs face recognition. Two specific application scenarios are listed here for illustration, but of course, this is only an exemplary one, and does not represent that the present application is limited thereto.
Optionally, in the first application scenario, the target object may be authenticated, and the method further includes: after the target object is subjected to image acquisition to obtain a target sample image of the target object, matching the target sample image with at least one image template of the target object; and after the target sample image is successfully matched with any image template of the target object, determining that the identity verification of the target object is successful. When the identity of the target object is identified, the target object is subjected to image acquisition to obtain a target sample image, then the target sample image is matched, and if the target sample image is matched with one image template of the target object, the verification is successful. Of course, there may be a case of authentication failure, for example, when an object to be authenticated performs authentication, acquiring an image of the object to be authenticated to obtain an acquired image, matching the sample image with an image template of the target object, and if all image templates of the target object fail to be matched with the acquired image of the object to be authenticated, it indicates that the object to be authenticated is not the target object, and the authentication fails. Of course, this is merely an example and does not represent a limitation of the present application.
Optionally, in the second application scenario, the target object may be registered, and the method further includes: after the target object is subjected to image acquisition to obtain a target sample image of the target object, the target sample image is determined to be an image template of the target object, the image template of the target object is stored, and the target object is registered. Here, in the registration process of the target object, image acquisition is performed on the target object to obtain a target sample image, and the target sample image is determined as the image template of the target object, which is, of course, only an exemplary illustration here, if a plurality of sample images of the target object are already acquired in the registration process, and if acquisition is continued, it is necessary to determine whether the newly acquired sample image can replace the existing image template according to steps 102 and 105.
Here, the first application scenario and the second application scenario are specific application scenarios of the image template updating method of the present application, and the updating of the image template may be performed in an authentication process of the target object (for example, face recognition), or may be performed in a registration process of the target object (for example, face registration), and the image template is automatically replaced in the recognition or registration process, so as to improve uniformity of distribution of the image template.
According to the method and the device, in the process of identifying the target object, the similarity between the target sample image of the target object and the first image template is determined, when the similarity is within a first preset range, the target sample image is determined to be a sample image allowed to be learned, repeated image templates in the image template of the target object are reduced, whether the target sample image can replace the image template in the target object or not is determined by replacing each image template of the target object respectively according to the target sample image to obtain corresponding uniformity parameters, the image templates in the target object are guaranteed to be uniformly distributed, the target object can be identified at all angles, and the accuracy and the identification rate of image identification are improved.
Example II,
Based on the method for updating an image template shown in fig. 1, fig. 5 is a logic block diagram of an image template updating method provided in an embodiment of the present application, and as shown in fig. 5, the method includes the following steps:
step 501, identifying the current target sample image by using the image template of the target object.
If the identification is successful, it indicates that the target sample image is a sample image of the target object, step 502 is performed, and if the identification is unsuccessful, it ends. In the embodiment corresponding to fig. 5, the target object may be a human face.
It should be noted that, when acquiring an image of a current target, an infrared camera device, a speckle projector, and a light supplement device may be used to acquire the image. In the identification process, when an infrared image is shot by using an infrared camera device, the infrared camera device and the light supplementing device are turned on, the speckle projector is turned off, and the infrared image is collected; when the speckle projector is used for collecting speckle images, the speckle projector is turned on, the infrared camera device and the light supplementing device are turned off, the speckle images are collected, depth information can be obtained through the collected speckle images, and two-dimensional infrared images are converted into three-dimensional point cloud information. And calculating the angle of the current point cloud through the camera external parameter rotation matrix.
And step 502, adding 1 to the hit times of successfully identified image template matching.
Step 503, judge whether the target sample image permits to learn.
The target sample image permits learning if the degree of similarity between the target sample image and the front-face image template of the target subject (in the present embodiment, the first image template is the front-face image template) is within a first preset range.
And step 504, calculating a target sample image to respectively replace each image template of the target object and obtain a corresponding overall uniformity parameter.
And 505, judging whether the overall uniformity is improved.
Determining whether global uniformity improves may be determined according to the first example in step 102. If stdiLess than std0It means that the global uniformity is better after the target sample image replaces the ith image template of the target object. If the global uniformity is not improved, step 506 is performed, and if the global uniformity is improved, step 508 is performed.
Step 506, calculating the target sample image to respectively replace each image template of the target object and obtain the corresponding local uniformity parameter.
And 507, judging whether the local uniformity parameters are improved.
Determining whether global uniformity improves may be determined according to the second example in step 102. If P isi1Less than std01And/or Pi2Less than std02It means that the local uniformity is better after the target sample image replaces the ith image template of the target object. If the local uniformity is not improved, this is done, and if the local uniformity is improved, step 508 is performed.
Step 508, determining the image template with the smallest number of matching hits in at least one alternative image template of the target object.
In step 509, the image template with the longest usage time is used as the second image template in the at least one image template with the smallest number of matching hits of the target object.
It should be noted that at least one replaceable image template constitutes a replacement queue, if there is only one image template in the replacement queue, the image template is directly replaced with the target sample image, if there are multiple image templates in the replacement queue, the selection is performed according to step 508, and if there are two image templates in the replacement queue after the selection according to step 508, the matching hit times of the two image templates are the same and both are the minimum times, the selection is further performed according to step 509.
Step 510, replace the target sample image with the image template as the target object and replace the second image template.
According to the method and the device, in the process of identifying the target object, the similarity between the target sample image of the target object and the first image template is determined, when the similarity is within a first preset range, the target sample image is determined to be a sample image allowed to be learned, repeated image templates in the image template of the target object are reduced, whether the target sample image can replace the image template in the target object or not is determined by replacing each image template of the target object with the target sample image respectively to obtain corresponding uniformity parameters, the image templates in the target object are guaranteed to be distributed uniformly, the target object can be identified at all angles, and the accuracy and the identification rate of image identification are improved.
Example III,
Fig. 6 is a structural diagram of an image template updating apparatus according to an embodiment of the present application, where the image template updating apparatus 60 includes: a permit learning module 601, a uniformity parameter module 602, and an image template management module 603;
the permission learning module 601 is configured to determine that a similarity between a target sample image of a target object and a first image template of the target object is within a first preset range;
a uniformity parameter module 602, configured to, when the number of the image templates of the target object is equal to a preset number, respectively replace each image template of the target object with the target sample image and obtain a corresponding uniformity parameter, where the uniformity parameter is used to indicate uniformity of distribution of the image templates of the target object after the target sample image replaces the image template of the target object;
the image template management module 603 is configured to, if the uniformity parameter obtained after replacing the second image template with the target object is smaller than the preset uniformity value, take the target sample image as the image template of the target object and replace the second image template.
Example four,
Fig. 7 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application. As shown in fig. 7, the electronic device 70 includes: at least one processor 701, a memory 702 and a bus 703, wherein the at least one processor 701 and the memory 702 are connected with each other through the bus and complete communication;
the memory 702, which is a non-volatile computer-readable storage medium, can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as the corresponding program instructions/modules (method for determining touch position of capacitive screen) in the embodiments of the present application. The at least one processor 701 executes various functional applications of the server and data processing by executing the non-volatile software programs, instructions and modules stored in the memory 702, that is, implements the above-described method embodiment (the method for determining the touch position of the capacitive screen).
The memory 702 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the stored data area may store data created according to the use of the (device for determining the touch position of the capacitive screen), and the like. Further, the memory 702 may include high-speed random access memory 702, and may also include non-volatile memory 702, such as at least one piece of disk memory 702, flash memory devices, or other pieces of non-volatile solid-state memory 702. In some embodiments, the memory 702 optionally includes memory 702 remotely located from the at least one processor 701, and these remote memories 702 may be connected (determining capacitive screen touch position devices) over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The memory 702 stores at least one program, and the at least one processor 701 implements the method described in the embodiments of the present application when executing the program stored in the memory 702.
Example four,
The embodiment of the present application provides a storage medium, on which a computer program is stored, and when the computer program is executed by at least one processor, the method described in the embodiment of the present application is implemented.
The product can execute the method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to the methods provided in the embodiments of the present application.
The electronic device of the embodiments of the present application exists in various forms, including but not limited to:
(1) mobile communication devices, which are characterized by mobile communication capabilities and are primarily targeted at providing voice and data communications. Such terminals include smart phones (e.g., iphones), multimedia phones, functional phones, and low-end phones, among others.
(2) The ultra-mobile personal computer equipment belongs to the category of personal computers, has calculation and processing functions and generally has the characteristic of mobile internet access. Such terminals include PDA, MID, and UMPC devices, such as ipads.
(3) Portable entertainment devices such devices may display and play multimedia content. Such devices include audio and video players (e.g., ipods), handheld game consoles, electronic books, as well as smart toys and portable car navigation devices.
(4) The server is similar to a general computer architecture, but has higher requirements on processing capability, stability, reliability, safety, expandability, manageability and the like because of the need of providing highly reliable services.
(5) And other electronic devices with data interaction functions.
Thus, particular embodiments of the present subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may be advantageous.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, at least one processor or at least one processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) at least one processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functionality of the units may be implemented in the same at least one software and/or hardware when implementing the present application.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on at least one computer-usable storage medium (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to at least one processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the at least one processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes at least one processor (CPU), an input/output interface, a network interface, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer-readable medium does not include a transitory computer-readable medium
(transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on at least one computer-usable storage medium (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular transactions or implement particular abstract data types. The application may also be practiced in distributed computing environments where transactions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (20)

  1. An image template updating method, comprising:
    acquiring an image of a target object to obtain a target sample image of the target object;
    determining that the similarity between the target sample image and a first image template of the target object is within a first preset range, wherein the target object has at least one image template, and the first image template belongs to the at least one image template of the target object;
    pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set;
    calculating a uniformity parameter of each pre-replacement template set in the at least one pre-replacement template set, wherein the uniformity parameter is used for indicating uniformity of distribution of image templates of the target object in the corresponding pre-replacement template set;
    and if the uniformity parameter of a pre-replacement template set obtained after the target object replaces the second image template is smaller than a preset uniformity value, taking the target sample image as the image template of the target object and replacing the second image template, wherein the second image template belongs to at least one image template of the target object.
  2. The method of claim 1, wherein replacing each image template of the target object with the target sample image and obtaining a corresponding uniformity parameter comprises:
    uniformly dividing at least one region in the value ranges of the transverse rotation angle and the longitudinal rotation angle of the target object;
    and for each image template of the target object, replacing the image template of the target object with the target sample image, determining the number of the image templates distributed in each area, and calculating the standard deviation/variance of the number of the image templates of all the areas as the uniformity parameter.
  3. The method of claim 2, further comprising:
    and for the image templates of the target object before replacement, determining the number of the image templates distributed in each area, and calculating the standard deviation/variance of the number of the image templates of all the areas as the preset uniformity value.
  4. The method of claim 2, wherein calculating a standard deviation/variance of the number of image templates for the at least one region as the uniformity parameter comprises:
    dividing the at least one region into a first portion and a second portion;
    calculating a value obtained by dividing the standard deviation/variance of the number of the image templates of all the areas in the first part by a first factor to be used as a first parameter in the uniformity parameters;
    and calculating the value of the standard deviation/variance of the number of the image templates of all the areas in the second part divided by the first factor as a second parameter in the uniformity parameters.
  5. The method of claim 4, further comprising:
    determining the number of image templates distributed in each area for the image templates of the target object before replacement;
    calculating the standard deviation/variance of the number of the image templates of all the areas in the first part as a first uniformity value in the preset uniformity values, and when the first parameter is smaller than the first uniformity value, determining that the uniformity parameter is smaller than the preset uniformity value;
    and/or calculating the standard deviation/variance of the number of the image templates of all the areas in the second part as a second uniformity value in the preset uniformity values, and determining that the uniformity parameter is smaller than the preset uniformity value when the second parameter is smaller than the second uniformity value.
  6. The method of claim 1, further comprising:
    calculating a first similarity average value of each image in at least one image and an image template of the target object; and determining an image with the first similarity average value within a second preset range as a target sample image of the target object.
  7. The method of claim 1, further comprising:
    determining that the similarity between a target sample image of the target object and a first image template of the target object is within the first preset range when the similarity is greater than a first threshold and less than or equal to a second threshold.
  8. The method of claim 7, further comprising:
    and calculating a second similarity average value of each sample image in the sample image set of the non-target object and the image template of the target object, and determining the second similarity average value with the largest value as the first threshold value.
  9. The method of claim 1, further comprising:
    determining a transverse rotation angle and a longitudinal rotation angle of a target object in the target sample image; and when the square sum of the transverse rotation angle and the longitudinal rotation angle is within a third preset range, determining that the similarity between the target sample image of the target object and the first image template of the target object is within the first preset range.
  10. The method of claim 1, wherein pre-replacing each image template of the target object with the target sample image to obtain at least one pre-replacement template set comprises:
    and when the number of the image templates of the target object is determined to be equal to the preset number, pre-replacing each image template of the target object by using the target sample image to obtain at least one pre-replaced template set.
  11. The method of claim 1, further comprising:
    and when the number of the image templates of the target object is less than the preset number, determining the target sample image of the target object as the image template of the target object.
  12. The method of claim 1, further comprising:
    and when the uniformity parameter smaller than a preset uniformity value exists in the uniformity parameters of the target object, replacing the image template with the minimum uniformity parameter obtained after replacement of the target object with the target sample image in the image templates of the target object.
  13. The method of claim 1, further comprising:
    and replacing the image template with the minimum number of matching hits of the target object in the image identification process with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value after the target object is replaced.
  14. The method of claim 1, further comprising:
    and replacing the image template with the longest service time with the target sample image in the image templates with the uniformity parameters smaller than the preset uniformity value after replacement with the target object.
  15. The method of claim 1, further comprising:
    matching the target sample image with at least one image template of the target object;
    and after the target sample image is successfully matched with any image template of the target object, determining that the identity verification of the target object is successful.
  16. The method of claim 1, further comprising:
    and determining the target sample image as an image template of the target object, storing the image template of the target object, and finishing the registration of the target object.
  17. The method according to any one of claims 1-16, further comprising:
    and converting the two-dimensional image of the target sample image into three-dimensional point cloud information, and calculating the transverse rotation angle and the longitudinal rotation angle of the target object in the target sample image through an external reference rotation matrix according to the three-dimensional point cloud information.
  18. An apparatus for updating an image template, comprising: the device comprises a study permitting module, a uniformity parameter module and an image template management module;
    the permitted learning module is configured to, when the number of the image templates of the target object is equal to a preset number, replace each image template of the target object with the target sample image and obtain a corresponding uniformity parameter, where the uniformity parameter is used to indicate uniformity of distribution of the image templates of the target object after the target sample image replaces the image template of the target object;
    the uniformity parameter module is used for determining a calibration coefficient of the first sensor according to an actual object reference value and the actual object measurement value, and the calibration coefficient is used for calibrating an electric signal detected by the first sensor;
    and the image template management module is used for taking the target sample image as the image template of the target object and replacing the second image template if the uniformity parameter obtained after the target object replaces the second image template is smaller than a preset uniformity value.
  19. An image recognition apparatus characterized by comprising: the system comprises at least one processor, a memory and an image acquisition module, wherein the at least one processor, the memory and the image acquisition module are mutually connected through a bus and complete communication; the image acquisition module comprises an infrared camera and a speckle projector;
    the memory has stored thereon at least one program that, when executed by the at least one processor, performs the method of any of claims 1-17.
  20. A storage medium, comprising: the storage medium has stored thereon a computer program which, when executed by at least one processor, implements the method of any one of claims 1-17.
CN201980002050.4A 2019-09-27 2019-09-27 Image template updating method, device and storage medium Active CN112912889B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/108656 WO2021056450A1 (en) 2019-09-27 2019-09-27 Method for updating image template, device, and storage medium

Publications (2)

Publication Number Publication Date
CN112912889A true CN112912889A (en) 2021-06-04
CN112912889B CN112912889B (en) 2024-03-26

Family

ID=75165050

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980002050.4A Active CN112912889B (en) 2019-09-27 2019-09-27 Image template updating method, device and storage medium

Country Status (2)

Country Link
CN (1) CN112912889B (en)
WO (1) WO2021056450A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139612A (en) * 2021-05-07 2021-07-20 上海商汤临港智能科技有限公司 Image classification method, training method of classification network and related products
CN114401440A (en) * 2021-12-14 2022-04-26 北京达佳互联信息技术有限公司 Video clip and clip model generation method, device, apparatus, program, and medium
CN115937629B (en) * 2022-12-02 2023-08-29 北京小米移动软件有限公司 Template image updating method, updating device, readable storage medium and chip

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106021606A (en) * 2016-06-21 2016-10-12 广东欧珀移动通信有限公司 Fingerprint template updating method and terminal equipment
CN106372609A (en) * 2016-09-05 2017-02-01 广东欧珀移动通信有限公司 Fingerprint template update method, fingerprint template update device and terminal equipment
KR20170046436A (en) * 2015-10-21 2017-05-02 삼성전자주식회사 Biometric authentication method and biometrics authentication apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016206081A1 (en) * 2015-06-26 2016-12-29 宇龙计算机通信科技(深圳)有限公司 Method and device for updating fingerprint template
US10769255B2 (en) * 2015-11-11 2020-09-08 Samsung Electronics Co., Ltd. Methods and apparatuses for adaptively updating enrollment database for user authentication

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170046436A (en) * 2015-10-21 2017-05-02 삼성전자주식회사 Biometric authentication method and biometrics authentication apparatus
CN106021606A (en) * 2016-06-21 2016-10-12 广东欧珀移动通信有限公司 Fingerprint template updating method and terminal equipment
CN106372609A (en) * 2016-09-05 2017-02-01 广东欧珀移动通信有限公司 Fingerprint template update method, fingerprint template update device and terminal equipment

Also Published As

Publication number Publication date
CN112912889B (en) 2024-03-26
WO2021056450A1 (en) 2021-04-01

Similar Documents

Publication Publication Date Title
CN108225341B (en) Vehicle positioning method
CN112528831B (en) Multi-target attitude estimation method, multi-target attitude estimation device and terminal equipment
CN109086734B (en) Method and device for positioning pupil image in human eye image
CN112912889B (en) Image template updating method, device and storage medium
KR102316230B1 (en) Image processing method and device
US20200125876A1 (en) Method and Device for License Plate Positioning
CN112016475B (en) Human body detection and identification method and device
CN109740415A (en) Vehicle attribute recognition methods and Related product
CN111104825A (en) Face registry updating method, device, equipment and medium
US20140232748A1 (en) Device, method and computer readable recording medium for operating the same
CN110738110A (en) Human face key point detection method, device, system and storage medium based on anchor point
CN108052869B (en) Lane line recognition method, lane line recognition device and computer-readable storage medium
CN111199169A (en) Image processing method and device
KR20220098312A (en) Method, apparatus, device and recording medium for detecting related objects in an image
CN115600157A (en) Data processing method and device, storage medium and electronic equipment
CN111160251A (en) Living body identification method and device
CN110135428B (en) Image segmentation processing method and device
CN111353325A (en) Key point detection model training method and device
WO2020001016A1 (en) Moving image generation method and apparatus, and electronic device and computer-readable storage medium
CN108875901B (en) Neural network training method and universal object detection method, device and system
CN110222576B (en) Boxing action recognition method and device and electronic equipment
CN109214271B (en) Method and device for determining loss function for re-identification
CN116309643A (en) Face shielding score determining method, electronic equipment and medium
CN110852261A (en) Target detection method and device, electronic equipment and readable storage medium
CN115578796A (en) Training method, device, equipment and medium for living body detection model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant