WO2021056450A1 - Procédé de mise à jour de modèle d'image, dispositif et support de stockage - Google Patents

Procédé de mise à jour de modèle d'image, dispositif et support de stockage Download PDF

Info

Publication number
WO2021056450A1
WO2021056450A1 PCT/CN2019/108656 CN2019108656W WO2021056450A1 WO 2021056450 A1 WO2021056450 A1 WO 2021056450A1 CN 2019108656 W CN2019108656 W CN 2019108656W WO 2021056450 A1 WO2021056450 A1 WO 2021056450A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target object
template
uniformity
target
Prior art date
Application number
PCT/CN2019/108656
Other languages
English (en)
Chinese (zh)
Inventor
肖梦佳
王波
Original Assignee
深圳市汇顶科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市汇顶科技股份有限公司 filed Critical 深圳市汇顶科技股份有限公司
Priority to CN201980002050.4A priority Critical patent/CN112912889B/zh
Priority to PCT/CN2019/108656 priority patent/WO2021056450A1/fr
Publication of WO2021056450A1 publication Critical patent/WO2021056450A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Definitions

  • the embodiments of the present application relate to the field of image recognition technology, and in particular, to an image template updating method, device, and storage medium.
  • face recognition As the image recognition technology becomes more and more mature, the application of image recognition technology becomes more and more extensive. Taking face recognition as an example, face recognition has been widely used in scenarios such as security inspection, identity verification, and electronic payment. Taking face recognition as an example, in the process of face recognition, the face image is sampled and registered first, and the image template of the face image is established according to the characteristics of the face image registered by the sample. In the subsequent application, according to the face image The image template of determines whether the currently identified person is a registered user. However, in practical applications, the image template of the registered face image is sampled, and the angle at which the image is collected is random, which results in a low recognition rate when recognizing the face image to be recognized.
  • one of the technical problems solved by the embodiments of the present invention is to provide an image template updating method, device, and storage medium to overcome the above-mentioned drawbacks.
  • an embodiment of the present application provides a method for updating an image template, including:
  • the target object has at least one image template, and the first image template belongs to at least one image template of the target object;
  • the target sample image is used as the image template of the target object and the second image template is replaced.
  • the second image template belongs to the target At least one image template of the object.
  • replacing each image template of the target object with the target sample image to obtain the corresponding uniformity parameter includes:
  • the method further includes: for the image template of the target object before replacement, determining the number of image templates distributed in each area, and calculating the standard for the number of image templates in all areas The difference/variance is used as the preset uniformity value.
  • calculating the standard deviation/variance of the number of image templates in at least one region as the uniformity parameter includes:
  • the method further includes:
  • the method further includes: calculating a first average value of similarity between each image in the at least one image and the image template of the target object; and setting the first average value of similarity in the second The image within the preset range is determined as the target sample image of the target object.
  • the method further includes: when the similarity is greater than the first threshold and less than or equal to the second threshold, determining the difference between the target sample image of the target object and the first image template of the target object The similarity between the two is within the first preset range.
  • the method further includes: calculating the second similarity average value between each sample image in the sample image set of the non-target object and the image template of the target object, and calculating the second similarity average value of the image template of the target object with the largest value.
  • the average of the two similarities is determined as the first threshold.
  • the method further includes: determining the horizontal rotation angle and the vertical rotation angle of the target object in the target sample image; the sum of squares of the horizontal rotation angle and the vertical rotation angle is preset in the third preset When it is within the range, it is determined that the similarity between the target sample image of the target object and the first image template of the target object is within the first preset range.
  • each image template of the target object is respectively pre-replaced using the target sample image to obtain at least one pre-replacement template set, including:
  • each image template of the target object is respectively pre-replaced using the target sample image to obtain at least one pre-replacement template set.
  • the method further includes: when the number of image templates of the target object is less than a preset number, determining the target sample image of the target object as the image template of the target object.
  • the method further includes: when there is a uniformity parameter less than a preset uniformity value in the uniformity parameter of the target object, comparing the uniformity parameter with the target object in the image template of the target object.
  • the image template with the smallest uniformity parameter obtained after the object replacement is replaced with the target sample image.
  • the method further includes: matching the target object in the image recognition process in the image template whose uniformity parameter is less than the preset uniformity value after being replaced with the target object
  • the image template with the smallest number of hits is replaced with the target sample image.
  • the method further includes: among the image templates whose uniformity parameters obtained after replacing with the target object are less than a preset uniformity value, replacing the image template with the longest use time with Target sample image.
  • the method further includes: matching the target sample image with at least one image template of the target object; after the target sample image is successfully matched with any image template of the target object, determining The authentication of the target object is successful.
  • the method further includes: determining the target sample image as the image template of the target object, storing the image template of the target object, and completing the registration of the target object.
  • the method further includes: converting the two-dimensional image of the target sample image into three-dimensional point cloud information, and calculating the target object in the target sample image through the external parameter rotation matrix according to the three-dimensional point cloud information The horizontal rotation angle and the vertical rotation angle.
  • an image template updating device is characterized by comprising: a permission learning module, a uniformity parameter module, and an image template management module;
  • the permission learning module is used to determine that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range
  • the uniformity parameter module is used to replace each image template of the target object with the target sample image and obtain the corresponding uniformity parameter when the number of image templates of the target object is equal to the preset number.
  • the uniformity parameter is used to indicate the target sample image The uniformity of the image template distribution of the target object after replacing the image template of the target object;
  • the image template management module is configured to use the target sample image as the image template of the target object and replace the second image template if the uniformity parameter obtained after the target object replaces the second image template is less than the preset uniformity value.
  • an image recognition device includes: at least one processor, memory, and image acquisition module, at least one processor, memory, and image acquisition module are connected to each other through a bus to complete communication; the image acquisition module includes an infrared camera and speckle The projector, the image acquisition module is used to acquire images; at least one program is stored in the memory, and when the at least one processor executes the program stored in the memory, the method as described in any one of the embodiments of the first aspect is implemented.
  • a storage medium has a computer program stored on the storage medium, and when at least one processor executes the computer program, the method described in any one of the embodiments of the first aspect is implemented.
  • this application determines the similarity between the target sample image of the target object and the first image template.
  • the similarity is within the first preset range, it is determined that the target sample image is the permitted learning sample image, Reduce the repetitive image templates in the image templates of the target object, and replace each image template of the target object with the target sample image to obtain the corresponding uniformity parameters to determine whether the target sample image can replace the image template in the target object, ensuring the target
  • the image templates in the object are evenly distributed, which ensures that the target object can be recognized at all angles, and improves the accuracy and recognition rate of image recognition.
  • FIG. 1 is a flowchart of an image template updating method provided by an embodiment of the application
  • FIG. 2 is a schematic diagram of a second similarity average value provided by an embodiment of the application.
  • FIG. 3 is a schematic diagram of a corresponding relationship between similarity and radius of distribution circle provided by an embodiment of this application;
  • FIG. 4 is a schematic diagram of a third preset range provided by an embodiment of this application.
  • FIG. 5 is a logical block diagram of a method for updating an image template provided by an embodiment of the application
  • FIG. 6 is a structural diagram of an image template updating device provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the application.
  • Figure 1 is a flowchart of an image template update method provided by an embodiment of the application; this method can be applied to the process of image recognition registration of a target object, and can also be applied to the process of image recognition after the target object is registered. This application does not limit this, as shown in Figure 1, which includes the following steps:
  • Step 101 Perform image collection on a target object to obtain a target sample image of the target object.
  • the target object refers to the object to be recognized in image recognition.
  • the target object in the application scenario of face recognition, can be the user's face.
  • the target object in the application scenario of license plate recognition, can be the vehicle's
  • the license plate of course, is just an example here.
  • the target object can be different objects.
  • target is used to indicate a singular number and refers to a thing.
  • a target object refers to an object, and the "target” does not need to be defined in any way, but it is only used to logically express the plan more clearly.
  • Implementation process In this application, words such as “first” and “second” are only used to distinguish, and are not used for any limitation, nor are they used to indicate any sequence (with a certain order relationship) or set (the second must have the first, Or the first must have the second).
  • Step 102 Determine that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range.
  • the target object has at least one image template, the first image template belongs to at least one image template of the target object, and when the similarity between the target sample image of the target object and the first image template of the target object is within the first preset range, it means The target sample image can be used as the permitted learning sample image, that is, the preliminary condition of the image template as the target object is satisfied, and the target sample image can be added to the learning queue.
  • the target object has multiple image templates, and the first image template is only one of the image templates of the target object.
  • multiple image templates are created by using the frontal image collected from the front of the target object, and Use one of them as the first image template.
  • the image template of the face image created by the face image collected from the front is the first image template.
  • the first image template may be any image template among the image templates of the target object. This is only an exemplary description, and does not mean that the application is limited to this.
  • the method further includes: when the similarity is greater than the first threshold and less than or equal to the second threshold, determining the difference between the target sample image of the target object and the first image template of the target object The similarity is within the first preset range.
  • the first preset range is defined by the first threshold as the lower limit and the second threshold as the upper limit. Setting the first threshold can ensure that the selected target sample image is the sample image of the target object instead of The sample image of the non-target object removes the interference of the sample image of the non-target object. Setting the second threshold can ensure that the target sample image with a high degree of similarity to the first image template is removed and avoid duplication of image templates.
  • the method further includes: calculating a second average value of similarity between each sample image in the sample image set of the non-target object and the image template of the target object, The second similarity average value with the largest value is determined as the first threshold value. For example, for a sample image A, calculate the similarity between the sample image A and each image template of the target object, and calculate the average similarity to obtain the second similarity average of the sample image A, and compare the second similarity of all the sample images. The average value of similarity, and the second average value of similarity with the largest value is determined as the first threshold.
  • the sample images contained in the collection are not sample images of the target object, but sample images of other objects.
  • the sample image collection of non-target objects has M sample images, M is an integer greater than 1, and the target object has n image templates.
  • n is an integer greater than 1
  • m is an integer within [1,M]
  • the second similarity average of the m-th sample image and the image template of the target object The calculation formula is as formula 1:
  • mi represents the similarity between the m-th sample image and the i-th image template of the target object.
  • the similarity between the sample image and the image template in this embodiment can be calculated by the neural network model, and the sample image and the image template are input into the neural network model to obtain the similarity of the neural network model output.
  • Figure 2 is a schematic diagram of a second similarity average value provided by an embodiment of the application.
  • the horizontal axis represents the second similarity average value
  • the left vertical axis represents the number of sample images
  • the right The vertical axis represents the proportion of the number of sample images
  • the bar graph represents the number of sample images that meet the corresponding second similarity average value
  • the line graph represents the number of sample images that are less than or equal to the corresponding second similarity average value.
  • the number of sample images with the average value of the second similarity within 0.8 accounts for close to 1, that is, in the sample image collection of non-target objects, almost all non-target objects
  • the similarity between the sample image and the image template of the target object is less than 0.8, and the sample image with the similarity greater than 0.8 can be considered as the sample image of the target object. Therefore, 0.8 can be taken as the maximum second similarity average, namely 0.8 is used as the first threshold.
  • the method further includes: determining the lateral rotation angle and the longitudinal rotation angle of the target object in the target sample image; when the sum of squares of the lateral rotation angle and the longitudinal rotation angle is within a third preset range When it is determined that the similarity between the target sample image of the target object and the first image template of the target object is within the first preset range.
  • the x-axis represents the longitudinal rotation angle of the target object (that is, the angle at which the target object rotates around the x-axis)
  • the y-axis represents the target object
  • the horizontal rotation angle (that is, the angle at which the target object rotates around the y-axis)
  • the sample image set of the target object can contain 1000 sample images of the target object, and calculate the similarity between each sample image and the front image template of the target object, for example, the mth sample image and The similarity between the front image templates of the target object can be expressed as sim m .
  • the coordinates of the m-th sample image are (x m ,y m ), which means that the rotation angle of the target object in the m-th sample image around the x-axis is x m , and the rotation angle around the y-axis is y m , then the m-th sample image
  • the R value of each sample image is R m
  • a sample image with a similarity greater than 0.95 is determined as a sample image that is repeated with the front image template of the target object, and a sample image with a similarity less than or equal to 0.8 is determined as an invalid sample image.
  • the invalid sample image is too similar to the image template of the target object.
  • Low can be identified as a sample image of a non-target object, and a sample image with a similarity greater than 0.8 is determined as an effective sample image.
  • the effective sample image has a high similarity with the image template of the target object and can be identified as a sample image of the target object . Therefore, the similarity between the target sample image and the first image template of the target object should be between (0.8, 1), as shown in FIG. 3, which is a similarity and distribution circle provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram of a third preset range provided by an embodiment of the application. In the rectangular coordinate system shown in FIG.
  • the circle of the shaded area represents the third preset range, where:
  • the shaded area indicates the distribution area with an R value greater than or equal to 5 and less than or equal to 60.
  • the area outside the shaded area indicates that the R value of the sample image is less than 5 (the similarity to the front image template of the target object is greater than 0.95) and the sample image A sample image distribution area with an R value greater than 60 (the similarity with the front image template of the target object is less than 0.8).
  • Step 103 Pre-replace each image template of the target object by using the target sample image to obtain at least one pre-replacement template set.
  • step 103 using the target sample image to replace each image template separately in step 103 is a pre-replacement, which is only used to calculate the uniformity parameter, and does not directly use the target sample image as the image template.
  • each image template of the target object is respectively pre-replaced using the target sample image to obtain at least one pre-replacement template set, including:
  • each image template of the target object is respectively pre-replaced using the target sample image to obtain at least one pre-replacement template set.
  • the method further includes: when the number of image templates of the target object is less than a preset number, determining the target sample image of the target object as the image template of the target object.
  • the upper limit of the number of image templates of the target object can be a preset number. If the number of image templates of the target object is less than the preset number, it means that the image templates of the target object can continue to increase. When the similarity is within the first preset range, the target sample image is directly used as the image template of the target object; if the number of image templates of the target object is equal to the preset number, it means that the image template of the target object cannot continue to increase. Determine whether the target sample image can replace a certain image template of the target object and get better results.
  • Step 104 Calculate the uniformity parameter of each pre-replacement template set in the at least one pre-replacement template set.
  • the uniformity parameter is used to indicate the uniformity of the image template distribution of the target object in the corresponding pre-replacement template set.
  • the uniformity parameter is used to indicate the uniformity of the image template distribution in the pre-replacement template set obtained after the target sample image replaces the image template of the target object.
  • the target object has n image templates, and n is an integer greater than 1.
  • the target sample image replaces each image template respectively, and the corresponding uniformity parameter can be obtained.
  • a total of n pre-replacement template sets and n uniformity parameters can be obtained.
  • the x-axis represents the longitudinal rotation angle of the target object (that is, the angle of the target object rotating around the x-axis)
  • the y-axis represents the lateral rotation angle of the target object (that is, the target object rotates around the y-axis). angle).
  • two specific examples are given to illustrate how to calculate the uniformity parameter.
  • the overall uniformity parameter can be calculated, and each image template of the target object is replaced with the target sample image to obtain the corresponding uniformity parameter, including:
  • the method further includes: for the image template of the target object before replacement, determining the number of image templates distributed in each area, and calculating the standard for the number of image templates in all areas The difference/variance is used as the preset uniformity value.
  • the ring-shaped shaded area is the range of the horizontal and vertical rotation angles of the target object, that is, the R value range is between [5,60)
  • the ring-shaped shadow area is equally divided into 8 fan-shaped areas (ie at least one area). Of course, it can also be divided into 6 fan-shaped areas and 4 fan-shaped areas, which can be set freely according to specific conditions. limit.
  • the point on the positive x-axis represents the upward rotation angle of the human face
  • the point on the negative x-axis Represents the downward rotation angle of the face.
  • the point on the positive y-axis represents the angle at which the face turns to the right.
  • the point on the positive y-axis represents the angle at which the face turns to the left.
  • the slope of the first quadrant is 0, indicates the angle of the face turning to the upper right
  • the oblique line in the second quadrant indicates the angle of the face turning to the lower right
  • the oblique line in the third quadrant indicates the angle of the face turning to the lower left
  • the oblique line in the fourth quadrant Indicates the angle at which the face turns to the upper left.
  • the positive direction may indicate the angle of downward and right rotation
  • the negative direction may indicate the angle of upward and left rotation. This application does not restrict this.
  • the preset uniformity value is the standard deviation calculated based on the original image template of the target object, that is, the image template of the target object is determined before the target sample image replaces the image template of the target object
  • T is the number of at least one region.
  • For the i-th image template replace the i-th image template with the target sample image (that is, use the target sample image as the i-th image template), calculate the number of image templates in each fan-shaped area, and the image in the t-th fan-shaped area
  • calculate the standard deviation std i of the number of image templates for the 8 fan-shaped regions according to formula 3:
  • T is the number of at least one region.
  • std i (that is, the overall uniformity parameter) represents the uniformity of the image template distribution of the target object after the target sample image replaces the i-th image template of the target object. The smaller the std i , the better the uniformity of the image template distribution. If std i is less than std 0 , it means that after the target sample image replaces the i-th image template of the target object, the image template distribution is more uniform, and the target sample image can be used as the i-th image template to replace the original image template.
  • a local uniformity parameter can be calculated, and the standard deviation/variance of the number of image templates in at least one region can be calculated as the uniformity parameter, including:
  • all areas in the first part are the sum of all areas belonging to the first part, and all areas in the first part plus all areas in the second part are the value ranges of the horizontal rotation angle and the vertical rotation angle of the target object.
  • At least one area can also be divided into multiple parts.
  • at least one area is divided into S parts, where S is an integer greater than 1, and the standard deviation/variance of the number of image templates in all areas in the sth part is calculated.
  • the value divided by the sth factor is used as the sth parameter in the uniformity parameter to obtain S parameters, and s is an integer in [1,S].
  • the method further includes:
  • the ring-shaped shadow area is divided into two ring areas, the first part and the second part, combined with the standard deviation calculation process in the first example, using Before the target sample image replaces the image template of the target object, for the original image template of the target object, calculate the standard deviations std 01 and std 02 of the number of image templates in the first part and the second part according to formula 4 and formula 5 respectively:
  • the i-th image template replace the i-th image template with the target sample image (that is, use the target sample image as the i-th image template), and combine the calculation process of the standard deviation in the first example to calculate according to formula 6 and formula 7 respectively
  • the first parameter and the second parameter can be weighted.
  • the first parameter P i1 std i1 /w i1
  • w i1 represents the first factor
  • the larger w i1 is, the more important the uniformity of the first part is. Therefore, the second parameter P i2 std i2 /w i2 .
  • w i2 represents the second factor.
  • the first parameter and the second parameter are the local uniformity parameters.
  • std i1 represents the uniformity of the image template distribution of the target object in the first part after the target sample image replaces the i-th image template of the target object.
  • the smaller the std i1 the better the uniformity of the image template distribution of the target object in the first part.
  • std i2 represents the uniformity of the image template of the target object in the second part after the target sample image replaces the i-th image template of the target object.
  • the smaller the std i2 the more uniform the image template of the target object is distributed in the second part.
  • std i1 is less than std 01 , it means that after the target sample image replaces the ith image template of the target object, the image template of the target object is more evenly distributed in the first part, and the target sample image can be used as the ith image template to replace the original image Template, and/or, if std i2 is less than std 02 , it means that after the target sample image replaces the i-th image template of the target object, the image template of the target object is more evenly distributed in the second part.
  • the local uniformity parameter divided by the factor can also be used for judgment. For example, if P i1 is less than std 01 , it means that after the target sample image replaces the i-th image template of the target object, the image template of the target object is in the first part The distribution is more uniform, the target sample image can be used as the i-th image template to replace the original image template, and/or, if Pi2 is less than std 02 , it means that the target sample image replaces the i-th image template of the target object, the target object The image template is more evenly distributed in the second part.
  • the overall uniformity parameter can be judged first, and then the local uniformity parameter can be judged. If the target sample image replaces the i-th image template of the target object, the overall uniformity parameter is equal to the preset uniformity Value, it is further judged whether the local uniformity parameter is less than the first uniformity value or the second uniformity value. If the overall uniformity parameter is less than the preset uniformity value, the target sample image can be directly replaced. If the overall uniformity parameter is greater than the preset uniformity value, the target sample image cannot be used as an image template of the target object.
  • Step 105 If the uniformity parameter of the pre-replacement template set obtained after the target object replaces the second image template is less than the preset uniformity value, use the target sample image as the image template of the target object and replace the second image template.
  • the second image template belongs to at least one image template of the target object.
  • the second image template represents an image template that can be used for replacement.
  • the first image template and the second image template satisfy std 1 ⁇ std 0 , std 2 ⁇ std 0 , that is to say, after the target sample image replaces the first image template of the target object, and after the target sample image replaces the second image template of the target object, the image template distribution of the target object is more even.
  • Both one image template and the second image template can be replaced as the second image template.
  • the first image template and the second image template are added to the replacement queue, and one image template needs to be selected for replacement among multiple image templates in the replacement queue.
  • the method further includes: when there is a uniformity parameter less than a preset uniformity value in the uniformity parameter of the target object, replacing the target object with the target object in the image template of the target object The resulting image template with the smallest uniformity parameter is replaced with the target sample image.
  • the image template distribution is more than the target sample image. If one image template is more uniform, replace the second image template with the target sample image.
  • the overall uniformity is only used for illustration here, and the local uniformity can also be used for judgment, which is not limited in this application.
  • the method further includes: matching the number of hits with the target object in the image recognition process in the image template whose uniformity parameter obtained after replacing with the target object is less than the preset uniformity value The smallest image template is replaced with the target sample image.
  • the method further includes: among the image templates whose uniformity parameters obtained after replacing with the target object are less than the preset uniformity value, replacing the image template with the longest use time with the target sample image.
  • the time for the first image template as the target image template is 5 days, and the time for the second image template as the target image template is 1 day, then the first image template is replaced with the target Sample image.
  • the method further includes: converting the two-dimensional image of the target sample image into three-dimensional point cloud information, and calculating the target object in the target sample image through the external parameter rotation matrix according to the three-dimensional point cloud information The horizontal rotation angle and the vertical rotation angle.
  • the three-dimensional point cloud information is used to indicate the position coordinates and depth information of each pixel in the target sample image, and the three-dimensional point cloud information can indicate the spatial position of each pixel.
  • the external parameter rotation matrix is the parameter of the shooting device that shoots the target sample image. According to the three-dimensional point cloud information and the external parameter rotation matrix of the shooting device, the lateral rotation angle and the vertical rotation angle of the target object in the target sample image can be calculated. In the same way, the horizontal rotation angle and the vertical rotation angle of the target object in the image corresponding to each image template of the target object can be calculated, which will not be repeated here.
  • a judgment may be made on a plurality of collected images, and a target sample image of the target object may be selected from the plurality of images.
  • the method further includes: Calculate the first average value of the similarity between each image in the at least one image and the image template of the target object; determine the image with the first average value of the similarity within the second preset range as the target sample image of the target object.
  • the similarity value range can be [0,1], a similarity of 0 means completely different, a similarity of 1 means completely the same, combined with the corresponding example in Figure 2, when the similarity is greater than 0.8, it can be determined that the sample image is The sample image of the target object, therefore, the second preset range may be (0.8, 1), which will not be repeated here, of course, it may also be other values, which is not limited in this application.
  • this application can continuously increase the number of image templates of the target object until the number of image templates reaches the upper limit, and after the number of image templates reaches the upper limit, the image template of the image template can still be replaced
  • the uniformity of the image template is better, and the recognition accuracy is improved.
  • the image template of the target object has better real-time performance.
  • the angle diversity of the image template is greater Improvement: After updating the same sample using the image template of this application, compared with before using the image template of this application, the FRR (False Rejection Rate) is significantly reduced, and the recognition rate is significantly improved.
  • the image template can be updated during the registration process of the target object, for example, the user performs face registration; it can also be during the identity verification process, for example, the user performs face recognition.
  • two specific application scenarios are listed for description. Of course, this is only an exemplary description, which does not mean that the application is limited to this.
  • the identity verification of the target object can be performed, and the method further includes: after the target sample image of the target object is obtained by image collection of the target object, comparing the target sample image with at least the target object An image template is matched; after the target sample image is successfully matched with any image template of the target object, it is determined that the target object's identity verification is successful.
  • the target sample image is obtained by image collection of the target object, and then the target sample image is matched. If it matches a certain image template of the target object, the verification is successful. Of course, verification may fail.
  • the object to be authenticated when an object to be authenticated is authenticated, the object to be authenticated is imaged to obtain the collected image, and the sample image is matched with the image template of the target object. If all of the target object is If the image template fails to match the captured image of the object to be verified, it means that the object to be verified is not the target object, and the identity verification fails.
  • this is only an exemplary description, which does not mean that the application is limited to this.
  • the target object may be registered, and the method further includes: after the target sample image of the target object is obtained by image collection of the target object, determining the target sample image as the image of the target object Template, store the image template of the target object, and complete the registration of the target object.
  • the target sample image is obtained by image acquisition of the target object, and the target sample image is determined as the image template of the target object.
  • the target sample image is determined as the image template of the target object.
  • a plurality of sample images of the target object are collected, and if the collection is continued, it is necessary to determine whether the newly collected sample images can replace the existing image template according to steps 102-105.
  • the first application scenario and the second application scenario are specific application scenarios of the image template update method of this application.
  • the image template update may be performed during the identity verification process of the target object (for example, face recognition), It can also be performed during the registration process of the target object (for example, face registration), and the image template is automatically changed during the recognition or registration process to improve the uniformity of the image template distribution.
  • this application determines the similarity between the target sample image of the target object and the first image template.
  • the similarity is within the first preset range, it is determined that the target sample image is the permitted learning sample image, Reduce the repetitive image templates in the image templates of the target object, and replace each image template of the target object with the target sample image to obtain the corresponding uniformity parameters to determine whether the target sample image can replace the image template in the target object, ensuring the target
  • the image templates in the object are evenly distributed, which ensures that the target object can be recognized at all angles, and improves the accuracy and recognition rate of image recognition.
  • FIG. 5 is a logical block diagram of an image template update method provided by an embodiment of the application. As shown in FIG. 5, the method includes the following steps:
  • Step 501 Use the image template of the target object to recognize the current target sample image.
  • the target sample image is a sample image of the target object, and step 502 is executed, and if the recognition is unsuccessful, the process ends.
  • the target object may be a human face.
  • an infrared camera, speckle projector, and light supplement can be used for image collection.
  • the infrared camera is used to capture infrared images, turn on the infrared camera and the light supplement, turn off the speckle projector to collect infrared images;
  • the speckle projector when use the speckle projector to collect speckle images, turn on the speckle projector, Turn off the infrared camera and the light supplement, collect speckle images, obtain depth information from the collected speckle images, and convert the two-dimensional infrared image into three-dimensional point cloud information. And calculate the angle of the current point cloud through the camera's external parameter rotation matrix.
  • Step 502 Add 1 to the number of image template matching hits that are successfully recognized.
  • Step 503 Determine whether the target sample image is allowed to learn.
  • the target sample image is allowed to learn.
  • Step 504 Calculate the target sample image to replace each image template of the target object and obtain the corresponding overall uniformity parameter.
  • Step 505 Determine whether the overall uniformity is improved.
  • step 102 Judging whether the global uniformity is improved can be judged according to the first example in step 102. If std i is less than std 0 , it means that the global uniformity is better after the target sample image replaces the i-th image template of the target object. If the global uniformity is not improved, then step 506 is executed, and if the global uniformity is improved, then step 508 is executed.
  • Step 506 Calculate the target sample image to replace each image template of the target object and obtain the corresponding local uniformity parameter.
  • Step 507 Determine whether the local uniformity parameter is improved.
  • step 102 Judging whether the global uniformity is improved can be judged according to the second example in step 102. If Pi1 is less than std 01 and/or Pi2 is less than std 02 , it means that the target sample image has better local uniformity after replacing the i-th image template of the target object. If the local uniformity is not improved, the process ends, and if the local uniformity is improved, then step 508 is executed.
  • Step 508 Determine the image template with the smallest number of matching hits among the at least one replaceable image template of the target object.
  • Step 509 Use the image template with the longest use time as the second image template among at least one image template with the smallest number of matching hits of the target object.
  • At least one replaceable image template forms a replacement queue. If there is only one image template in the replacement queue, the target sample image is directly replaced with the image template. If there are multiple image templates in the replacement queue, it is required to follow step 508. For selection, if after the selection in step 508, there are two image templates in the replacement queue with the same number of matching hits and the minimum number of times, then the two image templates are further selected according to step 509.
  • Step 510 Replace the target sample image with the image template of the target object and replace the second image template.
  • this application determines the similarity between the target sample image of the target object and the first image template.
  • the similarity is within the first preset range, it is determined that the target sample image is the permitted learning sample image, Reduce the repetitive image templates in the image templates of the target object, and replace each image template of the target object with the target sample image to obtain the corresponding uniformity parameters to determine whether the target sample image can replace the image template in the target object, ensuring the target
  • the image templates in the object are evenly distributed, which ensures that the target object can be recognized at all angles, and improves the accuracy and recognition rate of image recognition.
  • the image template updating device 60 includes: a permission learning module 601, a uniformity parameter module 602, and an image template management module 603;
  • the permission learning module 601 is used to determine that the similarity between the target sample image of the target object and the first image template of the target object is within a first preset range
  • the uniformity parameter module 602 is used to replace each image template of the target object with the target sample image and obtain the corresponding uniformity parameter when the number of image templates of the target object is equal to the preset number, and the uniformity parameter is used to indicate the target sample.
  • the image template management module 603 is configured to use the target sample image as the image template of the target object and replace the second image template if the uniformity parameter obtained after the target object replaces the second image template is less than the preset uniformity value.
  • FIG. 7 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the application.
  • the electronic device 70 includes: at least one processor 701, a memory 702, and a bus 703, and the at least one processor 701 and the memory 702 are connected to each other through the bus and complete communication;
  • the memory 702 as a non-volatile computer-readable storage medium, can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as the (Method for Determining Touch Position of Capacitive Screen) in the embodiment of the present application ) Corresponding program instructions/modules.
  • At least one processor 701 executes various functional applications and data processing of the server by running non-volatile software programs, instructions, and modules stored in the memory 702, that is, implementing the above method embodiment (method for determining the touch position of the capacitive screen ).
  • the memory 702 may include a storage program area and a storage data area, where the storage program area can store an operating system and an application program required by at least one function; the storage data area can store data created based on the use of (the device for determining the touch position of the capacitive screen) Data etc.
  • the memory 702 may include a high-speed random access memory 702, and may also include a non-volatile memory 702, such as 702 pieces of at least one disk storage, flash memory device, or 702 other non-volatile solid-state memories.
  • the memory 702 may optionally include a memory 702 remotely provided with respect to the at least one processor 701, and these remote memories 702 may be connected to (device for determining the touch position of the capacitive screen) via a network.
  • networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • At least one program is stored on the memory 702, and when at least one processor 701 executes the program stored on the memory 702, the method described in the embodiment of the present application is implemented.
  • the embodiment of the present application provides a storage medium with a computer program stored on the storage medium, and when at least one processor executes the computer program, the method described in the embodiment of the present application is implemented.
  • the electronic devices of the embodiments of the present application exist in various forms, including but not limited to:
  • Mobile communication equipment This type of equipment is characterized by mobile communication functions, and its main goal is to provide voice and data communications.
  • Such terminals include: smart phones (such as iPhone), multimedia phones, functional phones, and low-end phones.
  • Ultra-mobile personal computer equipment This type of equipment belongs to the category of personal computers, has calculation and processing functions, and generally also has mobile Internet features.
  • Such terminals include: PDA, MID and UMPC devices, such as iPad.
  • Portable entertainment equipment This type of equipment can display and play multimedia content.
  • Such devices include: audio, video players (such as iPod), handheld game consoles, e-books, as well as smart toys and portable car navigation devices.
  • Server A device that provides computing services.
  • the structure of a server includes at least one processor 810, hard disk, memory, system bus, etc.
  • the server is similar to a general computer architecture, but due to the need to provide highly reliable services, it has , Stability, reliability, security, scalability, manageability and other aspects have higher requirements.
  • the improvement of a technology can be clearly distinguished between hardware improvements (for example, improvements in circuit structures such as diodes, transistors, switches, etc.) or software improvements (improvements in method flow).
  • hardware improvements for example, improvements in circuit structures such as diodes, transistors, switches, etc.
  • software improvements improvements in method flow.
  • the improvement of many methods and processes of today can be regarded as a direct improvement of the hardware circuit structure.
  • Designers almost always get the corresponding hardware circuit structure by programming the improved method flow into the hardware circuit. Therefore, it cannot be said that the improvement of a method flow cannot be realized by the hardware entity module.
  • a programmable logic device for example, a Field Programmable Gate Array (Field Programmable Gate Array, FPGA)
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • HDL Hardware Description Language
  • ABEL Advanced Boolean Expression Language
  • AHDL Altera Hardware Description Language
  • HDCal JHDL
  • Lava Lava
  • Lola MyHDL
  • PALASM RHDL
  • VHDL Very-High-Speed Integrated Circuit Hardware Description Language
  • Verilog Verilog
  • the controller can be implemented in any suitable manner.
  • the controller can take the form of, for example, at least one micro processor or at least one processor and storing computer readable program codes (such as software or firmware) executable by the (micro) at least one processor. )
  • computer readable program codes such as software or firmware
  • ASICs application specific integrated circuits
  • controllers include but are not limited to the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicon Labs C8051F320, the memory controller can also be implemented as part of the memory control logic.
  • controllers in addition to implementing the controller in a purely computer-readable program code manner, it is entirely possible to program the method steps to make the controller use logic gates, switches, application-specific integrated circuits, programmable logic controllers, and embedded logic.
  • the same function can be realized in the form of a microcontroller or the like. Therefore, such a controller can be regarded as a hardware component, and the devices included in it for realizing various functions can also be regarded as a structure within the hardware component. Or even, the device for realizing various functions can be regarded as both a software module for realizing the method and a structure within a hardware component.
  • a typical implementation device is a computer.
  • the computer may be, for example, a personal computer, a laptop computer, a cell phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or Any combination of these devices.
  • this application can be provided as methods, systems, or computer program products. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on at least one computer-usable storage medium (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program code.
  • a computer-usable storage medium including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • the computing device includes at least one processor (CPU), input/output interface, network interface, and memory.
  • the memory may include non-permanent memory in a computer readable medium, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM). Memory is an example of computer readable media.
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash memory
  • Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
  • the information can be computer-readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include temporary computer-readable media
  • transitory media such as modulated data signals and carrier waves.
  • this application can be provided as a method, a system, or a computer program product. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on at least one computer-usable storage medium (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program code.
  • a computer-usable storage medium including but not limited to disk storage, CD-ROM, optical storage, etc.
  • This application may be described in the general context of computer-executable instructions executed by a computer, such as a program module.
  • program modules include routines, programs, objects, components, data structures, etc. that perform specific transactions or implement specific abstract data types.
  • This application can also be practiced in distributed computing environments. In these distributed computing environments, transactions are executed by remote processing devices connected through a communication network. In a distributed computing environment, program modules can be located in local and remote computer storage media including storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé de mise à jour d'un modèle d'image, un dispositif et un support de stockage, le procédé de mise à jour d'un modèle d'image comprenant : la collecte d'une image d'un objet cible pour obtenir une image d'échantillon cible de l'objet cible (101) ; la détermination que la similarité entre l'image d'échantillon cible de l'objet cible et un premier modèle d'image de l'objet cible se situe dans une première plage prédéfinie (102) ; l'utilisation de l'image d'échantillon cible pour pré-remplacer chaque modèle d'image de l'objet cible de façon à obtenir au moins un ensemble de modèles pré-remplacés (103) ; le calcul d'un paramètre d'uniformité de chaque ensemble de modèles pré-remplacés dans le ou les ensembles de modèles pré-remplacés (104) ; et, si le paramètre d'uniformité de l'ensemble de modèles pré-remplacés obtenu après que l'objet cible a remplacé un second modèle d'image est inférieur à une valeur numérique d'uniformité prédéfinie, l'utilisation de l'image d'échantillon cible en tant que modèle d'image de l'objet cible et le remplacement du second modèle d'image Le procédé décrit assure une distribution uniforme de modèles d'image dans un objet cible et améliore la précision et le taux de reconnaissance de la reconnaissance d'image.
PCT/CN2019/108656 2019-09-27 2019-09-27 Procédé de mise à jour de modèle d'image, dispositif et support de stockage WO2021056450A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980002050.4A CN112912889B (zh) 2019-09-27 2019-09-27 图像模板的更新方法、设备及存储介质
PCT/CN2019/108656 WO2021056450A1 (fr) 2019-09-27 2019-09-27 Procédé de mise à jour de modèle d'image, dispositif et support de stockage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/108656 WO2021056450A1 (fr) 2019-09-27 2019-09-27 Procédé de mise à jour de modèle d'image, dispositif et support de stockage

Publications (1)

Publication Number Publication Date
WO2021056450A1 true WO2021056450A1 (fr) 2021-04-01

Family

ID=75165050

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/108656 WO2021056450A1 (fr) 2019-09-27 2019-09-27 Procédé de mise à jour de modèle d'image, dispositif et support de stockage

Country Status (2)

Country Link
CN (1) CN112912889B (fr)
WO (1) WO2021056450A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139612A (zh) * 2021-05-07 2021-07-20 上海商汤临港智能科技有限公司 图像分类方法、分类网络的训练方法和相关产品
CN114401440A (zh) * 2021-12-14 2022-04-26 北京达佳互联信息技术有限公司 视频剪辑及剪辑模型生成方法、装置、设备、程序和介质
CN115937629A (zh) * 2022-12-02 2023-04-07 北京小米移动软件有限公司 模板图像的更新方法、更新装置、可读存储介质及芯片

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106021606A (zh) * 2016-06-21 2016-10-12 广东欧珀移动通信有限公司 一种指纹模板更新方法及终端设备
WO2016206081A1 (fr) * 2015-06-26 2016-12-29 宇龙计算机通信科技(深圳)有限公司 Procédé et dispositif pour mettre à jour un modèle d'empreinte digitale
CN106372609A (zh) * 2016-09-05 2017-02-01 广东欧珀移动通信有限公司 指纹模板更新方法、装置及终端设备
US20170116401A1 (en) * 2015-10-21 2017-04-27 Samsung Electronics Co., Ltd. Biometric authentication method and apparatus
CN106682068A (zh) * 2015-11-11 2017-05-17 三星电子株式会社 用于适应性更新用于用户认证的注册数据库的方法和设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016206081A1 (fr) * 2015-06-26 2016-12-29 宇龙计算机通信科技(深圳)有限公司 Procédé et dispositif pour mettre à jour un modèle d'empreinte digitale
US20170116401A1 (en) * 2015-10-21 2017-04-27 Samsung Electronics Co., Ltd. Biometric authentication method and apparatus
CN106682068A (zh) * 2015-11-11 2017-05-17 三星电子株式会社 用于适应性更新用于用户认证的注册数据库的方法和设备
CN106021606A (zh) * 2016-06-21 2016-10-12 广东欧珀移动通信有限公司 一种指纹模板更新方法及终端设备
CN106372609A (zh) * 2016-09-05 2017-02-01 广东欧珀移动通信有限公司 指纹模板更新方法、装置及终端设备

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139612A (zh) * 2021-05-07 2021-07-20 上海商汤临港智能科技有限公司 图像分类方法、分类网络的训练方法和相关产品
CN114401440A (zh) * 2021-12-14 2022-04-26 北京达佳互联信息技术有限公司 视频剪辑及剪辑模型生成方法、装置、设备、程序和介质
CN115937629A (zh) * 2022-12-02 2023-04-07 北京小米移动软件有限公司 模板图像的更新方法、更新装置、可读存储介质及芯片
CN115937629B (zh) * 2022-12-02 2023-08-29 北京小米移动软件有限公司 模板图像的更新方法、更新装置、可读存储介质及芯片

Also Published As

Publication number Publication date
CN112912889B (zh) 2024-03-26
CN112912889A (zh) 2021-06-04

Similar Documents

Publication Publication Date Title
CN111179311B (zh) 多目标跟踪方法、装置及电子设备
TWI753271B (zh) 資源轉移方法、裝置及系統
US10679146B2 (en) Touch classification
US10043308B2 (en) Image processing method and apparatus for three-dimensional reconstruction
US9811721B2 (en) Three-dimensional hand tracking using depth sequences
WO2020078017A1 (fr) Procédé et appareil de reconnaissance d'écriture manuscrite dans l'air ainsi que dispositif et support d'informations lisible par ordinateur
KR102316230B1 (ko) 이미지 처리 방법 및 장치
US10747990B2 (en) Payment method, apparatus, and system
WO2017063530A1 (fr) Procédé et système de reconnaissance d'informations de mouvement
CN103455794B (zh) 一种基于帧融合技术的动态手势识别方法
JP2018505495A (ja) 指紋重複領域の面積算出方法、それを行う電子機器、コンピュータプログラム、及び、記録媒体
TW202011264A (zh) 資訊的檢測方法、裝置及設備
WO2021147113A1 (fr) Procédé d'identification de catégorie sémantique de plan et appareil de traitement de données d'image
WO2021056450A1 (fr) Procédé de mise à jour de modèle d'image, dispositif et support de stockage
Meus et al. Embedded vision system for pedestrian detection based on HOG+ SVM and use of motion information implemented in Zynq heterogeneous device
KR102421604B1 (ko) 이미지 처리 방법, 장치 및 전자 기기
KR20220098312A (ko) 이미지 내 관련 대상 검출 방법, 장치, 디바이스 및 기록 매체
US11034028B2 (en) Pose determining method for mobile robot and apparatus and mobile robot thereof
CN111753583A (zh) 一种识别方法及装置
US10551195B2 (en) Portable device with improved sensor position change detection
US20230281867A1 (en) Methods performed by electronic devices, electronic devices, and storage media
CN111967365A (zh) 影像连接点的提取方法和装置
Zhang et al. Real-time dynamic SLAM using moving probability based on IMU and segmentation
CN103514434A (zh) 一种图像识别方法及装置
CN115657925B (zh) 触摸输入识别系统及方法、设备、计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19946689

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19946689

Country of ref document: EP

Kind code of ref document: A1