CN112597978A - Fingerprint matching method and device, electronic equipment and storage medium - Google Patents

Fingerprint matching method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112597978A
CN112597978A CN202110232239.1A CN202110232239A CN112597978A CN 112597978 A CN112597978 A CN 112597978A CN 202110232239 A CN202110232239 A CN 202110232239A CN 112597978 A CN112597978 A CN 112597978A
Authority
CN
China
Prior art keywords
template
fingerprint
sift feature
image
feature points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110232239.1A
Other languages
Chinese (zh)
Other versions
CN112597978B (en
Inventor
杨浩
徐洪伟
涂强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fushi Technology Co Ltd
Original Assignee
Shenzhen Fushi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fushi Technology Co Ltd filed Critical Shenzhen Fushi Technology Co Ltd
Priority to CN202110232239.1A priority Critical patent/CN112597978B/en
Publication of CN112597978A publication Critical patent/CN112597978A/en
Application granted granted Critical
Publication of CN112597978B publication Critical patent/CN112597978B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The application provides a fingerprint matching method and device, electronic equipment and a storage medium. The fingerprint matching method comprises the steps of obtaining a sample image, roughly matching the sample image with each template image in each pre-stored fingerprint template, and determining the number of template images meeting preset similar conditions with the sample image in each fingerprint template; taking the fingerprint template with the largest number of images as a fingerprint template to be matched; and fingerprint matching is carried out on the sample image and the template image in the fingerprint template to be matched. Because the fingerprint template to be matched is determined through rough matching, and then the sample image is subjected to fingerprint matching with the template image in the fingerprint template to be matched, the template image for fingerprint matching can be reduced, namely the number of times of fingerprint matching is reduced, so that the fingerprint matching speed can be improved.

Description

Fingerprint matching method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of image processing, and in particular, to a fingerprint matching method and apparatus, an electronic device, and a storage medium.
Background
The fingerprint identification technology is a technology for performing fingerprint identification on collected fingerprint image information. The rough process of fingerprint identification is: firstly, fingerprint images are collected, then fingerprint calibration is carried out on the collected fingerprint images, and fingerprints are enhanced so as to be convenient for extracting fingerprint characteristics. After the characteristics are extracted, the similarity between the acquired fingerprint image and a pre-stored template image is judged by a fingerprint matching method, if the characteristics are similar, the acquired fingerprint image and the template image are judged to be the fingerprint image of the same person, otherwise, the two fingerprints are judged not to be the fingerprint image of the same person.
The existing fingerprint matching method generally matches the acquired fingerprint image (i.e. sample image) with the template image in the template one by one, however, as the number of templates increases, the corresponding matching times also increases, which results in too much time consumption in the fingerprint matching process and reduces the fingerprint matching speed.
Disclosure of Invention
Therefore, the embodiment of the application provides a fingerprint matching method, a fingerprint matching device, an electronic device and a storage medium, and the fingerprint matching speed can be improved.
In a first aspect, a fingerprint matching method is provided, including:
acquiring a sample image;
roughly matching the sample images with template images in pre-stored fingerprint templates, and determining the number of the template images meeting preset similar conditions with the sample images in each fingerprint template;
taking the fingerprint template with the largest number of images as a fingerprint template to be matched;
and performing fingerprint matching on the sample image and a template image in the fingerprint template to be matched.
Preferably, the roughly matching the sample image with a template image in a pre-stored fingerprint template, and determining the number of template images in each fingerprint template that satisfy a preset similarity condition with the sample image, includes:
determining first feature points in the sample image and second feature points in the template image, wherein the first feature points and the second feature points in the template image are in one-to-one correspondence;
determining the similarity of the sample image and the template image according to a first Hamming distance between the first characteristic point and a second characteristic point in the template image;
and determining the number of the template images meeting the preset similar condition with the sample image in the fingerprint template according to the template images with the similarity larger than the preset similarity threshold.
Preferably, the determining the first feature point in the sample image and the second feature point in each template image includes:
extracting first SIFT feature points from the sample image, and determining the first feature points from the first SIFT feature points according to a preset rule;
and extracting second SIFT feature points from the template image, and determining second feature points from the second SIFT feature points according to a second Hamming distance between the first feature points and the second SIFT feature points.
Preferably, the fingerprint matching of the sample image and the template image in the fingerprint template to be matched includes:
extracting a third SIFT feature point from the sample image, and extracting a fourth SIFT feature point from the template image in the fingerprint template to be matched;
establishing a corresponding relation between the third SIFT feature point and a fourth SIFT feature point according to a third Hamming distance between the third SIFT feature point and the fourth SIFT feature point;
and determining the matching degree of the sample image and the template image in the fingerprint template to be matched according to the similarity between the third SIFT feature point and the fourth SIFT feature point with the corresponding relationship.
Preferably, the establishing a corresponding relationship between the third SIFT feature point and the fourth SIFT feature point according to a third hamming distance between the third SIFT feature point and the fourth SIFT feature point includes:
calculating a first hash value of the third SIFT feature point according to a preset hash function;
grouping the third SIFT feature points according to the first hash value to obtain at least two first groups;
calculating a second hash value of the fourth SIFT feature point according to the hash function;
grouping the fourth SIFT feature points according to the second hash value to obtain at least two second groups, wherein one first group corresponds to one second group;
and establishing a corresponding relation between the third SIFT feature points in the first grouping and the corresponding fourth SIFT feature points in the second grouping according to a third Hamming distance between the third SIFT feature points in the first grouping and the corresponding fourth SIFT feature points in the second grouping.
Preferably, before the establishing of the corresponding relationship between the third SIFT feature point and the fourth SIFT feature point according to the third hamming distance between the third SIFT feature point and the fourth SIFT feature point, the fingerprint matching method further includes:
establishing a data structure of the sample image according to the first hash value, and establishing a data structure of the template image according to the second hash value;
and determining the corresponding relation between the first group and the second group according to the data structure of the sample image and the data structure of the template image.
Preferably, the fingerprint matching of the sample image and the template image in the fingerprint template to be matched includes:
taking the pixel points with the gray values larger than the preset value in the sample image as white points, and taking the pixel points with the gray values smaller than the preset value in the sample image as black points;
and determining the matching degree of the sample image and the template image in the fingerprint template to be matched according to the similarity of the white point and the corresponding point in the template image and the similarity of the black point and the corresponding point in the template image.
In a second aspect, there is provided a fingerprint matching apparatus, including:
the acquisition module is used for acquiring a sample image;
the determining module is used for roughly matching the sample image with each template image in each pre-stored fingerprint template and determining the number of the template images meeting the preset similar conditions with the sample image in each fingerprint template;
the computing module is used for taking the fingerprint template with the largest number of images as a fingerprint template to be matched;
and the matching module is used for performing fingerprint matching on the sample image and the template image in the fingerprint template to be matched.
Preferably, the determining module is specifically configured to:
determining first feature points in the sample images and second feature points in each template image, wherein the first feature points correspond to the second feature points in each template image one to one;
determining the similarity of the sample image and each template image according to a first Hamming distance between the first characteristic point and a second characteristic point in each template image;
and determining the number of the template images meeting preset similar conditions with the sample image in each fingerprint template according to the template images with the similarity larger than a preset similar threshold.
Preferably, the determining module is further specifically configured to:
extracting first SIFT feature points from the sample image, and determining the first feature points from the first SIFT feature points according to a preset rule;
and extracting second SIFT feature points from each template image, and determining second feature points from the second SIFT feature points according to a second Hamming distance between the first feature points and the second SIFT feature points.
Preferably, the matching module is specifically configured to:
extracting a third SIFT feature point from the sample image, and extracting a fourth SIFT feature point from the template image in the fingerprint template to be matched;
establishing a corresponding relation between the third SIFT feature point and a fourth SIFT feature point according to a third Hamming distance between the third SIFT feature point and the fourth SIFT feature point;
and determining the matching degree of the sample image and the template image in the fingerprint template to be matched according to the similarity between the third SIFT feature point and the fourth SIFT feature point with the corresponding relationship.
Preferably, the matching module is further specifically configured to:
calculating a first hash value of the third SIFT feature point according to a preset hash function;
grouping the third SIFT feature points according to the first hash value to obtain at least two first groups;
calculating a second hash value of the fourth SIFT feature point according to the hash function;
grouping the fourth SIFT feature points according to the second hash value to obtain at least two second groups, wherein one first group corresponds to one second group;
and establishing a corresponding relation between the third SIFT feature points in the first grouping and the corresponding fourth SIFT feature points in the second grouping according to a third Hamming distance between the third SIFT feature points in the first grouping and the corresponding fourth SIFT feature points in the second grouping.
Preferably, the matching module is further specifically configured to:
establishing a data structure of the sample image according to the first hash value, and establishing a data structure of the template image according to the second hash value;
and determining the corresponding relation between the first group and the second group according to the data structure of the sample image and the data structure of the template image.
Preferably, the matching module is further specifically configured to:
taking the pixel points with the gray values larger than the preset value in the sample image as white points, and taking the pixel points with the gray values smaller than the preset value in the sample image as black points;
and determining the matching degree of the sample image and the template image in the fingerprint template to be matched according to the similarity of the white point and the corresponding point in the template image and the similarity of the black point and the corresponding point in the template image.
In a third aspect, an electronic device is provided, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the fingerprint matching method according to the first aspect when executing the computer program.
In a fourth aspect, a computer-readable storage medium is provided, the computer-readable storage medium storing a computer program which, when executed by a processor, implements the fingerprint matching method according to the first aspect as described above.
Compared with the prior art, the embodiment of the application has the advantages that: acquiring a sample image, roughly matching the sample image with each template image in each pre-stored fingerprint template, and determining the number of template images meeting preset similar conditions with the sample image in each fingerprint template; taking the fingerprint template with the largest number of images as a fingerprint template to be matched; and fingerprint matching is carried out on the sample image and the template image in the fingerprint template to be matched. Because the fingerprint template to be matched is determined through rough matching, and then the sample image is subjected to fingerprint matching with the template image in the fingerprint template to be matched, the template image for fingerprint matching can be reduced, namely the number of times of fingerprint matching is reduced, so that the fingerprint matching speed can be improved.
Drawings
Fig. 1 is a schematic flowchart illustrating an implementation process of a fingerprint matching method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of feature point matching provided by an embodiment of the present application;
FIG. 3 is a diagram of a data structure provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a coarse matching process provided in an embodiment of the present application;
FIG. 5 is a flow chart of similarity calculation provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a fingerprint matching process provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a fingerprint matching apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The following detailed description of the preferred embodiments of the present invention, taken in conjunction with the accompanying drawings, will make the advantages and features of the invention easier to understand by those skilled in the art, and thus will clearly and clearly define the scope of the invention. It is to be understood that the described embodiments are merely a few embodiments of the invention, and not all embodiments.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
The existing fingerprint matching method generally matches a sample image with template images in a template one by one, and corresponding matching times are increased along with the increase of the number of the templates, so that the time consumption of a fingerprint matching process is too much, and the fingerprint matching speed is reduced.
Therefore, the application provides a fingerprint matching method, which comprises the steps of firstly obtaining a sample image, roughly matching the sample image with template images in a prestored fingerprint template, determining the number of the images in the fingerprint template, which meet the preset similar conditions of the sample image, of the template images, taking the fingerprint template which meets the preset similar conditions and has the largest number of the images as a fingerprint template to be matched, and then carrying out fingerprint matching on the sample image and the template images in the fingerprint template to be matched, so that the template images for carrying out fingerprint matching can be reduced, namely the number of times for carrying out fingerprint matching is reduced, and the fingerprint matching speed is improved.
The fingerprint matching method provided in the present application is exemplified below.
The fingerprint matching method provided by the embodiment of the application is applied to electronic equipment, and the electronic equipment can be a mobile phone, a notebook, a palm computer, wearable equipment and the like.
Referring to fig. 1, a fingerprint matching method according to an embodiment of the present application includes:
s101: a sample image is acquired.
Specifically, the sample image is a fingerprint image to be verified input by a user, and the sample image may be collected by a fingerprint collector on the electronic device.
S102: and performing rough matching on the sample image and template images in pre-stored fingerprint templates, and determining the number of the template images meeting preset similar conditions with the sample image in each fingerprint template.
The template image in each fingerprint template is collected when a user registers a fingerprint and is stored in a fingerprint database of the electronic equipment. The fingerprint templates can be the fingerprint templates of the same user or the fingerprint templates of different users. Each fingerprint template corresponds to a finger, and for example, the fingerprint templates include a fingerprint template of the finger 1, a fingerprint template of the finger 2, a fingerprint template of the finger 3, and a fingerprint template of the finger 4. Each fingerprint template comprises a plurality of template images which are acquired by multiple times respectively.
In a possible implementation manner, Scale-invariant feature transform (SIFT) feature points are extracted from a sample image, and the first feature points are determined from the SIFT feature points according to a preset rule. The first SIFT feature points are pixel points in the sample image, and the preset rule may be that points are extracted from the first SIFT feature points by dividing the points by N in sequence, for example, the number of the first SIFT feature points is 350, points are extracted from the first SIFT feature points by dividing the points by 5, and the number of the obtained first feature points is 70. The preset rule may also be to randomly take points from the first SIFT feature points.
And after the first feature points are determined, extracting second SIFT feature points from each template image of each fingerprint template, wherein the second SIFT feature points are pixel points in the template images. And for each first feature point, sequentially calculating a second Hamming distance from each second SIFT feature point in each template image. And for each template image, taking the second SIFT feature point with the second Hamming distance as the second feature point corresponding to the first feature point, wherein the second SIFT feature point with the second Hamming distance as the minimum value is also the feature point closest to the first feature point in the template image. Wherein the second hamming distance is calculated according to the feature descriptors of the first feature points and the feature descriptors of the second SIFT feature points. Because the SIFT feature points have good stability, the first feature points determined from the first SIFT feature points can more accurately reflect the image features of the sample image, the second feature points determined from the second SIFT feature points can more accurately reflect the image features of the template image, and the accuracy of the determined fingerprint template to be matched is further improved.
In other possible implementation manners, the sample image may be subjected to image recognition, the image feature points may be extracted, the first feature points may be determined from the extracted image feature points, the template image may be subjected to image recognition, the image feature points may be extracted, and the second feature points may be determined from the extracted image feature points.
After the first characteristic points and the second characteristic points are obtained, for each template image, calculating first Hamming distances between the first characteristic points and the corresponding second characteristic points, and determining the similarity between the sample image and each template image according to the first Hamming distances. Wherein the first hamming distance is calculated from the feature descriptors of the first feature points and the feature descriptors of the second feature points. And determining the similarity between the sample image and the template image according to the first Hamming distance, thereby improving the accuracy of the calculated similarity.
In a possible implementation manner, for each template image, the first hamming distances corresponding to all the second feature points of the template image are sorted, the first 30% of the first hamming distances are taken according to the descending order, the first 30% of the first hamming distances are averaged, and the average value is used as the similarity between the template image and the sample image. In another possible implementation manner, a histogram of the sequence number of the second feature point and the first hamming distance may also be established, the first hamming distance with a value of the first 30% is determined according to the histogram, and the first hamming distance with a value of the first 30% is averaged to obtain the similarity between the template image and the sample image.
In other possible implementation manners, for each template image, the first hamming distances corresponding to all the second feature points of the template image may be averaged to obtain the similarity between the template image and the sample image, or the maximum first hamming distance may be determined from all the first hamming distances corresponding to the second feature points, and the maximum first hamming distance is used as the similarity between the template image and the sample image.
And after the similarity is determined, determining the template images with the similarity larger than a preset similarity threshold, and determining the number of the template images meeting preset similar conditions with the sample images in each fingerprint template according to the template images with the similarity larger than the preset similarity threshold. Specifically, the similarity is sorted in the descending order, a preset number of similarities are sequentially extracted in the order, the preset number of similarities are the similarities of which the similarities are greater than a preset similarity threshold, and template images respectively corresponding to the preset number of similarities are used as template images meeting preset similarity conditions with the sample images, namely candidate template images. And after the candidate template images are determined, determining the fingerprint template corresponding to each candidate template image, and counting the number of the candidate template images corresponding to each fingerprint template according to the fingerprint template corresponding to each candidate template image. In another possible implementation manner, after the similarity is determined, for each fingerprint template, the template images with the similarity Topk (top k) are selected in the descending order. Sequencing the template images of the Topk corresponding to all the fingerprint templates, selecting the template images with the similarity of Top2k according to the sequence from big to small, and counting the number of the template images corresponding to each fingerprint template in the template images of Top2 k.
S103: and taking the fingerprint template with the largest number of images as a fingerprint template to be matched.
Continuing with the possible implementation manner, the fingerprint template with the largest number of images of the candidate template images is taken as the fingerprint template to be matched.
S104: and performing fingerprint matching on the sample image and a template image in the fingerprint template to be matched.
Specifically, the similarity between the sample image and the template image in the fingerprint template to be matched is calculated, and the matching degree between the sample image and the template image in the fingerprint template to be matched is determined according to the similarity. For example, according to the identifier of the template image in the fingerprint template to be matched, the similarity between each template image and the sample image is sequentially calculated, and if the template image with the similarity larger than a preset value exists, it is indicated that the sample image and the fingerprint template to be matched correspond to the same finger, and the sample image passes verification. For another example, the similarity between the sample image and each template image is calculated, the similarity corresponding to each template image is averaged to obtain the similarity between the sample image and the fingerprint template to be matched, if the similarity is greater than a preset value, it is indicated that the sample image and the fingerprint template to be matched correspond to the same finger, and the sample image passes verification.
In a possible implementation manner, a third SIFT feature point is extracted from the sample image, and a fourth SIFT feature point is extracted from the template image in the fingerprint template to be matched, wherein the third SIFT feature point is a pixel point in the sample image, and the fourth SIFT feature point is a pixel point in the template image. And for one template image, calculating a third Hamming distance between each fourth SIFT feature point and each third SIFT feature point, and taking the third SIFT feature point with the minimum third Hamming distance as a point corresponding to the fourth SIFT feature point, thereby establishing a corresponding relation between the third SIFT feature point and the fourth SIFT feature point.
In a possible implementation manner, after the third SIFT feature point and the fourth SIFT feature point are extracted, a first hash value of a feature descriptor of the third SIFT feature point, that is, a first hash value of the third SIFT feature point, is calculated according to a preset hash function, and then the third SIFT feature points are grouped according to the first hash value to obtain at least two first groups, wherein a difference value of the first hash values of the third SIFT feature points in each first group is within a preset range. And calculating a second hash value of the feature descriptor of the fourth SIFT feature point according to a preset hash function, namely the second hash value of the fourth SIFT feature point, grouping the fourth SIFT feature points according to the second hash value to obtain at least two second groups, wherein the difference value of the second hash values of the fourth SIFT feature points in each second group is within a preset range.
Wherein the hash function satisfies the following characteristics:
for any two feature points x and y in the high-dimensional space,
if it is not
Figure DEST_PATH_IMAGE001
Then, then
Figure DEST_PATH_IMAGE002
Is not less than
Figure DEST_PATH_IMAGE003
;
If it is not
Figure DEST_PATH_IMAGE004
Then, then
Figure DEST_PATH_IMAGE005
Has a probability of not more than
Figure DEST_PATH_IMAGE006
;
Wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE007
is the hamming distance between feature point x and feature point y, h is the hash function,
Figure DEST_PATH_IMAGE008
and
Figure DEST_PATH_IMAGE009
is the hash value of feature point x and feature point y,
Figure DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE011
for example, the hash function may be a Local Sensitive Hashing (LSH) hash function of hamming distance, an LSH hash function of cosine of included angle of vector, an LSH hash function of euclidean distance, or a random hyperplane hash function.
After the first grouping and the second grouping are completed, according to the first hash value corresponding to each first grouping and the second hash value corresponding to each second grouping, establishing a corresponding relation between the first grouping and the second grouping of which the first hash value and the second hash value are within a preset range. And for each template image, calculating a third Hamming distance between each fourth SIFT feature point in the second grouping and each corresponding third SIFT feature point in the first grouping, and taking the third SIFT feature point with the minimum third Hamming distance as a point corresponding to the fourth SIFT feature point, namely the point closest to the fourth SIFT feature point, so as to establish a corresponding relation between the third SIFT feature point in the first grouping and the fourth SIFT feature point in the second grouping. And each first grouping and the corresponding second grouping are processed, so that the corresponding relation between all the third SIFT feature points and all the fourth SIFT feature points can be obtained.
In other possible implementation manners, for each template image, a third hamming distance between each third SIFT feature point and all fourth SIFT feature points in the template image may also be calculated, a fourth SIFT feature point with the third hamming distance being the minimum value is determined, and the fourth SIFT feature point is taken as the fourth SIFT feature point closest to the third SIFT feature point, that is, the fourth SIFT feature point corresponding to the third SIFT feature point. Compared with the method that the fourth SIFT feature points corresponding to each third SIFT feature point are directly determined from the template image, the corresponding relation between the third SIFT feature points and the fourth SIFT feature points is determined in a grouping mode, and the feature point matching speed is improved.
In another possible implementation manner, the number of the preset hash functions is multiple, and the corresponding grouping manner of the first packet and the grouping manner of the second packet are also multiple. As shown in fig. 2, the number of hash functions is set to L, for each hash function, a first hash value of the third SIFT feature point is calculated, and the third SIFT feature point is divided into n first packets according to the first hash value. The L hash functions correspond to the L grouping modes, the grouping result corresponding to the first hash function is Table1, the grouping result corresponding to the second hash function is Table2 …, and the grouping result corresponding to the L-th hash function is TableL. And after the L grouping modes are obtained, numbering each first grouping in each grouping mode according to a preset rule, wherein the corresponding numbers of the first grouping in each grouping mode are all 1-n. Wherein the number of the first packets is determined from the hash value of each first packet. And (3) performing union set on the third SIFT feature points in the first groups with the same number, namely performing union set on the third SIFT feature points in each column of the first groups from Table1 to TableL in fig. 2, and obtaining n union set-solved first groups. And (4) carrying out the same processing on the fourth SIFT feature point in each template image to obtain n second groups after union set calculation. And calculating a third Hamming distance between each third SIFT feature point in each merged first group and each fourth SIFT feature point in the corresponding merged second group. And arranging the third Hamming distances in a descending order to determine a fourth SIFT feature point corresponding to the third Hamming distance of Topk. And determining a fourth SIFT feature point closest to the third SIFT feature point from a fourth SIFT feature point corresponding to the third Hamming distance of Topk according to a nearest neighbor algorithm, thereby establishing a corresponding relation between the third SIFT feature point in the first grouping subjected to union set and the fourth SIFT feature point in the second grouping subjected to union set. Different groups are carried out on the third SIFT feature points and the fourth SIFT feature points through different hash functions, and different groups are subjected to union calculation, so that the grouping error can be reduced, and the accuracy of the established corresponding relation between the third SIFT feature points and the fourth SIFT feature points is improved.
In a possible implementation manner, after the first hash value and the second hash value are calculated, the data structure of the sample image is established according to the first hash value, the data structure of the template image is established according to the second hash value, and according to the first hash value and the second hash value in the data structure, the second sub-group corresponding to each first sub-group is conveniently found, the corresponding relation between the first sub-group and the second sub-group is determined, so that the speed of searching for the fourth SIFT feature point corresponding to the third SIFT feature point is increased.
In one possible implementation, the pixel points on the sample image are divided into white dots and black dots according to the ridge lines and the valley lines on the fingerprint, and the pixel points on the template image are divided into white dots and black dots. Specifically, in the sample image, the gray value of the pixel point located on the ridge line is greater than the preset value, the gray value of the pixel point located on the valley line is less than the preset value, the pixel point with the gray value greater than the preset value is used as a white point, and the pixel point with the gray value less than the preset value is used as a black point.
After the sample image is obtained, ridge lines and valley lines on the sample image are identified, pixel points on the sample image are divided into black points and white points according to the positions of the ridge lines and the valley lines, and similarly, the pixel points on the template image are divided into the black points and the white points. And extracting a third SIFT feature point from the black points of the sample image, and extracting a fourth SIFT feature point from the black points of the template image. And calculating a corresponding first hash value and a second hash value according to the hash function, grouping the third SIFT feature points according to the first hash value, and grouping the fourth SIFT feature points according to the second hash value. And after grouping the black points, extracting a third SIFT feature point from the white point of the sample image, and extracting a fourth SIFT feature point from the white point of the template image. And calculating a corresponding first hash value and a second hash value according to the hash function, grouping the third SIFT feature points according to the first hash value, and grouping the fourth SIFT feature points according to the second hash value. The sample image is divided into the black point and the white point, the template image is divided into the black point and the white point, and grouping is performed according to the black point and the white point, so that the matching speed of the third SIFT feature point and the fourth SIFT feature point is improved.
After the sample image and the template image are divided into two types, namely a black point and a white point, the data structure of the established sample image is shown in fig. 3, and the data structure comprises the type (black point or white point) of the third SIFT feature point, the grouping mode (Table 1, Table2 and Table 3) corresponding to the third SIFT feature point, the grouping where the third SIFT feature point is located under each grouping mode, and the first hash value of the third SIFT feature point, wherein Ptype =0 represents that the type of the feature point is the black point, and Ptype =1 represents that the type of the feature point is the white point. After the data structure is established, determining the index of each third SIFT feature point according to the grouping mode and the corresponding first hash value. And similarly, establishing a data structure of the template image and an index of each fourth SIFT feature point. After the index is established, the grouping of the fourth SIFT feature points corresponding to the third SIFT feature points can be quickly determined according to the index, and then the fourth SIFT feature points corresponding to the third SIFT feature points are determined.
After the corresponding relationship between the third SIFT feature point and the fourth SIFT feature point is established, the similarity between the third SIFT feature point and the fourth SIFT feature point with the corresponding relationship is calculated, the matching degree between the sample image and the template image in the fingerprint template to be matched is determined according to the similarity, and whether the sample image and the fingerprint template to be matched correspond to the same finger or not is judged.
In a possible implementation manner, after the number of images of template images meeting preset similar conditions with a sample image in each fingerprint template is determined, the fingerprint templates are sequenced according to the number of the images and from big to small, the fingerprint template with the first sequence is taken as a fingerprint template to be matched, and the sample image is subjected to fingerprint matching with the template image in the fingerprint template to be matched. And if the matching fails, taking the fingerprint template with the second sequence as the fingerprint template to be matched, and performing fingerprint matching on the sample image and the template image in the fingerprint template to be matched until the matching is successful. If all the fingerprint templates to be matched fail to be matched, the fingerprint matching is judged to fail, and therefore the accuracy of fingerprint matching is improved.
In the embodiment, the sample image is obtained and is roughly matched with each template image in each pre-stored fingerprint template, so that the number of the template images meeting the preset similar condition with the sample image in each fingerprint template is determined; taking the fingerprint template with the largest number of images as a fingerprint template to be matched; and fingerprint matching is carried out on the sample image and the template image in the fingerprint template to be matched. Because the fingerprint template to be matched is determined through rough matching, and then the sample image is subjected to fingerprint matching with the template image in the fingerprint template to be matched, the template image for fingerprint matching can be reduced, namely the number of times of fingerprint matching is reduced, so that the fingerprint matching speed can be improved.
The fingerprint matching method provided by the embodiment of the present application is described below with reference to specific scenarios.
As shown in fig. 4, the number of the pre-stored fingerprint templates is set to 4, including a template T1 of the finger 1, a template T2 of the finger 2, a template T3 of the finger 3, and a template T4 of the finger 4. After the sample image is acquired, the similarity of the sample image and each fingerprint template is calculated.
For any template image in any fingerprint template, the similarity calculation process is as shown in fig. 5, first SIFT feature points of the sample image are extracted, and the first SIFT feature points are subjected to point extraction at intervals of N to obtain first feature points. And extracting second SIFT feature points of the template image, calculating a second Hamming distance between the first feature point and each second SIFT feature point, and taking the second SIFT feature point with the second Hamming distance as the minimum value as the second feature point corresponding to the first feature point. Calculating a first Hamming distance between the corresponding first characteristic point and the second characteristic point, establishing a histogram of the first Hamming distance, averaging the first Hamming distances which are ranked from large to small to be the first 30% according to the histogram, and taking the average value as the similarity of the corresponding sample image and the template image. When the first hamming distance and the second hamming distance are calculated, the dimension of the corresponding feature descriptor is 40.
As shown in fig. 4, after the similarity between the sample image and each template image is calculated, for each fingerprint template, the template images ranked as Topk are selected according to the descending order of similarity. That is, Topk template images in T1, Topk template images in T2, Topk template images in T3, and Topk template images in T4 were determined. Selecting 2k template images from the determined template images according to the sequence of similarity from big to small, counting the image quantity of the template image corresponding to each fingerprint template according to the fingerprint templates corresponding to the Top (2 k) template images, taking the fingerprint templates as the fingerprint templates to be matched according to the image quantity in sequence, and performing fingerprint matching on the sample images and the template images in the fingerprint templates to be matched.
For any template image, the fingerprint matching process between the sample image and the template image is as shown in fig. 6, a third SIFT feature point is extracted from the sample image, a fourth SIFT feature point is extracted from the template image, a first hash value of the third SIFT feature point is calculated according to a preset hash function, and a second hash value of the fourth SIFT feature point is calculated according to the preset hash function. The dimension of the feature descriptor corresponding to the third SIFT feature point and the fourth SIFT feature point may be 128, and the first hash value and the second hash value are both integers from 0 to 16. And after the first hash value and the second hash value are calculated, grouping the third SIFT feature points according to the first hash value, and grouping the fourth SIFT feature points according to the second hash value. After the third SIFT feature points and the fourth SIFT feature points are grouped, establishing a data structure of the sample image according to the group where the third SIFT feature points are located and the type of the third SIFT feature points, and determining the index of each third SIFT feature point in the data structure according to the first hash value; and establishing a data structure of the template image according to the grouping of the fourth SIFT feature points and the type of the fourth SIFT feature points, and determining the index of each fourth SIFT feature point in the data structure according to the second hash value. And determining fourth SIFT feature points corresponding to each third SIFT feature point according to the data structure and the index, and establishing a corresponding relation between the third SIFT feature points and the fourth SIFT feature points. And determining the matching degree of the sample image and the template image according to the third SIFT feature points and the fourth SIFT feature points with corresponding relations.
By adopting the fingerprint matching method, the matching process of the third SIFT feature point and the fourth SIFT feature point is reduced from 500 times to 150 times, the fingerprint matching times are reduced from 41 times to 5 times, and the false passing rate is reduced.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 7 shows a block diagram of a fingerprint matching apparatus provided in the embodiment of the present application, which corresponds to the fingerprint matching method described in the above embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
As shown in fig. 7, the fingerprint matching apparatus includes,
an obtaining module 10, configured to obtain a sample image;
a determining module 20, configured to perform rough matching on the sample image and each template image in each pre-stored fingerprint template, and determine the number of template images in each fingerprint template, which satisfy a preset similar condition with the sample image;
the calculation module 30 is configured to use the fingerprint template with the largest number of images as a fingerprint template to be matched;
and the matching module 40 is used for performing fingerprint matching on the sample image and the template image in the fingerprint template to be matched.
Preferably, the determining module 20 is specifically configured to:
determining first feature points in the sample images and second feature points in each template image, wherein the first feature points correspond to the second feature points in each template image one to one;
determining the similarity of the sample image and each template image according to a first Hamming distance between the first characteristic point and a second characteristic point in each template image;
and determining the number of the template images meeting preset similar conditions with the sample image in each fingerprint template according to the template images with the similarity larger than a preset similar threshold.
Preferably, the determining module 20 is further configured to:
extracting first SIFT feature points from the sample image, and determining the first feature points from the first SIFT feature points according to a preset rule;
and extracting second SIFT feature points from each template image, and determining second feature points from the second SIFT feature points according to a second Hamming distance between the first feature points and the second SIFT feature points.
Preferably, the matching module 40 is specifically configured to:
extracting a third SIFT feature point from the sample image, and extracting a fourth SIFT feature point from the template image in the fingerprint template to be matched;
establishing a corresponding relation between the third SIFT feature point and a fourth SIFT feature point according to a third Hamming distance between the third SIFT feature point and the fourth SIFT feature point;
and determining the matching degree of the sample image and the template image in the fingerprint template to be matched according to the similarity between the third SIFT feature point and the fourth SIFT feature point with the corresponding relationship.
Preferably, the matching module 40 is further specifically configured to:
calculating a first hash value of the third SIFT feature point according to a preset hash function;
grouping the third SIFT feature points according to the first hash value to obtain at least two first groups;
calculating a second hash value of the fourth SIFT feature point according to the hash function;
grouping the fourth SIFT feature points according to the second hash value to obtain at least two second groups, wherein one first group corresponds to one second group;
and establishing a corresponding relation between the third SIFT feature points in the first grouping and the corresponding fourth SIFT feature points in the second grouping according to a third Hamming distance between the third SIFT feature points in the first grouping and the corresponding fourth SIFT feature points in the second grouping.
Preferably, the matching module 40 is further specifically configured to:
establishing a data structure of the sample image according to the first hash value, and establishing a data structure of the template image according to the second hash value;
and determining the corresponding relation between the first group and the second group according to the data structure of the sample image and the data structure of the template image.
Preferably, the matching module 40 is further specifically configured to:
taking the pixel points with the gray values larger than the preset value in the sample image as white points, and taking the pixel points with the gray values smaller than the preset value in the sample image as black points;
and determining the matching degree of the sample image and the template image in the fingerprint template to be matched according to the similarity of the white point and the corresponding point in the template image and the similarity of the black point and the corresponding point in the template image.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Fig. 8 is a schematic structural diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 8, the electronic apparatus includes: a processor 11, a memory 12 and a computer program 13 stored in said memory 12 and executable on said processor 11. The processor 11 implements the steps in the above-described embodiment of the fingerprint matching method when executing the computer program 13.
Illustratively, the computer program 13 may be partitioned into one or more modules/units, which are stored in the memory 12 and executed by the processor 11 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 13 in the terminal device.
Those skilled in the art will appreciate that fig. 8 is merely an example of an electronic device and is not limiting and may include more or fewer components than those shown, or some components may be combined, or different components, e.g., the electronic device may also include input output devices, network access devices, buses, etc.
The Processor 11 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 12 may be an internal storage unit of the electronic device, such as a hard disk or a memory of the electronic device. The memory 12 may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device. Further, the memory 12 may also include both an internal storage unit and an external storage device of the electronic device. The memory 12 is used for storing the computer program and other programs and data required by the electronic device. The memory 12 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (9)

1. A fingerprint matching method, comprising:
acquiring a sample image;
roughly matching the sample image with template images in a pre-stored fingerprint template, and determining first feature points in the sample image and second feature points in each template image, wherein the first feature points correspond to the second feature points in each template image one to one;
determining the similarity of the sample image and the template image according to a first Hamming distance between the first characteristic point and a second characteristic point in the template image;
determining the number of the template images meeting preset similar conditions with the sample images in the fingerprint template according to the template images with the similarity larger than a preset similarity threshold; taking the fingerprint template with the largest number of images as a fingerprint template to be matched;
and performing fingerprint matching on the sample image and a template image in the fingerprint template to be matched.
2. The fingerprint matching method of claim 1, wherein said determining a first feature point in the sample image and a second feature point in each of the template images, comprises:
extracting first SIFT feature points from the sample image, and determining the first feature points from the first SIFT feature points according to a preset rule;
and extracting second SIFT feature points from the template image, and determining second feature points from the second SIFT feature points according to a second Hamming distance between the first feature points and the second SIFT feature points.
3. The fingerprint matching method of claim 1, wherein the fingerprint matching of the sample image with a template image in the fingerprint template to be matched comprises:
extracting a third SIFT feature point from the sample image, and extracting a fourth SIFT feature point from the template image in the fingerprint template to be matched;
establishing a corresponding relation between the third SIFT feature point and a fourth SIFT feature point according to a third Hamming distance between the third SIFT feature point and the fourth SIFT feature point;
and determining the matching degree of the sample image and the template image in the fingerprint template to be matched according to the similarity between the third SIFT feature point and the fourth SIFT feature point with the corresponding relationship.
4. The fingerprint matching method of claim 3, wherein the establishing a correspondence between the third SIFT feature points and the fourth SIFT feature points according to a third Hamming distance between the third SIFT feature points and the fourth SIFT feature points comprises:
calculating a first hash value of the third SIFT feature point according to a preset hash function;
grouping the third SIFT feature points according to the first hash value to obtain at least two first groups;
calculating a second hash value of the fourth SIFT feature point according to the hash function;
grouping the fourth SIFT feature points according to the second hash value to obtain at least two second groups, wherein one first group corresponds to one second group;
and establishing a corresponding relation between the third SIFT feature points in the first grouping and the corresponding fourth SIFT feature points in the second grouping according to a third Hamming distance between the third SIFT feature points in the first grouping and the corresponding fourth SIFT feature points in the second grouping.
5. The fingerprint matching method of claim 3, wherein before said establishing a correspondence between the third SIFT feature points and the fourth SIFT feature points according to a third Hamming distance between the third SIFT feature points and the fourth SIFT feature points, the fingerprint matching method further comprises:
establishing a data structure of the sample image according to the first hash value, and establishing a data structure of the template image according to the second hash value;
and determining the corresponding relation between the first group and the second group according to the data structure of the sample image and the data structure of the template image.
6. The fingerprint matching method of any one of claims 1 to 5, wherein the fingerprint matching of the sample image with the template image in the fingerprint template to be matched comprises:
taking the pixel points with the gray values larger than the preset value in the sample image as white points, and taking the pixel points with the gray values smaller than the preset value in the sample image as black points;
and determining the matching degree of the sample image and the template image in the fingerprint template to be matched according to the similarity of the white point and the corresponding point in the template image and the similarity of the black point and the corresponding point in the template image.
7. A fingerprint matching apparatus, comprising:
the acquisition module is used for acquiring a sample image;
the determining module is used for performing rough matching on the sample image and each template image in each pre-stored fingerprint template, and determining first feature points in the sample image and second feature points in each template image, wherein the first feature points correspond to the second feature points in each template image one to one;
determining the similarity of the sample image and the template image according to a first Hamming distance between the first characteristic point and a second characteristic point in the template image;
determining the number of the template images meeting preset similar conditions with the sample images in the fingerprint template according to the template images with the similarity larger than a preset similarity threshold;
the computing module is used for taking the fingerprint template with the largest number of images as a fingerprint template to be matched;
and the matching module is used for performing fingerprint matching on the sample image and the template image in the fingerprint template to be matched.
8. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the fingerprint matching method according to any one of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the fingerprint matching method according to any one of claims 1 to 6.
CN202110232239.1A 2021-03-03 2021-03-03 Fingerprint matching method and device, electronic equipment and storage medium Active CN112597978B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110232239.1A CN112597978B (en) 2021-03-03 2021-03-03 Fingerprint matching method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110232239.1A CN112597978B (en) 2021-03-03 2021-03-03 Fingerprint matching method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112597978A true CN112597978A (en) 2021-04-02
CN112597978B CN112597978B (en) 2021-06-22

Family

ID=75208076

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110232239.1A Active CN112597978B (en) 2021-03-03 2021-03-03 Fingerprint matching method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112597978B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113033512A (en) * 2021-05-21 2021-06-25 深圳阜时科技有限公司 Narrow-strip-shaped fingerprint identification method, storage medium and electronic equipment
CN113269125A (en) * 2021-06-10 2021-08-17 北京中科闻歌科技股份有限公司 Face recognition method, device, equipment and storage medium
CN116386091A (en) * 2022-11-18 2023-07-04 荣耀终端有限公司 Fingerprint identification method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1439997A (en) * 2002-02-22 2003-09-03 杭州中正生物认证技术有限公司 Fingerprint identifying method and system
CN1506903A (en) * 2002-12-06 2004-06-23 中国科学院自动化研究所 Automatic fingerprint distinguishing system and method based on template learning
US20070248249A1 (en) * 2006-04-20 2007-10-25 Bioscrypt Inc. Fingerprint identification system for access control
CN105760738A (en) * 2016-03-15 2016-07-13 广东欧珀移动通信有限公司 Fingerprint unlocking method and terminal
CN109902700A (en) * 2017-12-11 2019-06-18 北京航天泰坦科技股份有限公司 A kind of extensive image matching method based on hash algorithm
CN111429359A (en) * 2020-06-11 2020-07-17 深圳市诺赛特系统有限公司 Small-area fingerprint image splicing method, device, equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1439997A (en) * 2002-02-22 2003-09-03 杭州中正生物认证技术有限公司 Fingerprint identifying method and system
CN1506903A (en) * 2002-12-06 2004-06-23 中国科学院自动化研究所 Automatic fingerprint distinguishing system and method based on template learning
US20070248249A1 (en) * 2006-04-20 2007-10-25 Bioscrypt Inc. Fingerprint identification system for access control
CN105760738A (en) * 2016-03-15 2016-07-13 广东欧珀移动通信有限公司 Fingerprint unlocking method and terminal
CN109902700A (en) * 2017-12-11 2019-06-18 北京航天泰坦科技股份有限公司 A kind of extensive image matching method based on hash algorithm
CN111429359A (en) * 2020-06-11 2020-07-17 深圳市诺赛特系统有限公司 Small-area fingerprint image splicing method, device, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李飚: "自动指纹识别系统中匹配算法研究与应用", 《万方学位论文全文数据库》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113033512A (en) * 2021-05-21 2021-06-25 深圳阜时科技有限公司 Narrow-strip-shaped fingerprint identification method, storage medium and electronic equipment
CN113269125A (en) * 2021-06-10 2021-08-17 北京中科闻歌科技股份有限公司 Face recognition method, device, equipment and storage medium
CN113269125B (en) * 2021-06-10 2024-05-14 北京中科闻歌科技股份有限公司 Face recognition method, device, equipment and storage medium
CN116386091A (en) * 2022-11-18 2023-07-04 荣耀终端有限公司 Fingerprint identification method and device
CN116386091B (en) * 2022-11-18 2024-04-02 荣耀终端有限公司 Fingerprint identification method and device

Also Published As

Publication number Publication date
CN112597978B (en) 2021-06-22

Similar Documents

Publication Publication Date Title
CN112597978B (en) Fingerprint matching method and device, electronic equipment and storage medium
CN107944020B (en) Face image searching method and device, computer device and storage medium
WO2021072885A1 (en) Method and apparatus for recognizing text, device and storage medium
US9600730B2 (en) Biometric authentication technology
US20120042171A1 (en) Method and system for biometric authentication
US10528844B2 (en) Method and apparatus for distance measurement
US11062120B2 (en) High speed reference point independent database filtering for fingerprint identification
CN110503682B (en) Rectangular control identification method and device, terminal and storage medium
WO2021072876A1 (en) Identification image classification method and apparatus, computer device, and readable storage medium
CN110738222B (en) Image matching method and device, computer equipment and storage medium
CN114972817A (en) Image similarity matching method, device and storage medium
CN111311593B (en) Multi-ellipse detection and evaluation algorithm, device and terminal
CN111553241A (en) Method, device and equipment for rejecting mismatching points of palm print and storage medium
CN110083731B (en) Image retrieval method, device, computer equipment and storage medium
CN113157962B (en) Image retrieval method, electronic device, and storage medium
TWI714321B (en) Method, apparatus and electronic device for database updating and computer storage medium thereof
CN112036362A (en) Image processing method, image processing device, computer equipment and readable storage medium
CN109871779B (en) Palm print identification method and electronic equipment
WO2021151359A1 (en) Palm print image recognition method, apparatus and device, and computer readable storage medium
CN112200004B (en) Training method and device for image detection model and terminal equipment
CN110287943B (en) Image object recognition method and device, electronic equipment and storage medium
CN112308027A (en) Image matching method, biological recognition chip and electronic device
JP5791666B2 (en) Dynamic generation device for visual keywords
JP2012008979A (en) Character string search method, character string search device and recording medium
JP4575356B2 (en) Identity checking system, computer readable recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant