CN116311393A - Fingerprint matching method, fingerprint matching device, terminal equipment and computer readable storage medium - Google Patents

Fingerprint matching method, fingerprint matching device, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN116311393A
CN116311393A CN202310359911.2A CN202310359911A CN116311393A CN 116311393 A CN116311393 A CN 116311393A CN 202310359911 A CN202310359911 A CN 202310359911A CN 116311393 A CN116311393 A CN 116311393A
Authority
CN
China
Prior art keywords
image
matching
fingerprint
template
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310359911.2A
Other languages
Chinese (zh)
Inventor
杨盼
于泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chipsailing Technology Co ltd
Original Assignee
Shenzhen Chipsailing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chipsailing Technology Co ltd filed Critical Shenzhen Chipsailing Technology Co ltd
Priority to CN202310359911.2A priority Critical patent/CN116311393A/en
Publication of CN116311393A publication Critical patent/CN116311393A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints

Abstract

The application is applicable to the technical field of feature comparison and provides a fingerprint matching method, a device, terminal equipment and the technical field of computer readable storage, comprising the following steps: performing first matching processing on the target fingerprint image and the template fingerprint image in the template library to obtain an initial matching result; screening an image matching group from the template library according to the initial matching result, wherein the template library comprises W template fingerprint images, the image matching group comprises L template fingerprint images, and L is a positive integer smaller than W; and respectively carrying out second matching processing on the target fingerprint image and each template fingerprint image in the image matching group to obtain a final matching result, wherein the data granularity of the first matching processing is lower than that of the second matching processing. The method can greatly reduce comparison time and improve fingerprint matching efficiency.

Description

Fingerprint matching method, fingerprint matching device, terminal equipment and computer readable storage medium
Technical Field
The application belongs to the technical field of feature comparison, and particularly relates to a fingerprint matching method, a fingerprint matching device, terminal equipment and a computer readable storage medium.
Background
Fingerprint recognition is the most important and widely used technology among biometric identification technologies. The fingerprint authentication method utilizes the uniqueness and life invariance of fingerprint characteristics to authenticate the identity of the person, and has extremely high safety and usability. With the rapid improvement of the hardware performance of computers, fingerprint identification technology has been applied to various fields.
However, in the related art, in the case of a fingerprint image having a large area array and in which a fingerprint library and hardware conditions are limited, a method for performing fingerprint feature matching is to sequentially match feature points. The above-described method is not only time consuming but also inefficient.
Disclosure of Invention
The embodiment of the application provides a fingerprint matching method, a fingerprint matching device and a fingerprint matching terminal, which relate to a computer readable storage medium, and can greatly reduce comparison time and improve fingerprint matching efficiency.
In a first aspect, an embodiment of the present application provides a fingerprint matching method, including:
performing first matching processing on the target fingerprint image and the template fingerprint image in the template library to obtain an initial matching result;
screening an image matching group from the template library according to the initial matching result, wherein the template library comprises W template fingerprint images, the image matching group comprises L template fingerprint images, and L is a positive integer smaller than W;
And respectively carrying out second matching processing on the target fingerprint image and each template fingerprint image in the image matching group to obtain a final matching result, wherein the data granularity of the first matching processing is lower than that of the second matching processing.
In the embodiment of the application, the target fingerprint image is firstly subjected to initial matching with all template fingerprint images of a database, a plurality of groups of possibly matched image matching groups are obtained, then the image matching groups obtained through initial matching are subjected to secondary matching, and finally a final matched image is determined from the image matching groups. In other words, the image matching group with higher matching degree is screened out from all template fingerprint images in the database by first performing rough matching processing, and then the image matching group obtained by the rough matching processing is compared in detail to obtain a final matching result. Under the condition of larger fingerprint library or insufficient hardware conditions, the comparison time can be greatly reduced by the method, and the fingerprint matching efficiency is improved.
In a possible implementation manner of the first aspect, the performing a first matching process on the target fingerprint image and the template fingerprint image in the template library to obtain an initial matching result includes:
Acquiring a first characteristic point set of the target fingerprint image and a second characteristic point set of each template fingerprint image in the template library, wherein the first characteristic point set comprises pixel points contained in n1 first image areas, the second characteristic set comprises pixel points contained in n2 second image areas, and both n1 and n2 are integers larger than 2;
and determining target area groups between the target fingerprint image and each template fingerprint image according to the first characteristic point set and the second characteristic point set, wherein each target area group comprises a first image area and a second image area, and the determined W target area groups serve as initial matching results.
In a possible implementation manner of the first aspect, the step of acquiring the first feature point set of the target fingerprint image includes:
acquiring n1 first pixel points from the target fingerprint image, wherein the distance between every two first pixel points is larger than a first preset distance;
determining n1 first image areas corresponding to the first pixel points respectively;
and generating the first characteristic point set according to the pixel points included in the first image area.
In a possible implementation manner of the first aspect, the generating the first feature point set according to the pixel points included in the first image area includes:
for each first image area, acquiring a center point of the first image area;
taking the center point as a rotation center, and carrying out rotation calibration on pixel points included in the first image area to obtain calibration points;
and generating the first characteristic point set according to the calibration points contained in each of the n1 first image areas.
In a possible implementation manner of the first aspect, the determining a target region group between the target fingerprint map and each of the template fingerprint maps according to the first feature point set and the second feature point set includes:
for each template fingerprint image, calculating the area distance between each first image area and each second image area in the template fingerprint image according to the first feature point set and the second feature point set of the template fingerprint image;
determining a first target area matched with each first image area from the second image areas of the template fingerprint image according to the area distance;
And determining a target region group between the target fingerprint image and the template fingerprint image according to the first matching scores corresponding to the n1 first candidate region groups, wherein one first image region and the first target region matched with the first image region form one first candidate region group.
In a possible implementation manner of the first aspect, the determining, according to the first matching scores corresponding to the n1 first candidate region groups, a target region group between the target fingerprint map and each of the template fingerprint maps includes:
screening k second candidate region groups from the determined n1 first candidate region groups according to the region distance, wherein k is a positive integer less than or equal to n 1;
calculating first matching scores corresponding to the k second candidate region groups respectively;
and determining a second candidate region group corresponding to the largest first matching score as a target region group between the target fingerprint image and the template fingerprint image.
In a possible implementation manner of the first aspect, the screening the image matching group from the template library according to the initial matching result includes:
and screening the image matching groups from the template library according to the first matching scores corresponding to the W target region groups.
In a second aspect, an embodiment of the present application provides a fingerprint comparison device, including:
the acquisition module is used for carrying out first matching processing on the target fingerprint image and the template fingerprint image in the template library to obtain an initial matching result;
the screening module is used for screening an image matching group from the template library according to the initial matching result, wherein the template library comprises W template fingerprint images, the image matching group comprises L template fingerprint images, and L is a positive integer smaller than W;
and the matching module is used for respectively carrying out second matching processing on the target fingerprint image and each template fingerprint image in the image matching group to obtain a final matching result, wherein the data fine granularity of the first matching processing is lower than that of the second matching processing.
In a third aspect, an embodiment of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the fingerprint matching method according to any one of the first aspects when the processor executes the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program which, when executed by a processor, implements a fingerprint matching method as in any one of the first aspects above.
In a fifth aspect, embodiments of the present application provide a computer program product, which when run on a terminal device, causes the terminal device to perform the fingerprint matching method according to any one of the first aspects above.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic system flow diagram of a fingerprint identification method according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of performing a first matching process according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of forming a first feature point set according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of determining a zone distance according to an embodiment of the present application;
Fig. 5 is a block diagram of a fingerprint matching device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise.
In the network era today, everyone has a large number of authentication passwords, such as a startup password, a mailbox password, a bank password, a forum login password, and the like; and are equipped with various keys such as lock keys, car keys, safe keys, etc. These are all the ways that conventional security systems take, with the development of society, their security becomes weaker. The life of people needs to confirm the personal identity and identify the authority at any time, especially in the information society, the requirements of people on security are higher and higher, and meanwhile, the authentication mode is expected to be simple and quick. In order to solve the problem, people turn the eyes to a biological recognition technology, in particular to a fingerprint recognition technology, wherein the fingerprint recognition technology is a biological recognition technology for identifying by comparing minutiae characteristic points of different fingerprints.
The development of fingerprint recognition technology benefits from the research of modern electronic integrated manufacturing technology and fast and reliable algorithms. Although fingerprints are only a small part of the human skin, the amount of data used for identification is quite large, and comparing these data is not simply an equal and unequal problem, but rather a large number of matching algorithms are required. At present, the fingerprint features extracted during fingerprint identification are mainly feature end points and fork points, and are time-consuming for the situation that a large fingerprint image is faced, and the fingerprint library and the limitation of hardware conditions are matched in sequence.
In order to solve the above problems, the embodiments of the present application provide a fingerprint comparison method. According to the method, a target fingerprint image is firstly subjected to initial matching with all template fingerprint images of a database, a plurality of groups of possibly matched image matching groups are obtained, then the image matching groups obtained through initial matching are subjected to secondary matching, and finally a final matched image is determined from the image matching groups. In other words, the image matching group with higher matching degree is screened out from all template fingerprint images in the database by first performing rough matching processing, and then the image matching group obtained by the rough matching processing is compared in detail to obtain a final matching result. Under the condition of larger fingerprint library or insufficient hardware conditions, the comparison time can be greatly reduced by the method, and the fingerprint matching efficiency is improved.
Referring to fig. 1, a system flow diagram of a fingerprint identification method according to an embodiment of the present application is shown. By way of example, and not limitation, the method includes the steps of:
step S101, performing first matching processing on the target fingerprint image and the template fingerprint image in the template library to obtain an initial matching result.
In the embodiment of the application, the fingerprint is a line generated by the rugged surface on the skin of the front surface of the tail end of the finger. Although a fingerprint is only a small part of the human skin, it contains a lot of information. Fingerprint recognition using fingerprint features has been applied to various fields because of the uniqueness and lifetime invariance of the fingerprint features.
In this embodiment of the present application, the target fingerprint image refers to a fingerprint image to be compared, and the template fingerprint image in the template library is a pre-stored fingerprint image for comparison, and if the target fingerprint image is matched with the fingerprint features in the template fingerprint image one by one, a lot of time will be consumed. Therefore, firstly, rough comparison (first matching treatment) is carried out by utilizing the target fingerprint image and the template fingerprint image through a certain method, and partial template fingerprint images with higher matching rate are screened out.
In one embodiment, referring to fig. 2, a flowchart of performing a first matching process according to an embodiment of the present application is shown in fig. 2, and one implementation manner of step S101 includes:
step S201, acquiring a first feature point set of the target fingerprint image and a second feature point set of each template fingerprint image in the template library, where the first feature point set includes n1 pixel points included in each of the first image areas, the second feature set includes n2 pixel points included in each of the second image areas, and both n1 and n2 are integers greater than 2.
In the embodiment of the application, the main idea of performing the first matching process by using the target fingerprint image and the template fingerprint image is to perform region matching. According to the region matching method, a target fingerprint image is divided into n1 regions according to main components of pixels, each region comprises different numbers of pixel points, namely fingerprint feature points, and the n1 regions and the feature points contained in the n1 regions are called a first feature point set. Similarly, each template fingerprint image in the template library needs to be divided into n2 regions according to the same method, and the n2 regions and the feature points contained therein are called a second feature point set. And then, carrying out rough region comparison on the n1 region divided by the target fingerprint image and the n2 region divided by the template fingerprint image, and screening out a region with higher matching degree, wherein the method is a first matching processing method.
By the method, the target fingerprint image and the template library fingerprint image can be divided into different areas, and the comparison time can be further saved by performing rough matching on the areas.
In one embodiment, referring to fig. 3, a schematic flow chart of forming a first feature point set according to an embodiment of the present application, as shown in fig. 3, an implementation manner of step S201 includes:
step S301, acquiring n1 first pixel points from the target fingerprint image, where a distance between every two first pixel points is greater than a first preset distance.
In the embodiment of the present application, it is assumed that one of templates of a template library fingerprint map is set as T, a map to be compared, i.e., a target fingerprint map is set as F, first, feature points are extracted from the template fingerprint map and the target fingerprint map, and then, the feature points in each fingerprint map are sorted according to mass fractions so as to facilitate division of subsequent regions.
The mass fraction is calculated, for example, as follows: assuming that the width of the pixel number of the fingerprint image is A and the height is B, firstly, the image is divided into blocks, the size of each block is C×C, the length and the width of each block are C pixel numbers, the fingerprint image can be divided into (A/C) pieces of blocks, then, the sum of absolute values of gradients (dx, dy) in the horizontal direction and the vertical direction of each pixel point in each block is calculated respectively, and finally, the sum of absolute values of gradients (dx, dy) in the horizontal direction and the vertical direction of each pixel point in each block is divided by the total pixel number (C×C) to obtain a mean value Vmean, wherein the Vmean represents the mass fraction of the current block, namely, the fraction of the block where the characteristic point is located is the fraction of the characteristic point.
In this embodiment of the present application, the pixel point is a fingerprint image displayed by a display, and the minimum light-emitting unit of the display screen is the pixel point, so that the feature point of the fingerprint can be obtained by the pixel point. Therefore, the fingerprint image needs to be divided into areas by the pixels for acquiring the fingerprint image. Firstly, sequentially selecting n1 first pixel points Pt (n 1, x, y and dir) in a target fingerprint image F according to the mass fraction, wherein n1 is at a selected central point of the target fingerprint image, x and y represent the positions of the pixel points in a two-dimensional coordinate system, and dir represents the directions of the pixel points. Before determining the first pixel, a first distance needs to be determined, where the first distance refers to a distance between two pixel points being D1, for example, a size of D1 may be set to 40-60pix (pixels), and may also be adjusted according to a size of an actual image. (note: n1 actually calculated may be more or less due to image differences, and generally n1 may be set to 8 to 10).
By the method, the quality scores among the areas can be effectively compared by calculating the quality scores of the feature points, and the areas with higher quality scores can be preferentially selected when the position comparison is carried out.
Step S302, determining a first image area corresponding to each of the n1 first pixel points.
In this embodiment of the present application, the first image area refers to an area obtained by performing area division according to each acquired first pixel point, and the dividing method of the first image area includes, for example, using each first pixel point as a center, and performing circle drawing with a specified radius to obtain a plurality of first image areas, or performing multiple methods such as area division with different radii, and may be according to the set radius of the actual target fingerprint image.
Step S303, generating the first feature point set according to the pixel points included in the first image area.
In this embodiment of the present application, the determining manner of the first image area is to set the second preset distance D2, use the n1 first pixel points as center points, find all the pixel points with a radius within the range of D2, determine all the pixel points within the range of D2 and various corresponding center points as the first image area, where all the pixel points in the first image area form a first feature point set, where the size of the radius D2 is set according to the size of D1, for example, the size of the D1 may be set to 40-60pix, and then the size of D2 is also set to 40-60pix.
In one embodiment, one implementation of step S303 includes:
For each first image area, acquiring a center point of the first image area;
taking the center point as a rotation center, and carrying out rotation calibration on pixel points included in the first image area to obtain calibration points;
and generating the first characteristic point set according to the calibration points contained in each of the n1 first image areas. In the embodiment of the application, the center point of the first image, that is, the first pixel point Pt (n 1, x, y, dir) is acquired respectively, all the feature points of the n1 sets are rotationally calibrated to the origin (that is, the coordinates (x (n 1, i), y (n 1, i)) of each point according to Pt [ n1, -x, -y ] with the respective center points Pt (n 1, x, y, dir), the coordinates x (n 1), y (n) of the respective center points are subtracted, and the translation is performed to a coordinate system centered on the origin (0, 0), so as to obtain new coordinates x1 (n 1, i), y1 (n 1, i)). Then rotated by an angle (-dir (n 1)) according to the origin, the rotation is calculated as follows:
x2(n1,i)=cos(-dir(n1))*x1(n1,i)-sin(-dir(n1))*y1(n1,i)
y2(n1,i)=sin(-dir(n1))*x1(n1,i)+cos(-dir(n1))*y1(n1,i)
dir2(n1,i)=dir1(n1,i)+(-dir(n1))
and acquiring a calibrated first characteristic point set by using the first image area, the first pixel point and the calibration method.
By the method, all the characteristic points of the first image area are calibrated, so that all the pixel points are in the same coordinate system, and the fingerprint characteristic points can be compared more accurately.
Step S202, determining a target region group between the target fingerprint image and each template fingerprint image according to the first feature point set and the second feature point set, where each target region group includes one first image region and one second image region, and the determined W target region groups are used as the initial matching result.
In this embodiment of the present application, the second image area and the second feature point set of all the template fingerprint images of the template library may be acquired according to the method for acquiring the first feature point. The region with the highest matching degree with the target fingerprint image in each template fingerprint image in all the template libraries can be obtained by a rough matching method, namely a first matching method. And taking the plurality of regions with highest matching degree obtained by rough matching as a target region group.
For example, assuming that the number of template fingerprint patterns in the template database is W, each fingerprint pattern in the W template fingerprint patterns includes a plurality of second image areas by a rough comparison method, and by comparing the second image areas with the first image areas in the target fingerprint pattern, a second image area with the highest matching degree with the first image areas can be obtained, and the obtained first image area and second image area with higher matching degree are called a target area group. Therefore, the W template fingerprint patterns are respectively compared with the target fingerprint patterns, so that W target region groups can be obtained, and the obtained target region groups are called as matching results obtained by the first matching process.
Through the method, the target area group with high matching degree with the target fingerprint image can be determined from the template library through the rough comparison method, and comparison data is provided for detailed comparison of subsequent fingerprints.
In one embodiment, referring to fig. 4, a flowchart of determining a region distance according to an embodiment of the present application is shown in fig. 4, and one implementation of step S202 includes:
step S401, for each of the template fingerprint patterns, calculating a region distance between each of the first image regions and each of the second image regions in the template fingerprint pattern according to the first feature point set and the second feature point set of the template fingerprint pattern.
In the embodiment of the present application, the area distance refers to the absolute value of the difference in pixel coordinate positions between the first image area and the second image area. Illustratively, if x (F, 0, i, dir) is the x-coordinate of the ith feature point in the first image region center point set in the target fingerprint image F, y (T, 0, j, dir) is the y-coordinate of the j feature points in the second image region center point set in the template fingerprint image T, and the other is the same. The region distance between the first image region and the second image region is calculated as follows:
Δdx=|x(F,0,i,dir)-x(T,0,j,dir)|
Δdy=|y(F,0,i,dir)-y(T,0,j,dir)|
The region distance between the two image regions can be obtained by the above.
Step S402, determining a first target area matched with each first image area from the second image areas of the template fingerprint chart according to the area distance.
In this embodiment of the present application, the area distance between each first image area in the target fingerprint image F and each second image in the template fingerprint image T is calculated, and according to the area distance, the second image area with the highest matching degree of each first image area may be determined and recorded. The second image area that matches the first image area is referred to as a first target area.
By the method, the optimal value between the two image areas can be obtained according to the area distance, and the image area with higher matching degree with each template library fingerprint image of the target fingerprint image can be further determined.
Step S403, determining a target region group between the target fingerprint image and the template fingerprint image according to the first matching scores corresponding to the n1 first candidate region groups, where one first image region and the first target region matched with the first image region form one first candidate region group.
In this embodiment of the present application, since the target fingerprint image is divided into n1 regions, n1 first target regions are obtained by performing region distance comparison, so each first image region and the second image region matched with the first image region form a target region group, and the target fingerprint image and each template fingerprint image generate n1 target region groups. Each target region group forms a first candidate region group, so that n1 first candidate region groups are obtained, and the matching degree of the first candidate region groups is higher. And the second image area with the highest matching score, namely the higher matching degree, can be determined according to the score.
In one embodiment, one implementation of step S403 includes:
and screening k second candidate region groups from the determined n1 first candidate region groups according to the region distance, wherein k is a positive integer less than or equal to n 1.
In the embodiment of the present application, since the n1 first candidate region groups also have region groups with different matching degrees, candidate regions with higher matching degrees need to be screened from the n1 first candidate region groups for further comparison. For example, the matching degree of the first k first candidate region groups can be screened out through screening, so that further comparison can be performed, the first k first candidate region groups can be set as the second candidate region groups, k is a positive integer, and the value cannot be larger than the value of n 1.
For example, the above screening method may be obtained by sorting the magnitudes of the matching scores, for example, the score of the first image region of the target fingerprint and one of the second image regions in the template fingerprint is f, the distance threshold t (may be set empirically) may be set, the score threshold is f1=Δdx+Δdy, and if f1 < t, the score may be obtained by the formula f=t-f 1.
By the method, the optimal matching areas can be further determined by matching the scores, and the matching degree of each image area and the target fingerprint image can be more intuitively seen by score matching.
Calculating the first matching scores corresponding to the k second candidate region groups respectively
In the embodiment of the application, after k second candidate group areas are selected, the offset and the rotation angle are calculated according to the method by using the center points of the respective areas, and the template fingerprint image is calibrated according to the translation distance and the rotation angle to the same coordinate system as the target fingerprint image. And re-calculating the first matching score for the calibrated second candidate region group according to the score calculation method.
And determining a second candidate region group corresponding to the largest first matching score as a target region group between the target fingerprint image and the template fingerprint image.
In the embodiment of the application, the k groups of matching scores obtained through calculation are ranked, and the second candidate region group corresponding to the maximum score is determined to be the target region group with the highest matching degree. The target region group is an image region with highest matching degree with the target fingerprint image in the template fingerprint image.
In one embodiment, the method further comprises:
and screening the image matching groups from the template library according to the first matching scores corresponding to the W target region groups.
In this embodiment, the matching score between each template fingerprint image and the target fingerprint image in the template library can be obtained by the method, and the image region with the highest matching degree can be obtained by the screening method, if W template fingerprint images exist in the template fingerprint library, a target region group with higher matching degree can be obtained.
And step S103, respectively carrying out second matching processing on the target fingerprint image and each template fingerprint image in the image matching group to obtain a final matching result, wherein the data granularity of the first matching processing is lower than that of the second matching processing.
In the embodiment of the application, the image matching group with higher matching degree with the target fingerprint image in the template library is roughly obtained through the first number matching processing, namely the determined L groups of image matching groups. At this time, the L image matching groups need to be subjected to detailed comparison, that is, second matching processing (detailed comparison algorithm is different from factor model, and threshold value is also determined according to algorithm evaluation) so as to obtain a final matching result. The first matching process is a rough processing result, the second matching process is a detailed matching process, and the accuracy is high.
The application provides a fingerprint rough comparison method, when the number of template libraries is large, rough comparison scores are carried out by using all templates of a current template and a fingerprint library, then all rough comparison scores are sorted from large to small, first H template fingerprint images with large scores are selected (for example, H can take values at 10-20), then detailed comparison is carried out (the detailed comparison algorithm factor models are different, the threshold value is also determined according to algorithm evaluation), the rough comparison only takes the role of obtaining the scores in the large template library in advance and sorting, fingerprint templates which are most likely to be successfully matched are obtained, and the detailed comparison is carried out by using few templates, so that the time is greatly improved.
By way of example, assuming that the template library includes 100 template fingerprint patterns W1, W2..w100, the comparative target fingerprint pattern is F, the method according to the present application first divides the target fingerprint pattern F into 3 first image areas F1, F2 and F3 respectively, likewise, the first template fingerprint image W1 is divided into 3 second image areas T1, T2 and T3, respectively, the second template fingerprint image W2 is divided into 2 second image areas L1 and L2, respectively, and all the template fingerprint images are sequentially divided into areas.
According to the method, firstly, F1, F2 and F3 in a target fingerprint image are respectively calculated with T1, T2 and T3 in a first template fingerprint image to obtain F1T1, F1T2, F1T3, F2T1, F2T2, F2T3, F3T1, F3T2 and F3T3 respectively corresponding area distances, score matching can be carried out through the area distances, the area distances are compared and scores are calculated, then the maximum score corresponding to each F area is found in sequence, indexes of the T area and the F area corresponding to the maximum score are recorded, for example, the maximum score corresponding to the F1 area corresponds to the T1 area, the maximum score corresponding to the F2 area corresponds to the T3 area, and the maximum score corresponding to the F3 area corresponds to the T2, and the 3 first candidate area groups are F1T1, F2T3 and F3T2 for the first template fingerprint image. And selecting 2 (supposing k=2) first candidate region groups with top scores from the 3 first candidate region groups as second candidate region groups. And then respectively calculating the first matching scores of the 2 second candidate region groups, and taking the T region in the second candidate region group corresponding to the largest first matching score as a target region group between the first template fingerprint image and the target fingerprint image.
And processing each template fingerprint image in the template library sequentially according to the steps to obtain a target region group between each template fingerprint image and the target fingerprint image. I.e. a total of 100 target area groups.
And selecting 20 (assuming that L=20) target region groups with first matching scores ranked top from the 100 target region groups, and determining template fingerprint patterns corresponding to the 20 target region groups as image matching groups. And then respectively carrying out second matching processing on the 20 image matching groups and the target fingerprint image to obtain a final matching result.
Therefore, through the method, 20 groups of image matching groups are screened out from 100 template fingerprint libraries through a rough comparison method with the target fingerprint images, and then the detailed comparison is performed, so that the comparison of 100 groups of fingerprint images is reduced to 20 groups of fingerprint images, and the time is greatly improved. Therefore, under the condition of large fingerprint libraries or insufficient hardware conditions, the algorithm is firstly used for rough matching, and then partial templates are selected for detailed comparison, so that the comparison time can be greatly reduced, and higher accuracy can be achieved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Corresponding to the fingerprint identification method described in the above embodiments, fig. 5 is a block diagram of the fingerprint matching device provided in the embodiment of the present application, and for convenience of explanation, only the portion relevant to the embodiment of the present application is shown.
Referring to fig. 5, the apparatus includes:
the obtaining module 51 is configured to perform a first matching process on the target fingerprint image and the template fingerprint image in the template library, so as to obtain an initial matching result;
the screening module 52 is configured to screen an image matching group from the template library according to the initial matching result, where the template library includes W template fingerprint images, the image matching group includes L template fingerprint images, and L is a positive integer less than W;
and a matching module 53, configured to perform a second matching process on the target fingerprint image and each template fingerprint image in the image matching group, so as to obtain a final matching result, where the data granularity of the first matching process is lower than the data granularity of the second matching process.
Optionally, the matching module 53 is further configured to:
acquiring a first characteristic point set of the target fingerprint image and a second characteristic point set of each template fingerprint image in the template library, wherein the first characteristic point set comprises pixel points contained in n1 first image areas, the second characteristic set comprises pixel points contained in n2 second image areas, and both n1 and n2 are integers larger than 2;
And determining target area groups between the target fingerprint image and each template fingerprint image according to the first characteristic point set and the second characteristic point set, wherein each target area group comprises a first image area and a second image area, and the determined W target area groups serve as initial matching results.
Optionally, the matching module 53 is further configured to:
acquiring n1 first pixel points from the target fingerprint image, wherein the distance between every two first pixel points is larger than a first preset distance;
determining n1 first image areas corresponding to the first pixel points respectively;
and generating the first characteristic point set according to the pixel points included in the first image area.
Optionally, the matching module 53 is further configured to:
for each first image area, acquiring a center point of the first image area;
taking the center point as a rotation center, and carrying out rotation calibration on pixel points included in the first image area to obtain calibration points;
and generating the first characteristic point set according to the calibration points contained in each of the n1 first image areas.
Optionally, the matching module 53 is further configured to:
For each template fingerprint image, calculating the area distance between each first image area and each second image area in the template fingerprint image according to the first feature point set and the second feature point set of the template fingerprint image;
determining a first target area matched with each first image area from the second image areas of the template fingerprint image according to the area distance;
and determining a target region group between the target fingerprint image and the template fingerprint image according to the first matching scores corresponding to the n1 first candidate region groups, wherein one first image region and the first target region matched with the first image region form one first candidate region group.
Optionally, the matching module 53 is further configured to:
screening k second candidate region groups from the determined n1 first candidate region groups according to the region distance, wherein k is a positive integer less than or equal to n 1;
calculating first matching scores corresponding to the k second candidate region groups respectively;
and determining a second candidate region group corresponding to the largest first matching score as a target region group between the target fingerprint image and the template fingerprint image.
Optionally, the matching module 53 is further configured to:
and screening the image matching groups from the template library according to the first matching scores corresponding to the W target region groups.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
In addition, the fingerprint matching device shown in fig. 5 may be a software unit, a hardware unit, or a unit combining soft and hard, which are built in an existing terminal device, or may be integrated into the terminal device as an independent pendant, or may exist as an independent terminal device.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Fig. 6 is a schematic structural diagram of a terminal device provided in an embodiment of the present application. As shown in fig. 6, the terminal device 6 of this embodiment includes: at least one processor 60 (only one shown in fig. 6), a memory 61 and a computer program 62 stored in the memory 61 and executable on the at least one processor 60, the processor 60 implementing the steps in any of the various fingerprint matching method embodiments described above when executing the computer program 62.
The terminal equipment can be computing equipment such as a desktop computer, a notebook computer, a palm computer, a cloud server and the like. The terminal device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that fig. 6 is merely an example of the terminal device 6 and is not meant to be limiting as to the terminal device 6, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 60 may be a central processing unit (Central Processing Unit, CPU), the processor 60 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may in some embodiments be an internal storage unit of the terminal device 6, such as a hard disk or a memory of the terminal device 6. The memory 61 may in other embodiments also be an external storage device of the terminal device 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the terminal device 6. The memory 61 is used for storing an operating system, an application program, a Boot Loader (Boot Loader), data, other programs, etc., such as program codes of the computer program. The memory 61 may also be used for temporarily storing data that has been output or is to be output.
Embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor, may implement the steps in the above-described method embodiments.
The present embodiments provide a computer program product which, when run on a terminal device, causes the terminal device to perform steps that enable the respective method embodiments described above to be implemented.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to an apparatus/terminal device, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A fingerprint matching method, comprising:
performing first matching processing on the target fingerprint image and the template fingerprint image in the template library to obtain an initial matching result;
screening an image matching group from the template library according to the initial matching result, wherein the template library comprises W template fingerprint images, the image matching group comprises L template fingerprint images, and L is a positive integer smaller than W;
And respectively carrying out second matching processing on the target fingerprint image and each template fingerprint image in the image matching group to obtain a final matching result, wherein the data granularity of the first matching processing is lower than that of the second matching processing.
2. The fingerprint matching method as set forth in claim 1, wherein the first matching process between the target fingerprint pattern and the template fingerprint patterns in the template library to obtain an initial matching result includes:
acquiring a first characteristic point set of the target fingerprint image and a second characteristic point set of each template fingerprint image in the template library, wherein the first characteristic point set comprises pixel points contained in n1 first image areas, the second characteristic set comprises pixel points contained in n2 second image areas, and both n1 and n2 are integers larger than 2;
and determining target area groups between the target fingerprint image and each template fingerprint image according to the first characteristic point set and the second characteristic point set, wherein each target area group comprises a first image area and a second image area, and the determined W target area groups serve as initial matching results.
3. The fingerprint matching method of claim 2, wherein the step of obtaining the first set of feature points of the target fingerprint pattern comprises:
acquiring n1 first pixel points from the target fingerprint image, wherein the distance between every two first pixel points is larger than a first preset distance;
determining n1 first image areas corresponding to the first pixel points respectively;
and generating the first characteristic point set according to the pixel points included in the first image area.
4. The fingerprint matching method of claim 3, wherein the generating the first set of feature points from the pixel points included in the first image region comprises:
for each first image area, acquiring a center point of the first image area;
taking the center point as a rotation center, and carrying out rotation calibration on pixel points included in the first image area to obtain calibration points;
and generating the first characteristic point set according to the calibration points contained in each of the n1 first image areas.
5. The fingerprint matching method of claim 2, wherein said determining a set of target regions between the target fingerprint pattern and each of the template fingerprint patterns from the first set of feature points and the second set of feature points comprises:
For each template map, calculating the region distance between each first image region and each second image region in the template fingerprint map according to the first feature point set and the second feature point set of the template fingerprint map;
determining a first target area matched with each first image area from the second image areas of the template fingerprint image according to the area distance;
and determining a target region group between the target fingerprint image and the template fingerprint image according to the first matching scores corresponding to the n1 first candidate region groups, wherein one first candidate region group is formed by one first image region and a first target region matched with the first image region.
6. The fingerprint matching method of claim 5, wherein determining a set of target regions between the target fingerprint pattern and each of the template fingerprint patterns based on the respective first matching scores of the n1 first candidate region sets comprises:
screening k second candidate region groups from the determined n1 first candidate region groups according to the region distance, wherein k is a positive integer less than or equal to n 1;
Calculating first matching scores corresponding to the k second candidate region groups respectively;
and determining a second candidate region group corresponding to the largest first matching score as a target region group between the target fingerprint image and the template fingerprint image.
7. The fingerprint matching method of claim 1, wherein said screening the image matching group from the template library based on the initial matching result comprises:
and screening the image matching groups from the template library according to the first matching scores corresponding to the W target region groups.
8. A fingerprint matching device, comprising:
the acquisition module is used for carrying out first matching processing on the target fingerprint image and the template fingerprint image in the template library to obtain an initial matching result;
the screening module is used for screening an image matching group from the template library according to the initial matching result, wherein the template library comprises W template fingerprint images, the image matching group comprises L template fingerprint images, and L is a positive integer smaller than W;
and the matching module is used for respectively carrying out second matching processing on the target fingerprint image and each template fingerprint image in the image matching group to obtain a final matching result, wherein the data fine granularity of the first matching processing is lower than that of the second matching processing.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 7.
CN202310359911.2A 2023-03-28 2023-03-28 Fingerprint matching method, fingerprint matching device, terminal equipment and computer readable storage medium Pending CN116311393A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310359911.2A CN116311393A (en) 2023-03-28 2023-03-28 Fingerprint matching method, fingerprint matching device, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310359911.2A CN116311393A (en) 2023-03-28 2023-03-28 Fingerprint matching method, fingerprint matching device, terminal equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN116311393A true CN116311393A (en) 2023-06-23

Family

ID=86828823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310359911.2A Pending CN116311393A (en) 2023-03-28 2023-03-28 Fingerprint matching method, fingerprint matching device, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN116311393A (en)

Similar Documents

Publication Publication Date Title
JP5228872B2 (en) Biometric authentication apparatus, biometric authentication method, biometric authentication computer program, and computer system
CN107111750B (en) Detection of deceptive faces
US7151846B1 (en) Apparatus and method for matching fingerprint
JP6667802B2 (en) Segment block based handwritten signature authentication system and method
US20040125993A1 (en) Fingerprint security systems in handheld electronic devices and methods therefor
US7120280B2 (en) Fingerprint template generation, verification and identification system
JP5729302B2 (en) Biometric authentication system, method and program
Azmi et al. Biometric signature verification system based on freeman chain code and k-nearest neighbor
US8064646B2 (en) Technique for authenticating an object on basis of features extracted from the object
JPWO2018079031A1 (en) Image processing apparatus, image processing method, face authentication system, and program
KR20010021850A (en) System and method for automatically verifying identity of a subject
US7079670B2 (en) Apparatus and method for authenticating a user by employing feature points of a fingerprint image of the user
US20090169072A1 (en) Method and system for comparing prints using a reconstructed direction image
JPWO2012014308A1 (en) Biometric authentication apparatus, biometric authentication method, biometric authentication computer program, and biometric information registration apparatus
JPH0660169A (en) Method and apparatus for pattern recognition and validity check
JP2010286937A (en) Biometric authentication method, client terminal used for biometric authentication, and authentication server
EP2370932B1 (en) Method, apparatus and computer program product for providing face pose estimation
WO2006012132A2 (en) Generation of directional field information in the context of image processing
CN113614731A (en) Authentication verification using soft biometrics
CN110427826B (en) Palm recognition method and device, electronic equipment and storage medium
CN116311393A (en) Fingerprint matching method, fingerprint matching device, terminal equipment and computer readable storage medium
Szymkowski et al. A novel approach to fingerprint identification using method of sectorization
Mohammadi et al. New approaches to fingerprint authentication using software methods based on fingerprint texture
Szczepanik et al. Security lock system for mobile devices based on fingerprint recognition algorithm
WO2016005759A1 (en) Access management system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination