CN108416342B - Fingerprint identification method combining thin node and thin line structure - Google Patents

Fingerprint identification method combining thin node and thin line structure Download PDF

Info

Publication number
CN108416342B
CN108416342B CN201810524261.1A CN201810524261A CN108416342B CN 108416342 B CN108416342 B CN 108416342B CN 201810524261 A CN201810524261 A CN 201810524261A CN 108416342 B CN108416342 B CN 108416342B
Authority
CN
China
Prior art keywords
fingerprint
image
minutiae
line structure
fine line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810524261.1A
Other languages
Chinese (zh)
Other versions
CN108416342A (en
Inventor
沈雷
汤正刚
吕葛梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN201810524261.1A priority Critical patent/CN108416342B/en
Publication of CN108416342A publication Critical patent/CN108416342A/en
Application granted granted Critical
Publication of CN108416342B publication Critical patent/CN108416342B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1353Extracting features related to minutiae or pores
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • G06V40/1371Matching features related to minutiae or pores

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention discloses a fingerprint identification method combining a fine node structure and a fine line structure. The invention comprises the following steps: step 1, collecting finger fingerprint images of all registered users, respectively preprocessing and extracting characteristics of the fingerprint images to obtain minutiae and fine line structural characteristics, respectively storing the minutiae and the fine line structural characteristics in a database, and establishing a finger fingerprint database; step 2, collecting a finger fingerprint image of a user to be identified, and preprocessing and extracting features of the fingerprint image to obtain minutiae features and fine line structural features of the finger fingerprint image of the user to be identified; and 3, matching the finger fingerprint images of the user to be identified. The invention makes full use of the direction field information around the minutiae, so that the minutiae have stronger identification capability, and the influence of false minutiae on matching is reduced; and the minutiae characteristics can quickly extract the matching reference points of the two fingerprints, so that the time for quickly matching the fingerprints is shortened.

Description

Fingerprint identification method combining thin node and thin line structure
Technical Field
The invention belongs to the technical field of biological feature identification and information security, and particularly relates to a fingerprint identification method combining a minutiae point structure and a fine line structure.
Background
The fingerprint identification technology is a biological characteristic identification technology, and has high safety, high stability, strong universality and strong compatibility, and acquisition equipment is convenient and fast, so that the technology becomes a research hotspot of foreign scholars. The finger fingerprint identification technology mainly comprises the steps of collecting finger fingerprint images, preprocessing the images, extracting features and matching identification. At present, a fingerprint identification system has nearly perfect performance, high precision, high efficiency and stable performance for high-quality fingerprint images. When actually collecting a fingerprint image, due to the influence of collection equipment and environment, a high-quality fingerprint image cannot be completely obtained, wherein partial fingerprints have the problems of image defects and poor quality of the fingerprint with broken textures. The low-quality fingerprint image has a large amount of interference noise and small effective information amount, so that the accuracy of single feature identification is low. Therefore, how to improve the accuracy and efficiency of fingerprint identification aiming at the characteristics of low-quality fingerprints, small-area fingerprints, distorted fingerprints and potential fingerprints becomes a difficult point of current research.
The most basic and widely applied matching method in the existing automatic fingerprint identification system is based on the matching mode of fingerprint minutiae, namely point pattern matching. However, the minutiae feature patterns have some disadvantages: firstly, the minutiae are unevenly distributed, the minutiae are rarely characterized in some areas of the fingerprint, and especially when the area occupies the main area of the fingerprint image, the minutiae are used for identifying the fingerprint, which often results in wrong matching; secondly, more false minutiae are easy to extract in a low-quality region, and the existing fingerprint preprocessing algorithm is difficult to accurately distinguish when the false minutiae in the region are far more than the correct minutiae, so that more false minutiae are reserved or a few correct minutiae are removed, and the fingerprint identification accuracy is reduced; thirdly, in a severely distorted fingerprint, the positions and the directions of the minutiae points are greatly changed, and although some matching algorithms can adapt to the problem of fingerprint deformation, the false acceptance rate is increased while the false rejection rate is reduced.
In summary, due to some defects of the minutiae features, the recognition rate is difficult to achieve satisfactory results under the conditions of low quality, small area, severe distortion and the like. In the fingerprint identification process, the thin line structure is also an important feature of the fingerprint, and from the aspect of the thin line structure, the minutiae represent an abrupt change feature of the thin line structure, namely, the thin line is suddenly interrupted or forked, and the feature is remarkable, is convenient for distinguishing different fingerprints, and is not stable. On one hand, there are no minutiae in the area without the change of the thin line, on the other hand, there are areas different from the actual fingerprint and wrong changes are generated due to the process of collecting the image, and then wrong nodes are extracted; the characteristics of the basic shape, the grain trend and the like of the fingerprint image are determined by the thin line structure, so that the fingerprint image has higher reliability. However, if the fine line structure matching is simply used, the alignment of the fine line structure is affected because the fine line structure datum point pair is difficult to determine, and the time for fingerprint matching is increased.
Disclosure of Invention
The invention aims to provide a fingerprint identification method combining a minutia point and a fine line structure, aiming at the defects of the prior art that the requirements of an automatic fingerprint identification system are difficult to meet by single minutia point mode matching for low-quality fingerprints, small-area fingerprints, distorted fingerprints and potential fingerprints.
The technical scheme adopted by the invention for solving the technical problem comprises the following steps:
step 1, collecting finger fingerprint images of all registered users, respectively preprocessing and extracting characteristics of the fingerprint images to obtain minutiae and fine line structural characteristics, respectively storing the minutiae and the fine line structural characteristics in a database, and establishing a finger fingerprint database;
step 2, collecting a finger fingerprint image of a user to be identified, and preprocessing and extracting features of the fingerprint image to obtain minutiae features and fine line structural features of the finger fingerprint image of the user to be identified;
step 3, matching the finger fingerprint image of the user to be identified:
the step 1 is specifically realized as follows:
1-1, collecting a finger fingerprint image of a registered user, and carrying out gray level processing on the fingerprint image to obtain an original gray level image of the fingerprint image;
1-2, obtaining a direction field of an original gray level image by a gradient-based direction field calculation method, and segmenting the original gray level image to obtain a fingerprint ridge area image;
1-3, carrying out image enhancement on the fingerprint ridge area image;
1-4, converting the enhanced fingerprint ridge area image into a binary image;
1-5, filtering noise in the binary image, then carrying out skeletonization, and then carrying out deburring treatment on the skeletonized image to obtain a fingerprint thin-line structure image;
1-6, searching minutiae of the fingerprint fine line structure image by a fine line tracking method to obtain the minutiae of the fingerprint fine line structure image, and storing the obtained fingerprint fine line structure image and the minutiae corresponding to the fingerprint fine line structure image as a pair in a database;
1-7, repeating the steps 1-1 to 1-6, finishing the acquisition of all registered users, and establishing a fingerprint database.
The step 2 is realized as follows:
step 2, collecting a finger fingerprint image of a user to be identified, and preprocessing and extracting features of the fingerprint image to obtain minutiae features and fine line structural features of the finger fingerprint image of the user to be identified;
2-1, collecting a finger fingerprint image of a user to be identified, and carrying out gray level processing on the fingerprint image to obtain an original gray level image of the fingerprint image;
2-2, obtaining a direction field of the original gray level image by a direction field calculation method based on gradient, and segmenting the original gray level image to obtain a fingerprint ridge area image;
2-3, carrying out image enhancement on the fingerprint ridge area image;
2-4, converting the enhanced fingerprint ridge area image into a binary image;
2-5, filtering noise in the binary image, then carrying out skeletonization, and then carrying out deburring treatment on the skeletonized image to obtain a fingerprint thin-line structure image;
2-6, searching minutiae of the fingerprint fine line structure image by a fine line tracking method to obtain the minutiae of the fingerprint fine line structure image and obtain minutiae characteristics of the fingerprint image;
step 3, matching the finger fingerprint image of the user to be identified:
3-1, acquiring an optimal reference point pair by using the direction field information around the minutiae according to the minutiae descriptor:
3-1-1, inputting two fingerprint images A and B, wherein A is the fingerprint image to be identified, and the extracted minutiae point is set A ═ a1,a2,···,an,···,aN}; b is any fingerprint image in the fingerprint library, and the extracted minutiae point is set B ═ B1,b2,···,bm,,bM}。
3-1-2, selecting any minutiae a in the fingerprint image AnTraversing all minutiae points in the fingerprint image B, and if the minutiae points B exist in the fingerprint image BmAnd two minutiae points an、bmSatisfy the same type and the position translation is in (+/-delta x)0,±Δy0) If the range is within the range, entering the step 3-1-4; if all minutiae in the traversal fingerprint image B do not exist and the minutiae a do not existnCorresponding detail point bmThen the minutiae a in the fingerprint image A are discardedn
And 3-1-3, continuously selecting a minutiae point from the fingerprint image A, and repeating the step 3-1-2 until all the minutiae points in the fingerprint image A are traversed.
3-1-4, constructing a minutiae descriptor, wherein the descriptor comprises three auxiliary points in a direction field around the minutiae; calculating the relative angle difference between every two points in the three auxiliary points:
Δθk=|θij|(i,j=0,1,2,3;i<j) (1)
in the formula (1), k is a corresponding number of the relative angle difference in the detail point descriptor, and the range of k is more than or equal to 1 and less than or equal to 6.
The k-th angular deviation between two minutiae points is denoted as G (k):
Figure BDA0001675538430000043
in the formula (2), the reaction mixture is,
Figure BDA0001675538430000044
as detail point anThe relative angle difference of (a) to (b),
Figure BDA0001675538430000045
as detail point bmRelative angular difference of (a).
3-1-5, judging whether the two minutiae are primary reference points or not, and if any G (k) in the formula (2) is larger than a threshold value T1If the two points are not matched, returning to the step 3-1-2; otherwise, the two minutiae points are the initial reference points.Record a set of preliminary fiducial points Q, the match logarithm is L, and the position offset (Δ x) between each pair of preliminary fiducial pointsi,Δyi,Δθi)。
3-1-6, finding the best reference point from the set Q of the preliminary reference points.
Calculating any pair of matching points (delta x) in the set of preliminary reference point pairsi,Δyi,Δθi) (i ═ 1,2, · · L) deviations d from all the matching points remaining in the set Q (Δ x, Δ y, Δ θ)i
Figure BDA0001675538430000041
In formula (3), d isiIs less than the optimal reference point threshold value T2The numerical values form a set D, and the maximum value C in the set D is selectedmMaximum value CmThe corresponding matching point is the best reference point, i.e. aiAnd bjFor the best reference point pair, the corresponding position deviation parameter is (Deltax)ij,Δyij,Δθij)。
Step 3-2, feature matching of detail points
3-2-1, calibrating the positions of all the points in the detail point set, wherein the formula is as follows:
Figure BDA0001675538430000042
θ'=θ+Δθij (5)
wherein S is0Are the scale transformation parameters. Fingerprint minutiae set A to be identified is { a ═ a }1,a2,···,aNVia translation and rotation (Δ x)ij,Δyij,Δθij) Conversion to a ═ a1',a'2,···,a'N}。
3-2-2. two minutiae points, generally defined, match, generally measured by both "distances". Calculating a 'first'iAnd bjDistance d (a'i,bj) Angle difference d of sum direction fieldθ(a′i,bj). If d (a'i,bj) Less than empirical threshold T of location distancedAnd d isθ(a′i,bj) Less than the empirical threshold T of the angular difference of the directional fieldθThen the two minutiae match.
Figure BDA0001675538430000051
dθ(a′i,bj)=min(|θ′ij|,360°-|θ′ij|) (7)
Calculating two matched detail point similarities s according to the two formulas (6) and (7):
Figure BDA0001675538430000052
where k and sigma are constants, d is the distance between two minutiae, and dθIs the difference of the direction field angles of the two minutiae.
3-2-3, counting the number of all matching points according to the formula (8), and calculating the matching score of the minutiae according to the following formula:
Figure BDA0001675538430000053
wherein N isMFor the number of matching points, N and M are the number of minutiae of the fingerprint to be identified and the number of minutiae of the template, respectively.
3-3. fingerprint fine line structure matching
3-3-1. the fine line structure point set extracted from the fingerprint image A is
Figure BDA0001675538430000054
Fine line structure set extracted from fingerprint image B
Figure BDA00016755384300000510
Point set
Figure BDA0001675538430000055
Each point in the graph contains only position information (x, y), and x and y respectively represent the abscissa and the ordinate of the point.
3-3-2, according to the transformation relation of the formula (4), the fingerprint fine line structure point set to be identified
Figure BDA0001675538430000056
Through translation and rotation (Δ x)ij,Δyij,Δθij) Is converted into
Figure BDA0001675538430000057
After transformation
Figure BDA0001675538430000058
Fingerprint with library
Figure BDA0001675538430000059
Comparing to obtain the ratio T 'of the overlapped area of the two fingerprints, and if T' is larger than the preset threshold T of the area ratio of the overlapped area of the thin line structure3Matching the structure characteristics of the thin lines in the overlapping area, and then entering the step 3-3-3 to judge whether the fingerprints are matched; otherwise, the overlapped area of the two fingerprints is too small and the fingerprints are not the same, and the fingerprints are judged to be not matched.
3-3-3. improved Hausdorff distance is often used as similarity measurement for image matching, the invention adopts a recognition method of searching fine line structure distance in the field, and utilizes the fine line structure distance
Figure BDA0001675538430000061
The similarity index of the two fingerprint thin line structures is specifically calculated as follows:
point set with fine line structure
Figure BDA0001675538430000062
To
Figure BDA0001675538430000063
The distance of (d) defines:
point collection of fine line structure
Figure BDA0001675538430000064
Middle p point to fine line structure point set
Figure BDA0001675538430000065
The distances are sorted in ascending order, and the average value of the first i distance values is taken as a fine line structure point set
Figure BDA0001675538430000066
To
Figure BDA0001675538430000067
The distance value of (d):
Figure BDA0001675538430000068
in the formula (10), i is more than or equal to 1 and less than or equal to p, and p is a point set with a fine line structure
Figure BDA0001675538430000069
Number of midpoints, IthRepresenting ascending sort, is
Figure BDA00016755384300000610
The euclidean distance between.
In the same way, the point set with fine line structure
Figure BDA00016755384300000611
To
Figure BDA00016755384300000612
The distance of (d) defines:
point collection of fine line structure
Figure BDA00016755384300000613
Middle q point to fine line structure point set
Figure BDA00016755384300000614
The distances are sorted in ascending order, and the average value of the first j distance values is taken as a fine line structure point set
Figure BDA00016755384300000615
To
Figure BDA00016755384300000616
The distance value of (d):
Figure BDA00016755384300000617
wherein j is more than or equal to 1 and less than or equal to q, and q is a point set
Figure BDA00016755384300000618
Number of midpoints, IthRepresenting ascending sort, is
Figure BDA00016755384300000619
Figure BDA00016755384300000620
The euclidean distance between.
Point set with fine line structure
Figure BDA00016755384300000624
And fine line structure point set
Figure BDA00016755384300000622
The matching score of the line is as follows:
Figure BDA00016755384300000623
the number of points in the fingerprint thin line structure is far more than the number of fingerprint characteristic points, and if the method for searching each point is used for matching the fingerprint thin line structure, a large amount of operation time is consumed. In order to reduce the operation time, a recognition method of a domain search fine line structure distance is adopted.
As can be seen from the equation (10), the distance from any one point in the fine line structure point set a' to the point set B needs to be calculated, and the minimum point is taken; it follows that this calculation consumes a lot of time. If only the distance of the point to a point in the range of the area (Δ x, Δ y) of the corresponding position in the point set B is searched, on the one hand, the calculated distance value is not changed, and on the other hand, the calculation time can be saved.
3-4. fusion decision center
Because the detail point characteristics and the thin line structure characteristics are different essential characteristics, the threshold calculation modes are different, the characteristic fusion is carried out on a decision layer by utilizing the fractional layer information, and the expression is as follows:
Figure BDA0001675538430000071
wherein alpha is a weight, SHCalculating the obtained matching fraction for the thin line structure characteristics;
the matching result is determined as follows:
Figure BDA0001675538430000072
wherein c is the similarity threshold of 2 fingerprint images obtained by experiments, and if the similarity threshold is smaller than c, the fingerprint images are regarded as different fingerprint images, otherwise, the fingerprint images are regarded as the same fingerprint images.
The invention has the following beneficial effects:
1. the direction field information around the minutiae is fully utilized, so that the minutiae has strong identification capability, and the influence of false minutiae on matching is reduced; and the minutiae characteristics can quickly extract the matching reference points of the two fingerprints, so that the time for quickly matching the fingerprints is shortened.
2. The fingerprint identification method provided by the invention has obviously higher identification performance than the traditional point pattern identification method under the condition of a certain error identification rate. Particularly, for fingerprint images such as low-quality fingerprints, small-area fingerprints and distorted fingerprints, the performance of the recognition algorithm based on the point pattern is obviously reduced, and the performance of the recognition algorithm combining the fine node structure and the fine line structure is not obviously reduced.
3. The process that the fingerprint is compared with the level of the minutiae is described from the level of the lines, so that the process that the fingerprint is compared with the level of the minutiae is more consistent with the process that a human expert compares the fingerprint, more features can be provided for matching, the defects of the minutiae are avoided, the fingerprints which cannot be correctly matched with the minutiae are better distinguished, and the accuracy of fingerprint identification is improved.
4. The invention is convenient for the change and upgrade of the later algorithm: the database stores the minutiae and the fine line structure, when the minutiae algorithm needs to be changed or upgraded in the later period, the fine line structure in the database can be directly read, and then the fine line structure is operated by the improved minutiae matching algorithm, so that new minutiae data is established.
Drawings
FIG. 1 is a flow chart of a fingerprint identification method incorporating minutiae and fine line structures of the present invention
FIG. 2 is a flow chart of creating a fingerprint database according to the present invention;
FIG. 3 is a diagram of an original fingerprint image
FIG. 4 is a diagram of a fingerprint direction field;
FIG. 5 fingerprint segmentation image;
FIG. 6 fingerprint enhancement image;
FIG. 7 is a fingerprint binary image;
FIG. 8 is a fingerprint refining structure feature diagram;
FIG. 9 is a fingerprint minutiae feature map;
the detail point of fig. 10 depicts a subgraph.
Detailed Description
The following further describes an embodiment of the present invention with reference to the drawings.
As shown in fig. 1 to 10, a fingerprint identification method combining a fine node structure and a fine line structure is implemented as follows:
step 1, collecting finger fingerprint images of all registered users, respectively preprocessing and extracting characteristics of the fingerprint images to obtain minutiae and fine line structural characteristics, respectively storing the minutiae and the fine line structural characteristics in a database, and establishing a finger fingerprint database;
step 2, collecting a finger fingerprint image of a user to be identified, and preprocessing and extracting features of the fingerprint image to obtain minutiae features and fine line structural features of the finger fingerprint image of the user to be identified;
step 3, matching the finger fingerprint image of the user to be identified:
the step 1 is specifically realized as follows:
1-1, collecting finger fingerprint images of registered users, and carrying out gray level processing on the fingerprint images to obtain original gray level images of the fingerprint images, as shown in FIG. 3;
1-2, obtaining a direction field of an original gray level image by a direction field calculation method based on gradient, such as a graph 4, and segmenting the original gray level image to obtain a fingerprint ridge area image; as shown in FIG. 5;
1-3, carrying out image enhancement on the fingerprint ridge area image, as shown in FIG. 6;
1-4, converting the enhanced fingerprint ridge area image into a binary image, as shown in FIG. 7;
1-5, filtering noise in the binary image, then carrying out skeletonization, and then carrying out deburring treatment on the skeletonized image to obtain a fingerprint thin-line structure image, as shown in fig. 8;
1-6, searching minutiae of the fingerprint fine line structure image by a line tracing method to obtain minutiae of the fingerprint fine line structure image, as shown in FIG. 9, and storing the obtained fingerprint fine line structure image and minutiae corresponding to the fingerprint fine line structure image as a pair in a database;
1-7, repeating the steps 1-1 to 1-6, finishing the acquisition of all registered users, and establishing a fingerprint database.
The step 2 is realized as follows:
step 2, collecting a finger fingerprint image of a user to be identified, and preprocessing and extracting features of the fingerprint image to obtain minutiae features and fine line structural features of the finger fingerprint image of the user to be identified;
2-1, collecting a finger fingerprint image of a user to be identified, and carrying out gray level processing on the fingerprint image to obtain an original gray level image of the fingerprint image;
2-2, obtaining a direction field of the original gray level image by a direction field calculation method based on gradient, and segmenting the original gray level image to obtain a fingerprint ridge area image;
2-3, carrying out image enhancement on the fingerprint ridge area image;
2-4, converting the enhanced fingerprint ridge area image into a binary image;
2-5, filtering noise in the binary image, then carrying out skeletonization, and then carrying out deburring treatment on the skeletonized image to obtain a fingerprint thin-line structure image;
2-6, searching minutiae of the fingerprint fine line structure image by a fine line tracking method to obtain the minutiae of the fingerprint fine line structure image and obtain minutiae characteristics of the fingerprint image;
step 3, matching the finger fingerprint image of the user to be identified:
3-1. as shown in FIG. 10, according to the minutiae descriptor, using the directional field information around the minutiae, the best reference point pair is obtained:
3-1-1, inputting two fingerprint images A and B, wherein A is the fingerprint image to be identified, and the extracted minutiae point is set A ═ a1,a2,···,an,···,aN}; b is any fingerprint image in the fingerprint library, and the extracted minutiae point is set B ═ B1,b2,···,bm,,bM}。
3-1-2, selecting any minutiae a in the fingerprint image AnTraversing all minutiae points in the fingerprint image B, and if the minutiae points B exist in the fingerprint image BmAnd two minutiae points an、bmSatisfy the same type and the position translation is in (+/-delta x)0,±Δy0) If the range is within the range, entering the step 3-1-4; if all minutiae in the traversal fingerprint image B do not exist and the minutiae a do not existnCorresponding detail point bmThen the minutiae a in the fingerprint image A are discardedn
And 3-1-3, continuously selecting a minutiae point from the fingerprint image A, and repeating the step 3-1-2 until all the minutiae points in the fingerprint image A are traversed.
3-1-4, constructing a minutiae descriptor, wherein the descriptor comprises three auxiliary points in a direction field around the minutiae; calculating the relative angle difference between every two points in the three auxiliary points:
Δθk=|θij|(i,j=0,1,2,3;i<j) (1)
in the formula (1), k is a corresponding number of the relative angle difference in the detail point descriptor, and the range of k is more than or equal to 1 and less than or equal to 6.
The k-th angular deviation between two minutiae points is denoted as G (k):
Figure BDA0001675538430000101
in the formula (2), the reaction mixture is,
Figure BDA0001675538430000102
as detail point anThe relative angle difference of (a) to (b),
Figure BDA0001675538430000103
as detail point bmRelative angular difference of (a).
3-1-5, judging whether the two minutiae are primary reference points or not, and if any G (k) in the formula (2) is larger than a threshold value T1If the two points are not matched, returning to the step 3-1-2; otherwise, the two minutiae points are the initial reference points. Record a set of preliminary fiducial points Q, the match logarithm is L, and the position offset (Δ x) between each pair of preliminary fiducial pointsi,Δyi,Δθi)。
3-1-6, finding the best reference point from the set Q of the preliminary reference points.
Calculating any pair of matching points (delta x) in the set of preliminary reference point pairsi,Δyi,Δθi) (i ═ 1,2, · · L) deviations d from all the matching points remaining in the set Q (Δ x, Δ y, Δ θ)i
Figure BDA0001675538430000104
In formula (3), d isiIs less than the optimal reference point threshold value T2The numerical values form a set D, and the maximum value C in the set D is selectedmMaximum value CmThe corresponding matching point is the best reference point, i.e. aiAnd bjIs an optimal reference point pair, whichThe corresponding position deviation parameter is (Δ x)ij,Δyij,Δθij)。
Step 3-2, feature matching of detail points
3-2-1, calibrating the positions of all the points in the detail point set, wherein the formula is as follows:
Figure BDA0001675538430000105
θ'=θ+Δθij (5)
wherein S is0Are the scale transformation parameters. Fingerprint minutiae set A to be identified is { a ═ a }1,a2,···,aNVia translation and rotation (Δ x)ij,Δyij,Δθij) To A '═ a'1,a'2,···,a'N}。
3-2-2. two minutiae points, generally defined, match, generally measured by both "distances". Calculating a 'first'iAnd bjDistance d (a'i,bj) Angle difference d of sum direction fieldθ(a′i,bj). If d (a'i,bj) Less than empirical threshold T of location distancedAnd d isθ(a′i,bj) Less than the empirical threshold T of the angular difference of the directional fieldθThen the two minutiae match.
Figure BDA0001675538430000111
dθ(a′i,bj)=min(|θ′ij|,360°-|θ′ij|) (7)
Calculating two matched detail point similarities s according to the two formulas (6) and (7):
Figure BDA0001675538430000112
where k, σ are constants and d is two detailsDistance of points, dθIs the difference of the direction field angles of the two minutiae.
3-2-3, counting the number of all matching points according to the formula (8), and calculating the matching score of the minutiae according to the following formula:
Figure BDA0001675538430000113
wherein N isMFor the number of matching points, N and M are the number of minutiae of the fingerprint to be identified and the number of minutiae of the template, respectively.
3-3. fingerprint fine line structure matching
3-3-1. the fine line structure point set extracted from the fingerprint image A is
Figure BDA0001675538430000114
Fine line structure set extracted from fingerprint image B
Figure BDA0001675538430000115
Point set
Figure BDA0001675538430000116
Each point in the graph contains only position information (x, y), and x and y respectively represent the abscissa and the ordinate of the point.
3-3-2, according to the transformation relation of the formula (4), the fingerprint fine line structure point set to be identified
Figure BDA0001675538430000117
Through translation and rotation (Δ x)ij,Δyij,Δθij) Is converted into
Figure BDA0001675538430000118
After transformation
Figure BDA0001675538430000119
Fingerprint with library
Figure BDA00016755384300001110
Comparing to obtain the ratio T 'of the two fingerprint overlapping areas, if T' is larger than the preset fine line structure overlapping areaDomain area ratio threshold T3Matching the structure characteristics of the thin lines in the overlapping area, and then entering the step 3-3-3 to judge whether the fingerprints are matched; otherwise, the overlapped area of the two fingerprints is too small and the fingerprints are not the same, and the fingerprints are judged to be not matched.
3-3-3. improved Hausdorff distance is often used as similarity measurement for image matching, the invention adopts a recognition method of searching fine line structure distance in the field, and utilizes the fine line structure distance
Figure BDA0001675538430000121
The similarity index of the two fingerprint thin line structures is specifically calculated as follows:
point set with fine line structure
Figure BDA0001675538430000122
To
Figure BDA0001675538430000123
The distance of (d) defines:
point collection of fine line structure
Figure BDA0001675538430000124
Middle p point to fine line structure point set
Figure BDA0001675538430000125
The distances are sorted in ascending order, and the average value of the first i distance values is taken as a fine line structure point set
Figure BDA0001675538430000126
To
Figure BDA0001675538430000127
The distance value of (d):
Figure BDA0001675538430000128
in the formula (10), i is more than or equal to 1 and less than or equal to p, and p is a point set with a fine line structure
Figure BDA0001675538430000129
Number of midpoints, IthRepresenting ascending sort, is
Figure BDA00016755384300001210
The euclidean distance between.
In the same way, the point set with fine line structure
Figure BDA00016755384300001211
To
Figure BDA00016755384300001212
The distance of (d) defines:
point collection of fine line structure
Figure BDA00016755384300001213
Middle q point to fine line structure point set
Figure BDA00016755384300001214
The distances are sorted in ascending order, and the average value of the first j distance values is taken as a fine line structure point set
Figure BDA00016755384300001215
To
Figure BDA00016755384300001216
The distance value of (d):
Figure BDA00016755384300001217
wherein j is more than or equal to 1 and less than or equal to q, and q is a point set
Figure BDA00016755384300001218
Number of midpoints, IthRepresenting ascending sort, is
Figure BDA00016755384300001219
Figure BDA00016755384300001220
In betweenThe euclidean distance.
Point set with fine line structure
Figure BDA00016755384300001221
And fine line structure point set
Figure BDA00016755384300001223
The matching score of the line is as follows:
Figure BDA00016755384300001222
the number of points in the fingerprint thin line structure is far more than the number of fingerprint characteristic points, and if the method for searching each point is used for matching the fingerprint thin line structure, a large amount of operation time is consumed. In order to reduce the operation time, a recognition method of a domain search fine line structure distance is adopted.
As can be seen from the equation (10), the distance from any one point in the fine line structure point set a' to the point set B needs to be calculated, and the minimum point is taken; it follows that this calculation consumes a lot of time. If only the distance of the point to a point in the range of the area (Δ x, Δ y) of the corresponding position in the point set B is searched, on the one hand, the calculated distance value is not changed, and on the other hand, the calculation time can be saved.
3-4. fusion decision center
Because the detail point characteristics and the thin line structure characteristics are different essential characteristics, the threshold calculation modes are different, the characteristic fusion is carried out on a decision layer by utilizing the fractional layer information, and the expression is as follows:
Figure BDA0001675538430000131
wherein alpha is a weight, SHCalculating the obtained matching fraction for the thin line structure characteristics;
the matching result is determined as follows:
Figure BDA0001675538430000132
wherein c is the similarity threshold of 2 fingerprint images obtained by experiments, and if the similarity threshold is smaller than c, the fingerprint images are regarded as different fingerprint images, otherwise, the fingerprint images are regarded as the same fingerprint images.
Example (b):
fingerprint images of 2000 registered users are collected by a collecting device in an experiment, and an abnormal fingerprint image database and a normal fingerprint image mixed database are established. The abnormal fingerprint image database occupies about 20 percent and comprises images such as low-quality fingerprints, small-area fingerprints, distorted fingerprints and potential fingerprints. And preprocessing the acquired fingerprint image, extracting fingerprint characteristic information and storing the fingerprint characteristic information.
The fingerprint images of registered users are sequentially collected to serve as users to be identified, reference points of the two fingerprints are found out by using the minutiae descriptors, the minutiae and thin line structures of the fingerprints to be identified are aligned according to the position deviation of the reference points, then the matching condition in the point mode and the matching condition in the thin line structure are respectively solved, and finally the two conditions are fused to judge whether the two fingerprints are matched. 2000 recognition results were obtained, of which 4 were not recognized as registered users, with a recognition rate of 99.84%, based on the VS2017 platform, which calculated that the matching took 4.65 milliseconds at a time. However, in the method of storing only feature point data in the database, 124 users who have not identified a registered user are identified, and the identification rate is 95.04%, whereas in the method of storing only a database with a thin line structure, the average matching time is 10.12 milliseconds.
The embodiment result shows that the fingerprint identification algorithm based on the combination of the minutiae and the fine line structure of the patent can save time and ensure higher identification rate.
Finally, it is noted that the disclosed embodiments are intended to aid in further understanding of the invention, but those skilled in the art will appreciate that: various substitutions and modifications are possible without departing from the spirit and scope of the invention and the appended claims. Therefore, the invention should not be limited to the embodiments disclosed, but the scope of the invention is defined by the appended claims.

Claims (1)

1. A fingerprint identification method combining a fine node structure and a fine line structure is characterized by comprising the following steps:
step 1, collecting finger fingerprint images of all registered users, respectively preprocessing and extracting characteristics of the fingerprint images to obtain minutiae and fine line structural characteristics, respectively storing the minutiae and the fine line structural characteristics in a database, and establishing a finger fingerprint database;
step 2, collecting a finger fingerprint image of a user to be identified, and preprocessing and extracting features of the fingerprint image to obtain minutiae features and fine line structural features of the finger fingerprint image of the user to be identified;
step 3, matching the finger fingerprint images of the user to be identified;
the step 1 is specifically realized as follows:
1-1, collecting a finger fingerprint image of a registered user, and carrying out gray level processing on the fingerprint image to obtain an original gray level image of the fingerprint image;
1-2, obtaining a direction field of an original gray level image by a gradient-based direction field calculation method, and segmenting the original gray level image to obtain a fingerprint ridge area image;
1-3, carrying out image enhancement on the fingerprint ridge area image;
1-4, converting the enhanced fingerprint ridge area image into a binary image;
1-5, filtering noise in the binary image, then carrying out skeletonization, and then carrying out deburring treatment on the skeletonized image to obtain a fingerprint thin-line structure image;
1-6, searching minutiae of the fingerprint fine line structure image by a fine line tracking method to obtain the minutiae of the fingerprint fine line structure image, and storing the obtained fingerprint fine line structure image and the minutiae corresponding to the fingerprint fine line structure image as a pair in a database;
1-7, repeating the steps 1-1 to 1-6, finishing the acquisition of all registered users, and establishing a fingerprint database;
the step 2 is realized as follows:
2-1, collecting a finger fingerprint image of a user to be identified, and carrying out gray level processing on the fingerprint image to obtain an original gray level image of the fingerprint image;
2-2, obtaining a direction field of the original gray level image by a direction field calculation method based on gradient, and segmenting the original gray level image to obtain a fingerprint ridge area image;
2-3, carrying out image enhancement on the fingerprint ridge area image;
2-4, converting the enhanced fingerprint ridge area image into a binary image;
2-5, filtering noise in the binary image, then carrying out skeletonization, and then carrying out deburring treatment on the skeletonized image to obtain a fingerprint thin-line structure image;
2-6, searching minutiae of the fingerprint fine line structure image by a fine line tracking method to obtain the minutiae of the fingerprint fine line structure image and obtain minutiae characteristics of the fingerprint image;
the step 3 is specifically realized as follows:
3-1, acquiring an optimal reference point pair by using the direction field information around the minutiae according to the minutiae descriptor:
3-2, matching the detail point characteristics of the optimal reference point pair;
3-3, matching the fingerprint fine line structure of the optimal reference point pair;
3-4, performing feature fusion on the detail point features and the thin line structure features by using the fractional layer information on a decision layer;
the step 3-1 is specifically realized as follows:
3-1-1, inputting two fingerprint images A and B, wherein A is the fingerprint image to be identified, and the extracted minutiae point is set A ═ a1,a2,…,an,…,aN}; b is any fingerprint image in the fingerprint library, and the extracted minutiae point is set B ═ B1,b2,…,bm,…,bM};
3-1-2, selecting any minutiae a in the fingerprint image AnTraversing all minutiae points in the fingerprint image B, and if the minutiae points B exist in the fingerprint image BmAnd two minutiae points an、bmSatisfy the same type and the position translation is in (+/-delta x)0,±Δy0) If the range is within the range, entering the step 3-1-4; if all minutiae in the traversal fingerprint image B do not exist and the minutiae a do not existnCorresponding detail point bmThen the minutiae a in the fingerprint image A are discardedn
3-1-3, continuously selecting a detail point from the fingerprint image A, and repeating the step 3-1-2 until all detail points in the fingerprint image A are traversed;
3-1-4, constructing a minutiae descriptor, wherein the descriptor comprises three auxiliary points in a direction field around the minutiae; calculating the relative angle difference between every two points in the three auxiliary points:
Δθk=|θij|,i,j=0,1,2,3;i<j (1)
in the formula (1), k is a corresponding number of the relative angle difference in the detail point descriptor, and the range of k is more than or equal to 1 and less than or equal to 6;
the k-th angular deviation between two minutiae points is denoted as G (k):
Figure FDA0003299140580000031
in the formula (2), the reaction mixture is,
Figure FDA0003299140580000032
as detail point anThe relative angle difference of (a) to (b),
Figure FDA0003299140580000033
as detail point bmRelative angular difference of (d);
3-1-5, judging whether the two detail points are primary reference points: if any G (k) in the formula (2) is larger than the threshold value T1If the two points are not matched, returning to the step 3-1-2; otherwise, the two minutiae points are the initial datum points; record a set Q of preliminary fiducial pairs, the match logarithm being L, the positional offset (Δ x) between each pair of preliminary fiducial pairsl,Δyl,Δθl);
3-1-6, searching the best reference point in the set Q from the preliminary reference point;
computing a set of preliminary fiducial point pairsAny pair of matching points (Δ x)l,Δyl,Δθl) 1,2, …, the deviation d of L from all the matching points remaining in the set Q (Δ x, Δ y, Δ θ)l
Figure FDA0003299140580000034
In formula (3), d islIs less than the optimal reference point threshold value T2The numerical values form a set D, and the maximum value C in the set D is selectedmMaximum value CmThe corresponding matching point is the best reference point, i.e. auAnd bvFor the best reference point pair, the corresponding position deviation parameter is (Deltax)uv,Δyuv,Δθuv);
The step 3-2 is specifically realized as follows:
3-2-1, calibrating the positions of all the points in the detail point set, wherein the formula is as follows:
Figure FDA0003299140580000035
θ'=θ+Δθuv (5)
wherein S is0Is a scale transformation parameter; fingerprint minutiae set A to be identified is { a ═ a }1,a2,…,an,…,aNVia translation and rotation (Δ x)uv,Δyuv,Δθuv) To A '═ a'1,a′2,…,a′n,…,a′N};
3-2-2. calculate a'nAnd bmDistance d (a'n,bm) Angle difference d of sum direction fieldθ(a′n,bm) (ii) a If d (a'n,bm) Less than empirical threshold T of location distancedAnd d isθ(a′n,bm) Less than the empirical threshold T of the angular difference of the directional fieldθIf yes, matching the two minutiae;
Figure FDA0003299140580000041
dθ(a′n,bm)=min(|θ′nm|,360°-|θ′nm|) (7)
calculating two matched detail point similarities s according to the two formulas (6) and (7):
Figure FDA0003299140580000042
where h and sigma are constants, d is the distance between the two minutiae points, and dθThe angle difference of the direction fields of the two detail points is obtained;
3-2-3, counting the number of all matching points according to the formula (8), and calculating the matching score of the minutiae according to the following formula:
Figure FDA0003299140580000043
wherein N isMThe number of matching points is N and M are the number of the minutiae of the fingerprint to be identified and the number of the minutiae of the template respectively;
the step 3-3 is specifically realized as follows:
3-3. fingerprint fine line structure matching
3-3-1. the fine line structure point set extracted from the fingerprint image A is
Figure FDA0003299140580000044
Fine line structure set extracted from fingerprint image B
Figure FDA0003299140580000045
Point set
Figure FDA0003299140580000046
Each point only contains position information (x, y), and x and y respectively represent the abscissa and the ordinate of the point;
3-3-2, according to the transformation relation of the formula (4), the fingerprint fine line structure point set to be identified
Figure FDA0003299140580000047
Through translation and rotation (Δ x)uv,Δyuv,Δθuv) Is converted into
Figure FDA0003299140580000048
After transformation
Figure FDA0003299140580000049
Fingerprint with library
Figure FDA00032991405800000410
Comparing to obtain the ratio T 'of the overlapped area of the two fingerprints, and if T' is larger than the preset threshold T of the area ratio of the overlapped area of the thin line structure3Matching the structure characteristics of the thin lines in the overlapping area, and then entering the step 3-3-3 to judge whether the fingerprints are matched; otherwise, the overlapped area of the two fingerprints is too small and is not the fingerprint of the same finger, and the two fingerprints are judged to be not matched;
3-3-3. identification method for searching fine line structure distance by using field, and identification method for searching fine line structure distance by using fine line structure distance
Figure FDA00032991405800000411
The similarity index of the two fingerprint thin line structures is specifically calculated as follows:
point set with fine line structure
Figure FDA00032991405800000412
To
Figure FDA00032991405800000413
The distance of (d) defines:
point collection of fine line structure
Figure FDA0003299140580000051
In
Figure FDA0003299140580000052
Point-to-fine line structure point set
Figure FDA0003299140580000053
The distances are sorted in ascending order, and the average value of the first p distance values is taken as a fine line structure point set
Figure FDA0003299140580000054
To
Figure FDA0003299140580000055
The distance value of (d):
Figure FDA0003299140580000056
in the formula (10), the compound represented by the formula (10),
Figure FDA0003299140580000057
Figure FDA0003299140580000058
as a collection of fine line structures
Figure FDA0003299140580000059
Number of midpoints, IthRepresenting ascending sort, is
Figure FDA00032991405800000510
The euclidean distance between;
in the same way, the point set with fine line structure
Figure FDA00032991405800000511
To
Figure FDA00032991405800000512
The distance of (d) defines:
point collection of fine line structure
Figure FDA00032991405800000513
In
Figure FDA00032991405800000514
Point-to-fine line structure point set
Figure FDA00032991405800000515
The distances of the first q distance values are taken as a fine line structure point set
Figure FDA00032991405800000516
To
Figure FDA00032991405800000517
The distance value of (d):
Figure FDA00032991405800000518
in the formula (I), the compound is shown in the specification,
Figure FDA00032991405800000519
Figure FDA00032991405800000520
is a set of points
Figure FDA00032991405800000521
Number of midpoints, IthRepresenting ascending sort, is
Figure FDA00032991405800000522
The euclidean distance between;
point set with fine line structure
Figure FDA00032991405800000523
And fine line structure point set
Figure FDA00032991405800000524
The matching scores for the lines are as follows:
Figure FDA00032991405800000525
the steps 3-4 are specifically realized as follows:
and performing feature fusion on the detail point features and the thin line structure features by using the fractional layer information on a decision layer, wherein the expression is as follows:
Figure FDA00032991405800000526
wherein alpha is a weight, SHCalculating the obtained matching fraction for the thin line structure characteristics;
the matching result is determined as follows:
Figure FDA00032991405800000527
wherein c is the similarity threshold of 2 fingerprint images obtained by experiments, and if the similarity threshold is smaller than c, the fingerprint images are regarded as different fingerprint images, otherwise, the fingerprint images are regarded as the same fingerprint images.
CN201810524261.1A 2018-05-28 2018-05-28 Fingerprint identification method combining thin node and thin line structure Active CN108416342B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810524261.1A CN108416342B (en) 2018-05-28 2018-05-28 Fingerprint identification method combining thin node and thin line structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810524261.1A CN108416342B (en) 2018-05-28 2018-05-28 Fingerprint identification method combining thin node and thin line structure

Publications (2)

Publication Number Publication Date
CN108416342A CN108416342A (en) 2018-08-17
CN108416342B true CN108416342B (en) 2022-02-18

Family

ID=63140647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810524261.1A Active CN108416342B (en) 2018-05-28 2018-05-28 Fingerprint identification method combining thin node and thin line structure

Country Status (1)

Country Link
CN (1) CN108416342B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109196523A (en) * 2018-09-03 2019-01-11 深圳市汇顶科技股份有限公司 Multiple light courcess Fingerprint enhancement and synthetic method and related fingerprint sensor
CN109376688B (en) * 2018-11-20 2021-10-22 连云港市公安局 Fingerprint feature editing, inquiring, combining and comparing method
CN109657579B (en) * 2018-12-07 2023-06-09 上海航芯电子科技股份有限公司 Fingerprint crack detection and repair method
CN109657657A (en) * 2019-02-19 2019-04-19 武汉芯盈科技有限公司 A kind of mobile phone small area high-precision fingerprint identification method and system based on image procossing
CN109886212A (en) * 2019-02-25 2019-06-14 清华大学 From the method and apparatus of rolling fingerprint synthesis fingerprint on site
CN109934180B (en) * 2019-03-18 2021-06-01 Oppo广东移动通信有限公司 Fingerprint identification method and related device
CN110163123B (en) * 2019-04-30 2021-02-26 杭州电子科技大学 Fingerprint finger vein fusion identification method based on single near-infrared finger image
CN110956468B (en) * 2019-11-15 2023-05-23 西安电子科技大学 Fingerprint payment system
CN112949361B (en) * 2019-12-11 2023-07-25 杭州萤石软件有限公司 Fingerprint identification method and device
CN112434658A (en) * 2020-12-10 2021-03-02 上海金智晟东电力科技有限公司 Smart city fingerprint image multi-feature fast matching algorithm
CN112784809A (en) * 2021-02-05 2021-05-11 三星(中国)半导体有限公司 Fingerprint identification method and fingerprint identification device
CN114863493B (en) * 2022-07-06 2022-09-13 北京圣点云信息技术有限公司 Detection method and detection device for low-quality fingerprint image and non-fingerprint image
CN115497125B (en) * 2022-11-17 2023-03-10 上海海栎创科技股份有限公司 Fingerprint identification method, system, computer equipment and computer readable storage medium
CN116258842B (en) * 2023-05-16 2023-07-25 上海海栎创科技股份有限公司 Fingerprint template dynamic splicing optimization system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101329727A (en) * 2008-06-27 2008-12-24 哈尔滨工业大学 Fingerprint identification method combining point with line
CN103714159A (en) * 2013-12-27 2014-04-09 中国人民公安大学 Coarse-to-fine fingerprint identification method fusing second-level and third-level features
CN107392847A (en) * 2017-06-07 2017-11-24 西安电子科技大学 A kind of fingerprint image joining method based on minutiae point and range image
CN107748877A (en) * 2017-11-10 2018-03-02 杭州晟元数据安全技术股份有限公司 A kind of Fingerprint recognition method based on minutiae point and textural characteristics
CN107909031A (en) * 2017-11-15 2018-04-13 张威 A kind of scene of a crime fingerprint ridge leaves region frequency dynamic reconstruction method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9734384B2 (en) * 2015-02-19 2017-08-15 Savitribai Phule Pune University Method and a system for matching fingerprint images obtained from different fingerprint image capturing devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101329727A (en) * 2008-06-27 2008-12-24 哈尔滨工业大学 Fingerprint identification method combining point with line
CN103714159A (en) * 2013-12-27 2014-04-09 中国人民公安大学 Coarse-to-fine fingerprint identification method fusing second-level and third-level features
CN107392847A (en) * 2017-06-07 2017-11-24 西安电子科技大学 A kind of fingerprint image joining method based on minutiae point and range image
CN107748877A (en) * 2017-11-10 2018-03-02 杭州晟元数据安全技术股份有限公司 A kind of Fingerprint recognition method based on minutiae point and textural characteristics
CN107909031A (en) * 2017-11-15 2018-04-13 张威 A kind of scene of a crime fingerprint ridge leaves region frequency dynamic reconstruction method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Fast and accurate fingerprint matching using expanded delaunay triangulation;Ghaddab M H et al.;《IEEE/ACS 14th International Conference on Computer Systems and Applications》;20171231;第751-758页 *
指纹识别技术的新进展;田捷 等;《自然科学进展》;20060430;第16卷(第4期);第1.1.1节 *
自动指纹识别算法研究;田俊青;《中国优秀硕士学位论文全文数据库 信息科技辑》;20120415;I138-1928 *

Also Published As

Publication number Publication date
CN108416342A (en) 2018-08-17

Similar Documents

Publication Publication Date Title
CN108416342B (en) Fingerprint identification method combining thin node and thin line structure
CN110097093B (en) Method for accurately matching heterogeneous images
CN107748877B (en) Fingerprint image identification method based on minutiae and textural features
CN106355577B (en) Rapid image matching method and system based on significant condition and global coherency
CN100414558C (en) Automatic fingerprint distinguishing system and method based on template learning
Dibeklioglu et al. 3D facial landmarking under expression, pose, and occlusion variations
CN103577815B (en) A kind of face alignment method and system
CN108491838B (en) Pointer type instrument indicating number reading method based on SIFT and HOUGH
CN107958443B (en) Fingerprint image splicing method based on ridge line characteristics and TPS deformation model
CN101539993B (en) Multi-acquisition-instrument fingerprint crossing-matching method based on size scaling estimation
CN101620677A (en) Fingerprint identification method based on triangulation and LOD technology
CN101777128A (en) Fingerprint minutiae matching method syncretized to global information and system thereof
CN111507206B (en) Finger vein identification method based on multi-scale local feature fusion
CN109190460B (en) Hand-shaped arm vein fusion identification method based on cumulative matching and equal error rate
CN112597812A (en) Finger vein identification method and system based on convolutional neural network and SIFT algorithm
CN114120378A (en) Three-level classification fingerprint identification method
CN114782715B (en) Vein recognition method based on statistical information
CN107862319A (en) A kind of heterologous high score optical image matching error elimination method based on neighborhood ballot
CN110246165B (en) Method and system for improving registration speed of visible light image and SAR image
CN114648445B (en) Multi-view high-resolution point cloud splicing method based on feature point extraction and fine registration optimization
CN116704557A (en) Low-quality fingerprint matching method based on texture information
CN114358166B (en) Multi-target positioning method based on self-adaptive k-means clustering
CN105701473B (en) A kind of matched method of palmprint image minutiae feature
CN107729863B (en) Human finger vein recognition method
CN113723314A (en) Sugarcane stem node identification method based on YOLOv3 algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant