WO1993007584A1  Method and system for detecting features of fingerprint in gray level image  Google Patents
Method and system for detecting features of fingerprint in gray level image Download PDFInfo
 Publication number
 WO1993007584A1 WO1993007584A1 PCT/US1992/008446 US9208446W WO9307584A1 WO 1993007584 A1 WO1993007584 A1 WO 1993007584A1 US 9208446 W US9208446 W US 9208446W WO 9307584 A1 WO9307584 A1 WO 9307584A1
 Authority
 WO
 WIPO (PCT)
 Prior art keywords
 image
 point
 direction
 fingerprint
 gt
 Prior art date
Links
 239000011162 core materials Substances 0 abstract claims description 87
 238000004422 calculation algorithm Methods 0 abstract claims description 26
 241000854491 Delta Species 0 abstract claims description 19
 230000000306 recurrent Effects 0 abstract claims description 4
 238000004458 analytical methods Methods 0 abstract description 5
 239000011159 matrix materials Substances 0 claims description 5
 238000001228 spectrum Methods 0 claims description 4
 230000000875 corresponding Effects 0 abstract description 3
 238000009826 distribution Methods 0 abstract description 2
 230000001131 transforming Effects 0 claims 1
 238000000034 methods Methods 0 description 6
 238000000605 extraction Methods 0 description 5
 210000001508 Eye Anatomy 0 description 3
 238000005520 cutting process Methods 0 description 3
 241000217377 Amblema plicata Species 0 description 2
 101700021556 G3P2 family Proteins 0 description 2
 101700035483 GAP2 family Proteins 0 description 2
 210000002356 Skeleton Anatomy 0 description 2
 210000000106 Sweat Glands Anatomy 0 description 2
 238000004364 calculation methods Methods 0 description 2
 230000000694 effects Effects 0 description 2
 238000009499 grossing Methods 0 description 2
 238000006011 modification Methods 0 description 2
 230000004048 modification Effects 0 description 2
 230000000051 modifying Effects 0 description 2
 230000036961 partial Effects 0 description 2
 241000084490 Esenbeckia delta Species 0 description 1
 241000590913 Euploea core Species 0 description 1
 230000003466 anticipated Effects 0 description 1
 238000004140 cleaning Methods 0 description 1
 230000002708 enhancing Effects 0 description 1
 230000001965 increased Effects 0 description 1
 230000000670 limiting Effects 0 description 1
 230000015654 memory Effects 0 description 1
 230000002093 peripheral Effects 0 description 1
 238000007781 preprocessing Methods 0 description 1
 238000007639 printing Methods 0 description 1
 230000001603 reducing Effects 0 description 1
 230000000754 repressing Effects 0 description 1
 230000001429 stepping Effects 0 description 1
Classifications

 G—PHYSICS
 G06—COMPUTING; CALCULATING; COUNTING
 G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
 G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
 G06K9/03—Detection or correction of errors, e.g. by rescanning the pattern
 G06K9/036—Evaluation of quality of acquired pattern

 G—PHYSICS
 G06—COMPUTING; CALCULATING; COUNTING
 G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
 G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
 G06K9/00006—Acquiring or recognising fingerprints or palmprints
 G06K9/00067—Preprocessing; Feature extraction (minutiae)

 G—PHYSICS
 G07—CHECKINGDEVICES
 G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
 G07C9/00—Individual entry or exit registers
 G07C9/00126—Access control not involving the use of a pass
 G07C9/00134—Access control not involving the use of a pass in combination with an identitycheck
 G07C9/00158—Access control not involving the use of a pass in combination with an identitycheck by means of a personal physical data
Abstract
Description
METHOD AND SYSTEM FOR DETECTING FEATURES OF FINGERPRINT
IN GRAY LEVEL IMAGE
1. BACKGROUND OF THE INVENTION
1.1 Field Of The Invention
This invention relates to the automatic detection of both the common features (i.e. cores, deltas and minutiae) and the unique features (shape and global features) of a fingerprint by processing a gray level image of the fingerprint.
1.2 Description Of The Prior Art
The history of identifying and verifying individuality according to dermatogliphic features is very long. Chinese people used to print their palms and fingers on documents and contracts as credit even since the seventh century A.D.. Although there are many other methods for identifying individuals today, fingerprint identification is still the most widespread and credible. However, since fingerprints became a legal identifier of persons about one hundred years ago, the number of fingerprint records has grown very quickly and manual management of the files has become very difficult. As a result, many automatic and semiautomatic systems for processing, recognizing, searching, and identifying fingerprints have been proposed.
The most widely used method for detecting features in many present automatic fingerprint identification systems is based upon binary image processing. A binary image is one in which each image element has one of only two binary values, e.g. 0 or 1. The key procedures in such processing are image enhancing, binarizing, thinning, smoothing and modifying. The minutiae of a fingerprint are detected by scanning the thinned binary image with a 3×3 window. Usually, the cores and deltas of fingerprints are also detected by scanning a binary image. A core exists where one or more lines of a fingerprint form a closed path, frequently a circle, or undergo an abrupt 180° direction change. A delta, as used herein, exists when three ridge lines meet at a common point. A delta may be more accurately referred to as a Y.
In U.S. Pat. No. 4,083,035, an apparatus is provided for detecting the position (X and Y) and orientation angle (θ) of minutiae in a binary data bit stream of a 256×256 thinned image. The minutia orientation detector obtains an 8bit vector average of all local angles present in each of a plurality of 8x8 bit windows across the image. This vector average of all of the local angles within a given 8x8 bit window is the orientation angle θ for each minutia that is positioned within that given 8x8 bit window. There are 32×32 such windows on the image, i.e. a 32x32 ridge orientation array will be generated.
In U.S. Pat. No. 4,310,827, the minutia direction of an ending is defined as the direction of a single direction vector drawn from the ending to an arrival point, i.e. a skeleton point located by tracing a predetermined accurate length from the ending. The direction of a bifurcation is defined by a direction symmetrical to the average vector direction of three arrival points.
In U.S. Pat. No. 4,156,230, a 7x7 template scanning window is passed electronically over a 29x29 subarray of the 32x32 ridge contour data as in U.S. Pat. No. 4,083,035 to generate a set of correlation values corresponding to each contour data element and to a plurality of reference angle vectors. The correlation values are processed for determination of peaks and valleys. The resultant data, representing the number of correlation peaks and the direction of each, provides 32 values which define the location and angular orientation of cores and deltas of a fingerprint.
In U.S. Pat. No. 4,151,512, the topological data, identifying singularity points such as triradii (i.e. deltas) and cores, as well as ridge flow line tracings related to those points, are extracted from a 32×32 ridge contour array as in U.S. Pat. No. 4,083,035. Subsequent to making the first cell tracing in any one direction from a triradius or core point, the information from the ridge contour array is used to supply additional angle data to continue each trace. Some logic circuits determine the next row and column address incremental values according to a specification chart. The maximum length of a trace is 48 cells on the ridge contour array. Based upon the number of singularities located, an initial classification can be made wherein an arch is identified if no triradii are located, a whorl may be identified if two triradii are located and a general loop type may be identified if one triradius is located. The loop type pattern is classified according to the direction and size of the flow tracings by comparing them with a set of prestored references.
The following publications are also of relevance to the present invention:
Shen, "Several local properties of digital picture and their applications to the extraction of descriptive information of fingerprints", Acta Scientiarum Naturalium, Universitatis Pekinensis, No. 3, 1986, pp. 3851. (In Chinese).
Shen, "The digital pseudocurvature and its applications",
Applied Mathematics, Sept. 1988, No. 3, Vol. 3, pp. 382391.
(In Chinese).
Shen et al., "A Similarity Measurement and Classification of Fingerprint", Proc. of 4th Chinese Conf. on Pattern
Recognition and Machine Intelligence, 1984. (In Chinese).
1.3 Problems In The Prior Art
The problems listed here below relate to the manner of identifying, or designating, cores, deltas and the shapes of fingerprints based upon ridge directions in the patents cited above.
There are two problems in calculating ridge direction array: (1) Usage of the same value of ridge direction for every point in an 8×8 window will produce serious errors when the window is in a region where ridges curve significantly. (2) There is no measurement provided for representing the accuracy of each average ridge direction.
In U.S. Pat. No. 4,310,827, the direction of a minutia depend on the arrival points. So the direction will be effected if any arrival point can not be found or if the skeleton ridge are not smooth enough.
There are four problems in analyzing ridge trends: (1) 7×7 window in a ridge contour array, i.e. a 56×56 window in the original image, is too large to find cores and. deltas of small whorls, or loops. (2) The fixed window size is not suitable for various types of cores and deltas. (3) The angular orientation with 32 values as well as cores and deltas with 29×29 positions are not accurate enough. (4) As many as 841 (=29×29) elements have to be analyzed for every fingerprint.
There are three problems in ridge flow tracing: (1) Each step in ridge flow tracing passes 8 points because every element in the ridge contour array refers to an 8×8 region in the image. This is too large for tracing at regions where ridges curve significantly. (2) The errors of position and direction are not accumulated to correct the trace. (3) The next step may be wrong when a core, delta or noise region is touched in tracing.
There are three problems in classification: (1) The initial classification based upon the number of deltas may be wrong in case an existing delta can not be found. (2) The loop subclassification by comparing the rough flow tracing is sensitive to the initial fingerprint impression. (3) There is no subclassification for whorls.
Finally, the main factor which effects the accuracy of fingerprint features extracted by binary processing is that much original information in a gray level image of the fingerprint may be lost after binarizing. 2. SUMMARY OF THE INVENTION
2.1 Objects Of The Invention
It is therefore a main object of the present invention to extract cores, deltas, minutiae, and shape and global features of a fingerprint from a gray level image based on original information as much as possible.
It is a more specific object of this invention to provide a quick algorithm for calculating the average direction of local ridges at every point of a fingerprint and for generating a precise direction array.
Another object of the invention is to provide a measurement, termed local curvature, representing the accuracy of each local direction that is easy to calculate.
Another object of the invention is to provide a method for separating a region of clear ridges from background and noise in the image.
Another object of the invention is to provide a method for analyzing the ridge flow trends around any point in the image to decide its trend directions and forkedness.
Another object of the invention is to find the cores and deltas of a fingerprint by analyzing the trends only for each singularity of the image rather than analyzing the trends for every point of a direction array.
Another object of the invention is to locate the position of the center and central orientation of a plain arch of a fingerprint.
Another object of the invention is to establish a coordinate axis of any fingerprint that is consistent for various types and shapes of fingerprints.
Another object of the invention is to accurately trace shape lines, contour lines and normal lines of a fingerprint.
Another object of the invention is to classify fingerprints according to the structural relations among shape lines.
Another object of the invention is to extract shape features from the shape lines that are consistent for both whorls and loops, and to further classify fingerprints according to the shape features.
Another object of the invention is to extract global features of any fingerprint, including plain arch, however imperfect or partial it is and whatever type or shape it has.
Another object of the invention is to calculate the global difference between two fingerprints to finely classify and distinguish them. Another object of the invention is to detect minutiae and their attributes from gray level images of fingerprints.
Another object of the invention is to calculate both the quality level and vector of fingerprints with regard to several aspects, for example noise level, area of clear region, position of center, number of minutiae, etc.
2.2 BRIEF DESCRIPTION OF THE DRAWING
FIG. 1 is a processing flow diagram showing the sequence of basic steps characterizing the present invention.
FIG. 2 shows a point of an image and its neighborhood used to explain the calculation of local ridge direction and curvature according to the invention.
FIG. 3 shows a point and its four adjacent points for calculating four gradient models.
FIGS. 4a4e show a symmetric convex region and its four subsets for calculating four average gradient models.
FIGS. 5a and 5b show the neighborhoods of two adjacent points and their common area as well as parts of the neighborhoods of two adjacent points and their common points.
FIGS. 6a6d show various octagonal regions of clear ridges in a fingerprint image.
FIGS. 7a7d show various types of fingerprint core patterns.
FIG. 7e shows a typical fingerprint delta pattern.
FIGS. 8a8e show the ridge trends of various singularities and analysis circles.
FIGS. 9a9e show the difference values within trend analyzing.
FIGS. 10a10r show the shape lines of various fingerprints for 18 classes of shapes.
FIG. 11 shows the macroscopic structure of peripheral ridges of a fingerprint.
FIG. 12 shows a vault line and normal lines for locating the center of the coordinate axes of a fingerprint.
FIG. 13 shows a manner of determining the central orientation of a plain arch. FIG. 14 illustrates the extraction of shape features extracted on shape lines.
FIG. 15 illustrates the extraction of global features of a fingerprint.
FIGS. 16a16e show various minutiae.
FIGS. 17a17d show the basic features of minutiae in terms of a ridge or valley respectively near a core or delta.
FIGS. 18a and 18b show two neighboring points in a tracing.
FIGS. 1924 are pictorial views illustrating various tracing operations according to the invention on gray level fingerprint images, where FIGS. 19, 20 and 21 depict the tracing of lines for locating the center and center orientation of fingerprint patterns containing a whorl, a loop and an arch, respectively; and FIGS. 22, 23 and 24 depict the detection of global features for an arch, a loop and a whorl, respectively, according to the invention on the basis of the local directions of ridge lines of the fingerprint image patterns at points arranged along concentric circles.
FIG. 25 is a pictorial view illustrating extraction of minutiae from the region of an arch in a gray level fingerprint image.
2.3 GENERAL DEFINITIONS
There are many constants, parameters, variables and functions used herein. Some of these are defined in the C programming language as follows:
x[ ] means an array named by x; it can be defined as a set in which
x[ ] = {x[0], ..., x[m1]},
where m>0.
x[ ][ ] means a two dimensional array, or matrix; it can be defined as a set in which
x[ ][ ] = (x[0][0], ..., x[m1][n1]),
where m>0, n>0. When each member of such a matrix is of the type x[i][j], then, for each member, i is called the line, or row, number and j is called the column number. x = C ? y : z; means that if condition C is true, then x=y; else x=z.
sign (x) = (x <0)? 1 : 1;
int(x) means the largest integer that is not greater than x. Therefore, if x>0, then int(x+0.5) means the most approximate integer of x.
x%y means the remainder, that is not smaller than 0, when x is divided by y.
(x,y) means a digital point at coordinates x and y of a plane. It can also mean, in the appropriate context, a vector from origin point (0,0) to (x,y).
dv(X) means the direction of a vector X, which direction is in the range [0, 2  π] .
#A means the cardinal or the number of members in set A. Σ(y(X), X, A) means the sum of values y(X) for every member X in set A.
x means the model or absolute value of x. if x is a number then
x = (x<0)? x : x;
if x=(x_{1}, ... ,x_{n}) is a vector, then
x =
p_x means a parameter which may be predetermined or calculated for use in performing computations according to the invention. The range of each predetermined parameter will may be used in preferred embodiments of the invention will be listed below.
π = 3.14159;
p_π = 252. DESCRIPTION OF THE PREFERRED EMBODIMENTS
3.1 General Procedure
Referring to FIG.l, the general procedure of this invention for extracting as many features as possible of a fingerprint from a gray level image without binary processing is shown. The significant steps in an exemplary method according to the invention are:
1) Input and Preprocessing
The input digital image has L rows and K columns of image elements, as shown in FIG.1. The intensity, or brightness level, in each image element has 3 to 1024, and preferably 256, gray levels and the image element density is 500 dpi (dots, or image elements, per inch) in each coordinate direction. For the further processing, the original range of gray levels of the fingerprint image is transformed into a uniform range.
2) Calculating Direction Array and Curvature Array
A direction array and a curvature array are calculated for each image point by a quick recurrent algorithm, possibly with the aid of some tables. The direction array and curvature array values for an image represent the average ridge direction and its accuracy or variance at each point of the image.
3) Cleaning Background
Segmenting an equiangular, or regular, octagonal region of clear ridges of a fingerprint from the background by eight straight lines according to the curvature array, and excavating all large conjunct regions of noise points in the octagon. All further processing is within this region.
4) Trend Analyzing
The ridge trends and forkedness of any point can be determined by analyzing the distribution of ridge directions around it on circles with different radii.
5) Finding Cores and Deltas
All singularities are found by scanning the curvature array with a 3×3 window to find all maximum curvature points. All cores and deltas are located by analyzing the trends for each singularity.
6) Line Tracing
Contour lines, shape lines and normal lines of a fingerprint are traced, based on the direction array. accurately by accumulating the errors of coordinates and directions to correct the trace point by point.
7) Deciding the Coordinate Axis
By macroscopically analyzing the structure of a direction array, the fingerprint is centered on a coordinate axis system; meanwhile the central orientation of the fingerprint is selected on the basis of the trend at the center.
8) Classification
A fingerprint is classified into one of 18 classes according to the structural relations among the shape lines.
9) Extracting Shape and Global Features
For classifying and fast searching, a few shape features are extracted consistently for various fingerprint classes except plain arch from shape lines. Furthermore a plurality of global features are extracted consistently for various fingerprint classes with reference to the coordinate axes.
10) Detecting Minutiae
Based on the description of minutiae when the shape lines are valleys, all minutiae are detected by tracing each valley of the fingerprint in the gray level image. Each minutia is represented by its x, y coordinates and direction θ.
11) Quality Checking
A synthetic quality level and quality vector of fingerprints is presented based on the position of the center, number of minutiae, noise level, area of clear region, etc. in order to decide automatically, or to suggest to the operator, whether to accept, reject or reevaluate the fingerprint.
3.2 Direction Array and Curvature Array
All features of the fingerprint, including ordinary features (cores, deltas, minutiae) or novel features (shape and global features) are referenced to the average direction of ridges in a small region of the fingerprint. The calculation of local direction is very important for image processing and feature extraction of fingerprints. In this invention, an array whose every element represents an average direction of the textures in a small region of the image is called a direction array.
There are three features of the method provided herein for calculating a precise direction array:
First, the direction array is calculated directly from the gray level image of the fingerprint, so the original information will be used as much as possible.
Second, the direction array is calculated point by point in the image, i.e. every element in the array is a local average direction of just one point in the image that is calculated on the basis of a neighborhood of the point, this is necessary spatially for regions on the fingerprint where the directions of ridges change greatly or the curvatures are very high.
Third, to represent the accuracy of the local direction at each point, a local average curvature of the textures in the neighborhood of the same point is also calculated, and a curvature array of the image is generated therefrom. The usage of local direction at a point should refer to the local curvature as the relation between an average value and its variance. A local curvature in some regions of a fingerprint, for example at a core, delta, scar or noise, will have very high curvature values which signify that local direction is meaningless there due to inconsistency of ridge directions and/or indistinct textures in that region. Generally, lower curvature values mean better accuracy of the direction value at a point.
For a given image area S, which may include the entire fingerprint image or any selected portion thereof, there are derived four summation gradient values v_{i} (i=1, 2, 3, 4), each representing the sum of the absolute values of the difference, g_{i}(X), in gray scale image point values, f(X), between each pair of points, X and XQ_{i}, in S for which Q_{i} is a selected vector. Thus, as will be seen from the example to be described, depending on the value of Q_{i}, v_{i} is representative of the degree of change in the image across the image area in direction Q_{i}. Moreover, v_{i} will be particularly relevant to the point P at the center of S because there is the greatest probability that it is the gradient condition at P which is described by v_{i} .
Table 1, below, provides an exemplary gray scale value matrix representing the gray values f(X) at respective points, X, of an image area S. Here, each point X has a horizontal coordinate k and a vertical coordinate 1. As is apparent from Table 1, the origin of the coordinate system will be somewhere above and to the left of the illustrated image area.
Each point X is represented by a pair of coordinates k, 1. In Table 1, the k and 1 coordinates at the center of S are n and m.
Table 1
S k n5 n4 n3 n2 n1 n n+1 n+2 n+3 n+4 n+5 1
m5 13 14 15 15 15 14 12 11 9 7 6 m4 14 14 14 13 12 10 6 4 2 2 3 m3 13 11 9 6 6 4 3 3. 4 6 8 m2 10 7 4 1 2 2 3 4 6 9 11 m1 5 4 3 0 2 4 6 8 11 13 14 m 3 4 5 6 8 9 11 12 14 15 15 m+1 4 6 9 10 12 14 15 15 15 14 12 m+2 7 9 12 13 14 14 15 14 13 10 8 m+3 13 14 14 14 14 13 10 8 8 6 4 m+4 15 15 14 13 11 8 3 2 5 5 5 m+5 14 12 10 8 7 4 0 1 5 7 8
For each vector Q_{i} there is produced a set S_{i} containing all points X for calculating g_{i} (X)=f(X)f(XQ_{i}). The points in each set S_{i} are obtained by using the corresponding values for Q_{i}, i.e., S_{1} is obtained by using Q_{1}, S_{2} by using Q_{2}, etc.. The values for g_{i}(X) in each set S_{i} are given in Tables 25, below, where Q_{i} have the following k,1 coordinate values,
Q_{1} = (1,0);
Q_{2} = (1,1);
Q_{3} = (0,1);
Q_{4} = (1,1). Table 2
S_{1} k n5 n4 n3 n2 n1 n n+1 n+2 n+3 n+4 n+5 1
m5 1 1 0 0 1 2 1 2 2 1 m4 0 0 1 1 2 4 2 2 0 1 m3 2 2 3 0 2 1 0 1 2 2 m2 3 3 3 1 0 1 1 2 3 2 m1 1 1 3 2 2 2 2 3 2 1 m 1 1 1 2 1 2 1 2 1 0 m+1 2 3 1 2 2 1 0 0 1 2 m+2 2 3 1 1 0 1 1 1 3 2 m+3 1 0 0 0 1 3 2 0 2 2 m+4 0 1 1 2 3 5 1 3 0 0 m+5 2 2 2 1 3 4 1 4 2 1
Table 3
S_{2} k n5 n4 n3 n2 n1 n n+1 n+2 n+3 n+4 n+5 1
m5
m4 1 0 2 3 5 8 8 9 7 4 m3 3 5 8 7 8 7 3 0 4 6 m2 6 7 8 4 4 1 1 3 5 5 m1 6 4 4 1 2 4 5 7 7 5 m 1 1 3 8 7 7 6 6 4 2 m+1 3 5 5 6 6 6 4 3 0 3 m+2 5 6 4 4 2 1 1 2 5 6 m+3 7 5 2 1 1 4 7 6 7 6 m+4 2 0 1 3 6 10 8 3 3 1 m+5 3 5 6 6 7 8 2 3 2 3 Table 4
S_{3} K n5 n4 n3 n2 n1 n n+1 n+2 n+3 n+4 n+5 1
m5
m4 1 0 1 2 3 4 6 7 7 5 3 m3 1 3 5 7 6 6 3 1 2 4 5 m2 3 4 5 5 4 2 0 1 2 3 3 m1 5 3 1 1 0 2 3 4 5 4 3 m 2 0 2 6 6 5 5 4 3 2 1 m+1 1 2 4 4 4 5 4 3 1 1 3 m+2 3 3 3 3 2 0 0 1 2 4 4 m+3 6 5 2 1 0 1 5 6 5 4 4 m+4 2 1 0 1 3 5 7 6 3 1 1 m+5 1 3 4 5 4 4 3 1 0 2 3
Table 5
S_{4} k n5 n4 n3 n2 n1 n n+1 n+2 n+3 n+4 n+5 1
m5
m4 0 1 1 2 2 2 5 5 5 4 m3 1 3 4 6 4 2 1 1 2 3 m2 1 2 2 5 2 1 0 0 0 1 m1 2 0 2 2 0 1 2 2 2 2 m 1 1 5 4 4 3 3 1 1 1 m+1 0 1 3 2 3 3 3 1 0 1 m+2 1 0 2 1 0 1 0 1 1 2 m+3 4 2 1 0 0 2 4 5 2 2 m+4 1 1 0 1 2 2 5 6 1 1 m+5 1 2 3 3 1 1 2 4 0 2
The number of values for g_{i} (X) on each set S_{i} is less than the number of points in area S because each value for g_{i} (X) is calculated only for a pair of values of f(X) and f(XQ_{i}) which are both in area S. Thus, for example, on set S_{1} there will be no value of g_{1}(X) for which the k coordinate of P is n5 because the k coordinate of XQ_{i} is n6, which is outside of area S. Then, for each set S_{i}, there is derived a value vi wher v, =∑g_{i}(P) for all values for g_{i}(P) on set S_{i}. Where i = 1, 2, 3, 4, the values for v_{i} in the example shown in Tables 15 will be v_{1}=167; v_{2}=437; v_{3}=337 and v_{4}=194.
From these values for v_{i}, there will be derived four further values u_{i}, as follows: u_{1} = max (v_{1}, v_{3})
u_{2} = max (v_{2}, v_{4});
u_{3} = min (v_{1}, v_{3});
u_{4} = min (v_{2}, v_{4}).
In the case of the example in Tables 15, u_{1}=337; u_{2}=437; u_{3}=167; u_{4}=194.
These values are then used to derive:
For the example shown in Tables 15, e=36.148865.
The derived values for v, u and e are then used to derive a local direction value, d(f, P, S), and a local curvature value, c(f, P, S).
Referring to Figures 2, 3 and 4 of the accompanying DRAWING, according to the 1986 and 1988 publications of Shen, the formulas for calculating the local direction d and curvature c at each point P at the center of a set S in image f are as follows:
d(f,P,S) = sign (v_{4} v_{2}) · arctn ((v_{1}e)/(v_{3}e)). (1)
In the example shown in Tables 15, which example is associated with P(n,m), d = 23.5° and c = 0.0927.The normal values of direction d are limited between π/2
(upward in FIG. 2) and +π/2 (downward in FIG. 2). Here d=0 represents the horizontal direction which points toward the right Table 1 and FIG. 2. According to Equation (2), the normal values of curvature c are all between 0 and 1.
Specially, c=0 represents the texture with a plain local curvature, c=1 represents the texture with an abrupt local curvature. Additionally, for each background point or a noisy point P, both of c and d will be set to a special value that c(P)=255 and d(P)=255.
S(P) is a neighborhood of P, it is convex and symmetric with respect to P and with respect to the directions 0º and 45°, respectively. For example, a digital square, disc and octagon with P as center are all neighborhoods of this kind. Each S_{i} is a subset of S by deleting some border points as shown in FIG.4 where '.' is in S and S_{i}, ' * ' is not in S_{i} (i=1,2,3,4).
Now referring to FIGS.5, because there is tremendous complexity for calculating the direction array and curvature array per point, a quick recurrent algorithm with some tables is proposed to reduce the complexity on the basis of three key points:
First of all, because each gradient model f(P)f(PQ_{i}) will be used as many times as the number of the points whose neighborhood includes both P and PQ_{i}, four arrays of gradient models at four directions as in FIG. 3 are calculated firstly and signed by g_{i} respectively:
g_{i}(P) = f(P)  f(PQ_{i}), where i=1, 2, 3, 4. (3)
Second, there are many common points in the neighborhoods of both P and its adjacent point P. So v_{i}(P) can be calculated recurrently from v_{i} (P) by subtracting g_{f} (X) for each X at the left side L(S_{f}(P) of S_{i} (P) and adding g_{i}(X) for each X at the right side R(S_{i}(P)) of S_{i}(P), as shown in FIG. 5a where each ' * ' is in S (P) , each ' . ' is in S (P~) and each 'o' is in both sets: V_{i}(f,P,S_{i}(P^{~})) = V_{i}(f,P,S_{i}.(P))  Σ(g_{i}(X),X,S_{i}(P)))
+ Σ(g_{i}(X),X,R(S_{i}(P^{~}))). (4)
For example, after v has been calculated for P (n,m), v_{i} may be calculated for P (n+1, m) by subtracting from P (n,m) the values of g_{i} (n4, 1), and adding the values of g_{f} (n+6, 1), in each case 1 taking on each value from m5 to m+5.
Furthermore as shown in FIG. 5b, (where ' * ' is in L, '.' is in L^ and 'o' is in both of them, ∑(g_{i} (X), L^) also can be calculated recurrently:
∑(g_{i}(X),X,L^) =∑(g_{i}(X),X,L)  g_{i}(P') +g_{i}(P"). (5)
Where P' is the top point in L and P" is the bottom point in L^ shown in FIG. 5b.
Third, some lookup tables are used instead of a series of calculations for direction d and curvature c.
To calculate the value of d, a table
Td = {td[0], td[1], ..., td[Md]};
is created, where each term Td[k] in Td is defined as
Td[k] = arctan(k/Md), for k = 0, 1, ...,Md. (6) where Md is a predetermined integer. So the values of terms in Td are all between 0 and π/4. For any group of v_{i} and e with the condition v_{1} ≤ v_{3}, let
diff_d = Td[int(Md· ((v_{1}e)/(v_{3}e))+0.5)]arctan((v_{1}e)/(v_{3}e)). Then according to the continuity of the function arctan(), the value diff_d will be very small if Md is large enough. So that by the table Td, when v_{1}≤v_{3}, Equation (1) can be transformed approximately to:
d(P) = sign(v_{4}v_{2})·((v_{1}≤v_{3})?
Td[int(Md·((v_{1}e)·(v_{3}e))+0.5]:
(π/2Td[int(Md((v_{3}e)·(v_{1}e))+0.5)]). (1') The integer Md can be selected large enough to assure sufficient accuracy for y.
To calculate the value of c, another table
Tc = {Tc[o][o], Tc[o][l], ..., Tc[Mc][Mc]}; is created where each term Tc[i][j] in Tc is defined as
where Mc is a predetermined integer. So the values of terms in Tc are all between 0 and 1. For any group of u_{i}, let diff_c = Tc [ int(Mc·u_{3}/u_{1}+0.5)] [ int(Mc·u_{4}/u_{2}+0.5)] Then according to the continuity of the function in Equation (2), the value diff_c will be very small if Mc is large enough. So that by the table Tc, Equation (2) can be transformed approximately to:
c(P) = Tc[int(Mc*u_{3}/u_{1}+0.5)] [int(Mc*u_{4}/u_{2}+0.5)].
The integer Mc can be selected large enough to assure sufficient accuracy of the formula.
The lookup tables may be provided, e.g. in a nonvolatile addressable memory, where each entry corresponds to a respective value of Td or Tc.
If the neighborhood for calculating the local direction and curvature is a square window with 2·r+1 points as both its length and width, let L and K represent the numbers of rows and columns, respectively, of an image array, and the algorithm is as follows:
<1> Calculate g_{i}
g_{1}[l][k]=f[l][k]f[l][kl], for 1=0,...,L1, k=1,...,K1;
g_{2}[l][k]=lf[l][k]f[ll][kl], for l=1,...,L1, k=1,...,K1;
g_{3}[l][k]=f[l][k]f[ll][k], for l=1,...,L1, k=0,...,K1;
g_{4}[l][k]=f[l][k]f[ll][k+l], for l=1,...,L1, k=0,...,K2; goto <2>;
<2> Accumulating g_{i} of each column k on a vertical line {(0,k), ... (2·r,k)}.
w_{1}[k]=∑(g_{1}[l][k]), 1, (0,1,...,2·r), for k=1, ...,K1; w_{2}[k]=∑(g_{2}[l][k]), 1, {1,2...,2·r}), for k=1,...,K1; w_{3}[k]=∑(g_{3}[l][k]), 1, {1,2,...,2·r}), for k=1,...,K1; w_{4}[k]=∑(g_{4}[l][k]), 1, {(1,2.,,,.2·r}), for k=0,...,K2; where k is the number of a column, 1 is the number of a row and varies from 0 to 2r, each w_{i} [k] is a summation of g_{i} for 1=0,...,2·rr and assigned by k.
goto <3>;
<3> Accumulate w_{i} (i=1, 2, 3, 4) in current window in each g_{i} to obtain v_{i}.
v_{1}=∑(w_{1}[j]), j, {1,...,2·r}) ;
v_{2}=∑(w_{2}[j]), j, {1,...,2·r}) ;
v_{3}=∑(w_{3}[j]), j, (0,...,2·r}) ;
v_{4}=∑(w_{4}[j]), j, {0,...,2·r1}) ;
goto <4>;
<4> Calculate curvature and direction of every point (k, 1) in current window and shifting the window to right,
if(v_{1}+v_{2}+v_{3}+v_{4}<p_v), then
c[l][k]=255; d[l][k]=255;
else c[l][k]=Tc[int(Mc·u_{3}/u_{1}+.5)][int(Mc·u_{4}/u_{2}+.5)];
if (v_{3}==v_{1}), then d[l][k]=si]n(v_{4}v_{2})·ττ/4; else {d[l][k] = sign(v_{4}v_{2})·((v_{1}v_{3})?
Td[Md·(v_{1}e)/(v_{3}e)+.5]:
(π/2Td{Md·)v_{3}3)·(v_{1}e)+.5]));
}
}
k=k+1;
if(k≥Kr) goto <5>;
v_{1}=v_{1}+w_{1}[k+r]w_{1}[kr];
v_{2}=v_{2}+w_{2}[k+r]w_{2}[kr];
v_{3}=v_{3}+w_{3} [ k+r]w_{3}[kr1];
v_{4}=v_{4}+w_{4}[k+rl]w_{4}[krl];
goto <4>; <5> Put the current window at the beginning of the next line and recalculating w_{i}:
l=1+1;
if (l≥Lr) then return;
w_{1}[k]=w_{1}[k]+g_{1}[l+r][k]g_{1}[lrl][k], for k=1,...,K1; w_{2}[k]=w_{2}[k]+g_{2}[l+r][k]g_{2}[lr][k], for k=1,.. .,K1;
w_{3}[k]=w_{3}[k]+g_{3}[l+r][k]g_{3}[lr][k], for k=0,...,K1;
w_{4}[k]=w_{4}[k]+g_{4}[l+r][k]g_{4}[lr][k], for k=0,...,K2;
p_v is a threshold which is directly proportional to r_{2}, and the condition v_{1}+v_{2}+v_{3}+v_{4}<p_v at a point means that the contrast of gray level in its neighborhood is very low.
Usually direction at those points are ignored, and the curvatures are assigned a special value of 255.
Here each term w_{i}[k] is associated with a column number k. For example, referring again to the values in Tables 1 and 2 above, let r=5, then,
w_{1}[n]=∑(gl[l][n], 1, {m5,m4,...,m+5})=17, similarly, w_{1} [n+1]=26.
Now, an example is provided in Table 6, which is derived from Table 4, for r=2. The values of w_{3}[k] (k=n5,n 4,...,n+5) are calculated as in Table 6, while Table 7 shows the values of v_{3} at the points from line m3 to line m+3 in Table 4.
Table 6
w_{3} k n5 n4 n3 n2 n1 n n+1 n+2 n+3 n+4 n+5 1 m3 10 10 12 15 13 14 12 13 16 16 14 m2 11 10 13 19 16 15 11 10 12 13 12 m1 11 9 12 16 14 14 12 12 11 10 10 m 11 8 10 14 12 12 12 12 11 11 11 m+1 12 10 11 14 12 11 14 14 11 11 12 m+2 12 11 9 9 9 11 16 16 11 10 12 m+3 12 12 9 10 9 10 15 14 10 11 12
For example,
w_{3}[m+2] [n]=w_{3}[m+l] [n]g_{3}[m2] [n]+g_{3}[m+2] [n]=142+0=12,
Table 7 v_{3} k n3 n2 n1 n n+1 n+2 n+3
1
m3 60 64 66 67 68 71 71
m2 69 73 74 71 64 61 58
m1 62 65 68 68 63 59 55
m 55 56 60 62 59 58 57
m+1 59 58 62 65 62 61 62
m+2 50 49 54 61 63 64 65
m+3 52 50 53 58 58 60 62
For example,
v_{3}[m] [n1]=v_{3}[m] [n2]w_{3}[m] [n4]+w_{3}[m] [n+1]=7310+11=74.
So that with the above algorithm, only 4 additions and subtractions are needed on average for calculating v_{3} at each point.
3.3 Removing Background
For most fingerprint images, there always are noisy textures and other features in background. Before extracting the salient features of the fingerprint, it is necessary to segment a region of clear ridges, or valleys, from background and noise, i.e. to decide the position and boundary of the region of clear ridges. The curvature value of a point in background is always very high due to the low contrast or noise, so the clear region can be obtained by cutting off the points with high curvature values.
Referring to FIGS. 6, the method proposed here is for locating an equiangular octagonal region of clear ridges of the fingerprint. The eight edges of the octagon are all straight lines with predetermined equispaced angular
orientations, e.g. 0°, 45°, 90°, 135°, 180°, 225º, 270° and 315º, respectively.
Referring to Fig.6a, the edges of an equiangular octagon are assigned E_{1,} ..., E_{8} respectively, and the corners are assigned by the coordinates (x_{1},y_{1}),..., (x_{8},y_{8}) respectively. So each edge can be represented by an equation of straight line as follows,
E_{1} : x+y=x_{1}+y_{1} ; or x+y=x_{2}+y_{2};
E_{2}: y=y_{2};
E_{3} : xy=x_{3}y_{3}; or xy=x_{4}y_{4};
E_{4} : x=x_{4} ;
E_{5} : x+y=x_{5}+y_{5}; or x+y=x_{6}+y_{6};
E_{6}: y=y_{6} ;
E_{7}: xy=x_{7}y_{7}; or xy=x_{8}y_{8};
E_{8}: X=Xg .
Any equiangular octagon is determined by such eight
equations, i.e. by eight parameters {x_{2}, y_{2}, x_{4}, y_{4}, x_{6}, y_{6}, x_{8}, y_{8})· The octagon is obtained by cutting the image with eight lines according to the curvature array sequentially and described by only 8 bytes of the positions of eight lines.
There are several shapes of octagons shown in FIGS.6. The algorithm to locate eight edges is as follows:
<1> Set the original edges of clear region such that
x_{4}=r; x_{8}=Lr1; y_{6}=r; y_{2}=Kr1l;
Calculate the average curvature of every row and column in the current clear region, and store them in the two arrays acl[ ] and ack[ ] respectively, i.e. acl[y]=∑(c[y] [x] ,x, {x_{4},x_{4}+1,...,x_{8}}),
for y=y_{6}, y_{6}+1, .. Y_{2};
acl[x]=∑(c[y] [x] ,y, (y_{6},y_{6}+1, ... ,y_{2}}) ,
for x=x_{4}, x_{4}+1, ..., x_{8}·
let n=0;
<2> if (n>p_n_{1}) then goto <5>;
where p_n_{1} is a predetermined limitation, i.e. the times for cutting the image is not more than p_n_{1}.
Let x_{0} being the number or coordinate of the row of the current array with the minimum average curvature value, i.e., acl[y_{0}]=min(acl[y_{6}], ..., acl[y_{2}]);
similarly let y_{0} satisfy,
ack[x_{0}]=min(ack[x_{4}], ..., ack[x_{8}]).
if(ack[x_{0}]/(x_{8}x_{4}+1) ≥ acl[y_{0}]/(y_{2}y_{6}+1)) then
goto <3>;
else goto <4>;
<3> Cut the area by horizontal lines, i.e. determine edges E_{2} and E_{6}. Let
P_c_y=p_c_pΣ(acl[y], y, {y_{6}, ...,y_{2}})/(y_{2}y_{6}+1);
l=max{y  (acl[y] > p_c_y) & (y≤y_{0}) & (y≥y_{6})};
l_{2}=min{y  (acl[y] > p_c_y) & (y≥y_{0}) & (y≤y_{2})};
i.e. 1 is the largest row number satisfying l≤y_{0}, l≥y_{6} and acl [1] >p_c_y; and l_{2} is the smallest row number satisfying l>y_{0}, l_{2}≤y_{2} and acl [1] >p_c_y. Then recalculate y_{2}, y_{6} and ack[ ] in the current clear region as follows,
Y2^{=l}2' Y6^{=l;}
ack[x]=∑(c[y][x], y, {y_{6}, y_{6}+1,...,y_{2}});
for x=x_{4}, x_{4}+1, ...,x_{8}.
let n=n+1;
goto <2>;
where p_c_p is a predetermined parameter for calculating the parameters p_c_x and p_c_y.
<4> Cut area by vertical lines, i.e. determine edges E_{4} and
E_{8}. Let
P_c_x=p_c_pΣ(ack[i], i, {x_{4}, ... ,x_{8}})/ (x_{8}x_{4}+1);
k_{1}=max{x  ack[x]>p_c_x & x<x_{0} & x≥x_{4}};
k_{2}=min{x  ack[x]>p_c_x & x≥x_{0} & y<x_{8}}; i.e. k, is the largest row number satisfying k_{1}≤x_{0}, k_{1}≥x_{4} and ack[k_{1}]>p_c_x; and k_{2} is the smallest row number satisfying k_{2}≥x_{0}, k_{2}≤x_{8} and ack[k_{2}]>p_c_x. Then recalculate x_{8}, x_{4} and ack[ ] in the current clear region as follows,
x_{8}=k_{2}; x_{4}=k_{1};
acl[y]=∑(c[y][x], x, {x_{4},x_{4}+1,...,x_{8}}),
for y=y_{6}, y_{6}+l,...,u_{2};
let n=n+1;
goto <2>;
<5> Cut area by hypotenuse lines, i.e. determine the edges E_{1}, E_{3}, E_{5} and E_{7} of the octagon enclosing the clear region or determine the numbers x_{2}, y_{4}, x_{6} and y_{8}. Let
p_c_z=(p_c_x + p_c_y)/2;
thent the four numbers can be calculated as following,
x_{2}=min{z  (ac3[z]>p_c_z) & (z≤x_{8}) & (z>(x_{4}+x_{8})/2)};
where for z=x_{8}, x_{8}1, ..., (x_{4}+x_{8})/2,
ac3[z] =∑(c[l][k], (k,l), A_{2}(z));
where A_{2}(z) = {(y_{2},z), (y_{2}l,z+1), ..., (y_{2}x_{8}+z,x_{8})}.
x_{6} = max{z I (ac3[z]>p_c_z) & (z≥x_{4}) & (z≤(x_{4}+x_{8})/2)}; where for z=x_{4}, x_{4}+1, ..., (x_{4}+x_{8})/2,
ac3[z]=∑(c[l][k], (k,l), A_{6}(z));
where A_{6}(z) = {(y_{6},z), (y_{6}+1,z1), ..., (y_{6}x_{4}+z,x_{4})}.
y_{4} = min(z  (ac3 [z]>p_c_z) & (z<y_{2}) & (z>_{2}+y_{6})/2)};
where for z=y_{2}, 2_{2}1, ..., (y_{2}+y_{6})/2,
ac3[z]=∑(c[l][k], (k,l), A_{4}(z));
where A_{4}(z) = { (y_{2},z), (y_{2}l,z1), ..., (y_{6}+x_{4}z,x_{4})}.
y_{8}=max[z  (ac3[z]>p_c_z) & (Z>y_{6}) & (z<(y_{2}+y_{6})/2)};
where for z=y_{6}, y_{6}+1, ..., (y_{2}+y_{6})/2,
ac3[z] =∑(c[l][k], (k,l), A_{8}(z));
and A_{8}(z)={(y_{6},z), (y_{6}+l,z+l), ..., (y_{6}+x_{8}z,x_{8})}.
<7> After the octagon is obtained, the curvatures of points in the background are set to 255 for distinguishing them from the points in clear area. 3.4 Locating Singularities And Analyzing Trends
Now referring to FIGS.7, there are three types of core in a fingerprint, according to the structure of the ridges around them, named 'o' (FIG. 7a), 'n' (FIG. 7b), and 'u' (FIG. 7c), respectively. An 'o' core may appear in a whorl, an 'n' may appear in a whorl, a double loop or a loop, while a 'u' may appear in a whorl, a double loop or a nodding loop. However, any core or delta is a point such that the
directions of ridges around it are very inconsistent, so its curvature is very high and may be higher than the curvatures of its neighboring points. Visually, there always are many bright points on the curvature array of a fingerprint; some of them indicate the position of a core or delta, while others of them indicate a scar, fold or noise.
According to the publications of Shen in 1986, a point on a digital image is called a singularity if it has a maximum curvature value that is greater than a threshold among its 8 neighboring points which form the corners and line midpoints of a square in which the point is centered.
A point P=(k,l) is called a singularity if its curvature is not less than its 8 neighboring points' and not less than a predetermined threshold p_cl, i.e.
1. (c(P) ≥ c(P+Q_{i})) & (c(P) ≥ c(PQ_{i})), (i=1,2,3,4); 2. c(P) ≥ p_cl;
There will be some singularities appearing in the region near a core or delta. But usually there will be some
singularities appearing in the region near a scar, fold or noise, too. For the purpose of recognition of a core or a delta among singularities, analyzing the structure around a singularity is necessary.
Now referring to FIGS. 8, the ridge flow is different around a core, a delta or an ordinary point. The distinction can be described easily by using the concept of ridge trend. A ridge trend of a point is defined as the direction of ridges that are near the point and run off from the point. There are three ridge trends for a delta, two trends for an 'a' core or ordinary point, one trend for a 'n' or 'u' core, and no ridge trend for an 'o' core. The number of ridge trends of a point is called the forkedness of the point. To find the ridge trends of a point P, a series of digital circles {O_{s}} with P_{c} as the center and various radii s are used. For every point P_{s} on a circle O_{s}, the difference between the local direction at P_{s} and the direction of vector P_{c}P_{s} is calculated and stored in an array dd[ ] . The
difference when P_{c}P_{s} extends substantially in the direction of a ridge trend will be very small, as shown by the curve minima FIGS.9, so the ridge trends will be decided by finding all minimums in dd[ ]. FIG. 9a shows the pattern of array dd[ ] around an 'o' core, FIG. 9b around an 'n' core, FIG. 9c around an 'a' core, FIG. 9d around a delta. Fig. 8e
represents a noise point or a scar. Fig. 9e shows the pattern of dd[ ] around the point in Fig. 8e.
Fourier transform and reverse transform are used on dd[ ] for reducing the effect of noise. The forkedness can be decided by the power spectrum, while the trends can be found on filtered dd[ ].
The method for determining the forkedness of a point will be described with reference to the power spectrum of a Fourier transform FT: ω[0], ..., ω[j]; where j is the order of Fourier the transform. Firstly, if ω[0] is the maximum of all ω[j] and ω[0] is not smaller than a predetermined
threshold p_ω_{0}, then it is an 'o' core. Secondly, if ω[1] is the maximum of all ω[j] except ω[0], and ω[1]≥p_w_{1}, then it is a 'n' or 'u' core. Else if ω[2] is the maximum of all ω[j] except ω[0] and ω[2]>p_ω_{2}, then it is an 'a' core. Else if ω[3] is the maximum except ω[0] and ω[3]>p_ω_{3}, then it is a delta point. Here p_ω_{0}, p_ω_{1}, p_ω_{2} and p_ω_{3} are all
predetermined parameters.
In the case of Fig. 8e, ω[2] to be the maximum, but it is not an 'a' core. To determine whether it is an 'a' core is depended on its trends. If the difference of the two trends is very large, for example, it is great than 2π/3, then it would not be an 'a' core and would be ignored.
However it is notable that both the trend directions and forkedness at a point are due to the radius of the analysis circle. The algorithm for analyzing the trends of a point P is as follows:
<1> Let s = S_{min}.
Where S_{min} is the minimum radius of digital circles for trend analysis and S_{max} is the maximum.
<2> Calculate dd[ ].
if (S > S_{max}) then reject and return;
else {for every point P_{si} on digital circle O_{s} do:
{dd[i] = d(P_{si})  dv(P_{si}P) % π;
if (dd[i] > π/2 then dd[i] = πdd[i];} nz = #{X I (X on O_{s}) & (c(X) > p_c_{2})}; if (nz > p_n_{2}), then {s = s+1; goto <2>} else goto <3>;}
i.e. if the number of points, P_{si} , on O_{s} with c(P_{si})>p_c_{2} is more than p_n_{2}, then increase the radii s and recalculate dd[ ].
<3> Derive Fourier transform on array dd[ ].
a[j] = Σ (dd[i]·cos(2·π·i·j/m), i, {0,...,m1})/m;
b[j] = ∑ (dd[i]·sin(2·π·i·j/m), i, {0,...,m1})/m;
for j = 0, ..., n;
where m=#O_{s}.
The power spectrum is
S=S_{min}1; ω[j] = (a[j])^{2}+(b[j])^{2}; for j=0, ..., n.
where n is the order of the Fourier transform.
if (ω[0] > p_w_{0}) {P is an 'o' core, return;}
else {let k satisfying,
ω[k] = max{ω[1], ω[2], ω[3]};
if (ω[k] < p_ω_{k}) then goto <2>;
else if k is equal to 1 then
{calculating the trend direction,
td = arctan(b[1]/a[1]) (π < td < π);
if (td > 0) then P is an 'n' core;
else P is a 'u' core;
return;
}
else make the reverse Fourier transform:
{dd[i] = ∑(a[j]·cos(2·π·i·j/m) b[j] ·sin(2·π·i·j/m), j, {0,1,...,n});
}
}
where p_ω(j=0, ...,3) are predetermined parameters.
<4> Finding each minimum value dd[i_{j} ] in dd[ ] [ , i. e. for each dd[i_{j}] that satisfies
dd[i_{j}]<dd[(i_{j}1)%m] & dd[i_{j}]<dd[(i_{j}+1)%m & dd[i_{j}]<p_dd, for j=1, ...,l.
if (1 is not equal to k) then goto <2>;
else {the forkedness at P is k;
the trends at P are dv(P_{sij}.P), j=1,...,k.
}
return;
where p_dd is a predetermined threshold for finding minimum points. dd[ (i_{j}1)%m] and dd[(i_{j}+1)%m] are the values of dd[ ] at neighboring points of P_{sij} on the circle.
In the equations presented above, n is a constant that is much smaller than m, the parameters S_{mfn} and S_{maχ} are predetermined; p_c, p_n_{2} and ω_{j} (j=0,...3) are thresholds. More specifically, p__n_{2} is selected to eliminate curvature values which are so large in number that the associated direction value is unreliable. The threshold p_c is selected to obtain points with a suitably high curvature value.
Accordmg to preferred embodiments of the invention, S_{min} may be equal to 5 and S_{maχ} may be equal to 20.
The procedure for finding all cores and deltas of a fingerprint except the 'a' core is as follows:
<1> All singularities i.e. maximums of curvature array in the clear region, are located;
<2> Every singularity is analyzed for finding trends by the above algorithm.
<3> Select one point in every set composed of the same kind of cores or deltas located together as the representative. The criterion for selecting 'o' core is ω[0], for 'n' or 'u' core is ω[l], for delta is ω[3].
In section 3.7, a method will be provided for finding the 'a' core. 3.5 Line Tracing
A digital curve in a fingerprint is called a contour line if it is in keeping with the local ridge direction at every point on it. Usually, a contour line can be obtained by starting from a point on the fingerprint with a trend of the point as the initial direction and extending or tracing progressively according to the direction array. Especially, if the initial point is a core or delta of the fingerprint, then the extended trace is called a shape line of the
fingerprint.
In the method provided below for accurately tracing contour lines, every tracing step moves by just one point; meanwhile the errors of each coordinate and direction are all accumulated for correcting the tracing; furthermore the tracing will stop at the right place. Where ko and 10 are the column and row with minimum curvature in the clear region produced as described previously, cl, ck and cd are the current row, column and direction values, the algorithm for tracing a contour line starting from point (k0,10) with initial direction do is as follows:
dl, dk, dd are the accumulating differences of
coordinates and direction in tracing respectively; tll[ ], tlk[ ], tld[ ] are the arrays of coordinate and direction of referring points in tracing, di is a parameter, ac is the average curvatures in the segment of a line.
<1> Initializing the variables,
dl=dk=dd=k0; ck=k0; cd=d0;
i=1; ac=0.
tll[0]=l0; tlk[0]=k0; td[0]=d0;
goto<2>.
<2> Stepping to next point,
while (cdtld[i1] < π) cd = cd+π;
while (cdtld[i1] > π) cd = cdπ;
if (cdd0 > p_d_{3} then goto <3>,
where p_d_{3} is the limitation of the difference between
current direction cd and initial direction do.
ac = ac+c[cl] [ck]; if (i ≥ p_l) then
{ac=acc[tll[idi] [tlk[idi]];
if (ac > p_ac·p_l) then goto <3>;
where p_ac is a threshold.
dd = cdtld[il];
if (dd > p_d2) then goto <3>;
where p_d2 is the limitation of accumulative difference of direction.
if (dd > p_dl) then
{dd > p_dl;
cd = cddd
}
else if (dd < p_dl) then
{dd = dd+p_dl;
cd = cddd;
)
else dd = 0;
where p_dl means the maximum value for correcting the direction. The increase of cl and ck depend on sin(cd) and cos(cd):
if (sin(cd) < cos(cd)) then
{cl = cl+sign(cos(cd)),
dk = dk+tan(cd);
if ([dk > 1) then
{ck = ck+sign(dk),
dk = dksign(dk),
}
}
else then
{ck = ck+sign(sin(cd)),
dl = dl+ctan(cd);
if (dl[ > 1) then
{cl = cl+sign(dl),
dl = dlsign(dl),
}
} if point (ck, cl) is out of the clear region, then goto <3>;
else, saving the current coordinates and direction,
{tll[i]=cl, tlk[i]=ck, tld[i]=cd, ii+i; cd = d[cl] [ck];
goto <2>;
)
<3> Determining the length of the traced line,
i = i1,
if (c[tll[i]] [tlk[i]] < p_ac) then goto <4>,
else {i = il;
if (i > 0) repeat <3>,
else goto <4>;
}
<4> the traced line is {tll[j], tlk[j], tld[j], j=0,l,...,i;
return,
where di, p_ac, Pd_{1}, p_d_{2} and p_d_{3} are all predetermined parameters.
A contour tracing will be stopped if one or more of following conditions is true:
(1) Current point is out of the clear region.
(2) Average curvature of last several points is too high.
(3) Rotated angle from initial direction is too large. A similar algorithm is used to trace any normal line of a fingerprint; the normal line is defined as a digital curve on the image that is perpendicular to the local ridge
direction everywhere. The algorithm can be obtained from the one above by replacing the assignment cd=d[cl] [ck] in step <2> with cd=d[cl] [ck]+π/2 or cd=d[cl] [ck]π/2. The normal lines are used in a novel method described in section 3.6, below, for macroscopically locating the center of a
fingerprint.
Line tracing is a basic algorithm in this fingerprint processing system. It is important in locating the
coordinate axes, extracting the shape features and detecting minutiae etc. that will be described in following sections. The line tracing operation described in this Section is used to trace shape lines from the center of a delta.
3.6 Macroscopic Method For Locating Coordinate Axes
The type or shape of a fingerprint, especially if it is characterized by a small whorl or loop, may be ambiguous due to distortion and noise produced when the impression is taken. Some impressions of a small whorl look like a loop or tent arch, while some impressions of a small loop look like a plain arch. Therefore, there must be some consistency in the rules for deciding the coordinate axes, i.e. the center and central orientation, of various types of fingerprints.
In section 3.4, there was described a method for
locating cores and deltas, except the 'a' core, of
fingerprints by analyzing all singularities of the
fingerprint. However, to find an 'a' core and to determine the center and central orientation of a plain arch that are consistent with other similar fingerprints, a method which involves analyzing the macroscopic structure of fingerprint ridges is needed.
Referring to FIGS.10 again, many fingerprint types or shapes, for example whorls, double loops, loops, plain arch, tent arch etc., are shown. In fact, the principal
distinctions among them are always at the central parts of the fingerprints, while the peripheries of fingerprints are all very similar. Generally, at the central area of any fingerprint, the ridges at the upper part will form a vault, the ridges at left and right sides will run off from the central part, and the ridges at the lower part will be always plain, as shown in FIG.11.
Referring to FIG.12, any core of a fingerprint, except a 'u' core, is always at the most curved region of the ridges below the vault formed by upper ridges. Generally for any fingerprint except a nodding loop, the normal lines will, starting from the upper part and going down, all concentrate together at a central region where there are most curved ridges of the fingerprint, i.e. where there is an 'o' core or an 'n' core for a whorl or loop, or an 'a' core for a plain arch. Some of the normal lines may end at the central region, while others may obviously curve at the central region, so the core can be located by analyzing the singularities near each end point and the most curved point on the noirmal lines. This method is very important for a plain arch, because it is usually difficult to determine, or locate, the center of a plain arch. The algorithm for locating macroscopically any core except a 'u' core of a fingerprint is:
Let l1 be the upper line border, l2 the lower line border, k1 the left column border and k2 the right border. <1> Initializing, let the initial start point (k0, l0) for tracing be
l0 = l1; k0 = (k2k1)/2;
<2> Selecting current start point (k0, l0),
l0 = l0+1;
if (c[l0][k0] > p_c3 then goto <2>;
dk = p_ksin(d[l0] [k0]);
if (dk > p_d4) then {k0 = k0dk; goto <2>;}
else, the start point is (k0, l0);
d0 = d[l0] [k0];
goto <3>.
where p_c3 and p_k are parameters.
<3> Finding a vault line.
A vault line can be considered as a combination of two contour lines, i.e. a right contour line and a left contour line that both start from the middle of the print. So firstly the two contour lines should be traced. Starting from (l0,k0), two contour lines can be obtained by tracing with directions do and d0+π respectively. These two contour lines are then combined into a vault line.
If the vault is not perfect, i.e. if its length is too short or its chord is too slanting, or the curvature of the vault is too high, then goto <2>.
else let {vl[j], vk[j], vd[j], j=0,...,lv} be the vault line; where vl[j], vk[j], vd[j] are the y coordinate, the x coordinate and the local direction, respectively, at the jth point on the line and lv is the length of the vault line. let i=0; goto <4>;
<4> Tracing a normal line by the previous algorithm with starting point (k0,l0) and direction do that
k0 = vk[i]; 10 = vl[i] ; do = vd[i]+π/2;
if (i ≤ lv) then
{i = i+p_g;
goto <4>,
}
else goto <5>;
where p_d4 is a threshold for limiting the local direction of starting point, p_g is the gap between two normal lines at starting points.
<5> Determine the areas of concentration of the normal lines (see FIG. 12), then analyze the singularities in the area to find the 'o', 'n' and 'a' core or others by the forkedness with the algorithm described in 3.4.
In the above algorithm, p_c is a threshold, p_k is a constant. In the trend analyzing of above points, if the forkedness is 0 or 1, then the point is an 'o' or 'n' core, while if the number is 2, then it is an 'a' core.
By singularity analysis, there may be more than one 'a' core in the central region of the fingerprint. The criterion for selecting the most representative one among them is the angle difference dd between the two trends (d1 and d2) of a singularity, i.e.
dd = min{d2d1, 2·πd2d1}.
For example, if d1=π/4 and d2=3π/4, then dd=π/2; if dl=2π and d2=3π/4, then dd=3π/4.
The 'a' core which has the smallest angle difference will be selected as the most representative one.
Referring to FIG.13, there are two trends of an 'a' core, one is towards left while the other is towards the right. The main trend of an 'a' core is defined as the trend at the core side where the gap between two adjacent contour lines is wider than at the other side. By this rule, the central orientation of a plain arch is consistent with loops.
After the trend analyzing for each singularity and locating the 'a' core macroscopically, the center and centra orientation of a fingerprint can be decided as follows sequentially:
<1> If there is an 'o' core in the central region, then the pattern must be a whorl, the position of the center is the center of the core, the central orientation is ττ/2 , i.e. down forward, and 0° is horizontal to the right.
<2> If, in the central region, there is an 'n' core and a •u* core, the pattern is a whorl; if there is an 'n' core, no 'u' core and more than one delta, the pattern is a whorl; and if there is an 'n' core, no 'u' core and not more than one delta, the pattern is a loop. The position of the center is the same as the center of the core, and the central
orientation is the trend of the core.
<3> Else if there is a 'u' core, then the pattern is a nodding loop, the position of the center is the same as the center of the core, and the central orientation is the trend of the core.
<4> Else if there is an 'a' core in the central region, then the pattern is a plain arch, the position of the center is the same as the center of the core, and the central
orientation is the main trend of the core.
The center and central orientation of a whorl, a loop and an arch, decided by the macroscopic method are shown in FIGS. 19, 20 and 21, respectively. These FIGS, depict tracing lines which have been generated to be perpendicular to the local ridge directions in the vicinity of the center of the fingerprint pattern.
3.7 Shape Features And Classification
All of the shape lines of a fingerprint can be obtained by the above algorithm starting from the center of delta of the fingerprint with each trend as the initial direction.
Line tracing is performed as described in Section 3.5, above. Various shape lines of fingerprints, much as shown in FIGS.10, can serve to describe the shapes of fingerprints accurately. According to the structural relations of the shape lines, the fingerprints can be classified into 18 types each with a respective topological structure. A finer classification may be based on the shape features defined below.
FIG. 14 illustrates a technique for extracting the shape features of a left loop. Where C is the center, P_{o} is the delta, sl_{1}, sl_{2} and sl_{3} are shape lines starting from P_{o} .
P_{1}...P_{7} are points selected on sl_{2} or sl_{3}. The algorithm for extracting the shape features of a loop is as follows:
<1> Determine center C=(kc,lc) by the algorithm in Section 3.6;
<2> Determine delta centered at P_{o} =(k_{o},l_{o}) by the algorithm in Section 3.4;
<3> Trace three shape lines si,, sl_{2} and sl_{3} starting from Po with three trends of the delta as initial directions
respectively.
<4> Selecting seven points P_{i}=(k_{i},l_{i}), (i=1,...,7) on sl_{2} and sl_{3}, such that
dv(P_{i}C) = d_{o}+iπ/4, where d_{o}=dv(P_{o} C). Because P_{6} is always very far from C or beyond the border of the image, no feature will be defined by referring to P_{6}.
<5> Calculate the distances between C and P_{i}, i.e.
<6> Ridge counting between C and P_{o} ,
Let {(x_{0},y_{0}), ..., (x_{n}, y_{n})} be the straight line from C to P_{o} , where (x_{0},y_{0}) =(kc,lc), ·(x_{n},y_{n})=(k_{0},l_{0}); and let
g[i] = f[y_{i}][x_{i}], (i=0,...,n);
then the ridge counting is defined as, rc(C,P_{o} ) = #(g[i] I g[i1]>g[i]<g[i+1], (i=1,...n1)). <7> Total of 18 shape features are defined, they are 7 distances IP_{i}C], one ridge count rc(C,P_{o} ), and 10 direction values referring to d_{o} including the central orientation, 3 trends of P_{o} and 6 local directions d(P_{i}), (i=1,...5,7).
The 18 shape features of a right loop are extracted in a manner similar to the left loop. A whorl or double loop can be considered as composed of two loops, i.e. one left loop formed by the left delta and center as well as one right loop formed by the right delta and center, so it has both 18 left loop shape features and 18 right loop shape features, i.e. 36 shape features in all. For a tent arch, the features
referring to point P_{7} are not extracted because it may appear at either left or right. However, for the purpose of
consistency between whorls and other shapes, a total of 36 shape features are supposed for any fingerprint. If some of the 36 shape features can not be obtained due to noise, imperfections in, or the shape of, the fingerprint, these features are each assigned a value of 1. There is no
meaningful shape feature for a plain arch, in other words all shape features of a plain arch are equal to 1.
According to the structural relations of position and surrounding etc. among shape lines, fingerprints are
classified into 18 classes with respectively different topological structure as shown in FIGS.10 which show 11 whorls, 4 loops, one accident, one tent arch and one plain arch. Every class of whorl, loop and tent arch can be further classified according to the shape features. 3.8 Global features and Global difference
The shape features for describing the pattern of a fingerprint are all based on both the center and delta. So they may be affected by the imperfections of fingerprint or by the noise which distorts the center, delta or shape lines. In particular, there is no shape feature defined for plain arch, so that for the purpose of practicality and consistency of fingerprint system, the features for describing the pattern of a partial, noisy or plain arch fingerprint should be considered.
One of the most important parts in this invention is a method for defining and extracting the global features of various fingerprints to represent their pattern naturally and consistently by referring to the local ridge directions.
Generally, the global features of a fingerprint provide a basis method for representing the ridge direction array of the fingerprint. These features must be obtainable for any kind of fingerprint, and be effective in pattern matching of fingerprints.
The simplest method for defining global features is to select some points on the direction array and take the local ridge directions at each point as features. So if the amount of points is large enough, then the accuracy of repression will be fine enough. Especially, as shown in FIGS. 15, the points can be selected to form a circular, or polar, array or a rectangular array.
First method, the points can be selected on several circles with a common center. Referring to Fig. 15A,
C=(k_{o},l_{o}) is the center of a fingerprint, d_{0} is the central orientation, or direction. There are n circles O_{i}
(i=0,...,n1), with a common center C and different radii r_{i} (i=0,...,n1). There are m, selected points P_{ij}=(k_{ij},l_{ij})
(j=0,...,m_{i}1) on O_{j} segmenting the circle equally
(i=0,...n1). The global features gf of a fingerprint are defined as a set:
gf=[gff((k_{ij},l_{ij})) I j=0,...,m_{i}1; i=l,...,nl];
where
k_{ij}=int(k_{o}+r_{i}·cos(d_{o}+j·2·π/m_{i})+0.5);
l_{ij}=int(10+r_{i}·sin(d_{o}+j·2·π/m_{i})+0.5);
255, if c(P) > p_c4;
gff(P) = {
int(d(P) ·p_π/π+0.5) %p_π, elsewhere.
i.e., gf(P) is equal to 255 when the ridge directions around
P are not clear or the curvature c(P) is greater than a predetermined threshold p_c4; else where gff(P) represents the direction d(P) in one byte with a value between 0 and 251; parameter p_π is selected to transform the range of angular value from [0,π] to [0,p_π] for storing it in one byte and reserving enough accuracy.
In an embodiment, the values of parameters are n=9 and m_{i}=64 (i=1,...,n), so there are total 964=576 points
selected, and the global features of a fingerprint are composed by 576 bytes. If the number of points selected on each circle 0_{i} is equal to others, then gf can be simply stored in an array of bytes:
gf={gf [i] [j] ( =gff (P_{ij}) )j=0,...,m1;i=0,...,n1;}
For example, the global features of an arch, loop and whorl are shown in FIGS. 22, 23 and 24, respectively, where the center of each pattern is at the common center of the concentric circles and the central orientation of each pattern is represented by a short line extending from the center. There are 9 circles in each image, and each circle is composed of 32 points (for showing more clear than 64 points). For each point selected, if it is not in background and its curvature is not high, then the local direction is represented by a line centered on the selected point; the remaining selected points are each represented by a dot, as is particularly apparent at the bottom and lower portions of the lefthand and righthand edges of FIG. 22.
The difference between two shapes of fingerprints always will reflect on their direction arrays. So that it also would reflect on the global features which represent the direction arrays. For this purpose, an important
measurement, called global difference between two sets of global features, is necessary.
In the case of Equation (9), the global difference gd1 between two fingerprints by their global features gf1 and gf2 is defined as:
gd1(gf1, gf2) = min{∑(f_dg(d1gfl[i][j],
d2gf2[i][(j+r)%m])/#M1(r), (i,j), M1(r)),
r=0,...,m1, #Ml(r)>0}. where d1 and d2 are central orientations of two fingerprints respectively, set M1(r) means
M1(r)={(i,j) I gfl[i][j]<p_π & gf2[i] [(j+r)%m]<p_π); function f_dg() means
f_dg(x,y) = f_i(min( xy %p_π, p_π ( xy %p_π))); where f_i(z) is an increase function of z. In the
embodiment, f_i(z) = z^{2}. So the global difference between two fingerprints is calculated by matching their global features with various radii r to find the minimum difference.
Second method, the points can be selected on a grid with n rows and m columns, as in FIG. 15b, so the global features can be stored in an array gf[ ][ ] such that:
gf[i] [j]=d[y_{oo}+idy] [x_{oo}+j dx];
i=0,...,n1; j=0,...,m1.
where x_{oo} and y_{oo} are the coordinates of the left upper corner point on the array, dy and dx are increases of row and column respectively.
The global difference gd2 between two fingerprints by their global features gfl and gf2 is defined as:
gd2(gfl,gf2)=min(∑(f_dg(d1gfl[i]p[j],
d2gf2[l][k]/#M2(l,k), (i,j), M2(l,k)),
Complete 1=0,...,nl, k=0,...,m1.
The global difference can be used for finely classifying fingerprints in a database, or selecting similar fingerprints in a database to reduce the difficulty of minutia matching during a searching procedure.
3.9 Detecting Minutia From Gray Level Image
Minutiae are very important traditional features of a fingerprint, and are used in final verification of identity of two fingerprints. Usually minutiae are described with respect to the pattern of fingerprint ridges. There are many types of minutiae on a fingerprint, for example as shown in FIGS. 16, endings (a), bifurcations (b), islands (c), eyes (d), bridges (e), etc. In brief, minutiae are singularities of ridges.
However ridges always coexist with valleys on a
fingerprint, and each feature or minutia of ridges always corresponds to a change in valleys, so that minutiae can be described in terms of valleys, too. In general, an ending of a ridge is a bifurcation of valleys while a bifurcation of ridges is an ending of a valley, an island of a ridge is an eye of a valley and an eye of a ridge is an island of a valley. Referring to FIGS. 17, the exceptions may appear at cores and deltas. The description of minutiae that are just at a core or delta in terms of ridges is different from the description in terms of valleys. However, the descriptions of minutiae should be consistent by being all in terms of valleys or all in terms of ridges.
For automatic detection of minutiae, the novel method provided here is based upon tracing the valleys rather than the ordinary method which is based upon binarizing, thinning and smoothing the ridges. Generally in a fingerprint image, the quality of valleys is much better than that of ridges, primarily because of the following reasons: firstly, there are no sweat glands in valleys; secondly, the widths of valleys are more even than those of ridges; and thirdly, the gray levels in valleys are more even than in ridges. Although there will be incipient ridges in the valleys of some
fingerprints that may affect valley tracing, all ridges of every fingerprint have sweat glands that may affect ridge tracing. So in general, the result of valley tracing should be much better than ridge tracing.
The algorithm for tracing a valley with an initial point (k0,l0) and direction do is similar to tracing a line, except that it uses a key technique that keeps the step points in the valley.
Let f[l][k] be an image array. Its element f[l][k] equals the gray scale value of a point (k,l), it will be set to 1 after it has been traced, ag is the summation of gray scale values of the last p_l points in a tracing line. Array tlg[ ] is used to store gray scale values of traced points. The definitions of other variables are the same as for the algorithm of line tracing in Section 3.5.
<1> Initializing the variables. dl=dk=dd=0 ;
cl=l0; ck=k0; cd=d0;
i=1; ac=0.
tll[0]=l0; tlk[0]=k0; tld[0]=d0;
tlg[0]=f[l0][k0];
goto <2>.
<2> Step to the next point.
Accumulate curvatures of every point in the valley, ac=ac+c[cl][ck];
if (i>p_l) then
{ac=acc[tll[ip_l]][tlkk[ip_l]];
if (ac > p_ac·p_l) then goto <3>;
}
Accumulate gray scale values of every point in the valley.
ag=ag+f[cl][ck];
if (i > p_l) then
{ag=agtlg[ip_l];
if (ag < p_ag·p_)) then goto <3>;
}
if (cdtld[i1] ≤ π) then cd = cd+π;
if (cdtld[i1] ≥ π) then cd = cdπ;
if (cddo > p_d3) then goto <3>,
where p_d3 is the limitation for changing current direction cd per step in tracing.
ac=ac+c[cl][ck];
if (i > p_l) then
{ac=acc[tll[ip_l]][tlk[ip_l]];
if (ac > p_ac·p_l) then goto <3>;
where p_ac is a threshold.
}
dd=cdtld[il];
if (dd[ > p_d2) then goto <3>;
where p_d2 is the limitation of accumulative difference of direction.
if (dd > p_dl) then
{dd = ddp_d1; cd=cddd
)
else if (dd < p_d1) then
{dd=dd+p_d1 ;
cd=cddd;
}
else dd=0;
where p_d1 means the maximum value for correcting the direction. The increase of cl and ck depend on sin(cd) and cos(cd);
if (sin(cd) < cos(cd)) then
{cl=cl+sign(cos(cd)),
dk=dk+tan)cd);
if (dk ≥ 1) then
{ck=ck+sign(dk).
dk=dksign)dk),
}
}
if any point (x,y) of {(ck,cl), (kl,ll), (kr,lr)} is out of the clear region or f[y][x]<0, then goto <3>;
else save current coordinate, direction and gray level values;
{tll[i]=cl; tlk[i]=ck; tld[i]=cd; i=i+1; f[cl][ck]=1;
cd=d[cl] [ck];
dd=dd+(f[ll][kl]f[lr][kr])·p_ga;
goto <2>;
}
where (kl,ll) and (kr,lr) shown in FIGS. 18 are called the left point and right point of current point (ck,cl)
respectively. Both of them are the 4neighboring points of current point and 8neighboring points of the previous point (tlk[i],tll[i]). p_ga is a predetermined parameter of modifying direction by difference of gray scale values.
<3> Determine the length of traced valley,
i=il; if ((c[tll[i]][tlk[i]]<p_ac) & tlg[i]>ρ_ag)) then goto
<4>,
else {f[tll[i]][tlk[i]]=tlg[i];
i=i1;
if (i>0) then repeat <3>,
else goto <4>.
}
<4> The traced valley is {tll[j], tlk[j], tld[j], (j=0,l,...,i)],
return;
Where p_l is a constant, p_ac, p_d1 and p_d2 are all thresholds.
Referring to FIGS.18, both the left point (kl,ll) and the right point (kr,lr) are 4neighboring for current point (ck,cl) ans 8neighboring for previous point
(tlk[i1] ,tll[i1] ) . (ck,cl) may be replaced by its
4neighboring point (kl,ll) or (kl,ll) according to their gray level. This algorithm is similar to that for line tracing with gray level as an additional factor.
In detail, the tracing line will firstly step from the prior point to the current point temporally, then the gray scale of two 4_neighboring points, i.e. (kl,ll) and (kr,lr), of the current point are considered. A point is selected to be a valley point if its gray scale value is higher than or equal to the other points in the neighborhood.
A valley tracing will be stopped if one or more of following conditions is true:
(1) The current point is out of the clear region.
(2) The average curvature ac of the last p_l points in the tracing is very high, i.e.greater than p_ac.
(3) Any previous valley trace is touched.
(4) The average gray level ag of the last p_l points in the tracing is very low, i.e. less than p_ag.
The algorithm for detecting minutiae from a gray level image by valley tracing is as follows:
<1> Let gp=gap2; <2> The start point P for valley tracing should satisfy each of following conditions:
(1) P is a maximum point in a 3×3 neighborhood on gray scale image F and f(P)>p_f.
A maximum point in a gray scale image means it is one whose gray scale value is not less than that of each of its 8 neighboring points.
(2) P is in the clear region of ridges and c(P)<p_c. This means the curvature at P is smaller than p_c.
(3) There is no traced line at directions c(P)+π/2 and c(P)π/2 within distance gp.
For every such point P, the valley is traced in two initial directions c(P) and c(P)+π, respectively. The
minutiae are detected as a result of the conditions for stopping a trace: If the trace is stopped due to condition
(3), then a valley bifurcation is found; While if the trace is stopped due to condition (4), then a valley ending has been found.
<3> Connect any two terminals of traces (a terminal being the start or end of a trace) if:
(1) The two last directions of the traces are opposite to one another;
(2) The positions of two terminals of traces are very close;
(3) The average gray level between them is higher than p_ag.
<4> gp=gp1; if gp>gapl, then goto <2>;
Where parameters p_f, t, gapl and gap2 are all
experimentally determined constants. Each minutia found is described by three attributes, i.e. x, y and θ. The
coordinates x and y are same as the position of a trace terminal, the direction θ is equal to the one of d[y] [x] and d[y] [x]+π which is closer to the last direction in the trace.
An example of extracting minutiae by tracing valleys on a gray level of a fingerprint is shown in FIG. 25, where the fingerprint is the the same as in FIGS. 21 and 22 and the gray level is reversed. 3.10 Quality Level And Vector
The features of a fingerprint may be effected by many factors, for example noise level, the effective area of the clear region, the position of the center, the number of minutiae, and so on. Sometimes the factors are due to the quality of the finger itself, while at other times they are due to the impression or input device. A quality level q_l should be provided after image processing in order to make possible an automated or operator controlled decision as to whether to accept, reject, or reinput the fingerprint image, or if possible to take a new fingerprint impression. The quality level can be described in detail by a quality vector:
q_v=(q_n, q_a, q_p, q_m, q_h);
where each factor is calculated as follows:
<1> The noise level q_n refers to the average curvature in the whole clear region, i.e.
q_n=f__q_n(a_c);
wh;re
a_c=Σ(c(X), X, {Xc(X)<l})/#{Xc(X)<l};
f_g_n(z) is an increase function for z in the range 0 to 1.
In the embodiment g__n is defined as,
0, when a_c<c1;
1, when a_c>c1 & a_c≤c2;
qn 2, when a__c>c2 & a_c<c3;
3, when a_c>c3.
Where c1, c2, and c3 are all predetermined experimental values. If the average curvature of an image is very small, then q_n=0, i.e. the image's quality is good; while if the average curvature is large, then the noise in the image would affect the processing, q_n will equal to 1, 2 or 3 depending on the noise.
<2> The effective area q_a represents the number of global features which define a direction, in the case of Equation (9),
q_a=f_q_a(#{x  (x in gf) & (x<p_π)});
where function f_q_a(z) is increasing for z between 0 and 255, to ensure that qa is 1, 2 or 3. When q_a equals 0, the quality of the image is good, otherwise the larger the value, the worse the quality.
<3> q_p depends on whether the position of the center
(k0,l0) is in the central region CR of the image, i.e.
q_p=(is (k0,l0) in CR) ? 0 : 1;
In the embodiment,
CR={(k,l) I (L/4 < 1 < L2/3) & (K/3 < k < K2/3)}.
<4> q_m depends on the number and average quality of
minutiae a_mq, i.e.
0, if nm < p_nm;
q_m = {
f_q_m(a_mq), otherwise,
where p_nm is a predetermined threshold, f_mq(z) is an increase function for z. In the embodiment, p_nm=18,
f_mq(z)=z/4.
<5> q_h means help level that presents the reliability of the center C by the average curvature around C,
q_h=f_q_h(Σ(c(X), X, NC)/#NC);
where
NC={X I XC≥p_r1 & XC<p_r2.
f_q_h(z) is an increase function for z.
Finally,
q_1=q_n + q_a + q p + q_m + q_h;
It is anticipated that the invention will be implemented by means of a general purpose digital computer system which is programmed in accordance with the algorithms described above and is provided with an appropriate graphics input device capable of scanning a fingerprint image and inputting gray level image point brightness values and displaying and printing and writing the results of the image processing procedures in an output device.
In an embodiment of the above method, the following parameters may have values in the ranges specified below:
Section Parameter Range
3.2 r [5, 30]
P_v [4·r·r, 60 r·r]
3.3 p_nl [1, 5]
P_c_P [0.5, 1] 3.4 p_c1 [0.6, 1]
P_ω_{0} [1.5, 2.5]
P_ω_{1} [0.5, 1.5]
P_ω_{2} [0.5, 1.5]
P_ω_{3} [0.5, 1.5]
p_c2 [0.6, 1]
p_n2 [1, 10]
p_dd [0.1, 0.6]
3.5 p_d1 [0.2, 0.8]
p_d2 [0.5, 1.5]
p_d3 great than 0
P_l [5, 15]
p_ac [0.6, 1]
3.6 p_c3 [0.6, 1]
P_k [1, 10]
p_d4 [10, 100]
3.8 p_c4 [0.6, 1]
3.9 p_ag [0, 7]
p_ga [0.01, 0.1]
3.10 p_nm [8, 20]
P_l_{2} [5, 20]
p_rl [1, 10]
p_r2 [10, 20]
The invention thus provides a method for calculating the global difference between two stripe patterns by means of their global features used in finer classification and search.
While the description above refers to particular
embodiments of the present invention, it will be understood that many modifications may be made without departing from the spirit thereof. The accompanying claims are intended to cover such modifications as would fall within the true scope and spirit of the present invention.
The presently disclosed embodiments are therefore to be considered in all respects as illustrative and not
restrictive, the scope of the invention being indicated by the appended claims, rather than the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Claims
Priority Applications (2)
Application Number  Priority Date  Filing Date  Title 

US772,393  19770228  
US77239391A true  19911007  19911007 
Publications (1)
Publication Number  Publication Date 

WO1993007584A1 true WO1993007584A1 (en)  19930415 
Family
ID=25094911
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

PCT/US1992/008446 WO1993007584A1 (en)  19911007  19921006  Method and system for detecting features of fingerprint in gray level image 
Country Status (2)
Country  Link 

AU (1)  AU2779092A (en) 
WO (1)  WO1993007584A1 (en) 
Cited By (13)
Publication number  Priority date  Publication date  Assignee  Title 

WO1995013591A1 (en) *  19931112  19950518  Jasper Consulting, Inc.  Fingerprint analyzing and encoding system 
US5748765A (en) *  19921027  19980505  Jasper Consulting, Inc.  Modifying a database using a fingerprint form 
WO1999004358A1 (en) *  19970718  19990128  Kaba Schliesssysteme Ag  Method for determining an identification code from fingerprint images 
US6002787A (en) *  19921027  19991214  Jasper Consulting, Inc.  Fingerprint analyzing and encoding system 
US7003141B1 (en)  20000621  20060221  Robert Bosch Gmbh  Method of and device for identification of fingermarks 
WO2007010209A1 (en) *  20050715  20070125  Neil Maxwell Rhiando  Useridentifier 
DE10118485B4 (en) *  20010412  20131024  Idencom Ag  Automatic identification and / or verification of linear textures such as fingerprints 
US9785819B1 (en)  20160630  20171010  Synaptics Incorporated  Systems and methods for biometric image alignment 
US9785818B2 (en)  20140811  20171010  Synaptics Incorporated  Systems and methods for image alignment 
US9792485B2 (en)  20150630  20171017  Synaptics Incorporated  Systems and methods for coarsetofine ridgebased biometric image alignment 
EP2660775A4 (en) *  20101227  20171220  Fujitsu Limited  Biometric authentication device 
WO2018201847A1 (en) *  20170503  20181108  Oppo广东移动通信有限公司  Optical fingerprint identification method, and related product 
US10127681B2 (en)  20160630  20181113  Synaptics Incorporated  Systems and methods for pointbased image alignment 
Citations (3)
Publication number  Priority date  Publication date  Assignee  Title 

US4083035A (en) *  19760910  19780404  Rockwell International Corporation  Binary image minutiae detector 
US5067162A (en) *  19860630  19911119  Identix Incorporated  Method and apparatus for verifying identity using image correlation 
US5140642A (en) *  19910423  19920818  Wen Hsing Hsu  Method and device for allocating core points of finger prints 

1992
 19921006 AU AU27790/92A patent/AU2779092A/en not_active Abandoned
 19921006 WO PCT/US1992/008446 patent/WO1993007584A1/en active Application Filing
Patent Citations (3)
Publication number  Priority date  Publication date  Assignee  Title 

US4083035A (en) *  19760910  19780404  Rockwell International Corporation  Binary image minutiae detector 
US5067162A (en) *  19860630  19911119  Identix Incorporated  Method and apparatus for verifying identity using image correlation 
US5140642A (en) *  19910423  19920818  Wen Hsing Hsu  Method and device for allocating core points of finger prints 
Cited By (16)
Publication number  Priority date  Publication date  Assignee  Title 

US5748765A (en) *  19921027  19980505  Jasper Consulting, Inc.  Modifying a database using a fingerprint form 
US6002787A (en) *  19921027  19991214  Jasper Consulting, Inc.  Fingerprint analyzing and encoding system 
US6289111B1 (en)  19921027  20010911  Jasper Consulting, Inc.  Modifying a database using an identification form 
AU675742B2 (en) *  19931112  19970213  Jasper Consulting, Inc.  Fingerprint analyzing and encoding system 
WO1995013591A1 (en) *  19931112  19950518  Jasper Consulting, Inc.  Fingerprint analyzing and encoding system 
AU761123B2 (en) *  19970718  20030529  Kaba Schliesssysteme Ag  Method for determining an identification code from fingerprint images 
WO1999004358A1 (en) *  19970718  19990128  Kaba Schliesssysteme Ag  Method for determining an identification code from fingerprint images 
US7003141B1 (en)  20000621  20060221  Robert Bosch Gmbh  Method of and device for identification of fingermarks 
DE10118485B4 (en) *  20010412  20131024  Idencom Ag  Automatic identification and / or verification of linear textures such as fingerprints 
WO2007010209A1 (en) *  20050715  20070125  Neil Maxwell Rhiando  Useridentifier 
EP2660775A4 (en) *  20101227  20171220  Fujitsu Limited  Biometric authentication device 
US9785818B2 (en)  20140811  20171010  Synaptics Incorporated  Systems and methods for image alignment 
US9792485B2 (en)  20150630  20171017  Synaptics Incorporated  Systems and methods for coarsetofine ridgebased biometric image alignment 
US9785819B1 (en)  20160630  20171010  Synaptics Incorporated  Systems and methods for biometric image alignment 
US10127681B2 (en)  20160630  20181113  Synaptics Incorporated  Systems and methods for pointbased image alignment 
WO2018201847A1 (en) *  20170503  20181108  Oppo广东移动通信有限公司  Optical fingerprint identification method, and related product 
Also Published As
Publication number  Publication date 

AU2779092A (en)  19930503 
Similar Documents
Publication  Publication Date  Title 

Sebastian et al.  On aligning curves  
Heutte et al.  A structural/statistical feature based vector for handwritten character recognition  
Grigorescu et al.  Distance sets for shape filters and shape recognition  
Bhanu et al.  Fingerprint indexing based on novel features of minutiae triplets  
EP0918300B1 (en)  Fingerprint feature correlator  
Kauppinen et al.  An experimental comparison of autoregressive and Fourierbased descriptors in 2D shape classification  
US6072895A (en)  System and method using minutiae pruning for fingerprint image processing  
Lee et al.  Offline tracing and representation of signatures  
US5239590A (en)  Fingerprint verification method  
Singh et al.  A new local adaptive thresholding technique in binarization  
Jain et al.  Filterbankbased fingerprint matching  
DE60031929T2 (en)  A method for separating signs for the recognition of motor vehicle license plates  
Jain et al.  Page segmentation using texture analysis  
CA1090475A (en)  Automatic pattern processing system  
KR100447023B1 (en)  Biometric recognition using a classification neural network  
US6041133A (en)  Method and apparatus for fingerprint matching using transformation parameter clustering based on local feature correspondences  
Ansari et al.  Nonparametric dominant point detection  
EP0114250B1 (en)  Confusion grouping of strokes in pattern recognition method and system  
US4047154A (en)  Operator interactive pattern processing system  
Vukadinovic et al.  Fully automatic facial feature point detection using Gabor feature based boosted classifiers  
US20140270419A1 (en)  Multiresolutional texture analysis fingerprint liveness systems and methods  
US5465303A (en)  Automated fingerprint classification/identification system and method  
CA1299292C (en)  Character recognition algorithm  
EP0551738B1 (en)  Method for connected and degraded text preprocessing  
EP0858047B1 (en)  Method and apparatus for verifying static signatures using dynamic information 
Legal Events
Date  Code  Title  Description 

AK  Designated states 
Kind code of ref document: A1 Designated state(s): AT AU BB BG BR CA CH CS DE DK ES FI GB HU JP KP KR LK LU MG MN MW NL NO PL RO RU SD SE 

AL  Designated countries for regional patents 
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL SE BF BJ CF CG CI CM GA GN ML MR SN TD TG 

DFPE  Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)  
REG  Reference to national code 
Ref country code: DE Ref legal event code: 8642 

122  Ep: pct application nonentry in european phase  
NENP  Nonentry into the national phase in: 
Ref country code: CA 