CA2296353A1 - Method for determining an identification code from fingerprint images - Google Patents

Method for determining an identification code from fingerprint images Download PDF

Info

Publication number
CA2296353A1
CA2296353A1 CA002296353A CA2296353A CA2296353A1 CA 2296353 A1 CA2296353 A1 CA 2296353A1 CA 002296353 A CA002296353 A CA 002296353A CA 2296353 A CA2296353 A CA 2296353A CA 2296353 A1 CA2296353 A1 CA 2296353A1
Authority
CA
Canada
Prior art keywords
code
image
determined
bifurcations
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002296353A
Other languages
French (fr)
Inventor
Rudolf Hauke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dormakaba Schweiz AG
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2296353A1 publication Critical patent/CA2296353A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method for determining an identification code from fingerprint images, wherein at least two of the following independent characteristics line spacing L, gradients G, curvatures K and bifurcations B are detected on the image area and a frequency distribution H is determined. On the basis of said frequency distribution, the characteristic values (mean value, variance, maximum value and classification value) and the characteristic values (Ci) of the selected bifurcations are determined, which form the vectorial components of the identification code C. The inventive method can be used to establish a short identification code which is relatively easy to determine and displays high recognition reliability for various applications.

Description

P1118 December 23, 1999 METHOD FOR DETERMINATION OF AN IDENTIFICATION CODE
FROM FINGERPRINTS
The invention concerns a method and a device for determination of identification codes from fingerprints or from digital gray scale value images respectively according to the generic terms of claims 1 and 25. These are for the automatic identification of fingerprints of a person in real time or from an image docume;,t by means of an electronic image recording device, e.g. a video camera. With this a code can be generated from fingerprint images in digital form with which code the appertaining persons can be identified. Mostly the finger is pressed onto a suitable optical device for creating contrast which device generates an image of the skin lines by means of prisms and the principle of impeded total reflection. Skin line images can, however, also be directly recorded electronically, e.g. capacitively. Up to now for characterization of a fingerprint minutiae of all kinds (bifurcations, ridge ends, inclusions, islands, interruptions, junctions, trifurcations etc.) have been used. This is especially valid for the widely used forensic applications which require a very accurate analysis of these minutiae characteristics (position and kind of minutiae as well as their orientation) which again require a correspondingly large amount of memory and complicated programs.
Furthermore the characterization by means of known minutiae methods further has a series of additional drawbacks: on the one hand errors and inaccuracies of the optical P 1118 December 23, 1999 image recording can lead to confusion of the minutiae, i.e. an error of the image - recording generates apparent minutiae which do not really exist and on the other hand existing minutiae cannot be recognized due to poor image recording.
Additionally the actual finger line image of a person can contain minutiae errors, e.g.
due to injury of the skin, by pollution or poor recordability of the skin lines such that e.g. interruptions appear in the image. Due to e.g. a simple injury in form of a cut a large different set of minutiae, i.e. apparent ridge ends can be formed along the edge of the cut. Thus the recorded minutiae image of a person is not always identical, which again requires complicated evaluation programs. For these reasons the determination of fingerprints by means of known minutiae methods requires much effort concerning calculation and memory.
Other known methods for determination of identification codes from distances between ridges or from gradients have not yet been capable of a sufficient security of recognition with short codes.
On the other hand there is considerable demand for the identification and verification of persons with simple means for use in a large number of applications for everyday use., e.g. selective access, for payment by means of credit cards, for identification for legal or social purposes, e.g. passport control or for inspection of personal documents, e.g. for social programs etc. For all these non-forensic applications it would be necessary to find a more simple and more secure biometric identification code which requires very little memory and thus can also be used for inexpensive data carriers. Especially for inexpensive magnetic cards, for documents with one- or two-dimensional bar-codes or on other inexpensive data carriers, especially also for chips with EEPROM memories of smart cards and contactless data carrier systems.
This is also absolutely necessary for all applications which concern the handling of business dealing with relatively low sums of money, e.g. in the field of everyday consumer goods, with vending machines with a relatively large amount of users.
For this kind of applications the identification carrier must be very inexpensive, i.e. must P1118 December 23, 1999 _,_ be applicable , securely with small capacity of memory and relatively simple - evaluation in small local computers of testing stations.
The object of the present invention is thus, to create a method with a better ratio of necessary code length and computing effort to the precision of determination and especially to generate a shorter and simpler code at a sufficiently high precision of determination which code can consist of less than 100 byte, e.g. only 36 byte or even less. The code is also to be less sensitive concerning image errors and recording errors as well as concerning the choice of the image section. Furthermore the generation of this code is also to be possible in local stations with simple, inexpensive computers.
This object is inventively achieved by means of a method according to claim 1 and a device according to claim 25.
By using at least two of the independent or orthogonal features ridge distance L, gradient G, curvature C and bifurcation B a multiplication of the determination precision of the two features is substantially achieved and with the determination of compressed characteristic values from the frequency distributions of the features, code features are determined in a simple manner which is additionally less dependent on recording and image errors. With the inventive method it is also possible to achieve higher determination precision basing on few characteristic values or very short identification codes for simple applications respectively can be achieved if required by increasing the number of features or the code length. Advantageous further developments of the invention are stated in the dependent claims.
In the following description of the different method steps as well as in connection with figures and examples the invention is described further, whereby:

P1118 December 23, 1999 Fig. 1 shows a determination of the ridge distances L in x-direction Fig. 2 shows a determination of ridge distances from gray scale value images Fig. 3 shows a determination of L in x- and y- direction Fig. 4 shows frequency distributions HL as a function of distance length L
Fig. 5 shows a classification with determination of class values Fig. 6 shows a determination of gradients G
Fig. 7 shows a determination of curvatures K
Fig. 8 shows a representation of class values of gradient distributions HG of different images.
Figs. 9 to 11 show different fingerprint images I
Fig. 12 shows a determination of bifurcations B in a skeletated image Fig. 13 shows a determination of bifurcation distances LB
Fig. 14 shows s determination of bifurcation areas F
Fig. 15 shows a further example of a determination of bifurcations B
Fig. 16 shows a segmentation and the covering of an image with a grid P1118 December 23, 1999 Fig. 17 shows representation of possible image sections Fig. 18 shows a diagrammatic representation of the inventive method Fig. 19a, b show examples of defined bifurcations Fig. 20 shows an inventive device for carrying out the method Fig. 21 shows an illustration concerning the classification of bifurcations.
For image determination a digital gray scale value image of a fingerprint with a suitable grid of e.g. 512 x 512 pixel is recorded by means of known methods (Figs. 9 to 11). This digital image can either be used directly for determination of the characteristics or a finger line image can be created from it by means of image pre-processing, especially by means of binarization and of skeletation (Figs. 12 and 15).
Different characteristics are then drawn from this once created image and are compressed into an identification code C in further processing steps which code corresponds to the desired application concerning its length and determination precision.
Fingerprint features The following four substantially independent or respectively orthogonal features (characteristics) are used and their frequency distributions are determined:
L Ridge distances G Gradients P 1118 December 23, 1999 K Curvatures B Bifurcations Feature ridge distances L
The ridge distances (distance lengths) L are, as illustrated in Figs. 1 and 2, defined as distances between two succeeding finger lines (ridges) 5, whereby the finger line has a width 0, i.e. corresponding to the distances between the middles of two successive finger lines which are recorded in the direction of a projecting ray X. For pre-processed, skeletated images this corresponds to a ridge distance (with ridge width 0, Fig. 1), while with digital gray scale value images according to Fig. 2 between two successive finger lines, the distance L is to be calculated as follows:
L= I,2 + ~/2 x (L1 +L3). If L1 and L3 correspond to the finger line widths with a suitably chosen gray scale threshold value 10. With this threshold value 10 a binarization can also be carried out, by terming gray scale values De above the threshold value as 1 and gray scale values below the threshold value as 0. For elimination of errors of individual pixels it can be predetermined as a condition that the ridge widths L1, L3 and also the distance L2 must e.g. amount to three successive pixels. The determination of the occurring distances Lxl, Lx2 etc. in direction of a recording ray, e.g. the abscissa x is carried out as shown in Fig. 1. The complete image is then recorded by variation of y in suitable steps dy according to Fig. 3, such that the quantity of all recorded occurring distances HL,x (x, y) can be plotted as a function of the distance lengths Lx according to Fig. 4. This shows the frequency distribution or histogram of all occurring distances Lx in x-direction over the recorded image region.
In analogy to this the distances Ly (x, y) are determined in orthogonal direction (i.e.
in the direction of ordinate y) and recorded over the whole image region by means of P1118 December 23, 1999 _7_ variation of x with the chosen distances dx. This leads to a histogram HLy, again over the whole image region.
For determination of the distance lengths Lx and Ly in x-direction and in y-direction the grid must be orientated in a defined manner: here with the y-axis corresponding to the longitudinal axis of the finger in order to achieve defined histograms.
The histograms HLx and HLy are completely different as seen in Fig. 4.
Fig. 5 shows a classification of a histogram HLx, whereby for each class He =
1, 2, 3..., e.g. the mean values Hq, maximum values Hmax and standard deviations or variance Hvar are determined and used as characteristic values Ci of identification code C. Histograms HLx and HLy can e.g. each be classified in 8, 12 or 16 classes and then one, two or three values (Hq, Hmax, Hvar) can be determined for each class.
Featur~,radients G
In analogy to the feature distance lengths the histogram of the gradients G, i.e. of the first direction derivations is also registered regularly over the whole image region and for this purpose again recorded e.g. per projection direction. The gradients are determined as tangent to skin line 5 in the point of intersection of the projection ray (e.g. in x-direction) with the skin line. As shown in Fig. 6 the gradients are registered via the projection direction and its histogram HGx is recorded for projection direction x over the whole image region and in analogy to this the histogram HGy of the gradients in projection direction y, in order to regularly record the gradient directions G(x, y) over the whole image region.

P 1118 December 23, 1999 _g_ This histogram of G can also be recorded in an image covering manner by determination of a gradient value for each mesh of a grid (30, Fig. 16) in order to register a gradient grid. This is explained in more detail further below.
As illustrating example the histograms HG of the gradients for the three fingerprint images of Figs. 9, 10 and 11 were determined and are shown in Fig. 8. The image I1 of Fig. 9 shows the fingerprint of a person 1 and the images I2a and I2b show fingerprints of a second person, whereby Fig. 10 shows fingerprint IZa without errors and Fig. 11 shows the same finger of the same person in image I2b with errors or injuries 20 respectively. The histograms HG of these images were recorded over the whole image and are shown as a function of the gradient angle of 0 to 180°. Basing on this a classification into 16 classes, i.e. each class with 180° :
16 = 11.25° angle region, was carried out and the mean values Hq of each class were determined and are shown in Fig. 8. The results is a graph with 16 feature values for each image. As can be seen clearly the graphs of these images I2a and I2b according to Figures 9 and 10 are nearly identical; i.e. with a correspondingly defined threshold value S
both images are classified as identical. Thus the identification of this person 2 is still possible although the images I2a and I2b differ, firstly due to the errors 20 from injury, secondly due to not identically recorded image regions. As shown in Fig. 11, the recorded image region I2a on its edges does not correspond to the image region I2b (differing definition of image region).
In the case of a known minutiae evaluation many new pseudo-minutiae (ridge ends) would occur in image I2b in the injured areas 20 and thus identification would be connected with much effort or even be impossible respectively. The graph of image I1 of person 1 of Fig. 9 visibly differs considerably from graphs I2a and I2b of person 2. Thus the 16 feature values Ci provide a relatively good contribution to the security of recognition of the identification code C as partial code C1 of feature G.
This example also illustrates that the inventive determination of histograms of the P1118 December 23, 1999 named features over a large image region and from this the determination of compressed feature values amounts to an identification code which is only affected by local image errors to a relatively small degree such that correspondingly a relatively increased security of recognition is achieved. This is in contrast to known minutiae evaluation.
As previously explained in connection with the determination of distances L
the features gradients G can also be determined directly from digital gray scale value images, e.g. by means of determination of a gradient value for each mesh of a grid 30 and thus of a gradient grid which is explained in connection with Fig. 16.
Feature curvatures K
The curvatures K according to Fig. 7 are determined in the intersection points of the skin lines or finger lines 5 with the projection directions x and y as second direction derivations of the finger lines. This is e.g. determined as inverse radius R
of the approximation circle to finger line 5 in the concerned intersection point. In analogue manner to the feature determination described so far, here the histograms of the curvatures are again determined over the whole image region in the two orthogonal directions x and y, i.e. 1-BCx and I-IICy (or by determination of the K-value for each mesh 30). In order to exclude irrelevant very small curvature radii which can be formed due to irregularities of a skeletated image choice rules can be applied, e.g.
Rmin = 0.3 - 0.5 mm such that the narrowest curvature radius in the center of the image is included, but not, however, even narrower curvatures, e.g. at the bifurcation B5.

P1118 December 23, 1999 As explained previously in connection with the determination of distances L
the feature gradients G and possibly also the curvatures K can also be derived from digital gray scale value images or from binarized but not yet skeletated images.
Inventively only the independent features ridge distances L, gradients G and bifurcations B (skin line bifurcations) are used of which at least two of these features must be used for code determination.
The features L, G, K are not minutiae features; they are characterized by their histograms. The special feature bifurcations B is also chosen and included in a manner that image errors, e.g. the absence of one single bifurcation e.g. in the margin - which can be recorded once and not recorded another time - have no or no substantial effect.
It is important that the sought after short identification code C is as independent of individual image errors or of one single feature (e.g. of differing individual minutiae) as possible. For this reason the sole minutiae characteristic which is used as feature can only be clearly defined bifurcations B, whereby these bifurcations B, in contrast to other minutiae, are relatively insensitive to image errors or to apparent mistakes due to disturbances of the finger lines e.g. due to injury from cutting.
Injuries due to cutting can create new ridge ends as apparent minutiae, but not, however bifurcations.
It is important that the clearly defined feature bifurcations B applied here is used in a different manner than in the conventional evaluation of minutiae. The conventional evaluation registers different kinds of minutiae and of each minutia its position, its kind and its orientation, whereby for evaluation and code determination the relative positions of these different individual minutiae are brought in. Inventively, however, only one kind of minutiae is chosen, clearly defined bifurcations, which are P1118 December 23, 1999 additionally used in completely difl;erent manner than up to now. This is explained in what follows: as histogram or as individually skeletated bifurcations.
Feature bifurcations B
Fig. 12 shows a binarized and skeletated representation of the gray scale value image of Fig. 10 which is used as an example for the determination of bifurcations.
In this image the bifurcations Bi = B 1 - B 12 are determined, this results in an quantity of N
= 12 bifurcations B. For this purpose suited choice rules or definition criteria respectively are set up, such that small errors and other kinds of minutiae (e.g.
islands) are not counted as bifurcations. The rules are e.g.: a registered bifurcation must have a minimum length of 0,5 - 1 mm of all three branches and one branch must have a length of at least 1 - 1.5 mm. Furthermore a minimum distance between two bifurcations of e.g. 0.7 - 1 mm can be prescribed. According to such definition criteria e.g. (B13), (B14) in Fig. 12 and (B7), (B8), (B9) in Fig. 15 are not counted as bifurcations.
Bifurcations B can be used as defined individual or selected bifurcations for generating a code, as is explained in connection with Figs. 19 and 21, or, according to Figs. 13, 14 histograms e.g. of bifurcation distances as well as of adjacent triangular areas F between the bifurcations can be determined.
According to Fig. 13 the bifurcation distances LB are determined as follows:
The distance LBi-j between the N bifurcations Bi and every other bifurcation Bj are determined. This results in N(N-1) bifurcation distances LBi j. In the example with N = 12 this results in 132 distances which are drawn up in a histogram: as frequency distribution HLB in function of distance LB.

P 1118 December 23, 1999 In analogy to this, triangular areas Fi j between bifurcations Bi and Bj are determined according to Fig. 14. A first side of triangle is defined, which reaches from each bifurcation Bi to each other bifurcation Bj, whereby the third point of the triangle is determined to be the bifurcation Bk (with k not = j) closest to bifurcation Bi; e.g. starting from B1, the area F1-2-3 (with i = 1, j = 2, k = 3) and F1-3-2 (with i = 1, j = 3, k = 2) as well as Fl-4-2 until F1-12-2. With this choice rule it is guaranteed that only one area is registered per bifurcation, e.g. in the case that two closest bifurcations are Bk-bifurcations only the one closest to Bj is used as Bk.
Starting from B10 to B1 the area F10-1-12 (i = 10, j = 1, k = 12), i.e.
starting with each Bj as a base line, this results in exactly one triangle, thus again a sum of N(N-1) = 132 areas Fi-j, which again form a histogram with frequency distribution as a function of area F.
Fig. 15 shows a further example for determination of bifurcations from another fingerprint image, whereby by means of corresponding choice rules, e.g. that the very closely situated bifurcations B7, B8, B9 are not counted as bifurcations for evaluation, such that here this leaves only the bifurcations B1 to B6 and thus the quantity of bifurcation distances LB as well as that of the closest triangular areas F
is each N(N-1) = 30. With other definition criteria, e.g. only B9 could be defined as bifurcation but not B7 and B8.
Determination of feature values From the histograms simple feature values Ci which characterize the histogram are determined, e.g. mean values Hq, maximum values Hmax and variances Hvar.
Furthermore a histogram can also be classified into classes He and e.g. mean value and variance can be determined as characteristic values (Fig. 5).

P 1118 December 23, 1999 Selected bifurcation features The feature bifurcations can additionally be used in a non-statistical manner by using individual clearly defined bifurcations with determined bifurcation features for determination of code features Ci and identification code C. Hereby the bifurcations are, as illustrated in the examples of Fig. 19a and 19b, determined via their position, i.e. the local coordinates xl, yl of point P1 are determined as well as the orientation angle Wa and the opening angle Wb. A clearly defined opening angle Wb can e.g.
be defined as the angle between the connecting line of points the P1, P2 and P1, P3, whereby points P2 and P3 are chosen at a suited distance rl to point P1, e.g.
rl = 0.5 - 1 mm. A point P4 with a minimal distance r2 of e.g. 1 to 1.5 mm from point P1 on the other hand form a definition criterion for a bifurcation, so that small image errors are not wrongly counted as bifurcations. In the example of Fig. 19a a relatively small bifurcation angle Wh i~ shown, e.g. corresponding to bifixrcation B6 in Fig.
2, while the example of Fig. 19b shows a large opening angle Wb, e.g. similar to the one in bifurcation B8 in Fig 21. Identified bifurcations can now be classified for code-determination, i.e. sorting criteria such as e.g. a valence for the bifurcations is introduced. A first classification criterion is the opening angle Wb, whereby large opening angles have higher valences. The classification criterion can e.g. be:
a valence in proportion to opening angle Wb. A further sorting criterion is the distance RBi of the bifi~rcation to a central reference point, here e.g. the distance RBi to the curvature center Km (Fig. 21), whereby the central bifurcations have higher valences than those further away. As a further criterion of selection, bifurcations which are close to the margin, e.g. B2 in Fig. 21 which are near the outer contour A of the finger line image can be IeR away. From the bifurcation features and the classification criteria e.g. a bifurcation priority can be defined as BP =
Wb/RB. With this all bifirrcations can be sorted in descending order according to BP. For generating a short code length the first entries of the list sorted according to P 1118 December 23, 1999 bifurcation priority can be used. A possible order for Fig. 21 can e.g. be the following: B8, B6, B5, B7, B4.
By means of combination of the feature gradients and bifurcations especially concentrated codes with relatively high security of recognition can be created. An identification code C which on the one hand consists of gradient features for each segment IS of a gradient grid and on the other hand of classified bifurcations B is particularly advantageous. For this purpose a central reference point and the orientation of the finger line image are required.
Imag-e-e-preprocessi~
The features L and G can be determined directly from the digital gray scale value image without further image pre-processing.
The determination of features B and also of K is mostly carried out by means of a line thinning as image pre-processing, e.g. a binarization and skeletation.
Then the other features L, G, K are determined from the skeletated image.
In an advantageous variant the bifurcations B can, however, also be determined directly from the digital gray scale value image by means of neuronal grids.
The features can be registered in following manners, as is illustrated in Fig.
16 with segmentation and mesh classification:
a in the whole image region (also for features L, G, K, B) b in each of a few relatively large segments, e.g. divided in 3 x 5 to S x 7 segments IS, whereby histograms are determined for each segment IS and feature values are derived from them.

P 1118 December 23, 1999 c covering of an image I with a grid 30, whereby for each mesh only one value of the feature is determined. The size of the mesh is chosen suitably. (e.g. 5 x5 pixel) With these kinds of registration a, b, c the following features are registered:
a b_ c Features L -G G G
K K K
B - (B) When comparing the codes it must he paid attention to that the division into segments IS is dependent of translation and rotation of the recorded images.
For comparison of codes this must be taken into consideration.
The feature curvatures K is independent of rotation; the same is valid for the bifurcation lengths BL and bifurcation areas F derived from bifurcations B and the relative positions of selected bifurcations. The feature gradients G requires a determination of the image-orientation-axis y. The feature gradients G, on the other hand, is independent of image dilatations (magnification and diminution of the image).
Because at least two features (L, G, K, B) are contained in the identification code C =
(C1, C2) as partial codes (C1, C2) these partial codes can also be used correspondingly differently for the code evaluation, i.e. corresponding to the dependence on the image definition and to the dependencies of the concerned feature and the histogram evaluation or the selected bifurcation features respectively.

P1118 December 23, 1999 Partial codes can be formed at choice from any feature values Ci.
Ima~ye section Fig. 17 illustrates the dependence of the image definition, whereby two images were recorded of the same finger, first an image I3 and then an image I4, whereby these images I3, I4 show different regions and possibly also different orientations of longitudinal axis y of the fingerprint. In order to always be able to compare similar image regions it is also possible to cut a central portion out of a given fingerprint image which portion can then be used as image section IA for determination of features and generating a code. Hereby regions from center Z of a fingerprint and its surroundings ZU are advantageously used for registration of features and code generation because the information content is higher here than in the marginal regions.
With a well defined position and orientation of the images, by means of corresponding image recording devices, e.g. a guidance for two fingers, segmented images can also be compared at higher security.
Orientation and central reference point For determination of the orientation and a central reference point of a finger line image or for the definition of a grid with an origin of coordinates e.g. a so called core point can be defined in known manner. This, however, is connected with large effort and often no clear core point can be determined. A simpler method consists in determining a curvature radius gravity center Km as central reference point from the feature curvature K. For this purpose the centers of the approximation circles (see R

P 1118 December 23, 1999 in Fig. 7) to the curvatures are determined and of these regularly recorded circle centers of the approximation circles the gravity centers Km with coordinates xm, ym are determined. A central reference point can, however, also first be determined by approximation e.g. as gravity center of the image area with outer contour A
(in Fig.
21). Similarly the recording of a finger image recorded in a defined manner by means of guiding means 32, as described in connection with Fig. 20 can be used. A
further possibility is to determine a central reference point from the variance Hvar of the gradient distribution in image segments IS, whereby the central reference point is determined by the segment which has a maximum of variance. For this purpose the image segmentation can also be varied.
With a poorly defined position of an image to be recorded, the features, manners of determination and code determination must be chosen such that they are relatively independent of position, e.g. the features G, K according to determination manner a, whereby, however, in most cases the direction should be determined for support. For code comparison the image can also be rotated over a small angle region or shifted in x- and y-direction.
As realization variants of the inventive method, depending on image definition and the objective to be achieved, e.g. the following combinations of the features for determination of an identification code C can be used:
1. features rzdge distances L, gradients G and bifurcations G.
These can e.g. also be determined from gray scale value images and be used especially for simpler, shorter codes (L and G), e.g. also with a classification of histograms.
From skeletated images the following features can be used:

P 1118 December 23, 1999 2. the features gradients G and curvatures K
3. the features gradients G and bifurcations B
4. the features curvatures K and bifurcations B, which are both invariant to rotation.
S. the features gradients G, curvatures K and bifurcations B, this specially for higher precision, whereby corresponding to the higher precision longer codes are formed (also with selected bifurcations).
After suitable choice of the combined features the precision, with which the features and their histograms are determined as well as the determination of the feature values Ci from it, can be chosen such that finally an identification code C with a code length with the desired demands to precision is formed, i.e. the code length can be enlarged as far as demanded, by narrower segmentation and classification and by determination of more feature values Ci.
The identification code C cannot only be determined sequentially but also in iterative manner.
Code comparison With these inventive, relatively compact identification codes C (with minor demand of memory) all in fact known identification, verification and authentification tasks can also be carried out in local, simple and rational manner. Hereby, for the comparison of two codes Ca and Cb, e.g. a code of a recent fingerprint image Ca with a reference code Cb from an allocated memory (which can be a databank or also an identification medium IIv>7 is determined by means of mathematical methods a code difference D = Ca - Cb and is compared to a pre-determinable threshold value (an acceptance value) S. In the case that D < S the two codes Ca and Cb and therefore the appertaining persons, are found to be identical.

P1118 December 23, 1999 Hereby individual threshold values specific to persons can be given. E.g.
dependent on how well the identification code C of a certain person is to be determined and on the other hand also on how important this identification task for this person is.
In similar manner a threshold value S can be predetermined, corresponding to the concerned application, i.e. which security of recognition is desired, i.e.
which FAR
false acceptance rate and which FFR false rejection rate is permissible.
For determination of this correspondence e.g. the euclidian distance D of the feature vectors C with a predetermined correspondence threshold S can be determined.
This means for a code, formed from the feature values Ci, that a fingerprint 1 is identical to a fingerprint 2 when:
D =E (Cli - C2i)Z < S
As an alternative the identification codes can also be correlated according to the following method:
A correlation Kor between the features of a finger ai with i = I,N and a second finger bi with i = 1,N according to the formula:
-15 Kor = E (ai-aq) (bi-bq) / (E (ai-aq)2)1~2 S + 1 for i = 1,N
is formed, whereby aq is the mean value of the values ai and bq is the mean value of the values bi.
For ai = bi the fingers are identical, i.e. the correlation is 1. At Kor = 0 the features are not correlated, at Kor = -1 they are anti-correlated.
Further possible methods of comparison are, depending on the type of code C
regression analysis or moment analysis. Furthermore the different features or feature P1118 December 23, 1999 values Ci respectively and corresponding partial codes Cl, C2 of an identification - code C = (C 1, C2) can also be treated differently.
Fig. 18 diagrammatically illustrates the inventive method for determination of a relatively short identification code C. The choice 52 of at least two of the orthogonal features L, G, K, B as well as the choice of the following method steps on the one hand bases on the image definition 51 and on the other hand on the task 60, i.e. on the required security of recognition of the desired applications, from this the required false acceptance rate FAR and the required false rejection rate FRR. Depending on the image definition 51 (i.e. quality of orientation, section, image recording and image quality) correspondingly less sensitive features and kinds of registration are chosen. Likewise a possible image pre-processing 53 can be carried out for a direct feature determination from the gray scale image, from a binarized image or from a thinned, skeletated line image. In a next step 54 a possible image orientation, a central reference point or an origin of coordinates, e.g. Km, as well as the choice of an image section IA or a segmentation IS respectively can follow. From this, histograms H with correspondingly fine screening or quantity of data respectively are determined. In addition to the histograms individual defined features bifurcations B
with position, orientation angles Wa and opening angles Wb can be determined in step 56 and in step 57 a classification of bifurcations, a skeletation or valences according to a bifurcation priority BP respectively can be carried out. In step 58 the determination of the feature values Ci is carried out and in step 59 the composition of the identification code C is carried out. Le. a corresponding longer or shorter code length is composed until the desired security of recognition FAR and FFR
according to step 60 is achieved. Corresponding to the composition of code C the methods of evaluation and the threshold values S for the identification evaluation can be chosen.
The identification evaluation (comparison) Ca - Cb = D < S is carried out in step 62.
Thus the inventive method is universally applicable for a wide range of applications P 1118 December 23, 1999 and is also optimally matchable to the desired tasks concerning computing effort and length of code.
Fig. 20 shows a device for carrying out the method with an electronic image recording device 31 and a station with evaluation electronics 34 and with evaluation logarithms 35 for determination of feature values Ci and from these of the identification code C or Ca respectively, i.e. of the actual identification code corresponding to the image recording of fingers 1, 2 to be recorded and for carrying out a comparison of codes Ca - Cb between a saved reference code Cb and the actual code Ca. This code comparison can also be earned out in a corresponding reading station WR. Corresponding to the code comparison, i.e. to the verification of the actual person, access functions for control of corresponding function stations can be practiced. The image recording device 31 here comprises guiding and positioning means 32 with limit stops for orientation and positioning of one or of two adjacent fingers 1, 2 to be recorded. Additionally a lateral limit stop for positioning of the longitudinal axis of the finger and a front limit stop for positioning of the finger image and thus for determination of a central reference point or a center of the finger image respectively can be provided. This kind of two-finger-recording device is e.g.
disclosed in PCT CH97/00241 = WO 98/09246. With an inventive device a code comparison Ca - Cb can be carried out locally such that no central databank with reference codes Cb are required. The inventive device can also comprise corresponding identification media IM for authorized persons, whereby the reference code Cb of the person and further information is contained in the identification medium IM or in its memory 43 and whereby an encoded communication 40 with an corresponding reading station WR can be carried out. Hereby the reference code Cb of the authorized person is only stored on the identification medium and not in the verification station or in a databank respectively. This makes a better data protection possible.

P1118 December 23, 1999 In a further variant a code comparison Ca - Cb, i.e. of the actual code Ca with the reference code Cb of the person, by means of processor 41 can also be carried out in the identification medium itself. In this case the actual code Ca must merely be transmitted from the reading station WR to the identification medium, while no code must be transmitted from the identification medium. In another variant the device can also be connected with a master system 47 with a main processor and with a databank 48 for reference codes Cb. The identification medium IM can contain access and function authorizations for further corresponding function stations 46 of a facility.
Pre-selected databank search In applications in which a fast databank search is required for positive identification of individual persons from an inhabitant databank with a large quantity of reference codes Cb solely by means of biometric recognition features the identification can be carried out in steps. Hereby the comparison is not carried out with the complete biometric identification code Ca but only with partial codes or with individual feature values Ci respectively such that the search can be carried out considerably faster. In a first step e.g, only a partial code C1 is compared and the remaining reduced quantity is then compared to a second partial code C2 in a second step etc.
This continues up to the complete code evaluation with a very small remaining quantity from the databank. This kind of search is considerably more rapid than a search for the complete identification code C. Advantageously the search for primary biometric data is classified, i.e. e.g. gradients and selected bifurcation features are used in steps for this purpose. A further method consists in the forming of gradient segments according to the method of quadrant segmentation (quad trees). Hereby the image is divided into four segments from each of which a gradient histogram is formed. In the next step each quadrant is again divided into four quadrants etc. until P1118 December 23, 1999 finally the gradient histogram would converge to the main line tangent in the considered segment.
The following terms are used in the description and in the figures:
L ridge distance G gradients K curvatures B bifurcations LB bifurcation distances F bifurcation areas H frequency distributions, histograms H(L), H(G), H(K), H(B: LB, F) He classification of H
Hq mean values Hmax maximum values Hvar variance of H
De gray scale value I1, I2 fingerprint images IA image sections IS image segments C identification code (feature vector Ci) Ca actual code Cb reference code C 1, C2 partial codes Ci feature values D code difference S threshold value, acceptance value Z center ZU surrounding x abscissa P 1118 December 27, 1999 y ordinate, longitudinal axis of finger dx, y screening d FAR false acceptance rate FRR false rejection rate identification medium, IM data carrier WR reading station BP bifurcation priority P P4 definition of Wb -rl, distances r2 Wa orientation angle Wb opening angle of B

Pl(xl, yl) local coordinates ofB

Km curvature radius gravity center xm, ym coordinates of Km RBi distance from central reference point to Bi A outer contour of finger image I

1, finger 5 finger lines, skin lines gray scale value 20 image error, injury grid 31 image recording device 32 guiding and positioning means 25 evaluation electronics, 34 evaluation station evaluation algorithms encoded communication 42 processor 43 memory P1118 December 23, 1999 46 access function (function stations) 47 master system, main processor 48 data bank 51 image definition 52 choice of feature L, G, K, B

53 image pre-processing, skeletation 54 image section, segmentation SS forming histograms 56 determining bifurcations classification, choice of 58 determining feature values Ci 59 determining identification code C

60 applications, demands (FAR, FRR) 61 evaluation method, choice of threshold value S

identification comparison 62 Ca- Cb

Claims (32)

-26-
1. Method for determination of an identification code from fingerprint images or from digital gray scale value images, characterized in that at least two of the following independent features: ridge distances L, gradients G, bifurcations B
are registered and that from at least one of the features histograms H are determined from which characteristic values (mean value, variance, maximum) are determined as compressed feature values Ci which as vector components form the identification code C.
2. Method according to claim 1, characterized in that the features gradients G
are used.
3. Method according to claim 1 or 2, characterized in that the histograms H
are determined in at least two orthogonal directions (x, y).
4. Method according to one of the preceding claims, characterized in that an image pre-processing is carried out, e.g. by means of binarization and skeletation.
5. Method according to one of the preceding claims, characterized in that a classification He of a histogram is carried out, whereby at least one characteristic value for each class is determined as a feature value Ci.
6. Method according to one of the preceding claims, characterized in that the recording image I is subdivided into several segments IS and that the histograms H of features (L, G, K, B) are determined for each segment IS.
7. Method according to one of the preceding claims, characterized in that the identification code C is derived from an image section IA which comprises the center Z of the fingerprint image and its surrounding ZU.
8. Method according to one of the preceding claims, characterized in that the length of the identification code C is chosen such that the demand for precision or the recognition security respectively (FAR, FRR) are fulfilled for a desired application.
9. Method according to one of the preceding claims, characterized in that partial codes C1, C2 of the features are determined differently and/or used differently for the code evaluation.
10. Method according to one of the preceding claims, characterized in that defined individual bifurcations B are used as features.
11. Method according to claim 10, characterized in that the features bifurcations B
are determined directly from the digital gray scale image by means of neuronal grids.
12. Method according to claim 10, characterized in that the features bifurcations B
are classified according to their opening angle Wb and/or according to their position in the fingerprint.
13. Method according to claim 10, characterized in that for classification a bifurcation priority BP is defined, whereby the bifurcation priority increases as the opening angle Wb increases and the distance RBi to the image center decreases.
14. Method according to claim 10, characterized in that the bifurcation priority BP
is defined as BP = Wb/RBi.
15. Method according to one of the preceding claims, characterized in that a central reference point is defined as origin of coordinates of the fingerprint image.
16. Method according to claim 15, characterized in that a curvature radius gravity center Km is defined as a central reference point.
17. Method according to claim 15, characterized in that the variance Hvar of the gradient distribution in image segments IS is determined and the central reference point is determined by the segment which has a maximum variance.
18. Method according to one of the preceding claims, characterized in that by means of guiding and positioning means (32) the longitudinal axis (y) of the finger and a central reference point are determined at least approximately when recording the image.
19. Method according to one of the preceding claims, characterized in that the features gradients G and bifurcations B are used.
20. Method according to claim 19, characterized in that the identification code C is formed from gradient characteristics for each segment IS of a gradient grid and of classified bifurcations B.
21. Method according to one of the preceding claims, characterized in that partial codes are formed and that a comparison Ca - Cb with a large quantity of reference codes Cb of a data bank (48) is carried out in steps corresponding to the partial codes.
22. Method according to claim 21, characterized in that the evaluation in steps is carried out by means of gradient segments and the method of quadrant-sub-division (quad trees).
23. Method according to one of the preceding claims, characterized in that an actual identification code Ca is recorded and determined and compared locally by means of an corresponding reading station WR to a personal reference code Cb stored in a corresponding identification medium IM.
24. Method according to one of the preceding claims, characterized in that the identification code C is determined in a sequential and/or iterative manner.
25. Device for carrying out the method according to one of the preceding claims, characterized by an electronic image recording device (31), a station with evaluation electronics (34) and with evaluation algorithms (35) for determination of an identification code C, Ca and for carrying out a code comparison Ca - Cb with a stored reference code Cb as well as with an access function (46) for the control of corresponding function stations.
26. Device according to claim 25, characterized by guiding and positioning means (32) with a lateral and a front limit stop for orientation and positioning of one or two adjacent fingers (1, 2) to be recorded.
27. Device according to claim 25 or 26, characterized in that the code comparison Ca - Cb is carried out locally.
28. Device according to one of claims 25 to 27, characterized by a corresponding identification medium IM of an authorized person which contains the reference code Cb of the person and further information for encoded communication (40) with a corresponding reading station WR.
29. Device according to claim 28, characterized in that the actual code Ca is transmitted to the identification medium IM and that the code comparison with the reference code Cb is carried out by means of the processor (42) of the identification medium IM.
30. Device according to claim 25, characterized by a master system (47) with a main processor and with a data bank (48) for reference codes Cb and with local function stations (46).
31. Identification medium IM for an authorized person with a stored reference code Cb of the person, which code is determined with a method according to one of claims 1 to 24, for encoded communication (40) with a corresponding reading station WR.
32. Identification medium according to claim 31 with access and function authorizations for corresponding function stations (46).
CA002296353A 1997-07-18 1998-07-16 Method for determining an identification code from fingerprint images Abandoned CA2296353A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CH1768/97 1997-07-18
CH176897 1997-07-18
PCT/CH1998/000312 WO1999004358A1 (en) 1997-07-18 1998-07-16 Method for determining an identification code from fingerprint images

Publications (1)

Publication Number Publication Date
CA2296353A1 true CA2296353A1 (en) 1999-01-28

Family

ID=4218141

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002296353A Abandoned CA2296353A1 (en) 1997-07-18 1998-07-16 Method for determining an identification code from fingerprint images

Country Status (10)

Country Link
EP (1) EP0996924B1 (en)
JP (1) JP2001510920A (en)
KR (1) KR20010021988A (en)
CN (1) CN1271446A (en)
AT (1) ATE237163T1 (en)
AU (1) AU761123B2 (en)
BR (1) BR9811511A (en)
CA (1) CA2296353A1 (en)
DE (1) DE59807883D1 (en)
WO (1) WO1999004358A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999044021A1 (en) * 1998-02-27 1999-09-02 Ideas To Market, L.P. Time temperature indicator
KR100374695B1 (en) * 2000-07-10 2003-03-03 주식회사 디토정보기술 Automatic Fingerprint Identification System using Direct Ridge Extraction
KR100430054B1 (en) * 2001-05-25 2004-05-03 주식회사 씨크롭 Method for combining fingerprint by digital linear image sensor
SE524727C2 (en) * 2002-05-07 2004-09-21 Precise Biometrics Ab Generation of frequency codes in conjunction with transformation and compression of fingerprint data
KR100456463B1 (en) * 2002-11-01 2004-11-10 한국전자통신연구원 A Hybrid Fingerprint Verification Method using Global and Local Features
KR100601453B1 (en) * 2004-03-10 2006-07-14 엘지전자 주식회사 Fingerprint recognition method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3668633A (en) * 1970-01-29 1972-06-06 Dactylog Co Orientation and linear scan device for use in an apparatus for individual recognition
EP0339527B1 (en) * 1988-04-23 1997-07-09 Nec Corporation Fingerprint processing system capable of detecting a core of a fingerprint image by curvature parameters
JPH0465776A (en) * 1990-07-05 1992-03-02 Ezel Inc Method for comparing image
JPH04332089A (en) * 1991-05-07 1992-11-19 Takayama:Kk Method for registering finger print data
WO1993007584A1 (en) * 1991-10-07 1993-04-15 Cogent Systems, Inc. Method and system for detecting features of fingerprint in gray level image
JP2725599B2 (en) * 1994-06-21 1998-03-11 日本電気株式会社 Ridge direction extraction device
AU735527B2 (en) 1996-08-27 2001-07-12 Kaba Schliesssysteme Ag Method and apparatus for the indentification of not-enrolled fingerprints

Also Published As

Publication number Publication date
DE59807883D1 (en) 2003-05-15
EP0996924A1 (en) 2000-05-03
WO1999004358A1 (en) 1999-01-28
EP0996924B1 (en) 2003-04-09
BR9811511A (en) 2000-09-12
AU8203998A (en) 1999-02-10
KR20010021988A (en) 2001-03-15
JP2001510920A (en) 2001-08-07
ATE237163T1 (en) 2003-04-15
AU761123B2 (en) 2003-05-29
CN1271446A (en) 2000-10-25

Similar Documents

Publication Publication Date Title
US7162058B2 (en) Authentication system by fingerprint
US7151846B1 (en) Apparatus and method for matching fingerprint
US5974163A (en) Fingerprint classification system
CN110222687B (en) Complex background card surface information identification method and system
EP0650137A2 (en) An apparatus for fingerprint verification
EP0780781B1 (en) Feature extraction for fingerprint recognition
US7079670B2 (en) Apparatus and method for authenticating a user by employing feature points of a fingerprint image of the user
KR20010021850A (en) System and method for automatically verifying identity of a subject
EP0466161A2 (en) Image positioning method
US20090169072A1 (en) Method and system for comparing prints using a reconstructed direction image
US20200110932A1 (en) Method for detecting document fraud
US10002285B2 (en) Fast, high-accuracy, large-scale fingerprint verification system
US7035441B2 (en) Check for fingerprints
KR101237148B1 (en) Hierarchical fingerprint verification apparatus and method using similarity distribution
AU761123B2 (en) Method for determining an identification code from fingerprint images
JP3494388B2 (en) Fingerprint matching method and fingerprint matching device
EP1295242B1 (en) Check of fingerprints
CN114742188A (en) Data identification system for automatically identifying social security card and implementation method thereof
Srinivasu et al. Aadhaar card voting system
MXPA00000492A (en) Method for determining an identification code from fingerprint images
Szczepanik et al. Security lock system for mobile devices based on fingerprint recognition algorithm
JP2600680B2 (en) Personal verification device
JP3033595B2 (en) Fingerprint image registration method
EP0466039A2 (en) Image comparison method
Bursikov et al. Building an optimal document authentication system

Legal Events

Date Code Title Description
FZDE Discontinued