AU8203998A - Method for determining an identification code from fingerprint images - Google Patents

Method for determining an identification code from fingerprint images Download PDF

Info

Publication number
AU8203998A
AU8203998A AU82039/98A AU8203998A AU8203998A AU 8203998 A AU8203998 A AU 8203998A AU 82039/98 A AU82039/98 A AU 82039/98A AU 8203998 A AU8203998 A AU 8203998A AU 8203998 A AU8203998 A AU 8203998A
Authority
AU
Australia
Prior art keywords
image
code
determined
bifurcations
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
AU82039/98A
Other versions
AU761123B2 (en
Inventor
Rudolf Hauke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dormakaba Schweiz AG
Original Assignee
Kaba Schliessysteme AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kaba Schliessysteme AG filed Critical Kaba Schliessysteme AG
Publication of AU8203998A publication Critical patent/AU8203998A/en
Application granted granted Critical
Publication of AU761123B2 publication Critical patent/AU761123B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/469Contour-based spatial representations, e.g. vector-coding

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method for determining an identification code from fingerprint images, wherein at least two of the following independent characteristics line spacing L, gradients G, curvatures K and bifurcations B are detected on the image area and a frequency distribution H is determined. On the basis of said frequency distribution, the characteristic values (mean value, variance, maximum value and classification value) and the characteristic values (Ci) of the selected bifurcations are determined, which form the vectorial components of the identification code C. The inventive method can be used to establish a short identification code which is relatively easy to determine and displays high recognition reliability for various applications.

Description

r-1101 METHOD FOR DETERMINATION OF AN IDENTIFICATION CODE FROM FINGERPRINTS The invention concerns a method and a device foi determination of identification codes from fingerprints or from digital gray scale value images respectively according to the generic terms of claims 1 and 25. These are for the automatic identification of fingerprints of a person in real time or from an image document by 5 means of an electronic image recording device, e.g. a video camera. With this a code can be generated from fingerprint images in digital form with which code the appertaining persons can be identified. Mostly the finger is pressed onto a suitable optical device for creating contrast which device generates an image of the skin lines by means of prisms and the principle of impeded total reflection. Skin line images 10 can, however, also be directly recorded electronically, e.g. capacitively. Up to now for characterization of a fingerprint minutiae of all kinds (bifurcations, ridge ends, inclusions, islands, interruptions, junctions, trifurcations etc.) have been used. This is especially valid for the widely used forensic applications which require a very accurate analysis of these minutiae characteristics (position and kind of minutiae as 15 well as their orientation) which again require a correspondingly large amount of memory and complicated programs. Furthermore the characterization by means of known minutiae methods further has a series of additional drawbacks: on the one hand errors and inaccuracies of the optical December 23, 1999 -2 image recording can lead to confusion of the minutiae, i.e. an error of the image recording generates apparent minutiae which do not really exist and on the other hand existing minutiae cannot be recognized due to poor image recording. Additionally the actual finger line image of a person can contain minutiae errors, e.g. 5 due to injury of the skin, by pollution or poor recordability of the skin lines such that e.g. interruptions appear in the image. Due to e.g. a simple injury in form of a cut a large different set of minutiae, i.e. apparent ridge ends can be formed along the edge of the cut. Thus the recorded minutiae image of a person is not always identical, which again requires complicated evaluation programs. For these reasons the 10 determination of fingerprints by means of known minutiae methods requires much effort concerning calculation and memory. Other known methods for determination of identification codes from distances between ridges or from gradients have not yet been capable of a sufficient security of recognition with short codes. 15 On the other hand there is considerable demand for the identification and verification of persons with simple means for use in a large number of applications for everyday use., e.g. selective access, for payment by means of credit cards, for identification for legal or social purposes, e.g. passport control or for inspection of personal documents, e.g. for social programs etc. For all these non-forensic applications it 20 would be necessary to find a more simple and more secure biometric identification code which requires very little memory and thus can also be used for inexpensive data carriers. Especially for inexpensive magnetic cards, for documents with one- or two-dimensional bar-codes or on other inexpensive data carriers, especially also for chips with EEPROM memories of smart cards and contactless data carrier systems. 25 This is also absolutely necessary for all applications which concern the handling of business dealing with relatively low sums of money, e.g. in the field of everyday consumer goods, with vending machines with a relatively large amount of users. For this kind of applications the identification carrier must be very inexpensive, i.e. must December 23, 1999 -3-/ be applicable. securely with small capacity of memory and relatively simple evaluation in small local computers of testing stations. The object of the present invention is thus, to create a method with a better ratio of necessary code length and computing effort to the precision of determination and 5 especially to generate a shorter and simpler code at a sufficiently high precision of determination which code can consist of less than 100 byte, e.g. only 36 byte or even less. The code is also to be less sensitive concerning image errors and recording errors as well as concerning the choice of the image section. Furthermore the generation of this code is also to be possible in local stations with simple, 10 inexpensive computers. This object is inventively achieved by means of a method according to claim I and a device according to claim 25. By using at least two of the independent or orthogonal features ridge distance L, gradient G, curvature C and bifurcation B a multiplication of the determination 15 precision of the two features is substantially achieved and with the determination of compressed characteristic values from the frequency distributions of the features, code features are determined in a simple manner which is additionally less dependent on recording and image errors. With the inventive method it is also possible to achieve higher determination precision basing on few characteristic values or very 20 short identification codes for simple applications respectively can be achieved if required by increasing the number of features or the code length. Advantageous further developments of the invention are stated in the dependent claims. In the following description of the different method steps as well as in connection 25 with figures and examples the invention is described further, whereby: r 1December 23, 1999 -4 Fig. 1 shows a determination of the ridge distances L in x-direction Fig. 2 shows a determination of ridge distances from gray scale value images 5 Fig. 3 shows a determination of L in x- and y- direction Fig. 4 shows frequency distributions BL as a function of distance length L Fig. 5 shows a classification with determination of class values 10 Fig. 6 shows a determination of gradients G Fig. 7 shows a determination of curvatures K 15 Fig. 8 shows a representation of class values of gradient distributions HG of different images. Figs. 9 to 11 show different fingerprint images I 20 Fig. 12 shows a determination of bifurcations B in a skeletated image Fig. 13 shows a determination of bifurcation distances LB Fig. 14 shows s determination of bifurcation areas F 25 Fig. 15 shows a further example of a determination of bifurcations B Fig. 16 shows a segmentation and the covering of an image with a grid December 23, 1999 -5 Fig. 17 shows representation of possible image sections Fig. 18 shows a diagrammatic representation of the inventive method 5 Fig. 19a, b show examples of defined bifurcations Fig. 20 shows an inventive device for carrying out the method Fig. 21 shows an illustration concerning the classification of bifurcations. 10 For image determination a digital gray scale value image of a fingerprint with a suitable grid of e.g. 512 x 512 pixel is recorded by means of known methods (Figs. 9 to 11). This digital image can either be used directly for determination of the 15 characteristics or a finger line image can be created from it by means of image pre processing, especially by means of binarization and of skeletation (Figs. 12 and 15). Different characteristics are then drawn from this once created image and are compressed into an identification code C in further processing steps which code corresponds to the desired application concerning its length and determination 20 precision. Fingerprint features 25 The following four substantially independent or respectively orthogonal features (characteristics) are used and their frequency distributions are determined: L Ridge distances G Gradients December 23, 1999 -6 K Curvatures B Bifurcations 5 Feature ridge distances L The ridge distances (distance lengths) L are, as illustrated in Figs. 1 and 2, defined as distances between two succeeding finger lines (ridges) 5, whereby the finger line has a width 0, i.e. corresponding to the distances between the middles of two successive 10 finger lines which are recorded in the direction of a projecting ray X. For pre processed, skeletated images this corresponds to a ridge distance (with ridge width 0, Fig. 1), while with digital gray scale value images according to Fig. 2 between two successive finger lines, the distance L is to be calculated as follows: L= L2 + /2 x (LI +L3). If LI and L3 correspond to the finger line widths with a 15 suitably chosen gray scale threshold value 10. With this threshold value 10 a binarization can also be carried out, by terming gray scale values De above the threshold value as 1 and gray scale values below the threshold value as 0. For elimination of errors of individual pixels it can be predetermined as a condition that the ridge widths LI, L3 and also the distance L2 must e.g. amount to three successive 20 pixels. The determination of the occurring distances Lxi, Lx2 etc. in direction of a recording ray, e.g. the abscissa x is carried out as shown in Fig. 1. The complete image is then recorded by variation of y in suitable steps dy according to Fig. 3, such that the quantity of all recorded occurring distances HLx (x, y) can be plotted as a function of the distance lengths Lx according to Fig. 4. This shows the frequency 25 distribution or histogram of all occurring distances Lx in x-direction over the recorded image region. In analogy to this the distances Ly (x, y) are determined in orthogonal direction (i.e. in the direction of ordinate y) and recorded over the whole image region by means of £ 1 110 December 23, 1999 -7 variation of x with the chosen distances dx. This leads to a histogram HLy, again over the whole image region. For determination of the distance lengths Lx and Ly in x-direction and in y-direction 5 the grid must be orientated in a defined manner: here with the y-axis corresponding to the longitudinal axis of the finger in order to achieve defined histograms. The histograms HLx and HLy are completely different as seen in Fig. 4. Fig. 5 shows a classification of a histogram HLx, whereby for each class Hc = 1, 2, 10 3..., e.g. the mean values Hq, maximum values Hmax and standard deviations or variance Hvar are determined and used as characteristic values Ci of identification code C. Histograms HLx and HLy can e.g. each be classified in 8, 12 or 16 classes and then one, two or three values (Hq, Hmax, Hvar) can be determined for each class. 15 Feature gradients G In analogy to the feature distance lengths the histogram of the gradients G, i.e. of the 20 first direction derivations is also registered regularly over the whole image region and for this purpose again recorded e.g. per projection direction. The gradients are determined as tangent to skin line 5 in the point of intersection of the projection ray (e.g. in x-direction) with the skin line. As shown in Fig. 6 the gradients are registered via the projection direction and its histogram HGx is recorded for projection 25 direction x over the whole image region and in analogy to this the histogram HGy of the gradients in projection direction y, in order to regularly record the gradient directions G(x, y) over the whole image region.
F 1 10 December 23, 1999 -8 This histogram of G can also be recorded in an image covering manner by determination of a gradient value for each mesh of a grid (30, Fig. 16) in order to register a gradient grid. This is explained in more detail further below. 5 As illustrating example the histograms HG of the gradients for the three fingerprint images of Figs. 9, 10 and 11 were determined and are shown in Fig. 8. The image II of Fig. 9 shows the fingerprint of a person 1 and the images 12a and 12b show fingerprints of a second person, whereby Fig. 10 shows fingerprint 12a without errors and Fig. 11 shows the same finger of the same person in image I2b with errors or 10 injuries 20 respectively. The histograms HG of these images were recorded over the whole image and are shown as a function of the gradient angle of 0 to 180*. Basing on this a classification into 16 classes, i.e. each class with 180* : 16 = 11.25* angle region, was carried out and the mean values Hq of each class were determined and are shown in Fig. 8. The results is a graph with 16 feature values for each image. As 15 can be seen clearly the graphs of these images 12a and I2b according to Figures 9 and 10 are nearly identical; i.e. with a correspondingly defined threshold value S both images are classified as identical. Thus the identification of this person 2 is still possible although the images 12a and 12b differ, firstly due to the errors 20 from injury, secondly due to not identically recorded image regions. As shown in Fig. 11, 20 the recorded image region 12a on its edges does not correspond to the image region I2b (differing definition of image region). In the case of a known minutiae evaluation many new pseudo-minutiae (ridge ends) would occur in image I2b in the injured areas 20 and thus identification would be 25 connected with much effort or even be impossible respectively. The graph of image Il of person 1 of Fig. 9 visibly differs considerably from graphs 12a and I2b of person 2. Thus the 16 feature values Ci provide a relatively good contribution to the security of recognition of the identification code C as partial code C1 of feature G. This example also illustrates that the inventive determination of histograms of the I'11115 December 23, 1999 -9 named features over a large image region and from this the determination of compressed feature values amounts to an identification code which is only affected by local image errors to a relatively small degree such that correspondingly a relatively increased security of recognition is achieved. This is in contrast to known 5 minutiae evaluation. As previously explained in connection with the determination of distances L the features gradients G can also be determined directly from digital gray scale value images, e.g. by means of determination of a gradient value for each mesh of a grid 30 10 and thus of a gradient grid which is explained in connection with Fig. 16. Feature curvatures K 15 The curvatures K according to Fig. 7 are determined in the intersection points of the skin lines or finger lines 5 with the projection directions x and y as second direction derivations of the finger lines. This is e.g. determined as inverse radius R of the approximation circle to finger line 5 in the concerned intersection point. In analogue manner to the feature determination described so far, here the histograms of the 20 curvatures are again determined over the whole image region in the two orthogonal directions x and y, i.e. HKx and HKy (or by determination of the K-value for each mesh 30). In order to exclude irrelevant very small curvature radii which can be formed due to irregularities of a skeletated image choice rules can be applied, e.g. Rmin = 0.3 - 0.5 mm such that the narrowest curvature radius in the center of the 25 image is included, but not, however, even narrower curvatures, e.g. at the bifurcation B5.
I'l11 December 23, 1999 - 10 As explained previously in connection with the determination of distances L the feature gradients G and possibly also the curvatures K can also be derived from digital gray scale value images or from binarized but not yet skeletated images. 5 Inventively only the independent features ridge distances L, gradients G and bifurcations B (skin line bifurcations) are used of which at least two of these features must be used for code determination. The features L, G, K are not minutiae features; they are characterized by their 10 histograms. The special feature bifurcations B is also chosen and included in a manner that image errors, e.g. the absence of one single bifurcation e.g. in the margin - which can be recorded once and not recorded another time - have no or no substantial effect. 15 It is important that the sought after short identification code C is as independent of individual image errors or of one single feature (e.g. of differing individual minutiae) as possible. For this reason the sole minutiae characteristic which is used as feature can only be clearly defined bifurcations B, whereby these bifurcations B, in contrast to other minutiae, are relatively insensitive to image errors or to apparent mistakes 20 due to disturbances of the finger lines e.g. due to injury from cutting. Injuries due to cutting can create new ridge ends as apparent minutiae, but not, however bifurcations. It is important that the clearly defined feature bifurcations B applied here is used in a 25 different manner than in the conventional evaluation of minutiae. The conventional evaluation registers different kinds of minutiae and of each minutia its position, its kind and its orientation, whereby for evaluation and code determination the relative positions of these different individual minutiae are brought in. Inventively, however, only one kind of minutiae is chosen, clearly defined bifurcations, which are ri i 1December 23, 1999 - 11 additionally used in completely different manner than up to now. This is explained in what follows: as histogram or as individually skeletated bifurcations. 5 Feature bifurcations B Fig. 12 shows a binarized and skeletated representation of the gray scale value image of Fig. 10 which is used as an example for the determination of bifurcations. In this image the bifurcations Bi = B1 - B12 are determined, this results in an quantity of N 10 = 12 bifurcations B. For this purpose suited choice rules or definition criteria respectively are set up, such that small errors and other kinds of minutiae (e.g. islands) are not counted as bifurcations. The rules are e.g.: a registered bifurcation must have a minimum length of 0,5 - 1 mm of all three branches and one branch must have a length of at least 1 - 1.5 mm. Furthermore a minimum distance between 15 two bifurcations of e.g. 0.7 - 1 mm can be prescribed. According to such definition criteria e.g. (B 13), (B 14) in Fig. 12 and (B7), (B8), (B9) in Fig. 15 are not counted as bifurcations. Bifurcations B can be used as defined individual or selected bifurcations for 20 generating a code, as is explained in connection with Figs. 19 and 21, or, according to Figs. 13, 14 histograms e.g. of bifurcation distances as well as of adjacent triangular areas F between the bifurcations can be determined. According to Fig. 13 the bifurcation distances LB are determined as follows: 25 The distance LBi-j between the N bifurcations Bi and every other bifurcation Bj are determined. This results in N(N-1) bifurcation distances LBi-j. In the example with N = 12 this results in 132 distances which are drawn up in a histogram: as frequency distribution HLB in function of distance LB.
r'i i in December 23, 1999 -12 In analogy to this, triangular areas Fi-j between bifurcations Bi and Bj are determined according to Fig. 14. A first side of triangle is defined, which reaches from each bifurcation Bi to each other bifurcation Bj, whereby the third point of the triangle is determined to be the bifurcation Bk (with k not = j) closest to bifurcation 5 Bi; e.g. starting from BI, the area F1-2-3 (with i = 1, j = 2, k = 3) and FI-3-2 (with i = 1, j = 3, k = 2) as well as F1-4-2 until F1-12-2. With this choice rule it is guaranteed that only one area is registered per bifurcation, e.g. in the case that two closest bifurcations are Bk-bifurcations only the one closest to Bj is used as Bk. Starting from B10 to BI the area F10-1-12 (i = 10, j = 1, k = 12), i.e. starting with 10 each Bj as a base line, this results in exactly one triangle, thus again a sum of N(N-1) = 132 areas Fi-j, which again form a histogram with frequency distribution as a function of area F. Fig. 15 shows a further example for determination of bifurcations from another 15 fingerprint image, whereby by means of corresponding choice rules, e.g. that the very closely situated bifurcations B7, B8, B9 are not counted as bifurcations for evaluation, such that here this leaves only the bifurcations B 1 to B6 and thus the quantity of bifurcation distances LB as well as that of the closest triangular areas F is each N(N-1) = 30. With other definition criteria, e.g. only B9 could be defined as 20 bifurcation but not B7 and B8. Determination of feature values 25 From the histograms simple feature values Ci which characterize the histogram are determined, e.g. mean values Hq, maximum values Hmax and variances Hvar. Furthermore a histogram can also be classified into classes Hc and e.g. mean value and variance can be determined as characteristic values (Fig. 5).
r i i is Lecemler 2i, 1999 - 13 Selected bifurcation features The feature bifurcations can additionally be used in a non-statistical manner by using individual clearly defined bifurcations with determined bifurcation features for 5 determination of code features Ci and identification code C. Hereby the bifurcations are, as illustrated in the examples of Fig. 19a and 19b, determined via their position, i.e. the local coordinates xl, yl of point P1 are determined as well as the orientation angle Wa and the opening angle Wb. A clearly defined opening angle Wb can e.g. be defined as the angle between the connecting line of points the P1, P2 and P1, P3, 10 whereby points P2 and P3 are chosen at a suited distance r1 to point P1, e.g. r1 = 0.5 - 1 mm. A point P4 with a minimal distance r2 of e.g. 1 to 1.5 mm from point P1 on the other hand form a definition criterion for a bifurcation, so that small image errors are not wrongly counted as bifurcations. In the example of Fig. 19a a relatively small bifurcation angle Wb is shown, e.g. corresponding to bifurcation B6 in Fig. 2, while 15 the example of Fig. 19b shows a large opening angle Wb, e.g. similar to the one in bifurcation B8 in Fig 21. Identified bifurcations can now be classified for code determination, i.e. sorting criteria such as e.g. a valence for the bifurcations is introduced. A first classification criterion is the opening angle Wb, whereby large opening angles have higher valences. The classification criterion can e.g. be: a 20 valence in proportion to opening angle Wb. A further sorting criterion is the distance RBi of the bifurcation to a central reference point, here e.g. the distance RBi to the curvature center Km (Fig. 21), whereby the central bifurcations have higher valences than those further away. As a further criterion of selection, bifurcations which are close to the margin, e.g. B2 in Fig. 21 which are near the outer contour A of the 25 finger line image can be left away. From the bifurcation features and the classification criteria e.g. a bifurcation priority can be defined as BP = Wb/RB. With this all bifurcations can be sorted in descending order according to BP. For generating a short code length the first entries of the list sorted according to ri I UDecember 23, 1999 -14 bifurcation priority can be used. A possible order for Fig. 21 can e.g. be the following: B8, B6, B5, B7, B4. By means of combination of the feature gradients and bifurcations especially concentrated codes with relatively high security of recognition can be created. An 5 identification code C which on the one hand consists of gradient features for each segment IS of a gradient grid and on the other hand of classified bifurcations B is particularly advantageous. For this purpose a central reference point and the orientation of the finger line image are required. 10 Image-preprocessing The features L and G can be determined directly from the digital gray scale value image without further image pre-processing. 15 The determination of features B and also of K is mostly carried out by means of a line thinning as image pre-processing, e.g. a binarization and skeletation. Then the other features L, G, K are determined from the skeletated image. In an advantageous variant the bifurcations B can, however, also be determined directly from the digital gray scale value image by means of neuronal grids. 20 The features can be registered in following manners, as is illustrated in Fig. 16 with segmentation and mesh classification: a in the whole image region (also for features L, G, K, B) 25 b in each of a few relatively large segments, e.g. divided in 3 x 5 to 5 x 7 segments IS, whereby histograms are determined for each segment IS and feature values are derived from them.
mi I December 23, 1999 - 15 c covering of an image I with a grid 30, whereby for each mesh only one value of the feature is determined. The size of the mesh is chosen suitably. (e.g. 5 x5 pixel) 5 With these kinds of registration a, b, c the following features are registered: a b c Features L 10 G G G K K K B (B) When comparing the codes it must be paid attention to that the division into 15 segments IS is dependent of translation and rotation of the recorded images. For comparison of codes this must be taken into consideration. The feature curvatures K is independent of rotation; the same is valid for the bifurcation lengths BL and bifurcation areas F derived from bifurcations B and the 20 relative positions of selected bifurcations. The feature gradients G requires a determination of the image-orientation-axis y. The feature gradients G, on the other hand, is independent of image dilatations (magnification and diminution of the image). 25 Because at least two features (L, G, K, B) are contained in the identification code C = (Cl, C2) as partial codes (Cl, C2) these partial codes can also be used correspondingly differently for the code evaluation, i.e. corresponding to the dependence on the image definition and to the dependencies of the concerned feature and the histogram evaluation or the selected bifurcation features respectively.
P1118 December 23, 1999 -16 Partial codes can be formed at choice from any feature values Ci. Image section 5 Fig. 17 illustrates the dependence of the image definition, whereby two images were recorded of the same finger, first an image 13 and then an image 14, whereby these images 13, 14 show different regions and possibly also different orientations of longitudinal axis y of the fingerprint. In order to always be able to compare similar 10 image regions it is also possible to cut a central portion out of a given fingerprint image which portion can then be used as image section IA for determination of features and generating a code. Hereby regions from center Z of a fingerprint and its surroundings ZU are advantageously used for registration of features and code generation because the information content is higher here than in the marginal 15 regions. With a well defined position and orientation of the images, by means of corresponding image recording devices, e.g. a guidance for two fingers, segmented images can also be compared at higher security. 20 Orientation and central reference point For determination of the orientation and a central reference point of a finger line 25 image or for the definition of a grid with an origin of coordinates e.g. a so called core point can be defined in known manner. This, however, is connected with large effort and often no clear core point can be determined. A simpler method consists in determining a curvature radius gravity center Km as central reference point from the feature curvature K. For this purpose the centers of the approximation circles (see R P1118 December 23, 1999 -17 in Fig. 7) to the curvatures are determined and of these regularly recorded circle centers of the approximation circles the gravity centers Km with coordinates xm, ym are determined. A central reference point can, however, also first be determined by approximation e.g. as gravity center of the image area with outer contour A (in Fig. 5 21). Similarly the recording of a finger image recorded in a defined manner by means of guiding means 32, as described in connection with Fig. 20 can be used. A further possibility is to determine a central reference point from the variance Hvar of the gradient distribution in image segments IS, whereby the central reference point is determined by the segment which has a maximum of variance. For this purpose the 10 image segmentation can also be varied. With a poorly defined position of an image to be recorded, the features, manners of determination and code determination must be chosen such that they are relatively independent of position, e.g. the features G, K according to determination manner a, 15 whereby, however, in most cases the direction should be determined for support. For code comparison the image can also be rotated over a small angle region or shifted in x- and y-direction. As realization variants of the inventive method, depending on image definition and 20 the objective to be achieved, e.g. the following combinations of the features for determination of an identification code C can be used: 1. features ridge distances L, gradients G and bifurcations G. These can e.g. also be determined from gray scale value images and be used 25 especially for simpler, shorter codes (L and G), e.g. also with a classification of histograms. From skeletated images the following features can be used: ni iDecember 23, 1999 -18 2. the features gradients G and curvatures K 3. the features gradients G and bifurcations B 4. the features curvatures K and bifurcations B, which are both invariant to rotation. 5 5. the features gradients G, curvatures K and bifurcations B, this specially for higher precision, whereby corresponding to the higher precision longer codes are formed (also with selected bifurcations). After suitable choice of the combined features the precision, with which the features 10 and their histograms are determined as well as the determination of the feature values Ci from it, can be chosen such that finally an identification code C with a code length with the desired demands to precision is formed, i.e. the code length can be enlarged as far as demanded, by narrower segmentation and classification and by determination of more feature values Ci. 15 The identification code C cannot only be determined sequentially but also in iterative manner. Code comparison 20 With these inventive, relatively compact identification codes C (with minor demand of memory) all in fact known identification, verification and authentification tasks can also be carried out in local, simple and rational manner. Hereby, for the comparison of two codes Ca and Cb, e.g. a code of a recent fingerprint image Ca 25 with a reference code Cb from an allocated memory (which can be a databank or also an identification medium IM) is determined by means of mathematical methods a code difference D = Ca - Cb and is compared to a pre-determinable threshold value (an acceptance value) S. In the case that D < S the two codes Ca and Cb and therefore the appertaining persons, are found to be identical.
i'i i 1December 23, 1999 -19 Hereby individual threshold values specific to persons can be given. E.g. dependent on how well the identification code C of a certain person is to be determined and on the other hand also on how important this identification task for this person is. In similar manner a threshold value S can be predetermined, corresponding to the 5 concerned application, i.e. which security of recognition is desired, i.e. which FAR false acceptance rate and which FFR false rejection rate is permissible. For determination of this correspondence e.g. the euclidian distance D of the feature vectors C with a predetermined correspondence threshold S can be determined. This 10 means for a code, formed from the feature values Ci, that a fingerprint 1 is identical to a fingerprint 2 when: D =Z(Cli -C21)2< 15 As an alternative the identification codes can also be correlated according to the following method: A correlation Kor between the features of a finger ai with i = 1,N and a second finger bi with i = ,N according to the formula: 20 -1 Kor = E (ai-aq) (bi-bq) / (E (ai-aq)2)" 2 _ + I for i = 1,N is formed, whereby aq is the mean value of the values ai and bq is the mean value of the values bi. For ai = bi the fingers are identical, i.e. the correlation is 1. At Kor = 0 the features 25 are not correlated, at Kor = -1 they are anti-correlated. Further possible methods of comparison are, depending on the type of code C regression analysis or moment analysis. Furthermore the different features or feature Fi l December 23, 1999 - 20 values Ci respectively and corresponding partial codes Cl, C2 of an identification code C = (C1, C2) can also be treated differently. Fig. 18 diagrammatically illustrates the inventive method for determination of a 5 relatively short identification code C. The choice 52 of at least two of the orthogonal features L, G, K, B as well as the choice of the following method steps on the one hand bases on the image definition 51 and on the other hand on the task 60, i.e. on the required security of recognition of the desired applications, from this the required false acceptance rate FAR and the required false rejection rate FRR. Depending on 10 the image definition 51 (i.e. quality of orientation, section, image recording and image quality) correspondingly less sensitive features and kinds of registration are chosen. Likewise a possible image pre-processing 53 can be carried out for a direct feature determination from the gray scale image, from a binarized image or from a thinned, skeletated line image. In a next step 54 a possible image orientation, a 15 central reference point or an origin of coordinates, e.g. Kin, as well as the choice of an image section IA or a segmentation IS respectively can follow. From this, histograms H with correspondingly fine screening or quantity of data respectively are determined. In addition to the histograms individual defined features bifurcations B with position, orientation angles Wa and opening angles Wb can be determined in 20 step 56 and in step 57 a classification of bifurcations, a skeletation or valences according to a bifurcation priority BP respectively can be carried out. In step 58 the determination of the feature values Ci is carried out and in step 59 the composition of the identification code C is carried out. I.e. a corresponding longer or shorter code length is composed until the desired security of recognition FAR and FFR according 25 to step 60 is achieved. Corresponding to the composition of code C the methods of evaluation and the threshold values S for the identification evaluation can be chosen. The identification evaluation (comparison) Ca - Cb = D < S is carried out in step 62. Thus the inventive method is universally applicable for a wide range of applications ru iDecember 23, 1999 -21 and is also optimally matchable to the desired tasks concerning computing effort and length of code. 5 Fig. 20 shows a device for carrying out the method with an electronic image recording device 31 and a station with evaluation electronics 34 and with evaluation logarithms 35 for determination of feature values Ci and from these of the identification code C or Ca respectively, i.e. of the actual identification code corresponding to the image recording of fingers 1, 2 to be recorded and for carrying 10 out a comparison of codes Ca - Cb between a saved reference code Cb and the actual code Ca. This code comparison can also be carried out in a corresponding reading station WR. Corresponding to the code comparison, i.e. to the verification of the actual person, access functions for control of corresponding function stations can be practiced. The image recording device 31 here comprises guiding and positioning 15 means 32 with limit stops for orientation and positioning of one or of two adjacent fingers 1, 2 to be recorded. Additionally a lateral limit stop for positioning of the longitudinal axis of the finger and a front limit stop for positioning of the finger image and thus for determination of a central reference point or a center of the finger image respectively can be provided. This kind of two-finger-recording device is e.g. 20 disclosed in PCT CH97/00241 = WO 98/09246. With an inventive device a code comparison Ca - Cb can be carried out locally such that no central databank with reference codes Cb are required. The inventive device can also comprise corresponding identification media IM for authorized persons, whereby the reference code Cb of the person and further information is contained in the identification 25 medium IM or in its memory 43 and whereby an encoded communication 40 with an corresponding reading station WR can be carried out. Hereby the reference code Cb of the authorized person is only stored on the identification medium and not in the verification station or in a databank respectively. This makes a better data protection possible.
December 23, 1999 - 22 In a further variant a code comparison Ca - Cb, i.e. of the actual code Ca with the reference code Cb of the person, by means of processor 41 can also be carried out in the identification medium itself. In this case the actual code Ca must merely be transmitted from the reading station WR to the identification medium, while no code 5 must be transmitted from the identification medium. In another variant the device can also be connected with a master system 47 with a main processor and with a databank 48 for reference codes Cb. The identification medium IM can contain access and function authorizations for further corresponding function stations 46 of a facility. 10 Pre-selected databank search In applications in which a fast databank search is required for positive identification 15 of individual persons from an inhabitant databank with a large quantity of reference codes Cb solely by means of biometric recognition features the identification can be carried out in steps. Hereby the comparison is not carried out with the complete biometric identification code Ca but only with partial codes or with individual feature values Ci respectively such that the search can be carried out considerably 20 faster. In a first step e.g. only a partial code Cl is compared and the remaining reduced quantity is then compared to a second partial code C2 in a second step etc. This continues up to the complete code evaluation with a very small remaining quantity from the databank. This kind of search is considerably more rapid than a search for the complete identification code C. Advantageously the search for primary 25 biometric data is classified, i.e. e.g. gradients and selected bifurcation features are used in steps for this purpose. A further method consists in the forming of gradient segments according to the method of quadrant segmentation (quad trees). Hereby the image is divided into four segments from each of which a gradient histogram is formed. In the next step each quadrant is again divided into four quadrants etc. until P'1113 December 23, 1999 - 23 finally the gradient histogram would converge to the main line tangent in the considered segment. The following terms are used in the description and in the figures: 5 L ridge distance G gradients K curvatures B bifurcations LB bifurcation distances 10 F bifurcation areas H frequency distributions, histograms H(L), H(G), H(K), H(B: LB, F) Hc classification of H Hq mean values Hmax maximum values 15 Hvar variance of H De gray scale value 11, 12 fingerprint images IA image sections IS image segments 20 C identification code (feature vector Ci) Ca actual code Cb reference code C1, C2 partial codes Ci feature values 25 D code difference S threshold value, acceptance value Z center ZU surrounding x abscissa P1118 December 27, 1999 -24 y ordinate, longitudinal axis of finger dx, dy screening FAR false acceptance rate FRR false rejection rate 5 IM identification medium, data carrier WR reading station BP bifurcation priority P1 - P4 definition of Wb ri, r2 distances 10 Wa orientation angle Wb opening angle of B P1(xl, yl) local coordinates of B Km curvature radius gravity center xm, ym coordinates of Km 15 RBi distance from central reference point to Bi A outer contour of finger image I 1,2 finger 5 finger lines, skin lines 20 10 gray scale value 20 image error, injury 30 grid 31 image recording device 32 guiding and positioning means 25 34 evaluation electronics, evaluation station 35 evaluation algorithms 40 encoded communication 42 processor 43 memory P118s December 23, 1999 -25 46 access function (function stations) 47 master system, main processor 48 data bank 51 image definition 5 52 choice of feature L, G, K, B 53 image pre-processing, skeletation 54 image section, segmentation 55 forming histograms 56 determining bifurcations 10 57 classification, choice of B 58 determining feature values Ci 59 determining identification code C 60 applications, demands (FAR, FRR) 61 evaluation method, choice of threshold value S 15 62 identification comparison Ca- Cb

Claims (32)

1. Method for determination of an identification code from fingerprint images or from digital gray scale value images, characterized in that at least two of the 5 following independent features: ridge distances L, gradients G, bifurcations B are registered and that from at least one of the features histograms H are determined from which characteristic values (mean value, variance, maximum) are determined as compressed feature values Ci which as vector components form the identification code C. 10
2. Method according to claim 1, characterized in that the features gradients G are used.
3. Method according to claim I or 2, characterized in that the histograms H are determined in at least two orthogonal directions (x, y).
4. Method according to one of the preceding claims, characterized in that an image 15 pre-processing is carried out, e.g. by means of binarization and skeletation.
5. Method according to one of the preceding claims, characterized in that a classification Hc of a histogram is carried out, whereby at least one characteristic value for each class is determined as a feature value Ci. 1110 December 23, 1999 - 27
6. Method according to one of the preceding claims, characterized in that the recording image I is subdivided into several segments IS and that the histograms H of features (L, G, K, B) are determined for each segment IS.
7. Method according to one of the preceding claims, characterized in that the 5 identification code C is derived from an image section IA which comprises the center Z of the fingerprint image and its surrounding ZU.
8. Method according to one of the preceding claims, characterized in that the length of the identification code C is chosen such that the demand for precision or the recognition security respectively (FAR, FRR) are fulfilled for a desired 10 application.
9. Method according to one of the preceding claims, characterized in that partial codes CI, C2 of the features are determined differently and/or used differently for the code evaluation.
10. Method according to one of the preceding claims, characterized in that defined 15 individual bifurcations B are used as features.
11. Method according to claim 10, characterized in that the features bifurcations B are determined directly from the digital gray scale image by means of neuronal grids. December 23, 1999 - 28
12. Method according to claim 10, characterized in that the features bifurcations B are classified according to their opening angle Wb and/or according to their position in the fingerprint.
13. Method according to claim 10, characterized in that for classification a 5 bifurcation priority BP is defined, whereby the bifurcation priority increases as the opening angle Wb increases and the distance RBi to the image center decreases.
14. Method according to claim 10, characterized in that the bifurcation priority BP is defined as BP = Wb/RBi. 10
15. Method according to one of the preceding claims, characterized in that a central reference point is defined as origin of coordinates of the fingerprint image.
16. Method according to claim 15, characterized in that a curvature radius gravity center Km is defined as a central reference point.
17. Method according to claim 15, characterized in that the variance Hvar of the 15 gradient distribution in image segments IS is determined and the central reference point is determined by the segment which has a maximum variance.
18. Method according to one of the preceding claims, characterized in that by means of guiding and positioning means (32) the longitudinal axis (y) of the P1118 December 23, 1999 - 29 finger and a central reference point are determined at least approximately when recording the image.
19. Method according to one of the preceding claims, characterized in that the features gradients G and bifurcations B are used. 5
20. Method according to claim 19, characterized in that the identification code C is formed from gradient characteristics for each segment IS of a gradient grid and of classified bifurcations B.
21. Method according to one of the preceding claims, characterized in that partial codes are formed and that a comparison Ca - Cb with a large quantity of 10 reference codes Cb of a data bank (48) is carried out in steps corresponding to the partial codes.
22. Method according to claim 21, characterized in that the evaluation in steps is carried out by means of gradient segments and the method of quadrant-sub division (quad trees). 15
23. Method according to one of the preceding claims, characterized in that an actual identification code Ca is recorded and determined and compared locally by means of an corresponding reading station WR to a personal reference code Cb stored in a corresponding identification medium IM. - 30
24. Method according to one of the preceding claims, characterized in that the identification code C is determined in a sequential and/or iterative manner.
25. Device for carrying out the method according to one of the preceding claims, characterized by an electronic image recording device (31), a station with 5 evaluation electronics (34) and with evaluation algorithms (35) for determination of an identification code C, Ca and for carrying out a code comparison Ca - Cb with a stored reference code Cb as well as with an access function (46) for the control of corresponding function stations.
26. Device according to claim 25, characterized by guiding and positioning means 10 (32) with a lateral and a front limit stop for orientation and positioning of one or two adjacent fingers (1, 2) to be recorded.
27. Device according to claim 25 or 26, characterized in that the code comparison Ca - Cb is carried out locally.
28. Device according to one of claims 25 to 27, characterized by a corresponding 15 identification medium IM of an authorized person which contains the reference code Cb of the person and further information for encoded communication (40) with a corresponding reading station WR.
29. Device according to claim 28, characterized in that the actual code Ca is transmitted to the identification medium IM and that the code comparison with 20 the reference code Cb is carried out by means of the processor (42) of the identification medium IM. P1118 December 23, 1999 -31
30. Device according to claim 25, characterized by a master system (47) with a main processor and with a data bank (48) for reference codes Cb and with local function stations (46).
31. Identification medium IM for an authorized person with a stored reference code 5 Cb of the person, which code is determined with a method according to one of claims 1 to 24, for encoded communication (40) with a corresponding reading station WR.
32. Identification medium according to claim 31 with access and function authorizations for corresponding function stations (46).
AU82039/98A 1997-07-18 1998-07-16 Method for determining an identification code from fingerprint images Expired - Fee Related AU761123B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CH1768/97 1997-07-18
CH176897 1997-07-18
PCT/CH1998/000312 WO1999004358A1 (en) 1997-07-18 1998-07-16 Method for determining an identification code from fingerprint images

Publications (2)

Publication Number Publication Date
AU8203998A true AU8203998A (en) 1999-02-10
AU761123B2 AU761123B2 (en) 2003-05-29

Family

ID=4218141

Family Applications (1)

Application Number Title Priority Date Filing Date
AU82039/98A Expired - Fee Related AU761123B2 (en) 1997-07-18 1998-07-16 Method for determining an identification code from fingerprint images

Country Status (10)

Country Link
EP (1) EP0996924B1 (en)
JP (1) JP2001510920A (en)
KR (1) KR20010021988A (en)
CN (1) CN1271446A (en)
AT (1) ATE237163T1 (en)
AU (1) AU761123B2 (en)
BR (1) BR9811511A (en)
CA (1) CA2296353A1 (en)
DE (1) DE59807883D1 (en)
WO (1) WO1999004358A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2882499A (en) * 1998-02-27 1999-09-15 Alan V. Bray Time temperature indicator
KR100374695B1 (en) * 2000-07-10 2003-03-03 주식회사 디토정보기술 Automatic Fingerprint Identification System using Direct Ridge Extraction
KR100430054B1 (en) * 2001-05-25 2004-05-03 주식회사 씨크롭 Method for combining fingerprint by digital linear image sensor
SE524727C2 (en) 2002-05-07 2004-09-21 Precise Biometrics Ab Generation of frequency codes in conjunction with transformation and compression of fingerprint data
KR100456463B1 (en) * 2002-11-01 2004-11-10 한국전자통신연구원 A Hybrid Fingerprint Verification Method using Global and Local Features
KR100601453B1 (en) * 2004-03-10 2006-07-14 엘지전자 주식회사 Fingerprint recognition method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3668633A (en) * 1970-01-29 1972-06-06 Dactylog Co Orientation and linear scan device for use in an apparatus for individual recognition
DE68928154T2 (en) * 1988-04-23 1997-10-30 Nippon Electric Co Fingerprint processing system, suitable for determining the core of a fingerprint image by means of curvature parameters
JPH0465776A (en) * 1990-07-05 1992-03-02 Ezel Inc Method for comparing image
JPH04332089A (en) * 1991-05-07 1992-11-19 Takayama:Kk Method for registering finger print data
AU2779092A (en) * 1991-10-07 1993-05-03 Cogent Systems, Inc. Method and system for detecting features of fingerprint in gray level image
JP2725599B2 (en) * 1994-06-21 1998-03-11 日本電気株式会社 Ridge direction extraction device
CA2236048A1 (en) 1996-08-27 1998-03-05 Kaba Schliesssysteme Ag Method and apparatus for the identification of not-enrolled fingerprints

Also Published As

Publication number Publication date
CA2296353A1 (en) 1999-01-28
JP2001510920A (en) 2001-08-07
CN1271446A (en) 2000-10-25
AU761123B2 (en) 2003-05-29
EP0996924B1 (en) 2003-04-09
ATE237163T1 (en) 2003-04-15
BR9811511A (en) 2000-09-12
KR20010021988A (en) 2001-03-15
DE59807883D1 (en) 2003-05-15
EP0996924A1 (en) 2000-05-03
WO1999004358A1 (en) 1999-01-28

Similar Documents

Publication Publication Date Title
US5745598A (en) Statistics based segmentation and parameterization method for dynamic processing, identification, and verification of binary contour image
US7162058B2 (en) Authentication system by fingerprint
US5974163A (en) Fingerprint classification system
US7151846B1 (en) Apparatus and method for matching fingerprint
EP0780781B1 (en) Feature extraction for fingerprint recognition
EP0650137A2 (en) An apparatus for fingerprint verification
US20090169072A1 (en) Method and system for comparing prints using a reconstructed direction image
EP0466161A2 (en) Image positioning method
WO2015196084A1 (en) A self-learning system and methods for automatic document recognition, authentication, and information extraction
US7079670B2 (en) Apparatus and method for authenticating a user by employing feature points of a fingerprint image of the user
US10002285B2 (en) Fast, high-accuracy, large-scale fingerprint verification system
EP0612035B1 (en) Neural net for the comparison of image pattern features
KR100299858B1 (en) fingerprint matching method
CN108665603B (en) Method and device for identifying currency type of paper money and electronic equipment
AU761123B2 (en) Method for determining an identification code from fingerprint images
Sanchez-Reillo et al. Fingerprint verification using smart cards for access control systems
US6785408B1 (en) Fingerprint segment area processing method and associated apparatus
JP3494388B2 (en) Fingerprint matching method and fingerprint matching device
JPS59151265A (en) Fingerprint collating method
JP4103056B2 (en) Method of using feature quantity for image identification and recording medium storing the program
Srinivasu et al. Aadhaar card voting system
MXPA00000492A (en) Method for determining an identification code from fingerprint images
JP2600680B2 (en) Personal verification device
EP0466039A2 (en) Image comparison method
Dass Classification of Fingerprints

Legal Events

Date Code Title Description
MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application