GB2278478A - Fingerprint classification system - Google Patents

Fingerprint classification system Download PDF

Info

Publication number
GB2278478A
GB2278478A GB9310995A GB9310995A GB2278478A GB 2278478 A GB2278478 A GB 2278478A GB 9310995 A GB9310995 A GB 9310995A GB 9310995 A GB9310995 A GB 9310995A GB 2278478 A GB2278478 A GB 2278478A
Authority
GB
United Kingdom
Prior art keywords
classification result
classification
neural network
skin print
identifier
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9310995A
Other versions
GB2278478B (en
GB9310995D0 (en
Inventor
Christopher Robert Gent
Colin Paul Sheppard
Steven Bryant
Brian David Stubbington
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
EDS SCICON DEFENCE Ltd
HP Enterprise Services LLC
Original Assignee
EDS SCICON DEFENCE Ltd
Electronic Data Systems LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by EDS SCICON DEFENCE Ltd, Electronic Data Systems LLC filed Critical EDS SCICON DEFENCE Ltd
Priority to GB9310995A priority Critical patent/GB2278478B/en
Publication of GB9310995D0 publication Critical patent/GB9310995D0/en
Publication of GB2278478A publication Critical patent/GB2278478A/en
Application granted granted Critical
Publication of GB2278478B publication Critical patent/GB2278478B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/22Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
    • G07C9/25Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
    • G07C9/257Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition electronically
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

A skin print sensor 1 produces electronic signals representative of a skin print 7 and electronic processing means 3 processes the signals to produce a classification result. <IMAGE>

Description

ELECTRONIC CLASSIFICATION SYSTEM The present invention relates to an apparatus and method responsive to a skin imprint, especially of human skin, and in particular a fingerprint or thumbprint.
It is well known that fingerprints are highly individual-specific and are commonly used as a means of identification, for example in forensic investigations.
This has led the applicants to investigate the possibilities for electronic skin print recognition.
Skin print recognition could in principle be used not only in the aforementioned forensic applications but also, for example, as a means of identifying individuals to permit or deny them access to a secure premises or to confirm lawful ownership of a credit card, debit card or the like.
Although in practice it is envisaged that such skin print recognition systems embodying the present invention will be adapted to respond to fingerprints or thumbprints, they could equally be adapted to be responsive to other skin prints such as palm-prints or the print of a bare foot (sole).
Thus, the present invention provides a skin print classification apparatus comprising a skin print sensor for producing electronic signals representing a skin print sensed by the sensor and electronic processing means for processing the signals to produce a classification result.
The present invention also provides a method of skin print classification comprising using a skin print sensor to produce electronic signals representing a skin print sensed by the sensor and electronically processing the signals to produce a classification result.
It is a principle of the present invention that the classification effected by the present invention need not be an absolute recognition of a particular individual. It is sufficent for the skin print to belong to a predetermined classification group. One or more levels of classification may be used as appropriate. Of course in the limit, using many sub-levels of classification may be tantamount to performing individual print recognition.
In forensic fingerprint analysis, a skilled practitioner will first classify a fingerprint into one of a number of predetermined groups. The print need then only be compared with existing fingerprint records already classified within the relevant group. This considerably reduces the number of fingerprint records which have to be searched.
However, the manual classification is laborious and the practitioner may make mistakes. Moreover, manual classification is not practicable for automatic credit card/debit card or security control or other such applications.
The invention is not limited to classifying finger or thumb prints since other skin prints (palm, foot, etc. ) may be equally susceptible of classification.
Preferably a memory means is used for storing an initial classification result and an identifier for indicating the identity of the relevant individual.
Later, these stored parameters can be used to verify that an authorised user is authorised to perform a certain function (e.g. financial transaction). The initial recording may be effected using a different physical apparatus from that carrying out the verification at a later time.
A preferred method of performing the initial classification is to use a neural network configured to perform a cluster analysis. Subsequent comparison of a print obtained from a user with the pre-stored classifications is preferably effected using a pattern matching algorithm. A number of different forms of pattern matching algorithm are possible but a neural network may be configured for this function also. A preferred form of neural network for the purposes of the present invention is Rohonen neural network.
A neural network is essentially an electronic or software equivalent of the network of neurons in the human brain. It consists of "artificial neurons" which receive various inputs and apply weighting factors to each before combining them into a function to produce a required output result. A recurrent multi-layer neural network consists of at least an input layer and an output layer of artificial neurons, separated by hidden layers. The neural network compares errors and uses these continuously to adjust the weighting factors and/or the operative functions to minimise the errors and optimise the result. The theory and implementation of neural nets are well documented, for example in "Neural Computing: An Introduction", R. Beale and T.
Jackson, Adam Hilger 1990.
A neural net can receive all available inputs and through its internal "learning process", apply appropriate weighting factors so that it takes as much or as little (including zero) account of each to find an optimum estimate of the parameter to be predicted.
Any appropriate sensor may be used, for example, a fingerprint camera. The later usually consists of a screeen against which a finger or thumb is pressed. A videocamera (solid state or otherwise) behind the screen produces video signals corresponding to an image of the print. If necessary, electronic contrast enhancement may also be employed. However, other sensors such as thermal transducers with spatial resolution (e.g.
comprising an array of thermally sensitive "pixels") may also be used.
The present invention will now be explained in more detail by the following description of a preferred embodiment and with reference to the accompanying drawing, in which Figure 1 shows a basic schematic diagram for verification of electronic financial transactions using fingerprint classification.
Figure 1 shows a device for the verification of electronic financial transactions using a plastic card such as payment for goods at a EPOS (Electronic Point of Sale Terminal) or the withdrawal of funds at an ATM (Automatic Teller Machine). The purpose of the device is to verify that the person carrying out an electronic financial transaction with a plastic card is the genuine owner of that card and to issue a warning should that person not appear to be the owner of the card. The term "plastic card" is used to denote any card or other information holding device issued to individuals for the purposes of performing electronic financial transactions. The term "plastic card reader" denotes any device that can read the information held on a plastic card and transfer that information to a processor or other electronic device.
The device consists of: a fingerprint sensing device 1 such as a fingerprint camera; a processor 3; and either a plastic card reader 5 or an interface into an existing plastic card reader.
The device will operate with plastic cards which, along with the normal information held on the card, contain information relating to the fingerprints of the owner of the card.
The device operates using following basic steps: 1. One or more fingerprints 7 are sensed from the person wishing to perform the transaction, typically via a fingerprint camera; 2. The processor analyses the fingerprint(s) sensed at step 1 and for each derives a classification for each fingerprint. This classification relates to the structure of ridges on the fingerprint and places the fingerprint into one of a small number of classes (typically between 20 and 50). Some of these classes are related to the classes used by fingerprint experts such as "arch", "loop left", "loop right" and "whorl"; 3. Either before, after or simultaneously with steps 1 and 2, the plastic card to be used in the transaction is read via the plastic card reader, typically by "swiping" the card through the reader. The information from the card pertaining to the owner's fingerprint is transferred to the processor; 4. The processor compares the classification(s) of the person's fingerprint(s) derived at step 2 with the information from the card derived at step 3. It should be noted that the system does not attempt to make a detailed match between the fingerprint of the actual person wishing to perform the transaction and that of the registered card owner; 5. If the classification(s) agree with the data from the card, then the financial transaction is allowed to proceed. If the classification(s) do not agree with the data from the card, then the financial transaction is stopped and an operator alerted or other form of warning issued.
The fingerprint related information that is stored on the card is derived using the steps 1 and 2 above typically at the time that the card is issued.
Each fingerprint is sensed (step 1 above) using a fingerprint camera. The image from this camera is digitised (typically to 512 x 512 pixels) and input to a computer processor. From the digitised image, the computer processor derives a pair of arrays: an array of data (normally much smaller than the array corresponding to the digitised image) where each data element indicates the mean direction of the fingerprint ridges within an area of the fingerprint image. This array is known as the Direction Map and is typically 20 x 20 elements in size. Each element of the Direction Map corresponds to a rectangular area in the fingerprint image in such a way that the Direction Map covers the complete fingerprint image; an array of data where each element indicates the vorticity of the fingerprint ridge directions. This is known as the Vorticity Map. It is the same size as the Direction Map and bears the same relationship to the fingerprint image.
To derive the classification, the processor compares the Direction Map and Vorticity Map pair derived from the fingerprint image against a number of prototype Direction Map and Vorticity Map pairs (one pair for each of the possible fingerprint classes) and for each class computes a "degree of match" or score. The class assigned to the incoming fingerprint is the class that yields the highest score in the matching process.
In order to cope.with the variability inherent in the capture of the fingerprint image the matching process would provide not a single "best match" class, but a short list (currently 5 elements) of the best matches. Similarly, a list of most likely fingerprint classes is held on the plastic card. To decide whether or not to approve the transaction (step 4 in Section 2 above) the two lists are compared.
The prototype Direction Maps and Vorticity Maps are previously derived using a Kohonen neural network to cluster sample fingerprint data by the following process: (a) a database containing a large number of fingerprint images is selected. The contents are selected as to be representative of the overall population; (b) each image is processed to derive the Direction Map and Vorticity Map using the same process as described above; (c) a Kohonen neural network is trained using unsupervised training techniques to cluster the images in the database into a number of classes. This network has a number of output neurons equal to the number of possible fingerprint classes. During this training process, the Direction Map and Vorticity Map pairs are randomly shifted by small amounts. The size of these shifts are increased during the process of training the neural network. This has the effect of making the network ignore the effects of small registration errors in the fingerprint image; (d) the prototype Direction Map and Vorticity Map pairs corresponding to each Kohonen output neuron are derived and it is these pairs that are used in the matching process described above.
In the light of this disclosure, modifications of the described embodiment, as well as other embodiments, all within the scope of the invention as defined by appended claims will now become apparent to persons skilled in the art.

Claims (23)

1. A skin print classification apparatus comprising a skin print sensor for producing electronic signals representing a skin print sensed by the sensor and electronic processing means for processing the signals to produce a classification result.
2. An apparatus according to claim 1, further comprising a memory means for storing both the classification result and an identifier indicating the identity of the individual producing said skin print.
3. An apparatus according to claim 2, wherein the processing means is adapted to input the classification result and the identifier into the memory means for future use.
4. An apparatus according to claim 2, wherein the processing means is adapted to compare the classification result and the identifier respectively with a reference classification result and reference identifier previously stored in the memory means and to produce an authorisation result based on said comparison.
5. An apparatus according to claim 4, further comprising a card reader or an interface for a card reader, for providing the identifier.
6. An apparatus according to claim 3, wherein the processing means is adapted to utilise a cluster analysis to produce the classification result.
7. An apparatus according to claim 6, wherein the cluster analysis is performed using a neural network.
8. An apparatus according to claim 4 or claim 5, wherein the processing means is adapted to use a pattern matching algorithm to produce the classification result.
9. An apparatus according to claim 8, wherein a neural network is configured to perform as the pattern matching algorithm.
10. An apparatus according to claim 7 or claim 9, wherein the neural network is a Kohonen neural network.
11. A method of skin print classification comprising using a skin print sensor to produce electronic signals representing a skin print sensed by the sensor and electronically processing the signals to produce a classification result.
12. A method according to claim 11, wherein the skin print is a human fingerprint or thumbprint.
13. A method according to claim 11 or claim 12, further comprising storing in a memory means, both the classification result and an identifier indicating the identity of the individual producing said skin print.
14. A method according to claim 13, further comprising using the processing means for inputting the classification result and the identifier into the memory means for future use.
15. A method according to claim 13, further comprising using the processing means to compare the classification result and the identifier respectively with a reference classification result and reference identifier previously stored in the memory means to produce an authorisation result based on said comparison.
16. A method according to claim 15, further comprising obtaining the identifier from a card reader or from an interface for a card reader.
17. A method according to claim 14, wherein the classification result is produced using a cluster analysis.
18. A method according to claim 17, wherein the cluster analysis is performed using a neural network.
19. A method according to claim 15 or claim 16, wherein the classification result is obtained by using a pattern matching algorithm.
20. An apparatus according to claim 19, wherein a neural network is configured to perform as the pattern matching algorithm.
21. A method according to claim 18 or claim 20, wherein the neural network is a Kohonen neural network.
22. A skin print classification apparatus substantially as hereinbefore described with reference to the accompanying drawing.
23. A method of skin print classification substantially as hereinbefore described with reference to the accompanying drawing.
GB9310995A 1993-05-27 1993-05-27 Electronic classification system Expired - Fee Related GB2278478B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9310995A GB2278478B (en) 1993-05-27 1993-05-27 Electronic classification system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9310995A GB2278478B (en) 1993-05-27 1993-05-27 Electronic classification system

Publications (3)

Publication Number Publication Date
GB9310995D0 GB9310995D0 (en) 1993-07-14
GB2278478A true GB2278478A (en) 1994-11-30
GB2278478B GB2278478B (en) 1997-06-11

Family

ID=10736250

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9310995A Expired - Fee Related GB2278478B (en) 1993-05-27 1993-05-27 Electronic classification system

Country Status (1)

Country Link
GB (1) GB2278478B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2820533A1 (en) * 2001-02-07 2002-08-09 Sagem BIOMETRIC IDENTIFICATION OR AUTHENTICATION SYSTEM

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1548667A (en) * 1975-06-23 1979-07-18 Calspan Corp Fingerprint based access control and identification apparaus
GB2050026A (en) * 1979-04-02 1980-12-31 Nippon Electric Co Device for extracting a density as one of a number of pattern features extracted for each feature point of a streaked pattern
GB1590755A (en) * 1976-09-10 1981-06-10 Rockwell International Corp Automatic pattern processing system
EP0159037A2 (en) * 1984-04-18 1985-10-23 Nec Corporation Identification system employing verification of fingerprints
GB2174831A (en) * 1985-04-22 1986-11-12 Quantum Fund Ltd The Skin-pattern recognition
WO1987007058A1 (en) * 1986-05-07 1987-11-19 Brendan David Costello Method and apparatus for verifying identity

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1548667A (en) * 1975-06-23 1979-07-18 Calspan Corp Fingerprint based access control and identification apparaus
GB1590755A (en) * 1976-09-10 1981-06-10 Rockwell International Corp Automatic pattern processing system
GB2050026A (en) * 1979-04-02 1980-12-31 Nippon Electric Co Device for extracting a density as one of a number of pattern features extracted for each feature point of a streaked pattern
EP0159037A2 (en) * 1984-04-18 1985-10-23 Nec Corporation Identification system employing verification of fingerprints
GB2174831A (en) * 1985-04-22 1986-11-12 Quantum Fund Ltd The Skin-pattern recognition
WO1987007058A1 (en) * 1986-05-07 1987-11-19 Brendan David Costello Method and apparatus for verifying identity

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2820533A1 (en) * 2001-02-07 2002-08-09 Sagem BIOMETRIC IDENTIFICATION OR AUTHENTICATION SYSTEM
WO2002063548A2 (en) * 2001-02-07 2002-08-15 Sagem S.A. Biometric identification or authentication system
WO2002063548A3 (en) * 2001-02-07 2003-01-03 Sagem Biometric identification or authentication system
CN100380269C (en) * 2001-02-07 2008-04-09 萨甘股份有限公司 Biometric identification or authentication system
US7587610B2 (en) 2001-02-07 2009-09-08 Sagem Securite Biometric identification or authentication system

Also Published As

Publication number Publication date
GB2278478B (en) 1997-06-11
GB9310995D0 (en) 1993-07-14

Similar Documents

Publication Publication Date Title
Jain et al. Introduction to biometrics
KR100447023B1 (en) Biometric recognition using a classification neural network
Ross et al. Information fusion in biometrics
US10552698B2 (en) System for multiple algorithm processing of biometric data
DE60313105T2 (en) Fingerprint-based authentication device
US7319779B1 (en) Classification of humans into multiple age categories from digital images
US5815252A (en) Biometric identification process and system utilizing multiple parameters scans for reduction of false negatives
Verlinde et al. Multi-modal identity verification using expert fusion
CN100382093C (en) Registration method for biometrics authentication system, biometrics authentication system, and program for same
US4805223A (en) Skin-pattern recognition method and device
Shah et al. FEATURE ANALYSIS
US7006671B2 (en) Personal identification apparatus and method
JPH05266173A (en) Face classification system
Labati et al. Touchless fingerprint biometrics
US20220277311A1 (en) A transaction processing system and a transaction method based on facial recognition
Kroeker Graphics and security: Exploring visual biometrics
Thakur et al. Social impact of biometric technology: myth and implications of biometrics: issues and challenges
EP1295242B2 (en) Check of fingerprints
Sharifi Score-level-based face anti-spoofing system using handcrafted and deep learned characteristics
GB2278478A (en) Fingerprint classification system
Dewan et al. Offline Signature Verification Using Neural Network
Rai et al. DyFFPAD: Dynamic Fusion of Convolutional and Handcrafted Features for Fingerprint Presentation Attack Detection
Sehgal Palm recognition using LBP and SVM
Sanchez-Reillo et al. Improving access control security using iris identification
Choudhary et al. Secured Automated Certificate Creation Based on Multimodal Biometric Verification

Legal Events

Date Code Title Description
730A Proceeding under section 30 patents act 1977
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20110527