WO2000070543A1 - Biometric system for biometric input, comparison, authentication and access control and method therefor biometric - Google Patents

Biometric system for biometric input, comparison, authentication and access control and method therefor biometric Download PDF

Info

Publication number
WO2000070543A1
WO2000070543A1 PCT/US2000/013323 US0013323W WO0070543A1 WO 2000070543 A1 WO2000070543 A1 WO 2000070543A1 US 0013323 W US0013323 W US 0013323W WO 0070543 A1 WO0070543 A1 WO 0070543A1
Authority
WO
WIPO (PCT)
Prior art keywords
biometric
input device
biometric input
disposed
prism
Prior art date
Application number
PCT/US2000/013323
Other languages
French (fr)
Inventor
Nikolai Alexejevich Khitsenko
Boris Ivanovich Kotenev
Igor Vladimirovich Matveev
Original Assignee
Biolink Technologies International, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/312,002 external-priority patent/US6282304B1/en
Application filed by Biolink Technologies International, Inc. filed Critical Biolink Technologies International, Inc.
Priority to AU47140/00A priority Critical patent/AU4714000A/en
Publication of WO2000070543A1 publication Critical patent/WO2000070543A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/83Protecting input, output or interconnection devices input devices, e.g. keyboards, mice or controllers thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0336Mouse integrated fingerprint sensor

Definitions

  • the present invention relates to a system for biometric input, comparison, and authentication and, more particularly, to a biometric input device having a scanning window with an illuminated prism, image detector and scanning electronics operable in conjunction with a biometric data comparison system for comparing directional and minutia data.
  • the biometric input device provides a compact, yet highly functional configuration and the biometric data comparison system provides for controlled access to a computing system based upon comparison of imputed biometric data with biometric data stored in a database.
  • Biometric input devices are known for use with computing systems. Such biometric input devices include computer mouse designs. Existing designs for such biometric input devices have scanning windows lacking efficient positioning structure for scanning positioning and protection from ambient light, and do not provide mechanical integration of a position sensing ball assembly with an optical scanning assembly maximizing reliability of position sensing ball operation.
  • Biometric data comparison methods and systems are known. Such known systems and methods suffer from various drawbacks including intensive computing power requirements, intensive memory requirements, slow data transfer, slow comparison, and comparison reliability reduction due to environmental and physiological factors. Known systems also fail to provide for secure communication of biometric data over public lines.
  • An object of the present invention is also to provide a biometric based access control system for use on computers which permits a user to graphically apply biometric access control features to data and applications by the use of a user manipulated biometric protection icon.
  • the present invention provides a biometric input device, system and method which includes a biometric input device having a scanning window surrounded by a ridge for ensuring positive positioning of a biometric sample such as a thumb.
  • the biometric input device includes an optical assembly having a prism with a focusing lens disposed on a side thereof and optionally integrally formed therewith.
  • a biometric comparison method is provided for comparing data from said biometric input device with data from a database using both directional image comparison and clusterized mmutia location and direction comparison.
  • a further system is provided for allowing access to computer functions base on the outcome of the comparison method.
  • the present invention also provides a biometric input device for accepting a fingerprint of a finger tip having opposing tip sides and a tip end, comprising a device body having a body wall defining an aperture and an optical assembly for scanning the fingerprint disposed in the device body.
  • the optical assembly has a scanning surface at the aperture upon which the finger tip is placed for scanning of the fingerprint by the optical assembly.
  • a ridge surrounds a portion of a periphery of the aperture such that the ridge engages the opposing tip sides and tip end such as to position the fingerprint on the scanning surface and block ambient light.
  • a further feature of the present invention includes the aforesaid biometric input device having a device body with a bottom surface opposing a substrate upon which the device body is placed, a device body length and a front portion, a middle portion and a heel portion.
  • a movement detection device for detecting movement of the device body relative the substrate is provided and the bottom surface defined a bottom surface aperture through which the movement detection device detects movement of the device body relative the substrate.
  • the bottom surface aperture is disposed in the heel portion of the device body and the optical assembly is disposed in the middle portion of the device body.
  • the movement detection device has a ball protruding through the bottom surface aperture for engaging the substrate to register the movement of the device body relative the substrate.
  • a biometric input device for accepting a fingerprint of a finger tip having opposing tip sides and a tip end, comprising a device body having a body side wall defining an aperture, and an optical assembly for scanning the fingerprint disposed in the device body.
  • the optical assembly includes an imaging component for converting a light image into pixel output and a lens for focusing the light image into the imaging component.
  • the optical assembly includes a prism with first, second and third sides and a top side wherein the first side forms a scanning surface at the aperture upon which the finger tip is placed for scanning of the fingerprint by the optical assembly, the second side has the lens for focusing the light image into the imaging component disposed thereon, and the third side has a light absorbing layer.
  • the present invention also includes the above embodiment wherein, in the alternative or in combination with one another, the lens is formed integrally with the prism and a light emitting device is disposed to emit light into the prism from the top side of the prism to illuminate the fingerprint when disposed at the scanning surface .
  • a biometric comparison method comprising a series of steps beginning with (a) scanning in a fingerprint and digitizing the scanning signals to produce a matrix of print image data representing pixels.
  • the method proceeds with (b) dividing the print image data into cells, each including a number of pixel data for contiguous pixels, and (c) calculating a matrix of directional image data DI using gradient statistics applied to the cells wherein the directional image data DI includes, for each of the cells, a cell position indicator and one of a cell vector indicative of a direction of ridge lines and an unidirectional flag indicative of a nondirectional calculation result. Processing then continues with (d) skeletonizing the print image data, and (e) extracting minutia from the print image data and producing a minutia data set comprised of data triplets for each minutia extracted, including minutia position data and minutia direction data.
  • a comparing process is initiated by (f) providing reference fingerprint data from a database wherein the reference fingerprint data includes reference directional image data DI and a reference minutia data set, and (g) performing successive comparisons of the directional image data DI with the reference directional image data DI and determining a directional difference DifDI for each of the successive comparisons wherein for each of the successive comparisons one of the directional image data DI and the reference directional image data DI is positional shifted by adding position shift data.
  • a next step (h) it is b determined for which of the successive comparisons the directional difference DifDI is the least and the position shift data thereof is selected as initial minutia shift data.
  • a next stage of the comparison process proceeds with (i) positional shifting minutia data by applying the initial minutia shift data to one of the minutia data sets and the reference minutia data set to initially positionally shift the minutia position data and the minutia orientation data, then (j) performing successive comparisons of the minutia data set with the reference minutia data set following the positional shifting minutia data and determining matching minutia based on a minutia distance criteria, a number of matching minutia, and a similarity measure indicative of correspondence of the matching minutia for each of the successive comparisons wherein, for each of the successive comparisons, one of the minutia data set and the reference minutia data set is positional shifted within a minutia shift range R by adding minutia position shift data, and finally (k) determining a maximum similarity measure of the similarity measures of the successive comparisons.
  • the comparison method concludes with (1) determining whether the maximum similarity measure is above a similarity threshold and indicating the reference fingerprint data and the fingerprint data
  • the present invention also includes the above method wherein, as an alternative, the calculation of the directional image data includes (cl) identifying a directional group of cells comprising all cells of the cells that do not have the unidirectional flag associated therewith; and then excluding from the successive comparisons of minutia data sets, one of the minutia data sets and the reference minutia data set located in or positionally aligned with the cells that have the unidirectional flag associated therewith.
  • the present invention further provides a feature for use in conducting the successive comparisons of minutia comprising dividing the minutia data set into the minutia data set clusters formed on contiguous one the cells and each including a predetermined number of the minutia before conducting the successive comparisons, conducting the successive comparisons for each of the minutia data set clusters and determining for each of the minutia data set clusters a maximum similarity measure, and finally determining the maximum similarity measure as a sum of the maximum similarity measures of each of the minutia data set clusters .
  • the present invention also provides for the above comparison method excluding from further processing pairs of the minutia located within a minutia exclusion distance of one another and having minutia direction data with a direction exclusion limit of being in opposite directions.
  • the present invention further provides a feature wherein in the above comparison method the minutia extraction step extracts minutia limited to ends and bifurcations. Still further there is provided a feature wherein the minutia data set excludes data distinguishing ends and bifurcations.
  • biometric comparison system comprising a computer having a memory including a reference fingerprint data and at least one of file data and application software, a display, an apparatus for representing at least one of file data and application software as icons on the display, and a biometric input device for scanning a fingerprint and storing fingerprint data representing the fingerprint into the memory.
  • a comparison engine is provided for comparing the fingerprint data with the reference fingerprint data and determining a match if a similarity threshold is satisfied.
  • An access control icon generator permits a user to move an access control icon on the display and an access control means is provided for controlling access to the at least one of file data and application software when a user moves the access control icon onto the icon representing the at least one of file data application software whereby access to the at least one of file data and application software is permitted only if a user scans a fingerprint producing fingerprint data for which the comparison means determines matches the reference fingerprint data.
  • Figure la is a block diagram of a system of the present invention.
  • Figure lb is a block diagram of an alternative system of the present invention
  • Figure 2a is a top plan simplified view of a biometric input device of the present invention
  • Figure 2b is a side elevation view of the biometric input device of Figures 2a showing internal components in dashed lines;
  • Figure 3a is a side elevation view of the biometric input device of Figure 2a showing surface contours
  • Figure 3b is a bottom perspective view of the biometric input device of Figure 2a showing surface contours and dimensional disposition of features;
  • FIG 4 is a block schematic of the biometric input device of Figure 2a;
  • FIG. 5 is a flow chart for operation of the biometric input device of Figure 2a;
  • Figure 6 is a flow chart of the comparison method of the present invention
  • Figure 7 is an illustration of a directional image analysis
  • Figure 8a is an image of the fingerprint based on data received from an optical scanning assembly
  • Figure 8b is an image of the fingerprint of Figure 8 (a) following low pass filtering
  • Figure 8c is an image of the fingerprint of Figure 8 (a) following directional filtering and binarization
  • o is an image of the fingerprint of Figure 8 (a) following directional filtering and binarization
  • Figure 8d is an image of the fingerprint of Figure 8(a) following skeletonization
  • Figure 9a is a depiction of a bifurcation
  • Figure 9b is a depiction of an end;
  • Figure 10 is a depiction of an analysis of two minutia exclusion purposes;
  • Figure 11 is a simplified depiction of a fingerprint image data FP1 divided into clusters.
  • Figure 12 is a simplified depiction of the clusters of Figure 11 applied individually shift to print image data FP2.
  • a computer 50 has a keyboard 52 and a biometric input device 54 with a scanning window 56 for accepting biometric input.
  • the computer 50 may take the form of a personal computer, a dedicated device such as an ATM machine, a dumb terminal, or a computer on the order of a workstation, minicomputer or mainframe.
  • the computer 50 is connected to a remote computer 51 via a link 53 which may be a direct link via phone lines or direct cabling, or via a network such as a LAN, WAN, intranet or Internet.
  • the computer 50 In order to gain access to use of the computer 50, or remote computer 51, for all or only specified functions, a user must provide a biometric input to the biometric input device 54 via the scanning window 56.
  • the computer 50 will be referred to, however, it is understood that the remote computer 51 may optionally perform the functions ascribed to the computer 50 with the computer 50 functioning as a terminal.
  • reference to gaining access to use of the computer 50 is understood to include the alternative of access to use of the remote computer 51.
  • the computer 50 compares biometric data, representing the biometric input, with stored biometric data and determines if the biometric data corresponds to any stored biometric data held in a data base. If a correspondence exists, the user is given authorization, that is, the user is allowed access to the computer 50 for performance of the specified functions or for use of the computer 50 in general.
  • the biometric input device 54 is connected to the computer 50 via an input cord 72.
  • an embodiment of the present invention has a port adaptor connector 57 connecting the input cord 72 to a corresponding port on the computer 50.
  • a stand-alone adaptor unit 58 channels data via the input cord 72 and a cable 59 to and from the computer 50.
  • an infra red or other remote and/or wireless data communication structure could be provided. Referring to Figure IB, an alternative configuration is shown wherein the scanning window 56 and associated structure is incorporated in either the computer 50 or the keyboard 52.
  • the stand-alone biometric input device 54 is omitted and functions thereof are performed by the computer 50 or by circuitry incorporated in the keyboard 52. It is understood that functions discussed herein with respect to the biometric input device 54 and the computer 50 may optionally be distributed between the biometric input device 54 and the computer 50 as is practical .
  • the biometric input device 54 is shown in the form of a computer mouse 60. Alternatively, the biometric input device may take the form of another type of input device such as a track ball, joystick, touch pad or other variety of input device.
  • the computer mouse 60 preferably includes a left button 62, a right button 64, a ball 66, an X direction sensor 68, and a Y direction sensor 70.
  • Various means may be used to effect input from these devices including mechanical, optical or other.
  • optical means may be substituted for the ball 66 to detect mouse movement.
  • the input cord 72 connects to the computer 50 for effecting data transfer.
  • the input cord 72 is replaced by wireless means for effecting data transfer which operate using optical or electromagnetic transmission.
  • the present invention further includes an optical assembly 80.
  • the optical assembly 80 preferably includes a prism 82, a first lens 84, a mirror 86, a CCD assembly 88, and LED ' s 89.
  • the prism 82 has first, second and third sides, 90, 92 and 94, respectively.
  • the first side 90 generally defines the surface of the scanning window 56.
  • a coating (s) or a transparent plate may optionally be used to protect the first side 90.
  • the second side 92 preferably includes the first lens 84 disposed thereon or formed integrally with the prism 82.
  • the prism 82 is molded integrally with the first lens 84 which provides for reducing part count and simplifying the assembly of the biometric input device 54.
  • the third side 94 includes a light absorbing coating 96.
  • the CCD assembly 88 includes a CCD sensor 102 and a second lens 104 which functions as an object lens.
  • the first and second lenses 84 and 104 preferably function in conjunction with the mirror 86, as shown by light ray tracings, to focus an image at the first surface 90 onto the CCD sensor 102.
  • Various other lens assemblies and configurations may optionally be realized by those of ordinary skill in the art and are considered to be within the scope and spirit of the present invention.
  • a user In order to input biometric data, a user holds the computer mouse 60 with the index, middle or third finger preferably extended to operate the left and right buttons, 62 and 64, and with the thumb contacting the scanning window 56 to permit an image of a thumb print to be focussed onto the CCD sensor 102. The user then operates any of the left and right buttons, 62 or 64, or other input device, to initiate scanning of the thumb print. Alternatively, scanning may be automatically initiated by circuitry in the biometric input device 54 or the computer 50.
  • a front portion 109 of the computer mouse 60 generally refers to an end portion of the computer mouse 60 from where the input cord 72 preferably extends and where the left and right buttons, 62 and 64, are situated, a heel portion 110 which comprises a rear end portion where a user's palm typically rests, and a middle portion 111 which is an area where the balls of the user's hand typically are situated.
  • the front portion 109, the heel portion 110, and the middle portion 111 are situated to define three sections of a length L of the computer mouse 60 extending from a front end of the end portion 109 to a rear end of the heel portion 110.
  • the scanning window 56 is preferably situated generally on a side of the middle portion 111 and preferably has a ridge 120 framing at least three sides of the scanning window 56.
  • the ridge 120 is configured to accept a perimeter of a user's thumb, thereby defining a scanning position of the user's thumb in the scanning window 56. Furthermore, the ridge 120 serves to shield the scanning window 56 from ambient light during the scanning process and also to protect the scanning window 56 from damage.
  • the ball 66 is preferably disposed with a center thereof within the heel portion 110 of the computer mouse 60. Such disposition of the ball 66 provides advantageous situation of the ball 66 under the palm of the user's hand so that pressure from the palm during operation ensures positive contact of the ball 66 with a substrate upon which the computer mouse 60 is used.
  • the ball 66 is optionally disposed rearward of a mid-position in the computer mouse 60 wherein the mid-position is a middle of the length L of the computer mouse 60. In conventional configurations the ball 66 is situated either in the middle portion, forward of the mid-position in the computer mouse, or in the front portion. Such a construction is prone to intermittent contact of the ball with the substrate due to the user applying excessive downward force to the heel portion of the mouse resulting in the front and middle portions rising from the substrate.
  • a circuit board 140 contains circuitry for effecting scanning operation of the optical assembly 80.
  • a contact detection assembly may be realized wherein the scanning window 56 takes the form of a silicon contact sensor.
  • a thumb print of the user is represented by data of an array of pixels.
  • the LED ' s 89 are mounted on the circuit board 140 in a position above a top surface of the prism 82 to radiate light into the prism 82 for scanning the thumb print.
  • the embodiment shown has two LED's, but it is realized a single LED may be used or alternative light generating devices may be substituted therefor.
  • the LED's 89 may alternatively be mounted on the prism 82 or molded into the prism 82, at the top side, in the same operation wherein the first lens 84 is molded integrally with the prism 82.
  • perspective depictions of the computer mouse 60 illustrate the length L of the computer mouse 60, the disposition of the ball 66 and the structure of the ridge 120.
  • the ridge 120 has an outer surface 122 extending outwardly from a side surface 126 of the computer mouse 60 and an inner surface 124 extending from a peak of the ridge structure to the scanning surface 56.
  • the ridge 120 is raised from the side surface 126 preferably on at least three sides of the scanning window 56, that is, front, top and bottom sides.
  • a rise of the ridge 120 from the side surface 126 is optionally omitted to permit ease of insertion of the thumb against the scanning window 56.
  • the location of the ridge 120 on the three sides of the scanning window 56 ensures positive location of the thumb for scanning purposes to minimize scan to scan variations in positioning of the thumb print thereby facilitating thumb print comparisons.
  • the center of the ball 66 is shown rearward of the mid-position, the middle portion 111 which includes the middle section of the computer mouse 60, and the three quarter length position.
  • the outer surface 122 is concave but may optionally be flat or convex.
  • the inner surface 124 is concave but may optionally be flat or convex.
  • the outer surface 122 may be omitted with the inner surface 124 serving alone to position the thumb wherein the inner surface 124 defines a recess in the side surface 126.
  • the rising of the outer surface 122 from the side surface 124 provides for the side surface 126 protruding less outwardly from a mouse body centerline CL1 of the computer mouse 60, shown in Figure 2a, thereby providing for a functionally less cumbersome device .
  • a surface of the scanning window 56 is preferably inclined with respect to the mouse body centerline CL1 to define an acute angle with respect thereto in the range of 5° to 25°, and preferably in the range of 10° to 20°.
  • a front edge of the scanning surface 56 is recessed inwardly toward the mouse body centerline CL1 from a position of the side wall 126 relative to the mouse body centerline CL1. Such positioning provides for an ergonomically advantageous positioning of the thumb when the computer mouse 60 is held.
  • the scanning window 56 has a length of about 30mm and a width of about 18mm.
  • the scanning window 56 is inclined in the vertical plane with respect to the substrate upon which the computer mouse 60 rests such that a longitudinal center line CL2 of the scanning surface defines an acute angle with respect to the substrate in the range of 0° to 25°, and preferably in the range of 5° to 15°.
  • a longitudinal center line CL2 of the scanning surface defines an acute angle with respect to the substrate in the range of 0° to 25°, and preferably in the range of 5° to 15°.
  • the prism 82 is a right angle prism with a forward acute angle in the range of 40° to 60° and preferably in the range of 45° to 55°.
  • the mirror 86 serves to redirect light to the CCD assembly 88 thereby providing for a compact arrangement of the optical assembly 80.
  • the forward angle is about 50°.
  • a microcontroller 150 is preferably interfaced with a CCD controller 152, a ROM 154, a RAM 156, and an A/D converter 158. Output from the CCD sensor 102 is input to the A/D converter 158 where it is digitized.
  • the CCD controller 152 effects scanning of the CCD sensor 102 to transfer sensed levels of the pixels of the CCD sensor 102.
  • the microcontroller 150 further controls the intensity of light produced by the LED 89.
  • An interface controller 160 is interfaced with the microcontroller 150 to effect communication with a serial port of the computer 50. Other interfaces may be employed permitting data communication with the computer 50.
  • the microcontroller 150 may optionally receive mouse input from the left and right mouse buttons, 62 and 64, and the x and y sensors, 68 and 70, and transmit the mouse input to the computer 50 to effect combined functions of thumb print scanning and mouse control .
  • the microcontroller 150 is optionally in the form of a programmable logic device (PLD) .
  • PLD programmable logic device
  • the microcontroller 150 controls the CCD controller 152, determines a size and position of a frame, records image data of the frame into the RAM 156, and supports communication protocol with the interface controller 160, such as the RS-232 interface, the PS-2 interface, or the USB interface.
  • the ROM 154 stores program codes for the microcontroller 150 and may be programmed to effect operations over various interfaces. While discrete IC's are shown, it is realized that the functions of the IC's may be integrated in a single IC.
  • the CCD controller 152 effects reading of successive pixels and lines of the CCD sensor 102.
  • a matrix of data from the pixel array of the CCD sensor 102 forms the frame and is stored in the RAM 156.
  • the frame consists of data representative of the thumb print image and preferably excludes data from pixels not representative of the thumb print image.
  • the frame represents a subset of data from a complete scanning of the CCD sensor 102. Accordingly, the amount of data to be processed and sent to the computer 50 is optionally reduced from that of an entire scan of the CCD sensor 102.
  • the interface controller 160 may be incorporated into an interface unit 162 for connecting the input cord 72 to the computer to permit operation over various interfaces by substitution of the interface unit 162 having the desired interface controller 160.
  • the interface unit 162 may be in a separate housing connectable to a desired input port, as shown in Figure la as the stand-alone adapter unit 58, or a connector housing itself as show in Figure la as the port adapter connector 57. Implementation of the interface unit 162 is dictated by the type of port to be interfaced.
  • a parallel printer port interface that is, a PS2 port interface
  • a microcontroller and a PLD for example, a ZILOG Corp. Z86E02 in conjunction with a FLEX8K PLD from Altera Corp.
  • the interface connector 162 is a separate housing which is connected to the computer's printer port with a cable and has a connector for the input cord 72 and for a parallel printer cable through which a printer may be interfaced to the computer 50. Power is supplied to the interface connector 162 and the computer mouse 60 via the PS2 port from the computer 50.
  • Data exchange for the computer mouse's 50 usual mouse input, that is, input from the left and right buttons, 62 and 64, and the x and y sensors, 69 and 70, is preferably effected using standard protocol for PS2 mouse interface and the PLD based on output from the microcontroller 150 of the computer mouse 60.
  • a full speed USB interface at 12 MBaud may be effected using a processor m the interface unit 162, such as an Intel Corp. 930, which has m built USB functions.
  • the interface unit 162 is optionally a separate housing m the form of a stand-alone adapter unit 58 which is connected to the computer's USB port with a cable 59, as shown m Figure la, and has a connector for the input cord 72. Power is supplied from the computer 50 for the interface unit 162 and the computer mouse 60 via the USB port.
  • a serial port interface that is, a COM port interface, functioning at 115.2 KB may be effected using a processor in the interface unit 162, such as an Atmel AT29C2051, and an RS232 voltage converter.
  • the interface unit 162 is optionally incorporated m a connector for connecting the input cord 72 to the computer's 50 serial port. Power is supplied from the computer 50 via a further connector and is processed by the voltage converter to drive the computer mouse 50.
  • a flow chart is shown of operation of the computer mouse 60. Operation begins at a start point 200 and proceeds to decision step 205 to determine whether a read print command is received from the computer 50, referred to as "PC" m the flow chart, to read m a thumb print. If a "read print" command is received, the LED 89 is lit to a maximum level m step 210 Next, m step 215, data from the CCD sensor 102 is read. Following reading CCD data, a decision step 220 is executed to determine whether a finger is detected. When a finger is detected operation proceeds to a decision step 225 to determine whether the light level is acceptable, and if it is not the level is adjusted and operation returns to step 215. If the light level is acceptable, operation proceeds to transmission step 230 wherein a message is sent to the computer 50 indicating that print data is to be sent. In another transmission step 235 a line of print data from the CCD sensor 102 is sent to the computer 50.
  • Operation then proceeds to a decision step 240 wherein it is determined whether the end of the image data has been sent to the computer 50. If transmission of the image data is not complete, a check is made in a status verification step 245 to see whether there is any mouse input, such as data from any of the left button 62, right button 64, X sensor 68, or Y sensor 70 input by the user and, if such data has been input, it is sent to the computer 50 m a transmission step 250. Operation returns to the transmission step 235 wherein a next line of CCD data is sent to the computer 50 after the mouse input is sent to the computer 60 or if no mouse input is detected. If it is determined m the decision step 240 that transmission of image data is complete, operation returns to the beginning of the flow chart below the start step 200.
  • a status verification step 245 to see whether there is any mouse input, such as data from any of the left button 62, right button 64, X sensor 68, or Y sensor 70 input by the user and, if such data has been
  • step 205 if no read print command is received, operation proceeds to a status verification step 255 to see whether any mouse input has been inputted by the user and, if such data has been inputted, it is sent to the computer 50 m transmission step 260.
  • image data is also referred to as print data m reference to the input of a thumb print.
  • image data is also referred to as print data m reference to the input of a thumb print.
  • other types of biometric input may be used and that the present invention may optionally used to process such other data.
  • examples of such other data include a print image of any of the other digits or images of other unique biometric data such as retinal images.
  • the entire operation of the present invention can be contained within the mouse itself, with only an authorization and/or restriction command being passed on to the computer itself.
  • the image data is then processed and added to a database of print image data or used to gain access to use of the computer 50 by comparison to previously stored print image data m the database.
  • an authorization process while entering print image data into the database is referred to as a registration process.
  • Finger print image analysis may effect comparison of images.
  • the present invention further provides an analysis algorithm that effects comparison of special point maps which indicate where special points, also known as mmutia, of a fingerprint are located.
  • the fingerprint analysis algorithm considers a fingerprint not as a determined object but as a stochastic object. There is a philosophical analogy, like the Laplas ' s determinism and the stochastic picture of the world. Another analogy is that the first practically significant results speech recognition appeared as soon as the first stochastic models of human's speech had appeared.
  • a discussion of standard approaches is found the paper A real-time matching system for large fingerprint databases, N.K. Ratha, K. Karu, S. Chen, and A.K. Jam, IEEE Trans, on PAMI , Aug. 1996, vol. 18, no. 8, pp. 799-813, which is incorporated herein by reference for its teaching relating to fingerprint analysis and modeling.
  • Factors that randomize print image data include elasticity of skin, humidity, level of impurity, skin temperature, individual characteristics of the user's finger-touch, among many other factors
  • the basic generation of a special points map optionally includes multiple finger touches of the same finger, that is, a user's thumb print is optionally scanned multiple times.
  • Each image data from each scanning is referred to herein as a "standard.”
  • the term "reliability, " as used above, relates a probability of recognizing a registered user, that is, matching a user's thumb print data with thumb print data m the data base after one touch.
  • UnDir (>Pi) mask value to detect the absence of FP in a current cell, for n-th FP
  • imaging step 300 the user's thumb print is scanned by the CCD sensor 102 and then digitized at step 305, wherein analog levels for each pixel of the CCD sensor 102 are digitized to form one byte per pixel.
  • analog levels for each pixel of the CCD sensor 102 are digitized to form one byte per pixel.
  • the analog levels of the pixels are successively digitized by the A/D converter 158 and stored in the RAM 156.
  • a sequence of filtering and contrasting transformations is executed on the initial matrix of intensity data. The aim is to get the more "stable" image of the fingerprint (while touching) .
  • the print image data FP is optionally transferred to the computer 50 as indicated in Figure 5.
  • the filtering and contrasting transformations may be executed by the mircocontroller 150 in the computer mouse 60.
  • the matrix of intensity data from the CCD sensor 102 that is, the print image data FP, includes the fingerprint and surrounding "garbage” .
  • a border between the print image and the "garbage” is defined and the "garbage” is excluded so that only the internal part of the print image, that is the portion which includes ridge lines, takes part in the further analysis.
  • preprocessing of the print image data FP is carried out beginning with a scale normalization step 310 in which the scale of the print image data FP is normalized using standard routines.
  • the print image data FP is then used to calculate directional image data DI using gradient statistics in directional calculation step 315, wherein the print image is divided into cells having a size defined by Fx and Fy.
  • the print image data FP is divided into cells as shown by a grid superimposed on the print image and a vector normal to the direction of ridge lines in each cell is calculated. These vectors form the directional image data DI .
  • an array of directional image data F(i,j) is generated where i and j denote the cell and the value of F(i,j) is between O and Pi for directional cells or is set to UnDir for cells wherein a directional gradient cannot be determined such as for isolated pixels or pixel groups lacking directionality.
  • the directional image data DI is then subjected to a smoothing process and its quality factor Q is determined in a smoothing and quality processing step 320.
  • the smoothing process includes first applying a low-pass filter and then a low-cut filter, after which a directional smoothing along the directions defined for each cell is effected.
  • Scale normalization, low-pass filtering, low-cut filtering directional image calculation and smoothing are processes that are realizable by those of ordinary skill in the art. Accordingly, detailed discussions thereof are omitted.
  • the quality Q of a print image data FP is then calculated by determining a ratio of cells that remain substantially unchanged following the smoothing and quality processing step 320 to the total number of cells. This ratio is then squared and multiplied by the area of the print image data FP divided by the area of the entire scanned image. Thus, both the quality of the print image data FP and absence of image data corresponding to a fingerprint are taken into consideration.
  • Quality decision step 325 is then executed to determine whether the quality Q of the print image FP is above a given quality threshold. When the quality Q is below the given quality threshold, the process returns to the imaging step 300 for input of new data. This is because it is determined that the quality of the fingerprint is insufficient to base matching upon. If the quality is above the given threshold, processing proceeds a binarization step 330.
  • the image data FP shown in Figure 8(a) is subjected to preliminary binarization using subtraction of low-pass filtering resulting in the image data FP producing the image shown in Figure 8 (b) , followed by directional filtering and binarization resulting in the image of Figure 8 (c) .
  • Processing continues with execution of a skeletonization step 335 wherein the image data FP is subjected to a thinning and skeletonization processing wherein all ridge lines are reduced to a width of one pixel which results in the image shown in Figure 8(d) .
  • visible ridge lines that are some pixels in width are being changed to lines one pixel in width.
  • the values on the ridge lines are 1 and for all other areas the values are 0.
  • a minutia extraction step 340 is next executed upon the image data FP that has been skeletonized.
  • Fingerprints are characterized by various minutia which are particular patterns of the ridges.
  • Two basic types of minutia are a bifurcation 400, or branch, shown in Figure 9a, wherein a ridge line 402 divides into two ridge lines, 403 and 405, and an end 410, shown in Figure 9b, wherein a ridge line 412 ends.
  • Each minutia is characterized as a vector represented by a minutia data triplet X, Y, and A wherein X and Y represent the location of the minutia and A is an angle of a vector of the directionallization of the minutia as shown in Figures 9a and 9b.
  • end minutia 410 and bifurcation minutia 400 are not made. It is found that exclusion of such distinction results in reduction of data, reduced processing needs and time, while still providing acceptable reliability of fingerprint comparison. Alternatively, distinction may be made with associated increase in processing.
  • the minutia extraction step 340 further proceeds with exclusion of minutia that are too closely located.
  • two end minutia at (xl, yl) and (x2 , y2 ) , respectively, and represented by vectors (pl,ql) and (p2,q2), respectively, are shown.
  • determination is made as to whether the two minutia are within a threshold distance.
  • This threshold distance is optionally a distance r used to determine matching minutia and discussed below, a fixed distance, or another distance based on mean ridge line separation distance.
  • the minutia extraction is advantageous in reducing the amount of data to be processed and thereby reducing the processing time and requirements.
  • FP1 now refers to the image data of the input fingerprint and FP2 refers to print image data of a fingerprint retrieved from the database in database retrieval step 347.
  • FP2 refers to print image data of a fingerprint retrieved from the database in database retrieval step 347.
  • other variables are appended with 1 or 2 to represent the respective fingerprint .
  • the values of fa, fdx, fdy iteratively varied and for each permutation thereof the transformation of Fl(fa, fdx, fdy) (i,j) is made and compared with F2(i,j) to find a DifDI for each set of fa, fdx, fdy values.
  • a set of fa, fdx, fdy values is then chosen for which DifDI is minimal.
  • the chosen set of fa, fdx, fdy represent the best shifting parameters for shifting the directional image DI to effect the best matching directional alignment of DI and D2.
  • BI is determined as the number of cells (i,j) of either DI or D2 that are not UnDir.
  • a directional difference decision step 350 is next executed wherein the minimal DifDI for the chosen set of fa, fdx fdy is compared against a threshold DifDI TH which may be a set threshold or threshold based on BI . If DifDI exceeds the threshold DifDI TH , then it is determined that the correspondence level, or matching level, between the directional images is insufficient to warrant further comparison of FP1 and FP2 and a different fingerprint image data is chosen for FP2 and processing returns to the beginning of the matching process step 345. If DifDI is less than the threshold, operation proceeds to similarity measure calculation step 355.
  • the chosen set of fa, fdx, fdy for orthogonal transformation is applied as (dfx*Fstepx, dfy*Fstepy and fa) to the minutia data triplets XI (k) , Yl (k) , and Al (k) of FP1, where k represents a k-th minutia.
  • the transformed minutia data triplets of print image data FP1 are then grouped into clusters each containing not less than a given number of minutia, preferably seven.
  • FP1 is illustrated as being divided in four clusters CS1, CS2 , CS3 , and CS , which each contain the given number of minutia (not shown) .
  • Figure 11 (a) is a simplified depiction of the process in that the clusters do not necessarily cover square regions of the print image and the number of clusters is not limited to four. The clusters may be thought of a regional groupings of minutia.
  • XI (k) , Yl (k) of the minutia of the given cluster are all iteratively shifted in x and y directions by values dr, wherein dr is varied within a range
  • the BI grouping of FP2 is the group of cells in FP2 that are not UnDir.
  • a, ⁇ and 0 are empirical values.
  • a is 150
  • is set equal to Rl , where Rl equals 30, and R2 , where R2 , equals 20, Rl and R2 being discussed below, and 0 is set equal to 4.
  • Rl equals 30, and R2
  • R2 equals 20, Rl and R2 being discussed below
  • 0 is set equal to 4.
  • the maximum similarity measure Smt (FPl, FP2) is generated for the best comparisons of all clusters of FPl with FP2 , along with a number Nmat of matched minutia, and a number Ntot which is the total number of minutia within the BI grouping of FPl.
  • An overall similarity measure for the comparison of FPl with FP2 is calculated as follows:
  • Nmt (R,r,BI, Ntot) Smt (FPl, FP2 ) - DifDI
  • Smt (FPl, FP2) is a sum of the best Smt of each cluster.
  • Nmt (R, r, BI, Ntot) is compared with a threshold Thr (R, r, BI, Ntot). If Nmt (R, r, BI, Ntot) is greater than the threshold Thr(R, r, BI, Ntot), it is determined the FPl matches FP2 and a match is indicated in match indication step 365.
  • the threshold Thr(R, r, BI, Ntot) is determined on the basis of threshold training using a sample pool of fingerprints from a number of individuals.
  • the sample pool is composed of a number of samples, or standards, from each individual in the pool.
  • the number of samples, from each individual in the pool is 4 and the number of individuals is in a range of 100 to 1000.
  • the number of samples and individuals may be varied from the exemplary values and range without departing from the scope and spirit of the present invention.
  • the process steps 305 through 355 of Figure 6 are then executed for each print with every print being compared to every other print. Since the sample pool is known, comparisons of prints from a same individual and comparisons of prints from different individuals are known.
  • MID is the mean inter-ridge distance of the prints in the sample pool. The following values are found: NmtS(Rl,rl,BI, Ntot) , NmtA (Rl , rl , BI ,Ntot ) , and
  • NmtS (R2,r2,BI,Ntot) , NmtA (R2 , r2 , BI , Ntot) , where NmtS is number of matched minutia for prints compared from the same individual while NmtA is the number of matching minutia resulting from the comparison of fingerprints from different individuals .
  • BestA (n, BI ,Nmat) is set to the max NmtA (Rn, rn, BI , Ntot) , of all the comparisons of fingerprints from different individuals
  • the threshold are then calculated as follows:
  • the similarity decision step 360 produces a positive match indication if for the current BI, Ntot:
  • the complete description to be stored in the database is a multilevel structure of 4 (or more) FP data sets taken from the different applications of the same FP .
  • Each level of the structure corresponds to minutia appearance frequencies for all FP codes .
  • thresholds for the similarity comparison as discussed above, fixed values may be chosen and used as threshold values.
  • the data base of fingerprints of individuals for whom identification is required is created by a registration process.
  • the registration process entails a given individual having their fingerprints scanned a number of times, for example four. Of the four scans, the scanning producing the greatest number of minutia is then selected for the database.
  • the present invention further includes use of the above fingerprint minutia extraction and comparison process in conjunction with a cryptographic protection process.
  • the computer 50 also referred to as the client
  • the remote computer 51 also referred to as the server
  • the link 53 which may be, for example, a link over the Internet.
  • security protection for data sent over the link 53 is required.
  • the user In order to use the cryptographic process, the user must first register his fingerprint with the server. In order to maintain security, the fingerprint data must be encrypted to prevent unauthorized interception thereof. The following steps are used:
  • This data set is also referred to herein as a passport.
  • components of the data set may be omitted, such as F( ⁇ ,j), so the passport may be shortened to about 1.2 KB.
  • the client, computer 50 then sends a request for the public key to the server via the link 53.
  • Server sends its public key K E via the link 53.
  • Client encrypts its passport and his UserlD using RSA algorithm and public key K E .
  • the length of the key is 512 bits:
  • the computer 50 sends C to the remote computer 51
  • the remote computer 51 decrypts message using its secret key K D :
  • the remote computer 51 then adds the UserlD and passport to the database.
  • the user authorization process is used where a user wishes to gam access to the remote computer on the basis of his finger print matching one m the database. 1. User scans his fingerprint image data into the computer 50.
  • the computer converts the image of the finger to the passport using processing steps 310 through 340 shown m Figure 6. 3.
  • the computer 50 sends a request over the link 53 to the remote computer 51, the server, for the public key to the server.
  • the remote computer 51 sends its public key K_ to the computer 50.
  • the computer 50 encrypts the passport and UserlD using RSA algorithm using the public key K L :
  • the computer 50 sends C to the remote computer 51 via the link 53. 7.
  • the remote computer 51 searches the database for the UserlD, finds the corresponding passport, and executes steps 345 through 365 of Figure 6 using the passport retrieved from the database as FP2.
  • step 350 is omitted. If the comparison of step 360 is positive, access is authorized. If the UserlD does not exist or the comparison result of step 360 negative, authorization for access is refused.
  • server takes i ts UserlD and passport and encrypts them wi th admini s tra tor ' s publ i c ke ⁇ .
  • Usage of two different keys makes it more difficult to corrupt fingerprint data since an intruder must obtain both public and private keys to complete his attack.
  • Different servers will have different keys to ensure that corrupted fingerprint data (i.e. stolen from some server) could not be used on other servers.
  • the 512-bits RSA keys are extremely difficult to crack. In fact, the keys of that length are not known to have been broken, so current cryptography declares them as keys for long-term secret information (30-50 years or longer) . Average time of encryption of passport (client side) is less than a second.
  • a further aspect of the present invention provides software for working m the Windows environment.
  • a protection icon is provided which an authorized user, one whose passport has produced a positive comparison, may move and drop on a file or program object to require that future access thereto be permitted only when a positive fingerprint comparison has been executed.
  • the user may input a list of UserlD' s for whom access will be allowed.

Abstract

A biometric input device (54), system and method includes a biometric input device having a scanning window (56) disposed in a side of a device body so as to facilitate positioning of a biometric sample such as a thumb. The biometric input device (54) includes an optical assembly (80) having a prism (82) with a focusing lens (84) disposed on a side thereof and optionally integrally formed therewith and associated with a scanning surface (92) of the prism disposed at the scanning window such that a light image may be generated therethrough and focused onto an imaging component (102) for conversion into a pixel output. A led (89) structured to emit light through a top surface of the prism functions to generate the light image of the biometric identifier such as the fingerprint.

Description

Description BIOMETRIC SYSTEM FOR BIOMETRIC INPUT, COMPARISON, AUTHENTICATION AND ACCESS CONTROL AND METHOD THEREFOR BIOMETRIC
BACKGROUND OF THE INVENTION
The present is a Continuation-m-Part of U.S. Patent
Application Serial No. 09/312,002, filed on May 14, 1999, for a
Biometric System for Biometric Input, Comparison, Authentication and Access control and Method therefor, the contents of which are incorporated herein by reference.
Field of the Invention
The present invention relates to a system for biometric input, comparison, and authentication and, more particularly, to a biometric input device having a scanning window with an illuminated prism, image detector and scanning electronics operable in conjunction with a biometric data comparison system for comparing directional and minutia data. The biometric input device provides a compact, yet highly functional configuration and the biometric data comparison system provides for controlled access to a computing system based upon comparison of imputed biometric data with biometric data stored in a database.
Description of the Related Art
Biometric input devices are known for use with computing systems. Such biometric input devices include computer mouse designs. Existing designs for such biometric input devices have scanning windows lacking efficient positioning structure for scanning positioning and protection from ambient light, and do not provide mechanical integration of a position sensing ball assembly with an optical scanning assembly maximizing reliability of position sensing ball operation.
Biometric data comparison methods and systems are known. Such known systems and methods suffer from various drawbacks including intensive computing power requirements, intensive memory requirements, slow data transfer, slow comparison, and comparison reliability reduction due to environmental and physiological factors. Known systems also fail to provide for secure communication of biometric data over public lines.
Summary of the Invention
Accordingly, it is an object of the invention to provide a system and method for biometric input and comparison which overcomes the drawbacks of the prior art .
It is a further object of the invention to provide an ergonomically advantageous biometric input device which ensures increased precision is sampling biometric data.
It is still a further object of the invention to provide a biometric data comparison method which controls access to computers or data networks. It is yet another object of the invention to provide a fingerprint comparison method which provides for accurate and rapid comparison of fingerprints while compensating for environmental and physiological factors.
An object of the present invention is also to provide a biometric based access control system for use on computers which permits a user to graphically apply biometric access control features to data and applications by the use of a user manipulated biometric protection icon.
Briefly stated, the present invention provides a biometric input device, system and method which includes a biometric input device having a scanning window surrounded by a ridge for ensuring positive positioning of a biometric sample such as a thumb. The biometric input device includes an optical assembly having a prism with a focusing lens disposed on a side thereof and optionally integrally formed therewith. A biometric comparison method is provided for comparing data from said biometric input device with data from a database using both directional image comparison and clusterized mmutia location and direction comparison. A further system is provided for allowing access to computer functions base on the outcome of the comparison method.
The present invention also provides a biometric input device for accepting a fingerprint of a finger tip having opposing tip sides and a tip end, comprising a device body having a body wall defining an aperture and an optical assembly for scanning the fingerprint disposed in the device body. The optical assembly has a scanning surface at the aperture upon which the finger tip is placed for scanning of the fingerprint by the optical assembly. A ridge surrounds a portion of a periphery of the aperture such that the ridge engages the opposing tip sides and tip end such as to position the fingerprint on the scanning surface and block ambient light.
A further feature of the present invention includes the aforesaid biometric input device having a device body with a bottom surface opposing a substrate upon which the device body is placed, a device body length and a front portion, a middle portion and a heel portion. A movement detection device for detecting movement of the device body relative the substrate is provided and the bottom surface defined a bottom surface aperture through which the movement detection device detects movement of the device body relative the substrate. The bottom surface aperture is disposed in the heel portion of the device body and the optical assembly is disposed in the middle portion of the device body. In an embodiment of the present invention the movement detection device has a ball protruding through the bottom surface aperture for engaging the substrate to register the movement of the device body relative the substrate.
According to a feature of the invention, there is further provided a biometric input device for accepting a fingerprint of a finger tip having opposing tip sides and a tip end, comprising a device body having a body side wall defining an aperture, and an optical assembly for scanning the fingerprint disposed in the device body. The optical assembly includes an imaging component for converting a light image into pixel output and a lens for focusing the light image into the imaging component. The optical assembly includes a prism with first, second and third sides and a top side wherein the first side forms a scanning surface at the aperture upon which the finger tip is placed for scanning of the fingerprint by the optical assembly, the second side has the lens for focusing the light image into the imaging component disposed thereon, and the third side has a light absorbing layer.
The present invention also includes the above embodiment wherein, in the alternative or in combination with one another, the lens is formed integrally with the prism and a light emitting device is disposed to emit light into the prism from the top side of the prism to illuminate the fingerprint when disposed at the scanning surface . According to a still further feature of the invention, there is provided a biometric comparison method comprising a series of steps beginning with (a) scanning in a fingerprint and digitizing the scanning signals to produce a matrix of print image data representing pixels. Next the method proceeds with (b) dividing the print image data into cells, each including a number of pixel data for contiguous pixels, and (c) calculating a matrix of directional image data DI using gradient statistics applied to the cells wherein the directional image data DI includes, for each of the cells, a cell position indicator and one of a cell vector indicative of a direction of ridge lines and an unidirectional flag indicative of a nondirectional calculation result. Processing then continues with (d) skeletonizing the print image data, and (e) extracting minutia from the print image data and producing a minutia data set comprised of data triplets for each minutia extracted, including minutia position data and minutia direction data.
Next, a comparing process is initiated by (f) providing reference fingerprint data from a database wherein the reference fingerprint data includes reference directional image data DI and a reference minutia data set, and (g) performing successive comparisons of the directional image data DI with the reference directional image data DI and determining a directional difference DifDI for each of the successive comparisons wherein for each of the successive comparisons one of the directional image data DI and the reference directional image data DI is positional shifted by adding position shift data. In a next step (h) it is b determined for which of the successive comparisons the directional difference DifDI is the least and the position shift data thereof is selected as initial minutia shift data. A next stage of the comparison process proceeds with (i) positional shifting minutia data by applying the initial minutia shift data to one of the minutia data sets and the reference minutia data set to initially positionally shift the minutia position data and the minutia orientation data, then (j) performing successive comparisons of the minutia data set with the reference minutia data set following the positional shifting minutia data and determining matching minutia based on a minutia distance criteria, a number of matching minutia, and a similarity measure indicative of correspondence of the matching minutia for each of the successive comparisons wherein, for each of the successive comparisons, one of the minutia data set and the reference minutia data set is positional shifted within a minutia shift range R by adding minutia position shift data, and finally (k) determining a maximum similarity measure of the similarity measures of the successive comparisons. The comparison method concludes with (1) determining whether the maximum similarity measure is above a similarity threshold and indicating the reference fingerprint data and the fingerprint data are from the same fingerprint when the maximum similarity measure is above the similarity threshold.
The present invention also includes the above method wherein, as an alternative, the calculation of the directional image data includes (cl) identifying a directional group of cells comprising all cells of the cells that do not have the unidirectional flag associated therewith; and then excluding from the successive comparisons of minutia data sets, one of the minutia data sets and the reference minutia data set located in or positionally aligned with the cells that have the unidirectional flag associated therewith.
The present invention further provides a feature for use in conducting the successive comparisons of minutia comprising dividing the minutia data set into the minutia data set clusters formed on contiguous one the cells and each including a predetermined number of the minutia before conducting the successive comparisons, conducting the successive comparisons for each of the minutia data set clusters and determining for each of the minutia data set clusters a maximum similarity measure, and finally determining the maximum similarity measure as a sum of the maximum similarity measures of each of the minutia data set clusters .
The present invention also provides for the above comparison method excluding from further processing pairs of the minutia located within a minutia exclusion distance of one another and having minutia direction data with a direction exclusion limit of being in opposite directions.
The present invention further provides a feature wherein in the above comparison method the minutia extraction step extracts minutia limited to ends and bifurcations. Still further there is provided a feature wherein the minutia data set excludes data distinguishing ends and bifurcations.
Yet another feature of the present invention is a biometric comparison system comprising a computer having a memory including a reference fingerprint data and at least one of file data and application software, a display, an apparatus for representing at least one of file data and application software as icons on the display, and a biometric input device for scanning a fingerprint and storing fingerprint data representing the fingerprint into the memory. A comparison engine is provided for comparing the fingerprint data with the reference fingerprint data and determining a match if a similarity threshold is satisfied. An access control icon generator permits a user to move an access control icon on the display and an access control means is provided for controlling access to the at least one of file data and application software when a user moves the access control icon onto the icon representing the at least one of file data application software whereby access to the at least one of file data and application software is permitted only if a user scans a fingerprint producing fingerprint data for which the comparison means determines matches the reference fingerprint data. The above, and other objects, features and advantages of the present invention will become apparent from the following description read in conjunction with the accompanying drawings, in which like reference numerals designate the same elements.
Brief Description of the Drawings
For a fuller understanding of the nature of the present invention, reference should be had to the following detailed description taken in connection with the accompanying drawings in which:
Figure la is a block diagram of a system of the present invention;
Figure lb is a block diagram of an alternative system of the present invention; Figure 2a is a top plan simplified view of a biometric input device of the present invention;
Figure 2b is a side elevation view of the biometric input device of Figures 2a showing internal components in dashed lines;
Figure 3a is a side elevation view of the biometric input device of Figure 2a showing surface contours;
Figure 3b is a bottom perspective view of the biometric input device of Figure 2a showing surface contours and dimensional disposition of features;
Figure 4 is a block schematic of the biometric input device of Figure 2a;
Figure 5 is a flow chart for operation of the biometric input device of Figure 2a;
Figure 6 is a flow chart of the comparison method of the present invention; Figure 7 is an illustration of a directional image analysis;
Figure 8a is an image of the fingerprint based on data received from an optical scanning assembly;
Figure 8b is an image of the fingerprint of Figure 8 (a) following low pass filtering; Figure 8c is an image of the fingerprint of Figure 8 (a) following directional filtering and binarization; o
Figure 8d is an image of the fingerprint of Figure 8(a) following skeletonization;
Figure 9a is a depiction of a bifurcation;
Figure 9b is a depiction of an end; Figure 10 is a depiction of an analysis of two minutia exclusion purposes;
Figure 11 is a simplified depiction of a fingerprint image data FP1 divided into clusters; and
Figure 12 is a simplified depiction of the clusters of Figure 11 applied individually shift to print image data FP2.
Like reference numerals refer to like parts throughout the several views of the drawings.
Detailed Description of the Preferred Embodiment Referring to Figure 1A, a computer 50 has a keyboard 52 and a biometric input device 54 with a scanning window 56 for accepting biometric input. The computer 50 may take the form of a personal computer, a dedicated device such as an ATM machine, a dumb terminal, or a computer on the order of a workstation, minicomputer or mainframe. Optionally, the computer 50 is connected to a remote computer 51 via a link 53 which may be a direct link via phone lines or direct cabling, or via a network such as a LAN, WAN, intranet or Internet. In order to gain access to use of the computer 50, or remote computer 51, for all or only specified functions, a user must provide a biometric input to the biometric input device 54 via the scanning window 56. Hereinafter the computer 50 will be referred to, however, it is understood that the remote computer 51 may optionally perform the functions ascribed to the computer 50 with the computer 50 functioning as a terminal. Likewise, reference to gaining access to use of the computer 50 is understood to include the alternative of access to use of the remote computer 51.
The computer 50 compares biometric data, representing the biometric input, with stored biometric data and determines if the biometric data corresponds to any stored biometric data held in a data base. If a correspondence exists, the user is given authorization, that is, the user is allowed access to the computer 50 for performance of the specified functions or for use of the computer 50 in general.
The biometric input device 54 is connected to the computer 50 via an input cord 72. Alternatively, depending upon the type of port the biometric input device 54 uses to communicate with the computer 50, an embodiment of the present invention has a port adaptor connector 57 connecting the input cord 72 to a corresponding port on the computer 50. A still further alternative provides an embodiment of the present invention wherein a stand-alone adaptor unit 58 channels data via the input cord 72 and a cable 59 to and from the computer 50. Moreover, if desired, an infra red or other remote and/or wireless data communication structure could be provided. Referring to Figure IB, an alternative configuration is shown wherein the scanning window 56 and associated structure is incorporated in either the computer 50 or the keyboard 52. In such instances, the stand-alone biometric input device 54 is omitted and functions thereof are performed by the computer 50 or by circuitry incorporated in the keyboard 52. It is understood that functions discussed herein with respect to the biometric input device 54 and the computer 50 may optionally be distributed between the biometric input device 54 and the computer 50 as is practical . Referring to Figures 2A and 2B, the biometric input device 54 is shown in the form of a computer mouse 60. Alternatively, the biometric input device may take the form of another type of input device such as a track ball, joystick, touch pad or other variety of input device. The computer mouse 60 preferably includes a left button 62, a right button 64, a ball 66, an X direction sensor 68, and a Y direction sensor 70. Various means may be used to effect input from these devices including mechanical, optical or other. For example, optical means may be substituted for the ball 66 to detect mouse movement. The input cord 72 connects to the computer 50 for effecting data transfer. Optionally, the input cord 72 is replaced by wireless means for effecting data transfer which operate using optical or electromagnetic transmission.
The present invention further includes an optical assembly 80. The optical assembly 80 preferably includes a prism 82, a first lens 84, a mirror 86, a CCD assembly 88, and LED ' s 89. In particular, the prism 82 has first, second and third sides, 90, 92 and 94, respectively. The first side 90 generally defines the surface of the scanning window 56. Moreover, a coating (s) or a transparent plate may optionally be used to protect the first side 90. The second side 92 preferably includes the first lens 84 disposed thereon or formed integrally with the prism 82. Preferably, the prism 82 is molded integrally with the first lens 84 which provides for reducing part count and simplifying the assembly of the biometric input device 54. The third side 94 includes a light absorbing coating 96. The CCD assembly 88 includes a CCD sensor 102 and a second lens 104 which functions as an object lens. The first and second lenses 84 and 104 preferably function in conjunction with the mirror 86, as shown by light ray tracings, to focus an image at the first surface 90 onto the CCD sensor 102. Various other lens assemblies and configurations may optionally be realized by those of ordinary skill in the art and are considered to be within the scope and spirit of the present invention.
In order to input biometric data, a user holds the computer mouse 60 with the index, middle or third finger preferably extended to operate the left and right buttons, 62 and 64, and with the thumb contacting the scanning window 56 to permit an image of a thumb print to be focussed onto the CCD sensor 102. The user then operates any of the left and right buttons, 62 or 64, or other input device, to initiate scanning of the thumb print. Alternatively, scanning may be automatically initiated by circuitry in the biometric input device 54 or the computer 50.
The structural configuration of an illustrated embodiment of the computer mouse 60 is detailed below wherein a front portion 109 of the computer mouse 60 generally refers to an end portion of the computer mouse 60 from where the input cord 72 preferably extends and where the left and right buttons, 62 and 64, are situated, a heel portion 110 which comprises a rear end portion where a user's palm typically rests, and a middle portion 111 which is an area where the balls of the user's hand typically are situated. The front portion 109, the heel portion 110, and the middle portion 111 are situated to define three sections of a length L of the computer mouse 60 extending from a front end of the end portion 109 to a rear end of the heel portion 110.
The scanning window 56 is preferably situated generally on a side of the middle portion 111 and preferably has a ridge 120 framing at least three sides of the scanning window 56. The ridge 120 is configured to accept a perimeter of a user's thumb, thereby defining a scanning position of the user's thumb in the scanning window 56. Furthermore, the ridge 120 serves to shield the scanning window 56 from ambient light during the scanning process and also to protect the scanning window 56 from damage.
The ball 66 is preferably disposed with a center thereof within the heel portion 110 of the computer mouse 60. Such disposition of the ball 66 provides advantageous situation of the ball 66 under the palm of the user's hand so that pressure from the palm during operation ensures positive contact of the ball 66 with a substrate upon which the computer mouse 60 is used. The ball 66 is optionally disposed rearward of a mid-position in the computer mouse 60 wherein the mid-position is a middle of the length L of the computer mouse 60. In conventional configurations the ball 66 is situated either in the middle portion, forward of the mid-position in the computer mouse, or in the front portion. Such a construction is prone to intermittent contact of the ball with the substrate due to the user applying excessive downward force to the heel portion of the mouse resulting in the front and middle portions rising from the substrate.
A circuit board 140 contains circuitry for effecting scanning operation of the optical assembly 80. As an alternative to the optical assembly 80, a contact detection assembly may be realized wherein the scanning window 56 takes the form of a silicon contact sensor. In either configuration, a thumb print of the user is represented by data of an array of pixels. The LED ' s 89 are mounted on the circuit board 140 in a position above a top surface of the prism 82 to radiate light into the prism 82 for scanning the thumb print. The embodiment shown has two LED's, but it is realized a single LED may be used or alternative light generating devices may be substituted therefor. Furthermore, although the embodiment shown provides the LED's 89 mounted on the circuit board 140, the LED's 89 may alternatively be mounted on the prism 82 or molded into the prism 82, at the top side, in the same operation wherein the first lens 84 is molded integrally with the prism 82.
Referring to Figures 3A and 3B, perspective depictions of the computer mouse 60 illustrate the length L of the computer mouse 60, the disposition of the ball 66 and the structure of the ridge 120. The ridge 120 has an outer surface 122 extending outwardly from a side surface 126 of the computer mouse 60 and an inner surface 124 extending from a peak of the ridge structure to the scanning surface 56. The ridge 120 is raised from the side surface 126 preferably on at least three sides of the scanning window 56, that is, front, top and bottom sides. On a fourth or rear side, a rise of the ridge 120 from the side surface 126 is optionally omitted to permit ease of insertion of the thumb against the scanning window 56. The location of the ridge 120 on the three sides of the scanning window 56 ensures positive location of the thumb for scanning purposes to minimize scan to scan variations in positioning of the thumb print thereby facilitating thumb print comparisons. The center of the ball 66 is shown rearward of the mid-position, the middle portion 111 which includes the middle section of the computer mouse 60, and the three quarter length position. The outer surface 122 is concave but may optionally be flat or convex. Likewise, the inner surface 124 is concave but may optionally be flat or convex. Furthermore, the outer surface 122 may be omitted with the inner surface 124 serving alone to position the thumb wherein the inner surface 124 defines a recess in the side surface 126. However, the rising of the outer surface 122 from the side surface 124 provides for the side surface 126 protruding less outwardly from a mouse body centerline CL1 of the computer mouse 60, shown in Figure 2a, thereby providing for a functionally less cumbersome device .
Referring again to Figure 2a, a surface of the scanning window 56 is preferably inclined with respect to the mouse body centerline CL1 to define an acute angle with respect thereto in the range of 5° to 25°, and preferably in the range of 10° to 20°. A front edge of the scanning surface 56 is recessed inwardly toward the mouse body centerline CL1 from a position of the side wall 126 relative to the mouse body centerline CL1. Such positioning provides for an ergonomically advantageous positioning of the thumb when the computer mouse 60 is held. In one embodiment of the invention the scanning window 56 has a length of about 30mm and a width of about 18mm. Referring again to Figure 2b, the scanning window 56 is inclined in the vertical plane with respect to the substrate upon which the computer mouse 60 rests such that a longitudinal center line CL2 of the scanning surface defines an acute angle with respect to the substrate in the range of 0° to 25°, and preferably in the range of 5° to 15°. Such positioning provides for a further ergonomically advantageous positioning of the thumb when the computer mouse 60 is held.
The prism 82 is a right angle prism with a forward acute angle in the range of 40° to 60° and preferably in the range of 45° to 55°. The mirror 86 serves to redirect light to the CCD assembly 88 thereby providing for a compact arrangement of the optical assembly 80. In one embodiment the forward angle is about 50°.
Referring to Figure 4, an embodiment of circuitry provided on board 140 is shown. A microcontroller 150 is preferably interfaced with a CCD controller 152, a ROM 154, a RAM 156, and an A/D converter 158. Output from the CCD sensor 102 is input to the A/D converter 158 where it is digitized. The CCD controller 152 effects scanning of the CCD sensor 102 to transfer sensed levels of the pixels of the CCD sensor 102. The microcontroller 150 further controls the intensity of light produced by the LED 89. An interface controller 160 is interfaced with the microcontroller 150 to effect communication with a serial port of the computer 50. Other interfaces may be employed permitting data communication with the computer 50. Furthermore, the microcontroller 150 may optionally receive mouse input from the left and right mouse buttons, 62 and 64, and the x and y sensors, 68 and 70, and transmit the mouse input to the computer 50 to effect combined functions of thumb print scanning and mouse control .
The microcontroller 150 is optionally in the form of a programmable logic device (PLD) . One such device is the FLEX10K from Altera. The microcontroller 150 controls the CCD controller 152, determines a size and position of a frame, records image data of the frame into the RAM 156, and supports communication protocol with the interface controller 160, such as the RS-232 interface, the PS-2 interface, or the USB interface. The ROM 154 stores program codes for the microcontroller 150 and may be programmed to effect operations over various interfaces. While discrete IC's are shown, it is realized that the functions of the IC's may be integrated in a single IC. The CCD controller 152 effects reading of successive pixels and lines of the CCD sensor 102. A matrix of data from the pixel array of the CCD sensor 102 forms the frame and is stored in the RAM 156. The frame consists of data representative of the thumb print image and preferably excludes data from pixels not representative of the thumb print image. Thus, the frame represents a subset of data from a complete scanning of the CCD sensor 102. Accordingly, the amount of data to be processed and sent to the computer 50 is optionally reduced from that of an entire scan of the CCD sensor 102.
In an embodiment of the invention, the interface controller 160 may be incorporated into an interface unit 162 for connecting the input cord 72 to the computer to permit operation over various interfaces by substitution of the interface unit 162 having the desired interface controller 160. The interface unit 162 may be in a separate housing connectable to a desired input port, as shown in Figure la as the stand-alone adapter unit 58, or a connector housing itself as show in Figure la as the port adapter connector 57. Implementation of the interface unit 162 is dictated by the type of port to be interfaced.
A parallel printer port interface (LPT) , that is, a PS2 port interface, may be effected using a microcontroller and a PLD, for example, a ZILOG Corp. Z86E02 in conjunction with a FLEX8K PLD from Altera Corp. In such instance the interface connector 162 is a separate housing which is connected to the computer's printer port with a cable and has a connector for the input cord 72 and for a parallel printer cable through which a printer may be interfaced to the computer 50. Power is supplied to the interface connector 162 and the computer mouse 60 via the PS2 port from the computer 50. Data exchange for the computer mouse's 50 usual mouse input, that is, input from the left and right buttons, 62 and 64, and the x and y sensors, 69 and 70, is preferably effected using standard protocol for PS2 mouse interface and the PLD based on output from the microcontroller 150 of the computer mouse 60.
A full speed USB interface at 12 MBaud may be effected using a processor m the interface unit 162, such as an Intel Corp. 930, which has m built USB functions. In such an instance the interface unit 162 is optionally a separate housing m the form of a stand-alone adapter unit 58 which is connected to the computer's USB port with a cable 59, as shown m Figure la, and has a connector for the input cord 72. Power is supplied from the computer 50 for the interface unit 162 and the computer mouse 60 via the USB port.
A serial port interface, that is, a COM port interface, functioning at 115.2 KB may be effected using a processor in the interface unit 162, such as an Atmel AT29C2051, and an RS232 voltage converter. In such an instance the interface unit 162 is optionally incorporated m a connector for connecting the input cord 72 to the computer's 50 serial port. Power is supplied from the computer 50 via a further connector and is processed by the voltage converter to drive the computer mouse 50.
Referring to Figure 5, a flow chart is shown of operation of the computer mouse 60. Operation begins at a start point 200 and proceeds to decision step 205 to determine whether a read print command is received from the computer 50, referred to as "PC" m the flow chart, to read m a thumb print. If a "read print" command is received, the LED 89 is lit to a maximum level m step 210 Next, m step 215, data from the CCD sensor 102 is read. Following reading CCD data, a decision step 220 is executed to determine whether a finger is detected. When a finger is detected operation proceeds to a decision step 225 to determine whether the light level is acceptable, and if it is not the level is adjusted and operation returns to step 215. If the light level is acceptable, operation proceeds to transmission step 230 wherein a message is sent to the computer 50 indicating that print data is to be sent. In another transmission step 235 a line of print data from the CCD sensor 102 is sent to the computer 50.
Operation then proceeds to a decision step 240 wherein it is determined whether the end of the image data has been sent to the computer 50. If transmission of the image data is not complete, a check is made in a status verification step 245 to see whether there is any mouse input, such as data from any of the left button 62, right button 64, X sensor 68, or Y sensor 70 input by the user and, if such data has been input, it is sent to the computer 50 m a transmission step 250. Operation returns to the transmission step 235 wherein a next line of CCD data is sent to the computer 50 after the mouse input is sent to the computer 60 or if no mouse input is detected. If it is determined m the decision step 240 that transmission of image data is complete, operation returns to the beginning of the flow chart below the start step 200.
In step 205, if no read print command is received, operation proceeds to a status verification step 255 to see whether any mouse input has been inputted by the user and, if such data has been inputted, it is sent to the computer 50 m transmission step 260.
Once a complete set of image, or print data, is sent to the computer 50, the computer 50 then proceeds to process the data. In the present description, image data is also referred to as print data m reference to the input of a thumb print. However, it is realized that other types of biometric input may be used and that the present invention may optionally used to process such other data. Examples of such other data include a print image of any of the other digits or images of other unique biometric data such as retinal images. Thus, such applications are considered to be within the scope and spirit of the present invention. Indeed, the entire operation of the present invention can be contained within the mouse itself, with only an authorization and/or restriction command being passed on to the computer itself.
After the thumb print image is scanned in and the image data thereof transferred to the computer 50, the image data is then processed and added to a database of print image data or used to gain access to use of the computer 50 by comparison to previously stored print image data m the database. Hereinafter, using image data to gain access is referred to as an authorization process while entering print image data into the database is referred to as a registration process.
Finger print image analysis may effect comparison of images. Alternatively, the present invention further provides an analysis algorithm that effects comparison of special point maps which indicate where special points, also known as mmutia, of a fingerprint are located. The fingerprint analysis algorithm considers a fingerprint not as a determined object but as a stochastic object. There is a philosophical analogy, like the Laplas ' s determinism and the stochastic picture of the world. Another analogy is that the first practically significant results speech recognition appeared as soon as the first stochastic models of human's speech had appeared. A discussion of standard approaches is found the paper A real-time matching system for large fingerprint databases, N.K. Ratha, K. Karu, S. Chen, and A.K. Jam, IEEE Trans, on PAMI , Aug. 1996, vol. 18, no. 8, pp. 799-813, which is incorporated herein by reference for its teaching relating to fingerprint analysis and modeling.
Factors that randomize print image data include elasticity of skin, humidity, level of impurity, skin temperature, individual characteristics of the user's finger-touch, among many other factors The basic generation of a special points map optionally includes multiple finger touches of the same finger, that is, a user's thumb print is optionally scanned multiple times. Each image data from each scanning is referred to herein as a "standard." The greater the number standards of a user stored m the database, the higher the reliability of the recognition is. The shorter the process of studying multiple standards, the less the reliability of recognition is.
Applicants have conducted experiments showing that the reliability of recognition and the quantity of the standards exhibit the following relationship:
Quantity of Standards Reliability 1 89% 3 92% 5 95% 7
12 99.5% 20 99.9%
The term "reliability, " as used above, relates a probability of recognizing a registered user, that is, matching a user's thumb print data with thumb print data m the data base after one touch.
Referring to Figure 6, a flow chart of a fingerprint analyzing algorithm of the present invention is shown. The algorithm is described below wherem the following definitions apply:
VARIABLE DEFINITION
Xn(i) , Yn(i) , An(i) i-th minutia description wherein Xn is an x coordinate of the i-th minutia, Yn is an x coordinate of the i-th minutia, and An is an angle of the i-th minutia
FP fingerprint N number of minutia of fingerprint after extraction
FPn n-th fingerprint
MID mean inter-ridge distance
DI directional image
Xmas, Ymax linear sizes of an input image
Fx, Fy linear sizes (numbers of cells) in directional image,
Fstepx - Ymax/Fy linear sizes of cells onto which the initial image is distributed to get directional image
Fn (i,j) directional image for n-th fingerprint
Pi discrete upper bound for 180 degrees
BI number of cells of directional image that are not UnDir
UnDir (>Pi) mask value to detect the absence of FP in a current cell, for n-th FP
In imaging step 300, the user's thumb print is scanned by the CCD sensor 102 and then digitized at step 305, wherein analog levels for each pixel of the CCD sensor 102 are digitized to form one byte per pixel. Although depicted as separate operations, it is understood from the schematic of Figure 4 that the analog levels of the pixels are successively digitized by the A/D converter 158 and stored in the RAM 156. Next, a sequence of filtering and contrasting transformations is executed on the initial matrix of intensity data. The aim is to get the more "stable" image of the fingerprint (while touching) .
Following storage in the RAM 158, the print image data FP is optionally transferred to the computer 50 as indicated in Figure 5. However, in an alternative embodiment of the invention the filtering and contrasting transformations may be executed by the mircocontroller 150 in the computer mouse 60.
The matrix of intensity data from the CCD sensor 102, that is, the print image data FP, includes the fingerprint and surrounding "garbage" . In an optional process a border between the print image and the "garbage" is defined and the "garbage" is excluded so that only the internal part of the print image, that is the portion which includes ridge lines, takes part in the further analysis.
After the print image data FP is acquired, preprocessing of the print image data FP is carried out beginning with a scale normalization step 310 in which the scale of the print image data FP is normalized using standard routines. After the scale normalization step 310 the print image data FP is then used to calculate directional image data DI using gradient statistics in directional calculation step 315, wherein the print image is divided into cells having a size defined by Fx and Fy. Referring to Figure 7, the print image data FP is divided into cells as shown by a grid superimposed on the print image and a vector normal to the direction of ridge lines in each cell is calculated. These vectors form the directional image data DI . Thus, an array of directional image data F(i,j) is generated where i and j denote the cell and the value of F(i,j) is between O and Pi for directional cells or is set to UnDir for cells wherein a directional gradient cannot be determined such as for isolated pixels or pixel groups lacking directionality. The directional image data DI is then subjected to a smoothing process and its quality factor Q is determined in a smoothing and quality processing step 320. The smoothing process includes first applying a low-pass filter and then a low-cut filter, after which a directional smoothing along the directions defined for each cell is effected. Scale normalization, low-pass filtering, low-cut filtering directional image calculation and smoothing are processes that are realizable by those of ordinary skill in the art. Accordingly, detailed discussions thereof are omitted.
The quality Q of a print image data FP is then calculated by determining a ratio of cells that remain substantially unchanged following the smoothing and quality processing step 320 to the total number of cells. This ratio is then squared and multiplied by the area of the print image data FP divided by the area of the entire scanned image. Thus, both the quality of the print image data FP and absence of image data corresponding to a fingerprint are taken into consideration. Quality decision step 325 is then executed to determine whether the quality Q of the print image FP is above a given quality threshold. When the quality Q is below the given quality threshold, the process returns to the imaging step 300 for input of new data. This is because it is determined that the quality of the fingerprint is insufficient to base matching upon. If the quality is above the given threshold, processing proceeds a binarization step 330.
In the binarization step 330, the image data FP shown in Figure 8(a) is subjected to preliminary binarization using subtraction of low-pass filtering resulting in the image data FP producing the image shown in Figure 8 (b) , followed by directional filtering and binarization resulting in the image of Figure 8 (c) . Processing continues with execution of a skeletonization step 335 wherein the image data FP is subjected to a thinning and skeletonization processing wherein all ridge lines are reduced to a width of one pixel which results in the image shown in Figure 8(d) . In this stage visible ridge lines, that are some pixels in width are being changed to lines one pixel in width. The values on the ridge lines are 1 and for all other areas the values are 0. Now the matrix consists of only two values. Detailed discussions of the filtering and skeletonization processes are omitted as such are realizable by those of ordinary skill in the art given the present disclosure.
A minutia extraction step 340 is next executed upon the image data FP that has been skeletonized. Fingerprints are characterized by various minutia which are particular patterns of the ridges. Two basic types of minutia are a bifurcation 400, or branch, shown in Figure 9a, wherein a ridge line 402 divides into two ridge lines, 403 and 405, and an end 410, shown in Figure 9b, wherein a ridge line 412 ends. Each minutia is characterized as a vector represented by a minutia data triplet X, Y, and A wherein X and Y represent the location of the minutia and A is an angle of a vector of the directionallization of the minutia as shown in Figures 9a and 9b.
In a preferred embodiment of the present invention, distinction between end minutia 410 and bifurcation minutia 400 is not made. It is found that exclusion of such distinction results in reduction of data, reduced processing needs and time, while still providing acceptable reliability of fingerprint comparison. Alternatively, distinction may be made with associated increase in processing.
The minutia extraction step 340 further proceeds with exclusion of minutia that are too closely located. Referring to Figure 10, two end minutia at (xl, yl) and (x2 , y2 ) , respectively, and represented by vectors (pl,ql) and (p2,q2), respectively, are shown. First, determination is made as to whether the two minutia are within a threshold distance. This threshold distance is optionally a distance r used to determine matching minutia and discussed below, a fixed distance, or another distance based on mean ridge line separation distance. When two minutia are within the given threshold distance, a determination is made whether the angle between the two vectors (pl,ql) and (p2,q2) is within a given threshold of 180° and the angle between (p2,q2) and (x2-xl, y2-yl) is within a given threshold of 0. If two minutia satisfy the aforesaid criteria they are excluded because they are too close and aligned in a nearly straight line. As a result of the minutia extraction process, the print image FP is now represented by a data set defined as FP={Q, N, F(i,j), X(k), Y(k), A(k)} wherein N is the total number of minutia for the fingerprint FP, and X(k) , Y(K) and A(k) are the data triplet representing the k-th minutia. The minutia extraction is advantageous in reducing the amount of data to be processed and thereby reducing the processing time and requirements.
Processing next proceeds to a matching process step 345 wherein the print image data FP is compared to image data in the database. FP1 now refers to the image data of the input fingerprint and FP2 refers to print image data of a fingerprint retrieved from the database in database retrieval step 347. Likewise in this description, other variables are appended with 1 or 2 to represent the respective fingerprint .
It is necessary to find the best alignment of the directional images DI1 and DI2 of Fl(i,j) and F2(i,j). Data Fl(fa, fdx, fdy) (i,j) is now calculated wherein rotation by angle fa and shift by distance fx and fy is effected in an orthogonal transformation of Fl(i,j). After the transformation of FI, a comparison of Fl(fa, fdx, fdy) (i,j) with F2(i,j) is then made wherein differences in orientations of corresponding cells of the directional images DI and D2 is calculated as DifDI. DifDI is calculated as the sum of all angular differences between corresponding cells. The values of fa, fdx, fdy iteratively varied and for each permutation thereof the transformation of Fl(fa, fdx, fdy) (i,j) is made and compared with F2(i,j) to find a DifDI for each set of fa, fdx, fdy values. A set of fa, fdx, fdy values is then chosen for which DifDI is minimal. The chosen set of fa, fdx, fdy represent the best shifting parameters for shifting the directional image DI to effect the best matching directional alignment of DI and D2. Subsequent alignment of minutia for matching purposes used the chosen set of fa, fdx, fdy as a starting point for adjustments. Additionally, BI is determined as the number of cells (i,j) of either DI or D2 that are not UnDir.
A directional difference decision step 350 is next executed wherein the minimal DifDI for the chosen set of fa, fdx fdy is compared against a threshold DifDITH which may be a set threshold or threshold based on BI . If DifDI exceeds the threshold DifDITH, then it is determined that the correspondence level, or matching level, between the directional images is insufficient to warrant further comparison of FP1 and FP2 and a different fingerprint image data is chosen for FP2 and processing returns to the beginning of the matching process step 345. If DifDI is less than the threshold, operation proceeds to similarity measure calculation step 355.
Next, the chosen set of fa, fdx, fdy for orthogonal transformation is applied as (dfx*Fstepx, dfy*Fstepy and fa) to the minutia data triplets XI (k) , Yl (k) , and Al (k) of FP1, where k represents a k-th minutia. The transformed minutia data triplets of print image data FP1 are then grouped into clusters each containing not less than a given number of minutia, preferably seven. Referring to Figure 11(a), FP1 is illustrated as being divided in four clusters CS1, CS2 , CS3 , and CS , which each contain the given number of minutia (not shown) . Figure 11 (a) is a simplified depiction of the process in that the clusters do not necessarily cover square regions of the print image and the number of clusters is not limited to four. The clusters may be thought of a regional groupings of minutia.
Referring now to Figure 11(b), for each of the clusters CS1, CS2, CS3, and CS4 on a cluster by cluster basis, XI (k) , Yl (k) of the minutia of the given cluster are all iteratively shifted in x and y directions by values dr, wherein dr is varied within a range
R, such that abs (dr) < R, and a comparison of the shifted XI (k) , Yl (k) , Al (k) is made against all minutia in a BI grouping of FP2 for each set of dr set values to identify minutia of FP1 matching those of FP2. A pair of minutia are considered matched when a distance between them is less than a threshold r discussed below.
The BI grouping of FP2 is the group of cells in FP2 that are not UnDir. For each shift of a cluster, a similarity measure Smt is taken, which is the sum of the following term for each set of matched minutia in the cluster: -lb d m(xl,yl;x2,y2) =a f exp(- z/2)dz + δ,
Figure imgf000027_0001
where d= (xl-x2)2 + (yl-y2)2
and a, δ and 0 are empirical values. In an embodiment of the invention, a is 150, δ is set equal to Rl , where Rl equals 30, and R2 , where R2 , equals 20, Rl and R2 being discussed below, and 0 is set equal to 4. These values are exemplary and alterable without departing from the scope and spirit of the present invention. For each cluster, the set of dr values yielding the greatest similarity measure Smt is selected and the total sum of the greatest similarity measure of each cluster is taken to find a similarity measure Smt (FPl, FP2) for the comparison of FP2 to FP2) .
As noted above, comparison of fingerprints is often hampered by various environmental and physiological factors. The division of FPl into clusters provides compensation in part for such factors as stretching and shrinking of the skin. For a given cluster, the total distance difference due to stretching or shrinkage between two minutia is limited due to the limited size of the cluster area. Thus, adverse effects of shrinking and stretching are minimized. Accordingly, individual cluster shifting and comparison are a preferred embodiment of the present invention. Alternatively, division of FPl into clusters may be omitted and shifting and comparison of FPl as a whole effected.
The maximum similarity measure Smt (FPl, FP2) is generated for the best comparisons of all clusters of FPl with FP2 , along with a number Nmat of matched minutia, and a number Ntot which is the total number of minutia within the BI grouping of FPl. An overall similarity measure for the comparison of FPl with FP2 is calculated as follows:
Nmt (R,r,BI, Ntot) =Smt (FPl, FP2 ) - DifDI where Smt (FPl, FP2) is a sum of the best Smt of each cluster. Thus, this takes into account the maximal number of matched minutia, DifDI and statistical peculiarities of distances distribution.
Processing then proceeds to similarity decision step 360 wherein Nmt (R, r, BI, Ntot) is compared with a threshold Thr (R, r, BI, Ntot). If Nmt (R, r, BI, Ntot) is greater than the threshold Thr(R, r, BI, Ntot), it is determined the FPl matches FP2 and a match is indicated in match indication step 365. If Nmt(R, r, BI, Ntot) is less than or equal to the threshold Thr(R, r, BI, Ntot) it is determined the FPl does not match FP2 and execution proceeds to the data base retrieval step 347 for the selection of another set of print data from the database for use as FP2 in the process which returns to the matching process step 345. Indication of a match is then used to permit access to the computer 50 in general or specific functions thereof. In a preferred embodiment of the invention, the threshold Thr(R, r, BI, Ntot) is determined on the basis of threshold training using a sample pool of fingerprints from a number of individuals. The sample pool is composed of a number of samples, or standards, from each individual in the pool. The number of samples, from each individual in the pool. The number of samples from each individual in one example is 4 and the number of individuals is in a range of 100 to 1000. The number of samples and individuals may be varied from the exemplary values and range without departing from the scope and spirit of the present invention. The process steps 305 through 355 of Figure 6 are then executed for each print with every print being compared to every other print. Since the sample pool is known, comparisons of prints from a same individual and comparisons of prints from different individuals are known. In performing the threshold training, n number of variations of R and r are used and are shown below as Rl, R2 and rl , r2 for an example where n=2. For example, values are set such that Rl < R2 and rl < r2 where R1=2*MID, rl=MID, R2=3.5-4 MID, and r2=2*MID. MID is the mean inter-ridge distance of the prints in the sample pool. The following values are found: NmtS(Rl,rl,BI, Ntot) , NmtA (Rl , rl , BI ,Ntot ) , and
NmtS (R2,r2,BI,Ntot) , NmtA (R2 , r2 , BI , Ntot) , where NmtS is number of matched minutia for prints compared from the same individual while NmtA is the number of matching minutia resulting from the comparison of fingerprints from different individuals .
For a given BI,Ntot (within subrange of appropriate quantization), BestA (n, BI ,Nmat) is set to the max NmtA (Rn, rn, BI , Ntot) , of all the comparisons of fingerprints from different individuals, and MinNmtS (Rn, rn, BI ,Ntot) is set to the minimum NmtS (Rn, rn, BI , Ntot) of all comparisons of fingerprints from the same individual for n=l,2, etc. The threshold are then calculated as follows:
Thr (n,BI,Nmat) = (BestA (n, ... ) =MinNmtS (Rn,rn, .. ) , where
NmtS (Rn, ... ) >BestA (Rn, ... ) /2.
In conjunction with the above discussion of threshold calculations, the similarity decision step 360 produces a positive match indication if for the current BI, Ntot:
Nmt (Rl,rl,BI,Ntot) >Thr (1 , BI , Ntot ) , or Nmt (R2,r2,BI,Ntot) >Thr (2 , BI , Ntot) . If this condition is not found, then the dichotomy analysis gives some correction. The results of identical and not identical matchings is considered as two classes of patterns and the pairs of values Nmt (Rl , rl, ... ) , Nmt (R2 , r2 , ... ) as feature coordinates. The dichotomies are performed by the second order threshold functions which are calculated according to chapter 2.3. in the classical book by J.Tu and R. Gonzalez "Pattern Recognition Principles" Addison-Wesley Publ . 1974, which is incorporated herein by reference for its relevant dichotomy teachings.
The complete description to be stored in the database is a multilevel structure of 4 (or more) FP data sets taken from the different applications of the same FP . Each level of the structure corresponds to minutia appearance frequencies for all FP codes .
Optionally, instead of using thresholds for the similarity comparison as discussed above, fixed values may be chosen and used as threshold values. The data base of fingerprints of individuals for whom identification is required is created by a registration process. The registration process entails a given individual having their fingerprints scanned a number of times, for example four. Of the four scans, the scanning producing the greatest number of minutia is then selected for the database.
The present invention further includes use of the above fingerprint minutia extraction and comparison process in conjunction with a cryptographic protection process. For this aspect of the invention, the computer 50, also referred to as the client, will send fingerprint data to the remote computer 51, also referred to as the server, over the link 53 which may be, for example, a link over the Internet. Thus, security protection for data sent over the link 53 is required.
There are three different cryptographic procedures used in the cryptographic process. As they are not used simultaneously, they are described below separately. All cryptographic parts are written in italic font. The cryptographic method employed is RSA encryption.
I . User registration
In order to use the cryptographic process, the user must first register his fingerprint with the server. In order to maintain security, the fingerprint data must be encrypted to prevent unauthorized interception thereof. The following steps are used:
1. User fills in a registration form including a
UserlD. Other information such as Name, E-mail address, etc. may be included.
2. User scans his fingerprint into the computer 50 via the biometric input device where it is stored as image data. The image data is typically on the order of 64 KB. 3. The computer 50 then converts the image data of the finger to the data set defined as FP={Q, N, F(I,J) , X(k) , Y(k) , A(k) using processing steps 310 through 340 shown m Figure 6. This data set is also referred to herein as a passport. Optionally, components of the data set may be omitted, such as F(ι,j), so the passport may be shortened to about 1.2 KB. 4. The client, computer 50, then sends a request for the public key to the server via the link 53.
5. Server sends its public key KE via the link 53.
6. Client encrypts its passport and his UserlD using RSA algorithm and public key KE . In a preferred embodiment the length of the key is 512 bits:
C = RSA . Encode Public (KF, passport . User ID)
1 . The computer 50 sends C to the remote computer 51
Figure imgf000031_0001
8. The remote computer 51 decrypts message using its secret key KD:
M = Passport + UserlD = RSA . Encode Secret (Kn, C)
9. The remote computer 51 then adds the UserlD and passport to the database.
II. User authorization
The user authorization process is used where a user wishes to gam access to the remote computer on the basis of his finger print matching one m the database. 1. User scans his fingerprint image data into the computer 50.
2. The computer converts the image of the finger to the passport using processing steps 310 through 340 shown m Figure 6. 3. The computer 50 sends a request over the link 53 to the remote computer 51, the server, for the public key to the server.
4. The remote computer 51 sends its public key K_ to the computer 50. 5. The computer 50 encrypts the passport and UserlD using RSA algorithm using the public key KL :
C = RSA EncodePubli c (K. , passport , UserlD)
6 . The computer 50 sends C to the remote computer 51 via the link 53. 7. The remote computer 51 decrypts message using its secret key KD : M = passport + UserlD = RSA EncodeSecret (KD, C)
9. The remote computer 51 then searches the database for the UserlD, finds the corresponding passport, and executes steps 345 through 365 of Figure 6 using the passport retrieved from the database as FP2. Optionally, step 350 is omitted. If the comparison of step 360 is positive, access is authorized. If the UserlD does not exist or the comparison result of step 360 negative, authorization for access is refused.
III. Installation of the server and addition of new users is effected by the following steps: 1. Installation of normal Web-server components.
2. Genera tion of the publ i c and secret keys for the admini s tra tor of the server : first of all random integer is generated, possibly based on administrator's fingerprint, which is part random, then the deterministic algorithm is started to determine public and secret keys.
3. When the new user is being registered, server takes i ts UserlD and passport and encrypts them wi th admini s tra tor ' s publ i c keγ . Usage of two different keys makes it more difficult to corrupt fingerprint data since an intruder must obtain both public and private keys to complete his attack. Different servers will have different keys to ensure that corrupted fingerprint data (i.e. stolen from some server) could not be used on other servers. The 512-bits RSA keys are extremely difficult to crack. In fact, the keys of that length are not known to have been broken, so current cryptography declares them as keys for long-term secret information (30-50 years or longer) . Average time of encryption of passport (client side) is less than a second. Average time of decryption of passport (server side) is about 2 seconds, so it is reasonable to predict that network delays would be more significant. Besides, servers are usually more powerful than the client computers. A further aspect of the present invention provides software for working m the Windows environment. In particular, a protection icon is provided which an authorized user, one whose passport has produced a positive comparison, may move and drop on a file or program object to require that future access thereto be permitted only when a positive fingerprint comparison has been executed. Optionally, the user may input a list of UserlD' s for whom access will be allowed.
Having described preferred embodiments of the invention with reference to the accompanying drawings, it is to be understood that the invention is not limited to those precise embodiments, and that various changes and modifications may be effected therein by one skilled m the art without departing from the scope or spirit of the invention as defined m the appended claims. Now that the invention has been described,

Claims

Claims 1. A biometric input device for accepting a fingerprint of a finger tip having opposing tip sides and a tip end, comprising: a device body having a body side wall defining an aperture; an optical assembly for scanning the fingerprint disposed in said device body; said optical assembly including an imaging component for converting a light image into pixel output and a lens for focusing said light image into said imaging component; and said optical assembly including a prism with first, second and third sides and a top side wherein: said first side forms a scanning surface at said aperture upon which the finger tip is placed for scanning of the fingerprint by said optical assembly; said second side has said lens for focusing said light image into said imaging component disposed thereon; and said third side has a light absorbing layer.
2. The biometric input device of claim 1 wherein said lens is formed integrally with said prism.
3. The biometric input device of claim 1 wherein said lens is adhered to said second side of said prism.
4. The biometric input device of claim 1 further comprising a light emitting device disposed to emit light into said prism from said top side of said prism to illuminate the fingerprint when disposed at said scanning surface.
5. The biometric input device of claim 4 wherein said light emitting device is disposed above said top side.
6. The biometric input device of claim 4 wherein said light emitting device is disposed on said top side.
7. A biometric input device for accepting a biometric identifier from a corresponding portion of a user, said biometric input device comprising: a device body; an optical assembly for scanning the biometric identifier, said optical assembly including an imaging component for converting a light image into pixel output; said optical assembly including at least a first side and a second side disposed at least partially in said device body; said first side of said optical assembly defining an exteriorly accessible scanning surface structured to be disposed in close scanning proximity to the portion of the user containing the biometric identifier so as to receive said light image; and said second side of said optical assembly including a lens, said lens structured to focus said light image into said imaging component.
8. A biometric input device as recited in claim 7 wherein said first side and said second side comprise portions of a prism, said prism further comprising a third side and a top side.
9. A biometric input device as recited in claim 8 wherein said optical assembly further comprises a light emitting device operatively disposed to illuminate the biometric identifier disposed at said scanning surface so as to generate said light image .
10. A biometric input device as recited in claim 9 wherein said light emitting device is disposed to emit light through said top side of said prism.
11. A biometric input device as recited in claim 8 wherein said lens is integrally formed at said second side of said prism.
12. A biometric input device as recited in claim 8 wherein said third side includes a light absorbing layer.
13. A biometric input device as recited in claim 8 wherein said prism comprises a right angle prism with a forward acute angle generally between about 40° to 60°.
14. A biometric input device as recited in claim 7 wherein said device body is substantially compact and further includes a mirror disposed therein, said mirror structured to direct said focused light image into said imaging component.
15. A biometric input device as recited in claim 14 wherein said imaging component comprises a second lens.
16. A biometric input device as recited in claim 7 wherein said device body comprises a computer mouse and further includes operative mouse selector and directional input structures defined therein.
17. A biometric input device as recited in claim 16 wherein said scanning surface is disposed in a middle portion of said device body.
18. A biometric input device as recited in claim 17 wherein a directional input structure of the computer mouse is disposed in a rear section of said device body so as to facilitate operative containment of said prism and said imaging component within said device body.
19. A biometric input device as recited in claim 7 wherein said scanning surface is disposed in a side surface of said device body so as to facilitate operative engagement with a user's thumb as the portion of the user containing the biometric identifier.
20. A biometric input device as recited in claim 7 wherein said scanning surface comprises a contact detection assembly structured to identify positioning of said biometric identifier in operative proximity thereto.
21. A computer mouse comprising: a device body; said device body including a front portion, a rear portion and a middle portion, said middle portion including a center line of said device body; at least one input selector control; a directional input structure configured to identify directional movement of said device body; and said directional input structure at least partially disposed rear of said center line such that a base of a user's hand manipulating said device body tends to maintain downward pressure on said directional input structure, and such that additional functional components may be operatively contained by said device body.
22. A computer mouse as recited in claim 21 wherein said additional functional components contained by said device body comprise an optical assembly for scanning a fingerprint, said optical assembly including an imaging component for converting a light image into pixel output and a lens for focusing said light image into said imaging component .
PCT/US2000/013323 1999-05-14 2000-05-12 Biometric system for biometric input, comparison, authentication and access control and method therefor biometric WO2000070543A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU47140/00A AU4714000A (en) 1999-05-14 2000-05-12 Biometric system for biometric input, comparison, authentication and access control and method therefor biometric

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US09/312,002 1999-05-14
US09/312,002 US6282304B1 (en) 1999-05-14 1999-05-14 Biometric system for biometric input, comparison, authentication and access control and method therefor
US43329499A 1999-11-03 1999-11-03
US09/433,294 1999-11-03

Publications (1)

Publication Number Publication Date
WO2000070543A1 true WO2000070543A1 (en) 2000-11-23

Family

ID=26978175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/013323 WO2000070543A1 (en) 1999-05-14 2000-05-12 Biometric system for biometric input, comparison, authentication and access control and method therefor biometric

Country Status (2)

Country Link
AU (1) AU4714000A (en)
WO (1) WO2000070543A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100433041C (en) * 2003-11-14 2008-11-12 萨热姆防务安全公司 Optical imaging device suited for forming an image of fingerprints

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4924085A (en) * 1988-06-23 1990-05-08 Fujitsu Limited Uneven-surface data detection apparatus
JPH02307175A (en) * 1989-05-22 1990-12-20 Omron Corp Fingerprint input device
US5146102A (en) * 1990-02-22 1992-09-08 Kabushiki Kaisha Toshiba Fingerprint image input apparatus including a cylindrical lens
US5222152A (en) * 1991-11-19 1993-06-22 Digital Biometrics, Inc. Portable fingerprint scanning apparatus for identification verification
US5341421A (en) * 1990-11-06 1994-08-23 Bull Cp8 Security device, including a memory and/or a microcomputer for data processing machines
USD369593S (en) * 1994-11-07 1996-05-07 Silitek Corporation Computer mouse
US5546471A (en) * 1994-10-28 1996-08-13 The National Registry, Inc. Ergonomic fingerprint reader apparatus
US5781651A (en) * 1996-04-15 1998-07-14 Aetex Biometric Corporation Compact fingerprint recognizing apparatus illuminated with electroluminescent device
US5848231A (en) * 1996-02-12 1998-12-08 Teitelbaum; Neil System configuration contingent upon secure input
US5991431A (en) * 1996-02-12 1999-11-23 Dew Engineering And Development Limited Mouse adapted to scan biometric data

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4924085A (en) * 1988-06-23 1990-05-08 Fujitsu Limited Uneven-surface data detection apparatus
JPH02307175A (en) * 1989-05-22 1990-12-20 Omron Corp Fingerprint input device
US5146102A (en) * 1990-02-22 1992-09-08 Kabushiki Kaisha Toshiba Fingerprint image input apparatus including a cylindrical lens
US5341421A (en) * 1990-11-06 1994-08-23 Bull Cp8 Security device, including a memory and/or a microcomputer for data processing machines
US5222152A (en) * 1991-11-19 1993-06-22 Digital Biometrics, Inc. Portable fingerprint scanning apparatus for identification verification
US5546471A (en) * 1994-10-28 1996-08-13 The National Registry, Inc. Ergonomic fingerprint reader apparatus
USD369593S (en) * 1994-11-07 1996-05-07 Silitek Corporation Computer mouse
US5848231A (en) * 1996-02-12 1998-12-08 Teitelbaum; Neil System configuration contingent upon secure input
US5991431A (en) * 1996-02-12 1999-11-23 Dew Engineering And Development Limited Mouse adapted to scan biometric data
US5781651A (en) * 1996-04-15 1998-07-14 Aetex Biometric Corporation Compact fingerprint recognizing apparatus illuminated with electroluminescent device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100433041C (en) * 2003-11-14 2008-11-12 萨热姆防务安全公司 Optical imaging device suited for forming an image of fingerprints

Also Published As

Publication number Publication date
AU4714000A (en) 2000-12-05

Similar Documents

Publication Publication Date Title
US6282304B1 (en) Biometric system for biometric input, comparison, authentication and access control and method therefor
US20020031245A1 (en) Biometric authentification method
EP0976087B1 (en) Biometric recognition using a master pattern set
US7853054B2 (en) Fingerprint template generation, verification and identification system
Rowe et al. A multispectral whole-hand biometric authentication system
US6941001B1 (en) To a combined fingerprint acquisition and control device
EP1825418B1 (en) Fingerprint biometric machine
US7136514B1 (en) Method for authenticating an individual by use of fingerprint data
EP1114394B1 (en) A configurable multi-function touchpad device
EP0905646A1 (en) Pointing and fingerprint identifier mechanism for a computer system
US20040042645A1 (en) Fingerprint recognition method, and fingerprint control method and system
CA2230279A1 (en) Method of gathering biometric information
CN1353844A (en) Method and apparatus for creating composite finger print image
JP2007206991A (en) Bioinformation processor and bioinformation processing program
EP1399874A1 (en) Method and system for transforming an image of a biological surface
US20020031244A1 (en) Biometric system for biometric input, comparison, authentication and access control and method therefor
WO2007018545A2 (en) Protometric authentication system
JPH10275233A (en) Information processing system, pointing device and information processor
WO2000070545A1 (en) Biometric system for biometric input, comparison, authentication and access control and method therefor
WO2000070543A1 (en) Biometric system for biometric input, comparison, authentication and access control and method therefor biometric
WO1998011501A2 (en) Embeddable module for fingerprint capture and matching
JP2002279413A (en) Device for identifying dummy fingerprint and device for collating fingerprint
EP1208528B1 (en) Method and arrangement for registering and verifying fingerprint information
CN111709312A (en) Local feature face recognition method based on joint main mode
Leghari et al. Analyzing the effects of data augmentation on single and multimodal biometrics

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP