WO2016099389A1 - Guided fingerprint enrolment based on center of attention point - Google Patents

Guided fingerprint enrolment based on center of attention point Download PDF

Info

Publication number
WO2016099389A1
WO2016099389A1 PCT/SE2015/051344 SE2015051344W WO2016099389A1 WO 2016099389 A1 WO2016099389 A1 WO 2016099389A1 SE 2015051344 W SE2015051344 W SE 2015051344W WO 2016099389 A1 WO2016099389 A1 WO 2016099389A1
Authority
WO
WIPO (PCT)
Prior art keywords
fingerprint
finger
point
sensor
user
Prior art date
Application number
PCT/SE2015/051344
Other languages
French (fr)
Inventor
Hamid SARVE
David TINGDAHL
Carsten Juncker
Niels MØRCH
Original Assignee
Fingerprint Cards Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fingerprint Cards Ab filed Critical Fingerprint Cards Ab
Priority to JP2017530224A priority Critical patent/JP2017538224A/en
Priority to CN201580006912.2A priority patent/CN105981043B/en
Priority to KR1020177014268A priority patent/KR101872367B1/en
Publication of WO2016099389A1 publication Critical patent/WO2016099389A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1335Combining adjacent partial images (e.g. slices) to create a composite input or reference pattern; Tracking a sweeping finger movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1359Extracting features related to ridge properties; Determining the fingerprint type, e.g. whorl or loop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • G06V40/1376Matching features related to ridge properties or fingerprint texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/50Maintenance of biometric data or enrolment thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1172Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting

Definitions

  • Embodiments herein relate to methods and arrangements relating to enrolment of fingerprints in a fingerprint sensing system.
  • a user In order to enable such secure access by way of fingerprint sensing, a user has to take part in a so-called enrolment procedure where information directly connected to a user's fingerprint is registered for later use in a verification procedure when actual access is to be determined. During such an enrolment procedure, the user is typically prompted to apply a finger to a fingerprint sensor several times until a complete fingerprint, or at least a large part of a fingerprint, has been recorded.
  • a method in a fingerprint sensing system comprises a fingerprint sensor and the method comprises a determination of a center of attention, COA, point.
  • the COA point is a point on a finger of a user that is likely to be in a proximity of the center of a fingerprint image of the finger obtained by the sensor. This COA determination is followed by guiding the user in a fingerprint enrolment procedure, using the determined COA point for providing finger position guidance information to the user.
  • the overall user experience of the fingerprint sensor is improved. This is the case, as the user will be using the part of the finger that feels natural for the user for both the enrolment and any subsequent verification procedure when the fingerprint is to be verified.
  • the determination of the COA point comprises obtaining a first plurality of fingerprint images of the finger of the user from the fingerprint sensor. During the obtaining of the first plurality of fingerprint images, the first plurality of fingerprint images are stitched into a first two-dimensional stitched image. The COA point is determined by calculating a center of gravity point of the first stitched image and assigning the center of gravity point to the COA point. Furthermore, in these embodiments, the guiding of the user in the fingerprint enrolment procedure comprises obtaining a second plurality of fingerprint images of the finger of the user from the fingerprint sensor. During the obtaining of the second plurality of fingerprint images, the second plurality of fingerprint images are stitched into a second two-dimensional stitched image.
  • a calculation is made of a desired position of the finger in relation to the sensor that, when a fingerprint image in the second plurality of fingerprint images is obtained of the finger at the desired position provides an amount of additional fingerprint area in the second stitched image in the proximity of the COA point that has a maximum value.
  • guidance information is provided for the user, where this guidance information is indicative of the calculated desired position.
  • the guiding of the user in the fingerprint enrolment procedure may in some embodiments comprise a calculation of an updated COA point by using the second stitched image. For example, a search can be made for a location of a singular point in the second stitched image and, if the search is positive, using this location of the singular point in the calculation of an updated COA point.
  • G is a Gaussian kernel
  • is the standard deviation of the Gaussian
  • M(x,y) is a binary value coverage mask corresponding to the second stitched image with x and y being the pixel position in the second stitched image
  • a fingerprint sensing system that comprises a fingerprint sensor, a processor and a memory.
  • the memory contains instructions executable by the processor whereby the processor is operative to control the fingerprint sensing system by determining a center of attention, COA, point, the COA point being a point on a finger of a user that is likely to be in a proximity of the center of a fingerprint image of the finger obtained by the sensor, and guiding the user in a fingerprint enrolment procedure, using the determined COA point for providing finger position guidance information to the user.
  • a communication device comprising the fingerprint sensing system of the second aspect, a computer program, comprising instructions which, when executed on at least one processor in a fingerprint sensing system, cause the fingerprint sensing system to carry out the method according to the first aspect and, in a final aspect, a carrier comprising the computer program.
  • Figure 1a schematically illustrates a block diagram of a fingerprint sensing system
  • figure 1 b schematically illustrates a block diagram of a mobile communication device
  • figure 1c schematically illustrates a block diagram of a smart card
  • figure 2a is a flowchart of a method
  • figure 2b is a flowchart of a method
  • figure 3 schematically illustrates a stitched image and a COA point
  • figure 4 schematically illustrates finger locations in relation to a sensor
  • figure 5 schematically illustrates singular points in a fingerprint image.
  • FIG. 1 a illustrates schematically in the form of function blocks a fingerprint sensing system 100.
  • the function blocks comprise a processor 102, a two-dimensional fingerprint sensor 104 and a memory 106, and the system 100 is in connection with a guidance information provider 108.
  • the processor is operable to control the fingerprint sensing system 100 and it is connected to the memory 104, which comprises an appropriate computer program 141 comprising software instructions and data that enables the processor 102 to control the system 100 as will be exemplified below.
  • the fingerprint sensor 104 it may be of any suitable type, such as optical, capacitive, ultrasonic etc., as the skilled person will realize.
  • the fingerprint sensor 104 may comprise a square or rectangular shaped matrix of pixels, for example a capacitive sensor having a size of 208x80 pixels, each pixel having a resolution of 256 grey scales.
  • the fingerprint sensor typically comprises a readout circuit (not shown in the drawings) allowing the image data, i.e. fingerprint data, to be read out to the processor 102 at various speeds.
  • the fingerprint sensing system 100 may comprise individual components as illustrated schematically in figure 1a and the system may also be implemented by way of combining functionalities of the processor 102 and the memory 106 in a single unit. It is also possible to have an implementation where the sensor 104 comprises the necessary processor and memory capabilities.
  • the guidance information provider 108 it is an arrangement that is capable of providing a feedback to a user when the user interacts with the fingerprint sensing system 100.
  • feedback will be exemplified with visual output in the form of graphics in the following, it is to be noted that the feedback from the guidance information provider 108 may be an arrangement that is capable of providing sensory output that is any of visual, sound and touch.
  • FIG. 1 b illustrates schematically in the form of function blocks a mobile communication device 110 such as a mobile phone, a smartphone, a tablet, a personal computer, a 5 laptop computer or any similar type of device.
  • the mobile communication device 1 10 comprises the functionalities of the fingerprint sensing system 100 of figure 1 a including the sensor 104.
  • the mobile communication device 110 comprises a processor 112, a memory 114, radio circuitry 116 and a touch sensitive display 1 18.
  • the fingerprint sensing system 100 forms part of the processor 112 and the memory
  • the touch sensitive display 1 18 is configured to act as the guidance information provider 108 by providing graphical output for a user during operation of the fingerprint sensing system 100.
  • the processor 1 12 is configured to
  • FIG. 140 Yet another arrangement in which a fingerprint sensing system may be implemented is a smart card 140, as schematically illustrated in a functional block diagram in figure 1c.
  • the smart card 140 comprises the functionalities of the fingerprint sensing system 100 of
  • the smart card 140 comprises a processor 142, a memory 144 and radio circuitry 146, which may be of any appropriate type such as near field communication, NFC, circuitry, Bluetooth ® circuitry etc.
  • the fingerprint sensing system 100 forms part of the processor 142 and the memory 144. That is, the processor 142 controls by means of software instructions the fingerprint sensing
  • the smart card is not equipped with a display, although variations of the smart card 140 may be equipped with a guidance information provider in the form of, e.g. light emitting diodes (LED) or audio providing means.
  • the processor 142 in the smart card 140 is configured to control the smart
  • FIG. 30 card 140 to operate in a communication system, e.g. in a payment scenario in case the smart card is a bank card or credit card, via the radio circuitry 146 in a manner that is outside the scope of the present disclosure.
  • a method in a fingerprint sensing system e.g. the fingerprint sensing system 100 of figures 1a, 1 b and 1 c, will be described in some detail.
  • the method comprises a number of actions that will be described below.
  • the actions of the method in figure 2 are realized by means of software instructions being executed in a processor, e.g.
  • any of the processors 102, 112 or the processor 142 which interacts with a sensor such as the sensor 104 and controls provision of guidance information, e.g. via a guidance information provider 108.
  • Memory such as the memory 106 or the memory 114 is utilized during the execution of the method.
  • COA center of attention
  • the user is guided in a fingerprint enrolment procedure, using the determined COA point for providing finger position guidance information to the user.
  • Embodiments of the method illustrated in figure 2a may comprise actions as illustrated in figure 2b. Although the actions are illustrated in a sequential order, it is to be understood that any number of the actions may be performed in parallel, as will become clear from the detailed description of the actions.
  • the determination of the COA point, as described in action 201 comprises the following actions 211 to 215.
  • a first plurality of fingerprint images of the finger of the user is obtained from the fingerprint sensor. Action 213
  • the first plurality of fingerprint images are stitched into a first two-dimensional stitched image.
  • the COA point is determined by calculating a center of gravity (COG) point of the first stitched image and assigning the center of gravity point to the COA point.
  • the COG in the x and y direction, CoG x and CoG y may be calculated as: ⁇ Vx , y xM(x, y) ⁇ vx, y yM(x, y)
  • M(x,y) is a binary value coverage mask corresponding to the second stitched image with x and y being the pixel position in the second stitched image.
  • Figure 3 illustrates an example of a plurality of fingerprint images, exemplified by reference numeral 304, have been stitched into a first stitched image 302.
  • a calculated COA 306 is also indicated in figure 3.
  • the guiding of the user in the fingerprint enrolment procedure comprises the following actions 217 to 223.
  • a second plurality of fingerprint images of the finger of the user is obtained from the fingerprint sensor.
  • the second plurality of fingerprint images are stitched into a second two-dimensional stitched image.
  • guidance information is provided for the user, where this guidance information is indicative of the calculated desired position.
  • the guidance information may be any of a matrix of blocks that illustrates the fingerprint coverage of the second stitched image, a binary map of actual coverage of the second stitched image, and a displayed image of a pseudo- finger that represents a position of the finger in relation to the sensor.
  • instructions may be provided for the user to repeatedly touch the sensor while moving the finger between each touch. Such instructions may be as simple as an instructive message or graphic displayed on a display.
  • any obtained fingerprint image that corresponds to the finger being asymmetrically located with respect to the sensor is discarded.
  • the continuation, in action 215, with the determination of the COA is then done when the first plurality of fingerprint images is numerically larger than a first threshold.
  • feedback information may be provided for the user that indicates that the finger is asymmetrically located with respect to the sensor.
  • an advantage of such embodiments can be illustrated by considering a user who has little experience with fingerprint enrolment procedures. Such an inexperienced user might not be aware of how the amount of movement between each time the finger touches the sensor maps to the guidance information that is fed back to the user, e.g. in terms of fingerprint coverage growth etc. as mentioned above.
  • This approach may be considered as a "training mode approach", since the user is informed when the user has placed the finger in an undesired asymmetric position in relation to the sensor and that a
  • a determination may be made whether or not a fingerprint image corresponds to the finger being asymmetrically located with respect to the sensor. This determination may comprise analysing data of the fingerprint image that correspond to data obtained from a sensor border and determining that the finger is asymmetrically located with respect to the sensor if fingerprint image data is missing from the sensor border.
  • instructions may be provided for the user to repeatedly touch the sensor while moving the finger between each touch.
  • Such instructions may be as simple as an instructive message or graphic displayed on a display.
  • the continuation, in action 215, with the determination of the COA is then done when the first plurality of fingerprint images is numerically larger than a second threshold.
  • a second threshold may be seen as a "training-free" approach where the COA is estimated once there are a minimum number of fingerprint images that can be stitched together.
  • the guiding of the user in the fingerprint enrolment procedure, i.e. in action 203 may comprise calculating an updated COA point by using the second stitched image.
  • a search is made, in the second stitched image, for a location of a singular point and, if the search is positive, the location of the singular point is used in the calculation of an updated COA point.
  • singular points examples include a core, a loop, a whorl center, a delta and a tented arch.
  • Figure 5 illustrates a fingerprint 510 where singular points 507, 508 are illustrated.
  • Singular point 507 is a delta and singular point 508 is a core point.
  • the embodiments where the COA is updated by using a location of a singular point may involve the following.
  • the initial COA point might be sub-optimal since it is possible that the region in which the COA is located contains few recognizable fingerprint patterns.
  • global points e.g. the location of a core, delta or loop
  • These patterns that comprise global points include high-informative regions and are hence useful to include as enrolment data.
  • the initial COA is shifted, i.e.
  • COAup d (kA + (k-1 )B) with 0 ⁇ k ⁇ 1 as a weighting parameter.
  • the amount of additional fingerprint area in the second stitched image is determined in an algorithm that comprises calculation of a coverage score S:
  • G is a Gaussian kernel
  • is the standard deviation of the Gaussian
  • M(x,y) is a binary value coverage mask corresponding to the second stitched image with x and y being the pixel position in the second stitched image
  • the termination criterion may be any of a number of different criteria, including: S is above a coverage score threshold, the increase of S over a number N last fingerprint images obtained and stitched into the second stitched image, a number of fingerprint images in the second stitched image is above a third threshold, a number of consecutively obtained fingerprint images that are not possible to stitch into the second stitched image is above a fourth threshold, and a number of consecutively obtained fingerprint images that are found to have an image quality that is lower than a quality threshold is above a fifth threshold.
  • the coverage score S summation is used in the following way: the
  • Gaussian kernel having the COA as expected value and standard deviation ⁇ and multiplied with the coverage mask M provides a measure of progress for the enrolment procedure.
  • the key point is that the covered surface is weighted with a Gaussian kernel such that regions that are close to the COA are emphasized. This assures that the enrolment covers a region of the finger that will be used for subsequent verification and hence improves the biometric performance of the system.
  • the coverage mask M(x,y) is a binary value mask that, for each point within the mask, shows whether the intensity of that pixel represents fingerprint- or background information.
  • Figure 1 a illustrates a fingerprint sensing system 100 that comprises a fingerprint sensor 104, a processor 102 and a memory 106, said memory 106 containing instructions executable by said processor 102 whereby said processor 102 is operative to control the fingerprint sensing system 100 by:
  • COA center of attention
  • the instructions that are executable by the processor 102 may be software in the form of a computer program 141.
  • the computer program 141 may be contained in or by a carrier 142, which may provide the computer program 141 to the memory 106 and processor 102.
  • the carrier 142 may be in any suitable form including an electronic signal, an optical signal, a radio signal or a computer readable storage medium.
  • the processor 102 is operative to control the fingerprint sensing system 100 by:
  • the processor 102 is operative to control the fingerprint sensing system 100 by, prior to the obtaining of the first plurality of fingerprint images:
  • the processor 102 is operative to control the fingerprint sensing system 100 by:
  • the processor 102 is operative to control the fingerprint sensing system 100 by:
  • the processor 102 is operative to control the fingerprint sensing system 100 by, prior to the obtaining of the first plurality of fingerprint images:
  • the processor 102 is operative to control the fingerprint sensing system 100 such that the guiding of the user in the fingerprint enrolment procedure comprises:
  • the processor 102 is operative to control the fingerprint sensing system 100 by:
  • the singular point is any of:
  • the processor 102 is operative to control the fingerprint sensing system 100 such that the amount of additional fingerprint area in the second stitched image is determined in an algorithm that comprises calculation of a coverage score S:
  • G is a Gaussian kernel
  • is the standard deviation of the Gaussian kernel
  • M(x,y) is a binary value coverage mask corresponding to the second stitched image with x and y being the pixel position in the second stitched image
  • the processor 102 is operative to control the fingerprint sensing system 100 such that the termination criterion is any of:
  • a number of consecutively obtained fingerprint images that are found to have an image quality that is lower than a quality threshold is above a fifth threshold.
  • the processor 102 is operative to control the fingerprint sensing system 100 such that the guidance information is any of:

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Input (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

In a fingerprint sensing system a determination is made of a center of attention, COA, point. The COA point is a point on a finger of a user that is likely to be in proximity of the center of a fingerprint image of the finger obtained by a fingerprint sensor. This COA determination is used in guiding the user in a fingerprint enrolment procedure.

Description

Guided fingerprint enrolment based on center of attention point
TECHNICAL FIELD
Embodiments herein relate to methods and arrangements relating to enrolment of fingerprints in a fingerprint sensing system. BACKGROUND
In the field of biometric sensing, the use of fingerprints has evolved to be one of the most widely used technologies. This fact can be illustrated and exemplified by considering the field of mobile communication technology, e.g. the use of intelligent mobile devices such as smartphones. In this field there is an increased demand for providing increased security for accessing the devices themselves and also for providing secure access to remote services such as banking services that are available via data communication networks.
In order to enable such secure access by way of fingerprint sensing, a user has to take part in a so-called enrolment procedure where information directly connected to a user's fingerprint is registered for later use in a verification procedure when actual access is to be determined. During such an enrolment procedure, the user is typically prompted to apply a finger to a fingerprint sensor several times until a complete fingerprint, or at least a large part of a fingerprint, has been recorded.
Examples of prior art fingerprint enrolment are described in US patent application publications 2014/0003677 and 2014/0003679. In the systems described in these publications, during the enrolment procedure, a user is provided with feedback in the form of information that tells the user which part of the fingerprint that is still to be recorded.
However, there are drawbacks with prior art enrolment procedures. For example, previous known enrolment methods, including the systems described in the publications cited above, typically apply generalized enrollment schemes aiming to enroll the same fingerprint area for all users, without taking into account what is feeling natural and convenient for the user who is to enroll a fingerprint. These approaches are thus sub- optimal as they, for some users, guide the users to enroll a part of their fingerprint which they will never use for subsequent verification procedures. Such prior art methods and systems typically suffer from a fairly large number of false rejections during verification procedures because a user by instinct might use another part of the fingerprint for verification than the part of the fingerprint that the user was instructed to use during the enrolment.
SUMMARY
In order to mitigate at least some of the drawbacks as discussed above, there is provided in a first aspect of embodiments herein a method in a fingerprint sensing system. The fingerprint sensing system comprises a fingerprint sensor and the method comprises a determination of a center of attention, COA, point. The COA point is a point on a finger of a user that is likely to be in a proximity of the center of a fingerprint image of the finger obtained by the sensor. This COA determination is followed by guiding the user in a fingerprint enrolment procedure, using the determined COA point for providing finger position guidance information to the user.
In other words, by making an initial estimation of the COA point for a given user and using that COA point as a reference point around which enrollment data is of interest during the subsequent guided fingerprint enrollment, the overall user experience of the fingerprint sensor is improved. This is the case, as the user will be using the part of the finger that feels natural for the user for both the enrolment and any subsequent verification procedure when the fingerprint is to be verified.
In various embodiments, the determination of the COA point comprises obtaining a first plurality of fingerprint images of the finger of the user from the fingerprint sensor. During the obtaining of the first plurality of fingerprint images, the first plurality of fingerprint images are stitched into a first two-dimensional stitched image. The COA point is determined by calculating a center of gravity point of the first stitched image and assigning the center of gravity point to the COA point. Furthermore, in these embodiments, the guiding of the user in the fingerprint enrolment procedure comprises obtaining a second plurality of fingerprint images of the finger of the user from the fingerprint sensor. During the obtaining of the second plurality of fingerprint images, the second plurality of fingerprint images are stitched into a second two-dimensional stitched image. During the obtaining and stitching of the second plurality of fingerprint images, a calculation is made of a desired position of the finger in relation to the sensor that, when a fingerprint image in the second plurality of fingerprint images is obtained of the finger at the desired position provides an amount of additional fingerprint area in the second stitched image in the proximity of the COA point that has a maximum value. Moreover, during the obtaining and stitching of the second plurality of fingerprint images, guidance information is provided for the user, where this guidance information is indicative of the calculated desired position.
The guiding of the user in the fingerprint enrolment procedure may in some embodiments comprise a calculation of an updated COA point by using the second stitched image. For example, a search can be made for a location of a singular point in the second stitched image and, if the search is positive, using this location of the singular point in the calculation of an updated COA point.
The amount of additional fingerprint area in the second stitched image may in some embodiments be determined in an algorithm that comprises calculation of a coverage score S:
S = ∑Vx,y G(CoA, o)M(x, y)
where G is a Gaussian kernel, σ is the standard deviation of the Gaussian and M(x,y) is a binary value coverage mask corresponding to the second stitched image with x and y being the pixel position in the second stitched image, and where the enrolment procedure is terminated when a termination criterion based on S is reached.
In a second aspect there is provided a fingerprint sensing system that comprises a fingerprint sensor, a processor and a memory. The memory contains instructions executable by the processor whereby the processor is operative to control the fingerprint sensing system by determining a center of attention, COA, point, the COA point being a point on a finger of a user that is likely to be in a proximity of the center of a fingerprint image of the finger obtained by the sensor, and guiding the user in a fingerprint enrolment procedure, using the determined COA point for providing finger position guidance information to the user.
In further aspects there are provided a communication device comprising the fingerprint sensing system of the second aspect, a computer program, comprising instructions which, when executed on at least one processor in a fingerprint sensing system, cause the fingerprint sensing system to carry out the method according to the first aspect and, in a final aspect, a carrier comprising the computer program.
Effects and advantages of these further aspects correspond to those summarized above in connection with the first aspect. BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1a schematically illustrates a block diagram of a fingerprint sensing system, figure 1 b schematically illustrates a block diagram of a mobile communication device, figure 1c schematically illustrates a block diagram of a smart card,
figure 2a is a flowchart of a method,
figure 2b is a flowchart of a method,
figure 3 schematically illustrates a stitched image and a COA point,
figure 4 schematically illustrates finger locations in relation to a sensor, and
figure 5 schematically illustrates singular points in a fingerprint image. DETAILED DESCRIPTION
Figure 1 a illustrates schematically in the form of function blocks a fingerprint sensing system 100. The function blocks comprise a processor 102, a two-dimensional fingerprint sensor 104 and a memory 106, and the system 100 is in connection with a guidance information provider 108. The processor is operable to control the fingerprint sensing system 100 and it is connected to the memory 104, which comprises an appropriate computer program 141 comprising software instructions and data that enables the processor 102 to control the system 100 as will be exemplified below. With regard to the fingerprint sensor 104 it may be of any suitable type, such as optical, capacitive, ultrasonic etc., as the skilled person will realize. The fingerprint sensor 104 may comprise a square or rectangular shaped matrix of pixels, for example a capacitive sensor having a size of 208x80 pixels, each pixel having a resolution of 256 grey scales. The fingerprint sensor typically comprises a readout circuit (not shown in the drawings) allowing the image data, i.e. fingerprint data, to be read out to the processor 102 at various speeds.
The fingerprint sensing system 100 may comprise individual components as illustrated schematically in figure 1a and the system may also be implemented by way of combining functionalities of the processor 102 and the memory 106 in a single unit. It is also possible to have an implementation where the sensor 104 comprises the necessary processor and memory capabilities.
With regard to the guidance information provider 108, it is an arrangement that is capable of providing a feedback to a user when the user interacts with the fingerprint sensing system 100. Although feedback will be exemplified with visual output in the form of graphics in the following, it is to be noted that the feedback from the guidance information provider 108 may be an arrangement that is capable of providing sensory output that is any of visual, sound and touch.
Figure 1 b illustrates schematically in the form of function blocks a mobile communication device 110 such as a mobile phone, a smartphone, a tablet, a personal computer, a 5 laptop computer or any similar type of device. The mobile communication device 1 10 comprises the functionalities of the fingerprint sensing system 100 of figure 1 a including the sensor 104. The mobile communication device 110 comprises a processor 112, a memory 114, radio circuitry 116 and a touch sensitive display 1 18. As indicated in figure 1 b, the fingerprint sensing system 100 forms part of the processor 112 and the memory
10 1 14 and is connected to the touch sensitive display 1 18. That is, the processor 112
controls by means of software instructions the fingerprint sensing system 100 as will be exemplified below. The touch sensitive display 1 18 is configured to act as the guidance information provider 108 by providing graphical output for a user during operation of the fingerprint sensing system 100. Needless to say, the processor 1 12 is configured to
15 control the mobile communication device to operate in a mobile communication system via the radio circuitry 1 16 in a manner that is outside the scope of the present disclosure.
Yet another arrangement in which a fingerprint sensing system may be implemented is a smart card 140, as schematically illustrated in a functional block diagram in figure 1c. The smart card 140 comprises the functionalities of the fingerprint sensing system 100 of
20 figure 1a including the sensor 104. The smart card 140 comprises a processor 142, a memory 144 and radio circuitry 146, which may be of any appropriate type such as near field communication, NFC, circuitry, Bluetooth® circuitry etc. As indicated in figure 1 c, the fingerprint sensing system 100 forms part of the processor 142 and the memory 144. That is, the processor 142 controls by means of software instructions the fingerprint sensing
25 system 100 as will be exemplified below. In contrast to the communication device 1 10 in figure 1 b, the smart card is not equipped with a display, although variations of the smart card 140 may be equipped with a guidance information provider in the form of, e.g. light emitting diodes (LED) or audio providing means. Similar to the communication device 110 in figure 1 b, the processor 142 in the smart card 140 is configured to control the smart
30 card 140 to operate in a communication system, e.g. in a payment scenario in case the smart card is a bank card or credit card, via the radio circuitry 146 in a manner that is outside the scope of the present disclosure. Turning now to figure 2a and with continued reference to figures 1a, 1 b and 1c, a method in a fingerprint sensing system, e.g. the fingerprint sensing system 100 of figures 1a, 1 b and 1 c, will be described in some detail. The method comprises a number of actions that will be described below. The actions of the method in figure 2 are realized by means of software instructions being executed in a processor, e.g. any of the processors 102, 112 or the processor 142, which interacts with a sensor such as the sensor 104 and controls provision of guidance information, e.g. via a guidance information provider 108. Memory such as the memory 106 or the memory 114 is utilized during the execution of the method. Action 201
A determination is made of a center of attention, COA, point. The COA point is a point on a finger of a user that is likely to be in a proximity of the center of a fingerprint image of the finger obtained by the sensor.
Action 203
The user is guided in a fingerprint enrolment procedure, using the determined COA point for providing finger position guidance information to the user.
Embodiments of the method illustrated in figure 2a may comprise actions as illustrated in figure 2b. Although the actions are illustrated in a sequential order, it is to be understood that any number of the actions may be performed in parallel, as will become clear from the detailed description of the actions. In these embodiments, the determination of the COA point, as described in action 201 , comprises the following actions 211 to 215.
Action 21 1
A first plurality of fingerprint images of the finger of the user is obtained from the fingerprint sensor. Action 213
During the obtaining of the first plurality of fingerprint images, the first plurality of fingerprint images are stitched into a first two-dimensional stitched image.
Action 215
The COA point is determined by calculating a center of gravity (COG) point of the first stitched image and assigning the center of gravity point to the COA point. The COG in the x and y direction, CoGx and CoGy, may be calculated as: ∑Vx,y xM(x, y) ∑vx,y yM(x, y)
CoGx = CoGy =
∑v*,y M(x, y) ∑vx,y M(x, y) where M(x,y) is a binary value coverage mask corresponding to the second stitched image with x and y being the pixel position in the second stitched image.
Figure 3 illustrates an example of a plurality of fingerprint images, exemplified by reference numeral 304, have been stitched into a first stitched image 302. A calculated COA 306 is also indicated in figure 3.
In these embodiments, the guiding of the user in the fingerprint enrolment procedure, as described in action 203, comprises the following actions 217 to 223.
Action 217
A second plurality of fingerprint images of the finger of the user is obtained from the fingerprint sensor.
Action 219
During the obtaining of the second plurality of fingerprint images, the second plurality of fingerprint images are stitched into a second two-dimensional stitched image. Action 221
During the obtaining and stitching of the second plurality of fingerprint images, a calculation is made of a desired position of the finger in relation to the sensor. This desired position is such that, when a fingerprint image in the second plurality of fingerprint images is obtained of the finger at the desired position provides an amount of additional fingerprint area in the second stitched image in the proximity of the COA point that has a maximum value.
Action 223
During the obtaining and stitching of the second plurality of fingerprint images, guidance information is provided for the user, where this guidance information is indicative of the calculated desired position. For example, the guidance information may be any of a matrix of blocks that illustrates the fingerprint coverage of the second stitched image, a binary map of actual coverage of the second stitched image, and a displayed image of a pseudo- finger that represents a position of the finger in relation to the sensor. In some embodiments, prior to the obtaining of the first plurality of fingerprint images, i.e. before actions 211 to 215, instructions may be provided for the user to repeatedly touch the sensor while moving the finger between each touch. Such instructions may be as simple as an instructive message or graphic displayed on a display. In these
embodiments, during the obtaining and stitching of the first plurality of fingerprint images, any obtained fingerprint image that corresponds to the finger being asymmetrically located with respect to the sensor is discarded. The continuation, in action 215, with the determination of the COA is then done when the first plurality of fingerprint images is numerically larger than a first threshold. In these embodiments, if an obtained fingerprint image is discarded, feedback information may be provided for the user that indicates that the finger is asymmetrically located with respect to the sensor. In other words, an advantage of such embodiments can be illustrated by considering a user who has little experience with fingerprint enrolment procedures. Such an inexperienced user might not be aware of how the amount of movement between each time the finger touches the sensor maps to the guidance information that is fed back to the user, e.g. in terms of fingerprint coverage growth etc. as mentioned above. This approach may be considered as a "training mode approach", since the user is informed when the user has placed the finger in an undesired asymmetric position in relation to the sensor and that a
corresponding fingerprint image has been discarded.. In some embodiments, a determination may be made whether or not a fingerprint image corresponds to the finger being asymmetrically located with respect to the sensor. This determination may comprise analysing data of the fingerprint image that correspond to data obtained from a sensor border and determining that the finger is asymmetrically located with respect to the sensor if fingerprint image data is missing from the sensor border.
In some embodiments, prior to the obtaining of the first plurality of fingerprint images, i.e. before actions 211 to 215, instructions may be provided for the user to repeatedly touch the sensor while moving the finger between each touch. Such instructions may be as simple as an instructive message or graphic displayed on a display. In these
embodiments, the continuation, in action 215, with the determination of the COA is then done when the first plurality of fingerprint images is numerically larger than a second threshold. In other words, such embodiments may be seen as a "training-free" approach where the COA is estimated once there are a minimum number of fingerprint images that can be stitched together. In some embodiments, the guiding of the user in the fingerprint enrolment procedure, i.e. in action 203, may comprise calculating an updated COA point by using the second stitched image. In these embodiments, a search is made, in the second stitched image, for a location of a singular point and, if the search is positive, the location of the singular point is used in the calculation of an updated COA point. Examples of singular points include a core, a loop, a whorl center, a delta and a tented arch. Figure 5 illustrates a fingerprint 510 where singular points 507, 508 are illustrated. Singular point 507 is a delta and singular point 508 is a core point.
The embodiments where the COA is updated by using a location of a singular point may involve the following. The initial COA point might be sub-optimal since it is possible that the region in which the COA is located contains few recognizable fingerprint patterns. To incorporate a region that conveys a larger amount of recognizable patterns, global points (e.g. the location of a core, delta or loop) can be included in the estimation of COA. These patterns that comprise global points include high-informative regions and are hence useful to include as enrolment data. In order to include them, the initial COA is shifted, i.e.
updated, to a point, COAUpd, between the center of gravity (A) and a global point (B) according to the following expression:
COAupd = (kA + (k-1 )B) with 0≤ k≤ 1 as a weighting parameter.
In some embodiments, the amount of additional fingerprint area in the second stitched image is determined in an algorithm that comprises calculation of a coverage score S:
S = ∑Vx,y G(CoA, o)M(x, y)
where G is a Gaussian kernel, σ is the standard deviation of the Gaussian and M(x,y) is a binary value coverage mask corresponding to the second stitched image with x and y being the pixel position in the second stitched image, and where the enrolment procedure is terminated when a termination criterion based on S is reached. The termination criterion may be any of a number of different criteria, including: S is above a coverage score threshold, the increase of S over a number N last fingerprint images obtained and stitched into the second stitched image, a number of fingerprint images in the second stitched image is above a third threshold, a number of consecutively obtained fingerprint images that are not possible to stitch into the second stitched image is above a fourth threshold, and a number of consecutively obtained fingerprint images that are found to have an image quality that is lower than a quality threshold is above a fifth threshold. In other words, in these embodiments, during the calculation in action 221 of a desired finger position, the coverage score S summation is used in the following way: the
Gaussian kernel having the COA as expected value and standard deviation σ and multiplied with the coverage mask M provides a measure of progress for the enrolment procedure. The key point is that the covered surface is weighted with a Gaussian kernel such that regions that are close to the COA are emphasized. This assures that the enrolment covers a region of the finger that will be used for subsequent verification and hence improves the biometric performance of the system. The coverage mask M(x,y) is a binary value mask that, for each point within the mask, shows whether the intensity of that pixel represents fingerprint- or background information.
Returning now to figure 1a, embodiments of a fingerprint sensing system 100 will be described in some more detail. Figure 1 a illustrates a fingerprint sensing system 100 that comprises a fingerprint sensor 104, a processor 102 and a memory 106, said memory 106 containing instructions executable by said processor 102 whereby said processor 102 is operative to control the fingerprint sensing system 100 by:
- determining a center of attention, COA, point, the COA point being a point on a finger of a user that is likely to be in a proximity of the center of a fingerprint image of the finger obtained by the sensor, and
- guiding the user in a fingerprint enrolment procedure, using the determined COA point for providing finger position guidance information to the user.
The instructions that are executable by the processor 102 may be software in the form of a computer program 141. The computer program 141 may be contained in or by a carrier 142, which may provide the computer program 141 to the memory 106 and processor 102. The carrier 142 may be in any suitable form including an electronic signal, an optical signal, a radio signal or a computer readable storage medium.
In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 by:
- obtaining a first plurality of fingerprint images of the finger of the user from the fingerprint sensor,
- stitching, during the obtaining of the first plurality of fingerprint images, the first plurality of fingerprint images into a first two-dimensional stitched image,
- determining the COA point by calculating a center of gravity point of the first stitched image and assigning the center of gravity point to the COA point, and wherein the guiding of the user in the fingerprint enrolment procedure comprises:
- obtaining a second plurality of fingerprint images of the finger of the user from the fingerprint sensor,
- stitching, during the obtaining of the second plurality of fingerprint images, the second plurality of fingerprint images into a second two-dimensional stitched image,
- calculating, during the obtaining and stitching of the second plurality of fingerprint images, a desired position of the finger in relation to the sensor that, when a fingerprint image in the second plurality of fingerprint images is obtained of the finger at the desired position provides an amount of additional fingerprint area in the second stitched image in the proximity of the COA point that has a maximum value, and
- providing, during the obtaining and stitching of the second plurality of fingerprint images, guidance information for the user, said guidance information being indicative of the calculated desired position.
In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 by, prior to the obtaining of the first plurality of fingerprint images:
- providing instructions for the user to repeatedly touch the sensor while moving the finger between each touch,
- discarding, during the obtaining and stitching of the first plurality of fingerprint images, any obtained fingerprint image that corresponds to the finger being
asymmetrically located with respect to the sensor, and
- continuing with the determination of the COA point when the first plurality of fingerprint images is numerically larger than a first threshold.
In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 by:
- if an obtained fingerprint image is discarded, providing feedback information for the user that indicates that the finger is asymmetrically located with respect to the sensor.
In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 by:
- determining whether or not a fingerprint image corresponds to the finger being asymmetrically located with respect to the sensor, comprising:
- analysing data of the fingerprint image that correspond to data obtained from a sensor border and determining that the finger is asymmetrically located with respect to the sensor if fingerprint image data is missing from the sensor border. In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 by, prior to the obtaining of the first plurality of fingerprint images:
- providing instructions for the user to touch the sensor while moving the finger between each touch, and
- continuing with the determination of the COA point when the first plurality of fingerprint images is numerically larger than a second threshold.
In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 such that the guiding of the user in the fingerprint enrolment procedure comprises:
- calculating an updated COA point by using the second stitched image.
In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 by:
- searching for, in the second stitched image, a location of a singular point and, if the search is positive, using the location of the singular point in the calculation of an updated COA point.
In some embodiments, the singular point is any of:
- a core,
- a loop,
- a whorl center,
- a delta, and
- a tented arch.
In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 such that the amount of additional fingerprint area in the second stitched image is determined in an algorithm that comprises calculation of a coverage score S:
S = ∑VXiy G(CoA, o)M(x, y)
where G is a Gaussian kernel, σ is the standard deviation of the Gaussian kernel and M(x,y) is a binary value coverage mask corresponding to the second stitched image with x and y being the pixel position in the second stitched image, and where the enrolment procedure is terminated when a termination criterion based on S is reached. In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 such that the termination criterion is any of:
- S is above a coverage score threshold, - the increase of S over a number N last fingerprint images obtained and stitched into the second stitched image,
- a number of fingerprint images in the second stitched image is above a third threshold,
- a number of consecutively obtained fingerprint images that are not possible to stitch into the second stitched image is above a fourth threshold, and
- a number of consecutively obtained fingerprint images that are found to have an image quality that is lower than a quality threshold is above a fifth threshold.
In some embodiments, the processor 102 is operative to control the fingerprint sensing system 100 such that the guidance information is any of:
- a matrix of blocks that illustrates the fingerprint coverage of the second stitched image,
- a binary map of actual coverage of the second stitched image, and
- a displayed image of a pseudo-finger that represents a position of the finger in relation to the sensor.

Claims

1. A method in a fingerprint sensing system (100), the fingerprint sensing system comprising a fingerprint sensor (104), the method comprising:
- determining (201) a center of attention, COA, point, the COA point being a point on a finger of a user that is likely to be in a proximity of the center of a fingerprint image of the finger obtained by the sensor, and
- guiding (203) the user in a fingerprint enrolment procedure, using the determined COA point for providing finger position guidance information to the user.
2. The method of claim 1 , wherein the determination of the COA point comprises:
- obtaining (21 1) a first plurality of fingerprint images (304) of the finger of the user from the fingerprint sensor,
- stitching (213), during the obtaining of the first plurality of fingerprint images, the first plurality of fingerprint images into a first two-dimensional stitched image (302),
- determining (215) the COA point by calculating a center of gravity point (306) of the first stitched image and assigning the center of gravity point to the COA point, and wherein the guiding of the user in the fingerprint enrolment procedure comprises:
- obtaining (217) a second plurality of fingerprint images of the finger of the user from the fingerprint sensor,
- stitching (219), during the obtaining of the second plurality of fingerprint images, the second plurality of fingerprint images into a second two-dimensional stitched image,
- calculating (221), during the obtaining and stitching of the second plurality of fingerprint images, a desired position of the finger in relation to the sensor that, when a fingerprint image in the second plurality of fingerprint images is obtained of the finger at the desired position provides an amount of additional fingerprint area in the second stitched image in the proximity of the COA point that has a maximum value, and
- providing (223), during the obtaining and stitching of the second plurality of fingerprint images, guidance information for the user, said guidance information being indicative of the calculated desired position.
3. The method of claim 2, comprising, prior to the obtaining of the first plurality of fingerprint images:
- providing instructions for the user to repeatedly touch the sensor while moving the finger between each touch,
- discarding, during the obtaining and stitching of the first plurality of fingerprint images, any obtained fingerprint image that corresponds to the finger being
asymmetrically located with respect to the sensor, and
- continuing with the determination of the COA point when the first plurality of fingerprint images is numerically larger than a first threshold.
4. The method of claim 3, comprising:
- if an obtained fingerprint image is discarded, providing feedback information for the user that indicates that the finger is asymmetrically located with respect to the sensor.
5. The method of claim 3 or claim 4, comprising:
- determining whether or not a fingerprint image corresponds to the finger being asymmetrically located with respect to the sensor, comprising:
- analysing data of the fingerprint image that correspond to data obtained from a sensor border and determining that the finger is asymmetrically located with respect to the sensor if fingerprint image data is missing from the sensor border.
6. The method of claim 2, comprising, prior to the obtaining of the first plurality of fingerprint images:
- providing instructions for the user to touch the sensor while moving the finger between each touch, and - continuing with the determination of the COA point when the first plurality of fingerprint images is numerically larger than a second threshold.
7. The method of any of claims 2 to 6, wherein the guiding of the user in the fingerprint enrolment procedure comprises:
- calculating an updated COA point by using the second stitched image.
8. The method of claim 7, comprising:
- searching for, in the second stitched image, a location of a singular point (507, 508) and, if the search is positive, using the location of the singular point in the calculation of an updated COA point.
9. The method of claim 8, wherein the singular point is any of:
- a core,
- a loop,
- a whorl center,
- a delta, and
- a tented arch.
10. The method of any of claims 2 to 9, wherein said amount of additional fingerprint area in the second stitched image is determined in an algorithm that comprises calculation of a coverage score S:
S = ∑Vx,y G(CoA, o)M(x, y)
where G is a Gaussian kernel, σ is the standard deviation of the Gaussian kernel and M(x,y) is a binary value coverage mask corresponding to the second stitched image with x and y being the pixel position in the second stitched image, and where the enrolment procedure is terminated when a termination criterion based on S is reached.
1 1. The method of claim 10, wherein the termination criterion is any of:
- S is above a coverage score threshold,
- the increase of S over a number N last fingerprint images obtained and stitched into the second stitched image,
- a number of fingerprint images in the second stitched image is above a third threshold,
- a number of consecutively obtained fingerprint images that are not possible to stitch into the second stitched image is above a fourth threshold, and
- a number of consecutively obtained fingerprint images that are found to have an image quality that is lower than a quality threshold is above a fifth threshold.
12. The method of any of claims 2 to 1 1 , wherein the guidance information is any of:
- a matrix of blocks that illustrates the fingerprint coverage of the second stitched image,
- a binary map of actual coverage of the second stitched image, and
- a displayed image of a pseudo-finger that represents a position of the finger in relation to the sensor.
13. A fingerprint sensing system (100), comprising a fingerprint sensor (104), a processor (102, 1 12,. 142) and a memory (106, 114, 144), said memory containing instructions executable by said processor whereby said processor is operative to control the fingerprint sensing system by:
- determining a center of attention, COA, point, the COA point being a point on a finger of a user that is likely to be in a proximity of the center of a fingerprint image of the finger obtained by the sensor, and
- guiding the user in a fingerprint enrolment procedure, using the determined COA point for providing finger position guidance information to the user.
14. A communication device (1 10) comprising the fingerprint sensing system (100) of claim 13 and a touch sensitive display (1 18).
15. A computer program (141), comprising instructions which, when executed on at least one processor (102, 1 12, 142) in a fingerprint sensing system (100), cause the fingerprint sensing system to carry out the method according to any one of claims 1 to 12.
16. A carrier (142) comprising the computer program of claim 15, wherein the carrier is one of an electronic signal, an optical signal, a radio signal and a computer readable storage medium.
PCT/SE2015/051344 2014-12-19 2015-12-15 Guided fingerprint enrolment based on center of attention point WO2016099389A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2017530224A JP2017538224A (en) 2014-12-19 2015-12-15 Fingerprint registration by guidance based on the center point of attention
CN201580006912.2A CN105981043B (en) 2014-12-19 2015-12-15 Guiding fingerprint register based on the center of interest point
KR1020177014268A KR101872367B1 (en) 2014-12-19 2015-12-15 Guided fingerprint enrolment based on center of attention point

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1451598-5 2014-12-19
SE1451598A SE1451598A1 (en) 2014-12-19 2014-12-19 Improved guided fingerprint enrolment

Publications (1)

Publication Number Publication Date
WO2016099389A1 true WO2016099389A1 (en) 2016-06-23

Family

ID=56127079

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2015/051344 WO2016099389A1 (en) 2014-12-19 2015-12-15 Guided fingerprint enrolment based on center of attention point

Country Status (6)

Country Link
US (1) US9477872B2 (en)
JP (1) JP2017538224A (en)
KR (1) KR101872367B1 (en)
CN (1) CN105981043B (en)
SE (1) SE1451598A1 (en)
WO (1) WO2016099389A1 (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3882797A1 (en) 2007-09-24 2021-09-22 Apple Inc. Embedded authentication systems in an electronic device
US8600120B2 (en) 2008-01-03 2013-12-03 Apple Inc. Personal computing device control using face detection and recognition
US8485442B2 (en) 2009-07-02 2013-07-16 Biometric Payment Solutions Electronic transaction verification system with biometric authentication
US9002322B2 (en) 2011-09-29 2015-04-07 Apple Inc. Authentication with secondary approver
US9898642B2 (en) 2013-09-09 2018-02-20 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US9483763B2 (en) 2014-05-29 2016-11-01 Apple Inc. User interface for payments
SE1451598A1 (en) * 2014-12-19 2016-06-20 Fingerprint Cards Ab Improved guided fingerprint enrolment
US9940637B2 (en) 2015-06-05 2018-04-10 Apple Inc. User interface for loyalty accounts and private label accounts
US20160358133A1 (en) 2015-06-05 2016-12-08 Apple Inc. User interface for loyalty accounts and private label accounts for a wearable device
CN104881238A (en) * 2015-06-25 2015-09-02 京东方科技集团股份有限公司 Touch control display device and touch control method thereof
US10621581B2 (en) 2016-06-11 2020-04-14 Apple Inc. User interface for transactions
DK201670622A1 (en) 2016-06-12 2018-02-12 Apple Inc User interfaces for transactions
US10496808B2 (en) 2016-10-25 2019-12-03 Apple Inc. User interface for managing access to credentials for use in an operation
CN108874196B (en) * 2017-05-15 2021-05-25 原相科技股份有限公司 Induction quantity compensation method of touch sensor and touch panel thereof
EP3779780B1 (en) * 2017-09-09 2024-02-28 Apple Inc. Implementation of biometric authentication with first and second form of authentication
KR102185854B1 (en) 2017-09-09 2020-12-02 애플 인크. Implementation of biometric authentication
KR102301599B1 (en) 2017-09-09 2021-09-10 애플 인크. Implementation of biometric authentication
CN108197596B (en) * 2018-01-24 2021-04-06 京东方科技集团股份有限公司 Gesture recognition method and device
US11170085B2 (en) 2018-06-03 2021-11-09 Apple Inc. Implementation of biometric authentication
US10860096B2 (en) 2018-09-28 2020-12-08 Apple Inc. Device control using gaze information
US11100349B2 (en) 2018-09-28 2021-08-24 Apple Inc. Audio assisted enrollment
US11328352B2 (en) 2019-03-24 2022-05-10 Apple Inc. User interfaces for managing an account
KR102655532B1 (en) * 2019-05-14 2024-04-09 삼성전자주식회사 Electronic device and method for acquiring biometric information using light of display
US11823487B2 (en) 2020-04-30 2023-11-21 Fingerprint Cards Anacatum Ip Ab Method and system for enrolling a fingerprint
US11816194B2 (en) * 2020-06-21 2023-11-14 Apple Inc. User interfaces for managing secure operations
KR20220030474A (en) * 2020-09-01 2022-03-11 삼성디스플레이 주식회사 Fingerprint authentication device, display device including the same, and method for authenticatiing fingerprint of display device
US11784956B2 (en) 2021-09-20 2023-10-10 Apple Inc. Requests to add assets to an asset account

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5828773A (en) * 1996-01-26 1998-10-27 Harris Corporation Fingerprint sensing method with finger position indication
GB2331613A (en) * 1997-11-20 1999-05-26 Ibm Apparatus for capturing a fingerprint
US20020141622A1 (en) * 2001-03-28 2002-10-03 Nec Corporation Fingerprint impressing position guiding method and fingerprint identification system
EP1645989A2 (en) * 2004-10-08 2006-04-12 Fujitsu Limited Collecting biometric information
US20100303311A1 (en) * 2009-05-26 2010-12-02 Union Community Co., Ltd. Fingerprint recognition apparatus and method thereof of acquiring fingerprint data
US20140003677A1 (en) * 2012-06-29 2014-01-02 Apple Inc. Fingerprint Sensing and Enrollment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7116805B2 (en) * 2003-01-07 2006-10-03 Avagotechnologies Ecbu Ip (Singapore) Pte. Ltd. Fingerprint verification device
WO2005034021A1 (en) * 2003-10-01 2005-04-14 Authentec, Inc. Methods for finger biometric processing and associated finger biometric sensors
EP1779064A4 (en) * 2004-08-09 2009-11-04 Classifeye Ltd Non-contact optical means and method for 3d fingerprint recognition
US20100232659A1 (en) * 2009-03-12 2010-09-16 Harris Corporation Method for fingerprint template synthesis and fingerprint mosaicing using a point matching algorithm
KR101436884B1 (en) * 2012-06-20 2014-09-02 동국대학교 산학협력단 Method for recognizing user by using footprint energy image and Apparatus thereof
US8913801B2 (en) 2012-06-29 2014-12-16 Apple Inc. Enrollment using synthetic fingerprint image and fingerprint sensing systems
US9342725B2 (en) * 2012-06-29 2016-05-17 Apple Inc. Image manipulation utilizing edge detection and stitching for fingerprint recognition
DE102012108838A1 (en) * 2012-09-19 2014-05-28 Cross Match Technologies Gmbh Method and device for recording fingerprints based on fingerprint scanners in reliably high quality
EP3172696A1 (en) * 2014-07-25 2017-05-31 Qualcomm Incorporated Enrollment and authentication on a mobile device
SE1451598A1 (en) * 2014-12-19 2016-06-20 Fingerprint Cards Ab Improved guided fingerprint enrolment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5828773A (en) * 1996-01-26 1998-10-27 Harris Corporation Fingerprint sensing method with finger position indication
GB2331613A (en) * 1997-11-20 1999-05-26 Ibm Apparatus for capturing a fingerprint
US20020141622A1 (en) * 2001-03-28 2002-10-03 Nec Corporation Fingerprint impressing position guiding method and fingerprint identification system
EP1645989A2 (en) * 2004-10-08 2006-04-12 Fujitsu Limited Collecting biometric information
US20100303311A1 (en) * 2009-05-26 2010-12-02 Union Community Co., Ltd. Fingerprint recognition apparatus and method thereof of acquiring fingerprint data
US20140003677A1 (en) * 2012-06-29 2014-01-02 Apple Inc. Fingerprint Sensing and Enrollment

Also Published As

Publication number Publication date
JP2017538224A (en) 2017-12-21
US20160180141A1 (en) 2016-06-23
CN105981043B (en) 2017-11-24
KR101872367B1 (en) 2018-06-28
CN105981043A (en) 2016-09-28
US9477872B2 (en) 2016-10-25
SE1451598A1 (en) 2016-06-20
KR20170097638A (en) 2017-08-28

Similar Documents

Publication Publication Date Title
US9477872B2 (en) Guided fingerprint enrolment
RU2714096C1 (en) Method, equipment and electronic device for detecting a face vitality
US9734379B2 (en) Guided fingerprint enrollment
CN107665485B (en) Electronic device and computer-readable recording medium for displaying graphic objects
KR102224721B1 (en) Systems and methods for authenticating a user based on a biometric model associated with the user
JP6011938B2 (en) Sensor-based mobile search, related methods and systems
CA2792336C (en) Intuitive computing methods and systems
US10917552B2 (en) Photographing method using external electronic device and electronic device supporting the same
US10678342B2 (en) Method of virtual user interface interaction based on gesture recognition and related device
CN111310705A (en) Image recognition method and device, computer equipment and storage medium
KR102355039B1 (en) Lock screen output controlling method and electronic device supporting the same
CN111414119A (en) Method, system and apparatus for biometric authentication system
US11061468B2 (en) Method and device for inputting password in virtual reality scene
KR20170097884A (en) Method for processing image and electronic device thereof
CN107784268B (en) Method and electronic device for measuring heart rate based on infrared sensor
US11216067B2 (en) Method for eye-tracking and terminal for executing the same
CN111327888A (en) Camera control method and device, computer equipment and storage medium
CN111160251A (en) Living body identification method and device
CN111405175B (en) Camera control method, device, computer equipment and storage medium
CN113343951A (en) Face recognition countermeasure sample generation method and related equipment
CN117521044A (en) Biometric authentication method, device, computer apparatus, and storage medium
KR20160055406A (en) Game Service system and Method for measuring the cognitive

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15870462

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20177014268

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2017530224

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15870462

Country of ref document: EP

Kind code of ref document: A1