GB2600401A - Methods, systems and computer program products, for use in biometric authentication - Google Patents

Methods, systems and computer program products, for use in biometric authentication Download PDF

Info

Publication number
GB2600401A
GB2600401A GB2016818.3A GB202016818A GB2600401A GB 2600401 A GB2600401 A GB 2600401A GB 202016818 A GB202016818 A GB 202016818A GB 2600401 A GB2600401 A GB 2600401A
Authority
GB
United Kingdom
Prior art keywords
biometric data
fist
images
body part
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2016818.3A
Other versions
GB202016818D0 (en
Inventor
Chiu Chan Tai
Chiu Hung Chow Alexander
Ohashi Monika
Visande Segovia Yves
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Polydigi Tech Ltd
Polydigi Tech Ltd
Original Assignee
Polydigi Tech Ltd
Polydigi Tech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Polydigi Tech Ltd, Polydigi Tech Ltd filed Critical Polydigi Tech Ltd
Priority to GB2016818.3A priority Critical patent/GB2600401A/en
Publication of GB202016818D0 publication Critical patent/GB202016818D0/en
Priority to PCT/EP2021/079205 priority patent/WO2022084444A1/en
Publication of GB2600401A publication Critical patent/GB2600401A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/117Biometrics derived from hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints

Abstract

The invention relates to a method of authenticating the identity of a person using biometric data (412, 422) derived from images of a bottom of the fist 202. A plurality of images of the fist are captured (406) while illuminating using one or more light sources (404). New biometric data (412) derived from the captured images is compared (432) directly or indirectly with reference biometric data (422) obtained previously from one or more individuals. Based on the result of said comparing step, an authentication result is provided (436, 438). The biometric data are derived by recognition of features characteristic of the bottom of a fist such as connected lines 232 corresponding to the folds of the little finger. Additional checks (440, 442) for liveness and presence of a real fist are included. The required images could be captured by a display screen of a portable personal device (200), and the fist could be used in clenched or relaxed states.

Description

METHODS, SYSTEMS AND COMPUTER PROGRAM PRODUCTS, FOR USE IN BIOMETRIC AUTHENTICATION
FIELD OF THE INVENTION
[0001] The invention relates to methods and systems, and computer program products, all for obtaining and/or using biometric data for authenticating the identity of individual persons.
BACKGROUND
[0002] Many aspects of personal and/or business life nowadays depend on the digital verification of a person's identity. A service provider may provide social media functions, banking functions with security protection based on digital keys such as passwords and/or hardware keys such as the possession of a particular mobile phone for the receipt of a one-time password (SMS OTP). Biometrics are physical or behavioural human characteristics that can be used to digitally identify a person to grant access to systems, premises, devices or data. Examples of these biometric identifiers are fingerprints, palm prints, facial patterns, voice or typing cadence. Biometrics are regarded as signifying what the person is, as opposed to what they have or what they know.
[0003] Faces and fingerprints and handprints are human body parts commonly adopted for the biometric authentication of important documents such as passports and ID cards.
Modern portable personal devices such as smartphones and tablets include cameras used for face recognition, and/or often dedicated fingerprint sensors, so that biometric authentication is familiar and widely used. A weakness of the known systems is that both face and fingerprints are typically recognised in a passive mode and are based on body parts that are commonly displayed in everyday life. This means that, even without the owner's knowledge, their biometric data can be easily captured and collected for use in fraud. For example, they can be obtained from other sources such as social media or high-resolution images. Known ways to make them more secure and reliable require a more sophisticated sensor to detect the depth map of the image, for example, or to look beneath the skin to vein patterns. Also, the fingerprint is very sensitive to ambient environmental changes and facial features can be tampered by facial reconstructive surgery or faked by 3D printing masks which imitate a person's facial features.
[0004] Usually, the face and fingerprint information are only stored in users' personal devices, and the decision whether to verify identity is determined by the device. This is done partly as the "solution" to privacy and leakage concerns, but it means that service provider applications cannot have any other information source for verification, and must depend on the device detection accuracy and integrity. iPhone and Android devices don't share the face or fingerprint data to the application or the app service providers. Also, stored biometric information cannot be transferred cross-platform because it depends on the specific sensor hardware and specific operating system. Although this seems to provide a solution, it does not address the basic issue that the smartphone then becomes the single point of failure and can be targeted for exploitation.
SUMMARY OF THE INVENTION
[0005] The invention in a first aspect provides a method of obtaining biometric data for a person, the method comprising: capturing a plurality of images of a body part of the person using one or more image acquisition devices; illuminating the body part during the capture of said images using one or more light sources; and processing the captured images of the body part to derive biometric data for use in identifying the user, wherein the body part is the bottom of a fist of the user, said biometric data being derived by recognition of features characteristic of the bottom of a fist.
[0006] The inventors have recognised that the bottom of the fist provides a new set of features for biometric authentication, that can be used to provide real-time and reliable identity verification. The typical features of the bottom of the fist are not easily affected by ambient environmental changes. Natural human behaviour tends to protect privacy of this part of the body.
[0007] The invention in a second aspect provides a method of authenticating the identity of a person automatically using biometric data, the method comprising: capturing a plurality of images of a body part of the person using one or more image acquisition devices; illuminating the body part during the capture of said images using one or more light sources; and comparing new biometric data derived from the captured images directly or indirectly with reference biometric data obtained previously from one or more individual persons; and based at least in part on the result of said comparing step, providing an authentication result, wherein the body part is the bottom of a fist of the user, and wherein at least one of said new biometric data and said reference biometric data has been derived by recognition of features characteristic of the bottom of a reference fist by a method according to the first aspect of the invention, as set out above.
[0008] In some embodiments, said biometric data includes first state biometric data is derived from one or more images of the fist captured with the fist in a first state and second state biometric data derived from one or more images of the fist captured with the fist in a second state. Active biometric measurement provides protection against spoofing using images or fake hands.
[0009] In some embodiments, the reference biometric data represents at least some of said characteristic features in three dimensions, the reference biometric data having been captured while the fist is in different orientations relative to an image capture device. [0010] Some embodiments further comprise varying one or more characteristics of illumination of the body part by control of said light source(s) while capturing a further plurality of images, and the authentication result is based at least in part on observation of differences between images captured under different characteristics of illumination. [0011] In some such embodiments, directional characteristics of said illumination are varied, and images captured with different directional characteristics of illumination are processed together such that the authentication result depends at least in part on shadow variations confirming the presence of depth features characteristic of a fist.
[0012] In some embodiments, a carrier network mechanism is used together with the biometric data. The carrier network and user device act as a hardware key, while the biometric authentication acts as a password. Without both the user's biometric characteristics and the mobile device (SIM card) a person will not pass the verification process.
[0013] Many optional features can be added, as mentioned above, as mentioned in the dependent claims appended hereto, and as illustrated in the following embodiments. [0014] In particular, several embodiments described herein and defined in the dependent claims may be referred to as an active biometric bottom-of-fist authentication method having high convenience and high accuracy, and especially resistant to hacking or spoofing. The active biometric features of these embodiments require live movement of the body part to be performed by the user, guarding against fake or dead body parts being exploited to gain unauthorised access. Other features of the embodiments ensure that the three-dimensional body part is truly present, and not being simulated by two-dimensional images or videos.
[0015] The invention in a third aspect provides a system for obtaining biometric data for a person, comprising: one or more image acquisition devices arranged to capture a plurality of images of a body part of the person; one or more light sources for illuminating the body part during the capture of said images; and one or more processors configured to process the captured images of the body part to derive biometric data for use in identifying the user, wherein the body part is the bottom of a fist of the user, said biometric data being derived by recognition of features characteristic of the bottom of a fist.
[0016] The invention in a fourth aspect provides a system for authenticating the identity of a person automatically using biometric data, comprising: one or more image acquisition devices arranged to capture a plurality of images of a body part of the person; one or more light sources for illuminating the body part during the capture of said images; and one or more processors configured to compare new biometric data derived from the captured images directly or indirectly with reference biometric data obtained previously from one or more individual persons and, based at least in part on the result of said comparing step, to provide an authentication result, wherein the body part is the bottom of a fist of the user, and wherein at least one of said new biometric data and said reference biometric data has been derived by recognition of features characteristic of the bottom of a reference fist.
[0017] In some embodiments, said one or more processors are further configured to derive said new biometric data from the captured images by recognition of features characteristic of the bottom of a reference fist.
[0018] The invention further provides one or more computer program products, comprising instructions for configuring one or more processors to perform the processing steps of a method of the first aspect and/or the second aspect of the invention as set forth above.
[0019] The computer program product may further comprise instructions for configuring one or more processors to control the image acquisition device and the light source(s) so as to perform the capturing and illuminating steps of the method.
[0020] The invention further provides one or more computer program products, comprising instructions for configuring one or more processors to function as the one or more processors in a system according to the third aspect or the fourth aspect of the invention as set forth above.
[0021] Embodiments of the computer program products provide many optional features, as mentioned above, as mentioned in the dependent claims appended hereto, and as illustrated in the following embodiments.
[0022] These and other aspects and optional features of the invention will be understood from a consideration of the description of examples that follows.
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] Embodiments of the invention will now be described, by way of example, with reference to the accompanying drawings, in which: Figure 1 illustrates the capturing of images of a body part using a light source and an image acquisition device in an embodiment of the present invention; Figure 2 illustrates a portable personal device including a light-emitting device and an image acquisition device usable in the capturing step of Figure 1; Figure 3 illustrates the extraction of biometric data from a images of a body part in an embodiment of the present invention; Figure 4 illustrates schematically a biometric authentication process based on biometric data extracted by the method of Figure 3 in a first example method; Figure 5 illustrates the capture of images of the body part (a) in a first state and (b) in a second state in certain embodiments of the methods of Figure 4; Figure 6 illustrates the generation of illumination having different characteristics (a) to (d) using the portable personal device of Figure 2 in certain embodiments of the method of Figure 4; Figure 7 shows partially processed images of the body part captured using the different illumination characteristics (a) to (d) of Figure 6; Figure 8 illustrates schematically a biometric authentication process based on biometric data extracted by the method of Figure 3, in a second example method; Figure 9 illustrates a registration process including the creation of reference biometric data for use in the methods of Figure 4 and/or Figure 8; Figure 10 illustrates an authentication process incorporating the biometric authentication process of Figure 4 and/or Figure 9; Figure 11 shows the hardware and functional structure of the portable personal device of Figure 2 in an example embodiment; Figure 12 shows the hardware and functional structure of a biometric authentication server usable in embodiments of the methods of Figures 1 to 10; and Figure 13 illustrates an example method for obtaining three-dimensional biometric data for use in some embodiments of the authentication processes described above.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
Bottom fist biometrics -introduction
[0024] Figure 1 shows schematically a biometric measurement device 100 which includes an image acquisition device 102 and a light source 104. Also seen is a body part 106 of a user who is presenting the bottom of their fist in a field of view 108 of image acquisition device 102. Light-emitting device 104 is directing illumination 110 in a range of directions towards the body part.
[0025] Figure 2 shows an example of biometric measurement device 100 implemented using built-in features of a portable personal device (PPD) 120, such as a smartphone or tablet using for example the familiar iOS or Android operating system. Image acquisition device 102 is provided by one of the built-in cameras typically provided in such a device, and more specifically a front facing camera of the smartphone. Light source 104 is provided in the form of the regular display screen of PPD 120. A large outer region 122 of the display is shown emitting white light towards the body part 106. A central region 124 of the display is provided to show a live camera image, to help positioning the hand in a consistent manner. It may be noted that the display screen of a smartphone may comprise many thousands of individual light-emitting elements (or equivalently light transmitting elements with backlighting). For the purposes of the present disclosure, these may be considered individual elements with their own state of illumination, their own position, and their own colour. On the other hand, for simplicity the display screen as a whole may be considered as a single light-emitting device controllable to vary the spatial distribution of intensity and colour across its surface. Needless to say, the technical teaching is the same, however the device is described.
[0026] The inventors have recognised that choosing the bottom of the fist of the user as a body part for biometric authentication brings new opportunities and advantages over conventional biometric techniques. In particular: * Unlike face and fingerprint patterns, by virtue of natural human behaviour, the bottom of the fist is rarely displayed in public, making it difficult for "bad actors" to capture and misuse the features.
* Unlike the face, the bottom of the fist is rarely covered with masks, glasses, make up, hair etc.. The bottom of the fist is unlikely to be modified by cosmetic procedures in the normal course of events.
* Unlike fingerprint patterns, the bottom of the fist is less sensitive to ambient environmental changes.
* By exploiting the three-dimensional character of the bottom of the fist, an authentication method can guard against "spoofing" using photographs or video.
* By exploiting actions of the user in the fist, active biometric authentication can be implemented as a safeguard against spoofing using 3D models of the fist, without requiring the user to learn and reproduce elaborate facial expressions or hand gestures.
[0027] These and other advantages will become apparent from a consideration of the implementation examples below. As with other biometric techniques, key steps involve feature recognition and extraction, registration of reference biometric data, and the acquisition and comparison of new biometric data with reference data to confirm or deny the identity of an individual.
[0028] Figure 3 illustrates schematically a process of obtaining biometric data from a captured image of the bottom of a fist, in accordance with one example implementation. Different stages, including intermediate results and final biometric data are illustrated schematically at 202, 204, 206, 208, 210, and 212. The method to be described can be implemented using image processing techniques and data manipulation techniques in one or more processors, which may be processors in the biometric measurement device 100, and/or in one or more servers or specialised processes which receive raw images and/or intermediate results from the biometric measurement device. The method will be described in a simplified form, it being understood that a practical implementation is well within the capabilities of the skilled person or persons, using widely available processing hardware and software, based on the teaching herein.
[0029] Starting with the captured image 202, a pre-processing step 220 provides enhanced image data 204 in which boundaries, lines 214 and texture are emphasised and smooth regions such as the background of image 202 are suppressed. In the drawing, this is shown as an image comprising black lines and features on a white background, purely to aid understanding. It will be understood that these images are processed simply as digital data after this point. From the enhanced image data 204, a first recognition step 222 identifies first features characteristic of the bottom of the fist. In particular, step 222 identifies the series of folds of the little finger and palm which appear to radiate from roughly a "star point". Image 206 shows a large dot 230 identifying the star point. From this point, a series of connected lines 232 can be traced which are continuous from that star point, radiating from it with or without a degree of branching. The paths of these lines are encoded by a series of control points indicated by smaller dots 234. Control points may for example include end control points, junction control points and intermediate control points where the path curves.
[0030] In step 224, data defining the relative positions of the star point and the control points of the radiating lines may be saved separately from the captured image 202 or the enhanced image data 204. The set of control points and connections defining the number and shape of these lines may be regarded as a first example of primary biometric data for the fist in the image, as illustrated at 208. It will be understood that the set of control points and their connections can be stored in a very compact format, compared with either of the images as a whole.
[0031] In a second recognition step 226, the enhanced image data 204 is used to identify a number of second features characteristic of the bottom of the fist. In particular, a number of individual lines are identified, that are disconnected from the first features and from one another. Image 210 shows a series of disconnected lines 240 which are identified by tracing along the second features. The paths of these lines are encoded by a series of control points indicated by the smallest dots 242. Control points may for example include end control points, junction control points and intermediate control points where the path curves. The level of detail recorded, in terms of the number of lines of varying degrees of fineness and/or length, as a matter of implementation, and can also be set in an adaptive manner, as described further below.
[0032] In a step 228, data defining the relative positions of the control points 242 of the second features may be saved separately from the captured image 202 or the enhanced image data 204. The set of control points and connections defining the shapes and relative positions of these lines may be regarded as a first example of secondary biometric data for the body part in the image, or they may be included as a further part of the primary biometric data. Positions of the second features may be encoded relative to the positions defined by the first features, or to some other system. The primary biometric data and/or secondary biometric data can be stored, forwarded, processed together as part of combined biometric data illustrated at 212.
[0033] Note that, in this example, the second features (lines 240) in a lower portion 244 of the image are highly variable, depending on the wrist posture of the person at the time of taking the image. In a simplified implementation, these features may be disregarded for the purposes of identification, although they can be exploited in other ways, for example in confirmation of liveness. In a further example, described below with reference to Figure 9, data representing the lines 240 that lie in close to the connected lines 232 are treated as part of the primary biometric data, while lines 240 the fall in the lower portion 244 of the image are treated as secondary biometric features. Dividing the biometric data in this way provide different levels of data for efficient and secure search and verification. [0034] In some embodiments, supplementary biometric data is also obtained by recognition of other characteristics or features of the body part, for use in combination with the primary biometric data and/or the secondary biometric data in identifying the person. In the illustrated example based on the bottom fist image, supplementary biometric data is obtained by recognising the general outline of the fist. This outline can be traced automatically for example as a curved or polygonal outline shape in the enhanced image data 204. The outline shape 250 is shown overlaid on the image together with the primary biometric data illustrated at 206. In a step 252, data defining the positions of control points 254 of the outline shape 250 are added to the other types of biometric data obtained from the fist image 202. Data encoding the positions and/or directions of the control points 254 of this outline shape 250 relative to the star point and/or other points in the primary biometric data stored as supplementary biometric data, as indicated by various arrows in the combined biometric data illustrated at 212. For example, vectors 256 shown in single dot-dash lines encode the distance and direction from the star point to each control point 254 of the outline shape.
[0035] Additionally, vectors 258 shown in double dot-dash lines encode the distance and direction from the star point to the ends of the connected lines 232. Using these vectors, the position and orientation of the star point within the outline shape can be recorded, but also the relative size (scale) of the pattern of connected lines 232 relative to the overall size of the fist. These characteristics can be used as additional criteria for identifying an individual person and guarding against fakes. It should be understood that vectors 258 may already be recorded as part of the primary biometric data, in which case the vectors 256 alone may be sufficient as supplementary biometric data. The supplementary biometric data can be considered as part of the primary biometric data, if preferred. The number of control points can be more or fewer than shown. The number of control points and vectors can be determined adaptively, as explained below.
[0036] The types of biometric data obtained from the image 202 are not limited to examples of primary biometric data and secondary biometric data and supplementary biometric data illustrated and described above in this example. There may be fewer or more types of biometric data. The designation of primary biometric data, secondary biometric data and supplementary biometric data can of course be given to different sets of features, as desired. What is here referred to as supplementary biometric data may in some embodiments be treated as part of the primary biometric data, and/or part of the secondary biometric data.
[0037] From a single image 202, these various biometric features can be positioned only in two dimensions (for example defined by axes X and Y as shown). As explained further below, information of these features can be obtained and stored in three dimensions, when desired. Three-dimensional information can be obtained in particular by capturing a series of images of the body part in varying orientations and/or positions, and processing the data together. Three-dimensional information can also be obtained by recognition of supplementary features, based on knowledge of the general structure of a fist.
[0038] In the following examples, it will be described how biometric data such as the primary biometric data and secondary biometric data illustrated in Figure 3 can be used as the basis of a bottom fist biometric authentication system. It should be understood that the forms of primary biometric data and secondary biometric data described above are only examples. Alternative characteristics and/or alternative representations of the same characteristics may be envisaged, without departing from the principles of the present disclosure.
Biometric authentication using bottom fist image data (first example method) [0039] Figure 4 illustrates schematically a first example of a method of authenticating the identity of a person automatically using biometric data extracted from bottom fist images according to the principles described above with reference to Figures 1 to 3 of the drawings. The left-hand side in Figure 4, and image acquisition device, or camera 402 for short, is provided, which may conveniently be the camera of a smartphone as described already above. A light source 404 is provided, which may conveniently be the display screen of the smartphone as already described.
[0040] A processor 406 controls the light source and the camera to acquire images, including images of the bottom of a fist of a person for authentication. Processor 406 in this example performs steps similar to those described with reference to Figure 3 to extract biometric data from one or more of the images. The image data during recognition of characteristic features is represented in the drawing at 410, while the extracted biometric data is represented in the drawing at 412. As explained in relation to Figure 3, the biometric data in some embodiments comprises primary biometric data and secondary biometric data.
As in the example of Figure 3, the extracted biometric data in Figure 4 includes (i) primary biometric data representing observed paths of a number of connected lines corresponding to folds in the little finger and palm and (ii) secondary biometric data representing numerous disconnected lines. Wien there is only one captured image, or a series of images in which the orientation of the body part relative to the camera does not change, then the biometric data necessarily is limited to a two-dimensional format. However, by capturing a series of images in which the body part changes orientation, the extracted biometric data can be generated in three-dimensional form. (This is similar to what will be described further below with reference to Figure 8, in relation to the registration process or a new user.) [0041] On the right-hand side in Figure 4, a database 420 stores preregistered biometric data 422 for a number of individuals, for example for individuals A, B, C, D. Biometric data preregistered for person B is shown enlarged at 422B. In this embodiment, the preregistered biometric data includes primary biometric data and secondary biometric data similar to that described above, but stored in a three-dimensional representation, indicated by axes X, Y, Z. The acquisition of three-dimensional biometric data will be described further below, with reference to Figure 8. For the moment, it is sufficient to understand that the storage of the biometric data in three dimensions allows the system to adapt to variations in the orientation and position of the body part as it is presented to the camera 402 for authentication. Within the database, biometric data for each person is defined in a standard position and orientation relative to axes X, Y, Z. [0042] A comparison process 430 is shown, which performs matching between, on the one hand, the new biometric data 412 extracted from a newly acquired image or images and, on the other hand, reference biometric data 422B corresponding to a candidate ID. The matching is performed firstly by applying geometric transformations to the three-dimensional representation of the reference biometric data 422B, stretching and rotating it until it best matches the two-dimensional representation of the new biometric data 412 extracted from the image captured by camera 402. Once the best fit has been obtained by geometric transformation, a matching score for the candidate ID (person B) is calculated, for example by summing (or averaging) the residual distances between each control point in the new biometric data and a corresponding control point in the candidate reference biometric data.
[0043] In the same way, the new biometric data can be compared against several or all of the individual reference biometric data examples in the database, to identify the closest match. Comparison with these other candidates A, C, 0 is illustrated in broken lines in the drawing. In one embodiment, data from the new images is compared with reference biometric data for all individuals in an authentication server database. In this way, it is confirmed that the individual presenting their fist to the camera is a particular one, and is only one, of all the individuals registered in the system. Of course, to compare a new image or set of images with a large number of candidates requires sufficient computing power and efficient comparison and search strategies. The types of biometric data and feature recognition disclosed herein work towards that goal.
[0044] Database 420 may be stored centrally in a secure authentication server operated by a trusted third-party authentication service provider. Alternatively, for applications with only local effect, database 420 may be stored locally on the same device as the processor 406 with camera and light source. Where the database is stored on a device used by only one user, it will be understood that the local database may comprise only one set of biometric data for that user. In further variations, part of the database 420 may be stored locally and part on a central server, or a number of servers. The implementer can choose whether to perform some or all of the extraction of biometric data using the local processor 106, sending only the biometric data in a compact format to the server, or whether to send the raw images, or partially processed images to the server for extraction of biometric data. [0045] In a case where the new biometric data is three-dimensional in character, the comparing step can be performed by first rotating the new biometric data into a standard position, which can be much more quickly compared to all of the records 422 in the database 420. In a case where the new biometric data is three-dimensional in character, the matching with the reference biometric data for higher levels of security can be performed in three-dimensional space, rather than being projected to a particular two-dimensional plane. Three-dimensional characteristics of the new biometric data may be used as an example of secondary biometric data in a modified method of the type described below with reference to Figure 9. An example method for obtaining three-dimensional biometric data is described further below with reference to Figure 13.
[0046] Evaluation process 432 may be performed which evaluates the best matching scores achieved between the new biometric data and all the candidates A, B, C, D. The best matching of these (i.e. the one with the least distance summed or averaged over all points) is selected and its matching score is subjected to a threshold test 434. Provided the matching is closer than a predetermined threshold, processing can proceed towards an eventual positive authentication result being provided in step 436. If the matching to the best candidate does not meet the threshold in step 434, a negative authentication result is provided in step 438.
Additional checks for anti-spoofing [0047] Note that the positive authentication result in this embodiment depends not only on matching of the characteristics of the new fist image and the reference biometric data, but also on a number of other processes and tests, which are designed for example to exclude spoofing of different kinds. These additional processes are labelled 440 in the drawing, with the understanding that any number of additional processes 440 and tests 442 may be provided as requirements to achieve a positive authentication result in step 436. Moreover, although the flowchart in the lower part of Figure 4 indicates a series of yes/no decisions based on the success of tests, it will be understood that the results of a number of tests can be combined in different ways. For example, if there are three anti-spoofing checks, their individual results may be combined in different ways. Depending on the application, avoiding too many false negative results may be important, as much as avoiding false positives.
[0048] As a first example of an additional check, one of the steps 440 may obtain and process images of the body part presented to camera 402 in different states. This will be described further below, with reference to Figure 5. As a second example of an additional check, one of the steps 440 may obtain images of the body part with camera 402 while controlling light source 404 to illuminate the body part from different directions, as described below with reference to Figure 6. As a third example of an additional check, one of the steps 440 may obtain images of the body part with camera 402 while controlling light source 404 to illuminate the body part with different characteristics of illumination, for example by a sequence of colours.
[0049] As a fourth example of an additional check, one of the steps 440 may record presentation characteristics from an image or a sequence of images of the body part presented to camera 402 on each occasion authentication or registration is requested. It will be understood that the real, human user cannot always hold the hand perfectly stationary, and cannot always hold a hand in exactly the same relationship to the camera at every authentication. By recording presentation characteristics such as the position, orientation and movements of the body part as seen by the camera on one occasion, the system can compare presentation characteristics as seen by the camera on a new occasion, when authentication is requested. If the presentation characteristics of the new images match too closely presentation characteristics recorded on a previous occasion, it may be judged that the new presentation is a recording and not the real human user.
For this method, it may help if the comparison is performed on extracted biometric data of the type shown in Figure 3, rather than trying to compare raw image data or even preprocessed image data. A sequence of movements and positions can be recorded, based on only the primary biometric data, for example, which can reduce computational complexity and storage requirements.
[0050] These and other additional checks are just examples, and they may be used individually or in combination. In addition, while these additional checks are shown being performed after a successful match of biometric data, some or all of these checks can be made before the comparison of biometric data, and/or in parallel therewith.
[0051] Comparison process 430 and all the processes shown in Figure 4 may be implemented by hardware and software elements separate from or shared with the processor 406.
Comparison process 430 and any or all of the other processes may be implemented locally in the same device as the camera and light source, or may be implemented wholly or partially in a remote server. In a particular embodiment, a central and trusted authentication server holds the database of biometric data for a large number of users, and performs the comparison and evaluation steps 430, 432, 434.
First example additional check (active biometrics) [0052] To illustrate the first example of an additional check, Figure 5 shows the capture of images of the body part (a) in a first state and (b) in a second state in certain embodiments of the methods of Figure 4. On the left-hand side in Figure 5 (a) we see the captured image 202 and on the right-hand side we see the enhanced (pre-processed) image data 204 with lines and other texture such as indicated at 214, the same as described in relation to Figure 3. In this example, the fist is in a first state, namely closed but relatively relaxed. In Figure 5 (b) we see another captured image 202' and enhanced image data 204'. The fist in these images is tightly clenched, i.e. tensed. This can be seen by the deformation and additional texture in the captured image 202'. When pre-processed to highlight boundaries and texture, it can be seen in the enhanced image data 204' that the strength, length and number of features has increased, as shown at 214'.
[0053] Using both of these states in the biometric authentication makes the authentication process an active biometric process, rather than a passive biometric process. Supposing that this active biometric test is to be performed as one of the additional processing steps 440, the process 440 involves issuing one or more instructions to the user, either at the time or in advance, to change their fist from a relaxed to a tensed state and/or vice versa. Such instructions can be issued on screen and/or by verbal instructions through a loudspeaker, by other sounds and/or haptic (vibration) prompts.
[0054] In a simple embodiment, an estimate of the amount of texture in each image data 204 and 204' can be compared to confirm that the texture has increased. This can be done for example by counting the number of line features recognised in the new biometric data before and after tensing the fist and/or comparing the lengths of corresponding lines, or the average length of lines traced in the new biometric data. Alternatively, it may be preferred to obtain a different measure of the greater texture in image data 204' compared with enhanced image data 204. Recognising line features as such is not necessary to obtain a measure of relative texture content. Analysis of spatial frequencies and/or contrast levels in the enhanced image data 204' or even in the captured image 202' may be used to confirm a greater texture content than in the corresponding first state image data 204 or 202. In the corresponding test 442, a quantitative measure confirming that there is a greater texture content in the second state image than in the first state image can be applied against a threshold. If the difference is above a certain threshold, whether in absolute or relative terms, it is judged that a real human is present in the images, and flow continues towards a positive authentication result at step 436. If the difference does not exceed the threshold, flow passes to step 436, where a negative authentication result is given.
[0055] In an alternative embodiment, process 440 may involve repeating all or part of the authentication process of steps 406 to 432 with the body part in the second state. This is illustrated schematically in Figure 4 in broken lines: second state image data during recognition of characteristic features in the second state is represented in the drawing at 410', while the extracted second state biometric data is represented at 412'.The database 420 in that case may include reference biometric data for every individual not only in the first state, but also in the second state. This second state reference biometric data for candidate B is represented at 422B'. This second state reference biometric data may be stored entirely independently of the first state reference biometric data, or it may be stored as supplement to the first state reference biometric data, for example to save time and/or to save storage space. It will be appreciated that a range of alternative embodiments are available which may increase security, but at the expense of increased processing burden and database size. The second state reference data may be used as an example of secondary reference data in a modified method of the type described below with reference to Figure 9.
[0056] In an alternative embodiment, the difference between the first state and the second state may be greater than simply relaxing or tensing the fist. For example, the hand could be unfolded into a flat shape and then folded into the fist. However, by choosing relaxed and tense states of the body part while it remains generally in the same shape, framing of the image and comparison between first state and second state becomes easier, and concealment of sensitive features such as fingerprints is also maintained.
Second example additional check (varying direction of illumination) [0057] To illustrate the second example of an additional process mentioned above, Figure 6 shows the generation of illumination having different characteristics (a) to (d) and Figure 7 shows partially processed images of the bottom of the fist captured using the different illumination characteristics (a) to (d) of Figure 6. As explained in relation to Figure 2 above, in the case where the light source is formed by a display screen of a portable personal device such as a smart phone, characteristics of illumination of the body part can be varied very easily by causing the display of different patterns of light and dark, colour and shade, and the display screen 104. As shown in Figure 6(a) a display pattern has a central portion 124 and an illuminated portion 122' which is strongly asymmetrical, giving a particular directional characteristic to the illumination of the body part.
[0058] Whereas the light-emitting portion of the display covered a large outer region 122 in Figure 2, a first display pattern shown in Figure 6(a) has an illuminated portion 122' confined to a small region of the screen at the left hand side of the screen. Similarly, Figures 6 (b), (c) and (d) show display patterns with illuminated portion 122' confined to a small region of the screen at the right hand side of the screen, at the top of the screen, and the bottom of the screen, respectively. Consequently, as the screen is controlled to display these different patterns during the capturing of images by camera 402, the body part whose image is being captured receives illumination with a different direction. The speed of operation of the display screen and the video camera means that all of the images can be captured in a short time, for example a fraction of a second, minimising any movement of the body part. Capturing all the images quickly also reduces the possibility for a real hand to be fake using a recorded or synthesised video signal. The sequence of display patterns, and their relative durations, can be varied in an unpredictable way, further reducing the possibility of images the system being fooled by recorded and/or synthesised images.
[0059] Partially processed images 204" of the body part are shown in Figure 7 when illuminated by the different display patterns (a) to (d). Recall that the pre-processed (enhanced) image data appears darker where detailed features and texture are present in the captured image, not where the captured image is darker. Due to the directional component of the illumination, coming from the right of the image as seen from the camera in Figure 7(a), shadowing causes a loss of illumination and a loss of detail in the left-hand part of the image. Similarly, illumination coming from the left of the image as viewed from the camera causes enhanced detail (texture) to be captured at the left-hand side of the image, and reduced detail at the right-hand side in Figure 7(b). Similarly, the image captured at (c) has reduced detail in both upper and lower parts (due to a lack of shadowing in the upper part and a lack of illumination in the lower part). Finally, the image captured at (d) has enhanced texture in central parts of the image, but reduced detail elsewhere, due to shadowing.
[0060] By comparing the images captured by the camera under a variety of known illumination conditions, comparisons and contrasts can be made between the level of detail found in each part of the image. In some embodiments, the change in detail between different parts of the image can be evaluated by first tracing characteristic features such as the "star point" and connected lines, and/or the disconnected lines the texture of the body part. In other embodiments, the level of detail can be evaluated directly from the captured images and/or the pre-processed images such as are seen in Figure 7. Analysis of brightness, contrast and/or spatial frequency content can be used, for example. In a test 442 according to the second example additional check, a quantitative measure confirming that the regions of greater and/or lower textual content change with the variation in direction of illumination. The analysis may be specific as to the expected differences in different regions, bearing in mind the known illumination directions, the roughly known 3D form of the body part and consequently the expected shadowing effects. If the differences between the four images are present above a certain threshold, it is judged that a real human is present in the images, and control passes towards a positive authentication result at step 436. Such a threshold can be based on a combination of image regions, and may be expressed in absolute and/or relative terms. If the pattern of differences difference does not exceed the threshold, flow passes to step 436, where a negative authentication result is given. This may occur, for example, when a two-dimensional photograph of the body part is presented to the camera 402, or when a video is presented which does not have the correct sequence and timing of variation in the direction of illumination.
[0061] Of course, the specific display patterns of Figure 6 are only examples, and different patterns, and patterns fewer or greater in number may be used. Variations of colour may be used instead of or in addition to variations in the illumination direction. In other embodiments, the light source does not need to be a display screen, but could be an array of dedicated light sources, individually controlled. The types of processing disclosed herein are less dependent on the particular hardware, compared with the familiar fingerprint sensors mentioned in the introduction.
[0062] As a third example of an additional process 440 and test 442, the manner of presentation of the body part, for example it's orientation and/or sequence of movements, may be compared with characteristics stored from previous times that the body part has been presented authentication. A real human could never replicate exactly the same presentation, and therefore a match with historic presentation data may indicate that the biometric data is being faked with a recording of a previous presentation. The previous presentation may, for example, include the original presentation made at the time of registration. In this way, the system may guard against misuse of images "stolen" at the time of registration.
[0063] As mentioned above, a large variety of techniques can be considered for use as one of the additional processes 440 and tests 442. The ones explained in more detail above are merely examples that have been found practical.
Registration process example [0064] Figure 8 illustrates a registration method 800, including the generation of three-dimensional reference biometric data of the type used as reference biometric data 422 in the method of Figure 4. The concept of bottom fist biometrics will by now be clear, and will not be explained again in detail. The registration process is the one by which each individual person wishing their identity to be verified by bottom fist biometrics has to go through a registration or enrolment process, which the biometric data of their individual body part is captured for use later in an authentication method such as that described above.
[0065] Starting the top left, the user uses an image acquisition device 802 and the light source 804 to capture images of the desired body part, in this case the bottom of their fist. The user can use either the right fist or their left fist. A sequence of capture steps 812 to 820 is performed to capture images of the fist in different orientations. Although shown as a series of capture steps, it will be appreciated that a series of images may be captured as a single video file, with individual images extracted later. The different orientations may be performed in a predetermined sequence, while video is recorded, or while individual photographs are taken. Alternatively, without imposing any sequence on the user, the user can move their fist freely in front of the camera until a sufficient variety of poses have been captured. The system can issue on-screen and/or audible guidance as required.
Alternatively, the body part can be held stationary while the camera (e.g. a smartphone) is moved in the user's other hand.
[0066] Example captured images are shown at 822 to 830 (or rather enhanced image data is shown after pre-processing similar to that described above to highlight the texture and boundaries). From each of these images, two-dimensional biometric data 832 to 840 can be obtained, by a method similar to that described above with reference to Figure 3. Knowing that the images relate to different orientations of the same body part, points in the first set of biometric data can be mapped onto points in the second set 834, etc., so as to perform tracking of these points as the body part moves. Tracking the points over a sequence of images may reduce the computational burden of subsequent steps. At least approximate measures of orientation can be derived in a variety of simplified processes, for example by observing the relative position of the "star point" 230 relative to supplementary features such as the outline shape 250. An approximate measure of orientation can be derived by observing the lines 232 connected to the "star point" 230 within the obtained biometric data. An approximate measure of orientation can be obtained by observing the proportions of the outline shape 250 as they vary between images. Any or all of these measures can be combined with more detailed data to obtain a refined 3D representation of the biometric data from the captured images.
[0067] In step 850 the positions of the recognised features over the series of images are combined into a 3D representation of the biometric data, including primary biometric data, secondary biometric data and any supplementary biometric data. At 852, a biometric identity record for the user is created and stored in database 420. For this purpose, additional information is obtained from an authentication service provider 854 and/or a service provider 856 such as a financial institution with whom the user has a relationship.
Additional information may also be obtained from the user's portable personal device 200, including the subscriber identification module (SIM) and/or from a telecom service provider 858.
[0068] As mentioned already, some embodiments disclosed herein use biometric information of the body part in two or more different states (e.g. relaxed fist and tensed fist).
Depending how the second state biometric data is to be used, second state reference biometric data may be generated as part of the registration process 800. This is illustrated schematically in broken lines at 850'. The second state reference data may be abbreviated reference data, relative to the detail in the first state reference data. Alternatively, as mentioned above, the entire registration process may be replicated with the body part in the second state, so that the second state reference biometric data has the same form and detail as the first state reference biometric data. This is illustrated schematically in broken lines by steps 812' to 820'.
[0069] An important performance criterion for any biometric system is that the biometric data and the authentication processes in which it is used must enable reliable discrimination between all the registered individuals. In practice, a first person's biometric data may be very different to that of any other person in the database, or it may be rather similar to another person in the database. Accordingly, the preparation of biometric data for use in registering a new user includes a check 860 to confirm whether the data for the new user is sufficiently distinct to identify that user uniquely among all the registered users. If the biometric data of the new user is close to that of one or more existing users, enhanced biometric data may be stored. For example, an enhanced level of detail may be obtained and recorded in the primary and/or secondary biometric data (e.g. more lines 240 and/or more control points (242) per line.). An enhanced level of detail may be recorded in the supplementary data (e.g. more control points 254 in the outline shape 250 and/or more vectors 256 linking the supplementary reference data with the primary biometric data).
Conversely, if the biometric data of the new user is very different to that previously registered users, a less detailed biometric data can be stored. Of course, another approach would be to store a maximum level of detail for every user, but this would greatly increase the processing burden and storage burden in all operations of the system.
Therefore, the example systems described herein take an adaptive approach to the level of detail and/or the types of biometric data included in each individual record. Also, the maximum level of detail may depend on the image capture hardware, illumination et cetera, and an objective standard is preferred.
[0070] In a case where, even with enhanced detail, it seems difficult to discriminate the new user one or more existing users, an alternative solution would be to ask the new user to use their other hand. Most users of course have a choice of left and right hand, and the bottom fist print is unique to every hand.
[0071] A major benefit of the biometric methods disclosed herein, compared with, for example familiar fingerprint sensor technology, is that the biometric data can be made device-independent, as mentioned already. In an optional step 862, the biometric data for one or more individuals can be transferred to a new portable personal device 200' and/or to a further database 420'. It goes without saying that any of the databases and services described herein can be implemented in a distributed fashion, rather than being tied to specific and unique hardware. Additional security against theft of authentication credentials can be provided by storing different parts of each user's reference biometric data on different servers, It will be understood that, in a high-volume application such as banking, the infrastructure should be provided for many thousands or even millions of authentication requests per day.
Biometric authentication using bottom fist image data (second example method) [0072] Figure 9 illustrates schematically a second example method 900 of authenticating the identity of a person automatically, using biometric data extracted from bottom fist images according to the principles described above with reference to Figures 1 to 3 and Figure 4 of the drawings. Reference signs in Figure 9 correspond closely to those in the method 400 of Figure 4, but with prefix '9' in place of '4'. For brevity, features which are the same as in Figure method 400 will not be repeated here. This description will focus on the differences.
[0073] The method 900 illustrates the possibility to apply processing differently on the primary biometric data (represented at 912-1) and secondary biometric data represented at (912- 2). One motivation for this is to speed up the search for matching individuals, by processing only the primary biometric data until a potentially matching candidate or shortlist of candidates are found in the database, and then using the secondary biometric data to confirm or reject the candidate.
[0074] To this end, instead of the single comparing step 430 in the method 400, a first comparing step 930-1 in the method 900 compares only the primary biometric data with primary reference biometric data 922-1 of the candidates from a database 920-1. Optionally some supplementary biometric data such as the outline shape 250 may be used in conjunction with the primary biometric data. An evaluating step 932-1 then selects the one or more candidates which appear to have a good chance of matching the new biometric data, based on the primary reference data and supplementary data. As in the example of Figure 4, we assume that the user B is the best match, and that this is confirmed using the first reference biometric data 922-1B. Then a second comparing step 930-2 is performed using the full set of biometric data 922-23 from a second database 920-2. This full set of biometric data includes not only the primary reference biometric data and supplementary reference biometric data, but also the secondary reference biometric data and optionally some more supplementary reference biometric data for user B. Alternatively, the second database might contain only the secondary reference biometric data, and the second comparing step compares only the secondary biometric data obtained from the captured images, so that the complete set of biometric data is never present or processed in the one place.
[0075] Based on the example of biometric data captured in Figure 3, for example, the second comparing step 930-2 may use full reference biometric data including the disconnected lines from the lower portion 244 of the fist image. The second comparing step 930-2 can seek to match additional features to the ones matched in step 930-1, or more detailed representations of the same features. As mentioned above, the second comparing step 930-2 can use more detailed supplementary data, such as a more detailed representation of outline shape 250, and/or more numerous connecting vectors between the supplementary biometric data and the primary biometric data. As another example, the second comparing step may use three-dimensional new biometric data where the first comparing step uses two-dimensional new biometric data. The second comparing step may evaluate matching between new biometric data and biometric reference data obtained of the body part in a second state (e.g. tensed fist), where the first comparing step uses only first state biometric data (e.g. relaxed fist).
[0076] In this way, a great deal of processing effort may be saved, by conducting a search first using only subset of the available biometric data. As in the method 400 of Figure 4, the method 900 includes additional checks 940 and tests 942, for example to confirm the captured images do come from a live human fist. All the variations described above for the examples of Figures 4 to 8 can be applied equally in the embodiment of Figure 9. As before, the end result is either a positive authentication result given at step 936, or a negative authentication result given at step 938.
[0077] The first reference biometric data 922-1 is stored in a database 920-1 which may be the same as or separate from a database 920-2 storing the second reference biometric data 922-2. By separating parts of the data from other parts, additional security can be provided in a system so that a single attack or point of weakness does not expose the full set of data. In some examples, only the first comparing step 930-1 may be performed when low-level authentication is required, while both the first comparing step and the second comparing step 930-2 are performed more higher-value and/or more sensitive transactions. For example, the first comparing step 930-1 may be performed using reference data stored on a mobile device as part of an authentication process for unlocking that device, while the second comparing step is required for executing banking transactions or the like.
Application example
[0078] Figure 10 illustrates the disclosed biometric authentication techniques as part of a larger secure authentication infrastructure involving several service providers in remote cooperation with the user and their portable personal device 200. It will be appreciated that the procedures are presented in summary form only, without the detailed implementation that would be within the capabilities of the skilled person to design. At 1002 a request for authentication of an individual is generated, for example by a merchant 856 such as a content provider or financial institution. This request is issued to the server of an authentication service provider 854, identifying the user by for example a full or partial telephone number or email address. The authentication service provider 854 interrogates an authentication application running on the portable personal device at step 1004. Two levels of protection are provided in this simple example: one based on the identity of the portable personal device and subscriber identification module (SIM) and the other based on bottom fist biometric authentication.
[0079] At step 1004, for the sake of example, the presence of the SIM is confirmed, for example by the method as described in US patent US10230721B2 (Chan), the contents of which are incorporated herein by reference. The interaction between authentication service provider 854 and merchant 856 may include exchanging partial or full telephone number details, for example, to identify the user and their smartphone. At step 1006, authentication service provider 854 provides either a positive or negative result. A positive result allows processing to proceed towards a final positive authentication outcome at step 1036. A negative result leads to rejection of the authentication request at step 1038.
[0080] Assuming the SIM is present and correct, at step 1008 the biometric authentication method is applied, for example by the method 400 or the method 900, described above. Biometric data captured by the portable personal device 200 is sent to the authentication service provider 854 in order to confirm that the individual operating the portable personal device is the authorised individual. At 1010, authentication service provider 854 provides either a positive or a negative result to the biometric authentication. A positive result allows processing to proceed towards a final positive authentication outcome at step 1036. A negative result leads to rejection of the authentication request at step 1038. Only if the user is in possession of both the authorised SIM and the matching fist will the verification process be successfully completed, and the merchant or financial institution 856 be informed.
[0081] Of course the different authentication 1004 and 1008 can be performed in a different order, and/or in parallel. Additional authentication steps based on biometrics, secret knowledge and/or hardware keys can be included, if desired. The authentication service provider for the SIM identity check need not be the same as the authentication service provider for the biometric authentication.
Hardware implementation examples [0082] Figure 11 shows in schematic form one implementation of a portable personal device (PPD) 1100 is illustrated, suitable for use as the portable personal device 200 in the system of Figures 1 to 10. Internal components of PPD 1100 include a processor 1120 and various indications and other hardware typically found in smartphone devices. Not shown but implicitly present are other hardware elements such as a battery, microphone, loudspeaker, vibrator, real-time clock. Specifically shown are: the user input/output interfaces U10 for communicating with the user via buttons and a touchscreen, a display driver and screen for use as the light source 104 when capturing biometric information; camera module CAM for use as the image acquisition device 102 when capturing biometric information; GPS interface for satellite location services; interfaces DAT, TEL and SMS for mobile data, telephony and short messages (texts), respectively; WLAN interface WFI; Bluetooth interface BLT for short-range wireless communication; near-field communications interface NEC (also known as RFID). Also provided is the holder for an SIM card, that is the subscriber interface module used in authentication methods such as the ones described above with reference to Figure 10.
[0083] Functions of the portable personal device for the purposes of the present disclosure are defined by program instructions of the user authentication application (UAAPP) 1150.
These instructions and associated data structures are stored in the storage STO associated with processor 1120, and they configure the processor to implement the functions of the PPD described and claimed herein. Instructions may be delivered on a hardware data carrier (for example a USB memory device, not shown separately), and/or delivered over one of the interfaces DAT, WIFI etc., listed above. Program instructions define several modules, according to the desired functionality of the portable personal device. Example modules illustrated in Figure 11 are a registration module REG for controlling the device in registering biometric data by method 800 or similar and an authentication module for controlling the device in subsequent authentication operations, for example by the method 400 or 900 described above. Implicitly present but not shown are software modules of the operating system, including basic user interface functions common to computing devices. The operating system may be iOS or Android, for example, in which case the app could be downloadable from the appropriate app store (Google or Apple). Alternatively, implementation may be in the form of a Responsive Web application, and it may be provided in the form of an SDK, API or library for integration into a merchant or banking app, for example. Also implicitly present are other applications APP2, APP3 as commonly found on a smartphone or similar device.
[0084] Data structures (UDAT) created and/or used by the PFD user application 650 are indicated at 1152. For supporting the biometric authentication, data fields may include a user ID UID relating to the authentication service provider 854, a user phone number UPHNO and user preferences. In practice, the user IP can be the phone number or email address -which is a universal and open ID -so that the two fields may be the same. Optionally, primary biometric data PRIBD and/or secondary biometric data SECBD and or supplementary biometric data SUPBD may be stored, for use as reference biometric data in case local comparison between newly acquired images and reference biometric data is required. Also optionally stored is presentation history data PRESHIST for use in the third example of the additional checks (440/442) discussed above. The presentation history data may record one or more previous presentations by the user, for comparison with a newly acquired series of images to guard against recorded being used to attack images being used to hack the system. The presentation history data may record characteristics of the originally registered reference biometric data, to guard against theft of that data.
[0085] It will be appreciated that these fields are only examples, and different data may be stored and/or different formats used, in a practical implementation.
[0086] Figure 12 shows hardware and functional elements of a server computer 1200 for implementing the functionality of the authentication service provider 854 in the methods described above. Although a single server computer is shown, it will be understood that a network of servers and/or cloud-based computing services may be involved in the practical implementation of the functions described. Within server 1200, a processor operates, with storage STO, network interfaces NIF and user input/output functions U10 for communication with operators.
[0087] Functions of the server computer 1200 for the purposes of the present disclosure are defined by program instructions of a server application (SVRAPP) 1250. These instructions and associated data structures are stored in the storage STO associated with processor 1220, and they configure the processor to implement the functions of the server described and claimed herein. Instructions may be delivered on a hardware data carrier (for example a USB memory device, not shown separately), and/or delivered over one of the interfaces NIF. Program instructions define several modules, according to the desired functionality of the server computer. Example modules illustrated in Figure 12 are: a system administration module SYSADMIN; a user registration module for conducting the registration of new users and/or new biometric data via methods such as method 800 described above; a registered management module REGMGR for managing data and records for the many users; an authentication module AUTH for implementing the server functions of the biometric authentication methods such as methods 400, 900, 1000 described above; and a transaction logging module LOG. An alarm module ALRM is provided for reporting suspicious activity detected, for example, on the basis of failed authentication. Implicitly present, but not shown, are software modules of the operating system, including basic user interface functions common to server computers, as well as other applications, if desired.
[0088] Data structures (UDAT) created and/or used by the server application 1250 are indicated at 1252. For each user, data fields may include a user ID UID identifying the user uniquely to the authentication service provider, a user phone number UPHNO and user preferences UPREF. Additionally, primary biometric data PRI BD and/or secondary biometric data SECBD and or supplementary biometric data SUPBD may be stored, for use as reference biometric data in the authentication the user based on newly acquired images received from the portable personal device or other image acquisition device. Also optionally stored is presentation history data PRESHIST for use in one of the additional checks discussed above. The presentation history data may record one or more previous presentations by the user, for comparison with a newly acquired series of images to guard against recorded being used to attack images being used to hack the system. Also provided for each user may be records of one or more institutions identified by institution ID INSID and account information UDAT, previously registered with the authentication service provider.
[0089] Transaction logs LOG recording details of the authentication activity of each user are also provided. The presentation history data PRESHIST may be stored as part of the transaction log instead of the user record. The transaction log and/or the presentation history data may be stored in a separate server from the other user data.
[0090] Also provided for each institution and other partners there may be data INSDAT concerning the accounts and relations between institutions and the authentication service provider.
Three-dimensional biometric data [0091] Figure 13 illustrates a method of quickly transforming a sets of two-dimensional image data into an approximate three dimensional model of a body part such as a fist. The method can be used in generating three-dimensional reference biometric data, as an example of the method of Figure 8. The method of Figure 13 can alternatively or in addition be used in generating three-dimensional new biometric data, for use in variations of the methods of Figure 4 and/or Figure 9. At Figure 13 (a) we see a first captured image of the fist, seen in a "neutral" or "base" orientation. At (b) we see first enhanced image data, derived from the first captured image in a manner similar to enhanced image data 822 in Figure 8. Based on the recognition of the star point 1330 and connected lines similar in the manner illustrated in Figure 3, a pair of crossed lines 1332 are identified, which measure the maximum extent of the connected lines (232) in two dimenfions. At (c) we see a first outline shape 1350 that is obtained from the first enhanced image data in the same manner as outline shape 250 described above, with the crossed lines and star point superimposed. Similarly, in Figure 13 (d) we see a second captured image of the fist, seen in a rotated position and at (e) we see the corresponding second enhanced image data, similar to enhanced image data 824 in Figure 8. Again the star point 1330' and connected lines are recognised, and the pair of crossed lines 1332' are identified, which measure the maximum extent of the connected lines (232) in two dimensions. A corresponding second outline shape 1350' is obtained at (g)..
[0092] In a practical embodiment, more than two positions and more than two extruded shapes would be generated. The coordinates of the connected lines 1332, 1332' etc. are used, either alone or in combination with other features such as outline shapes 1350, 1350' and/or star points 1330, 1330', to determine the changes in orientation that relate each two-dimensional image to the real three-dimensional fist. From among a number of views, the one corresponding to the base orientation can be identified by finding the one with the longest pair of crossed lines in two dimensions. That is to say, the length of one or both of the crossed lines 1332' will be less than the length of the crossed lines 1332 in the image corresponding to the base orientation.
[0093] Using the relative orientations determined for the individual images, the images can be combined by a simple algorithm into an approximate three-dimensional model of the fist. As illustrated schematically at Figure 13 (g), each outline shape 1350, 1350' can be converted into a shape 1352, 1352' in three dimensions, by extrusion in a direction 1360, 1360' perpendicular to the plane of the outline shape. Then, these shapes can be oriented relative to one another based on the orientation derived from analysis of the crossed lines 1332, 1332', and superimposed as shown in Figure 13(g). By superimposing the two or more extruded shapes 1352, 1352' and taking the Boolean intersection of the first and second etc. extruded shapes, an approximate three-dimensional form of the fist can be generated. First features and/or second features identified in the various images can be mapped to this three-dimensional shape.
[0094] As mentioned above, three-dimensional new biometric data can be compared with three-dimensional reference biometric data directly, without projection to two dimensions.
Alternatively, or in addition, three-dimensional new biometric data can be rotated to a "standard" orientation, allowing rapid matching against the candidates whose reference biometric data is stored in that standard orientation. The standard orientation may for example be the same as the base or neutral orientation mentioned above, or it could be different.
Conclusion
[0095] By the above disclosure there is enabled the provision of authentication methods and systems that avoid drawbacks of known biometric authentication methods. The methods in the disclosed embodiments use biometric data extracted from images of the bottom of the human fist as an example of a body part which is unique to each person, yet not readily "stolen" in the same way as fingerprints or facial features. Such features may be referred to as "deep" or "hidden" biometric features. The embodiments disclosed provide in effect an active biometric fist print authentication having high convenience and high accuracy.
[0096] The above description illustrates only some examples by which the principles disclosed and claimed herein may be implemented. These examples should not be regarded as limiting on the scope of the claims. Various modifications and variations are possible, including but not limited to the ones mentioned above. Accordingly, the scope of protection is defined only by the appended claims, with due account being taken of equivalents.

Claims (36)

  1. CLAIMS1. A method of obtaining biometric data for a person, the method comprising: capturing a plurality of images of a body part of the person using one or more image acquisition devices; illuminating the body part during the capture of said images using one or more light sources; and processing the captured images of the body part to derive biometric data for use in identifying the user, wherein the body part is the bottom of a fist of the user, said biometric data being derived by recognition of features characteristic of the bottom of a fist.
  2. 2. A method as claimed in claim 1 wherein said biometric data comprises primary biometric data relating to at least first features of the body part and secondary biometric data relating to a further features of the body part.
  3. 3. A method as claimed in claim 2 wherein said first features of the body part comprise a connected set of lines radiating from a central region and corresponding to folds of the little finger and palm.
  4. 4. A method as claimed in claim 2 or 3 wherein said primary biometric data further relates to second features of the body part, said second features comprising a plurality of disconnected lines corresponding to creases in the bottom of the fist in the vicinity of the first 25 features.
  5. 5. A method as claimed in claim 2, 3 or 4 wherein said further features of the body part comprise a (further) plurality of disconnected lines corresponding to creases in the bottom of the fist, being optionally lines located away from the first features.
  6. 6. A method as claimed in any preceding claim, wherein said biometric data includes first state biometric data derived from one or more images of the fist captured with the fist in a first state and second state biometric data derived from one or more images of the fist captured with the fist in a second state.
  7. 7. A method as claimed in claim 6 wherein said first state of the fist is a relaxed state and said second state of the fist is a tensed state.
  8. 8. A method as claimed in any preceding claim wherein said display screen is the display screen of a portable personal device, and wherein the capture of said images is performed using one or more cameras included within the portable personal device.
  9. 9. A method as claimed in any preceding claim further comprising storing said biometric data as reference biometric data in a database for use in authenticating the identity of the person at a later date.
  10. 10. A method as claimed in claim 9 wherein said biometric data comprises primary biometric data relating to at least first features of the body part and secondary biometric data relating to a further features of the body part, and wherein said primary biometric data and said secondary biometric data are stored as primary reference biometric data and secondary reference biometric data in separate databases.
  11. 11. A method as claimed in claim 9 or 10 wherein at least a copy of said reference biometric data is stored in a database and a server remote from a portable personal device that is used to capture the images of the body part.
  12. 12. A method as claimed in any preceding claim further comprising storing presentation history data representing characteristics of the presentation of the fist in the captured images.
  13. 13. A method as claimed in any of claims 9 to 12 wherein said reference biometric data for a person is registered by an authenticating service provider by reference to identifying information stored within the portable personal device.
  14. 14. A method of authenticating the identity of a person automatically using biometric data, the method comprising: capturing a plurality of images of a body part of the person using one or more image acquisition devices; illuminating the body part during the capture of said images using one or more light sources; and comparing new biometric data derived from the captured images directly or indirectly with reference biometric data obtained previously from one or more individual persons; and based at least in part on the result of said comparing step, providing an authentication result, wherein the body part is the bottom of a fist of the user, and wherein at least one of said new biometric data and said reference biometric data has been derived by recognition of features characteristic of the bottom of a reference fist by a method as claimed in any of claims 1 to 13.
  15. 15. A method as claimed in claim 14 wherein the comparing step includes deriving new biometric data by recognition of features characteristic of the bottom of a fist in one or more of the new captured images, by a method as claimed in any of claims 1 to 13, and comparing the new biometric data with the reference biometric data.
  16. 16. A method as claimed in claim 15 wherein the comparing step includes performing a geometric transformation of the reference biometric data and/or of data derived from the new captured images.
  17. 17. A method as claimed in claim 15 or 16 wherein the new biometric data represents said characteristic features in two dimensions, while the reference biometric data represents at least some of said characteristic features in three dimensions, the reference biometric data having been captured while the fist is in different orientations relative to the image capture device.
  18. 18. A method as claimed in claim 19 wherein the comparing step includes performing a transformation of the reference data in three dimensions to match features in data derived from one or more of the new captured images.
  19. 19. A method as claimed in claim 16 wherein the new biometric data represents at least some of said characteristic features in three dimensions, the reference biometric data having been captured while the fist is in different orientations relative to the image capture device and wherein the comparing step includes performing a transformation of the reference data in three dimensions to match a common orientation of the reference biometric data in the database.
  20. 20. A method as claimed in any of claims 14 to 19 wherein the reference biometric data comprises primary reference biometric data relating to at least first features of the reference fist and secondary biometric reference data relating to further features of the reference fist.
  21. 21. A method as claimed in claim 20 wherein said first features comprise a connected set of lines radiating from a central region and corresponding to folds of the little finger and palm.
  22. 22. A method as claimed in claim 20 or 21 wherein said primary reference biometric data further relates to second features of the reference fist, said second features comprising a plurality of disconnected lines corresponding to creases in the bottom of the fist in the vicinity of the first features.
  23. 23. A method as claimed in any of claims 20 to 22 wherein the comparing step comprises a first comparing step in which data derived from said new captured images is compared with said primary reference biometric data and a second comparing step in which data derived from said new captured images is compared with said secondary reference biometric data.
  24. 24. A method as claimed in claim 23 wherein said second comparing step is performed only in the event of a successful result of the first comparing step.
  25. 25. A method as claimed in claim 23 or 24 wherein said first comparing step is performed in respect of primary reference biometric data for a number of different reference fists, and said second comparing step is performed in respect of one of said reference fists in response to a successful result of the primary comparing step.
  26. 26. A method as claimed in any of claims 14 to 25, wherein said capturing and illuminating steps are performed to capture first state images with the fist in a first state and second state images with the fist in a second state, and wherein the authentication result is based at least in part on observation of differences between the first state images and the second state images.
  27. 27. A method as claimed in claim 26 wherein said reference biometric data includes first state reference biometric data corresponding to the reference fist in the first state and second state reference biometric data corresponding to the reference fist in a second state, and the comparing step is performed both between said first state images and said first state reference biometric data and between said second state images and said second state reference biometric data.
  28. 28. A method as claimed in claim 26 or 27 wherein said first state of the fist is a relaxed state and said second state of the fist is a tensed state. 10
  29. 29. A method as claimed in any of claims 14 to 28, further comprising varying one or more characteristics of illumination of the body part by control of said light source(s) while capturing a further plurality of images, and wherein the authentication result is based at least in part on observation of differences between images captured under different characteristics of illumination.
  30. 30. A method as claimed in claim 29 wherein directional characteristics of said illumination are varied, and wherein images captured with different directional characteristics of illumination are processed together such that the authentication result depends at least in part on shadow variations confirming the presence of depth features characteristic of a fist.
  31. 31. A method as claimed in claim 29 or 30 wherein colour and/or intensity characteristics of said illumination are varied in a known sequence, and wherein images captured with different colour and/or intensity characteristics are processed together such that the authentication result depends at least in part on confirming the presence of said sequence in light reflected from the body part.
  32. 32. A method as claimed in any of claims 14 to 31 wherein characteristics of the presentation of the fist in the captured images are compared with presentation characteristics recorded on one or more previous occasions, and wherein the authentication result depends at least in part on confirming that the presentation of the fist is not identical to a previous presentation.
  33. 33. A method as claimed in claim 32 wherein said light sources are part of a display screen, and wherein the characteristics of illumination are varied by causing display of images with different spatial distribution and/or colour and/or intensity.
  34. 34. A system for obtaining biometric data for a person, comprising: one or more image acquisition devices arranged to capture a plurality of images of a body part of the person; one or more light sources for illuminating the body part during the capture of said images; and one or more processors configured to process the captured images of the body part to derive biometric data for use in identifying the user, wherein the body part is the bottom of a fist of the user, said biometric data being derived by recognition of features characteristic of the bottom of a fist.
  35. 35. A system for authenticating the identity of a person automatically using biometric data, comprising: one or more image acquisition devices arranged to capture a plurality of images of a body part of the person; one or more light sources for illuminating the body part during the capture of said images; and one or more processors configured to compare new biometric data derived from the captured images directly or indirectly with reference biometric data obtained previously from one or more individual persons and, based at least in part on the result of said comparing step, to provide an authentication result, wherein the body part is the bottom of a fist of the user, and wherein at least one of said new biometric data and said reference biometric data has been derived by recognition of features characteristic of the bottom of a reference fist.
  36. 36. A system as claimed in claim 35 wherein said one or more processors are further configured to derive said new biometric data from the captured images by recognition of features characteristic of the bottom of a reference fist.
GB2016818.3A 2020-10-23 2020-10-23 Methods, systems and computer program products, for use in biometric authentication Pending GB2600401A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2016818.3A GB2600401A (en) 2020-10-23 2020-10-23 Methods, systems and computer program products, for use in biometric authentication
PCT/EP2021/079205 WO2022084444A1 (en) 2020-10-23 2021-10-21 Methods, systems and computer program products, for use in biometric authentication

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2016818.3A GB2600401A (en) 2020-10-23 2020-10-23 Methods, systems and computer program products, for use in biometric authentication

Publications (2)

Publication Number Publication Date
GB202016818D0 GB202016818D0 (en) 2020-12-09
GB2600401A true GB2600401A (en) 2022-05-04

Family

ID=73727069

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2016818.3A Pending GB2600401A (en) 2020-10-23 2020-10-23 Methods, systems and computer program products, for use in biometric authentication

Country Status (2)

Country Link
GB (1) GB2600401A (en)
WO (1) WO2022084444A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5862246A (en) * 1994-06-20 1999-01-19 Personal Information & Entry Access Control, Incorporated Knuckle profile identity verification system
WO2004052173A2 (en) * 2002-12-06 2004-06-24 Cross Match Technologies, Inc. System and a non-planar prism that are used to obtain print and other hand characteristic information
EP2688050A1 (en) * 2012-07-18 2014-01-22 Gemalto SA Method for authenticating a user of a contactless chip card
CN107977600A (en) * 2017-09-11 2018-05-01 江苏国光信息产业股份有限公司 A kind of signature false distinguishing method based on side palm feature

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3579495A4 (en) 2017-02-01 2020-06-03 Chan, Tai Chiu Authentication server, authentication system, and authentication method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5862246A (en) * 1994-06-20 1999-01-19 Personal Information & Entry Access Control, Incorporated Knuckle profile identity verification system
WO2004052173A2 (en) * 2002-12-06 2004-06-24 Cross Match Technologies, Inc. System and a non-planar prism that are used to obtain print and other hand characteristic information
EP2688050A1 (en) * 2012-07-18 2014-01-22 Gemalto SA Method for authenticating a user of a contactless chip card
CN107977600A (en) * 2017-09-11 2018-05-01 江苏国光信息产业股份有限公司 A kind of signature false distinguishing method based on side palm feature

Also Published As

Publication number Publication date
WO2022084444A1 (en) 2022-04-28
GB202016818D0 (en) 2020-12-09

Similar Documents

Publication Publication Date Title
Rui et al. A survey on biometric authentication: Toward secure and privacy-preserving identification
US10671716B2 (en) User authentication method and system using variable keypad and biometric identification
AU2019201491B2 (en) Method of Host-Directed Illumination and System for Conducting Host-Directed Illumination
WO2020207189A1 (en) Method and device for identity authentication, storage medium, and computer device
Jain et al. Biometric identification
US20150302252A1 (en) Authentication method using multi-factor eye gaze
US10929849B2 (en) Method and a system for performing 3D-based identity verification of individuals with mobile devices
KR20230149320A (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
Amin et al. Biometric and traditional mobile authentication techniques: Overviews and open issues
US11316699B2 (en) Method for authenticating user contactlessly based on decentralized identifier using verifiable credential and authentication supporting server using the same
Azimpourkivi et al. Camera based two factor authentication through mobile and wearable devices
JP7428242B2 (en) Authentication device, authentication system, authentication method and authentication program
CA3049042A1 (en) System and method for authenticating transactions from a mobile device
Ibrahim et al. Performance analysis of biometric recognition modalities
Wojciechowska et al. The overview of trends and challenges in mobile biometrics
Parizi et al. Towards better ocular recognition for secure real-world applications
GB2600401A (en) Methods, systems and computer program products, for use in biometric authentication
Ducray Authentication by gesture recognition: A dynamic biometric application
Lott Biometrics: modernising customer authentication for financial services and payments
Wells et al. Privacy and biometrics for smart healthcare systems: attacks, and techniques
Dasgupta et al. Biometric Authentication: Authentication through human characteristics
Oostdijk et al. State-of-the-Art in Biometrics for Multi-Factor Authentication in a Federative Context
Ninassi et al. Privacy Compliant Multi-biometric Authentication on Smartphones.
Abdullahi et al. Biometric Approach as a Means of Preventing Identity Theft
Kagiri Enhancing community based health information system CBHIS reporting through open source short message service based tool