GB2598016A - Biometric authentication apparatus and biometric authentication method - Google Patents
Biometric authentication apparatus and biometric authentication method Download PDFInfo
- Publication number
- GB2598016A GB2598016A GB2103638.9A GB202103638A GB2598016A GB 2598016 A GB2598016 A GB 2598016A GB 202103638 A GB202103638 A GB 202103638A GB 2598016 A GB2598016 A GB 2598016A
- Authority
- GB
- United Kingdom
- Prior art keywords
- finger
- image
- authentication
- shape
- section
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1347—Preprocessing; Feature extraction
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Collating Specific Patterns (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Input (AREA)
Abstract
The high-precision authentication is realized even when there occurs fluctuation in posture upon the image of the living body part being captured. The system includes an image capturing section to capture an image having a picture of a living body part taken; a computation section to compute plural positions to specify a shape of a prescribed portion of the living body part in the captured image; a transformation section to magnify or reduce the shape of the prescribed portion based on a ratio between one of the plural positions and another of the plural positions upon registration and the one of the computed plural positions and the other of the computed plural positions; and an authentication section to make biometric authentication using an image of the prescribed portion whose shape has been magnified or reduced.
Description
BIOMETRIC AUTHENTICATION APPARATUS AND BIOMETRIC AUTHENTICATION METHOD
BACKGROUND OF THE INVENTION
[0001] The present invention relates to a biometric authentication apparatus to authenticate an individual and a biometric authentication method using the biometric information.
[0002] Among various biometric authentication techniques, the finger vein authentication is known as being able to realize high-precision authentication. Such finger vein authentication can realize not only excellent authentication precision because it uses blood vessel patterns found within the fingers but also high security due to the fact that it is less prone to forging and alteration than the fingerprint authentication.
[0003] In recent years, such mobile terminals as smartphones and tablet computers have prevailed over the market worldwide, many of which carry a color camera for general use as standard equipment. The finger authentication technique in which such camera is used for taking the pictures of the fingers does without any special hardware, so that it is considered that such technique is going to be widely accepted from now on. Further, by displaying the mode in which the pictures of the living body parts are being taken on the screen of the terminal, for instance, it facilitates the pictures of the plural fingers to be taken at the same time and permits high-precision multimodal finger authentication of lower non-correspondence rate to be realized.
[0004] When finger authentication is introduced to the mobile terminal of each individual, the users themselves are required to understand how to correctly take the pictures of their fingers. However, for instance, when such manipulation is required as the fingers remaining motionless in the air, there are found numberless positions and postures of the fingers, so that the users sometimes fail to be authenticated because they cannot hold up their fingers appropriately. Therefore, in terms of the system in use, it is required to guide the users so that they can take the pictures of their fingers with their right positions and postures and to feed back the fact that such pictures are adequately being taken. In addition, considering the cases where there occurs slight displacement which is hard to be addressed by guidance, it is also important to rectify unfavorable posture fluctuations within the system. That is to say, the finger posture guidance technique, the finger posture adequacy determination technique and the finger posture rectification technique are must for the system in use.
Especially, the finger posture rectification technique permits burden on the part of the users upon manipulation to reduce while constant authentication precision to be enhanced.
[0005]
As regards the prior art on the detection of the
finger posture or posture rectification method, the technique by which the length and thickness of the finger are detected is disclosed in Japanese Unexamined Patent Application Publication No. 2018-128785; the technique by which the position of the fingertip and the largeness of the hand are detected is disclosed in Japanese Unexamined Patent Application Publication No. 2009-301094; and the technique by which the largeness and shape of the hand are detected and guidance is provided accordingly is disclosed in Japanese Unexamined Patent Application Publication No. 2013-205931.
SUMMARY OF THE INVENTION [0007]
In order to enhance the reliability of finger information which is input during the finger authentication using the plural fingers and perform authentication processing high in usability, the technique to guide the presented finger to the position or posture appropriate for authentication, the technique to determine whether or not the presented condition is appropriate for authentication and the technique to rectify the condition inappropriate for authentication are must. Especially when the pictures of the fingers are taken with the front camera of e.g., tablet computers and notebook computers, the fingers result in being held up non-contact in the air, but for example, the three-dimensional angles that the center axis of the finger and the optical axis of the camera make are vulnerable to change and it is hard to guide the users so that such angles become completely identical to those registered beforehand. Thus, it is important to provide such authentication technique as being able to reduce influence of the finger posture change by rectifying such posture displacement on the part of the system in use. Regarding this posture displacement rectification, conventionally, such method has been carried out as generating images and patterns imitating some combinations of the posture displacement conditions and adopting the result closest to the registered data. However, the performance has stalled because the processing time increases due to the generation of plural patterns, the similarity between the strangers also becomes stronger due to adopting the most similar results upon carrying out plural collations to collate the strangers with one another and improvement effect is poor when such imitated images or patterns do not coincide with the actual degree of displacement by way of some examples. Besides the above, the disclosures of the prior art documents relevant to the present technical problem are as follows.
[0008] In Japanese Unexamined Patent Application Publication No. 2018-128785, upon carrying out the slide-type palm vein authentication using a tablet computer, to conduct guidance on the largeness of a user's hand with its person-to-person difference taken into considerations, such technique is disclosed as providing sensors and a touch panel; finding the length and largeness of the user' hand by detecting the contact positions on the plural places of the touch panel; and reducing the guide displaying according to such contact positions. However, there is no mention in this prior invention on the technique to rectify the finger features according to the largeness of the user's hand.
[0009] In Japanese Unexamined Patent Application Publication No. 2009-301094, with the input device to generate input signals by discerning contact against plural places of the input surface in order to facilitate input operation for the manipulation panel, such technique is disclosed as providing a means to detect finger touch and a means to generate input signals by detecting the transitional change from the contact to non-contact condition of the finger; and displaying the manipulation button at the position corresponding to the fingertip's location and the finger length. This prior art reference discloses that the hand size is standardized in order to improve the precision with which to determine the right hand or the left hand, but it is just to uniformly magnify or reduce the detected finger size, and there is no mention in this prior invention on the technique to rectify the specific three-dimensional angles.
[0010] In Japanese Unexamined Patent Application Publication 20 No. 2013-205931, in order to make guidance on the postures of e.g. the hand and palm held up non-contact, such technique is disclosed as providing an optimal posture estimation unit to make the sample hand images transform based on the position information of the specific portions which are obtained from the finger and palm images and hard to be influenced by the posture change of the hand and estimating an optimal hand posture for a user by the transformation of the hand sample images. This prior art reference is characterized in preliminarily preparing the hand sample images showing the ideal way of holding up the hand and determining whether or not the hand held up by the user is close to the ideal condition by making such sample images transform in accordance with the largeness of a user's hand whose picture is taken. However, there is no technical disclosure in this prior invention on the viewpoint to perform the rectification of magnification rate for the user's finger features.
[0011] The abovementioned problems are related to not only the fingers but also such various living body parts as the user's palm, back of the hand and face. In this way, according to the prior art, with the biometric authentication using various living body parts besides the plural fingers principally, it has been faced with such technical problem as being unable to rectify the fluctuation of the way of holding up the living body parts so as to end up deteriorating the authentication precision. [0012] The present invention is to provide a biometric 25 authentication apparatus and a biometric authentication method which is able to realize the high-precision authentication even when there occurs fluctuation in posture upon the images of the living body parts being captured.
[0013] The biometric authentication apparatus according to the present invention includes an image capturing section to capture an image having a picture of a living body part taken; a computation section to compute plural positions to specify the shape of the prescribed portion of the living body part in the captured image; a transformation section to magnify or reduce the shape of the prescribed portion based on a ratio between one of the plural positions and another of the plural positions upon registration and the one of the computed plural positions and the other of the computed plural positions; and an authentication section to make biometric authentication using an image of the prescribed portion whose shape has been magnified or reduced.
[0014] According to the present invention, it allows the high-precision authentication to be realized even when there occurs fluctuation in posture upon the image of the living body part being captured.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] Fig. lA is a view illustrating the arrangement of the biometric authentication system as a whole according to a 5 first embodiment; Fig. 1B is a view illustrating one example of the functional units of the program stored in the memory according to the first embodiment; Fig. 2 is a view illustrating one example of the registration processing flow of the biometric authentication system according to the first embodiment; Fig. 3 is a view illustrating one example of the authentication processing flow of the biometric authentication system according to the first embodiment; Figs. 4A and 4B depict explanatory views illustrating the presence or absence of the pitching angle of the finger according to the first embodiment; Figs. 5A to 5C depict explanatory views illustrating the change of appearance of the finger shape on the image 20 according to the first embodiment; Fig. 6 illustrates one mode of the standardization processing of the finger shape according to the first embodiment; Figs. 7A to 7D depict exemplary views upon the shapes of the finger being standardized according to the first embodiment; Figs. 8A and 8B illustrate one mode of standardizing 5 the palm shape according to a second embodiment; Fig. 9 illustrates one guidance example to suppress the finger pitching according to a third embodiment; Fig. 10 illustrates one guidance displaying example by the multimodal authentication in which the face and the fingers are held up according to a fourth embodiment; Figs. 11A and 11B illustrate explanatory views representing the relationship between the finger contour guide, the face and the visual direction according to the fourth embodiment; Figs. 12A and 12B illustrate a screen arrangement example displaying the finger guide which makes it easy to look according to the fourth embodiment; and Fig. 13 is a view illustrating one example of the standardization processing flow of the palm shape depicted in Figs. 8A and 83.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS [0016] Hereafter, the embodiments for carrying out the 25 present invention are explained with reference to the accompanying drawings. The following description together with the accompanying drawings are intended for exemplarily explaining the present invention, and to clarify the explanation, omissions and simplifications are made where appropriate. The present invention can be carried out in various manners other than disclosed herein. The respective characteristic features can be in single form or in plural form as far as it is not specified.
[0017] To facilitate the present invention to be understood, the position, size, shape, range and so forth of the respective characteristic features illustrated in the drawings, in some cases, do not represent the actual position, size, shape, range and so forth thereof. Thus, the present invention is not necessarily limited to the positions, sizes, shapes, ranges and so forth thereof disclosed in the accompanying drawings.
[0018] In the following explanation, in some cases, each piece of information is explained with the expressions such as 'table' and 'list', but such each piece of information may be expressed with the data structure other than mentioned above. To indicate independency from such data structure, in some cases, the expressions such as 'XX table' and 'XX list' are referred to as 'XX information'. 1 2.
Upon explaining identification data, where use of the expressions such as 'identification information', 'identifier', 'name', 'ID' and 'No.' are made, such expressions are replaceable with one another.
[0019] Where there are plural characteristic features having the identical or similar functions to one another, in some cases, explanation is given with a different subscript or superscript added to the same reference sign. However, where it is unnecessary to discriminate such plural characteristic features from one another, in some cases, explanation is given with such subscript or superscript omitted.
[0020] Further, in the following explanation, in some cases, the steps taken by executing a program are explained, in which the program is executed by a processor (e.g. CPU (Central Processing Unit), GPU (Graphics Processing Unit)), thereby, the prescribed steps being taken using a storage resource (e.g. memory) and/or interface devices (e.g. communication ports) and the like where appropriate, so that it can be said that the steps taken by executing the program are mainly processed by the processor. Likewise, the steps taken by executing the program may well be mainly processed by a controller, a device, a system, a computer and a node which are respectively provided with a processor. The steps taken by executing the program can be mainly processed by any arithmetic unit, which may well include a dedicated circuit to take the specific steps (e.g., FPGA (Field-Programmable Gate Array or ASIC (Application Specific Integrated Circuit).
[0021] The program may well be installed in such a device as a computer from the program source. The program source may well be e.g., a program distribution server or a storage medium which is readable by the computer. In the case of the program source being the program distribution server, the program distribution server includes a processor and a storage resource to store the program subject to distribution, in which the processor of the program distribution server may well distribute the program subject to distribution to another computer. Further, in the following explanation, two or more programs may well be realized as one program or one program may well be realized as two or more programs.
[0022] To note, in the present specification, the biometric features denote such biometric features anatomically different from one another as finger veins, fingerprints, joint patterns, skin patterns, finger contour shapes, body fat patterns, ratio in length between the respective fingers, finger widths, surface areas of fingers, melanin patterns, palm veins, palmprints, veins on the back of the hand, face veins, ear veins, face, ears, and irises.
(First Embodiment) [0023] A first embodiment is one mode for carrying out the biometric authentication system using the biometric features. That is to say, it includes an image capturing 10 section to capture an image of a living body part; an authentication processing section which is a processing device to process the image captured by the image capturing section and authenticate the living body part; and a display section which is a display device to display various information on the present system besides the processing result of the authentication processing section principally. The authentication processing section is provided with a posture determination unit to acquire posture information of the living body part shown in the image and determine the posture of the living body part; a posture rectification unit (computation section, transformation section) to rectify the posture of the living body part; a features extraction unit to extract the biometric features; and a collation unit (authentication unit) to calculate similarity of the biometric features and determine whether or not authentication is successful. The display section displays the acquired posture information, ideal posture information and guidance to the users.
[0024] Fig. 1A is a view illustrating one arrangement example of the biometric authentication system 1000 as a whole using the biometric features of the finger according to the present embodiment. To note, it is needles to say that the configuration according to the present embodiment is arranged not as the system, but may well be arranged as the apparatus in which all or some of the components are carried in a housing. The apparatus may well be a personal authentication apparatus having authentication processing under its own scope or a finger image acquisition apparatus dedicated to the acquisition of the finger image and/or a finger features image extraction apparatus while the authentication processing is carried out outside of the apparatus. In addition, the present configuration may well he regarded as a finger image acquisition apparatus using a color camera for general use which is carried on smartphones or tablet computers. Moreover, it may well be in the from of a terminal. The configuration provided with at least the image capturing section to capture the image of the living body part and the authentication processing section to process the captured image and authenticate the living body part is referred to as the biometric authentication apparatus herein.
[0025] The biometric authentication system 1000 according to 5 the present embodiment illustrated in Fig. 1A includes an input device 2 which is the image capturing section; the authentication processing section 10; a storage device 14; a display section 15; an input section 16; a speaker 17; and an image input section 18. The input device 2 includes 10 an image capturing device 9 installed in a housing and may well include a light source 3 installed in the housing. The authentication processing section 10 is provided with an image processing function.
[0026] The light source 3, for example, is such light emitting elements as LED (Light Emitting Diode) and irradiates light onto a finger 1 presented in front of the input device 2. The light source 3 may well be the kind that is able to irradiate light with various wavelengths according to certain embodiments or the kind that is able to irradiate the transmitted light of the living body part, in which it may well be also configured without carrying the light source 3. The image capturing device 9 captures the image of the finger 1 presented before the input device 2. To note, the images of the other living body parts such as the face, the iris, the back of the hand and the palm may well be captured at the same time. The image capturing device 9 may well be a color camera, an infrared camera or a multispectral camera which is bale to take pictures under the influence of visible, ultraviolet and infrared light at the same time. Further, it may well be a distance measuring camera which is able to measure the distance to an object whose image is captured or a stereo camera in which the same cameras are plurally combined. The input device 2 may well include the plurality of such image capturing devices. Furthermore, the finger 1 may well be rephrased with plural fingers or the plural fingers of both hands may well be covered at the same time. The image input section 18 acquires the image captured by the image capturing device 9 of the input device 2 and inputs the acquired image to the authentication processing section 10. Various kinds of reading devices to read the image can be adopted for the image input section 18.
[0027] The authentication processing section 10 is composed of a computer including a central processing unit (CPU) 11, a memory 12 and various Interfaces (IFs) 13. The CPU 11 performs authentication processing by executing a program stored in the memory 12. Fig. 1B is a view illustrating one example of the functional units of the program stored in the memory 12 in order to realize the functions of the authentication processing section 10. As illustrated in Fig. 1B, the authentication processing section 10 is composed of such various processing blocks as the image processing unit 20 to remove noise from the input image and detect the living body part; the posture determination unit 21 to determine whether or not the posture of the living body part is appropriate for authentication; the posture guidance unit 22 to guide a user such that the living body part takes a posture appropriate for authentication; the posture rectification unit 23 to rectify the posture fluctuation of the living body part; the features extraction unit 24 to extract the biometric features upon registration processing and authentication processing; and the collation unit 25 to compare the similarity of the biometric features. Such various processing steps are explained in detail below. The memory 12 stores the program executed by the CPU 11. Further, the memory 12 temporarily stores e.g., the images input from the image input section 18.
[0028] The interfaces 13 interconnect the authentication processing section 10 with external devices. Specifically speaking, the interfaces 13 are equipment provided with ports and the like to connect with e.g., the input device 2, the storage device 14, the display section 15, the input section 16, the speaker 17 and the image input section 18. [0029] The storage device 14 is composed of e.g., an HDD (Hard Disk Drive) and an SSD (Solid State Drive) and stores e.g., the registered data of a user. The registered data are pieces of information which are obtained upon registration processing to collate a user with and include, for example, image or feature data such as the finger vein patterns. The images of the finger vein patterns are mainly those of the finger veins, which are the blood vessels distributed under the skin of the finger on the palm side, captured as black shaded or slightly blue hued patterns. The feature data of the finger vein patterns are those of the images of the vein portions converted into binary or 8-bit images or those including features generated from coordinates of feature points such as the bent portions, branches and end points of the veins or generated from luminance data around such feature points.
[0030] The display section 15 is, for instance, a liquid crystal display and an output device to display information received from the authentication processing section 10 and the posture guidance information of the living body part as well as the posture determination result. The input section 16 is, for example, a keyboard or touch panel and transmits information input by a user to the authentication processing section 10. To note, the display section 15 may well be provided with such input function means as a touch panel. The speaker 17 is an output device to transmit information received from the authentication processing section 10 with such acoustic signals as voices.
[0031] Figs. 2 and 3 respectively are views illustrating one example of the schematic flow of the registration processing and the authentication processing according to the biometric authentication technique using the blood vessels of the finger which is explained in the present embodiment. Such registration processing and authentication processing are realized, for instance, by the program executed by the CPU 11 of the aforementioned authentication processing section 10.
[0032] To begin with, explanation is given on the flow of the registration processing with reference to Fig. 2. Firstly, the posture guidance unit 22 of the authentication processing section 10 displays guidance to prompt a user to present his/her finger on the display section 15, in accordance with which the user holds it up (S201). In the case of such smart devices as smartphones and tablet computers being used as the authentication apparatus, the user holds up his/her hand finger in the air.
[0033] Further, in the case of a finger placement table designed for a user to hold up his/her hand finger over the same being provided with the input device 2 or smart devices, the user may well hold it up thereover. At this time, the posture guidance unit 22 may well display a finger guide imitating the finger contour with overlapped with the video image being actually captured by a camera such as the image capturing device 9 in addition to the guiding message saying, for example, 'Please hold up your finger', as the guidance to prompt the user to present his/her finger, thereby, allowing the user to visually understand where and how he/she has to hold up his/her finger, which leads to improving usability.
[0034] Further, at this time, the posture guidance unit 22 may well display a circular guidance to prompt the user to hold up his/her face. By providing the guide to prompt him/her to hold up his/her finger and that to prompt him/her to hold up his/her face in different places, it permits the biometric features of the face and finger to be used for the biometric authentication at the same time. In addition, the posture guidance unit 22 may well display the 22.
guide for the face and that for the finger in the same place in this order with time lag, in which the images of the face and the finger may well be captured sequentially in time. At this time, where the camera in use is a front camera, because the display direction of the display section 15 and the capturing direction of the camera faces the same direction, the video image of the camera and the finger guide are displayed with their right and left sides inversed, thereby, the right and left sides of the user corresponding to those of the video image, which leads to improving the operability of the user.
[0035] Then, the image processing unit 20 of the authentication processing section 10 carries out the exposure adjustment, the white balance adjustment and the focus adjustment of the camera while the video image of the camera is being captured (5202) . With the exposure adjustment, the image processing unit 20 adjusts the exposure set value of the camera such that the average pixel value of the finger portion becomes a constant value.
To note, as the method for determining in which pixels the finger portion is shown, the inner area of the aforementioned finger guide may well be regarded as the finger portion or the result from the finger region having been specified through the background separation step described below may well be utilized. Additionally, as the method for calculating the optimal exposure set value, utilizing the fact that the exposure time is in proportion to the average luminance value of an image, such a method can be utilized as calculating the exposure time which is predicted to result in the average luminance value as desired based on the set exposure time and the resulting average luminance value of an image.
[0036] With the white balance adjustment, the image processing unit 20 makes such rectification that the color of the finger becomes the predetermined hue. Further, as the optimal focus adjustment method, the image processing unit 20 may well utilize, for example, such a method as evaluating, after having captured two differently focused images, the intensity of the edges and contrast within the finger region in such images and proceeding to fix the focus to the image having more intense edges and contrast or provide the fixed value for setting at which the focus meets most justly the distance over which the finger is held up, provided that such distance is assumed to be substantially constant.
[0037] Moreover, at this time, the image processing unit 20 25 may well expand the dynamic range of the image (HDR: High Dynamic Range); reduce noise in the image; generate ultrahigh-resolution images, range images and omnifocal images using a streak of images captured with high speed and plural different timings or a streak of images captured under plural exposure settings or a streak of images captured under plural focus settings. Such preliminary processing leads to bringing such effects as being able to capture the image of the living body part with high image-quality or vividly even under the dark surroundings.
[0038] Then, the image processing unit 20 performs the surroundings determination processing of the image (3203). Although the image is rectified with appropriate brightness through exposure rectification, it is assumed that there are some cases where it cannot be satisfactorily rectified especially when the surroundings are too dark or too bright. Thus, the image processing unit 20, when the average luminance of the image is poor even if the exposure time is made longer than the prescribed value to enhance brightness, it warns the user that the surroundings are too dark and prompt him/her to utilize the apparatus under brighter surroundings. To note, in the case of a mobile terminal intended to be used carrying a light source such as a flashlight, it may well be turned on to brighten up the surroundings. Likewise, when the image is still bright even if the exposure time is made shorter than the prescribed value to reduce brightness, it warns the user that the surroundings are too bright.
[0039] To note, at the surroundings determination processing, the image processing unit 20 may well not only determine the average brightness and darkness of the image but also take the following steps. For example, the image processing unit 20 may well detect such image capturing environments inappropriate for authentication as the surroundings where the background sunlight is intensely penetrated; the surroundings where shades, the dark and bright contrasts of which are intense, such as the spotlight and the sunlight leaking from the gaps of the interior blinds, are projected onto an object whose image is captured; the surroundings where smear noise, flare or ghost is overlapped with the video image; and the surroundings where the camera and an object whose image is captured have been largely misplaced so as to issue a warning as an error, thereby, allowing the quality of the captured image to improve and corrective measures to be presented to the user, which leads to improving authentication precision and convenience for use. In the case of this processing resulting in an error, it returns to the guide displaying step (5201) to prompt the user to present his/her finger.
[0040] Then, the image processing unit 20 of the authentication processing section 10 performs the background separation step to separate the image of the held-up finger from the images of the rest on the background (S204). Various objects other than the hand finger subject to authentication are visualized in the image. Thus, in order to specify in which position of the image the finger is present, the image processing unit 20 separates the finger portion from the background portion other than the finger one based on the color and edge data of the image. As one example of such separation method, the machine learning method using teaching data can be adopted. Specifically speaking, preliminarily, the image processing unit 20 captures the massive images of the hand fingers of various examinees by the camera such as the image capturing device 9 under various surroundings to the extent that such images form a population larger than the prescribed number and then makes up the labeling result in pixels which define the finger portion and the background portion for the respective images so as to accumulate the pair between the respective images and the labelling results as the teaching data. Then, the image processing unit 20 learns parameters based on the machine learning principally exemplified by DCNN (Deep Convolutional Neural Network) such that upon the image of the teaching data being input, its corresponding labelling result is output as correctly as possible. This method permits only the finger region to be obtained even when an unknown image is input.
[0041] To note, there are some cases where the images of plural hand fingers are captured in the object whose image is captured, in which cases the image processing unit 20 may well select the most conspicuously visualized hand finger in size among such plural ones or proceed with authentication for all the hand fingers or issue a warning as an error. Further, as another method to obtain the finger region, the image processing unit 20 may well, on the premise that the finger contour line can be described with a simple closed curve and on the assumption that such curve can be approximated with a spline curve and a Bezier curve, estimate the control parameters of the approximate curve by means of e.g., the machine learning. According to this method, it is assured that the finger contour line results in the smoothly connected closed curve, so that in comparison with the method by which such control parameters are estimated in pixels, it can be made less likely that the background portion might be mistaken for the finger region in error.
[0042] Then, the posture determination unit 21 of the authentication processing section 10 performs the finger posture detection in order to detect in which portion of the image obtained through the processing of the image processing unit 20 the held-up finger lies and how it is being held up (S205). The definitions of the finger posture include e.g. the fingertip positions of plural fingers, the positions of the interdigital webs which correspond to the interconnected portions of the neighboring fingers, the length, width and surface area of each finger, the coordinate of the center axis of each finger and its orientation, the positions of the joint patterns seen on the palm side of each finger, the curving of the joints, the warping of the joints, the open and close state of the fingers to indicate the degree of proximity between the neighboring fingers and the folding or stretching state of each finger. As one of the methods to detect such pieces of information above, such a method is adoptable as the posture determination unit 21, based on the information of the finger contour obtained along the outer circumference of the finger region when it is separated from the background portion as mentioned above, defining the portion higher in the curvature of the contour line as the fingertip while defining the portion higher in such curvature, but protruding inversely with the fingertip as the interdigital web and specifying the position, size and orientation of each finger. As another method to be exemplified herein, such a method may well be adopted as the posture determination unit 21, upon preparing a high volume of teaching data as described above, making up the teaching data also including such pieces of posture information as the positions of the fingertips, the interdigital web positions and the joint positions; and conducting the machine learning along with such pieces of posture information so as to estimate and output posture information for the input of un unknown finger image data.
[0043] Subsequently, the posture determination unit 21 of the authentication processing section 10 performs the posture determination step to determine whether or not the held-up finger is appropriate for authentication (S206).
The user holds up his/her finger in accordance with the finger guide shown on the display section 15, but his/her finger is not always held up in a manner appropriate for authentication. Thus, the posture determination unit 21, taking such information on the fingertips, the interdigital webs and the surface areas of the fingers into considerations, determines whether or not the finger posture is contained within the range of the predetermined permissible values. If it is determined that it is contained within such range, it proceeds to the next step.
Otherwise, the authentication processing section 10 issues a warning, outputs how to make such finger posture appropriate for authentication and returns to the guide displaying step (S201).
[0044] To note, if the finger is held up appropriately, the authentication processing section 10 changes the color and brightness of the finger guide and makes sounds, thereby, allowing the user to manipulate the apparatus without trouble. Further, the authentication processing section 10 may well change the color and brightness of the finger guide for each finger to show whether or not each finger takes an appropriate posture, which leads to specifically showing which finger is held up inappropriately so as to facilitate the user to adjust the positions of his/her finger in a simpler manner.
[0045] Then, the posture rectification unit 23 of the authentication processing section 10 performs the posture rectification of the living body part (S220). The posture rectification unit 23, based on the result of the abovementioned finger posture determination, cuts out an ROI (Region of Interest) image for each finger which corresponds to the partial image area of each finger so as to constantly align e.g., the orientation and magnification rate of each finger with one another. At this time, by applying the standardization of the finger shape described below, the apparent shape of each finger is made constant even when e.g., the angle by which the finger is held up might change, which leads to the generation of the finger images robust against the fluctuation of the presented angle.
[0046] Next, the features extraction unit 24 of the authentication processing section 10 performs the features 15 extraction step (5207), in which the features extraction unit 24 extracts the biometric features included in the finger image subjected to the posture rectification. The features that the features extraction unit 24 extracts, as described above, include such plural biometric features as 20 the finger veins, the fingerprints and the joint patterns. [0047] Thereafter, the collation unit 25 of the authentication processing section 10 performs the pattern normality determination for the extracted biometric features (5208). The pattern normality determination is 32.
the step to determine that the extracted patterns have no anomality. For instance, when the image information which does not belong to the finger is taken out as that of the finger by mistake during the finger detection step, there are cases where the patterns different in characteristics from those to be obtained in the normal course of action are gained upon extracting the biometric features from such image information. The collation unit 25 detects such error through the pattern normality determination step lest such mistakenly extracted patterns might be put to use for the registration or authentication. As one of the examples for the pattern characteristics, if reference is made to the linear patterns such as veins, they include e.g., the length and the number of branches thereof, the ratio or volume by which the veins occupy the inner area of the finger and the pattern density thereof. The collation unit 25, on such characteristics, converts into numerical values the pattern characteristics obtained from the correct vein images so as to predetermine the normal ranges and is able to determine whether or not the patterns are normal by confirming that the numerical values obtained from the actually extracted patterns are within such normal ranges. In turn, utilizing the general nature of images that those closer in terms of time sequence are higher in similarity, the collation unit 25 may well compare the currently extracted biometric features and those extracted immediately therebefore or those extracted closer thereto at several points in terms of time sequence so as to determine that there is some anomaly if their similarity is lower. If it is determined that there is anomaly, it results in registration error, in which case the unit performs the step from scratch.
[0048] Lastly, the collation unit 25 accumulates the registration candidates to choose the patterns adequate for registration from (S209). Since the images of the living body part are captured on video, the collation unit 25 can acquire the patterns one after another through video images, if it is determined that there is no particular problem during the aforementioned determination step. Considering the storage volume and restriction on the processing time, not all patterns can be registered. However, when the collation unit 25 time-sequentially accumulates such patterns as groups of candidates for the registration data, it can perform the registration selection processing to select which patterns should be registered among them. Because the registration data higher in quality can be singled out if only those appropriate for registration are registered from among a number of candidates, it allows the registration data volume to reduce and the authentication precision to be enhanced.
[0049] Then, the collation unit 25 determines whether or not 5 the groups of registration candidates are sufficiently accumulated (S210). The collation unit 25 temporarily stores the patterns which are determined to be free from problem at various determination stages through the processing flow up to the foregoing continuously during the prescribed duration of time or until they reach the prescribed number. If the collation unit 25 determines that such candidates are sufficiently accumulated (5210, Y), one batch of the registration candidates can be acquired. [0050] Next, the image processing unit 20 of the authentication processing section 10 gives an instruction to the user to the effect that the video image of the finger is captured once again so as to acquire new registration candidates through the capturing similar to 20 the step 5202. According to the present embodiment, this retake is repeated until the image processing unit 20 can acquire three batches of the registration candidates (5211). [0051] When the image processing unit 20 determines that 25 three batches of the registration candidates have been acquired (5211, Y), it performs the registration selection step from among them to determine the registration data (S212). As one mode of such registration selection, in the first place, the image processing unit 20 makes all-play-all collation among the patterns within one of the registration candidate batches and calculates similarity in pattern through the combination of the patterns as a whole. Herein, when attention is paid to a certain pattern and the total value of similarity with another pattern is calculated, it can be said that the former is similar to the latter on the average when such total value is relatively larger. On the other hand, when such total value becomes relatively smaller, the pattern itself to which attention has been paid is considered to be an exceptional pattern. Because registering the patterns whose images have been exceptionally captured for the registration candidates leads to the deterioration of precision, it is preferred to select the pattern, the total value of similarity of which with another pattern is as large as possible. Thus, the image processing unit 20 defines the pattern, the total value of similarity of which with another pattern is the largest, as the representative pattern of such one of the registration candidate batches. Since there are three batches of the registration candidates herein, three representative patterns are obtained. Lastly, the image processing unit 20 makes all-play all collation among those three patterns obtained as the representative ones so as to register those three patterns if they gain the similarity higher than the prescribed standard value based on which all of them can be judged as being similar to one another.
[0052] Further, as another method, in the first place, the image processing unit 20 defines the one having the highest 10 total value of similarity among the three representative patterns as the first registration data. Then, the image processing unit 20 makes collation between all the data of the registration candidates obtained in the non-selected two batches and the first registration data and determines as the second registration data the data which results in the lowest similarity among the limited number of similarities with which to be regarded as the same patterns. Lastly, the image processing unit 20 may well make collation between all the data of the registration candidates obtained in the non-selected batch and the first and second registration data and determine as the third registration the data which results in the lowest similarity among the limited number of similarities with which to be regarded as the same patterns. This method 25 brings the effect of absorbing various fluctuations upon authentication because such three pieces of registration data involve fluctuations with one another while guaranteeing that they are the data extracted from the same person so as to be able to realize the robust collation against the pattern fluctuations upon authentication. [0053] The image processing unit 20, at this moment, determines whether or not three pieces of registration data have been decided (5213) and preserves such data (5214) when it is determined that it was successful (5213, Y) so as to end the registration step. On the other hand, if it is determined that such three pieces of registration data have not been decided (5213, N), it ends the registration step in failure (5215).
[0054] Performing such registration determination prevents abnormal patterns acquired unexpectedly from being registered by mistake and the plurality of patterns having slightly different finger postures is registered while the capturing is repeated about three times, so that the robustness against the posture fluctuations upon authentication improves.
[0055] To note, upon acquiring three batches of registration 25 candidates, the posture guidance unit 22 may well make changes of the places where the finger is held up and its postures for the capturing of each batch of candidates like the hand guide position is displayed in a different location for each batch; the hand guide shape is changed for each batch such that the finger angles (three-dimensional rotational angles) and postures (e.g. the manner in which the fingers are separated from each other) different from one another are demanded or the place where the finger is held up is dynamically displaced for each batch. At this time, during the capturing of each batch of registration candidates, it entails different tendency to the finger posture and distortion resulting from the camera lens. This makes such three pieces of registration data different in seeming variation from one another so as to complement one another, which contributes to the improvement of the authentication precision.
[0056] Now, the authentication processing is explained with reference to Fig. 3. To begin with, the processing flow 20 from the guide displaying (5301) to the pattern normality determination (5308) are equivalent to that of the aforementioned registration step, so that such flow is omitted herein. Subsequently, the collation unit 23 takes the collation step (5309) with the registration data and lastly determines the authentication results (S310). At this time, according to the present embodiment, three batches of registration data are present. As one example of such collation processing method, the collation unit 25 may well collate the respective registration data with the respective authentication data and determine the result highest in similarity as the definitive similarity or determine as the definitive similarity the result obtained from each similarity being calculated and then the calculated similarities being averaged. Further, the collation unit 25 may well transform three batches of similarities into e.g., personal probabilities and calculate the definitive personal probability by combining such personal probabilities. The collation unit 25 determines whether or not the similarities obtained through such collation step are similar to the registration data (5311), and ends the authentication step in success (5312) if it is determined that they are similar to one another (5311, Y). On the other hand, when the collation unit 25 determines that they are not similar to one another (3311, N), it ends the authentication step in failure (3313). [0057] Figs. 4A and 4B are views illustrating the presence or absence of the pitching angle of the held-up finger. According to the present embodiment, principally, it is assumed that the finger is held up in the air, in which especially in the case of the finger being held up before the front camera, it is kept stationary in the air with the palm side of the finger faced to the camera or the guide screen. At this time, fluctuations occur on the angles of the wrist and elbow or the manner in which the lid of the liquid crystal screen provided with a camera is being opened if the equipment at hand is a notebook computer. Therefore, the distance between the camera and the fingertip and that between the camera and the finger root are not always the same, in which fluctuations occur like only the fingertip or the finger root comes near to the camera. Herein, such angle fluctuation toward the camera or the guide screen is called pitching, in which the state where the optical axis of the camera runs crosswise with the center axis of the finger is defined as the pitching-free state and the angular inclination from such state is defined as the pitching angle.
[0058] Fig. 4A represents the state where there is no pitching or the pitching angle is O. To note, the user holds up his/her finger toward the camera 9 at the left side from the right side with respect to the drawing sheet. At this time, the optical axis 80 of the camera runs crosswise with the center axis 81 of the finger, so that the distance between the camera and the fingertip and that between the camera and the finger root are substantially the same. In turn, Fig. 43 Illustrates an example in which there arises pitching. The angle that the optical axis 80 of the camera and the center axis 81 of the finger make is not orthogonal, but they intersect with each other with the inclination of +17° in this example. It should be noted that the pitching angle made when the fingertip comes near to the camera while at the same time the finger root is made distant away from the camera is defined as the plus direction. When the pitching angle of the finger is displaced in the plus direction, the fingertip comes near to the camera, so that the image of the fingertip portion is captured with magnification whereas the finger root is made distant away from the camera, so that its image is captured with reduction. Then, the magnification and reduction rates continuously change according to the vicinity of the fingertip, that of the finger middle and that of the finger root respectively and the change of the magnification rate is varied according to the largeness of the pitching angle and the average distance between the finger and the camera. In this way, the magnification rate differs portion by portion according to the change of the finger pitching angle, so that the seeming shapes of the finger also differ when its image is captured with the camera. 42.
[0059] Figs. 5A to 5C depict explanatory views illustrating how the finger shape looks on the image according to the change of the pitching angle. These drawings represent how 5 the finger having a typical shape looks on the image in the case of there being no pitching and in the case of there being the plus and minus pitching angles. As illustrated in the drawings, the fingertip 101, the finger root 102, and the finger contour 103 are shown in the ROI image 100 of the finger whose image is captured. Further, according to the present embodiment, the fingertip side width 104 on the fingertip side and the finger root side width 105 on the finger root side are defined as that of the joint wrinkle 106 at the first joint and that of the joint wrinkle 107 at the second joint respectively. However, the way of deciding the finger width is not limited to the foregoing, but it may well be arranged by way of one example such that the finger length from the fingertip 101 to the finger root 102 is calculated and the finger width at the position corresponding to the one-fifths of the finger length with respect to the fingertip position and that at the position corresponding to the four-fifths of the finger length with respect thereto are defined as the fingertip side width and the finger root width respectively.
The latter case is advantageous in that the finger width can be defined without detecting the finger joint wrinkles and just by detecting the fingertip and the finger length. In turn, the thinnest finger width and the thickest finger width in the whole finger region excepting the vicinity of the fingertip may well be defined as the fingertip side width and the finger root side width respectively. This method is advantageous in that such widths can be defined just by measuring the finger width at such thinnest and thickest positions based on the finger contour 103 and is 10 also higher in robustness with which to detect the seeming deformation of the finger.
[0060] Fig. SA shows the image of the finger captured from the front in the case of the pitching angle being 0. As the image of the finger is captured from the front with the camera, how the finger looks on the image reflects the original shape of the finger, in which the fingertip side width 104 is slightly thinner than the finger root side width 105. On the other hand, as illustrated in Fig. 5B, in the case of the pitching angle taking the plus angle, the fingertip side width looks somewhat thicker, as the fingertip comes near to the camera, and the vicinity of the fingertip is observed with magnification. Further, since the finger root side is made distant away from the camera, the finger root side width becomes relatively thinner, so that it invites the state where there is no large difference between the fingertip side width and the finger root side width as a whole. Further, the finger length also results in being shorter in comparison with the case where there is no pitching. Moreover, as illustrated in Fig. 5C, in the case of the pitching angle taking the minus angle, the seeming tendency of the fingertip and the finger root is inversed with that for the plus angle, in which the fingertip side width looks much thinner and the finger root side width looks much thicker.
[0061] In this way, when there arises change in pitching angle of the finger, the seeming finger shape in the image changes accordingly, so that especially in the case of the 15 collation processing being performed through the template matching in which the two-dimensional shapes are regarded as the biometric features as they are, correspondence rate deteriorates because the featured shapes of even the same living body part differ between the registration stage and the authentication stage, thereby, inviting incorrect authentication in some cases. Thus, according to the present embodiment, the standardization of the finger shape is carried out at the posture rectification steps (5220 and 5320) illustrated in Figs. 2 and 3, so that the seeming finger shape according to the change in pitching angle of the finger constantly becomes the same.
[0062] Fig. 6 illustrates one example of the standardization 5 processing of the finger shape according to the present embodiment. Herein, it is provided that the posture rectification unit 23 of the authentication processing section 10 had already acquired the positional information related to various finger postures such as the finger contour line, the fingertip, the finger root and the straight line of the center axis in the longitudinal direction of the finger at the time of having acquired the finger ROI image. Further, it is provided that the posture rectification unit 23 has carried out the rotational rectification for the rotation of the finger (referred to as the yawing rotation of the finger) within the two-dimensional image plane, that is, has acquired the finger ROI image such that the center axis of the finger is in parallel with the longitudinal direction of the finger ROI Image. Further, before carrying out the present processing, the posture rectification unit 23 preliminarily derives information on the average values of the fingertip side and the finger root side widths and the finger length from a high volume of learning data to the extent that it forms a population higher than the prescribed number and has such information defined as the standard shape of the finger. To note, upon defining the standard shape, it may well be defined as that to enhance authentication precision at the highest level when such parameters as the finger width and length representing the standard shape are comprehensively varied or such standard shape may well be defined for each registration ID by calculating the average value of the finger posture data in the up-to-now time-sequential data for each registration ID every time the user's authentication succeeds during the operation. Further, in the case of the plural fingers being used for authentication, such standard shape may well be defined for each detected finger or the common standard shape may well be defined for all the fingers.
[0063] To begin with, the posture rectification unit 23 acquires the finger ROT image derived from the foregoing posture determination processing (S601). Then, the posture rectification unit 23 calculates the position of the first joint wrinkle of the finger to measure the fingertip side width (S602). In this calculation, the posture rectification unit 23 can make the position where there is the highest peak the first joint wrinkle position of the finger by e.g., enhancing the finger joint wrinkle with the image enhancement filter and calculating the projected image of luminance toward the main direction in which the finger joint wrinkle runs. Then, the posture rectification unit 23 calculates the finger width at that position and defines such finger width as the fingertip side width (3603). Subsequently, likewise, the posture rectification unit 23 calculates the second joint wrinkle position of the finger (3604) so as to acquire the finger root width (3605). However, if such values have been already calculated during the foregoing posture determination processing, reference should be made to such values.
[0064] Thereafter, the posture rectification unit 23 acquires a trapezoid approximating to the finger shape (S606), in which the coordinates of four points of such a trapezoid as taking the fingertip side width and the finger root side width as its upper base and lower base respectively and taking the distance between such bases as its height are obtained. Lastly, the posture rectification unit 23 standardizes the finger ROI image as a whole by perspective projection transformation such that such image matches the aforementioned standard shape (S607), thereby, the arbitrary finger being transformed such that it corresponds to the standard shape.
[0065] Figs. 7A to 71) illustrate exemplary views to show the shape of the finger whose image is captured standardized into the standard shape through perspective projection transformation. The perspective projection transformation is the transformation of an image to reproduce how the three-dimensional object projected into the two-dimensional plane looks like and is also called projective transformation. Through this transformation, the transformation of the seeming finger shape can be expressed when the image of the finger present in the three-dimensional space is captured from an arbitrary visual point. Upon the displacement in visual point between the finger and the camera arising by pitching, the seeming shape of the finger changes, but carrying out the perspective projection transformation on the premise that the finger always takes the standard shape leads to converting the displaced visual point caused by the fluctuation of the pitching angle into the pitching-free visual point, so that the shape of the finger whose image is captured from the pitching-free visual point can be obtained.
[0066] Fig. 7A illustrates a trapezoid representing the predefined standard shape of the finger. Herein, the fingertip side width 104, the finger root side width 105, and the distance 140 between the finger joints are defined, in which the coordinates of the four apexes of the approximate trapezoid 141 to the finger formed by such definitions are defined at the same time. Further, for comparison purpose, the position 142 in the short axis direction of the finger middle is indicated with a dotted line. Fig. 7B illustrates a figure which approximates the shape of the finger whose image is captured from the pitching-free visual point to a trapezoid. The finger 143 before transformation reflects its original shape because of its pitching-free condition, but the posture rectification unit 23 carries out perspective projection transformation thereon such that it takes the standard shape, thereby, allowing the transformed finger 144 whose width, length, and shape have been standardized to be obtained. The finger is transformed into a shape different from its original shape, but it is always transformed into the same shape, so that the collation step proceeds correctly.
[0067] Now, Fig. 70 illustrates the finger shape at the plus pitching angle when the fingertip comes near to the camera. The seeming shape of the fingertip side portion is observed conspicuously large in size and the dotted line indicating the finger middle is slightly displaced to the finger root side. Then, upon the posture rectification unit 23 carrying out perspective projection transformation thereon such that the four points of the approximate trapezoid 141 to the finger perfectly take the same coordinates as those of the standard shape, the fingertip observed nearer is transformed such that it looks smaller while the finger root observed farther is transformed with magnification and the seeming length of the finger is elongatedly transformed. At this time, the finger middle portion also shifts to the vertical equivalent positions, thereby, the seeming deformation of the finger seen from the visual point of the camera being standardized into the appearance of the standard shape so as to allow the deformation of the finger caused by the pitching angle that the camera and the finger makes to be removed. Likewise, Fig. 7D illustrates a trapezoid approximate to the finger whose image is captured form the visual point having the minus pitching angle, which trapezoid looks slightly thicker in the finger root side, but the seeming shape of which is standardized by the similar transformation treatment. To note, as one of the concrete methods for carrying out such transformation, the following one can be exemplified, in which the posture rectification unit 23 calculates a matrix required for the corresponding perspective projection transformation from the coordinates of the eight points already known including the four points of the trapezoid having the standard shape and the four points of a trapezoid approximate to the actually measured finger and transforms the actually measured finger ROI image as a whole using such matrix.
[0068] In view of the foregoing, the present system brings such effect as being able to transform and rectify the finger image into the same appearance even when it is captured from different visual points caused by the fluctuation of the pitching angle, that is, being able to proceed with correct collation even when the collation processing is performed through the template matching. Further, the above standardization processing is advantageous because even in the feature points matching which is said to be comparatively robust against deformation, in some cases, the features fluctuate by the seeming deformation upon their extraction. Moreover, even when authentication is performed through the machine learning represented by DCNN, the system in which the input images are standardized is higher in learning efficiency, which leads to the improvement of authentication precision. [0069] On the other hand, the system according to one of the prior systems, in which the whole image is magnified or 25 reduced such that the finger width or length is made OZ.
constant or the longitude and latitude of the finger is independently subjected to magnification or reduction such that the finger width is subjected thereto such that it is made constant and then the finger length is subjected thereto such that it is made constant, makes the magnification rate in the two-dimensional plane of an image constant, so that it cannot respond to the partial change of magnification rate caused by an object receding or approaching. According to the present embodiment, rectification into substantially the same shape is feasible even with the change in finger pitching angle and the distortion of the camera lens, so that the high-precision authentication can be realized even under the surroundings where the pitching angle is present or likely to arise like the finger is roughly held up or there is difference in lens performance of the camera for general use.
[0070] To note, the actual fingers are three-dimensional and especially the fingertips are three-dimensionally spheric, 20 so that the fingertip portions entail the larger fluctuation of magnification rate against the fluctuation of the pitching angle in comparison with the case in which the finger is assumed to be a planar trapezoid according to the present embodiment. Thus, it is preferred to adopt the method for calculating the fingertip width which does not depend on the change of magnification rate of the finger too much. Particularly, the method for detecting the first joint wrinkle position of the finger does not depend on magnification rate, so that it allows the finger width to be measured robustly against the pitching angle fluctuation, which leads to improving authentication precision. Further, according to the present embodiment, the finger shape is approximated to a planar trapezoid, but it may well be approximated to such three-dimensional structures as cone and cylinder so as to make such shape transformed into the standard shape through the perspective projection transformation, thereby, more accurate standardization being realizable.
[0071] Further, according to the present embodiment, explanation is given above such that the oblong region having a trapezoidal shape which is defined with the finger widths at the first and second joint positions of the finger and the distance between both of such joints is subjected to the standardization of the finger shape, but in the case of the detection precision of the fingertip and the finger root being higher than that of the joints, a rectangle including the finger length from the fingertip to the finger root and the finger width at the middle point of the finger length may well be subjected to such standardization. Moreover, the finger length may well be the distance between the fingertip position and the finger root position shifted to the center direction of the finger by a certain distance from the fingertip and the finger root respectively and the finger width in this case is not limited to the middle point of the finger length, but using the finger width at the positions of the fingertip and the finger root shifted to the center direction of the finger, a trapezoidal region for standardization may well be formed.
Furthermore, as the standardization method in the case of the rectangle being subjected to standardization, the whole image may well be subjected to standardization such that the ratio between the finger length and the finger width always becomes constant irrespectively of personal difference in e.g. finger shape by the posture rectification unit 23 firstly uniformly magnifying or reducing the latitude and longitude of an image such that the finger width becomes constant and then magnifying or reducing only the finger length direction such that the finger length becomes constant, that is, by prolonging or compressing the image in the finger length direction. This method is advantageous in facilitating the authentication processing because the perspective projection transformation is dispensed with.
(Second Embodiment) [0072] Figs. 13, 8A, and 83 illustrate one embodiment of the processing flow and its exemplary views respectively to standardize the palm shape into the standard shape based on the perspective projection transformation. According to the first embodiment, the finger shape associated with the pitching angle change is standardized, but the palm posture can be standardized by the similar method mentioned above. Particularly, the palm is the living body part whose image can be captured together with the fingers upon the fingers being held up before the front camera and utilizing it along with the fingers for authentication leads to the improvement of its precision. Hereupon, because the palm features fluctuate according to the change in angle by which it is held up in the same way as the finger, the standardization of the seeming shape thereof is advantageous.
[0073] Firstly, the posture rectification unit 23 acquires the hand image including the palm and the fingers (5801) and then detects the positions of the joint wrinkles 160 at the finger roots on the palm side (S802). The detection method is the same as used for detecting the abovementioned first and second joint wrinkles. Next, the posture rectification unit 23 calculates the center positions of the joint wrinkles 160 at the finger roots in the finger width direction (S803). To note, the posture rectification unit 23 may well calculate the interdigital webs between the fingers and define the middle points between the webs as the center positions.
[0074] Subsequently, the posture rectification unit 23 calculates a pentagon 161 for the palm rectification which takes such center positions as its apexes (S804). Then, 10 the posture rectification unit 23 calculates a matrix for subjecting this pentagon to the perspective projection transformation into the predetermined standard shape (5805) and lastly subjects the hand image 162 as a whole to the perspective projection transformation by the matrix (5806), thereby, the posture of the palm image being standardized.
In other words, the posture rectification unit 23 rectifies the palm shape by subjecting the calculated pentagon 161 to the perspective projection transformation into the polygon having the prescribed standard shape mentioned above, thereby, permitting the palm shape to be rectified in terms of the pitching angle, the rolling angle, and the yawing angle and authentication using the biometric features of the palm robust against the deformation of such angles being realizable.
[0075] To note, the biometric features extracted from the palm include e.g., the palm veins, the palmprints, the palm shapes or such dermatographic features as melanin pigment, red pigment of hemoglobin, and fat. To note, according to the present embodiment, the finger image is rectified together with the rectification of the palm image, but it may well be arranged such that the ROI image for the palm portion is prepared on a separate basis and rectification is limited to such separate image while the rectification of the fingers is carried out according to the first embodiment above on a separate basis.
[0076] To note, according to the present embodiment, standardization is caned out with the pentagon, but a 15 square including four points at the positions of the interdigital webs which correspond to the joint roots between the neighboring fingers may well be subjected to the perspective projection transformation into the standard shape, for instance. Especially, the positions of the interdigit webs can be comparatively so stably extracted from the bending points of the finger contour that the robust detection of the palm is feasible. Further, for instance, when the thumb image is not captured, a square having four points excepting the joint root of the thumb as its apexes may well be subjected to the perspective projection transformation into the standard shape or when the wrist image inclusive can be captured, a polygon including six or seven points which contains the middle point or both ends of the linear pattern of the skin present in the wrist may well be transformed into the standard shape. Extracting a region whose surface area is as large as possible leads to carrying out the shape standardization more accurately.
[0077] Even when laiometric authentication is carried out using the images of e.g., the face, the pinna, and the iris, the similar transformation mentioned above is adoptable. As one embodiment of the standardization of the face shape, the posture rectification unit 23 may well calculate the center position of both eyes and the positions of both mouth corners respectively and subject a square taking these four points as its apexes to the perspective projection transformation into the standard shape, thereby, permitting the face image to be rectified in terms of the pitching angle, the rolling angle, and the yawing angle respectively. To note, it is needless to say that the perspective projection transformation is also feasible even using such feature points as the eye corners, both ears, the chin tip, and the eyebrow. Further, in the case of using the pinna, a square having four points including the tip end of the earlobe, the ear hole, the outer fringe edges of the pinna can be subjected to the perspective projection transformation into the standard shape while in the case of using the iris, its circular cornea is subjected to the perspective projection transformation such that it takes a perfect circle.
(Third Embodiment) [0078] Fig. 9 illustrates one guidance example to suppress 10 the pitching fluctuation of the finger.
[0079] As described in the first embodiment, the held-up finger is approximated to a trapezoid and compared with the predefined standard shape, in which in the case of the ratio of the finger width on the fingertip side to that on the finger root side being larger, the finger tends to be pitched to the plus direction whereas in the reverse case thereof, it tends to be pitched to the minus direction. Thus, the posture guidance unit 22 calculates such ratio and can judge that the fingertip is possibly coming near to the camera too much when the average value of the calculated ratio of each finger is larger than the average ratio in terms of the standard shape. Utilizing this pitching tendency, the posture guidance unit 22 guides the hand held up in the three-dimensional air.
[0080] To begin with, the user holds up his/her finger 1 over the finger contour guide 180 shown on the display section 15. At this time, the preview screen in which the preview video image that is the video image being captured at the moment by the camera is overlapped with the guide so as to be also displayed on the screen in a real-time manner. Then, the posture guidance unit 22 calculates a trapezoid approximate to the finger according to the abovementioned embodiment while calculating the ratio of the finger width on the fingertip side to the finger width on the finger root side at the same time. In the case of this value being higher, the finger highly possibly takes the plus pitching angle whereas in the case of the same being lower, highly possibly taking the minus pitching angle. At this time, the posture guidance unit 22 preliminarily calculates the upper limit threshold to determine the plus pitching angle and the lower limit threshold to determine the minus pitching angle and judges that the finger is highly likely to take the plus pitching angle in the case of such value being higher or going beyond such upper limit.
[0081] Then, the posture guidance unit 22 displays the guiding message 181 saying 'Please bend your elbow to keep 25 your hand away from the screen' while transforming the shape itself of the finger contour guide 180 through the perspective projection transformation such that only the fingertip side looks thinner for display. Further, the posture guidance unit 22 displays the body posture guide 182 seen transversely with respect to the screen and shows how to keep his/her hand where the body is seen transversely with respect to the screen. The user who looks at this guide keeps his/her hand away from the screen with bending his/her elbow. At this time, his/her hand moves around the elbow, so that the palm slightly turns up and the fingertip keeps away from the camera. Likewise, the posture guidance unit 22, when the aforesaid ratio is lower than its lower limit, the fingertip is likely to be too much distant away from the camera, so that it displays a guiding message such as saying 'Please stretch your elbow to let your fingertip come closer to the screen'.
[0082] The pitching angle itself of an object whose image is captured can be guided to a certain angle through the presentation of such guiding messages, so that the pitching angle of such object itself can be controlled besides the rectification by transformation through the image processing. Moreover, the user can rectify his/her finger position with the manipulation step easy to understand and effortless to handle through the body posture guide and the transformation display of the finger contour guide, which leads to the improvement of authentication precision without ruining convenience for use.
(Fourth Embodiment) [0083] A fourth embodiment exemplifies the authentication system using information on the living body part whose image can be captured with the front camera carried on such mobile terminals as the notebook and tablet computers.
[0084] Fig. 10 illustrates an example of the guidance display for the multimodal authentication in which the face and the hand fingers are held up. As described in the aforementioned embodiments, making a user hold up his/her fingers after the finger contour guide 180 has been displayed is effective for suppressing the posture fluctuation of the finger, but there are some cases where the held-up fingers result in occluding the field of vision on the part of the user so that he/she cannot see the finger contour guide 180, in which case the user takes such manipulation as aligning his/her hand position while peeping at the finger contour guide, which complicates the manipulation step concerned and makes the video image of the face listed so as to deteriorate the precision of the face authentication. Thus, according to the present embodiment, it exemplifies one mode of the guidance system to finely adjust the display position of the preview screen of the finger image based on the face position and the display position of the finger guide with respect to the display section 15.
[0085] To begin with, when the user comes before such authentication terminals as tablet computers, the image of his/her face 200 is captured by the camera and the position 10 of the face on the captured image is detected through the face detection processing performed by the image processing unit 20. At this time, the collation unit 25 may well perform the face authentication besides the face detection. Then, the posture guidance unit 22 displays the finger contour guide 180 on the screen and simultaneously shows the guiding message 181 such as saying 'Please place your right hand along the finger guide'. At this time, the posture guidance unit 22 displays guidance to make the user hold up his/her right hand when the face position is on the left side with respect to the image center and to make the user hold up his/her left hand when the face position is on the right side with respect to the image center. Through this guidance, the image of the hand easy to be held up along with the face image can be captured without effort, which leads to the effortless manipulation step so as to improve convenience for use.
[0086] The user holds up his/her finger 1 in the air while 5 looking at the finger contour guide 180 according to its guidance. At this time, the position of the front camera is varied according to the authentication terminal in use, in which provided that the optical axis of the camera corresponds to the image center, it can be seen whether or not the held-up hand fingers occlude his/her visual direction so as to make the finger contour guide disappear when he/she places his/her fingers along the finger contour guide in view of the relationship between the face position and the display position of the finger contour guide.
[0087] Figs. 11A and 11B illustrate exemplary views representing the relationship between the finger contour guide, the face and the user's visual direction. As illustrated in Fig. 11A, when the user's face 200 is slightly positioned to the left side with respect to the image center and the posture guidance unit 22 displays the finger contour guide 180 for the right hand on the right side of the image, the position at which the hand is held up and the position of the finger contour guide 180 both lie on the right side. At this time, the user is hard to see the finger contour guide 180 due to his/her held-up hand 220. Thus, as illustrated in Fig. 11B, the posture guidance unit 22 shifts the image preview screen itself to the left side so as to make the finger contour guide 160 displayed at the detection position of the face 200. That is to say, the posture guidance unit 22 shifts the preview screen for display such that the user's face and the finger contour guide 180 on the preview screen are directly opposed to each other, as a result of which the finger guide for the right hand is displayed in front of the user so as to make it easy for him/her to see the finger guide. In this way, the authentication system in which the held-up finger does not block the finger guide and which makes it easy for the finger to be held up is realizable.
[0068] However, when the face detection position becomes unstable, it sometimes happens that the image flickers like it is shifted one time or not shifted another time. Therefore, after the image processing unit 20 has detected that the face lies on the left side through the face detection processing over the plural frames, the posture guidance unit 22 displays the image with keeping it shifted for a certain time so as to make it stably displayed. Further, the posture guidance unit 22, when shifting an image, may well shift the image while smoothly displacing it like the video image, thereby, making it easy for the user to understand that such image has shifted, which leads to the improvement of convenience for use.
[0089] Figs. 12A and 12B exemplify the screen arrangement to display the finger guide illustrated in Fig. 11B so as to make it easy to look. Fig. 12A illustrates the image display when the captured images and the guide image are reduced in size and lopsided to the left side. The face 200 and the finger contour guide 180 are both displayed on the preview screen 240 reduced in size, in which the preview screen is displayed substantially in front of the face because those images are displayed lopsidedly on the left side, so that there is no way that the user's visual direction is occluded. However, due to such images being reduced in size, there are some cases where they become slightly hard to see.
[0090] On the rather hand, Fig. 12B illustrates the screen where the face ROT image 241 resulting from the face portion only being cut out through the face detection is displayed on a separate window-in a small scale. On this screen, the above separate window is displayed and the hand ROI image 242 is cut out around the finger contour guide 180, which is displayed lopsidedly to the left side in a normal scale. Because the position alignment of the fingers has comparatively higher latitude than the position alignment of the face, displaying the finger images larger than the face permits the position alignment of the fingers 5 to be facilitated.
[0091] To note, according to the present embodiment, explanation has been given on how to capture the images of the face and one hand at the same time, but prompting the user to hold up both hands can enhance authentication precision, as the guidance for which the ROI images of both hands may well be juxtaposed at the position easy to look for display as mentioned above. Further, for example, when the screen is narrow in size or the installation space for the authentication terminal is cramped, it may well be arranged such that plural living body parts are held up in an appropriate order like the face image is firstly captured and then the hand is held up, in which case different living body parts can be held up in the same place, so that it is optimal for the multimodal authentication performed in a small space.
[0092] Moreover, according to the above embodiment, it is arranged such that the right or left hand to be held up is 25 automatically selected according to the face position, but the right or left hand to be held up can be preliminarily designated by the user, in which case the posture guidance unit 22 may well change the hand guide display position as mentioned above according to the designated hand side.
Specifically speaking, when the input section 16 receives the designation of the left hand being held up, it is assumed that the hand and the face are held up on the left side and the right side respectively. Therefore, the posture guidance unit 22 displays the hand guide lopsidedly to the right side where the face lies, thereby, the guide displaying easy to look being realizable.
[0093] In addition, with the terminals such as smartphones which are intended to be used by gripping them by hand, it 15 often happens that they are used vertically, the camera of which is often provided on the upper portion thereof with respect to the vertical direction thereof. When such terminal is laid transversely, the camera position results in moving to either the left side or the right side with respect to the front of the user, in which case it often happens that the hand is not shown within the angle of view of the camera when the hand distant away from the camera is held up. Therefore, when such terminal is laid transversely, the guide displaying is automatically switched over such that the hand nearer to the camera is held up utilizing the inclined direction detection function of the terminal.
[0094] For example, when the camera is laid transversely 5 listed to the left side, the image processing unit 20 automatically detects that the camera is listed to the left side through the acceleration sensor of the smartphone and the posture guidance unit 22 automatically displays the guide for the left hand such that the left hand which is 10 easy to be held up is held up, thereby, allowing the hand image to be optimally and effortlessly captured without the user feeling hard to hold up his/her hand even when the smartphone is laid transversely.
[0095] Furthermore, when the vertical-type terminal is used as it is laid vertically, it is assumed that the face and hand side by side held up go over the angle of view because the screen is narrow in space so as to become hard to be held up, in which case the posture guidance unit 22 may well display the hand guide at the position slightly lower than the face or right above or below the face, thereby, permitting the face and the hand both to be held up at the same time even when the terminal is laid vertically.
[0096] To note, the present invention is not limited to the embodiments described above, but can be modified into various manners. For example, the above embodiments are intended for explaining the present invention in detail for the persons skilled in the art to better understand it, so that the present invention is not necessarily limited to what has all the characteristic features disclosed in the explanation. Further, some of the features of certain embodiments can be replaced with those of the other embodiments while the features of the other embodiments can be added to those of certain embodiments. Additionally, as for some of the features of each embodiment, another feature can be added thereto and replaced therewith by deletion.
[0097] Further, such embodiments have been explained as making up a program realizing some of or all of the abovementioned respective features, the relevant functions and the authentication processing unit and the like, but as explained in the preface, it is needless to say that some or all of them may well be realized with hardware e.g., like they are designed with integrated circuits by way of one example. That is to say, all or some of the functions covered by the authentication processing unit may well be realized with such integrated circuits as ASIC and FPGA instead of a program.
[0098] In this way, since the present invention explained 5 according to the above embodiments includes an image capturing section to capture an image having a picture of a living body part (e.g. fingers and palm) taken; a computation section (e.g. posture rectification unit) to compute the plural positions to specify the shape of the prescribed portion (e.g. finger length and width, finger root portion on the palm side) of the living body part in the captured image; a transformation section (e.g. the above posture rectification unit) to magnify or reduce the shape of the prescribed portion based on a ratio between one of the plural positions and the other of the plural positions upon registration and the one of the computed plural positions and the other of the computed plural positions; and an authentication section (e.g. collation unit) to make biometric authentication using an image of the prescribed portion whose shape has been magnified or reduced, it can provide the authentication system so high in operability for the user himself/herself to be able to perform its registration operation and the high-precision biometric authentication apparatus which is convenient to use and excellent in usability so as to be optimal for a 72.
personal authentication apparatus. Further, it relates to the authentication system that authenticates an individual using biometric information so as to enhance the versatility and utility of the authentication apparatus and 5 be able to provide high-precision authentication technique. For instance, in order to achieve the biometric authentication using the video images of one finger or plural fingers which are captured with a camera for general use carried on smartphones or tablet computers, the posture 10 fluctuation of the user's finger or hand which is presented in accordance with the guiding means displayed by the system is rectified, thereby, the high-precision biometric authentication apparatus convenient to use being realizable.
Claims (7)
- What is clamed is: 1. A biometric authentication apparatus comprising: an image capturing section to capture an image having 5 a picture of a living body part taken; a computation section to compute a plurality of positions to specify a shape of a prescribed portion of the living body part in the captured image; a transformation section to magnify or reduce the 10 shape of the prescribed portion based on a ratio between one of the plurality of positions and another of the plurality of positions upon registration and the one of the computed plural positions and the other of the computed plural positions; and an authentication section to make biometric authentication using an image of the prescribed portion whose shape has been magnified or reduced.
- 2. The biometric authentication apparatus according to claim 1, wherein the image capturing section captures an image of a finger as the living body part; the computation section computes a length of the finger and a width of the finger in the captured image as 25 the shapes of the prescribed portions; the transformation section magnifies or reduces the finger length based on a ratio between the finger length and the finger width upon registration and the computed finger length and the computed finger width; and the authentication section makes biometric authentication using an image of the finger whose length has been magnified or reduced.
- 3. The biometric authentication apparatus according 10 to claim 2, wherein the transformation section uniformly magnifies or reduces latitude and longitude of the image such that the finger width becomes constant.
- 4. The biometric authentication apparatus according to claim 3, wherein the transformation section further prolongs or compresses the image in a direction of the finger length such that the ratio between the finger length and the finger width becomes constant.
- 5. The biometric authentication apparatus according to claim 2, wherein the transformation section magnifies or reduces the finger length by defining a distance between a fingertip position and a finger root position shifted to a center direction of the finger by a certain distance from the fingertip and the finger root respectively as the finger length.
- 6. The biometric authentication apparatus according to claim 1, wherein the image capturing section captures an image of a palm; the computation section detects joint wrinkle positions at finger roots on the palm side as the shapes of 10 the prescribed portions and calculates a polygon taking the detected joint wrinkle positions as its apexes; the transformation section rectifies a shape of the palm by subjecting the calculated polygon to perspective transformation into a polygon having a prescribed standard 15 shape; and the authentication section makes biometric authentication using an image of the palm whose shape has been rectified.
- 7. A biometric authentication method executed by a computer comprising the steps of: capturing an image having a picture of a living body part taken; computing a plurality of positions to specify a shape of a prescribed portion of the living body part in the captured image; magnifying or reducing the shape of the prescribed 5 portion based on a ratio between one of the plurality of positions and another of the plurality of positions upon registration and the one of the computed plural positions and the other of the computed plural positions; and making biometric authentication using an image of the 10 prescribed portion whose shape has been magnified or reduced.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020125159A JP7513451B2 (en) | 2020-07-22 | 2020-07-22 | Biometric authentication device and method |
Publications (3)
Publication Number | Publication Date |
---|---|
GB202103638D0 GB202103638D0 (en) | 2021-04-28 |
GB2598016A true GB2598016A (en) | 2022-02-16 |
GB2598016B GB2598016B (en) | 2022-12-21 |
Family
ID=75439117
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2103638.9A Active GB2598016B (en) | 2020-07-22 | 2021-03-16 | Biometric authentication apparatus and biometric authentication method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7513451B2 (en) |
GB (1) | GB2598016B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11354924B1 (en) * | 2021-05-17 | 2022-06-07 | Vr Media Technology, Inc. | Hand recognition system that compares narrow band ultraviolet-absorbing skin chromophores |
CN115082972B (en) * | 2022-07-27 | 2022-11-22 | 山东圣点世纪科技有限公司 | Living body detection method based on RGB image and vein gray level image |
CN115631514B (en) * | 2022-10-12 | 2023-09-12 | 中海银河科技(北京)有限公司 | User identification method, device, equipment and medium based on palm vein fingerprint |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009301094A (en) | 2008-06-10 | 2009-12-24 | Sharp Corp | Input device and control method for input device |
JP2013205931A (en) | 2012-03-27 | 2013-10-07 | Fujitsu Ltd | Biological information acquisition device, biological information acquisition method, biological information acquisition control program |
CN104598870A (en) * | 2014-07-25 | 2015-05-06 | 北京智膜科技有限公司 | Living fingerprint detection method based on intelligent mobile information equipment |
JP2018128785A (en) | 2017-02-07 | 2018-08-16 | 富士通株式会社 | Biometric authentication apparatus, biometric authentication method, and biometric authentication program |
WO2020071008A1 (en) * | 2018-10-03 | 2020-04-09 | 株式会社日立製作所 | Biometric authentication system, biometric authentication method, and program |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003038461A (en) | 2001-07-13 | 2003-02-12 | Ritsushin Chin | Automatic palm print diagnostic device characterized in that image of palm is analyzed by computer and disease can be diagnosed from the analysis result |
JP2003263640A (en) | 2002-03-08 | 2003-09-19 | Toyota Motor Corp | Personal identification device using biological information |
JP4640582B2 (en) | 2005-03-09 | 2011-03-02 | ソニー株式会社 | Collation device, registration device, image correction method, and program |
JP4845203B2 (en) | 2006-11-24 | 2011-12-28 | ソニー・エリクソン・モバイルコミュニケーションズ株式会社 | Biometric authentication apparatus and deviation detection method |
JP5053889B2 (en) | 2008-02-29 | 2012-10-24 | グローリー株式会社 | Image collation device, personal authentication device, corresponding point search device, corresponding point search method, and corresponding point search program |
JP5293950B2 (en) | 2008-03-04 | 2013-09-18 | 株式会社リコー | Personal authentication device and electronic device |
JP5618267B2 (en) | 2010-06-02 | 2014-11-05 | 国立大学法人名古屋工業大学 | Vein authentication system |
JP6044403B2 (en) | 2013-03-18 | 2016-12-14 | 富士通株式会社 | Imaging apparatus, imaging method, and imaging program |
JP2016057850A (en) | 2014-09-10 | 2016-04-21 | 日立オムロンターミナルソリューションズ株式会社 | Biometric authentication device |
JP6467852B2 (en) | 2014-10-10 | 2019-02-13 | 富士通株式会社 | Biological information correction apparatus, biological information correction method, and biological information correction computer program |
-
2020
- 2020-07-22 JP JP2020125159A patent/JP7513451B2/en active Active
-
2021
- 2021-03-16 GB GB2103638.9A patent/GB2598016B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009301094A (en) | 2008-06-10 | 2009-12-24 | Sharp Corp | Input device and control method for input device |
JP2013205931A (en) | 2012-03-27 | 2013-10-07 | Fujitsu Ltd | Biological information acquisition device, biological information acquisition method, biological information acquisition control program |
CN104598870A (en) * | 2014-07-25 | 2015-05-06 | 北京智膜科技有限公司 | Living fingerprint detection method based on intelligent mobile information equipment |
JP2018128785A (en) | 2017-02-07 | 2018-08-16 | 富士通株式会社 | Biometric authentication apparatus, biometric authentication method, and biometric authentication program |
WO2020071008A1 (en) * | 2018-10-03 | 2020-04-09 | 株式会社日立製作所 | Biometric authentication system, biometric authentication method, and program |
Non-Patent Citations (1)
Title |
---|
LIANG XU ET AL: "A Novel Multicamera System for High-Speed Touchless Palm Recognition", IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS: SYSTEMS, IEEE, PISCATAWAY, NJ, USA, vol. 51, no. 3, 12 March 2019 (2019-03-12), pages 1534 - 1548, XP011838236, ISSN: 2168-2216, [retrieved on 20210216], DOI: 10.1109/TSMC.2019.2898684 * |
Also Published As
Publication number | Publication date |
---|---|
GB2598016B (en) | 2022-12-21 |
JP7513451B2 (en) | 2024-07-09 |
GB202103638D0 (en) | 2021-04-28 |
JP2022021537A (en) | 2022-02-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11775056B2 (en) | System and method using machine learning for iris tracking, measurement, and simulation | |
US11188734B2 (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
WO2017082100A1 (en) | Authentication device and authentication method employing biometric information | |
KR102561723B1 (en) | System and method for performing fingerprint-based user authentication using images captured using a mobile device | |
GB2598016A (en) | Biometric authentication apparatus and biometric authentication method | |
US12067095B2 (en) | Biometric authentication system, biometric authentication method, and storage medium | |
WO2020108225A1 (en) | Fingerprint acquisition method and related apparatus | |
US8798329B2 (en) | Authentication apparatus, authentication method, registration apparatus and registration method | |
WO2022244357A1 (en) | Body part authentication system and authentication method | |
JP4507679B2 (en) | Image recognition apparatus, image extraction apparatus, image extraction method, and program | |
JP5254897B2 (en) | Hand image recognition device | |
JPWO2022074865A5 (en) | LIFE DETECTION DEVICE, CONTROL METHOD, AND PROGRAM | |
JP4654434B2 (en) | Gaze direction identification system | |
JP2023075984A (en) | Biological information measurement device, biological information measurement method and biological information measurement program | |
JP2023161524A (en) | Biometric authentication system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40068769 Country of ref document: HK |