WO2020207947A1 - Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality - Google Patents

Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality Download PDF

Info

Publication number
WO2020207947A1
WO2020207947A1 PCT/EP2020/059718 EP2020059718W WO2020207947A1 WO 2020207947 A1 WO2020207947 A1 WO 2020207947A1 EP 2020059718 W EP2020059718 W EP 2020059718W WO 2020207947 A1 WO2020207947 A1 WO 2020207947A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
body part
posture
capturing
biometrics
Prior art date
Application number
PCT/EP2020/059718
Other languages
French (fr)
Inventor
Johan Bergqvist
Arnold Herp
Original Assignee
Smart Secure Id Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smart Secure Id Ag filed Critical Smart Secure Id Ag
Priority to EP20717828.6A priority Critical patent/EP3953858A1/en
Priority to US17/602,464 priority patent/US11900731B2/en
Priority to CN202080026328.4A priority patent/CN113678138A/en
Priority to AU2020271607A priority patent/AU2020271607A1/en
Publication of WO2020207947A1 publication Critical patent/WO2020207947A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/11Hand-related biometrics; Hand pose recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • the present disclosure relates to a biometrics imaging device and a biometrics imaging method for capturing image data of a body part of a person.
  • the present disclosure relates to a biometrics imaging device and a biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality.
  • Biometric authentication devices which include biometric imaging devices for capturing image data of a body part of a person, are widely used for the authentication of persons, such as in the context of access control to resources as for example buildings, rooms, computers, smartphones, electronic bank accounts, voting systems, school or university exam papers, frontiers, company registers, etc.
  • biometrics imaging devices are configured to capture image data of a body part of a person, such as of a hand of the person, such that individual and typical biometric features may be determined from the captured image data.
  • Captured image data of the hand or body part of the person may relate to image data captured with a near infrared light sensor (e.g. 700nm to 900nm), to image data captured with a visible light sensor (e.g. 400nm to 600nm), or a combination thereof.
  • Biometric features determined from image data may relate to vein patterns, palm prints, lifelines, etc. of the hand.
  • Image data captured in the near infrared light enable determining features relating to vein patterns of the hand.
  • Image data captured in the visible light enable determining features relating to palm prints and lifelines of the hand.
  • Authentication of persons is based on pre-stored biometric features which were registered under control of an entitled and trustworthy authority.
  • This authority verifies the identity of a person on the basis of an identification card such as passport, for example.
  • Respective image data of the hand of the person is captured and biometric features of the hand or body part of the person are determined from the captured image data.
  • the determined biometric features are stored in a database as pre-stored biometric features.
  • the pre-stored biometric features may include partially or fully the captured image data.
  • the determined biometric features of the hand or body part, the captured image data of the hand, or a combination thereof may relate to vein patterns, palm prints, lifelines, etc.
  • a biometric authentication device captures image data of the hand or body part of the person. Biometric features of the hand or body part of the person are determined and compared with pre-stored biometric features and/or pre-stored image data. Authentication is approved for the person if a match is found within the pre-stored biometric features, otherwise authentication is rejected.
  • the biometrics imaging device In order to achieve reproducible results and sufficient authentication accuracy, the biometrics imaging device must capture image data of the body part or hand of the person with high quality, both as regards the visible light spectrum and the near infrared light spectrum. In particular, illumination with visible light and near infrared light is required to have a high homogeneity and uniform intensity. Capturing image data of body parts of persons is required to provide high quality image data in various environmental conditions. For example, in retrofit installations, previously installed lighting may not be optimally designed for the installation of biometrics imaging devices. In order to achieve a desired ease of use, biometric imaging devices are often installed such that the person may move the body part into a comfortable posture.
  • the person may desire to move the flat hand into a horizontal posture with regard to the floor, or into a posture having an inclination of maximally 45° with regard to the floor.
  • backlight may severely impact quality of image data captured with biometric imaging devices which have been installed to provide a desired level of ease of use.
  • biometrics imaging devices relate to laptop computers or smartphones, which are used in daylight, sun, rainy conditions, during the night, etc.
  • it is required that captured image data is of high quality also under such heavily varying environmental conditions.
  • US2005286744A1 discloses capturing images of a palm.
  • a front face guide is provided for supporting a wrist.
  • the front face guide enables guiding the palm naturally to an image capturing region of the sensor unit.
  • the palm can be correctly positioned.
  • US2006023919A1 discloses providing guidance such that image capture of biometrics information is performed appropriately.
  • An image capture device is caused to perform a plurality of image capture operations (including distance measurement) at short intervals.
  • a guidance screen is displayed according to analysis results.
  • Guidance includes messages“Please place your hand over the authentication device again”,“Your hand is too far away”,“Spread palm”,“Please left hand parallel to device”.
  • Guidance is disclosed in connection with a mechanical guide.
  • biometrics imaging device and a biometrics imaging method which do not have at least some of the disadvantages of the prior art.
  • it is an objective of the invention to provide a biometrics imaging device and a biometrics imaging method which enable improved image data quality.
  • it is an objective of the invention to provide a biometrics imaging device and a biometrics imaging method which enable improved image data quality under difficult backlight conditions and without any mechanical auxiliary devices such as a mechanical hand guidance.
  • At least one objective of the invention is achieved by the biometrics imaging device and the biometrics imaging method defined in the enclosed independent claims.
  • the dependent claims set forth further embodiments of the invention.
  • a biometrics imaging device for capturing image data of a body part of a person, which comprises at least one of a visible light sensor for capturing image data of the body part in the visible light spectrum and a near infrared light sensor for capturing image data of the body part in the near infrared light spectrum.
  • the biometrics imaging device comprises a time of flight camera configured for capturing three dimensional image data of the body part of the person.
  • the biometrics imaging device is configured to execute an imaging procedure which includes the steps of: capturing three dimensional image data of a current body part posture;
  • the desired body part posture may take into account difficult backlight conditions.
  • the desired body part posture may be determined dynamically, for example in accordance to captured three dimensional image data.
  • the desired body part posture may take into account particular factors such as the location of light sources, the brightness of light sources, etc. Without the need of further aids such as a mechanical guide, the user is enabled to adapt the body part posture in accordance to the desired body part posture and quality of captured image data is improved.
  • Capturing image data in the visible light spectrum and/or capturing image data in the near infrared spectrum may be optimized on the basis of three dimensional image data captured by the time of flight camera.
  • optimization may relate to focal length and/or depth of field of the visible light sensor.
  • optimization may relate to focal length and/or depth of field of the near infrared light sensor.
  • optimization may enable that wrinkles or skin folds may be captured with detailed resolution, for example
  • optimization on the basis of three dimensional image data may relate to determining a desired body part posture having a predefined distance from the visible light sensor and/or near infrared light sensor, thereby taking into account focal length and/or depth of field. Accordingly, no expensive lenses such as sufficiently fast electrically adaptable liquid lenses are required. Furthermore, no real-time computationally complex frequency analysis of the image data in the visible light or near infrared light spectrum is required. Moreover, three dimensional image data captured by the time of flight camera enables measuring precisely the absolute dimensions of the body part, such as a hand.
  • the current and desired body part posture relate to one or more of a relative distance, a relative orientation, and a gesture.
  • relative distance and/or relative orientation are defined with respect to the visible light sensor and/or with respect to the near infrared light sensor.
  • user guidance relates to one or more of adapting a relative distance, a relative orientation, and a gesture of the current body part posture.
  • the relative distance may relate to the distance between the biometrics imaging device and the body part.
  • the relative orientation may relate to the relative orientation between the biometrics imaging device and the body part posture.
  • the relative distance and/or orientation may depend on properties of the environment of the biometrics imaging device, such as backlight, reflecting surfaces, etc.
  • the gesture may relate to movements of the body part such as the stretching of spreading of fingers of a hand.
  • user guidance includes one or more of a visual guidance displayed on a display and acoustic guidance played back on a loudspeaker.
  • Acoustic guidance may be less preferable because the language of the person may not be known.
  • Acoustic guidance may include generic signals such as warning signals, information signals, etc.
  • Visual guidance may include representations of the body part, which may include accentuations such as coloured parts.
  • user guidance includes displaying on a display a representation of the desired body part posture and a representation of the current body part posture.
  • the person is enabled to adapt the body part posture more precisely.
  • user guidance includes displaying on a display a representation of a spirit level indicating the difference between the current body part posture and the desired body part posture. The person is enabled to adapt the body part posture more precisely.
  • the biometrics imaging device is further configured to repeat one or more steps of the imaging procedure more than once.
  • the person is enabled to adapt the body part stepwise, thereby increasing precision.
  • the biometrics imaging device is further configured that in case the determined difference is within the predefined range a delay of less than 100 milliseconds, preferably less than 10 milliseconds, is maintained between determining the difference and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum. Image data relevant for biometrics features is captured as soon as the body part posture is in the desired posture.
  • the biometrics imaging device is further configured to determine a region of interest of the body part on the basis of at least one of three dimensional image data, image data in the visible light spectrum and image data in the near infrared light spectrum, and to adapt in accordance to the region of interest at least one of the desired body part posture and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.
  • the near infrared light spectrum may not include a vein pattern in case of a palm of a hand of a woman and in this case the desired posture may be changed to the back of the hand of the woman.
  • a rectangular region of interest of a part of the palm of a hand may not include sufficient biometric features and in this case the desired body part posture and/or capturing image data may be changed to enable capturing image data of a larger area of the palm of the hand, for example also including all the fingers.
  • At least one objective of the invention is also achieved by a biometrics imaging method for capturing image data of a body part of a person, wherein at least one of a visible light sensor for capturing image data of the body part in the visible light spectrum and a near infrared light sensor for capturing image data of the body part in the near infrared light spectrum is provided.
  • the method comprises: providing a time of flight camera configured for capturing three dimensional image data of the body part of the person; and executing an imaging procedure which includes the steps of: capturing three dimensional image data of a current body part posture; determining on the basis of the three dimensional image data a difference between a desired body part posture and the current body part posture; providing on the basis of the determined difference user guidance to the person enabling the person to adapt the body part posture in direction of the desired posture; and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.
  • user guidance relates to one or more of adapting a relative distance, a relative orientation, and a gesture of the current body part posture.
  • user guidance includes displaying on a display a representation of the desired body part posture and a representation of the current body part posture.
  • the biometrics imaging method further comprises: repeating one or more steps of the imaging procedure more than once.
  • the biometrics imaging method further comprises maintaining in case the determined difference is within the predefined range a delay of less than 100 milliseconds, preferably less than 10 milliseconds, between determining the difference and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum.
  • the biometrics imaging method further comprises determining a region of interest of the body part on the basis of at least one of three dimensional image data, image data in the visible light spectrum and image data in the near infrared light spectrum, and adapting in accordance to the region of interest at least one of the desired body part posture and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.
  • Fig. 1 illustrates schematically the palm of the left hand of a first person
  • Fig. 2 illustrates schematically the palm of the right hand of a second person
  • Fig. 3 illustrates schematically the venous network of the back of the right hand 3 of a third person
  • Fig. 4 illustrates schematically the hand of a person and a biometrics authentication device
  • Fig. 5 illustrates schematically a time of flight camera
  • Fig. 6 illustrates schematically a biometrics imaging device installed in a building
  • Fig. 7a, 7b, 7c, 7d illustrate schematically user guidance relating to an imaging
  • Fig. 8 illustrates schematically a method of capturing image data of a hand of a user.
  • Figure 1 illustrates schematically the palm of the left hand 1 of a first person.
  • the left hand 1 has a thumb t, an index finger i, a middle finger m, a ring finger r, and a little finger I.
  • Figure 2 illustrates schematically the palm of the right hand 2 of a second person.
  • the right hand 2 has a thumb t, an index finger i, a middle finger m, a ring finger r, and a little finger I.
  • Figure 1 and Figure 2 illustrate schematically images of the palms of the left and the right hand 1 , 2 captured with a visible light sensor (e.g. 400nm to 600nm).
  • the hands 1 , 2 have palm prints P or lifelines, which can be identified in visible light. Additionally or
  • vein patterns of the hands 1 , 2 can be determined from image data captured in near infrared light (e.g. 700nm to 900nm). Figure 1 and Figure 2 do not illustrated vein patterns.
  • the palm prints P or lifelines of the hands 1 , 2 of these two persons include individual biometric features, such as particular lengths, positions, curvatures, etc.
  • biometric features such as particular lengths, positions, curvatures, etc.
  • authentication of a particular person is enabled, in particular in combination with biometric features determined from respective vein patterns.
  • authentication of a person may also be based on biometric features of the back of the hand determined from image data captured in visible light, in near infrared light, or a combination thereof.
  • biometric features of the back of the hand determined from image data captured with a visible light sensor sufficiently enable authentication of a person.
  • FIG 3 illustrates schematically the venous network of the back of the right hand 3 of a third person.
  • the right hand 3 has a thumb t, an index finger i, a middle finger m, a ring finger r, and a little finger I.
  • the back of the hand 3 includes veins, which include the dorsal venous network 31 (rete venosum dorsale manus) and the dorsal metacarpal veins 32 (Vv. metacarpals dorsales).
  • Vein patterns can be determined from image data captured with a near infrared light sensor, and individual biometric features can be determined form the image data captured in near infrared light.
  • FIG 4 illustrates schematically a biometrics imaging device 80, which may be part of or provide a biometrics authentication device.
  • the biometrics imaging device 80 includes a biometric sensor 10 and a processing unit 20.
  • the biometrics imaging device 80 may be connected to a user display 40, for example, for providing user guidance.
  • the processing unit 20 may be attached to the biometric sensor 10 as illustrated in Figure 4.
  • the processing unit 20 may be located remotely within a computing infrastructure such as a host computer, server, cloud, etc.
  • the processing unit 20 may include one or more processors and may have stored computer instructions which may be executed by the one or more processors in order to enable the functions as described in the present disclosure.
  • the user display 40 may be fixedly installed close to the biometric sensor 10.
  • the user display 40 may relate to a user device such as a notebook, smartphone, smartwatch, etc., wherein the processing unit 20 may communicate via a wireless connection such as Bluetooth with the user display 40, for example.
  • the biometrics imaging device 80 may be included in a user device such as a notebook, smartphone, smartwatch, etc. As illustrated in Figure 4, a current posture of the user’s hand 401 and a desired posture of the user’s hand 402 may be displayed on display 40.
  • the biometric sensor 10 enables capturing image data of the hand 4 of a person.
  • the biometric sensor 10 includes a visible light sensor 101 for capturing image data in the visible light spectrum, a near infrared light sensor 102 for capturing image data in the near infrared light spectrum, and a time of flight camera 103 for capturing image data having three dimensions.
  • One or more of the visible light sensor 101 , the near infrared light sensor 102 and the time of flight camera 103 may be included into a single sensor.
  • the biometric sensor 10 includes light sources 104.
  • Figure 4 illustrates eight light sources 104 arranged on a circle around the sensors 101 , 102 and the time of flight camera 103.
  • the light sources 104 may include a different number of light sources and/or may be arranged in a different manner.
  • the light sources 104 may include customized lenses in order to achieve a homogeneous light distribution.
  • the light sources 104 may include one or more light sources providing illumination in the visible light spectrum and enabling capturing image data with the visible light sensor 101 in the visible light spectrum.
  • the light sources 104 may include one or more light sources providing illumination in the near infrared light and enabling capturing image data with the near infrared light sensor 102 in the near infrared light.
  • Calibration may be provided in particular as regards the geometric location of the visible light sensor 101 , the near infrared light sensor 102 and the time of flight camera 103, such as the translational displacement between the visible light sensor 101 , the near infrared light sensor 102 and the time of flight camera 103. Moreover, calibration may be provided as regards a scaling factor of image data captured by the time of flight camera 103, such as the absolute size of objects in the captured image data. Calibration may be provided within the biometric sensor 10, by post-processing in a dedicated computer such as the processing unit 20, or a combination thereof. Calibration may provide that the objects in the image data captured by the visible light sensor 101 , the near infrared light sensor 102 and the time of flight camera 103 align to each other.
  • the visible light sensor 101 may include a visible light sensitive chip providing 2D image data (2D: two dimensional) in accordance to a visible light intensity distribution generated by a 3D scene (3D: three dimensional).
  • the near infrared light sensor 102 may include a near infrared light sensitive chip providing 2D image data (2D: two dimensional) in accordance to a near infrared light intensity distribution generated by a 3D scene (3D: three dimensional).
  • the visible light sensor 101 and the near infrared light sensor 102 may include lenses, buffers, controllers, processing electronics, etc.
  • the visible light sensor 101 and the near infrared light sensor 102 may relate to commercially available sensors such as e2v semiconductors SAS EV76C570 CMOS image sensor, equipped with a blocking optical filter ⁇ 500nm wavelength for the visible light sensor 101 and with a blocking optical filter of >700nm for the near infrared light sensor 102, or such as
  • the light sources 104 may include a visible light and/or near infrared light generator such as an LED (LED: light emitting diode).
  • the light sources 104 may relate to commercially available light sources such as high power LEDs SMB1 N series from Roithner Lasertechnik GmbH, Vienna.
  • FIG. 5 illustrates schematically a time of flight camera 103.
  • the time of flight camera 103 includes a sequence controller 1031 , a modulation controller 1032, a pixel matrix 1033, an A/D converter 1034 (A/D: analogue to digital), an LED or VCSEL 1035 (LED: light emitting diode; VCSEL: vertical-cavity surface-emitting laser), and a lens 1036.
  • the sequence controller controls the modulation controller 1032 and the A/D converter 1034.
  • the modulation controller 1032 controls the LED or VCSEL 1035 and the pixel matrix 1033.
  • the pixel matrix 1033 provides signals to the A/D converter 1034.
  • the sequence controller 1031 interacts with a host controller 1037, for example via I2C bus (I2C: I- Squared-C serial data bus).
  • I2C I- Squared-C serial data bus
  • the LED or VCSEL 1035 illuminates a 3D scene 1038.
  • the lens 1036 receives light reflected by the 3D scene 1038.
  • the A/D converter 1034 provides raw 3D image data (3D: three dimensional) to the host controller 1037, for example via Ml PI CSI-2 or PIF (Ml PI: Mobile Industry Processor Interface; CSI: Camera Serial Interface; PIF: Parallel InterFace).
  • the host controller performs a depth map calculation and provides an amplitude image 103a of the 3D scene 1038 and a depth image 103d of the 3D scene.
  • the background of the amplitude image 103a includes shades of light of a wall behind a person, for example, while the background of the depth image 103d has a single value, such as black, because the wall behind the person is arranged at a specific distance from the time of flight camera 103.
  • the time of flight camera 103 may relate to a REAL3TM of the company InfineonTM, and may include the specifications: direct measurement of depth and amplitude in every pixel; highest accuracy; lean computational load; active modulated infra-red light and patented Suppression of Background Illumination (SBI) circuitry in every pixel; full operation in any light condition: darkness and bright sunlight; monocular system architecture having no mechanical baseline; smallest size and high design flexibility; no limitation in close range operation; no special requirements on mechanical stability; no mechanical alignment and angle correction; no recalibration or risk of de-calibration due to drops, vibrations or thermal bending; easy and very fast once-in-a-lifetime calibration; cost efficient manufacturing.
  • SBI Background Illumination
  • FIG. 6 illustrates schematically a biometrics imaging device 80 installed in a building 6 having an entrance 61. Access to the building 6 is controlled by a biometrics
  • the biometrics imaging device 80 is installed at a location close to the entrance 61.
  • the biometrics imaging device 80 is configured to capture image data from a body part such as a hand 4 of a person requesting access to the building 6.
  • Backlight such as effected by light sources 62 installed for the purpose of providing comfortable lighting conditions, may severely deteriorate quality of the image data captured by the biometrics imaging device 80.
  • the biometrics imaging device 80 is configured to execute an imaging procedure which includes the steps of: capturing three dimensional image data of a current posture of the hand 4; determining on the basis of the three dimensional image data a difference between a desired posture of the hand 4 and the current posture of the hand 4; providing on the basis of the determined difference user guidance 401 , 402 to the person enabling the person to adapt the posture of the hand 4 in direction of the desired posture; and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum of the hand 4.
  • Figures 7a, 7b, 7c, 7d illustrate schematically user guidance relating to an imaging procedure executed by the biometrics imaging device 80.
  • User guidance is displayed on a display 40.
  • the display 40 may be fixedly installed at a location close to the biometrics device 80.
  • the display 40 may relate to a user device such as a tablet computer, a smartphone, etc.
  • a representation of a desired posture 402 of the hand 4 is displayed. Displaying the representation of the desired posture 402 is based on three dimensional image data captured with the time of flight camera 103 and may be calibrated in accordance to the hand 4. Calibration in accordance to the hand 4 is important for enabling proper guidance of hands of different sizes, such as a hand of a man, of a woman, or of a child.
  • a representation of the current posture 402 of the hand 4 is displayed. Displaying the representation of the current posture 401 is based on three dimensional image data captured with the time of flight camera 103.
  • the current posture of the hand 4 is too far away from the biometrics imaging device 80, which is indicated by displaying the representation of the current posture 401 at a smaller size than the representation of the desired posture 402. Accordingly, the person is enabled to adapt the posture of the hand 4 in direction of the desired posture, namely to move the hand closer to the biometrics imaging device 80.
  • the current posture of the hand 4 is still too far away from the biometrics imaging device 80, which is indicated with displaying the representation of the current posture 401 at a smaller size than the representation of the desired posture 402.
  • the distance is reduced and further information may be displayed such as a spirit level 403 in order to provide further guidance to adapt the posture of the hand 4.
  • the spirit level 403, which is displayed on the back of the representation of the hand at the current posture 401 provides user guidance as regards a difference between the current inclination of the hand and the desired inclination of the hand.
  • the person is enabled to adapt the posture of the hand 4 in direction of the desired posture, namely to move the hand even closer to the biometrics imaging device 80 and to adapt inclination of the hand.
  • the current posture of the hand 4 is approximately at the desired distance from the biometrics imaging device 80, which is indicated with displaying the representation of current posture 401 at approximately the same size as the representation of the desired posture 402.
  • the posture of the hand 4 does not have yet the correct inclination, which is indicated by the spirit level 403 displayed together with the representation of the hand at the current posture 401.
  • the person is enabled to adapt the posture of the hand 4 in direction of the desired posture, namely to adapt inclination of the hand even more.
  • the difference between the current posture of the hand 4 and the desired posture of the hand 4 is at a sufficiently small level.
  • the biometrics imaging device 80 captures at least one of image data in the visible light spectrum and image data in the infrared light spectrum. Because the posture of the hand 4 is in conformance with a desired posture, which may in particular take into account backlight conditions, optimal focus conditions, etc., captured image data has improved quality.
  • Figure 8 illustrates schematically a method of capturing image data of a hand 4 of a person.
  • the method includes an imaging procedure which the following steps.
  • step S1 captured is three dimensional image data of a current posture of the hand 4 using a time of flight camera 103.
  • determined is on the basis of the three dimensional image data a difference between a desired posture of the hand 4 and the current posture of the hand 4.
  • step S3 provided is on the basis of the determined difference user guidance 401 , 402 to the person enabling the person to adapt the posture of the hand 4 in direction of the desired posture.
  • captured is at least one of image data in the visible light spectrum and image data in the infrared light spectrum of the hand 4. For example, steps S1 to S3 are repeated continuously until in step S2 it is determined that the difference is smaller than a predefined threshold and step S2 is followed by step S4.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Image Input (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A biometrics imaging device (10) for capturing image data of a body part (4) of a person comprises at least one of a visible light sensor (101) for capturing image data of the body part in the visible light spectrum and a near infrared light sensor (102) for capturing image data of the body part in the near infrared light spectrum. The biometrics imaging device (10) comprises a time of flight camera (103) configured for capturing three dimensional image data of the body part (4) of the person. The biometrics imaging device (10) is configured to execute an imaging procedure which includes the steps of: capturing three dimensional image data of a current body part posture; determining on the basis of the three dimensional image data a difference between a desired body part posture and the current body part posture; providing on the basis of the determined difference user guidance to the person enabling the person to adapt the body part posture in direction of the desired posture; and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.

Description

Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality Technical Field
The present disclosure relates to a biometrics imaging device and a biometrics imaging method for capturing image data of a body part of a person. In particular, the present disclosure relates to a biometrics imaging device and a biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality.
Prior Art
Biometric authentication devices, which include biometric imaging devices for capturing image data of a body part of a person, are widely used for the authentication of persons, such as in the context of access control to resources as for example buildings, rooms, computers, smartphones, electronic bank accounts, voting systems, school or university exam papers, frontiers, company registers, etc.
In some embodiments, biometrics imaging devices are configured to capture image data of a body part of a person, such as of a hand of the person, such that individual and typical biometric features may be determined from the captured image data. Captured image data of the hand or body part of the person may relate to image data captured with a near infrared light sensor (e.g. 700nm to 900nm), to image data captured with a visible light sensor (e.g. 400nm to 600nm), or a combination thereof. Biometric features determined from image data may relate to vein patterns, palm prints, lifelines, etc. of the hand. Image data captured in the near infrared light enable determining features relating to vein patterns of the hand. Image data captured in the visible light enable determining features relating to palm prints and lifelines of the hand.
Authentication of persons is based on pre-stored biometric features which were registered under control of an entitled and trustworthy authority. This authority verifies the identity of a person on the basis of an identification card such as passport, for example. Respective image data of the hand of the person is captured and biometric features of the hand or body part of the person are determined from the captured image data. The determined biometric features are stored in a database as pre-stored biometric features. In some embodiments, the pre-stored biometric features may include partially or fully the captured image data. The determined biometric features of the hand or body part, the captured image data of the hand, or a combination thereof, may relate to vein patterns, palm prints, lifelines, etc.
Later on, in case the authentication of a person is required, a biometric authentication device captures image data of the hand or body part of the person. Biometric features of the hand or body part of the person are determined and compared with pre-stored biometric features and/or pre-stored image data. Authentication is approved for the person if a match is found within the pre-stored biometric features, otherwise authentication is rejected.
In order to achieve reproducible results and sufficient authentication accuracy, the biometrics imaging device must capture image data of the body part or hand of the person with high quality, both as regards the visible light spectrum and the near infrared light spectrum. In particular, illumination with visible light and near infrared light is required to have a high homogeneity and uniform intensity. Capturing image data of body parts of persons is required to provide high quality image data in various environmental conditions. For example, in retrofit installations, previously installed lighting may not be optimally designed for the installation of biometrics imaging devices. In order to achieve a desired ease of use, biometric imaging devices are often installed such that the person may move the body part into a comfortable posture. For example, the person may desire to move the flat hand into a horizontal posture with regard to the floor, or into a posture having an inclination of maximally 45° with regard to the floor. However, backlight may severely impact quality of image data captured with biometric imaging devices which have been installed to provide a desired level of ease of use. Other applications of biometrics imaging devices relate to laptop computers or smartphones, which are used in daylight, sun, rainy conditions, during the night, etc. However, it is required that captured image data is of high quality also under such heavily varying environmental conditions. In case biometric features from the whole hand and the fingers need to be determined, it becomes particularly difficult to capture image data of high quality because there is no coverage of backlight as in the case when biometric features relating only to a sub-region of the palm of the hand need to be determined. An important quality feature of captured image data relates to the so called“regions of interest”, which must be clearly identified. This results in the requirement that the image data must have a high contrast at the edges of the hand in order to determine the outline of the hand unambiguous and reproducible.
US2005286744A1 discloses capturing images of a palm. A front face guide is provided for supporting a wrist. The front face guide enables guiding the palm naturally to an image capturing region of the sensor unit. The palm can be correctly positioned. US2006023919A1 discloses providing guidance such that image capture of biometrics information is performed appropriately. An image capture device is caused to perform a plurality of image capture operations (including distance measurement) at short intervals. A guidance screen is displayed according to analysis results. Guidance includes messages“Please place your hand over the authentication device again”,“Your hand is too far away”,“Spread palm”,“Please left hand parallel to device”. Guidance is disclosed in connection with a mechanical guide.
Summary of the Invention
It is an objective of the invention to provide a biometrics imaging device and a biometrics imaging method which do not have at least some of the disadvantages of the prior art. In particular, it is an objective of the invention to provide a biometrics imaging device and a biometrics imaging method which enable improved image data quality. In particular, it is an objective of the invention to provide a biometrics imaging device and a biometrics imaging method which enable improved image data quality under difficult backlight conditions and without any mechanical auxiliary devices such as a mechanical hand guidance.
At least one objective of the invention is achieved by the biometrics imaging device and the biometrics imaging method defined in the enclosed independent claims. The dependent claims set forth further embodiments of the invention.
At least one objective of the invention is achieved by a biometrics imaging device for capturing image data of a body part of a person, which comprises at least one of a visible light sensor for capturing image data of the body part in the visible light spectrum and a near infrared light sensor for capturing image data of the body part in the near infrared light spectrum. The biometrics imaging device comprises a time of flight camera configured for capturing three dimensional image data of the body part of the person. The biometrics imaging device is configured to execute an imaging procedure which includes the steps of: capturing three dimensional image data of a current body part posture;
determining on the basis of the three dimensional image data a difference between a desired body part posture and the current body part posture; providing on the basis of the determined difference user guidance to the person enabling the person to adapt the body part posture in direction of the desired posture; and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum. The desired body part posture may take into account difficult backlight conditions. The desired body part posture may be determined dynamically, for example in accordance to captured three dimensional image data. The desired body part posture may take into account particular factors such as the location of light sources, the brightness of light sources, etc. Without the need of further aids such as a mechanical guide, the user is enabled to adapt the body part posture in accordance to the desired body part posture and quality of captured image data is improved.
Capturing image data in the visible light spectrum and/or capturing image data in the near infrared spectrum may be optimized on the basis of three dimensional image data captured by the time of flight camera. For example, optimization may relate to focal length and/or depth of field of the visible light sensor. For example, optimization may relate to focal length and/or depth of field of the near infrared light sensor. For example, in case of capturing image data of a palm side or back side of a hand, optimization may enable that wrinkles or skin folds may be captured with detailed resolution, for example
complementing the lifelines of the palm side of the hand with further details. Optimization on the basis of three dimensional image data may relate to determining a desired body part posture having a predefined distance from the visible light sensor and/or near infrared light sensor, thereby taking into account focal length and/or depth of field. Accordingly, no expensive lenses such as sufficiently fast electrically adaptable liquid lenses are required. Furthermore, no real-time computationally complex frequency analysis of the image data in the visible light or near infrared light spectrum is required. Moreover, three dimensional image data captured by the time of flight camera enables measuring precisely the absolute dimensions of the body part, such as a hand.
In an embodiment, the current and desired body part posture relate to one or more of a relative distance, a relative orientation, and a gesture. In some embodiments, relative distance and/or relative orientation are defined with respect to the visible light sensor and/or with respect to the near infrared light sensor.
In an embodiment, user guidance relates to one or more of adapting a relative distance, a relative orientation, and a gesture of the current body part posture. The relative distance may relate to the distance between the biometrics imaging device and the body part. The relative orientation may relate to the relative orientation between the biometrics imaging device and the body part posture. The relative distance and/or orientation may depend on properties of the environment of the biometrics imaging device, such as backlight, reflecting surfaces, etc. The gesture may relate to movements of the body part such as the stretching of spreading of fingers of a hand.
In an embodiment, user guidance includes one or more of a visual guidance displayed on a display and acoustic guidance played back on a loudspeaker. Acoustic guidance may be less preferable because the language of the person may not be known. Acoustic guidance may include generic signals such as warning signals, information signals, etc. Visual guidance may include representations of the body part, which may include accentuations such as coloured parts.
In an embodiment, user guidance includes displaying on a display a representation of the desired body part posture and a representation of the current body part posture. The person is enabled to adapt the body part posture more precisely.
In an embodiment, user guidance includes displaying on a display a representation of a spirit level indicating the difference between the current body part posture and the desired body part posture. The person is enabled to adapt the body part posture more precisely.
In an embodiment, the biometrics imaging device is further configured to repeat one or more steps of the imaging procedure more than once. The person is enabled to adapt the body part stepwise, thereby increasing precision.
In an embodiment, the biometrics imaging device is further configured that in case the determined difference is within the predefined range a delay of less than 100 milliseconds, preferably less than 10 milliseconds, is maintained between determining the difference and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum. Image data relevant for biometrics features is captured as soon as the body part posture is in the desired posture.
In an embodiment, the biometrics imaging device is further configured to determine a region of interest of the body part on the basis of at least one of three dimensional image data, image data in the visible light spectrum and image data in the near infrared light spectrum, and to adapt in accordance to the region of interest at least one of the desired body part posture and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum. For example, the near infrared light spectrum may not include a vein pattern in case of a palm of a hand of a woman and in this case the desired posture may be changed to the back of the hand of the woman. For example, a rectangular region of interest of a part of the palm of a hand may not include sufficient biometric features and in this case the desired body part posture and/or capturing image data may be changed to enable capturing image data of a larger area of the palm of the hand, for example also including all the fingers.
At least one objective of the invention is also achieved by a biometrics imaging method for capturing image data of a body part of a person, wherein at least one of a visible light sensor for capturing image data of the body part in the visible light spectrum and a near infrared light sensor for capturing image data of the body part in the near infrared light spectrum is provided. The method comprises: providing a time of flight camera configured for capturing three dimensional image data of the body part of the person; and executing an imaging procedure which includes the steps of: capturing three dimensional image data of a current body part posture; determining on the basis of the three dimensional image data a difference between a desired body part posture and the current body part posture; providing on the basis of the determined difference user guidance to the person enabling the person to adapt the body part posture in direction of the desired posture; and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.
In an embodiment, user guidance relates to one or more of adapting a relative distance, a relative orientation, and a gesture of the current body part posture.
In an embodiment, user guidance includes displaying on a display a representation of the desired body part posture and a representation of the current body part posture.
In an embodiment, the biometrics imaging method further comprises: repeating one or more steps of the imaging procedure more than once.
In an embodiment, the biometrics imaging method further comprises maintaining in case the determined difference is within the predefined range a delay of less than 100 milliseconds, preferably less than 10 milliseconds, between determining the difference and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum.
In an embodiment, the biometrics imaging method further comprises determining a region of interest of the body part on the basis of at least one of three dimensional image data, image data in the visible light spectrum and image data in the near infrared light spectrum, and adapting in accordance to the region of interest at least one of the desired body part posture and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.
Brief Explanation of the Figures
The invention is described in greater detail below with reference to embodiments that are illustrated in the figures. The figures show:
Fig. 1 illustrates schematically the palm of the left hand of a first person;
Fig. 2 illustrates schematically the palm of the right hand of a second person; Fig. 3 illustrates schematically the venous network of the back of the right hand 3 of a third person;
Fig. 4 illustrates schematically the hand of a person and a biometrics authentication device;
Fig. 5 illustrates schematically a time of flight camera;
Fig. 6 illustrates schematically a biometrics imaging device installed in a building;
Fig. 7a, 7b, 7c, 7d illustrate schematically user guidance relating to an imaging
procedure executed by the biometrics imaging device; and
Fig. 8 illustrates schematically a method of capturing image data of a hand of a user.
Embodiments of the Invention
Figure 1 illustrates schematically the palm of the left hand 1 of a first person. The left hand 1 has a thumb t, an index finger i, a middle finger m, a ring finger r, and a little finger I. Figure 2 illustrates schematically the palm of the right hand 2 of a second person. The right hand 2 has a thumb t, an index finger i, a middle finger m, a ring finger r, and a little finger I.
Figure 1 and Figure 2 illustrate schematically images of the palms of the left and the right hand 1 , 2 captured with a visible light sensor (e.g. 400nm to 600nm). The hands 1 , 2 have palm prints P or lifelines, which can be identified in visible light. Additionally or
alternatively, vein patterns of the hands 1 , 2 can be determined from image data captured in near infrared light (e.g. 700nm to 900nm). Figure 1 and Figure 2 do not illustrated vein patterns.
As is illustrated in Figure 1 and Figure 2, the palm prints P or lifelines of the hands 1 , 2 of these two persons include individual biometric features, such as particular lengths, positions, curvatures, etc. By comparison with biometric features which have been pre stored from body parts of registered persons, authentication of a particular person is enabled, in particular in combination with biometric features determined from respective vein patterns. Furthermore, authentication of a person may also be based on biometric features of the back of the hand determined from image data captured in visible light, in near infrared light, or a combination thereof. However, it is presently not known if biometric features of the back of the hand determined from image data captured with a visible light sensor sufficiently enable authentication of a person. In case of relying on the back of a hand, it is presently believed that image data captured with a near infrared light sensor are necessary for sufficiently enabling authentication of a person. Figure 3 illustrates schematically the venous network of the back of the right hand 3 of a third person. The right hand 3 has a thumb t, an index finger i, a middle finger m, a ring finger r, and a little finger I. As illustrated in Figure 3, the back of the hand 3 includes veins, which include the dorsal venous network 31 (rete venosum dorsale manus) and the dorsal metacarpal veins 32 (Vv. metacarpals dorsales). Vein patterns can be determined from image data captured with a near infrared light sensor, and individual biometric features can be determined form the image data captured in near infrared light.
Figure 4 illustrates schematically a biometrics imaging device 80, which may be part of or provide a biometrics authentication device. The biometrics imaging device 80 includes a biometric sensor 10 and a processing unit 20. The biometrics imaging device 80 may be connected to a user display 40, for example, for providing user guidance. The processing unit 20 may be attached to the biometric sensor 10 as illustrated in Figure 4. The processing unit 20 may be located remotely within a computing infrastructure such as a host computer, server, cloud, etc. The processing unit 20 may include one or more processors and may have stored computer instructions which may be executed by the one or more processors in order to enable the functions as described in the present disclosure. The user display 40 may be fixedly installed close to the biometric sensor 10. The user display 40 may relate to a user device such as a notebook, smartphone, smartwatch, etc., wherein the processing unit 20 may communicate via a wireless connection such as Bluetooth with the user display 40, for example. The biometrics imaging device 80 may be included in a user device such as a notebook, smartphone, smartwatch, etc. As illustrated in Figure 4, a current posture of the user’s hand 401 and a desired posture of the user’s hand 402 may be displayed on display 40.
The biometric sensor 10 enables capturing image data of the hand 4 of a person. The biometric sensor 10 includes a visible light sensor 101 for capturing image data in the visible light spectrum, a near infrared light sensor 102 for capturing image data in the near infrared light spectrum, and a time of flight camera 103 for capturing image data having three dimensions. One or more of the visible light sensor 101 , the near infrared light sensor 102 and the time of flight camera 103 may be included into a single sensor.
Furthermore, the biometric sensor 10 includes light sources 104. Figure 4 illustrates eight light sources 104 arranged on a circle around the sensors 101 , 102 and the time of flight camera 103. The light sources 104 may include a different number of light sources and/or may be arranged in a different manner. The light sources 104 may include customized lenses in order to achieve a homogeneous light distribution. The light sources 104 may include one or more light sources providing illumination in the visible light spectrum and enabling capturing image data with the visible light sensor 101 in the visible light spectrum. The light sources 104 may include one or more light sources providing illumination in the near infrared light and enabling capturing image data with the near infrared light sensor 102 in the near infrared light. Calibration may be provided in particular as regards the geometric location of the visible light sensor 101 , the near infrared light sensor 102 and the time of flight camera 103, such as the translational displacement between the visible light sensor 101 , the near infrared light sensor 102 and the time of flight camera 103. Moreover, calibration may be provided as regards a scaling factor of image data captured by the time of flight camera 103, such as the absolute size of objects in the captured image data. Calibration may be provided within the biometric sensor 10, by post-processing in a dedicated computer such as the processing unit 20, or a combination thereof. Calibration may provide that the objects in the image data captured by the visible light sensor 101 , the near infrared light sensor 102 and the time of flight camera 103 align to each other.
The visible light sensor 101 may include a visible light sensitive chip providing 2D image data (2D: two dimensional) in accordance to a visible light intensity distribution generated by a 3D scene (3D: three dimensional). The near infrared light sensor 102 may include a near infrared light sensitive chip providing 2D image data (2D: two dimensional) in accordance to a near infrared light intensity distribution generated by a 3D scene (3D: three dimensional). The visible light sensor 101 and the near infrared light sensor 102 may include lenses, buffers, controllers, processing electronics, etc. The visible light sensor 101 and the near infrared light sensor 102 may relate to commercially available sensors such as e2v semiconductors SAS EV76C570 CMOS image sensor, equipped with a blocking optical filter <500nm wavelength for the visible light sensor 101 and with a blocking optical filter of >700nm for the near infrared light sensor 102, or such as
OmniVision OV4686 RGB-lr sensor, with the visible light sensor 101 and the near infrared light sensor 102 combined in one chip and having included a RGB-lr filter). The light sources 104 may include a visible light and/or near infrared light generator such as an LED (LED: light emitting diode). The light sources 104 may relate to commercially available light sources such as high power LEDs SMB1 N series from Roithner Laser Technik GmbH, Vienna.
Figure 5 illustrates schematically a time of flight camera 103. The time of flight camera 103 includes a sequence controller 1031 , a modulation controller 1032, a pixel matrix 1033, an A/D converter 1034 (A/D: analogue to digital), an LED or VCSEL 1035 (LED: light emitting diode; VCSEL: vertical-cavity surface-emitting laser), and a lens 1036. The sequence controller controls the modulation controller 1032 and the A/D converter 1034. The modulation controller 1032 controls the LED or VCSEL 1035 and the pixel matrix 1033. The pixel matrix 1033 provides signals to the A/D converter 1034. The sequence controller 1031 interacts with a host controller 1037, for example via I2C bus (I2C: I- Squared-C serial data bus). The LED or VCSEL 1035 illuminates a 3D scene 1038. After a time of flight, the lens 1036 receives light reflected by the 3D scene 1038. The A/D converter 1034 provides raw 3D image data (3D: three dimensional) to the host controller 1037, for example via Ml PI CSI-2 or PIF (Ml PI: Mobile Industry Processor Interface; CSI: Camera Serial Interface; PIF: Parallel InterFace). The host controller performs a depth map calculation and provides an amplitude image 103a of the 3D scene 1038 and a depth image 103d of the 3D scene. As illustrated in Figure 5, the background of the amplitude image 103a includes shades of light of a wall behind a person, for example, while the background of the depth image 103d has a single value, such as black, because the wall behind the person is arranged at a specific distance from the time of flight camera 103. The time of flight camera 103 may relate to a REAL3™ of the company Infineon™, and may include the specifications: direct measurement of depth and amplitude in every pixel; highest accuracy; lean computational load; active modulated infra-red light and patented Suppression of Background Illumination (SBI) circuitry in every pixel; full operation in any light condition: darkness and bright sunlight; monocular system architecture having no mechanical baseline; smallest size and high design flexibility; no limitation in close range operation; no special requirements on mechanical stability; no mechanical alignment and angle correction; no recalibration or risk of de-calibration due to drops, vibrations or thermal bending; easy and very fast once-in-a-lifetime calibration; cost efficient manufacturing.
Figure 6 illustrates schematically a biometrics imaging device 80 installed in a building 6 having an entrance 61. Access to the building 6 is controlled by a biometrics
authentication device, which includes a biometrics imaging device 80. As illustrated in Figure 6, the biometrics imaging device 80 is installed at a location close to the entrance 61. The biometrics imaging device 80 is configured to capture image data from a body part such as a hand 4 of a person requesting access to the building 6. Backlight, such as effected by light sources 62 installed for the purpose of providing comfortable lighting conditions, may severely deteriorate quality of the image data captured by the biometrics imaging device 80.
In order to capture image data with improved quality, the biometrics imaging device 80 is configured to execute an imaging procedure which includes the steps of: capturing three dimensional image data of a current posture of the hand 4; determining on the basis of the three dimensional image data a difference between a desired posture of the hand 4 and the current posture of the hand 4; providing on the basis of the determined difference user guidance 401 , 402 to the person enabling the person to adapt the posture of the hand 4 in direction of the desired posture; and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum of the hand 4.
Figures 7a, 7b, 7c, 7d illustrate schematically user guidance relating to an imaging procedure executed by the biometrics imaging device 80. User guidance is displayed on a display 40. The display 40 may be fixedly installed at a location close to the biometrics device 80. The display 40 may relate to a user device such as a tablet computer, a smartphone, etc.
As illustrated in Figure 7a, a representation of a desired posture 402 of the hand 4 is displayed. Displaying the representation of the desired posture 402 is based on three dimensional image data captured with the time of flight camera 103 and may be calibrated in accordance to the hand 4. Calibration in accordance to the hand 4 is important for enabling proper guidance of hands of different sizes, such as a hand of a man, of a woman, or of a child. As illustrated in Figure 7a, a representation of the current posture 402 of the hand 4 is displayed. Displaying the representation of the current posture 401 is based on three dimensional image data captured with the time of flight camera 103.
In the example according to Figure 7a, the current posture of the hand 4 is too far away from the biometrics imaging device 80, which is indicated by displaying the representation of the current posture 401 at a smaller size than the representation of the desired posture 402. Accordingly, the person is enabled to adapt the posture of the hand 4 in direction of the desired posture, namely to move the hand closer to the biometrics imaging device 80.
In the example according to Figure 7b, the current posture of the hand 4 is still too far away from the biometrics imaging device 80, which is indicated with displaying the representation of the current posture 401 at a smaller size than the representation of the desired posture 402. However, with regard to the example according to Figure 7a, the distance is reduced and further information may be displayed such as a spirit level 403 in order to provide further guidance to adapt the posture of the hand 4. The spirit level 403, which is displayed on the back of the representation of the hand at the current posture 401 , provides user guidance as regards a difference between the current inclination of the hand and the desired inclination of the hand. Accordingly, the person is enabled to adapt the posture of the hand 4 in direction of the desired posture, namely to move the hand even closer to the biometrics imaging device 80 and to adapt inclination of the hand. In the example according to Figure 7c, the current posture of the hand 4 is approximately at the desired distance from the biometrics imaging device 80, which is indicated with displaying the representation of current posture 401 at approximately the same size as the representation of the desired posture 402. However, the posture of the hand 4 does not have yet the correct inclination, which is indicated by the spirit level 403 displayed together with the representation of the hand at the current posture 401. Accordingly, the person is enabled to adapt the posture of the hand 4 in direction of the desired posture, namely to adapt inclination of the hand even more.
In the example according to Figure 7d, the difference between the current posture of the hand 4 and the desired posture of the hand 4 is at a sufficiently small level. The biometrics imaging device 80 captures at least one of image data in the visible light spectrum and image data in the infrared light spectrum. Because the posture of the hand 4 is in conformance with a desired posture, which may in particular take into account backlight conditions, optimal focus conditions, etc., captured image data has improved quality.
Figure 8 illustrates schematically a method of capturing image data of a hand 4 of a person. The method includes an imaging procedure which the following steps. In step S1 , captured is three dimensional image data of a current posture of the hand 4 using a time of flight camera 103. In step S2, determined is on the basis of the three dimensional image data a difference between a desired posture of the hand 4 and the current posture of the hand 4. In step S3, provided is on the basis of the determined difference user guidance 401 , 402 to the person enabling the person to adapt the posture of the hand 4 in direction of the desired posture. In step S4, captured is at least one of image data in the visible light spectrum and image data in the infrared light spectrum of the hand 4. For example, steps S1 to S3 are repeated continuously until in step S2 it is determined that the difference is smaller than a predefined threshold and step S2 is followed by step S4.
Reference Signs
1 ,2,3 hand of first, second and third person
t,i,m,r,l thumb finger, index finger, middle finger, ring finger, little finger
P palm print or lifelines
31 , 32 dorsal venous network, dorsal metacarpal veins
4 hand of a person
10 biometric sensor 101 visible light sensor
102 near infrared light sensor
103 time of flight camera
104 light sources
20 processing unit
80 biometrics imaging device
40 display
401 , 402 current posture of user’s hand, desired posture of user’s hand 403 spirit level

Claims

Claims
1. A biometrics imaging device (10) for capturing image data of a body part (4) of a person, the biometrics imaging device (10) comprising at least one of a visible light sensor (101) for capturing image data of the body part in the visible light spectrum and a near infrared light sensor (102) for capturing image data of the body part in the near infrared light spectrum, wherein: the biometrics imaging device (10) comprises a time of flight camera (103) configured for capturing three dimensional image data of the body part (4) of the person; and the biometrics imaging device (10) is configured to execute an imaging procedure which includes the steps of: capturing three dimensional image data of a current body part posture; determining on the basis of the three dimensional image data a difference between a desired body part posture and the current body part posture; providing on the basis of the determined difference user guidance (401 , 402) to the person enabling the person to adapt the body part posture in direction of the desired posture; and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.
2. The biometrics imaging device (10) of the preceding claim, wherein the current and desired body part posture relate to one or more of a relative distance, a relative orientation, and a gesture.
3. The biometrics imaging device (10) of one of the preceding claims, wherein user guidance (401 , 402) relates to one or more of adapting a relative distance, a relative orientation, and a gesture of the current body part posture.
4. The biometrics imaging device (10) of one of the preceding claims, wherein user guidance (401 , 402) includes one or more of a visual guidance displayed on a display (40) and acoustic guidance played back on a loudspeaker.
5. The biometrics imaging device (10) of one of the preceding claims, wherein user guidance (401 , 402) includes displaying on a display (40) a representation of the desired body part posture (401) and a representation of the current body part posture (402).
6. The biometrics imaging device (10) of one of the preceding claims, wherein user guidance (402) includes displaying on a display (40) a representation of a spirit level (403) indicating the difference between the current body part posture and the desired body part posture.
7. The biometrics imaging device (10) of one of the preceding claims, further
configured to repeat one or more steps of the imaging procedure more than once.
8. The biometrics imaging device (10) of one of the preceding claims, further
configured that in case the determined difference is within the predefined range a delay of less than 100 milliseconds, preferably less than 10 milliseconds, is maintained between determining the difference and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum.
9. The biometrics imaging device (10) of one of the preceding claims, further
configured to determine a region of interest of the body part on the basis of at least one of three dimensional image data, image data in the visible light spectrum and image data in the near infrared light spectrum, and to adapt in accordance to the region of interest at least one of the desired body part posture and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.
10. A biometrics imaging method for capturing image data of a body part (4) of a
person, wherein at least one of a visible light sensor (101) for capturing image data of the body part in the visible light spectrum and a near infrared light sensor (102) for capturing image data of the body part in the near infrared light spectrum is provided, the method comprising: providing a time of flight camera (103) configured for capturing three dimensional image data of the body part (4) of the person; and executing an imaging procedure which includes the steps of: capturing three dimensional image data of a current body part posture; determining on the basis of the three dimensional image data a difference between a desired body part posture and the current body part posture; providing on the basis of the determined difference user guidance to the person enabling the person to adapt the body part posture in direction of the desired posture; and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.
11. The biometrics imaging method of claim 10, wherein user guidance (401 , 402)
relates to one or more of adapting a relative distance, a relative orientation, and a gesture of the current body part posture.
12. The biometrics imaging method of claims 10 or 11 , wherein user guidance (401 ,
402) includes displaying on a display (40) a representation of the desired body part posture (401) and a representation of the current body part posture (402).
13. The biometrics imaging method of one of the claims 10 to 12, further comprising repeating one or more steps of the imaging procedure more than once.
14. The biometrics imaging method of one of the claims 10 to 13, further comprising maintaining in case the determined difference is within the predefined range a delay of less than 100 milliseconds, preferably less than 10 milliseconds, between determining the difference and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum.
15. The biometrics imaging method of one of the claims 10 to 14, further comprising determining a region of interest of the body part on the basis of at least one of three dimensional image data, image data in the visible light spectrum and image data in the near infrared light spectrum, and adapting in accordance to the region of interest at least one of the desired body part posture and capturing at least one of image data in the visible light spectrum and image data in the infrared light spectrum.
PCT/EP2020/059718 2019-04-10 2020-04-06 Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality WO2020207947A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP20717828.6A EP3953858A1 (en) 2019-04-10 2020-04-06 Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality
US17/602,464 US11900731B2 (en) 2019-04-10 2020-04-06 Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality
CN202080026328.4A CN113678138A (en) 2019-04-10 2020-04-06 Biometric imaging apparatus and biometric imaging method for capturing image data of a body part of a person capable of improving image data quality
AU2020271607A AU2020271607A1 (en) 2019-04-10 2020-04-06 Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CH00486/19 2019-04-10
CH00486/19A CH716053A1 (en) 2019-04-10 2019-04-10 Biometric formation device and biometric formation method for acquiring image data of a body part of a person with user guidance.

Publications (1)

Publication Number Publication Date
WO2020207947A1 true WO2020207947A1 (en) 2020-10-15

Family

ID=67909234

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/059718 WO2020207947A1 (en) 2019-04-10 2020-04-06 Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality

Country Status (6)

Country Link
US (1) US11900731B2 (en)
EP (1) EP3953858A1 (en)
CN (1) CN113678138A (en)
AU (1) AU2020271607A1 (en)
CH (1) CH716053A1 (en)
WO (1) WO2020207947A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022100971A1 (en) 2020-11-11 2022-05-19 Palmpay Ag Method and system for biometric authentication for large numbers of enrolled persons

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050286744A1 (en) 2004-06-28 2005-12-29 Yoshio Yoshizu Image capturing apparatus for palm authentication
US20060023919A1 (en) 2004-07-30 2006-02-02 Fujitsu Limited Guidance screen control method of biometrics authentication device, biometrics authentication device, and program for same

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060018523A1 (en) 2004-07-23 2006-01-26 Sanyo Electric Co., Ltd. Enrollment apparatus and enrollment method, and authentication apparatus and authentication method
JP4786483B2 (en) 2006-09-14 2011-10-05 富士通株式会社 Biometric guidance control method for biometric authentication device and biometric authentication device
JP5810581B2 (en) 2011-03-29 2015-11-11 富士通株式会社 Biological information processing apparatus, biological information processing method, and biological information processing program
US9679215B2 (en) * 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
JP6089610B2 (en) * 2012-11-13 2017-03-08 富士通株式会社 Biometric authentication apparatus, biometric authentication method, and biometric authentication computer program
JP6075069B2 (en) * 2013-01-15 2017-02-08 富士通株式会社 Biological information imaging apparatus, biometric authentication apparatus, and manufacturing method of biometric information imaging apparatus
ES2809249T3 (en) * 2013-02-25 2021-03-03 Commw Scient Ind Res Org 3D imaging method and system
JP6160148B2 (en) * 2013-03-19 2017-07-12 富士通株式会社 Biological information input device, biometric information input program, and biometric information input method
KR102332320B1 (en) * 2014-02-21 2021-11-29 삼성전자주식회사 Multi-band biometric camera system having iris color recognition
US10248192B2 (en) * 2014-12-03 2019-04-02 Microsoft Technology Licensing, Llc Gaze target application launcher
US11410458B2 (en) * 2018-04-12 2022-08-09 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Face identification method and apparatus, mobile terminal and storage medium
CH716052A1 (en) 2019-04-10 2020-10-15 Smart Secure Id Ag Biometric authentication device and biometric authentication method for authenticating a person with reduced computing complexity.

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050286744A1 (en) 2004-06-28 2005-12-29 Yoshio Yoshizu Image capturing apparatus for palm authentication
US20060023919A1 (en) 2004-07-30 2006-02-02 Fujitsu Limited Guidance screen control method of biometrics authentication device, biometrics authentication device, and program for same

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SAMOIL STEVEN ET AL: "Multispectral Hand Biometrics", 2014 FIFTH INTERNATIONAL CONFERENCE ON EMERGING SECURITY TECHNOLOGIES, IEEE, 10 September 2014 (2014-09-10), pages 24 - 29, XP032703228, DOI: 10.1109/EST.2014.10 *
SVOBODA JAN ET AL: "Contactless biometric hand geometry recognition using a low-cost 3D camera", 2015 INTERNATIONAL CONFERENCE ON BIOMETRICS (ICB), IEEE, 19 May 2015 (2015-05-19), pages 452 - 457, XP033166138, DOI: 10.1109/ICB.2015.7139109 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022100971A1 (en) 2020-11-11 2022-05-19 Palmpay Ag Method and system for biometric authentication for large numbers of enrolled persons
EP4002166A1 (en) 2020-11-11 2022-05-25 PalmPay AG Method and system for biometric authentication for large numbers of enrolled persons

Also Published As

Publication number Publication date
CN113678138A (en) 2021-11-19
US20220207922A1 (en) 2022-06-30
AU2020271607A1 (en) 2021-10-28
US11900731B2 (en) 2024-02-13
CH716053A1 (en) 2020-10-15
EP3953858A1 (en) 2022-02-16

Similar Documents

Publication Publication Date Title
CN107949863B (en) Authentication device and authentication method using biometric information
US11029762B2 (en) Adjusting dimensioning results using augmented reality
US8831295B2 (en) Electronic device configured to apply facial recognition based upon reflected infrared illumination and related methods
US8971565B2 (en) Human interface electronic device
JP5949319B2 (en) Gaze detection apparatus and gaze detection method
CN108205374B (en) Eyeball tracking module and method of video glasses and video glasses
US20140037135A1 (en) Context-driven adjustment of camera parameters
KR20110137453A (en) Apparatus for inputting coordinate using eye tracking and method thereof
CN106778641B (en) Sight estimation method and device
US9880634B2 (en) Gesture input apparatus, gesture input method, and program for wearable terminal
KR20130107981A (en) Device and method for tracking sight line
US10866635B2 (en) Systems and methods for capturing training data for a gaze estimation model
CN113260951B (en) Fade-in user interface display based on finger distance or hand proximity
US11900731B2 (en) Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality
US10534975B1 (en) Multi-frequency high-precision object recognition method
US20220172505A1 (en) Biometrics authentication device and biometrics authentication method for authenticating a person with reduced computational complexity
JP2011198270A (en) Object recognition device and controller using the same, and object recognition method
JP5530809B2 (en) Position detection apparatus and image processing system
WO2023228730A1 (en) Information processing device, information processing system, information processing method, and non-transitory computer-readable medium with program stored therein
US9300908B2 (en) Information processing apparatus and information processing method
TW202004669A (en) Method for multi-spectrum high-precision identification of objects capable of being widely used in security monitoring, industrial monitoring, face recognition, vehicle image recognition and door opening
JP7228509B2 (en) Identification device and electronic equipment
KR20220079753A (en) Method for measuring of object based on face-recognition
CN111352252A (en) Air imaging mechanism, real image device and interactive system
WO2018185992A1 (en) Biometric authentication device and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20717828

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020271607

Country of ref document: AU

Date of ref document: 20200406

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2020717828

Country of ref document: EP

Effective date: 20211110