WO2018073335A1 - System and method for contactless biometric authentication - Google Patents

System and method for contactless biometric authentication Download PDF

Info

Publication number
WO2018073335A1
WO2018073335A1 PCT/EP2017/076685 EP2017076685W WO2018073335A1 WO 2018073335 A1 WO2018073335 A1 WO 2018073335A1 EP 2017076685 W EP2017076685 W EP 2017076685W WO 2018073335 A1 WO2018073335 A1 WO 2018073335A1
Authority
WO
WIPO (PCT)
Prior art keywords
body part
biometric
unit
characteristic data
pattern
Prior art date
Application number
PCT/EP2017/076685
Other languages
English (en)
French (fr)
Inventor
Johan Bergqvist
Original Assignee
Smart Secure Id In Sweden Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smart Secure Id In Sweden Ab filed Critical Smart Secure Id In Sweden Ab
Publication of WO2018073335A1 publication Critical patent/WO2018073335A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1312Sensors therefor direct reading, e.g. contactless acquisition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/557Depth or shape recovery from multiple images from light fields, e.g. from plenoptic cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger
    • G06V40/1394Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/12Acquisition of 3D measurements of objects
    • G06V2201/121Acquisition of 3D measurements of objects using special illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • the present invention relates to the field of biometric authentication and, more specifically, relates to a system and method for contactless biometric authentication, wherein the method of authentication also checks for the liveliness of the individual to be authenticated.
  • Biometric authentication methods use the inherent and unique characteristics of a given body part of individuals. Fortunately, human beings consist of complex organic systems that have high degrees of uniqueness. The most common organic feature utilized for authentication purposes, has been fingerprints. Other methods use the iris of the human eyes, blood vessel patterns hidden under the skin of the hand or the face. These methods are typically based on image processing and recognition. Yet other methods involve the use of the voice.
  • finger print One of the most commonly used biometric authentications is based on finger print, but there is a high risk of circumvention by using dummy finger prints. Then another issue with finger print authentication is that the user has to have a touch contact to the surface, which may cause hygiene issue when the same authentication device is being used by a number of people or accumulated dirt or grease on the surface may lead to malfunctioning of the authentication device.
  • the biometric authentication system for detecting biometric characteristic data from a body part of an individual and verifying the detected biometric characteristic data against registered biometric characteristic data already obtained from said body part to authenticate the individual, comprises: an illumination unit for illuminating the body part; an image capturing unit to capture an image of the illuminated body part; a processing unit for extracting the biometric characteristic data from the image captured by the image capturing unit; and an authentication unit for verifying the extracted biometric data against registered characteristic data to authenticate the individual.
  • the illumination unit of the system is provided with at least one pattern mask in order to project a mask pattern onto the body part such that the projected mask pattern is included in the captured image.
  • the projected mask pattern which preferably consists of identically shaped polygons of the same area, can be used to find the optimal distance between body part and illumination or capturing unit in an entirely contact-free manner by comparing the average size of the individual pattern shapes projected onto the body part to the size of the individual pattern shapes obtain during a previous calibration step.
  • the same mask pattern projected onto the body part and therefore included in the captured image can also be used to define a region of interest of the body part and to extract the characteristic biometric data in that individual pattern shapes are identified, which include structures of interest of the body part (e.g. palm print and palm vein).
  • the same mask pattern can be used to perform a liveliness test based on the unevenness of the body part. Unevenness of the body part leads to a distortion of the mask pattern. This distortion can hardly be reconstructed by a dummy body part and can therefore be used as an indication of liveliness of an individual to be authenticated.
  • the pattern mask may have a regular pattern of polygons, preferably triangle, tetragon, square, pentagon or hexagon. I.e. the polygons have identical shape and area.
  • the illumination unit may comprise at least one light source of visible light source and/or at least one light source of near infra-red (NIR) light, and wherein the light source preferably comprise a plurality of light emitting diodes (LEDs).
  • the light sources or LEDs may be placed in a circular manner around the capturing unit. Different light sources may be placed alternately.
  • the illumination unit comprises light sources for visible and NIR light to detect e.g . palm print/creases and palm veins, respectively. That way the authentication analysis can be based on polygons having palm print and palm vein present.
  • the biometric authentication system further may comprise a visual guidance unit being able to determine the optimal distance between body part and capture unit based on the mask pattern projected onto the body part.
  • the visual guidance unit may indicate the optimal distance by e.g . a green light. In case the distance to the illumination/capturing unit is too small or too large a red light may be lit. Additionally, there may be a gradual change from red to yellow to green when approaching the optimal distance. In this way the user obtains information if he is moving his hand in the right direction.
  • the capturing unit may comprise a micro lens array for obtaining depth information of the body part.
  • the micro lens array is an optional features as already the mask pattern projected onto the body part includes certain depth information due to the distortion of the pattern.
  • the invention further refers to a method for detecting biometric characteristic data from a body part of an individual and verifying the detected biometric characteristic data against registered characteristic data already obtained from said body part to authenticate the individual, comprising the steps of: illuminating the body part using an illumination unit provided with at least one pattern mask in order to project a mask pattern onto the body part; capturing an image of the illuminated body part including the projected mask pattern using an image capturing unit; extracting the biometric characteristic data from the image captured by the image capturing unit using a processing unit; verifying the extracted biometric data against registered characteristic data to authenticate the individual using an authentication unit.
  • the pattern mask may have a regular pattern of polygons, preferably triangle, tetragon, square, pentagon or hexagon, and is used to obtain the optimal distance between body part and capturing unit.
  • the mask pattern is used for guidance of the body part in order to obtain an optimal distance between body part and capturing unit. This can be done by comparing the average size/area of the polygons of the projected mask pattern to the size/area of the polygons obtained during a calibration step.
  • the illumination unit may illuminate the body part with visible light and NIR light in order to obtain combined images including structures of the body part detectable under visible light (e.g . palm print) and NIR light (e.g . palm vein), respectively, and including the mask pattern.
  • the combined image are then used for extraction of the characteristic biometric data.
  • the mask pattern may be used for extracting the characteristic biometric data. For example, the mask pattern divides the image in several small areas, which can be analysed with respect to the presence of characteristic biometric data (e.g . palm print and/or palm vein).
  • the invention further refers to a biometric authentication device for capturing images to perform the above described method.
  • the device comprises an illumination unit for illuminating the body part; an image capturing unit to capture an image of the illuminated body part; wherein the illumination unit is provided with at least one pattern mask in order to project a mask pattern onto the body part such that the projected mask pattern is included in the captured image.
  • the device further comprises a processing unit or means for connecting the device to a processing unit.
  • the processing unit extracts the biometric characteristic data from the image captured by the image capturing unit.
  • the device further comprises an authentication unit or means for connecting the device to an authentication unit.
  • the authentication unit verifies the extracted biometric data against registered characteristic data to authenticate the individual .
  • the authentication may be started or signaled by the palm moving up and down or down and up or forward/backward or sideward motion or any standalone yaw, pitch or roll motion.
  • the region of interest is defined as those where there is a presence of palm crease and palm vein, which are captured in visible and NIR light as illustrated in according to an embodiment of the invention .
  • a 3D model of the palm is generated by the stereoscopic effect when the hand moves up and down or in any of the yaw, roll or pitch motion, resulting in a series of images with different LED patterns at varying focal lengths to generate a 3D image of the palm and vein.
  • This may also be used for liveness test.
  • a region of interest is identified by means of following steps. The first step is to identify the wrist position of the hand to be authenticated. Once the wrist position is identified then radially outline the regions. In another scenario after identifying the wrist position, the next step is to identify the thumb position and radially outline the regions to connect the thumb with the other fingers joints and connecting through all the joints.
  • the palm is in first position in such a way that the side of the palm is visible to the sensor, i.e., the palmar and dorsal is perpendicular to the sensor.
  • the palmar surface rotated such that it is parallel and facing to the surface of the sensor.
  • the rotation gesture is performed, The outermost visible points while the palmar area is being bent becomes the region of interest. This may also be used to perform a liveness test.
  • the user is asked for a liveness test to close his palm as a fist and the region of interest extraction is marked along the closing finger ridges on top to draw a square.
  • This action is used for liveness test whenever there is a risk of fake palm authentication.
  • a user can be asked to perform this gesture for liveness test randomly.
  • the user is asked for a liveness test to close and open his fingers and the region of interest extraction is marked along the active fingers ridges on top to draw a polygon.
  • This action is used for liveness test whenever there is a risk of fake palm authentication.
  • a user can be asked to perform this gesture for liveness test randomly.
  • the present invention provides contactless biometric authentication system.
  • the present invention does not require the user to place his hand in a contact surface so as to ensure better hygienic procedure as the method of authentication ensures that the user does not have to touch anywhere during for the process of authentication.
  • the invention further discloses various methods to verify the liveness of the authenticating subject. This helps in reducing the occurrence of fraudulent practice while using biometric authentication using hand or finger.
  • Figure 1 illustrates a block diagram of a system for contactless biometric authentication.
  • Figure 2 illustrates a device including an illumination unit and a capturing unit.
  • Figure 3a illustrates an image captured using NIR light source without pattern mask.
  • Figure 3b illustrates an image captured using NIR light source with pattern mask.
  • Figure 4 illustrates a calibration system in which the optimal position of the hand is determined .
  • Figure 5a illustrates a manner in which a palm is positioned in front of the illumination unit and also the manner in which the light is scattered by the palm.
  • Figure 5b illustrates a function of a visual guidance unit.
  • Figure 6 illustrates different positions in which the palm may be moved .
  • Figure 7 illustrates a method of identification of region of interest of a palm.
  • Figure 8 illustrates a method in which features of a palm are extracted . , in accordance with an embodiment of the invention.
  • Figure 9 illustrates various feature spots on the palmar area.
  • Figure 10 illustrates a neural network of plurality of neurons.
  • Figure 11 illustrates a mask pattern of hexagons (left) and a distorted mask pattern due to uneven body parts (right).
  • Figure 12 illustrates a possible way of defining pattern blocks using the pattern mask.
  • the present invention overcomes the drawbacks of the prior art by providing a contactless biometric authentication system, which helps to authenticate and also to check for the liveliness of the subject to be authenticated without using 3D cameras and projector system.
  • FIG. 1 illustrates a block diagram of a system for contactless biometric authentication, in accordance with an embodiment of the invention.
  • the system includes at least one illumination unit 110 comprising at least one light sources of visible light and/or at least one light source of near infra-red (NIR) light.
  • the light source may be a plurality of light emitting diodes (LEDs).
  • the LEDs may be placed in a circular manner. Visible light LEDs and NIR light LEDs may be alternately placed .
  • the illumination unit 110 or its individual light sources are masked with at least one pattern 112.
  • the pattern preferably is defined as a regular group of polygons of the same area (see Figure 11, left).
  • the size of a polygon is in the range of 1 mm or has an area of approx. 1 mm 2 .
  • polygon patterns e.g. triangle, tetragon, square, pentagon or hexagon. Other forms are also possible. Pattern masks on different LEDs may be placed at different angles to each other. E.g.
  • a pattern mask when a pattern mask is placed on two farthest LEDs in a circular arrangement, they can be used for computational purposes as their distances between themselves form a triangulation to calculate optimal distance of the body part from the illumination or capturing unit or to identify surface area of the body part.
  • the LED lights with different pattern masks are used to indicate at what optimum distance the hand must be placed above the sensor of the capturing unit. This arrangement also helps to define the region of interest by clearly identifying the hand from the rest of the background .
  • the system further includes an image capturing unit 114 to capture the image of the body part e.g . a hand of the individual to be authenticated .
  • the image capturing unit 114 includes a camera and one or more lenses 116.
  • the capturing unit may include a micro lens array to capture depth information of the body part which provides pictures with different focal length or deformity of the surface and thereby providing 3D information.
  • the system further includes a processing unit 118 to process the images and to extract the various biometric characteristic data, such as palm crease/print and/or palm vein, from the image captured .
  • the system also includes an authentication unit 120, wherein the authentication unit 120 verifies the biometric characteristic data extracted from the captured image against previously extracted and registered characteristic data to authenticate the individual .
  • the registered data may be stored in a storage unit of the authentication unit or in a separate storage unit.
  • FIG. 2 illustrates a device including an illumination unit 110 along with an image capturing unit 114.
  • the light sources are arranged in a circular ring .
  • the LEDs with NIR and visible light source are placed alternately.
  • the invention further discloses a method of contactless biometric authentication.
  • the authentication is performed by capturing images by means of image capturing unit 114.
  • the image capturing unit 114 may utilize a 2D camera (herein after referred as camera).
  • the camera takes the masked image of the hand from the masked light source.
  • the camera When provided with a micro lens array, the camera may also take a depth image by constructing the image formed by micro lens array ( Figure 11). Thereby the micro lens array is used to extract different focal length of the same image and therefrom extract the depth image or surface deformity information of the body part.
  • Figure 3a illustrates an image captured with a NIR light source without a pattern mask
  • Figure 3b illustrates the image captured with a pattern mask
  • a mask pattern appears as a geometric pattern, which can be pre-determined e.g . on an plane surface.
  • a manual or auto calibration test may be performed for the camera to optimally visualise the region of interest without the presence of the body part by placing a plane surface at the desired distance from the illumination unit and capturing unit.
  • a visual guidance tool or unit depicts at what distance the hand should be placed based on the predetermined geometric pattern size on the hand . The guidance unit is explained in further detail below.
  • Figure 4 illustrates a calibration system in which the optimal position of palm is determined based on the mask pattern projected onto the palm.
  • a contactless biometric authentication device 100 having at least the illumination unit 110 with a pattern mask 112 and the capturing unit is placed at a predetermined position against a stand 511 of a calibration device 500.
  • the stand 511 holds a prefixed plane plate 512 at a known distance in relation to the pre-determined position of the authentication device.
  • the image processing unit detects the mask pattern on a prefixed plane plate 512 at a known distance.
  • the image processing unit 118 detects the mask pattern and stores the pattern information (pattern area, pattern size, etc.) for future detection of optimal distance of a palm for authentication.
  • the image processing unit 118 detects the mask pattern on the palm, extracts the pattern information and compares it to the pattern information previously stored during calibration.
  • calibration visual guidance unit indicates at what height the palm needs to be placed based on the change in size of the mask pattern, which is compared to the calibration mask pattern previously stored .
  • a red colour light appears, preferably projected onto the palm.
  • a green colour light appears ( Figure 5b, right), preferably projected onto the palm. The user notices the colour, which is reflected around the fingers, when there is a gap between fingers, or around the entire palm.
  • the user notices the colour of the light from the light reflected back from the palm surface back to the sensor area, which the user views from the side or profile view. Additional the colour light may gradually change from red over yellow and finally to green thereby indicating if the palm is moved in the right direction away or towards the capturing unit. This way the correct position can be easily found. E.g. when the colour changes from yellow to red the user recognized that he moves the palm in the wrong way.
  • Figure 5 illustrates on the left the manner in which a palm is positioned in front of the authentication device 100 and on the right also the manner in which the guidance light 611 of the visual guidance unit is scattered by the palm.
  • This figure illustrates the scattered guidance light 613 from the palm.
  • the reflected light is seen around the fingers by the user. User has to change the position of the hand until a green light gets reflected on their palm. The green light indicates that the distance of the palm towards the authentication device is optimal. Once the optimal position is identified the illumination unit 110 is activated for the authentication.
  • the calibration based on mask pattern allows the functioning of the authentication device without any need of support for placing the hand at the optimal position. It therefore is completely contact- less.
  • a user has to enrol in the authentication system before use.
  • the user For enrolment initially the user identifies the optimum distance by which he keeps his palm from the device such that the green light emerges. Once the green light emerges the user who wants to perform enrolment moves the hand in different direction such as towards left, right, front and back. The palm is moved in the plane across the various directions to capture various parts of the palm. So that next time any part of the palm may be used for authentication. By doing this a threshold on minimum area for authentication may be set. Thus, the threshold indicates minimum area of the palm required for a given authentication to be accepted.
  • Figure 6 illustrates the different positions in which the palm is moved in accordance with an embodiment of the invention.
  • the image comprises pattern information and depth information due to the unevenness of the palm (explained in further detail below) captured by using micro lens array.
  • the image further comprises palm vein information taken under NIR light and palm print information taken under visible light.
  • the palm in Figure 6 shows only palm prints.
  • the palm in Figure 7 shows palm prints and palm veins.
  • Figure 7 illustrates the identification of a region of interest (ROI) for authentication, in accordance with an embodiment of the invention.
  • the palm vein ending points are identified for the entire palm.
  • the identified palm vein endings are tracked to the outer most points so as to form the region of interest.
  • the outermost vein endings such as 201a, 201b, 201c, 201d, 201e, 201f and 201g are marked .
  • the palm has many more vein ending, but in order to identify the region of interest only the outermost vein endings are considered.
  • the outermost end points of the vein 201a, 201b, 201c, 201d, 201e, 201f and 201g are tracked to form the region of interest 202.
  • the region of interest for obtaining the biometric characteristic data may be identified by the boundaries of the palm vein.
  • the mask pattern may help to define the region of interest by clearly identifying the hand from the rest of the background .
  • the size of the mask pattern on the palm varies depending on the deformity of the palm. The palm does not have a plane surface, such that the size of the mask pattern is slightly different in different regions of the palm depending on the distance to the illumination unit. The size of the mask pattern in the captured image can therefore also be used as depth information.
  • the mask pattern size in the center of the palm is slightly larger than that in the area around the centre of the palm, meaning that the centre is slightly further away from the illumination unit that the surrounding areas.
  • the unevenness of the palm leads to a distortion of the mask pattern (see Figure 11, right).
  • This unevenness of the palm and the information obtained therein can hardly be reconstructed with a palm dummy and can therefore be used as an indication of liveliness of an individual to be authenticated .
  • Figure 8 illustrates a method, in which the biometric characteristic data of the palm is extracted, in accordance with an embodiment of the invention.
  • the pattern block may refer to one or a pre-defined number of polygons of the pattern mask.
  • One possibility to define pattern blocks is described further below.
  • the presence of the palm vein and palm print is identified as in the pattern block 301, where a palm vein and palm print is overlapping i.e. crossing each other. The system identifies such overlaps and stores the details, such as palm print overlapping palm vein with its polar coordinates along with the size/area of the polygon or pattern block.
  • the polygon or pattern block size, polar coordinates and/or depth information captured by using micro lens array, palm vein and palm print features are stored as a weighted mathematical formula, which is named as a neuron node.
  • This neuron node identifies in which direction to hop for the next list of connected neuron for pattern matching. After capture of the palm image with all these neuron information, all the list of neuron nodes are stored as a mathematical formula.
  • Figure 9 illustrates various pattern blocks in the previously defined region of interest, in accordance with an embodiment of the invention.
  • the system only considers pattern blocks in the region of interest, which have vein and print information. When a feature spot has only palm print or only palm vein present then that block is ignored.
  • Figure 10 illustrates a neural network of plurality of neuron nodes, in accordance with an embodiment of the invention.
  • neuron 1 401
  • neuron 2 402
  • neuron 3 403
  • neuron 4 404
  • neuron 5 405
  • the palm print information of neuron 1 (401) is connected to neuron 2 (402) by palm print line 410a, wherein the neuron 2 (402) is further connected to neuron 3 (403) by palm print line 410c, to neuron 4 (404) by palm print line 410a and to neuron 5 (405) by palm print line 410b.
  • the neuron 3 (403) is only connected to neuron 2 (402), wherein neuron 4 (404) is connected to neuron 2 (402).
  • the neuron 5 (405) is connected to neuron 2 (402) and not neuron 1 (401) as the angle of the palm print line is facing towards neuron 2 (402).
  • These information is read as 124, which means neuron 1 (401) can hop from neuron 2 (402) to neuron 4 (404) and 123, which means neuron 1 (401) can hop from neuron 2 (402) to neuron 3 (403). This is repeated for the entire feature neuron list in the region of interest of a palm and represented as a mathematical model. The same process is performed for palm vein lines and stored .
  • the features extracted may be stored as weighted neural nodes in a network.
  • the features extracted also include the presence of the creases on the palm with respect to the pattern of the mask.
  • the features extracted includes the creases of the palm with respect to the patterns present in the mask.
  • Each node is defined as a presence of a pattern and how the palm crease and vein are overlapped in that pattern, co-ordinates of the features, depth information captured by using micro lens array and the size of that predetermined geometric pattern.
  • the network is extended by stage wise, hopping along the vein or the crease wherever a pattern has both a feature of palm vein and palm crease. This is repeated until all the features are extracted and is represented as a mathematical weighted neural network.
  • the system For comparison of two palm images in authentication process, the system performs a multi level neuromorphic algorithm to see the matching level with respect to the mathematical model stored in the database.
  • the system utilizes multi-stage object recognition.
  • This multi-stage object recognition system utilizes large-scale arrays of information rich neurons to build a biologically-plausible model of visual information processing.
  • Figure 12 illustrates a possible way of defining pattern blocks 704 by using the pattern mask of the illumination unit.
  • the captured image of the palm includes palm veins 702, palm prints 703 and the mask pattern 701 projected onto the palm.
  • the maks pattern 701 shown consists of regular hexagons a to s.
  • polygons containing palm vein 702 and palm print 703 are identified .
  • the pattern block 704 is then defined by the polygon g and its surrounding polygons b, c, f, h, I and m ( Figure 12, right).
PCT/EP2017/076685 2016-10-19 2017-10-19 System and method for contactless biometric authentication WO2018073335A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CH01404/16 2016-10-19
CH01404/16A CH713061B1 (de) 2016-10-19 2016-10-19 System und Verfahren zur berührungslosen biometrischen Authentifizierung.

Publications (1)

Publication Number Publication Date
WO2018073335A1 true WO2018073335A1 (en) 2018-04-26

Family

ID=58158734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/076685 WO2018073335A1 (en) 2016-10-19 2017-10-19 System and method for contactless biometric authentication

Country Status (2)

Country Link
CH (1) CH713061B1 (de)
WO (1) WO2018073335A1 (de)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109447052A (zh) * 2019-01-09 2019-03-08 东浓智能科技(上海)有限公司 一种可精确定位手掌位置的掌静脉识别装置及其实现方法
CN111310699A (zh) * 2020-02-27 2020-06-19 浙江光珀智能科技有限公司 一种基于手掌特征的身份认证方法及系统
WO2020136883A1 (ja) * 2018-12-28 2020-07-02 株式会社ジェーシービー 認証システム
US11301664B2 (en) 2018-09-12 2022-04-12 Fingerprint Cards Anacatum Ip Ab Reconstruction of fingerprint subimages
DE102021111422A1 (de) 2021-05-04 2022-11-10 IDloop GmbH Vorrichtung und Verfahren zur kontaktlosen Aufnahme von Finger- und Handabdrücken
WO2023247462A1 (de) * 2022-06-24 2023-12-28 IDloop GmbH Vorrichtung zur kontaktlosen aufnahme von biometriedaten von hautbereichen

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1387309A2 (de) 2002-07-31 2004-02-04 Fujitsu Limited Prozessor mit Personenidentitätsüberprüfungsfunktion und Bedienungsvorrichtung
US6813010B2 (en) 2000-09-20 2004-11-02 Hitachi, Ltd Personal identification system
EP1612718A2 (de) 2004-06-28 2006-01-04 Fujitsu Limited Registrierverfahren für ein biometrisches Authentiziersystem, entsprechendes biometrisches Authentiziersystem und Programm
EP1612717A2 (de) 2004-06-28 2006-01-04 Fujitsu Limited Biometrisches Authentizierungssystem und Registrierverfahren
US20060120576A1 (en) * 2004-11-08 2006-06-08 Biomagnetic Imaging Llc 3D Fingerprint and palm print data model and capture devices using multi structured lights and cameras
US20080107309A1 (en) 2006-11-03 2008-05-08 Cerni Consulting, Llc Method and apparatus for biometric identification
US20080211628A1 (en) * 2007-03-01 2008-09-04 Sony Corporation Biometric authentic device
CN201302723Y (zh) * 2008-10-29 2009-09-02 北京市新技术应用研究所 在线多光谱手掌图像采集仪
EP2244224A1 (de) 2008-02-15 2010-10-27 Fujitsu Limited Fotografische vorrichtung für biometrie und biometrievorrichtung
WO2012041826A1 (de) * 2010-09-28 2012-04-05 Icognize Gmbh Verfahren und vorrichtung zur berührungslosen erfassung biometrischer merkmale
EP2854097A1 (de) * 2012-05-22 2015-04-01 Fujitsu Limited Vorrichtung zur verarbeitung von bioinformationen, verfahren zur verarbeitung von bioinformationen und programm
US9223955B2 (en) 2014-01-30 2015-12-29 Microsoft Corporation User-authentication gestures
US9355236B1 (en) 2014-04-03 2016-05-31 Fuji Xerox Co., Ltd. System and method for biometric user authentication using 3D in-air hand gestures

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6813010B2 (en) 2000-09-20 2004-11-02 Hitachi, Ltd Personal identification system
EP1387309A2 (de) 2002-07-31 2004-02-04 Fujitsu Limited Prozessor mit Personenidentitätsüberprüfungsfunktion und Bedienungsvorrichtung
US7359531B2 (en) 2002-07-31 2008-04-15 Fujitsu Limited Processor with personal verification function and operating device
EP1612718A2 (de) 2004-06-28 2006-01-04 Fujitsu Limited Registrierverfahren für ein biometrisches Authentiziersystem, entsprechendes biometrisches Authentiziersystem und Programm
EP1612717A2 (de) 2004-06-28 2006-01-04 Fujitsu Limited Biometrisches Authentizierungssystem und Registrierverfahren
US20060120576A1 (en) * 2004-11-08 2006-06-08 Biomagnetic Imaging Llc 3D Fingerprint and palm print data model and capture devices using multi structured lights and cameras
US20080107309A1 (en) 2006-11-03 2008-05-08 Cerni Consulting, Llc Method and apparatus for biometric identification
US20080211628A1 (en) * 2007-03-01 2008-09-04 Sony Corporation Biometric authentic device
EP2244224A1 (de) 2008-02-15 2010-10-27 Fujitsu Limited Fotografische vorrichtung für biometrie und biometrievorrichtung
CN201302723Y (zh) * 2008-10-29 2009-09-02 北京市新技术应用研究所 在线多光谱手掌图像采集仪
WO2012041826A1 (de) * 2010-09-28 2012-04-05 Icognize Gmbh Verfahren und vorrichtung zur berührungslosen erfassung biometrischer merkmale
EP2854097A1 (de) * 2012-05-22 2015-04-01 Fujitsu Limited Vorrichtung zur verarbeitung von bioinformationen, verfahren zur verarbeitung von bioinformationen und programm
US9223955B2 (en) 2014-01-30 2015-12-29 Microsoft Corporation User-authentication gestures
US9355236B1 (en) 2014-04-03 2016-05-31 Fuji Xerox Co., Ltd. System and method for biometric user authentication using 3D in-air hand gestures

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WEI LI ET AL: "A Novel 3-D Palmprint Acquisition System", IEEE TRANSACTIONS ON SYSTEMS, MAN AND CYBERNETICS. PART A:SYSTEMS AND HUMANS, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 42, no. 2, March 2012 (2012-03-01), pages 443 - 452, XP011416650, ISSN: 1083-4427, DOI: 10.1109/TSMCA.2011.2164066 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11301664B2 (en) 2018-09-12 2022-04-12 Fingerprint Cards Anacatum Ip Ab Reconstruction of fingerprint subimages
WO2020136883A1 (ja) * 2018-12-28 2020-07-02 株式会社ジェーシービー 認証システム
JPWO2020136883A1 (ja) * 2018-12-28 2021-02-18 株式会社ジェーシービー 認証システム
CN109447052A (zh) * 2019-01-09 2019-03-08 东浓智能科技(上海)有限公司 一种可精确定位手掌位置的掌静脉识别装置及其实现方法
CN111310699A (zh) * 2020-02-27 2020-06-19 浙江光珀智能科技有限公司 一种基于手掌特征的身份认证方法及系统
DE102021111422A1 (de) 2021-05-04 2022-11-10 IDloop GmbH Vorrichtung und Verfahren zur kontaktlosen Aufnahme von Finger- und Handabdrücken
WO2022233964A1 (de) * 2021-05-04 2022-11-10 IDloop GmbH Vorrichtung und verfahren zur kontaktlosen aufnahme von finger- und handabdrücken
WO2023247462A1 (de) * 2022-06-24 2023-12-28 IDloop GmbH Vorrichtung zur kontaktlosen aufnahme von biometriedaten von hautbereichen

Also Published As

Publication number Publication date
CH713061B1 (de) 2021-03-31
CH713061A1 (de) 2018-04-30

Similar Documents

Publication Publication Date Title
WO2018073335A1 (en) System and method for contactless biometric authentication
Lee et al. Finger vein recognition using minutia‐based alignment and local binary pattern‐based feature extraction
KR101700595B1 (ko) 얼굴 인식 장치 및 그 방법
US7508960B1 (en) Projection of light patterns for liveness verification of biometrics
JP6005750B2 (ja) 認証装置、及び認証方法
Raghavendra et al. A low-cost multimodal biometric sensor to capture finger vein and fingerprint
US20090268951A1 (en) Method and system for personal identification using 3d palmprint imaging
JP5998922B2 (ja) マルチバイオメトリック認証装置、マルチバイオメトリック認証システム及びマルチバイオメトリック認証用プログラム
KR101626837B1 (ko) 손가락 마디 및 지정맥 기반의 융합형 생체 인증 방법 및 그 장치
Connell et al. Fake iris detection using structured light
Zhang et al. 3D Biometrics
JP2016157420A (ja) 画像テンプレートマスキング
CN111344703A (zh) 基于虹膜识别的用户认证设备和方法
KR101582467B1 (ko) 인접 합산 이진화를 이용한 동공 추출 방법 및 이를 이용한 동공 추출 제어장치
Dixit et al. Iris recognition by daugman’s method
Van et al. Palm vein recognition using enhanced symmetry local binary pattern and sift features
JP2008198083A (ja) 個人識別装置
Basit et al. Iris localization via intensity gradient and recognition through bit planes
Sano et al. Fingerprint authentication device based on optical characteristics inside a finger
Ogane et al. Biometric Jammer: Preventing surreptitious fingerprint photography without inconveniencing users
KR102316587B1 (ko) 홍채들로부터의 생체정보 인식 방법
Li et al. Multi-feature based score fusion method for fingerprint recognition accuracy boosting
Manjunath et al. Analysis of unimodal and multimodal biometric system using iris and fingerprint
KR101548625B1 (ko) 글린트 제거방법과 이를 이용한 홍채 인식 제어장치
KR101951692B1 (ko) 3d 구조물 인식을 위한 줄무늬 패턴 조명생성장치 및 이를 이용한 얼굴 인식장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17794909

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17794909

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15/07/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17794909

Country of ref document: EP

Kind code of ref document: A1