WO2021148844A1 - Procédé et système biométriques permettant une analyse de la main - Google Patents

Procédé et système biométriques permettant une analyse de la main Download PDF

Info

Publication number
WO2021148844A1
WO2021148844A1 PCT/IB2020/050530 IB2020050530W WO2021148844A1 WO 2021148844 A1 WO2021148844 A1 WO 2021148844A1 IB 2020050530 W IB2020050530 W IB 2020050530W WO 2021148844 A1 WO2021148844 A1 WO 2021148844A1
Authority
WO
WIPO (PCT)
Prior art keywords
individual
fingers
hand
image
features
Prior art date
Application number
PCT/IB2020/050530
Other languages
English (en)
Inventor
Zeev Zohar
Original Assignee
Four Ace Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Four Ace Ltd. filed Critical Four Ace Ltd.
Priority to PCT/IB2020/050530 priority Critical patent/WO2021148844A1/fr
Publication of WO2021148844A1 publication Critical patent/WO2021148844A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger

Definitions

  • the present invention is directed to multimodal-biometric identification and authentication.
  • biometric identification is performed by analyzing a single physical biometric identity (unimodal-biometric) such as: fingerprints, face recognition, IRIS, retina and voice signature.
  • biometric identity such as: fingerprints, face recognition, IRIS, retina and voice signature.
  • the present invention is directed to multimodal-biometric identification, authentication and management systems and in particular, to biometric identification of the hand combined with three dimensional (3D) face recognition and/or document identity data recognition.
  • the present invention provides a multimodal-biometric identification of a person (human) using a continuous, live identity capturing process in a 3D space.
  • the acquired biometric identity includes at least two independent biometric identities, such as a 3D face identity that represents a matching grid in three dimensional space (rather than two dimensional space), the person’s hand peripheral and internal features, that represent the individual fingers’ unique characteristics such as: DIP (Distal Interphalangeal creases), PIP (Proximal Interphalangeal creases) and MCP (MetaCarpoPhalangeal - Palmar Digital creases). Unlike fingers and thumbs, these finger characteristics represent an individual’s unique whole finger biometric identity, as a human fingers biometric barcode. Other visual characteristics such as scars, moles and, beauty marks enrich the overall biometric characteristics of an individual fingers’ biometric identity.
  • a 3D face identity that represents a matching grid in three dimensional space (rather than two dimensional space)
  • the person’s hand peripheral and internal features that represent the individual fingers’ unique characteristics such as: DIP (Distal Interphalangeal creases), PIP (Proximal Interphalangeal creases)
  • peripheral fingers’ contour which in this document includes the finger lines on the fingers along the palm side of the hand, provides additional uniqueness of an individual’s fingers, by identifying finger lines, fingers thickness and length, finger phalanx segment lengths, and thumb convex curvature, is analyzed using light and shadow structure.
  • the present invention includes a multimodal biometric system that is based on two biometric modalities: four fingers-lines of a given hand and three dimensional face recognition.
  • a multimodal biometric system increases security and ensures confidentiality of user data.
  • a multimodal biometric system realizes the merger of decisions taken under individual modalities. If one of the modalities is eliminated, the system can still ensure security, using the remaining.
  • a multimodal-biometric system provides numerous biometric features, when compared to unimodal-biometric systems. In a multimodal system, a fusion of feature vectors or features and/or decisions developed by each subsystem is carried out, and then the final decision on identification is made on the basis of the vector of features thus obtained.
  • the present invention classifies and analyzes numerous aspects of the lines’ characteristics of a given three fingers (selected out of four fingers, randomly or the best matching three fingers), index, middle, ring, and pinkie, and in various finger combinations, in order to provide robust, feature richness and accurate biometric identification of an individual.
  • Each matching is classified with a varying level of confidence of accuracy.
  • the varying level of confidence is used for profiling the individual authentication profile and appropriately for authorizing different levels of authorization, e.g., low volume financial transactions/low security level areas, which may be authorized with lower levels of confidence.
  • Each biometric authentication measures the level of match with a level of error probability that represents the level of confidence in each biometric authentication transaction.
  • the hand and fingers’ biometric identification along with the 3D face recognition may be used with additional individual visual information presented such as the individual identity document: passport document image/personal ID document.
  • This information is analyzed using image processing of different data representations, for example, barcodes, using a bar code reader, the attached individual’s picture, and text data using OCR (Optical Characters Recognition).
  • Embodiments of the invention are directed to a method for identification of an individual.
  • the method comprises: obtaining, from at least one first image, a plurality of features from the fingers of an individual, from the palm side of the hand of the individual, the features based on the lines of the fingers; obtaining, from at least one second image, a plurality of features from the fingers of an individual, from the palm side of the hand of the individual, the features based on the lines of the fingers; comparing corresponding features from the at least one first image and the at least one second image for matches, and, based on the number of matches meeting a predetermined threshold number of matches, identifying the individual.
  • the identifying the individual may be one or more of verifying and/or authenticating the identity of the individual.
  • the method is such that the features based on the lines of the fingers include one or more relationships of the lines of the fingers to each other.
  • the method is such that the features obtained from the at least one first image are stored as a reference template for the individual.
  • the method is such that the comparing the corresponding features from the at least one first image and the at least one second image for matches includes, comparing the features from the at least one second image against the features of the reference template for matches, and, based on the number of matches meeting a predetermined threshold number of matches, identifying the individual.
  • the method is such that the predetermined threshold includes a confidence level based on numbers of matches.
  • the method is such that the lines of the fingers include lines from one or more of: 1) Distal Interphalangeal creases (DIP); 2) Proximal Interphalangeal creases (PIP); and, 3) MetaCarpoPhalangeal (MCP) - Palmar Digital creases, of the fingers.
  • DIP Distal Interphalangeal creases
  • PIP Proximal Interphalangeal creases
  • MCP MetaCarpoPhalangeal
  • each of the matches includes at least one of an exact match of the features and/or a correspondence (including congruencies) of the features.
  • the method is such that the fingers include one or more of: the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the matches are performed on at least three of the fingers including the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the features additionally include one or more of scars, moles, and beauty marks.
  • Embodiments of the invention are directed to a system for identifying an individual.
  • the system comprises: a processor in communication with a memory; an image processor in communication with the processor, the image processor configured for processing images; a feature extractor in communication with the image processor for extracting a plurality of features from the fingers of an individual, from the palm side of the hand of the individual, from processed images of the fingers from the palm side of the hand of the individual, the features based on the lines of the fingers and one or more relationships of the lines to each other; a rules and policies database for providing threshold numbers of matches for use in identifying the individual; and, a comparator in communication with the feature extractor for comparing corresponding features from the processed images for matches, and, based on the number of matches meeting a predetermined threshold number of matches, identifying the individual.
  • the system is such that it additionally comprises: a templater for creating a reference template for the individual of the features obtained from an image of the fingers from the palm side of the hand of the individual.
  • the system is such that the comparator additionally compares the features from the reference template against the features from an image of one or more of the fingers of the palm side of the hand of an individual for matches, and, based on the number of matches meeting a predetermined threshold, identifying the individual.
  • the system is such that the feature extractor extracts the lines of the fingers from one or more of: 1) Distal Interphalangeal creases (DIP); 2) Proximal Interphalangeal creases (PIP); and, 3) MetaCarpoPhalangeal (MCP) - Palmar Digital creases, of the fingers.
  • DIP Distal Interphalangeal creases
  • PIP Proximal Interphalangeal creases
  • MCP MetaCarpoPhalangeal
  • the system is such that the comparator determines a match of features based on at least one of an exact match of the features and/or a correspondence of the features.
  • Embodiments of the invention are directed to a method for identifying an individual.
  • the method comprises: a) building a reference template for an individual based on data from an identification document for the individual and hand biometric data from an image of the fingers of the individual, from the palm side of the hand of the individual, the hand biometric data including features based on lines of the fingers; b) obtaining, from at least one subsequent image of the palm side of the hand of the individual, hand biometric data including features based on the lines of the fingers; and, c) comparing corresponding features of the hand biometric data from the reference template, and from the at least one subsequent image, for matches, and based on the number of matches meeting a predetermined threshold, identifying the individual.
  • the method is such that it additionally comprises: d) updating the reference template with the hand biometric data from the at least one subsequent image, whereby the updated reference template becomes the reference template.
  • the method is such that it additionally comprises: repeating the steps of, building a reference template, obtaining hand biometric data, and, comparing features of the hand biometric data.
  • the method is such that the hand biometric data additionally includes one or more relationships of the lines of the fingers to each other.
  • the method is such that the predetermined threshold includes a confidence level based on numbers of matches.
  • the method is such that the lines of the fingers include lines from one or more of: 1) Distal Interphalangeal creases (DIP); 2) Proximal Interphalangeal creases (PIP); and, 3) MetaCarpoPhalangeal (MCP) - Palmar Digital creases, of the fingers.
  • DIP Distal Interphalangeal creases
  • PIP Proximal Interphalangeal creases
  • MCP MetaCarpoPhalangeal
  • each of the matches includes at least one of an exact match of the features and/or a correspondence of the features.
  • the method is such that the fingers include one or more of: the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the matches are performed on at least three of the fingers including the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the identification document includes at least one of: a passport, a driver’s license, a government issued document, and, a photo-identification.
  • the method is such that the image of the fingers of the palm side of the hand of the individual is from at least one of: 1) a video of the fingers of the palm side of the hand of the individual; or, 2) at least one still image of the fingers of the palm side of the hand of the individual.
  • the method is such that the at least one subsequent image of the fingers of the palm side of the hand of the individual is from at least one of: 1) a video of the fingers of the palm side of the hand of the individual; or, 2) at least one still image of the fingers of the palm side of the hand of the individual.
  • the method is such that the data from the identification document is based on one or more of: text, photographs, and/or bar codes, from the identification document.
  • Embodiments of the invention are directed to a method for identifying an individual.
  • the method comprises: a) building a reference template for an individual based on: 1) data from an identification document for the individual; 2) hand biometric data from an image the fingers of the individual, from the palm side of the hand of the individual, the hand biometric data including features based on lines of the fingers; and, 3) face biometric data from an image of the face of the individual including facial features; b) obtaining the hand biometric data including features based on the lines of the fingers, from at least one subsequent image of the palm side of the hand of the individual, and, obtaining the face biometric data including facial features, from at least one subsequent image of the face of the individual, and, c) comparing corresponding features of the hand biometric data and the face biometric data from the reference template against image data from: 1) the at least one subsequent image of the of the palm side of the hand of the individual, and, 2) the at least one subsequent image of the face of the individual, for matches, and, based on the number of matches
  • the method is such that it additionally comprises: d) updating the reference template with the hand biometric data from the at least one subsequent image of the palm side of the hand of the individual, and the face biometric data from the at least one subsequent image of the face of the individual, whereby the updated reference template becomes the reference template.
  • the method is such that the hand biometric data additionally includes one or more relationships of the lines of the fingers to each other.
  • the method is such that the predetermined threshold includes a confidence level based on numbers of matches.
  • the method is such that the lines of the fingers include lines from one or more of: 1) Distal Interphalangeal creases (DIP); 2) Proximal Interphalangeal creases (PIP); and, 3) MetaCarpoPhalangeal (MCP) - Palmar Digital creases, of the fingers.
  • DIP Distal Interphalangeal creases
  • PIP Proximal Interphalangeal creases
  • MCP MetaCarpoPhalangeal
  • each of the matches includes at least one of an exact match of the features and/or a correspondence of the features.
  • the method is such that the fingers include one or more of: the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the matches are performed on at least three of the fingers including the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the identification document includes at least one of: a passport, a driver’s license, a government issued document, and, a photo-identification.
  • the method is such that the image of the fingers of the palm side of the hand of the individual is from at least one of: 1) a video of the fingers of the palm side of the hand of the individual; or, 2) at least one still image of the fingers of the palm side of the hand of the individual.
  • the method is such that the image of the face of the individual is from at least one of: 1) a video of the face of the individual; or, 2) at least one still image of the face of the individual.
  • the method is such that the at least one subsequent image of the fingers of the palm side of the hand of the individual is from at least one of: 1) a video of the fingers of the palm side of the hand of the individual; or, 2) at least one still image of the fingers of the palm side of the hand of the individual.
  • the method is such that the at least one subsequent image of the face of the individual is from at least one of: 1) a video of the face of the individual; or, 2) at least one still image of the face of the individual.
  • Embodiments of the invention are directed to a method for identifying an individual.
  • the method comprises: a) building a reference template for an individual based on hand biometric data from an image of the fingers of the individual, from the palm side of the hand of the individual, the hand biometric data including features based on lines of the fingers; b) obtaining, from at least one subsequent image of the palm side of the hand of the individual, hand biometric data including features based on the lines of the fingers; and, c) comparing corresponding features of the hand biometric data from the reference template and from the at least one subsequent image for matches, and based on the number of matches meeting a predetermined threshold umber of matches, identifying the individual.
  • the method is such that it additionally comprises: d) updating the reference template with the hand biometric data from the at least one subsequent image, whereby the updated reference template becomes the reference template.
  • the method is such that the hand biometric data additionally includes one or more relationships of the lines of the fingers to each other.
  • the method is such that the predetermined threshold includes a confidence level based on numbers of matches.
  • the method is such that the lines of the fingers include lines from one or more of: 1) Distal Interphalangeal creases (DIP); 2) Proximal Interphalangeal creases (PIP); and, 3) MetaCarpoPhalangeal (MCP) - Palmar Digital creases, of the fingers.
  • DIP Distal Interphalangeal creases
  • PIP Proximal Interphalangeal creases
  • MCP MetaCarpoPhalangeal
  • the method is such that the features additionally include one or more of scars, moles, and beauty marks.
  • each of the matches includes at least one of an exact match of the features and/or a correspondence of the features.
  • the method is such that the fingers include one or more of: the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the matches are performed on at least three of the fingers including the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the image of the fingers of the palm side of the hand of the individual is from at least one of: 1) a video of the fingers of the palm side of the hand of the individual; or, 2) at least one still image of the fingers of the palm side of the hand of the individual.
  • the method is such that the at least one subsequent image of the fingers of the palm side of the hand of the individual is from at least one of: 1) a video of the fingers of the palm side of the hand of the individual; or, 2) at least one still image of the fingers of the palm side of the hand of the individual.
  • Embodiments of the invention are directed to a method for identifying an individual.
  • the method comprises: a) building a reference template for an individual based on: 1) hand biometric data from an image the fingers of the individual, from the palm side of the hand of the individual, the hand biometric data including features based on lines of the fingers; and, 2) face biometric data from an image of the face of the individual including facial features; b) obtaining, from at least one subsequent image of the palm side of the hand of the individual, the hand biometric data including features based on the lines of the fingers, and, 2) from at least one subsequent image of the face of the individual, face biometric data including facial features; and, c) comparing corresponding features of the hand biometric data and the face biometric data from the reference template, against image data from: 1) the at least one subsequent image of the of the palm side of the hand of the individual, and, 2) the at least one subsequent image of the face of the individual, for matches, and based on the number of matches of the hand biometric data and face biometric data meeting
  • the method is such that it additionally comprises: d) updating the reference template with the hand biometric data from the at least one subsequent image of the palm side of the hand of the individual, and the face biometric data from the at least one subsequent image of the face of the individual, whereby the updated reference template becomes the reference template.
  • the method is such that the hand biometric data additionally includes one or more relationships of the lines of the fingers to each other.
  • the method is such that the predetermined threshold includes a predetermined confidence level.
  • the method is such that the lines of the fingers include lines from one or more of: 1) Distal Interphalangeal creases (DIP); 2) Proximal Interphalangeal creases (PIP); and, 3) MetaCarpoPhalangeal (MCP) - Palmar Digital creases; of the fingers.
  • DIP Distal Interphalangeal creases
  • PIP Proximal Interphalangeal creases
  • MCP MetaCarpoPhalangeal
  • the method is such that the hand biometric data includes features additionally comprising one or more of scars, moles, and beauty marks.
  • each of the matches includes at least one of an exact match of the features and/or a correspondence of the features.
  • the method is such that the fingers include one or more of: the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the matches are performed on at least three of the fingers including the index finger, the middle finger, the ring finger, and, the pinky.
  • the method is such that the image of the fingers of the palm side of the hand of the individual is from at least one of: 1) a video of the fingers of the palm side of the hand of the individual; or, 2) at least one still image of the fingers of the palm side of the hand of the individual.
  • the method is such that the image of the face of the individual is from at least one of: 1) a video of the face of the individual; or, 2) at least one still image of the face of the individual.
  • the method is such that the at least one subsequent image of the fingers of the palm side of the hand of the individual is from at least one of: 1) a video of the fingers of the palm side of the hand of the individual; or, 2) at least one still image of the fingers of the palm side of the hand of the individual.
  • the method is such that the at least one subsequent image of the face of the individual is from at least one of: 1) a video of the face of the individual; or, 2) at least one still image of the face of the individual.
  • FIG. 1A is a diagram of an exemplary environment for the system in which embodiments of the disclosed subject matter are performed;
  • FIG. IB is a block diagram of a system architecture in accordance with an embodiment of the invention.
  • FIG. 2A is a diagram of the fingers on the palm side of the hand, on which curvatures based features extraction and analytic processes of the invention are performed;
  • FIG. 2B is the diagram of 2A showing regions of the fingers in boxes;
  • FIG. 3A is a main flow diagram detailing a first process in accordance with the invention.
  • FIG. 3B is a flow diagram of the subprocess of identification associated with verification from FIG. 3A;
  • FIGs. 4A-1 and 4A-2 are a flow diagram detailing the main steps in the enrollment and the authentication/verification processes in accordance with the present invention
  • FIGs. 4B-1 and 4B-2 are a flow diagram detailing the authentication/verification processes in accordance with the present invention.
  • FIG. 5 is a diagram illustrating the identification process of FIG. 4A
  • FIG. 6 is a diagram illustrating the enrollment process of FIG. 4A.
  • FIG. 7 is a diagram illustration the verification/authorization process of FIG. 4B.
  • aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more non-transitory computer readable (storage) medium(s) having computer readable program code embodied thereon. Throughout this document, numerous textual and graphical references are made to trademarks, and domain names. These trademarks and domain names are the property of their respective owners, and are referenced only for explanation purposes herein.
  • FIG. 1A shows an exemplary operating environment along an enterprise or local area network (LAN) 50, which is linked to a wide area network (WAN) 55, such as a public network, such as the Internet and/or cellular networks and the like.
  • LAN local area network
  • WAN wide area network
  • Linked includes both wired or wireless links, either direct or indirect, and placing networks, computers, including, servers, components and the like, in electronic and/or data communications with each other.
  • the LAN 50 is linked a home server (HS) 100, also known as a home computer, a main server or main computer, these terms used interchangeably herein.
  • the home server 100 also supports a system 100’, either alone or with other, computers, including servers, components, and applications, e.g., client applications, associated with either the home server 100, as detailed below.
  • the home server 100 operates as an organization services server, where for example, users are enrolled, their enrollment data is stored, for example as templates, and is compared against identification data from user computers 120a- 120c, collectively referred to as element 120, for user identification and verification.
  • These user computers 120a- 120c which typically include cameras 120a’, or other imaging devices, are linked to the WAN 55. These user computers 120a- 120c are used to obtain images and text from the user.
  • An organization enrollment computer/server 104 is also linked to the LAN 50.
  • An office enrollment computer/server 106 is linked to the LAN 50.
  • This computer 106 includes a camera 106a or other imaging device, for obtaining and image, text and/or barcode from a user during the enrollment process, as detailed below.
  • the home server (HS) 100 is of an architecture that includes one or more components, engines, modules and the like, for providing numerous additional server functions and operations.
  • the home server (HS) 100 may be associated with additional storage, memory, caches and databases, both internal and external thereto.
  • the home server (HS) 100 may have a uniform resource locator (URL) of, for example, www.hs.com. While a single home server (HS) 100 is shown, the home server (HS) 100 may be formed of multiple servers and/or components.
  • FIG. IB shows an example architecture of a system 100’ in accordance with the present invention.
  • the system 100’ is shown, for example, within the home server 100, but may be distributed among multiple servers, computers, components and the like along the LAN 50 and/or the WAN 55.
  • This architecture of the system 100' includes a central processing unit (CPU) 152 formed of one or more processors.
  • CPU central processing unit
  • the CPU 152 is electronically linked, and in electronic and or data communication, directly or indirectly with computer components including: a storage/memory 154, a communications interface 156, a confidence level module 158, a feature extraction module 161, a reference template forming module 162, a reference template database 163, a received data module 164, a comparison module 165, an image processor 171, devices including: an optical character reader (OCR) 172, and a bar code reader 173, and an adaptive learning and retraining engine 180.
  • OCR optical character reader
  • a “module”, for example, includes a component for storing instructions (e.g., machine readable instructions) for performing one or more processes, and including or associated with processors, e.g., the CPU 152, for executing the instructions.
  • instructions e.g., machine readable instructions
  • the Central Processing Unit (CPU) 152 is formed of one or more processors, including microprocessors, for performing the home server 100 functions and operations detailed herein, including controlling the components 154, 156, 158, 161, 162, 163, 164, 165, 171, 172, 173 and 180, executing the instructions provided and/or obtained therefrom.
  • the Central Processing Unit (CPU) 152 processors are, for example, conventional processors, such as those used in servers, computers, and other computerized devices, including data processors and hardware processors, for performing the home server 100 and system 100’ functions and operations detailed herein.
  • the processors may include x86 Processors from AMD and Intel, Xenon® and Pentium® processors from Intel, as well as any combinations thereof.
  • the storage/memory 154 is associated with the CPU 152, and is any conventional storage media.
  • the storage/memory 154 also includes machine executable instructions associated with the operation of the CPU 152 and the components 154, 156, 158, 161, 162, 163, 164, 165, 171, 172, 173 and 180, and along with the processes and subprocesses shown in FIGs. 2A-4B, detailed herein.
  • the storage/memory 154 also, for example, stores rules and policies, such as security policies, for the system 100’ and the home server 100, as well as rules and policies for correspondences and/or congruencies during feature matching.
  • the communications interface 156 serves to both transmit and receive data from the LAN 50 and/or WAN 55.
  • the confidence level module 158 applies system rules and policies, such as security policies (e.g., stores in the storage 154 and/or the confidence level module 158), to determine the amount of feature matching between features which is needed for verification authorization, and identification, including, for example, biometric identity verification, authorization, and identification.
  • security policies e.g., stores in the storage 154 and/or the confidence level module 158
  • feature matching or matching includes one or more of exact matches between features, and/or correspondences (or congruencies) between features, the correspondences (or congruencies) in accordance with preprogrammed thresholds, rules and policies, and the like.
  • the confidence level may be, for example, a number of matches (e.g., exact matches and/or correspondences or congruencies), such as a minimum number of matches of features.
  • the feature extraction module or feature extractor 161 extracts features from images, as obtained from the image processor 171, text, as obtained from the ocular character reader (OCR) 172 and barcodes, as obtained from the barcode reader 173.
  • the features extracted from images are the features from the palm side of the fingers (detailed below), also referred to herein as the contour of the fingers, contour, contour image, and from the face of the user, both from a real time image, or a prerecorded image such as a photograph on an identification document, such as a passport, or the like.
  • a reference template forming module 162 receives and/or obtains the extracted features from the feature extraction module 161 and forms these extracted features into a reference template for the particular user.
  • the reference template forming module 162 also augments or updates each respective reference template for a user, when the user is photographed, presents an image on a new document, presents a new barcode, and has new text associated with the document.
  • the augmenting or updating of the reference template is performed by the adaptive learning and retraining engine 180, detailed below, in conjunction with the reference template forming module 162.
  • the now created and augmented or updated reference templates are stored in the reference template database(s) 163.
  • a data receiving module or data receiver 164 serves to receive data when a user is photographed and the identification document is scanned, for example, when verification and/or authentication processes are being performed.
  • the module 164 receives the user data from the image processor 171, optical character reader (OCR) 172, and a bar code reader 173, selects the requisite features, which are extracted and, for example, temporarily stored.
  • OCR optical character reader
  • the image processor 171 includes software and hardware for processing video, still images, and the like, both in normal light spectra, and infra-red and ultraviolet spectra.
  • the image processor 171 typically processes images associated with various regions of interest (ROIs). There are typically five ROIs, which include, for example, 1) the fingers (from the palm side of the hand), 2) the face or facial, and, from a document, 3) the photograph, 4) the text, and, 5) the barcode.
  • ROIs regions of interest
  • the OCR 172 is a standard optical character reader capable of reading text in multiple languages.
  • the bar code reader 173 is, for example, a standard bar code reader.
  • a comparison module 165 functions to compare the extracted features for a user, stored in the database(s) 163, with the temporarily stored extracted features, to perform comparisons of features to verify, authorize, and/or otherwise identify the user.
  • the comparison typically includes matching features, whereby a match may be one or more of an exact match between the features and/or a correspondence or congruence of the features, the correspondence or congruence based on levels and tolerances preprogrammed into the comparison module 165, or in accordance with rules and policies (e.g., stored in the storage 154 or confidence level module 158).
  • the comparison module 165 may also perform the comparisons based on confidence levels as received from the confidence level module 158.
  • the adaptive learning and retraining engine 180 is constantly updating (including augmenting) the reference templates by iterative processes each time a new facial image or fingers contour image is taken of an individual, who has a reference template in the system 100’.
  • the reference template for the individual is subsequently updated, including the features of the facial and finger contour images. Additionally, this engine 180 continuously develops acceptable variances between features which are to be compared.
  • FIGs. 2A and 2B show the same image of fingers of the palm side of the hand 200 for the sequence of images analysis performed by computerized image analysis processes of the invention.
  • the lines in the fingers form the aforementioned biometric or human bar code, and the system 100 uses this human bar code to identify individuals, as well as verify and authenticate, and otherwise identify individuals.
  • the implementation of a stream of images allows for the maximization of the probability to obtain high-quality images (out of the video frames sequence), and use them to construct a super resolution image, with all of the required details to support high accuracy with maximum TPR and minimum FPR and FNR (TPR - True Positive Rate, FPR - False Positive Rate, FNR - False Negative Rate).
  • the process is not limited to a specific/customized image processing algorithm. Eventually, the process ends at the optimization of the tradeoff between quality and resource consumption. For example, the images from which FIGs. 2A and 2B are based, were scanned, in high resolution, of 1080 x 1920. Initially, the image may be subjected to various preprocessing processes, such as one or more of: contrast stretching, denoising, blur removal, and histogram equalization of the original image.
  • the overall image processing of the fingers or finger contour image, from the palm side of the hand, as a Region of Interest (ROI), may use image foreground and background subtraction, salient object detection using machine learning (ML) models, binary segmentation, edge detection, object zooming to verify the ROI selection, smoothing and low resolution processing.
  • ML machine learning
  • the aforementioned processes are performed, for example, by the feature selection module 161 and image processor 171, along with the reference template forming module 162 and the database(s) 163.
  • image analysis of the ROI including feature extraction, is performed.
  • the four fingers, index 201, middle 202, ring 203 and pinkie (or little) 204 are divided into areas in accordance with the broken lines. These areas are unique to each individual, and accordingly, are used for biometric identification of the individual.
  • the areas of the image are processed by image processing techniques. Image processing by the image processor 171 and the feature selection module 161 is as follows.
  • W-ux DIP-Distal Interphalangeal creases
  • W-mx PIP- Proximal Interphalangeal creases
  • W-Bx MCP-MetaCarpo phalangeal, Palmar Digital creases
  • the lines are grouped in accordance with the respective area, i.e., DIP, PIP, MCP per each finger), as shown by boxes 201a-201c, 202a-202c, 203a-203c, and 204a-204c of FIG. 2B.
  • the structure of the fingers includes internal and peripheral lines, and is used for generating a rich high-dimensional feature space that provides the unique four fingers’ identity of an individual.
  • Each of the lines’ lengths and widths are measured and stored, and also the distance (spacing) with respect to each other line in the area is measured and stored as part of a biometric reference template for the individual.
  • any combination of one or more fingers and the lines thereon may be used for identification of the individual.
  • the matching process fits the features map as represented in the individual identity reference template against an inspected features map of an individual’s (person’s) given four fingers. This process is similar in evaluating the match between two map images of the same place (similar to the process of comparing overlapped satellite images).
  • the peripheral fingers’ contour provides additional uniqueness of an individual’s fingers’ identity, with features such as finger thickness, finger length, finger phalanx segment lengths, and optionally, where the thumbs’ convex curvature is analyzed using light and shadow structure. These features are typically also stored for the individual, as part of the biometric reference template.
  • the aforementioned “matches”, as well as the “matches” listed below, may be exact matches, or correspondences or congruencies, the correspondences or congruencies based on levels and tolerances preprogrammed into the comparison module 165, or in accordance with rules and policies. All of the aforementioned measurements and relationships of finger lines, as well as finger measurements, and markings on the fingers, such as scars, moles, beauty marks and the like, are considered to be features, and these features are, for example, stored as part of the biometric reference template(s), for the individual.
  • the person to be identified has his hand imaged on the palm side, and the image is analyzed against one or more of the features, including, for example, lines, line patterns, and line relationships for each area, line pattern relationships and line distances between the areas, finger lengths and finger widths, finger, hand and palm outlines, from the biometric reference template for the individual.
  • the verifications and authentications of the finger images are compared against the biometric reference template for the individual, for matches, as performed, for example, by the comparison module 165.
  • the aforementioned features may be taken from any combination of one or more fingers of the four fingers, index 201, middle 202, ring 203, or pinky 204.
  • the image analysis performed by the system is such that numerous aspects of the lines’ characteristics of, for example, a given three fingers (selected out of four fingers, randomly or three best match), index, middle, ring and pinkie, and in various finger combinations, are classified and analyzed in order to provide robust and accurate biometric identification of an individual.
  • Each matching for example, exact matching or correspondence, is classified with a varying level of confidence and accuracy.
  • the varying level of confidence is used for the individual authentication profile and appropriately for authorizing different level of authorization, i.e., low volume of financial transactions/low security level areas can be authorized with lower level of confidence.
  • Each biometric authentication measures the level of the match with a level of error probability that represents the level of confidence in each biometric authentication (identification) transaction.
  • the feature matches for example, exact matching or correspondence
  • a threshold level number or predetermined number
  • a threshold level number or predetermined number
  • matches e.g., exact matches or correspondences
  • the same amount of features may be compared, with a lower threshold (number or predetermined number) for matches, e.g., exact matches or correspondences, when less accuracy is required, and a higher threshold for matches, e.g., exact matches or correspondences, when greater accuracy is required.
  • a higher number of features may be selected with a higher threshold (number or predetermined number) of matches, e.g., exact matches or correspondences, may be used, when high accuracy is desired, as opposed to when lower accuracy is sufficient, for the requisite operation.
  • a higher threshold number or predetermined number
  • the features of the current authentication transaction are used to update the reference biometric template (for example, by the adaptive learning and retraining engine 180).
  • the fingers biometric identification may be performed alone, or with one or both of three dimensional (3D) face recognition, and individual visual information presented such as the individual’s identity document, such as a passport, driver’s license or other government issued identification, passport of document data, and/or passport bar code or other document bar code.
  • This information is analyzed using image processing of different data representations, such as a barcode reader, attached individual photograph image processing, and text data identification using OCR (Optical Characters Recognition), which is described for FIGs. 3A, 3B, 4A and 4B, below.
  • FIG. 3A is a flow diagram detailing a process including the finger analysis of FIGs. 2A and 2B and as described above.
  • This process is typically performed automatically, for example, by computers and computerized devices, but portions thereof may be performed manually, and the process is, for example, performed in real time.
  • the process of FIG. 3A emphasizes the use of a generic image processing which scans a stream of video frames/images of the individual (that acts according to the system guidelines) and detects the required Regions of Interest (ROIs) of the presented identity document ROI, the photograph (ROI), text (ROI) and barcode (ROI) therefrom, the individual face ROI and the finger contour ROI, also known as the four fingers ROI.
  • ROIs Regions of Interest
  • the process begins at the START block 302 by a third-party application that initiates an authentication request for authorizing a specific electronic transaction. Identifying the individual is essential in order to verify that this individual is authorized to conduct that specific electronic transaction remotely using his smartphone/desktop personal computer (PC)/laptop/tablet, or the like (without having the individual physically present at the organization desk).
  • the process moves to block 304, where image acquisition, for example, in the form of a video (video stream) or series of one or more still images, (the video and/or the still images collectively known as an “image”) is performed using a camera or other imaging device.
  • the image acquired includes an image of the user’s (individual’s) face, palm side facing fingers (contour), and passport or other identification document.
  • the image is preprocessed.
  • This preprocessing includes for example, contrast stretching, denoising, blur removal and histogram equalization of the original video stream frames/images.
  • Each image of the sequenced images is pre-processed.
  • the selected high-quality image is inverted, and the normal and inverted images are processed to maximize the quality of feature extraction due to lighting considerations.
  • the process moves to block 310, an optional subprocess, where additional preprocessing of the peripheral fingers contour image, i.e., the lines of the four fingers on the palm side of the hand, as detected in the requisite image, is performed.
  • the optional subprocesses may include, for example, image foreground and background subtraction, salient object detection using machine learning (ML) models, binary segmentation, edge detection, object zooming to verify the ROI selection, smoothing and low resolution processing.
  • ML machine learning
  • the aforementioned processes are performed, for example, by the feature selection module 161 and image processor 171, along with the reference template forming module 162 and the database(s) 163.
  • the hand region is defined as the foreground object image (peripheral and internal), and isolated from the background by using a background reduction process, for example, by the image processor 171.
  • a background reduction process for example, by the image processor 171.
  • an edge detection process is conducted using a zoom-in process.
  • a salient object image processing is used to create a binary mask.
  • the masked images are preprocessed for super-resolution images and then later for structural features extraction, distances map, corners detection, curvature and grid, that represents the biometric characteristics of a given hand or fingers scan.
  • ROIs regions of interest
  • ROIs regions of interest
  • ROIs are, for example, subject to processes, such as, detection, contrast stretching, image denoising using sparse representation, greyscale inverse and feature extractions.
  • the process moves to blocks 314a and 314b contemporaneously, and, for example, simultaneously.
  • features are extracted from each ROI (for example, five ROIs).
  • features are extracted based on face recognition, for example, by Face Recognition Software, including that from Visage Technologies AB of Linkoping Sweden.
  • features are extracted in accordance with the description of FIGs. 2A and 2B above, including the finger lines in each area as well as the relation of finger lines between each of the areas. Additional features of the fingers or hand, such as scars, moles, beauty marks and the like, may be extracted in the same or similar manner as that for the finger lines.
  • the document image or document ROIs for example, a passport
  • features are extracted for the text, e.g., name, dates, and, image, of the passport holder, and numbers and bar code (if present) on the passport.
  • a set of sequenced images is processed.
  • the extracted features, represented by data are, for example, normalized to reduce any redundancies.
  • a visual selective attention (VS A) algorithm such as color thresholding and/or finger contour extraction and/or Haar cascades and/or HOG (histogram of oriented gradients), Linear SVM (support vector machine), SSDs (solid state drives), and/or Faster R-CNNs (Convolutional Neural Networks), which can provided the recognized ID per frame in the video, is used.
  • VS A visual selective attention
  • HOG hoverogram of oriented gradients
  • Linear SVM support vector machine
  • SSDs solid state drives
  • Faster R-CNNs Convolutional Neural Networks
  • the ROIs are tracked by using a video tracking process which implements an algorithm, such as computing the Euclidean distances between new ROI bounding boxes and existing tracked ID objects, which analyzes sequential video frames and outputs the movement of targeted objects between the frames.
  • an algorithm such as computing the Euclidean distances between new ROI bounding boxes and existing tracked ID objects, which analyzes sequential video frames and outputs the movement of targeted objects between the frames.
  • algorithms such as Haar and Kalman filters, which support such functionality and capabilities.
  • target representation and localization As well as filtering and data association.
  • Target representation and localization is mostly a bottom-up process. These methods give a variety of tools for identifying the moving object. Locating and tracking the target object successfully is dependent on the algorithm. For example, using blob tracking is useful for identifying variant-profile objects movements/changes.
  • Common objects tracking, representation and localization algorithms are: 1) Kernel-based tracking (mean-shift tracking): an iterative localization procedure based on the maximization of: 2) a similarity measure (Bhattacharyya coefficient), and, 3) Contour tracking: detection of object boundaries (e.g., active contours or a Condensation algorithm). Contour tracking methods iteratively involve an initial contour initialized from the previous frame to its new position in the current frame. This approach to contour tracking directly involves the contour by minimizing the contour energy using gradient descent.
  • the process moves to block 316, where one or more templates are constructed or otherwise made, for the extracted features of each of the five ROIs, in any combination of ROIs.
  • the process moves to block 318, where the template(s) are sent for enrollment and/or verification by an entity such as a bank, government agency, or the like.
  • An identification process detailed in FIG. 3B and blocks 332 to 346, is typically employed with the aforementioned enrollment and/or verification process. Once the individual is verified and enrolled, authentication transactions, such as bank account withdrawals and transfers, entry permissions, such as border crossings, and the like are granted from organizations to the individual one the individual is authenticated. Additionally, with a proper identification, resulting in an enrollment or verification, the process moves to block 320, where the template(s) are updated, should there be changes in the features extracted from the respective ROIs.
  • ROIs are selected, such as one or more ROIs, including hand, face, document text, document photograph (image), and document bar code.
  • the hand (fingers) ROI is a selected ROI, the number of fingers to be analyzed, is typically selected. For example, three of the four fingers is a typical selection.
  • a confidence level is assigned to each of the selected ROI’s or all of the selected ROIs as a group, for the comparison, which will be used in making the identification.
  • weights may be applied to the comparison, such as a certain group of finger lines, or certain areas of finger lines to each other, as well as other rules and policies.
  • the process moves to block 340, where template(s) are obtained for the selected ROIs.
  • images corresponding to the selected ROIs are obtained for a comparison with the template(s) obtained for the selected ROIs.
  • the comparison of the template(s) and the images is performed at block 342, at the selected confidence levels, and weights and/or rules and policies, if selected.
  • the process then moves to block 344, where it is determined whether the ID process was successful for the enrollment or the verification, based on rules and policies of the enrollment or verification. If yes, the process moves to block 320. If no, the process moves to block 346, where it ends, and the template(s) is not updated.
  • FIGs. 4A and 4B These diagrams, with references to the machines that perform the processes, show processes for identification and enrollment of an individual (FIG. 4A) and, home server 100, computer 104, and computer 106, and authentication (FIG. 4B), by the home server 100, and one of computers 120, which obtain data from an 106 individual or user.
  • FIG. 4A the process begins at a START block 400. Only after the inspected individual’s identity is successfully verified, is the multimodal-biometric identity verified. The process moves to block 402, where a verification of the individual is made.
  • FIG. 4A shows a verification of the individual is made.
  • the individual 502 e.g., customer
  • a video, photograph and/or other image is taken of the individual 502.
  • the individual also brings an identification document 504, such as a passport or other issued government identification, with is reviewed and photographed, copied, or the like.
  • the identification document 504, e.g., passport data, photograph, text and/or bar code data, for the individual, is stored in the database(s) 163.
  • the aforementioned identification is used to manually identify the individual, and once the individual is manually identified and entered into the system 100’, the system 100’, via the feature extraction module 161, builds a reference template (or template(s)), via module 162, from the extracted features of the face and of the contour of the hand, as well as the passport text, photograph and barcode, for the individual, at block 404.
  • the entity identifying the individual is typically a human, but could also be a digital know your client (KYC) evaluator, who positively identifies the individual via his document(s). The individual is now verified and the data for the individual is in the system 100’ (e.g., stored in the database 163).
  • the process moves to block 406, where an enrollment process begins.
  • the enrollment process is shown in FIG. 6, to which attention is also directed.
  • the Enrollment process is required prior to authenticating the individual 502 with a multimodal-biometric identity.
  • the enrollment process is conducted at the organization enrollment office (by the enrolment computer 106) which is equipped with optimized camera device 106a with a minimum noisy environment, for example, a plain/flat background that emphasize the individual’s face 602 and his hand/four fingers contour 604, i.e., the fingers as viewed on the palm side of the hand, and a document 606 of the individual.
  • the individual is physically present at the enrollment office.
  • the individual’s document 606 is scanned and verified by the organization representative to ensure that the acquired individual’s identity document information is valid and correct.
  • the passport number, issue date, full name, birth date and more are all correct as written in the individual identity document.
  • the enrollment process starts, for example, by initiating an enrollment transaction request, which is generated using the organization’s enrollment services servers (which may be remote), e.g., home server 100, over a secured connection.
  • This enrollment process is, for example, performed remotely, via a remote computer link or connection.
  • a camera 106a or other imaging device acquires an image, typically a sequence of images, e.g., a video, of the individual’s face, hand (fingers of the palm side of the hand), and the individual’s passport, e.g., photograph, and information and/or signature page.
  • This sequence of images is transmitted (sent) to the organization’s enrollment services server 100 over the aforementioned secured connection.
  • one or more of the acquired images is processed in accordance with the process detailed for FIGs. 2A and 2B, with numerous features extracted from each of, for example, five ROIs, including, the facial image, the hand image and the document image, the document text and the document bar code.
  • ROIs are processed for contour detection (e.g., finger lines and relationships between them), contrast stretching, greyscale inverse, and feature extractions.
  • the features which are extracted are features that are used to build a multi-biometric reference template, with a minimal confidence level.
  • This minimal confidence level for example, represents the proportional probability minus the mismatch/error level between the inspected biometric identity and the reference biometric identity, and used for an identification of the individual.
  • the minimum confidence biometric reference template of the individual is either generated (created) by the system 100’ or refused. If the minimum confidence biometric reference template was generated, it is stored by the organization’s enrollment services servers 100 (e.g., in the database(s) 163), for future identifications of the individual. The process moves to block 416, where it is determined whether the minimum confidence biometric reference template was generated (built or created). If no at block 416, the process moves to block 424, where it ends. If yes at block 416, where a biometric reference template(s) was generated (built or created) for the individual, the individual is now enrolled in the system 100’.
  • the reference template(s) from the initial verification process of block 402 is compared against the newly generated biometric reference template(s), at block 418.
  • the comparison may yield new, changed or modified features in the various ROIs of the present biometric reference template(s), as determined by the adaptive learning and retraining engine 180, at block 420.
  • these data changes and/or modifications may be used to update the present biometric template(s), at block 422, with this updated biometric reference template(s) now being the present biometric reference template(s) (stored in the database(s) 163 for the individual).
  • the process moves to block 424, where it ends.
  • FIG. 4B the process of authentication or identifying the individual occurs.
  • the process is also illustrated at FIG. 7.
  • the process begins at a START block 450 by a third-party application that initiates an authentication request for authorizing a specific electronic transaction.
  • the process moves to block 452, where an authentication or identification request is generated by the organization’s server(s) 100, which are, for example, remote. This request reaches the organization’s computer system via a secured connection.
  • the individual’s 502 face 702, finger contour 704, and, document, e.g., passport 706, are imaged, for example, locally in real time, by a laptop or other camera type device 120, in a sequence of one or more images, including, for example, video, as well as camera still images.
  • ROIs are evaluated, which include, for example, the passport photo 706a, read by image processing apparatus 171, indicia on the passport 706bl, 706b2, read by optical character recognition (OCR) scanning or the like, by the apparatus 172, and the bar code 706c, read by the bar code reader 173, on the passport 706.
  • This data is transmitted (sent) and received by the organization’s server(s) 100, for example, over a secured connection.
  • the received images from the individual’s response to the request for example, the facial image(s), finger contour image(s), and passport (document) photo image(s) are subject to image processing, to prepare the acquired images for verification by feature extraction.
  • image processing includes, for example, contrast stretching, denoising, unbluring, histogram equalization, and edge detection.
  • the process moves to block 458, where three ROIs from the individual’s identification document (e.g., passport 706), for example, the photograph 706a, indicia or text 706bl, 706b2, and barcode 706c, are subjected to image analysis (by image analyzer 171), optical character reading or scanning (by the OCR 172), and bar code reading (by the bar code reader 173), are used to verify the identity of the individual.
  • identification document e.g., passport 706
  • image analysis by image analyzer 171
  • OCR 172 optical character reading or scanning
  • bar code reading by the bar code reader 173
  • the verification is performed at block 460, where the acquired document (e.g., passport 706) information (data) is verified against the previously obtained passport data, from the enrollment process of block 402, detailed above, stored in the data base(s) 613.
  • the comparison module 165 programmed to a threshold level (number or predetermined number) for matches of document, e.g., passport, data, defined by exact matches and correspondences, of compared document features, determines whether the threshold level for verification has been reached.
  • the process moves to block 462, where the system 100’, e.g., the comparison module 165, determines whether there was verification.
  • the process moves to block 470, as discussed below. If yes at block 462, the process moves to block 464.
  • the biometric data, sent by the individual, the face image(s) and finger contour image(s) are extracted for features.
  • the process moves to block 466, where the extracted features for the face and finger contour of the individual, are compared against those of the reference template for the individual, for example, stored in the database(s) 163.
  • the process moves to block 468 where based on an organizational security policy, for example, from the confidence level module 158, a comparison based on feature matching, performed for example, by the comparison module 165, for matching multiple biometric identities, facial features and finger contour features, between those of the present template (from the database(s) 163, and the image(s) acquired from the individual (e.g., at block 454).
  • a comparison based on feature matching performed for example, by the comparison module 165, for matching multiple biometric identities, facial features and finger contour features, between those of the present template (from the database(s) 163, and the image(s) acquired from the individual (e.g., at block 454).
  • a predetermined threshold number or predetermined number of matches
  • confidence levels vary based on the transaction. Assuming a bank transaction of $100, confidence levels may be minimal, requiring matching (exact matches and/or correspondences) of only a few extracted features, and in particular, a few features of the finger contour and/or face. However, for a large transaction, such as $500,000, confidence levels may be maximized, requiring matching (exact matches and/or correspondence) of numerous extracted features, and in particular, a numerous features of the fingers and face.
  • the process moves to block 470.
  • the authentication results e.g., approval or disapproval of the individual being authenticated
  • the system server(s) 100 which requested the authentication transaction (of block 452).
  • the process moves to block 472, where the process ends.
  • the comparison at block 468 may yield new, changed or modified features in the various ROIs of the present biometric reference template, as determined by the adaptive learning and retraining engine 180. These data changes and/or modifications may be used to update the present biometric template, with this updated biometric reference template now being the present biometric reference template (stored in the database(s) 163) for the individual.
  • FIGs. 4A and 4B While the processes of FIGs. 4A and 4B include an identification document, face and hand biometric data, these processes may be performed with the identification document and hand biometric data, and the identification document and face biometric data.
  • FIGs. 3A, 3B, 4A and 4B, and/or portions thereof may be repeated if desired or necessary, for as long as is desired or necessary.
  • the aforementioned processes are, for example, repeated, when new or subsequent data, including, image data, or new document data, or the like, is obtained.
  • Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
  • a data processor such as a computing platform for executing a plurality of instructions.
  • the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, non-transitory storage media such as a magnetic hard-disk and/or removable media, for storing instructions and/or data.
  • a network connection is provided as well.
  • a display and/or a user input device such as a keyboard or mouse are optionally provided as well.
  • non-transitory computer readable (storage) medium(s) may be utilized in accordance with the above-listed embodiments of the present invention.
  • a non-transitory computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable non-transitory storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • processes and portions thereof can be performed by software, hardware and combinations thereof. These processes and portions thereof can be performed by computers, computer-type devices, workstations, processors, micro-processors, other electronic searching tools and memory and other non-transitory storage-type devices associated therewith.
  • the processes and portions thereof can also be embodied in programmable non-transitory storage media, for example, compact discs (CDs) or other discs including magnetic, optical, etc., readable by a machine or the like, or other computer usable storage media, including magnetic, optical, or semiconductor storage, or other source of electronic signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

La présente invention concerne des systèmes d'identification biométrique multimodale, d'authentification et de gestion, qui assurent l'identification biométrique de la main combinée à une reconnaissance faciale 3D et/ou à une reconnaissance de données d'identité de document. L'identification biométrique multimodale d'une personne (humaine) est assurée à l'aide d'un procédé de capture d'identité continu en direct dans un espace tridimensionnel.
PCT/IB2020/050530 2020-01-23 2020-01-23 Procédé et système biométriques permettant une analyse de la main WO2021148844A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2020/050530 WO2021148844A1 (fr) 2020-01-23 2020-01-23 Procédé et système biométriques permettant une analyse de la main

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2020/050530 WO2021148844A1 (fr) 2020-01-23 2020-01-23 Procédé et système biométriques permettant une analyse de la main

Publications (1)

Publication Number Publication Date
WO2021148844A1 true WO2021148844A1 (fr) 2021-07-29

Family

ID=76992843

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/050530 WO2021148844A1 (fr) 2020-01-23 2020-01-23 Procédé et système biométriques permettant une analyse de la main

Country Status (1)

Country Link
WO (1) WO2021148844A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022245783A3 (fr) * 2021-05-17 2023-01-12 Vr Media Technology, Inc. Système de reconnaissance de main qui compare des chromophores de peau absorbant les ultraviolets à bande étroite
WO2023063940A1 (fr) * 2021-10-13 2023-04-20 Hewlett-Packard Development Company, L.P. Images recadrées de région d'intérêt

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080302870A1 (en) * 2006-10-30 2008-12-11 Cryptometrics, Inc. Computerized biometric passenger identification system and method
US20090116706A1 (en) * 2005-05-13 2009-05-07 Rudolf Hauke Device and procedure for the recording of lines of the fingers respectively the hand
US20090174526A1 (en) * 2002-10-11 2009-07-09 Howard James V Systems and Methods for Recognition of Individuals Using Multiple Biometric Searches
US20120293642A1 (en) * 2011-05-18 2012-11-22 Nextgenid, Inc. Multi-biometric enrollment kiosk including biometric enrollment and verification, face recognition and fingerprint matching systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090174526A1 (en) * 2002-10-11 2009-07-09 Howard James V Systems and Methods for Recognition of Individuals Using Multiple Biometric Searches
US20090116706A1 (en) * 2005-05-13 2009-05-07 Rudolf Hauke Device and procedure for the recording of lines of the fingers respectively the hand
US20080302870A1 (en) * 2006-10-30 2008-12-11 Cryptometrics, Inc. Computerized biometric passenger identification system and method
US20120293642A1 (en) * 2011-05-18 2012-11-22 Nextgenid, Inc. Multi-biometric enrollment kiosk including biometric enrollment and verification, face recognition and fingerprint matching systems

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FENG JIANJIANG; ZHOU HUAPENG; ZHOU JIE: "A Preliminary Study of Handprint Synthesis", BIOMETRIC RECOGNITION : 6TH CHINESE CONFERENCE, CCBR 2011, BEIJING, CHINA, DECEMBER 3-4, 2011 ; PROCEEDINGS, SPRINGER, BERLIN, HEIDELBERG, vol. 7098 Chap.17, 3 December 2011 (2011-12-03) - 4 December 2011 (2011-12-04), Berlin, Heidelberg, pages 133 - 140, XP047439203, ISBN: 978-3-642-25448-2, DOI: 10.1007/978-3-642-25449-9_17 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022245783A3 (fr) * 2021-05-17 2023-01-12 Vr Media Technology, Inc. Système de reconnaissance de main qui compare des chromophores de peau absorbant les ultraviolets à bande étroite
WO2023063940A1 (fr) * 2021-10-13 2023-04-20 Hewlett-Packard Development Company, L.P. Images recadrées de région d'intérêt

Similar Documents

Publication Publication Date Title
US20220165087A1 (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
CN110326001B (zh) 使用利用移动设备捕捉的图像执行基于指纹的用户认证的系统和方法
Verma et al. Daughman’s algorithm method for iris recognition—a biometric approach
Jaswal et al. DeepKnuckle: revealing the human identity
US20190392129A1 (en) Identity authentication method
Ross Information fusion in fingerprint authentication
Aravindan et al. Robust partial fingerprint recognition using wavelet SIFT descriptors
Krishneswari et al. A review on palm print verification system
Ilankumaran et al. Multi-biometric authentication system using finger vein and iris in cloud computing
Haji et al. Real time face recognition system (RTFRS)
WO2021148844A1 (fr) Procédé et système biométriques permettant une analyse de la main
Aguilar et al. Fingerprint recognition
Agarwal et al. Human identification and verification based on signature, fingerprint and iris integration
Jose et al. Towards building a better biometric system based on vein patterns in human beings
Lin et al. Accuracy enhanced thermal face recognition
Methani Camera based palmprint recognition
Pathak Image compression algorithms for fingerprint system
Ribarić et al. A biometric identification system based on the fusion of hand and palm features
Mansour Iris recognition using gauss laplace filter
Abdulla et al. Exploring Human Biometrics: A Focus on Security Concerns and Deep Neural Networks
Bala et al. An effective multimodal biometric system based on textural feature descriptor
Kumar et al. Robust palm vein recognition using LMKNCN classification
Patil et al. Multimodal biometric identification system: Fusion of Iris and fingerprint
Verma et al. Personal palm print identification using KNN classifier
Supriya et al. Efficient iris recognition by fusion of matching scores obtained by lifting DWT and Log-Gabor methods of feature extraction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20916198

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08.12.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20916198

Country of ref document: EP

Kind code of ref document: A1