WO2022084039A1 - Procédé et système de mise à jour d'un système d'identification d'utilisateur - Google Patents

Procédé et système de mise à jour d'un système d'identification d'utilisateur Download PDF

Info

Publication number
WO2022084039A1
WO2022084039A1 PCT/EP2021/077554 EP2021077554W WO2022084039A1 WO 2022084039 A1 WO2022084039 A1 WO 2022084039A1 EP 2021077554 W EP2021077554 W EP 2021077554W WO 2022084039 A1 WO2022084039 A1 WO 2022084039A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
feature extractor
image data
database
feature
Prior art date
Application number
PCT/EP2021/077554
Other languages
English (en)
Inventor
Marc FARESSE
Original Assignee
Dormakaba Schweiz Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dormakaba Schweiz Ag filed Critical Dormakaba Schweiz Ag
Publication of WO2022084039A1 publication Critical patent/WO2022084039A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/50Maintenance of biometric data or enrolment thereof
    • G06V40/53Measures to keep reference information secret, e.g. cancellable biometrics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/50Maintenance of biometric data or enrolment thereof
    • G06V40/55Performing matching on a personal external card, e.g. to avoid submitting reference information

Definitions

  • the present disclosure generally relates to the field of neural network-based facial recognition systems for user enrollment and identification.
  • Facial recognition is broadly used in many industries. Some example applications are in law enforcement, retail, healthcare, and access and security. While recognizing faces is a trivial task for humans, it remains a complex problem for machines. Neural networks are used to recognize faces by mapping facial features from a photograph or video to databases with known faces to find a match. Deep learning can leverage very large datasets of faces to provide highly performing systems.
  • a method for updating a user identification system comprises pre-processing image data of a user acquired from an image capture device, for face detection and alignment; identifying the user by extracting a first feature vector from the pre-processed image data using a first feature extractor and comparing the first feature vector to first user data stored in a first database; and enrolling the user, concurrently with the identifying, by extracting a second feature vector from the pre- processed image data using a second feature extractor and storing the second feature vector with second user data in a second database.
  • a user identification system comprising at least one processor and at least one non-transitory computer readable medium having stored thereon program instructions.
  • the program instructions are executable by the processor for pre-processing image data of a user acquired from an image capture device, for face detection and alignment; identifying the user by extracting a first feature vector from the pre-processed image data using a first feature extractor and comparing the first feature vector to first user data stored in a first database; and enrolling the user, concurrently with the identifying, by extracting a second feature vector from the pre-processed image data using a second feature extractor and storing the second feature vector with second user data in a second database.
  • FIG. 1 is a block diagram of an example of a user enrollment and identification system
  • FIG. 2A-2C are block diagrams of example embodiments for updating the user identification system of Fig. 1 ;
  • Fig. 3 is a schematic diagram of multiple instances of algorithms running in parallel
  • FIGs. 4A-4C are flowcharts of example method for updating a user identification system.
  • Fig. 5 is a block diagram of an example computing device.
  • Fig. 1 shows an example of an identification system 100.
  • the identification is based on facial recognition provided from image data acquired by an image capture device 101.
  • the image capture device 101 is a sensor, such as a camera, that can acquire an image directly from a person standing in front of it.
  • the image capture device 101 acquires an image of an image, such as a picture on a badge. Any device capable of acquiring image data may be used.
  • the image data is received for pre-processing 102, which generally includes face detection and alignment steps. Face detection segments the face areas from the background and provides coarse estimates of the location and scale of a detected face. Face alignment then refines localization and normalization of the detected face.
  • Preprocessing 102 may include cropping, rotation, and/or scaling of an image. If the image data is taken from a video stream, the detected face may need to be tracked using a face tracking component. Any known or other pre-processing steps for facial recognition may be used.
  • the pre-processed image data is transmitted to a feature extractor 104, where feature vectors are extracted.
  • the feature vectors are used for feature matching 108, by comparing the extracted feature vectors to user data stored in a database 106.
  • the user data generally comprises previously extracted feature vectors associated with a given user, for example with a user identification number. When a match is found with sufficient confidence, for example using a Euclidean distance calculation between the extracted feature vector and the stored feature vectors, identification of the user is confirmed. If a match is not found with sufficient confidence, identification is denied. Other techniques for feature matching may also be used for user identification.
  • Enrollment, or registration, of users into the system 100 may be performed at any time by storing the extracted feature vector in the database 106 and associating the feature vector with a user identifier.
  • the user identifier and any other data stored in the database 106 is devoid of any details allowing users to be identified, such as pictures, names, addresses, etc. This may be done, for example, to respect various privacy laws in certain jurisdictions, such as the General Data Protection Regulation (GDPR) in the European Union.
  • GDPR General Data Protection Regulation
  • an update to the system 100 is seamlessly performed by running a parallel instance 202 while also performing user identification concurrently.
  • the parallel instance comprises an updated or modified feature extractor 204, that receives the pre- processed image data in parallel to the feature extractor 104.
  • updated and “modified” may be used interchangeably to refer to any changes having been applied to an original version.
  • the expression “updating the user identification system” encompasses any changes to be made to the software and/or hardware of the system.
  • Feature vectors extracted by the feature extractor 104 are used to perform user identification through comparison with feature vectors stored in the database 106.
  • Feature vectors extracted by the updated feature extractor 204 are used to populate a new database 206, for enrollment purposes. Therefore, the update to the system 100 is performed concurrently with continued use of the system 100 for identification purposes, and there is no downtime to the system 100 required for the update. Moreover, the update is performed without having to store any images or other personal information related to the users.
  • Fig. 2B illustrates an example where the update to the system 100 concerns not only the algorithm used for extracting features through the feature extractor 104, but also the pre- processing steps 102.
  • a parallel instance 212 includes an updated pre-processing stage 208.
  • the image data is sent to both pre-processing stages 102, 208, and each feature extractor 104, 204 receives pre-processed image data from a respective pre-processing stage 102, 208.
  • the system 100 may also be used to transition from an identification system based on credentials to an identification system based on facial recognition.
  • an ID reader 210 may scan the credentials of the user, such as from a card, badge, or mobile device, and extract therefrom a user ID.
  • the image capture device 101 acquires the image data for the user.
  • the database 206 of a parallel instance 222 can then be built without any external intervention as the user ID from the credentials is automatically associated with the feature vectors extracted from the pre- processed image data by the updated feature extractor 204.
  • the update to the system 100 is completed when one or more update criteria has been met.
  • the update criteria may be a total number of users to enroll in the new database 206.
  • the update criteria may be a finite list of users to be enrolled in the new database 206, and the update is completed when each user from the list has been enrolled in the new database 206.
  • the update criteria may be the expiry of a timer or a combination of a number of user IDs and a timer.
  • Various criteria may be used to determine that the update has been completed, depending on practical implementation.
  • operation of the system 100 is transitioned to the second instance 202, 212, 222 for user identification when the update is completed.
  • the first instance namely the original feature extractor 104 and original database 106, and in some case the original pre-processing 102, may be removed entirely from the system 100 or simply deactivated (i.e. taken offline).
  • the image data is then received only in the second instance 202, 212, 222 which would become the only instance of the algorithm. Transition to the second instance 202, 212, 222 may occur automatically or manually.
  • both the first instance and the second instance 202, 212, 222 are used for user identification once the update is completed.
  • feature matching 108 is performed by searching for a match in both the first database 106 and the second database 206.
  • feature matching 108 may be based on feature vectors received from the feature extractor 104 and from the updated feature extractor 204.
  • Subsequent updates to the system 100 may result in multiple instances running in parallel, as illustrated in Fig. 3.
  • An original version 300 may be supplemented and/or replaced by subsequent instances 302, 304, 306, which may be updated versions of an algorithm and/or modified versions of the algorithm.
  • the multiple instances may be associated with distinct geographical locations. For example, a company having different locations around the world may have different requirements with regards to the algorithms that they can run at each location. There may also be restrictions with regards to the data stored in each of the databases associated with each algorithm as a function of the location.
  • the system 100 thus offers interoperability between locations and more flexibility. A user enrolled in an instance running in a location in the United States that travels to China would not need to reenroll in the system in China. The system running in China could access the database running in the instance located in the United States to verify the identity of the user.
  • Fig. 4A illustrates an example of a method 400 for updating a user identification system, such as the system 100.
  • image data acquired by an image capture device for a given user is pre-processed for face detection and alignment.
  • the pre-processed image data is transmitted to a first feature extractor associated with a first database and a second feature extractor associated with a second database.
  • the second feature extractor is an updated and/or alternate version of the first feature extractor.
  • user identification 404 and user enrollment 406 are performed concurrently.
  • User identification 404 is performed with the former version of the feature extractor, whereas user enrollment is performed with the updated or alternate version of the feature extractor.
  • User identification 404 comprises extracting feature vector(s) with the first feature extractor at step 408, and comparing the first feature vector(s) to user data stored in the first database to identify the user.
  • User enrollment 406 comprises extracting second feature vector(s) with the second feature extractor at step 412, and storing the second feature vector(s) with user data in the second database at step 414.
  • user enrollment 406 is performed in compliance with privacy regulations by updating the user identification system without storing image data that would allow a user to be identified, such as face images.
  • step 414 of enrolling users in the second database comprises associating an anonymous user ID to the second feature vector and storing the anonymous user ID and second feature vector(s) in the second database.
  • the user ID is obtained from an ID reader concurrently with the acquisition of the image data for facial recognition. While this may be done to facilitate the update to the user identification system, it may also allow a transition from a credentials-based system that only uses an ID reader to a facial recognition system based on image data.
  • the method 400 further comprises completing the update to the user identification system when one or more update criteria has been met.
  • the second feature extractor and second database are transitioned to user identification, which may involve taking the first feature extractor and the first database offline. Alternatively, both feature extractors and both databases are used concurrently to perform user identification.
  • the update is not only for the feature extractor, but also for the pre-processing steps.
  • pre-processing of the image data 402 is performed separately for each feature extractor, as illustrated in the embodiment of Fig. 4C.
  • the user identification system 100 and the method 400 are implemented in one or more computing device 500, as illustrated in Fig. 5.
  • the user identification system 100 may include more computing devices 500 operable to exchange data.
  • each instance 300, 302, 304, 306 may be implemented in a separate computing device 500.
  • the computing devices 500 may be the same or different types of devices.
  • the computing device 500 comprises a processing unit 502 and a memory 504 which has stored therein computer-executable instructions 506.
  • the processing unit 502 may comprise any suitable devices configured to implement a method such that instructions 506, when executed by the computing device 500 or other programmable apparatus, may cause the functions/acts/steps to be executed.
  • the processing unit 502 may comprise, for example, any type of general-purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, a central processing unit (CPU), a graphics processing unit (GPU), Al-enabled chips, an integrated circuit, a field programmable gate array (FPGA), a reconfigurable processor, other suitably programmed or programmable logic circuits, or any combination thereof.
  • DSP digital signal processing
  • CPU central processing unit
  • GPU graphics processing unit
  • FPGA field programmable gate array
  • reconfigurable processor other suitably programmed or programmable logic circuits, or any combination thereof.
  • the memory 504 may comprise any suitable known or other machine-readable storage medium.
  • the memory 504 may comprise non-transitory computer readable storage medium, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • the memory 504 may include a suitable combination of any type of computer memory that is located either internally or externally to device, for example random-access memory (RAM), read-only memory (ROM), compact disc read-only memory (CDROM), electro- optical memory, magneto-optical memory, erasable programmable read-only memory (EPROM), and electrically-erasable programmable read-only memory (EEPROM), Ferroelectric RAM (FRAM) or the like.
  • Memory 504 may comprise any storage means (e.g., devices) suitable for retrievably storing machine-readable instructions 506 executable by processing unit 502.
  • the methods and systems described herein may be implemented in a high level procedural or object oriented programming or scripting language, or a combination thereof, to communicate with or assist in the operation of a computer system, for example the computing device 500.
  • the methods and systems may be implemented in assembly or machine language.
  • the language may be a compiled or interpreted language.
  • Program code for implementing the methods and systems may be stored on a storage media or a device, for example a ROM, a magnetic disk, an optical disc, a flash drive, or any other suitable storage media or device.
  • the program code may be readable by a general or special-purpose programmable computer for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.
  • Embodiments of the methods and systems may also be considered to be implemented by way of a non-transitory computer-readable storage medium having a computer program stored thereon.
  • the computer program may comprise computer-readable instructions which cause a computer, or more specifically the processing unit 502 of the computing device 500, to operate in a specific and predefined manner to perform the functions described herein, for example those described in the method 700.
  • Computer-executable instructions may be in many forms, including program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
  • functionality of the program modules may be combined or distributed as desired in various embodiments.
  • the embodiments described herein provide useful physical machines and particularly configured computer hardware arrangements.
  • the embodiments described herein are directed to electronic machines and methods implemented by electronic machines adapted for processing and transforming electromagnetic signals which represent various types of information.
  • the embodiments described herein pervasively and integrally relate to machines, and their uses; and the embodiments described herein have no meaning or practical applicability outside their use with computer hardware, machines, and various hardware components.
  • Substituting the physical hardware particularly configured to implement various acts for non-physical hardware, using mental steps for example, may substantially affect the way the embodiments work.
  • Such computer hardware limitations are clearly essential elements of the embodiments described herein, and they cannot be omitted or substituted for mental means without having a material effect on the operation and structure of the embodiments described herein.
  • the computer hardware is essential to implement the various embodiments described herein and is not merely used to perform steps expeditiously and in an efficient manner.
  • connection or “coupled to” may include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements).
  • the technical solution of embodiments may be in the form of a software product.
  • the software product may be stored in a non-volatile or non-transitory storage medium, which can be a compact disk read-only memory (CD-ROM), a USB flash disk, or a removable hard disk.
  • the software product includes a number of instructions that enable a computer device (personal computer, server, or network device) to execute the methods provided by the embodiments.

Abstract

L'invention concerne un système d'identification d'utilisateur et un procédé pour sa mise à jour. Le procédé comprend le prétraitement de données d'image d'un utilisateur acquises à partir d'un dispositif de capture d'image, pour une détection et un alignement de visage ; l'identification de l'utilisateur par extraction d'un premier vecteur de caractéristique à partir des données d'image prétraitées à l'aide d'un premier extracteur de caractéristique et la comparaison du premier vecteur de caractéristique à des premières données d'utilisateur stockées dans une première base de données ; et l'enrôlement de l'utilisateur, simultanément à l'identification, par extraction d'un second vecteur de caractéristique à partir des données d'image prétraitées à l'aide d'un second extracteur de caractéristique et le stockage du second vecteur de caractéristique avec des secondes données d'utilisateur dans une seconde base de données.
PCT/EP2021/077554 2020-10-23 2021-10-06 Procédé et système de mise à jour d'un système d'identification d'utilisateur WO2022084039A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063104892P 2020-10-23 2020-10-23
US63/104,892 2020-10-23

Publications (1)

Publication Number Publication Date
WO2022084039A1 true WO2022084039A1 (fr) 2022-04-28

Family

ID=78085702

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/077554 WO2022084039A1 (fr) 2020-10-23 2021-10-06 Procédé et système de mise à jour d'un système d'identification d'utilisateur

Country Status (1)

Country Link
WO (1) WO2022084039A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115510089A (zh) * 2022-11-15 2022-12-23 以萨技术股份有限公司 一种向量特征比对方法、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020112177A1 (en) * 2001-02-12 2002-08-15 Voltmer William H. Anonymous biometric authentication
US9418214B1 (en) * 2011-12-06 2016-08-16 Imageware Systems, Inc. Anonymous biometric enrollment
US20180365402A1 (en) * 2017-06-20 2018-12-20 Samsung Electronics Co., Ltd. User authentication method and apparatus with adaptively updated enrollment database (db)
US20190213394A1 (en) * 2017-08-01 2019-07-11 Apple Inc. Multiple enrollments in facial recognition
WO2019173562A1 (fr) * 2018-03-07 2019-09-12 Open Inference Holdings LLC Systèmes et procédés de traitement biométrique respectant la confidentialité

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020112177A1 (en) * 2001-02-12 2002-08-15 Voltmer William H. Anonymous biometric authentication
US9418214B1 (en) * 2011-12-06 2016-08-16 Imageware Systems, Inc. Anonymous biometric enrollment
US20180365402A1 (en) * 2017-06-20 2018-12-20 Samsung Electronics Co., Ltd. User authentication method and apparatus with adaptively updated enrollment database (db)
US20190213394A1 (en) * 2017-08-01 2019-07-11 Apple Inc. Multiple enrollments in facial recognition
WO2019173562A1 (fr) * 2018-03-07 2019-09-12 Open Inference Holdings LLC Systèmes et procédés de traitement biométrique respectant la confidentialité

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115510089A (zh) * 2022-11-15 2022-12-23 以萨技术股份有限公司 一种向量特征比对方法、电子设备及存储介质
CN115510089B (zh) * 2022-11-15 2023-03-10 以萨技术股份有限公司 一种向量特征比对方法、电子设备及存储介质

Similar Documents

Publication Publication Date Title
Arashloo et al. Efficient processing of MRFs for unconstrained-pose face recognition
US7693310B2 (en) Moving object recognition apparatus for tracking a moving object based on photographed image
US9613428B2 (en) Fingerprint authentication using stitch and cut
WO2019071664A1 (fr) Procédé et appareil de reconnaissance de visage humain combinés à des informations de profondeur, et support de stockage
US10740386B2 (en) Multi-stage image matching techniques
AU2011252761B2 (en) Automatic identity enrolment
Haji et al. Real time face recognition system (RTFRS)
Chang et al. Effectiveness evaluation of iris segmentation by using geodesic active contour (GAC)
WO2022084039A1 (fr) Procédé et système de mise à jour d'un système d'identification d'utilisateur
TW201915830A (zh) 資訊識別方法、裝置及電子設備
Archana et al. Face recognition: A template based approach
Fawwad Hussain et al. Gray level face recognition using spatial features
Li et al. Log-gabor weber descriptor for face recognition
US20170293410A1 (en) Biometric state switching
Prasad et al. Iris recognition systems: a review
US10133471B2 (en) Biometric shortcuts
KR101802061B1 (ko) 얼굴 시공간 특징에 기반한 자동 생체 인증 방법 및 시스템
Alhajim et al. FFDR: Design and implementation framework for face detection based on raspberry pi
Li et al. Learning deep features with adaptive triplet loss for person reidentification
KR102301786B1 (ko) 딥러닝 기반 실시간 온-디바이스 얼굴 인증을 위한 방법 및 장치
Anzid et al. Improving point matching on multimodal images using distance and orientation automatic filtering
CN112270275B (zh) 基于图片识别的商品搜索方法、装置及计算机设备
Lee et al. Research Trend Analysis for EO-IR Image Registration
Raghavendra Jingade et al. DOG-ADTCP:: A new feature descriptor for protection of face identification system
Brown et al. Improved automatic face segmentation and recognition for applications with limited training data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21787426

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21787426

Country of ref document: EP

Kind code of ref document: A1