WO2022046120A1 - User authentication using event cameras - Google Patents

User authentication using event cameras Download PDF

Info

Publication number
WO2022046120A1
WO2022046120A1 PCT/US2020/048810 US2020048810W WO2022046120A1 WO 2022046120 A1 WO2022046120 A1 WO 2022046120A1 US 2020048810 W US2020048810 W US 2020048810W WO 2022046120 A1 WO2022046120 A1 WO 2022046120A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
event
pattern
event stream
processor
Prior art date
Application number
PCT/US2020/048810
Other languages
French (fr)
Inventor
Srikanth KUTHURU
Madhu Sudan ATHREYA
Manu RASTOGI
Mithra VANKIPURAM
M Anthony Lewis
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2020/048810 priority Critical patent/WO2022046120A1/en
Publication of WO2022046120A1 publication Critical patent/WO2022046120A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • Computing devices may often include security features that may prevent unauthorized access to the computing devices.
  • Some of the security features may employ facial recognition for identification verification.
  • an image of a user’s face may be captured and characteristics of the user’s face may be analyzed against a reference facial image data. Access to a computing device may be permitted when there is a sufficient match between the facial characteristics and the reference facial image data and may be denied when there a sufficient match does not exist.
  • FIG. 1 shows a block diagram of an example apparatus that may authenticate a user based on an event stream received from an event camera;
  • FIG. 2 shows a block diagram of an example system in which the apparatus depicted in FIG. 1 may be implemented
  • FIGS. 3 and 4 respectively, show block diagrams of example systems, in which a processor may authenticate a user based on an event stream received from an event camera;
  • FIGS. 5-7 respectively, show flow diagrams of example methods for authenticating a user based on a facial image in an event stream received from an event camera;
  • FIG. 8 shows a block diagram of an example computer-readable medium that may have stored thereon computer-readable instructions for authenticating a user based on a facial image in an event stream received from an event camera.
  • the terms “a” and “an” are intended to denote at least one of a particular element.
  • the term “includes” means includes but not limited to, the term “including” means including but not limited to.
  • the term “based on” means based at least in part on.
  • a processor may authenticate a user based on a facial image in an event stream received from an event camera. That is, instead of a camera that may include shutter to capture images of a user, such that the images of the user may be used to authenticate the user, the apparatuses disclosed herein may receive an event stream from an event camera.
  • an event camera may include an imaging sensor that may respond to local changes in brightness. That is, the event camera may be a camera that may include pixels that may each operate independently and asynchronously with respect to each other and may report changes in brightness as they are detected.
  • event cameras may be energy and processing resource usage efficient and may have high dynamic range and sampling rate over cameras that use shutters.
  • event cameras may be beneficial over cameras that use shutters.
  • an issue with the use of event cameras is that they may only detect changes in brightness, which may be caused by motion, and thus, if a user is stationary after a first movement, the event cameras may not detect the user's face.
  • a processor may implement various techniques to enable the use of event cameras in the detection of a user’s face.
  • the processor may identify features of the user’s face and may use the identified features in determining whether the user is authentic.
  • the processor may use an event simulator that may produce images that may be similar to the event camera’s output. In order to simulate such images for training, the processor may use an event simulator on normal face videos as discussed in greater detail herein below.
  • an IR light source may output pulses of IR illumination onto a user such that the event camera may generate events responsive to the changes in illumination.
  • the IR light source may output the pulses according to a predefined pattern, in which the predefined pattern may be a secret or otherwise secure pattern, which the processor may use to authenticate an event stream received from the event camera.
  • an event camera may be implemented for facial recognition and authentication purposes.
  • the amount of processing and energy resources associated with the facial recognition and authentication purposes may be reduced and/or minimized as compared with use of other types of cameras.
  • FIG. 1 shows a block diagram of an example apparatus 100 that may authenticate a user based on a facial image in an event stream received from an event camera.
  • FIG. 2 shows a block diagram of an example system 200 in which the apparatus 100 depicted in FIG. 1 may be implemented. It should be understood that the example apparatus 100 depicted in FIG. 1 and/or the example system 200 depicted in FIG. 2 may include additional features and that some of the features described herein may be removed and/or modified without departing from the scopes of the apparatus 100 and/or the system 200.
  • the apparatus 100 may be a computing system such as a laptop computer, a tablet computer, a desktop computer, a smartphone, or the like.
  • the apparatus 100 may include a processor 102, which may be a semiconductor-based microprocessor, a central processing unit (CPU), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and/or other suitable hardware device.
  • the apparatus 100 may also include a memory 110 that may have stored thereon machine-readable instructions (which may equivalently be termed computer-readable instructions) that the processor 102 may execute.
  • the memory 110 may be an electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions.
  • the memory 110 may be, for example, Random-Access memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and the like.
  • RAM Random-Access memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • the memory 110 which may also be referred to as a computer-readable storage medium, may be a non-transitory machine-readable storage medium, where the term “non- transitory” does not encompass transitory propagating signals.
  • the memory 110 may have stored thereon machine-readable instructions 112-118 that the processor 102 may execute.
  • the instructions 112-118 are described herein as being stored on the memory 110 and may thus include a set of machine-readable instructions
  • the apparatus 100 may include hardware logic blocks that may perform functions similar to the instructions 112-118.
  • the processor 102 may include hardware components that may execute the instructions 112-118.
  • the apparatus 100 may include a combination of instructions and hardware logic blocks to implement or execute functions corresponding to the instructions 112-118.
  • the processor 102 may implement the hardware logic blocks and/or execute the instructions 112-118.
  • the apparatus 100 may also include additional instructions and/or hardware logic blocks such that the processor 102 may execute operations in addition to or in place of those discussed above with respect to FIG. 1 .
  • the processor 102 may execute the instructions 112 to receive an event stream 202 from an event camera 204, in which the event stream 202 may correspond to a user 206 in a field of view of the event camera 204.
  • the event camera 204 may capture images 208 of the user 206 and may generate a plurality of events corresponding to the captured images 208.
  • the event camera 204 may also generate and send the event stream 202 to the apparatus 100.
  • the event camera 204 may be integrated with the apparatus 100.
  • the event camera 204 may be mounted within and/or on a housing of the apparatus 100. In other examples, the event camera 204 may be separate from the apparatus 100.
  • the event camera 204 may be an external camera, such as an external webcam, a security camera, or the like, that may be connected to the apparatus 100 through a wired or a wireless connection.
  • the event camera 204 which may equivalently be termed an eventbased camera, a dynamic vision sensor camera, a neuromorphic camera, and/or the like, may include an imaging sensor that may respond to local changes in brightness.
  • the event camera 204 may be a camera that may include pixels that may each operate independently and asynchronously with respect to each other and may report changes in brightness as they are detected. That is, the pixels in the event camera 204 may compare brightness levels to a reference brightness level and may generate an event when a difference between a detected brightness and the reference brightness level exceeds a preset threshold.
  • the event camera 204 may generate an event when there is a change in a scene, e.g., motion within the field of view of the event camera 204, and may not generate an event when there is no change in a scene, as may occur when the user 206 within the field of view of the event camera 204 is stationary.
  • the event camera 204 may therefore be distinguished from other types of cameras that may capture images using shutters.
  • the pixels of the event camera 204 may capture events corresponding to an image 208 of the user 206 when the user 206 positions the user’s face in front of the event camera 204.
  • the user 206 may position their face in front of the event camera 204 when the user 206 seeks to be authenticated to access the apparatus 100 and/or an application executing on the apparatus 100.
  • the event camera 204 may be activated, e.g., powered on, when the apparatus 100 is activated from a power off state or a standby state.
  • the event camera 204 may be activated when an application that may use facial recognition-based authentication is opened or launched on the apparatus 100.
  • the event camera 204 may communicate the events in the event stream 202 to the apparatus 100.
  • the processor 102 may execute the instructions 114 to apply an image recognition operation 210 on the event stream 202 to identify features of the user 206.
  • the processor 102 may identify biometric features of the user from the event stream 202, e.g., a facial image in the event stream 202. That is, the processor 102 may identify features of the user 206, such as patterns, based on the user’s facial textures and shape.
  • the processor 102 may apply the image recognition operation 210 to analyze the relative positions, sizes, and/or shapes of the user’s 206 eyes, nose, cheekbones, jaw, and/or the like.
  • the processor 102 may execute the instructions 116 to determine whether the identified features of the user 206 match features of a reference image 212.
  • the reference image 212 may be an image of the user 206 or of another user that may have previously been captured and registered with the apparatus 100.
  • the reference image 212 may be an image of a user who is authorized to access the apparatus 100, to access a particular application, and/or the like.
  • the features of the reference image 212 may be similar to the types of features that were identified from the event stream 202.
  • the features of the reference image 212 may be feature vectors that may be determined through application of a neural network on a facial image in the reference image 212.
  • the reference image 212 and/or the features of the reference image 212 may be stored in a data store 214 and the processor 102 may access the reference image 212 and/or the features of the reference image 212 from the data store 214.
  • the data store 214 may be an electronic, magnetic, optical, or other physical storage device.
  • the data store 214 may be a Random-Access memory (RAM), an Electrically Erasable Programmable Read- Only Memory (EEPROM), a storage device, an optical disc, and the like.
  • the data store 214 and the memory 110 may be the same component, while in other examples, the data store 214 may be a separate component from the memory 110.
  • the processor 102 may execute the instructions 118 to authenticate or reject 216 the user 206 based on a determination as to whether the identified features of the user 206 match the features of the reference image 212. That is, the processor 102 may determine that the user 206 is authentic based on the identified features of the user 206 matching the features of the reference image 212. Particularly, the processor 102 may determine that the user 206 is authentic when the identified features of the user 206 exactly match the features of the reference image 212. In other examples, the processor 102 may determine that the user 206 is authentic when a difference between the identified features of the user 206 and the features of the reference image 212 is below a predefined threshold. The predefined threshold may be defined based on the level of security to be applied. That is, the predefined threshold may be lower for higher levels of security.
  • the processor 102 may determine that the user 206 is inauthentic when the identified features of the user 206 do not exactly match the features of the reference image 212. In other examples, the processor 102 may determine that the user 206 is inauthentic when a difference between the identified features of the user 206 and the features of the reference image 212 exceeds the predefined threshold.
  • the processor 102 may allow authenticated access to the apparatus 100 or an application executing on the apparatus 100. That is, the processor 102 may unlock the apparatus 100, may allow secure access to the application, etc. However, based on a determination that the user 206 is inauthentic, the processor 102 may prevent authenticated access to the apparatus 100 or the application. That is, the processor 102 may keep the apparatus 100 locked, may not enable access to features of the application, and/or the like.
  • FIGS. 3 and 4 respectively show block diagrams of example systems 300 and 400, in which a processor 102 may authenticate a user 206 based on a facial image in an event stream 202 received from an event camera 204.
  • a processor 102 may authenticate a user 206 based on a facial image in an event stream 202 received from an event camera 204.
  • FIGS. 3 and 4 may include additional features and that some of the features described herein may be removed and/or modified without departing from the scopes of the systems 300 and 400.
  • the systems 300 and 400 may include the apparatus 100 with which a user 206 may interact.
  • the apparatus 100 may also include features similar to those discussed herein with respect to the system 200 depicted in FIG. 2. Descriptions of the common features will not be repeated with respect to the systems 300 and 400, but instead, the descriptions provided above with respect to the system 200 are intended to provide descriptions of the features in the systems 300 and 400.
  • event cameras may output only the changes in the input video stream in a binary format. As such, for a person who is moving slightly in front of an event camera, the event camera may output the edges of the user's face as there may not be much variation on smooth surfaces of the face.
  • the processor 102 may use an event simulator that may produce images that may be similar to the event camera’s 204 output. In order to simulate such images for training, the processor 102 may use an event simulator on normal face videos as discussed in greater detail herein below.
  • the processor 102 may execute an event integrator 302 that may integrate/concatenate events in the event stream 202 over a set time period, e.g., every T-seconds, which may be a user-defined parameter.
  • the execution of the event integrator 302 may result in binary image frames 304 that may look like an edge detector’s output.
  • the event integrator 302 may be a module, e.g., a set of instructions, a hardware component, or the like, that the processor 102 may execute.
  • each of the events in the event stream 202 may have an “x, y” value that may specify a location (e.g., a location of the pixel that generated the event), a “t” value that may specify a time instant at which the event was generated, and a “p” value that may specify the polarity of the event.
  • the event capture rate may be high, e.g., in the range of about 1 million events/second. As a result, the event timing precision may be as low as about 1 microsecond.
  • AER Address Event Representation
  • the processor 102 may apply a neural network on the frames 304 generated through execution of the event integrator 302 to generate feature vectors of images included in the frames 304.
  • the processor 102 may apply a machine learning operation on the frames 304 to generate feature vectors that may represent biometric features of the user 206 as identified in the images of the user 206 in the frames 304.
  • the frames 304 may be a video stream that may include a frame every T-seconds and each binary frame may include facial edge information.
  • the processor 102 may apply the neural network on the frames 304 to classify each of the binary frames containing the facial edge information into a valid/invalid face verification.
  • the processor 102 may apply an artificial neural network, a spiking neural network classifier, or the like, on the frames 304, which may relatively efficiently process sparse data in the binary image frames 304.
  • the compute efficiency of the spiking neural network classifier (neural network 306) combined with the use of the event camera 204 may result in significant system 300 power efficiency.
  • the processor 102 may also identify feature vectors of the reference image 212.
  • the feature vectors of the reference image 212 may have previously been determined and may have been stored, for instance, in the data store 214.
  • the processor 102 may identify the feature vectors through access of the feature vectors from the data store 214.
  • the processor 102 may apply a neural network, such as an artificial neural network, a spiking neural network, or the like, on the reference image 212 to generate the feature vectors of the reference image 212.
  • the output of the application of the neural network 306 on the frames 304 may be framewise probabilities 308.
  • the neural network may predict a probability value, e.g., “p”, that may range from 0 to 1.
  • This value “p” may represent the probability of the frame 304 belonging to the reference image 212, which may be a true label.
  • higher p values may be an indication that the frames 304 include images that match the reference image 212 and the lower p values may be an indication that the frames 304 do not include images that match the reference image 212.
  • the probability value ‘p’ is the probability that the frames 304 and reference image 212 belong to the same user 206.
  • the processor 102 may input the feature vectors of the images in the frames 304 and the feature vectors of the reference image 212 into a decision maker 310.
  • the decision maker 310 may be a module, e.g., a set of instructions, that the processor 102 may execute to determine whether the feature vectors of the images in the frames 304 match the feature vectors of the reference image 212.
  • the decision maker 310 may be a memory-based decision maker in that the decision maker 310 may have a memory aspect to it, such as a simple recurrent network. That is, the processor 102 may look at the probability output value of a present frame along with the values from a few previous frames and may make a binary decision for the present frame.
  • the processor 102 may apply a Siamese neural network to determine whether the facial image of the user 206 is valid or invalid.
  • a Siamese neural network or equivalently, a twin neural network, may be a network that may have two artificial neural network models with the same weights that may act as feature extractors.
  • the processor 102 may pass the facial image of the user 206 included in the frames 304 and the facial image of the reference image 212 through the two artificial neural networks.
  • the processor 102 may pass the facial image of the user 206 included in the frames 304 through a first artificial neural network and the facial image included in the reference image 212 through a second artificial neural network, in which the processor 102 may apply the same weights on both of the artificial neural networks.
  • the outputs of the final layers of the artificial neural networks may be the extracted feature vectors of the facial image of the user 206 included in the frames 304 and the feature vectors of the facial image in the reference image 212.
  • the processor 102 may determine a similarity metric between the feature vectors of the facial image of the user 206 included in the frames 304 and the feature vectors of the facial image in the reference image 212. For instance, the processor 102 may use a distance metric, such as a Euclidean distance, a Manhattan distance, and/or the like, to measure the similarity between the feature vectors.
  • a distance metric such as a Euclidean distance, a Manhattan distance, and/or the like
  • the processor 102 may also determine whether the similarity metric exceeds a similarity threshold, which may be user-defined. Based on a determination that the similarity metric exceeds the similarity threshold, the processor 102 may determine that both of the facial images in the frames 304 and the reference image 212 belong to the same person. In this case, the processor 102 may determine that the user 206 is authentic and may grant access to the apparatus 100 and/or an application. However, based on a determination that the similarity metric falls below the similarity threshold, the processor 102 may determine that the facial images in the frames 304 and the reference image 212 belong to different people. In this case, the processor 102 may determine that the user 206 is inauthentic and may reject access by the user 206 to the apparatus 100 and/or the application.
  • a similarity threshold may be user-defined. Based on a determination that the similarity metric exceeds the similarity threshold, the processor 102 may determine that both of the facial images in the frames 304 and the reference image 212 belong to the same person. In this case,
  • the apparatus 100 may include an infrared (IR) light source 402 that may output pulses 404 of IR illumination onto the user 206.
  • the processor 102 may cause the IR light source 402 to output the IR pulses 404 following the detection of motion by the event camera 204.
  • the event camera 204 may detect changes in an illumination on the user 206 caused by the pulses 404 of IR illumination by the IR light source 402. In response to the detected changes, the event camera 204 may generate portions of the event stream 202 corresponding to detected changes in the illumination of the user 206.
  • the event camera 204 may generate the events to be included in the event stream 202 when the pixels in the event camera 204 detect changes in brightness levels. In some instances, such as when the user 206 and the event camera 204 remain mainly stationary with respect to each other, the detected brightness levels may not change. As a result, the pixels in the event camera 204 may not generate sufficient events for the features of the user 206 to be identified. However, the output of the IR pulses 404 onto the user 206 may cause changes in the brightness levels that the pixels in the event camera 204 may detect. The event camera 204 may thus generate the event stream 202 with sufficient events for the processor 102 to identify features of the user 206 even in instances in which there may otherwise not be changes in brightness levels of the user 206.
  • the IR light source 402 may output the IR pulses 404 in a particular pattern 406.
  • the particular pattern 406 may be a random or a pseudorandom pattern in some examples.
  • the IR pulses 406 may be output onto the user 206 to enable the event camera 204 to detect the user 206.
  • the particular pattern 406 may correspond to a predefined IR pattern 408.
  • the predefined IR pattern 408 may be stored in the data store 214 and the processor 102 may access the predefined IR pattern 408 from the data store 214.
  • the processor 102 may also communicate the predefined IR pattern 408 for use by the IR light source 402.
  • the IR light source 402 may include a dedicated data storage into which the predefined IR pattern 408 may be stored.
  • the processor 102 may communicate the predefined IR pattern 408 during bootup of the apparatus 100.
  • the predefined IR pattern 408 may change over time, and the processor 102 may communicate the different predefined IR patterns 408 to, for instance, enhance security of the apparatus 100.
  • the predefined IR pattern 408 may be a secret or otherwise secure pattern and the processor 102 may implement the predefined IR pattern 408 to add a layer of security to the authentication of the user 206. That is, for instance, the processor 102 may determine the pattern at which the IR pulses 404 have been outputted onto the user 206 from the event stream 202 and may determine whether the determined pattern matches the predefined IR pattern 408.
  • the processor 102 may determine that the event stream 202 may not be from the event camera 204. Instead, the processor 102 may determine that the event stream 202 may be from another source, which may be a malicious source. For instance, a hacker may bypass the event camera 204 and may inject a user’s video stream from another source to the processor 102, which may cause the processor 102 to incorrectly determine that the user 206 is authentic. However, because the hacker may not know the predefined IR pattern 408, the hacker may not be able to cause the injected video stream to have the predefined IR pattern 408. As a result, the processor 102 may prevent the user 206 from being authenticated based on the injected video stream. In addition, use of the predefined IR pattern 408 may prevent the event camera 204 from being used on another apparatus.
  • the predefined IR pattern 408 may be defined in any of a number of suitable manners.
  • the predefined IR pattern 408 may be user- defined, e.g., a user may define the predefined IR pattern 408 through a byte stream and a duty cycle value.
  • randomness collected during a computing operation may be used as an initializer for the predefined IR pattern 408.
  • the predefined IR pattern 408 may be defined through hardware-based and/or software-based techniques.
  • Examples of the such techniques may include: using the least significant bits of a sound card or a video card output, using a CPU hardware entropy generation, a random number from a server (in which the event camera 204, at the time of initialization or periodic intervals, may request for a random number from a centralized server, and/or the like.
  • the event camera 204 may send the random number to the server at the boot-up time or initialization of the apparatus 100.
  • the processor 102 may determine an IR pattern in the event stream 202, in which the IR pattern in the event stream 202 may correspond to a pattern at which the IR light source 402 outputted the pulses 404 of IR illumination.
  • the processor 102 may determine the IR pattern in the event stream 202.
  • the pattern from the event camera 204 may include a combination of the user 206 image and the event stream 202 resulting from the IR pulses 404.
  • the processor 102 may apply an IR pattern detector on the event stream 202 to determine the IR pattern. That is, the processor 102 may apply a detector on the event stream 202 to detect the IR pattern and a face recognition algorithm on the event stream 202 to detect the features of the user's 206 face.
  • the IR pattern detector may be a hardware component and/or a set of instructions that the processor 102 may execute.
  • the processor 102 may use the event stream 202 for two separate tasks, IR pattern detection as well as face recognition.
  • the IR pattern detector and the face recognition may be part of the same machine learning algorithm/network or may be parts of separate machine learning algorithms/networks.
  • the processor 102 may also determine whether the IR pattern in the event stream 202 matches the predefined IR pattern 408. In addition, the processor 102 may, based on a determination that the IR pattern in the event stream 202 does not match the predefined IR pattern 408, deny authentication of the user 206. The processor 102 may also deny authentication of the user 206 based on a determination that the identified features of the user 206 do not match the features of the reference image 212. However, the processor 102 may determine that authentication of the user 206 may proceed based on a determination that the IR pattern in the event stream 202 matches the predefined IR pattern 408.
  • FIGS. 5-7 there are respectively shown flow diagrams of example methods 500-700 for authenticating a user 206 based on a facial image in an event stream 202 received from an event camera 204.
  • the methods 500 and 600 depicted in FIGS. 5-7 may include additional operations and that some of the operations described therein may be removed and/or modified without departing from the scopes of the methods 500- 700.
  • the descriptions of the methods 500-700 are also made with reference to the features depicted in FIGS. 1-4 for purposes of illustration.
  • the processor 102 depicted in FIGS. 1 and 2 may execute some or all of the operations included in the methods 500-700.
  • the processor 102 may receive an event stream 202 from an event camera 204.
  • the event stream 202 may correspond to individually captured spikes of an image of a user 206 by pixels in the event camera 204, in which the event camera 204 may generate a portion of the event stream 202 corresponding to a detection of motion of the user 206.
  • the processor 102 may apply a neural network on the event stream 202 to identify features, e.g., biometric features, a feature vector of the biometric features, and/or of a facial image in the event stream 202. That is, for instance, the processor 102 may apply a facial or image recognition operation 210 on the event stream 202 to identify the features of the facial image in the event stream 202.
  • the processor 102 may determine whether the identified features of the facial image match features of a reference image 212.
  • the processor 102 may determine whether the user 206 is authentic based on a determination that the identified features of the facial image in the event stream 202 match features of the reference image.
  • the processor 102 may permit access to the apparatus 100 or an application based on the determination that the user 206 is authentic.
  • the processor 102 may communicate a predefined infrared (IR) pattern 408 for use by an IR light source 402.
  • the IR light source 402 may output pulses 404 of IR illumination at the predefined IR patern 408 and the event camera 204 may detect changes in an illumination on the user 206 caused by the pulses 404 of IR illumination by the IR light source 402 and to generate a portion of the event stream 202 corresponding to detected changes in the illumination of the user 206.
  • the portion of the event stream 202 may be another portion of the event stream 202 that may follow an earlier portion of the event stream 202, e.g., an initial event in the event stream 202.
  • the processor 102 may receive an event stream 202 from an event camera 204 as discussed herein.
  • the processor 102 may determine an IR pattern in the event stream 202, in which the IR pattern may correspond to a pattern at which the IR light source 402 outputted the pulses 404 of IR illumination.
  • the processor 102 may determine whether the determined IR pattern in the event stream 202 matches the predefined IR pattern 408. Based on a determination that the determined IR pattern does not match the predefined IR pattern 408, at block 610, the processor 102 may reject access by the user 206 to the apparatus 100 or an application.
  • the processor 102 may, at block 612, determine whether the user 206 is authentic. In addition, at block 614, based on a determination that the user 206 is authentic, the processor 102 may permit access to an apparatus 100 or an application. However, based on a determination at block 612 that the user 206 is inauthentic, the processor 102 may reject access to the apparatus 100 or the application at block 610.
  • the processor 102 may receive an event stream 202 from an event camera 204.
  • the processor 102 may apply a neural network 306 on the event stream 202 to generate feature vectors of a facial image in the event stream 202.
  • the processor 102 may apply an event integrator 302 on the event stream 202 to generate frames 304, and may apply the neural network 306 on the frames 304.
  • the processor 102 may apply a Siamese neural network on the feature vectors of the facial image in the event stream 202 and feature vectors of a facial image in the reference image 212.
  • the processor 102 may determine a similarity metric between the feature vectors of the facial image in the event stream 202 and the feature vectors of the facial image in the reference image 212. In addition, at block 710, the processor 102 may determine whether the similarity metric exceeds a similarity threshold, which may be user-defined.
  • the processor 102 may, at block 712, reject access by the user 206 to the apparatus 100 and/or the application. However, based on a determination that the similarity metric exceeds the similarity threshold, the processor 102 may, at block 714, determine whether the user 206 is authentic. In addition, at block 716, the processor 102 may permit access to the apparatus 100 and/or the application based on a determination that the user 206 is authentic. However, based on a determination at block 714 that the user 206 is inauthentic, the processor 102 may reject access to the apparatus 100 or the application at block 712.
  • Some or all of the operations set forth in the methods 500-700 may be contained as utilities, programs, or subprograms, in any desired computer accessible medium.
  • the methods 500-700 may be embodied by computer programs, which may exist in a variety of forms.
  • the methods 500-700 may exist as machine-readable instructions, including source code, object code, executable code or other formats. Any of the above may be embodied on a non-transitory computer-readable storage medium.
  • non-transitory computer-readable storage media include computer system RAM, ROM, EPROM, EEPROM, and magnetic or optical disks or tapes. It is therefore to be understood that any electronic device capable of executing the above-described functions may perform those functions enumerated above.
  • FIG. 8 there is shown a block diagram of an example computer-readable medium 800 that may have stored thereon computer-readable instructions for authenticating a user 206 based on a facial image in an event stream 202 received from an event camera 204.
  • the example computer-readable medium 800 depicted in FIG. 8 may include additional instructions and that some of the instructions described herein may be removed and/or modified without departing from the scope of the computer-readable medium 800 disclosed herein.
  • the computer-readable medium 800 may be a non-transitory computer-readable medium, in which the term “non-transitory” does not encompass transitory propagating signals.
  • the computer-readable medium 800 may have stored thereon machine-readable instructions 802-810 that a processor, such as the processor 102 depicted in FIGS. 1 and 2, may execute.
  • the computer-readable medium 800 may be an electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions.
  • the computer-readable medium 800 may be, for example, Random Access memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and the like.
  • the processor may fetch, decode, and execute the instructions 802 to receive an event stream 202 from an event camera 204.
  • the event stream 202 may correspond to individually captured spikes of an image of a user by pixels in the event camera 204, in which the event camera 204 may generate a portion of the event stream 202 corresponding to a detection of motion of the user 206.
  • the processor may fetch, decode, and execute the instructions 804 to apply a neural network 306 on the event stream 202 to identify features of a facial image in the event stream 202.
  • the processor may fetch, decode, and execute the instructions 806 to determine whether the identified features of the facial image match features of a facial image in a reference image 212.
  • the processor may apply a Siamese network on the features, e.g., feature vectors, of the facial image in the event stream 202 and features, e.g., feature vectors, of a facial image in the reference image 212.
  • the processor may fetch, decode, and execute the instructions 808 to determine whether the user 206 is authentic based on a determination that the identified features of the facial image match features of the reference image 212.
  • the processor may fetch, decode, and execute the instructions 810 to permit or reject access to an apparatus 100 and/or an application based on the determination as to whether the user 206 is authentic.
  • the processor may fetch, decode, and execute instructions to communicate a predefined infrared (IR) pattern 408 for use by an IR light source 402.
  • the IR light source 402 may output pulses 404 of IR illumination at the predefined IR pattern 408 and the event camera 204 may detect changes in an illumination on the user 206 caused by the pulses 404 of IR illumination by the IR light source 402 and to generate another portion of the event stream 202 corresponding to detected changes in the illumination of the user 206.
  • the processor may fetch, decode, and execute instructions to determine an IR pattern in the event stream 202, in which the IR pattern may correspond to a pattern at which the IR light source 402 outputted the pulses 404 of IR illumination as discussed herein.
  • the processor may fetch, decode, and execute instructions to determine whether the determined IR pattern in the event stream 202 matches the predefined IR pattern 408. In addition, the processor may fetch, decode, and execute instructions to, based on a determination that the determined IR pattern does not match the predefined IR pattern, reject authentication of the user. However, based on a determination that the determined IR pattern matches the predefined IR pattern, the processor may grant authentication of the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Collating Specific Patterns (AREA)

Abstract

According to examples, an apparatus may include a memory on which is stored instructions that when executed by a processor, cause the processor to receive an event stream from an event camera, in which the event stream may correspond to a user in a field of view of the event camera. The processor may apply an image recognition operation on the event stream to identify features of a facial image of the user in the event stream and may determine whether the identified features of the facial image match features of a reference image. The processor may also authenticate the user based on the determination as to whether the identified features of the facial image match the features of the reference image.

Description

USER AUTHENTICATION USING EVENT CAMERAS
BACKGROUND
[0001] Computing devices may often include security features that may prevent unauthorized access to the computing devices. Some of the security features may employ facial recognition for identification verification. In these types of security features, an image of a user’s face may be captured and characteristics of the user’s face may be analyzed against a reference facial image data. Access to a computing device may be permitted when there is a sufficient match between the facial characteristics and the reference facial image data and may be denied when there a sufficient match does not exist.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Features of the present disclosure are illustrated by way of example and not limited in the following figure(s), in which like numerals indicate like elements, in which:
[0003] FIG. 1 shows a block diagram of an example apparatus that may authenticate a user based on an event stream received from an event camera;
[0004] FIG. 2 shows a block diagram of an example system in which the apparatus depicted in FIG. 1 may be implemented;
[0005] FIGS. 3 and 4, respectively, show block diagrams of example systems, in which a processor may authenticate a user based on an event stream received from an event camera;
[0006] FIGS. 5-7, respectively, show flow diagrams of example methods for authenticating a user based on a facial image in an event stream received from an event camera; and
[0007] FIG. 8 shows a block diagram of an example computer-readable medium that may have stored thereon computer-readable instructions for authenticating a user based on a facial image in an event stream received from an event camera.
DETAILED DESCRIPTION
[0008] For simplicity and illustrative purposes, the present disclosure is described by referring mainly to examples. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be readily apparent however, that the present disclosure may be practiced without limitation to these specific details. In other instances, some methods and structures have not been described in detail so as not to unnecessarily obscure the present disclosure.
[0009] Throughout the present disclosure, the terms “a” and “an” are intended to denote at least one of a particular element. As used herein, the term “includes” means includes but not limited to, the term “including” means including but not limited to. The term “based on” means based at least in part on.
[0010] Disclosed herein are apparatuses, methods, and computer- readable mediums, in which a processor may authenticate a user based on a facial image in an event stream received from an event camera. That is, instead of a camera that may include shutter to capture images of a user, such that the images of the user may be used to authenticate the user, the apparatuses disclosed herein may receive an event stream from an event camera. As discussed herein an event camera may include an imaging sensor that may respond to local changes in brightness. That is, the event camera may be a camera that may include pixels that may each operate independently and asynchronously with respect to each other and may report changes in brightness as they are detected.
[0011] Generally speaking, event cameras may be energy and processing resource usage efficient and may have high dynamic range and sampling rate over cameras that use shutters. In this regard, event cameras may be beneficial over cameras that use shutters. However, an issue with the use of event cameras is that they may only detect changes in brightness, which may be caused by motion, and thus, if a user is stationary after a first movement, the event cameras may not detect the user's face. Through implementation of the features disclosed herein, a processor may implement various techniques to enable the use of event cameras in the detection of a user’s face. In addition, the processor may identify features of the user’s face and may use the identified features in determining whether the user is authentic.
[0012] In some examples, instead of collecting real world data from the event camera, the processor may use an event simulator that may produce images that may be similar to the event camera’s output. In order to simulate such images for training, the processor may use an event simulator on normal face videos as discussed in greater detail herein below. In addition, or in other examples, an IR light source may output pulses of IR illumination onto a user such that the event camera may generate events responsive to the changes in illumination. The IR light source may output the pulses according to a predefined pattern, in which the predefined pattern may be a secret or otherwise secure pattern, which the processor may use to authenticate an event stream received from the event camera.
[0013] Through implementation of the features of the present disclosure, an event camera may be implemented for facial recognition and authentication purposes. As a result, the amount of processing and energy resources associated with the facial recognition and authentication purposes may be reduced and/or minimized as compared with use of other types of cameras.
[0014] Reference is first made to FIGS. 1 and 2. FIG. 1 shows a block diagram of an example apparatus 100 that may authenticate a user based on a facial image in an event stream received from an event camera. FIG. 2 shows a block diagram of an example system 200 in which the apparatus 100 depicted in FIG. 1 may be implemented. It should be understood that the example apparatus 100 depicted in FIG. 1 and/or the example system 200 depicted in FIG. 2 may include additional features and that some of the features described herein may be removed and/or modified without departing from the scopes of the apparatus 100 and/or the system 200.
[0015] The apparatus 100 may be a computing system such as a laptop computer, a tablet computer, a desktop computer, a smartphone, or the like. As shown, the apparatus 100 may include a processor 102, which may be a semiconductor-based microprocessor, a central processing unit (CPU), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and/or other suitable hardware device. The apparatus 100 may also include a memory 110 that may have stored thereon machine-readable instructions (which may equivalently be termed computer-readable instructions) that the processor 102 may execute. The memory 110 may be an electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. The memory 110 may be, for example, Random-Access memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and the like. The memory 110, which may also be referred to as a computer-readable storage medium, may be a non-transitory machine-readable storage medium, where the term “non- transitory” does not encompass transitory propagating signals.
[0016] As shown in FIG. 1 , the memory 110 may have stored thereon machine-readable instructions 112-118 that the processor 102 may execute. Although the instructions 112-118 are described herein as being stored on the memory 110 and may thus include a set of machine-readable instructions, the apparatus 100 may include hardware logic blocks that may perform functions similar to the instructions 112-118. For instance, the processor 102 may include hardware components that may execute the instructions 112-118. In other examples, the apparatus 100 may include a combination of instructions and hardware logic blocks to implement or execute functions corresponding to the instructions 112-118. In any of these examples, the processor 102 may implement the hardware logic blocks and/or execute the instructions 112-118. As discussed herein, the apparatus 100 may also include additional instructions and/or hardware logic blocks such that the processor 102 may execute operations in addition to or in place of those discussed above with respect to FIG. 1 . [0017] With reference to FIGS. 1 and 2, the processor 102 may execute the instructions 112 to receive an event stream 202 from an event camera 204, in which the event stream 202 may correspond to a user 206 in a field of view of the event camera 204. The event camera 204 may capture images 208 of the user 206 and may generate a plurality of events corresponding to the captured images 208. The event camera 204 may also generate and send the event stream 202 to the apparatus 100. In some examples, and as shown in FIG. 2, the event camera 204 may be integrated with the apparatus 100. For instance, the event camera 204 may be mounted within and/or on a housing of the apparatus 100. In other examples, the event camera 204 may be separate from the apparatus 100. For instance, the event camera 204 may be an external camera, such as an external webcam, a security camera, or the like, that may be connected to the apparatus 100 through a wired or a wireless connection.
[0018] The event camera 204, which may equivalently be termed an eventbased camera, a dynamic vision sensor camera, a neuromorphic camera, and/or the like, may include an imaging sensor that may respond to local changes in brightness. The event camera 204 may be a camera that may include pixels that may each operate independently and asynchronously with respect to each other and may report changes in brightness as they are detected. That is, the pixels in the event camera 204 may compare brightness levels to a reference brightness level and may generate an event when a difference between a detected brightness and the reference brightness level exceeds a preset threshold. Thus, for instance, the event camera 204 may generate an event when there is a change in a scene, e.g., motion within the field of view of the event camera 204, and may not generate an event when there is no change in a scene, as may occur when the user 206 within the field of view of the event camera 204 is stationary. The event camera 204 may therefore be distinguished from other types of cameras that may capture images using shutters.
[0019] According to examples, the pixels of the event camera 204 may capture events corresponding to an image 208 of the user 206 when the user 206 positions the user’s face in front of the event camera 204. The user 206 may position their face in front of the event camera 204 when the user 206 seeks to be authenticated to access the apparatus 100 and/or an application executing on the apparatus 100. Thus, for instance, the event camera 204 may be activated, e.g., powered on, when the apparatus 100 is activated from a power off state or a standby state. In addition, the event camera 204 may be activated when an application that may use facial recognition-based authentication is opened or launched on the apparatus 100. In any regard, the event camera 204 may communicate the events in the event stream 202 to the apparatus 100.
[0020] The processor 102 may execute the instructions 114 to apply an image recognition operation 210 on the event stream 202 to identify features of the user 206. In one regard, the processor 102 may identify biometric features of the user from the event stream 202, e.g., a facial image in the event stream 202. That is, the processor 102 may identify features of the user 206, such as patterns, based on the user’s facial textures and shape. By way of example, the processor 102 may apply the image recognition operation 210 to analyze the relative positions, sizes, and/or shapes of the user’s 206 eyes, nose, cheekbones, jaw, and/or the like.
[0021] The processor 102 may execute the instructions 116 to determine whether the identified features of the user 206 match features of a reference image 212. The reference image 212 may be an image of the user 206 or of another user that may have previously been captured and registered with the apparatus 100. In addition, the reference image 212 may be an image of a user who is authorized to access the apparatus 100, to access a particular application, and/or the like. The features of the reference image 212 may be similar to the types of features that were identified from the event stream 202. In some examples, the features of the reference image 212 may be feature vectors that may be determined through application of a neural network on a facial image in the reference image 212.
[0022] In any regard, the reference image 212 and/or the features of the reference image 212 may be stored in a data store 214 and the processor 102 may access the reference image 212 and/or the features of the reference image 212 from the data store 214. The data store 214 may be an electronic, magnetic, optical, or other physical storage device. For example, the data store 214 may be a Random-Access memory (RAM), an Electrically Erasable Programmable Read- Only Memory (EEPROM), a storage device, an optical disc, and the like. In some examples, the data store 214 and the memory 110 may be the same component, while in other examples, the data store 214 may be a separate component from the memory 110.
[0023] The processor 102 may execute the instructions 118 to authenticate or reject 216 the user 206 based on a determination as to whether the identified features of the user 206 match the features of the reference image 212. That is, the processor 102 may determine that the user 206 is authentic based on the identified features of the user 206 matching the features of the reference image 212. Particularly, the processor 102 may determine that the user 206 is authentic when the identified features of the user 206 exactly match the features of the reference image 212. In other examples, the processor 102 may determine that the user 206 is authentic when a difference between the identified features of the user 206 and the features of the reference image 212 is below a predefined threshold. The predefined threshold may be defined based on the level of security to be applied. That is, the predefined threshold may be lower for higher levels of security.
[0024] The processor 102 may determine that the user 206 is inauthentic when the identified features of the user 206 do not exactly match the features of the reference image 212. In other examples, the processor 102 may determine that the user 206 is inauthentic when a difference between the identified features of the user 206 and the features of the reference image 212 exceeds the predefined threshold.
[0025] Based on a determination that the user 206 is authentic, the processor 102 may allow authenticated access to the apparatus 100 or an application executing on the apparatus 100. That is, the processor 102 may unlock the apparatus 100, may allow secure access to the application, etc. However, based on a determination that the user 206 is inauthentic, the processor 102 may prevent authenticated access to the apparatus 100 or the application. That is, the processor 102 may keep the apparatus 100 locked, may not enable access to features of the application, and/or the like.
[0026] Reference is now made to FIGS. 3 and 4, which respectively show block diagrams of example systems 300 and 400, in which a processor 102 may authenticate a user 206 based on a facial image in an event stream 202 received from an event camera 204. It should be understood that the example systems 300 and 400 depicted in FIGS. 3 and 4 may include additional features and that some of the features described herein may be removed and/or modified without departing from the scopes of the systems 300 and 400.
[0027] As shown in FIGS. 3 and 4, the systems 300 and 400 may include the apparatus 100 with which a user 206 may interact. The apparatus 100 may also include features similar to those discussed herein with respect to the system 200 depicted in FIG. 2. Descriptions of the common features will not be repeated with respect to the systems 300 and 400, but instead, the descriptions provided above with respect to the system 200 are intended to provide descriptions of the features in the systems 300 and 400.
[0028] As discussed herein, event cameras may output only the changes in the input video stream in a binary format. As such, for a person who is moving slightly in front of an event camera, the event camera may output the edges of the user's face as there may not be much variation on smooth surfaces of the face. Instead of collecting real world data from the event camera 204, the processor 102 may use an event simulator that may produce images that may be similar to the event camera’s 204 output. In order to simulate such images for training, the processor 102 may use an event simulator on normal face videos as discussed in greater detail herein below.
[0029] As shown in FIG. 3, the processor 102 may execute an event integrator 302 that may integrate/concatenate events in the event stream 202 over a set time period, e.g., every T-seconds, which may be a user-defined parameter. The execution of the event integrator 302 may result in binary image frames 304 that may look like an edge detector’s output. In any regard, the event integrator 302 may be a module, e.g., a set of instructions, a hardware component, or the like, that the processor 102 may execute.
[0030] According to examples, each of the events in the event stream 202 may have an “x, y” value that may specify a location (e.g., a location of the pixel that generated the event), a “t” value that may specify a time instant at which the event was generated, and a “p” value that may specify the polarity of the event. In some instances, the event capture rate may be high, e.g., in the range of about 1 million events/second. As a result, the event timing precision may be as low as about 1 microsecond. In addition, an Address Event Representation (AER) data stream, e.g., the event stream 202, may be fed into the event integrator 302.
[0031] In addition, to apply the image recognition operation 210 (FIG. 2), the processor 102 may apply a neural network on the frames 304 generated through execution of the event integrator 302 to generate feature vectors of images included in the frames 304. In other words, the processor 102 may apply a machine learning operation on the frames 304 to generate feature vectors that may represent biometric features of the user 206 as identified in the images of the user 206 in the frames 304. The frames 304 may be a video stream that may include a frame every T-seconds and each binary frame may include facial edge information. The processor 102 may apply the neural network on the frames 304 to classify each of the binary frames containing the facial edge information into a valid/invalid face verification.
[0032] By way of example, the processor 102 may apply an artificial neural network, a spiking neural network classifier, or the like, on the frames 304, which may relatively efficiently process sparse data in the binary image frames 304. In addition, the compute efficiency of the spiking neural network classifier (neural network 306) combined with the use of the event camera 204 may result in significant system 300 power efficiency.
[0033] The processor 102 may also identify feature vectors of the reference image 212. In some examples, the feature vectors of the reference image 212 may have previously been determined and may have been stored, for instance, in the data store 214. In these examples, the processor 102 may identify the feature vectors through access of the feature vectors from the data store 214. In other examples, the processor 102 may apply a neural network, such as an artificial neural network, a spiking neural network, or the like, on the reference image 212 to generate the feature vectors of the reference image 212. The output of the application of the neural network 306 on the frames 304 may be framewise probabilities 308. That is, for instance, for each input frame, the neural network may predict a probability value, e.g., “p”, that may range from 0 to 1. This value “p” may represent the probability of the frame 304 belonging to the reference image 212, which may be a true label. In these examples, higher p values may be an indication that the frames 304 include images that match the reference image 212 and the lower p values may be an indication that the frames 304 do not include images that match the reference image 212. As an example, the probability value ‘p’ is the probability that the frames 304 and reference image 212 belong to the same user 206.
[0034] The processor 102 may input the feature vectors of the images in the frames 304 and the feature vectors of the reference image 212 into a decision maker 310. The decision maker 310 may be a module, e.g., a set of instructions, that the processor 102 may execute to determine whether the feature vectors of the images in the frames 304 match the feature vectors of the reference image 212. In some examples, the decision maker 310 may be a memory-based decision maker in that the decision maker 310 may have a memory aspect to it, such as a simple recurrent network. That is, the processor 102 may look at the probability output value of a present frame along with the values from a few previous frames and may make a binary decision for the present frame.
[0035] In some examples, the processor 102 may apply a Siamese neural network to determine whether the facial image of the user 206 is valid or invalid. A Siamese neural network, or equivalently, a twin neural network, may be a network that may have two artificial neural network models with the same weights that may act as feature extractors. In these examples, the processor 102 may pass the facial image of the user 206 included in the frames 304 and the facial image of the reference image 212 through the two artificial neural networks. That is, the processor 102 may pass the facial image of the user 206 included in the frames 304 through a first artificial neural network and the facial image included in the reference image 212 through a second artificial neural network, in which the processor 102 may apply the same weights on both of the artificial neural networks. The outputs of the final layers of the artificial neural networks may be the extracted feature vectors of the facial image of the user 206 included in the frames 304 and the feature vectors of the facial image in the reference image 212.
[0036] The processor 102 may determine a similarity metric between the feature vectors of the facial image of the user 206 included in the frames 304 and the feature vectors of the facial image in the reference image 212. For instance, the processor 102 may use a distance metric, such as a Euclidean distance, a Manhattan distance, and/or the like, to measure the similarity between the feature vectors.
[0037] The processor 102 may also determine whether the similarity metric exceeds a similarity threshold, which may be user-defined. Based on a determination that the similarity metric exceeds the similarity threshold, the processor 102 may determine that both of the facial images in the frames 304 and the reference image 212 belong to the same person. In this case, the processor 102 may determine that the user 206 is authentic and may grant access to the apparatus 100 and/or an application. However, based on a determination that the similarity metric falls below the similarity threshold, the processor 102 may determine that the facial images in the frames 304 and the reference image 212 belong to different people. In this case, the processor 102 may determine that the user 206 is inauthentic and may reject access by the user 206 to the apparatus 100 and/or the application.
[0038] Reference is now made to FIG. 4. In the system 400, the apparatus 100 may include an infrared (IR) light source 402 that may output pulses 404 of IR illumination onto the user 206. In some examples, the processor 102 may cause the IR light source 402 to output the IR pulses 404 following the detection of motion by the event camera 204. In the system 400, the event camera 204 may detect changes in an illumination on the user 206 caused by the pulses 404 of IR illumination by the IR light source 402. In response to the detected changes, the event camera 204 may generate portions of the event stream 202 corresponding to detected changes in the illumination of the user 206.
[0039] As discussed herein, the event camera 204 may generate the events to be included in the event stream 202 when the pixels in the event camera 204 detect changes in brightness levels. In some instances, such as when the user 206 and the event camera 204 remain mainly stationary with respect to each other, the detected brightness levels may not change. As a result, the pixels in the event camera 204 may not generate sufficient events for the features of the user 206 to be identified. However, the output of the IR pulses 404 onto the user 206 may cause changes in the brightness levels that the pixels in the event camera 204 may detect. The event camera 204 may thus generate the event stream 202 with sufficient events for the processor 102 to identify features of the user 206 even in instances in which there may otherwise not be changes in brightness levels of the user 206.
[0040] According to examples, the IR light source 402 may output the IR pulses 404 in a particular pattern 406. The particular pattern 406 may be a random or a pseudorandom pattern in some examples. In these examples, the IR pulses 406 may be output onto the user 206 to enable the event camera 204 to detect the user 206.
[0041] However, in other examples, the particular pattern 406 may correspond to a predefined IR pattern 408. In these examples, the predefined IR pattern 408 may be stored in the data store 214 and the processor 102 may access the predefined IR pattern 408 from the data store 214. The processor 102 may also communicate the predefined IR pattern 408 for use by the IR light source 402. By way of example, the IR light source 402 may include a dedicated data storage into which the predefined IR pattern 408 may be stored. In some examples, the processor 102 may communicate the predefined IR pattern 408 during bootup of the apparatus 100. In these examples, the predefined IR pattern 408 may change over time, and the processor 102 may communicate the different predefined IR patterns 408 to, for instance, enhance security of the apparatus 100. [0042] In some examples, the predefined IR pattern 408 may be a secret or otherwise secure pattern and the processor 102 may implement the predefined IR pattern 408 to add a layer of security to the authentication of the user 206. That is, for instance, the processor 102 may determine the pattern at which the IR pulses 404 have been outputted onto the user 206 from the event stream 202 and may determine whether the determined pattern matches the predefined IR pattern 408. In instances in which the determined pattern does not match the predefined IR pattern 408, the processor 102 may determine that the event stream 202 may not be from the event camera 204. Instead, the processor 102 may determine that the event stream 202 may be from another source, which may be a malicious source. For instance, a hacker may bypass the event camera 204 and may inject a user’s video stream from another source to the processor 102, which may cause the processor 102 to incorrectly determine that the user 206 is authentic. However, because the hacker may not know the predefined IR pattern 408, the hacker may not be able to cause the injected video stream to have the predefined IR pattern 408. As a result, the processor 102 may prevent the user 206 from being authenticated based on the injected video stream. In addition, use of the predefined IR pattern 408 may prevent the event camera 204 from being used on another apparatus.
[0043] The predefined IR pattern 408 may be defined in any of a number of suitable manners. For instance, the predefined IR pattern 408 may be user- defined, e.g., a user may define the predefined IR pattern 408 through a byte stream and a duty cycle value. As another example, randomness collected during a computing operation may be used as an initializer for the predefined IR pattern 408. The predefined IR pattern 408 may be defined through hardware-based and/or software-based techniques. Examples of the such techniques may include: using the least significant bits of a sound card or a video card output, using a CPU hardware entropy generation, a random number from a server (in which the event camera 204, at the time of initialization or periodic intervals, may request for a random number from a centralized server, and/or the like. In some examples, the event camera 204 may send the random number to the server at the boot-up time or initialization of the apparatus 100. [0044] According to examples, the processor 102 may determine an IR pattern in the event stream 202, in which the IR pattern in the event stream 202 may correspond to a pattern at which the IR light source 402 outputted the pulses 404 of IR illumination. In these examples, the processor 102 may determine the IR pattern in the event stream 202. For instance, the pattern from the event camera 204 may include a combination of the user 206 image and the event stream 202 resulting from the IR pulses 404.
[0045] In other examples, the processor 102 may apply an IR pattern detector on the event stream 202 to determine the IR pattern. That is, the processor 102 may apply a detector on the event stream 202 to detect the IR pattern and a face recognition algorithm on the event stream 202 to detect the features of the user's 206 face. The IR pattern detector may be a hardware component and/or a set of instructions that the processor 102 may execute. In these examples, the processor 102 may use the event stream 202 for two separate tasks, IR pattern detection as well as face recognition. The IR pattern detector and the face recognition may be part of the same machine learning algorithm/network or may be parts of separate machine learning algorithms/networks.
[0046] The processor 102 may also determine whether the IR pattern in the event stream 202 matches the predefined IR pattern 408. In addition, the processor 102 may, based on a determination that the IR pattern in the event stream 202 does not match the predefined IR pattern 408, deny authentication of the user 206. The processor 102 may also deny authentication of the user 206 based on a determination that the identified features of the user 206 do not match the features of the reference image 212. However, the processor 102 may determine that authentication of the user 206 may proceed based on a determination that the IR pattern in the event stream 202 matches the predefined IR pattern 408. That is, based on a determination that the determined IR pattern in the event stream 202 matches the predefined IR pattern 408 and that the identified features of the user 206 match the features of the reference image 212, the processor 102 may determine that the user 206 is authentic and may thus be provided access to the apparatus 100 and/or an application. [0047] Turning now to FIGS. 5-7, there are respectively shown flow diagrams of example methods 500-700 for authenticating a user 206 based on a facial image in an event stream 202 received from an event camera 204. It should be understood that the methods 500 and 600 depicted in FIGS. 5-7 may include additional operations and that some of the operations described therein may be removed and/or modified without departing from the scopes of the methods 500- 700. The descriptions of the methods 500-700 are also made with reference to the features depicted in FIGS. 1-4 for purposes of illustration. Particularly, the processor 102 depicted in FIGS. 1 and 2 may execute some or all of the operations included in the methods 500-700.
[0048] With reference first to FIG. 5, at block 502, the processor 102 may receive an event stream 202 from an event camera 204. As discussed herein, the event stream 202 may correspond to individually captured spikes of an image of a user 206 by pixels in the event camera 204, in which the event camera 204 may generate a portion of the event stream 202 corresponding to a detection of motion of the user 206. At block 504, the processor 102 may apply a neural network on the event stream 202 to identify features, e.g., biometric features, a feature vector of the biometric features, and/or of a facial image in the event stream 202. That is, for instance, the processor 102 may apply a facial or image recognition operation 210 on the event stream 202 to identify the features of the facial image in the event stream 202.
[0049] At block 506, the processor 102 may determine whether the identified features of the facial image match features of a reference image 212. At block 508, the processor 102 may determine whether the user 206 is authentic based on a determination that the identified features of the facial image in the event stream 202 match features of the reference image. In addition, at block 510, the processor 102 may permit access to the apparatus 100 or an application based on the determination that the user 206 is authentic.
[0050] Turning now to FIG. 6, at block 602, the processor 102 may communicate a predefined infrared (IR) pattern 408 for use by an IR light source 402. The IR light source 402 may output pulses 404 of IR illumination at the predefined IR patern 408 and the event camera 204 may detect changes in an illumination on the user 206 caused by the pulses 404 of IR illumination by the IR light source 402 and to generate a portion of the event stream 202 corresponding to detected changes in the illumination of the user 206. The portion of the event stream 202 may be another portion of the event stream 202 that may follow an earlier portion of the event stream 202, e.g., an initial event in the event stream 202.
[0051] At block 604, the processor 102 may receive an event stream 202 from an event camera 204 as discussed herein. At block 606, the processor 102 may determine an IR pattern in the event stream 202, in which the IR pattern may correspond to a pattern at which the IR light source 402 outputted the pulses 404 of IR illumination. At block 608, the processor 102 may determine whether the determined IR pattern in the event stream 202 matches the predefined IR pattern 408. Based on a determination that the determined IR pattern does not match the predefined IR pattern 408, at block 610, the processor 102 may reject access by the user 206 to the apparatus 100 or an application. However, based on a determination that the determined IR pattern matches the predefined IR pattern 408, the processor 102 may, at block 612, determine whether the user 206 is authentic. In addition, at block 614, based on a determination that the user 206 is authentic, the processor 102 may permit access to an apparatus 100 or an application. However, based on a determination at block 612 that the user 206 is inauthentic, the processor 102 may reject access to the apparatus 100 or the application at block 610.
[0052] Turning now to FIG. 7, at block 702, the processor 102 may receive an event stream 202 from an event camera 204. At block 704, the processor 102 may apply a neural network 306 on the event stream 202 to generate feature vectors of a facial image in the event stream 202. As discussed herein with respect to FIG. 3, the processor 102 may apply an event integrator 302 on the event stream 202 to generate frames 304, and may apply the neural network 306 on the frames 304. At block 706, the processor 102 may apply a Siamese neural network on the feature vectors of the facial image in the event stream 202 and feature vectors of a facial image in the reference image 212. At block 708, the processor 102 may determine a similarity metric between the feature vectors of the facial image in the event stream 202 and the feature vectors of the facial image in the reference image 212. In addition, at block 710, the processor 102 may determine whether the similarity metric exceeds a similarity threshold, which may be user-defined.
[0053] Based on a determination that the similarity metric does not exceed the similarity threshold, the processor 102 may, at block 712, reject access by the user 206 to the apparatus 100 and/or the application. However, based on a determination that the similarity metric exceeds the similarity threshold, the processor 102 may, at block 714, determine whether the user 206 is authentic. In addition, at block 716, the processor 102 may permit access to the apparatus 100 and/or the application based on a determination that the user 206 is authentic. However, based on a determination at block 714 that the user 206 is inauthentic, the processor 102 may reject access to the apparatus 100 or the application at block 712.
[0054] Some or all of the operations set forth in the methods 500-700 may be contained as utilities, programs, or subprograms, in any desired computer accessible medium. In addition, the methods 500-700 may be embodied by computer programs, which may exist in a variety of forms. For example, the methods 500-700 may exist as machine-readable instructions, including source code, object code, executable code or other formats. Any of the above may be embodied on a non-transitory computer-readable storage medium.
[0055] Examples of non-transitory computer-readable storage media include computer system RAM, ROM, EPROM, EEPROM, and magnetic or optical disks or tapes. It is therefore to be understood that any electronic device capable of executing the above-described functions may perform those functions enumerated above.
[0056] Turning now to FIG. 8, there is shown a block diagram of an example computer-readable medium 800 that may have stored thereon computer-readable instructions for authenticating a user 206 based on a facial image in an event stream 202 received from an event camera 204. It should be understood that the example computer-readable medium 800 depicted in FIG. 8 may include additional instructions and that some of the instructions described herein may be removed and/or modified without departing from the scope of the computer-readable medium 800 disclosed herein. The computer-readable medium 800 may be a non-transitory computer-readable medium, in which the term “non-transitory” does not encompass transitory propagating signals.
[0057] The computer-readable medium 800 may have stored thereon machine-readable instructions 802-810 that a processor, such as the processor 102 depicted in FIGS. 1 and 2, may execute. The computer-readable medium 800 may be an electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. The computer-readable medium 800 may be, for example, Random Access memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and the like.
[0058] The processor may fetch, decode, and execute the instructions 802 to receive an event stream 202 from an event camera 204. The event stream 202 may correspond to individually captured spikes of an image of a user by pixels in the event camera 204, in which the event camera 204 may generate a portion of the event stream 202 corresponding to a detection of motion of the user 206. The processor may fetch, decode, and execute the instructions 804 to apply a neural network 306 on the event stream 202 to identify features of a facial image in the event stream 202. The processor may fetch, decode, and execute the instructions 806 to determine whether the identified features of the facial image match features of a facial image in a reference image 212. As discussed herein, the processor may apply a Siamese network on the features, e.g., feature vectors, of the facial image in the event stream 202 and features, e.g., feature vectors, of a facial image in the reference image 212.
[0059] The processor may fetch, decode, and execute the instructions 808 to determine whether the user 206 is authentic based on a determination that the identified features of the facial image match features of the reference image 212. In addition, the processor may fetch, decode, and execute the instructions 810 to permit or reject access to an apparatus 100 and/or an application based on the determination as to whether the user 206 is authentic.
[0060] The processor may fetch, decode, and execute instructions to communicate a predefined infrared (IR) pattern 408 for use by an IR light source 402. The IR light source 402 may output pulses 404 of IR illumination at the predefined IR pattern 408 and the event camera 204 may detect changes in an illumination on the user 206 caused by the pulses 404 of IR illumination by the IR light source 402 and to generate another portion of the event stream 202 corresponding to detected changes in the illumination of the user 206. The processor may fetch, decode, and execute instructions to determine an IR pattern in the event stream 202, in which the IR pattern may correspond to a pattern at which the IR light source 402 outputted the pulses 404 of IR illumination as discussed herein. The processor may fetch, decode, and execute instructions to determine whether the determined IR pattern in the event stream 202 matches the predefined IR pattern 408. In addition, the processor may fetch, decode, and execute instructions to, based on a determination that the determined IR pattern does not match the predefined IR pattern, reject authentication of the user. However, based on a determination that the determined IR pattern matches the predefined IR pattern, the processor may grant authentication of the user.
[0061] Although described specifically throughout the entirety of the instant disclosure, representative examples of the present disclosure have utility over a wide range of applications, and the above discussion is not intended and should not be construed to be limiting, but is offered as an illustrative discussion of aspects of the disclosure.
[0062] What has been described and illustrated herein is an example of the disclosure along with some of its variations. The terms, descriptions and figures used herein are set forth by way of illustration and are not meant as limitations. Many variations are possible within the scope of the disclosure, which is intended to be defined by the following claims - and their equivalents - in which all terms are meant in their broadest reasonable sense unless otherwise indicated.

Claims

What is claimed is:
1. An apparatus comprising: a processor; and a memory on which is stored machine-readable instructions that when executed by the processor, cause the processor to: receive an event stream from an event camera, the event stream corresponding to a user in a field of view of the event camera; apply an image recognition operation on the event stream to identify features of a facial image of the user in the event stream; determine whether the identified features of the facial image match features of a reference image; and authenticate the user based on the determination as to whether the identified features of the facial image match the features of the reference image.
2. The apparatus of claim 1 , wherein the instructions further cause the processor to: based on the user being determined to be authentic, allow authenticated access to the apparatus or an application; and based on the user being determined to be inauthentic, prevent authenticated access to the apparatus or the application.
3. The apparatus of claim 1 , wherein the event camera is to generate a portion of the event stream corresponding to a detection of motion in a field of view of the event camera, wherein an infrared (IR) light source is to output pulses of IR illumination onto the user following the detection of motion by the event camera, and wherein the event camera is to detect changes in an illumination on the user caused by the pulses of IR illumination by the IR light source and to generate another portion of the event stream corresponding to detected changes in the illumination of the user.
4. The apparatus of claim 3, wherein the instructions further cause the processor to: access a predefined IR pattern; and communicate the predefined IR pattern for use by the IR light source, wherein the IR light source is to output the pulses of IR illumination at the predefined IR pattern.
5. The apparatus of claim 4, wherein the instructions further cause the processor to: determine an IR pattern in the event stream, the IR pattern corresponding to a pattern at which the IR light source outputted the pulses of IR illumination; determine whether the IR pattern in the event stream matches the predefined IR pattern; and based on a determination that the IR pattern does not match the predefined IR pattern, deny authentication of the user.
6. The apparatus of claim 1 , wherein the instructions further cause the processor to: apply a neural network on the event stream to generate feature vectors of the facial image in the event stream; apply a Siamese neural network on the feature vectors of the facial image in the event stream and feature vectors of a facial image in the reference image; determine a similarity metric between the feature vectors of the facial image in the event stream and the feature vectors of the facial image in the reference image; determine whether the similarity metric exceeds a similarity threshold; and based on a determination that the similarity metric exceeds the similarity threshold, determine that the user is authentic.
7. The apparatus of claim 6, wherein the instructions further cause the processor to: integrate events in the event stream received from the event camera into binary image frames; and apply the neural network on the binary image frames to generate feature vectors of the facial image in the event stream.
8. The apparatus of claim 7, wherein the instructions further cause the processor to: make a binary decision as to whether the feature vectors of the facial image in the event stream match the feature vectors of the facial image in the reference image through a comparison of a probability output value of a present frame with values from a number of previous frames.
9. A method comprising: receiving, by a processor, an event stream from an event camera, the event stream corresponding to individually captured spikes of an image of a user by pixels in the event camera, wherein the event camera is to generate a portion of the event stream corresponding to a detection of motion of the user; applying, by the processor, a neural network on the event stream to identify features of a facial image in the event stream; determining, by the processor, whether the identified features of the user match features of a reference image; determining, by the processor, whether the user is authentic based on a determination that the identified features of the user match features of the reference image; and permitting, by the processor, access to an apparatus or an application based on the determination that the user is authentic.
10. The method of claim 9, further comprising: communicating a predefined infrared (IR) pattern for use by an IR light source, wherein the IR light source is to output pulses of IR illumination at the predefined IR pattern and the event camera is to detect changes in an illumination on the user caused by the pulses of IR illumination by the IR light source and to generate another portion of the event stream corresponding to detected changes in the illumination of the user.
11. The method of claim 10, further comprising: determining an IR pattern in the event stream, the IR pattern corresponding to a pattern at which the IR light source outputted the pulses of IR illumination; determining whether the determined IR pattern in the event stream matches the predefined IR pattern; and based on a determination that the determined IR pattern does not match the predefined IR pattern, rejecting authentication of the user.
12. The method of claim 9, further comprising: applying a neural network on the event stream to generate feature vectors of a facial image in the event stream; applying a Siamese neural network on the feature vectors of the facial image in the event stream and feature vectors of a facial image in the reference image; determining a similarity metric between the feature vectors of the facial image in the event stream and the feature vectors of the facial image in the reference image; determining whether the similarity metric exceeds a similarity threshold; and based on a determination that the similarity metric exceeds the similarity threshold, determining that the user is authentic.
13. The method of claim 12, further comprising: integrating events in the event stream received from the event camera into binary image frames; and applying the neural network on the binary image frames to generate feature vectors of the facial image in the event stream.
14. A non-transitory computer-readable medium on which is stored computer- readable instructions that when executed by a processor, cause the processor to: receive an event stream from an event camera, the event stream corresponding to individually captured spikes of an image of a user by pixels in the event camera, wherein the event camera is to generate a portion of the event stream corresponding to a detection of motion of the user; apply a neural network on the event stream to identify features of a facial image in the event stream; determine whether the identified features of the facial image match features of a facial image in a reference image; determine whether the user is authentic based on a determination that the identified features of the facial image match features of the reference image; and permit access to an apparatus and/or an application based on the determination that the user is authentic.
15. The non-transitory computer-readable medium of claim 14, wherein the instructions further cause the processor to: communicate a predefined infrared (IR) pattern for use by an IR light source, wherein the IR light source is to output pulses of IR illumination at the predefined IR pattern and the event camera is to detect changes in an illumination on the user caused by the pulses of IR illumination by the IR light source and to generate another portion of the event stream corresponding to detected changes in the illumination of the user; determine an IR pattern in the event stream, the IR pattern corresponding to a pattern at which the IR light source outputted the pulses of IR illumination; determine whether the determined IR pattern in the event stream matches the predefined IR pattern; and based on a determination that the determined IR pattern does not match the predefined IR pattern, reject authentication of the user.
PCT/US2020/048810 2020-08-31 2020-08-31 User authentication using event cameras WO2022046120A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2020/048810 WO2022046120A1 (en) 2020-08-31 2020-08-31 User authentication using event cameras

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/048810 WO2022046120A1 (en) 2020-08-31 2020-08-31 User authentication using event cameras

Publications (1)

Publication Number Publication Date
WO2022046120A1 true WO2022046120A1 (en) 2022-03-03

Family

ID=80354246

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/048810 WO2022046120A1 (en) 2020-08-31 2020-08-31 User authentication using event cameras

Country Status (1)

Country Link
WO (1) WO2022046120A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150113636A1 (en) * 2013-02-15 2015-04-23 Microsoft Corporation Managed Biometric Identity
US20170235931A1 (en) * 2014-05-09 2017-08-17 Google Inc. Systems and methods for discerning eye signals and continuous biometric identification
US20180082304A1 (en) * 2016-09-21 2018-03-22 PINN Technologies System for user identification and authentication
US20180307895A1 (en) * 2014-08-12 2018-10-25 Microsoft Technology Licensing, Llc False face representation identification
US20190102528A1 (en) * 2017-09-29 2019-04-04 General Electric Company Automatic authentification for MES system using facial recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150113636A1 (en) * 2013-02-15 2015-04-23 Microsoft Corporation Managed Biometric Identity
US20170235931A1 (en) * 2014-05-09 2017-08-17 Google Inc. Systems and methods for discerning eye signals and continuous biometric identification
US20180307895A1 (en) * 2014-08-12 2018-10-25 Microsoft Technology Licensing, Llc False face representation identification
US20180082304A1 (en) * 2016-09-21 2018-03-22 PINN Technologies System for user identification and authentication
US20190102528A1 (en) * 2017-09-29 2019-04-04 General Electric Company Automatic authentification for MES system using facial recognition

Similar Documents

Publication Publication Date Title
US11288504B2 (en) Iris liveness detection for mobile devices
US11093731B2 (en) Analysis of reflections of projected light in varying colors, brightness, patterns, and sequences for liveness detection in biometric systems
Chan et al. Face liveness detection using a flash against 2D spoofing attack
Chakraborty et al. An overview of face liveness detection
Chen et al. Sensor-assisted facial recognition: an enhanced biometric authentication system for smartphones
Hadid Face biometrics under spoofing attacks: Vulnerabilities, countermeasures, open issues, and research directions
US11275928B2 (en) Methods and systems for facial recognition using motion vector trained model
KR102655949B1 (en) Face verifying method and apparatus based on 3d image
JP2020061171A (en) System and method for biometric authentication in connection with camera-equipped devices
Lovisotto et al. Biometric backdoors: A poisoning attack against unsupervised template updating
WO2016084072A1 (en) Anti-spoofing system and methods useful in conjunction therewith
US10885171B2 (en) Authentication verification using soft biometric traits
EP3734503A1 (en) Method and apparatus with liveness detection
Kant et al. Fake face recognition using fusion of thermal imaging and skin elasticity
Ng et al. Face verification using temporal affective cues
Solomon et al. Hdlhc: Hybrid face anti-spoofing method concatenating deep learning and hand-crafted features
Shahriar et al. An iris-based authentication framework to prevent presentation attacks
WO2022046120A1 (en) User authentication using event cameras
Asaduzzaman et al. Improving facial recognition accuracy by applying liveness monitoring technique
Bakshi et al. 3T‐FASDM: Linear discriminant analysis‐based three‐tier face anti‐spoofing detection model using support vector machine
Cindori et al. Security Hardening of Facial Recognition Systems
Wadhwa et al. FA-Net: A Deep Face Anti-Spoofing Framework using Optical Maps
Hemalatha et al. A study of liveness detection in face biometric systems
Sondrol Possible Attacks on Match-In-Database Fingerprint Authentication
Findling Pan shot face unlock: Towards unlocking personal mobile devices using stereo vision and biometric face information from multiple perspectives

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20951817

Country of ref document: EP

Kind code of ref document: A1