WO2021138342A1 - Methods and apparatus for facial recognition - Google Patents
Methods and apparatus for facial recognition Download PDFInfo
- Publication number
- WO2021138342A1 WO2021138342A1 PCT/US2020/067338 US2020067338W WO2021138342A1 WO 2021138342 A1 WO2021138342 A1 WO 2021138342A1 US 2020067338 W US2020067338 W US 2020067338W WO 2021138342 A1 WO2021138342 A1 WO 2021138342A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- requester
- ecd
- data
- face
- macroblock
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 113
- 230000001815 facial effect Effects 0.000 title claims description 67
- 238000005070 sampling Methods 0.000 claims abstract description 37
- 230000003287 optical effect Effects 0.000 claims description 35
- 238000004891 communication Methods 0.000 claims description 32
- 238000003860 storage Methods 0.000 claims description 30
- 230000002207 retinal effect Effects 0.000 claims description 7
- 230000005021 gait Effects 0.000 claims description 6
- 208000016339 iris pattern Diseases 0.000 claims description 5
- 238000004422 calculation algorithm Methods 0.000 description 82
- 238000001514 detection method Methods 0.000 description 79
- 238000005259 measurement Methods 0.000 description 51
- 206010063836 Atrioventricular septal defect Diseases 0.000 description 49
- 230000005855 radiation Effects 0.000 description 43
- 238000012545 processing Methods 0.000 description 42
- 230000008569 process Effects 0.000 description 40
- 238000012795 verification Methods 0.000 description 40
- 239000011159 matrix material Substances 0.000 description 39
- 238000004458 analytical method Methods 0.000 description 25
- 239000000523 sample Substances 0.000 description 25
- 238000013135 deep learning Methods 0.000 description 23
- 230000006870 function Effects 0.000 description 23
- 210000003128 head Anatomy 0.000 description 22
- 230000008859 change Effects 0.000 description 18
- 238000004590 computer program Methods 0.000 description 15
- 238000001211 electron capture detection Methods 0.000 description 14
- 238000005286 illumination Methods 0.000 description 12
- 239000011521 glass Substances 0.000 description 11
- 230000008901 benefit Effects 0.000 description 9
- 210000000887 face Anatomy 0.000 description 9
- 230000000670 limiting effect Effects 0.000 description 9
- 238000012360 testing method Methods 0.000 description 9
- 238000009826 distribution Methods 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 230000035515 penetration Effects 0.000 description 6
- 230000004044 response Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 239000002131 composite material Substances 0.000 description 5
- 210000004207 dermis Anatomy 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 239000000284 extract Substances 0.000 description 5
- 239000002245 particle Substances 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 238000003491 array Methods 0.000 description 4
- 230000002354 daily effect Effects 0.000 description 4
- 230000006378 damage Effects 0.000 description 4
- 239000000835 fiber Substances 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 230000002829 reductive effect Effects 0.000 description 4
- 230000002787 reinforcement Effects 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 102100029235 Histone-lysine N-methyltransferase NSD3 Human genes 0.000 description 3
- 101100461044 Homo sapiens NSD3 gene Proteins 0.000 description 3
- 238000013459 approach Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 150000001875 compounds Chemical class 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000005670 electromagnetic radiation Effects 0.000 description 3
- 230000014509 gene expression Effects 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000000691 measurement method Methods 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 208000002874 Acne Vulgaris Diseases 0.000 description 2
- 241001303601 Rosacea Species 0.000 description 2
- 206010000496 acne Diseases 0.000 description 2
- 230000032683 aging Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000001010 compromised effect Effects 0.000 description 2
- 239000002537 cosmetic Substances 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000004069 differentiation Effects 0.000 description 2
- 210000005069 ears Anatomy 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 208000000069 hyperpigmentation Diseases 0.000 description 2
- 230000003810 hyperpigmentation Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 150000004032 porphyrins Chemical class 0.000 description 2
- 230000036544 posture Effects 0.000 description 2
- 201000004700 rosacea Diseases 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 210000003491 skin Anatomy 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 208000009056 telangiectasis Diseases 0.000 description 2
- KJLPSBMDOIVXSN-UHFFFAOYSA-N 4-[4-[2-[4-(3,4-dicarboxyphenoxy)phenyl]propan-2-yl]phenoxy]phthalic acid Chemical compound C=1C=C(OC=2C=C(C(C(O)=O)=CC=2)C(O)=O)C=CC=1C(C)(C)C(C=C1)=CC=C1OC1=CC=C(C(O)=O)C(C(O)=O)=C1 KJLPSBMDOIVXSN-UHFFFAOYSA-N 0.000 description 1
- 206010004950 Birth mark Diseases 0.000 description 1
- 230000005457 Black-body radiation Effects 0.000 description 1
- 238000001712 DNA sequencing Methods 0.000 description 1
- 206010052128 Glare Diseases 0.000 description 1
- 206010027940 Mood altered Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000005388 cross polarization Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007510 mood change Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 206010033675 panniculitis Diseases 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 231100000241 scar Toxicity 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 210000004304 subcutaneous tissue Anatomy 0.000 description 1
- 238000001757 thermogravimetry curve Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/165—Detection; Localisation; Normalisation using facial parts and geometric relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
- G06F18/2148—Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the process organisation or structure, e.g. boosting cascade
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
Definitions
- a requester may be a user/person that requests access to an access controlled assets and/or infrastructure.
- a requester may carry and use a key, which is designed to fit a lock to allow the requester of the key to open the lock and gain entry.
- a requester may use a key fob to remotely lock or unlock the doors of a vehicle by, e.g., pressing a button on the fob to generate an infrared ("IR") or radio frequency (“RF”) signal, which is detected by a sensor in the vehicle, which controls the doors.
- IR infrared
- RF radio frequency
- Such vehicle keyless access systems may require the requester to operate the ignition system.
- Other similar keyless access implementations may involve inserting and presenting a magnetic card or the like in a slot or a card reader/detector, or enabling an authorized requester to key in a numeric or alphanumeric code on a provided keypad.
- biometrics access control systems may mitigate some shortcomings of keys/cards-based access control systems, there may be limitations as well.
- Traditional biometric sensors such as iris detection sensors, may be limited to specific light conditions significantly reducing both the effectiveness of the biometric sensor as well as the possible environments to apply same. The performance of biometric sensors may be compromised in direct sunlight due to glares, shadows, and other artifacts.
- Some aspects of the present disclosure include methods for generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light, identifying one or more macroblocks each includes a subset of the plurality of sampling points, selecting a local pattern value, calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, assigning a unique index to each of the plurality of weighted values, generating a second array of the unique index by ranking the plurality of weighted values, and generating a third array including a plurality of ranking distances.
- an edge capture device having an illumination source configured to emit an incident non-visible light, an optical sensor configured to detect a detected non-visible light, wherein the detected non-visible light includes a reflected non-visible light and a radiated non-visible light, one or more processors operatively coupled to the illumination source and the optical sensor, the one or more processors are configured to construct a biometric template of a requester requesting access to an entry point by generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light, identifying one or more macroblocks each includes a subset of the plurality of sampling points, selecting a local pattern value, calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of
- aspects of the present disclosure include a computer readable medium having code stored therein that, when executed by one or more processors, cause the one or more processors to execute code for generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light, code for identifying one or more macroblocks each includes a subset of the plurality of sampling points, code for selecting a local pattern value, code for calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, code for generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, code for assigning a unique index to each of the plurality of weighted values, generating a second array of the unique index by ranking the plurality of weighted values, and code for generating a third array including
- An aspect of the present disclosure includes a system having means for generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light, means for identifying one or more macroblocks each includes a subset of the plurality of sampling points, means for selecting a local pattern value, means for calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, means for generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, means for assigning a unique index to each of the plurality of weighted values, generating a second array of the unique index by ranking the plurality of weighted values, and means for generating a third array including a plurality of ranking distances.
- aspects of the present disclosure include an infrastructure having an access-controlled entry point, a ECD configured to emit an incident non-visible light onto a face of a requester, detect a detected non-visible light from the face of the requester, wherein the detected non- visible light includes a reflected non-visible light and a radiated non-visible light, generate a biometric template of the requester by generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non- visible light, identifying one or more macroblocks each includes a subset of the plurality of sampling points, selecting a local pattern value, calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, assigning a
- FIG. 1 is an example concurrent real-time identity verification and authentication system, in accordance with some aspects of the present disclosure
- FIG. 13 is an example concurrent real-time identity verification and authentication system, in accordance with some aspects of the present disclosure
- FIG. 2 shows a perspective view of an example of a concurrent real-time identity verification and authentication device, in accordance with some aspects of the present disclosure
- FIG. 3 shows a frontal view of an example of a concurrent real-time identity verification and authentication device, in accordance with some aspects of the present disclosure
- FIG. 4 shows another perspective view of an example of a concurrent real-time identity verification and authentication device, in accordance with some aspects of the present disclosure
- FIG. 5 is a block diagram of an example processing component of a concurrent real- time identity verification and authentication device, in accordance with some aspects of the present disclosure
- FIG.6 shows a flow diagram of a facial recognition method, in accordance with some aspects of the present disclosure
- FIG. 18 shows a flow diagram of a facial recognition method, in accordance with some aspects of the present disclosure
- FIG. 7(a) shows a facial image for a person, in accordance with some aspects of the present disclosure
- FIG.7(b) shows a different facial image for the same person, in accordance with some aspects of the present disclosure
- FIG.8 shows an example process for calculating local binary pattern (LBP) feature, in accordance with some aspects of the present disclosure
- FIG. 9 shows an example process for calculating local ternary pattern (LTP) feature, in accordance with some aspects of the present disclosure
- FIG. 10 shows positions of three example key features selected among one or more face images, in accordance with some aspects of the present disclosure
- FIG. 11 shows an example of a receiver operating characteristic (ROC) curve for testing a face database, in accordance with some aspects of the present disclosure
- FIG. 12 illustrates an example of biometric, asymmetric encryption for confidentiality, in accordance with some aspects of the present disclosure
- FIG. 13 illustrates another example of biometric, asymmetric encryption for authentication, in accordance with some aspects of the present disclosure
- FIG. 14 illustrates a schematic view of an example of an environment for implementing one or more gateways for access control
- FIG. 15 illustrates an example of a computer system for implementing a method of managing data in accordance with aspects of the present disclosure
- FIG. 16 illustrates a block diagram of various exemplary system components, in accordance with aspects of the present disclosure
- FIG. 17 illustrates an example of a ECD for identifying biometric templates, in accordance with aspects of the present disclosure
- FIG. 18 illustrates an example of the components of the ECD of FIG. 17, in accordance with aspects of the present disclosure
- FIG. 19 illustrates another example of the components of the ECD of FIG. 17, in accordance with aspects of the present disclosure
- FIG.20 illustrates an example of a sampled profile, in accordance with aspects of the present disclosure
- FIG. 21 illustrates an example of LBP operation on measurement points of the sampled profile of FIG.20, in accordance with aspects of the present disclosure;; [34] FIG.22 illustrates examples of sub-matrices; [35] FIG. 23 illustrates an example of a table of results for sequence conversion, in accordance with aspects of the present disclosure; [36] FIG. 24 illustrates an example of a flow chart for converting a sequence, in accordance with aspects of the present disclosure; [37] FIG. 25 illustrates an example of a table of verification calculations, in accordance with aspects of the present disclosure; [38] FIG. 26 illustrates an example of a flow chart of deep learning, in accordance with aspects of the present disclosure; [39] FIG.
- FIG. 27 illustrates another example of a flow chart of deep learning, in accordance with aspects of the present disclosure
- FIG. 28 illustrates a flow chart of a method for identifying biometric templates, in accordance with aspects of the present disclosure
- FIG. 29 illustrates examples of binary trees based on sampled profiles of biometric templates, in accordance with aspects of the present disclosure
- FIG.30 illustrates an example of time-domain analysis, in accordance with aspects of the present disclosure
- FIG. 31 illustrates an example of facial recognition using auto-alignment, in accordance with aspects of the present disclosure
- FIG. 32 illustrates an example of yaw, roll, and pitch, in accordance with aspects of the present disclosure
- FIG. 33 illustrates examples of techniques for computing yaw, roll, and pitch, in accordance with aspects of the present disclosure
- FIG.34 illustrates an example of aligning a macroblock to a face using landmarks, in accordance with aspects of the present disclosure
- FIG. 35 illustrates examples of aligning a macroblock to a face including yaw, in accordance with aspects of the present disclosure
- FIG. 36 illustrates examples of aligning a macroblock to a face including pitch, in accordance with aspects of the present disclosure.
- DETAILED DESCRIPTION [49] The following includes definitions of selected terms employed herein. The definitions include various examples and/or forms of components that fall within the scope of a term and that may be used for implementation.
- the bus may also be a vehicle bus that interconnects components inside a vehicle using protocols, such as Controller Area network (CAN), Local Interconnect Network (LIN), among others.
- a “memory,” as used herein may include volatile memory and/or non-volatile memory.
- Non-volatile memory may include, for example, ROM (read only memory), PROM (programmable read only memory), EPROM (erasable PROM) and EEPROM (electrically erasable PROM).
- Volatile memory may include, for example, RAM (random access memory), synchronous RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), and/or direct RAM bus RAM (DRRAM).
- Biometric identification techniques generally refer to pattern recognition techniques that perform a requester identification process by determining the authenticity of a specific physiological or behavioral characteristic possessed by the requester.
- biometric identification may be preferred over traditional methods involving passwords and personal identification numbers (PINs) for various reasons.
- PINs personal identification numbers
- the person (e.g., requester) to be identified is typically required to be physically present at the point-of-identification.
- identification based on biometric techniques obviates the need to remember a password or carry a token (i.e., a security device used to gain access to an access controlled entry point).
- One kind of texture based local binary pattern (“LBP”) feature describes facial information that produces desirable recognition results.
- the improved local ternary pattern (“LTP") feature may be a further improvement over conventional LBP methods.
- LBP and LTP features may not be sensitive to light and expression variations and are computationally efficient, but they also have shortcomings, such as information redundancy due to correlation between the positive histogram and the negative histogram.
- the biometric signature data may be interchangeable across a wide variety of applications.
- the same biometric signature data for a person may be used to authenticate that person at one or more locations and for one or more applications.
- an example of a biometric system in the present disclosure allows the biometric signature data to be altered based on a desired security level.
- the type of biometric signature data that may be used for a particular application and/or relating to a particular requester may vary depending on the security level desired for that particular application and/or requester.
- the biometric data associated with the intended recipient may be obtained via a biometric sensor of a biometric-based access control system.
- a biometric sensor of a biometric-based access control system may be obtained via a biometric sensor of a biometric-based access control system.
- variations in light, temperature, distance of the biometric sensor from a target may impact the quantity and quality of the biometric data obtained via the biometric sensor.
- the biometric sensor may utilize either near infrared (IR) or ultraviolet (UV) light or a combination of both IR and UV at desired intensities.
- IR near infrared
- UV ultraviolet
- the method uses near IR light.
- An Infrared light emitting diode (LED) array may be utilized in the facial recognition device or biometric sensor to minimize the impact of the surrounding lighting on capturing the facial uniqueness.
- an access control system may utilize IR or new IR illumination and detection to identify facial features.
- IR or new IR lighting may penetrate into the dermis of the face.
- the IR or new IR lighting may penetrate into the dermis by 10 micrometers, 20 micrometers, 50 micrometers, 100 micrometers, 200 micrometers, 500 micrometers, 1 millimeters, 2 millimeters, 5 millimeters, and/or 10 millimeters. Other penetration depths are possible.
- the penetration depths may depend on the location of the body, wavelength of the infrared lighting, and/or intensity of the infrared lighting.
- the penetration may expose characteristics of the skin that may be difficult to see in visible light including (age spots, spider veins, hyperpigmentation, rosacea, acne, and porphyrins).
- the identification of these subdermal features may be used to adjust/supplement the unique identification of the requester. These features on the face of the requester may be unique because they are based on the requesters exposure to nature and the sun over the life of the requester. Facial recognition based on subdermal features may identify the uniqueness of the face at the time of capture to provide opportunities for identification analysis. The number of subdermal features may increase over time with exposure to the sun and on a daily basis.
- an access control system may utilize ultraviolet illumination and detection to identify facial features.
- Ultraviolet lighting may penetrate into the dermis of the face.
- the UV lighting may penetrate into the dermis by 10 micrometers, 20 micrometers, 50 micrometers, 100 micrometers, 200 micrometers, 500 micrometers, 1 millimeters, 2 millimeters, 5 millimeters, and/or 10 millimeters.
- Other penetration depths are possible.
- the penetration depths may depend on the location of the body, wavelength of the ultraviolet lighting, and/or intensity of the ultraviolet lighting.
- the penetration may expose characteristics of the skin that may be difficult to see in visible light including (age spots, spider veins, hyperpigmentation, rosacea, acne, and porphyrins).
- the identification of these subdermal features may be used to adjust/supplement the unique identification of the requester.
- These features on the face of the requester may be unique because they are based on the requesters exposure to nature and the sun over the life of the requester.
- Facial recognition based on subdermal features may identify the uniqueness of the face at the time of capture to provide opportunities for identification analysis.
- the number of subdermal features may increase over time with exposure to the sun and on a daily basis.
- the facial recognition system of the present disclosure may estimate the age of a person based on the quantity and nature of the subdermal features.
- the access control system may also track the change in these features over time to confirm the individual’s identity and establish lifestyle and daily routines based on interpretations of the subdermal features.
- Subdermal facial recognition may also increase the difficulty of creating a duplicate (e.g., duplicate of a biometric template) of the face due to its elimination of dependency on facial features capable of being captured by standard visible wavelength photography and camera technology.
- the access control system may also further obfuscate the content of the ultraviolet capture by introducing time-sequenced cross-polarization filters to the capturing process that further eliminates the ability to present an artificial duplicate of the face to the access control system.
- a benefit of the system in the present disclosure includes allowing a single credential system replacing PINs, passwords, and multi-factor authentication that is seamless to the requester. With this architecture in place, the requester(s) of the system may rely on a single credential management solution.
- the system of the present disclosure may support both logical and physical gateways. In some implementations of the present disclosure, the system may provide protection at home and at work.
- Aspects of the present disclosure may include a method referred to as “layered reinforcement.”
- the method comprises of taking the image of face from the biometric sensor and overlaying several layers of different size pixel boxes on the image. This layering of pixel boxes of different sizes has an amplifying impact on the analysis of the uniqueness of the face. Areas that are more unique to the face are amplified. Areas that are more common among faces are deemphasized.
- layered reinforcement may improve the algorithm performance while allowing the method to handle a large number of users at multiple sites where the biometric sensor ECD is deployed.
- the “layer reinforcement” of the method may allow for the processing of the same number of requesters on a local Advanced Reduced Instruction Set Computing Machine (ARM) processor at the biometric sensor ECD where the image is first captured, thus reducing hardware and processing requirements and contributing to the accuracy and reliability of the method as a network failure cannot prevent the biometric sensor ECD from processing a face verification.
- ARM Advanced Reduced Instruction Set Computing Machine
- Some aspects of this embodiment of the invention cover the use of a gateway (described below) to manage the data analyzed by the various algorithms to increase performance by decreasing false negative and false positive results through the following processes: pixel box hierarchical analysis to create binary tree of dominant features (i.e., determining what is the most distinctive feature); pixel box time domain analysis with heat maps (i.e., determining over time features that are problematic due to overlap among subjects); and binary tree collision (flagging overlap of biometric signature data for two subjects that may cause a false positive and addressing in a proactive fashion).
- Benefits to the system of the present disclosure include improved performance when accuracy requires reduction in false negative and false positive results.
- FIG. 1 an example of an identification system 100 for concurrent real- time identity verification and authentication for use in, e.g., allowing access by an authorized requester to a vehicle, building, or the like is illustrated in accordance with aspects of the present disclosure.
- FIG.1 is intended to describe aspects of the disclosure to enable those skilled in the art. Other implementations may be utilized and changes may be made without departing from the scope of the present disclosure.
- the identification system 100 comprises a concurrent real-time identity verification and authentication device 102 including at least one biometric sensor 104, a processor 106, memory 108, a display 110, and input/output mechanism 112.
- the identification system 100 may be used to secure or control access to a secured area, device, or information, such as an airport boarding area, building, stadium, database, locked door, vehicle, or other access controlled assets/infrastructure.
- the biometric sensor(s) 104 may include a camera, a fingerprint reader, retinal scanner, facial recognition scanner, weight sensor, height sensor, body temperature sensor, gait sensor, heartbeat sensor, or any other sensor or device capable of sensing a biometric characteristic of a person. As shown in FIGS.
- the biometric sensor(s) 104 may be an optical sensor, such as a camera.
- the biometric sensor(s) 104 may include an optical sensor that captures visual data.
- the biometric sensor(s) 104 may be a camera that senses visual information of a requester, such as the facial features of the person.
- the facial features of the person may include the textures, complexions, bone structures, moles, birthmarks, contours, coloring, of the face of the person.
- the biometric sensor(s) 104 may capture the facial features of the person and convert the visual information into digital sensed information as discussed below).
- the processor 106 may be configured for comparing the sensed information via biometric sensor(s) 104 with known characteristics of a person in an attempt to identify the person via biometric signature data.
- the processor 106 may include any number of processors, controllers, integrated circuits, programmable logic devices, or other computing devices.
- the processor 106 may be communicatively coupled with the biometric sensor(s) 104 and other components of the system 100 through wired or wireless connections to enable information to be exchanged between the device 102 and external devices 114 or systems (e.g., network 116) to allow for comparison of the stored biometric signature data with the sensed information obtained from the biometric sensor(s) 104.
- the processor 106 may implement a computer program and/or code segments stored on memory 108 to perform some of the functions described herein.
- the computer program may include an ordered listing of executable instructions for implementing logical functions in the device 102.
- the computer program can be embodied in any computer-readable medium (e.g., memory 108) for use by or in connection with an instruction execution system, apparatus, or device, and execute the instructions.
- the memory 108 may contain, store, communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- Examples of memory 108 may include an electrical connection having one or more wires, a random access memory (RAM), a read- only memory (ROM), an erasable, programmable, read-only memory (EPROM or Flash memory), a portable computer diskette, or a portable compact disk read-only memory (CDROM).
- RAM random access memory
- ROM read- only memory
- EPROM or Flash memory erasable, programmable, read-only memory
- CDROM compact disk read-only memory
- the memory 108 may be integral with the device 102, a stand-alone memory, or a combination of both.
- the memory 108 may include, for example, removable and non- removable memory elements such as RAM, ROM, Flash, magnetic, optical, USB memory devices, and/or other conventional memory elements.
- the memory 108 may store the known characteristics of a number of people and various other data associated with operation of the system 100, such as the computer program and code segments mentioned above, or other data for instructing the device 102 and other device elements to perform the aspects described herein.
- the various data stored within the memory 108 may be associated within one or more databases (not shown) to facilitate retrieval of the information, e.g., via the external devices 114 or the network 116.
- the memory 108 as shown in FIG.1 is integrated into the device 102, it should be appreciated that memory 108 may be stand-alone memory positioned in the same enclosure as the device 102, or may be external memory accessible by the device 102.
- the display 110 may be configured to display various information relating to the system 100 and its underlying operations.
- a notification device may be included (not shown) for indicating the sensed biometric characteristic or the sensed signal fail to match the known characteristics of the person and may include an audible alarm, a visual alarm, and/or any other notification device.
- the device 102 may also include input/output mechanism 112 to facilitate exchanging data and other information among different components within the device 102, or with various the external devices 114 or systems via the network 116.
- various I/O ports may be contemplated including a Secure Disk Digital (SD) card slot, Mini SD Card slot, Micro SD Card slot or the like for receiving removable SD cards, Mini SD Cards, Micro SD Cards, or the like, and a USB port for coupling with a USB cable communicatively coupled with another computing device such as a personal computer.
- the input/output mechanism 112 may include an input device (not shown) for receiving identification information about a person-to-be-identified.
- the input device may include a ticket reader, a credit card reader, an identification reader, a keypad, a touch- screen display, or any other device.
- the input/output mechanism 112 may be configured to enable the device 102 to communicate with other electronic devices through the network 116, such as the Internet, a local area network, a wide area network, an ad hoc or peer to peer network, or a direct connection such as a USB, Firewire, or Bluetooth TM connection, etc.
- the network 116 such as the Internet, a local area network, a wide area network, an ad hoc or peer to peer network, or a direct connection such as a USB, Firewire, or Bluetooth TM connection, etc.
- known characteristics about persons may be stored and retrievable in remote databases or memory via the network 116.
- the input/output mechanism 112 may thus communicate with the network 116 utilizing wired data transfer methods or wireless data transfer methods such as WiFi (802.11), Wi- Max, Bluetooth TM , ANT ® , ultra-wideband, infrared, cellular telephony, radio frequency, etc.
- the input/output mechanism 112 may include a cellular transceiver for transmitting and receiving communications over a communications network operable with GSM (Global System for Mobile communications), CDMA (Code Division Multiple Access), or any other known standards.
- the device 102 may also include a power source (not shown) for providing electrical power to the various components contained therein.
- the power source may include batteries, battery packs, power conduits, connectors, and receptacles operable to receive batteries, battery connectors, or power cables.
- the device 102 may be installed and positioned on an access control entry point (not shown) such as a gate, locked door, etc.
- the device 102 may be a stand-alone, compact, handheld, and portable device. In one example, one may use such a stand-alone, compact, handheld, and portable device to protect sensitive documents or information that are electronically stored and accessed on the Internet and/or an intranet.
- a concurrent realtime identity verification facility access unit may use biometric signature data to create interchangeable authentication for a variety of uses (e.g., office, home, smart phone, computer, facilities).
- the processor 106 in FIG.1 may be configured to include, among other features, a detection module 502 and a recognition module 508 for providing concurrent real-time or near real-time identity verification and authentication with keyless access to authorized requesters to secured facilities or information.
- the detection module 502 may include a face detection module 504 for detecting facial features of a requester.
- the detection module 502 may include an eye detection module 506 for identifying the locations of the eyes of a requester.
- the detection module 520 may include one or both the face detection module 504 and/or the eye detection module 506.
- the processor 106 may receive inputs (digital or analog) from the sensor(s) 104.
- FIG.6 describes an example procedure of selecting key features from a database with a large number of facial information and building one classifier which can distinguish different faces accordingly.
- LBP and LTP may be used to provide a full description of face information, and then with the use of an adaptive boosting ("adaboost") learning algorithm, one may select key features and build a classifier to distinguish different faces by creating biometric signature data.
- This biometric signature data may be used to create universal verification and authentication that can be used for a variety of applications (e.g., computer, building access, smartphone, automobile, data encryption) with varying degrees of access and security (e.g., access to network, but heightened security for requester computer).
- applications e.g., computer, building access, smartphone, automobile, data encryption
- degrees of access and security e.g., access to network, but heightened security for requester computer.
- the processor 106 and the recognition module 508 may create a face sample database using unrecognized face samples.
- the processor 106 and the recognition module 508 may store, into the memory 108, 1000, different persons with each person showing 10 different postures and/or expressions.
- extract LBP and LTP features For example, the detection module 502 and/or the face detection module 504 may extract LBP and LTP features from different blocks in different positions of each face sample.
- calculate positive sample and negative sample For example, at least one of the face detection module 502, the face detection module 504, and the eye detection module 506 may calculate the feature absolute value distance for the same position of any two different images from one person and set this distance as positive sample feature database.
- the face detection module 502 and the face detection module 504 may jointly or separately calculate the feature absolute value distance for the same position of any two different images from different person and set this distance as negative sample feature database.
- build adaboost classifier For example, the face detection module 502 and the face detection module 504 may select the most distinguishable key feature from the candidate feature database with adaboost and create a face classifier.
- generate recognition result For example, the recognition module 508 may generate recognition result. Once there is a fixed dataset of macro blocks and the specific LPB ranging from 1 to 255 is determined, a value is assigned to that unit of the dataset based upon the number of pixels within the block that satisfy that specific LBP.
- the method 600 determines the number of pixels in the histogram that fall within that LBP of 20 on scalar value.
- the method 600 calculates scalar value and then normalize value in a second array to address the problem of determining value within various sized macro-blocks.
- the scalar value based upon the known method was based on size of macro-block where the maximum value could be from 100 to 1600 depending upon the size of the macro-block.
- the scalar value in this second array may now a percentage of the total pixels available in that macro block to normalize the data for the subsequent assessment. Normalization causes the data to not be skewed based on the size of the macro block.
- each unit of the data set in this second array has the same weight.
- This normalized data may be then sorted to establish and assign a value from 1 – 2165 where the scale reflects the highest normalized value going to the top of the sort. For example if dataset 2000 had the highest value in the array it would be assigned a value of 1 with descending value reflecting the datasets that have lower values.
- the second normalized array may then be converted to a third simulated DNA sequencing array where the position is established within this third array based upon its value in previous sort. The third array assesses the position and calculates the differences between where the data set appears in the sequence (e.g., ranking distance).
- test face sample For example, the detection module 502 may optionally test face samples.
- extract LBP and LTP features For example, the detection module 502 and/or the face detection module 504 may extract LBP and LTP features from different blocks in different positions of each face sample.
- online recognition may include the following steps: [89] (1) Calculate the offline stage extracted key feature of different blocks in different positions for face sample to be identified. [90] (2) Calculate the key feature selected from step (1) with that of each face sample in database and determine whether they belong to the same person or not.
- an example process starts with creating a face database with different postures and different expressions. For example, one may include the images of, e.g., 1000, different persons and each person shows, e.g., 10, images differently.
- FIG. 7(a) shows the different face image of one same person, and FIG.7(b) refers to the different face image of different person.
- LBP and LTP may be used to describe face.
- FIG. 8 shows a calculating process of LBP features
- FIG. 9 shows a calculating process of LTP features.
- different block size may be divided on different positions of face sample.
- face size can be 100 x 100
- block size may be w x h
- w and h values can range from 2 to 100
- 7837 blocks may be selected as a result.
- the identification system 100 may select the bin features of LBP and LTP on different block size and make it as the final candidate feature database.
- the next step is to calculate positive samples and negative samples.
- the bin feature absolute value distance of the same position for different images from a same person may be calculated and set as the positive sample. Additionally, the bin feature absolute value distance of same position for different persons may be calculated and set as the negative sample.
- the result may involve calculating 32356 positive samples and 58747698 negative samples.
- the key bin feature that can distinguish all positive and negative samples among the large number of feature database may be selected with a learning algorithm.
- a learning algorithm For example, one may choose the learning algorithm of discrete adaboost to select feature and build a classifier.
- An example algorithm of using adaboost to classify may include the following computational steps: [96] 1. Given f as the maximum negative sample error rate, d as the minimum positive sample correct rate, F tar as the target of negative sample error rate, and D tar as the target of positive sample correct rate that cascade classifier has to achieve. P, N are the positive and negative database, respectively. [97] 2.
- FIG. 10 shows the position of the first three key features selected among face image by taking online testing for face database of 100 persons based on offline selected features and classifier.
- FIG. 11 shows recognition results for 100 persons, wherein X axis represents false accept rate, which means the wrongly identified rate of face samples. Y axis represents verification rate, which means the rate of face samples correctly recognized. As shown in FIG. 11, when the false accept rate is below 10 -4 , it may achieve 95% recognition rate.
- the face recognition in this example not only improves the robustness of face sample, but also reduces its computational complexity thus improves the face recognition significantly.
- the detection module 502 may be configured to use, among other features, a face detection module 504 and an eyes detection module 506 for processing the acquired image of the person-to-be-identified as follows.
- Face Detection Module 504 [113] Inputs: Acquired frontal face image (grey image), face classifier [114] Outputs: Face frame positions, and the number of faces [115]
- Flow [116] a. Reduce the acquire frontal face image to user-defined size [117] b. Calculate an integral image of the reduced image [118] c. Initialize a traverse window based on the size defined by the face classifier, e.g., 20x20 [119] d.
- Eyes Detection Module 506 Inputs: Acquired frontal face image (grey image), face frame positions, classifier for both left and right eyes, left eye classifier, right eye classifier, left eye coarse detection classifier, right eye coarse detection classifier [125] Outputs: frame position for both eyes, frame position of left eye, and frame position of right eye [126] Flow: [127] a. Obtain face image from the acquired frontal face image [128] b. If user-defined classifier for both left and right eyes is available, use correspondingly defined face detection function to detect both the left and right eyes of the obtained face image. If not, estimate the positions of both the left and right eyes based on experience. [129] c.
- user-defined left/right eye course detection classifier for the left/right eye detect the left/right eye on the corresponding half of the obtained face image. Further, based on the coarse detection result, determine whether the detected human subject is wearing glasses or not. If glasses are present, detect the obtained face image and return with results. If no glasses are present, continue to detect the obtained face image based on the coarse detection result and return the detection result without considering the presence of glasses. (If user-defined classifier for glasses-wearing subject is not available, detect the obtained face image without considering the presence of glasses.) [130] d. If user-defined course detection classifiers are not available, determine whether glasses are present by directly detecting the left/right half of the obtained face image. If glasses are present, detect the obtained face image and return with results.
- the processor 106 may further use, e.g., a recognition module 508, to extract pertinent facial features obtained from the detection module 502 for comparing against known characteristics and/or information of a number of authorized people as follows.
- Inputs to-be-normalized image (grey image), the coordinates of the centers of both the left and right eyes on the image axis (the origin is located at the left top corner of the image).
- Output output image [137] Feature Extraction [138] Inputs: Normalized image (grey image) and feature types [139] Outputs: If output buffer is NULL, return feature dimensional degrees. Otherwise, assume the size of the output buffer equals the feature dimensional degrees, write the features of the image into the buffer, and return feature dimensional degrees. Certain features are associated with certain image size. For example, #6 feature may require the image size of 100 by 100. Therefore, when the input image fails corresponding defined image size requirement, a result of zero can be returned. [140] Feature Comparison [141] Inputs: Two features to be compared and the comparison method [142] Output: The smaller the comparison result (a floating point), the higher the similarity.
- Examples of the present disclosure can generate asymmetric keys based on one or more BSD files in such a way that by utilizing a biometric sensor, a person's biometric measurement can act as the person's private key.
- Implementations of the present disclosure may also incorporate BSD files into digital rights management (DRM) security in such a way that files cannot be decrypted or accessed by anyone other than the requester or group of requesters intended, or encrypted in a way that the original owners, such as a business, can no longer access the files. Accordingly, by using implementations of the present disclosure employing BSD files, when a file is accessed, there can be assurance of the identity of the requester who accessed the file.
- DRM digital rights management
- BSD files can be generated by the algorithmic analysis of data from an A/D IR and/or UV sensor. Accordingly, many of these elements can be considered when constructing the private key of the asymmetrical pair (i.e., analog and/or digital values). Thus, in some implementations of the present disclosure, multiple elements of a sensor can contribute real- time data or real-time analog data related to a recognition event in order to de-encrypt, thus ensuring a real-time event (i.e., the actual measurement of the intended person) has triggered the authentication. [149] As shown in FIGS. 12-13, in accordance with some implementations of the present disclosure, messages can be sent as follows.
- a requester can register, e.g., on a computer, and create a public key for the requester. The requester than then publish the public key so that the key is publicly known. Other people, systems, or entities, can use the requester's public key to encrypt messages for the requester and send those messages to the requester. The requester can decrypt the message using her private key created by one or more live BSD files associated with the requester. Accordingly, the sender of the message is ensured that the requester is actually the person decrypting the message because the private key used to decrypt the message can be generated by the requester's live biometric data.
- DRM rules can allow for additional content to be added to a file and additional rules to be required.
- DRM rules can be expressed in many rights management languages known in the art, including but not limited to, XrML (extensible rights markup language), XMCL (extensible media commerce language), ODRL (open digital rights language), and the like.
- Rules can specify the actions that are permitted (e.g., decrypting, encrypting, transferring, copying, editing, etc.).
- the rules can also specify the people authorized to perform actions and the conditions under which these actions are permitted.
- BSD files can be used to authenticate a requester to determine whether the requester is one of the people specified in the rules.
- Various systems and methods for biometric encryption and authentication can also find application in corporate settings where, e.g., employees use corporate devices for personal use as well as business, or when, e.g., an employee uses a personal device and the corporate digital assets are transferred to and from the personal device.
- rules By applying rules to documents that have certain digital signatures, there may be controllable segmentation between private and business concerns. Both parties may have access to the parts they are entitled to access but can be prevented from accessing parts that are not entitled to access.
- possible applications include, but are not limited to, providing remote access, making purchases, and conditional security.
- biometric authentication techniques of the present disclosure can be used to make authenticated online purchases/transactions. For example, spending limits can be based on requester or group profile for an account. In order for a requester to make a purchase, a system can use the biometric authentication techniques of the present disclosure to authenticate the true identity of that requester to verify the requester is entitled to make the desired purchase.
- Biometric authentication techniques can also be used to provide conditional security to various digital files.
- biometric Encryption and Authentication Application to Digital Cinema find many applications in the digital cinema industry. Movies are popular commodities, especially pre- DVD release. In order to maximize both production efficiencies and distribution opportunities, movies need to be accessed and handled by many different strata of requesters. Persons skilled in the art appreciate that techniques capable of protecting digital assets in the digital cinema industry can be used to protect digital assets in almost any industry. Accordingly, the principles described herein are not limited to application in the digital cinema industry, but may instead be applied to any industry for a similar purpose.
- SMPTE DC28 the body responsible for digital cinema standards, has identified five separate areas of digital cinema: (1) capture; (2) production; (3) Master (cinema, home, video, trailers, test screenings); (4) distribution (satellite, fiber, packaged); and (5) exhibition (digital projector security).
- a movie is vulnerable to theft.
- movies can be encrypted prior to distribution. Movies are then typically stored in their encrypted state in the theater until showtime. At showtime, the movie is decrypted and decompressed. This decryption/decompression may take place in a server or in a projector.
- DC28.4 can represent the conditional access portions of the cinema delivery system.
- Modem DRM encryption methods have proven sufficient to withstand unwarranted deciphering attempts, but securing the keys has become a problem. From capture to exhibition to distribution, a movie is encrypted and decrypted multiple times. Accordingly, various biometric encryption and authentication techniques discussed herein can be applied to one of more of the encryption, decryption, and authentication steps, in accordance with various implementations of the present disclosure.
- an example of an environment 1400 for managing data may rely on a first gateway 1402a, a second gateway 1402b, and a third gateway 1402c to route data via wired and/or wireless communication links.
- the first gateway 1402a may be implemented as a software-based gateway virtualized in a computer system.
- the second gateway 1402b may be a standalone device that routes data as described below.
- the third gateway 1402c may be a cloud-based gateway.
- Other architectures are possible.
- the gateways 1402 may perform several functions including managing the movement of data to and from the biometric sensor as described below, providing a networked solution that efficiently moves binary facial data between devices, and when clustered together (physical and virtual), providing a high availability solution for security designs.
- the gateway 1402 may receive credentialing data through an XML file structure.
- the gateway 1402 may be able to utilize a standardized universal interface agnostic to the programming language, operating system and type of connectivity of the data source, support physical and logical access control requirements within the same method, offer an interface that supports simultaneous connectivity from different system types, and/or support either live or batch processing of credentials, including the immediate recovery of a credential system through file replacement.
- data stored in the gateways 1402 may be stored in files based on JavaScript Object Notation format.
- the environment 1400 may include one or more ECD ECDs 1404, e.g., ECD-a 1404a, ECD-b 1404b, ECD-c 1404c, ECD-d 1404d, ECD-e 1404e, and ECD-f 1404f.
- the ECDs 1404 may also be referred to as edge capture devices.
- the ECD ECDs 1404 may transmit data to the gateways 1402 to be routed to another device within the environment 1400.
- the ECD-a 1404a may send a biometric template and/or identity information of a person to the third gateway 1402c.
- the third gateway 1402c may send one or more biometric templates to the ECD-a 1404a for performing matching operations.
- the environment 1400 may include an access control server 1406a, an enterprise server 1406b, and a third party server 1406c.
- the access control server 1406a may be communicatively coupled to one or more access controlled entry points 1408 via a wired or wireless communication link.
- the enterprise server 1406b may be communicatively coupled to a storage device 1410.
- the storage device 1410 may be a network drive, local hard drive, flash drive, tape drive, or other suitable storage media.
- the ECD-b 1404b may receive a biometric template of a first requester 1450a.
- the first requester 1450a may attempt to access one or more access controlled entry points 1408 controlled by the access control server 1406a.
- the one or more access controlled entry points 1408 may include a vault, lock, secure door, secure gate, equipment or machinery, computing device, digital storage device, database, or file, for example.
- the one or more access controlled entry points 1408 may be an access controlled door or gate of an infrastructure, such as a warehouse, office building, restricted area, etc.
- the biometric template of the first requester 1450a may include one or more of the fingerprints, voice patterns, iris patterns, facial features, signature patterns, shapes of the ears, retinal patterns, gait, hand geometry of the first requester 1450a.
- the ECD-b 1404b may extract the facial features of the first requester 1450a, and compare the facial features with the facial features of authorized personnel. If the facial feature of the first requester 1450a matches one of the facial features of the authorized personnel, the ECD-b 1404b may send a first positive match signal to the third gateway 1402c. The third gateway 1402c may route the first positive match signal to the access control server 1406a. Upon receiving the first positive match signal, the access control server 1406a may unlock one of the one or more access controlled entry points 1408 associated with the ECD-b 1404b to allow the requester 1450 access.
- the ECD-d 1404d may transmit a second positive match signal of a second requester 1450b at a first time and a third positive match signal at a second time of the second requester 1450b to the second gateway 1402b.
- the second gateway 1402b may route the second positive match signal and the third positive match signal to the first gateway1402a, which may route the second and third positive match signals to the enterprise server 1406b.
- the enterprise server 1406b may use the second positive match signal and the third positive match signal to log access information associated with the second requester 1450b.
- the enterprise server 1406b may record, into the storage device 1410, the first time as the arrival time of the second requester 1450b and the second time as the departure time on a work day.
- the enterprise server 1406b may record the first and second time as the number of accesses to the one or more access controlled entry points 1408 by the second requester 1450b.
- the enterprise server 1406b may also log, based on the information in the second and third positive match signals, the premises, equipment, files, locations, and information accessed by the second requester 1450b.
- the ECD-f 1404f may transmit a fourth positive match signal of the third requester 1450c to the first gateway 1402a.
- the first gateway 1402a may route the fourth positive match signal to the third party server 1406c through a firewall 1412.
- the firewall 1412 may filter information transmitted through the firewall 1412 and prevent malicious requesters from gaining unauthorized access.
- the fourth positive match signal may indicate to the third party server 1406c that the third requester 1450c gained access to the one or more access controlled entry points 1408.
- the fourth positive match signal may indicate to the third part server 1406c that the third requester 1450c accessed a software that requires payment to the owner of the third party server 1406c.
- a network administrator 1452 may install, manage, update, maintain, and/or control the software in the ECD ECDs 1404, the gateways 1402, the access control server 1406a, the one or more access controlled entry points 1408, the enterprise server 1406b, the storage device 1410, and/or the firewall 1412 via a workstation 1414.
- the workstation 1414 may be a desktop computer, laptop computer, tablet computer, handheld computer, smartphone, or other suitable computer devices communicatively coupled via a wired or wireless connection to the third gateway 1402c.
- the network administrator 1452 may transmit software commands from the workstation 1414 to the third gateway 1402c to be routed to a destination. In some examples, the network administrator 1452 may upgrade the firmware in the ECD ECDs 1404.
- the network administrator 1452 may install new software onto the access control server 1406a. In another example, the network administrator 1452 may perform maintenance operations, such as disk error check and defragmentation, on the storage device 1410. In yet another example, the network administrator 1452 may lock down or open the one or more access controlled entry points 1408 in an emergency.
- an employee 1454 such as a supervisor, may utilize a requester terminal 1416 to access information through the second gateway 1402b.
- the requester terminal 1416 may be a desktop computer, laptop computer, tablet computer, handheld computer, smartphone, or other suitable computer devices communicatively coupled via a wired or wireless connection to the second gateway 1402b.
- the employee 1454 may download information, such as work hours, arrival time, access history, utilization frequencies, using the requester terminal 1416.
- data exchanges within the environment 1400 may be encrypted.
- Data transmissions between the gateways 1402 and the ECD ECDs 1404 may use advanced TLS v1.2 communications with a proprietary key management framework.
- Data transmitted via TLS v1.2 communications may be fully encrypted to remove the threat of exposure of the data to unwanted parties.
- the encryption of the data may be further protected by protecting the generation of the encrypted keys through the use of the biometric data as the seed for the generation of the keys. As such, data exchanged within the environment may be difficult to access by unauthorized requesters.
- the gateways 1402 may be used to manage the data and the creation of a blockchain for the requesters.
- the first gateway 1402a, the second gateway 1402b, and the third gateway 1402c may each include a blockchain wallet.
- the wallets will contain the requester personal credentials required to authenticate to any device or application.
- the wallets may be tied to the gateways 1402 to provide cybersecurity monitoring and to provide the interaction between the wallet and the facial recognition devices.
- the ECD ECDs 1404 linked to the personal blockchain may be able to enable a transaction in the blockchain. Communications between blockchain wallets, a ledger for the blockchain transactions, and the gateways 1402 may use the blockchain protocol.
- the personal blockchain will also extend to devices that verify more than one requester, like a bank ATM (not shown).
- the gateways 1402 will utilize location tracking to move the link binary facial data and the link to the blockchain ledger between shared devices to improve the security of the transactions and to manage the number of requesters held within each ECD 1404.
- Aspects of the present disclosure may be implemented using hardware, software, a cloud network, or a combination thereof and may be implemented in one or more computer systems or other processing systems. In an aspect of the present disclosure, features are directed toward one or more computer systems capable of carrying out the functionality described herein. An example of such the computer system 1500 is shown in FIG.15.
- One or more of the gateways 1402, the servers 1406, the firewall 1412, the workstation 1414, and/or the requester terminal 1416 may be implemented based on the computer system 1500.
- the computer system 1500 includes one or more processors, such as the processor 1504.
- the processor 1504 is communicatively coupled to a communication infrastructure 1506 (e.g., a communications bus, cross-over bar, or network).
- a communication infrastructure 1506 e.g., a communications bus, cross-over bar, or network.
- the computer system 1500 may include a display interface 1502 that forwards graphics, text, and other data from the communication infrastructure 1506 (or from a frame buffer not shown) for display on a display unit 1530.
- Computer system 1500 also includes a main memory 208, preferably random access memory (RAM), and may also include a secondary memory 1510.
- the secondary memory 1510 may include, for example, a hard disk drive 1512, and/or a removable storage drive 1514, representing a floppy disk drive, magnetic tape drive, optical disk drive, universal serial bus (USB) flash drive, etc.
- the removable storage drive 1514 reads from and/or writes to a removable storage unit 1518 in a well-known manner.
- Removable storage unit 1518 represents a floppy disk, magnetic tape, optical disk, USB flash drive etc., which is read by and written to removable storage drive 1514.
- the removable storage unit 1518 includes a computer usable storage medium having stored therein computer software and/or data.
- Secondary memory 1510 may include other similar devices for allowing computer programs or other instructions to be loaded into computer system 1500.
- Such devices may include, for example, a removable storage unit 1522 and an interface 1520. Examples of such may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an erasable programmable read only memory (EPROM), or programmable read only memory (PROM)) and associated socket, and other removable storage units 1522 and interfaces 1520, which allow software and data to be transferred from the removable storage unit 1522 to computer system 1500.
- Computer system 1500 may also include a communications interface 1524. Communications interface 1524 allows software and data to be transferred between computer system 1500 and external devices.
- communications interface 1524 may include a modem, a network interface (such as an Ethernet card), a communications port, a Personal Computer Memory Card International Association (PCMCIA) slot and card, etc.
- Software and data transferred via communications interface 1524 are in the form of signals 1528, which may be electronic, electromagnetic, optical or other signals capable of being received by communications interface 1524.
- signals 1528 are provided to communications interface 1524 via a communications path (e.g., channel) 1526.
- This path 1526 carries signals 1528 and may be implemented using one or more of a wire or cable, fiber optics, telephone line, cellular link, RF link and/or other communications channels.
- Computer program medium and “computer usable medium” are used to refer generally to media such as a removable storage drive 1518, a hard disk installed in hard disk drive 1512, and signals 1528. These computer program products provide software to the computer system 1500. Aspects of the present disclosure are directed to such computer program products.
- Computer programs also referred to as computer control logic
- Computer programs may also be received via communications interface 1524. Such computer programs, when executed, enable the computer system 1500 to perform the features in accordance with aspects of the present disclosure, as discussed herein.
- the computer programs when executed, enable the processor 1504 to perform the features in accordance with aspects of the present disclosure.
- FIG. 16 illustrates a block diagram of various example system components, in accordance with an aspect of the present disclosure.
- Fig.16 shows a communication system 1600 usable in accordance with aspects of the present disclosure.
- the communication system 1600 includes one or more accessors 1660, 1662 and one or more terminals 1642, 1666.
- data for use in accordance with aspects of the present disclosure is, for example, input and/or accessed by accessors 1660, 1662 via terminals 1642, 1666, such as personal computers (PCs), minicomputers, mainframe computers, microcomputers, telephonic devices, or wireless devices, such as personal digital assistants (“PDAs”) or a hand-held wireless devices coupled to a server 1643, such as a PC, minicomputer, mainframe computer, microcomputer, or other device having a processor and a repository for data and/or connection to a repository for data, via, for example, a network 1644, such as the Internet or an intranet, and couplings 1645, 1646, 1664.
- PCs personal computers
- PDAs personal digital assistants
- server 1643 such as a PC, minicomputer,
- the couplings 1645, 1646, 1664 include, for example, wired, wireless, and/or fiberoptic links.
- the method and system in accordance with aspects of the present disclosure operate in a stand-alone environment, such as on a single terminal.
- an example of a ECD 1404 that is configured to perform access control may analyze a biometric template of a requester 1450 to determine whether the requester 1450 is authorized to gain access to an entry point (not shown) associated with the ECD 1404.
- the ECD 1404 may include an optical sensor 1702, an illumination source 1704, a display 1706, a keypad 1708, and a scanner 1710.
- the optical sensor 1702 may be configured to capture still or moving images.
- the optical sensor 1702 may capture the fingerprints, the iris patterns, the facial features, the signature patterns, the shapes of the ears, the retinal patterns, the gait, and/or the hand geometry of the requester 1450.
- the optical sensor 1702 may be a broadband camera configured to detect electromagnetic radiation having wavelengths ranging from 200 nanometers (e.g., soft UV) to 2000 nanometers (e.g., near infra-red (NIR)).
- NIR near infra-red
- the optical sensor 1702 is configured to detect radiation between 700 to 900 nanometers.
- the optical sensor 1702 may include a motion sensor configured to detect people approaching the ECD 1404.
- the optical sensor 1702 may include a wide angle lens (e.g., such as a fisheye lens) used to provide both vertical area coverage to capture faces across the full range of human heights (as well as addressing American’s With Disabilities Act requirements) and to provide horizontal coverage of the complete area of an access point to address more than one person accessing the secure area on one authentication.
- the optical sensor 1702 may employ a high resolution (e.g., megapixel per square inch) charge coupled device (CCD) array.
- the high resolution array may provide the ability to identify faces at a greater distance from the sensor.
- the ECD 1404 may take advantage of the increased distance of identification to pre-identify requesters in queuing situations.
- the illumination source 1704 may emit electromagnetic radiation having wavelengths ranging from 200 nanometers to 2000 nanometers. In certain examples, the illumination source 1704 may emit non-visible radiation between 700 to 900 nanometers and/or 200-300 nanometers. The illumination source 1704 may emit radiation to illuminate bodily features and patterns of the requester 1450 used for biometric analysis (analyzing biometric template to determine access rights).
- the emitted radiation may impinge on a portion of a body of the requester 1450, and reflect off of the portion of the body.
- the reflected radiation may be captured by the optical sensor 1702 for biometric analysis.
- the display 1706 may present useful information to the requester 1450.
- the display 1706 may show one or more images of a face 1730 of the requester 1450, captured by the optical sensor 1702, to assist the requester 1450 in aligning the face 1730 during biometric analysis.
- the display 1706 may notify the requester 1450 a status of the entry point associated with the ECD 1404 (e.g., locked down, temporarily unavailable, normal operations, under maintenance).
- the display 1706 may display information such as time, date, weather, current location, etc.
- the keypad 1708 may allow the requester 1450 to enter numbers, symbols, and alphabets into the ECD 1404. In an example, the requester 1450 may enter a password in addition to the biometric analysis to gain access to the entry point.
- the scanner 1710 may be a radio frequency identification (RFID) scanner, a proximity card scanner (e.g., HID TM card scanner), a contact card scanner, or a magnetic card scanner.
- the scanner 1710 may send an interrogatory signal to a proximity card (not shown) having a coil and an integrated circuit with a programmable or non-programmable identification sequence.
- the interrogatory signal may be “absorbed” by the coil and may energize the integrated circuit.
- the energized integrated circuit sends a response signal including the identification sequence back to the scanner 1710 via the coil.
- the scanner 1710 analyzes the identification sequence to determine whether or not to grant access.
- the identification sequence may be one or more numbers, alphabets, symbols, and/or a combination thereof.
- the requester 1450 may approach the ECD 1404 during operations.
- the optical sensor 1702 may detect the requester 1450.
- the illumination source 1704 may emit incident NIR radiation 1760 toward the requester 1450.
- the incident NIR radiation 1760 may impinge on the face 1730 of the requester 1450, and reflect off of the face 1730 of the requester 1450.
- the optical sensor 1702 may detect detected NIR radiation 1762 originating from the surface of the face 1730.
- the detected NIR radiation 1762 may include reflected incident NIR radiation 1760 and/or NIR radiation emitted from the requestor 1450 due to thermal heating (i.e., black body radiation).
- the intensity and distribution of the detected NIR radiation 1762 may depend on the intensity and angle of the incident NIR radiation 1760, the contour of the face 1730, angle of detection by the optical sensor 1702, and other factors.
- the ECD 1404 may use the detected NIR radiation 1762 to construct a facial template (the “NIR sampled profile”) of the requester 1450.
- the ECD 1404 may compare the constructed NIR sampled profile with existing templates stored therein (details described below). If the ECD 1404 detects a match, the ECD 1404 may allow the requester 1450 access to the entry point (as described above).
- a NIR sampled profile generated via NIR radiation detection may be resistant to changes in ambient lighting. As ambient lighting fluctuates (e.g., changes in luminance, color, color temperature, lighting angle), a NIR sampled profile constructed using NIR radiation detection may remain sufficiently constant to prevent a false acceptance or a false rejection.
- the NIR sampled profile of the requester 1450 constructed via NIR radiation detection under a “bright” condition may be substantially identical to the NIR sampled profile constructed via NIR radiation detection under a “dark” condition (e.g., 100 lux).
- a NIR sampled profile generated via NIR radiation detection may improve the privacy of the owner.
- the storage device 1410 may store NIR sampled profiles of employees and associated confidential information (e.g., birthdates, email account passwords). If an unauthorized person gains access to the NIR sampled profiles and the associated confidential information, the unauthorized person may not be able to exploit the stolen confidential information because it may be difficult to identify the employees based on the NIR sampled profiles.
- the NIR sampled profiles are constructed using, for example, detected NIR radiation, they may be unrecognizable because the NIR sampled profile of a person may be drastically different from the visual image of the face of the same person.
- the requester 1450 may approach the ECD 1404 during operations.
- the optical sensor 1702 may detect the requester 1450.
- the illumination source 1704 may emit incident UV radiation 1760 (e.g., electromagnetic radiations having wavelengths between 315-390 nanometers) toward the requester 1450.
- the incident UV radiation 1760 may impinge on the face 1730 of the requester 1450, and reflect off of the face 1730 of the requester 1450.
- the incident UV radiation 1760 may penetrate the surface of the face 1730 and reflect off of subdermal features of the face 1730 (e.g., dermis, subcutaneous tissue, muscles, imperfections).
- the optical sensor 1702 may detect detected UV radiation 1762 originating from the surface and/or subdermal features of the face 1730.
- the detected UV radiation 1762 may include reflected incident UV radiation 1760.
- the intensity and distribution of the detected UV radiation 1762 may depend on the intensity and angle of the incident UV radiation 1760, the contour of the face 1730, angle of detection by the optical sensor 1702, and other factors.
- the ECD 1404 may use the detected UV radiation 1762 to construct a facial template (the “UV sampled profile”) of the requester 1450.
- the ECD 1404 may compare the constructed UV sampled profile with existing templates stored therein (details described below).
- a UV sampled profile generated via UV radiation detection may be resistant to changes in ambient lighting. As ambient lighting fluctuates (e.g., changes in luminance, color, color temperature, lighting angle), a UV sampled profile constructed using UV radiation detection may remain sufficiently constant to prevent a false acceptance or a false rejection.
- the UV sampled profile of the requester 1450 constructed via UV radiation detection under a “bright” condition e.g., 1000 lux
- a “dark” condition e.g., 100 lux.
- a UV sampled profile generated via UV radiation detection may improve the privacy of the owner.
- the storage device 1410 may store UV sampled profiles of employees and associated confidential information (e.g., birthdates, email account passwords).
- UV sampled profiles are constructed using, for example, detected UV radiation, they may be unrecognizable because the UV sampled profile of a person may be drastically different from the visual image of the face of the same person.
- a sampled profile may be a rendering of the dataset that visualizes the consistency of position within the three arrays. If the sampled profile is compromised, it may be more difficult to obtain biometric information used to identify the individual.
- the ECD 1404 may generate a visible-light sampled profile based on detected visible-light reflected from the face 1730 of the requester 1450.
- the requestor 1450 may be asked to provide a password, a personal identification number (PIN), and/or a valid HID TM card to be used in conjunction with the constructed sampled profile to gain access to the entry point associated with the ECD 1404.
- the ECD 1404 may include a microphone (not shown in FIG.17) to perform voice recognition.
- the display 1706 may show the face 1730 of the requester 1450 as imaged by the optical sensor 1702.
- the display 1706 may include alignment marks (not shown) to assist the requester 1450 in aligning the face 1730 with respect to the optical sensor 1702. This alignment process may minimize false acceptances and false rejections due to misalignment.
- the ECD 1404 may be a pocket-sized mobile device powered by rechargeable batteries. [197] Referring to FIG. 18, in some implementations, the ECD 1404 may include a processor 1802 having a communication module 1852 configured to communicate with the gateways 1402 and other ECD ECDs 1404 as described in this disclosure.
- the communication module 1852 may be implemented as hardware in the processor 1802 for example, as software code executed by the processor 1802, or a combination thereof.
- the processor 1802 may also include a security module 1854 configured to encrypt and/or decrypt data.
- the security module 1854 may be implemented as hardware in the processor 1802 for example, as software code executed by the processor 1802, or a combination thereof.
- the processor 1802 further includes an algorithm module 1856 for constructing and comparing biometric templates as described throughout this disclosure. Alternatively, the communications with the Gateway may be facilitated by a processor in the control panel.
- the algorithm module 1856 may be implemented as hardware in the processor 1802 for example, as software code executed by the processor 1802, or a combination thereof.
- the processor 1802 may further include a parallel computation module 1858 for performing distributed processing.
- the parallel computation module 1858 may be implemented as hardware in the processor 1802 for example, as software code executed by the processor 1802, or a combination thereof.
- the processor 1802 may include one or more processors or cores, and may be implemented as a semiconductor processor, graphical processing unit, a field programmable gate array, a programmable logic device, a processing cluster, an application specific integrated circuit, or other suitable architectures.
- the ECD 1404 includes a memory 1804.
- the memory may be static or dynamic memory such as flash memory, random access memory, magnetic memory, or semiconductor memory.
- the memory 1804 may include external memory such as a cloud storage.
- the memory 1804 may include or store applications and/or computer executable code.
- the ECD 1404 further includes a modem 1808 for communicating with the gateways 1402 and other ECD ECDs 1404, and may operate in cooperation with the communication module 1852.
- the ECD 1404 also includes a RAM 1806, such as static or dynamic random access memory (RAM).
- the ECD 1404 may also include an Input/Output (I/O) device 1810 communicatively coupled to the display 1706, the optical sensor 1702, the keypad 1708, and the scanner 1710.
- the components within the ECD 1404 may be interconnected by an internal bus 1812a.
- the processor 1802, the memory 1804, the RAM 1806, and the internal bus 1812a may be disposed on a processing board 1820.
- FIG. 19 another example of the ECD 1404 may include a number of processing boards 1820 interconnected by an external bus 1812b. Each processing board 1820 may include the processor 1802, the memory 1804, the RAM 1806, and the internal bus 1812a.
- Data distributed among the processing boards 1820 may be distributed by a controller 1814 via the external bus 1812b.
- the ECD 1404 may download (from the gateways 1402) 240,000 biometric templates of potential requestors.
- the controller 1814 may distribute the 240,000 biometric templates evenly or unevenly among the processing boards 1820 such that each processing board 1820 may store, in the respective memory 1804, 30,000 biometric templates.
- each processing board 1820 may store the same or different number of biometric templates.
- the ECD 1404 may construct a sampled profile (e.g., UV or NIR) of the requestor 1450 based on the detected NIR radiation 1762.
- a sampled profile e.g., UV or NIR
- the controller 1814 may distribute copies of the constructed sampled profile to each of the processing boards 1820.
- the processing boards 1820 may simultaneously compare the duplicated sampled profiles with the locally stored biometric templates (e.g., 30,000 stored in each processing board 1820). While the current example of the ECD 1404 shown in FIG 19 includes eight processing boards 1820, other numbers of processing boards 1820 may also be used.
- the ECD 1404 may include 2, 4, 6, 8, 12, 16, 32 or 64 processing boards 1820. [201] In an implementation, the ECD 1404 may rely on remote processing boards (not shown) to perform the distributed computing described above.
- the ECD 1404 may send the duplicated sampled profiles to the remote processing boards to jointly and simultaneously implement the algorithm (described below) for matching the duplicated sampled profiles (or the numerical representation of the duplicated sampled profiles) to known biometric templates.
- the processing boards may be within other ECD ECDs within the network.
- the ECD 1404 may send the duplicated sampled profiles to a Beowulf cluster.
- the clustering design may employ inter-process and inter-processor protocols to share processing tasks of the same application between both processors and the processing cores within those processors.
- processor- intensive operations like facial verification may utilize multiple processors or multiple cores to complete the operation within an acceptable period of time. In some cases, large multi- core processors may be used for this type of operation.
- the operation may be spread across several advanced RISC machine (“ARM”) processors to accomplish the same performance as the single multi-core processor without the high hardware cost and potential for a single point of failure.
- the first gateway 1402a may download 600,000 biometric templates of potential requesters (e.g., employees and contractors) from the enterprise server 1406b.
- the first gateway 1402a may distribute 100,000 biometric templates to each of the ECD-e 1404e and the ECD-f 1404f, and 200,000 biometric templates to each of the second gateway 1402b and the third gateway 1402c.
- the second gateway 1402b may distribute 100,000 biometric templates to each of the ECD-c 1404c and the ECD-d 1404d
- the third gateway 1402c may distribute 100,000 biometric templates to each of the ECD-a 1404a and the ECD-b 1404b. Consequently, the 600,000 biometric templates downloaded from the enterprise server 1406b may be evenly distributed among the ECD ECDs 1404 (i.e., 100,000 non- overlapping templates each).
- the ECD-b 1404b may duplicate the constructed sampled profile and distribute the duplicated sampled profiles to the ECD-a 1404a, ECD-c 1404c, ECD-d 1404d, ECD-e 1404e, and ECD-f 1404f (via one or more gateways 1402).
- the ECD ECDs 1404 may compare the duplicated sampled profile of the first requester 1450a with the 100,000 templates stored locally to determine a match.
- the ECD-a 1404a, ECD-c 1404c, ECD-d 1404d, ECD-e 1404e, and ECD-f 1404f may send (via one or more gateways 1402) the result of the comparison back to the ECD-b 1404b.
- the ECD-b 1404b may gather the results and determine whether to grant access to the first requester 1450a.
- the biometric templates may be unevenly distributed among the ECD ECDs 1404.
- the ECD 1404 may generate a sampled profile 2000 based on measuring the intensity of the detected radiation 1762 from the face 1730 of the requester 1450.
- the sampled profile 2000 may include a base matrix 2010 of one or more measurement points 2002.
- the measurement points 2002 may include a value indicative of the intensity of the detected radiation 1762 at a location of the particular measurement point.
- the measurement point-a 2002a may indicate an intensity scale value of 2 corresponding to a region 2004a (e.g., black hair) around the face 1730 of the requester 1450.
- the measurement point-b 2002b may indicate an intensity scale value of 50 that corresponds to a background color.
- the measurement point-c 2002c may indicate an intensity scale value of 21 that corresponds to a region 2004c (e.g., cheeks) on the face 1730 of the requester 1450.
- the intensity scale may range from 0 (no reflection) to 100 (maximum reflection able to be detected by the optical sensor 1702 of the ECD 1404.
- the intensity scale may measure an absolute intensity (e.g., brightness) of the measurement points 2002.
- the sampled profile 2000 may include measurement points 2002 for detected radiation 1762 of different wavelengths (e.g., UV, NIR, red, green, blue).
- the sampled profile 2000 in FIG. 20 shows the base matrix 2010 of 10 x 12 measurement points 2002 across the face 1730 of the requester 1450, other measurement points density may be possible for the base matrix 2010.
- the ECD 1404 may generate another sampled profile using 100 x 100 measurement points across the face 1730 of the requester 1450.
- the ECD 1404 may generate a sampled profile using 500 x 500 measurement points.
- the ECD 1404 may remove measurement points 2002 indicating the background.
- the ECD 1404 (or the processor 1802) may apply the LBP operation to one or more of the measurement points 2002 within the base matrix 2010. For example, the ECD 1404 may perform LBP on the measurement point-d 2002d, which includes an intensity value of 41.
- the LBP string for the measurement point-d 2002d is 01110100, based on the span of 1 (i.e., performing LBP using the immediate neighbors, having distance of 1 cell, of the measurement point-d 2002).
- the ECD 1404 may track the numbers of LBP strings for the remaining measurement points 2002r as shown in a table 2100.
- the ECD 1404 may compute 24 of the measurement points 2002 as having the LBP string of 00000000, 3 having the LBP string of 00000001, 11 having the LBP string of 00000010, and 0 having the LBP string of 00000011, etc., as indicated in a table 2100.
- the table 2100 may include 256 entries for the possible strings (i.e., 8 bits). In some examples, the table 2100 may include 255 entries by eliminating the entry for the “all white” string (11111111) or the “all black” string (00000000).
- the data in the table 2100 may be plotted as a histogram indicating the number of occurrences (e.g., measurement points 2002) for the possible LBP strings.
- the ECD 1404 may determine one or more unique features.
- the one or more unique features may be non-zero LBP strings occurring fewer than a threshold frequency (e.g., 5), such as LBP strings 00000001, 01110100, and 11111100.
- the one or more unique features may be the non-zero LBP strings occurring the least, such as LBP strings 01110100.
- the ECD 1404 may track the location the measurement points 2002 having the one or more unique features.
- the ECD 1404 may track the coordinate of the LBP strings.
- the one or more unique features may be the non-zero LBP strings occurring the most.
- the ECD 1404 may divide the base matrix 2010 into one or more sub-matrices. Each sub-matrix may include a macroblock of measurement points.
- the ECD 1404 may divide the base matrix 2010 into a 5 x 5 sub-matrix 2202, a 3 x 3 sub-matrix 2204, and a 6 x 6 sub-matrix 2206.
- the ECD 1402 may calculate the LBP strings of the measurement points 2002 within the 5 x 5 sub- matrix 2202 using a span of 3 (i.e., calculating LBP strings using neighbors 3 cells away). Similarly, when computing the LBP strings for the measurement points 2002 within the 3 x 3 sub-matrix 2204, the ECD 1402 may calculate the LBP strings of the measurement points 2002 within the 3 x 3 sub-matrix 2204 using a span of 2.
- the ECD 1402 may calculate the LBP strings of the measurement points 2002 within the 6 x 6 sub-matrix 2206 using a span of 4.
- Other sizes for the base matrix, the sub-matrices, and spans are possible, as determined by the ECD 1402.
- a 100 x 100 base matrix may be divided into sub-matrices of six different sizes: 10 x 10, 15 x 15, 20 x 20, 30 x 30, 35 x 35 and 40 x 40.
- the LBP string may be calculated with a different span around the measurement point being calculated to characterize the texture/slope of the area surrounding the cell at different coverage areas.
- the span in pixels from the measurement point 2002 to be calculated to each of the neighboring cells for each of the sub-matrix sizes may be 3, 9, 15, 21, 27 and 33.
- a 200 x 200 base matrix may include 6 different sub- matrices having sizes of 10 x 10, 20 x 20, 35 x 35, 50 x 50, 65 x 65, 80 x 80, and 100 x 100.
- the span in pixels from the measurement point 2002 to be calculated to each of the neighboring cells for each of the sub-matrix sizes may be 3, 5, 7, 10, 25 and 40.
- the sub- matrices may overlap in some instances.
- the ECD 1404 may apply LTP computations onto the measurement points 2002.
- the ECD 1404 may convert the list of binary features derived from the sampled profile into a data sequence that is extensible and capable of being tailored to the unique traits of each requester 1450 while still providing a methodology of object comparison and matching/verification.
- a binary feature may contain three characteristics: the size of the sub-matrix/macroblock, the location of the sub- matrix/macroblock on the sampled profile, and the Uniform Local Binary Pattern (ULBP) assigned to binary feature.
- a macroblock may be a “sub-image” of the image of the face 1730 of the requester 1450.
- a macroblock of size 10 is a 10 x 10 (i.e., 100 measurement points 2002) sub-image of the image of the face 1730.
- the location of the macroblock on the image may be defined by the position of the first pixel in the macroblock located in the top, left corner of the macroblock. The position of this pixel is defined by two values, its distance from the left boundary of the face image (x- dimension) and the distance from the top boundary of the face image (y-dimension).
- the final characteristic is the ULBP, which is a mathematical methodology for establishing a scalar value for the edge and texture characteristics of a pixel starting with the top left of the face 1730. For each pixel within the macroblock, a ULBP calculation is performed.
- a table 2300 may illustrate an example of the results of the sequence conversion process for macroblocks having sizes of 10 x 10, 35 x 35, 20 x 20, 30 x 30, 15 x 15, 40 x 40, and 15 x 15. Three arrays may be used to generate a biometric template.
- First array may be a scalar value in array normalized.
- Second array may be sorted so that binary feature 2000, which in this example has the highest value, goes to the top of the second array in position 1.
- Third array compares the difference in the position of binary feature 2000 in the sort instead of scalar value. If the position within the array for binary feature 2000 remains the same, the results will reflect a value of 0 representing the least possible change in position and therefore the highest value for significance in the authentication method.
- a method 2400 of converting a sequence may identify the most unique features used for identification.
- obtain the sampled profile of the face For example, the processor 1802 and/or the communication module 1825 may obtain the sampled profile of the face 1730.
- each macroblock/ULBP combination assigns a unique index to each macroblock/ULBP combination.
- the algorithm module 1856 may assign a unique index (e.g., 4) to each macroblock/ULBP combination (e.g., macroblock 30 x 30) and analyzing based upon uniqueness of traits within the face and not on scalar values.
- construct a first array of scalar values for each macroblock/ULBP combination in the master schema referenced by the schema For example, the algorithm module 1856 may construct a first array of scalar values for each macroblock/ULBP combination in the master schema referenced by the schema. First array includes number of pixels that fall within that ULBP within that macro-block.
- the ECD 1404 may perform normal LBP calculation and get a histogram that is the value from 0-255, and if the result is the highest value (standard LBP formula that takes pixel and takes 8 pixels around it and then each pixel has a different binary value) then the result receives a value of 1. If some other LBP is higher than that pixel has a 0 value.
- weigh each scalar value by the size of the macroblock For example, the algorithm module 1856 may weigh each scalar value (e.g., 380) by the size of the macroblock (e.g., 30 x 30).
- the algorithm module 1856 may convert the sequence into a second array with a scalar value for each unique index that is the position of the primary index from the beginning of the sequence array with 2165 elements currently but may be extensible depending upon data for each individual.
- a deep learning may be performed on the full data set to establish the minimum data set required to accurately perform facial recognition.
- the result of the deep learning may be a reduced data set of binary features that is a fraction of the total data set (e.g., 1%, 2%, 5%, 10%, 20%, 50%,).
- Each element of data in this data set may be defined by a sub-matrix of a specific size and location and a single value within the full LBP base matrix.
- Each data element may be assigned a unique numeric identifier. For the enrollment process, this static data set is used for each requester. As each requester continues to verify their faces, the verification data may be collected and through deep learning a new set of binary features may be created for each requester. The introduction of the sequence to the verification algorithm may allow the introduction of this individualized set of binary features and still allow for comparison between different sets. The matching of two sequences may be achieved by calculating the difference (e.g., the difference in position of a data set) of the two arrays. The greater the sum the lower the quality of the match. A perfect object match may have a sum of substantially zero. [221] Still referring to FIG. 25, a table 2500 illustrates an example of the verification calculations.
- the first step in creating the sequence may be the sorting of the scalar array by the magnitude of the scalar value.
- the sort may be performed in descending order.
- the binary feature unique identification with the largest value is at the top of the sort.
- the binary features providing with the most uniqueness are at the top of the array.
- the new sort position is transferred to that third array and the location on the new array is based upon the binary feature unique identification.
- Binary feature unique identifier is 2000 when it gets sorted it goes to position 1. In third array the 2000 position will receive a value of 1.
- This array may be a unique sequence for the object and is the new basis for object verification and/or matching.
- the sequence matching algorithm determines the quality of the match by the distance of each binary feature unique identification from the beginning of the sequence.
- the distance is the index value of the binary feature unique identification within the sequence array where the index values are sequentially assigned.
- another array is created that contains the distance of each binary feature from the beginning of the sequence.
- the binary feature with the unique identification of one is in position ten of the sorted sequence.
- position one will have a value of ten.
- sequence array is second array and scalar array is the first array (normalized for block size). Extensible may indicate that the fixed list of binary features may now be expanded and contracted as required to optimize the process of matching objects while also improving the matching process integrity.
- sequence array may be dynamically adjustable based on two separate deep learning functions based upon the environment of the edge device and the specific individual face being authorized. The first function may seek to continuously optimize the default list of binary features applied to some or all object matching attempts prior to the development of an individual list of binary features.
- the second function may aggregate image data on individual objects and over time develop an optimized list of binary features for the face. Once the individualized list of binary features exceeds the matching performance of the default list of binary features through parallel object matching trials on incoming object image data, the default list of binary features may be replaced with the individualized list of binary features for that object in the live object matching functions.
- the deep learning engine for both processes may receive sample data in parallel with the live object/face matching functions. As image data is received, two lists of binary features may be applied to it. The first list may either be the default or individualized list of reduced binary features. (e.g., three most important blocks out of 100 in 10 x 10 macroblock). The output of the application of this list may be used in the live matching process.
- the second list may be the full list of binary features, which comprises all possible macroblock size (e.g., all 100 within the 10 x 10 macroblock), applied to possible image positions for each ULBP.
- the complete list of binary features may be, for example, twenty times the size of the default or individualized list of binary features.
- the deep learning engine may receive a sequence derived from the complete list of binary features for each object image received into the system. In the case of the default list of binary features deep learning process, the images may be categorized into the training and validating sets. An independent default test set may be created including objects different than the object being learned. In the case of the individualized list of binary features, each object identified by the default or existing individualized list of binary features will be placed in the training, validation and test sets.
- All other object’s list may be placed in the verification and test sets.
- a deep learning training session for both the default object list of binary features and for each individual list of binary features may be executed for each new live entry from the object detection system.
- the face matrix of data is transferred to the facial recognition algorithm. If the deep learning algorithm has developed specific list of binary features, then an aggregate list of these binary features for all users in the system may be created and used to generate the sequence. Otherwise, the sequence may be generated using the default set of binary features. The sequence may be transferred to the verification algorithm where the matching process may determine the identity of the sequence.
- a full list of binary features (all sub-matrices with full LBP histograms) may be generated and transferred to that identities data set in the deep learning algorithm.
- the deep learning algorithm may process the available data set for that identity (the set may continue to grow with each verification) and generate a revised optimum data set for that identity.
- the optimized data set may be converted to a sequence and used in the next verification process for that user.
- the aggregate set of binary features may be updated if necessary for converting the next face matrix. While the verification process will be within the locale cluster of devices, the deep learning algorithm may occur within both the locale and global clusters as a background task.
- the respective list may be updated in the live verification/matching process.
- the result may be a matching system capable of automatically adjusting to both the overall object population and the specific, unique traits of each object.
- the evolution of the object data will allow for large scale object matching solutions in excess of 100,000 objects capable of the same precision as a small solution ( ⁇ 1,000).
- an example of the deep learning process 2600 may rely on feedback loops and machine learning to refine the identification process.
- obtain an image matrix For example, the processor 1802 and/or the communication module 1825 may obtain the sampled profile of the face 1730.
- the algorithm module 1856 may determine if the face specific macroblock/ULBP is available. For example, the algorithm module 1856 may determine if the face specific macroblock/ULBP is available. [231] At block 2606, if the custom list of binary features is not available, convert image to face generic detection list of binary features. For example, the algorithm module 1856 may convert image to face generic/default detection list of binary features if the custom list is not available. [232] At block 2608, obtain face detection default sequence. For example, the algorithm module 1856 may obtain face detection default sequence if the macroblock is not available. [233] At block 2610, if the custom list of binary features is available, convert image to face detection specific list of binary features.
- the algorithm module 1856 may convert image to face detection specific list of binary features if the custom list of binary features (could be in original 2165 or could be all new ones that based upon that individual (e.g., 10 additional 10 x 10s that are more distinctive for that particular individual) is available. [234] At block 2612, perform face detection. For example, the algorithm module 1856 may perform face detection. [235] At block 2614, perform eye detection and location. For example, the algorithm module 1856 may perform eye detection and location. [236] At block 2616, convert image to full list of binary features. For example, the algorithm module 1856 may convert image to full list of binary features. (not just 57, but all 100 of 10 x 10 of macroblocks and include all values).
- Tensor flow is the deep learning engine framework provided by Google TM . Equation and algorithm for tensor flow is well known in the art.
- develop face detection list of binary features using tensor flow deep learning For example, the algorithm module 1856 may develop face detection list of binary features using tensor flow deep learning.
- feedback face detection refined sequence For example, the algorithm module 1856 may feedback face detection refined sequence.
- the ECD 1404 may take entire feature set and collect all data for face and derive a new refined set that replaces the default set. Distance calculation between the position of the binary feature within two sets may be used as metrics. Every feature would be in the same position every time.
- the example of the deep learning process 2600 shown in FIG.26 may rely on feedback loops and machine learning to refine the identification process. Rather than centralizing process of deep learning data that is subject to hacking if stored in a central server (tensor flow single application operating on a single central server) particularly on the cloud, the present disclosure may use tensor flow in a distributed architecture to evaluate at each reader that is privacy protected on all individual devices rather than stored in one central server.
- perform eye detection and location For example, the algorithm module 1856 may perform eye detection and location.
- the algorithm module 1856 may obtain face detection specific list of binary features if the macroblock is available.
- obtain facial recognition default list of binary features For example, the algorithm module 1856 may convert image to face generic detection list of binary features if the specific list of binary features for that individual is not available.
- determine if the face specific list of binary features is available For example, the algorithm module 1856 may determine if the face specific list of binary features is available.
- Custom list of binary features may include the original 2165 or could be all new ones that are based upon that individual (e.g., 10 additional 10 x 10s that are more distinctive for that particular individual).
- the algorithm module 1856 may convert image to face generic list of binary features if the face specific list of binary features is not available.
- convert image to face specific list of binary features if the face specific list of binary features is available.
- convert image to face specific list of binary features if the face specific list of binary features is available.
- convert list of binary features to sequences For example, the algorithm module 1856 may convert list of binary features to sequences.
- perform verification For example, the algorithm module 1856 may perform verification.
- the algorithm module 1856 may convert image to full list of binary features.
- the algorithm module 1856 may convert image to full list of binary features.
- the algorithm module 1856 may develop face specific list of binary features using tensor flow deep learning. (not just 57 as is typical with LBP, but all 100 of 10 x 10 of macroblocks and include all values).
- Tensor flow is the deep learning engine framework provided by Google TM . Equation and algorithm for tensor flow is well known in the art.
- the gateway 1402 and/or the ECD 1404 may include proactive algorithms to identify and reduce “consistency-collisions” during the verification process.
- a “consistency-collisions” may occur when one or more requesters have unique facial data that causes a mistake (i.e., false acceptance or false reject) in the verification.
- the gateway 1402 and/or the ECD 1404 may seek out potential ‘consistency-collisions’ proactively using two methods of the present disclosure described below.
- the ‘Layered Reinforcement’ algorithm may allow the requester to create a binary tree structure for the facial data.
- the binary data may create a hierarchy of data points (e.g., LBP string) based on the uniqueness of the data points. The greater the uniqueness the higher up on the tree. Every verification transaction may result in the transfer of the binary facial data from the ECD 1404 to the gateway 1402.
- the gateway 1402 When the gateway 1402 receives the binary facial data, it may begin checking the uniqueness of the new data’s binary tree structure against the data of the other requesters 1450 stored. When it identifies a potential collision, the gateway 1402 may notify the ECD 1404 to escalate verification transactions between the two identified requesters 1450 to the gateway 1404. When the gateway 1404 receives an escalated transaction it may perform two advanced algorithms (i.e., the Proactive Collision Identification Algorithm and the Time-Domain Trending Algorithm) on the data. Binary tree may take most distinctive features and establish hierarchy based upon most and least distinctive. Prioritized tree may reduce analysis based on 20 rather than 2165 saving time and computing/processing.
- the Proactive Collision Identification Algorithm i.e., the Proactive Collision Identification Algorithm and the Time-Domain Trending Algorithm
- the ECD 1404 may determine if this is or is not the person rather than going thru all 2165 features. Thresholds may be empirical and derived from testing.
- the first advanced algorithm, the Proactive Collision Identification Algorithm may take the binary tree data and analyze the facial data based on its location in the binary tree. The binary tree data may be weighted. If the weighting of the binary data does not yield a sufficient differentiation of the data, the gateway 1402 may extend the verification process to find adequate differentiation data.
- the facial characteristics of requesters may continuously change.
- biometric data of a face such as the face 1730 of the requester 1450
- the identification of the face may ultimately result in false rejections, requiring the re- enrollment of the face 1730.
- Dynamic adjustments to the biometric data may be continuously applied and algorithms may determine whether the proposed change is not the introduction of a different face into the biometric data resulting in a false acceptance on the face 17303. With every iteration of the biometric data, the algorithm must determine the amount of biometric data to retain to insure the next successful identification of the requester 1450 and the amount to change to insure the long-term identification of the requester 1450.
- the determination may be made by a combination of a time-domain analysis where the changes are regressed to insure linearity and facial regional analysis to determine if the area of change is rational.
- the use of this dynamic data capability leads to the second algorithm, the Time- Domain Trending Algorithm.
- the gateway 1402 may maintain the history of the binary data of the requester 1450 and perform a time-domain based analysis of the data to assess what features are changing when the history of the binary data of requester 1450 is compared with face 1730 and the speed with which these features are changing.
- the gateway 1402 may track the evolution of the face and uses this data to extrapolate and reinforce the unique differences identified in the face 1730 of the requester 1450 to establish and later emphasize core features or baseline features and assess the rate of change in these core features for the face 1730. The gateway 1402 then determines using time-domain based analysis whether these changes are taking place over a time period to suggest natural changes in the face 1730, artificial changes in face 1730 warranting further analysis, or a mismatch of the history of binary data of the requester 1450 with the face 1730. The gateway 1402 may identify the unique differences and verify that the differences identified in the verification request are consistent with the trending over time.
- the gateway 1402 may factor in lifestyle and daily routines of the requester 1450 in the Time-Domain Trending Algorithm. For example, if the requester 1450 enjoys outdoor activities, the Algorithm may factor in increased tanning during the summer season.
- the Proactive Collision Identification Algorithm combined with the Time-Domain Trending Algorithm allows the method of this embodiment of the invention to maintain its high performance in a large population ‘1:N’ solution.
- the profiles may be refined by additional replacement profiles. For example, a first-in first-out scheme may be implemented, where a new replacement profile may replace the oldest profile.
- the profiles may be divided into a first group of “fast” learners and a second group of “slow” learners.
- the profiles in the “fast” learners may be replaced after every use, for example.
- the profiles in the “slow” learners may be replaced after every day, every week, every month, etc.
- history may be maintained without stale profiles.
- some profiles may be locked after replacement. For example, if a requester has 20 profiles (e.g., profile #1, profile #2,... profile #20).
- a replacement profile, such as profile #21, may replace profile #5.
- the Time-Domain Trending Algorithm may reduce the change over time in the biometric data into one or more equations that characterize the change over time (e.g., curve fitting algorithms). The linearity of the one or more equations over time may determine the integrity of the changes. If a particular area of the change is represented as a discrete function, the change may be flagged as a potential threat (e.g., disguise, incorrect match). The changes may also be evaluated based on the physical location of the pixel box on the face.
- a potential threat e.g., disguise, incorrect match
- a method 2800 of constructing a biometric template may be performed by the ECD 1404.
- the method 2800 may perform an enrollment process.
- the enrollment process may include the ECD 1404 capturing a plurality of images (e.g., 5, 10, 15, 20, 30, 50) of the face 1730.
- the ECD 1404 may convert the plurality of images to biometric data as described above.
- emitting an incident non-visible light For example, the illumination source 1704 may emit an incident non-visible light.
- the non-visible light may include near IR or UV. In some implementations, the illumination source 1704 may emit a visible light.
- detecting a detected non-visible light wherein the detected non-visible light includes a reflected non-visible light and a radiated non-visible light.
- the optical sensor 1702 may detect a detected non-visible light, wherein the detected non-visible light includes a reflected non-visible light and a radiated non-visible light.
- the optical sensor 1702 may detect IR light reflected off of the face 1730 of the requester 1450 and/or IR light radiated due to the heat of the requester 1450.
- generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light For example, the algorithm module 1856 may generate a sampled profile, such as the sampled profile 2000, including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light.
- identifying one or more macroblocks each includes a subset of the plurality of sampling points.
- the algorithm module 1856 may identify one or more macroblocks each includes a subset of the plurality of sampling points.
- the one or more macroblocks may be a 10 x 10 macroblock, a 15 x 15 macroblock, a 20 x 20 macroblock, a 30 x 30 macroblock, a 35 x 35 macroblock, and a 40 x 40 macroblock. Other sizes are possible.
- the ECD 1404 may identify 2165 macroblocks having 2165 associated dimensions. [265]
- selecting a local pattern value For example, the algorithm module 1856 may select a local binary pattern value of 20. In other examples, the algorithm module 1856 may select a local ternary pattern value.
- calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks For example, the algorithm module 1856 may calculate a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks.
- generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks.
- the algorithm module 1856 may generate a first array including the weighted values shown in the table 2300 (i.e., [0.30 0.54 0.15 0.42 0.62 0.46 0.53]).
- assigning a unique index to each of the plurality of weighted values For example, the algorithm module 1856 may assign unique indices (i.e., 1, 2, 3, 4, 5, 6, and 7) to each of the plurality of weighted values shown in the table 2300.
- the unique index 1 is assigned to the weighted value of 0.30, 2 to 0.54, 3 to 0.15..., etc.
- generating a second array of the unique index by ranking the plurality of weighted values.
- the algorithm module 1856 may generate a second array of the unique index by ranking the plurality of weighted values, such as the sequence 5, 2, 7, 6, 4, 1, 3 in the table 2300.
- the second array/sequence may indicate a ranking of the weighted values from the highest to the lowest.
- the first number in the sequence is 5 because the weighted value (i.e., 0.62) associated with the unique index of 5 is the highest among the elements of the first array.
- generating a third array including a plurality of ranking distances For example, the algorithm module 1856 may generate a third array including a plurality of ranking distances, such as the stored array [5 1 6 4 0 3 2] in the table 2300.
- a ranking distance may indicate a numerical difference between the ranks of the highest weighted value (e.g., 0.62 - rank 1) and the current weighted value (e.g., 0.30 – rank 6). Therefore, the ranking distance between 0.62 and 0.30 may be 5 (i.e., 6-1).
- the biometric data based on the sampled profile (the “requestor biometric data”) may be compared to the biometric data of the plurality of images captured during the enrollment process (the “enrollment biometric data”).
- the ECD 1404 and/or one of the gateways 1402 may determine that the requestor biometric data is a positive match and the verification is successful.
- a threshold percentage e.g. 20, 30, 40, 50, 60, 70, or 80
- the ECD 1404 and/or one of the gateways 1402 may adjust the enrollment biometric data over time to accommodate any changes to the face 1703 due to, for example, sun tan, aging, injuries, mood change, weight change, facial hair change, cosmetics usage, accessories, or other causes.
- the ECD 1404 and/or one of the gateways 1402 may adjust a portion (e.g., 20%, 30%, 40%, 50%) of the enrollment biometric data.
- the ECD 1404 and/or one of the gateways 1402 may obtain 20 sampled profiles of a person (e.g., IR or UV images of a person). Each profile of the 20 sampled profiles may be compared to other profiles of the sampled profiles. A distance data may be calculated between each measurement point (or local binary or ternary pattern) of a profile and the corresponding measurement point (or local binary or ternary pattern) of the other profiles. The profiles with the lower standard deviations (e.g., lowest 10) from the mean of each calculated measurement point (or local binary or ternary pattern) may be kept, and the profiles with the higher standard deviations (e.g., highest 10) may be replaceable.
- a person e.g., IR or UV images of a person.
- Each profile of the 20 sampled profiles may be compared to other profiles of the sampled profiles.
- a distance data may be calculated between each measurement point (or local binary or ternary pattern) of a profile and the corresponding measurement point (or local binary or ternary pattern
- the ECD 1404 and/or one of the gateways 1402 may replace some or all of the replaceable profiles (i.e., ones with higher standard deviations) with updated profiles.
- each measurement point may include an associate composite value calculated from at least one of an average of the corresponding measurement points (or local binary or ternary patterns) of the 20 sampled profiles, a standard deviation of the average of the corresponding measurement points (or local binary or ternary patterns) of the 20 sampled profiles, and/or the corresponding measurement points (or local binary or ternary patterns) of the 20 sampled profiles.
- the associate composite value may be the sum of the corresponding measurement points of the 20 sampled profile.
- the associate composite value may be the average of the corresponding measurement points of the 20 sampled profile.
- the associate composite value may be proportional or inversely proportional to the standard deviation of the average of the corresponding measurement points of the 20 sampled profile. Other ways of generating the associate composite value of a measurement point may also be used.
- the ECD 1404 and/or one of the gateways 1402 may use the equation to calculate an average pixel value.
- n may denote a number of samples and a i may denote the value of each pixel.
- an example of a binary tree 2900 may include a root node 2902, a first left node 2904, a first right node 2906, a second left node 2908, a second right node 2910, a third left node 2912, a third right node 2914, and nodes 29n, 29n+1, 29n+2, 29n+3, 29n+4, etc.
- the binary tree 2900 may be a data structure that includes the features of a sampled profile.
- the root node 2902 may include the most unique feature of the sampled profile of a person
- the first left node 2904 and the first right node 2906 may include the next most unique features, and so forth.
- a sampled profile may be divided into multiple regions (e.g., 4, 9, 16, 25).
- the sampled profile may include nine regions: Left Eye, Nose Bridge, Right Eye, Left Cheek, Nose, Right Cheek, Left Mouth, Middle Mouth, and Right Mouth.
- the sampled profile may include four regions: top left, top right, bottom left, and bottom right. The sizes of the regions may be the same or different.
- the root node 2902 and/or nodes near the root node 2902 may include measurement points (or local binary or ternary patterns) around the eyes due to these being the more unique features.
- the ECD 1404 and/or one of the gateways 1402 may compare the binary tree 2900 with a stored binary tree 2950 of a plurality of binary trees.
- the ECD 1404 and/or one of the gateways 1402 may calculate a difference between a value of the first node 2902 of the binary tree 2900 and a value of a first node 2952 of the binary tree 2950, a difference between a value of the first left node 2904 of the binary tree 2900 and a value of a first left node 2954 of the binary tree 2950, a difference between a value of the first right node 2906 of the binary tree 2900 and a value of a first right node 2956 of the binary tree 2950... to compute an aggregated difference value.
- extra nodes from one of the binary trees 2900, 2950 may be truncated by the ECD 1404 and/or one of the gateways 1402.
- the ECD 1404 and/or one of the gateways 1402 may determine a number of nodes to compare for each tree. If the aggregated difference value is lower than the aggregated difference values between the binary tree 2900 and other binary tress stored in the ECD 1404 and/or one of the gateways 1402, the ECD 1404 and/or one of the gateways 1402 may determine a positive match.
- the ECD 1404 and/or one of the gateways 1402 may determine a positive match when the aggregated difference values between the binary tree 2900 and other binary tress (belonging to the same person) are lower than other aggregated difference values.
- the binary tree 2900 may be restructured periodically.
- FIG. 30 an example of a time domain analysis 3000 may begin with the ECD 1404 capturing n sampled profiles 3002-1, 3002-2,... 3002-n of an enroller during the enrollment process.
- Each of the n sampled profiles 3002-1, 3002-2,... 3002-n may be divided into two or more zones (e.g., 9 zones)
- the two or more zones may include the upper left, upper middle, upper right, middle left, center, middle right, lower left, lower middle, and lower right zones.
- the two or more zones may have equal dimensions or may have different dimensions.
- the two or more zones may include the same or different number of sampling points.
- the n sampled profiles 3002-1, 3002-2,... 3002-n may be transmitted to one or more of the gateways 1402.
- the one or more biometric templates (described above) for the n sampled profiles 3002-1, 3002-2,... 3002-n may be transmitted to one or more of the gateways 1402.
- the n sampled profiles 3002-1, 3002-2,... 3002-n may include information such as timestamps (i.e., time the sampled profile was captured), light used (i.e., exposure under UV light, IR light, and/or visible light), resolutions of the camera used for capturing the image associated with the sampled profile, or other information related to the sampled profiles 3002.
- timestamps i.e., time the sampled profile was captured
- light used i.e., exposure under UV light, IR light, and/or visible light
- resolutions of the camera used for capturing the image associated with the sampled profile or other information related to the sampled profiles 3002.
- the one or more gateways 1402 and/or the ECD 1404 may attempt to update one or more of the n sampled profiles to accommodate for changes in the appearance of the enroller.
- the enroller may grow a beard or mustache, put on make-up, get a darker complexion from sun- tan, receive a scar on his/her face from injuries, add piercings onto his/her face, wear glasses, or experience other events that may alter the face of the enroller.
- the one or more gateways 1402 and/or the ECD 1404 may designate certain sampled profiles as fixed (remains as part of the biometric template of the enroller) and other sampled profiles as updateable (may be replaced).
- the one or more gateways 1402 and/or the ECD 1404 may replace the oldest sampled profile with a new sampled profile obtained during verification and/or re-enrollment.
- n matrices 3004-1, 3004-2,... 3004-n may show the numbers of occurrences of a local pattern value (e.g., local binary pattern value or local ternary pattern value).
- the first matrix 3004-1 shows that the upper left zone of first sampled profile 3002-1 includes 39 occurrences of the local pattern value.
- the upper middle zone of the first sampled profile 3002-1 includes 121 occurrences of the local pattern value, and so forth and so on.
- the upper middle zone of second sampled profile 3002-2 includes 125 occurrences of the local pattern value.
- the upper right zone of second sampled profile 3002-2 includes 99 occurrences of the local pattern value, and so forth and so on.
- the one or more gateways 1402 and/or the ECD 1404 may generate an average matrix 3010 of the numbers of occurrences in each zone for the n sampled profiles 3002.
- the average value for the upper left zone may be 39.33 (rounded to the nearest hundredth digit).
- the average value for the upper middle zone may be 122.33 (rounded to the nearest hundredth digit).
- the average value for the upper right zone may be 98.67 (rounded to the nearest hundredth digit), and so forth and so on.
- the one or more gateways 1402 and/or the ECD 1404 may generate absolute deviation matrices 3020 including absolute deviations between the average value of the numbers of occurrences in a zone and each of the number of occurrences.
- the first absolute deviation matrix 3020-1 may include, for the upper left zone, a number of 0.33 (rounded to the nearest hundredth digit).
- the number 0.33 indicates the absolute deviation between the average value 39.33 and the number 39 in the first matrix 3004-1.
- the second absolute deviation matrix 3020-2 may include, for the upper middle zone, a number of 2.67 (rounded to the nearest hundredth digit).
- the number 2.67 indicates the absolute deviation between the average value 122.33 and the number 125 in the second matrix 3004-2.
- the n th absolute deviation matrix 3020-n may include, for the lower left zone, a number of 5.33 (rounded to the nearest hundredth digit).
- the number 5.33 indicates the absolute deviation between the average value 105.67 and the number 111 in the n th matrix 3004-n, and so forth and so on.
- the one or more gateways 1402 and/or the ECD 1404 may compute the effective absolute deviations for each sampled profile.
- the effective absolute deviation may be the highest absolute deviation, the highest median of the absolute deviations for a sampled profile, or the sum of the absolute deviations for a sampled profile.
- the one or more gateways 1402 and/or the ECD 1404 may compute the effective absolute deviation (e.g., sum) for the first sampled profile 3002-1 based on the first absolute deviation matrix 3020-1, and obtain the value of 8 (i.e. sum of 0.33, 1.33, 0.33, 0.33, 0, 1.33, 2.67, 9.33, and 1.33.
- the effective absolute deviation (sum) of the second sampled profile 3002-2 is 11 (i.e., sum of 0.33, 2.67, 0.33, 0.33, 0, 1.33, 2.67, 0.67, and 2.67).
- the effective absolute deviation (sum) of the n th sampled profile 3002-n is 13.
- the one or more gateways 1402 and/or the ECD 1404 may replace the n th sampled profile 3002-n with a new sampled profile because the n th sampled profile 3002-n has the highest absolute deviation (sum).
- the one or more gateways 1402 and/or the ECD 1404 may replace the first sampled profile 3002-1 with a new sampled profile because the first sampled profile 3002-1 has the highest absolute deviation.
- the one or more gateways 1402 and/or the ECD 1404 may update the biometric template based on the new sampled profile.
- an example of a facial recognition process may begin with the ECD 1404 capturing an image 3100 of a face of a user.
- the image 3100 may be captured under one or more of UV light, IR light, or visible light.
- the performance of facial recognition algorithms may be dependent on the positioning of the face in the image 3100.
- Accurate recognition may require a full frontal image, or a near full frontal image, e.g., within one or more predetermined thresholds.
- a full frontal image may be an image having no pitch, yaw, or roll of the face (described below).
- a full frontal image may allow for maximum exposure of all facial detail.
- a person interacting with a camera may not inherently present his/her face in a full frontal position.
- the face is typically presented with varying degrees of yaw, pitch, and roll.
- the presentation of a full frontal image may be time- consuming and may weaken an advantage of facial recognition.
- the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to perform image processing. During the image processing, the ECD 1404 may identify one or more facial landmarks 3102. The one or more facial landmarks 3102 may be points at predetermined locations on the face of the user.
- the ECD 1404 may identify a landmark 3102- 1 near the right corner of the right eye of the user, a landmark 3102-2 near the left corner of the right eye of the user, a landmark 3102-3 near the right corner of the left eye of the user, a landmark 3102-4 near the left corner of the left eye of the user, a landmark 3102-5 near the tip of the nose of the user, and a landmark 3102-6 near the tip of the chin of the user.
- Other landmarks may also be used, e.g., corners of the mouth.
- the landmarks 3102 may be used by the processor 1802, the processing board 1820, and/or the image processing algorithm stored in the memory 1804 to compute special deviations of the image 3100 from a full frontal image of the user, e.g., the yaw, roll, and/or pitch of the captured image 3100 as described below.
- an example 3200 illustrating the yaw, roll, and pitch of a capture image may align a head 3202 of the user to a roll axis 3210, a pitch axis 3220, and a yaw axis 3230.
- the head 3202 may be in the “neutral” position (no roll, pitch or yaw) with respect to the ECD 1404 when the ECD 1404 is able to take a full frontal image of the head 3202 of the user without further adjustment.
- the head 3202 may tilt to one side or another when the ECD 1404 captures the image. Tilting the head 3202 may cause the image of the head 3202 to include a roll (i.e., rotating about the roll axis 3210). In other examples, the head 3202 may raise up or lower when the ECD 1404 captures the image.
- Raising or lowering the head 3202 may cause the image of the head 3202 to include a pitch (i.e., rotating about the pitch axis 3220).
- the head 302 may turn to the left or the right when the ECD 1404 captures the image.
- Turning the head 3202 may cause the image of the head 3202 to include a yaw (i.e., rotating about the yaw axis 3230).
- the head 3202 may tilt, raise/lower, and/or turn, causing the image of the head 3202 to include any one or any combination of a roll, pitch, or yaw.
- a yaw measurement technique 3300 may include measuring a first distance 3302 between the landmark 3102-1 (i.e., the right corner of the right eye of the user) and the landmark 3102-5 (i.e., the tip of the nose of the user) and a second distance 3304 between the landmark 3102-4 (i.e., the left corner of the left eye of the user) and the landmark 3102-5.
- a pitch measurement technique 3310 may include measuring a first distance 3312 between the landmark 3102-2 (i.e., the left corner of the right eye of the user) and the landmark 3102-5 (i.e., the tip of the nose of the user) and a second distance 3314 between the landmark 3102-5 and the landmark 3102-6 (i.e., the tip of the chin of the user).
- the ratio between the first distance 3302 and the second distance 3304 may represent a pitch angle, which may quantify an amount of pitch in the captured image.
- a roll measurement technique 3320 may include measuring a vector 3322 between the landmark 3102-1 (i.e., the right corner of the right eye of the user) and the landmark 3102-4 (i.e., the left corner of the left eye of the user).
- the vector 3322 may be measured against a horizontal vector.
- the angle between the vector 3322 and the horizontal vector may represent a roll angle, which may quantify an amount of roll in the captured image.
- the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to rotate the captured image until the vector 3322 is parallel with the horizontal vector. The corrections for the yaw and the pitch will be described in detail below.
- the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to overlay a macroblock 3410 (e.g., 10 x 10 blocks) onto the image 3100 captured by the ECD 1404.
- the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to align the macroblock 3410 to the landmarks 3102.
- the ECD 1404 and/or components of ECD 1404 may align certain blocks of the macroblock 3410 to the landmarks 3102.
- the blocks in the top-most row the macroblock 3410 may be numbered from 1 to 10 from the top left of the face of the user to the top right of the face of the user.
- the second top-most row of the macroblock 3410 may be numbered from 11 to 20 from the top left of the face of the user to the top right of the face of the user, and so forth.
- the landmark 3102-1 may be aligned such that the landmark 3102-1 overlays a boundary between block 20 and block 30.
- the landmark 3102-3 may be aligned such that the landmark 3102-3 is entirely within block 24.
- the landmark 3102-5 may be aligned to the center line of the macroblock 3410. Other coordinate systems may also be used to align the landmarks 3102 to the blocks of the macroblock 3410.
- an example of a technique 3500 for correcting yaw may divide the macroblock 3410 into two 10 x 5 sub-macroblocks. For example, if a head 3202-1 (as seen from above) turns to the right, one or more portions of the macroblock 3410 overlaying the face of the user may be misaligned due to the yaw (e.g., the landmark 3102-5 is no longer aligned to the center line of the macroblock 3410). To adjust for the yaw, the ECD 1404 may split the macroblock 3410-1 into a first sub-macroblock 3410-1A and a second sub-macroblock 3410-1B.
- Both the first sub-macroblock 3410-1A and the second sub-macroblock 3410-1B may each include 10 x 5 blocks.
- the first sub- macroblock 3410-1A and the second sub-macroblock 3410-1B may share a center line 3412 that intersects the landmarks 3102-5 and/or 3102-6. Due to the head 3202-1 turning to the right (i.e., yaw), the first sub-macroblock 3410-1A may be compressed horizontally and the second sub-macroblock 3410-1B may be expanded horizontally.
- the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to reconstruct a frontal image of the face of the user by expanding the first macroblock 3410-1A and compressing the second sub-macroblock 3410-1B.
- the ECD 1404 (or one or more of the subcomponents) may adjust the first macroblock 3410-1A and/ the second sub-macroblock 3410-1B to have identical areas.
- the ECD 1404 may split the macroblock 3410-2 into a third sub-macroblock 3410-2A and a fourth sub-macroblock 3410-2B. Both the third sub-macroblock 3410-2A and the fourth sub-macroblock 3410-2B may each include 10 x 5 blocks.
- the third sub-macroblock 3410-2A and the fourth sub-macroblock 3410-2B may share the center line 3412 that intersects the landmarks 3102-5 and/or 3102-6. Due to the head 3202-2 turning to the left (i.e., yaw), the third sub-macroblock 3410-2A may be expanded horizontally and the fourth sub-macroblock 3410-2B may be compressed horizontally. To correct for the yaw, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to reconstruct a frontal image of the face of the user by compressing the third macroblock 3410-2A and expanding the fourth sub-macroblock 3410-2B.
- the ECD 1404 may adjust the third macroblock 3410-2A and/ the fourth sub-macroblock 3410-2B to have identical areas.
- an example of a technique 3600 for correcting pitch may divide the macroblock 3410 into a 2 x 10 sub-macroblock and a 8 x 10 sub-macroblock. For example, if a head 3202-3 (as seen from the right) raises, one or more portions of the macroblock 3410 overlaying the face of the user may be misaligned due to the pitch (e.g., the landmark 3102-1 is no longer aligned to a boundary between block 20 and block 30).
- the ECD 1404 may split the macroblock 3410-1 into a fifth sub-macroblock 3410-3A and a sixth sub-macroblock 3410-3B.
- the fifth sub-macroblock 3410-3A may include 2 x 10 blocks and the sixth sub-macroblock 3410-3B may include 8 x 10 blocks.
- the fifth sub-macroblock 3410-3A and the sixth sub-macroblock 3410-3B may share a dividing line 3414 that intersects the landmark 3102-1 and/or the landmark 3102-2. Due to the head 3202-3 raising (i.e., pitch), the fifth sub-macroblock 3410-3A and the sixth sub-macroblock 3410-3B may be compressed vertically.
- the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to reconstruct a frontal image of the face of the user by expanding the fifth macroblock 3410-3A and the sixth sub-macroblock 3410-3B.
- the ECD 1404 (or one or more of the subcomponents) may adjust the fifth macroblock 3410-3A to occupy 20% of the area of the macroblock 3410-3 and adjust the sixth macroblock 3410-3B to occupy 80% of the area of the macroblock 3410-3.
- the ECD 1404 may split the macroblock 3410-1 into a seventh sub-macroblock 3410-4A and an eighth sub-macroblock 3410-4B.
- the seventh sub- macroblock 3410-4A may include 2 x 10 blocks and the eighth sub-macroblock 3410-4B may include 8 x 10 blocks.
- the seventh sub-macroblock 3410-4A and the eighth sub- macroblock 3410-4B may share the dividing line 3414 that intersects the landmark 3102-1 and/or the landmark 3102-2. Due to the head 3202-3 lowering (i.e., pitch), the seventh sub- macroblock 3410-4A may be compressed vertically and the eighth sub-macroblock 3410-4B may be expanded vertically. To correct for the pitch, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to reconstruct a frontal image of the face of the user by expanding the seventh sub-macroblock 3410-4A and compressing the eighth sub-macroblock 3410-4B.
- the ECD 1404 may adjust the fifth macroblock 3410-3A to occupy 20% of the area of the macroblock 3410-3 and adjust the sixth macroblock 3410-3B to occupy 80% of the area of the macroblock 3410-3.
- a given macroblock may capture the same area of the face regardless of the yaw and pitch of the face. This given area of the face may be smaller or larger based on the type of movement but the boundary may remain the same.
- a macroblock may include the right corner of the right eye, which may include a landmark, in the full frontal position.
- the macroblock will continue to include the right corner of the right eye regardless of the pitch and yaw angle.
- the area that it covers in the full frontal position will be 10x10 but this area will change depending on the pitch and yaw angles.
- the macroblock may become a 4x6 blocks or a 14x8 blocks but it will always cover the same portion of the face. By digitally aligning the orientation of the face to the camera, the same region of the face may be aligned for each macroblock.
- the macroblock 3410 may be converted into a ULBP histogram.
- the histogram may be constructed by calculating the ULBP for each pixel within the macroblock and then aggregating the resulting ULBPs into a histogram.
- the amplitude of the histogram may be the number of pixels with that ULBP value within the macroblock. To normalize the histogram across macroblocks with varying dimensions, the histogram amplitude values may be divided by the area of the macroblock in pixels. The amplitude may be a percentage of the macroblock containing the given ULBP value. These normalized values may be used in the remaining algorithm to characterize the face (as described above). [302] In some implementations, a collision between a first biometric template of a first user and a second biometric template of a second user may occur when the overlap of features between two users to cause the biometric data matching algorithm to mistake one user for the other and grant a false accept or reject.
- the gateways 1402 and/or the ECD 1404 may identify a positive match between the first user and the second biometric template.
- the collision may be identified proactively (i.e., the collision is identified independent of any actions).
- the collision may be identified by the gateways 1402 and/or the ECD 1404 when the environment 1400 is experiencing a few (e.g., less than 1 per hour) or zero access requests.
- the collision may be identified due to an access request by the first user or the second user.
- the collision may be identified independent of any access request.
- the gateways 1402 and/or the ECD 1404 may identify overlapping features between the first biometric template and the second biometric template.
- the gateways 1402 and/or the ECD 1404 may determine a collision when 50% or more of the features in the first biometric template overlaps with the features in the second biometric template.
- the gateways 1402 and/or the ECD 1404 may notify an administrator, such as the network administrator 1452, a security personnel, and/or other relevant employees, about the collision.
- the gateways 1402 and/or the ECD 1404 may notify the administrator via automatically generated email and/or text messages.
- the gateways 1402 and/or the ECD 1404 may notify the first user and/or the second user to approach any of the ECDs 1404a, 1404b, 1404c, 1404d, 1404e, 1404f to re-enroll.
- example when used in this description, means “serving as an example, instance, or illustration,” and not “preferred” or “advantageous over other examples.”
- the detailed description includes specific details for the purpose of providing an understanding of the described techniques. These techniques, however, may be practiced without these specific details. For example, changes may be made in the function and arrangement of elements discussed without departing from the scope of the disclosure. Also, various examples may omit, substitute, or add various procedures or components as appropriate. For instance, the methods described may be performed in an order different from that described, and various steps may be added, omitted, or combined. Also, features described with respect to some examples may be combined in other examples.
- a specially-programmed device such as but not limited to a processor, a digital signal processor (DSP), an ASIC, a FPGA or other programmable logic device, a discrete gate or transistor logic, a discrete hardware component, or any combination thereof designed to perform the functions described herein.
- DSP digital signal processor
- a specially-programmed processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
- a specially-programmed processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
- the functions described herein may be implemented in hardware, software executed by a processor, firmware, or any combination thereof. If implemented in software executed by a processor, the functions may be stored on or transmitted over as one or more instructions or code on a non-transitory computer-readable medium. Other examples and implementations are within the scope and spirit of the disclosure and appended claims.
- Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
- a storage medium may be any available medium that can be accessed by a general purpose or special purpose computer.
- computer- readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code means in the form of instructions or data structures and that can be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor.
- any connection is properly termed a computer-readable medium.
- Disk and disc include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of computer-readable media.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- Collating Specific Patterns (AREA)
- Image Processing (AREA)
Abstract
Aspects of the present disclosure include methods for generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light, identifying one or more macroblocks each includes a subset of the plurality of sampling points, calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, assigning a unique index to each of the plurality of weighted values, generating a second array of the unique index by ranking the plurality of weighted values, and generating a third array including a plurality of ranking distances.
Description
METHODS AND APPARATUS FOR FACIAL RECOGNITION CROSS-REFERENCE TO RELATED APPLICATIONS [1] This application claims priority to and the benefit of United States Patent Application No.16/730,578, filed on December 30, 2019, which is a Continuation of United States Patent Application No. 16/297,351, filed on March 8, 2019, which is a Continuation-in-Part of United States Patent Application No. 16/104,826, filed on August 17, 2018, which is a Continuation-in-Part of United States Patent Application No. 15/649,144, filed on July 13, 2017, which is a Continuation of United States Patent Application No. 14/022,080, filed on September 9, 2013, now United States Patent No. 9,740,917, issued August 22, 2017, which claims the benefit of United States Provisional Application No. 61/792,922, filed on March 15, 2013, and United States Provisional Application No. 61/698,347, filed on September 7, 2012, the contents of which are incorporated by reference it their entireties. BACKGROUND [2] There has been a growing need for stronger identity verification to protect personal property, both physical and electronic. For example, it is important to control access to premises, vehicles, and personal property so that only authorized requesters are allowed access. A requester may be a user/person that requests access to an access controlled assets and/or infrastructure. In a traditional example, a requester may carry and use a key, which is designed to fit a lock to allow the requester of the key to open the lock and gain entry. A loss or damage to the key, however, can render access impossible. In another example, a requester may use a key fob to remotely lock or unlock the doors of a vehicle by, e.g., pressing a button on the fob to generate an infrared ("IR") or radio frequency ("RF") signal, which is detected by a sensor in the vehicle, which controls the doors. Such vehicle keyless access systems may require the requester to operate the ignition system. Other similar keyless access implementations may involve inserting and presenting a magnetic card or the like in a slot or a card reader/detector, or enabling an authorized requester to key in a numeric or alphanumeric code on a provided keypad. In each of these conventional techniques, however, it is very difficult to determine if the person holding the key/card is the actual authorized requester. An imposter may steal or duplicate a valid key and gain unauthorized accesses to the premise, vehicle, and/or personal property.
[3] While traditional biometrics access control systems may mitigate some shortcomings of keys/cards-based access control systems, there may be limitations as well. Traditional biometric sensors, such as iris detection sensors, may be limited to specific light conditions significantly reducing both the effectiveness of the biometric sensor as well as the possible environments to apply same. The performance of biometric sensors may be compromised in direct sunlight due to glares, shadows, and other artifacts. Even with the emergence of mega- pixel camera technology, the features of each face may be obscured by ambient lighting, the position of the face, changes to the face, the background behind the face and the quality of the camera. Motion blur, insufficient resolution, environmental impacts, lighting, background, and camera angles collude to obscure subject details, making heterogeneous facial recognition (the matching of video and other probe images to large databases of frontal photographs) difficult. [4] Other factors may also increase the false acceptance and/or false recognition rates of traditional biometric sensors. For example, biometric sensors also have difficulties obtaining the necessary data in the absence of light. Light source shadowing and other changes in intensity may create contrasts on the face that may be misinterpreted as facial features, and/or slightly distort the measurement of the real facial features. Another major source of inaccuracy is the increased probability of similar measured features between faces in a growing population. Further, the problem of capturing the features of each face may be compounded by the desire for low maintenance and/or low complexity facial recognition systems. Therefore, improvement in access control may be desired. SUMMARY [5] The following presents a simplified summary of one or more aspects in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more aspects in a simplified form as a prelude to the more detailed description that is presented later. [6] Some aspects of the present disclosure include methods for generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light, identifying one or more macroblocks each includes a subset of the plurality of sampling points, selecting a local pattern value,
calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, assigning a unique index to each of the plurality of weighted values, generating a second array of the unique index by ranking the plurality of weighted values, and generating a third array including a plurality of ranking distances. [7] Certain aspects of the present disclosure include an edge capture device (ECD) having an illumination source configured to emit an incident non-visible light, an optical sensor configured to detect a detected non-visible light, wherein the detected non-visible light includes a reflected non-visible light and a radiated non-visible light, one or more processors operatively coupled to the illumination source and the optical sensor, the one or more processors are configured to construct a biometric template of a requester requesting access to an entry point by generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light, identifying one or more macroblocks each includes a subset of the plurality of sampling points, selecting a local pattern value, calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, assigning a unique index to each of the plurality of weighted values, generating a second array of the unique index by ranking the plurality of weighted values, and generating a third array including a plurality of ranking distances. [8] Aspects of the present disclosure include a computer readable medium having code stored therein that, when executed by one or more processors, cause the one or more processors to execute code for generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light, code for identifying one or more macroblocks each includes a subset of the plurality of sampling points, code for selecting a local pattern value, code for calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, code for generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the
numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, code for assigning a unique index to each of the plurality of weighted values, generating a second array of the unique index by ranking the plurality of weighted values, and code for generating a third array including a plurality of ranking distances. [9] An aspect of the present disclosure includes a system having means for generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light, means for identifying one or more macroblocks each includes a subset of the plurality of sampling points, means for selecting a local pattern value, means for calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, means for generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, means for assigning a unique index to each of the plurality of weighted values, generating a second array of the unique index by ranking the plurality of weighted values, and means for generating a third array including a plurality of ranking distances. [10] Aspects of the present disclosure include an infrastructure having an access-controlled entry point, a ECD configured to emit an incident non-visible light onto a face of a requester, detect a detected non-visible light from the face of the requester, wherein the detected non- visible light includes a reflected non-visible light and a radiated non-visible light, generate a biometric template of the requester by generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non- visible light, identifying one or more macroblocks each includes a subset of the plurality of sampling points, selecting a local pattern value, calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks, generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks, assigning a unique index to each of the plurality of weighted values, generating a second array of the unique index by ranking the plurality of weighted values, and generating a third array including a plurality of ranking distances, store a plurality of biometric templates of authorized personnel, compare the biometric template of the requester with the plurality of biometric templates of authorized personnel, generate a positive match signal in response to identifying
a match between the biometric template of the requester and one of the plurality of biometric templates of authorized personnel, and transmit the positive match signal to a gateway to grant the requester access to the entry point. BRIEF DESCRIPTION OF THE DRAWINGS [11] The features believed to be characteristic of aspects of the disclosure are set forth in the appended claims. In the description that follows, like parts are marked throughout the specification and drawings with the same numerals, respectively. The drawing figures are not necessarily drawn to scale and certain figures may be shown in exaggerated or generalized form in the interest of clarity and conciseness. The disclosure itself, however, as well as a preferred mode of use, further objects and advantages thereof, will be best understood by reference to the following detailed description of illustrative aspects of the disclosure when read in conjunction with the accompanying drawings, wherein: [12] FIG. 1 is an example concurrent real-time identity verification and authentication system, in accordance with some aspects of the present disclosure; [13] FIG. 2 shows a perspective view of an example of a concurrent real-time identity verification and authentication device, in accordance with some aspects of the present disclosure; [14] FIG. 3 shows a frontal view of an example of a concurrent real-time identity verification and authentication device, in accordance with some aspects of the present disclosure; [15] FIG. 4 shows another perspective view of an example of a concurrent real-time identity verification and authentication device, in accordance with some aspects of the present disclosure; [16] FIG. 5 is a block diagram of an example processing component of a concurrent real- time identity verification and authentication device, in accordance with some aspects of the present disclosure; [17] FIG.6 shows a flow diagram of a facial recognition method, in accordance with some aspects of the present disclosure; [18] FIG. 7(a) shows a facial image for a person, in accordance with some aspects of the present disclosure; [19] FIG.7(b) shows a different facial image for the same person, in accordance with some aspects of the present disclosure;
[20] FIG.8 shows an example process for calculating local binary pattern (LBP) feature, in accordance with some aspects of the present disclosure; [21] FIG. 9 shows an example process for calculating local ternary pattern (LTP) feature, in accordance with some aspects of the present disclosure; [22] FIG. 10 shows positions of three example key features selected among one or more face images, in accordance with some aspects of the present disclosure; [23] FIG. 11 shows an example of a receiver operating characteristic (ROC) curve for testing a face database, in accordance with some aspects of the present disclosure; [24] FIG. 12 illustrates an example of biometric, asymmetric encryption for confidentiality, in accordance with some aspects of the present disclosure; [25] FIG. 13 illustrates another example of biometric, asymmetric encryption for authentication, in accordance with some aspects of the present disclosure; [26] FIG. 14 illustrates a schematic view of an example of an environment for implementing one or more gateways for access control; [27] FIG. 15 illustrates an example of a computer system for implementing a method of managing data in accordance with aspects of the present disclosure; [28] FIG. 16 illustrates a block diagram of various exemplary system components, in accordance with aspects of the present disclosure; [29] FIG. 17 illustrates an example of a ECD for identifying biometric templates, in accordance with aspects of the present disclosure; [30] FIG. 18 illustrates an example of the components of the ECD of FIG. 17, in accordance with aspects of the present disclosure; [31] FIG. 19 illustrates another example of the components of the ECD of FIG. 17, in accordance with aspects of the present disclosure; [32] FIG.20 illustrates an example of a sampled profile, in accordance with aspects of the present disclosure; [33] FIG. 21 illustrates an example of LBP operation on measurement points of the sampled profile of FIG.20, in accordance with aspects of the present disclosure;; [34] FIG.22 illustrates examples of sub-matrices; [35] FIG. 23 illustrates an example of a table of results for sequence conversion, in accordance with aspects of the present disclosure; [36] FIG. 24 illustrates an example of a flow chart for converting a sequence, in accordance with aspects of the present disclosure;
[37] FIG. 25 illustrates an example of a table of verification calculations, in accordance with aspects of the present disclosure; [38] FIG. 26 illustrates an example of a flow chart of deep learning, in accordance with aspects of the present disclosure; [39] FIG. 27 illustrates another example of a flow chart of deep learning, in accordance with aspects of the present disclosure; [40] FIG. 28 illustrates a flow chart of a method for identifying biometric templates, in accordance with aspects of the present disclosure; [41] FIG. 29 illustrates examples of binary trees based on sampled profiles of biometric templates, in accordance with aspects of the present disclosure; [42] FIG.30 illustrates an example of time-domain analysis, in accordance with aspects of the present disclosure; [43] FIG. 31 illustrates an example of facial recognition using auto-alignment, in accordance with aspects of the present disclosure; [44] FIG. 32 illustrates an example of yaw, roll, and pitch, in accordance with aspects of the present disclosure; [45] FIG. 33 illustrates examples of techniques for computing yaw, roll, and pitch, in accordance with aspects of the present disclosure; [46] FIG.34 illustrates an example of aligning a macroblock to a face using landmarks, in accordance with aspects of the present disclosure; [47] FIG. 35 illustrates examples of aligning a macroblock to a face including yaw, in accordance with aspects of the present disclosure; and [48] FIG. 36 illustrates examples of aligning a macroblock to a face including pitch, in accordance with aspects of the present disclosure. DETAILED DESCRIPTION [49] The following includes definitions of selected terms employed herein. The definitions include various examples and/or forms of components that fall within the scope of a term and that may be used for implementation. The examples are not intended to be limiting. [50] A “processor,” as used herein, processes signals and performs general computing and arithmetic functions. Signals processed by the processor may include digital signals, data signals, computer instructions, processor instructions, messages, a bit, a bit stream, or other computing that may be received, transmitted and/or detected.
[51] A “bus,” as used herein, refers to an interconnected architecture that is communicatively coupled to transfer data between computer components within a singular or multiple systems. The bus may be a memory bus, a memory controller, a peripheral bus, an external bus, a crossbar switch, and/or a local bus, among others. The bus may also be a vehicle bus that interconnects components inside a vehicle using protocols, such as Controller Area network (CAN), Local Interconnect Network (LIN), among others. [52] A “memory,” as used herein may include volatile memory and/or non-volatile memory. Non-volatile memory may include, for example, ROM (read only memory), PROM (programmable read only memory), EPROM (erasable PROM) and EEPROM (electrically erasable PROM). Volatile memory may include, for example, RAM (random access memory), synchronous RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), and/or direct RAM bus RAM (DRRAM). [53] As used in the specification and the appended claims, the singular forms "a," "an" and "the" include plural referents unless the context clearly dictates otherwise. [54] Ranges may be expressed herein as from "about," "substantially," or "approximately" one particular value and/or to "about," "substantially," or "approximately" another particular value. When such a range is expressed, another implementation includes from the one particular value and/or to the other particular value. [55] By "comprising" or "containing" or "including" is meant that at least the named compound, element, particle, or method step is present in the composition or article or method, but does not exclude the presence of other compounds, materials, particles, method steps, even if the other such compounds, material, particles, method steps have the same function as what is named. [56] It is also to be understood that the mention of one or more method steps does not preclude the presence of additional method steps or intervening method steps between those steps expressly identified. Similarly, it is also to be understood that the mention of one or more components in a device or system does not preclude the presence of additional components or intervening components between those components expressly identified. [57] Biometric identification techniques generally refer to pattern recognition techniques that perform a requester identification process by determining the authenticity of a specific physiological or behavioral characteristic possessed by the requester. In some instances, biometric identification may be preferred over traditional methods involving passwords and
personal identification numbers (PINs) for various reasons. For example, with biometric identification, the person (e.g., requester) to be identified is typically required to be physically present at the point-of-identification. Additionally, identification based on biometric techniques obviates the need to remember a password or carry a token (i.e., a security device used to gain access to an access controlled entry point). [58] One kind of texture based local binary pattern ("LBP") feature describes facial information that produces desirable recognition results. The improved local ternary pattern ("LTP") feature may be a further improvement over conventional LBP methods. LBP and LTP features may not be sensitive to light and expression variations and are computationally efficient, but they also have shortcomings, such as information redundancy due to correlation between the positive histogram and the negative histogram. [59] It is therefore desirable to contemplate concurrent real-time identity verification and authentication techniques to create biometric signature data for providing keyless access to authorized requesters to a vehicle, building, or the like with varying degrees of security by utilizing various types of biometric data of authorized requesters. As discussed above, in some implementations of the present disclosure, the biometric signature data may be interchangeable across a wide variety of applications. Accordingly, in some examples of the present disclosure, the same biometric signature data for a person may be used to authenticate that person at one or more locations and for one or more applications. Additionally, an example of a biometric system in the present disclosure allows the biometric signature data to be altered based on a desired security level. Thus, the type of biometric signature data that may be used for a particular application and/or relating to a particular requester may vary depending on the security level desired for that particular application and/or requester. While some implementations discussed herein are discussed in the context of facial biometric data, those skilled in the art would understand that various implementations of the present disclosure may employ many types of biometric data, including, but not limited to, fingerprint data, iris and retinal scan data, speech data, facial thermograms, hand geometry data, and the like. [60] In some implementations of the present disclosure, the biometric data associated with the intended recipient (e.g., a biometric template) may be obtained via a biometric sensor of a biometric-based access control system. As will be discussed below, variations in light, temperature, distance of the biometric sensor from a target may impact the quantity and quality of the biometric data obtained via the biometric sensor. For example, variations in
light intensity and angle may create shadows on the face of a requester, making facial recognition more difficult. If the biometric data for identifying a requester is obscured, more templates may be needed to properly authenticate the requester, thus increasing the quantity of the biometric data necessary. To reduce the undesirable impact of these environmental factors, the biometric sensor may utilize either near infrared (IR) or ultraviolet (UV) light or a combination of both IR and UV at desired intensities. In an implementation, the method uses near IR light. An Infrared light emitting diode (LED) array may be utilized in the facial recognition device or biometric sensor to minimize the impact of the surrounding lighting on capturing the facial uniqueness. The camera and the LED array are packaged into a dedicated edge device (e.g., an ECD or a faceplate) mounted at a location requiring verification and/or identification/analysis, such as a door requiring access control. [61] In some implementations, an access control system may utilize IR or new IR illumination and detection to identify facial features. IR or new IR lighting may penetrate into the dermis of the face. The IR or new IR lighting may penetrate into the dermis by 10 micrometers, 20 micrometers, 50 micrometers, 100 micrometers, 200 micrometers, 500 micrometers, 1 millimeters, 2 millimeters, 5 millimeters, and/or 10 millimeters. Other penetration depths are possible. The penetration depths may depend on the location of the body, wavelength of the infrared lighting, and/or intensity of the infrared lighting. The penetration may expose characteristics of the skin that may be difficult to see in visible light including (age spots, spider veins, hyperpigmentation, rosacea, acne, and porphyrins). The identification of these subdermal features may be used to adjust/supplement the unique identification of the requester. These features on the face of the requester may be unique because they are based on the requesters exposure to nature and the sun over the life of the requester. Facial recognition based on subdermal features may identify the uniqueness of the face at the time of capture to provide opportunities for identification analysis. The number of subdermal features may increase over time with exposure to the sun and on a daily basis. [62] In another example, an access control system may utilize ultraviolet illumination and detection to identify facial features. Ultraviolet lighting may penetrate into the dermis of the face. The UV lighting may penetrate into the dermis by 10 micrometers, 20 micrometers, 50 micrometers, 100 micrometers, 200 micrometers, 500 micrometers, 1 millimeters, 2 millimeters, 5 millimeters, and/or 10 millimeters. Other penetration depths are possible. The penetration depths may depend on the location of the body, wavelength of the ultraviolet lighting, and/or intensity of the ultraviolet lighting. The penetration may expose
characteristics of the skin that may be difficult to see in visible light including (age spots, spider veins, hyperpigmentation, rosacea, acne, and porphyrins). The identification of these subdermal features may be used to adjust/supplement the unique identification of the requester. These features on the face of the requester may be unique because they are based on the requesters exposure to nature and the sun over the life of the requester. Facial recognition based on subdermal features may identify the uniqueness of the face at the time of capture to provide opportunities for identification analysis. The number of subdermal features may increase over time with exposure to the sun and on a daily basis. The facial recognition system of the present disclosure may estimate the age of a person based on the quantity and nature of the subdermal features. The access control system may also track the change in these features over time to confirm the individual’s identity and establish lifestyle and daily routines based on interpretations of the subdermal features. Subdermal facial recognition may also increase the difficulty of creating a duplicate (e.g., duplicate of a biometric template) of the face due to its elimination of dependency on facial features capable of being captured by standard visible wavelength photography and camera technology. The access control system may also further obfuscate the content of the ultraviolet capture by introducing time-sequenced cross-polarization filters to the capturing process that further eliminates the ability to present an artificial duplicate of the face to the access control system. [63] A benefit of the system in the present disclosure includes allowing a single credential system replacing PINs, passwords, and multi-factor authentication that is seamless to the requester. With this architecture in place, the requester(s) of the system may rely on a single credential management solution. The system of the present disclosure may support both logical and physical gateways. In some implementations of the present disclosure, the system may provide protection at home and at work. [64] Aspects of the present disclosure may include a method referred to as “layered reinforcement.” The method comprises of taking the image of face from the biometric sensor and overlaying several layers of different size pixel boxes on the image. This layering of pixel boxes of different sizes has an amplifying impact on the analysis of the uniqueness of the face. Areas that are more unique to the face are amplified. Areas that are more common among faces are deemphasized. As a result, layered reinforcement may improve the algorithm performance while allowing the method to handle a large number of users at multiple sites where the biometric sensor ECD is deployed. The “layer reinforcement” of the method may allow for the processing of the same number of requesters on a local Advanced
Reduced Instruction Set Computing Machine (ARM) processor at the biometric sensor ECD where the image is first captured, thus reducing hardware and processing requirements and contributing to the accuracy and reliability of the method as a network failure cannot prevent the biometric sensor ECD from processing a face verification. [65] Some aspects of this embodiment of the invention cover the use of a gateway (described below) to manage the data analyzed by the various algorithms to increase performance by decreasing false negative and false positive results through the following processes: pixel box hierarchical analysis to create binary tree of dominant features (i.e., determining what is the most distinctive feature); pixel box time domain analysis with heat maps (i.e., determining over time features that are problematic due to overlap among subjects); and binary tree collision (flagging overlap of biometric signature data for two subjects that may cause a false positive and addressing in a proactive fashion). [66] Benefits to the system of the present disclosure include improved performance when accuracy requires reduction in false negative and false positive results. The improvement also allows for the benefits of 1:1 comparison in a 1:N environment as a potential replacement to video surveillance and comparison thereby opening up the massive surveillance market to significantly increased accuracy. [67] Referring to FIG. 1, an example of an identification system 100 for concurrent real- time identity verification and authentication for use in, e.g., allowing access by an authorized requester to a vehicle, building, or the like is illustrated in accordance with aspects of the present disclosure. [68] It should be appreciated that FIG.1 is intended to describe aspects of the disclosure to enable those skilled in the art. Other implementations may be utilized and changes may be made without departing from the scope of the present disclosure. [69] The identification system 100 comprises a concurrent real-time identity verification and authentication device 102 including at least one biometric sensor 104, a processor 106, memory 108, a display 110, and input/output mechanism 112. The identification system 100 may be used to secure or control access to a secured area, device, or information, such as an airport boarding area, building, stadium, database, locked door, vehicle, or other access controlled assets/infrastructure. [70] The biometric sensor(s) 104 may include a camera, a fingerprint reader, retinal scanner, facial recognition scanner, weight sensor, height sensor, body temperature sensor, gait sensor, heartbeat sensor, or any other sensor or device capable of sensing a biometric
characteristic of a person. As shown in FIGS. 2-4, in an exemplary implementation of the present disclosure, the biometric sensor(s) 104 may be an optical sensor, such as a camera. [71] In some aspects, the biometric sensor(s) 104 may include an optical sensor that captures visual data. For example, the biometric sensor(s) 104 may be a camera that senses visual information of a requester, such as the facial features of the person. The facial features of the person may include the textures, complexions, bone structures, moles, birthmarks, contours, coloring, of the face of the person. The biometric sensor(s) 104 may capture the facial features of the person and convert the visual information into digital sensed information as discussed below). [72] The processor 106 may be configured for comparing the sensed information via biometric sensor(s) 104 with known characteristics of a person in an attempt to identify the person via biometric signature data. The processor 106 may include any number of processors, controllers, integrated circuits, programmable logic devices, or other computing devices. The processor 106 may be communicatively coupled with the biometric sensor(s) 104 and other components of the system 100 through wired or wireless connections to enable information to be exchanged between the device 102 and external devices 114 or systems (e.g., network 116) to allow for comparison of the stored biometric signature data with the sensed information obtained from the biometric sensor(s) 104. [73] The processor 106 may implement a computer program and/or code segments stored on memory 108 to perform some of the functions described herein. The computer program may include an ordered listing of executable instructions for implementing logical functions in the device 102. The computer program can be embodied in any computer-readable medium (e.g., memory 108) for use by or in connection with an instruction execution system, apparatus, or device, and execute the instructions. The memory 108 may contain, store, communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device. Examples of memory 108 may include an electrical connection having one or more wires, a random access memory (RAM), a read- only memory (ROM), an erasable, programmable, read-only memory (EPROM or Flash memory), a portable computer diskette, or a portable compact disk read-only memory (CDROM). The memory 108 may be integral with the device 102, a stand-alone memory, or a combination of both. The memory 108 may include, for example, removable and non- removable memory elements such as RAM, ROM, Flash, magnetic, optical, USB memory devices, and/or other conventional memory elements.
[74] In some aspects, the memory 108 may store the known characteristics of a number of people and various other data associated with operation of the system 100, such as the computer program and code segments mentioned above, or other data for instructing the device 102 and other device elements to perform the aspects described herein. The various data stored within the memory 108 may be associated within one or more databases (not shown) to facilitate retrieval of the information, e.g., via the external devices 114 or the network 116. Although the memory 108 as shown in FIG.1 is integrated into the device 102, it should be appreciated that memory 108 may be stand-alone memory positioned in the same enclosure as the device 102, or may be external memory accessible by the device 102. [75] In an aspect, the display 110 may be configured to display various information relating to the system 100 and its underlying operations. For example, a notification device may be included (not shown) for indicating the sensed biometric characteristic or the sensed signal fail to match the known characteristics of the person and may include an audible alarm, a visual alarm, and/or any other notification device. [76] In an aspect, the device 102 may also include input/output mechanism 112 to facilitate exchanging data and other information among different components within the device 102, or with various the external devices 114 or systems via the network 116. [77] For example, various I/O ports may be contemplated including a Secure Disk Digital (SD) card slot, Mini SD Card slot, Micro SD Card slot or the like for receiving removable SD cards, Mini SD Cards, Micro SD Cards, or the like, and a USB port for coupling with a USB cable communicatively coupled with another computing device such as a personal computer. In some aspects, the input/output mechanism 112 may include an input device (not shown) for receiving identification information about a person-to-be-identified. The input device may include a ticket reader, a credit card reader, an identification reader, a keypad, a touch- screen display, or any other device. In some other aspects, as described above, the input/output mechanism 112 may be configured to enable the device 102 to communicate with other electronic devices through the network 116, such as the Internet, a local area network, a wide area network, an ad hoc or peer to peer network, or a direct connection such as a USB, Firewire, or BluetoothTM connection, etc. In one example, known characteristics about persons may be stored and retrievable in remote databases or memory via the network 116. The input/output mechanism 112 may thus communicate with the network 116 utilizing wired data transfer methods or wireless data transfer methods such as WiFi (802.11), Wi- Max, BluetoothTM, ANT®, ultra-wideband, infrared, cellular telephony, radio frequency, etc.
In an aspect, the input/output mechanism 112 may include a cellular transceiver for transmitting and receiving communications over a communications network operable with GSM (Global System for Mobile communications), CDMA (Code Division Multiple Access), or any other known standards. [78] The device 102 may also include a power source (not shown) for providing electrical power to the various components contained therein. The power source may include batteries, battery packs, power conduits, connectors, and receptacles operable to receive batteries, battery connectors, or power cables. [79] In an aspect, the device 102 may be installed and positioned on an access control entry point (not shown) such as a gate, locked door, etc. for preventing persons from accessing certain areas until the device 102 determines that the sensed biometric characteristic and/or signal match the known characteristics. In some other aspects, as shown in FIGS. 2-4, the device 102 may be a stand-alone, compact, handheld, and portable device. In one example, one may use such a stand-alone, compact, handheld, and portable device to protect sensitive documents or information that are electronically stored and accessed on the Internet and/or an intranet. In some aspects, a concurrent realtime identity verification facility access unit may use biometric signature data to create interchangeable authentication for a variety of uses (e.g., office, home, smart phone, computer, facilities). [80] Referring to FIG.5, the processor 106 in FIG.1 may be configured to include, among other features, a detection module 502 and a recognition module 508 for providing concurrent real-time or near real-time identity verification and authentication with keyless access to authorized requesters to secured facilities or information. The detection module 502 may include a face detection module 504 for detecting facial features of a requester. The detection module 502 may include an eye detection module 506 for identifying the locations of the eyes of a requester. In some implementations, the detection module 520 may include one or both the face detection module 504 and/or the eye detection module 506. In some aspects, the processor 106 may receive inputs (digital or analog) from the sensor(s) 104. [81] FIG.6 describes an example procedure of selecting key features from a database with a large number of facial information and building one classifier which can distinguish different faces accordingly. LBP and LTP may be used to provide a full description of face information, and then with the use of an adaptive boosting ("adaboost") learning algorithm, one may select key features and build a classifier to distinguish different faces by creating biometric signature data. This biometric signature data may be used to create universal
verification and authentication that can be used for a variety of applications (e.g., computer, building access, smartphone, automobile, data encryption) with varying degrees of access and security (e.g., access to network, but heightened security for requester computer). At block 602, create face sample database. For example, the processor 106 and the recognition module 508 may create a face sample database using unrecognized face samples. In one implementation, the processor 106 and the recognition module 508 may store, into the memory 108, 1000, different persons with each person showing 10 different postures and/or expressions. [82] At block 604, extract LBP and LTP features. For example, the detection module 502 and/or the face detection module 504 may extract LBP and LTP features from different blocks in different positions of each face sample. [83] At block 606, calculate positive sample and negative sample. For example, at least one of the face detection module 502, the face detection module 504, and the eye detection module 506 may calculate the feature absolute value distance for the same position of any two different images from one person and set this distance as positive sample feature database. Further, the face detection module 502 and the face detection module 504 may jointly or separately calculate the feature absolute value distance for the same position of any two different images from different person and set this distance as negative sample feature database. [84] At block 608, build adaboost classifier. For example, the face detection module 502 and the face detection module 504 may select the most distinguishable key feature from the candidate feature database with adaboost and create a face classifier. [85] At block 610, generate recognition result. For example, the recognition module 508 may generate recognition result. Once there is a fixed dataset of macro blocks and the specific LPB ranging from 1 to 255 is determined, a value is assigned to that unit of the dataset based upon the number of pixels within the block that satisfy that specific LBP. For example, assuming a 10 x 10 macro block in unit number 1 of 255 and LBP of 20, the method 600 determines the number of pixels in the histogram that fall within that LBP of 20 on scalar value. The method 600 calculates scalar value and then normalize value in a second array to address the problem of determining value within various sized macro-blocks. The scalar value based upon the known method was based on size of macro-block where the maximum value could be from 100 to 1600 depending upon the size of the macro-block. The scalar value in this second array may now a percentage of the total pixels available in that
macro block to normalize the data for the subsequent assessment. Normalization causes the data to not be skewed based on the size of the macro block. After normalization under the improved method, each unit of the data set in this second array has the same weight. This normalized data may be then sorted to establish and assign a value from 1 – 2165 where the scale reflects the highest normalized value going to the top of the sort. For example if dataset 2000 had the highest value in the array it would be assigned a value of 1 with descending value reflecting the datasets that have lower values. The second normalized array may then be converted to a third simulated DNA sequencing array where the position is established within this third array based upon its value in previous sort. The third array assesses the position and calculates the differences between where the data set appears in the sequence (e.g., ranking distance). This improved method analyzes traits as opposed to scalar value based upon the uniqueness of traits within the face and not merely on scalar values. [86] At block 620, test face sample. For example, the detection module 502 may optionally test face samples. [87] At block 622, extract LBP and LTP features. For example, the detection module 502 and/or the face detection module 504 may extract LBP and LTP features from different blocks in different positions of each face sample. [88] Further, online recognition may include the following steps: [89] (1) Calculate the offline stage extracted key feature of different blocks in different positions for face sample to be identified. [90] (2) Calculate the key feature selected from step (1) with that of each face sample in database and determine whether they belong to the same person or not. If calculated distance is less than the set threshold, it may be determined that they are the same person, otherwise it may be determined that they are not. [91] As shown in FIG. 8, an example process starts with creating a face database with different postures and different expressions. For example, one may include the images of, e.g., 1000, different persons and each person shows, e.g., 10, images differently. FIG. 7(a) shows the different face image of one same person, and FIG.7(b) refers to the different face image of different person. [92] LBP and LTP may be used to describe face. FIG. 8 shows a calculating process of LBP features, and FIG. 9 shows a calculating process of LTP features. In order to obtain as many features to describe face information, different block size may be divided on different positions of face sample. For example, face size can be 100 x 100, block size may be w x h,
w and h values can range from 2 to 100, and 7837 blocks may be selected as a result. The identification system 100 may select the bin features of LBP and LTP on different block size and make it as the final candidate feature database. [93] The next step is to calculate positive samples and negative samples. The bin feature absolute value distance of the same position for different images from a same person may be calculated and set as the positive sample. Additionally, the bin feature absolute value distance of same position for different persons may be calculated and set as the negative sample. For example, the result may involve calculating 32356 positive samples and 58747698 negative samples. [94] Thereafter, the key bin feature that can distinguish all positive and negative samples among the large number of feature database may be selected with a learning algorithm. For example, one may choose the learning algorithm of discrete adaboost to select feature and build a classifier. [95] An example algorithm of using adaboost to classify may include the following computational steps: [96] 1. Given f as the maximum negative sample error rate, d as the minimum positive sample correct rate, Ftar as the target of negative sample error rate, and Dtar as the target of positive sample correct rate that cascade classifier has to achieve. P, N are the positive and negative database, respectively. [97] 2. Set F0=1.0, D0=1.0, and i=0; [98] 3. When Fi>Ftar, i=i+1, ni=0, Fi=Fi-1; when Fi>fx Fi-1, ni=ni+1. [99] 4. Compute the strong classifier with n features via adaboost in database P and N; calculate Fi and Di for current cascade classifier, adjust the threshold value of current strong classifier until the rate is no less than dxDi-1, N is nonempty set. [100] 5. If Fi>Ftar, classify the currently obtained cascade classifier in other negative sample image and determine, put wrongly determined image into N. [101] 1) Given n computing sample (xi, yi), . . . , (xn, yn), yi=0, 1, xi presents negative sample label and positive sample label, respectively. [102] 2) Initialize weight
where the number of positive samples is l and the number of negative samples is m. [103] 3) Try t from 1 to T and run below steps repeatedly:
[104] a) Normalize weight
[105] b) Compute a weak classifier hj for each feature fj and mark the error rate of this classifier
[106] c) Find out classifier ht with lowest error rate ^t among all weak classifier computed from last step, [107] d) Update weight
among which
If x, is correctly classified, ei=0. Otherwise ei=1. [108] Get the strong classifier lastly: if
then h(x)=1, otherwise h(x)=0. There,
[109] FIG. 10 shows the position of the first three key features selected among face image by taking online testing for face database of 100 persons based on offline selected features and classifier. [110] FIG. 11 shows recognition results for 100 persons, wherein X axis represents false accept rate, which means the wrongly identified rate of face samples. Y axis represents verification rate, which means the rate of face samples correctly recognized. As shown in FIG. 11, when the false accept rate is below 10-4, it may achieve 95% recognition rate. The face recognition in this example not only improves the robustness of face sample, but also reduces its computational complexity thus improves the face recognition significantly. [111] Referring back to FIG. 5, in some aspects, the detection module 502 may be configured to use, among other features, a face detection module 504 and an eyes detection module 506 for processing the acquired image of the person-to-be-identified as follows. [112] Face Detection Module 504 [113] Inputs: Acquired frontal face image (grey image), face classifier [114] Outputs: Face frame positions, and the number of faces [115] Flow: [116] a. Reduce the acquire frontal face image to user-defined size [117] b. Calculate an integral image of the reduced image [118] c. Initialize a traverse window based on the size defined by the face classifier, e.g., 20x20 [119] d. Move the traverse window on the integral image from left to right and then from top to bottom with each move distance corresponding to a user-defined distance. However, if
the user-defined distance is zero, set the move distance as 1/20 of the width of the traverse window. [120] e. Use the face classifier to determine whether the current position of the traverse window defines a valid portion of a face. If so, save the current rectangular frame position of the traverse window as results. [121] f. After traversing the entire integral image, increase the width and the length of the traverse window by 1.1 times and repeat step e until the size of the traverse window exceeds the size of the image, or the buffer allocated for saving the results is used up. [122] g. Return to face frame position and faces [123] Eyes Detection Module 506 [124] Inputs: Acquired frontal face image (grey image), face frame positions, classifier for both left and right eyes, left eye classifier, right eye classifier, left eye coarse detection classifier, right eye coarse detection classifier [125] Outputs: frame position for both eyes, frame position of left eye, and frame position of right eye [126] Flow: [127] a. Obtain face image from the acquired frontal face image [128] b. If user-defined classifier for both left and right eyes is available, use correspondingly defined face detection function to detect both the left and right eyes of the obtained face image. If not, estimate the positions of both the left and right eyes based on experience. [129] c. If user-defined left/right eye course detection classifier for the left/right eye is available, detect the left/right eye on the corresponding half of the obtained face image. Further, based on the coarse detection result, determine whether the detected human subject is wearing glasses or not. If glasses are present, detect the obtained face image and return with results. If no glasses are present, continue to detect the obtained face image based on the coarse detection result and return the detection result without considering the presence of glasses. (If user-defined classifier for glasses-wearing subject is not available, detect the obtained face image without considering the presence of glasses.) [130] d. If user-defined course detection classifiers are not available, determine whether glasses are present by directly detecting the left/right half of the obtained face image. If glasses are present, detect the obtained face image and return with results. If no glasses are present, continue to detect the obtained face image based on the coarse detection result and
return the detection result without considering the presence of glasses. (If user-defined classifier for glasses-wearing subject is not available, detect the obtained face image without considering the presence of glasses.) [131] e. Return [132] In some aspects, the processor 106 may further use, e.g., a recognition module 508, to extract pertinent facial features obtained from the detection module 502 for comparing against known characteristics and/or information of a number of authorized people as follows. [133] Recognition Module 508 [134] Normalization [135] Inputs: to-be-normalized image (grey image), the coordinates of the centers of both the left and right eyes on the image axis (the origin is located at the left top corner of the image). The meanings of parameters: 1x refers to the x coordinate of the center point of the left eye (horizontal direction) in the output image divided by the width of the output image, and 1y refers to x coordinate of the center point of the left eye (vertical direction) in the output image divided by the height of the output image. [136] Output: output image [137] Feature Extraction [138] Inputs: Normalized image (grey image) and feature types [139] Outputs: If output buffer is NULL, return feature dimensional degrees. Otherwise, assume the size of the output buffer equals the feature dimensional degrees, write the features of the image into the buffer, and return feature dimensional degrees. Certain features are associated with certain image size. For example, #6 feature may require the image size of 100 by 100. Therefore, when the input image fails corresponding defined image size requirement, a result of zero can be returned. [140] Feature Comparison [141] Inputs: Two features to be compared and the comparison method [142] Output: The smaller the comparison result (a floating point), the higher the similarity. [143] Obtaining Algorithm Information [144] Function: instruct the requester to correctly assign parameters for the algorithm [145] Input: algorithm type based on the usage context
[146] Outputs: parameters information of the algorithm including feature type, feature dimensional degrees, normalized image size, the minimum distance, suggested range, and distance type. [147] Many of the systems and methods described above can be used to create Biometric Signature Data ("BSD") files that allow a system to identify and distinguish requesters with a high degree of accuracy. Various implementations of the present disclosure may employ the BSD files to create an encryption/decryption key, thus increasing the security of such keys. Examples of the present disclosure can generate asymmetric keys based on one or more BSD files in such a way that by utilizing a biometric sensor, a person's biometric measurement can act as the person's private key. Implementations of the present disclosure may also incorporate BSD files into digital rights management (DRM) security in such a way that files cannot be decrypted or accessed by anyone other than the requester or group of requesters intended, or encrypted in a way that the original owners, such as a business, can no longer access the files. Accordingly, by using implementations of the present disclosure employing BSD files, when a file is accessed, there can be assurance of the identity of the requester who accessed the file. [148] BSD files can be generated by the algorithmic analysis of data from an A/D IR and/or UV sensor. Accordingly, many of these elements can be considered when constructing the private key of the asymmetrical pair (i.e., analog and/or digital values). Thus, in some implementations of the present disclosure, multiple elements of a sensor can contribute real- time data or real-time analog data related to a recognition event in order to de-encrypt, thus ensuring a real-time event (i.e., the actual measurement of the intended person) has triggered the authentication. [149] As shown in FIGS. 12-13, in accordance with some implementations of the present disclosure, messages can be sent as follows. A requester can register, e.g., on a computer, and create a public key for the requester. The requester than then publish the public key so that the key is publicly known. Other people, systems, or entities, can use the requester's public key to encrypt messages for the requester and send those messages to the requester. The requester can decrypt the message using her private key created by one or more live BSD files associated with the requester. Accordingly, the sender of the message is ensured that the requester is actually the person decrypting the message because the private key used to decrypt the message can be generated by the requester's live biometric data. These systems and methods for encryption provide substantial advantages over conventional systems and
methods. For example, instead, of simply matching anonymous asymmetrical codes, by using BSD filed in the encryption process, authentication because inherent in the key itself. [150] Various implementations of the present disclosure can also improve DRM. For example, DRM rules can allow for additional content to be added to a file and additional rules to be required. DRM rules can be expressed in many rights management languages known in the art, including but not limited to, XrML (extensible rights markup language), XMCL (extensible media commerce language), ODRL (open digital rights language), and the like. Rules can specify the actions that are permitted (e.g., decrypting, encrypting, transferring, copying, editing, etc.). The rules can also specify the people authorized to perform actions and the conditions under which these actions are permitted. BSD files can be used to authenticate a requester to determine whether the requester is one of the people specified in the rules. [151] Various systems and methods for biometric encryption and authentication can also find application in corporate settings where, e.g., employees use corporate devices for personal use as well as business, or when, e.g., an employee uses a personal device and the corporate digital assets are transferred to and from the personal device. By applying rules to documents that have certain digital signatures, there may be controllable segmentation between private and business concerns. Both parties may have access to the parts they are entitled to access but can be prevented from accessing parts that are not entitled to access. For example, possible applications include, but are not limited to, providing remote access, making purchases, and conditional security. [152] In the case of remote access, various implementations of the present disclosure can generate BSD files used to authenticate a requester, thus providing secure access for any remote network connection, i.e., VPN server, secure access to network email, and/or company proprietary information, from a remote device. [153] Additionally, biometric authentication techniques of the present invention can be used to make authenticated online purchases/transactions. For example, spending limits can be based on requester or group profile for an account. In order for a requester to make a purchase, a system can use the biometric authentication techniques of the present disclosure to authenticate the true identity of that requester to verify the requester is entitled to make the desired purchase. [154] Biometric authentication techniques can also be used to provide conditional security to various digital files. For example, files that contain sensitive information can only be
accessed by authorized requesters, which can be authenticated using the requester's live BSD files. [155] Biometric Encryption and Authentication Application to Digital Cinema [156] The biometric encryption and authentication techniques described herein find many applications in the digital cinema industry. Movies are popular commodities, especially pre- DVD release. In order to maximize both production efficiencies and distribution opportunities, movies need to be accessed and handled by many different strata of requesters. Persons skilled in the art appreciate that techniques capable of protecting digital assets in the digital cinema industry can be used to protect digital assets in almost any industry. Accordingly, the principles described herein are not limited to application in the digital cinema industry, but may instead be applied to any industry for a similar purpose. [157] Digital cinema security views itself as an end-to-end process from production via distribution to consumption. SMPTE DC28, the body responsible for digital cinema standards, has identified five separate areas of digital cinema: (1) capture; (2) production; (3) Master (cinema, home, video, trailers, test screenings); (4) distribution (satellite, fiber, packaged); and (5) exhibition (digital projector security). In each area identified by SMPTE DC28, a movie is vulnerable to theft. In order to discourage theft, movies can be encrypted prior to distribution. Movies are then typically stored in their encrypted state in the theater until showtime. At showtime, the movie is decrypted and decompressed. This decryption/decompression may take place in a server or in a projector. [158] In an exemplary SMPTE DC28 process, DC28.4 can represent the conditional access portions of the cinema delivery system. Modem DRM encryption methods have proven sufficient to withstand unwarranted deciphering attempts, but securing the keys has become a problem. From capture to exhibition to distribution, a movie is encrypted and decrypted multiple times. Accordingly, various biometric encryption and authentication techniques discussed herein can be applied to one of more of the encryption, decryption, and authentication steps, in accordance with various implementations of the present disclosure. [159] Referring to FIG.14, an example of an environment 1400 for managing data may rely on a first gateway 1402a, a second gateway 1402b, and a third gateway 1402c to route data via wired and/or wireless communication links. The first gateway 1402a may be implemented as a software-based gateway virtualized in a computer system. The second gateway 1402b may be a standalone device that routes data as described below. The third gateway 1402c may be a cloud-based gateway. Other architectures are possible.
[160] In certain implementations, the gateways 1402 may perform several functions including managing the movement of data to and from the biometric sensor as described below, providing a networked solution that efficiently moves binary facial data between devices, and when clustered together (physical and virtual), providing a high availability solution for security designs. The gateway 1402 may receive credentialing data through an XML file structure. By monitoring and actively consuming the XML files, the gateway 1402 may be able to utilize a standardized universal interface agnostic to the programming language, operating system and type of connectivity of the data source, support physical and logical access control requirements within the same method, offer an interface that supports simultaneous connectivity from different system types, and/or support either live or batch processing of credentials, including the immediate recovery of a credential system through file replacement. [161] In certain aspects, data stored in the gateways 1402 may be stored in files based on JavaScript Object Notation format. [162] The environment 1400 may include one or more ECD ECDs 1404, e.g., ECD-a 1404a, ECD-b 1404b, ECD-c 1404c, ECD-d 1404d, ECD-e 1404e, and ECD-f 1404f. The ECDs 1404 may also be referred to as edge capture devices. The ECD ECDs 1404 may transmit data to the gateways 1402 to be routed to another device within the environment 1400. For example, the ECD-a 1404a may send a biometric template and/or identity information of a person to the third gateway 1402c. The third gateway 1402c may send one or more biometric templates to the ECD-a 1404a for performing matching operations. [163] Still referring to FIG. 14, in certain implementations, the environment 1400 may include an access control server 1406a, an enterprise server 1406b, and a third party server 1406c. The access control server 1406a may be communicatively coupled to one or more access controlled entry points 1408 via a wired or wireless communication link. The enterprise server 1406b may be communicatively coupled to a storage device 1410. The storage device 1410 may be a network drive, local hard drive, flash drive, tape drive, or other suitable storage media. [164] Still referring to FIG. 14, during operations for example, the ECD-b 1404b may receive a biometric template of a first requester 1450a. The first requester 1450a may attempt to access one or more access controlled entry points 1408 controlled by the access control server 1406a. The one or more access controlled entry points 1408 may include a vault, lock, secure door, secure gate, equipment or machinery, computing device, digital storage device,
database, or file, for example. In some examples, the one or more access controlled entry points 1408 may be an access controlled door or gate of an infrastructure, such as a warehouse, office building, restricted area, etc. The biometric template of the first requester 1450a may include one or more of the fingerprints, voice patterns, iris patterns, facial features, signature patterns, shapes of the ears, retinal patterns, gait, hand geometry of the first requester 1450a. [165] In certain implementations, the ECD-b 1404b may extract the facial features of the first requester 1450a, and compare the facial features with the facial features of authorized personnel. If the facial feature of the first requester 1450a matches one of the facial features of the authorized personnel, the ECD-b 1404b may send a first positive match signal to the third gateway 1402c. The third gateway 1402c may route the first positive match signal to the access control server 1406a. Upon receiving the first positive match signal, the access control server 1406a may unlock one of the one or more access controlled entry points 1408 associated with the ECD-b 1404b to allow the requester 1450 access. If the facial feature of the first requester 1450a does not match one of the facial features of the authorized personnel, the first requester 1450a may not gain access to the entry point 1408 associated with the ECD-b 1404b. [166] Still referring to FIG. 14, in certain examples, the ECD-d 1404d may transmit a second positive match signal of a second requester 1450b at a first time and a third positive match signal at a second time of the second requester 1450b to the second gateway 1402b. The second gateway 1402b may route the second positive match signal and the third positive match signal to the first gateway1402a, which may route the second and third positive match signals to the enterprise server 1406b. The enterprise server 1406b may use the second positive match signal and the third positive match signal to log access information associated with the second requester 1450b. For example, the enterprise server 1406b may record, into the storage device 1410, the first time as the arrival time of the second requester 1450b and the second time as the departure time on a work day. In another example, the enterprise server 1406b may record the first and second time as the number of accesses to the one or more access controlled entry points 1408 by the second requester 1450b. The enterprise server 1406b may also log, based on the information in the second and third positive match signals, the premises, equipment, files, locations, and information accessed by the second requester 1450b.
[167] Referring still to FIG.14, in certain implementations, the ECD-f 1404f may transmit a fourth positive match signal of the third requester 1450c to the first gateway 1402a. The first gateway 1402a may route the fourth positive match signal to the third party server 1406c through a firewall 1412. The firewall 1412 may filter information transmitted through the firewall 1412 and prevent malicious requesters from gaining unauthorized access. The fourth positive match signal may indicate to the third party server 1406c that the third requester 1450c gained access to the one or more access controlled entry points 1408. For example, the fourth positive match signal may indicate to the third part server 1406c that the third requester 1450c accessed a software that requires payment to the owner of the third party server 1406c. [168] Still referring to FIG. 14, in certain examples, a network administrator 1452 may install, manage, update, maintain, and/or control the software in the ECD ECDs 1404, the gateways 1402, the access control server 1406a, the one or more access controlled entry points 1408, the enterprise server 1406b, the storage device 1410, and/or the firewall 1412 via a workstation 1414. The workstation 1414 may be a desktop computer, laptop computer, tablet computer, handheld computer, smartphone, or other suitable computer devices communicatively coupled via a wired or wireless connection to the third gateway 1402c. The network administrator 1452 may transmit software commands from the workstation 1414 to the third gateway 1402c to be routed to a destination. In some examples, the network administrator 1452 may upgrade the firmware in the ECD ECDs 1404. In other examples, the network administrator 1452 may install new software onto the access control server 1406a. In another example, the network administrator 1452 may perform maintenance operations, such as disk error check and defragmentation, on the storage device 1410. In yet another example, the network administrator 1452 may lock down or open the one or more access controlled entry points 1408 in an emergency. [169] Referring to FIG. 14, in some implementations, an employee 1454, such as a supervisor, may utilize a requester terminal 1416 to access information through the second gateway 1402b. The requester terminal 1416 may be a desktop computer, laptop computer, tablet computer, handheld computer, smartphone, or other suitable computer devices communicatively coupled via a wired or wireless connection to the second gateway 1402b. In some examples, the employee 1454 may download information, such as work hours, arrival time, access history, utilization frequencies, using the requester terminal 1416.
[170] In certain examples, data exchanges within the environment 1400 may be encrypted. Data transmissions between the gateways 1402 and the ECD ECDs 1404 may use advanced TLS v1.2 communications with a proprietary key management framework. Data transmitted via TLS v1.2 communications may be fully encrypted to remove the threat of exposure of the data to unwanted parties. The encryption of the data may be further protected by protecting the generation of the encrypted keys through the use of the biometric data as the seed for the generation of the keys. As such, data exchanged within the environment may be difficult to access by unauthorized requesters. [171] Further, the gateways 1402 may be used to manage the data and the creation of a blockchain for the requesters. The first gateway 1402a, the second gateway 1402b, and the third gateway 1402c may each include a blockchain wallet. The wallets will contain the requester personal credentials required to authenticate to any device or application. The wallets may be tied to the gateways 1402 to provide cybersecurity monitoring and to provide the interaction between the wallet and the facial recognition devices. The ECD ECDs 1404 linked to the personal blockchain may be able to enable a transaction in the blockchain. Communications between blockchain wallets, a ledger for the blockchain transactions, and the gateways 1402 may use the blockchain protocol. The personal blockchain will also extend to devices that verify more than one requester, like a bank ATM (not shown). The gateways 1402 will utilize location tracking to move the link binary facial data and the link to the blockchain ledger between shared devices to improve the security of the transactions and to manage the number of requesters held within each ECD 1404. [172] Aspects of the present disclosure may be implemented using hardware, software, a cloud network, or a combination thereof and may be implemented in one or more computer systems or other processing systems. In an aspect of the present disclosure, features are directed toward one or more computer systems capable of carrying out the functionality described herein. An example of such the computer system 1500 is shown in FIG.15. One or more of the gateways 1402, the servers 1406, the firewall 1412, the workstation 1414, and/or the requester terminal 1416 may be implemented based on the computer system 1500. [173] Referring now to FIG.15, the computer system 1500 includes one or more processors, such as the processor 1504. The processor 1504 is communicatively coupled to a communication infrastructure 1506 (e.g., a communications bus, cross-over bar, or network). Various software aspects are described in terms of this example computer system. After
reading this description, it will become apparent to a person skilled in the relevant art(s) how to implement aspects of the disclosure using other computer systems and/or architectures. [174] The computer system 1500 may include a display interface 1502 that forwards graphics, text, and other data from the communication infrastructure 1506 (or from a frame buffer not shown) for display on a display unit 1530. Computer system 1500 also includes a main memory 208, preferably random access memory (RAM), and may also include a secondary memory 1510. The secondary memory 1510 may include, for example, a hard disk drive 1512, and/or a removable storage drive 1514, representing a floppy disk drive, magnetic tape drive, optical disk drive, universal serial bus (USB) flash drive, etc. The removable storage drive 1514 reads from and/or writes to a removable storage unit 1518 in a well-known manner. Removable storage unit 1518 represents a floppy disk, magnetic tape, optical disk, USB flash drive etc., which is read by and written to removable storage drive 1514. As will be appreciated, the removable storage unit 1518 includes a computer usable storage medium having stored therein computer software and/or data. [175] Alternative aspects of the present disclosure may include secondary memory 1510 and may include other similar devices for allowing computer programs or other instructions to be loaded into computer system 1500. Such devices may include, for example, a removable storage unit 1522 and an interface 1520. Examples of such may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an erasable programmable read only memory (EPROM), or programmable read only memory (PROM)) and associated socket, and other removable storage units 1522 and interfaces 1520, which allow software and data to be transferred from the removable storage unit 1522 to computer system 1500. [176] Computer system 1500 may also include a communications interface 1524. Communications interface 1524 allows software and data to be transferred between computer system 1500 and external devices. Examples of communications interface 1524 may include a modem, a network interface (such as an Ethernet card), a communications port, a Personal Computer Memory Card International Association (PCMCIA) slot and card, etc. Software and data transferred via communications interface 1524 are in the form of signals 1528, which may be electronic, electromagnetic, optical or other signals capable of being received by communications interface 1524. These signals 1528 are provided to communications interface 1524 via a communications path (e.g., channel) 1526. This path 1526 carries signals 1528 and may be implemented using one or more of a wire or cable, fiber optics, telephone
line, cellular link, RF link and/or other communications channels. In this document, the terms “computer program medium” and “computer usable medium” are used to refer generally to media such as a removable storage drive 1518, a hard disk installed in hard disk drive 1512, and signals 1528. These computer program products provide software to the computer system 1500. Aspects of the present disclosure are directed to such computer program products. [177] Computer programs (also referred to as computer control logic) are stored in main memory 1508 and/or secondary memory 1510. Computer programs may also be received via communications interface 1524. Such computer programs, when executed, enable the computer system 1500 to perform the features in accordance with aspects of the present disclosure, as discussed herein. In particular, the computer programs, when executed, enable the processor 1504 to perform the features in accordance with aspects of the present disclosure. Accordingly, such computer programs represent controllers of the computer system 1500. [178] In an aspect of the present disclosure where the method is implemented using software, the software may be stored in a computer program product and loaded into computer system 1500 using removable storage drive 1514, hard drive 1512, or communications interface 1520. The control logic (software), when executed by the processor 1504, causes the processor 1504 to perform the functions described herein. In another aspect of the present disclosure, the system is implemented primarily in hardware using, for example, hardware components, such as application specific integrated circuits (ASICs). Implementation of the hardware state machine so as to perform the functions described herein will be apparent to persons skilled in the relevant art(s). [179] FIG. 16 illustrates a block diagram of various example system components, in accordance with an aspect of the present disclosure. Fig.16 shows a communication system 1600 usable in accordance with aspects of the present disclosure. The communication system 1600 includes one or more accessors 1660, 1662 and one or more terminals 1642, 1666. In one aspect, data for use in accordance with aspects of the present disclosure is, for example, input and/or accessed by accessors 1660, 1662 via terminals 1642, 1666, such as personal computers (PCs), minicomputers, mainframe computers, microcomputers, telephonic devices, or wireless devices, such as personal digital assistants (“PDAs”) or a hand-held wireless devices coupled to a server 1643, such as a PC, minicomputer, mainframe computer, microcomputer, or other device having a processor and a repository for data and/or
connection to a repository for data, via, for example, a network 1644, such as the Internet or an intranet, and couplings 1645, 1646, 1664. The couplings 1645, 1646, 1664 include, for example, wired, wireless, and/or fiberoptic links. In another example variation, the method and system in accordance with aspects of the present disclosure operate in a stand-alone environment, such as on a single terminal. [180] Turning now to FIG. 17, an example of a ECD 1404 that is configured to perform access control may analyze a biometric template of a requester 1450 to determine whether the requester 1450 is authorized to gain access to an entry point (not shown) associated with the ECD 1404. The ECD 1404 may include an optical sensor 1702, an illumination source 1704, a display 1706, a keypad 1708, and a scanner 1710. [181] In some implementations, the optical sensor 1702 may be configured to capture still or moving images. For example, the optical sensor 1702 may capture the fingerprints, the iris patterns, the facial features, the signature patterns, the shapes of the ears, the retinal patterns, the gait, and/or the hand geometry of the requester 1450. In some examples, the optical sensor 1702 may be a broadband camera configured to detect electromagnetic radiation having wavelengths ranging from 200 nanometers (e.g., soft UV) to 2000 nanometers (e.g., near infra-red (NIR)). In a particular example, the optical sensor 1702 is configured to detect radiation between 700 to 900 nanometers. In certain implementations, the optical sensor 1702 may include a motion sensor configured to detect people approaching the ECD 1404. [182] In certain examples, the optical sensor 1702 may include a wide angle lens (e.g., such as a fisheye lens) used to provide both vertical area coverage to capture faces across the full range of human heights (as well as addressing American’s With Disabilities Act requirements) and to provide horizontal coverage of the complete area of an access point to address more than one person accessing the secure area on one authentication. In other examples, the optical sensor 1702 may employ a high resolution (e.g., megapixel per square inch) charge coupled device (CCD) array. The high resolution array may provide the ability to identify faces at a greater distance from the sensor. The ECD 1404 may take advantage of the increased distance of identification to pre-identify requesters in queuing situations. Potential requesters may be identified as they enter a queuing area and a placed in a priority list to accelerate the confirmation at the access controlled entry point and discharge of the queue. The pre-identification and prioritizing of individuals in the identification process at the access point enables high volume throughput at the entry point by reducing the identification time during the identification and authentication process.
[183] In certain implementations, the illumination source 1704 may emit electromagnetic radiation having wavelengths ranging from 200 nanometers to 2000 nanometers. In certain examples, the illumination source 1704 may emit non-visible radiation between 700 to 900 nanometers and/or 200-300 nanometers. The illumination source 1704 may emit radiation to illuminate bodily features and patterns of the requester 1450 used for biometric analysis (analyzing biometric template to determine access rights). The emitted radiation may impinge on a portion of a body of the requester 1450, and reflect off of the portion of the body. The reflected radiation may be captured by the optical sensor 1702 for biometric analysis. [184] In some implementations, the display 1706 may present useful information to the requester 1450. For example, the display 1706 may show one or more images of a face 1730 of the requester 1450, captured by the optical sensor 1702, to assist the requester 1450 in aligning the face 1730 during biometric analysis. In another example, the display 1706 may notify the requester 1450 a status of the entry point associated with the ECD 1404 (e.g., locked down, temporarily unavailable, normal operations, under maintenance). In yet another example, the display 1706 may display information such as time, date, weather, current location, etc. [185] Still referring to FIG. 17, the keypad 1708 may allow the requester 1450 to enter numbers, symbols, and alphabets into the ECD 1404. In an example, the requester 1450 may enter a password in addition to the biometric analysis to gain access to the entry point. [186] In some implementations, the scanner 1710 may be a radio frequency identification (RFID) scanner, a proximity card scanner (e.g., HIDTM card scanner), a contact card scanner, or a magnetic card scanner. In an example, the scanner 1710 may send an interrogatory signal to a proximity card (not shown) having a coil and an integrated circuit with a programmable or non-programmable identification sequence. The interrogatory signal may be “absorbed” by the coil and may energize the integrated circuit. In response, the energized integrated circuit sends a response signal including the identification sequence back to the scanner 1710 via the coil. The scanner 1710, in turn, analyzes the identification sequence to determine whether or not to grant access. The identification sequence may be one or more numbers, alphabets, symbols, and/or a combination thereof. [187] Still referring to FIG.17, in an implementation, the requester 1450 may approach the ECD 1404 during operations. The optical sensor 1702 may detect the requester 1450. In response to the detection, the illumination source 1704 may emit incident NIR radiation 1760
toward the requester 1450. The incident NIR radiation 1760 may impinge on the face 1730 of the requester 1450, and reflect off of the face 1730 of the requester 1450. In some implementations, the optical sensor 1702 may detect detected NIR radiation 1762 originating from the surface of the face 1730. The detected NIR radiation 1762 may include reflected incident NIR radiation 1760 and/or NIR radiation emitted from the requestor 1450 due to thermal heating (i.e., black body radiation). The intensity and distribution of the detected NIR radiation 1762 may depend on the intensity and angle of the incident NIR radiation 1760, the contour of the face 1730, angle of detection by the optical sensor 1702, and other factors. The ECD 1404 may use the detected NIR radiation 1762 to construct a facial template (the “NIR sampled profile”) of the requester 1450. The ECD 1404 may compare the constructed NIR sampled profile with existing templates stored therein (details described below). If the ECD 1404 detects a match, the ECD 1404 may allow the requester 1450 access to the entry point (as described above). [188] In some examples, a NIR sampled profile generated via NIR radiation detection may be resistant to changes in ambient lighting. As ambient lighting fluctuates (e.g., changes in luminance, color, color temperature, lighting angle), a NIR sampled profile constructed using NIR radiation detection may remain sufficiently constant to prevent a false acceptance or a false rejection. For example, the NIR sampled profile of the requester 1450 constructed via NIR radiation detection under a “bright” condition (e.g., 1000 lux) may be substantially identical to the NIR sampled profile constructed via NIR radiation detection under a “dark” condition (e.g., 100 lux). In another example, the NIR sampled profile of the requester 1450 constructed via NIR radiation detection under substantially blue light (e.g., CIE coordinates x=0.153, y=0.100) may be substantially identical to the NIR sampled profile constructed via NIR radiation detection under substantially white light (e.g., CIE coordinates x=0.30, y=0.33). [189] Still referring to FIGs. 14 and 17, in other implementations, a NIR sampled profile generated via NIR radiation detection may improve the privacy of the owner. For example, the storage device 1410 may store NIR sampled profiles of employees and associated confidential information (e.g., birthdates, email account passwords). If an unauthorized person gains access to the NIR sampled profiles and the associated confidential information, the unauthorized person may not be able to exploit the stolen confidential information because it may be difficult to identify the employees based on the NIR sampled profiles. Given that the NIR sampled profiles are constructed using, for example, detected NIR
radiation, they may be unrecognizable because the NIR sampled profile of a person may be drastically different from the visual image of the face of the same person. [190] In another implementation, the requester 1450 may approach the ECD 1404 during operations. The optical sensor 1702 may detect the requester 1450. In response to the detection, the illumination source 1704 may emit incident UV radiation 1760 (e.g., electromagnetic radiations having wavelengths between 315-390 nanometers) toward the requester 1450. The incident UV radiation 1760 may impinge on the face 1730 of the requester 1450, and reflect off of the face 1730 of the requester 1450. In certain examples, the incident UV radiation 1760 may penetrate the surface of the face 1730 and reflect off of subdermal features of the face 1730 (e.g., dermis, subcutaneous tissue, muscles, imperfections). In some implementations, the optical sensor 1702 may detect detected UV radiation 1762 originating from the surface and/or subdermal features of the face 1730. The detected UV radiation 1762 may include reflected incident UV radiation 1760. The intensity and distribution of the detected UV radiation 1762 may depend on the intensity and angle of the incident UV radiation 1760, the contour of the face 1730, angle of detection by the optical sensor 1702, and other factors. The ECD 1404 may use the detected UV radiation 1762 to construct a facial template (the “UV sampled profile”) of the requester 1450. The ECD 1404 may compare the constructed UV sampled profile with existing templates stored therein (details described below). If the ECD 1404 detects a match, the ECD 1404 may allow the requester 1450 access to the entry point (as described above). [191] In some examples, a UV sampled profile generated via UV radiation detection may be resistant to changes in ambient lighting. As ambient lighting fluctuates (e.g., changes in luminance, color, color temperature, lighting angle), a UV sampled profile constructed using UV radiation detection may remain sufficiently constant to prevent a false acceptance or a false rejection. For example, the UV sampled profile of the requester 1450 constructed via UV radiation detection under a “bright” condition (e.g., 1000 lux) may be substantially identical to the UV sampled profile constructed via UV radiation detection under a “dark” condition (e.g., 100 lux). In another example, the UV sampled profile of the requester 1450 constructed via UV radiation detection under substantially blue light (e.g., CIE coordinates x=0.153, y=0.100) may be substantially identical to the UV sampled profile constructed via NIR radiation detection under substantially white light (e.g., CIE coordinates x=0.30, y=0.33).
[192] Still referring to FIGs. 14 and 17, in other implementations, a UV sampled profile generated via UV radiation detection may improve the privacy of the owner. For example, the storage device 1410 may store UV sampled profiles of employees and associated confidential information (e.g., birthdates, email account passwords). If an unauthorized person gains access to the UV sampled profiles and the associated confidential information, the unauthorized person may not be able to exploit the stolen confidential information because it may be difficult to identify the employees based on the UV sampled profiles. Given that the UV sampled profiles are constructed using, for example, detected UV radiation, they may be unrecognizable because the UV sampled profile of a person may be drastically different from the visual image of the face of the same person. A sampled profile may be a rendering of the dataset that visualizes the consistency of position within the three arrays. If the sampled profile is compromised, it may be more difficult to obtain biometric information used to identify the individual. [193] In some examples, the ECD 1404 may generate a visible-light sampled profile based on detected visible-light reflected from the face 1730 of the requester 1450. [194] In certain implementations, the requestor 1450 may be asked to provide a password, a personal identification number (PIN), and/or a valid HIDTM card to be used in conjunction with the constructed sampled profile to gain access to the entry point associated with the ECD 1404. [195] In other implementations, the ECD 1404 may include a microphone (not shown in FIG.17) to perform voice recognition. [196] In some examples, the display 1706 may show the face 1730 of the requester 1450 as imaged by the optical sensor 1702. The display 1706 may include alignment marks (not shown) to assist the requester 1450 in aligning the face 1730 with respect to the optical sensor 1702. This alignment process may minimize false acceptances and false rejections due to misalignment. In certain implementations, the ECD 1404 may be a pocket-sized mobile device powered by rechargeable batteries. [197] Referring to FIG. 18, in some implementations, the ECD 1404 may include a processor 1802 having a communication module 1852 configured to communicate with the gateways 1402 and other ECD ECDs 1404 as described in this disclosure. The communication module 1852 may be implemented as hardware in the processor 1802 for example, as software code executed by the processor 1802, or a combination thereof. The processor 1802 may also include a security module 1854 configured to encrypt and/or
decrypt data. The security module 1854 may be implemented as hardware in the processor 1802 for example, as software code executed by the processor 1802, or a combination thereof. The processor 1802 further includes an algorithm module 1856 for constructing and comparing biometric templates as described throughout this disclosure. Alternatively, the communications with the Gateway may be facilitated by a processor in the control panel. The algorithm module 1856 may be implemented as hardware in the processor 1802 for example, as software code executed by the processor 1802, or a combination thereof. The processor 1802 may further include a parallel computation module 1858 for performing distributed processing. The parallel computation module 1858 may be implemented as hardware in the processor 1802 for example, as software code executed by the processor 1802, or a combination thereof. The processor 1802 may include one or more processors or cores, and may be implemented as a semiconductor processor, graphical processing unit, a field programmable gate array, a programmable logic device, a processing cluster, an application specific integrated circuit, or other suitable architectures. [198] The ECD 1404 includes a memory 1804. The memory may be static or dynamic memory such as flash memory, random access memory, magnetic memory, or semiconductor memory. The memory 1804 may include external memory such as a cloud storage. The memory 1804 may include or store applications and/or computer executable code. The ECD 1404 further includes a modem 1808 for communicating with the gateways 1402 and other ECD ECDs 1404, and may operate in cooperation with the communication module 1852. The ECD 1404 also includes a RAM 1806, such as static or dynamic random access memory (RAM). The ECD 1404 may also include an Input/Output (I/O) device 1810 communicatively coupled to the display 1706, the optical sensor 1702, the keypad 1708, and the scanner 1710. The components within the ECD 1404 may be interconnected by an internal bus 1812a. The processor 1802, the memory 1804, the RAM 1806, and the internal bus 1812a may be disposed on a processing board 1820. [199] Referring now to FIG. 19, another example of the ECD 1404 may include a number of processing boards 1820 interconnected by an external bus 1812b. Each processing board 1820 may include the processor 1802, the memory 1804, the RAM 1806, and the internal bus 1812a. Data distributed among the processing boards 1820 may be distributed by a controller 1814 via the external bus 1812b. In a non-limiting example, the ECD 1404 may download (from the gateways 1402) 240,000 biometric templates of potential requestors. The controller 1814 may distribute the 240,000 biometric templates evenly or unevenly among the
processing boards 1820 such that each processing board 1820 may store, in the respective memory 1804, 30,000 biometric templates. In some implementations, each processing board 1820 may store the same or different number of biometric templates. [200] Still referring to FIG. 19, in an example, during operations, the ECD 1404 may construct a sampled profile (e.g., UV or NIR) of the requestor 1450 based on the detected NIR radiation 1762. The controller 1814 may distribute copies of the constructed sampled profile to each of the processing boards 1820. In some implementations, the processing boards 1820 may simultaneously compare the duplicated sampled profiles with the locally stored biometric templates (e.g., 30,000 stored in each processing board 1820). While the current example of the ECD 1404 shown in FIG 19 includes eight processing boards 1820, other numbers of processing boards 1820 may also be used. For example, the ECD 1404 may include 2, 4, 6, 8, 12, 16, 32 or 64 processing boards 1820. [201] In an implementation, the ECD 1404 may rely on remote processing boards (not shown) to perform the distributed computing described above. For example, the ECD 1404 may send the duplicated sampled profiles to the remote processing boards to jointly and simultaneously implement the algorithm (described below) for matching the duplicated sampled profiles (or the numerical representation of the duplicated sampled profiles) to known biometric templates. The processing boards may be within other ECD ECDs within the network. In certain examples, the ECD 1404 may send the duplicated sampled profiles to a Beowulf cluster. The clustering design may employ inter-process and inter-processor protocols to share processing tasks of the same application between both processors and the processing cores within those processors. In the case of facial recognition, processor- intensive operations like facial verification may utilize multiple processors or multiple cores to complete the operation within an acceptable period of time. In some cases, large multi- core processors may be used for this type of operation. In other cases, the operation may be spread across several advanced RISC machine (“ARM”) processors to accomplish the same performance as the single multi-core processor without the high hardware cost and potential for a single point of failure. [202] Referring back to FIG. 14, in a non-limiting example of distributed parallel comparison, the first gateway 1402a may download 600,000 biometric templates of potential requesters (e.g., employees and contractors) from the enterprise server 1406b. The first gateway 1402a may distribute 100,000 biometric templates to each of the ECD-e 1404e and the ECD-f 1404f, and 200,000 biometric templates to each of the second gateway 1402b and
the third gateway 1402c. Next, the second gateway 1402b may distribute 100,000 biometric templates to each of the ECD-c 1404c and the ECD-d 1404d, and the third gateway 1402c may distribute 100,000 biometric templates to each of the ECD-a 1404a and the ECD-b 1404b. Consequently, the 600,000 biometric templates downloaded from the enterprise server 1406b may be evenly distributed among the ECD ECDs 1404 (i.e., 100,000 non- overlapping templates each). In some implementations, when the ECD-b 1404b constructs a sampled profile (UV or NIR) of the first requester 1450a, the ECD-b 1404b may duplicate the constructed sampled profile and distribute the duplicated sampled profiles to the ECD-a 1404a, ECD-c 1404c, ECD-d 1404d, ECD-e 1404e, and ECD-f 1404f (via one or more gateways 1402). The ECD ECDs 1404 may compare the duplicated sampled profile of the first requester 1450a with the 100,000 templates stored locally to determine a match. The ECD-a 1404a, ECD-c 1404c, ECD-d 1404d, ECD-e 1404e, and ECD-f 1404f may send (via one or more gateways 1402) the result of the comparison back to the ECD-b 1404b. The ECD-b 1404b may gather the results and determine whether to grant access to the first requester 1450a. In other implementations, the biometric templates may be unevenly distributed among the ECD ECDs 1404. [203] Referring now to FIG.20, the ECD 1404 may generate a sampled profile 2000 based on measuring the intensity of the detected radiation 1762 from the face 1730 of the requester 1450. The sampled profile 2000 may include a base matrix 2010 of one or more measurement points 2002. The measurement points 2002 may include a value indicative of the intensity of the detected radiation 1762 at a location of the particular measurement point. For example, the measurement point-a 2002a may indicate an intensity scale value of 2 corresponding to a region 2004a (e.g., black hair) around the face 1730 of the requester 1450. The measurement point-b 2002b may indicate an intensity scale value of 50 that corresponds to a background color. The measurement point-c 2002c may indicate an intensity scale value of 21 that corresponds to a region 2004c (e.g., cheeks) on the face 1730 of the requester 1450. In certain examples, the intensity scale may range from 0 (no reflection) to 100 (maximum reflection able to be detected by the optical sensor 1702 of the ECD 1404. In an implementation, the intensity scale may measure an absolute intensity (e.g., brightness) of the measurement points 2002. In other examples, the sampled profile 2000 may include measurement points 2002 for detected radiation 1762 of different wavelengths (e.g., UV, NIR, red, green, blue).
[204] While the sampled profile 2000 in FIG. 20 shows the base matrix 2010 of 10 x 12 measurement points 2002 across the face 1730 of the requester 1450, other measurement points density may be possible for the base matrix 2010. For example, the ECD 1404 may generate another sampled profile using 100 x 100 measurement points across the face 1730 of the requester 1450. In another example, the ECD 1404 may generate a sampled profile using 500 x 500 measurement points. Other measurement points density are also possible, and may depend on desired accuracy, computational constraints, amount of data storage, etc. [205] In some implementations, the ECD 1404 may remove measurement points 2002 indicating the background. [206] Turning now to FIG. 21, in one implementation, the ECD 1404 (or the processor 1802) may apply the LBP operation to one or more of the measurement points 2002 within the base matrix 2010. For example, the ECD 1404 may perform LBP on the measurement point-d 2002d, which includes an intensity value of 41. The LBP string for the measurement point-d 2002d is 01110100, based on the span of 1 (i.e., performing LBP using the immediate neighbors, having distance of 1 cell, of the measurement point-d 2002). The ECD 1404 may track the numbers of LBP strings for the remaining measurement points 2002r as shown in a table 2100. In some examples, the ECD 1404 may compute 24 of the measurement points 2002 as having the LBP string of 00000000, 3 having the LBP string of 00000001, 11 having the LBP string of 00000010, and 0 having the LBP string of 00000011, etc., as indicated in a table 2100. The table 2100 may include 256 entries for the possible strings (i.e., 8 bits). In some examples, the table 2100 may include 255 entries by eliminating the entry for the “all white” string (11111111) or the “all black” string (00000000). In some implementations, the data in the table 2100 may be plotted as a histogram indicating the number of occurrences (e.g., measurement points 2002) for the possible LBP strings. [207] In some implementations, based on the distribution of the LBP strings in the table 2100, the ECD 1404 may determine one or more unique features. The one or more unique features may be non-zero LBP strings occurring fewer than a threshold frequency (e.g., 5), such as LBP strings 00000001, 01110100, and 11111100. In another implementations, the one or more unique features may be the non-zero LBP strings occurring the least, such as LBP strings 01110100. In some examples, the ECD 1404 may track the location the measurement points 2002 having the one or more unique features. For example, the ECD 1404 may track the coordinate of the LBP strings. In some implementations, the one or more unique features may be the non-zero LBP strings occurring the most.
[208] Turning now FIG.22, in certain implementations, the ECD 1404 may divide the base matrix 2010 into one or more sub-matrices. Each sub-matrix may include a macroblock of measurement points. For example, the ECD 1404 may divide the base matrix 2010 into a 5 x 5 sub-matrix 2202, a 3 x 3 sub-matrix 2204, and a 6 x 6 sub-matrix 2206. When computing the LBP strings for the measurement points 2002 within the 5 x 5 sub-matrix 2202, the ECD 1402 may calculate the LBP strings of the measurement points 2002 within the 5 x 5 sub- matrix 2202 using a span of 3 (i.e., calculating LBP strings using neighbors 3 cells away). Similarly, when computing the LBP strings for the measurement points 2002 within the 3 x 3 sub-matrix 2204, the ECD 1402 may calculate the LBP strings of the measurement points 2002 within the 3 x 3 sub-matrix 2204 using a span of 2. When computing the LBP strings for the measurement points 2002 within the 6 x 6 sub-matrix 2206, the ECD 1402 may calculate the LBP strings of the measurement points 2002 within the 6 x 6 sub-matrix 2206 using a span of 4. Other sizes for the base matrix, the sub-matrices, and spans are possible, as determined by the ECD 1402. [209] In one example, a 100 x 100 base matrix may be divided into sub-matrices of six different sizes: 10 x 10, 15 x 15, 20 x 20, 30 x 30, 35 x 35 and 40 x 40. Within each of these sub-matrices, the LBP string may be calculated with a different span around the measurement point being calculated to characterize the texture/slope of the area surrounding the cell at different coverage areas. The span in pixels from the measurement point 2002 to be calculated to each of the neighboring cells for each of the sub-matrix sizes may be 3, 9, 15, 21, 27 and 33. In another example, a 200 x 200 base matrix may include 6 different sub- matrices having sizes of 10 x 10, 20 x 20, 35 x 35, 50 x 50, 65 x 65, 80 x 80, and 100 x 100. The span in pixels from the measurement point 2002 to be calculated to each of the neighboring cells for each of the sub-matrix sizes may be 3, 5, 7, 10, 25 and 40. The sub- matrices may overlap in some instances. In certain implementations, the ECD 1404 may apply LTP computations onto the measurement points 2002. [210] Referring now to FIG.23, in certain examples, the ECD 1404 may convert the list of binary features derived from the sampled profile into a data sequence that is extensible and capable of being tailored to the unique traits of each requester 1450 while still providing a methodology of object comparison and matching/verification. A binary feature may contain three characteristics: the size of the sub-matrix/macroblock, the location of the sub- matrix/macroblock on the sampled profile, and the Uniform Local Binary Pattern (ULBP) assigned to binary feature. A macroblock may be a “sub-image” of the image of the face
1730 of the requester 1450. For instance, a macroblock of size 10 is a 10 x 10 (i.e., 100 measurement points 2002) sub-image of the image of the face 1730. In a non-limiting example, the location of the macroblock on the image may be defined by the position of the first pixel in the macroblock located in the top, left corner of the macroblock. The position of this pixel is defined by two values, its distance from the left boundary of the face image (x- dimension) and the distance from the top boundary of the face image (y-dimension). The final characteristic is the ULBP, which is a mathematical methodology for establishing a scalar value for the edge and texture characteristics of a pixel starting with the top left of the face 1730. For each pixel within the macroblock, a ULBP calculation is performed. If the calculated ULBP matches the defined ULBP for that binary feature, the value of the binary feature is incremented by one. Therefore, the maximum value of a binary feature is the size of the defined macroblock (10x10 is 100) and the minimum value is zero. A macroblock may be used in more than one binary feature with a different ULBP definitions. The location of a macroblock may also be used in more than one binary feature with a different macroblock size and ULBP definition. [211] Still referring to FIG. 23, a table 2300 may illustrate an example of the results of the sequence conversion process for macroblocks having sizes of 10 x 10, 35 x 35, 20 x 20, 30 x 30, 15 x 15, 40 x 40, and 15 x 15. Three arrays may be used to generate a biometric template. First array may be a scalar value in array normalized. Second array may be sorted so that binary feature 2000, which in this example has the highest value, goes to the top of the second array in position 1. Third array compares the difference in the position of binary feature 2000 in the sort instead of scalar value. If the position within the array for binary feature 2000 remains the same, the results will reflect a value of 0 representing the least possible change in position and therefore the highest value for significance in the authentication method. [212] Turning now to FIG. 24, a method 2400 of converting a sequence may identify the most unique features used for identification. [213] At block 2402, obtain the sampled profile of the face. For example, the processor 1802 and/or the communication module 1825 may obtain the sampled profile of the face 1730. [214] At block 2404, assign a unique index to each macroblock/ULBP combination. For example, the algorithm module 1856 may assign a unique index (e.g., 4) to each
macroblock/ULBP combination (e.g., macroblock 30 x 30) and analyzing based upon uniqueness of traits within the face and not on scalar values. [215] At block 2406, construct a first array of scalar values for each macroblock/ULBP combination in the master schema referenced by the schema. For example, the algorithm module 1856 may construct a first array of scalar values for each macroblock/ULBP combination in the master schema referenced by the schema. First array includes number of pixels that fall within that ULBP within that macro-block. When analyzing the LBP of a pixel, the ECD 1404 may perform normal LBP calculation and get a histogram that is the value from 0-255, and if the result is the highest value (standard LBP formula that takes pixel and takes 8 pixels around it and then each pixel has a different binary value) then the result receives a value of 1. If some other LBP is higher than that pixel has a 0 value. [216] At block 2408, weigh each scalar value by the size of the macroblock. For example, the algorithm module 1856 may weigh each scalar value (e.g., 380) by the size of the macroblock (e.g., 30 x 30). [217] At block 2410, sort the first array of scalar values in descending order. For example, the algorithm module 1856 may sort the elements of the first array of scalar values in descending order from, for example, 1 to 2165. [218] At block 2412, convert the associated indices into a sequence. For example, the algorithm module 1856 may convert the associated indices into a sequence or the sort order of the data set instead of the scalar value. [219] At block 2414, convert the sequence into a second array with a scalar value for each unique index that is the distance (difference in position in array rather than measurement in mm) of the primary index from the beginning of the sequence array. For example, the algorithm module 1856 may convert the sequence into a second array with a scalar value for each unique index that is the position of the primary index from the beginning of the sequence array with 2165 elements currently but may be extensible depending upon data for each individual. [220] Referring to FIG. 25, in certain implementations, a deep learning may be performed on the full data set to establish the minimum data set required to accurately perform facial recognition. The result of the deep learning may be a reduced data set of binary features that is a fraction of the total data set (e.g., 1%, 2%, 5%, 10%, 20%, 50%,). Each element of data in this data set may be defined by a sub-matrix of a specific size and location and a single value within the full LBP base matrix. Each data element may be assigned a unique numeric
identifier. For the enrollment process, this static data set is used for each requester. As each requester continues to verify their faces, the verification data may be collected and through deep learning a new set of binary features may be created for each requester. The introduction of the sequence to the verification algorithm may allow the introduction of this individualized set of binary features and still allow for comparison between different sets. The matching of two sequences may be achieved by calculating the difference (e.g., the difference in position of a data set) of the two arrays. The greater the sum the lower the quality of the match. A perfect object match may have a sum of substantially zero. [221] Still referring to FIG. 25, a table 2500 illustrates an example of the verification calculations. The first step in creating the sequence may be the sorting of the scalar array by the magnitude of the scalar value. The sort may be performed in descending order. The binary feature unique identification with the largest value is at the top of the sort. By sorting in this manner, the binary features providing with the most uniqueness (greatest scalar magnitude) are at the top of the array. After the sort, the new sort position is transferred to that third array and the location on the new array is based upon the binary feature unique identification. Binary feature unique identifier is 2000 when it gets sorted it goes to position 1. In third array the 2000 position will receive a value of 1. This array may be a unique sequence for the object and is the new basis for object verification and/or matching. [222] The sequence matching algorithm determines the quality of the match by the distance of each binary feature unique identification from the beginning of the sequence. The distance is the index value of the binary feature unique identification within the sequence array where the index values are sequentially assigned. To efficiently complete this matching calculation, another array is created that contains the distance of each binary feature from the beginning of the sequence. As an example, after the sort, the binary feature with the unique identification of one is in position ten of the sorted sequence. In the new array, which is zero- indexed, position one will have a value of ten. By structuring the array in this manner, the algorithm to perform a match between two of these arrays is the same as the original algorithm. The absolute value of the difference of the values with the matching index is aggregated to create a single value for the array match. A lower value equates to a closer match. A perfect match may be substantially a zero value. [223] By introducing the sequence array, the matching algorithm may be extensible. Sequence array is second array and scalar array is the first array (normalized for block size). Extensible may indicate that the fixed list of binary features may now be expanded and
contracted as required to optimize the process of matching objects while also improving the matching process integrity. In this implementation, the sequence array may be dynamically adjustable based on two separate deep learning functions based upon the environment of the edge device and the specific individual face being authorized. The first function may seek to continuously optimize the default list of binary features applied to some or all object matching attempts prior to the development of an individual list of binary features. The second function may aggregate image data on individual objects and over time develop an optimized list of binary features for the face. Once the individualized list of binary features exceeds the matching performance of the default list of binary features through parallel object matching trials on incoming object image data, the default list of binary features may be replaced with the individualized list of binary features for that object in the live object matching functions. [224] In certain implementations, the deep learning engine for both processes may receive sample data in parallel with the live object/face matching functions. As image data is received, two lists of binary features may be applied to it. The first list may either be the default or individualized list of reduced binary features. (e.g., three most important blocks out of 100 in 10 x 10 macroblock). The output of the application of this list may be used in the live matching process. The second list may be the full list of binary features, which comprises all possible macroblock size (e.g., all 100 within the 10 x 10 macroblock), applied to possible image positions for each ULBP. The complete list of binary features may be, for example, twenty times the size of the default or individualized list of binary features. The deep learning engine may receive a sequence derived from the complete list of binary features for each object image received into the system. In the case of the default list of binary features deep learning process, the images may be categorized into the training and validating sets. An independent default test set may be created including objects different than the object being learned. In the case of the individualized list of binary features, each object identified by the default or existing individualized list of binary features will be placed in the training, validation and test sets. All other object’s list may be placed in the verification and test sets. [225] A deep learning training session for both the default object list of binary features and for each individual list of binary features may be executed for each new live entry from the object detection system.
[226] Once the eyes detection algorithm locates the eyes, the face matrix of data is transferred to the facial recognition algorithm. If the deep learning algorithm has developed specific list of binary features, then an aggregate list of these binary features for all users in the system may be created and used to generate the sequence. Otherwise, the sequence may be generated using the default set of binary features. The sequence may be transferred to the verification algorithm where the matching process may determine the identity of the sequence. With the identity established, a full list of binary features (all sub-matrices with full LBP histograms) may be generated and transferred to that identities data set in the deep learning algorithm. The deep learning algorithm may process the available data set for that identity (the set may continue to grow with each verification) and generate a revised optimum data set for that identity. The optimized data set may be converted to a sequence and used in the next verification process for that user. At the same time the aggregate set of binary features may be updated if necessary for converting the next face matrix. While the verification process will be within the locale cluster of devices, the deep learning algorithm may occur within both the locale and global clusters as a background task. [227] As the results of the training session refine the list of binary features, the respective list may be updated in the live verification/matching process. The result may be a matching system capable of automatically adjusting to both the overall object population and the specific, unique traits of each object. The evolution of the object data will allow for large scale object matching solutions in excess of 100,000 objects capable of the same precision as a small solution (<1,000). [228] Referring now to FIG.26, an example of the deep learning process 2600 may rely on feedback loops and machine learning to refine the identification process. [229] At block 2602, obtain an image matrix. For example, the processor 1802 and/or the communication module 1825 may obtain the sampled profile of the face 1730. [230] At block 2604, determine if the face specific macroblock/ULBP is available. For example, the algorithm module 1856 may determine if the face specific macroblock/ULBP is available. [231] At block 2606, if the custom list of binary features is not available, convert image to face generic detection list of binary features. For example, the algorithm module 1856 may convert image to face generic/default detection list of binary features if the custom list is not available.
[232] At block 2608, obtain face detection default sequence. For example, the algorithm module 1856 may obtain face detection default sequence if the macroblock is not available. [233] At block 2610, if the custom list of binary features is available, convert image to face detection specific list of binary features. For example, the algorithm module 1856 may convert image to face detection specific list of binary features if the custom list of binary features (could be in original 2165 or could be all new ones that based upon that individual (e.g., 10 additional 10 x 10s that are more distinctive for that particular individual) is available. [234] At block 2612, perform face detection. For example, the algorithm module 1856 may perform face detection. [235] At block 2614, perform eye detection and location. For example, the algorithm module 1856 may perform eye detection and location. [236] At block 2616, convert image to full list of binary features. For example, the algorithm module 1856 may convert image to full list of binary features. (not just 57, but all 100 of 10 x 10 of macroblocks and include all values). Tensor flow is the deep learning engine framework provided by GoogleTM. Equation and algorithm for tensor flow is well known in the art. [237] At block 2618, develop face detection list of binary features using tensor flow deep learning. For example, the algorithm module 1856 may develop face detection list of binary features using tensor flow deep learning. [238] At block 2620, feedback face detection refined sequence. For example, the algorithm module 1856 may feedback face detection refined sequence. To refine the list of features, the ECD 1404 may take entire feature set and collect all data for face and derive a new refined set that replaces the default set. Distance calculation between the position of the binary feature within two sets may be used as metrics. Every feature would be in the same position every time. As face data get better face to face comparison will get smaller and the difference between this face and all others will get larger. [239] Turning now to FIG. 27, the example of the deep learning process 2600 shown in FIG.26 may rely on feedback loops and machine learning to refine the identification process. Rather than centralizing process of deep learning data that is subject to hacking if stored in a central server (tensor flow single application operating on a single central server) particularly on the cloud, the present disclosure may use tensor flow in a distributed architecture to
evaluate at each reader that is privacy protected on all individual devices rather than stored in one central server. [240] At block 2614, perform eye detection and location. For example, the algorithm module 1856 may perform eye detection and location. [241] At block 2702, obtain face detection specific list of binary features. For example, the algorithm module 1856 may obtain face detection specific list of binary features if the macroblock is available. [242] At block 2704, obtain facial recognition default list of binary features. For example, the algorithm module 1856 may convert image to face generic detection list of binary features if the specific list of binary features for that individual is not available. [243] At block 2706, determine if the face specific list of binary features is available. For example, the algorithm module 1856 may determine if the face specific list of binary features is available. Custom list of binary features may include the original 2165 or could be all new ones that are based upon that individual (e.g., 10 additional 10 x 10s that are more distinctive for that particular individual). [244] At block 2708, if the face specific list of binary features is not available, convert image to face generic list of binary features. For example, the algorithm module 1856 may convert image to face generic list of binary features if the face specific list of binary features is not available. [245] At block 2710, if the face specific list of binary features is available, convert image to face specific list of binary features. For example, the algorithm module 1856 may convert image to face specific list of binary features if the face specific list of binary features is available. [246] At block 2712, convert list of binary features to sequences. For example, the algorithm module 1856 may convert list of binary features to sequences. [247] At block 2714, perform verification. For example, the algorithm module 1856 may perform verification. [248] At block 2716, convert image to full list of binary features. For example, the algorithm module 1856 may convert image to full list of binary features. [249] At block 2718, develop face specific list of binary features using tensor flow deep learning. For example, the algorithm module 1856 may develop face specific list of binary features using tensor flow deep learning. (not just 57 as is typical with LBP, but all 100 of 10 x 10 of macroblocks and include all values). Tensor flow is the deep learning engine
framework provided by GoogleTM. Equation and algorithm for tensor flow is well known in the art. [250] In some examples, the gateway 1402 and/or the ECD 1404 may include proactive algorithms to identify and reduce “consistency-collisions” during the verification process. A “consistency-collisions” may occur when one or more requesters have unique facial data that causes a mistake (i.e., false acceptance or false reject) in the verification. The gateway 1402 and/or the ECD 1404 may seek out potential ‘consistency-collisions’ proactively using two methods of the present disclosure described below. [251] The ‘Layered Reinforcement’ algorithm may allow the requester to create a binary tree structure for the facial data. The binary data may create a hierarchy of data points (e.g., LBP string) based on the uniqueness of the data points. The greater the uniqueness the higher up on the tree. Every verification transaction may result in the transfer of the binary facial data from the ECD 1404 to the gateway 1402. When the gateway 1402 receives the binary facial data, it may begin checking the uniqueness of the new data’s binary tree structure against the data of the other requesters 1450 stored. When it identifies a potential collision, the gateway 1402 may notify the ECD 1404 to escalate verification transactions between the two identified requesters 1450 to the gateway 1404. When the gateway 1404 receives an escalated transaction it may perform two advanced algorithms (i.e., the Proactive Collision Identification Algorithm and the Time-Domain Trending Algorithm) on the data. Binary tree may take most distinctive features and establish hierarchy based upon most and least distinctive. Prioritized tree may reduce analysis based on 20 rather than 2165 saving time and computing/processing. Based upon the branches of the tree off the most distinctive features may allow the ECD 1404 to determine if this is or is not the person rather than going thru all 2165 features. Thresholds may be empirical and derived from testing. [252] The first advanced algorithm, the Proactive Collision Identification Algorithm, may take the binary tree data and analyze the facial data based on its location in the binary tree. The binary tree data may be weighted. If the weighting of the binary data does not yield a sufficient differentiation of the data, the gateway 1402 may extend the verification process to find adequate differentiation data. [253] In some implementations, the facial characteristics of requesters may continuously change. If the biometric data of a face, such as the face 1730 of the requester 1450, remains static, the identification of the face may ultimately result in false rejections, requiring the re- enrollment of the face 1730. Dynamic adjustments to the biometric data may be continuously
applied and algorithms may determine whether the proposed change is not the introduction of a different face into the biometric data resulting in a false acceptance on the face 17303. With every iteration of the biometric data, the algorithm must determine the amount of biometric data to retain to insure the next successful identification of the requester 1450 and the amount to change to insure the long-term identification of the requester 1450. The determination may be made by a combination of a time-domain analysis where the changes are regressed to insure linearity and facial regional analysis to determine if the area of change is rational. [254] The use of this dynamic data capability leads to the second algorithm, the Time- Domain Trending Algorithm. The gateway 1402 may maintain the history of the binary data of the requester 1450 and perform a time-domain based analysis of the data to assess what features are changing when the history of the binary data of requester 1450 is compared with face 1730 and the speed with which these features are changing. The gateway 1402 may track the evolution of the face and uses this data to extrapolate and reinforce the unique differences identified in the face 1730 of the requester 1450 to establish and later emphasize core features or baseline features and assess the rate of change in these core features for the face 1730. The gateway 1402 then determines using time-domain based analysis whether these changes are taking place over a time period to suggest natural changes in the face 1730, artificial changes in face 1730 warranting further analysis, or a mismatch of the history of binary data of the requester 1450 with the face 1730. The gateway 1402 may identify the unique differences and verify that the differences identified in the verification request are consistent with the trending over time. In some implementations, the gateway 1402 may factor in lifestyle and daily routines of the requester 1450 in the Time-Domain Trending Algorithm. For example, if the requester 1450 enjoys outdoor activities, the Algorithm may factor in increased tanning during the summer season. The Proactive Collision Identification Algorithm combined with the Time-Domain Trending Algorithm allows the method of this embodiment of the invention to maintain its high performance in a large population ‘1:N’ solution. [255] In some implementations, the profiles may be refined by additional replacement profiles. For example, a first-in first-out scheme may be implemented, where a new replacement profile may replace the oldest profile. In another example, the profiles may be divided into a first group of “fast” learners and a second group of “slow” learners. The profiles in the “fast” learners may be replaced after every use, for example. The profiles in
the “slow” learners may be replaced after every day, every week, every month, etc. By simultaneously maintaining the “fast” and “slow” learners, history may be maintained without stale profiles. [256] In certain instances, some profiles may be locked after replacement. For example, if a requester has 20 profiles (e.g., profile #1, profile #2,… profile #20). A replacement profile, such as profile #21, may replace profile #5. After the replacement, profile #21 may be locked for a predetermined amount of time (e.g., 1 day, 1 week, 1 month, 3 months…) to prevent replacement. [257] In some implementations, the Time-Domain Trending Algorithm may reduce the change over time in the biometric data into one or more equations that characterize the change over time (e.g., curve fitting algorithms). The linearity of the one or more equations over time may determine the integrity of the changes. If a particular area of the change is represented as a discrete function, the change may be flagged as a potential threat (e.g., disguise, incorrect match). The changes may also be evaluated based on the physical location of the pixel box on the face. Locations may be weighted based on the probability of change in that area of the face. Significant changes in low probability change areas will be flagged as a potential threat. [258] In some aspects of the present disclosure, the the ECD 1404 and/or one of the gateways 1402 may flag for a false positive or chatter for a false negative of a profile that does not match the remaining profiles. [259] Turning now to FIG. 28, referencing figures above, in some implementations, a method 2800 of constructing a biometric template may be performed by the ECD 1404. [260] In optional implementations, the method 2800 may perform an enrollment process. The enrollment process may include the ECD 1404 capturing a plurality of images (e.g., 5, 10, 15, 20, 30, 50) of the face 1730. The ECD 1404 may convert the plurality of images to biometric data as described above. [261] At block 2802, emitting an incident non-visible light. For example, the illumination source 1704 may emit an incident non-visible light. The non-visible light may include near IR or UV. In some implementations, the illumination source 1704 may emit a visible light. [262] At block 2804, detecting a detected non-visible light, wherein the detected non-visible light includes a reflected non-visible light and a radiated non-visible light. For example, the optical sensor 1702 may detect a detected non-visible light, wherein the detected non-visible light includes a reflected non-visible light and a radiated non-visible light. In some
implementations, the optical sensor 1702 may detect IR light reflected off of the face 1730 of the requester 1450 and/or IR light radiated due to the heat of the requester 1450. [263] At block 2806, generating a sampled profile including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light. For example, the algorithm module 1856 may generate a sampled profile, such as the sampled profile 2000, including a plurality of sampling points having a plurality of characteristic values associated with the detected non-visible light. [264] At block 2808, identifying one or more macroblocks each includes a subset of the plurality of sampling points. For example, the algorithm module 1856 may identify one or more macroblocks each includes a subset of the plurality of sampling points. The one or more macroblocks may be a 10 x 10 macroblock, a 15 x 15 macroblock, a 20 x 20 macroblock, a 30 x 30 macroblock, a 35 x 35 macroblock, and a 40 x 40 macroblock. Other sizes are possible. In some implementations, the ECD 1404 may identify 2165 macroblocks having 2165 associated dimensions. [265] At block 2810, selecting a local pattern value. For example, the algorithm module 1856 may select a local binary pattern value of 20. In other examples, the algorithm module 1856 may select a local ternary pattern value. [266] At block 2812, calculating a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks. For example, the algorithm module 1856 may calculate a number of occurrences of the local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks. [267] At block 2814, generating a first array including a plurality of weighted values by calculating the plurality of weighted values based on the numbers of occurrences of the local pattern value and corresponding sizes of the one or more macroblocks. For example, the algorithm module 1856 may generate a first array including the weighted values shown in the table 2300 (i.e., [0.30 0.54 0.15 0.42 0.62 0.46 0.53]). [268] At block 2816, assigning a unique index to each of the plurality of weighted values. For example, the algorithm module 1856 may assign unique indices (i.e., 1, 2, 3, 4, 5, 6, and 7) to each of the plurality of weighted values shown in the table 2300. The unique index 1 is assigned to the weighted value of 0.30, 2 to 0.54, 3 to 0.15…, etc. [269] At block 2818, generating a second array of the unique index by ranking the plurality of weighted values. For example, the algorithm module 1856 may generate a second array of
the unique index by ranking the plurality of weighted values, such as the sequence 5, 2, 7, 6, 4, 1, 3 in the table 2300. In some implementations, the second array/sequence may indicate a ranking of the weighted values from the highest to the lowest. The first number in the sequence is 5 because the weighted value (i.e., 0.62) associated with the unique index of 5 is the highest among the elements of the first array. [270] At block 2820, generating a third array including a plurality of ranking distances. For example, the algorithm module 1856 may generate a third array including a plurality of ranking distances, such as the stored array [5 1 6 4 0 3 2] in the table 2300. A ranking distance may indicate a numerical difference between the ranks of the highest weighted value (e.g., 0.62 - rank 1) and the current weighted value (e.g., 0.30 – rank 6). Therefore, the ranking distance between 0.62 and 0.30 may be 5 (i.e., 6-1). [271] In some optional implementations, during the verification process, the biometric data based on the sampled profile (the “requestor biometric data”) may be compared to the biometric data of the plurality of images captured during the enrollment process (the “enrollment biometric data”). If the matching percentage exceeds a threshold percentage (e.g., 20, 30, 40, 50, 60, 70, or 80), the ECD 1404 and/or one of the gateways 1402 may determine that the requestor biometric data is a positive match and the verification is successful. [272] In certain aspects, the ECD 1404 and/or one of the gateways 1402 may adjust the enrollment biometric data over time to accommodate any changes to the face 1703 due to, for example, sun tan, aging, injuries, mood change, weight change, facial hair change, cosmetics usage, accessories, or other causes. During each verification, the ECD 1404 and/or one of the gateways 1402 may adjust a portion (e.g., 20%, 30%, 40%, 50%) of the enrollment biometric data. Another portion of the biometric data may remain unchanged. [273] In one non-limiting example, the ECD 1404 and/or one of the gateways 1402 may obtain 20 sampled profiles of a person (e.g., IR or UV images of a person). Each profile of the 20 sampled profiles may be compared to other profiles of the sampled profiles. A distance data may be calculated between each measurement point (or local binary or ternary pattern) of a profile and the corresponding measurement point (or local binary or ternary pattern) of the other profiles. The profiles with the lower standard deviations (e.g., lowest 10) from the mean of each calculated measurement point (or local binary or ternary pattern) may be kept, and the profiles with the higher standard deviations (e.g., highest 10) may be replaceable. If the ECD 1404 and/or one of the gateways 1402 detects a change of the
person’s profile (e.g., aging, facial hair change, cosmetics, injuries…), the ECD 1404 and/or one of the gateways 1402 may replace some or all of the replaceable profiles (i.e., ones with higher standard deviations) with updated profiles. [274] In some aspects, each measurement point (or local binary or ternary pattern) may include an associate composite value calculated from at least one of an average of the corresponding measurement points (or local binary or ternary patterns) of the 20 sampled profiles, a standard deviation of the average of the corresponding measurement points (or local binary or ternary patterns) of the 20 sampled profiles, and/or the corresponding measurement points (or local binary or ternary patterns) of the 20 sampled profiles. For example, the associate composite value may be the sum of the corresponding measurement points of the 20 sampled profile. In another example, the associate composite value may be the average of the corresponding measurement points of the 20 sampled profile. In a non- limiting example, the associate composite value may be proportional or inversely proportional to the standard deviation of the average of the corresponding measurement points of the 20 sampled profile. Other ways of generating the associate composite value of a measurement point may also be used. [275] In some implementations, the ECD 1404 and/or one of the gateways 1402 may use the equation to calculate an average pixel value. Here, n may denote a
number of samples and ai may denote the value of each pixel. [276] Turning now to FIG. 29, an example of a binary tree 2900 may include a root node 2902, a first left node 2904, a first right node 2906, a second left node 2908, a second right node 2910, a third left node 2912, a third right node 2914, and nodes 29n, 29n+1, 29n+2, 29n+3, 29n+4, etc. The binary tree 2900 may be a data structure that includes the features of a sampled profile. The root node 2902 may include the most unique feature of the sampled profile of a person, the first left node 2904 and the first right node 2906 may include the next most unique features, and so forth. The nodes closer to the root node 2902 may be weighted more heavily than the nodes farther away from the root node 2902. [277] In some implementations, a sampled profile may be divided into multiple regions (e.g., 4, 9, 16, 25…). In one example, the sampled profile may include nine regions: Left Eye, Nose Bridge, Right Eye, Left Cheek, Nose, Right Cheek, Left Mouth, Middle Mouth, and Right Mouth. In another example, the sampled profile may include four regions: top left, top right, bottom left, and bottom right. The sizes of the regions may be the same or
different. The root node 2902 and/or nodes near the root node 2902 may include measurement points (or local binary or ternary patterns) around the eyes due to these being the more unique features. [278] Still referring to FIG.29, during detection, the ECD 1404 and/or one of the gateways 1402 may compare the binary tree 2900 with a stored binary tree 2950 of a plurality of binary trees. The ECD 1404 and/or one of the gateways 1402 may calculate a difference between a value of the first node 2902 of the binary tree 2900 and a value of a first node 2952 of the binary tree 2950, a difference between a value of the first left node 2904 of the binary tree 2900 and a value of a first left node 2954 of the binary tree 2950, a difference between a value of the first right node 2906 of the binary tree 2900 and a value of a first right node 2956 of the binary tree 2950… to compute an aggregated difference value. If the binary tree 2900 and the binary tree 2950 include different number of nodes, extra nodes from one of the binary trees 2900, 2950 may be truncated by the ECD 1404 and/or one of the gateways 1402. Alternatively, the ECD 1404 and/or one of the gateways 1402 may determine a number of nodes to compare for each tree. If the aggregated difference value is lower than the aggregated difference values between the binary tree 2900 and other binary tress stored in the ECD 1404 and/or one of the gateways 1402, the ECD 1404 and/or one of the gateways 1402 may determine a positive match. [279] In some examples, the ECD 1404 and/or one of the gateways 1402 may determine a positive match when the aggregated difference values between the binary tree 2900 and other binary tress (belonging to the same person) are lower than other aggregated difference values. [280] In some aspects, the binary tree 2900 may be restructured periodically. [281] Turning now to FIG. 30, an example of a time domain analysis 3000 may begin with the ECD 1404 capturing n sampled profiles 3002-1, 3002-2,… 3002-n of an enroller during the enrollment process. Each of the n sampled profiles 3002-1, 3002-2,… 3002-n may be divided into two or more zones (e.g., 9 zones) In one non-limiting example, the two or more zones may include the upper left, upper middle, upper right, middle left, center, middle right, lower left, lower middle, and lower right zones. The two or more zones may have equal dimensions or may have different dimensions. The two or more zones may include the same or different number of sampling points. In some implementations, the n sampled profiles 3002-1, 3002-2,… 3002-n may be transmitted to one or more of the gateways 1402. In other examples, the one or more biometric templates (described above) for the n sampled profiles 3002-1, 3002-2,… 3002-n may be transmitted to one or more of the gateways 1402.
[282] In some instances, the n sampled profiles 3002-1, 3002-2,… 3002-n may include information such as timestamps (i.e., time the sampled profile was captured), light used (i.e., exposure under UV light, IR light, and/or visible light), resolutions of the camera used for capturing the image associated with the sampled profile, or other information related to the sampled profiles 3002. [283] In certain aspects, when implementing time-domain analysis, the one or more gateways 1402 and/or the ECD 1404 may attempt to update one or more of the n sampled profiles to accommodate for changes in the appearance of the enroller. For example, the enroller may grow a beard or mustache, put on make-up, get a darker complexion from sun- tan, receive a scar on his/her face from injuries, add piercings onto his/her face, wear glasses, or experience other events that may alter the face of the enroller. In certain aspects of the time-domain analysis, the one or more gateways 1402 and/or the ECD 1404 may designate certain sampled profiles as fixed (remains as part of the biometric template of the enroller) and other sampled profiles as updateable (may be replaced). In one aspect of the time- domain analysis, the one or more gateways 1402 and/or the ECD 1404 may replace the oldest sampled profile with a new sampled profile obtained during verification and/or re-enrollment. The one or more gateways 1402 and/or the ECD 1404 may rely on the timestamps of the n sampled profiles 3002 to identify the oldest sampled profile. In another aspect of the time- domain analysis, the one or more gateways 1402 and/or the ECD 1404 may replace the sampled profile having the largest average deviation (as explained below) with a new sampled profile. [284] In some implementations, n matrices 3004-1, 3004-2,… 3004-n may show the numbers of occurrences of a local pattern value (e.g., local binary pattern value or local ternary pattern value). For example, the first matrix 3004-1 shows that the upper left zone of first sampled profile 3002-1 includes 39 occurrences of the local pattern value. The upper middle zone of the first sampled profile 3002-1 includes 121 occurrences of the local pattern value, and so forth and so on. For example, as shown in the second matrix 3004-2, the upper middle zone of second sampled profile 3002-2 includes 125 occurrences of the local pattern value. The upper right zone of second sampled profile 3002-2 includes 99 occurrences of the local pattern value, and so forth and so on. [285] In some implementations, the one or more gateways 1402 and/or the ECD 1404 may generate an average matrix 3010 of the numbers of occurrences in each zone for the n sampled profiles 3002. For example, the average value for the upper left zone may be 39.33
(rounded to the nearest hundredth digit). The average value for the upper middle zone may be 122.33 (rounded to the nearest hundredth digit). The average value for the upper right zone may be 98.67 (rounded to the nearest hundredth digit), and so forth and so on. [286] In some implementations, the one or more gateways 1402 and/or the ECD 1404 may generate absolute deviation matrices 3020 including absolute deviations between the average value of the numbers of occurrences in a zone and each of the number of occurrences. For example, the first absolute deviation matrix 3020-1 may include, for the upper left zone, a number of 0.33 (rounded to the nearest hundredth digit). The number 0.33 indicates the absolute deviation between the average value 39.33 and the number 39 in the first matrix 3004-1. The second absolute deviation matrix 3020-2 may include, for the upper middle zone, a number of 2.67 (rounded to the nearest hundredth digit). The number 2.67 indicates the absolute deviation between the average value 122.33 and the number 125 in the second matrix 3004-2. The nth absolute deviation matrix 3020-n may include, for the lower left zone, a number of 5.33 (rounded to the nearest hundredth digit). The number 5.33 indicates the absolute deviation between the average value 105.67 and the number 111 in the nth matrix 3004-n, and so forth and so on. [287] In some implementations, the one or more gateways 1402 and/or the ECD 1404 may compute the effective absolute deviations for each sampled profile. The effective absolute deviation may be the highest absolute deviation, the highest median of the absolute deviations for a sampled profile, or the sum of the absolute deviations for a sampled profile. In one implementation, the one or more gateways 1402 and/or the ECD 1404 may compute the effective absolute deviation (e.g., sum) for the first sampled profile 3002-1 based on the first absolute deviation matrix 3020-1, and obtain the value of 8 (i.e. sum of 0.33, 1.33, 0.33, 0.33, 0, 1.33, 2.67, 9.33, and 1.33. The effective absolute deviation (sum) of the second sampled profile 3002-2, as calculated based on the second absolute deviation matrix 3020-2, is 11 (i.e., sum of 0.33, 2.67, 0.33, 0.33, 0, 1.33, 2.67, 0.67, and 2.67). The effective absolute deviation (sum) of the nth sampled profile 3002-n is 13. During the time domain analysis, the one or more gateways 1402 and/or the ECD 1404 may replace the nth sampled profile 3002-n with a new sampled profile because the nth sampled profile 3002-n has the highest absolute deviation (sum). In other examples of the time-domain analysis, the one or more gateways 1402 and/or the ECD 1404 may replace the first sampled profile 3002-1 with a new sampled profile because the first sampled profile 3002-1 has the highest absolute deviation.
[288] In some implementations, the one or more gateways 1402 and/or the ECD 1404 may update the biometric template based on the new sampled profile. [289] Turning to FIG. 31, an example of a facial recognition process may begin with the ECD 1404 capturing an image 3100 of a face of a user. The image 3100 may be captured under one or more of UV light, IR light, or visible light. The performance of facial recognition algorithms may be dependent on the positioning of the face in the image 3100. Accurate recognition may require a full frontal image, or a near full frontal image, e.g., within one or more predetermined thresholds. A full frontal image may be an image having no pitch, yaw, or roll of the face (described below). A full frontal image may allow for maximum exposure of all facial detail. However, a person interacting with a camera may not inherently present his/her face in a full frontal position. The face is typically presented with varying degrees of yaw, pitch, and roll. The presentation of a full frontal image may be time- consuming and may weaken an advantage of facial recognition. [290] In some implementations, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to perform image processing. During the image processing, the ECD 1404 may identify one or more facial landmarks 3102. The one or more facial landmarks 3102 may be points at predetermined locations on the face of the user. For example, the ECD 1404 may identify a landmark 3102- 1 near the right corner of the right eye of the user, a landmark 3102-2 near the left corner of the right eye of the user, a landmark 3102-3 near the right corner of the left eye of the user, a landmark 3102-4 near the left corner of the left eye of the user, a landmark 3102-5 near the tip of the nose of the user, and a landmark 3102-6 near the tip of the chin of the user. Other landmarks may also be used, e.g., corners of the mouth. The landmarks 3102 may be used by the processor 1802, the processing board 1820, and/or the image processing algorithm stored in the memory 1804 to compute special deviations of the image 3100 from a full frontal image of the user, e.g., the yaw, roll, and/or pitch of the captured image 3100 as described below. [291] Turning now to FIG. 32, an example 3200 illustrating the yaw, roll, and pitch of a capture image may align a head 3202 of the user to a roll axis 3210, a pitch axis 3220, and a yaw axis 3230. In some implementations, the head 3202 may be in the “neutral” position (no roll, pitch or yaw) with respect to the ECD 1404 when the ECD 1404 is able to take a full frontal image of the head 3202 of the user without further adjustment. In some examples, the head 3202 may tilt to one side or another when the ECD 1404 captures the image. Tilting the
head 3202 may cause the image of the head 3202 to include a roll (i.e., rotating about the roll axis 3210). In other examples, the head 3202 may raise up or lower when the ECD 1404 captures the image. Raising or lowering the head 3202 may cause the image of the head 3202 to include a pitch (i.e., rotating about the pitch axis 3220). In certain examples, the head 302 may turn to the left or the right when the ECD 1404 captures the image. Turning the head 3202 may cause the image of the head 3202 to include a yaw (i.e., rotating about the yaw axis 3230). In some implementations, the head 3202 may tilt, raise/lower, and/or turn, causing the image of the head 3202 to include any one or any combination of a roll, pitch, or yaw. [292] Referring to FIG. 33, in some implementations, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to calculate amounts of roll, pitch, and/or yaw. In one implementation, a yaw measurement technique 3300 may include measuring a first distance 3302 between the landmark 3102-1 (i.e., the right corner of the right eye of the user) and the landmark 3102-5 (i.e., the tip of the nose of the user) and a second distance 3304 between the landmark 3102-4 (i.e., the left corner of the left eye of the user) and the landmark 3102-5. The ratio between the first distance 3302 and the second distance 3304 may represent a yaw angle, which may quantify an amount of yaw in the captured image. [293] In some aspects, a pitch measurement technique 3310 may include measuring a first distance 3312 between the landmark 3102-2 (i.e., the left corner of the right eye of the user) and the landmark 3102-5 (i.e., the tip of the nose of the user) and a second distance 3314 between the landmark 3102-5 and the landmark 3102-6 (i.e., the tip of the chin of the user). The ratio between the first distance 3302 and the second distance 3304 may represent a pitch angle, which may quantify an amount of pitch in the captured image. [294] In some aspects, a roll measurement technique 3320 may include measuring a vector 3322 between the landmark 3102-1 (i.e., the right corner of the right eye of the user) and the landmark 3102-4 (i.e., the left corner of the left eye of the user). The vector 3322 may be measured against a horizontal vector. The angle between the vector 3322 and the horizontal vector may represent a roll angle, which may quantify an amount of roll in the captured image. To correct for the roll, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to rotate the captured image until the vector 3322 is parallel with the horizontal vector. The corrections for the yaw and the pitch will be described in detail below.
[295] Referring to FIG. 34, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to overlay a macroblock 3410 (e.g., 10 x 10 blocks) onto the image 3100 captured by the ECD 1404. The ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to align the macroblock 3410 to the landmarks 3102. In one aspect, the ECD 1404 and/or components of ECD 1404 may align certain blocks of the macroblock 3410 to the landmarks 3102. In a non-limiting example, the blocks in the top-most row the macroblock 3410 may be numbered from 1 to 10 from the top left of the face of the user to the top right of the face of the user. The second top-most row of the macroblock 3410 may be numbered from 11 to 20 from the top left of the face of the user to the top right of the face of the user, and so forth. Based on the coordinate system described, the landmark 3102-1 may be aligned such that the landmark 3102-1 overlays a boundary between block 20 and block 30. The landmark 3102-3 may be aligned such that the landmark 3102-3 is entirely within block 24. The landmark 3102-5 may be aligned to the center line of the macroblock 3410. Other coordinate systems may also be used to align the landmarks 3102 to the blocks of the macroblock 3410. [296] Turning now to FIG.35 and referencing FIG.34, an example of a technique 3500 for correcting yaw may divide the macroblock 3410 into two 10 x 5 sub-macroblocks. For example, if a head 3202-1 (as seen from above) turns to the right, one or more portions of the macroblock 3410 overlaying the face of the user may be misaligned due to the yaw (e.g., the landmark 3102-5 is no longer aligned to the center line of the macroblock 3410). To adjust for the yaw, the ECD 1404 may split the macroblock 3410-1 into a first sub-macroblock 3410-1A and a second sub-macroblock 3410-1B. Both the first sub-macroblock 3410-1A and the second sub-macroblock 3410-1B may each include 10 x 5 blocks. The first sub- macroblock 3410-1A and the second sub-macroblock 3410-1B may share a center line 3412 that intersects the landmarks 3102-5 and/or 3102-6. Due to the head 3202-1 turning to the right (i.e., yaw), the first sub-macroblock 3410-1A may be compressed horizontally and the second sub-macroblock 3410-1B may be expanded horizontally. To correct for the yaw, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to reconstruct a frontal image of the face of the user by expanding the first macroblock 3410-1A and compressing the second sub-macroblock 3410-1B. In one example, the ECD 1404 (or one or more of the subcomponents) may adjust the first macroblock 3410-1A and/ the second sub-macroblock 3410-1B to have identical areas.
[297] In another example, if a head 3202-2 (as seen from above) turns to the left, one or more portions of the macroblock 3410 overlaying the face of the user may be misaligned due to the yaw (e.g., the landmark 3102-5 is no longer aligned to the center line of the macroblock 3410). To adjust for the yaw, the ECD 1404 may split the macroblock 3410-2 into a third sub-macroblock 3410-2A and a fourth sub-macroblock 3410-2B. Both the third sub-macroblock 3410-2A and the fourth sub-macroblock 3410-2B may each include 10 x 5 blocks. The third sub-macroblock 3410-2A and the fourth sub-macroblock 3410-2B may share the center line 3412 that intersects the landmarks 3102-5 and/or 3102-6. Due to the head 3202-2 turning to the left (i.e., yaw), the third sub-macroblock 3410-2A may be expanded horizontally and the fourth sub-macroblock 3410-2B may be compressed horizontally. To correct for the yaw, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to reconstruct a frontal image of the face of the user by compressing the third macroblock 3410-2A and expanding the fourth sub-macroblock 3410-2B. In one example, the ECD 1404 (or one or more of the subcomponents) may adjust the third macroblock 3410-2A and/ the fourth sub-macroblock 3410-2B to have identical areas. [298] Turning now to FIG.36 and referencing FIG.34, an example of a technique 3600 for correcting pitch may divide the macroblock 3410 into a 2 x 10 sub-macroblock and a 8 x 10 sub-macroblock. For example, if a head 3202-3 (as seen from the right) raises, one or more portions of the macroblock 3410 overlaying the face of the user may be misaligned due to the pitch (e.g., the landmark 3102-1 is no longer aligned to a boundary between block 20 and block 30). To adjust for the pitch, the ECD 1404 may split the macroblock 3410-1 into a fifth sub-macroblock 3410-3A and a sixth sub-macroblock 3410-3B. The fifth sub-macroblock 3410-3A may include 2 x 10 blocks and the sixth sub-macroblock 3410-3B may include 8 x 10 blocks. The fifth sub-macroblock 3410-3A and the sixth sub-macroblock 3410-3B may share a dividing line 3414 that intersects the landmark 3102-1 and/or the landmark 3102-2. Due to the head 3202-3 raising (i.e., pitch), the fifth sub-macroblock 3410-3A and the sixth sub-macroblock 3410-3B may be compressed vertically. To correct for the pitch, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to reconstruct a frontal image of the face of the user by expanding the fifth macroblock 3410-3A and the sixth sub-macroblock 3410-3B. In one example, the ECD 1404 (or one or more of the subcomponents) may adjust the fifth macroblock 3410-3A to occupy
20% of the area of the macroblock 3410-3 and adjust the sixth macroblock 3410-3B to occupy 80% of the area of the macroblock 3410-3. [299] In another example, if a head 3202-4 (as seen from the right) lowers, one or more portions of the macroblock 3410 overlaying the face of the user may be misaligned due to the pitch (e.g., the landmark 3102-1 is no longer aligned to a boundary between block 20 and block 30). To adjust for the pitch, the ECD 1404 may split the macroblock 3410-1 into a seventh sub-macroblock 3410-4A and an eighth sub-macroblock 3410-4B. The seventh sub- macroblock 3410-4A may include 2 x 10 blocks and the eighth sub-macroblock 3410-4B may include 8 x 10 blocks. The seventh sub-macroblock 3410-4A and the eighth sub- macroblock 3410-4B may share the dividing line 3414 that intersects the landmark 3102-1 and/or the landmark 3102-2. Due to the head 3202-3 lowering (i.e., pitch), the seventh sub- macroblock 3410-4A may be compressed vertically and the eighth sub-macroblock 3410-4B may be expanded vertically. To correct for the pitch, the ECD 1404 may utilize the processor 1802, the processing board 1820, and/or applications stored in the memory 1804 to reconstruct a frontal image of the face of the user by expanding the seventh sub-macroblock 3410-4A and compressing the eighth sub-macroblock 3410-4B. In one example, the ECD 1404 (or one or more of the subcomponents) may adjust the fifth macroblock 3410-3A to occupy 20% of the area of the macroblock 3410-3 and adjust the sixth macroblock 3410-3B to occupy 80% of the area of the macroblock 3410-3. [300] By adjusting the macroblock location (top and left) and resulting size in this manner, a given macroblock may capture the same area of the face regardless of the yaw and pitch of the face. This given area of the face may be smaller or larger based on the type of movement but the boundary may remain the same. For instance, a macroblock may include the right corner of the right eye, which may include a landmark, in the full frontal position. The macroblock will continue to include the right corner of the right eye regardless of the pitch and yaw angle. The area that it covers in the full frontal position will be 10x10 but this area will change depending on the pitch and yaw angles. The macroblock may become a 4x6 blocks or a 14x8 blocks but it will always cover the same portion of the face. By digitally aligning the orientation of the face to the camera, the same region of the face may be aligned for each macroblock. [301] In some implementations, the macroblock 3410 may be converted into a ULBP histogram. The histogram may be constructed by calculating the ULBP for each pixel within the macroblock and then aggregating the resulting ULBPs into a histogram. The amplitude
of the histogram may be the number of pixels with that ULBP value within the macroblock. To normalize the histogram across macroblocks with varying dimensions, the histogram amplitude values may be divided by the area of the macroblock in pixels. The amplitude may be a percentage of the macroblock containing the given ULBP value. These normalized values may be used in the remaining algorithm to characterize the face (as described above). [302] In some implementations, a collision between a first biometric template of a first user and a second biometric template of a second user may occur when the overlap of features between two users to cause the biometric data matching algorithm to mistake one user for the other and grant a false accept or reject. For example, the gateways 1402 and/or the ECD 1404 may identify a positive match between the first user and the second biometric template. The collision may be identified proactively (i.e., the collision is identified independent of any actions). In a non-limiting example, the collision may be identified by the gateways 1402 and/or the ECD 1404 when the environment 1400 is experiencing a few (e.g., less than 1 per hour) or zero access requests. The collision may be identified due to an access request by the first user or the second user. The collision may be identified independent of any access request. The gateways 1402 and/or the ECD 1404 may identify overlapping features between the first biometric template and the second biometric template. In some aspects, the gateways 1402 and/or the ECD 1404 may determine a collision when 50% or more of the features in the first biometric template overlaps with the features in the second biometric template. [303] In response to detecting a collision, the gateways 1402 and/or the ECD 1404 may notify an administrator, such as the network administrator 1452, a security personnel, and/or other relevant employees, about the collision. The gateways 1402 and/or the ECD 1404 may notify the administrator via automatically generated email and/or text messages. The gateways 1402 and/or the ECD 1404 may notify the first user and/or the second user to approach any of the ECDs 1404a, 1404b, 1404c, 1404d, 1404e, 1404f to re-enroll. [304] It will be appreciated that various implementations of the above-disclosed and other features and functions, or alternatives or varieties thereof, may be desirably combined into many other different systems or applications by one ordinarily skilled in the art. Also that various presently unforeseen or unanticipated alternatives, modifications, variations, or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims. [305] The above detailed description set forth above in connection with the appended drawings describes examples and does not represent the only examples that may be
implemented or that are within the scope of the claims. The term “example,” when used in this description, means “serving as an example, instance, or illustration,” and not “preferred” or “advantageous over other examples.” The detailed description includes specific details for the purpose of providing an understanding of the described techniques. These techniques, however, may be practiced without these specific details. For example, changes may be made in the function and arrangement of elements discussed without departing from the scope of the disclosure. Also, various examples may omit, substitute, or add various procedures or components as appropriate. For instance, the methods described may be performed in an order different from that described, and various steps may be added, omitted, or combined. Also, features described with respect to some examples may be combined in other examples. In some instances, well-known structures and apparatuses are shown in block diagram form in order to avoid obscuring the concepts of the described examples. [306] Information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, computer-executable code or instructions stored on a computer- readable medium, or any combination thereof. [307] The various illustrative blocks and components described in connection with the disclosure herein may be implemented or performed with a specially-programmed device, such as but not limited to a processor, a digital signal processor (DSP), an ASIC, a FPGA or other programmable logic device, a discrete gate or transistor logic, a discrete hardware component, or any combination thereof designed to perform the functions described herein. A specially-programmed processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A specially-programmed processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. [308] The functions described herein may be implemented in hardware, software executed by a processor, firmware, or any combination thereof. If implemented in software executed by a processor, the functions may be stored on or transmitted over as one or more instructions or code on a non-transitory computer-readable medium. Other examples and implementations are within the scope and spirit of the disclosure and appended claims. For
example, due to the nature of software, functions described above can be implemented using software executed by a specially programmed processor, hardware, firmware, hardwiring, or combinations of any of these. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations. Also, as used herein, including in the claims, “or” as used in a list of items prefaced by “at least one of” indicates a disjunctive list such that, for example, a list of “at least one of A, B, or C” means A or B or C or AB or AC or BC or ABC (i.e., A and B and C). [309] Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage medium may be any available medium that can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, computer- readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code means in the form of instructions or data structures and that can be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of computer-readable media.
Claims
CLAIMS What we claim is: 1. A method of generating a biometric template, comprising: emitting an incident non-visible light toward a requester; detecting a detected non-visible light from the requester; generating a sampled profile including a plurality of sampling points having a plurality of intensities of the detected non-visible light, identifying one or more macroblocks, wherein each macroblock includes a coordinate and a subset of the plurality of sampling points, calculating a number of occurrences of a local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks; generating a first array including a plurality of weighted values, wherein each of the plurality of weighted values is calculated by dividing the corresponding number of occurrences of the local pattern value by a size of a corresponding macroblock of the one or more macroblocks; assigning a unique index to each of the plurality of weighted values; generating a second array of the unique indices by ranking the unique indices based on an associated weighted value; generating a third array including a plurality of ranking distances between a highest weighted value of the plurality of weighted values and each weighted value of the plurality of weighted values; constructing a biometric template of the requester based on the third array and the coordinates; and wherein the sampled profile is different than a visible profile including a plurality of visible sampling points having a plurality of intensities of detected visible light.
2. The method of claim 1, further comprising: storing the sampled profile of the requester in a storage device; receiving confidential information associated with the requester; and associating the confidential information with the sampled profile.
3. The method of claim 1, further comprising:
generating an encryption key based on the biometric template; encrypting data using the encryption key to generate encrypted data; and storing the encrypted data.
4. The method of claim 3, further comprising: generating a decryption key based on the biometric template; decrypting the encrypted data using the decryption key to recover the data; and accessing content of the data.
5. The method of claim 4, wherein the encryption key is a public key and the decryption key is a private key.
6. The method of claim 3, wherein the data includes at least one of confidential information, the sampled profile, a document, or a movie.
7. The method of claim 1, further comprising associating at least one of a fingerprint of the requester, a voice pattern of the requester, an iris pattern of the requester, a facial feature of the requester, a signature pattern of the requester, a shape of an ear of the requester, a retinal pattern of the requester, a gait of the requester, or a hand geometry of the requester with the biometric template for identification.
8. An edge capture device (ECD), comprising: a memory that stores a plurality of biometric templates; a communication interface; a processor communicatively coupled with the memory and the communication interface, the processor is configured to: cause a light source to emit an incident non-visible light toward a requester; cause an optical sensor to detect a detected non-visible light from the requester; generate a sampled profile including a plurality of sampling points having a plurality of intensities of the detected non-visible light, identify one or more macroblocks, wherein each macroblock includes a coordinate and a subset of the plurality of sampling points,
calculate a number of occurrences of a local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks; generate a first array including a plurality of weighted values, wherein each of the plurality of weighted values is calculated by dividing the corresponding number of occurrences of the local pattern value by a size of a corresponding macroblock of the one or more macroblocks; assign a unique index to each of the plurality of weighted values; generate a second array of the unique indices by ranking the unique indices based on an associated weighted value; generate a third array including a plurality of ranking distances between a highest weighted value of the plurality of weighted values and each weighted value of the plurality of weighted values; construct a biometric template of the requester based on the third array and the coordinates; and wherein the sampled profile is different than a visible profile including a plurality of visible sampling points having a plurality of intensities of detected visible light.
9. The ECD of claim 8, wherein the processor is further configured to: store the sampled profile of the requester in a storage device; receive confidential information associated with the requester; and associate the confidential information with the sampled profile.
10. The ECD of claim 8, wherein the processor is further configured to: generate an encryption key based on the biometric template; encrypt data using the encryption key to generate encrypted data; and store the encrypted data in the memory.
11. The ECD of claim 10, wherein the processor is further configured to: generate a decryption key based on the biometric template; decrypt the encrypted data using the decryption key to recover the data; and access content of the data.
12. The ECD of claim 11, wherein the encryption key is a public key and the decryption key is a private key.
13. The ECD of claim 10, wherein the data includes at least one of confidential information, the sampled profile, a document, or a movie.
14. The ECD of claim 8, wherein the processor is further configured to associate at least one of a fingerprint of the requester, a voice pattern of the requester, an iris pattern of the requester, a facial feature of the requester, a signature pattern of the requester, a shape of an ear of the requester, a retinal pattern of the requester, a gait of the requester, or a hand geometry of the requester with the biometric template.
15. A non-transitory computer readable medium having instructions stored therein that, when executed by one or more processors of an edge capture device (ECD), cause the one or more processors to: cause a light source to emit an incident non-visible light toward a requester; cause an optical sensor to detect a detected non-visible light from the requester; generate a sampled profile including a plurality of sampling points having a plurality of intensities of the detected non-visible light, identify one or more macroblocks, wherein each macroblock includes a coordinate and a subset of the plurality of sampling points, calculate a number of occurrences of a local pattern value within each subset of the plurality of the sampling points for each of the one or more macroblocks; generate a first array including a plurality of weighted values, wherein each of the plurality of weighted values is calculated by dividing the corresponding number of occurrences of the local pattern value by a size of a corresponding macroblock of the one or more macroblocks; assign a unique index to each of the plurality of weighted values; generate a second array of the unique indices by ranking the unique indices based on an associated weighted value; generate a third array including a plurality of ranking distances between a highest weighted value of the plurality of weighted values and each weighted value of the plurality of weighted values;
construct a biometric template of the requester based on the third array and the coordinates; and wherein the sampled profile is different than a visible profile including a plurality of visible sampling points having a plurality of intensities of detected visible light.
16. The non-transitory computer readable medium of claim 1, further comprising instructions that, when executed by the one or more processors, cause the one or more processors to: store the sampled profile of the requester in a storage device; receive confidential information associated with the requester; and associate the confidential information with the sampled profile.
17. The non-transitory computer readable medium of claim 1, further comprising instructions that, when executed by the one or more processors, cause the one or more processors to: generate an encryption key based on the biometric template; encrypt data using the encryption key to generate encrypted data; and store the encrypted data.
18. The non-transitory computer readable medium of claim 17, further comprising instructions that, when executed by the one or more processors, cause the one or more processors to: generate a decryption key based on the biometric template; decrypt the encrypted data using the decryption key to recover the data; and access content of data.
19. The non-transitory computer readable medium of claim 15, wherein the encryption key is a public key and the decryption key is a private key.
20. The non-transitory computer readable medium of claim 15, further comprising instructions that, when executed by the one or more processors, cause the one or more processors to associate at least one of a fingerprint of the requester, a voice pattern of the requester, an iris pattern of the requester, a facial feature of the requester, a signature pattern
of the requester, a shape of an ear of the requester, a retinal pattern of the requester, a gait of the requester, or a hand geometry of the requester with the biometric template.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20911109.5A EP4085372A4 (en) | 2019-12-30 | 2020-12-29 | Methods and apparatus for facial recognition |
CA3163432A CA3163432A1 (en) | 2019-12-30 | 2020-12-29 | Methods and apparatus for facial recognition |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/730,578 US11275929B2 (en) | 2012-09-07 | 2019-12-30 | Methods and apparatus for privacy protection during biometric verification |
US16/730,578 | 2019-12-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021138342A1 true WO2021138342A1 (en) | 2021-07-08 |
Family
ID=76687265
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2020/067338 WO2021138342A1 (en) | 2019-12-30 | 2020-12-29 | Methods and apparatus for facial recognition |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4085372A4 (en) |
CA (1) | CA3163432A1 (en) |
WO (1) | WO2021138342A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11625665B1 (en) * | 2022-03-29 | 2023-04-11 | Todd Martin | Contactless authorized event entry and item delivery system and method |
US20230154233A1 (en) * | 2021-11-16 | 2023-05-18 | Deep Et | Apparatus and method for face recognition using user terminal |
CN117896546A (en) * | 2024-03-14 | 2024-04-16 | 浙江华创视讯科技有限公司 | Data transmission method, system, electronic equipment and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160328600A1 (en) * | 2014-12-01 | 2016-11-10 | Xiamen ZKTeco Electronic Biometric Identification Technology Co., Ltd. | System and method for personal identification based on multimodal biometric information |
US20180211092A9 (en) * | 2013-09-16 | 2018-07-26 | EyeVerify Inc. | Biometric template security and key generation |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6154559A (en) * | 1998-10-01 | 2000-11-28 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | System for classifying an individual's gaze direction |
-
2020
- 2020-12-29 EP EP20911109.5A patent/EP4085372A4/en active Pending
- 2020-12-29 WO PCT/US2020/067338 patent/WO2021138342A1/en unknown
- 2020-12-29 CA CA3163432A patent/CA3163432A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180211092A9 (en) * | 2013-09-16 | 2018-07-26 | EyeVerify Inc. | Biometric template security and key generation |
US20160328600A1 (en) * | 2014-12-01 | 2016-11-10 | Xiamen ZKTeco Electronic Biometric Identification Technology Co., Ltd. | System and method for personal identification based on multimodal biometric information |
Non-Patent Citations (1)
Title |
---|
See also references of EP4085372A4 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230154233A1 (en) * | 2021-11-16 | 2023-05-18 | Deep Et | Apparatus and method for face recognition using user terminal |
US11625665B1 (en) * | 2022-03-29 | 2023-04-11 | Todd Martin | Contactless authorized event entry and item delivery system and method |
US11755986B1 (en) | 2022-03-29 | 2023-09-12 | Todd Martin | Combined flow-thru facial recognition for mass spectator event entry and item fulfillment system and method |
CN117896546A (en) * | 2024-03-14 | 2024-04-16 | 浙江华创视讯科技有限公司 | Data transmission method, system, electronic equipment and storage medium |
CN117896546B (en) * | 2024-03-14 | 2024-06-07 | 浙江华创视讯科技有限公司 | Data transmission method, system, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
EP4085372A1 (en) | 2022-11-09 |
EP4085372A4 (en) | 2024-08-07 |
CA3163432A1 (en) | 2021-07-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10853630B2 (en) | Methods and apparatus for biometric verification | |
US11163984B2 (en) | Methods and apparatus for constructing biometrical templates using facial profiles of users | |
US11163983B2 (en) | Methods and apparatus for aligning sampling points of facial profiles of users | |
CA3109748A1 (en) | Methods and apparatus for facial recognition | |
US10438053B2 (en) | Biometric identification systems and methods | |
Del Rio et al. | Automated border control e-gates and facial recognition systems | |
WO2021138342A1 (en) | Methods and apparatus for facial recognition | |
US9076048B2 (en) | Biometric identification, authentication and verification using near-infrared structured illumination combined with 3D imaging of the human ear | |
US20150015365A1 (en) | Point of entry authorization utilizing rfid enabled profile and biometric data | |
US11017212B2 (en) | Methods and apparatus for biometric verification | |
Das | Biometric technology: authentication, biocryptography, and cloud-based architecture | |
US11275929B2 (en) | Methods and apparatus for privacy protection during biometric verification | |
US11301670B2 (en) | Methods and apparatus for collision detection in biometric verification | |
US11594072B1 (en) | Methods and apparatus for access control using biometric verification | |
US11017213B1 (en) | Methods and apparatus for biometric verification | |
US11017214B1 (en) | Methods and apparatus for biometric verification | |
CA3132721A1 (en) | Methods and apparatus for facial recognition | |
CN117501265A (en) | Method for authenticating a user of a mobile device | |
Kumari | ENHANCING PAYMENT SECURITY THROUGH THE IMPLEMENTATION OF DEEP LEARNING-BASED FACIAL RECOGNITION SYSTEMS IN MOBILE BANKING APPLICATIONS | |
Phillips et al. | A cancellable and privacy-preserving facial biometric authentication scheme | |
Ammaarah et al. | Photographic Methods in Enhancing Biometric Security Systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20911109 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3163432 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020911109 Country of ref document: EP Effective date: 20220801 |