US20210365535A1 - Eye scanner for user identification and security in an eyewear device - Google Patents
Eye scanner for user identification and security in an eyewear device Download PDFInfo
- Publication number
- US20210365535A1 US20210365535A1 US17/397,790 US202117397790A US2021365535A1 US 20210365535 A1 US20210365535 A1 US 20210365535A1 US 202117397790 A US202117397790 A US 202117397790A US 2021365535 A1 US2021365535 A1 US 2021365535A1
- Authority
- US
- United States
- Prior art keywords
- infrared
- user
- eye
- frame
- eyewear device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000006870 function Effects 0.000 claims abstract description 30
- 210000001525 retina Anatomy 0.000 claims description 25
- 238000000034 method Methods 0.000 claims description 24
- 230000003287 optical effect Effects 0.000 claims description 9
- 238000004891 communication Methods 0.000 description 26
- 238000012545 processing Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 8
- 239000000853 adhesive Substances 0.000 description 6
- 230000001070 adhesive effect Effects 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000014509 gene expression Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 210000004204 blood vessel Anatomy 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- 150000003071 polychlorinated biphenyls Chemical class 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000000994 depressogenic effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000003909 pattern recognition Methods 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000003760 hair shine Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 239000000155 melt Substances 0.000 description 2
- 230000002207 retinal effect Effects 0.000 description 2
- 210000001210 retinal vessel Anatomy 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 229910000679 solder Inorganic materials 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- KJLPSBMDOIVXSN-UHFFFAOYSA-N 4-[4-[2-[4-(3,4-dicarboxyphenoxy)phenyl]propan-2-yl]phenoxy]phthalic acid Chemical compound C=1C=C(OC=2C=C(C(C(O)=O)=CC=2)C(O)=O)C=CC=1C(C)(C)C(C=C1)=CC=C1OC1=CC=C(C(O)=O)C(C(O)=O)=C1 KJLPSBMDOIVXSN-UHFFFAOYSA-N 0.000 description 1
- 238000012896 Statistical algorithm Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000012729 immediate-release (IR) formulation Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 210000003061 neural cell Anatomy 0.000 description 1
- 238000005476 soldering Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/0861—Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G06K9/00617—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
Definitions
- the present subject matter relates to eye scanners for an eyewear device, e.g., smart glasses, for user identification and security.
- Portable eyewear devices such as smartglasses, headwear, and headgear available today integrate cameras and displays. Users of such portable eyewear devices may share such eyewear devices with friends and family members so that any user can borrow the eyewear device to capture images with the integrated camera.
- Verifying the identity of the specific user of the portable eyewear device can be useful. For example, as augmented reality becomes more prevalent in such eyewear devices applications may be developed that need to verify the identity of the user for security purposes.
- FIG. 1 is a rear view of an example hardware configuration of an eyewear device, which includes an eye scanner on a frame, for use in a system for identifying a user of the eyewear device.
- FIG. 2 is a rear view of an example hardware configuration of another eyewear device, which includes an eye scanner on a chunk, for use in a system for identifying a user of the eyewear device.
- FIG. 3 shows a rear perspective sectional view of the eyewear device of FIG. 1 depicting an infrared camera, a frame front, a frame back, and a circuit board.
- FIG. 4 is a cross-sectional view taken through the infrared camera and the frame of the eyewear device of FIG. 3 .
- FIG. 5 shows a rear perspective view of the eyewear device of FIG. 1 depicting an infrared emitter, an infrared camera, a frame front, a frame back, and a circuit board.
- FIG. 6 is a cross-sectional view taken through the infrared emitter and the frame of the eyewear device of FIG. 5 .
- FIG. 7 is a top cross-sectional view of the chunk of the eyewear device of FIG. 2 depicting the infrared emitter, the infrared camera, and a circuit board.
- FIG. 8A depicts an example of a pattern of infrared light emitted by an infrared emitter of the eyewear device and reflection variations of the emitted pattern of infrared light captured by the infrared camera of the eyewear device.
- FIG. 8B depicts the emitted pattern of infrared light being emitted by the infrared emitter of the eyewear device in an inwards facing field of view towards an eye of a user.
- FIG. 9 is a high-level functional block diagram of an example user identification system including the eyewear device, a mobile device, and a server system connected via various networks.
- FIG. 10 shows an example of a hardware configuration for the mobile device of the user identification system of FIG. 9 , in simplified block diagram form.
- FIG. 11A shows various alternate locations for the eye scanner on the eyewear device, which can be used individually or in combination.
- FIGS. 11B, 11C, and 11D illustrate the effects of the various alternate locations on the eyewear device with respect to different orientations of the eye of the user.
- FIG. 12 is a flowchart of the operation of the eyewear device and other components of the user identification system.
- Coupled refers to any logical, optical, physical or electrical connection, link or the like by which signals or light produced or supplied by one system element are imparted to another coupled element. Unless described otherwise, coupled elements or devices are not necessarily directly connected to one another and may be separated by intermediate components, elements or communication media that may modify, manipulate or carry the light or signals.
- the orientations of the eyewear device, associated components and any complete devices incorporating an eye scanner such as shown in any of the drawings, are given by way of example only, for illustration and discussion purposes.
- the eyewear device may be oriented in any other direction suitable to the particular application of the eyewear device, for example up, down, sideways, or any other orientation.
- any directional term such as front, rear, inwards, outwards, towards, left, right, lateral, longitudinal, up, down, upper, lower, top, bottom and side, are used by way of example only, and are not limiting as to direction or orientation of any optic or component of an optic constructed as otherwise described herein.
- a system in an example, includes an eyewear device.
- the eyewear device includes a frame and a temple connected to a lateral side of the frame.
- the eyewear device further includes an infrared emitter connected to the frame or the temple to emit a pattern of infrared light.
- the eyewear device further includes an infrared camera connected to the frame or the temple to emit a pattern of infrared light.
- the system further includes a processor coupled to the eyewear device, a memory accessible to the processor, and programming in the memory.
- Execution of the programming by the processor configures the system to perform functions, including functions to emit, via the infrared emitter, the pattern of infrared light on an eye of a user of the eyewear device.
- the execution of the programming by the processor further configures the system to capture, via the infrared camera, the reflection variations in the emitted pattern of infrared light on the eye of the user.
- the execution of the programming by the processor further configures the system to identify a user, or an account, of the eyewear device based on the reflection variations of the emitted pattern of infrared light on the eye of the user.
- FIG. 1 is a rear view of an example hardware configuration of an eyewear device 100 , which includes an eye scanner 113 on a frame 105 , for use in a system for identifying a user of the eyewear device 100 .
- the eyewear device 100 is in a form configured for wearing by a user, which are eyeglasses in the example of FIG. 1 .
- the eyewear device 100 can take other forms and may incorporate other types of frameworks, for example, a headgear, a headset, or a helmet.
- eyewear device 100 includes a frame 105 which includes a left rim 107 A connected to a right rim 107 B via a bridge 106 adapted for a nose of the user.
- the left and right rims 107 A-B include respective apertures 175 A-B which hold a respective optical element 180 A-B, such as a lens and a display device.
- the term lens is meant to cover transparent or translucent pieces of glass or plastic having curved and flat surfaces that cause light to converge/diverge or that cause little or no convergence/divergence.
- eyewear device 100 can include other arrangements, such as a single optical element or may not include any optical element 180 A-B depending on the application or intended user of the eyewear device 100 .
- eyewear device 100 includes a left chunk 110 A adjacent the left lateral side 170 A of the frame 105 and a right chunk 110 B adjacent the right lateral side 170 B of the frame 105 .
- the chunks 110 A-B may be integrated into the frame 105 on the respective sides 170 A-B (as illustrated) or implemented as separate components attached to the frame 105 on the respective sides 170 A-B.
- the chunks 110 A-B may be integrated into temples (not shown) attached to the frame 105 .
- the eye scanner 113 includes an infrared emitter 115 and an infrared camera 120 .
- Visible light cameras typically include a blue light filter to block infrared light detection
- the infrared camera 120 is a visible light camera, such as a low resolution video graphic array (VGA) camera (e.g., 640 ⁇ 480 pixels for a total of 0.3 megapixels), with the blue filter removed.
- VGA video graphic array
- the infrared emitter 115 and the infrared camera 120 are co-located on the frame 105 , for example, both are shown as connected to the upper portion of the left rim 107 A.
- the frame 105 or one or more of the left and right chunks 110 A-B include a circuit board that includes the infrared emitter 115 and the infrared camera 120 .
- the infrared emitter 115 and the infrared camera 120 can be connected to the circuit board by soldering, for example.
- infrared emitter 115 and infrared camera 120 can be implemented, including arrangements in which the infrared emitter 115 and infrared camera 120 are both on the right rim 107 B, or in different locations on the frame 105 , for example, the infrared emitter 115 is on the left rim 107 A and the infrared camera 120 is on the right rim 107 B. In another example, the infrared emitter 115 is on the frame 105 and the infrared camera 120 is on one of the chunks 110 A-B, or vice versa.
- the infrared emitter 115 can be connected essentially anywhere on the frame 105 , left chunk 110 A, or right chunk 110 B to emit a pattern of infrared light.
- the infrared camera 120 can be connected essentially anywhere on the frame 105 , left chunk 110 A, or right chunk 110 B to capture at least one reflection variation in the emitted pattern of infrared light.
- the infrared emitter 115 and infrared camera 120 are arranged to face inwards towards the eye of the user with a partial or full field of view of the eye in order to pick up an infrared image of the eye for identify verification.
- the infrared emitter 115 and infrared camera 120 are positioned directly in front of the eye, in the upper part of the frame 105 or in the chunks 110 A-B at either ends of the frame 105 .
- the identification establishes globally unique and unambiguous identifiers of the eye, which serves to distinguish a discrete individual from other like and unlike users.
- the eye scanner 113 is a retina scanner which uses infrared light illumination (e.g., near-infrared, short-wavelength infrared, mid-wavelength infrared, long-wavelength infrared, or far infrared) to identify the unique blood vessel configuration in the eye, for example, an unperceived beam of low-energy infrared light is cast on the person's eye.
- infrared light illumination e.g., near-infrared, short-wavelength infrared, mid-wavelength infrared, long-wavelength infrared, or far infrared
- the pattern of infrared light emitted by the infrared emitter 115 is more easily absorbed by the blood vessels than the surrounding tissue, the amount of light reflected back to the infrared camera 120 varies, which can then be used to uniquely identify the user.
- Such retinal scanning is an ocular-based biometric technology that uses the unique patterns on a person's retina blood vessels.
- a user identification algorithm using digital templates encoded from these patterns by mathematical and statistical algorithms allow for pattern recognition of the retina blood vessels or iris and hence the identification of the user.
- identification can be unique, in other embodiments, the identification establishes that the user is part of a group of users. In response to being identified as part of a group, the user can be provided permissions to access, control, or utilize, one more executable software applications or hardware features (e.g., a visible light camera) of the eyewear device 100 .
- the eyewear device 100 is coupled to a processor and a memory, for example in the eyewear device 100 itself or another part of the system. Subsequent processing by the eyewear device 100 by the system, for example, using a coupled memory and processor in the system to process the captured image of reflection variations of infrared light from the retina, identifies the unique pattern of the user's eye and thus the particular user of the eyewear device 100 .
- the eye scanner 113 may include an emitter that emits other wavelengths of light besides infrared and the eye scanner 113 further includes a camera sensitive to that wavelength that receives and captures images with that wavelength.
- the eye scanner 113 may comprise a visible light camera that captures light in the visible light range from the iris.
- iris recognition can use infrared illumination by the infrared emitter 115 and the infrared camera 120 or a video camera to capture images of the detail-rich, intricate structures of the iris which are visible externally.
- the eyewear device 100 or the system can subsequently process images captured of the iris using, for example, a coupled memory and processor in the system to process the captured image of visible light from the iris. Such processing of the captured images can identify the unique pattern of the user's eye and thus the particular user of the eyewear device 100 .
- FIG. 2 is a rear view of an example hardware configuration of another eyewear device 200 .
- the eyewear device 200 is depicted as including an eye scanner 213 on a right chunk 210 B.
- the infrared emitter 215 and the infrared camera 220 are co-located on the right chunk 210 B.
- the eye scanner 213 or one or more components of the eye scanner 213 can be located on the left chunk 210 A and other locations of the eyewear device 200 , for example, the frame 205 .
- Eye scanner 213 has an infrared emitter 215 and infrared camera 220 like that of FIG. 1 , but the eye scanner 213 can be varied to be sensitive to different light wavelengths as described previously in FIG. 1 .
- the eyewear device 200 includes a frame 105 which includes a left rim 107 A which is connected to a right rim 107 B via a bridge 106 ; and the left and right rims 107 A-B include respective apertures which hold a respective optical element 180 A-B.
- FIG. 3 shows a rear perspective view of the eyewear device of FIG. 1 depicting an infrared camera 120 , a frame front 330 , a frame back 335 , and a circuit board.
- the upper portion of the left rim 107 A of the frame 105 of the eyewear device 100 includes a frame front 330 and a frame back 335 .
- the frame front 330 includes a front-facing side configured to face outwards away from the eye of the user.
- the frame back 335 includes a rear-facing side configured to face inwards towards the eye of the user.
- An opening for the infrared camera 120 is formed on the frame back 335 .
- a circuit board which is a flexible printed circuit board (PCB) 340 , is sandwiched between the frame front 330 and the frame back 335 . Also shown in further detail is the attachment of the left chunk 110 A to the left temple 325 A via a left hinge 326 A.
- components of the eye scanner including the infrared camera 120 , the flexible PCB 340 , or other electrical connectors or contacts may be located on the left temple 325 A or the left hinge 326 A.
- the left chunk 110 A includes a chunk body 311 , a chunk cap 312 , an inwards facing surface 391 and an outwards facing surface 392 (labeled, but not visible). Disposed inside the left chunk 110 A are various interconnected circuit boards, such as PCBs or flexible PCBs, that include controller circuits for charging, a battery, inwards facing light emitting diodes (LEDs), and outwards (forward) facing LEDs.
- PCBs or flexible PCBs that include controller circuits for charging, a battery, inwards facing light emitting diodes (LEDs), and outwards (forward) facing LEDs.
- FIG. 4 is a cross-sectional view through the infrared camera 120 and the frame corresponding to the encircled cross-section 4 - 4 of the eyewear device of FIG. 3 .
- Various layers of the eyewear device 100 are visible in the cross-section of FIG. 4 .
- the flexible PCB 340 is disposed on the frame front 330 and connected to the frame back 335 .
- the infrared camera 320 is disposed on the flexible PCB 340 and covered by an infrared camera cover lens 445 .
- the infrared camera 120 is reflowed to the back of the flexible PCB 340 .
- Reflowing attaches the infrared camera 120 to electrical contact pad(s) formed on the back of the flexible PCB 340 by subjecting the flexible PCB 340 to controlled heat which melts a solder paste to connect the two components.
- reflowing is used to surface mount the infrared camera 120 on the flexible PCB 340 and electrically connect the two components.
- through-holes can be used to connect leads from the infrared camera 120 to the flexible PCB 340 via interconnects, for example.
- the frame back 335 includes an infrared camera opening 450 for the infrared camera cover lens 445 .
- the infrared camera opening 450 is formed on a rear-facing side of the frame back 335 that is configured to face inwards towards the eye of the user.
- the flexible PCB 340 can be connected to the frame front 330 via a flexible PCB adhesive 460 .
- the infrared camera cover lens 445 can be connected to the frame back 335 via infrared camera cover lens adhesive 455 .
- the connection can be indirect via intervening components.
- FIG. 5 shows a rear perspective view of the eyewear device of FIG. 1 .
- the eyewear device 100 includes an infrared emitter 115 , infrared camera 120 , a frame front 330 , a frame back 335 , and a circuit board 340 .
- the upper portion of the left rim of the frame of the eyewear device 100 includes the frame front 330 and the frame back 335 .
- An opening for the infrared emitter 115 is formed on the frame back 335 .
- a circuit board which is a flexible PCB 340 , is sandwiched between the frame front 330 and the frame back 335 . Also shown in further detail is the attachment of the left chunk 110 A to the left temple 325 A via the left hinge 326 A.
- components of the eye scanner including the infrared emitter 115 , the flexible PCB 340 , or other electrical connectors or contacts may be located on the left temple 325 A or the left hinge 326 A.
- FIG. 6 is a cross-sectional view through the infrared emitter 115 and the frame corresponding to the encircled cross-section 6 - 6 of the eyewear device of FIG. 5 .
- Multiple layers of the eyewear device 100 are illustrated in the cross-section of FIG. 6 , as shown the frame includes the frame front 330 and the frame back 335 .
- the flexible PCB 340 is disposed on the frame front 330 and connected to the frame back 335 .
- the infrared emitter 115 is disposed on the flexible PCB 340 and covered by an infrared emitter cover lens 645 .
- the infrared emitter 115 is reflowed to the back of the flexible PCB 340 .
- Reflowing attaches the infrared emitter 115 to contact pad(s) formed on the back of the flexible PCB 340 by subjecting the flexible PCB 340 to controlled heat which melts a solder paste to connect the two components.
- reflowing is used to surface mount the infrared emitter 115 on the flexible PCB 340 and electrically connect the two components.
- through-holes can be used to connect leads from the infrared emitter 115 to the flexible PCB 340 via interconnects, for example.
- the frame back 335 includes an infrared emitter opening 650 for the infrared emitter cover lens 645 .
- the infrared emitter opening 650 is formed on a rear-facing side of the frame back 335 that is configured to face inwards towards the eye of the user.
- the flexible PCB 340 can be connected to the frame front 330 via the flexible PCB adhesive 460 .
- the infrared emitter cover lens 645 can be connected to the frame back 335 via infrared emitter cover lens adhesive 655 .
- the coupling can also be indirect via intervening components.
- FIG. 7 is a top cross-sectional view of the right chunk 210 B of the eyewear device of FIG. 2 .
- the eyewear device 200 includes the infrared emitter 215 , the infrared camera 220 , and a circuit board, which may be a flexible PCB 740 .
- the right chunk 210 B is connected to a right temple 725 B of the eyewear device 200 via the right hinge 726 B.
- components of the eye scanner including the infrared emitter 215 and the infrared camera 220 , the flexible PCB 740 , or other electrical connectors or contacts may be located on the right temple 725 B or the right hinge 726 B.
- the right chunk 710 B includes chunk body 711 , an inwards facing surface 791 , and an outwards facing surface 792 .
- the right chunk 710 B also includes a chunk cap (not shown) like the chunk cap 312 for the left chunk in FIG. 3 , but the chunk cap is removed in the cross-section of FIG. 7 .
- Disposed inside the right chunk 210 B are various interconnected circuit boards, such as PCBs or flexible PCBs, that include controller circuits for a visible light camera 714 , microphone(s), low-power wireless circuitry (e.g., for wireless short range network communication via BluetoothTM), high-speed wireless circuitry (e.g., for wireless local area network communication via WiFi).
- the visible light camera 714 is disposed on a circuit board and covered by a visible camera cover lens and has an outwards facing field of view.
- the frame front which is connected to the right chunk 210 B, and the right chunk 210 B can include opening(s) for the visible light camera cover lens.
- the frame front includes a front-facing side configured to face outwards away from the eye of the user.
- the opening for the visible light camera cover lens is formed on and through the front-facing side.
- the infrared emitter 215 and infrared camera 220 have an inwards facing field of view relative to the visible light camera 714 having the outwards facing field of view.
- the infrared emitter 215 and the infrared camera 220 are co-located on the inwards facing surface 791 of the right chunk 210 B to point inwards towards the eye of the user.
- the inwards facing surface 791 can be sloped such that it curves away from the upper portion of the right rim of the frame where the inwards facing surface 791 intersects the right rim and towards the right temple 725 B to orient the infrared emitter 215 and infrared camera 220 with an inwards facing field of view and a line of sight of the eye of the user.
- the infrared emitter 215 and the infrared camera 220 are coupled to the flexible PCB 740 in a manner that is similar to that shown and described with reference to FIGS. 3-6 .
- the flexible PCB 740 is disposed inside the right chunk 210 B between inwards facing surface 791 and the outwards facing surface 792 of the right chunk 210 B.
- Flexible PCB 740 is coupled to one or more other components housed in the right chunk 210 B.
- the infrared emitter 215 is disposed on the flexible PCB 740 and an infrared emitter cover lens covers the infrared emitter 215 .
- the infrared camera 220 is also disposed on the flexible PCB 740 and an infrared camera cover lens covers the infrared emitter 215 .
- the eye scanner including the infrared emitter 215 and the infrared camera 220 , can be formed on the circuit boards of the left chunk as shown in FIG. 3 .
- An infrared camera opening and infrared emitter opening are both formed on the inwards facing surface 791 of the right chunk 210 B that are configured to face inwards towards the eye of the user.
- the flexible PCB 740 can be connected to the inwards facing surface 791 and outwards facing surface 792 via a flexible PCB adhesive.
- the infrared emitter cover lens and infrared camera cover lens can be connected to the inwards facing surface 791 via a cover lens adhesive.
- the coupling can also be indirect via intervening components.
- FIG. 8A depicts an example of a pattern of infrared light emitted by an infrared emitter 815 of the eyewear device and reflection variations of the emitted pattern of infrared light captured by the infrared camera 820 of the eyewear device.
- FIG. 8B depicts the emitted pattern of infrared light 881 emitted by the infrared emitter 815 of the eyewear device in an inwards facing field of view towards an eye of a user 880 .
- the pattern of infrared light 881 can be a standardized matrix or beam of pixels that will outline a uniform light trace on the eye of the user 880 (e.g., retina or iris).
- the eye of each user 880 is unique, for example both the retina and iris portions uniquely identify a user.
- the retina is a thin tissue composed of neural cells located in the posterior portion of the eye. Capillaries that supply the retina with blood form a complex structure that make each user's retina unique to that person. The intricate structures forming the iris are also unique to each person and thus also uniquely identify each user.
- the infrared camera 820 captures the reflection variations of the emitted pattern of infrared light 882 , which can then be used to uniquely identify the user.
- the emitted pattern of infrared light 881 is an unperceived low-energy infrared beam that shines on the eye with a standardized path.
- the amount of reflection of the emitted pattern of infrared light 881 varies in different parts of the retina (e.g., retinal blood vessels absorb light more than surrounding tissue) and the iris.
- Infrared camera 820 captures these reflection variations of the emitted pattern of infrared light 882 , which is digitized by the components of the system.
- the wearable device includes or is coupled to image processor, memory, and processor for digitizing the reflection variations of the emitted pattern of infrared light 882 .
- the reflection variations of the emitted pattern of infrared light 882 can then be compared to a database of captured infrared images of eyes of multiple users to identify the user.
- the reflection variations of the emitted pattern of infrared light 882 from the user's eye can be stored in the database of captured infrared images, which includes images of eyes of multiple users.
- the system may then subsequently compare received reflection variations to this database to uniquely identify the user.
- the infrared emitter 815 emits the emitted pattern of infrared light 881 and the infrared camera 820 captures one, two, three, or more images of the reflection variations of the emitted pattern of infrared light 882 in different parts of the user's eye(s).
- the system will find no previously captured infrared image exists in the database that matches the currently captured reflection variations of the emitted pattern of infrared light 882 .
- the system updates the database to store digitized images of the currently captured reflection variations of the emitted pattern of infrared light 882 .
- the updated database with the digitized reflection variations that were previously stored in the database are analyzed using algorithms.
- the algorithms employ mathematical and statistical techniques for pattern recognition to determine whether at least one subsequently captured image of reflection variations of that same user or a different user of the eyewear device matches one or more of the previously captured digitized images that are stored and exist in the database. If a match is found, the identity of the user is verified (e.g., known) and corresponding user account information is retrieved.
- a chat application stored on a mobile device may be executed by a processor of the mobile device and utilize the corresponding user account information to post or send images and videos captured by a visible light to camera of the eyewear device to the user's account and deliver the images and videos captured by the visible light camera to contacts or associated groups of the verified user in the chat application.
- some embodiments can include determining that the same person has used the eyewear device before without specifically knowing the identify or account information of the user. It should be understood that the foregoing functionality can be embodied in programming instructions of a user identification application found in one or more components of the system.
- FIG. 9 is a high-level functional block diagram of an example user identification system.
- the system 900 includes eyewear device 910 , mobile device 990 , and server system 998 .
- Mobile device 990 may be a smartphone, tablet, laptop computer, access point, or any other such device capable of connecting with eyewear device 910 using both a low-power wireless connection 925 and a high-speed wireless connection 937 .
- Mobile device 990 is connected to server system 998 and network 995 .
- the network 995 may include any combination of wired and wireless connections.
- Server system 998 may be one or more computing devices as part of a service or network computing system, for example, that include a processor, a memory, and network communication interface to communicate over the network 995 with the mobile device 990 and eyewear device 910 .
- the memory of the server system 998 can include digital images of the reflection variations of the emitted pattern of infrared light as captured by the eyewear device 910 and transmitted via the depicted networks 925 , 937 , 995 .
- the memory of the server system 998 can also include a database of captured infrared images of eyes of multiple users and a user identification application to perform functions of the programming described herein. Execution of the programming by the processor of the server system 998 can cause the server system 998 to perform some or all of the functions described herein, for example, to uniquely identify the user of the eyewear device 910 based on the reflection variations.
- Mobile device 990 and elements of network 995 , low-power wireless connection 925 , and high-speed wireless architecture 937 may be implemented using details of the architecture of mobile device 990 , for example utilizing the short range XCVRs and WWAN XCVRs of mobile device 990 described in FIG. 10 .
- System 900 may optionally include additional peripheral device elements 919 and a display 911 integrated with eyewear device 910 .
- peripheral device elements 919 may include biometric sensors, additional sensors, or display elements integrated with eyewear device 910 .
- peripheral device elements 919 may include any I/O components including output components, motion components, position components, or any other such elements described herein.
- Output components include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), or a projector), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth.
- visual components e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), or a projector
- acoustic components e.g., speakers
- haptic components e.g., a vibratory motor
- the input components include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instruments), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
- alphanumeric input components e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components
- point-based input components e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instruments
- tactile input components e.g., a physical button, a touch screen that
- the biometric components include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like.
- the motion components include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth.
- the position components include location sensor components to generate location coordinates (e.g., a Global Positioning System (GPS) receiver component), WiFi or BluetoothTM transceivers to generate positioning system coordinates, altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
- location coordinates can also be received over wireless connections 925 and 937 from the mobile device 990 via the low-power wireless circuitry 924 or high-speed wireless circuitry 936 .
- Eyewear device 910 includes a visible light camera 914 , infrared emitter 915 , infrared camera 920 , image processor 912 , interface 916 , low-power circuitry 920 , and high-speed circuitry 930 .
- the components shown in FIG. 9 for the eyewear device 910 are located on one or more circuit boards, for example a PCB or flexible PCB, in the chunks or frames. Alternatively or additionally, the depicted components can be located in the temples, hinges, or bridge of the eyewear device 910 .
- the infrared camera 920 may be a low resolution camera, such as VGA (640 ⁇ 480 resolution), which can provide for low power consumption since fewer pixels equals less power and also allows the camera module package to be small enough to fit into the design of the eyewear device 910 , including the frame and chunks.
- Infrared camera 920 and visible light camera 914 can include digital camera elements such as a charge coupled device, a lens, or any other respective visible or infrared light capturing elements that may be used to capture data.
- Interface 916 refers to any source of a user command that is provided to eyewear device 910 .
- interface 916 is a respective physical button on a visible light camera 914 , infrared emitter 915 , or infrared camera 920 that, when depressed, sends a user input signal from interface 916 to low power processor 922 .
- the interface 916 is located on different portions of the eyewear device 910 , such as on a different chunk or the frame, but is electrically connected via a circuit board to the visible light camera 914 , infrared emitter 915 , or infrared camera 920 .
- Interaction with the interface by the user e.g., tactile input or a depression of a button followed by an immediate release
- low power processor 922 Interaction with the interface by the user, e.g., tactile input or a depression of a button followed by an immediate release
- a depression of such a camera button for a first period of time may be processed by low-power processor 922 as a request to capture video data while the button is depressed, and to cease video capture when the button is released, with the video captured while the button was depressed stored as a single video file.
- the low-power processor 922 may have a threshold time period between the press of a button and a release, such as 500 milliseconds or one second, below which the button press and release is processed as an image request, and above which the button press and release is interpreted as a video request.
- Use of the interface 916 on the eyewear device 910 can be immediately followed by user identification via an eye scanner.
- the infrared emitter 915 emits a pattern of infrared light and the infrared camera 920 captures reflection variations in the emitted pattern of infrared light by capturing various infrared images.
- Such user identification can occur prior to each image request or video request via the interface 916 or after a predetermined time interval of usage of the eyewear device 910 has elapsed since the user was previously identified via the eye scanner.
- the low power processor 922 may make this user identification determination while the video or image processor 912 is booting.
- the interface 916 may be a touch screen device, capacitive or resistive strip or array on a circuit board, or any mechanical switch or physical interface capable of accepting user inputs associated with a request for data from the visible light camera 914 , infrared emitter 915 , or infrared camera 920 .
- the interface 916 may have a software component, or may be associated with a command received wirelessly from another source.
- Image processor 912 includes circuitry to receive signals from the visible light camera 914 and infrared camera 920 and process those signals from the visible light camera 914 and infrared camera 920 into a format suitable for storage in the memory 934 .
- the memory 934 includes various images containing reflection variations 960 of the emitted pattern of infrared light of the eye of the user as captured by the infrared camera 920 .
- the memory 934 can also include a database of captured infrared images 950 of eyes of multiple users and a user identification application 945 to perform the functions of the programming described herein, for example the operations outlined in further detail in FIGS. 1-8 and 10-12 , for example.
- uniquely identifying the user includes comparing the images containing the reflection variations of the emitted pattern of infrared light 960 of the eye of the user against the database of captured infrared images of eyes of multiple users 950 via a user identification application 945 .
- Such comparison can be done on a device separate from the eyewear device 910 , such as a host computer, which includes the mobile device 990 and server system 998 .
- a host computer which includes the mobile device 990 and server system 998 .
- identification may occur on the eyewear device 910 alone and in combination with the mobile device 990 .
- user identification can occur on essentially any host computer, which includes both the mobile device 990 and server system 998 .
- the eyewear device 910 can include the processors 922 , 932 ; the memory 934 , a user identification application 945 in the memory 934 , to perform the functions of the programming to emit and capture as described herein.
- the host computer 990 and 998 coupled to the eyewear device 910 via the networks 925 , 937 , and 995 as shown, can include a second processor, a second memory; and the function of the programming to uniquely identify the user of the eyewear device. Where and which components of the depicted system 900 perform the user identification, depends on the security preferences of the user and privacy requirements of the system 900 because storage of such private identification data may be subject to various rules and regulations.
- Image processor 912 is structured within eyewear device 910 such that it may be powered on and booted under the control of low-power circuitry 920 . Image processor 912 may additionally be powered down by low-power circuitry 920 . Depending on various power design elements associated with image processor 912 , image processor 912 may still consume a small amount of power even when it is in an off state. This power will, however, be negligible compared to the power used by image processor 912 when it is in an on state, and will also have a negligible impact on battery life. As described herein, device elements in an “off” state are still configured within a device such that low-power processor 922 is able to power on and power down the devices. A device that is referred to as “off” or “powered down” during operation of eyewear device 910 does not necessarily consume zero power due to leakage or other aspects of a system design.
- image processor 912 comprises a microprocessor integrated circuit (IC) customized for processing sensor data from a visible light camera 914 and an infrared camera 920 , along with volatile memory used by the microprocessor to operate.
- IC microprocessor integrated circuit
- volatile memory used by the microprocessor to operate.
- a non-volatile read only memory ROM may be integrated on the IC with instructions for operating or booting the image processor 912 . This ROM may be minimized to match a minimum size needed to provide basic functionality for gathering sensor data from visible light camera 914 and infrared camera 920 , such that no extra functionality that would cause delays in boot time are present.
- the ROM may be configured with direct memory access (DMA) to the volatile memory of the microprocessor of image processor 912 .
- DMA allows memory-to-memory transfer of data from the ROM to system memory of the image processor 912 independent of operation of a main controller of image processor 912 .
- Providing DMA to this boot ROM further reduces the amount of time from power on of the image processor 912 until sensor data from the visible light camera 914 and infrared camera 920 can be processed and stored.
- minimal processing of the camera signal from the visible light camera 914 and infrared camera 920 is performed by the image processor 912 , and additional processing may be performed by applications operating on the mobile device 990 or server system 998 .
- Low-power circuitry 920 includes low-power processor 922 and low-power wireless circuitry 924 . These elements of low-power circuitry 920 may be implemented as separate elements or may be implemented on a single IC as part of a system on a single chip.
- Low-power processor 922 includes logic for managing the other elements of the eyewear device 910 . As described above, for example, low power processor 922 may accept user input signals from an interface 916 . Low-power processor 922 may also be configured to receive input signals or instruction communications from mobile device 990 via low-power wireless connection 925 . Additional details related to such instructions are described further below.
- Low-power wireless circuitry 924 includes circuit elements for implementing a low-power wireless communication system via a short-range network. BluetoothTM Smart, also known as BluetoothTM low energy, is one standard implementation of a low power wireless communication system that may be used to implement low-power wireless circuitry 924 . In other embodiments, other low power communication systems may be used.
- High-speed circuitry 930 includes high-speed processor 932 , memory 934 , and high-speed wireless circuitry 936 .
- the infrared emitter 915 is shown as being coupled to the high-speed circuitry 930 and operated by the high-speed processor 932 .
- the infrared emitter 915 can be coupled to the low-power circuitry 920 such that the infrared emitter 915 is operated by low-power processor 922 .
- a low-energy infrared beam pattern can be emitted by the infrared emitter 915 with relatively few pixels in the matrix which equals less power and can also allow for a small package that fits into the design of the eyewear device 910 , including the frame and chunks.
- High-speed processor 932 may be any processor capable of managing high-speed communications and operation of any general computing system needed for eyewear device 910 .
- High speed processor 932 includes processing resources needed for managing high-speed data transfers on high-speed wireless connection 937 to a wireless local area network (WLAN) using high-speed wireless circuitry 936 .
- the high-speed processor 932 executes an operating system such as a LINUX operating system or other such operating system.
- the high-speed processor 932 executing a software architecture for the eyewear device 910 is used to manage data transfers with high-speed wireless circuitry 936 .
- high-speed wireless circuitry 936 is configured to implement Institute of Electrical and Electronic Engineers (IEEE) 802.11 communication standards, also referred to herein as Wi-Fi. In other embodiments, other high-speed communications standards may be implemented by high-speed wireless circuitry 936 .
- IEEE Institute of Electrical and Electronic Engineers
- Memory 934 includes any storage device capable of storing camera data generated by the infrared camera 920 , the visible light camera 914 , and the image processor 912 . While memory 934 is shown as integrated with high-speed circuitry 930 , in other embodiments, memory 934 may be an independent standalone element of the eyewear device 910 . In certain such embodiments, electrical routing lines may provide a connection through a chip that includes the high-speed processor 932 from the image processor 912 or low-power processor 922 to the memory 934 . In other embodiments, the high-speed processor 932 may manage addressing of memory 934 such that the low-power processor 922 will boot the high-speed processor 932 any time that a read or write operation involving memory 934 is needed.
- FIG. 10 is a high-level functional block diagram of an example of a mobile device 1090 that communicates via the user identification system of FIG. 9 . Shown are elements of a touch screen type of mobile device 1090 having a user identification application 1045 loaded, although other non-touch type mobile devices can be used in the user identification communications and controls under consideration here. Examples of touch screen type mobile devices that may be used include (but are not limited to) a smart phone, a personal digital assistant (PDA), a tablet computer, a laptop computer, or other portable device. However, the structure and operation of the touch screen type devices is provided by way of example; and the subject technology as described herein is not intended to be limited thereto. For purposes of this discussion, FIG.
- PDA personal digital assistant
- Mobile device 10 therefore provides a block diagram illustration of the example mobile device 1090 having a touch screen display for displaying content and receiving user input as (or as part of) the user interface.
- Mobile device 1090 also includes a camera(s) 1070 , such as visible light camera(s).
- the mobile device 1090 includes at least one digital transceiver (XCVR) 1010 , shown as WWAN XCVRs, for digital wireless communications via a wide area wireless mobile communication network.
- the mobile device 1090 also includes additional digital or analog transceivers, such as short range XCVRs 1020 for short-range network communication, such as via NFC, VLC, DECT, ZigBee, BluetoothTM, or WiFi.
- short range XCVRs 1020 may take the form of any available two-way wireless local area network (WLAN) transceiver of a type that is compatible with one or more standard protocols of communication implemented in wireless local area networks, such as one of the Wi-Fi standards under IEEE 802.11 and WiMAX.
- WLAN wireless local area network
- the mobile device 1090 can include a global positioning system (GPS) receiver.
- GPS global positioning system
- the mobile device 1090 can utilize either or both the short range XCVRs 1020 and WWAN XCVRs 1010 for generating location coordinates for positioning.
- cellular network, WiFi, or BluetoothTM based positioning systems can generate very accurate location coordinates, particularly when used in combination.
- Such location coordinates can be transmitted to the eyewear device over one or more network connections via XCVRs 1020 .
- the transceivers 1010 , 1020 conforms to one or more of the various digital wireless communication standards utilized by modern mobile networks.
- WWAN transceivers 1010 include (but are not limited to) transceivers configured to operate in accordance with Code Division Multiple Access (CDMA) and 3rd Generation Partnership Project (3GPP) network technologies including, for example and without limitation, 3GPP type 2 (or 3GPP2) and LTE, at times referred to as “4G.”
- CDMA Code Division Multiple Access
- 3GPP 3rd Generation Partnership Project
- 3GPP type 2 or 3GPP2
- LTE Long Term Evolution
- the transceivers 1010 , 1020 provide two-way wireless communication of information including digitized audio signals, still image and video signals, web page information for display as well as web related inputs, and various types of mobile message communications to/from the mobile device 1090 for user identification strategies.
- communications through the transceivers 1010 , 1020 and a network relate to protocols and procedures in support of communications with the eyewear device or the server system for user identity verification utilizing eye scanners, e.g., infrared emitters and infrared cameras, to digitize and process images of the retina or iris of the eye.
- Such communications may transport packet data via the short range XCVRs 1020 over the wireless connections 925 and 937 to and from the eyewear device as shown in FIG. 9 .
- Such communications may also transport data utilizing IP packet data transport via the WWAN XCVRs 1010 over the network (e.g., Internet) 995 shown in FIG. 9 .
- Both WWAN XCVRs 1010 and short range XCVRs 1020 connect through radio frequency (RF) send-and-receive amplifiers (not shown) to an associated antenna (not shown).
- RF radio frequency
- the mobile device 1090 further includes a microprocessor, shown as CPU 1030 , sometimes referred to herein as the host controller.
- a processor is a circuit having elements structured and arranged to perform one or more processing functions, typically various data processing functions. Although discrete logic components could be used, the examples utilize components forming a programmable CPU.
- a microprocessor for example includes one or more integrated circuit (IC) chips incorporating the electronic elements to perform the functions of the CPU.
- the processor 1030 may be based on any known or available microprocessor architecture, such as a Reduced Instruction Set Computing (RISC) using an ARM architecture, as commonly used today in mobile devices and other portable electronic devices. Of course, other processor circuitry may be used to form the CPU 1030 or processor hardware in smartphone, laptop computer, and tablet.
- RISC Reduced Instruction Set Computing
- the microprocessor 1030 serves as a programmable host controller for the mobile device 1090 by configuring the mobile device to perform various operations, for example, in accordance with instructions or programming executable by processor 1030 .
- operations may include various general operations of the mobile device, as well as operations related to user identification and communications with the eyewear device and server system.
- a processor may be configured by use of hardwired logic, typical processors in mobile devices are general processing circuits configured by execution of programming.
- the mobile device 1090 includes a memory or storage device system, for storing data and programming.
- the memory system may include a flash memory 1040 A and a random access memory (RAM) 1040 B.
- the RAM 1040 B serves as short term storage for instructions and data being handled by the processor 1030 , e.g. as a working data processing memory.
- the flash memory 1040 A typically provides longer term storage.
- the flash memory 1040 A is used to store programming or instructions for execution by the processor 1030 .
- the mobile device 1090 stores and runs a mobile operating system through which specific applications, including user identification application 1045 .
- Applications such as the user identification application 1045 , may be a native application, a hybrid application, or a web application (e.g., a dynamic web page executed by a web browser) that runs on mobile device 1090 to uniquely identify the user.
- Examples of mobile operating systems include Google Android, Apple iOS (I-Phone or iPad devices), Windows Mobile, Amazon Fire OS, RIM BlackBerry operating system, or the like.
- flash memory 1040 A storage device stores a database of captured infrared images of respective eyes of multiple users 1050 .
- the database of captured infrared images of respective eyes of multiple users 1050 is accumulated over time as different users of the eyewear device set up a profile in the user identification system. Initially, each user utilizes the eye scanner 113 to capture various images of an eye. The captured images are then populated into the database of captured infrared images of respective eyes of multiple users 1050 to allow for user identification.
- an eyewear device 100 captures a digital image of reflection variations of the emitted pattern of infrared light 1060 and the captured digital image in the flash memory 1040 A.
- current reflection variations of an emitted pattern of infrared light 1060 is compared by the processor 1030 to previously captured infrared images of respective eyes of multiple users 1050 within the database to uniquely identify the user of the eyewear device.
- the mobile device 1090 is just one type of host computer in the user identification system and that other arrangement may be utilized.
- a server system such as that shown in FIG. 9 may host the database of captured infrared images of respective eyes of multiple users 1050 and perform the comparison to make the unique user identification determination.
- the database of captured infrared images of respective eyes of multiple users 1050 and the reflection variations of the emitted pattern of infrared light 1060 is stored and processed can vary depending on the security preferences of the user and the system requirements.
- the user identification application 1045 includes programming functions to populate the database with captured infrared images of respective eyes of multiple users 1050 and to uniquely identify the user.
- the programming functions may include comparing the digital image of reflection variations of the emitted pattern of infrared light 1060 with the database of captured infrared images of respective eyes of multiple users 1050 .
- any of the user identification functionality described herein for the eyewear device, mobile device, and server system can be embodied in one more applications as described previously.
- FIG. 11A shows various alternate locations for the eye scanner on the eyewear device, which can be used individually or in combination.
- multiple eyewear scanners 1113 A-D can be included in the eyewear device 1100 to reduce errors in the user identification determination and to determine a direction in which the user is looking (e.g., line of sight) for eye tracking.
- there a four eye scanners 1113 A-D and each eye scanner 1113 A-D includes a respective infrared emitter 1115 A-D and infrared camera 1120 A-D.
- the frame 1105 includes opposing first and second lateral sides 1170 A-B.
- a first chunk 1110 A is integrated into the first lateral side 1170 A of frame 1105 .
- a second chunk 1110 B is integrated into the second lateral side 1170 B of frame 1105 .
- a circuit board (not shown) spans the first chunk 1110 A, the frame 1105 , and the second chunk 1110 B.
- the frame 1105 of the eyewear device 1100 includes an upper frame portion 1195 , a middle frame portion 1196 , and a lower frame portion 1197 .
- eye scanner 1113 A is located on the first rim 1107 A on the upper frame portion 1195 .
- Eye scanner 1113 B is located on the second chunk 1110 B.
- Eye scanner 1113 C is located on the first rim 1107 A on the lower frame portion 1197 .
- Eye scanner 1113 D is located on the first rim on the middle frame portion 1196 .
- Eyewear device 1100 includes a first eye scanner 1113 A that includes a first infrared emitter 1115 A and a first infrared camera 1120 A. Eyewear device 1100 also includes a second eye scanner 1113 B that includes a second infrared emitter 1115 B and a second infrared camera 1120 B.
- the second infrared emitter 1115 B is connected to the frame 1105 or the at least one chunk 1110 A-B to emit a second emitted pattern of infrared light.
- the second infrared camera 1120 B is connected to the frame 1105 or the at least one chunk 1110 A-B to capture reflection variations in the second emitted pattern of infrared light.
- first and second eye scanners can include any combination of locations, or number of eye scanners 1113 A-D shown in FIG. 11A , including one, two, three, or four of the eye scanners 1113 A-D. Additionally, the eye scanners 1113 A-D can be located on other portions of the eyewear device 1100 , including the first chunk 1110 A; upper, middle, and lower portions 1195 - 1197 of the second rim 1107 B; the bridge 1106 , or the temples.
- Execution of the programming by a processor of a user identification system configures the system to perform functions.
- the eyewear device 1110 emits, via the second infrared emitter 1115 B, the second emitted pattern of infrared light on a second eye of the user of the eyewear device 1110 ; captures, via the second infrared camera 1120 B, reflection variations in the second emitted pattern of infrared light on a second eye of the user. Based on the reflection variations of the second emitted pattern of infrared light on the second eye of the user, the system determines a direction of a line of sight of the eyes of the user for eye tracking.
- the eyewear device 1100 emits, via the second infrared emitter 1115 B, the second emitted pattern of infrared light on a different portion of the eye of the user of the eyewear device 1100 than the first infrared emitter 1115 A.
- the eyewear device 1100 captures, via the second infrared camera 1115 B, the reflection variations in the second emitted pattern of infrared light on the different portion of the eye of the user. Based on the reflection variations of the second emitted pattern of infrared light on the different portion of the eye of the user, the system uniquely identifies the user of the eyewear device 1100 based on the reflection variations of the second emitted pattern of infrared light on the different portion of the eye of the user.
- the second emitted pattern of infrared light can be the same or different from the first pattern of infrared light emitted by the first infrared emitter 1115 A.
- the second infrared emitter 1115 B and the second infrared camera 1120 B can be co-located on the frame 1105 or the at least one chunk 1110 A-B as shown in FIG. 11 .
- the first infrared emitter 1115 A and the infrared camera 1120 A can be co-located on a first chunk 1110 A.
- the second infrared emitter 1115 B and the second infrared camera 1120 B can be co-located on a second chunk 1110 B.
- the frame 1105 of the eyewear device 1100 includes first and second eye rims 1107 A-B that have respective apertures to hold a respective optical element and the first and second eye rims 1107 A-B are connected by a bridge 1106 .
- the first infrared emitter 1115 A and the first infrared camera 1120 A are co-located on the first eye rim 1107 A.
- the second infrared emitter 1115 B and the second infrared camera 1120 B can be co-located on the second eye rim 1107 B, including on the upper frame portion 1195 , middle frame portion 1196 , and lower frame portion 1197 .
- FIGS. 11B-D illustrate the effects of the various alternate locations on the eyewear device with respect to different orientations of the eye of the user.
- the eye of the user 1180 B is looking up.
- placement of the eye scanner 1113 A such as the infrared emitter and infrared camera, on either the upper frame portion (e.g., top frame on the rims, bridge, etc.) or a chunk can accurately capture an image of the retina or iris of the eye of the user 1180 B looking up.
- placement of the eye scanner 1113 B on a lower frame portion (e.g., bottom frame) of the eyewear device also accurately captures an image of the retina or iris of the eye of the user 1180 B looking up.
- both fields of view are depicted as suitable (OK).
- FIG. 11C the eye of the user 1180 C is looking straight ahead.
- the eye scanner 1113 A on either the upper frame portion or a chunk can accurately capture an image of the retina or iris of the eye of the user 1180 C looking straight ahead.
- placement of the eye scanner 1113 B on the lower frame portion of the eyewear device accurately captures an image of the retina or iris of the eye of the user 1180 C looking straight ahead.
- FIG. 11D the eye of the user 1180 D is looking down.
- placement of the eye scanner 1113 A, on either the upper frame portion or a chunk may be insufficient because the eyelid of the user 1180 D can block the infrared camera.
- the field of view is depicted as not good (NG).
- placement of the eye scanner 1113 B on the lower frame portion of the eyewear device can accurately capture an image of the retina or iris of the eye of the user 1180 D looking down.
- having multiple eye scanners 1113 A-B on the eyewear device can improve performance of the user identification system by improving accuracy and reducing errors in eye scanning.
- multiple eye scanners 1113 A-B can be used for eye tracking directional information, for example, to detect where the user is looking (left, right, up, down, east, west, north, south, etc.).
- location coordinates of the user of the eyewear device can also be generated by the location sensor components of the eyewear device or a mobile device being carried by the user that is in communication via the connections 925 and 937 as described in FIGS. 9-10 .
- specific content can be delivered to the eyewear device. For example, if the user is walking down the street and looking at a store, the eyewear device can be loaded with information about the particular store for delivering coupons for monetizing purposes.
- an “application” or “applications” are program(s) that execute functions defined in the programs.
- Various programming languages can be employed to create one or more of the applications, structured in a variety of manners, such as object-oriented programming languages (e.g., Objective-C, Java, or C++) or procedural programming languages (e.g., C or assembly language).
- a third party application e.g., an application developed using the ANDROIDTM or IOSTM software development kit (SDK) by an entity other than the vendor of the particular platform
- SDK software development kit
- the third party application can invoke API calls provided by the operating system to facilitate functionality described herein.
- FIG. 12 is a flowchart of the operation of the eyewear device and other components of the user identification system. As noted above, utilizing the eyewear devices and protocols and procedures of the user identification system described herein, the identity of a user can be verified. Although shown as occurring serially, the blocks of FIG. 12 may be reordered or parallelized depending on the implementation.
- eyewear device initiates scanning of the eye.
- eye scanning is initiated when a user puts the eyewear device on, for example, over the user's eyes.
- Such wearing of the eyewear device can be detected after the eyewear device detects that the temples have been be unfolded via an open/close sensor (e.g., magnetic contacts) mounted on a circuit board that is coupled to the temples and hinges.
- an open/close sensor e.g., magnetic contacts
- a capacitive strip on the bridge, temples, or other portions of the eyewear device may detect that the eyewear device is being worn by the user.
- the remaining blocks of FIG. 12 may be executed.
- eye scanning is initiated within a predetermined time period after the eyewear device is powered on. In another example, eye scanning is initiated when another function of the eyewear device is triggered, for example, a different software executable application is accessed which requires appropriate user or group permissions.
- Eye scanning can be initiated when hardware is accessed on the eyewear device, for example, when a button is pressed to capture images or a video via the visible light camera or another user interface or component of the eyewear device is utilized.
- the eyewear device initiates an eye scan under certain conditions (e.g., detection of motion from an on-board accelerometer or gyroscope) or detecting modification of positional location coordinates via a GPS receiver or other positioning system.
- the eye scanner of the eyewear device emits a pattern of infrared light.
- the infrared emitter emits the pattern of infrared light which can be a standardized matrix or beam of pixels that will outline a uniform light trace on the eye of the user (e.g., retina or iris).
- the emitted pattern can be an unperceived low-energy infrared beam that shines on the eye with a standardized path.
- the eyewear device captures reflection variations in the emitted pattern of infrared light.
- the amount of reflection of the emitted pattern of infrared light varies in different parts of the retina (e.g., retinal blood vessels absorb light more than surrounding tissue) and the iris.
- the infrared camera captures these reflection variations of the emitted pattern of infrared light, which is digitized by the eyewear device.
- a user of the eyewear device is identified based on the currently captured digitized reflection variations, on one or more devices of the user identification system, such as the eyewear device, mobile device, or server system.
- a database with the digitized reflection variations that were previously stored are analyzed using algorithms to compare against the currently captured digitized reflection variations.
- the algorithms employ mathematical and statistical techniques for pattern recognition to determine whether the currently captured reflection variations of the user of the eyewear device matches one or more of the previously captured digitized images that are stored and exist in the database. If a match is found, the identity of the user is verified (e.g., known) and corresponding user account information is retrieved.
- the eyewear device and associated mobile device may be unlocked and profile settings or configurations of the eyewear device can be loaded based on the associated user account.
- access to certain software executable applications and associated hardware can be granted, such as the visible light camera, of the eyewear device.
- the eyewear device may automatically pair with the mobile device associated with the identified user account in response to user identification.
- the user may be automatically be logged into user accounts on third party software applications, for example, an application store or chat application.
- the identity of the user or the identity of the user account can be included in the metadata of images or videos captured by the visible light camera along with geolocation data.
- the eyewear device and mobile device may remain locked and inaccessible. For example, the eyewear device and mobile device lock down and the account associated with the devices receives messages that there was a non-matching access to the devices.
- the system will find no previously captured infrared image exists in the database with digitized reflection variations that match the currently captured reflection variations of the emitted pattern of infrared light.
- the system may update the database to store digitized images of the currently captured reflection variations of the emitted pattern of infrared light. The system may then allow the user access to the eyewear device and mobile device, for example, and request that the user set up a user account.
- any and all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. Such amounts are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain. For example, unless expressly stated otherwise, a parameter value or the like may vary by as much as ⁇ 10% from the stated amount.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- Optics & Photonics (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- Eye Examination Apparatus (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
A system comprises an eyewear device that includes a frame, a temple connected to a lateral side of the frame, an infrared emitter, and an infrared camera. The infrared emitter and the infrared camera are connected to the frame or the temple to emit a pattern of infrared light. The system includes a processor coupled to the eyewear device, a memory accessible to the processor, and programming in the memory. Execution of the programming by the processor configures the system to perform functions, including functions to emit, via the infrared emitter, a pattern of infrared light on an eye of a user of the eyewear device; capture, via the camera, reflection variations in the pattern of infrared light on the eye of the user; and identify a user of the eyewear device based on the reflection variations of the emitted pattern of infrared light on the eye of the user.
Description
- This application is a Continuation of U.S. application Ser. No. 16/188,981 filed on Nov. 13, 2018, which claims priority to U.S. Provisional Application Ser. No. 62/588,700 filed on Nov. 20, 2017, the contents of which are incorporated fully herein by reference.
- The present subject matter relates to eye scanners for an eyewear device, e.g., smart glasses, for user identification and security.
- Portable eyewear devices, such as smartglasses, headwear, and headgear available today integrate cameras and displays. Users of such portable eyewear devices may share such eyewear devices with friends and family members so that any user can borrow the eyewear device to capture images with the integrated camera.
- Verifying the identity of the specific user of the portable eyewear device can be useful. For example, as augmented reality becomes more prevalent in such eyewear devices applications may be developed that need to verify the identity of the user for security purposes.
- The drawing figures depict one or more implementations, by way of example only, not by way of limitations. In the figures, like reference numerals refer to the same or similar elements.
-
FIG. 1 is a rear view of an example hardware configuration of an eyewear device, which includes an eye scanner on a frame, for use in a system for identifying a user of the eyewear device. -
FIG. 2 is a rear view of an example hardware configuration of another eyewear device, which includes an eye scanner on a chunk, for use in a system for identifying a user of the eyewear device. -
FIG. 3 shows a rear perspective sectional view of the eyewear device ofFIG. 1 depicting an infrared camera, a frame front, a frame back, and a circuit board. -
FIG. 4 is a cross-sectional view taken through the infrared camera and the frame of the eyewear device ofFIG. 3 . -
FIG. 5 shows a rear perspective view of the eyewear device ofFIG. 1 depicting an infrared emitter, an infrared camera, a frame front, a frame back, and a circuit board. -
FIG. 6 is a cross-sectional view taken through the infrared emitter and the frame of the eyewear device ofFIG. 5 . -
FIG. 7 is a top cross-sectional view of the chunk of the eyewear device ofFIG. 2 depicting the infrared emitter, the infrared camera, and a circuit board. -
FIG. 8A depicts an example of a pattern of infrared light emitted by an infrared emitter of the eyewear device and reflection variations of the emitted pattern of infrared light captured by the infrared camera of the eyewear device. -
FIG. 8B depicts the emitted pattern of infrared light being emitted by the infrared emitter of the eyewear device in an inwards facing field of view towards an eye of a user. -
FIG. 9 is a high-level functional block diagram of an example user identification system including the eyewear device, a mobile device, and a server system connected via various networks. -
FIG. 10 shows an example of a hardware configuration for the mobile device of the user identification system ofFIG. 9 , in simplified block diagram form. -
FIG. 11A shows various alternate locations for the eye scanner on the eyewear device, which can be used individually or in combination. -
FIGS. 11B, 11C, and 11D illustrate the effects of the various alternate locations on the eyewear device with respect to different orientations of the eye of the user. -
FIG. 12 is a flowchart of the operation of the eyewear device and other components of the user identification system. - In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant teachings. However, it should be apparent to those skilled in the art that the present teachings may be practiced without such details. In other instances, well known methods, procedures, components, and circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present teachings.
- The term “coupled” as used herein refers to any logical, optical, physical or electrical connection, link or the like by which signals or light produced or supplied by one system element are imparted to another coupled element. Unless described otherwise, coupled elements or devices are not necessarily directly connected to one another and may be separated by intermediate components, elements or communication media that may modify, manipulate or carry the light or signals.
- The orientations of the eyewear device, associated components and any complete devices incorporating an eye scanner such as shown in any of the drawings, are given by way of example only, for illustration and discussion purposes. In operation for a particular variable optical processing application, the eyewear device may be oriented in any other direction suitable to the particular application of the eyewear device, for example up, down, sideways, or any other orientation. Also, to the extent used herein, any directional term, such as front, rear, inwards, outwards, towards, left, right, lateral, longitudinal, up, down, upper, lower, top, bottom and side, are used by way of example only, and are not limiting as to direction or orientation of any optic or component of an optic constructed as otherwise described herein.
- In an example, a system includes an eyewear device. The eyewear device includes a frame and a temple connected to a lateral side of the frame. The eyewear device further includes an infrared emitter connected to the frame or the temple to emit a pattern of infrared light. The eyewear device further includes an infrared camera connected to the frame or the temple to emit a pattern of infrared light. The system further includes a processor coupled to the eyewear device, a memory accessible to the processor, and programming in the memory.
- Execution of the programming by the processor configures the system to perform functions, including functions to emit, via the infrared emitter, the pattern of infrared light on an eye of a user of the eyewear device. The execution of the programming by the processor further configures the system to capture, via the infrared camera, the reflection variations in the emitted pattern of infrared light on the eye of the user. The execution of the programming by the processor further configures the system to identify a user, or an account, of the eyewear device based on the reflection variations of the emitted pattern of infrared light on the eye of the user.
- Additional objects, advantages and novel features of the examples will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The objects and advantages of the present subject matter may be realized and attained by means of the methodologies, instrumentalities and combinations particularly pointed out in the appended claims.
- Reference now is made in detail to the examples illustrated in the accompanying drawings and discussed below.
-
FIG. 1 is a rear view of an example hardware configuration of aneyewear device 100, which includes aneye scanner 113 on aframe 105, for use in a system for identifying a user of theeyewear device 100. As shown inFIG. 1 , theeyewear device 100 is in a form configured for wearing by a user, which are eyeglasses in the example ofFIG. 1 . Theeyewear device 100 can take other forms and may incorporate other types of frameworks, for example, a headgear, a headset, or a helmet. - In the eyeglasses example,
eyewear device 100 includes aframe 105 which includes aleft rim 107A connected to a right rim 107B via abridge 106 adapted for a nose of the user. The left andright rims 107A-B includerespective apertures 175A-B which hold a respectiveoptical element 180A-B, such as a lens and a display device. As used herein, the term lens is meant to cover transparent or translucent pieces of glass or plastic having curved and flat surfaces that cause light to converge/diverge or that cause little or no convergence/divergence. - Although shown as having two
optical elements 180A-B, theeyewear device 100 can include other arrangements, such as a single optical element or may not include anyoptical element 180A-B depending on the application or intended user of theeyewear device 100. As further shown,eyewear device 100 includes aleft chunk 110A adjacent the leftlateral side 170A of theframe 105 and aright chunk 110B adjacent the rightlateral side 170B of theframe 105. Thechunks 110A-B may be integrated into theframe 105 on therespective sides 170A-B (as illustrated) or implemented as separate components attached to theframe 105 on therespective sides 170A-B. Alternatively, thechunks 110A-B may be integrated into temples (not shown) attached to theframe 105. - In the example of
FIG. 1 , theeye scanner 113 includes aninfrared emitter 115 and aninfrared camera 120. Visible light cameras typically include a blue light filter to block infrared light detection, in an example, theinfrared camera 120 is a visible light camera, such as a low resolution video graphic array (VGA) camera (e.g., 640×480 pixels for a total of 0.3 megapixels), with the blue filter removed. Theinfrared emitter 115 and theinfrared camera 120 are co-located on theframe 105, for example, both are shown as connected to the upper portion of theleft rim 107A. As described in further detail below, theframe 105 or one or more of the left andright chunks 110A-B include a circuit board that includes theinfrared emitter 115 and theinfrared camera 120. Theinfrared emitter 115 and theinfrared camera 120 can be connected to the circuit board by soldering, for example. - Other arrangements of the
infrared emitter 115 andinfrared camera 120 can be implemented, including arrangements in which theinfrared emitter 115 andinfrared camera 120 are both on the right rim 107B, or in different locations on theframe 105, for example, theinfrared emitter 115 is on theleft rim 107A and theinfrared camera 120 is on the right rim 107B. In another example, theinfrared emitter 115 is on theframe 105 and theinfrared camera 120 is on one of thechunks 110A-B, or vice versa. Theinfrared emitter 115 can be connected essentially anywhere on theframe 105, leftchunk 110A, orright chunk 110B to emit a pattern of infrared light. Similarly, theinfrared camera 120 can be connected essentially anywhere on theframe 105, leftchunk 110A, orright chunk 110B to capture at least one reflection variation in the emitted pattern of infrared light. - The
infrared emitter 115 andinfrared camera 120 are arranged to face inwards towards the eye of the user with a partial or full field of view of the eye in order to pick up an infrared image of the eye for identify verification. For example, theinfrared emitter 115 andinfrared camera 120 are positioned directly in front of the eye, in the upper part of theframe 105 or in thechunks 110A-B at either ends of theframe 105. - In an embodiment, the identification establishes globally unique and unambiguous identifiers of the eye, which serves to distinguish a discrete individual from other like and unlike users. In the example, the
eye scanner 113 is a retina scanner which uses infrared light illumination (e.g., near-infrared, short-wavelength infrared, mid-wavelength infrared, long-wavelength infrared, or far infrared) to identify the unique blood vessel configuration in the eye, for example, an unperceived beam of low-energy infrared light is cast on the person's eye. Since the pattern of infrared light emitted by theinfrared emitter 115 is more easily absorbed by the blood vessels than the surrounding tissue, the amount of light reflected back to theinfrared camera 120 varies, which can then be used to uniquely identify the user. Such retinal scanning is an ocular-based biometric technology that uses the unique patterns on a person's retina blood vessels. A user identification algorithm using digital templates encoded from these patterns by mathematical and statistical algorithms allow for pattern recognition of the retina blood vessels or iris and hence the identification of the user. Although identification can be unique, in other embodiments, the identification establishes that the user is part of a group of users. In response to being identified as part of a group, the user can be provided permissions to access, control, or utilize, one more executable software applications or hardware features (e.g., a visible light camera) of theeyewear device 100. - Although not shown in
FIG. 1 , theeyewear device 100 is coupled to a processor and a memory, for example in theeyewear device 100 itself or another part of the system. Subsequent processing by theeyewear device 100 by the system, for example, using a coupled memory and processor in the system to process the captured image of reflection variations of infrared light from the retina, identifies the unique pattern of the user's eye and thus the particular user of theeyewear device 100. - Alternatively, or additionally, the
eye scanner 113 may include an emitter that emits other wavelengths of light besides infrared and theeye scanner 113 further includes a camera sensitive to that wavelength that receives and captures images with that wavelength. For example, theeye scanner 113 may comprise a visible light camera that captures light in the visible light range from the iris. In some examples, such iris recognition can use infrared illumination by theinfrared emitter 115 and theinfrared camera 120 or a video camera to capture images of the detail-rich, intricate structures of the iris which are visible externally. Theeyewear device 100 or the system can subsequently process images captured of the iris using, for example, a coupled memory and processor in the system to process the captured image of visible light from the iris. Such processing of the captured images can identify the unique pattern of the user's eye and thus the particular user of theeyewear device 100. -
FIG. 2 is a rear view of an example hardware configuration of anothereyewear device 200. In this example configuration, theeyewear device 200 is depicted as including aneye scanner 213 on aright chunk 210B. As shown, theinfrared emitter 215 and theinfrared camera 220 are co-located on theright chunk 210B. It should be understood that theeye scanner 213 or one or more components of theeye scanner 213 can be located on theleft chunk 210A and other locations of theeyewear device 200, for example, the frame 205.Eye scanner 213 has aninfrared emitter 215 andinfrared camera 220 like that ofFIG. 1 , but theeye scanner 213 can be varied to be sensitive to different light wavelengths as described previously inFIG. 1 . - Similar to
FIG. 1 , theeyewear device 200 includes aframe 105 which includes aleft rim 107A which is connected to a right rim 107B via abridge 106; and the left andright rims 107A-B include respective apertures which hold a respectiveoptical element 180A-B. -
FIG. 3 shows a rear perspective view of the eyewear device ofFIG. 1 depicting aninfrared camera 120, aframe front 330, a frame back 335, and a circuit board. It can be seen that the upper portion of theleft rim 107A of theframe 105 of theeyewear device 100 includes aframe front 330 and a frame back 335. Theframe front 330 includes a front-facing side configured to face outwards away from the eye of the user. The frame back 335 includes a rear-facing side configured to face inwards towards the eye of the user. An opening for theinfrared camera 120 is formed on the frame back 335. - As shown in the encircled cross-section 4-4 of the upper middle portion of the left rim of the frame, a circuit board, which is a flexible printed circuit board (PCB) 340, is sandwiched between the
frame front 330 and the frame back 335. Also shown in further detail is the attachment of theleft chunk 110A to theleft temple 325A via aleft hinge 326A. In some examples, components of the eye scanner, including theinfrared camera 120, theflexible PCB 340, or other electrical connectors or contacts may be located on theleft temple 325A or theleft hinge 326A. - In an example, the
left chunk 110A includes achunk body 311, achunk cap 312, an inwards facingsurface 391 and an outwards facing surface 392 (labeled, but not visible). Disposed inside theleft chunk 110A are various interconnected circuit boards, such as PCBs or flexible PCBs, that include controller circuits for charging, a battery, inwards facing light emitting diodes (LEDs), and outwards (forward) facing LEDs. -
FIG. 4 is a cross-sectional view through theinfrared camera 120 and the frame corresponding to the encircled cross-section 4-4 of the eyewear device ofFIG. 3 . Various layers of theeyewear device 100 are visible in the cross-section ofFIG. 4 . As shown theflexible PCB 340 is disposed on theframe front 330 and connected to the frame back 335. The infrared camera 320 is disposed on theflexible PCB 340 and covered by an infraredcamera cover lens 445. For example, theinfrared camera 120 is reflowed to the back of theflexible PCB 340. Reflowing attaches theinfrared camera 120 to electrical contact pad(s) formed on the back of theflexible PCB 340 by subjecting theflexible PCB 340 to controlled heat which melts a solder paste to connect the two components. In one example, reflowing is used to surface mount theinfrared camera 120 on theflexible PCB 340 and electrically connect the two components. However, it should be understood that through-holes can be used to connect leads from theinfrared camera 120 to theflexible PCB 340 via interconnects, for example. - The frame back 335 includes an
infrared camera opening 450 for the infraredcamera cover lens 445. Theinfrared camera opening 450 is formed on a rear-facing side of the frame back 335 that is configured to face inwards towards the eye of the user. In the example, theflexible PCB 340 can be connected to theframe front 330 via a flexible PCB adhesive 460. The infraredcamera cover lens 445 can be connected to the frame back 335 via infrared cameracover lens adhesive 455. The connection can be indirect via intervening components. -
FIG. 5 shows a rear perspective view of the eyewear device ofFIG. 1 . Theeyewear device 100 includes aninfrared emitter 115,infrared camera 120, aframe front 330, a frame back 335, and acircuit board 340. As inFIG. 3 , it can be seen inFIG. 5 that the upper portion of the left rim of the frame of theeyewear device 100 includes theframe front 330 and the frame back 335. An opening for theinfrared emitter 115 is formed on the frame back 335. - As shown in the encircled cross-section 6-6 in the upper middle portion of the left rim of the frame, a circuit board, which is a
flexible PCB 340, is sandwiched between theframe front 330 and the frame back 335. Also shown in further detail is the attachment of theleft chunk 110A to theleft temple 325A via theleft hinge 326A. In some examples, components of the eye scanner, including theinfrared emitter 115, theflexible PCB 340, or other electrical connectors or contacts may be located on theleft temple 325A or theleft hinge 326A. -
FIG. 6 is a cross-sectional view through theinfrared emitter 115 and the frame corresponding to the encircled cross-section 6-6 of the eyewear device ofFIG. 5 . Multiple layers of theeyewear device 100 are illustrated in the cross-section ofFIG. 6 , as shown the frame includes theframe front 330 and the frame back 335. Theflexible PCB 340 is disposed on theframe front 330 and connected to the frame back 335. Theinfrared emitter 115 is disposed on theflexible PCB 340 and covered by an infraredemitter cover lens 645. For example, theinfrared emitter 115 is reflowed to the back of theflexible PCB 340. Reflowing attaches theinfrared emitter 115 to contact pad(s) formed on the back of theflexible PCB 340 by subjecting theflexible PCB 340 to controlled heat which melts a solder paste to connect the two components. In one example, reflowing is used to surface mount theinfrared emitter 115 on theflexible PCB 340 and electrically connect the two components. However, it should be understood that through-holes can be used to connect leads from theinfrared emitter 115 to theflexible PCB 340 via interconnects, for example. - The frame back 335 includes an infrared emitter opening 650 for the infrared
emitter cover lens 645. The infrared emitter opening 650 is formed on a rear-facing side of the frame back 335 that is configured to face inwards towards the eye of the user. In the example, theflexible PCB 340 can be connected to theframe front 330 via the flexible PCB adhesive 460. The infraredemitter cover lens 645 can be connected to the frame back 335 via infrared emittercover lens adhesive 655. The coupling can also be indirect via intervening components. -
FIG. 7 is a top cross-sectional view of theright chunk 210B of the eyewear device ofFIG. 2 . As shown, theeyewear device 200 includes theinfrared emitter 215, theinfrared camera 220, and a circuit board, which may be aflexible PCB 740. Theright chunk 210B is connected to a right temple 725B of theeyewear device 200 via the right hinge 726B. In some examples, components of the eye scanner, including theinfrared emitter 215 and theinfrared camera 220, theflexible PCB 740, or other electrical connectors or contacts may be located on the right temple 725B or the right hinge 726B. - The right chunk 710B includes
chunk body 711, an inwards facingsurface 791, and an outwards facingsurface 792. The right chunk 710B also includes a chunk cap (not shown) like thechunk cap 312 for the left chunk inFIG. 3 , but the chunk cap is removed in the cross-section ofFIG. 7 . Disposed inside theright chunk 210B are various interconnected circuit boards, such as PCBs or flexible PCBs, that include controller circuits for a visiblelight camera 714, microphone(s), low-power wireless circuitry (e.g., for wireless short range network communication via Bluetooth™), high-speed wireless circuitry (e.g., for wireless local area network communication via WiFi). - The visible
light camera 714 is disposed on a circuit board and covered by a visible camera cover lens and has an outwards facing field of view. The frame front, which is connected to theright chunk 210B, and theright chunk 210B can include opening(s) for the visible light camera cover lens. The frame front includes a front-facing side configured to face outwards away from the eye of the user. The opening for the visible light camera cover lens is formed on and through the front-facing side. Theinfrared emitter 215 andinfrared camera 220 have an inwards facing field of view relative to the visiblelight camera 714 having the outwards facing field of view. - As shown, the
infrared emitter 215 and theinfrared camera 220 are co-located on the inwards facingsurface 791 of theright chunk 210B to point inwards towards the eye of the user. The inwards facingsurface 791 can be sloped such that it curves away from the upper portion of the right rim of the frame where the inwards facingsurface 791 intersects the right rim and towards the right temple 725B to orient theinfrared emitter 215 andinfrared camera 220 with an inwards facing field of view and a line of sight of the eye of the user. - The
infrared emitter 215 and theinfrared camera 220 are coupled to theflexible PCB 740 in a manner that is similar to that shown and described with reference toFIGS. 3-6 . For example, theflexible PCB 740 is disposed inside theright chunk 210B between inwards facingsurface 791 and the outwards facingsurface 792 of theright chunk 210B.Flexible PCB 740 is coupled to one or more other components housed in theright chunk 210B. Theinfrared emitter 215 is disposed on theflexible PCB 740 and an infrared emitter cover lens covers theinfrared emitter 215. Theinfrared camera 220 is also disposed on theflexible PCB 740 and an infrared camera cover lens covers theinfrared emitter 215. Although shown as being formed on the circuit boards of theright chunk 210B, the eye scanner, including theinfrared emitter 215 and theinfrared camera 220, can be formed on the circuit boards of the left chunk as shown inFIG. 3 . - An infrared camera opening and infrared emitter opening are both formed on the inwards facing
surface 791 of theright chunk 210B that are configured to face inwards towards the eye of the user. In the example, theflexible PCB 740 can be connected to the inwards facingsurface 791 and outwards facingsurface 792 via a flexible PCB adhesive. The infrared emitter cover lens and infrared camera cover lens can be connected to the inwards facingsurface 791 via a cover lens adhesive. The coupling can also be indirect via intervening components. -
FIG. 8A depicts an example of a pattern of infrared light emitted by aninfrared emitter 815 of the eyewear device and reflection variations of the emitted pattern of infrared light captured by the infrared camera 820 of the eyewear device.FIG. 8B depicts the emitted pattern ofinfrared light 881 emitted by theinfrared emitter 815 of the eyewear device in an inwards facing field of view towards an eye of a user 880. - The pattern of
infrared light 881 can be a standardized matrix or beam of pixels that will outline a uniform light trace on the eye of the user 880 (e.g., retina or iris). As noted above, the eye of each user 880 is unique, for example both the retina and iris portions uniquely identify a user. The retina is a thin tissue composed of neural cells located in the posterior portion of the eye. Capillaries that supply the retina with blood form a complex structure that make each user's retina unique to that person. The intricate structures forming the iris are also unique to each person and thus also uniquely identify each user. When the emitted pattern ofinfrared light 881 strikes the eye of the user 880, the infrared camera 820 captures the reflection variations of the emitted pattern ofinfrared light 882, which can then be used to uniquely identify the user. - In an example, the emitted pattern of
infrared light 881 is an unperceived low-energy infrared beam that shines on the eye with a standardized path. The amount of reflection of the emitted pattern ofinfrared light 881 varies in different parts of the retina (e.g., retinal blood vessels absorb light more than surrounding tissue) and the iris. Infrared camera 820 captures these reflection variations of the emitted pattern ofinfrared light 882, which is digitized by the components of the system. For example, the wearable device includes or is coupled to image processor, memory, and processor for digitizing the reflection variations of the emitted pattern ofinfrared light 882. The reflection variations of the emitted pattern ofinfrared light 882 can then be compared to a database of captured infrared images of eyes of multiple users to identify the user. - To initially set up the user in the system, the reflection variations of the emitted pattern of infrared light 882 from the user's eye can be stored in the database of captured infrared images, which includes images of eyes of multiple users. The system may then subsequently compare received reflection variations to this database to uniquely identify the user. In an example, when the user is utilizing an eyewear device for the first time, the
infrared emitter 815 emits the emitted pattern ofinfrared light 881 and the infrared camera 820 captures one, two, three, or more images of the reflection variations of the emitted pattern ofinfrared light 882 in different parts of the user's eye(s). If this is the first time the user has used the system, the system will find no previously captured infrared image exists in the database that matches the currently captured reflection variations of the emitted pattern ofinfrared light 882. In response to finding no matching captured infrared image exists, the system updates the database to store digitized images of the currently captured reflection variations of the emitted pattern ofinfrared light 882. During a subsequent use of the eyewear device at a later time, the updated database with the digitized reflection variations that were previously stored in the database are analyzed using algorithms. The algorithms employ mathematical and statistical techniques for pattern recognition to determine whether at least one subsequently captured image of reflection variations of that same user or a different user of the eyewear device matches one or more of the previously captured digitized images that are stored and exist in the database. If a match is found, the identity of the user is verified (e.g., known) and corresponding user account information is retrieved. A chat application stored on a mobile device may be executed by a processor of the mobile device and utilize the corresponding user account information to post or send images and videos captured by a visible light to camera of the eyewear device to the user's account and deliver the images and videos captured by the visible light camera to contacts or associated groups of the verified user in the chat application. Although the above example describes verifying the identity of the user as knowing their identity or identifying an associated user account, some embodiments can include determining that the same person has used the eyewear device before without specifically knowing the identify or account information of the user. It should be understood that the foregoing functionality can be embodied in programming instructions of a user identification application found in one or more components of the system. -
FIG. 9 is a high-level functional block diagram of an example user identification system. Thesystem 900 includeseyewear device 910,mobile device 990, and server system 998.Mobile device 990 may be a smartphone, tablet, laptop computer, access point, or any other such device capable of connecting witheyewear device 910 using both a low-power wireless connection 925 and a high-speed wireless connection 937.Mobile device 990 is connected to server system 998 andnetwork 995. Thenetwork 995 may include any combination of wired and wireless connections. - Server system 998 may be one or more computing devices as part of a service or network computing system, for example, that include a processor, a memory, and network communication interface to communicate over the
network 995 with themobile device 990 andeyewear device 910. The memory of the server system 998 can include digital images of the reflection variations of the emitted pattern of infrared light as captured by theeyewear device 910 and transmitted via the depictednetworks 925, 937, 995. The memory of the server system 998 can also include a database of captured infrared images of eyes of multiple users and a user identification application to perform functions of the programming described herein. Execution of the programming by the processor of the server system 998 can cause the server system 998 to perform some or all of the functions described herein, for example, to uniquely identify the user of theeyewear device 910 based on the reflection variations. -
Mobile device 990 and elements ofnetwork 995, low-power wireless connection 925, and high-speed wireless architecture 937 may be implemented using details of the architecture ofmobile device 990, for example utilizing the short range XCVRs and WWAN XCVRs ofmobile device 990 described inFIG. 10 . -
System 900 may optionally include additionalperipheral device elements 919 and adisplay 911 integrated witheyewear device 910. Suchperipheral device elements 919 may include biometric sensors, additional sensors, or display elements integrated witheyewear device 910. For example,peripheral device elements 919 may include any I/O components including output components, motion components, position components, or any other such elements described herein. - Output components include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), or a projector), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth. The input components include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instruments), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
- For example, the biometric components include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth. The position components include location sensor components to generate location coordinates (e.g., a Global Positioning System (GPS) receiver component), WiFi or Bluetooth™ transceivers to generate positioning system coordinates, altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like. Such location coordinates can also be received over wireless connections 925 and 937 from the
mobile device 990 via the low-power wireless circuitry 924 or high-speed wireless circuitry 936. -
Eyewear device 910 includes a visiblelight camera 914,infrared emitter 915,infrared camera 920,image processor 912,interface 916, low-power circuitry 920, and high-speed circuitry 930. The components shown inFIG. 9 for theeyewear device 910 are located on one or more circuit boards, for example a PCB or flexible PCB, in the chunks or frames. Alternatively or additionally, the depicted components can be located in the temples, hinges, or bridge of theeyewear device 910. - The
infrared camera 920 may be a low resolution camera, such as VGA (640×480 resolution), which can provide for low power consumption since fewer pixels equals less power and also allows the camera module package to be small enough to fit into the design of theeyewear device 910, including the frame and chunks.Infrared camera 920 and visiblelight camera 914 can include digital camera elements such as a charge coupled device, a lens, or any other respective visible or infrared light capturing elements that may be used to capture data. -
Interface 916 refers to any source of a user command that is provided toeyewear device 910. In one implementation,interface 916 is a respective physical button on a visiblelight camera 914,infrared emitter 915, orinfrared camera 920 that, when depressed, sends a user input signal frominterface 916 tolow power processor 922. In some examples, theinterface 916 is located on different portions of theeyewear device 910, such as on a different chunk or the frame, but is electrically connected via a circuit board to the visiblelight camera 914,infrared emitter 915, orinfrared camera 920. Interaction with the interface by the user, e.g., tactile input or a depression of a button followed by an immediate release, can be processed bylow power processor 922 as a request to capture a single image. A depression of such a camera button for a first period of time may be processed by low-power processor 922 as a request to capture video data while the button is depressed, and to cease video capture when the button is released, with the video captured while the button was depressed stored as a single video file. In certain embodiments, the low-power processor 922 may have a threshold time period between the press of a button and a release, such as 500 milliseconds or one second, below which the button press and release is processed as an image request, and above which the button press and release is interpreted as a video request. - Use of the
interface 916 on theeyewear device 910 can be immediately followed by user identification via an eye scanner. For example, theinfrared emitter 915 emits a pattern of infrared light and theinfrared camera 920 captures reflection variations in the emitted pattern of infrared light by capturing various infrared images. Such user identification can occur prior to each image request or video request via theinterface 916 or after a predetermined time interval of usage of theeyewear device 910 has elapsed since the user was previously identified via the eye scanner. Thelow power processor 922 may make this user identification determination while the video orimage processor 912 is booting. In other embodiments, theinterface 916 may be a touch screen device, capacitive or resistive strip or array on a circuit board, or any mechanical switch or physical interface capable of accepting user inputs associated with a request for data from the visiblelight camera 914,infrared emitter 915, orinfrared camera 920. In other embodiments, theinterface 916 may have a software component, or may be associated with a command received wirelessly from another source. -
Image processor 912 includes circuitry to receive signals from the visiblelight camera 914 andinfrared camera 920 and process those signals from the visiblelight camera 914 andinfrared camera 920 into a format suitable for storage in thememory 934. Thememory 934 includes various images containingreflection variations 960 of the emitted pattern of infrared light of the eye of the user as captured by theinfrared camera 920. In some examples, thememory 934 can also include a database of capturedinfrared images 950 of eyes of multiple users and auser identification application 945 to perform the functions of the programming described herein, for example the operations outlined in further detail inFIGS. 1-8 and 10-12 , for example. - As explained in further detail herein, uniquely identifying the user includes comparing the images containing the reflection variations of the emitted pattern of
infrared light 960 of the eye of the user against the database of captured infrared images of eyes ofmultiple users 950 via auser identification application 945. Such comparison can be done on a device separate from theeyewear device 910, such as a host computer, which includes themobile device 990 and server system 998. Due to the private nature of data from retina and iris scans, in some examples, identification may occur on theeyewear device 910 alone and in combination with themobile device 990. However, it should be understood that user identification can occur on essentially any host computer, which includes both themobile device 990 and server system 998. For example, as shown, theeyewear device 910 can include theprocessors memory 934, auser identification application 945 in thememory 934, to perform the functions of the programming to emit and capture as described herein. Thehost computer 990 and 998 coupled to theeyewear device 910 via thenetworks 925, 937, and 995 as shown, can include a second processor, a second memory; and the function of the programming to uniquely identify the user of the eyewear device. Where and which components of the depictedsystem 900 perform the user identification, depends on the security preferences of the user and privacy requirements of thesystem 900 because storage of such private identification data may be subject to various rules and regulations. -
Image processor 912 is structured withineyewear device 910 such that it may be powered on and booted under the control of low-power circuitry 920.Image processor 912 may additionally be powered down by low-power circuitry 920. Depending on various power design elements associated withimage processor 912,image processor 912 may still consume a small amount of power even when it is in an off state. This power will, however, be negligible compared to the power used byimage processor 912 when it is in an on state, and will also have a negligible impact on battery life. As described herein, device elements in an “off” state are still configured within a device such that low-power processor 922 is able to power on and power down the devices. A device that is referred to as “off” or “powered down” during operation ofeyewear device 910 does not necessarily consume zero power due to leakage or other aspects of a system design. - In one example embodiment,
image processor 912 comprises a microprocessor integrated circuit (IC) customized for processing sensor data from a visiblelight camera 914 and aninfrared camera 920, along with volatile memory used by the microprocessor to operate. In order to reduce the amount of time thatimage processor 912 takes when powering on to processing data, a non-volatile read only memory (ROM) may be integrated on the IC with instructions for operating or booting theimage processor 912. This ROM may be minimized to match a minimum size needed to provide basic functionality for gathering sensor data from visiblelight camera 914 andinfrared camera 920, such that no extra functionality that would cause delays in boot time are present. The ROM may be configured with direct memory access (DMA) to the volatile memory of the microprocessor ofimage processor 912. DMA allows memory-to-memory transfer of data from the ROM to system memory of theimage processor 912 independent of operation of a main controller ofimage processor 912. Providing DMA to this boot ROM further reduces the amount of time from power on of theimage processor 912 until sensor data from the visiblelight camera 914 andinfrared camera 920 can be processed and stored. In certain embodiments, minimal processing of the camera signal from the visiblelight camera 914 andinfrared camera 920 is performed by theimage processor 912, and additional processing may be performed by applications operating on themobile device 990 or server system 998. - Low-
power circuitry 920 includes low-power processor 922 and low-power wireless circuitry 924. These elements of low-power circuitry 920 may be implemented as separate elements or may be implemented on a single IC as part of a system on a single chip. Low-power processor 922 includes logic for managing the other elements of theeyewear device 910. As described above, for example,low power processor 922 may accept user input signals from aninterface 916. Low-power processor 922 may also be configured to receive input signals or instruction communications frommobile device 990 via low-power wireless connection 925. Additional details related to such instructions are described further below. Low-power wireless circuitry 924 includes circuit elements for implementing a low-power wireless communication system via a short-range network. Bluetooth™ Smart, also known as Bluetooth™ low energy, is one standard implementation of a low power wireless communication system that may be used to implement low-power wireless circuitry 924. In other embodiments, other low power communication systems may be used. - High-
speed circuitry 930 includes high-speed processor 932,memory 934, and high-speed wireless circuitry 936. In the example, theinfrared emitter 915 is shown as being coupled to the high-speed circuitry 930 and operated by the high-speed processor 932. However, it should be understood that in some examples theinfrared emitter 915 can be coupled to the low-power circuitry 920 such that theinfrared emitter 915 is operated by low-power processor 922. For example, a low-energy infrared beam pattern can be emitted by theinfrared emitter 915 with relatively few pixels in the matrix which equals less power and can also allow for a small package that fits into the design of theeyewear device 910, including the frame and chunks. - High-
speed processor 932 may be any processor capable of managing high-speed communications and operation of any general computing system needed foreyewear device 910.High speed processor 932 includes processing resources needed for managing high-speed data transfers on high-speed wireless connection 937 to a wireless local area network (WLAN) using high-speed wireless circuitry 936. In certain embodiments, the high-speed processor 932 executes an operating system such as a LINUX operating system or other such operating system. In addition to any other responsibilities, the high-speed processor 932 executing a software architecture for theeyewear device 910 is used to manage data transfers with high-speed wireless circuitry 936. In certain embodiments, high-speed wireless circuitry 936 is configured to implement Institute of Electrical and Electronic Engineers (IEEE) 802.11 communication standards, also referred to herein as Wi-Fi. In other embodiments, other high-speed communications standards may be implemented by high-speed wireless circuitry 936. -
Memory 934 includes any storage device capable of storing camera data generated by theinfrared camera 920, the visiblelight camera 914, and theimage processor 912. Whilememory 934 is shown as integrated with high-speed circuitry 930, in other embodiments,memory 934 may be an independent standalone element of theeyewear device 910. In certain such embodiments, electrical routing lines may provide a connection through a chip that includes the high-speed processor 932 from theimage processor 912 or low-power processor 922 to thememory 934. In other embodiments, the high-speed processor 932 may manage addressing ofmemory 934 such that the low-power processor 922 will boot the high-speed processor 932 any time that a read or writeoperation involving memory 934 is needed. -
FIG. 10 is a high-level functional block diagram of an example of amobile device 1090 that communicates via the user identification system ofFIG. 9 . Shown are elements of a touch screen type ofmobile device 1090 having auser identification application 1045 loaded, although other non-touch type mobile devices can be used in the user identification communications and controls under consideration here. Examples of touch screen type mobile devices that may be used include (but are not limited to) a smart phone, a personal digital assistant (PDA), a tablet computer, a laptop computer, or other portable device. However, the structure and operation of the touch screen type devices is provided by way of example; and the subject technology as described herein is not intended to be limited thereto. For purposes of this discussion,FIG. 10 therefore provides a block diagram illustration of the examplemobile device 1090 having a touch screen display for displaying content and receiving user input as (or as part of) the user interface.Mobile device 1090 also includes a camera(s) 1070, such as visible light camera(s). - The activities that are the focus of discussions here typically involve data communications related to eye scanning for user identification and security in a portable eyewear device. As shown in
FIG. 10 , themobile device 1090 includes at least one digital transceiver (XCVR) 1010, shown as WWAN XCVRs, for digital wireless communications via a wide area wireless mobile communication network. Themobile device 1090 also includes additional digital or analog transceivers, such asshort range XCVRs 1020 for short-range network communication, such as via NFC, VLC, DECT, ZigBee, Bluetooth™, or WiFi. For example,short range XCVRs 1020 may take the form of any available two-way wireless local area network (WLAN) transceiver of a type that is compatible with one or more standard protocols of communication implemented in wireless local area networks, such as one of the Wi-Fi standards under IEEE 802.11 and WiMAX. - To generate location coordinates for positioning of the
mobile device 1090, themobile device 1090 can include a global positioning system (GPS) receiver. Alternatively, or additionally themobile device 1090 can utilize either or both theshort range XCVRs 1020 andWWAN XCVRs 1010 for generating location coordinates for positioning. For example, cellular network, WiFi, or Bluetooth™ based positioning systems can generate very accurate location coordinates, particularly when used in combination. Such location coordinates can be transmitted to the eyewear device over one or more network connections viaXCVRs 1020. - The
transceivers 1010, 1020 (network communication interface) conforms to one or more of the various digital wireless communication standards utilized by modern mobile networks. Examples ofWWAN transceivers 1010 include (but are not limited to) transceivers configured to operate in accordance with Code Division Multiple Access (CDMA) and 3rd Generation Partnership Project (3GPP) network technologies including, for example and without limitation, 3GPP type 2 (or 3GPP2) and LTE, at times referred to as “4G.” For example, thetransceivers mobile device 1090 for user identification strategies. - Several of these types of communications through the
transceivers short range XCVRs 1020 over the wireless connections 925 and 937 to and from the eyewear device as shown inFIG. 9 . Such communications, for example, may also transport data utilizing IP packet data transport via theWWAN XCVRs 1010 over the network (e.g., Internet) 995 shown inFIG. 9 . BothWWAN XCVRs 1010 andshort range XCVRs 1020 connect through radio frequency (RF) send-and-receive amplifiers (not shown) to an associated antenna (not shown). - The
mobile device 1090 further includes a microprocessor, shown asCPU 1030, sometimes referred to herein as the host controller. A processor is a circuit having elements structured and arranged to perform one or more processing functions, typically various data processing functions. Although discrete logic components could be used, the examples utilize components forming a programmable CPU. A microprocessor for example includes one or more integrated circuit (IC) chips incorporating the electronic elements to perform the functions of the CPU. Theprocessor 1030, for example, may be based on any known or available microprocessor architecture, such as a Reduced Instruction Set Computing (RISC) using an ARM architecture, as commonly used today in mobile devices and other portable electronic devices. Of course, other processor circuitry may be used to form theCPU 1030 or processor hardware in smartphone, laptop computer, and tablet. - The
microprocessor 1030 serves as a programmable host controller for themobile device 1090 by configuring the mobile device to perform various operations, for example, in accordance with instructions or programming executable byprocessor 1030. For example, such operations may include various general operations of the mobile device, as well as operations related to user identification and communications with the eyewear device and server system. Although a processor may be configured by use of hardwired logic, typical processors in mobile devices are general processing circuits configured by execution of programming. - The
mobile device 1090 includes a memory or storage device system, for storing data and programming. In the example, the memory system may include aflash memory 1040A and a random access memory (RAM) 1040B. TheRAM 1040B serves as short term storage for instructions and data being handled by theprocessor 1030, e.g. as a working data processing memory. Theflash memory 1040A typically provides longer term storage. - Hence, in the example of
mobile device 1090, theflash memory 1040A is used to store programming or instructions for execution by theprocessor 1030. Depending on the type of device, themobile device 1090 stores and runs a mobile operating system through which specific applications, includinguser identification application 1045. Applications, such as theuser identification application 1045, may be a native application, a hybrid application, or a web application (e.g., a dynamic web page executed by a web browser) that runs onmobile device 1090 to uniquely identify the user. Examples of mobile operating systems include Google Android, Apple iOS (I-Phone or iPad devices), Windows Mobile, Amazon Fire OS, RIM BlackBerry operating system, or the like. - As shown,
flash memory 1040A storage device stores a database of captured infrared images of respective eyes ofmultiple users 1050. The database of captured infrared images of respective eyes ofmultiple users 1050 is accumulated over time as different users of the eyewear device set up a profile in the user identification system. Initially, each user utilizes theeye scanner 113 to capture various images of an eye. The captured images are then populated into the database of captured infrared images of respective eyes ofmultiple users 1050 to allow for user identification. - In the example, an
eyewear device 100 captures a digital image of reflection variations of the emitted pattern of infrared light 1060 and the captured digital image in theflash memory 1040A. To uniquely identify the user of theeyewear device 100, current reflection variations of an emitted pattern of infrared light 1060 is compared by theprocessor 1030 to previously captured infrared images of respective eyes ofmultiple users 1050 within the database to uniquely identify the user of the eyewear device. It will be understood that themobile device 1090 is just one type of host computer in the user identification system and that other arrangement may be utilized. For example, a server system such as that shown inFIG. 9 may host the database of captured infrared images of respective eyes ofmultiple users 1050 and perform the comparison to make the unique user identification determination. Where the database of captured infrared images of respective eyes ofmultiple users 1050 and the reflection variations of the emitted pattern of infrared light 1060 is stored and processed can vary depending on the security preferences of the user and the system requirements. - The
user identification application 1045 includes programming functions to populate the database with captured infrared images of respective eyes ofmultiple users 1050 and to uniquely identify the user. For example, the programming functions may include comparing the digital image of reflection variations of the emitted pattern of infrared light 1060 with the database of captured infrared images of respective eyes ofmultiple users 1050. In addition, any of the user identification functionality described herein for the eyewear device, mobile device, and server system can be embodied in one more applications as described previously. -
FIG. 11A shows various alternate locations for the eye scanner on the eyewear device, which can be used individually or in combination. A shown,multiple eyewear scanners 1113A-D can be included in theeyewear device 1100 to reduce errors in the user identification determination and to determine a direction in which the user is looking (e.g., line of sight) for eye tracking. In the example, there a foureye scanners 1113A-D, and eacheye scanner 1113A-D includes a respectiveinfrared emitter 1115A-D andinfrared camera 1120A-D. - As shown, the
frame 1105 includes opposing first and second lateral sides 1170A-B. Afirst chunk 1110A is integrated into the firstlateral side 1170A offrame 1105. Asecond chunk 1110B is integrated into the second lateral side 1170B offrame 1105. A circuit board (not shown) spans thefirst chunk 1110A, theframe 1105, and thesecond chunk 1110B. Theframe 1105 of theeyewear device 1100 includes anupper frame portion 1195, amiddle frame portion 1196, and alower frame portion 1197. - As depicted in
FIG. 11A ,eye scanner 1113A is located on thefirst rim 1107A on theupper frame portion 1195.Eye scanner 1113B is located on thesecond chunk 1110B.Eye scanner 1113C is located on thefirst rim 1107A on thelower frame portion 1197. Eye scanner 1113D is located on the first rim on themiddle frame portion 1196. -
Eyewear device 1100 includes afirst eye scanner 1113A that includes a firstinfrared emitter 1115A and a firstinfrared camera 1120A.Eyewear device 1100 also includes asecond eye scanner 1113B that includes a second infrared emitter 1115B and a secondinfrared camera 1120B. The second infrared emitter 1115B is connected to theframe 1105 or the at least onechunk 1110A-B to emit a second emitted pattern of infrared light. The secondinfrared camera 1120B is connected to theframe 1105 or the at least onechunk 1110A-B to capture reflection variations in the second emitted pattern of infrared light. It should be understood that the first and second eye scanners can include any combination of locations, or number ofeye scanners 1113A-D shown inFIG. 11A , including one, two, three, or four of theeye scanners 1113A-D. Additionally, theeye scanners 1113A-D can be located on other portions of theeyewear device 1100, including thefirst chunk 1110A; upper, middle, and lower portions 1195-1197 of thesecond rim 1107B; thebridge 1106, or the temples. - Execution of the programming by a processor of a user identification system, for example in the
eyewear device 1100 or a coupled mobile device or server system, configures the system to perform functions. In an example, the eyewear device 1110 emits, via the second infrared emitter 1115B, the second emitted pattern of infrared light on a second eye of the user of the eyewear device 1110; captures, via the secondinfrared camera 1120B, reflection variations in the second emitted pattern of infrared light on a second eye of the user. Based on the reflection variations of the second emitted pattern of infrared light on the second eye of the user, the system determines a direction of a line of sight of the eyes of the user for eye tracking. - In another example, the
eyewear device 1100 emits, via the second infrared emitter 1115B, the second emitted pattern of infrared light on a different portion of the eye of the user of theeyewear device 1100 than the firstinfrared emitter 1115A. Theeyewear device 1100 captures, via the second infrared camera 1115B, the reflection variations in the second emitted pattern of infrared light on the different portion of the eye of the user. Based on the reflection variations of the second emitted pattern of infrared light on the different portion of the eye of the user, the system uniquely identifies the user of theeyewear device 1100 based on the reflection variations of the second emitted pattern of infrared light on the different portion of the eye of the user. The second emitted pattern of infrared light can be the same or different from the first pattern of infrared light emitted by the firstinfrared emitter 1115A. The second infrared emitter 1115B and the secondinfrared camera 1120B can be co-located on theframe 1105 or the at least onechunk 1110A-B as shown inFIG. 11 . Although not shown inFIG. 11A , the firstinfrared emitter 1115A and theinfrared camera 1120A can be co-located on afirst chunk 1110A. The second infrared emitter 1115B and the secondinfrared camera 1120B can be co-located on asecond chunk 1110B. - As described and depicted in
FIG. 1 and shown inFIG. 11A , theframe 1105 of theeyewear device 1100 includes first and second eye rims 1107A-B that have respective apertures to hold a respective optical element and the first and second eye rims 1107A-B are connected by abridge 1106. In an example, the firstinfrared emitter 1115A and the firstinfrared camera 1120A are co-located on thefirst eye rim 1107A. Although not shown inFIG. 11B , the second infrared emitter 1115B and the secondinfrared camera 1120B can be co-located on thesecond eye rim 1107B, including on theupper frame portion 1195,middle frame portion 1196, andlower frame portion 1197. -
FIGS. 11B-D illustrate the effects of the various alternate locations on the eyewear device with respect to different orientations of the eye of the user. InFIG. 11B , the eye of the user 1180B is looking up. Accordingly, placement of theeye scanner 1113A, such as the infrared emitter and infrared camera, on either the upper frame portion (e.g., top frame on the rims, bridge, etc.) or a chunk can accurately capture an image of the retina or iris of the eye of the user 1180B looking up. Also, placement of theeye scanner 1113B on a lower frame portion (e.g., bottom frame) of the eyewear device also accurately captures an image of the retina or iris of the eye of the user 1180B looking up. Hence both fields of view are depicted as suitable (OK). - In
FIG. 11C , the eye of the user 1180C is looking straight ahead. In this scenario, again placement of theeye scanner 1113A on either the upper frame portion or a chunk can accurately capture an image of the retina or iris of the eye of the user 1180C looking straight ahead. Also, placement of theeye scanner 1113B on the lower frame portion of the eyewear device accurately captures an image of the retina or iris of the eye of the user 1180C looking straight ahead. - In
FIG. 11D , the eye of the user 1180D is looking down. In this orientation of the eye of the user 1180D, placement of theeye scanner 1113A, on either the upper frame portion or a chunk may be insufficient because the eyelid of the user 1180D can block the infrared camera. Hence the field of view is depicted as not good (NG). However, placement of theeye scanner 1113B on the lower frame portion of the eyewear device can accurately capture an image of the retina or iris of the eye of the user 1180D looking down. Thus, havingmultiple eye scanners 1113A-B on the eyewear device can improve performance of the user identification system by improving accuracy and reducing errors in eye scanning. In addition,multiple eye scanners 1113A-B can be used for eye tracking directional information, for example, to detect where the user is looking (left, right, up, down, east, west, north, south, etc.). - In an example, location coordinates of the user of the eyewear device can also be generated by the location sensor components of the eyewear device or a mobile device being carried by the user that is in communication via the connections 925 and 937 as described in
FIGS. 9-10 . With the eye tracking directional information along with the location coordinates of the user, specific content can be delivered to the eyewear device. For example, if the user is walking down the street and looking at a store, the eyewear device can be loaded with information about the particular store for delivering coupons for monetizing purposes. - According to some embodiments, an “application” or “applications” are program(s) that execute functions defined in the programs. Various programming languages can be employed to create one or more of the applications, structured in a variety of manners, such as object-oriented programming languages (e.g., Objective-C, Java, or C++) or procedural programming languages (e.g., C or assembly language). In a specific example, a third party application (e.g., an application developed using the ANDROID™ or IOS™ software development kit (SDK) by an entity other than the vendor of the particular platform) may be mobile software running on a mobile operating system such as IOS™, ANDROID™ WINDOWS® Phone, or another mobile operating systems. In this example, the third party application can invoke API calls provided by the operating system to facilitate functionality described herein.
-
FIG. 12 is a flowchart of the operation of the eyewear device and other components of the user identification system. As noted above, utilizing the eyewear devices and protocols and procedures of the user identification system described herein, the identity of a user can be verified. Although shown as occurring serially, the blocks ofFIG. 12 may be reordered or parallelized depending on the implementation. - Beginning in
block 1200, eyewear device initiates scanning of the eye. In one example, eye scanning is initiated when a user puts the eyewear device on, for example, over the user's eyes. Such wearing of the eyewear device can be detected after the eyewear device detects that the temples have been be unfolded via an open/close sensor (e.g., magnetic contacts) mounted on a circuit board that is coupled to the temples and hinges. Or, for example, a capacitive strip on the bridge, temples, or other portions of the eyewear device may detect that the eyewear device is being worn by the user. In response to detecting wearing of the eyewear device, for example, for a predetermined time, the remaining blocks ofFIG. 12 may be executed. In another example, eye scanning is initiated within a predetermined time period after the eyewear device is powered on. In another example, eye scanning is initiated when another function of the eyewear device is triggered, for example, a different software executable application is accessed which requires appropriate user or group permissions. - Eye scanning can be initiated when hardware is accessed on the eyewear device, for example, when a button is pressed to capture images or a video via the visible light camera or another user interface or component of the eyewear device is utilized. In another embodiment, the eyewear device initiates an eye scan under certain conditions (e.g., detection of motion from an on-board accelerometer or gyroscope) or detecting modification of positional location coordinates via a GPS receiver or other positioning system.
- Continuing to block 1210, the eye scanner of the eyewear device emits a pattern of infrared light. As described in detail previously, the infrared emitter emits the pattern of infrared light which can be a standardized matrix or beam of pixels that will outline a uniform light trace on the eye of the user (e.g., retina or iris). The emitted pattern can be an unperceived low-energy infrared beam that shines on the eye with a standardized path.
- Proceeding to block 1220, the eyewear device captures reflection variations in the emitted pattern of infrared light. As outlined above, the amount of reflection of the emitted pattern of infrared light varies in different parts of the retina (e.g., retinal blood vessels absorb light more than surrounding tissue) and the iris. The infrared camera captures these reflection variations of the emitted pattern of infrared light, which is digitized by the eyewear device.
- Moving to block 1230, a user of the eyewear device is identified based on the currently captured digitized reflection variations, on one or more devices of the user identification system, such as the eyewear device, mobile device, or server system. A database with the digitized reflection variations that were previously stored are analyzed using algorithms to compare against the currently captured digitized reflection variations. The algorithms employ mathematical and statistical techniques for pattern recognition to determine whether the currently captured reflection variations of the user of the eyewear device matches one or more of the previously captured digitized images that are stored and exist in the database. If a match is found, the identity of the user is verified (e.g., known) and corresponding user account information is retrieved.
- Finishing now in
block 1240, actions are taken based on identification or lack of identification of the user. For example, the eyewear device and associated mobile device may be unlocked and profile settings or configurations of the eyewear device can be loaded based on the associated user account. In one example, access to certain software executable applications and associated hardware can be granted, such as the visible light camera, of the eyewear device. In another example, the eyewear device may automatically pair with the mobile device associated with the identified user account in response to user identification. In some embodiments, the user may be automatically be logged into user accounts on third party software applications, for example, an application store or chat application. In other examples, the identity of the user or the identity of the user account can be included in the metadata of images or videos captured by the visible light camera along with geolocation data. - If the user is not identified (e.g., no match is found in the database), then the eyewear device and mobile device may remain locked and inaccessible. For example, the eyewear device and mobile device lock down and the account associated with the devices receives messages that there was a non-matching access to the devices. Alternatively, if this is the first time the user utilizes the user identification system, the system will find no previously captured infrared image exists in the database with digitized reflection variations that match the currently captured reflection variations of the emitted pattern of infrared light. In response to finding no matching captured infrared image exists, the system may update the database to store digitized images of the currently captured reflection variations of the emitted pattern of infrared light. The system may then allow the user access to the eyewear device and mobile device, for example, and request that the user set up a user account.
- It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “includes,” “including,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises or includes a list of elements or steps does not include only those elements or steps but may include other elements or steps not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
- Unless otherwise stated, any and all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. Such amounts are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain. For example, unless expressly stated otherwise, a parameter value or the like may vary by as much as ±10% from the stated amount.
- In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed examples require more features than are expressly recited in each claim. Rather, as the following claims reflect, the subject matter to be protected lies in less than all features of any single disclosed example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
- While the foregoing has described what are considered to be the best mode and other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that they may be applied in numerous applications, only some of which have been described herein. It is intended by the following claims to claim any and all modifications and variations that fall within the true scope of the present concepts.
Claims (20)
1. A system comprising:
an eyewear device including:
a frame;
a first temple connected to a first lateral side of the frame;
a second temple connected to a second lateral side of the frame;
a first infrared emitter connected to the frame or the first temple to emit a first pattern of infrared light;
a second infrared emitter connected to the frame, the first temple, or the second temple to emit a second pattern of infrared light;
a first infrared camera connected to the frame or the first temple to capture reflection variations in the first emitted pattern of infrared light; and
a second infrared camera connected to the frame, the first temple, or the second temple to capture reflection variations in the second emitted pattern of infrared light;
a processor coupled to the eyewear device;
a memory accessible to the processor; and
programming in the memory, wherein execution of the programming by the processor configures the system to perform functions, including functions to:
emit, via the first infrared emitter, the first pattern of infrared light on a first eye of a user of the eyewear device;
capture, via the first infrared camera, the reflection variations in the first emitted pattern of infrared light on the first eye of the user;
emit, via the second infrared emitter, the second pattern of infrared light on a second eye of the user of the eyewear device;
capture, via the second infrared camera, the reflection variations in the second emitted pattern of infrared light on the second eye of the user; and
determine a direction of a line of sight of the first and second eyes of the user for eye tracking based on at least the reflection variations in the second emitted pattern of infrared light on the second eye of the user.
2. The system of claim 1 , wherein execution of the programming by the processor configures the system to perform additional functions, including functions to:
receive location coordinates of the user of the eyewear device; and
provide content to the eyewear device based on the location coordinates and the direction of the line of sight of the first and second eyes of the user.
3. The system of claim 1 , wherein the first infrared emitter and first infrared camera and are co-located on the frame.
4. The system of claim 3 , wherein the second infrared emitter and the second infrared camera are co-located on the frame, the first temple, the second temple, or a chunk that is integrated into or connected to the frame on the first or second lateral side.
5. The system of claim 4 , wherein when the first infrared camera cannot capture an image of the retina or iris of the eye of the user due to the orientation of the eye of the user, the second infrared camera is positioned to capture the image of the retina of the iris of the eye of the user.
6. The system of claim 1 , wherein:
the first infrared emitter and the first infrared camera are co-located on a first chunk that is integrated into or connected to the frame on the first lateral side.
7. The system of claim 6 , wherein:
the second infrared emitter and the second infrared camera are co-located on a second chunk that is integrated into or connected to the frame on the second lateral side.
8. The system of claim 6 , wherein the first chunk includes a circuit board that includes the first infrared emitter and the first infrared camera.
9. The system of claim 8 , wherein the frame includes:
a frame front; and
a frame back; and
the circuit board includes is flexible printed circuit board disposed between the frame front and the frame back.
10. The system of claim 9 , wherein:
the first infrared camera is disposed on the circuit board and is covered by an infrared camera cover lens; and
the frame back includes a first opening for the infrared camera cover lens.
11. The system of claim 1 , wherein:
the frame includes first and second eye rims that have respective apertures to hold a respective optical element;
the first and second eye rims are connected by a bridge;
the first infrared emitter and the first infrared camera are co-located on the first eye rim; and
the second infrared emitter and the second infrared camera are co-located on the second eye rim.
12. The system of claim 1 , further comprising:
a plurality of infrared emitters connected to the frame, the first temple, the second temple, a first chunk that is integrated into or connected to the frame on the first lateral side, or a second chunk that is integrated or connected to the frame on the second lateral side, the plurality of infrared emitters including the first and second infrared emitters; and
a plurality of infrared cameras connected to the frame, the first temple, the second temple, the first chunk, or the second chunk, the plurality of infrared cameras including the first and second infrared cameras,
wherein the plurality of infrared emitters and plurality of infrared cameras track eye direction of the user to detect whether the user is looking left, right, up, down, east, west, north, or south.
13. A method for tracking the eyes of a user of an eyewear device, the eyewear device including a first infrared emitter, a second infrared emitter, a first infrared camera, and a second infrared camera, the method comprising:
emitting, via the first infrared emitter, a first pattern of infrared light on a first eye of a user of the eyewear device;
capturing, via the first infrared camera, reflection variations in the first emitted pattern of infrared light on the first eye of the user;
emitting, via the second infrared emitter, a second pattern of infrared light on a second eye of the user of the eyewear device;
capturing, via the second infrared camera, reflection variations in the second emitted pattern of infrared light on the second eye of the user; and
determining a direction of a line of sight of the first and second eyes of the user for eye tracking based on at least the reflection variations in the second emitted pattern of infrared light on the second eye of the user.
14. The method of claim 13 , further comprising:
receiving location coordinates of the user of the eyewear device; and
providing content to the eyewear device based on the location coordinates and the direction of the line of sight of the first and second eyes of the user.
15. The method of claim 13 , wherein the first infrared emitter and first infrared camera and are co-located on a frame of the eyewear device and the second infrared emitter and the second infrared camera are co-located on the frame, a temple, or a chunk that is integrated into or connected to the frame on a lateral side of the eyewear device, further comprising:
capturing an image of the retina or iris of the eye of the user using the second infrared emitter and second infrared camera when the first infrared camera and first infrared emitter cannot capture the image of the retina or iris of the eye of the user due to the orientation of the eye of the user.
16. The method of claim 13 , wherein a plurality of infrared emitters are connected to the frame, the first temple, the second temple, a first chunk that is integrated into or connected to the frame on the first lateral side, or a second chunk that is integrated or connected to the frame on the second lateral side, the plurality of infrared emitters including the first and second infrared emitters, and wherein a plurality of infrared cameras are connected to the frame, the first temple, the second temple, the first chunk, or the second chunk, the plurality of infrared cameras including the first and second infrared cameras, further comprising:
tracking, using the plurality of infrared emitters and plurality of infrared cameras, eye direction of the user to detect whether the user is looking left, right, up, down, east, west, north, or south.
17. A non-transitory computer readable medium including instructions for implementing functions when executed by a processor of an eyewear device including a first infrared emitter, a second infrared emitter, a first infrared camera, and a second infrared camera, the functions comprising:
emitting, via the first infrared emitter, a first pattern of infrared light on a first eye of a user of the eyewear device;
capturing, via the first infrared camera, reflection variations in the first emitted pattern of infrared light on the first eye of the user;
emitting, via the second infrared emitter, a second pattern of infrared light on a second eye of the user of the eyewear device;
capturing, via the second infrared camera, reflection variations in the second emitted pattern of infrared light on the second eye of the user; and
determining a direction of a line of sight of the first and second eyes of the user for eye tracking based on at least the reflection variations in the second emitted pattern of infrared light on the second eye of the user.
18. The computer readable medium of claim 17 , further comprising instructions for implementing functions when executed by the processor of the eyewear device, the functions including:
receiving location coordinates of the user of the eyewear device; and
providing content to the eyewear device based on the location coordinates and the direction of the line of sight of the first and second eyes of the user.
19. The computer readable medium of claim 17 , wherein the first infrared emitter and first infrared camera and are co-located on a frame of the eyewear device and the second infrared emitter and the second infrared camera are co-located on the frame, a temple, or a chunk that is integrated into or connected to the frame on a lateral side of the eyewear device, further comprising instructions for implementing functions when executed by the processor of the eyewear device, the functions including:
capturing an image of the retina or iris of the eye of the user using the second infrared emitter and second infrared camera when the first infrared camera and first infrared emitter cannot capture the image of the retina or iris of the eye of the user due to the orientation of the eye of the user.
20. The computer readable medium of claim 17 , wherein a plurality of infrared emitters are connected to the frame, the first temple, the second temple, a first chunk that is integrated into or connected to the frame on the first lateral side, or a second chunk that is integrated or connected to the frame on the second lateral side, the plurality of infrared emitters including the first and second infrared emitters, further comprising instructions for implementing functions when executed by the processor of the eyewear device, the functions including:
tracking, using the plurality of infrared emitters and plurality of infrared cameras, eye direction of the user to detect whether the user is looking left, right, up, down, east, west, north, or south.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/397,790 US20210365535A1 (en) | 2017-11-20 | 2021-08-09 | Eye scanner for user identification and security in an eyewear device |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762588700P | 2017-11-20 | 2017-11-20 | |
US16/188,981 US11138301B1 (en) | 2017-11-20 | 2018-11-13 | Eye scanner for user identification and security in an eyewear device |
US17/397,790 US20210365535A1 (en) | 2017-11-20 | 2021-08-09 | Eye scanner for user identification and security in an eyewear device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/188,981 Continuation US11138301B1 (en) | 2017-11-20 | 2018-11-13 | Eye scanner for user identification and security in an eyewear device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210365535A1 true US20210365535A1 (en) | 2021-11-25 |
Family
ID=77923840
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/188,981 Active 2039-03-16 US11138301B1 (en) | 2017-11-20 | 2018-11-13 | Eye scanner for user identification and security in an eyewear device |
US17/397,790 Pending US20210365535A1 (en) | 2017-11-20 | 2021-08-09 | Eye scanner for user identification and security in an eyewear device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/188,981 Active 2039-03-16 US11138301B1 (en) | 2017-11-20 | 2018-11-13 | Eye scanner for user identification and security in an eyewear device |
Country Status (1)
Country | Link |
---|---|
US (2) | US11138301B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220286662A1 (en) * | 2021-03-05 | 2022-09-08 | Largan Precision Co., Ltd. | Head-mounted device |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107864323B (en) * | 2017-12-13 | 2020-06-19 | Oppo广东移动通信有限公司 | Camera assembly and electronic equipment with same |
US11861941B1 (en) * | 2019-02-06 | 2024-01-02 | Apple Inc. | Eye camera systems with polarized light |
US11326763B1 (en) | 2019-02-06 | 2022-05-10 | Apple Inc. | Light-emitting diodes with optical filters |
DE102021126907A1 (en) | 2021-10-18 | 2023-04-20 | Robert Bosch Gesellschaft mit beschränkter Haftung | Device, system and method for biometric user identification in a device |
CN114265212A (en) * | 2021-12-02 | 2022-04-01 | 华为终端有限公司 | Intelligent glasses and intelligent glasses wearing detection method |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2980675A2 (en) * | 2014-07-29 | 2016-02-03 | Samsung Electronics Co., Ltd. | Mobile device and method of pairing the same with electric device |
US20180008141A1 (en) * | 2014-07-08 | 2018-01-11 | Krueger Wesley W O | Systems and methods for using virtual reality, augmented reality, and/or a synthetic 3-dimensional information for the measurement of human ocular performance |
US20180239144A1 (en) * | 2017-02-16 | 2018-08-23 | Magic Leap, Inc. | Systems and methods for augmented reality |
US20190331914A1 (en) * | 2011-07-20 | 2019-10-31 | Google Llc | Experience Sharing with Region-Of-Interest Selection |
US20210173480A1 (en) * | 2010-02-28 | 2021-06-10 | Microsoft Technology Licensing, Llc | Ar glasses with predictive control of external device based on event input |
US11122258B2 (en) * | 2017-06-30 | 2021-09-14 | Pcms Holdings, Inc. | Method and apparatus for generating and displaying 360-degree video based on eye tracking and physiological measurements |
US20210407317A1 (en) * | 2007-02-07 | 2021-12-30 | Skyhawke Technologies, Llc | Coaching Aid for Golf |
KR102495139B1 (en) * | 2016-03-07 | 2023-02-06 | 매직 립, 인코포레이티드 | Blue light adjustment for biometric security |
US20230258929A1 (en) * | 2014-01-21 | 2023-08-17 | Mentor Acquisition One, LLC. | See-through computer display systems |
Family Cites Families (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7922321B2 (en) * | 2003-10-09 | 2011-04-12 | Ipventure, Inc. | Eyewear supporting after-market electrical components |
AU2003902422A0 (en) * | 2003-05-19 | 2003-06-05 | Intellirad Solutions Pty. Ltd | Access security system |
US10227063B2 (en) * | 2004-02-26 | 2019-03-12 | Geelux Holdings, Ltd. | Method and apparatus for biological evaluation |
JP2006136450A (en) * | 2004-11-11 | 2006-06-01 | Matsushita Electric Ind Co Ltd | Iris certification device |
US11428937B2 (en) * | 2005-10-07 | 2022-08-30 | Percept Technologies | Enhanced optical and perceptual digital eyewear |
US8953849B2 (en) * | 2007-04-19 | 2015-02-10 | Eyelock, Inc. | Method and system for biometric recognition |
US10064552B1 (en) * | 2009-06-04 | 2018-09-04 | Masoud Vaziri | Method and apparatus for a compact and high resolution mind-view communicator |
US20150309316A1 (en) * | 2011-04-06 | 2015-10-29 | Microsoft Technology Licensing, Llc | Ar glasses with predictive control of external device based on event input |
US8941559B2 (en) * | 2010-09-21 | 2015-01-27 | Microsoft Corporation | Opacity filter for display device |
US20130154913A1 (en) * | 2010-12-16 | 2013-06-20 | Siemens Corporation | Systems and methods for a gaze and gesture interface |
EP2923638B1 (en) * | 2011-03-18 | 2019-02-20 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Optical measuring device and system |
US8885877B2 (en) * | 2011-05-20 | 2014-11-11 | Eyefluence, Inc. | Systems and methods for identifying gaze tracking scene reference locations |
US20120326948A1 (en) * | 2011-06-22 | 2012-12-27 | Microsoft Corporation | Environmental-light filter for see-through head-mounted display device |
US20150084864A1 (en) * | 2012-01-09 | 2015-03-26 | Google Inc. | Input Method |
JP6141584B2 (en) * | 2012-01-24 | 2017-06-07 | アリゾナ ボード オブ リージェンツ オン ビハーフ オブ ザ ユニバーシティ オブ アリゾナ | Compact line-of-sight head-mounted display |
JP5887026B2 (en) * | 2012-09-03 | 2016-03-16 | ゼンソモトリック インストゥルメンツ ゲゼルシャフト ヒューア イノベイティブ ゼンソリック エムベーハーSENSOMOTORIC INSTRUMENTS Gesellschaft fur innovative Sensorik mbH | Head mounted system and method for computing and rendering a stream of digital images using the head mounted system |
US10716469B2 (en) * | 2013-01-25 | 2020-07-21 | Wesley W. O. Krueger | Ocular-performance-based head impact measurement applied to rotationally-centered impact mitigation systems and methods |
US20140285402A1 (en) * | 2013-03-13 | 2014-09-25 | Aliphcom | Social data-aware wearable display system |
US9851803B2 (en) * | 2013-03-15 | 2017-12-26 | Eyecam, LLC | Autonomous computing and telecommunications head-up displays glasses |
CN105247861B (en) * | 2013-03-22 | 2017-11-10 | 精工爱普生株式会社 | Infrared video shows glasses |
US20140341441A1 (en) * | 2013-05-20 | 2014-11-20 | Motorola Mobility Llc | Wearable device user authentication |
TWI516804B (en) * | 2014-01-02 | 2016-01-11 | 廣達電腦股份有限公司 | Head mounted display apparatus and backlight adjustment method thereof |
US11054639B2 (en) * | 2014-03-03 | 2021-07-06 | Eyeway Vision Ltd. | Eye projection system |
WO2015195417A1 (en) * | 2014-06-20 | 2015-12-23 | Rambus Inc. | Systems and methods for lensed and lensless optical sensing |
US20160011657A1 (en) * | 2014-07-14 | 2016-01-14 | Futurewei Technologies, Inc. | System and Method for Display Enhancement |
US10152631B2 (en) * | 2014-08-08 | 2018-12-11 | Fotonation Limited | Optical system for an image acquisition device |
US10410535B2 (en) * | 2014-08-22 | 2019-09-10 | Intelligent Technologies International, Inc. | Secure testing device |
US10345768B2 (en) * | 2014-09-29 | 2019-07-09 | Microsoft Technology Licensing, Llc | Environmental control via wearable computing system |
US9568603B2 (en) * | 2014-11-14 | 2017-02-14 | Microsoft Technology Licensing, Llc | Eyewear-mountable eye tracking device |
US9576399B2 (en) * | 2014-12-23 | 2017-02-21 | Meta Company | Apparatuses, methods and systems coupling visual accommodation and visual convergence to the same plane at any depth of an object of interest |
EA033741B1 (en) * | 2015-03-01 | 2019-11-21 | Novasight Ltd | System and method for measuring ocular motility |
US11461936B2 (en) * | 2015-03-17 | 2022-10-04 | Raytrx, Llc | Wearable image manipulation and control system with micro-displays and augmentation of vision and sensing in augmented reality glasses |
KR20160128119A (en) * | 2015-04-28 | 2016-11-07 | 엘지전자 주식회사 | Mobile terminal and controlling metohd thereof |
US10799122B2 (en) * | 2015-06-14 | 2020-10-13 | Facense Ltd. | Utilizing correlations between PPG signals and iPPG signals to improve detection of physiological responses |
US11064892B2 (en) * | 2015-06-14 | 2021-07-20 | Facense Ltd. | Detecting a transient ischemic attack using photoplethysmogram signals |
US9755636B2 (en) * | 2015-06-23 | 2017-09-05 | Microsoft Technology Licensing, Llc | Insulated gate device discharging |
WO2017013913A1 (en) * | 2015-07-17 | 2017-01-26 | ソニー株式会社 | Gaze detection device, eyewear terminal, gaze detection method, and program |
CN105528577B (en) * | 2015-12-04 | 2019-02-12 | 深圳大学 | Recognition methods based on intelligent glasses |
US9946943B2 (en) * | 2015-12-07 | 2018-04-17 | Delta Id, Inc. | Methods and apparatuses for birefringence based biometric authentication |
US10419053B2 (en) * | 2016-04-22 | 2019-09-17 | Seabeck Holdings, Llc | Smart aviation communication headset and peripheral components |
US10521660B2 (en) * | 2016-04-28 | 2019-12-31 | Sharp Kabushiki Kaisha | Image processing method and image processing device |
US20180113216A1 (en) * | 2016-10-25 | 2018-04-26 | Innoviz Technologies Ltd. | Methods Circuits Devices Assemblies Systems and Functionally Associated Machine Executable Code for Active Optical Scanning of a Scene |
JP7090601B2 (en) * | 2016-10-05 | 2022-06-24 | マジック リープ, インコーポレイテッド | Peripheral test for mixed reality calibration |
US10877556B2 (en) * | 2016-10-21 | 2020-12-29 | Apple Inc. | Eye tracking system |
US20180267604A1 (en) * | 2017-03-20 | 2018-09-20 | Neil Bhattacharya | Computer pointer device |
US10635168B2 (en) * | 2017-08-22 | 2020-04-28 | Microsoft Technology Licensing, Llc | MEMS line scanner and silicon photomultiplier based pixel camera for low light large dynamic range eye imaging |
US10521661B2 (en) * | 2017-09-01 | 2019-12-31 | Magic Leap, Inc. | Detailed eye shape model for robust biometric applications |
US10698481B1 (en) * | 2017-09-28 | 2020-06-30 | Apple Inc. | Glint-assisted gaze tracker |
US10739850B2 (en) * | 2017-09-29 | 2020-08-11 | Sony Interactive Entertainment Inc. | Prescription glasses with eye gaze tracking and electro optical signaling to a HMD |
US10474916B2 (en) * | 2017-11-20 | 2019-11-12 | Ashok Krishnan | Training of vehicles to improve autonomous capabilities |
US10564716B2 (en) * | 2018-02-12 | 2020-02-18 | Hong Kong Applied Science and Technology Research Institute Company Limited | 3D gazing point detection by binocular homography mapping |
US11067805B2 (en) * | 2018-04-19 | 2021-07-20 | Magic Leap, Inc. | Systems and methods for operating a display system based on user perceptibility |
-
2018
- 2018-11-13 US US16/188,981 patent/US11138301B1/en active Active
-
2021
- 2021-08-09 US US17/397,790 patent/US20210365535A1/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210407317A1 (en) * | 2007-02-07 | 2021-12-30 | Skyhawke Technologies, Llc | Coaching Aid for Golf |
US20210173480A1 (en) * | 2010-02-28 | 2021-06-10 | Microsoft Technology Licensing, Llc | Ar glasses with predictive control of external device based on event input |
US20190331914A1 (en) * | 2011-07-20 | 2019-10-31 | Google Llc | Experience Sharing with Region-Of-Interest Selection |
US20230258929A1 (en) * | 2014-01-21 | 2023-08-17 | Mentor Acquisition One, LLC. | See-through computer display systems |
US20180008141A1 (en) * | 2014-07-08 | 2018-01-11 | Krueger Wesley W O | Systems and methods for using virtual reality, augmented reality, and/or a synthetic 3-dimensional information for the measurement of human ocular performance |
EP2980675A2 (en) * | 2014-07-29 | 2016-02-03 | Samsung Electronics Co., Ltd. | Mobile device and method of pairing the same with electric device |
KR102495139B1 (en) * | 2016-03-07 | 2023-02-06 | 매직 립, 인코포레이티드 | Blue light adjustment for biometric security |
US20180239144A1 (en) * | 2017-02-16 | 2018-08-23 | Magic Leap, Inc. | Systems and methods for augmented reality |
US11122258B2 (en) * | 2017-06-30 | 2021-09-14 | Pcms Holdings, Inc. | Method and apparatus for generating and displaying 360-degree video based on eye tracking and physiological measurements |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220286662A1 (en) * | 2021-03-05 | 2022-09-08 | Largan Precision Co., Ltd. | Head-mounted device |
US11785198B2 (en) * | 2021-03-05 | 2023-10-10 | Largan Precision Co., Ltd. | Head-mounted device |
Also Published As
Publication number | Publication date |
---|---|
US11138301B1 (en) | 2021-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11892710B2 (en) | Eyewear device with fingerprint sensor for user input | |
US20210365535A1 (en) | Eye scanner for user identification and security in an eyewear device | |
US11269402B1 (en) | User interface interaction paradigms for eyewear device with limited field of view | |
CN108664783B (en) | Iris recognition-based recognition method and electronic equipment supporting same | |
US10401955B2 (en) | Method for displaying an image and an electronic device thereof | |
US20220103757A1 (en) | Multi-purpose cameras for simultaneous capture and cv on wearable ar devices | |
US11561398B2 (en) | Audio visualizer eyewear device | |
US20230269355A1 (en) | Input parameter based image waves | |
US11721045B2 (en) | Audio-triggered augmented reality eyewear device | |
US11665334B2 (en) | Rolling shutter camera pipeline exposure timestamp error determination | |
US11663992B2 (en) | Fade-in user interface display based on finger distance or hand proximity | |
US11335090B2 (en) | Electronic device and method for providing function by using corneal image in electronic device | |
US11948262B2 (en) | Geospatial image surfacing and selection | |
US11789527B1 (en) | Eyewear device external face tracking overlay generation | |
US11798282B1 (en) | Video highlights with user trimming | |
US11527895B1 (en) | Eyewear bidirectional communication using time gating power transfer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |