WO2024063978A1 - Éclairage de guide d'ondes pour évaluation oculaire - Google Patents

Éclairage de guide d'ondes pour évaluation oculaire Download PDF

Info

Publication number
WO2024063978A1
WO2024063978A1 PCT/US2023/032453 US2023032453W WO2024063978A1 WO 2024063978 A1 WO2024063978 A1 WO 2024063978A1 US 2023032453 W US2023032453 W US 2023032453W WO 2024063978 A1 WO2024063978 A1 WO 2024063978A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
light
electronic device
optic
implementations
Prior art date
Application number
PCT/US2023/032453
Other languages
English (en)
Inventor
Ariel Lipson
Moshe Kriman
Original Assignee
Apple Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc. filed Critical Apple Inc.
Publication of WO2024063978A1 publication Critical patent/WO2024063978A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • the present disclosure generally relates to electronic devices and, in particular, to systems, methods, and devices for assessing gaze direction, eye orientation, and other eye characteristics of electronic device users.
  • Some existing eye-assessment techniques analyze glints that are formed by reflecting light off a user’s eye.
  • HMDs head mounted devices
  • such reflections/glints may be formed by projecting light towards the eye from light sources located on side portions of the HMDs, e.g., on an HMD’s frame, to avoid obscuring a user’ view through or of a central portion of the HMD that provides an optic.
  • Producing light reflections from such side locations may be inefficient, inaccurate, or otherwise undesirable, particularly for HMDs having relatively larger center portions (e.g., optics) and thus side portions that are likely to be relatively more distant from the eye’s optical axis.
  • Various implementations include devices, systems, and methods that assess an eye characteristic (e.g., gaze direction, eye orientation, identifying an iris of the eye, etc.) based on reflections (e.g., glints) produced using one or more light sources.
  • Some implementations provide a device (e.g., AR glasses or other HMD) that has an optic (e.g., stack of one or more transparent elements) through which a user’s physical environment is viewed. Light is emitted from a plurality of exit locations on or in the optic to illuminate the eye to produce glints for eye assessment.
  • One or more light sources may be located on a surrounding frame of the device and produce light that is guided into one or more wave guides.
  • the one or more wave guides may direct the light from the one or more light sources through the optic and then out of the optic at distinct exit locations, e.g., via prisms, gratings, etc.
  • the wave guides and associated elements are configured such that a user in unlikely to notice them, for example, based on their small size, close proximity to the eye, and/or the use of transparent materials.
  • Some implementations provide an electronic device that has a frame comprising a one or more light sources (e.g., light emitting diodes (LEDs), vertical-cavity surfaceemitting lasers (VCSELs), etc.).
  • the electronic device has an optic (e.g., a stack of glass, lenses, and other optical layers) coupled to the frame and comprising one or more wave guides positioned to direct light received from the one or more light sources of the frame to exit locations at a surface of the optic.
  • LEDs light emitting diodes
  • VCSELs vertical-cavity surfaceemitting lasers
  • wave guide refers to a light pipe, optical fiber, or other structure that guides waves, such as electro-magnetic waves, by restricting transmission to one or more directions.
  • Waveguides may be formed of dielectric material with high permittivity, and thus high index of refraction, surrounded by a material with lower permittivity.
  • the structure of a wave guide may guide a wave by total internal reflection.
  • Wave guides include, but are not limited to, light pipes in the form of hollow tubes with highly -reflective inner surfaces.
  • the exemplary device also includes an image sensor and a processor.
  • the processor may be configured to receive sensor data from the image sensor (e.g., sensor data corresponding to reflections of the light exiting from the exit locations and reflected from an eye) and assess an eye characteristic based on the reflections.
  • Using wave guides to produce light from multiple exit locations on an optic of a device to produce glints for eye assessment can provide numerous advantages. For example, doing so can enable the emission of light from exit locations that are closer to an optical axis of an eye of the user which may be desirable in some implementations. Locations may be determined by running an optimization that results in some central exit and some edge exit locations. Light produced from location that are relatively far from an eye’s optical axis may produce glints that are less desirable for eye assessments, e.g., some of the glints may end up on the sclera rather not the pupil or otherwise be ill-suited for eye assessment.
  • a non-transitory computer readable storage medium has stored therein instructions that are computer-executable to perform or cause performance of any of the methods described herein.
  • a device includes one or more processors, a non-transitory memory, and one or more programs, the one or more programs are stored in the non-transitory memory and configured to be executed by the one or more processors and the one or more programs include instructions for performing or causing performance of any of the methods described herein.
  • Figure 1 illustrates an exemplary electronic device according to some implementations.
  • Figure 2 illustrates an exemplary use of one or more wave guides in the exemplary device of Figure 1 in accordance with some implementations.
  • Figure 3 illustrates an exemplary use of one or more wave guides in the exemplary device of Figure 1 in accordance with some implementations.
  • Figure 4 illustrates light directed within and exiting a w ave guide of Figure 3 in accordance with some implementations.
  • Figure 5 illustrates an exemplary use of one or more wave guides in the exemplary device of Figure 1 in accordance with some implementations.
  • Figure 6 illustrates an exemplary stack of layers forming an optic of an exemplary device, in accordance with some implementations.
  • Figures 7A-B illustrate exemplary processes for forming at least a portion an optic of an exemplary device, in accordance with some implementations.
  • Figure 8 illustrates an exemplary prism component in accordance with some implementations.
  • Figures 9A-G illustrate exemplary processes for forming at least a portion an optic of an exemplary device, in accordance with some implementations.
  • Figure 10 illustrates an exemplary process for forming at least a portion an optic of an exemplary device, in accordance with some implementations.
  • Figure 11 illustrates positioning an exemplary optic relative to an eye in accordance with some implementations.
  • Figure 12 illustrates exemplary' glints reflected off a model of an eye.
  • Figure 13 is a flowchart representation of a method for assessing an eye characteristic of a user based on reflected light in accordance with some implementations.
  • Figure 14 is a block diagram illustrating device components of an exemplary device according to some implementations.
  • Figure 1 illustrates a device 110 that includes a frame 212 that can be worn on a user’s head.
  • the frame 212 includes extensions (e.g., arms) that are placed over ears of the user to hold the frame 212 in place on the user’s head.
  • the device 110 includes two optics 215a-b for a right eye and a left eye of the user, respectively. Each of the optics 215a-b may be configured as a stack of effectively transparent layers.
  • Such layers may include a bias (+/- ) layer for a prescription lens, a display wave guide layer for displaying XR content, an eye assessment light layer comprising one or more waveguides for directing light towards the user’s eye from one or more exit locations within the optics 215a-b to form glints or otherwise for eye assessment purposes, etc.
  • the device 110 enables the user to view at least a portion of a surrounding physical environment by looking through optics 215a-b.
  • the device 110 may be an HMD that includes optics 215a-b through which a portion of a surrounding physical environment is viewed and upon which additional (e.g., virtual) content may be displayed.
  • the device 110 may take the form of “XR glasses.”
  • the device 110 enables the user to view the surrounding physical environment and/or virtual content via optics 215a-b, while the device 1 10 obtains image data, motion data, and/or physiological data (e g., pupillary data, facial feature data, etc.) from a user via one or more sensors, e g., detectors 220a, 220b.
  • image data, motion data, and/or physiological data e g., pupillary data, facial feature data, etc.
  • the one or more detectors 220a-b may include one or more image sensors, e.g. , IR camera(s), that capture images or otherwise provide captured data for detecting reflected light (e.g., glints) from an eye of the user for eye assessment purposes.
  • image sensors e.g. , IR camera(s)
  • reflected light e.g., glints
  • the device 110 further includes one or more controllers 250a-b that each includes a processor and/or a power source that controls the light being emitted from one or more light sources to be directed towards an eye of the user to form reflections (e.g., glints) for eye assessment.
  • each of the controllers 250a-b is a microcontroller that can control the processes described herein for assessing characteristics of a respective eye (e.g., gaze direction, eye orientation, identifying an iris of the eye) based on the sensor data obtained from one of the detectors 220a-b.
  • the device 110 is a wearable device such as an HMD.
  • the device 110 is a handheld electronic device (e.g., a smartphone or a tablet).
  • the device 110 is a laptop computer or a desktop computer.
  • the device 110 has a touchpad and, in some implementations, the device 110 has a touch-sensitive display (also known as a “touch screen” or “touch screen display”).
  • the device 110 includes an eye-assessment system for assessing an eye of a user, e.g., detecting eye position and eye movements.
  • an eye-assessment system may include one or more light sources (e.g., IR/NIR light source(s)) and one or more cameras sensitive to the wavelengths emitted by the one or more light sources (e.g., IR/NIR camera(s)).
  • images captured by the eyeassessment system may be analyzed to detect positions and movements of the eyes of the user, or to detect other information about the eyes such as color, shape, state (e.g., wide open, squinting, etc.), pupil dilation, or pupil diameter.
  • the point of gaze may be estimated from the eye assessments to enable gaze-based interaction with content displayed at the optics 215a-b of the device 110.
  • the electronic devices described herein may generate and present an extended reality (XR) environment to a user and may assess eye characteristics (e.g., gaze direction) relative to such an XR environment.
  • XR extended reality
  • device 110 While this example and other examples discussed herein illustrate a single device 110, the techniques disclosed herein are applicable to multiple devices. For example, the functions of device 110 may be performed by multiple devices.
  • Figure 2 illustrates an example configuration for a lens 215b and surrounding frame portion 212b of the device 110.
  • Lens 215a (and frame portion 212a) may have a similar configuration.
  • Figure 2 illustrates components for an eye assessment system for the device 110.
  • the components include one or more light sources 252a-c, one or more wave guides 245a-g, a detector 220b, and a controller 250b.
  • the controller 250b may control and provide power to the one or more light sources 252a-c.
  • the one or more light sources may be vertical-cavity surface-emitting lasers (VCSELs), light emitting diodes (LEDs), or any other type of light source.
  • VCSELs vertical-cavity surface-emitting lasers
  • LEDs light emitting diodes
  • Implementations that utilize VCSELS in combination with a wave guide may provide advantages, e.g., enabling the efficient provision of uniform illumination.
  • the VCSELS may be power efficient and the use of a wave guide may help reduce or remove coherence that might otherwise result in speckles on sensor images of the eye.
  • the coupling of a VCSEL into a wave guide may also be relatively efficient.
  • the one or more light sources may produce light having particular characteristics, e.g., producing IR or NIR light.
  • the one or more wave guides 245a-g are positioned within the optic 215b to direct light from the one or more light sources 252a-c to the exit locations 240a-g.
  • the one or more wave guides 245a-g may be configured to have a size/diameter that is small enough and/or may be made at least in part of one or more transparent materials so as to not be detectable by a human eye of a user wearing the device 110.
  • the one or more wave guides 245 a-g may be configured to be effectively transparent to a user when wearing the device 110 and viewing content through the optic 215b.
  • the exit locations 240a-g may be the locations at which the one or more wave guides 245a-g direct light from the one or more light sources 252a-c.
  • the exit locations 240a- g may be the locations at which exit components (e.g., prisms, gratings, etc.) are positioned to direct light from within the one or more wave guides 245 a-g outwards, e.g., out of the optic 215b and towards an eye of the user while the user is wearing the device 110.
  • exit components e.g., prisms, gratings, etc.
  • Figure 3 illustrates another example configuration for lens 215b and surrounding frame portion 212b of the device 110.
  • Lens 215a (and frame portion 212a) may have a similar configuration.
  • the eye assessment components include one or more light sources 352a-b, one or more wave guides 345a-e, detector 220b, and controller 250b.
  • the controller 250b may control and provide power to the one or more light sources 352a-b via conductors.
  • the one or more light sources may be vertical-cavity surface-emitting lasers (VCSELs), light emitting diodes, or any other type of light source.
  • the one or more light sources may produce light having particular characteristics, e.g., producing IR or NIR light.
  • the one or more wave guides 345a-e are positioned within the optic 215b to direct light from the one or more light sources 352a-b to the multiple exit locations (e.g., exit location 340) along path 335.
  • the one or more wave guides 345a-e may be configured to have a size/diameter that is small enough and/or may be made of one or more transparent materials so as to not be detectable by a human eye of a user wearing the device 110 and thus would be considered transparent when viewing content through the optic 215b.
  • the exit locations may be the locations at which the one or more wave guides 345a-e direct light from the one or more light sources 352a-c.
  • the exit locations (e.g., exit location 340) form a circular path 335 on the optic 215b and may be the locations at which exit components (e.g., prisms, gratings, etc.) are positioned to direct light from the one or more wave guides 345a-e outwards, e.g., out of the optic 215b and towards the eye of the user while the user is wearing the device 110.
  • exit components e.g., prisms, gratings, etc.
  • Figure 4 illustrates light 410 directed within and exiting a wave guide 445.
  • light 410 (which may have originated from a light source such as light source 352a of Figure 3) travels within the wave guide 445 and then is redirected by prism 440 out of the wave guide 445 (and corresponding optic 215b) towards an eye of a user.
  • Figure 5 illustrates another example configuration for lens 215b and surrounding frame portion 212b of the device 110.
  • Lens 215a (and frame portion 212a) may have a similar configuration.
  • the eye assessment components include a light source 550, a wave guide 545, detector 220b, and controller 250b.
  • the controller 250b may control and provide power to the light source 552.
  • the light source may be a vertical-cavity surfaceemitting laser (VCSEL), a light emitting diode (LED), or any other type of light source.
  • the light source may produce light having particular characteristics, e.g., producing IR or NIR light.
  • the wave guide 545 is positioned within the optic 215b to direct light from the light source 552 to the exit locations 540a-f.
  • the wave guide 545 may be configured to have a size/diameter that is small enough and/or may be made of one or more transparent materials so as to not be detectable by a human eye of a user wearing the device 110 and thus would be considered transparent when viewing content through the optic 215b.
  • the exit locations 540a-f may be the locations at which the wave guide 545 directs light from the light source 552. In this example, there are multiple locations along a single wave guide 545.
  • the exit locations 540a-f form an approximately circular pattern on the optic 215b and may be the locations at which exit components (e.g., prisms, gratings, etc.) are positioned to direct light from the wave guide 545 outwards, e.g., out of the optic 215b and towards the eye of the user while the user is wearing the device 110.
  • exit components e.g., prisms, gratings, etc.
  • FIG. 6 illustrates an exemplary stack 600 of an optic of an exemplary device relative to frame 605 and an eye 690 of a user.
  • This stack 600 includes a tint cover glass layer 610, a tint layer 630 (e.g., with organic electrochromic glass and a controller/controller connection 620, a cover glass layer 650, an eye assessment light layer 655, and a display wave guide layer 680.
  • a bias layer (not shown) may additionally be included, e.g., below the display wave guide layer 680.
  • the eye assessment light layer 655 includes a wave guide formed by a channel of high index material 656 surrounded by low index material 660, as well as air gap 670.
  • the low index material 660 may be selected to reduce the visibility of the waveguide. Such selection of the low index material 660 may account for refractive index being wavelength dependent. The selection may be based on criteria requiring low contrast in the visible range (to lower the visibility) and higher contrast in the NIR wavelength (to improve guiding with higher angle content).
  • Tight from the light source 675 is directed through the wave guide (e.g., through the channel formed by high index material 656 surrounded by low index material 660) and exits the optic at exit location 665 (e.g., a prism or grating at that location).
  • the light source 675 may reside in a frame portion of the device and/or adjacent such a frame portion and may be driven by a driver, e.g., a VCSEL driver.
  • the eye assessment layer 655 is relatively thin, e.g., 2-20 microns.
  • Eye assessment components can provide a number of transparent illumination sources on an optic of an electronic device by means of a wave guide such as a light pipe.
  • a wave guide such as a light pipe.
  • a wave guide may be created using conventional wafer technology patterning or other techniques.
  • the light source 675 e g., a VCSEL
  • the light source 675 is configured to produce light within the range from an upper limited of the visible light spectrum (e.g., 700nm) to a sensor’s upper light range capability limits (e.g., 1500nm).
  • the light source 675 e.g., a VCSEL
  • the light source 675 is configured to produce light in the range from 800-1100 nm.
  • the light source 675 e.g., a VCSEL
  • the light source 675 is configured to produce light in the range from 900-1000 nm, e.g., producing light at approximate 940 nm.
  • the light source 675 e.g., a 940 nm VCSEL
  • some or all of the light source 675 is positioned within the optic.
  • the light source 675 is preferably small and/or transparent, or otherwise configured or positioned to avoid obscuring a view provided through the optic.
  • eye assessment light layer 655 is configured with a number of exit components (e.g., prisms, gratings, etc.) configured to produce light towards an eye of a user from multiple exit locations on an optic to produce reflections (e.g., glints) from the eye that can be used for eye assessment.
  • the exit components e.g., prisms
  • the exit components are coated with a wave length specific (e.g., 940 nm) or wavelength range-specific reflective coating that is transparent to human vision.
  • a prism / reflective-surface may be relatively small (e.g., smaller than 75 um x 75 um). Such a prism may be secured via glued in place (e.g., via glue on one edge).
  • the low index material 660 may be transparent and provided to cover the wave guide structure while reducing reflections.
  • a wave guide is created using a nanoimprint technique.
  • gratings (or other exit components) are used to direct the light out of the optic at specific location and towards a user’s eye.
  • a negative mold of a light pipe is etched in silicon with sub-micron precision. High index transparent material is applied on the glass and shaped by a mold and low index transparent material is added to cover the structure to reduce reflections.
  • Figure 7A illustrates an exemplary process for forming at least a portion an optic of an exemplary device.
  • a transparent substrate e.g., glass, poly carbonate, cyclo olefin polymer (COP), or other transparent polymeric material
  • a high index polymer 720 e.g., SU-8 ( ⁇ l,58@940nm) is added, e.g., by lithography or laser.
  • a micro prism 730 with coated facets e.g., for 940nm range
  • a fill structure of low index material 740 is added.
  • an air gap is used instead of the low index material 740.
  • the contrast may be significant enough to make edges visible.
  • the low index material 740 may provide less contrast and thus less potential for visibility.
  • Figure 8 illustrates an exemplary prism 720.
  • the prism 720 is transparent except for a portion 810 having a reflective coating (e.g., gold).
  • the prism 720 is entirely transparent and portion 810 has zero reflectance in the visible range (e.g., 400-700nm) but is reflective in the range of the tight from the tight sources (e.g., 940nm).
  • reflector material is used that comprises metal (e.g., Cu, Au plated material, etc.). Reflector material may be grown/applied directly on a substrate.
  • Figure 7B illustrates an exemplary process for forming at least a portion an optic of an exemplary device.
  • a transparent substrate e.g., glass, poly carbonate, cyclo olefin polymer (COP), or other transparent polymeric material
  • a high index polymer 720b e.g., SU-8 ( ⁇ l,58@940nm) is added, e.g., by lithography or laser.
  • a micro prism 730b with coated facets e.g., for 940nm range
  • the micro prism 730b is oriented in a different exemplary orientation than the micro prism 730 of Figure 7A.
  • a fill structure of low index material 740 is added.
  • an air gap is used instead of the low index material 740.
  • the contrast may be significant enough to make edges visible.
  • the low index material 740 may provide less contrast and thus less potential for visibility.
  • Figure 9A illustrates an exemplary process for forming at least a portion of an optic of an exemplary device.
  • atransparent substrate e g., glass, poly carbonate, cyclo olefin polymer (COP), or other transparent polymeric material
  • a high index polymer 920 e.g., SU-8 ( ⁇ l,58@940nm) is added, e.g., by lithography or laser.
  • nano-imprinting is used to form shapes 930a-b from a low index material, e.g., a polymer.
  • a mirror coating and patterning technique is used to form exit component 935 on shape 930b.
  • a low index material coating is added, combining with shapes 930a-b to form a region 950 of low index material.
  • the exit component 935 is embedded within the low index material 950 in this way.
  • Figure 9B illustrates an exemplary process for forming at least a portion of an optic of an exemplary device.
  • atransparent substrate e.g., glass, poly carbonate, cyclo olefin polymer (COP), or other transparent polymeric material
  • a high index polymer 920b e.g., SU-8 ( ⁇ l,58@940nm) is added, e.g., by lithography or laser.
  • nano-imprinting is used to form shapes 930c-d from a low index material, e.g., a polymer.
  • a mirror coating and patterning technique is used to form exit component 935b on shape 930c.
  • a low index material coating is added, combining with shapes 93Oc-d to form a region 950b of low index material.
  • the exit component 935b is embedded within the low index material 950b in this way.
  • Figure 9C illustrates an exemplary process for forming at least a portion of an optic of an exemplary device.
  • atransparent substrate e g., glass, poly carbonate, cyclo olefin polymer (COP), or other transparent polymeric material
  • the or other transparent substrate 970 is patterned, e.g., by lithography or laser.
  • an additional etch is optionally performed for prism placement.
  • the etched grooves / channels are filled with high index material 972, e g., via lithography or an overfill and polish technique.
  • a mirror 976 is attached and portion 974 filled with a low refractive index material such as glass.
  • Figure 9D illustrates an exemplary process for forming at least a portion of an optic of an exemplary device similar to the process of Figure 9C. However, at step 929, a mirror 976b is attached and portion 974 filled with a low refractive index material such as glass. The mirror 976b of Figure 9D has a different position and orientation than the mirror of Figure 9C.
  • Figure 9E illustrates an exemplary process for forming at least a portion of an optic of an exemplary device.
  • atransparent substrate e g., glass, poly carbonate, cyclo olefin polymer (COP), or other transparent polymeric material
  • the glass or other transparent substrate 990 is patterned, e.g., by lithography or laser, to create around edge profile.
  • any etched grooves / channels are filled with high index material 992, e.g., via lithography or an overfill and polish technique.
  • portion 994 is filled with a low refractive index material such as glass.
  • light is guided across a clear aperture to a location and then coupled out towards the user’s eye by diffuse scattering from a waveguide tip, e.g., coated in a high reflectance coating.
  • a waveguide tip e.g., coated in a high reflectance coating.
  • This may approach may provide minimal visibility of the illumination system due to small (e.g., micron-scale) feature dimensions and its transparency.
  • Diffuse scattered light may provide near-Lamberian illumination sources with a wide angle, which may allow illumination of a large area of a user’s face from very close distance.
  • the shape of the tip may be configured for illumination efficiency and reduction of back-reflections to the light source.
  • the components are configured to produce light in a direction (e.g., vertically) out of a wave guide while also make the light diffusive, similar to an LED.
  • reflector material is used that comprises metal (e.g., Cu, Au plated material, etc.). Reflector material may be grown/applied directly on a substrate.
  • Figure 9F illustrates an exemplary process for forming at least a portion of an optic of an exemplary device.
  • atransparent substrate e.g., glass, poly carbonate substrate, cyclo olefin polymer (COP), or other transparent polymeric material
  • waveguide 1912 e.g., a transparent polymer
  • Figure 9G illustrates an isometric view of an exemplary geometric shape of the waveguide 1912 with tip 1914.
  • the waveguide 1912 is coated with a diffuse layer 1916.
  • a low reflective index (LRI) clad layer is added.
  • a light source 1918 is coupled.
  • LRI low reflective index
  • Figure 10 illustrates an exemplary process for forming at least a portion an optic of an exemplary device.
  • atransparent substrate e.g., glass, poly carbonate, cyclo olefin polymer (COP), or other transparent polymeric material
  • ahigh index polymer 1020 e.g., SU-8 ( ⁇ l,58@940nm) is added, e.g., by lithography or laser, including grating couplers 1030a-b.
  • encapsulation with a low index material 1040 is optionally performed.
  • a partial encapsulation can be provided if needed.
  • Figure 11 illustrates positioning an exemplary optic 1100 relative to an eye 1120.
  • light is directed through the optic 1100 to leave at exit locations (e.g., exit location 1110) towards the eye 1120.
  • This light reflects off the eye 1120 to form glints in image data captured by image sensor 1115.
  • Figure 12 illustrates exemplary glints (e.g., glint 1210) reflected off a model of an eye. Glints may appear as bright spots or otherwise distinguishable regions in image captured by image sensors and the relative positioning can be used to determine the eye’s position and/or orientation.
  • Figure 13 is a flowchart illustrating an exemplary method 1300.
  • a device e.g., device 110 performs method 1300 to assess an eye characteristic.
  • method 1300 is performed on a mobile device, desktop, laptop, HMD, or server device.
  • the method 1300 may be performed by processing logic, including hardware, firmware, software, or a combination thereof.
  • the method 1300 is performed on a processor executing code stored in a non-transitory computer-readable medium (e.g., a memory).
  • the method 1300 is performed in combination of one or more devices as described herein.
  • sensor data from a plurality oflight sensors may be acquired at an HMD (e.g., device 110), but the processing of the data (e.g., assess an eye characteristic) may be performed at a separate device (e.g., a mobile device).
  • the method 1300 directs light through a wave guide to exit positions on an optic of a device.
  • the light produces a glint (e.g., a specular reflection) by producing light that reflects off a portion of an eye.
  • the glint may be a specular glint.
  • the light is IR light.
  • the light sources are VCSELs or LEDs.
  • the method 1300 receives sensor data from an image sensor, the sensor data corresponding to a plurality of reflections of the light reflected from an eye.
  • the sensor e.g., a detector 220a-b
  • the sensor may be an IR image sensor/ detector that receives the reflections of light off of the eye (e.g., glints).
  • the method 1300 determines a location of the glint based on the reflected light received at the sensor. For example, determining a location of the glint may include determining a centroid of the received light. In some implementations, multiple glints may be produced and located by the sensor (e.g., detector 220).
  • the method 1300 assesses a characteristic of the eye based on the sensor data.
  • assessing the eye characteristic of the eye may be based on a determined location of the glint.
  • the eye characteristic may include a gaze direction, eye orientation, identifying an iris of the eye, or the like.
  • the eye-assessment system for the HMD can track gaze direction, eye orientation, identification of the iris, etc., of a user.
  • determining an orientation of the eye is based on identifying a pattern of the glints/light reflections in an image.
  • gaze direction may be determined using the sensor data to identify two points on the eye, e.g., a cornea center and an eyeball center.
  • gaze direction may be determined using the sensor data (e.g., a pattern of glints) to directly predict the gaze direction.
  • a machine learning model may be trained to directly predict the gaze direction based on the sensor data.
  • the user may be uniquely identified from a registration process or prior iris evaluation.
  • the method 1300 may include assessing the characteristic from the eye by performing an identification or authentication process.
  • Such a process may include identifying an iris of an eye. For example, matching a pattern of glints/light reflections in an image with a unique pattern associated with the user.
  • the iris identification techniques e.g., matching patterns
  • Iris identification may be used as a primary authentication mode or as part of a multi-factor or step up authentication.
  • the matching patterns may be stored in a database located on the HMD (e.g., device 110), another device communicatively coupled to the HMD (e.g., a mobile device in electronic communication with the HMD), an external device or server (e.g., connected through a network), or a combination of these or other devices.
  • assessing a characteristic of an eye includes determining locations of multiple portions of the eye based on determining locations of multiple glints In some implementations, assessing the characteristic of the eye is based on sensor data from a single sensor.
  • the sensor e.g., detector 220
  • receiving the reflected light includes receiving the reflected light from image data from the sensor.
  • an electronic device comprises a frame comprising a one or more light sources (e.g., LEDs, VCSELs, etc.) and an optic (e.g., glass, lens, stack of optical components, etc.) coupled to the frame and comprising one or more wave guides positioned to direct light received from the one or more light sources of the frame to exit locations at a surface of the optic.
  • the wave guides may be configured to be transparent.
  • the optic may include a plurality of layers. In one example, the layers include: a first layer comprising the wave guide to direct the light that is reflected from the eye and a second layer comprising a second wave guide positioned to display augmented reality content.
  • This first layer may include a first portion comprising a first transparent material or an air gap having a first transparency index and a second portion comprising a second transparent material having a second transparency index that is greater than the first transparency index.
  • the optic may include a cover glass layer and/or a tint layer.
  • the optic may include one or more exit components, e.g., prisms, gratings, etc.
  • the optic further comprises prisms comprising reflective surfaces directing the light to exit the optic at the exit locations.
  • the prisms or reflective surfaces are coated with a coating that is transparent to at least some light in a visible light range and reflective to at least some of the light outside of the visible light range produced by the one or more light sources.
  • Such reflective surface may be relatively small such that is not visible to the naked eye.
  • Each may be smaller than a threshold associated with naked eye visibility, e.g., each smaller than 75um x 75 um, each smaller than OOum x 100 um, each smaller than 125um x 125 um, etc.
  • reflector material is used that comprises metal (e.g., Cu, Au plated material, etc.). Reflector material may be grown/apphed directly on a substrate.
  • the one or more wave guides may be formed a wafer patterning technique or nanoimprint technique.
  • Each of the one or more wave guides may include a polymer patterned by lithography or laser and a micro prism comprising at least one facet coated to reflect light used to produce eye reflections using a particular wave length or wave length range, e.g., between 900nm and 1 OOOnm.
  • Each of the one or more wave guides further may have a fill structure having a lower index than the polymer.
  • each wave guide comprises a polymer patterned by lithography or laser and a reflective shape embedded within a fill structure having a lower index than the polymer.
  • the electronic device further includes an image sensor (e.g., an IR image sensor/detector) and a processor configured to receive sensor data from the image sensor, the sensor data corresponding to reflections of the light exiting from the exit locations and reflected from an eye and assess an eye characteristic based on the reflections.
  • an image sensor e.g., an IR image sensor/detector
  • a processor configured to receive sensor data from the image sensor, the sensor data corresponding to reflections of the light exiting from the exit locations and reflected from an eye and assess an eye characteristic based on the reflections.
  • the electronic device may further include a second wave guide positioned to display augmented reality (AR) or other extended reality (XR) content.
  • the electronic device may have exit locations that are spaced from one another relative to a surface of the optic to emit the light towards the eye from multiple directions.
  • each exit location is approximately equidistant from an adjacent light source.
  • the spatial arrangement of the plurality of exit locations may form an evenly spaced grid 3x3, 4x4, etc.
  • each exit location is spaced from each adjacent exit location based on a minimum distant constraint.
  • the exit locations are divided into subgroups, and each subgroup includes two or more exit locations.
  • the subgroups of the exit locations are dispersed throughout the optic.
  • the exit locations may be grouped in numbers of three exit locations per group, and each group may be spread out in any spatial arrangement (e.g., an equidistant grid, an ellipse, a box, etc.).
  • the spatial arrangement of exit locations is based on a geometric shape.
  • the geometric shape includes shapes such as a parabola, an ellipse, a hyperbola, a cycloid, or the like.
  • the geometric shape is based on a transcendental curve or an algebraic curve.
  • FIG 14 is a block diagram of an example device 1400.
  • Device 1400 illustrates an exemplary device configuration for device 110. While certain specific features are illustrated, those skilled in the art will appreciate from the present disclosure that various other features have not been illustrated for the sake of brevity, and so as not to obscure more pertinent aspects of the implementations disclosed herein.
  • the device 1400 includes one or more processing units 1402 (e.g., microprocessors, ASICs, FPGAs, GPUs, CPUs, processing cores, and/or the like), one or more input/output (I/O) devices and sensors 1406, one or more communication interfaces 1408 (e.g., USB, FIREWIRE, THUNDERBOLT, IEEE 802.3x, IEEE 802.1 lx, IEEE 802.
  • processing units 1402 e.g., microprocessors, ASICs, FPGAs, GPUs, CPUs, processing cores, and/or the like
  • I/O input/output
  • communication interfaces 1408 e.g., USB, FIREWIRE, THUNDERBOLT, IEEE 802.3x, IEEE 802.1 lx, IEEE 802.
  • 16x GSM, CDMA, TDMA, GPS, IR, BLUETOOTH, ZIGBEE, SPI, I2C, and/or the like type interface
  • one or more programming (e.g., I/O) interfaces 1410 one or more displays 1412, one or more interior and/or exterior facing image sensor systems 1414, a memory 1420, and one or more communication buses 1404 for interconnecting these and various other components.
  • the one or more communication buses 1404 include circuitry that interconnects and controls communications between system components.
  • the one or more I/O devices and sensors 1406 include at least one of an inertial measurement unit (IMU), an accelerometer, a magnetometer, a gyroscope, a thermometer, one or more physiological sensors (e.g., blood pressure monitor, heart rate monitor, blood oxygen sensor, blood glucose sensor, etc.), one or more microphones, one or more speakers, a haptics engine, one or more depth sensors (e.g., a structured light, atime-of- flight, or the like), and/or the like.
  • IMU inertial measurement unit
  • an accelerometer e.g., an accelerometer
  • a magnetometer e.g., a magnetometer
  • a gyroscope e.g., a Bosch Sensortec, etc.
  • thermometer e.g., a thermometer
  • physiological sensors e.g., blood pressure monitor, heart rate monitor, blood oxygen sensor
  • the one or more displays 1412 are configured to present a view of a physical environment or a graphical environment to the user.
  • the one or more displays 1412 correspond to holographic, digital light processing (DLP), liquid-crystal display (LCD), liquid-crystal on silicon (LCoS), organic light-emitting field-effect transitory (OLET), organic light-emitting diode (OLED), surfaceconduction electron-emitter display (SED), field-emission display (FED), quantum-dot lightemitting diode (QD-LED), micro-electromechanical system (MEMS), and/or the like display types.
  • DLP digital light processing
  • LCD liquid-crystal display
  • LCDoS liquid-crystal on silicon
  • OLET organic light-emitting field-effect transitory
  • OLET organic light-emitting diode
  • SED surfaceconduction electron-emitter display
  • FED field-emission display
  • QD-LED quantum-dot lighte
  • the one or more displays 1412 correspond to diffractive, reflective, polarized, holographic, etc. waveguide displays.
  • the device 1400 includes a single display. In another example, the device 1400 includes a display for each eye of the user (e.g., device 210).
  • the one or more image sensor systems 1414 are configured to obtain image data that corresponds to at least a portion of the physical environment 5.
  • the one or more image sensor systems 1414 include one or more RGB cameras (e.g., with a complimentary metal-oxide-semiconductor (CMOS) image sensor or a charge-coupled device (CCD) image sensor), monochrome cameras, IR cameras, depth cameras, event-based cameras, and/or the like.
  • the one or more image sensor systems 1414 further include illumination sources that emit light, such as a flash.
  • the one or more image sensor systems 1414 further include an on-camera image signal processor (ISP) configured to execute a plurality of processing operations on the image data.
  • ISP on-camera image signal processor
  • the memory 1420 includes high-speed random-access memory, such as DRAM, SRAM, DDR RAM, or other random-access solid-state memory devices.
  • the memory 1420 includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid-state storage devices.
  • the memory 1420 optionally includes one or more storage devices remotely located from the one or more processing units 1402.
  • the memory 1420 includes anon-transitory computer readable storage medium.
  • the memory 1420 or the non-transitory computer readable storage medium of the memory 1420 stores an optional operating sy stem 1430 and one or more instruction set(s) 1440.
  • the operating system 1430 includes procedures for handling various basic system services and for performing hardware dependent tasks.
  • the instruction set(s) 1440 include executable software defined by binary information stored in the form of electrical charge.
  • the instruction set(s) 1440 are software that is executable by the one or more processing units 1402 to carry out one or more of the techniques described herein.
  • the instruction set(s) 1440 include a glint analysis instruction set 1442, a physiological tracking instruction set 1444, and a light driver instruction set 1446.
  • the instruction set(s) 1440 may be embodied a single software executable or multiple software executables.
  • the glint analysis instruction set 1442 is executable by the processing unit(s) 1402 to determine a location of a glint based on reflected light received at a sensor.
  • the physiological tracking (e.g., eye gaze characteristics) instruction set 1444 is executable by the processing unit(s) 1402 to track a assess an eye characteristic or other physiological attribute based on the determined location of a glint (e.g., from the glint analysis instruction set 1442) using one or more of the techniques discussed herein or as otherwise may be appropriate.
  • the light driver instruction set 1446 is executable by the processing unit(s) 1402 to activate and control the light sources using one or more of the techniques discussed herein or as otherwise may be appropriate.
  • FIG. 14 is intended more as functional description of the various features which are present in a particular implementation as opposed to a structural schematic of the implementations described herein. As recognized by those of ordinary skill in the art, items shown separately could be combined and some items could be separated. The actual number of instructions sets and how features are allocated among them may vary from one implementation to another and may depend in part on the particular combination of hardware, software, and/or firmware chosen for a particular implementation.
  • this gathered data may include personal information data that uniquely identifies a specific person or can be used to identify interests, traits, or tendencies of a specific person.
  • personal information data can include physiological data, demographic data, locationbased data, telephone numbers, email addresses, home addresses, device characteristics of personal devices, or any other personal information.
  • the present disclosure recognizes that the use of such personal information data, in the present technology, can be used to the benefit of users.
  • the personal information data can be used to improve interaction and control capabilities of an electronic device. Accordingly, use of such personal information data enables calculated control of the electronic device. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure.
  • the present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information and/or physiological data will comply with well-established privacy policies and/or privacy practices.
  • such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure.
  • personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users.
  • such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.
  • the present disclosure also contemplates implementations in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware or software elements can be provided to prevent or block access to such personal information data.
  • the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services.
  • users can select not to provide personal information data for targeted content delivery services.
  • users can select to not provide personal information, but permit the transfer of anonymous information for the purpose of improving the functioning of the device.
  • the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data.
  • content can be selected and delivered to users by inferring preferences or settings based on non-personal information data or a bare minimum amount of personal information, such as the content being requested by the device associated with a user, other non-personal information available to the content delivery services, or publicly available information.
  • data is stored using a public/private key system that only allows the owner of the data to decrypt the stored data.
  • the data may be stored anonymously (e.g., without identifying and/or personal information about the user, such as a legal name, username, time and location data, or the like). In this way, other users, hackers, or third parties cannot determine the identity of the user associated with the stored data.
  • a user may access his or her stored data from a user device that is different than the one used to upload the stored data. In these instances, the user may be required to provide login credentials to access their stored data.
  • a computing device can include any suitable arrangement of components that provides a result conditioned on one or more inputs.
  • Suitable computing devices include multipurpose microprocessor-based computer systems accessing stored software that programs or configures the computing system from a general-purpose computing apparatus to a specialized computing apparatus implementing one or more implementations of the present subject matter. Any suitable programming, scripting, or other type of language or combinations of languages may be used to implement the teachings contained herein in software to be used in programming or configuring a computing device.
  • Implementations of the methods disclosed herein may be performed in the operation of such computing devices. The order of the blocks presented in the examples above can be varied for example, blocks can be re-ordered, combined, or broken into subblocks. Certain blocks or processes can be performed in parallel.
  • first first
  • second second
  • first node first node
  • first node second node
  • first node first node
  • second node second node
  • the first node and the second node are both nodes, but they are not the same node.
  • the term “if’ may be construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting,” that a stated condition precedent is true, depending on the context.
  • the phrase “if it is determined [that a stated condition precedent is true]” or “if [a stated condition precedent is true]” or “when [a stated condition precedent is true]” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.

Abstract

Divers modes de réalisation évaluent une caractéristique oculaire sur la base de réflexions (par exemple, des reflets) produites à l'aide d'une ou de plusieurs sources de lumière. Un dispositif (par exemple, un HMD) comprend une optique (par exemple, une pile d'un ou plusieurs éléments transparents) à travers laquelle un environnement physique d'un utilisateur est visualisé. La lumière est émise à partir d'une pluralité d'emplacements sur ou dans l'optique pour produire des reflets sur l'œil pour une évaluation de l'œil. Une ou plusieurs sources de lumière peuvent être situées sur un cadre environnant et peuvent produire de la lumière qui est guidée dans un ou plusieurs guides d'ondes. Le ou les guides d'ondes peuvent diriger la lumière à travers l'optique et ensuite hors de l'optique à des emplacements de sortie spécifiques, par exemple, par l'intermédiaire de prismes, de réseaux, etc. les guides d'ondes et les éléments associés sont configurés de telle sorte qu'un utilisateur n'est pas susceptible de notifier ceux-ci étant donné leur petite taille, à proximité immédiate de l'œil, et/ou des matériaux transparents.
PCT/US2023/032453 2022-09-23 2023-09-12 Éclairage de guide d'ondes pour évaluation oculaire WO2024063978A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263409282P 2022-09-23 2022-09-23
US63/409,282 2022-09-23

Publications (1)

Publication Number Publication Date
WO2024063978A1 true WO2024063978A1 (fr) 2024-03-28

Family

ID=88297169

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/032453 WO2024063978A1 (fr) 2022-09-23 2023-09-12 Éclairage de guide d'ondes pour évaluation oculaire

Country Status (1)

Country Link
WO (1) WO2024063978A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10838132B1 (en) * 2018-08-21 2020-11-17 Facebook Technologies, Llc Diffractive gratings for eye-tracking illumination through a light-guide
US20220026719A1 (en) * 2019-08-15 2022-01-27 Magic Leap, Inc. Ghost Image Mitigation in See-Through Displays With Pixel Arrays
US11397465B1 (en) * 2021-05-17 2022-07-26 Microsoft Technology Licensing, Llc Glint-based eye tracker illumination using dual-sided and dual-layered architectures

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10838132B1 (en) * 2018-08-21 2020-11-17 Facebook Technologies, Llc Diffractive gratings for eye-tracking illumination through a light-guide
US20220026719A1 (en) * 2019-08-15 2022-01-27 Magic Leap, Inc. Ghost Image Mitigation in See-Through Displays With Pixel Arrays
US11397465B1 (en) * 2021-05-17 2022-07-26 Microsoft Technology Licensing, Llc Glint-based eye tracker illumination using dual-sided and dual-layered architectures

Similar Documents

Publication Publication Date Title
US9377623B2 (en) Waveguide eye tracking employing volume Bragg grating
US11474358B2 (en) Systems and methods for retinal imaging and tracking
JP6641361B2 (ja) 切換え式回折格子を利用した導波路アイトラッキング
US11340702B2 (en) In-field illumination and imaging for eye tracking
US11650426B2 (en) Holographic optical elements for eye-tracking illumination
KR102273001B1 (ko) 아이 트래킹 장치, 방법 및 시스템
US10241332B2 (en) Reducing stray light transmission in near eye display using resonant grating filter
US10852817B1 (en) Eye tracking combiner having multiple perspectives
US10698204B1 (en) Immersed hot mirrors for illumination in eye tracking
US10775647B2 (en) Systems and methods for obtaining eyewear information
US11073903B1 (en) Immersed hot mirrors for imaging in eye tracking
CN115032795A (zh) 被配置为交换生物测定信息的头戴式显示系统
US10725302B1 (en) Stereo imaging with Fresnel facets and Fresnel reflections
CN114144717A (zh) 用于减少光学伪影的变迹光学元件
EP4314931A1 (fr) Éclairage de dispositif de suivi de l'oeil à travers un guide d'ondes
US20210378509A1 (en) Pupil assessment using modulated on-axis illumination
US11426070B2 (en) Infrared illuminator and related eye tracking apparatus and method
WO2024063978A1 (fr) Éclairage de guide d'ondes pour évaluation oculaire
US11237628B1 (en) Efficient eye illumination using reflection of structured light pattern for eye tracking
US20230333640A1 (en) Multiple gaze dependent illumination sources for retinal eye tracking
CN107783292B (zh) 可以采集用户虹膜信息的头戴式可视设备
US20230324587A1 (en) Glint analysis using multi-zone lens
US20230367117A1 (en) Eye tracking using camera lens-aligned retinal illumination
US11836287B1 (en) Light pattern-based alignment for retinal eye tracking
US20230309824A1 (en) Accommodation tracking based on retinal-imaging