US20230087535A1 - Wavefront sensing from retina-reflected light - Google Patents
Wavefront sensing from retina-reflected light Download PDFInfo
- Publication number
- US20230087535A1 US20230087535A1 US18/071,643 US202218071643A US2023087535A1 US 20230087535 A1 US20230087535 A1 US 20230087535A1 US 202218071643 A US202218071643 A US 202218071643A US 2023087535 A1 US2023087535 A1 US 2023087535A1
- Authority
- US
- United States
- Prior art keywords
- eye
- retina
- light
- infrared
- illuminators
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000005286 illumination Methods 0.000 claims abstract description 101
- 230000002350 accommodative effect Effects 0.000 claims abstract description 26
- 230000003287 optical effect Effects 0.000 claims description 79
- 210000001747 pupil Anatomy 0.000 claims description 30
- 238000000034 method Methods 0.000 claims description 22
- 210000001525 retina Anatomy 0.000 claims description 15
- 230000010287 polarization Effects 0.000 claims description 11
- 239000000758 substrate Substances 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 5
- 230000015654 memory Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 7
- 239000000463 material Substances 0.000 description 7
- 230000003190 augmentative effect Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 210000004087 cornea Anatomy 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 230000001902 propagating effect Effects 0.000 description 3
- 239000012780 transparent material Substances 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 208000014733 refractive error Diseases 0.000 description 2
- 208000036758 Postinfectious cerebellitis Diseases 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000004308 accommodation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 210000002159 anterior chamber Anatomy 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/20—Filters
- G02B5/208—Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B27/0103—Head-up displays characterised by optical features comprising holographic elements
- G02B2027/0105—Holograms with particular structures
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
- G02B2027/0174—Head mounted characterised by optical features holographic
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B2027/0192—Supplementary details
- G02B2027/0194—Supplementary details with combiner of laminated type, for optical or mechanical aspects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0037—Arrays characterized by the distribution or form of lenses
- G02B3/0056—Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C11/00—Non-optical adjuncts; Attachment thereof
- G02C11/04—Illuminating means
Definitions
- HMDs Head mounted displays
- conventional methods used in HMDs and other optical systems for determining where an eye is focusing can be inaccurate, especially across age demographics
- FIG. 1 illustrates an example HMD that may include infrared in-field illuminators and a combiner for redirecting retina-reflected infrared light to a wavefront sensor, in accordance with aspects of the disclosure.
- FIG. 2 is a top view of an example near-eye optical element that includes an illumination layer, a combiner layer, and a display layer, in accordance with aspects of the disclosure.
- FIG. 3 illustrates a front view of an eye through an example illumination layer, in accordance with aspects of the disclosure.
- FIG. 4 illustrates an example optical path of infrared illumination light and retina-reflected infrared light, in accordance with aspects of the disclosure.
- FIG. 5 illustrates an example infrared in-field illuminator including a light source and an example beam-forming element, in accordance with aspects of the disclosure.
- FIGS. 6 A- 6 C illustrate an eye in different positions with respect to an array of infrared in-field illuminators and an example combiner, in accordance with aspects of the disclosure.
- FIG. 7 illustrates a wavefront imaging system that may be utilized in a near-eye optical system, in accordance with aspects of the disclosure.
- FIG. 8 illustrates a flow chart for generating an accommodative eye state value, in accordance with aspects of the disclosure.
- FIG. 9 is a block diagram illustration of a lenslet array focusing a planar wavefront of retina-reflected infrared light onto an image sensor as beam spots, in accordance with aspects of the disclosure.
- FIG. 10 is a block diagram illustration of a lenslet array focusing a converging wavefront of reflected infrared light onto an image sensor as beam spots, in accordance with aspects of the disclosure.
- FIG. 11 is a block diagram illustration of a lenslet array focusing a diverging wavefront of reflected infrared light onto an image sensor as beam spots, in accordance with aspects of the disclosure.
- Embodiments of wavefront sensing with in-field illuminators are described herein.
- numerous specific details are set forth to provide a thorough understanding of the embodiments.
- One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc.
- well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
- Embodiments of an apparatus, system, and method for wavefront sensing described in this disclosure are capable of capturing a wavefront image of infrared light propagating through the lens of an eye. By determining the converging or diverging attributes of the wavefront, an accommodative state of the eye can be determined.
- Vergence-Accommodation Conflict (VAC) is used as a surrogate to approximate the accommodative state of the eye. For example, when two eyes are narrowed the eyes are likely focused to a near-field object (e.g. a book held close) whereas two eyes that are looking straight ahead are likely focused near infinity (e.g. a mountain in the distance).
- VAC only approximates the accommodative state of the eye.
- the accommodative response of the eye varies over different age groups. For example, individuals under approximately age 45 may accommodate freely while older individuals may have limited accommodation response. For these reasons, it would be advantageous to measure an accommodative state of the eye rather than approximating the accommodative state based on vergence.
- Embodiments of the disclosure provide a way to measure an accommodative state of the eye in real time or pseudo real-time.
- an infrared wavefront that has propagated through the lens of the eye is measured by a wavefront sensor.
- a wavefront image captured by wavefront sensor is analyzed for divergence or convergence to determine the accommodative state of the eye and a virtual image presented to the eye(s) may be adjusted based on the determined accommodative state of the eye.
- An array of infrared in-field illuminators or a photonic integrated circuit (PIC), for example, may illuminate the eye with infrared illumination light and a combiner is utilized to redirect an infrared wavefront (that propagated through the eye lens and is exiting the pupil) to the wavefront sensor.
- the infrared in-field illuminators may be configured to emit infrared illumination light that is collimated or near-collimated to a center of rotation of an eye.
- FIG. 1 illustrates an example HMD 100 , in accordance with aspects of the present disclosure.
- the illustrated example of HMD 100 is shown as including a frame 102 , temple arms 104 A and 104 B, and near-eye optical elements 110 A and 110 B.
- Wavefront sensors 108 A and 108 B are shown as coupled to temple arms 104 A and 104 B, respectively.
- FIG. 1 also illustrates an exploded view of an example of near-eye optical element 110 A.
- Near-eye optical element 110 A is shown as including an optically transparent layer 120 A, an illumination layer 130 A, an optical combiner layer 140 A, and a display layer 150 A.
- Display layer 150 A may include a waveguide 158 that is configured to direct virtual images to an eye of a user of HMD 100 .
- Illumination layer 130 A is shown as including a plurality of in-field illuminators 126 .
- In-field illuminators 126 are described as “in-field” because they are in a field of view (FOV) of a user of the HMD 100 .
- In-field illuminators 126 may be in a same FOV that a user views a display of the HMD, in an embodiment.
- In-field illuminators 126 may be in a same FOV that a user views an external environment of the HMD 100 via scene light 191 propagating through near-eye optical elements 110 .
- each in-field illuminator 126 may introduce minor occlusions into the near-eye optical element 110 A, the in-field illuminators 126 , as well as their corresponding electrical routing may be so small as to be unnoticeable or insignificant to a wearer of HMD 100 . Additionally, any occlusion from in-field illuminators 126 will be placed so close to the eye as to be unfocusable by the human eye and therefore assist in the in-field illuminators 126 being not noticeable or insignificant. In some embodiments, each in-field illuminator 126 has a footprint (or size) that is less than about 200 ⁇ 200 microns.
- the in-field illuminators 126 may be disposed between 10 mm and 30 mm from the eye. In some embodiments, the in-field illuminators 126 may be placed between 15 mm and 25 mm from the eye of a user.
- the in-field illuminators 126 may be infrared in-field illuminators 126 configured to emit infrared illumination light for eye-tracking purposes, for example.
- a photonic integrated circuit may be implemented instead of in-field illuminators 126 to achieve a similar function as in-field illuminators 126 .
- outcoupling elements may be positioned similarly to the infield-illuminators 126 and the outcoupling elements may be provided infrared light by transparent waveguides.
- Light sources located at the edge of a frame of the HMD may provide the infrared light into the transparent waveguides, for example.
- the outcoupling elements then redirect the infrared light provided by the waveguides to illuminate an eyeward region.
- the outcoupling elements may have diffractive or refractive features to facilitate beam-shaping of the infrared light received from the waveguides.
- wavefront sensor(s) 108 of this disclosure may also be disposed in numerous places in the VR HMD besides a template position, as illustrated in FIG. 1 .
- Example HMD 100 is coupled to temple arms 104 A and 104 B for securing the HMD 100 to the head of a user.
- Example HMD 100 may also include supporting hardware incorporated into the frame 102 and/or temple arms 104 A and 104 B.
- the hardware of HMD 100 may include any of processing logic, wired and/or wireless data interface for sending and receiving data, graphic processors, and one or more memories for storing data and computer-executable instructions.
- HMD 100 may be configured to receive wired power and/or may be configured to be powered by one or more batteries.
- HMD 100 may be configured to receive wired and/or wireless data including video data.
- FIG. 1 illustrates near-eye optical elements 110 A and 110 B that are configured to be mounted to the frame 102 .
- near-eye optical elements 110 A and 110 B may appear transparent to the user to facilitate augmented reality or mixed reality such that the user can view visible scene light from the environment while also receiving display light directed to their eye(s) by way of display layer 150 A.
- some or all of near-eye optical elements 110 A and 110 B may be incorporated into a virtual reality headset where the transparent nature of the near-eye optical elements 110 A and 110 B allows the user to view an electronic display (e.g., a liquid crystal display (LCD), an organic light emitting diode (OLED) display, a micro-LED display, etc.) incorporated in the virtual reality headset.
- LCD liquid crystal display
- OLED organic light emitting diode
- illumination layer 130 A includes a plurality of in-field illuminators 126 .
- Each in-field illuminator 126 may be disposed on a transparent substrate and may be configured to emit light towards an eyeward side 109 of the near-eye optical element 110 A.
- the in-field illuminators 126 are configured to emit near infrared light (e.g. 750 nm-1.5 ⁇ m).
- Each in-field illuminator 126 may be a micro light emitting diode (micro-LED), an edge emitting LED, a vertical cavity surface emitting laser (VCSEL) diode, or a Superluminescent diode (SLED).
- the in-field illuminators 126 of the illumination layer 130 A may be configured to emit infrared illumination light towards the eyeward side 109 of the near-eye optical element 110 A to illuminate the eye of a user.
- the near-eye optical element 110 A is shown as including optical combiner layer 140 A where the optical combiner layer 140 A is disposed between the illumination layer 130 A and a backside 111 of the near-eye optical element 110 A.
- the optical combiner 140 A is configured to receive retina-reflected infrared light that is reflected by retina of the eye of the user and to direct the retina-reflected infrared light towards the wavefront sensor 108 A.
- the wavefront sensor(s) 108 may be located in different positions than the positions illustrated.
- the optical combiner 140 A is transmissive to visible light, such as scene light 191 incident on the backside 111 of the near-eye optical element 110 A.
- the optical combiner 140 A may be configured as a volume hologram and/or may include one or more Bragg gratings for directing the retina-reflected infrared light towards the wavefront sensor 108 A.
- the optical combiner 140 A includes a polarization-selective volume hologram (a.k.a. polarized volume hologram) that diffracts (in reflection) a particular polarization orientation of incident light while passing other polarization orientations.
- Display layer 150 A may include one or more other optical elements depending on the design of the HMD 100 .
- display layer 150 A may include a waveguide 158 to direct display light generated by an electronic display to the eye of the user.
- at least a portion of the electronic display is included in the frame 102 of the HMD 100 .
- the electronic display may include an LCD, an organic light emitting diode (OLED) display, micro-LED display, pico-projector, or liquid crystal on silicon (LCOS) display for generating the display light.
- OLED organic light emitting diode
- micro-LED micro-LED display
- pico-projector pico-projector
- LCOS liquid crystal on silicon
- Optically transparent layer 120 A is shown as being disposed between the illumination layer 130 A and the eyeward side 109 of the near-eye optical element 110 A.
- the optically transparent layer 120 A may receive the infrared illumination light emitted by the illumination layer 130 A and pass the infrared illumination light to illuminate the eye of the user.
- the optically transparent layer 120 A may also be transparent to visible light, such as scene light 191 received from the environment and/or display light received from the display layer 150 A.
- the optically transparent layer 120 A has a curvature for focusing light (e.g., display light and/or scene light) to the eye of the user.
- the optically transparent layer 120 A may, in some examples, may be referred to as a lens.
- the optically transparent layer 120 A has a thickness and/or curvature that corresponds to the specifications of a user.
- the optically transparent layer 120 A may be a prescription lens.
- the optically transparent layer 120 A may be a non-prescription lens.
- FIG. 1 illustrates an HMD 100 configured for augmented reality (AR) or mixed reality (MR) contexts
- the disclosed embodiments may also be used in other implementations of an HMD.
- the illumination layers of this disclosure may be disposed close to a display plane of a display of a virtual reality (VR) or prior to a focusing lens of a VR HMD where the focusing lens is disposed between the illumination layer and the display and the focusing lens focuses display light from the display for an eye of a wearer of the VR HMD.
- VR virtual reality
- FIG. 2 is a top view of an example near-eye optical element 210 that includes an illumination layer 230 , a combiner layer 240 , and a display layer 250 .
- a transparent layer (not illustrated) may optionally be included between illumination layer 230 and eye 202 , in some embodiments.
- a plurality of infrared in-field illuminators 237 emit infrared illumination light 239 to an eyebox area to illuminate eye 202 .
- Plane 206 illustrates a two-dimensional pupil plane 206 in the eyebox area being normal to the curvature of the eye 202 at the center of pupil 203 .
- FIG. 2 illustrates example array of infrared in-field illuminators 237 A- 237 E.
- Each infrared in-field illuminator 237 in the array is configured to emit infrared illumination light 239 to a center of rotation 241 of eye 202 .
- the different infrared in-field illuminators 237 may direct infrared illumination light 239 to the center of rotation of eye 202 at different angles depending on the position of the infrared in-field illuminator with respect to eye 202 .
- infrared in-field illuminators 237 A and 237 E may include beam-forming elements that direct the infrared illumination light to eye 202 at steeper angles compared to infrared illuminator 237 C directing infrared illumination light 239 to eye 202 at an angle closer to normal.
- the center of rotation 241 of eye 202 remains at a substantially same position with respect to illuminators 237 even over a large range of gaze angles of eye 202 .
- infrared in-field illuminators 237 may be VCSELs or SLEDs, and consequently infrared illumination light 239 may be narrow-band infrared illumination light (e.g. linewidth of 1-10 nm).
- the infrared illumination light 239 may be collimated or near-collimated so that at least a portion of the infrared illumination light 239 will propagate through pupil 203 of eye 202 , reflect of off retina 208 and exit eye 202 through pupil 203 as retina-reflected infrared light.
- the retina-reflected infrared light may be received by combiner optical element 240 and redirected to wavefront sensor 108 A to generate a wavefront image.
- alternative illumination layer implementations that utilize outcoupling elements, waveguides, and/or planar waveguides that achieve a similar function as infrared in-field illuminators 237 may also be utilized to generate infrared illumination light 239 that is collimated or near-collimated.
- Wavefront sensor 108 A is configured to capture wavefront images that may be utilized to determine an accommodative eye state value of eye 202 , for example.
- Wavefront sensor 108 may include an infrared bandpass filter to pass the wavelength of the infrared illumination light 239 emitted by the infrared illuminators and block other light from becoming incident on an image sensor of wavefront sensor 108 A, in some embodiments.
- FIG. 2 shows that scene light 191 (visible light) from the external environment may propagate through display layer 250 , combiner layer 240 , and illumination layer 230 to become incident on eye 202 so that a user can view the scene of an external environment.
- FIG. 2 also shows that display layer 250 may generate or redirect display light 293 to present virtual images to eye 202 .
- Display light 293 is visible light and propagates through combiner layer 240 and illumination layer 230 to reach eye 202 .
- Illumination layer 230 may include a transparent substrate that the infrared in-field illuminators 237 are disposed on.
- the infrared in-field illuminators 237 may also be encapsulated in a transparent material 232 .
- Transparent material 232 is configured to transmit visible light (e.g. 400 nm-750 nm) and near-infrared light (e.g. 750 nm-1.5 ⁇ m).
- FIG. 3 illustrates a front view of eye 202 through an example illumination layer 330 , in accordance with aspects of the disclosure.
- illumination layer 330 include twenty-one infrared in-field illuminators ( 337 A- 337 U).
- infrared illuminators 337 A- 337 H may be considered an “inner ring” of infrared in-field illuminators 337 while infrared illuminators 337 I- 337 U are considered an “outer ring” of infrared in-field illuminators 337 .
- infrared illuminators 337 I- 337 U may direct their infrared illumination light to eye 202 at a steeper angle than infrared illuminators 337 A- 337 H in the inner ring.
- An illumination angle of the infrared illumination light 239 from different in-field infrared illuminators 337 may increase as a distance of a particular infrared in-field illuminators 337 increases from middle region 231 of the array of infrared in-field illuminators 337 .
- FIG. 4 illustrates an example optical path of infrared illumination light 439 and retina-reflected infrared light 449 , in accordance with aspects of the disclosure.
- an array of infrared in-field illuminators 437 emit infrared illumination light 439 to a center of rotation of eye 202 . Only the infrared illumination light from infrared in-field illuminators 437 B is shown for illustration and description of the optical path of the infrared illumination light, in FIG. 4 .
- Portions of infrared illumination light 439 may not necessarily propagate through the pupil and may be scattered by the iris or cornea.
- infrared illumination light 439 propagates substantially normal to pupil plane 206 of eye 202 and propagates through the cornea 201 , anterior chamber, pupil 209 , and lens 204 of eye 202 before becoming incident upon the retina 208 .
- a portion (e.g. ⁇ 10% for 850 nm light) of infrared illumination light 439 reflects off the retina 208 as retina-reflected infrared light 449 .
- the portion of infrared illumination light 439 that propagates through pupil 209 normal to (or at least substantially normal to) pupil plane 206 is the light that can be reflected back out of pupil 209 after reflecting off of retina 208 rather than being absorbed by the interior of eye 202 .
- retina-reflected infrared light 449 propagates through lens 204 , pupil 209 , and cornea 201 to exit eye 202 . Retina-reflected infrared light 449 then propagates through illumination layer 430 and encounters combiner optical element 440 .
- Combiner optical element 440 receives retina-reflected infrared light 449 and redirects the retina-reflected infrared light 449 to a wavefront sensor (e.g. wavefront sensor 108 ).
- Combiner optical element 440 may include a polarization-selective volume hologram that reflects a first polarization orientation (e.g. right-hand circularly polarized light) of the retina-reflected infrared light and passes polarization orientations that are other than the first polarization orientation.
- Combiner optical element 440 may also include a folding mirror, hologram or linear diffractive grating, to redirected retina-reflected infrared light 449 , in some embodiments. The combiner optical element 440 passes visible light.
- FIG. 5 illustrates an example infrared in-field illuminator 537 that may be utilized as infrared illuminators 126 / 237 / 337 / 447 , in accordance with aspects of the disclosure.
- the example infrared in-field illuminator 537 illustrated in FIG. 5 includes an infrared light source 531 having an output aperture 536 and a beam-forming element 535 disposed over output aperture 536 .
- Beam-forming element 535 is configured to direct the infrared illumination light 539 to a center of rotation of an eye.
- FIG. 5 illustrates an example infrared in-field illuminator 537 that may be utilized as infrared illuminators 126 / 237 / 337 / 447 , in accordance with aspects of the disclosure.
- the example infrared in-field illuminator 537 illustrated in FIG. 5 includes an infrared light source 531 having an output aperture 536 and a beam-forming
- beam-forming element 535 includes a refractive material 538 and a lens curvature 534 may be formed of the refractive material 538 of the beam-forming element 535 .
- the lens curvature 534 may assist in directing the infrared illumination light 539 to a center of rotation of the eye.
- the beam-forming elements of the infrared light sources may be configured to increase an illumination angle of the infrared illumination light 539 as a distance of a particular beam-forming element increases from a middle region (e.g. 231 ) of the array of infrared in-field illuminators so that the infrared illumination light 539 from each infrared in-field illuminator 537 is directed to a center of rotation of the eye.
- Substrate 532 is a transparent material.
- Refractive material 538 of beam-forming element 535 may be a high-index material having a refractive index of greater than three.
- the illustrated refractive beam-forming element 535 is replaced by, or includes, a diffractive optical element configured to direct the infrared illumination light 539 to the eye.
- beam-forming element 535 is approximately 30 microns wide.
- FIGS. 6 A- 6 C illustrates an eye 202 in different positions with respect to an array of infrared in-field illuminators 437 and an example combiner optical element, in accordance with aspects of the disclosure
- infrared in-field illuminator 437 B and 437 C emit infrared illumination light 239 to a center of rotation of eye 202 .
- Infrared illumination light 239 may be collimated light.
- Other infrared in-field illuminators 437 in the array may also emit infrared illumination light 239 to a center of rotation of eye 202 .
- At least a portion of the infrared illumination light 239 propagates through the pupil of eye 202 and reflects off of retina 208 and propagates back through (exiting) the pupil as retina-reflected infrared light.
- FIG. 6 B illustrates infrared in-field illuminators 437 A and 437 B emitting infrared illumination light 239 to a center of rotation of eye 202 when eye 202 has changed a gaze angle of the eye.
- the eye 202 illustrated in FIG. 6 B may be gazing up or gazing to the left, for example.
- FIG. 6 C illustrates infrared in-field illuminators 437 C and 437 D emitting infrared illumination light 239 to a center of rotation of eye 202 when eye 202 is positioned at yet another gaze angle.
- the eye 202 illustrated in FIG. 6 C may be gazing down or gazing to the right, for example.
- FIGS. 6 A- 6 C illustrate that even when the gaze angle and/or position of eye 202 changes, different infrared in-field illuminators are still able to direct infrared illumination light substantially normal to pupil plane 206 and therefore have the infrared illumination light 239 propagate through the pupil, reflect off of retina 208 , propagate back through the pupil (as retina-reflected infrared light 449 , not illustrated) to combiner optical element 440 to be redirected to a wavefront sensor.
- the infrared in-field illuminators 437 in the array are spaced apart so that at least a portion of the infrared in-field illuminators 437 will be positioned to illuminate a retina of the eye, through a pupil of the eye, with infrared illumination light propagating approximately normal to a pupil plane of the eye, over a range of eye positions.
- the range of eye positions may include the maximum eye position range that humans are capable of.
- the infrared in-field illuminators 437 in the array are selectively illuminated based on where a given infrared in-field illuminator 437 (or group of infrared in-field illuminators 437 ) are positioned.
- the infrared in-field illuminators 437 selected are positioned to illuminate the eye 202 with infrared illumination light 239 that will propagate through the pupil at angle substantially normal to pupil plane 206 so that the combiner optical element 440 can receive a usable signal of retina-reflected infrared light 449 to direct to the wavefront sensor.
- the infrared in-field illuminators 437 are selectively activated (turned on) based on eye-tracking data collected by a separate eye-tracking system of an HMD. For example, if the eye-tracking system determines that eye 202 is looking up, infrared in-field illuminators 437 A and 437 B may be selectively activated since they may be best positioned to illuminate eye 202 with infrared illumination light 239 that will be reflected off retina 208 , back through the pupil to combiner optical element 440 .
- infrared in-field illuminators 437 C and 437 D may be selectively activated since they may be best positioned to illuminate eye 202 with infrared illumination light 239 that will be reflected off retina 208 , back through the pupil to combiner optical element 440 .
- FIG. 7 illustrates a wavefront imaging system 700 that may be utilized in an HMD or as a near-eye optical system, in accordance with aspects of the disclosure.
- Wavefront imaging system 700 includes an eye-tracking module 747 for determining a position of eye 202 .
- eye-tracking module 747 includes a camera configured to capture infrared images of eye 202 .
- Eye-tracking module 747 generates eye-tracking data 793 that may include a position of eye 202 .
- eye 202 may change gaze angles in any combination of up, down, right, and left, and eye-tracking module 747 may provide those gaze angles or eye position in eye-tracking data 793 by analyzing images of eye 202 .
- Display 790 generates visible display light 799 for presenting a virtual image to a user of an HMD. Visible display light 799 may propagate through a near-eye optical element that includes illumination layer 430 and combiner optical element 440 with very little (if any) optical loss since the materials in the near-eye optical element are configured to pass visible light and combiner 440 may be configured to diffract a particular bandwidth of infrared light emitted by infrared in-field illuminators.
- Display 790 may include an OLED, micro-LED, or LCD in a virtual reality context. In an augmented reality or mixed reality context, display 790 may include a transparent OLED or an LCOS projector paired with a waveguide included in a near-eye optical element of an HMD, for example.
- illumination logic 770 is configured to control display 790 and drive images onto display 790 .
- Illumination logic 770 is also configured to receive eye-tracking data 793 generated by eye-tracking module 747 .
- illumination logic 770 is configured to selectively activate (turn on) individual or groups of infrared in-field illuminators in an array of infrared in-field illuminators in illumination layer 430 .
- Illumination logic 770 may selectively activate the infrared in-field illuminators based on the received eye-tracking data 793 .
- FIG. 7 shows that retina-reflected infrared light 749 may include a diverging wavefront 751 A, a converging wavefront 751 B, or a planar wavefront 751 C.
- the wavefront is directed to wavefront sensor 745 via combiner optical element 440 so that wavefront sensor 745 can capture a wavefront image 750 that may be provided to illumination logic 770 .
- the optical paths associated with infrared illumination light 239 / 439 are not illustrated in FIG. 7 , the infrared illumination light generally follows the example optical paths illustrated in FIG. 4 .
- Example wavefront sensor 745 includes an image sensor 748 , a lenslet array 746 , and an optional focusing lens 735 .
- Wavefront sensor 745 may be arranged as a Shack-Hartmann wavefront sensor.
- Image sensor 748 may be included in a camera with additional focusing elements.
- Image sensor 748 may include a complementary metal-oxide semiconductor (CMOS) image sensor, for example.
- CMOS complementary metal-oxide semiconductor
- the camera may include an infrared filter configured to pass the wavelengths of the retina-reflected infrared light and reject other light wavelengths.
- the lenslet array 746 is disposed in an optical path between the combiner optical element 440 and image sensor 748 , in FIG. 7 .
- Lenslet array 746 may be positioned at a plane that is conjugate to a pupil plane 206 of eye 202 .
- additional optical elements e.g. mirrors and/or lenses
- Illumination logic 770 may be configured to adjust a virtual image presented to the eye 202 of a user in response to determining an accommodative eye state value based on a wavefront image 750 captured by wavefront sensor 745 . Since the accommodative state of the eye can be derived from wavefront image 750 , a user's refractive error can be measured and corrected for. Display images driven onto display 790 may be tailored to correct for the user's refractive error.
- FIG. 8 illustrates a flow chart for generating an accommodative eye state value, in accordance with aspects of the disclosure.
- the order in which some or all of the process blocks appear in process 800 should not be deemed limiting. Rather, one of ordinary skill in the art having the benefit of the present disclosure will understand that some of the process blocks may be executed in a variety of orders not illustrated, or even in parallel.
- Process 800 may be executed by illumination logic 770 , for example.
- an eye is illuminated by infrared illumination light (e.g. infrared illumination light 239 ) from an array of infrared in-field illuminators where the infrared illumination light from each infrared in-field illuminator is directed to a center of rotation of the eye.
- the infrared illumination light may be collimated or near-collimated.
- a wavefront image (e.g. 750 ) of retina-reflected infrared light (e.g. 649 ) is generated.
- the retina-reflected infrared light is the infrared illumination light (e.g. 639 ) reflected by a retina and exiting a pupil of the eye.
- generating the wavefront image includes receiving the retina-reflected infrared light with a wavefront sensor (e.g. 745 ) including an image sensor and a lenslet array.
- the lenslet array may be positioned in a plane that is conjugate to a pupil plane of the eye.
- an accommodative eye state value is determined based at least in part on the wavefront image.
- determining the accommodative eye state value includes analyzing a spacing of beam spots of the wavefront image generated by microlenses of the lenslet array focusing the retina-reflected infrared light onto the image sensor.
- process 800 further includes adjusting a virtual image presented to the eye by a head mounted display in response to determining the accommodative eye state value.
- FIG. 9 illustrates a block diagram illustration of a lenslet array 947 focusing a planar wavefront of retina-reflected infrared light 649 onto an image sensor 949 as beam spots 948 , in accordance with aspects of the disclosure.
- lenslet array 947 includes a plurality of microlenses 947 A- 947 Y that focus corresponding beam spots 948 A- 948 Y.
- microlens 947 A focuses infrared light onto image sensor 949 as beam spot 948 A
- microlens 947 B focuses infrared light onto image sensor 949 as beam spot 948 B . . .
- microlens 947 Y focuses infrared light onto image sensor 949 as beam spot 948 Y.
- Microlens 947 M is the middle microlens in the example 5 ⁇ 5 array of microlenses in lenslet array 947 .
- FIG. 9 illustrates that when retina-reflected infrared light 649 is a planar wavefront (e.g. wavefront 751 C), each beam spot 948 is axially aligned with an optical axis of its corresponding microlens that focuses that particular beam spot 948 . Accordingly, each beam spot 948 in the example is equidistant. In other examples, beam spots 948 may not necessarily be equidistance for incoming planar wavefronts.
- FIG. 10 illustrates a block diagram illustration of a lenslet array 947 focusing a converging wavefront of retina-reflected infrared light 649 onto an image sensor 949 as beam spots 1048 , in accordance with an embodiment of the disclosure.
- FIG. 10 illustrates that when retina-reflected infrared light 649 is a converging wavefront (e.g. wavefront 751 B), beam spots 1048 have converged toward middle beam spot 1048 M. Accordingly, when the beam spots 1048 are converging, a wavefront image that captures beam spots 1048 will indicate that the lens system of eye 202 is focusing at nearer distances. The closer the beam spots 1048 converge, the nearer the distance the eye 202 may be focusing to.
- a converging wavefront e.g. wavefront 751 B
- FIG. 11 illustrates a block diagram illustration of a lenslet array 947 focusing a diverging wavefront of retina-reflected infrared light 649 onto an image sensor 949 as beam spots 1148 , in accordance with an embodiment of the disclosure.
- FIG. 11 illustrates that when retina-reflected infrared light 649 is a diverging wavefront (e.g. wavefront 751 A), beam spots 1148 have diverged away from middle beam spot 1148 M. Accordingly, when the beam spots 1148 are diverging, a wavefront image that captures beam spots 1148 will indicate that eye 202 is focusing at a farther distance.
- a diverging wavefront e.g. wavefront 751 A
- wavefront 751 A is not diverging but merely less divergent than wavefront 751 C and the beam spots 1148 formed on the wavefront image are also not converging, but rather converging less than beam spots 1048 of FIG. 10 .
- the lesser extent of the convergence of beam spots 1148 indicates that the eye 202 is focusing at a farther distance than the more condensed beam spots 1048 .
- a greater condensing of the beam spots from the respective microlenses represents a near-focused accommodative eye state value where the eye is focused at a near distance and a lesser condensing of the beam spots from the respective microlenses represents a far-focused accommodative eye state value where the eye is focused at a farther distance.
- FIGS. 9 - 11 illustrate how analysis of the positioning of the beam spots will indicate the diverging or converging nature of the wavefront of retina-reflected infrared light 649 as well as the magnitude of the divergence or convergence. Accordingly, a magnitude and nature of the accommodative state of the lens system of eye 202 may be determined from a wavefront image generated by wavefront sensor 745 by analyzing the spacing of the beam spots.
- An algorithm to determine the accommodative eye state value of an eye may include detecting bright beam spots with sub-pixel resolution accuracy.
- the pupil of the eye may be segmented based on intensity thresholding or other computer vision or machine learning principles.
- distortion of any optics in the optical path between the optical combiner element and the wavefront sensor may be accounted for.
- the raw data from a wavefront image that includes an array of bright spots over a dark background may be converted to a wavefront map and compared to a calibration metric to determine an offset in a spherical curvature of an incoming wavefront, for example.
- Embodiments of the invention may include or be implemented in conjunction with an artificial reality system.
- Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof.
- Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content.
- the artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer).
- artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality.
- the artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
- HMD head-mounted display
- illumination logic or “processing logic” in this disclosure may include one or more processors, microprocessors, multi-core processors, Application-specific integrated circuits (ASIC), and/or Field Programmable Gate Arrays (FPGAs) to execute operations disclosed herein.
- memories are integrated into the processing logic to store instructions to execute operations and/or store data.
- Processing logic may also include analog or digital circuitry to perform the operations in accordance with embodiments of the disclosure.
- a “memory” or “memories” described in this disclosure may include one or more volatile or non-volatile memory architectures.
- the “memory” or “memories” may be removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data.
- Example memory technologies may include RAM, ROM, EEPROM, flash memory, CD-ROM, digital versatile disks (DVD), high-definition multimedia/data storage disks, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device.
- a computing device may include a desktop computer, a laptop computer, a tablet, a phablet, a smartphone, a feature phone, a server computer, or otherwise.
- a server computer may be located remotely in a data center or be stored locally.
- a tangible non-transitory machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.).
- a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
Abstract
Description
- This application is a continuation of U.S. Non-Provisional application Ser. No. 16/917,893 filed Jun. 30, 2020, which claims the benefit of U.S. Provisional Application No. 62/928,948 filed Oct. 31, 2019. U.S. Non-Provisional application Ser. No. 16/917,893, and U.S. Provisional Application No. 62/928,948 are expressly incorporated herein by reference in their entirety.
- In a variety of different optical contexts, the ability to measure or sense a light wavefront is useful. Head mounted displays (HMOs) present virtual images to users of the HMD. In some contexts, it is advantageous for the HMD to determine the location of the eye of the user and/or determine where the eyes of the user are focusing. However, conventional methods used in HMDs and other optical systems for determining where an eye is focusing can be inaccurate, especially across age demographics
- Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
-
FIG. 1 illustrates an example HMD that may include infrared in-field illuminators and a combiner for redirecting retina-reflected infrared light to a wavefront sensor, in accordance with aspects of the disclosure. -
FIG. 2 is a top view of an example near-eye optical element that includes an illumination layer, a combiner layer, and a display layer, in accordance with aspects of the disclosure. -
FIG. 3 illustrates a front view of an eye through an example illumination layer, in accordance with aspects of the disclosure. -
FIG. 4 illustrates an example optical path of infrared illumination light and retina-reflected infrared light, in accordance with aspects of the disclosure. -
FIG. 5 illustrates an example infrared in-field illuminator including a light source and an example beam-forming element, in accordance with aspects of the disclosure. -
FIGS. 6A-6C illustrate an eye in different positions with respect to an array of infrared in-field illuminators and an example combiner, in accordance with aspects of the disclosure. -
FIG. 7 illustrates a wavefront imaging system that may be utilized in a near-eye optical system, in accordance with aspects of the disclosure. -
FIG. 8 illustrates a flow chart for generating an accommodative eye state value, in accordance with aspects of the disclosure. -
FIG. 9 is a block diagram illustration of a lenslet array focusing a planar wavefront of retina-reflected infrared light onto an image sensor as beam spots, in accordance with aspects of the disclosure. -
FIG. 10 is a block diagram illustration of a lenslet array focusing a converging wavefront of reflected infrared light onto an image sensor as beam spots, in accordance with aspects of the disclosure. -
FIG. 11 is a block diagram illustration of a lenslet array focusing a diverging wavefront of reflected infrared light onto an image sensor as beam spots, in accordance with aspects of the disclosure. - Embodiments of wavefront sensing with in-field illuminators are described herein. In the following description, numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
- Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
- Embodiments of an apparatus, system, and method for wavefront sensing described in this disclosure are capable of capturing a wavefront image of infrared light propagating through the lens of an eye. By determining the converging or diverging attributes of the wavefront, an accommodative state of the eye can be determined. Conventionally, Vergence-Accommodation Conflict (VAC) is used as a surrogate to approximate the accommodative state of the eye. For example, when two eyes are narrowed the eyes are likely focused to a near-field object (e.g. a book held close) whereas two eyes that are looking straight ahead are likely focused near infinity (e.g. a mountain in the distance). However, VAC only approximates the accommodative state of the eye. Furthermore, the accommodative response of the eye varies over different age groups. For example, individuals under approximately age 45 may accommodate freely while older individuals may have limited accommodation response. For these reasons, it would be advantageous to measure an accommodative state of the eye rather than approximating the accommodative state based on vergence.
- Embodiments of the disclosure provide a way to measure an accommodative state of the eye in real time or pseudo real-time. To determine the accommodative state of the eye, an infrared wavefront that has propagated through the lens of the eye is measured by a wavefront sensor. A wavefront image captured by wavefront sensor is analyzed for divergence or convergence to determine the accommodative state of the eye and a virtual image presented to the eye(s) may be adjusted based on the determined accommodative state of the eye. An array of infrared in-field illuminators or a photonic integrated circuit (PIC), for example, may illuminate the eye with infrared illumination light and a combiner is utilized to redirect an infrared wavefront (that propagated through the eye lens and is exiting the pupil) to the wavefront sensor. The infrared in-field illuminators may be configured to emit infrared illumination light that is collimated or near-collimated to a center of rotation of an eye. These and other embodiments are described in more detail in connections with
FIGS. 1-11 . -
FIG. 1 illustrates anexample HMD 100, in accordance with aspects of the present disclosure. The illustrated example of HMD 100 is shown as including aframe 102,temple arms optical elements Wavefront sensors temple arms FIG. 1 also illustrates an exploded view of an example of near-eyeoptical element 110A. Near-eyeoptical element 110A is shown as including an opticallytransparent layer 120A, anillumination layer 130A, anoptical combiner layer 140A, and adisplay layer 150A.Display layer 150A may include a waveguide 158 that is configured to direct virtual images to an eye of a user of HMD 100. -
Illumination layer 130A is shown as including a plurality of in-field illuminators 126. In-field illuminators 126 are described as “in-field” because they are in a field of view (FOV) of a user of theHMD 100. In-field illuminators 126 may be in a same FOV that a user views a display of the HMD, in an embodiment. In-field illuminators 126 may be in a same FOV that a user views an external environment of the HMD 100 viascene light 191 propagating through near-eye optical elements 110. While in-field illuminators 126 may introduce minor occlusions into the near-eyeoptical element 110A, the in-field illuminators 126, as well as their corresponding electrical routing may be so small as to be unnoticeable or insignificant to a wearer ofHMD 100. Additionally, any occlusion from in-field illuminators 126 will be placed so close to the eye as to be unfocusable by the human eye and therefore assist in the in-field illuminators 126 being not noticeable or insignificant. In some embodiments, each in-field illuminator 126 has a footprint (or size) that is less than about 200×200 microns. When HMD 100 is being worn by a user, the in-field illuminators 126 may be disposed between 10 mm and 30 mm from the eye. In some embodiments, the in-field illuminators 126 may be placed between 15 mm and 25 mm from the eye of a user. The in-field illuminators 126 may be infrared in-field illuminators 126 configured to emit infrared illumination light for eye-tracking purposes, for example. - In some embodiments (not illustrated), a photonic integrated circuit (PIC) may be implemented instead of in-
field illuminators 126 to achieve a similar function as in-field illuminators 126. For example, outcoupling elements may be positioned similarly to the infield-illuminators 126 and the outcoupling elements may be provided infrared light by transparent waveguides. Light sources located at the edge of a frame of the HMD may provide the infrared light into the transparent waveguides, for example. The outcoupling elements then redirect the infrared light provided by the waveguides to illuminate an eyeward region. The outcoupling elements may have diffractive or refractive features to facilitate beam-shaping of the infrared light received from the waveguides. Other techniques (not necessarily considered to be PICs) may also be implemented to achieve a similar illumination function as described with respect to in-field illuminators 126. In a VR HMD context, wavefront sensor(s) 108 of this disclosure may also be disposed in numerous places in the VR HMD besides a template position, as illustrated inFIG. 1 . - As shown in
FIG. 1 ,frame 102 is coupled totemple arms HMD 100 to the head of a user.Example HMD 100 may also include supporting hardware incorporated into theframe 102 and/ortemple arms HMD 100 may include any of processing logic, wired and/or wireless data interface for sending and receiving data, graphic processors, and one or more memories for storing data and computer-executable instructions. In one example,HMD 100 may be configured to receive wired power and/or may be configured to be powered by one or more batteries. In addition,HMD 100 may be configured to receive wired and/or wireless data including video data. -
FIG. 1 illustrates near-eyeoptical elements frame 102. In some examples, near-eyeoptical elements display layer 150A. In further examples, some or all of near-eyeoptical elements optical elements - As shown in
FIG. 1 ,illumination layer 130A includes a plurality of in-field illuminators 126. Each in-field illuminator 126 may be disposed on a transparent substrate and may be configured to emit light towards aneyeward side 109 of the near-eyeoptical element 110A. In some aspects of the disclosure, the in-field illuminators 126 are configured to emit near infrared light (e.g. 750 nm-1.5 μm). Each in-field illuminator 126 may be a micro light emitting diode (micro-LED), an edge emitting LED, a vertical cavity surface emitting laser (VCSEL) diode, or a Superluminescent diode (SLED). - As mentioned above, the in-
field illuminators 126 of theillumination layer 130A may be configured to emit infrared illumination light towards theeyeward side 109 of the near-eyeoptical element 110A to illuminate the eye of a user. The near-eyeoptical element 110A is shown as includingoptical combiner layer 140A where theoptical combiner layer 140A is disposed between theillumination layer 130A and abackside 111 of the near-eyeoptical element 110A. In some aspects, theoptical combiner 140A is configured to receive retina-reflected infrared light that is reflected by retina of the eye of the user and to direct the retina-reflected infrared light towards thewavefront sensor 108A. The wavefront sensor(s) 108 may be located in different positions than the positions illustrated. In some aspects, theoptical combiner 140A is transmissive to visible light, such as scene light 191 incident on thebackside 111 of the near-eyeoptical element 110A. In some examples, theoptical combiner 140A may be configured as a volume hologram and/or may include one or more Bragg gratings for directing the retina-reflected infrared light towards thewavefront sensor 108A. In some examples, theoptical combiner 140A includes a polarization-selective volume hologram (a.k.a. polarized volume hologram) that diffracts (in reflection) a particular polarization orientation of incident light while passing other polarization orientations. -
Display layer 150A may include one or more other optical elements depending on the design of theHMD 100. For example,display layer 150A may include a waveguide 158 to direct display light generated by an electronic display to the eye of the user. In some implementations, at least a portion of the electronic display is included in theframe 102 of theHMD 100. The electronic display may include an LCD, an organic light emitting diode (OLED) display, micro-LED display, pico-projector, or liquid crystal on silicon (LCOS) display for generating the display light. - Optically
transparent layer 120A is shown as being disposed between theillumination layer 130A and theeyeward side 109 of the near-eyeoptical element 110A. The opticallytransparent layer 120A may receive the infrared illumination light emitted by theillumination layer 130A and pass the infrared illumination light to illuminate the eye of the user. As mentioned above, the opticallytransparent layer 120A may also be transparent to visible light, such as scene light 191 received from the environment and/or display light received from thedisplay layer 150A. In some examples, the opticallytransparent layer 120A has a curvature for focusing light (e.g., display light and/or scene light) to the eye of the user. Thus, the opticallytransparent layer 120A may, in some examples, may be referred to as a lens. In some aspects, the opticallytransparent layer 120A has a thickness and/or curvature that corresponds to the specifications of a user. In other words, the opticallytransparent layer 120A may be a prescription lens. However, in other examples, the opticallytransparent layer 120A may be a non-prescription lens. - While
FIG. 1 illustrates anHMD 100 configured for augmented reality (AR) or mixed reality (MR) contexts, the disclosed embodiments may also be used in other implementations of an HMD. For example, the illumination layers of this disclosure may be disposed close to a display plane of a display of a virtual reality (VR) or prior to a focusing lens of a VR HMD where the focusing lens is disposed between the illumination layer and the display and the focusing lens focuses display light from the display for an eye of a wearer of the VR HMD. -
FIG. 2 is a top view of an example near-eyeoptical element 210 that includes anillumination layer 230, acombiner layer 240, and adisplay layer 250. A transparent layer (not illustrated) may optionally be included betweenillumination layer 230 andeye 202, in some embodiments. A plurality of infrared in-field illuminators 237 emitinfrared illumination light 239 to an eyebox area to illuminateeye 202.Plane 206 illustrates a two-dimensional pupil plane 206 in the eyebox area being normal to the curvature of theeye 202 at the center ofpupil 203.FIG. 2 illustrates example array of infrared in-field illuminators 237A-237E. Each infrared in-field illuminator 237 in the array is configured to emitinfrared illumination light 239 to a center ofrotation 241 ofeye 202. The different infrared in-field illuminators 237 may directinfrared illumination light 239 to the center of rotation ofeye 202 at different angles depending on the position of the infrared in-field illuminator with respect toeye 202. For example, infrared in-field illuminators infrared illuminator 237C directinginfrared illumination light 239 to eye 202 at an angle closer to normal. The center ofrotation 241 ofeye 202 remains at a substantially same position with respect to illuminators 237 even over a large range of gaze angles ofeye 202. - As described above, infrared in-field illuminators 237 may be VCSELs or SLEDs, and consequently
infrared illumination light 239 may be narrow-band infrared illumination light (e.g. linewidth of 1-10 nm). Theinfrared illumination light 239 may be collimated or near-collimated so that at least a portion of theinfrared illumination light 239 will propagate throughpupil 203 ofeye 202, reflect of offretina 208 andexit eye 202 throughpupil 203 as retina-reflected infrared light. As will be described in greater detail below, the retina-reflected infrared light may be received by combineroptical element 240 and redirected towavefront sensor 108A to generate a wavefront image. As described above, alternative illumination layer implementations that utilize outcoupling elements, waveguides, and/or planar waveguides that achieve a similar function as infrared in-field illuminators 237 may also be utilized to generateinfrared illumination light 239 that is collimated or near-collimated. -
Wavefront sensor 108A is configured to capture wavefront images that may be utilized to determine an accommodative eye state value ofeye 202, for example. Wavefront sensor 108 may include an infrared bandpass filter to pass the wavelength of theinfrared illumination light 239 emitted by the infrared illuminators and block other light from becoming incident on an image sensor ofwavefront sensor 108A, in some embodiments. -
FIG. 2 shows that scene light 191 (visible light) from the external environment may propagate throughdisplay layer 250,combiner layer 240, andillumination layer 230 to become incident oneye 202 so that a user can view the scene of an external environment.FIG. 2 also shows thatdisplay layer 250 may generate or redirect display light 293 to present virtual images to eye 202.Display light 293 is visible light and propagates throughcombiner layer 240 andillumination layer 230 to reacheye 202. -
Illumination layer 230 may include a transparent substrate that the infrared in-field illuminators 237 are disposed on. The infrared in-field illuminators 237 may also be encapsulated in atransparent material 232.Transparent material 232 is configured to transmit visible light (e.g. 400 nm-750 nm) and near-infrared light (e.g. 750 nm-1.5 μm). -
FIG. 3 illustrates a front view ofeye 202 through anexample illumination layer 330, in accordance with aspects of the disclosure. In the illustrated embodiment,illumination layer 330 include twenty-one infrared in-field illuminators (337A-337U). In the illustrated example, infrared illuminators 337A-337H may be considered an “inner ring” of infrared in-field illuminators 337 while infrared illuminators 337I-337U are considered an “outer ring” of infrared in-field illuminators 337. As such, infrared illuminators 337I-337U may direct their infrared illumination light to eye 202 at a steeper angle than infrared illuminators 337A-337H in the inner ring. An illumination angle of the infrared illumination light 239 from different in-field infrared illuminators 337 may increase as a distance of a particular infrared in-field illuminators 337 increases frommiddle region 231 of the array of infrared in-field illuminators 337. -
FIG. 4 illustrates an example optical path ofinfrared illumination light 439 and retina-reflectedinfrared light 449, in accordance with aspects of the disclosure. InFIG. 4 , an array of infrared in-field illuminators 437 emitinfrared illumination light 439 to a center of rotation ofeye 202. Only the infrared illumination light from infrared in-field illuminators 437B is shown for illustration and description of the optical path of the infrared illumination light, inFIG. 4 . Portions of infrared illumination light 439 (not illustrated) may not necessarily propagate through the pupil and may be scattered by the iris or cornea. However, at least a portion ofinfrared illumination light 439 propagates substantially normal topupil plane 206 ofeye 202 and propagates through thecornea 201, anterior chamber,pupil 209, andlens 204 ofeye 202 before becoming incident upon theretina 208. A portion (e.g. ˜10% for 850 nm light) ofinfrared illumination light 439 reflects off theretina 208 as retina-reflectedinfrared light 449. The portion ofinfrared illumination light 439 that propagates throughpupil 209 normal to (or at least substantially normal to)pupil plane 206 is the light that can be reflected back out ofpupil 209 after reflecting off ofretina 208 rather than being absorbed by the interior ofeye 202. InFIG. 4 , retina-reflectedinfrared light 449 propagates throughlens 204,pupil 209, andcornea 201 to exiteye 202. Retina-reflectedinfrared light 449 then propagates throughillumination layer 430 and encounters combineroptical element 440. - Combiner
optical element 440 receives retina-reflectedinfrared light 449 and redirects the retina-reflectedinfrared light 449 to a wavefront sensor (e.g. wavefront sensor 108). Combineroptical element 440 may include a polarization-selective volume hologram that reflects a first polarization orientation (e.g. right-hand circularly polarized light) of the retina-reflected infrared light and passes polarization orientations that are other than the first polarization orientation. Combineroptical element 440 may also include a folding mirror, hologram or linear diffractive grating, to redirected retina-reflectedinfrared light 449, in some embodiments. The combineroptical element 440 passes visible light. -
FIG. 5 illustrates an example infrared in-field illuminator 537 that may be utilized asinfrared illuminators 126/237/337/447, in accordance with aspects of the disclosure. The example infrared in-field illuminator 537 illustrated inFIG. 5 includes an infraredlight source 531 having anoutput aperture 536 and a beam-formingelement 535 disposed overoutput aperture 536. Beam-formingelement 535 is configured to direct theinfrared illumination light 539 to a center of rotation of an eye. In the illustrated embodiment ofFIG. 5 , beam-formingelement 535 includes arefractive material 538 and alens curvature 534 may be formed of therefractive material 538 of the beam-formingelement 535. Thelens curvature 534 may assist in directing theinfrared illumination light 539 to a center of rotation of the eye. The beam-forming elements of the infrared light sources may be configured to increase an illumination angle of theinfrared illumination light 539 as a distance of a particular beam-forming element increases from a middle region (e.g. 231) of the array of infrared in-field illuminators so that the infrared illumination light 539 from each infrared in-field illuminator 537 is directed to a center of rotation of the eye. -
Substrate 532 is a transparent material.Refractive material 538 of beam-formingelement 535 may be a high-index material having a refractive index of greater than three. In some embodiments, the illustrated refractive beam-formingelement 535 is replaced by, or includes, a diffractive optical element configured to direct theinfrared illumination light 539 to the eye. In some embodiments, beam-formingelement 535 is approximately 30 microns wide. -
FIGS. 6A-6C illustrates aneye 202 in different positions with respect to an array of infrared in-field illuminators 437 and an example combiner optical element, in accordance with aspects of the disclosure InFIG. 6A , infrared in-field illuminator infrared illumination light 239 to a center of rotation ofeye 202.Infrared illumination light 239 may be collimated light. Other infrared in-field illuminators 437 in the array may also emitinfrared illumination light 239 to a center of rotation ofeye 202. At least a portion of theinfrared illumination light 239 propagates through the pupil ofeye 202 and reflects off ofretina 208 and propagates back through (exiting) the pupil as retina-reflected infrared light. -
FIG. 6B illustrates infrared in-field illuminators infrared illumination light 239 to a center of rotation ofeye 202 wheneye 202 has changed a gaze angle of the eye. Theeye 202 illustrated inFIG. 6B may be gazing up or gazing to the left, for example. -
FIG. 6C illustrates infrared in-field illuminators infrared illumination light 239 to a center of rotation ofeye 202 wheneye 202 is positioned at yet another gaze angle. Theeye 202 illustrated inFIG. 6C may be gazing down or gazing to the right, for example. - Notably,
FIGS. 6A-6C illustrate that even when the gaze angle and/or position ofeye 202 changes, different infrared in-field illuminators are still able to direct infrared illumination light substantially normal topupil plane 206 and therefore have theinfrared illumination light 239 propagate through the pupil, reflect off ofretina 208, propagate back through the pupil (as retina-reflectedinfrared light 449, not illustrated) to combineroptical element 440 to be redirected to a wavefront sensor. In other words, the infrared in-field illuminators 437 in the array are spaced apart so that at least a portion of the infrared in-field illuminators 437 will be positioned to illuminate a retina of the eye, through a pupil of the eye, with infrared illumination light propagating approximately normal to a pupil plane of the eye, over a range of eye positions. The range of eye positions may include the maximum eye position range that humans are capable of. - In some embodiments, the infrared in-field illuminators 437 in the array are selectively illuminated based on where a given infrared in-field illuminator 437 (or group of infrared in-field illuminators 437) are positioned. The infrared in-field illuminators 437 selected are positioned to illuminate the
eye 202 withinfrared illumination light 239 that will propagate through the pupil at angle substantially normal topupil plane 206 so that the combineroptical element 440 can receive a usable signal of retina-reflectedinfrared light 449 to direct to the wavefront sensor. In some embodiments, the infrared in-field illuminators 437 are selectively activated (turned on) based on eye-tracking data collected by a separate eye-tracking system of an HMD. For example, if the eye-tracking system determines thateye 202 is looking up, infrared in-field illuminators eye 202 withinfrared illumination light 239 that will be reflected offretina 208, back through the pupil to combineroptical element 440. Or, if the eye-tracking system determines thateye 202 is looking down, infrared in-field illuminators eye 202 withinfrared illumination light 239 that will be reflected offretina 208, back through the pupil to combineroptical element 440. -
FIG. 7 illustrates awavefront imaging system 700 that may be utilized in an HMD or as a near-eye optical system, in accordance with aspects of the disclosure.Wavefront imaging system 700 includes an eye-trackingmodule 747 for determining a position ofeye 202. In some embodiments, eye-trackingmodule 747 includes a camera configured to capture infrared images ofeye 202. Eye-trackingmodule 747 generates eye-trackingdata 793 that may include a position ofeye 202. For example,eye 202 may change gaze angles in any combination of up, down, right, and left, and eye-trackingmodule 747 may provide those gaze angles or eye position in eye-trackingdata 793 by analyzing images ofeye 202. -
Display 790 generates visible display light 799 for presenting a virtual image to a user of an HMD.Visible display light 799 may propagate through a near-eye optical element that includesillumination layer 430 and combineroptical element 440 with very little (if any) optical loss since the materials in the near-eye optical element are configured to pass visible light andcombiner 440 may be configured to diffract a particular bandwidth of infrared light emitted by infrared in-field illuminators.Display 790 may include an OLED, micro-LED, or LCD in a virtual reality context. In an augmented reality or mixed reality context,display 790 may include a transparent OLED or an LCOS projector paired with a waveguide included in a near-eye optical element of an HMD, for example. - In
FIG. 7 ,illumination logic 770 is configured to controldisplay 790 and drive images ontodisplay 790.Illumination logic 770 is also configured to receive eye-trackingdata 793 generated by eye-trackingmodule 747. Optionally,illumination logic 770 is configured to selectively activate (turn on) individual or groups of infrared in-field illuminators in an array of infrared in-field illuminators inillumination layer 430.Illumination logic 770 may selectively activate the infrared in-field illuminators based on the received eye-trackingdata 793. -
FIG. 7 shows that retina-reflectedinfrared light 749 may include a divergingwavefront 751A, a convergingwavefront 751B, or aplanar wavefront 751C. The wavefront is directed towavefront sensor 745 via combineroptical element 440 so thatwavefront sensor 745 can capture awavefront image 750 that may be provided toillumination logic 770. Although the optical paths associated withinfrared illumination light 239/439 are not illustrated inFIG. 7 , the infrared illumination light generally follows the example optical paths illustrated inFIG. 4 . -
Example wavefront sensor 745 includes animage sensor 748, alenslet array 746, and an optional focusing lens 735.Wavefront sensor 745 may be arranged as a Shack-Hartmann wavefront sensor.Image sensor 748 may be included in a camera with additional focusing elements.Image sensor 748 may include a complementary metal-oxide semiconductor (CMOS) image sensor, for example. As described previously, the camera may include an infrared filter configured to pass the wavelengths of the retina-reflected infrared light and reject other light wavelengths. Thelenslet array 746 is disposed in an optical path between the combineroptical element 440 andimage sensor 748, inFIG. 7 .Lenslet array 746 may be positioned at a plane that is conjugate to apupil plane 206 ofeye 202. Although not illustrated, additional optical elements (e.g. mirrors and/or lenses) may be included to properly focus the retina-reflectedinfrared light 749 towavefront sensor 745, in different arrangements. -
Illumination logic 770 may be configured to adjust a virtual image presented to theeye 202 of a user in response to determining an accommodative eye state value based on awavefront image 750 captured bywavefront sensor 745. Since the accommodative state of the eye can be derived fromwavefront image 750, a user's refractive error can be measured and corrected for. Display images driven ontodisplay 790 may be tailored to correct for the user's refractive error. -
FIG. 8 illustrates a flow chart for generating an accommodative eye state value, in accordance with aspects of the disclosure. The order in which some or all of the process blocks appear inprocess 800 should not be deemed limiting. Rather, one of ordinary skill in the art having the benefit of the present disclosure will understand that some of the process blocks may be executed in a variety of orders not illustrated, or even in parallel.Process 800 may be executed byillumination logic 770, for example. - In process block 805, an eye is illuminated by infrared illumination light (e.g. infrared illumination light 239) from an array of infrared in-field illuminators where the infrared illumination light from each infrared in-field illuminator is directed to a center of rotation of the eye. The infrared illumination light may be collimated or near-collimated.
- In
process block 810, a wavefront image (e.g. 750) of retina-reflected infrared light (e.g. 649) is generated. The retina-reflected infrared light is the infrared illumination light (e.g. 639) reflected by a retina and exiting a pupil of the eye. In some embodiments, generating the wavefront image includes receiving the retina-reflected infrared light with a wavefront sensor (e.g. 745) including an image sensor and a lenslet array. The lenslet array may be positioned in a plane that is conjugate to a pupil plane of the eye. - In
process block 815, an accommodative eye state value is determined based at least in part on the wavefront image. In some embodiments, determining the accommodative eye state value includes analyzing a spacing of beam spots of the wavefront image generated by microlenses of the lenslet array focusing the retina-reflected infrared light onto the image sensor. - In an embodiment,
process 800 further includes adjusting a virtual image presented to the eye by a head mounted display in response to determining the accommodative eye state value. -
FIG. 9 illustrates a block diagram illustration of a lenslet array 947 focusing a planar wavefront of retina-reflected infrared light 649 onto animage sensor 949 as beam spots 948, in accordance with aspects of the disclosure. In the illustrated block diagram example, lenslet array 947 includes a plurality ofmicrolenses 947A-947Y that focus correspondingbeam spots 948A-948Y. For example,microlens 947A focuses infrared light ontoimage sensor 949 asbeam spot 948A,microlens 947B focuses infrared light ontoimage sensor 949 asbeam spot 948B . . . andmicrolens 947Y focuses infrared light ontoimage sensor 949 asbeam spot 948Y. Microlens 947M is the middle microlens in the example 5×5 array of microlenses in lenslet array 947.FIG. 9 illustrates that when retina-reflected infrared light 649 is a planar wavefront (e.g. wavefront 751C), each beam spot 948 is axially aligned with an optical axis of its corresponding microlens that focuses that particular beam spot 948. Accordingly, each beam spot 948 in the example is equidistant. In other examples, beam spots 948 may not necessarily be equidistance for incoming planar wavefronts. -
FIG. 10 illustrates a block diagram illustration of a lenslet array 947 focusing a converging wavefront of retina-reflected infrared light 649 onto animage sensor 949 asbeam spots 1048, in accordance with an embodiment of the disclosure.FIG. 10 illustrates that when retina-reflected infrared light 649 is a converging wavefront (e.g. wavefront 751B),beam spots 1048 have converged towardmiddle beam spot 1048M. Accordingly, when thebeam spots 1048 are converging, a wavefront image that capturesbeam spots 1048 will indicate that the lens system ofeye 202 is focusing at nearer distances. The closer thebeam spots 1048 converge, the nearer the distance theeye 202 may be focusing to. -
FIG. 11 illustrates a block diagram illustration of a lenslet array 947 focusing a diverging wavefront of retina-reflected infrared light 649 onto animage sensor 949 as beam spots 1148, in accordance with an embodiment of the disclosure.FIG. 11 illustrates that when retina-reflected infrared light 649 is a diverging wavefront (e.g. wavefront 751A), beam spots 1148 have diverged away frommiddle beam spot 1148M. Accordingly, when the beam spots 1148 are diverging, a wavefront image that captures beam spots 1148 will indicate thateye 202 is focusing at a farther distance. In some cases,wavefront 751A is not diverging but merely less divergent thanwavefront 751C and the beam spots 1148 formed on the wavefront image are also not converging, but rather converging less thanbeam spots 1048 ofFIG. 10 . In this case, the lesser extent of the convergence of beam spots 1148 (compared with the convergence of beam spots 1048) indicates that theeye 202 is focusing at a farther distance than the more condensed beam spots 1048. Consequently, a greater condensing of the beam spots from the respective microlenses represents a near-focused accommodative eye state value where the eye is focused at a near distance and a lesser condensing of the beam spots from the respective microlenses represents a far-focused accommodative eye state value where the eye is focused at a farther distance. - Although
lenslet array 947 or 746 may not be configured exactly as illustrated inFIGS. 9-11 in all implementations,FIGS. 9-11 illustrate how analysis of the positioning of the beam spots will indicate the diverging or converging nature of the wavefront of retina-reflected infrared light 649 as well as the magnitude of the divergence or convergence. Accordingly, a magnitude and nature of the accommodative state of the lens system ofeye 202 may be determined from a wavefront image generated bywavefront sensor 745 by analyzing the spacing of the beam spots. - An algorithm to determine the accommodative eye state value of an eye may include detecting bright beam spots with sub-pixel resolution accuracy. The pupil of the eye may be segmented based on intensity thresholding or other computer vision or machine learning principles. Of course, distortion of any optics in the optical path between the optical combiner element and the wavefront sensor may be accounted for. The raw data from a wavefront image that includes an array of bright spots over a dark background may be converted to a wavefront map and compared to a calibration metric to determine an offset in a spherical curvature of an incoming wavefront, for example.
- Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
- The term “illumination logic” or “processing logic” in this disclosure may include one or more processors, microprocessors, multi-core processors, Application-specific integrated circuits (ASIC), and/or Field Programmable Gate Arrays (FPGAs) to execute operations disclosed herein. In some embodiments, memories (not illustrated) are integrated into the processing logic to store instructions to execute operations and/or store data. Processing logic may also include analog or digital circuitry to perform the operations in accordance with embodiments of the disclosure.
- A “memory” or “memories” described in this disclosure may include one or more volatile or non-volatile memory architectures. The “memory” or “memories” may be removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Example memory technologies may include RAM, ROM, EEPROM, flash memory, CD-ROM, digital versatile disks (DVD), high-definition multimedia/data storage disks, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device.
- A computing device may include a desktop computer, a laptop computer, a tablet, a phablet, a smartphone, a feature phone, a server computer, or otherwise. A server computer may be located remotely in a data center or be stored locally.
- The processes explained above are described in terms of computer software and hardware. The techniques described may constitute machine-executable instructions embodied within a tangible or non-transitory machine (e.g., computer) readable storage medium, that when executed by a machine will cause the machine to perform the operations described. Additionally, the processes may be embodied within hardware, such as an application specific integrated circuit (“ASIC”) or otherwise.
- A tangible non-transitory machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.). For example, a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
- The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
- These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/071,643 US20230087535A1 (en) | 2019-10-31 | 2022-11-30 | Wavefront sensing from retina-reflected light |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962928948P | 2019-10-31 | 2019-10-31 | |
US16/917,893 US11561405B1 (en) | 2019-10-31 | 2020-06-30 | Wavefront sensing with in-field illuminators |
US18/071,643 US20230087535A1 (en) | 2019-10-31 | 2022-11-30 | Wavefront sensing from retina-reflected light |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/917,893 Continuation US11561405B1 (en) | 2019-10-31 | 2020-06-30 | Wavefront sensing with in-field illuminators |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230087535A1 true US20230087535A1 (en) | 2023-03-23 |
Family
ID=84978038
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/917,893 Active US11561405B1 (en) | 2019-10-31 | 2020-06-30 | Wavefront sensing with in-field illuminators |
US18/071,643 Pending US20230087535A1 (en) | 2019-10-31 | 2022-11-30 | Wavefront sensing from retina-reflected light |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/917,893 Active US11561405B1 (en) | 2019-10-31 | 2020-06-30 | Wavefront sensing with in-field illuminators |
Country Status (1)
Country | Link |
---|---|
US (2) | US11561405B1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11561405B1 (en) * | 2019-10-31 | 2023-01-24 | Meta Platforms Technologies, Llc | Wavefront sensing with in-field illuminators |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040156015A1 (en) * | 2003-02-10 | 2004-08-12 | Visx, Inc. | Eye refractor with active mirror wavefront sensor |
US20070216867A1 (en) * | 2006-03-14 | 2007-09-20 | Visx, Incorporated | Shack-Hartmann based integrated autorefraction and wavefront measurements of the eye |
US20130176536A1 (en) * | 2012-01-10 | 2013-07-11 | Digitalvision, Llc | Intra-ocular lens optimizer |
US20130286053A1 (en) * | 2012-04-25 | 2013-10-31 | Rod G. Fleck | Direct view augmented reality eyeglass-type display |
US20140313484A1 (en) * | 2013-03-15 | 2014-10-23 | Amo Groningen B.V. | Wavefront generation for ophthalmic applications |
US20160026253A1 (en) * | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US20160270656A1 (en) * | 2015-03-16 | 2016-09-22 | Magic Leap, Inc. | Methods and systems for diagnosing and treating health ailments |
US20170039904A1 (en) * | 2015-08-03 | 2017-02-09 | Oculus Vr, Llc | Tile Array for Near-Ocular Display |
US9798147B1 (en) * | 2015-05-28 | 2017-10-24 | Verily Life Sciences Llc | Near-eye display with phase map |
US20180150709A1 (en) * | 2016-11-30 | 2018-05-31 | Samsung Electronics Co., Ltd. | Electronic device and method for displaying image for iris recognition in electronic device |
US20200371370A1 (en) * | 2019-05-20 | 2020-11-26 | Facebook Technologies, Llc | Polarizing beam splitter assembly |
US10852551B1 (en) * | 2019-06-07 | 2020-12-01 | Facebook Technologies, Llc | Wavefront sensing with ellipsoidal lensing structure |
US20210041692A1 (en) * | 2019-08-07 | 2021-02-11 | Facebook Technologies, Llc | Stray light suppression in eye-tracking imaging |
US20220105090A1 (en) * | 2013-08-28 | 2022-04-07 | Lenz Therapeutics, Inc. | Compositions and methods for the treatment of eye conditions |
US20220233434A1 (en) * | 2013-08-28 | 2022-07-28 | Lenz Therapeutics, Inc. | Compositions and methods for the treatment of presbyopia |
US11561405B1 (en) * | 2019-10-31 | 2023-01-24 | Meta Platforms Technologies, Llc | Wavefront sensing with in-field illuminators |
-
2020
- 2020-06-30 US US16/917,893 patent/US11561405B1/en active Active
-
2022
- 2022-11-30 US US18/071,643 patent/US20230087535A1/en active Pending
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040156015A1 (en) * | 2003-02-10 | 2004-08-12 | Visx, Inc. | Eye refractor with active mirror wavefront sensor |
US20070216867A1 (en) * | 2006-03-14 | 2007-09-20 | Visx, Incorporated | Shack-Hartmann based integrated autorefraction and wavefront measurements of the eye |
US20130176536A1 (en) * | 2012-01-10 | 2013-07-11 | Digitalvision, Llc | Intra-ocular lens optimizer |
US20130286053A1 (en) * | 2012-04-25 | 2013-10-31 | Rod G. Fleck | Direct view augmented reality eyeglass-type display |
US20140313484A1 (en) * | 2013-03-15 | 2014-10-23 | Amo Groningen B.V. | Wavefront generation for ophthalmic applications |
US20220105090A1 (en) * | 2013-08-28 | 2022-04-07 | Lenz Therapeutics, Inc. | Compositions and methods for the treatment of eye conditions |
US20220233434A1 (en) * | 2013-08-28 | 2022-07-28 | Lenz Therapeutics, Inc. | Compositions and methods for the treatment of presbyopia |
US20160026253A1 (en) * | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US20160270656A1 (en) * | 2015-03-16 | 2016-09-22 | Magic Leap, Inc. | Methods and systems for diagnosing and treating health ailments |
US9798147B1 (en) * | 2015-05-28 | 2017-10-24 | Verily Life Sciences Llc | Near-eye display with phase map |
US20170039904A1 (en) * | 2015-08-03 | 2017-02-09 | Oculus Vr, Llc | Tile Array for Near-Ocular Display |
US20180150709A1 (en) * | 2016-11-30 | 2018-05-31 | Samsung Electronics Co., Ltd. | Electronic device and method for displaying image for iris recognition in electronic device |
US20200371370A1 (en) * | 2019-05-20 | 2020-11-26 | Facebook Technologies, Llc | Polarizing beam splitter assembly |
US10852551B1 (en) * | 2019-06-07 | 2020-12-01 | Facebook Technologies, Llc | Wavefront sensing with ellipsoidal lensing structure |
US20210041692A1 (en) * | 2019-08-07 | 2021-02-11 | Facebook Technologies, Llc | Stray light suppression in eye-tracking imaging |
US11561405B1 (en) * | 2019-10-31 | 2023-01-24 | Meta Platforms Technologies, Llc | Wavefront sensing with in-field illuminators |
Also Published As
Publication number | Publication date |
---|---|
US11561405B1 (en) | 2023-01-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9625723B2 (en) | Eye-tracking system using a freeform prism | |
US10545340B2 (en) | Head mounted display and low conspicuity pupil illuminator | |
US10228561B2 (en) | Eye-tracking system using a freeform prism and gaze-detection light | |
US10345903B2 (en) | Feedback for optic positioning in display devices | |
EP3500887B1 (en) | Scanning in optical systems | |
US11407731B2 (en) | Holographic in-field illuminator | |
US20230051353A1 (en) | Beam shaping optical structures | |
US10852551B1 (en) | Wavefront sensing with ellipsoidal lensing structure | |
US11108977B1 (en) | Dual wavelength eye imaging | |
US10880542B1 (en) | Near-eye optical element with embedded hot mirror | |
US20230087535A1 (en) | Wavefront sensing from retina-reflected light | |
US11953679B2 (en) | Dual Purkinje imaging with ellipsoidal lensing structure | |
US11205069B1 (en) | Hybrid cornea and pupil tracking | |
US20230119935A1 (en) | Gaze-guided image capture | |
US11550153B2 (en) | Optical combiner aberration correction in eye-tracking imaging | |
US11281160B2 (en) | Holographic pattern generation for head-mounted display (HMD) eye tracking using a fiber exposure | |
US11796829B1 (en) | In-field illuminator for eye depth sensing | |
US11796804B1 (en) | Eye-tracking with steered eyebox | |
US11867900B2 (en) | Bright pupil eye-tracking system | |
TWI294528B (en) | System and method for channeling images within a head mounted display | |
US11927766B2 (en) | In-field imaging system for eye tracking | |
US11579425B1 (en) | Narrow-band peripheral see-through pancake lens assembly and display device with same | |
US20240134447A1 (en) | Scanning display with eye-tracking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: FACEBOOK TECHNOLOGIES, LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHARMA, ROBIN;HATZILIAS, KAROL CONSTANTINE;OUDERKIRK, ANDREW JOHN;AND OTHERS;SIGNING DATES FROM 20200715 TO 20200716;REEL/FRAME:062319/0053 |
|
AS | Assignment |
Owner name: META PLATFORMS TECHNOLOGIES, LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:FACEBOOK TECHNOLOGIES, LLC;REEL/FRAME:063165/0001 Effective date: 20220318 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |