CN116009243A - Eye tracking based on waveguide imaging - Google Patents

Eye tracking based on waveguide imaging Download PDF

Info

Publication number
CN116009243A
CN116009243A CN202211554573.XA CN202211554573A CN116009243A CN 116009243 A CN116009243 A CN 116009243A CN 202211554573 A CN202211554573 A CN 202211554573A CN 116009243 A CN116009243 A CN 116009243A
Authority
CN
China
Prior art keywords
light
optical element
optical
light ray
waveguide
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211554573.XA
Other languages
Chinese (zh)
Inventor
巴巴克·埃米尔苏来马尼
帕西·萨里科
耿莹
优素福·尼奥尼·巴克萨姆·苏莱
斯科特·查尔斯·麦克尔道尼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Meta Platforms Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/359,117 external-priority patent/US11256086B2/en
Application filed by Meta Platforms Technologies LLC filed Critical Meta Platforms Technologies LLC
Publication of CN116009243A publication Critical patent/CN116009243A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/283Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising used for beam splitting or combining
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The present application relates to eye tracking based on waveguide imaging. An optical system includes an optical waveguide and a first optical element configured to direct a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide toward a second optical element via total internal reflection. The first optical element is also configured to direct a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at a first angle of incidence such that the second light ray propagates away from the second optical element. The second optical element is configured to direct the first light propagating through the optical waveguide to the detector.

Description

Eye tracking based on waveguide imaging
The application is a divisional application of application with application number 201980033381.4, entitled "eye tracking based on waveguide imaging", with application date 2019, 04, 22.
Technical Field
The present disclosure relates generally to display devices, and more particularly to head mounted display devices.
Background
Head mounted display devices (also referred to herein as head mounted displays) are becoming increasingly popular as a means of providing visual information to users. For example, head mounted display devices are used for virtual reality and augmented reality operations. Eye tracking allows the head mounted display device to determine the user's gaze and provide visual information based on the user's gaze direction.
SUMMARY
Thus, there is a need for a compact and lightweight eye tracking system in a head mounted display device.
The systems and methods disclosed in this specification address the above-described and other technical challenges using waveguides and polarization-dependent optical elements (e.g., polarizer holographic elements (polarization volume holographic element), geometric phase lenses, etc.). Polarization dependent optical elements manipulate light having a particular polarization, such as right-hand circular polarization (right-handed circular polarization), and couple the light into the waveguide such that the light is guided within the waveguide to an off-axis position where the light is coupled out of the waveguide. This allows the optical element and detector to be placed in a position away from the line of sight of the user (e.g., an on-axis position). Furthermore, the waveguide and polarization dependent optical element employ a telescopic configuration, thereby providing a reduced image (of the eye, for example), and allowing for the use of smaller (and lighter) detectors in an eye tracking system. Furthermore, by utilizing polarization dependent optical elements, the detector in the eye tracking system receives light with a specific polarization (and either does not receive light with a different polarization or receives reduced intensity light with a different polarization), which in turn reduces noise in the received light and improves the performance of the eye tracking system. In some embodiments, the waveguide and polarization dependent optical element are wavelength specific, thus allowing transmission of visible light, making the eye tracking system compatible with augmented reality operation.
According to some embodiments, an optical system includes an optical waveguide and a first optical element configured to: i) Directing a first light ray in a first direction, the first light ray having a first circular polarization and impinging (impinge) on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide towards the second optical element via total internal reflection, and ii) directing a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element. The second optical element is configured to direct the first light propagating through the optical waveguide to the detector.
According to some embodiments, a method for relaying an image of an eye includes receiving light from an eye of a user at a first optical element, wherein the first optical element is configured to: i) Directing a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide towards the second optical element via total internal reflection, and ii) directing a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element. The method includes directing the first light from the optical waveguide to the detector with a second optical element.
Embodiments according to the invention are specifically disclosed in the appended claims directed to optical systems, imaging systems and methods, wherein any feature mentioned in one claim category (e.g. optical systems) may also be claimed in another claim category (e.g. imaging systems, methods, storage media, systems and computer program products). The dependencies or return references in the appended claims are chosen for formal reasons only. However, any subject matter resulting from an intentional back-reference (particularly multiple references) to any preceding claim may also be claimed, such that any combination of claims and their features is disclosed and may be claimed, irrespective of the dependencies selected in the appended claims. The subject matter which may be claimed includes not only the combination of features as set forth in the appended claims, but also any other combination of features in the claims, wherein each feature mentioned in the claims may be combined with any other feature or combination of features in the claims. Furthermore, any of the embodiments and features described or depicted herein may be claimed in separate claims and/or in any combination with any of the embodiments or features described or depicted herein or in any combination with any of the features of the appended claims.
In an embodiment, an optical system may include:
optical waveguide
A first optical element configured to:
i) Directing a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide towards the second optical element via total internal reflection, an
ii) directing a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at a first angle of incidence such that the second light ray propagates away from the second optical element,
wherein the second optical element is configured to direct the first light propagating through the optical waveguide to the detector.
The first optical element may comprise an element selected from the group consisting of a polarizer holographic element and a geometric phase lens.
The first light may have a wavelength greater than 850 nm.
The first optical element may be configured to transmit a third light ray having a wavelength less than 800nm such that the third light ray propagates away from the second optical element.
The first optical element may be disposed on the first surface of the optical waveguide such that the first light ray impinges on the optical waveguide after impinging on the first optical element.
In an embodiment, an optical system may include a detector, wherein the first light rays impinging on the first optical element may include imaging light from the subject, and the optical system may be configured to project the imaging light onto the detector.
Directing the first light may include causing reflection and diffraction of the first light.
The first optical element may be disposed on the second surface of the optical waveguide such that the first light impinges on the optical waveguide before impinging on the first optical element.
The second optical element may include an element selected from the group consisting of a polarizer holographic element, a geometric phase lens, an output mirror (output mirror), and an output grating.
The first light ray may form part of a light beam relayed to the second optical element at a reduced magnification.
The optical waveguide may include a mid-field lens (intermediate field lens) to reduce the magnification of the light beam.
The first optical element may include a coating that provides a focusing power (focusing power).
The first optical element and the second optical element may form an off-axis galilean telescope (Galilean telescope).
The first optical element and the second optical element may form an Offner telescope (Offner telescope).
The osvenner telescope can include three reflective surfaces, a second of which is located at an intermediate image plane (image plane) of the optical system.
In an embodiment, an imaging system may include:
the optical system of any embodiment herein; and
a detector configured to receive an image of the object from the optical system.
The object may include an eye, the detector may include a camera, the camera may be located outside of a field of view of the eye, and the first optical element may be located in front of the eye to allow the camera to image a direct view of the eye.
The imaging system may be included in a headset (head set).
In an embodiment, a method for relaying an eye image (in particular, using an optical system or an imaging system according to any embodiment herein) may comprise:
receiving light from an eye of a user at a first optical element, wherein the first optical element is configured to:
i) Directing a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide towards the second optical element via total internal reflection, an
ii) directing a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at a first angle of incidence such that the second light ray propagates away from the second optical element; and
the first light is directed from the optical waveguide to the detector by a second optical element.
In an embodiment, a method for relaying an eye image:
projecting a first light ray onto a detector to form an image; and
the position of the pupil (pupil) of the user's eye is determined from the image.
In an embodiment, one or more computer-readable non-transitory storage media may embody software that, when executed, is operable to perform a method in accordance with, or within, an optical system, or an imaging system, or any of the above-mentioned embodiments.
In an embodiment, a system may include: one or more processors; and at least one memory coupled to the processor and including processor-executable instructions, the processor when executed operable to perform a method in accordance with or within the optical system, or the imaging system, or any of the above-mentioned embodiments.
In an embodiment, a computer program product, preferably comprising a computer readable non-transitory storage medium, is operable when executed on a data processing system to perform a method according to or within an optical system, or an imaging system, or any of the above-mentioned embodiments.
Brief Description of Drawings
For a better understanding of the various embodiments described, reference should be made to the description of the embodiments below, taken in conjunction with the following drawings, in which like reference numerals refer to corresponding parts throughout.
Fig. 1 is a perspective view of a display device according to some embodiments.
Fig. 2 is a block diagram of a system including a display device according to some embodiments.
FIG. 3 is an isometric view of a display device according to some embodiments.
Fig. 4A is an example optical system according to some embodiments.
Fig. 4B is an example optical system according to some embodiments.
Fig. 5A is an example optical system according to some embodiments.
Fig. 5B is an example optical system according to some embodiments.
Fig. 5C is an example optical system according to some embodiments.
Fig. 5D is an example optical system according to some embodiments.
Fig. 5E shows an example of distortion in an optical system.
Fig. 6A is an example of a paraxial optical system (paraxial optical system) according to some embodiments.
FIG. 6B is an example of an off-axis optical system (off-axis optical system) according to some embodiments.
Fig. 7A is an example optical system according to some embodiments.
Fig. 7B is an example optical system according to some embodiments.
Fig. 7C is an example of distortion in an optical system.
Fig. 8A illustrates an osvenor relay optical system in accordance with some embodiments.
Fig. 8B illustrates a cross-elliptical relay optical system according to some embodiments.
The figures are not drawn to scale unless otherwise indicated.
Detailed Description
Reference will now be made to embodiments, examples of which are illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide an understanding of the various described embodiments. It will be apparent, however, to one skilled in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will be further understood that, although the terms first, second, etc. may be used herein to describe various elements in some instances, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first light projector (light projector) may be referred to as a second light projector, and similarly, a second light projector may be referred to as a first light projector, without departing from the scope of the various described embodiments. The first light projector and the second light projector are both light projectors, but they are not the same light projector.
The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and includes any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises," "comprising," "includes," and/or "including," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The term "exemplary" is used herein in the sense of "serving as an example, instance, or illustration," and not in the sense of "representing the best of the same class.
Fig. 1 illustrates a display device 100 according to some embodiments. In some embodiments, the display device 100 is configured to be worn on the head of the user (e.g., by having the form of eyeglasses or eyepieces as shown in fig. 1) or included as part of a helmet to be worn by the user. When the display device 100 is configured to be worn on the head of a user or included as part of a helmet, the display device 100 is referred to as a head mounted display. Alternatively, the display device 100 is configured for placement near one or both eyes of the user at a fixed location, rather than being head-mounted (e.g., the display device 100 is mounted in a vehicle (e.g., an automobile or airplane) for placement in front of one or both eyes of the user). As shown in fig. 1, the display device 100 includes a display 110. The display 110 is configured to present visual content (e.g., augmented reality content, virtual reality content, mixed reality content, or any combination thereof) to a user.
In some embodiments, display device 100 includes one or more of the components described herein with reference to fig. 2. In some embodiments, display device 100 includes additional components not shown in fig. 2.
Fig. 2 is a block diagram of a system 200 according to some embodiments. The system 200 shown in fig. 2 includes a display device 205 (which corresponds to the display device 100 shown in fig. 1), an imaging device 235, and an input interface 240, each of which is coupled to the console 210. Although fig. 2 shows an example of a system 200 including one display device 205, an imaging device 235, and an input interface 240, in other embodiments, any number of these components may be included in the system 200. For example, there may be a plurality of display devices 205, each display device 205 having an associated input interface 240 and being monitored by one or more imaging devices 235, wherein each display device 205, input interface 240, and imaging device 235 are in communication with console 210. In alternative configurations, different and/or additional components may be included in system 200. For example, in some embodiments, console 210 is connected to system 200 via a network (e.g., the internet) or is stand alone (e.g., physically located inside display device 205) as part of display device 205. In some embodiments, the display device 205 is used to create a mixed reality by adding a view of the reality environment. Accordingly, the display device 205 and system 200 described herein may communicate augmented reality, virtual reality, and mixed reality.
In some embodiments, as shown in fig. 1, the display device 205 is a head mounted display that presents media to a user. Examples of media presented by the display device 205 include one or more images, video, audio, or some combination thereof. In some embodiments, the audio is presented via an external device (e.g., speaker and/or headphones) that receives audio information from the display device 205, the console 210, or both and presents audio data based on the audio information. In some embodiments, the display device 205 immerses the user in the enhanced environment.
In some embodiments, the display device 205 also functions as an Augmented Reality (AR) headset. In these embodiments, the display device 205 augments a view of the physical real world environment with computer-generated elements (e.g., images, video, sound, etc.). Further, in some embodiments, the display device 205 is capable of cycling between different types of operations. Thus, based on instructions from the application engine 255, the display device 205 operates as a Virtual Reality (VR) device, an Augmented Reality (AR) device, as glasses, or some combination thereof (e.g., glasses without optical correction, glasses with optical correction for a user, sunglasses, or some combination thereof).
The display device 205 includes an electronic display 215, one or more processors 216, an eye tracking module 217, an adjustment module 218, one or more locators 220, one or more position sensors 225, one or more position cameras 222, memory 228, an Inertial Measurement Unit (IMU) 230, one or more reflective elements 260, or a subset or superset thereof (e.g., the display device 205 having the electronic display 215, the one or more processors 216, and the memory 228 without any other listed components). Some embodiments of the display device 205 have modules that are different from those described herein. Similarly, functionality may be distributed among modules in a different manner than described herein.
One or more processors 216 (e.g., processing units or cores) execute instructions stored in memory 228. Memory 228 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and may include non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. The memory 228 or alternatively a non-volatile memory device within the memory 228 includes a non-transitory computer-readable storage medium. In some embodiments, memory 228 or a computer readable storage medium of memory 228 stores programs, modules and data structures, and/or instructions for displaying one or more images on electronic display 215.
The electronic display 215 displays images to the user in accordance with data received from the console 210 and/or the processor 216. In various embodiments, electronic display 215 may include a single adjustable display element or multiple adjustable display elements (e.g., one display per eye of a user). In some embodiments, electronic display 215 is configured to display an image to a user by projecting the image onto one or more reflective elements 260.
In some embodiments, the display element includes one or more light emitting devices and corresponding spatial light modulator arrays. A spatial light modulator is an array of electro-optic pixels, an array of some other device that dynamically adjusts the amount of light transmitted by each device, or some combination thereof. The pixels are placed behind one or more lenses. In some embodiments, the spatial light modulator is a liquid crystal based pixel array in an LCD (liquid crystal display). Examples of the light emitting device include: organic light emitting diodes, active matrix organic light emitting diodes, some type of device that can be placed in a flexible display, or some combination thereof. The light emitting device includes a device capable of generating visible light (e.g., red, green, blue, etc.) for image generation. The spatial light modulator is configured to selectively attenuate individual light emitting devices, groups of light emitting devices, or some combination thereof. Alternatively, when the spatial light modulator is configured to selectively attenuate individual emission devices and/or groups of light emitting devices, the display element includes an array of such light emitting devices without a separate array of emission intensities. In some embodiments, electronic display 215 projects an image to one or more reflective elements 260, reflective elements 260 reflecting at least a portion of the light toward the eyes of the user.
One or more lenses direct light from the array of light emitting devices (optionally through an array of emission intensities) to a location within each window (eyebox) and ultimately to the back of the user's retina. A window is the area occupied by the eyes of a user located near the display device 205 (e.g., the user wearing the display device 205) for viewing images from the display device 205. In some cases, the window is represented as a square of 10mm by 10 mm. In some embodiments, one or more lenses include one or more coatings, such as an anti-reflective coating.
In some embodiments, the display element includes an array of Infrared (IR) detectors that detect retro-reflected (retro-reflected) IR light from the retina of the viewing user, from the corneal surface, the lens of the eye, or some combination thereof. The IR detector array includes an IR sensor or a plurality of IR sensors, each of which corresponds to a different location of the pupil of the eye of the viewing user. In alternative embodiments, other eye tracking systems may be employed.
The eye tracking module 217 determines the position of each pupil of the user's eye. In some embodiments, the eye tracking module 217 instructs the electronic display 215 to illuminate the window with IR light (e.g., via an IR emitting device in the display element).
A portion of the emitted IR light will pass through the pupil of the viewing user and be retroreflected from the retina toward the IR detector array used to determine the pupil location. Alternatively, the reflected light leaving the surface of the eye is also used to determine the position of the pupil. The IR detector array scans for retroreflection and identifies which IR emitting devices are active when retroreflection is detected. The eye tracking module 217 may use a tracking look-up table and identified IR emitting devices to determine the pupil position of each eye. The tracking lookup table maps the signals received on the IR detector array to positions in each window (corresponding to pupil positions). In some embodiments, the tracking look-up table is generated via a calibration process (e.g., the user looks at various known reference points in the image, and the eye tracking module 217 maps the position of the user's pupil when looking at the reference points to corresponding signals received on the IR tracking array). As described above, in some embodiments, the system 200 may use other eye tracking systems in addition to the embedded IR eye tracking system described herein.
The adjustment module 218 generates an image frame based on the determined pupil position. In some embodiments, this sends the discrete image to a display that tiles the sub-images together so that a coherent stitched image will appear on the back of the retina. The adjustment module 218 adjusts the output of the electronic display 215 (i.e., the generated image frame) based on the detected pupil position. The adjustment module 218 instructs portions of the electronic display 215 to deliver image light to the determined pupil position. In some embodiments, adjustment module 218 also instructs the electronic display not to pass image light to locations other than the determined pupil location. Adjustment module 218 may, for example, block and/or stop light emitting devices whose image light falls outside of the determined pupil position, allow other light emitting devices to emit image light that falls within the determined pupil position, translate and/or rotate one or more display elements, dynamically adjust the curvature and/or refractive power (refractive power) of one or more active lenses in an array of lenses (e.g., microlenses), or some combination thereof.
The optional locator 220 is an object that is located in a particular position on the display device 205 relative to each other and to a particular reference point on the display device 205. The locator 220 may be a Light Emitting Diode (LED), a pyramidal prism (corner cube reflector), a reflective marker (reflective marker), a type of light source that contrasts with the environment in which the display device 205 operates, or some combination thereof. In embodiments where the locator 220 is active (i.e., an LED or other type of light emitting device), the locator 220 may emit light in the visible light band (e.g., about 400nm to 750 nm), in the infrared band (e.g., about 750nm to 1 mm), in the ultraviolet band (about 100nm to 400 nm), some other portion of the electromagnetic spectrum, or some combination thereof.
In some embodiments, the locator 220 is located below an outer surface of the display device 205 that is transparent to the wavelength of light emitted or reflected by the locator 220 or is sufficiently thin to not substantially attenuate the wavelength of light emitted or reflected by the locator 220. Additionally, in some embodiments, the exterior surface or other portion of the display device 205 is opaque in the visible band of wavelengths of light. Thus, the fixture 220 may emit light in the IR band below an outer surface that is light transmissive in the IR band but opaque in the visible band.
The IMU 230 may be an electronic device that generates calibration data based on measurement signals received from one or more position sensors 225. The position sensor 225 generates one or more measurement signals in response to movement of the display device 205. Examples of the position sensor 225 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, one type of sensor for error correction of the IMU 230, or some combination thereof. The position sensor 225 may be located external to the IMU 230, internal to the IMU 230, or some combination thereof.
Based on one or more measurement signals from one or more position sensors 225, the IMU 230 generates first calibration data indicative of an estimated position of the display device 205 relative to an initial position of the display device 205. For example, the position sensor 225 includes a plurality of accelerometers to measure translational motion (forward/backward, up/down, left/right) and a plurality of gyroscopes to measure rotational motion (e.g., pitch, yaw, roll). In some embodiments, the IMU 230 rapidly samples the measurement signals and calculates an estimated position of the display device 205 from the sampled data. For example, the IMU 230 integrates the measurement signals received from the accelerometer over time to estimate a velocity vector and integrates the velocity vector over time to determine an estimated location of a reference point on the display device 205. Alternatively, the IMU 230 provides the sampled measurement signals to the console 210, which console 210 determines the first calibration data. The reference point is a point that may be used to describe the position of the display device 205. Although the reference point may generally be defined as a point in space, in practice, the reference point is defined as a point within the display device 205 (e.g., the center of the IMU 230).
In some embodiments, the IMU 230 receives one or more calibration parameters from the console 210. As discussed further below, one or more calibration parameters are used to keep track of the display device 205. Based on the received calibration parameters, the IMU 230 may adjust one or more IMU parameters (e.g., sample rate). In some embodiments, certain calibration parameters cause the IMU 230 to update the initial position of the reference point so that it corresponds to the next calibration position of the reference point. Updating the initial position of the reference point to the next calibrated position of the reference point helps to reduce the accumulated error associated with the determined estimated position. The accumulated error, also known as drift error (drift error), causes the estimated position of the reference point to deviate from the actual position of the reference point over time.
The imaging device 235 generates calibration data from the calibration parameters received from the console 210. The calibration data includes one or more images showing the observed locations of the locator 220, which locations are detectable by the imaging device 235. In some embodiments, imaging device 235 includes one or more still cameras, one or more video cameras, any other device capable of capturing images including one or more locators 220, or some combination thereof. In addition, the imaging device 235 may include one or more filters (e.g., to increase signal-to-noise ratio). The imaging device 235 is configured to optionally detect light emitted or reflected from the locator 220 in the field of view of the imaging device 235. In embodiments where the locators 220 include passive elements (e.g., retro-reflectors), the imaging device 235 may include light sources that illuminate some or all of the locators 220 that retro-reflect light toward the light sources in the imaging device 235. The second calibration data is transferred from the imaging device 235 to the console 210, and the imaging device 235 receives one or more calibration parameters from the console 210 to adjust one or more imaging parameters (e.g., focal length, focus, frame rate, ISO, sensor temperature, shutter speed, aperture, etc.).
In some embodiments, the display device 205 optionally includes one or more reflective elements 260. In some embodiments, the electronic display device 205 optionally includes a single reflective element 260 or multiple reflective elements 260 (e.g., one reflective element 260 per eye of a user). In some embodiments, electronic display device 215 projects a computer-generated image onto one or more reflective elements 260, which reflective elements 260 in turn reflect the image toward one or both eyes of the user. The computer-generated images include still images, animated images, and/or combinations thereof. The computer-generated image includes objects that appear to be two-dimensional and/or three-dimensional objects. In some embodiments, one or more reflective elements 260 are partially light transmissive (e.g., one or more reflective elements 260 have a transmittance of at least 15%, 20%, 25%, 30%, 35%, 40%, 45%, or 50%), which allows for the transmission of ambient light. In such embodiments, the computer-generated image projected by electronic display 215 is superimposed with the transmitted ambient light (e.g., the transmitted ambient image) to provide an augmented reality image.
The input interface 240 is a device that allows a user to send an action request to the console 210. An action request is a request to perform a particular action. For example, the action request may be to start or end an application, or to perform a particular action within an application. Input interface 240 may include one or more input devices. Example input devices include: a keyboard, a mouse, a game controller, data from brain signals, data from other parts of the human body, or any other suitable device for receiving motion requests and transmitting the received motion requests to console 210. The action request received by the input interface 240 may be communicated to the console 210, and the console 210 performs an action corresponding to the action request. In some embodiments, input interface 240 may provide haptic feedback to a user in accordance with instructions received from console 210. For example, the haptic feedback is provided upon receipt of a motion request, or the console 210 communicates instructions to the input interface 240 causing the input interface 240 to generate haptic feedback when the console 210 performs a motion.
The console 210 provides media to the display device 205 for presentation to a user in accordance with information received from one or more of the imaging device 235, the display device 205, and the input interface 240. In the example shown in fig. 2, the console 210 includes an application store 245, a tracking module 250, and an application engine 255. Some embodiments of console 210 have different modules than those described in connection with fig. 2. Similarly, the functions further described herein may be distributed among the components of console 210 in a manner different than that described herein.
When the application store 245 is included in the console 210, the application store 245 stores one or more applications executed by the console 210. An application is a set of instructions that when executed by a processor are used to generate content for presentation to a user. The content generated by the processor based application may be responsive to input received from a user via movement of the display device 205 or the input interface 240. Examples of applications include: a gaming application, a conferencing application, a video playback application, or other suitable application.
When the tracking module 250 is included in the console 210, the tracking module 250 calibrates the system 200 using one or more calibration parameters and may adjust the one or more calibration parameters to reduce errors in the determination of the position of the display device 205. For example, the tracking module 250 adjusts the focus of the imaging device 235 to obtain a more accurate position of the observed locator on the display device 205. In addition, the calibration performed by the tracking module 250 also takes into account information received from the IMU 230. In addition, if tracking of the display device 205 is lost (e.g., the imaging device 235 loses at least a threshold number of lines of sight of the locators 220), the tracking module 250 recalibrates some or all of the system 200.
In some embodiments, the tracking module 250 uses the second calibration data from the imaging device 235 to track movement of the display device 205. For example, the tracking module 250 uses the observed locator to determine the location of the reference point of the display device 205 from the second calibration data and the model of the display device 205. In some embodiments, the tracking module 250 also uses the location information from the first calibration data to determine the location of the reference point of the display device 205. Further, in some embodiments, the tracking module 250 may use portions of the first calibration data, the second calibration data, or some combination thereof to predict a future location of the display device 205. The tracking module 250 provides the estimated or predicted future position of the display device 205 to the application engine 255.
The application engine 255 executes applications within the system 200 and receives location information, acceleration information, velocity information, predicted future locations, or some combination thereof, of the display device 205 from the tracking module 250. Based on the received information, the application engine 255 determines content to be provided to the display device 205 for presentation to the user. For example, if the received information indicates that the user has seen to the left, the application engine 255 generates content for the display device 205 that reflects (mirror) the user's movements in the enhanced environment. In addition, the application engine 255 performs actions within applications executing on the console 210 in response to action requests received from the input interface 240 and provides feedback to the user that the actions are performed. The feedback provided may be visual or audible feedback via the display device 205 or tactile feedback via the input interface 240.
Fig. 3 is an isometric view of a display device 300 according to some embodiments. In some other embodiments, the display device 300 is part of some other electronic display (e.g., a digital microscope, a head mounted display device, etc.). In some embodiments, the display device 300 includes an array of light emitting devices 310 and one or more lenses 330, 335. In some embodiments, the display device 300 further includes an array of IR detectors.
The array of light emitting devices 310 emits image light and optionally IR light toward a viewing user. The array of light emitting devices 310 may be, for example, an array of LEDs, a micro LED array, an array of OLEDs, or some combination thereof. The light emitting device array 310 includes light emitting devices 320 that emit light in the visible (and optionally devices that emit light in the IR).
In some embodiments, display device 300 includes an array of emission intensities configured to selectively attenuate light emitted from light emitting device array 310. In some embodiments, the emission intensity array is comprised of a plurality of liquid crystal cells or pixels, a group of light emitting devices, or some combination thereof. Each liquid crystal cell (or group of liquid crystal cells in some embodiments) is addressable to have a particular level of attenuation. For example, at a given time, some liquid crystal cells may be set to no attenuation, while other liquid crystal cells may be set to maximum attenuation. In this manner, the emission intensity array can control which portion of the image light emitted from the light emitting device array 310 is passed to one or more lenses 330, 335. In some embodiments, the display device 300 uses an array of emission intensities to facilitate providing image light to the pupil 350 location of the user's eye 340 and to minimize the amount of image light provided to other areas in the window.
One or more lenses 330, 335 receive the modified image light (e.g., attenuated light) from the emission intensity array (or directly from the light emitting device array 310) and direct the modified image light to the location of the pupil 350.
An optional IR detector array detects IR light that has been retroreflected from the retina of eye 340, the cornea of eye 340, the lens of eye 340, or some combination thereof. The IR detector array includes a single IR sensor or a plurality of IR sensitive detectors (e.g., photodiodes). In some embodiments, the IR detector array is separate from the light emitting device array 310. In some embodiments, the IR detector array is integrated into the light emitting device array 310.
In some embodiments, the array of light emitting devices 310 and the array of emission intensities constitute a display element. Alternatively, the display element includes an array of light emitting devices 310 (e.g., when the array of light emitting devices 310 includes individually adjustable pixels) and does not include an array of emission intensities. In some embodiments, the display element further comprises an IR array. In some embodiments, in response to the determined pupil 350 position, the display element adjusts the emitted image light such that the light output by the display element is refracted by the one or more lenses 330, 335 toward the determined pupil 350 position (rather than toward other positions in the window).
In some embodiments, display device 300 includes one or more broadband light sources (e.g., one or more white LEDs) coupled with a plurality of color filters in addition to light emitting device array 310 or in place of light emitting device array 310.
Fig. 6A illustrates a schematic "unfolded" of an example optical system 600 (e.g., all optical elements are arranged without "folded" optical elements, such as waveguides) and a coaxial configuration (e.g., the geometric center of the optical elements coincides with the major axis of the optical elements) according to some embodiments. The optical system 600 includes two relay systems, the first relay system 602 including a first optical element 608 and a second optical element 610. The first relay system 602 receives light from a subject (e.g., an eye 606 of a wearer of a device including the optical system 600). The first optical element 608 has a converging/focusing optical power. The light beams from the eye 606 that impinge the first optical element 608 at a greater height along the y-direction converge to a smaller height as they impinge the second optical element 610. The second optical element 610 has a diverging optical power (diverging optical power) and the light beam diverges in the y-direction after interacting with the second optical element 610. In some embodiments, the positive lens effect (e.g., converging) of the first optical element 608 and the negative lens effect (e.g., diverging) of the second optical element 610 allow the first and second optical elements to form a galilean telescope (e.g., formed by a negative lens followed by a positive lens). In some embodiments, the first relay system 602 includes a kepler telescope (Keplerian telescope) (e.g., formed of two positive lenses). In some embodiments, the first optical element and the second optical element form a telescope. In some embodiments, the telescope includes additional optical elements within the first relay system 602. In some embodiments, additional optical elements are disposed within optical waveguide 402.
The second relay system 604 is disposed downstream of the first relay system 602 and images the light beam exiting (emerge) from the second optical element 610 onto a detector 614. In fig. 6A, a second relay system 604 in an example optical system 600 includes a single optical element 612 (e.g., a converging lens). In some embodiments, the second relay system 604 includes additional optical elements. In some embodiments, the second relay system 604 comprises a telescope. In some embodiments, the second relay system 604 is a galilean telescope. In some embodiments, the second relay system 604 is a keplerian telescope. A detector 614 (e.g., a CCD camera containing sensor elements) is located at the image plane of the optical element 612. In the example optical system 600, the second relay system 604 images the output of the first relay system 602 onto the detector 614 at a reduced magnification (the range of imaging light along the y-axis at the detector 614 is less than the range of imaging light striking the optical element 612). In some embodiments, the second optical relay system is downstream of the first optical relay system and receives as its input an output from the first optical relay system.
Fig. 6B illustrates an "unfolded" (e.g., all optical elements are arranged without "folded" optical elements, such as waveguides) and off-axis configuration (e.g., the geometric center of any optical element does not coincide with the major axis of any other optical element) of an optical system 650 according to some embodiments. The optical elements in the optical system 600 in fig. 6A are arranged proximally, but the optical elements in the optical system 650 in fig. 6B are arranged in an off-axis manner.
Like optical system 600, optical system 650 has a first relay system 658, first relay system 658 including a first optical element 652 and a second optical element 654. Fig. 6A and 6B both show galilean telescopes in a first relay system. The first optical element 652 has a converging optical power (e.g., a positive lens), and the second optical element 654 has a diverging optical power (e.g., a negative lens). In some embodiments, first optical element 652 is a positive lens, and the light beam from eye 606 impinges (impinge) on first optical element 652 in an off-center manner. For example, the first optical element 652 is a decentered lens (e.g., a lens with a lens principal axis offset from the geometric center of the lens). In some embodiments, the principal axis of the decentered lens is remote from the lens (e.g., outside the lens). In this way, a beam from one edge of eye 606660 is refracted by a greater angle thetar than the beam 662 from the other edge of the eye 1 The beam 662 is refracted by the first optical element 652 by a small angle thetar 2 . Thus, in some embodiments, symmetrically-emitted light beams (in the y-z plane) are refracted differently when striking the first optical element in an off-axis/off-center manner. In contrast, the outermost beam 616 in FIG. 6A (which impinges on the center lens) is refracted by the same angle θr as the other outermost beam 618 1 . In FIG. 6A, the angle θr 1 And thetar 2 Having the same size and different symbols.
In some embodiments, the first optical element 652 is a geometric phase element. In some embodiments, the first optical element 652 is a geometric phase lens. Circularly polarized light presents this phase profile directly by the geometrical phase (also known as the Pancharatnam-Berry phase) effect, e.g. the phase profile of the geometrical phase lens is added to the original phase (the phase of the light before it passes through the geometrical phase lens). The light beam 660 includes a first light ray 664 having a first circular polarization and a second light ray (not shown) having a second circular polarization different from the first circular polarization. First optical element 652 is disposed along an angle θr with respect to the z-axis 1 Is directed to the first light 664. In some embodiments, a second light ray having a second circular polarization different from the first circular polarization is transmitted through the first optical element 652. In some embodiments, first optical element 652 is positioned at- θr from the z-axis 1 Is directed in a direction opposite to the first light.
The first light rays from the various light beams impinge on the second optical element 654 in an off-center manner (e.g., off-axis, the second optical element is not symmetrically illuminated about its axis of symmetry or its principal axis). In some embodiments, the second optical element 654 is a negative lens, and the first light rays diverge after impinging on the second optical element 654. In some embodiments, the decentered negative lens of the second optical element 654 corrects for aberrations (e.g., distortions). In some embodiments, the first optical element 652 and the second optical element 654 are positive and negative lenses, respectively, of a galilean telescope that forms the first relay system 658.
In some embodiments, a single converging lens 656 forms the second relay system 648. The lens 656 images the output of the first relay system 658 onto the detector 624.
Fig. 4A illustrates an optical system 400 according to some embodiments. The optical system 400 includes a waveguide 402 (e.g., an optical waveguide that guides electromagnetic radiation along a long axis of the optical waveguide (e.g., the y-axis in fig. 4A) that is longer than 400nm, greater than 800nm, greater than 1000nm, or greater than 2000 nm), a first optical element 404, a second optical element 406, an imaging optical element 408, and a detector 410 for imaging light from an eye 412 of a user of the optical system 400 (e.g., a user wearing a device (e.g., a head-mounted display, VR display head-mounted device, or AR display head-mounted device, etc.) that includes the optical system 400. The waveguide 402 in fig. 4A allows the ray trace inside it to be "folded" compared to the coaxial system shown in fig. 6A, resulting in a more compact system.
Light beam 414 from eye 412 (e.g., light reflected off of eye 412) includes a first light ray 416-1 and a second light ray 418-1. The first light ray 416-1 has a first circular polarization (e.g., right circularly polarized light (RCP)), and the second light ray 418-1 has a second circular polarization (e.g., left circularly polarized Light (LCP)) that is different from the first circular polarization. The light beam 414 impinges on the first optical element 404 at a first angle of incidence. In some embodiments, the light beam impinges on the first optical element 404 at a range of angles of incidence. In some embodiments (as shown in fig. 4A), the beam 414 impinges on the first optical element 404 at normal incidence (i.e., an angle of incidence of 0 °).
In some embodiments, the first optical element 404 is configured to be at a diffraction angle θ along D First light ray 416-1 (e.g., as first light ray 416-2) and second light ray 418-1 in a second direction different from the first direction. In some embodiments, the first optical element directs the second light 418-1 in the second direction without any diffraction by transmitting the second light 418-1 through the optical waveguide 402. In some embodiments, the first optical element is formed by having a diffraction angle θ along D (e.g. relative to the firstThe diffraction angle of a ray 416-1 is negative) diffracts the second ray (e.g., the first ray 416-1 is diffracted to the +1 diffraction order, and the ray 418-1 is diffracted to the-1 diffraction order) to direct the second ray as the second ray 418-3. Both the second light ray 418-2 and the second light ray 418-3 are directed in a direction that causes the second light ray to propagate away from the second optical element 406.
In some embodiments, the first light ray 416-1 has a first wavelength and the second light ray 418-1 has a second wavelength different from the first wavelength. In some embodiments, the first wavelength is greater than 850nm (e.g., greater than 900nm, greater than 1000nm, greater than 1500nm, greater than 2000 nm), and the second wavelength is less than 850nm (e.g., less than 800nm, less than 700nm, less than 600 nm). In some embodiments, the first wavelength and the second wavelength are different and both are greater than 800nm. In some embodiments, the first optical element 404 diffracts light within a wavelength range (e.g., greater than 800nm, between 800nm and 2000 nm) and transmits light outside of the wavelength range. In some embodiments, the first optical element 404 diffracts light impinging thereon that is incident within a range of angles of incidence (e.g., between +20° and-20 ° from the normal to the incident surface, between +10° and-10 ° from the normal to the incident surface, between +5° and-5 ° from the normal to the incident surface, and between +2° and-2 ° from the normal to the incident surface).
In some embodiments, the first optical element 404 is a polarizer holographic element. In some embodiments, the first optical element is a geometric phase lens similar to those described in co-pending patent application No. 15/833,676 entitled "Geometric Phase Lens Alignment in an Augmented Reality Head Mounted Display," filed on date 12 at 2017, which is incorporated herein by reference in its entirety.
The first optical element 404 diffracts the first light ray 416-1 into the first light ray 416-2 in the forward direction such that it is at the incident angle θ I Incident on the rear surface 422 of the optical waveguide 402, the incident angle being equal to or greater than the critical angle (critical angle) of the optical waveguide 402. For example, at the optical waveguide 402 at the wavelength λ 1 In an embodiment of a material having a refractive index n, the wavelength is lambda 1 Is made of (1) light materialCritical angle θ at the material-air interface c Is sin -1 (n Air-conditioner /n) (i.e. air refractive index n Air-conditioner And wavelength lambda 1 An arcsine of the ratio of refractive indices n). Thus, the first ray is reflected by Total Internal Reflection (TIR) at the rear surface 422 to be greater than θ c Is (for example, the magnitude of the angle is equal to the incident angle theta I Is the same) again impinges on the first surface 424. The first optical element 404 couples in light from the eye 412 such that the light is directed along the long axis (e.g., y-axis) of the optical waveguide 402. After undergoing one or more total internal reflections at the material-air interfaces (e.g., back surface 422 and front surface 424) of optical waveguide 402, first light ray 416-2 impinges on front surface 424 at the location where second optical element 406 is disposed. In some embodiments, second optical element 406 is deposited on optical waveguide 402. In some embodiments, the second optical element 406 is coated on the optical waveguide 402. The second optical element 406 couples the first light ray 416-2 out of the optical waveguide 402 and thus does not direct the first light ray 416-2 further along the optical waveguide 402 (e.g., the first light ray 416-2 is not reflected by the optical waveguide 402). In some embodiments, the second optical element 406 is a polarizer holographic element. In some embodiments, the second optical element 406 is a geometric phase lens. In some embodiments, the second optical element 406 is a polarization grating. In some embodiments, the second optical element 406 is an output mirror. In some embodiments, the second optical element 406 is an output grating. The second optical element 406 directs the first light ray 416-2 such that the directed first light ray 416-3 propagates substantially parallel to the z-axis after exiting the optical waveguide 402 (e.g., light ray at an angle of less than 20 ° to the z-axis, light ray at an angle of less than 10 ° to the z-axis, light ray at an angle of less than 5 ° to the z-axis, light ray at an angle of less than 2 ° to the z-axis, light ray at an angle of less than 1 ° to the z-axis).
Fig. 4A shows another ray 426-1 from eye 412 in a direction parallel to beam 414. In FIG. 4A, first optical element 404 also diffracts light ray 426-1 into light ray 426-2 in a direction parallel to first light ray 416-2. As a result, light ray 426-2 is directed through optical waveguide 402 along an optical path that is shifted (displaced) in the y-direction relative to first light ray 416-2. Imaging optics 408 is positioned downstream of second optics 406 (e.g., positioned after second optics 406 along an optical path that begins at eye 412 and ends at detector 410), imaging first light 416-3 and light 426-3 onto detector 410 such that an image of eye 412 (e.g., an image of the pupil of eye 412) is formed at detector 410 (e.g., detector 410 is positioned at the image plane of imaging optics 408). In some embodiments, the object plane of the imaging optical element 408 is at or near the exit surface (exit surface) of the second optical element 406. In some embodiments, the exit surface is a surface 428 of the second optical element 406 closest to the detector 410, which may define a material-air interface. In some embodiments, imaging optics 408 form a relay system that images the output from optical waveguide 402 onto detector 410.
In some embodiments, the first relay system relays (or images) light rays (e.g., beam 414, ray 426-1) from eye 412 onto a plane (e.g., image plane, output plane of optical waveguide 402), which in turn is the object plane (object plane) of the second relay system (e.g., imaging optics 408). In some embodiments, the first optical element 404 and the second optical element 406 together form a first relay system.
Fig. 4B illustrates an optical system 450 according to some embodiments. Light beam 414 from eye 412 is transmitted through front surface 424 of optical waveguide 402 and is coupled into optical waveguide 402. The light beam 414 includes a first light ray 416-1 having a first circular polarization and a light ray 418-1 having a second circular polarization different from the first circular polarization. The light beam 414 impinges on a reflective first optical element 452 disposed on the rear surface 422 of the optical waveguide 402. The reflective first optical element 452 reflectively (e.g., in a rearward direction) at a diffraction angle θ D First light ray 416-1 having a first circular polarization is diffracted toward front surface 424 of optical waveguide 402 as diffracted first light ray 416-2. First ray 416-2 is equal to θ D Angle of incidence theta of (2) I Striking the front surface 424. The reflective first optical element 452 is configured such that θ D (and thus make θ I ) Equal to or greater than lightThe critical angle of waveguide 402. In this way, the first light ray 416-2 is reflected by total internal reflection within the optical waveguide and is directed along its long axis (e.g., along the y-direction). In some embodiments, the second light ray 418-1 having a second circular polarization different from the first circular polarization is transmitted only through the reflective first optical element 452 as the transmitted second light ray 418-2 and propagates away from the second optical element 406. In some embodiments, the second light ray 418-1 having the second circular polarization is at an angle- θ in the opposite direction D Diffracts, as diffracted second light 418-3, and propagates away from second optical element 406. First light ray 416-2 is coupled out of optical waveguide 402 through second optical element 406 in a similar manner as described with reference to FIG. 4A.
For ease of illustration, first optical element 404 is not shown in FIG. 4A as having optical power (e.g., focusing optical power, converging optical power, diverging optical power) -the distance between first light ray 416-1 (within beam 414) and light ray 426-1 at second optical element 406 (e.g., the distance between first light ray 416-3 and second light ray 426-3) is substantially the same as the distance between them at first optical element 404 (e.g., the distance between first light ray 416-1 and second light ray 426-1). In some embodiments, the first optical element has a focusing power and the distance between the rays at the second optical element 406 is reduced (e.g., a reduced image is formed) compared to their distance at the first optical element 404. In some embodiments, the first optical element has a coating that provides a focusing power. In some embodiments, the first optical element is formed from a material that provides a focusing power.
Fig. 5A shows an "unfolded" configuration of the optical system 500, which includes a telescope for forming an image of an object on a detector. The "unfolded" configuration shows the individual optical elements arranged in the optical system 500 in sequence along the z-axis, with no one or more reflections within the waveguide. The light beam exits the object (e.g., eye 502) along the y-axis and impinges on first optical element 504.
In some embodiments, the first optical element 504 (sometimes referred to as an input grating) has a focusing power. For exampleThe first optical element 504 in fig. 5A has a focal length f 1 . The second optical element 520 is located at a distance f from the first optical element 504 1 Where it is located. In some embodiments, as shown in fig. 5A, the second optical element 520 does not have a focusing power and is used (e.g., only) to couple out light guided by the waveguide. In some embodiments, the spectral width of the light beam from eye 502 is narrowest at second optical element 520. Placing the second optical element 520 at this location allows for the use of a minimal second optical element 520 without losing a substantial portion of the light downstream of the first optical element 504.
As in galilean telescope, has a focal length f 2 Is positioned such that the second optical element 520 is at a distance f 2 Placed away from the lens 522 (i.e., at the back focal plane of the lens 522). The reduced image of eye 502 is at a distance f away from lens 522 2 Formed on a detector 524 (e.g., a CCD camera with sensor elements), i.e., at the front focal plane of the lens 522. The reduction rate depends on the focal length f 1 And f 2
Fig. 5B shows an optical system 528 corresponding to optical system 500 without a waveguide in a partially folded configuration. The first optical element 530 is shown as having a focal length f 1 Although depicted as planar, and the second optical element 532 is a planar mirror at a distance f 1 Is placed away from the first optical element 530. All of the optical elements in fig. 5B are located at the same distance as shown in fig. 5A (e.g., the representative distance between the second optical element 532 and the lens 522 is f 2 And the distance between the lens 522 and the detector 524 is also f 2 )。
Fig. 5C shows a perspective view of an optical system 550 including a waveguide 552. Fig. 5D is a y-z plane cross-sectional view of optical system 550. Eye 502 is represented by a view 553, and a light beam from an x-y plane containing view 553 (including light beam 506) impinges on a first optical element 554 disposed on waveguide 552. The first optical element 554 couples in light of the light beam having the first circular polarization. Light rays having a second circular polarization different from the first circular polarization (e.g., the first circular polarization is RCP and the second circular polarization is LCP; or the first circular polarization is LCP and the second circular polarization is RCP) are not directed by the first optical element 554 to undergo total internal reflection within the waveguide 552. As a result, light rays having the second circular polarization propagate away from the second optical element 556. For example, light of the second polarization is transmitted through waveguide 552 or diffracted by first optical element 554 in a direction opposite to the direction of diffraction of light having the first polarization. In some embodiments, light rays of a first polarization directed within the waveguide 552 travel upward in the y-direction, and a second optical element 556 disposed on the waveguide 552 couples out the light rays, directing them substantially in the z-direction. Lens 557 images light onto detector 558. In some embodiments, detector 558 is placed vertically above the eye (and closer to the waveguide than the eye). First optical element 554 has a focusing power and reduces the imaging beam from eye 502 as the guided light rays are coupled out of waveguide 552. In this way, the detector 558 has a detection surface in the x-y plane that is smaller than the area of the window 553.
In some embodiments, the first optical element 554 is configured to receive input light and manipulate the input light at a first angle in a first direction parallel to the first optical element 554 and at a second angle in a second direction parallel to the first optical element 554 and perpendicular to the first direction. In some embodiments, the first angle is different from the second angle. For example, in some embodiments, the first optical element 554 manipulates the input light at a first angle (e.g., less than 10 degrees, less than 5 degrees, less than 3 degrees, less than 2 degrees, less than 1 degree) toward the y-direction and manipulates the input light at a second angle (e.g., less than 6 degrees, less than 3 degrees, less than 2 degrees, less than 1 degree, less than 0.5 degrees) toward the x-direction. In some embodiments, the second angle is less than the first angle.
Fig. 5E shows distortion in the x-y plane observed at detector 558 for light rays incident on first optical element 554 at an angle of incidence within ±0.1° of the surface normal of first optical element 554. Distortion is a form of optical aberration and is a deviation from a straight projection (rectilinear projection), i.e. the straight line emerging from the object remains straight in the image. To determine the magnitude of the distortion, the input (incident) light rays that form a mirror symmetric checkerboard pattern (along the x-axis and along the y-axis) propagate through the optical system, and deviations from the checkerboard input image reveal the degree of distortion within the optical system.
The pattern 580 recorded by detector 558 shows that the width of the entire pattern in the x-direction is smaller for light rays in the positive y-direction. The bottom of the checkerboard pattern also has some curvature (e.g., along the x-axis for a smaller y-coordinate value). Distortion prevents the formation of an accurate image of the view 553 at the detector 558. In some embodiments, the corrective optical element reduces (e.g., eliminates) distortion. In some embodiments, the distortion is first determined and then used to calibrate the optical system. In some embodiments, the correction algorithm processes the image detected at detector 558 to reduce (e.g., eliminate) distortion by computationally considering those errors.
Fig. 7A shows an optical system 700. The light beam from the eye 702 impinges on a first optical element 704 disposed on the front surface of a waveguide 706. The light rays guided within the waveguide 706 are shown in an "expanded" configuration—the total internal reflection of the light rays guided within the waveguide 706 is not depicted in fig. 7A, and the propagation distance of the light rays within the waveguide sets the thickness of the waveguide 706. The guided light is coupled out of the waveguide 706 by a second optical element 710. The upper right hand inset shows an exploded view of the guided light propagating near one end of the waveguide. The second optical element 710 corrects for various aberrations of the light rays, focuses the light rays (e.g., separated by their wavelength) near their output interface (e.g., in a keplerian telescope), and couples out the light rays to propagate along the z-axis. The imaging lens 712 images the light onto the detector 714.
Fig. 7B depicts the optical system 700 of fig. 7A in a folded configuration, showing multiple total internal reflections of guided light within the waveguide 706. The first optical element 704 couples light rays (from the eye 702) having a particular circular polarization into the waveguide 706. Light rays having a different circular polarization are either transmitted through the waveguide 706 or diffracted/refracted into a different direction than light rays of a particular circular polarization. As a result, light rays that do not have a particular circular polarization propagate away from the second optical element 710. The first optical element 704 is designed to respond to light rays having a particular circular polarization. In some embodiments, the first optical element 704 is designed to diffract LCP light into the +1 diffraction order and RCP light into the-1 diffraction order. In some embodiments, the first optical element 704 is designed to diffract RCP light into the +1 diffraction order and LCP light into the-1 diffraction order. In some embodiments, the first optical element 704 is designed to diffract LCP light into the +1 diffraction order, while RCP light is transmitted through the first optical element 704 (e.g., the first optical element 704 does not cause diffraction of RCP light). In some embodiments, the first optical element 704 is designed to diffract RCP light into the +1 diffraction order, while LCP light is transmitted through the first optical element 704 (e.g., the first optical element 704 does not cause diffraction of LCP light).
The light is coupled out of the waveguide 706 by a second optical element 710. In some cases, some of the light rays 720 are not coupled out by the second optical element 710 in a direction toward the imaging lens 712, but rather leak out of the waveguide 706 because they no longer satisfy the total internal reflection condition after interacting with the second optical element 710. The light directed to lens 712 is imaged by lens 712 onto an image plane on detector 714. Some of the light rays 722 that continue to be totally internally reflected within the waveguide pass through (past) the second optical element 710.
Fig. 7C shows a pattern 730 indicating the amount of distortion in the x-y plane. The input (incident) light rays (along the x-axis and along the y-axis) form a mirror symmetric checkerboard pattern and are transmitted through the optical system to determine the amount of deflection of the checkerboard pattern after exiting the optical system. After light exits the optical system 700, the width of the overall pattern in the x-direction and in the y-direction remains substantially constant (e.g., less than 10% change, less than 5% change, less than 1% change). Pattern 730 shows some curvature of the top of the checkerboard (i.e., along the x-axis for the maximum y-coordinate value). In some embodiments, distortion prevents the formation of an accurate image of the view 553 at the detector 714. Here, the optical system 700 corrects distortion, allowing the chief ray to image with reduced distortion. In some embodiments, additional corrective optics reduce (e.g., eliminate) distortion errors.
In addition to the relay systems shown in fig. 5A, 5B, 5C, 5D, or the relay systems shown in fig. 6A and 6B (e.g., galilean telescope and keplerian telescope), other relay systems may be used. In some embodiments, the imaging system includes a single relay system (e.g., fig. 5A-5D). In some embodiments, the imaging system includes two relay systems (e.g., fig. 6A and 6B). In some embodiments, the optical waveguide along which the light of a particular circular polarization is directed may also include a mid-field lens to reduce the magnification of the light beam (which includes the first light) coupled out of the waveguide. The intermediate field lens is a lens placed in a position conjugate to the image plane of the optical system (e.g., the plane of the detector).
In some embodiments, the optical system comprises an osvena telescope. Fig. 8A illustrates an osvena telescope 800 in some embodiments. The osvena telescope 800 comprises three reflective surfaces 802, 804 and 806. Light exiting the object 808 is focused by the first reflective surface 802 onto the second reflective surface 804. Light rays exiting the object 808 at a first angle are focused onto a first location on the second reflective surface 804. Light rays exiting the object 808 at a second angle are focused onto a second location on the second reflective surface 804 that is different from the first location.
The focused light rays are reflected by the second reflective surface 804, diverged and reflected off the third reflective surface 806. The third reflective surface 806 then images the light into an image plane 810.
In some embodiments, the center of curvature of the first reflective surface 802 and the center of curvature of the third reflective surface 806 coincide with the second reflective surface 804. In some embodiments, the optical system 800 provides a reduction rate (i.e., the image 810 is smaller than the object 808). In some embodiments, the optical system 800 provides magnification (i.e., the image 810 is larger than the object 808).
In some embodiments, the optical system 800 is configured as an afocal optical system (afocal optical system). An afocal system (i.e., a system without a focal point) is an optical system that does not produce a net convergence or divergence of the light beam (e.g., has an effective focal length of infinity). An optical system providing afocal magnification may also correct for petzval field curvature (Petzval field curvature). Such bending occurs when the image point near the optical axis is fully focused, but the off-axis rays are focused in front of the image sensor. The optical system 800 corrects for petzval field curvature because the curvature of the second reflective surface 804 (e.g., diverging convex mirror) is opposite in sign to the curvature of the first reflective surface 802 and the third reflective surface 806 (e.g., converging, concave mirror). Off-axis rays reflect off the edge of the convex mirror in a manner opposite to that of the concave mirror, reducing (e.g., counteracting) the petzval field curvature caused by the first and third reflective surfaces.
Fig. 8B shows an optical system 820 that includes two intersecting elliptical surfaces 826 and 828. Light exiting the object 822 reflects off the first elliptical reflecting surface 826 before reflecting off the second elliptical reflecting surface 828. The second elliptical reflecting surface directs the light onto an image plane 824, forming an image of the object 822 at the image plane 824.
In some embodiments, the first optical element (e.g., 404, 452, 504, 530, 554, 608, 652, 704) and the second optical element (e.g., 406, 556, 654, 710) form elements of the osvenet repeater 800. In some embodiments, the first optical element (e.g., 404, 452, 504, 530, 554, 608, 652, 704) and the second optical element (e.g., 406, 556, 654, 710) form elements of the cross-oval repeater 820.
In some embodiments, the first optical element is made of a material that causes diffraction similar to the optical reflection effect of the first reflective surface 802. In some embodiments, the second optical element is made of a material that causes diffraction similar to the optical reflection effect of the third reflective surface 806. In such an embodiment, the reflective surface 804 is provided by another optical element within the waveguide or outside the waveguide.
In some embodiments, the osvenet repeater 800 is disposed downstream of the waveguide. In some embodiments, the cross-elliptic repeater 820 is disposed downstream of the waveguide.
In some embodiments, the first optical element is made of a material that causes diffraction similar to the optical reflection effect of the first elliptical reflecting surface 826. In some embodiments, the second optical element is made of a material that causes diffraction similar to the optical reflection effect of the second elliptical reflecting surface 828.
In light of these principles, we now turn to certain embodiments.
According to some embodiments, an optical system includes an optical waveguide and a first optical element configured to direct a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide toward a second optical element via total internal reflection. The first optical element is further configured to direct a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at a first angle of incidence such that the second light ray propagates away from the second optical element (e.g., the second light ray does not propagate through the optical waveguide via total internal reflection but passes through the optical waveguide, or the second light ray is directed away from the second optical element even if the second light ray propagates through the optical waveguide via total internal reflection). The second optical element is configured to direct the first light propagating through the optical waveguide to the detector.
In some embodiments, directing the light includes changing the direction of the light (e.g., by reflection, refraction, and/or diffraction, etc.). In some embodiments, directing the light includes not changing the direction of the light (e.g., directing the light includes allowing the light to pass through the optical element without changing the direction of the light).
In some embodiments, the first optical element comprises an element selected from the group consisting of: a polarizer hologram element and a geometric phase lens. In some embodiments, the optical system includes a polarizer holographic element and/or a geometric phase lens. In some embodiments, the geometric phase lens is an off-center geometric phase lens.
In some embodiments, the first optical element is configured to direct near infrared light impinging on the first optical element at a first angle of incidence in a first direction and direct visible light impinging on the first optical element at the first angle of incidence in a direction different from the first direction (e.g., allowing visible light to pass through the first optical element without changing the direction of the visible light). In some embodiments, the first light has a wavelength greater than 850 nm. In some embodiments, the first optical element is configured to transmit visible light (without changing the direction of visible light). In some embodiments, the first optical element is configured to transmit a third light ray having a wavelength less than 800nm such that the third light ray propagates away from the second optical element (e.g., the third light ray does not propagate through the optical waveguide via total internal reflection, but instead passes through the optical waveguide).
In some embodiments, the first optical element has a first diffraction efficiency for near infrared light and a second diffraction efficiency for visible light, and the first diffraction efficiency is greater than the second diffraction efficiency (e.g., the first diffraction efficiency is 90% or greater and the second diffraction efficiency is 10% or less). In some embodiments, the first optical element has a diffraction efficiency for wavelengths greater than 850nm that is higher than a diffraction efficiency for wavelengths less than 800 nm.
In some embodiments, the first optical element is disposed on the first surface of the optical waveguide such that the first light impinges on the optical waveguide after impinging on the first optical element. In some embodiments, the first optical element is located between the object and the optical waveguide.
In some embodiments, the optical system includes a detector (e.g., fig. 5C). In some embodiments, the first light rays impinging on the first optical element comprise imaging light from the object, and the optical system is configured to project the imaging light onto the detector.
In some embodiments, the optical system further comprises an imaging telescope that is different from the combination of the optical waveguide, the first optical element, and the second optical element. In some embodiments, the imaging telescope is configured to receive imaging light from the second optical element and form an image of the object on the detector. In some embodiments, the detector comprises a camera. In some embodiments, the camera and the object are located on the same side of the optical waveguide. In some embodiments, the camera is positioned above the object. In some embodiments, the camera is located below the object. In some embodiments, the optical system is configured to reduce aberrations (e.g., chromatic aberration, distortion, etc.) recorded by the detector (e.g., fig. 6A, 6B, 7A, 7B, and 7C).
In some embodiments, directing the first light includes causing reflection and diffraction of the first light. In some embodiments, the first optical element is disposed on the second surface of the optical waveguide such that the first light impinges on the optical waveguide before impinging on the first optical element. In some embodiments, the optical waveguide receives a first light ray on a first surface of the optical waveguide, and the first light ray that has passed through the first surface of the optical waveguide is reflected by a first optical element located on a second surface of the optical waveguide. In some embodiments, the optical waveguide is located between the object and the first optical element.
In some embodiments, the second optical element includes a polarizer holographic element, a geometric phase lens (e.g., an off-center geometric phase lens), an output mirror, and an output grating.
In some embodiments, the first light ray forms a portion of a light beam that is relayed to the second optical element at a reduced magnification. In some embodiments, the optical waveguide further includes a mid-field lens to reduce the magnification of the light beam. In some embodiments, the intermediate field lens is disposed on a surface of the optical waveguide. In some embodiments, the intermediate field lens is embedded in the optical waveguide.
In some embodiments, the first optical element includes a coating that provides a focusing power (e.g., the first optical element is a thin film optic having a power).
In some embodiments, the first optical element and the second optical element form an off-axis galilean telescope (e.g., fig. 6B). In some embodiments, the first optical element is a positive lens and the second optical element is a negative lens.
In some embodiments, the first optical element and the second optical element (together) form an osvena telescope (e.g., fig. 8A). In some embodiments, the osvenner telescope comprises three reflective surfaces, a second of which is located at the intermediate image plane of the optical system (e.g., fig. 8A).
In some embodiments, the optical system comprises an off-axis galilean telescope (e.g., an off-axis galilean telescope separate from the first optical element and the second optical element). In some embodiments, the off-axis galilean telescope receives light exiting an optical waveguide (e.g., fig. 5C and 5D) and images it onto a detector. In some embodiments, the optical system further comprises an optical relay system to image the output of the off-axis galilean telescope onto the detector. In some embodiments, the off-axis galilean telescope includes a converging lens and a diverging lens. The converging lens is decentered and the diverging lens is decentered. The diverging lens is configured to reduce aberrations associated with the converging lens. In some embodiments, the aberration includes distortion. In some embodiments, the aberrations include chromatic aberration.
In some embodiments, the optical system includes a fourth optical element (e.g., fig. 7A, 7B, and 7C) that corrects distortion. In some embodiments, the fourth optical element comprises a coating.
In some embodiments, the optical system further comprises two off-axis reflective elliptical surfaces (e.g., fig. 8B). In some embodiments, the first optical element comprises one of two off-axis reflective elliptical surfaces.
According to some embodiments, an imaging system includes an optical system and a detector configured to receive an image of an object from the optical system.
In some embodiments, the object comprises an eye, the detector comprises a camera, the camera is located outside of a field of view of the eye, and the first optical element is located in front of the eye to allow the camera to image a direct view of the eye.
In some embodiments, the imaging system is included in a headset (e.g., the imaging system operates as part of an eye tracking unit of the headset).
According to some embodiments, a method for relaying an image of an eye includes receiving light from an eye of a user at a first optical element. The first optical element is configured to direct a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide towards the second optical element via total internal reflection. The first optical element is also configured to direct a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at a first angle of incidence such that the second light ray propagates away from the second optical element. The method includes directing the first light from the optical waveguide to the detector with a second optical element.
In some embodiments, the method further comprises projecting a first light ray onto the detector to form an image; and determining the position of the pupil of the user's eye from the image. In some embodiments, imaging the first light onto the camera includes sending the first light coupled out of the waveguide into an optical relay system, and the camera is located at an image plane of the optical system (e.g., fig. 5A and 5B). In some embodiments, the optical relay system comprises an off-axis galilean telescope, an off-axis kepler telescope, an osvena telescope, and/or two off-axis elliptical surfaces.
Although the various figures illustrate the operation of a particular component or group of components with respect to one eye, one of ordinary skill in the art will appreciate that similar operations may be performed with respect to the other or both eyes. Such details are not repeated herein for the sake of brevity.
Although several of the logic stages are shown in a particular order in some of the different figures, the stages that are not order dependent may be reordered and other stages may be combined or broken down. Although some reordering or other groupings are specifically mentioned, other reordering or groupings will be apparent to those of ordinary skill in the art, and thus the ordering and groupings presented herein are not an exhaustive list of alternatives. Furthermore, it should be appreciated that these stages may be implemented in hardware, firmware, software, or any combination thereof.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the scope of the claims to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen in order to best explain the principles of the claims and their practical application to thereby enable others skilled in the art to best utilize the embodiments with various modifications as are suited to the particular use contemplated.

Claims (20)

1. An optical system, comprising:
an optical waveguide;
a first optical element configured to: directing a first light ray from an eye in a second direction, the first light ray having a first circular polarization and impinging on the first optical element in a first direction, and directing a second light ray from an eye in a third direction different from the second direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element in the first direction such that one of the first light ray or the second light ray propagates through the optical waveguide via total internal reflection and the other of the first light ray or the second light ray is transmitted through the optical waveguide; and
A second optical element positioned to manipulate light propagating through the optical waveguide to eject the light from the optical waveguide.
2. The optical system of claim 1, wherein the first optical element comprises an element selected from the group consisting of a polarizer holographic element and a geometric phase lens.
3. The optical system of claim 1, wherein the first light has a wavelength greater than 850 nm.
4. The optical system of claim 3, wherein the first optical element is configured to transmit a third light ray having a wavelength less than 800nm such that the third light ray propagates away from the second optical element.
5. The optical system of claim 1, wherein the first optical element is disposed on a first surface of the optical waveguide such that the first light impinges on the optical waveguide after impinging on the first optical element.
6. The optical system of claim 1, further comprising a detector, wherein the first light rays impinging on the first optical element comprise imaging light from a subject, and the optical system is configured to project the imaging light onto the detector.
7. The optical system of claim 1, wherein directing the first light ray comprises causing reflection and diffraction of the first light ray.
8. The optical system of claim 7, wherein the first optical element is disposed on the second surface of the optical waveguide such that the first light impinges on the optical waveguide before impinging on the first optical element.
9. The optical system of claim 1, wherein the second optical element comprises an element selected from the group consisting of a polarizer holographic element, a geometric phase lens, an output mirror, and an output grating.
10. The optical system of claim 1, wherein the first light ray forms a portion of a light beam relayed to the second optical element at a reduced magnification.
11. The optical system of claim 10, wherein the optical waveguide further comprises a mid-field lens to reduce the magnification of the light beam.
12. The optical system of claim 1, wherein the first optical element further comprises a coating that provides a focusing power.
13. The optical system of claim 1, wherein the first optical element and the second optical element form an off-axis galilean telescope.
14. The optical system of claim 1, wherein the first optical element and the second optical element form an osvenna telescope.
15. The optical system of claim 14, wherein the osvenner telescope comprises three reflective surfaces, a second of the three reflective surfaces being located at an intermediate image plane of the optical system.
16. An imaging system, comprising:
the optical system of claim 1; and
a detector configured to receive an image of an object from the optical system.
17. The imaging system of claim 16, wherein the object comprises an eye, the detector comprises a camera, the camera is located outside of a field of view of the eye, and the first optical element is located in front of the eye to allow the camera to image a direct view of the eye.
18. The imaging system of claim 16, wherein the imaging system is included in a headset.
19. A method for relaying an image of a user's eye, the method comprising:
at a first optical element, receiving a first light ray from an eye and directing the first light ray to a second direction, the first light ray having a first circular polarization and impinging on the first optical element in the first direction;
At the first optical element, receiving a second light ray from the eye and directing the second light ray to a third direction different from the second direction such that one of the first light ray or the second light ray propagates through an optical waveguide via total internal reflection and the other of the first light ray or the second light ray is transmitted through the optical waveguide, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element in the first direction; and
the light propagating through the optical waveguide is manipulated with a second optical element to eject the light from the optical waveguide.
20. The method of claim 19, further comprising:
projecting the manipulated light onto a detector to form an image; and
a position of a pupil of the user's eye is determined from the image.
CN202211554573.XA 2018-05-18 2019-04-22 Eye tracking based on waveguide imaging Pending CN116009243A (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201862673805P 2018-05-18 2018-05-18
US62/673,805 2018-05-18
US201962804136P 2019-02-11 2019-02-11
US62/804,136 2019-02-11
US16/359,117 2019-03-20
US16/359,117 US11256086B2 (en) 2018-05-18 2019-03-20 Eye tracking based on waveguide imaging
CN201980033381.4A CN112136074B (en) 2018-05-18 2019-04-22 Waveguide imaging based eye tracking
PCT/US2019/028452 WO2019221875A1 (en) 2018-05-18 2019-04-22 Eye tracking based on waveguide imaging

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201980033381.4A Division CN112136074B (en) 2018-05-18 2019-04-22 Waveguide imaging based eye tracking

Publications (1)

Publication Number Publication Date
CN116009243A true CN116009243A (en) 2023-04-25

Family

ID=68540713

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211554573.XA Pending CN116009243A (en) 2018-05-18 2019-04-22 Eye tracking based on waveguide imaging
CN201980033381.4A Active CN112136074B (en) 2018-05-18 2019-04-22 Waveguide imaging based eye tracking

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201980033381.4A Active CN112136074B (en) 2018-05-18 2019-04-22 Waveguide imaging based eye tracking

Country Status (3)

Country Link
EP (1) EP3794396A4 (en)
CN (2) CN116009243A (en)
WO (1) WO2019221875A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11256086B2 (en) 2018-05-18 2022-02-22 Facebook Technologies, Llc Eye tracking based on waveguide imaging
CN114690414A (en) * 2020-12-30 2022-07-01 舜宇光学(浙江)研究院有限公司 Waveguide-based augmented reality device and method thereof

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007021036A1 (en) * 2007-05-04 2008-11-06 Carl Zeiss Ag Display device and display method for binocular display of a multicolor image
US9134534B2 (en) * 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9001030B2 (en) * 2012-02-15 2015-04-07 Google Inc. Heads up display
WO2013167864A1 (en) * 2012-05-11 2013-11-14 Milan Momcilo Popovich Apparatus for eye tracking
EP3120071A1 (en) * 2014-03-20 2017-01-25 CSEM Centre Suisse d'Electronique et de Microtechnique SA - Recherche et Développement Imaging system
US9377623B2 (en) * 2014-08-11 2016-06-28 Microsoft Technology Licensing, Llc Waveguide eye tracking employing volume Bragg grating
US20170038591A1 (en) * 2015-08-03 2017-02-09 Oculus Vr, Llc Display with a Tunable Pinhole Array for Augmented Reality
US10983340B2 (en) * 2016-02-04 2021-04-20 Digilens Inc. Holographic waveguide optical tracker
US10254542B2 (en) * 2016-11-01 2019-04-09 Microsoft Technology Licensing, Llc Holographic projector for a waveguide display

Also Published As

Publication number Publication date
EP3794396A4 (en) 2021-07-28
WO2019221875A1 (en) 2019-11-21
CN112136074A (en) 2020-12-25
CN112136074B (en) 2022-12-09
EP3794396A1 (en) 2021-03-24

Similar Documents

Publication Publication Date Title
US11933975B2 (en) Eye tracking based on waveguide imaging
US20220163808A1 (en) Optical assembly with polarization volume holographic element
EP3729173B1 (en) Integrated augmented reality head-mounted display for pupil steering
EP3721286B1 (en) Compact multi-color beam combiner using a geometric phase lens
US10600352B1 (en) Display device with a switchable window and see-through pancake lens assembly
US11624922B2 (en) Optical assemblies having polarization volume gratings for projecting augmented reality content
CN113454515A (en) Holographic illuminator in field
US10942320B2 (en) Dispersion compensation for light coupling through slanted facet of optical waveguide
US10969675B1 (en) Optical assemblies having scanning reflectors for projecting augmented reality content
CN113454504B (en) Holographic pattern generation for Head Mounted Display (HMD) eye tracking using diffractive optical elements
US11914162B1 (en) Display devices with wavelength-dependent reflectors for eye tracking
CN112136074B (en) Waveguide imaging based eye tracking
US11366298B1 (en) Eye tracking based on telecentric polarization sensitive grating
US10955675B1 (en) Variable resolution display device with switchable window and see-through pancake lens assembly
US12025795B1 (en) Wedge combiner for eye-tracking
US11579425B1 (en) Narrow-band peripheral see-through pancake lens assembly and display device with same
US11586024B1 (en) Peripheral see-through pancake lens assembly and display device with same
US11726326B1 (en) Wedge light guide
US20230168506A1 (en) High efficiency optical assembly with folded optical path

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination