CN112136074B - Waveguide imaging based eye tracking - Google Patents

Waveguide imaging based eye tracking Download PDF

Info

Publication number
CN112136074B
CN112136074B CN201980033381.4A CN201980033381A CN112136074B CN 112136074 B CN112136074 B CN 112136074B CN 201980033381 A CN201980033381 A CN 201980033381A CN 112136074 B CN112136074 B CN 112136074B
Authority
CN
China
Prior art keywords
optical element
light
optical
eye
light ray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980033381.4A
Other languages
Chinese (zh)
Other versions
CN112136074A (en
Inventor
巴巴克·埃米尔苏来马尼
帕西·萨里科
耿莹
优素福·尼奥尼·巴克萨姆·苏莱
斯科特·查尔斯·麦克尔道尼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Meta Platforms Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/359,117 external-priority patent/US11256086B2/en
Application filed by Meta Platforms Technologies LLC filed Critical Meta Platforms Technologies LLC
Priority to CN202211554573.XA priority Critical patent/CN116009243A/en
Publication of CN112136074A publication Critical patent/CN112136074A/en
Application granted granted Critical
Publication of CN112136074B publication Critical patent/CN112136074B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/283Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising used for beam splitting or combining
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

An optical system includes an optical waveguide and a first optical element configured to direct a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide toward a second optical element via total internal reflection. The first optical element is also configured to direct a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element. The second optical element is configured to direct the first light propagating through the optical waveguide to the detector.

Description

Waveguide imaging based eye tracking
Technical Field
The present disclosure relates generally to display devices, and more particularly to head mounted display devices.
Background
Head mounted display devices (also referred to herein as head mounted displays) are becoming increasingly popular as a means of providing visual information to users. For example, head mounted display devices are used for virtual reality and augmented reality operations. Eye tracking allows the head-mounted display device to determine the user's gaze and provide visual information based on the user's gaze direction.
SUMMARY
Therefore, there is a need for a compact and lightweight eye tracking system in a head mounted display device.
The systems and methods disclosed in this specification use waveguides and polarization-dependent optical elements (e.g., polarization volume holographic elements, geometric phase lenses, etc.) to address the above-referenced and other technical challenges. The polarization-dependent optical element manipulates light having a particular polarization (e.g., right-handed circular polarization) and couples the light into the waveguide such that the light is directed within the waveguide to an off-axis position where the light is coupled out of the waveguide. This allows the optical elements and detectors to be placed at a location (e.g., an on-axis location) away from the user's line of sight. In addition, the waveguide and polarization dependent optical elements employ a telescopic configuration, thereby providing a reduced image (of the eye, for example) and allowing the use of smaller (and lighter) detectors in the eye tracking system. Furthermore, by utilizing polarization dependent optical elements, the detector in the eye tracking system receives light with a particular polarization (and does not receive light with a different polarization or receives reduced intensity light with a different polarization), which in turn reduces noise in the received light and improves the performance of the eye tracking system. In some embodiments, the waveguide and polarization dependent optical element are wavelength specific, thus allowing transmission of visible light, making the eye tracking system compatible with augmented reality operations.
According to some embodiments, an optical system includes an optical waveguide and a first optical element configured to: i) Directing a first light ray having a first circular polarization in a first direction and impinging (imping) on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide toward the second optical element via total internal reflection, and ii) directing a second light ray having a second circular polarization different from the first circular polarization in a second direction different from the first direction and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element. The second optical element is configured to direct the first light propagating through the optical waveguide to the detector.
According to some embodiments, a method for relaying an eye image, the method comprising receiving light from a user's eye at a first optical element, wherein the first optical element is configured to: i) Directing a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide via total internal reflection towards the second optical element, and ii) directing a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element. The method includes directing the first light from the optical waveguide to a detector with a second optical element.
Embodiments according to the invention are disclosed in particular in the appended claims relating to optical systems, imaging systems and methods, wherein any feature mentioned in one claim category (e.g. optical systems) may also be claimed in another claim category (e.g. imaging systems, methods, storage media, systems and computer program products). The dependencies or back-references in the appended claims are chosen for formal reasons only. However, any subject matter resulting from an intentional back-reference (especially multiple references) to any preceding claim may also be claimed, such that any combination of a claim and its features is disclosed and may be claimed, irrespective of the dependencies chosen in the appended claims. The subject matter which may be claimed comprises not only the combination of features as set out in the appended claims, but also any other combination of features in the claims, wherein each feature mentioned in the claims may be combined with any other feature or combination of other features in the claims. Furthermore, any of the embodiments and features described or depicted herein may be claimed in separate claims and/or in any combination with any of the embodiments or features described or depicted herein or in any combination with any of the features of the appended claims.
In an embodiment, an optical system may include:
an optical waveguide, and
a first optical element configured to:
i) Directing a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide towards the second optical element via total internal reflection, and
ii) directing a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element,
wherein the second optical element is configured to direct the first light propagating through the optical waveguide to the detector.
The first optical element may comprise an element selected from the group consisting of a polarizer holographic element and a geometric phase lens.
The first light may have a wavelength greater than 850 nm.
The first optical element may be configured to transmit third light having a wavelength of less than 800nm such that the third light propagates away from the second optical element.
The first optical element may be disposed on the first surface of the optical waveguide such that the first light is irradiated onto the optical waveguide after being irradiated onto the first optical element.
In an embodiment, an optical system may include a detector, wherein the first light rays impinging on the first optical element may include imaging light from the object, and the optical system may be configured to project the imaging light onto the detector.
Directing the first light ray may include causing reflection and diffraction of the first light ray.
The first optical element may be disposed on the second surface of the optical waveguide such that the first light impinges on the optical waveguide before impinging on the first optical element.
The second optical element may comprise an element selected from the group consisting of a polarizer holographic element, a geometric phase lens, an output mirror (output mirror) and an output grating.
The first light ray may form part of a light beam that is relayed to the second optical element with reduced magnification.
The optical waveguide may include intermediate field lenses (intermediate field lenses) to reduce the magnification of the optical beam.
The first optical element may include a coating that provides a focusing power.
The first optical element and the second optical element may form an off-axis Galilean telescope (Galilean telescope).
The first optical element and the second optical element may form an Offner telescope (Offner telescope).
The olfner telescope may include three reflective surfaces, a second of which is located at an intermediate image plane (image plane) of the optical system.
In an embodiment, an imaging system may include:
the optical system of any embodiment herein; and
a detector configured to receive an image of the object from the optical system.
The object may include an eye, the detector may include a camera, the camera may be located outside a field of view of the eye, and the first optical element may be located in front of the eye to allow the camera to image a direct view (direct view) of the eye.
The imaging system may be included in a head set.
In an embodiment, a method for relaying an eye image (in particular, with an optical system or imaging system according to any embodiment herein) may comprise:
receiving light from a user's eye at a first optical element, wherein the first optical element is configured to:
i) Directing a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide towards the second optical element via total internal reflection, and
ii) directing a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element; and
the first light is directed from the optical waveguide to the detector with a second optical element.
In an embodiment, a method for relaying an eye image:
projecting the first light onto a detector to form an image; and
the position of the pupil of the user's eye (pupil) is determined from the image.
In an embodiment, one or more computer-readable non-transitory storage media may embody software that, when executed, is operable to perform a method according to or within an optical system, or an imaging system, or any of the above-mentioned embodiments.
In an embodiment, a system may include: one or more processors; and at least one memory coupled to the processor and comprising processor executable instructions, the processor when executing the instructions being operable to perform a method according to or within the optical system, or the imaging system, or any of the above mentioned embodiments.
In an embodiment, a computer program product, preferably comprising a computer readable non-transitory storage medium, is operable when executed on a data processing system to perform a method according to or within an optical system, or an imaging system, or any of the above mentioned embodiments.
Brief Description of Drawings
For a better understanding of the various embodiments described, reference should be made to the following description of the embodiments taken in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout.
Fig. 1 is a perspective view of a display device according to some embodiments.
Fig. 2 is a block diagram of a system including a display device according to some embodiments.
FIG. 3 is an isometric view of a display device according to some embodiments.
Fig. 4A is an example optical system according to some embodiments.
Fig. 4B is an example optical system according to some embodiments.
Fig. 5A is an example optical system according to some embodiments.
Fig. 5B is an example optical system according to some embodiments.
Fig. 5C is an example optical system according to some embodiments.
Fig. 5D is an example optical system according to some embodiments.
Fig. 5E shows an example of distortion in an optical system.
FIG. 6A is an example of a paraxial optical system according to some embodiments.
FIG. 6B is an example of an off-axis optical system according to some embodiments.
Fig. 7A is an example optical system according to some embodiments.
Fig. 7B is an example optical system according to some embodiments.
Fig. 7C is an example of distortion in an optical system.
Fig. 8A illustrates an osvenner relay optical system according to some embodiments.
FIG. 8B illustrates a crossed-ellipse relay optical system according to some embodiments.
Unless otherwise indicated, the figures are not drawn to scale.
Detailed Description
Reference will now be made to embodiments, examples of which are illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide an understanding of various described embodiments. It will be apparent, however, to one skilled in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail as not to unnecessarily obscure aspects of the embodiments.
It will be further understood that, although the terms first, second, etc. may be used herein to describe various elements in some cases, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first light projector may be referred to as a second light projector, and similarly, a second light projector may be referred to as a first light projector, without departing from the scope of the various described embodiments. The first light projector and the second light projector are both light projectors, but they are not the same light projector.
The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and includes any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises", "comprising", "includes" and/or "including", when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The term "exemplary" is used herein in the sense of "serving as an example, instance, or illustration," and not "representing the best of the same.
FIG. 1 illustrates a display device 100 according to some embodiments. In some embodiments, the display device 100 is configured to be worn on the head of a user (e.g., by having the form of glasses (eyeglasses) as shown in fig. 1), or included as part of a helmet to be worn by the user. When the display device 100 is configured to be worn on the head of a user or included as part of a helmet, the display device 100 is referred to as a head-mounted display. Alternatively, the display device 100 is configured for placement near one or both eyes of the user at a fixed location, rather than being head-mounted (e.g., the display device 100 is mounted in a vehicle (e.g., a car or airplane) for placement in front of one or both eyes of the user). As shown in fig. 1, the display device 100 includes a display 110. The display 110 is configured to present visual content (e.g., augmented reality content, virtual reality content, mixed reality content, or any combination thereof) to a user.
In some embodiments, display device 100 includes one or more components described herein with reference to fig. 2. In some embodiments, display device 100 includes additional components not shown in FIG. 2.
Fig. 2 is a block diagram of a system 200 according to some embodiments. The system 200 shown in fig. 2 includes a display device 205 (which corresponds to the display device 100 shown in fig. 1), an imaging device 235, and an input interface 240, each of which is coupled to the console 210. Although fig. 2 shows an example of system 200 including one display device 205, imaging device 235, and input interface 240, in other embodiments any number of these components may be included in system 200. For example, there may be multiple display devices 205, each display device 205 having an associated input interface 240 and being monitored by one or more imaging devices 235, where each display device 205, input interface 240, and imaging device 235 are in communication with the console 210. In alternative configurations, different and/or additional components may be included in system 200. For example, in some embodiments, console 210 is connected to system 200 via a network (e.g., the internet), or is self-contained (e.g., physically located inside display device 205) as part of display device 205. In some embodiments, the display device 205 is used to create mixed reality by adding a view of the real environment. Thus, the display device 205 and system 200 described herein may deliver augmented reality, virtual reality, and mixed reality.
In some embodiments, as shown in FIG. 1, display device 205 is a head mounted display that presents media to a user. Examples of media presented by display device 205 include one or more images, video, audio, or some combination thereof. In some embodiments, the audio is presented via an external device (e.g., speakers and/or headphones) that receives audio information from the display device 205, the console 210, or both and presents audio data based on the audio information. In some embodiments, the display device 205 immerses the user in the augmented environment.
In some embodiments, display device 205 also functions as an Augmented Reality (AR) headset. In these embodiments, the display device 205 augments the view of the physical real-world environment with computer-generated elements (e.g., images, videos, sounds, etc.). Further, in some embodiments, the display device 205 is capable of cycling between different types of operations. Thus, based on instructions from the application engine 255, the display device 205 operates as a Virtual Reality (VR) device, an Augmented Reality (AR) device, as glasses, or some combination thereof (e.g., glasses without optical correction, glasses that are optically corrected for the user, sunglasses, or some combination thereof).
The display device 205 includes an electronic display 215, one or more processors 216, an eye tracking module 217, an adjustment module 218, one or more positioners 220, one or more position sensors 225, one or more position cameras 222, a memory 228, an Inertial Measurement Unit (IMU) 230, one or more reflective elements 260, or a subset or superset thereof (e.g., a display device 205 having an electronic display 215, one or more processors 216, and memory 228 without any other listed components). Some embodiments of the display device 205 have modules different from those described herein. Similarly, functionality may be distributed among modules in a different manner than described herein.
One or more processors 216 (e.g., processing units or cores) execute instructions stored in memory 228. The memory 228 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices; and may include non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The memory 228, or alternatively a non-volatile memory device within the memory 228, includes non-transitory computer-readable storage media. In some embodiments, memory 228 or the computer-readable storage medium of memory 228 stores programs, modules and data structures, and/or instructions for displaying one or more images on electronic display 215.
The electronic display 215 displays images to the user according to data received from the console 210 and/or the processor 216. In various embodiments, electronic display 215 may include a single adjustable display element or multiple adjustable display elements (e.g., one display for each eye of the user). In some embodiments, the electronic display 215 is configured to display an image to a user by projecting the image onto one or more reflective elements 260.
In some embodiments, the display element includes one or more light emitting devices and a corresponding array of spatial light modulators. The spatial light modulator is an array of electro-optic pixels, some other array of devices that dynamically adjusts the amount of light transmitted by each device, or some combination thereof. The pixels are placed behind one or more lenses. In some embodiments, the spatial light modulator is a liquid crystal-based pixel array in an LCD (liquid crystal display). Examples of the light emitting device include: organic light emitting diodes, active matrix organic light emitting diodes, some type of device that can be placed in a flexible display, or some combination thereof. Light emitting devices include devices capable of generating visible light (e.g., red, green, blue, etc.) for image generation. The spatial light modulator is configured to selectively attenuate individual light emitting devices, groups of light emitting devices, or some combination thereof. Alternatively, when the spatial light modulator is configured to selectively attenuate individual emitting devices and/or groups of light emitting devices, the display element comprises an array of such light emitting devices without a separate array of emission intensities. In some embodiments, electronic display 215 projects an image to one or more reflective elements 260, reflective elements 260 reflecting at least a portion of the light toward the user's eye.
One or more lenses direct light from the array of light emitting devices (optionally through an array of emission intensities) to a location within each viewing window (eyebox) and ultimately to the back of the user's retina. A viewport is an area occupied by the eyes of a user (e.g., a user wearing the display device 205) located near the display device 205 for viewing images from the display device 205. In some cases, the viewing window is represented as a 10mm by 10mm square. In some embodiments, one or more lenses include one or more coatings, such as an anti-reflective coating.
In some embodiments, the display element includes an array of Infrared (IR) detectors that detect retro-reflected (retro-reflected) IR light from the retina of a viewing user, from the corneal surface, the lens of an eye, or some combination thereof. The IR detector array includes one IR sensor or a plurality of IR sensors, each of which corresponds to a different location of the pupil of the eye of the viewing user. In alternative embodiments, other eye tracking systems may be employed.
The eye tracking module 217 determines the location of each pupil of the user's eye. In some embodiments, the eye tracking module 217 instructs the electronic display 215 to illuminate the window with IR light (e.g., via IR emitting devices in the display elements).
A portion of the emitted IR light will pass through the pupil of the viewing user and reflect back from the retina towards the IR detector array used to determine the pupil location. Alternatively, the reflected light off the surface of the eye is also used to determine the location of the pupil. The array of IR detectors scans for retro-reflections and identifies which IR emitting devices are active when retro-reflections are detected. The eye tracking module 217 may use a tracking look-up table and the identified IR emitting devices to determine the pupil location of each eye. The tracking look-up table maps the signals received on the IR detector array to a location in each window (corresponding to the pupil location). In some embodiments, the tracking look-up table is generated via a calibration process (e.g., the user looks at various known reference points in the image, and the eye tracking module 217 maps the location of the user's pupil when looking at the reference points to the corresponding signals received on the IR tracking array). As noted above, in some embodiments, the system 200 may use other eye tracking systems in addition to the embedded IR eye tracking system described herein.
The adjustment module 218 generates an image frame based on the determined pupil position. In some embodiments, this sends the discrete images to a display that tiles the sub-images together so that a coherent stitched image will appear on the back of the retina. The adjustment module 218 adjusts the output of the electronic display 215 (i.e., the generated image frames) based on the detected pupil position. Adjustment module 218 instructs portions of electronic display 215 to pass image light to the determined pupil location. In some embodiments, the adjustment module 218 also instructs the electronic display not to pass image light to locations other than the determined pupil location. The adjustment module 218 may, for example, block and/or stop light emitting devices whose image light falls outside of the determined pupil location, allow other light emitting devices to emit image light that falls within the determined pupil location, translate and/or rotate one or more display elements, dynamically adjust the curvature and/or optical power (reactive power) of one or more active lenses in an array of lenses (e.g., microlenses), or some combination thereof.
Optional locators 220 are objects that are located at a particular position on display device 205 relative to each other and relative to a particular reference point on display device 205. The locators 220 may be Light Emitting Diodes (LEDs), pyramidal prisms (cone reflectors), reflective markers (reflective markers), a type of light source that contrasts with the environment in which the display device 205 operates, or some combination thereof. In embodiments where the locators 220 are active (i.e., LEDs or other types of light emitting devices), the locators 220 may emit light in the visible band (e.g., about 400-750 nm), in the infrared band (e.g., about 750-1 mm), in the ultraviolet band (about 100-400 nm), some other portion of the electromagnetic spectrum, or some combination thereof.
In some embodiments, the locators 220 are positioned below an outer surface of the display device 205 that is transparent to the wavelengths of light emitted or reflected by the locators 220, or thin enough not to substantially attenuate the wavelengths of light emitted or reflected by the locators 220. Additionally, in some embodiments, an outer surface or other portion of the display device 205 is opaque in the visible band of wavelengths of light. Thus, the locator 220 can emit light in the IR band below an outer surface that is transparent in the IR band but opaque in the visible band.
The IMU 230 may be an electronic device that generates calibration data based on measurement signals received from the one or more position sensors 225. The position sensor 225 generates one or more measurement signals in response to movement of the display device 205. Examples of the position sensor 225 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor to detect motion, one type of sensor for error correction of the IMU 230, or some combination thereof. The location sensor 225 may be located external to the IMU 230, internal to the IMU 230, or some combination thereof.
Based on the one or more measurement signals from the one or more position sensors 225, the IMU 230 generates first calibration data indicative of an estimated position of the display device 205 relative to an initial position of the display device 205. For example, the position sensors 225 include multiple accelerometers that measure translational motion (forward/backward, up/down, left/right) and multiple gyroscopes that measure rotational motion (e.g., pitch, yaw, roll). In some embodiments, the IMU 230 samples the measurement signals quickly and calculates an estimated position of the display device 205 from the sampled data. For example, the IMU 230 integrates the measurement signals received from the accelerometers over time to estimate a velocity vector, and integrates the velocity vector over time to determine an estimated location of a reference point on the display device 205. Alternatively, the IMU 230 provides the sampled measurement signals to the console 210, and the console 210 determines the first calibration data. The reference point is a point that may be used to describe the location of the display device 205. Although the reference point may be defined generally as a point in space, in practice, the reference point is defined as a point within the display device 205 (e.g., the center of the IMU 230).
In some embodiments, the IMU 230 receives one or more calibration parameters from the console 210. As discussed further below, one or more calibration parameters are used to keep track of the display device 205. Based on the received calibration parameters, the IMU 230 may adjust one or more IMU parameters (e.g., sampling rate). In some embodiments, certain calibration parameters cause the IMU 230 to update the initial position of the reference point so that it corresponds to the next calibrated position of the reference point. Updating the initial position of the reference point to the next calibrated position of the reference point helps to reduce the cumulative error associated with the determined estimated position. The accumulated error, also known as drift error, causes the estimated position of the reference point to deviate from the actual position of the reference point over time.
The imaging device 235 generates calibration data from the calibration parameters received from the console 210. The calibration data includes one or more images showing the observed positions of the localizer 220, which are detectable by the imaging device 235. In some embodiments, imaging device 235 includes one or more still cameras, one or more video cameras, any other device capable of capturing images including one or more locators 220, or some combination thereof. In addition, the imaging device 235 may include one or more filters (e.g., to increase signal-to-noise ratio). The imaging device 235 is configured to detect light emitted or reflected from the locator 220, optionally in the field of view of the imaging device 235. In embodiments where the locators 220 include passive elements (e.g., retro reflectors), the imaging device 235 may include a light source that illuminates some or all of the locators 220 that retroreflect light toward the light source in the imaging device 235. The second calibration data is communicated from the imaging device 235 to the console 210, and the imaging device 235 receives one or more calibration parameters from the console 210 to adjust one or more imaging parameters (e.g., focal length, focus, frame rate, ISO, sensor temperature, shutter speed, aperture, etc.).
In some embodiments, the display device 205 optionally includes one or more reflective elements 260. In some embodiments, the electronic display device 205 optionally includes a single reflective element 260 or a plurality of reflective elements 260 (e.g., one reflective element 260 for each eye of the user). In some embodiments, the electronic display device 215 projects the computer-generated image onto one or more reflective elements 260, which reflective elements 260 in turn reflect the image toward one or both eyes of the user. The computer-generated images include still images, animated images, and/or combinations thereof. The computer-generated image includes objects that appear to be two-dimensional and/or three-dimensional objects. In some embodiments, one or more reflective elements 260 are partially light transmissive (e.g., one or more reflective elements 260 have a transmittance of at least 15%, 20%, 25%, 30%, 35%, 40%, 45%, or 50%), which allows for transmission of ambient light. In such embodiments, the computer-generated image projected by electronic display 215 is superimposed with transmitted ambient light (e.g., a transmitted ambient image) to provide an augmented reality image.
Input interface 240 is a device that allows a user to send action requests to console 210. An action request is a request to perform a particular action. For example, the action request may be to start or end an application, or to perform a particular action within an application. Input interface 240 may include one or more input devices. Example input devices include: a keyboard, mouse, game controller, data from brain signals, data from other parts of the human body, or any other suitable device for receiving an action request and transmitting the received action request to the console 210. The action request received by the input interface 240 may be communicated to the console 210, and the console 210 performs an action corresponding to the action request. In some embodiments, the input interface 240 may provide haptic feedback to the user according to instructions received from the console 210. For example, the haptic feedback is provided upon receiving an action request, or the console 210 transmits instructions to the input interface 240 causing the input interface 240 to generate the haptic feedback when the console 210 performs the action.
The console 210 provides media to the display device 205 for presentation to the user in accordance with information received from one or more of the imaging device 235, the display device 205, and the input interface 240. In the example shown in fig. 2, the console 210 includes an application store 245, a tracking module 250, and an application engine 255. Some embodiments of the console 210 have different modules than those described in conjunction with fig. 2. Similarly, the functionality further described herein may be distributed among the components of the console 210 in a manner different than that described herein.
When the application storage 245 is included in the console 210, the application storage 245 stores one or more applications executed by the console 210. An application is a set of instructions that, when executed by a processor, are used to generate content for presentation to a user. The content generated by the processor based on the application may be responsive to input received from a user via movement of the display device 205 or the input interface 240. Examples of applications include: a gaming application, a conferencing application, a video playback application, or other suitable application.
When the tracking module 250 is included in the console 210, the tracking module 250 calibrates the system 200 using one or more calibration parameters, and may adjust the one or more calibration parameters to reduce errors in the display device 205 position determination. For example, the tracking module 250 adjusts the focus of the imaging device 235 to obtain a more accurate location of the observed locator on the display device 205. In addition, the calibration performed by the tracking module 250 also takes into account information received from the IMU 230. Additionally, if tracking of the display device 205 is lost (e.g., the imaging device 235 loses at least a threshold number of the localizer's 220 line of sight), the tracking module 250 recalibrates some or all of the system 200.
In some embodiments, the tracking module 250 tracks the movement of the display device 205 using the second calibration data from the imaging device 235. For example, the tracking module 250 determines the location of a reference point of the display device 205 using the observed locator based on the second calibration data and the model of the display device 205. In some embodiments, the tracking module 250 also uses the location information from the first calibration data to determine the location of the reference point of the display device 205. Further, in some embodiments, the tracking module 250 may use portions of the first calibration data, the second calibration data, or some combination thereof to predict a future location of the display device 205. The tracking module 250 provides the estimated or predicted future location of the display device 205 to the application engine 255.
The application engine 255 executes applications within the system 200 and receives location information, acceleration information, velocity information, predicted future location of the display device 205, or some combination thereof, from the tracking module 250. Based on the received information, the application engine 255 determines content to provide to the display device 205 for presentation to the user. For example, if the received information indicates that the user has looked to the left, the application engine 255 generates content for the display device 205 that reflects (mirror) the user's movement in the enhanced environment. Further, the application engine 255 performs actions within the application executing on the console 210 in response to action requests received from the input interface 240 and provides feedback to the user that the actions were performed. The feedback provided may be visual or auditory feedback via the display device 205 or tactile feedback via the input interface 240.
Fig. 3 is an isometric view of a display device 300 according to some embodiments. In some other embodiments, the display device 300 is part of some other electronic display (e.g., a digital microscope, a head-mounted display device, etc.). In some embodiments, the display device 300 includes an array of light emitting devices 310 and one or more lenses 330, 335. In some embodiments, the display device 300 also includes an array of IR detectors.
The array of light emitting devices 310 emits image light and optionally IR light toward a viewing user. The array of light emitting devices 310 may be, for example, an array of LEDs, an array of micro-LEDs, an array of OLEDs, or some combination thereof. The array of light emitting devices 310 includes light emitting devices 320 that emit light in the visible (and optionally devices that emit light in the IR).
In some embodiments, the display device 300 includes an array of emission intensities configured to selectively attenuate light emitted from the array of light emitting devices 310. In some embodiments, the emission intensity array is comprised of a plurality of liquid crystal cells or pixels, groups of light emitting devices, or some combination thereof. Each liquid crystal cell (or in some embodiments, group of liquid crystal cells) is addressable to have a particular level of attenuation. For example, at a given time, some liquid crystal cells may be set to no attenuation, while other liquid crystal cells may be set to maximum attenuation. In this manner, the array of emission intensities can control which portion of the image light emitted from the array of light emitting devices 310 is passed to the one or more lenses 330, 335. In some embodiments, the display device 300 uses an array of emission intensities to facilitate providing image light to the pupil 350 location of the user's eye 340 and to minimize the amount of image light provided to other areas in the viewing window.
One or more lenses 330, 335 receive modified image light (e.g., attenuated light) from the emission intensity array (or directly from the light emitting device array 310) and direct the modified image light to the location of the pupil 350.
An optional array of IR detectors detects IR light that has been retro-reflected from the retina of the eye 340, the cornea of the eye 340, the lens of the eye 340, or some combination thereof. The IR detector array includes a single IR sensor or a plurality of IR sensitive detectors (e.g., photodiodes). In some embodiments, the array of IR detectors is separate from the array of light emitting devices 310. In some embodiments, the array of IR detectors is integrated into the array of light emitting devices 310.
In some embodiments, the array of light emitting devices 310 and the array of emission intensities comprise a display element. Alternatively, the display element includes the array of light emitting devices 310 (e.g., when the array of light emitting devices 310 includes individually adjustable pixels) without the array of emission intensities. In some embodiments, the display element additionally comprises an IR array. In some embodiments, in response to the determined pupil 350 location, the display element adjusts the emitted image light such that light output by the display element is refracted by one or more lenses 330, 335 toward the determined pupil 350 location (and not toward other locations in the viewing window).
In some embodiments, display apparatus 300 includes one or more broadband light sources (e.g., one or more white LEDs) coupled with a plurality of color filters in addition to or in place of light emitting device array 310.
Fig. 6A illustrates a schematic "unfolded" (e.g., all optical elements arranged without "folded" optical elements, such as waveguides) and coaxial configuration (e.g., the geometric center of an optical element coincides with the principal axis of the optical element) of an example optical system 600 according to some embodiments. The optical system 600 includes two relay systems, a first relay system 602 including a first optical element 608 and a second optical element 610. The first relay system 602 receives light from an object (e.g., an eye 606 of a wearer of a device that includes the optical system 600). The first optical element 608 has a converging/focusing optical power. The light beams from the eye 606 that strike the first optical element 608 at a greater height along the y-direction converge to a lesser height as they strike the second optical element 610. Second optical element 610 has a diverging optical power (i.e., power), and the light beam diverges in the y-direction after interacting with second optical element 610. In some embodiments, the positive lens effect (e.g., converging) of the first optical element 608 and the negative lens effect (e.g., diverging) of the second optical element 610 allow the first and second optical elements to form a galilean telescope (e.g., formed by a negative lens followed by a positive lens). In some embodiments, the first relay system 602 includes a Keplerian telescope (e.g., formed from two positive lenses). In some embodiments, the first optical element and the second optical element form a telescope. In some embodiments, the telescope includes additional optical elements within the first relay system 602. In some embodiments, additional optical elements are disposed within the optical waveguide 402.
The second relay system 604 is arranged downstream of the first relay system 602 and images the light beam exiting (emerge) from the second optical element 610 onto the detector 614. In fig. 6A, the second relay system 604 in the example optical system 600 includes a single optical element 612 (e.g., a converging lens). In some embodiments, the second relay system 604 includes additional optical elements. In some embodiments, the second relay system 604 comprises a telescope. In some embodiments, the second relay system 604 is a galilean telescope. In some embodiments, the second relay system 604 is a keplerian telescope. A detector 614 (e.g. a CCD camera containing sensor elements) is located in the image plane of the optical element 612. In the example optical system 600, the second relay system 604 images the output of the first relay system 602 onto the detector 614 at a reduced magnification (the range of imaging light along the y-axis at the detector 614 is less than the range of imaging light striking the optical element 612). In some embodiments, the second optical relay system is downstream of the first optical relay system and receives as its input the output from the first optical relay system.
Fig. 6B illustrates an "unfolded" (e.g., all optical elements arranged without "folded" optical elements, such as waveguides) and an off-axis configuration (e.g., the geometric center of any optical element does not coincide with the major axis of any other optical element) of an optical system 650 according to some embodiments. The optical elements in optical system 600 in fig. 6A are arranged paraxially, but the optical elements in optical system 650 in fig. 6B are arranged off-axis.
Like optical system 600, optical system 650 has a first relay system 658, first relay system 658 includingA first optical element 652 and a second optical element 654. Fig. 6A and 6B both show a galilean telescope in the first relay system. The first optical element 652 has a converging optical power (e.g., a positive lens) and the second optical element 654 has a diverging optical power (e.g., a negative lens). In some embodiments, the first optical element 652 is a positive lens, and the beam from the eye 606 impinges (imping) on the first optical element 652 in an eccentric manner. For example, the first optical element 652 is an eccentric lens (e.g., a lens having a principal axis of the lens offset from a geometric center of the lens). In some embodiments, the principal axis of the decentered lens is distal from the lens (e.g., outside the lens). In this manner, light beams 660 from one edge of the eye 606 are refracted by a greater angle θ r than light beams 662 from the other edge of the eye 1 Beam 662 is refracted by first optical element 652 by a small angle ar 2 . Thus, in some embodiments, a symmetrically emerging light beam (in the y-z plane) is refracted differently when striking the first optical element in an off-axis/off-center manner. In contrast, the outermost light beam 616 in FIG. 6A (which impinges on a central lens) is refracted by the same angle θ r as the other outermost light beam 618 1 . In FIG. 6A, the angle θ r 1 And thetar 2 Of the same size and of different symbols.
In some embodiments, the first optical element 652 is a geometric phase element. In some embodiments, the first optical element 652 is a geometric phase lens. The circularly polarized light exhibits this phase profile directly by the effect of the geometric phase (also known as Pancharatnam-Berry phase), e.g. the phase profile of the geometric phase lens is added to the original phase (the phase before the light passes through the geometric phase lens). Light beam 660 includes first light 664 having a first circular polarization and second light (not shown) having a second circular polarization different from the first circular polarization. The first optical element 652 is at an angle θ r to the z-axis 1 Directs a first light ray 664. In some embodiments, second light having a second circular polarization different from the first circular polarization is transmitted through first optical element 652. In some embodiments, the first optical element 652 is oriented at- θ r with respect to the z-axis 1 The second light ray is directed in a direction opposite to the first light ray.
The first light from the various light beams impinges on the second optical element 654 in an off-center manner (e.g., off-axis, the second optical element is not symmetrically illuminated about its axis of symmetry or its principal axis). In some embodiments, the second optical element 654 is a negative lens, and the first light rays diverge after impinging on the second optical element 654. In some embodiments, the decentered negative lens of the second optical element 654 corrects for aberrations (e.g., distortions). In some embodiments, the first optical element 652 and the second optical element 654 are a positive lens and a negative lens, respectively, of a galilean telescope forming the first relay system 658.
In some embodiments, a single converging lens 656 forms second relay system 648. Lens 656 images the output of first relay system 658 onto detector 624.
Fig. 4A illustrates an optical system 400 according to some embodiments. Optical system 400 includes a waveguide 402 (e.g., an optical waveguide that guides electromagnetic radiation having a wavelength greater than 400nm, greater than 800nm, greater than 1000nm, or greater than 2000nm along a long axis of the optical waveguide (e.g., the y-axis in fig. 4A)), a first optical element 404, a second optical element 406, imaging optical element 408, and a detector 410 for imaging light from an eye 412 of a user of optical system 400 (e.g., a user wearing a device including optical system 400 (e.g., a head-mounted display, a VR display head-mounted device, or an AR display head-mounted device, etc.)). Compared to the coaxial system shown in FIG. 6A, the waveguide 402 in FIG. 4A allows the light ray trajectory inside it to be "folded" resulting in a more compact system.
Light beam 414 from eye 412 (e.g., light reflected off eye 412) includes a first light ray 416-1 and a second light ray 418-1. First light 416-1 has a first circular polarization (e.g., right circularly polarized light (RCP)), and second light 418-1 has a second circular polarization different from the first circular polarization (e.g., left circularly polarized Light (LCP)). The light beam 414 impinges on the first optical element 404 at a first angle of incidence. In some embodiments, the light beam impinges on the first optical element 404 at a range of incident angles. In some embodiments (as shown in fig. 4A), the light beam 414 impinges on the first optical element 404 at normal incidence (i.e., an angle of incidence of 0 °).
In some embodiments, the first optical element 404 is configured to be at an angle along diffraction θ D To direct a first light ray 416-1 (e.g., as a first light ray 416-2), and to direct a second light ray 418-1 in a second direction different from the first direction. In some embodiments, the first optical element directs the second light 418-1 in the second direction without any diffraction by transmitting the second light 418-1 through the optical waveguide 402. In some embodiments, the first optical element passes through a path along the diffraction angle θ D (e.g., negative relative to the diffraction angle of first light ray 416-1) diffracts the second light ray (e.g., first light ray 416-1 is diffracted to the +1 diffraction order and light ray 418-1 is diffracted to the-1 diffraction order) to direct the second light ray as second light ray 418-3. Second light ray 418-2 and second light ray 418-3 are both directed in a direction that causes the second light ray to propagate away from second optical element 406.
In some embodiments, the first light ray 416-1 has a first wavelength and the second light ray 418-1 has a second wavelength different from the first wavelength. In some embodiments, the first wavelength is greater than 850nm (e.g., greater than 900nm, greater than 1000nm, greater than 1500nm, greater than 2000 nm) and the second wavelength is less than 850nm (e.g., less than 800nm, less than 700nm, less than 600 nm). In some embodiments, the first wavelength and the second wavelength are different and both are greater than 800nm. In some embodiments, first optical element 404 diffracts light within a wavelength range (e.g., greater than 800nm, between 800nm and 2000 nm) and transmits light outside of the wavelength range. In some embodiments, the first optical element 404 diffracts light impinging thereon that is incident within a range of incidence angles (e.g., between +20 ° and-20 ° from the normal to the incidence surface, between +10 ° and-10 ° from the normal to the incidence surface, between +5 ° and-5 ° from the normal to the incidence surface, and between +2 ° and-2 ° from the normal to the incidence surface).
In some embodiments, the first optical element 404 is a polarizer holographic element. In some embodiments, the first optical element is a Geometric Phase Lens, similar to those described in co-pending patent application No. 15/833,676 entitled "geometrical Phase Lens Alignment in an Augmented Reality Head Mounted Display," filed on 6.12.2017, which is incorporated herein by reference in its entirety.
First optical element 404 diffracts first light ray 416-1 into first light ray 416-2 in a forward direction such that it is at an incident angle θ I Incident on the back surface 422 of the optical waveguide 402 at an angle equal to or greater than the critical angle (critical angle) of the optical waveguide 402. For example, at the optical waveguide 402 at the wavelength λ 1 In the embodiment made of a material having a refractive index n, the wavelength is λ 1 At the critical angle theta of the light at the material-air interface c Is sin -1 (n Air (a) N) (i.e., refractive index n of air) Air (a) And wavelength lambda 1 The arcsine of the ratio of the refractive indices n). Thus, the first light ray is reflected at the back surface 422 by Total Internal Reflection (TIR) to be greater than θ c Angle (e.g., the magnitude of the angle and the angle of incidence θ) I Is the same size) is again irradiated onto the first surface 424. The first optical element 404 couples in light from the eye 412 such that the light is directed along the long axis (e.g., y-axis) of the optical waveguide 402. After undergoing one or more total internal reflections at the material-air interfaces (e.g., back surface 422 and front surface 424) of the optical waveguide 402, the first light ray 416-2 impinges on the front surface 424 at the location where the second optical element 406 is disposed. In some embodiments, the second optical element 406 is deposited on the optical waveguide 402. In some embodiments, the second optical element 406 is coated on the optical waveguide 402. The second optical element 406 couples the first light 416-2 out of the optical waveguide 402 and therefore does not further direct the first light 416-2 along the optical waveguide 402 (e.g., the first light 416-2 is no longer reflected by the optical waveguide 402). In some embodiments, the second optical element 406 is a polarizer holographic element. In some embodiments, the second optical element 406 is a geometric phase lens. In some embodiments, the second optical element 406 is a polarization grating. In some embodiments, the second optical element 406 is an output mirror. In some embodiments, the second optical element406 is the output raster. The second optical element 406 directs the first light ray 416-2 such that the directed first light ray 416-3 propagates substantially parallel to the z-axis after exiting the optical waveguide 402 (e.g., the light ray makes an angle of less than 20 ° with the z-axis, the light ray makes an angle of less than 10 ° with the z-axis, the light ray makes an angle of less than 5 ° with the z-axis, the light ray makes an angle of less than 2 ° with the z-axis, and the light ray makes an angle of less than 1 ° with the z-axis).
Fig. 4A shows another ray 426-1 from the eye 412 in a direction parallel to the beam 414. In FIG. 4A, first optical element 404 also diffracts light 426-1 into light 426-2 in a direction parallel to first light 416-2. As a result, light ray 426-2 is directed through optical waveguide 402 along an optical path that is shifted (displaced) in the y-direction relative to first light ray 416-2. Imaging optics 408 are located downstream from second optics 406 (e.g., behind second optics 406 along an optical path that begins with eye 412 and ends at detector 410), and first light ray 416-3 and light ray 426-3 are imaged onto detector 410 such that an image of eye 412 (e.g., an image of the pupil of eye 412) is formed at detector 410 (e.g., detector 410 is located at the image plane of imaging optics 408). In some embodiments, the object plane of the imaging optic 408 is on or near the exit surface (exit surface) of the second optic 406. In some embodiments, the exit surface is the surface 428 of the second optical element 406 closest to the detector 410, which may define a material-air interface. In some embodiments, imaging optics 408 form a relay system that images the output from optical waveguide 402 onto detector 410.
In some embodiments, the first relay system relays (or images) light rays (e.g., light beam 414, light ray 426-1) from eye 412 onto a plane (e.g., an image plane, an output plane of optical waveguide 402), which in turn is an object plane (object plane) of the second relay system (e.g., imaging optics 408). In some embodiments, the first optical element 404 and the second optical element 406 together form a first relay system.
Fig. 4B illustrates an optical system 450 according to some embodiments. Light 414 from eye 412 is transmitted through front surface 42 of optical waveguide 4024 and is coupled into optical waveguide 402. Light beam 414 includes first light ray 416-1 having a first circular polarization and light ray 418-1 having a second circular polarization different from the first circular polarization. The light beam 414 strikes a reflective first optical element 452 disposed on the back surface 422 of the optical waveguide 402. The reflective first optical element 452 is reflective (e.g., in a backward direction) at a diffraction angle θ D First light ray 416-1 having the first circular polarization is diffracted toward front surface 424 of optical waveguide 402 as diffracted first light ray 416-2. First ray 416-2 is equal to θ D Angle of incidence theta I Striking the front surface 424. The reflective first optical element 452 is configured such that θ D (and thus θ) I ) Equal to or greater than the critical angle of the optical waveguide 402. In this manner, first light ray 416-2 is reflected by total internal reflection within the optical waveguide and guided along its long axis (e.g., along the y-direction). In some embodiments, second light ray 418-1 having a second circular polarization different from the first circular polarization is transmitted only through reflective first optical element 452 as transmitted second light ray 418-2 and propagates away from second optical element 406. In some embodiments, second light ray 418-1 having a second circular polarization is at an angle- θ in the opposite direction D Diffracted, as diffracted second light 418-3, and propagates away from second optical element 406. The first light ray 416-2 is coupled out of the optical waveguide 402 by the second optical element 406 in a similar manner as described with reference to fig. 4A.
For ease of illustration, first optical element 404 is not shown in FIG. 4A as having optical power (e.g., focal power, converging power, diverging power) -the distance between first ray 416-1 (within light beam 414) and ray 426-1 at second optical element 406 (e.g., the distance between first ray 416-3 and second ray 426-3) is substantially the same as the distance between them at first optical element 404 (e.g., the distance between first ray 416-1 and second ray 426-1). In some embodiments, the first optical element has a focal power, and the distance between the rays at the second optical element 406 is reduced (e.g., forms a demagnified image) compared to their distance at the first optical element 404. In some embodiments, the first optical element has a coating that provides a focusing optical power. In some embodiments, the first optical element is formed of a material that provides a focal power.
Fig. 5A shows an "unfolded" configuration of the optical system 500, which includes a telescope for forming an image of the object on the detector. The "expanded" configuration shows the various optical elements arranged in the optical system 500 sequentially along the z-axis, with no one or more reflections within the waveguide. The light beam exits the object (e.g., eye 502) along the y-axis and impinges on a first optical element 504.
In some embodiments, the first optical element 504 (sometimes referred to as an input grating) has a focal power. For example, the first optical element 504 in FIG. 5A has a focal length f 1 . The second optical element 520 is located at a distance f from the first optical element 504 1 To (3). In some embodiments, as shown in fig. 5A, the second optical element 520 has no focusing optical power and is used (e.g., only used) to couple out light guided by the waveguide. In some embodiments, the spectral width of the light beam from eye 502 is narrowest at second optical element 520. Placing the second optical element 520 at this location allows the smallest second optical element 520 to be used without losing a significant portion of the light downstream of the first optical element 504.
As in Galilean telescopes, with a focal length f 2 Is positioned such that the second optical element 520 is at a distance f 2 Is placed away from lens 522 (i.e., at the back focal plane of lens 522). The reduced image of eye 502 is at a distance f away from lens 522 2 Formed on a detector 524 (e.g., a CCD camera having sensor elements), i.e., at the front focal plane of lens 522. The demagnification depends on the focal length f 1 And f 2
Fig. 5B shows optical system 528, which corresponds to optical system 500 in a partially folded configuration, without a waveguide. The first optical element 530 is shown as having a focal length f 1 And the second optical element 532 is a flat mirror, at a distance f, although depicted as being flat 1 Away from the first lightThe optical element 530 is placed. All of the optical elements in FIG. 5B are located at the same distance as shown in FIG. 5A (e.g., the representative distance between the second optical element 532 and the lens 522 is f 2 And the distance between the lens 522 and the detector 524 is also f 2 )。
Fig. 5C shows a perspective view of an optical system 550 including a waveguide 552. Fig. 5D is a y-z plane cross-sectional view of optical system 550. Eye 502 is represented by window 553, and light beams from the x-y plane containing window 553 (including light beam 506) impinge on a first optical element 554 disposed on waveguide 552. The first optical element 554 couples in light of the light beam having a first circular polarization. Light rays having a second circular polarization different from the first circular polarization (e.g., the first circular polarization is RCP and the second circular polarization is LCP; or the first circular polarization is LCP and the second circular polarization is RCP) are not guided by the first optical element 554 to undergo total internal reflection within the waveguide 552. As a result, light having the second circular polarization propagates away from the second optical element 556. For example, light of the second polarization is transmitted through the waveguide 552 or diffracted by the first optical element 554 in a direction opposite to the direction of diffraction of light having the first polarization. In some embodiments, light rays of the first polarization guided within waveguide 552 travel upward along the y-direction, and are coupled out by a second optical element 556 disposed on waveguide 552 to guide them substantially along the z-direction. Lens 557 images light onto detector 558. In some embodiments, detector 558 is placed vertically above the eye (and closer to the waveguide than the eye). The first optical element 554 has focusing power and demagnifies the imaging beam from the eye 502 as the guided light is coupled out of the waveguide 552. In this manner, detector 558 has a detection surface in the x-y plane that is smaller in area than window 553.
In some embodiments, the first optical element 554 is configured to receive input light and manipulate the input light at a first angle in a first direction parallel to the first optical element 554 and at a second angle in a second direction parallel to the first optical element 554 and perpendicular to the first direction. In some embodiments, the first angle is different from the second angle. For example, in some embodiments, the first optical element 554 manipulates input light at a first angle (e.g., less than 10 degrees, less than 5 degrees, less than 3 degrees, less than 2 degrees, less than 1 degree) toward the y-direction and manipulates input light at a second angle (e.g., less than 6 degrees, less than 3 degrees, less than 2 degrees, less than 1 degree, less than 0.5 degrees) toward the x-direction. In some embodiments, the second angle is less than the first angle.
Fig. 5E shows distortion in the x-y plane observed at detector 558 for light rays incident on first optical element 554 at an angle of incidence within ± 0.1 ° of the surface normal of first optical element 554. Distortion is a form of optical aberration and is a deviation from a rectilinear projection (i.e., a line emerging from an object remains straight in an image). To determine the magnitude of the distortion, input (incident) light rays that form a mirror-symmetric checkerboard pattern (along the x-axis and along the y-axis) propagate through the optical system, and deviations from the checkerboard input image reveal the degree of distortion within the optical system.
The pattern 580 recorded by detector 558 shows that for light rays in the positive y-direction, the width of the overall pattern in the x-direction is small. The bottom of the checkerboard pattern also has some curvature (e.g., along the x-axis for smaller y-coordinate values). The distortion prevents an accurate image of the viewing window 553 being formed at the detector 558. In some embodiments, the corrective optical element reduces (e.g., cancels) distortion. In some embodiments, the distortion is first determined and then used to calibrate the optical system. In some embodiments, a correction algorithm processes the image detected at detector 558 to reduce (e.g., eliminate) distortion by computationally considering those errors.
Fig. 7A shows an optical system 700. Light beams from the eye 702 impinge on a first optical element 704 disposed on the front surface of a waveguide 706. Light rays guided within waveguide 706 are shown in an "unfolded" configuration-total internal reflection of light rays guided within waveguide 706 is not depicted in fig. 7A, and the propagation distance of the light rays within the waveguide sets the thickness of waveguide 706. The guided light is coupled out of the waveguide 706 by the second optical element 710. The insert in the upper right corner shows an exploded view of guided light propagating near one end of the waveguide. The second optical element 710 corrects various aberrations of the light, focuses the light (e.g., separated by their wavelength) near its output interface (e.g., in a keplerian telescope), and couples out the light to propagate along the z-axis. Imaging lens 712 images the light onto detector 714.
Fig. 7B depicts the optical system 700 of fig. 7A in a folded configuration, illustrating multiple total internal reflections of guided light rays within the waveguide 706. The first optical element 704 couples light rays (from the eye 702) having a particular circular polarization into the waveguide 706. Light rays having a different circular polarization than the light rays of the particular circular polarization are either transmitted through the waveguide 706 or diffracted/refracted into a different direction. As a result, light rays that do not have a particular circular polarization propagate away from the second optical element 710. The first optical element 704 is designed to respond to light rays having a particular circular polarization. In some embodiments, the first optical element 704 is designed to diffract LCP light into the +1 diffraction order and RCP light into the-1 diffraction order. In some embodiments, the first optical element 704 is designed to diffract RCP light into the +1 diffraction order and to diffract LCP light into the-1 diffraction order. In some embodiments, the first optical element 704 is designed to diffract LCP light into the +1 diffraction order while RCP light is transmitted through the first optical element 704 (e.g., the first optical element 704 does not cause diffraction of the RCP light). In some embodiments, the first optical element 704 is designed to diffract RCP light into the +1 diffraction order while LCP light is transmitted through the first optical element 704 (e.g., the first optical element 704 does not cause diffraction of LCP light).
The light is coupled out of the waveguide 706 by the second optical element 710. In some cases, some of the light rays 720 are not coupled out by the second optical element 710 in a direction toward the imaging lens 712, but instead leak out of the waveguide 706 because they no longer satisfy the total internal reflection condition after interacting with the second optical element 710. Light directed to lens 712 is imaged by lens 712 onto an image plane on detector 714. Some of the light rays 722 that continue to be totally internally reflected within the waveguide pass through (past) the second optical element 710.
Fig. 7C shows a pattern 730 indicating the amount of distortion in the x-y plane. The input (incident) light rays form a mirror-symmetric checkerboard pattern (along the x-axis and along the y-axis) and are transmitted through the optical system to determine the amount of deviation of the checkerboard pattern after exiting the optical system. After the light exits optical system 700, the width of the entire pattern in the x-direction and in the y-direction remains substantially constant (e.g., varies by less than 10%, varies by less than 5%, varies by less than 1%). Pattern 730 shows some curvature of the top of the checkerboard (i.e., along the x-axis for the maximum y-coordinate value). In some embodiments, the distortion prevents an accurate image of the viewing window 553 from being formed at the detector 714. Here, the optical system 700 corrects distortion, allowing the chief rays to be imaged with reduced distortion. In some embodiments, additional corrective optical elements reduce (e.g., eliminate) distortion errors.
Other relay systems may be used in addition to the relay systems shown in fig. 5A, 5B, 5C, 5D, or the relay systems shown in fig. 6A and 6B (e.g., galilean telescopes and keplerian telescopes). In some embodiments, the imaging system includes a single relay system (e.g., fig. 5A-5D). In some embodiments, the imaging system includes two relay systems (e.g., fig. 6A and 6B). In some embodiments, the optical waveguide along which the particular circularly polarized light ray is guided may further comprise an intermediate field lens to reduce the magnification of the light beam (which comprises the first light ray) coupled out of the waveguide. An intermediate field lens is a lens placed in a position conjugate to the image plane of the optical system (e.g., the plane of the detector).
In some embodiments, the optical system comprises an Ovonic telescope. Fig. 8A shows an olferone telescope 800 in some embodiments. The olvens telescope 800 includes three reflective surfaces 802, 804 and 806. Light emitted from the object 808 is focused by the first reflective surface 802 onto the second reflective surface 804. Light rays emanating from the object 808 at the first angle are focused onto a first location on the second reflective surface 804. Light rays emanating from the object 808 at a second angle are focused onto a second location on the second reflective surface 804 that is different from the first location.
The focused light rays are reflected by the second reflective surface 804, diverge, and reflect off the third reflective surface 806. The third reflective surface 806 then images the light to an image plane 810.
In some embodiments, the center of curvature of the first reflective surface 802 and the center of curvature of the third reflective surface 806 coincide with the second reflective surface 804. In some embodiments, the optical system 800 provides a demagnification rate (i.e., the image 810 is smaller than the object 808). In some embodiments, the optical system 800 provides magnification (i.e., the image 810 is larger than the object 808).
In some embodiments, optical system 800 is configured as an afocal optical system. An afocal system (i.e., a system without a focal point) is an optical system that does not produce a net convergence or divergence of the light beam (e.g., having an effective focal length at infinity). An optical system that provides afocal amplification can also correct for Petzval field curvature. This curvature occurs when the image point near the optical axis is fully focused, but the off-axis rays are focused before the image sensor. The optical system 800 corrects for petzval field curvature because the curvature of the second reflective surface 804 (e.g., a diverging convex mirror) is opposite in sign to the curvatures of the first reflective surface 802 and the third reflective surface 806 (e.g., a converging, concave mirror). Off-axis light rays reflect off the edge of the convex mirror in an opposite manner as the concave mirror, reducing (e.g., canceling) the petzval field curvature caused by the first and third reflective surfaces.
Fig. 8B shows an optical system 820 that includes two intersecting elliptical surfaces 826 and 828. Light emitted from object 822 reflects off of first elliptical reflecting surface 826 before reflecting off of second elliptical reflecting surface 828. The second elliptical reflecting surface directs light onto image plane 824, forming an image of object 822 at image plane 824.
In some embodiments, the first optical element (e.g., 404, 452, 504, 530, 554, 608, 652, 704) and the second optical element (e.g., 406, 556, 654, 710) form elements of the olferon repeater 800. In some embodiments, the first optical element (e.g., 404, 452, 504, 530, 554, 608, 652, 704) and the second optical element (e.g., 406, 556, 654, 710) form elements of a crossed elliptical relay 820.
In some embodiments, the first optical element is made of a material that causes diffraction similar to the optical reflection effect of the first reflective surface 802. In some embodiments, the second optical element is made of a material that causes diffraction similar to the optical reflection effect of the third reflective surface 806. In such embodiments, the reflective surface 804 is provided by another optical element within the waveguide or external to the waveguide.
In some embodiments, the offner relay 800 is disposed downstream of the waveguide. In some embodiments, the crossed elliptical relay 820 is disposed downstream of the waveguide.
In some embodiments, the first optical element is made of a material that causes diffraction similar to the optical reflection effect of the first elliptical reflecting surface 826. In some embodiments, the second optical element is made of a material that causes diffraction similar to the optical reflection effect of the second elliptical reflecting surface 828.
In accordance with these principles, we turn now to certain embodiments.
According to some embodiments, an optical system includes an optical waveguide and a first optical element configured to direct a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide toward a second optical element via total internal reflection. The first optical element is further configured to direct second light rays in a second direction different from the first direction, the second light rays having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light rays propagate away from the second optical element (e.g., the second light rays do not propagate through the light guide via total internal reflection but pass through the light guide or are directed away from the second optical element even if the second light rays propagate through the light guide via total internal reflection). The second optical element is configured to direct the first light propagating through the optical waveguide to the detector.
In some embodiments, directing the light includes changing the direction of the light (e.g., by reflection, refraction, and/or diffraction, etc.). In some embodiments, directing the light includes not changing the direction of the light (e.g., directing the light includes allowing the light to pass through the optical element without changing the direction of the light).
In some embodiments, the first optical element comprises an element selected from the group consisting of: a polarizer holographic element and a geometric phase lens. In some embodiments, the optical system comprises a polarizer holographic element and/or a geometric phase lens. In some embodiments, the geometric phase lens is an eccentric geometric phase lens.
In some embodiments, the first optical element is configured to direct near-infrared light impinging on the first optical element at a first angle of incidence in a first direction and to direct visible light impinging on the first optical element at the first angle of incidence in a direction different from the first direction (e.g., allowing visible light to pass through the first optical element without changing the direction of the visible light). In some embodiments, the first light has a wavelength greater than 850 nm. In some embodiments, the first optical element is configured to transmit visible light (without changing the direction of the visible light). In some embodiments, the first optical element is configured to transmit third light rays having a wavelength less than 800nm such that the third light rays propagate away from the second optical element (e.g., the third light rays do not propagate through the optical waveguide via total internal reflection, but rather pass through the optical waveguide).
In some embodiments, the first optical element has a first diffraction efficiency for near-infrared light and a second diffraction efficiency for visible light, and the first diffraction efficiency is greater than the second diffraction efficiency (e.g., the first diffraction efficiency is 90% or greater and the second diffraction efficiency is 10% or less). In some embodiments, the first optical element has a higher diffraction efficiency for wavelengths greater than 850nm than for wavelengths less than 800nm.
In some embodiments, the first optical element is disposed on the first surface of the optical waveguide such that the first light rays impinge on the optical waveguide after impinging on the first optical element. In some embodiments, the first optical element is located between the object and the optical waveguide.
In some embodiments, the optical system includes a detector (e.g., fig. 5C). In some embodiments, the first light rays impinging on the first optical element comprise imaging light from the subject, and the optical system is configured to project the imaging light onto the detector.
In some embodiments, the optical system further comprises an imaging telescope distinct from the combination of the optical waveguide, the first optical element, and the second optical element. In some embodiments, the imaging telescope is configured to receive the imaging light from the second optical element and form an image of the object on the detector. In some embodiments, the detector comprises a camera. In some embodiments, the camera and the object are located on the same side of the optical waveguide. In some embodiments, the camera is located above the object. In some embodiments, the camera is located below the object. In some embodiments, the optical system is configured to reduce aberrations (e.g., chromatic aberration, distortion, etc.) recorded by the detector (e.g., fig. 6A, 6B, 7A, 7B, and 7C).
In some embodiments, directing the first light ray includes causing reflection and diffraction of the first light ray. In some embodiments, the first optical element is disposed on the second surface of the optical waveguide such that the first light rays impinge on the optical waveguide before impinging on the first optical element. In some embodiments, the optical waveguide receives the first light on a first surface of the optical waveguide, and the first light that has passed through the first surface of the optical waveguide is reflected by a first optical element located on a second surface of the optical waveguide. In some embodiments, the optical waveguide is located between the object and the first optical element.
In some embodiments, the second optical element includes a polarizer holographic element, a geometric phase lens (e.g., an eccentric geometric phase lens), an output mirror, and an output grating.
In some embodiments, the first light ray forms part of a light beam that is relayed to the second optical element with reduced magnification. In some embodiments, the optical waveguide further comprises an intermediate field lens to reduce the magnification of the optical beam. In some embodiments, the intermediate field lens is disposed on a surface of the optical waveguide. In some embodiments, the intermediate field lens is embedded in the optical waveguide.
In some embodiments, the first optical element includes a coating that provides a focusing optical power (e.g., the first optical element is a thin film optic having optical power).
In some embodiments, the first optical element and the second optical element form an off-axis galilean telescope (e.g., fig. 6B). In some embodiments, the first optical element is a positive lens and the second optical element is a negative lens.
In some embodiments, the first optical element and the second optical element (collectively) form an olferone telescope (e.g., fig. 8A). In some embodiments, the olfner telescope includes three reflective surfaces, a second of which is located at an intermediate image plane of the optical system (e.g., fig. 8A).
In some embodiments, the optical system includes an off-axis galilean telescope (e.g., an off-axis galilean telescope separate from the first optical element and the second optical element). In some embodiments, the off-axis galilean telescope receives light exiting the optical waveguide (e.g., fig. 5C and 5D) and images it onto the detector. In some embodiments, the optical system further comprises an optical relay system to image the output of the off-axis Galilean telescope onto the detector. In some embodiments, the off-axis galilean telescope includes a converging lens and a diverging lens. The converging lens is eccentric and the diverging lens is eccentric. The diverging lens is configured to reduce aberrations associated with the converging lens. In some embodiments, the aberration comprises distortion. In some embodiments, the aberration comprises chromatic aberration.
In some embodiments, the optical system includes a fourth optical element (e.g., fig. 7A, 7B, and 7C) that corrects for distortion. In some embodiments, the fourth optical element comprises a coating.
In some embodiments, the optical system further comprises two off-axis reflective elliptical surfaces (e.g., fig. 8B). In some embodiments, the first optical element comprises one of two off-axis reflective elliptical surfaces.
According to some embodiments, an imaging system includes an optical system and a detector configured to receive an image of an object from the optical system.
In some embodiments, the object comprises an eye, the detector comprises a camera, the camera is located outside a field of view of the eye, and the first optical element is located in front of the eye to allow the camera to image a direct view of the eye.
In some embodiments, the imaging system is included in a head-mounted device (e.g., the imaging system operates as part of an eye tracking unit of the head-mounted device).
According to some embodiments, a method for relaying an eye image includes receiving light from a user's eye at a first optical element. The first optical element is configured to direct a first light ray in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide toward the second optical element via total internal reflection. The first optical element is also configured to direct a second light ray in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element. The method includes directing the first light from the optical waveguide to a detector with a second optical element.
In some embodiments, the method further comprises projecting the first light ray onto a detector to form an image; and determining the location of the pupil of the user's eye from the image. In some embodiments, imaging the first light rays onto the camera includes sending the first light rays coupled out of the waveguide into an optical relay system, and the camera is located at an image plane of the optical system (e.g., fig. 5A and 5B). In some embodiments, the optical relay system comprises an off-axis galilean telescope, an off-axis keplerian telescope, an olferan telescope, and/or two off-axis elliptical surfaces.
Although the various figures show the operation of a particular component or set of components with respect to one eye, those of ordinary skill in the art will appreciate that similar operations may be performed with respect to the other eye or both eyes. For the sake of brevity, such details are not repeated herein.
Although some of the different figures show multiple logical stages in a particular order, stages that are not order dependent may be reordered and other stages may be combined or broken down. Although some reordering or other grouping is specifically mentioned, other reordering or grouping will be apparent to one of ordinary skill in the art, and thus the ordering and grouping presented herein is not an exhaustive list of alternatives. Further, it should be recognized that these stages could be implemented in hardware, firmware, software, or any combination thereof.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the scope of the claims to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen in order to best explain the principles of the claims and their practical application to thereby enable others skilled in the art to best utilize the embodiments with various modifications as are suited to the particular use contemplated.

Claims (25)

1. An optical system, comprising:
an optical waveguide; and
a first optical element configured to:
i) Directing a first light ray from a user's eye in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide toward the second optical element via total internal reflection, and
ii) directing a second light ray from the user's eye in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element,
wherein the second optical element is configured to direct the first light rays propagating through the optical waveguide to a detector;
wherein directing the first light rays comprises causing reflection and diffraction of the first light rays;
and wherein the first optical element is disposed on the second surface of the optical waveguide such that the first light impinges on the optical waveguide before impinging on the first optical element.
2. The optical system of claim 1, wherein the first optical element comprises an element selected from the group consisting of a polarizer holographic element and a geometric phase lens.
3. The optical system of claim 1, wherein the first light has a wavelength greater than 850 nm.
4. The optical system of claim 2, wherein the first light has a wavelength greater than 850 nm.
5. The optical system of claim 3, wherein the first optical element is configured to transmit third light rays having a wavelength of less than 800nm such that the third light rays propagate away from the second optical element.
6. The optical system of claim 4, wherein the first optical element is configured to transmit third light rays having a wavelength of less than 800nm such that the third light rays propagate away from the second optical element.
7. The optical system of any one of claims 1-6, further comprising the detector, wherein the first light ray impinging on the first optical element comprises imaging light from a subject, and the optical system is configured to project the imaging light onto the detector.
8. The optical system of any of claims 1-6, wherein the second optical element comprises an element selected from the group consisting of a polarizer holographic element, a geometric phase lens, an output mirror, and an output grating.
9. The optical system of claim 7, wherein the second optical element comprises an element selected from the group consisting of a polarizer holographic element, a geometric phase lens, an output mirror, and an output grating.
10. The optical system according to any one of claims 1-6, 9, wherein the first light ray forms part of a light beam relayed to the second optical element with reduced magnification.
11. The optical system of claim 10, wherein the optical waveguide further comprises an intermediate field lens to reduce the magnification of the optical beam.
12. The optical system of any of claims 1-6, 9, 11, wherein the first optical element further comprises a coating that provides a focusing optical power.
13. The optical system of any of claims 1-6, 9, 11, wherein the first and second optical elements form an off-axis galilean telescope.
14. The optical system of any of claims 1-6, 9, 11, wherein the first optical element and the second optical element form an Ovonic telescope.
15. The optical system of claim 14, wherein the Ovonic telescope comprises three reflective surfaces, a second of the three reflective surfaces being located at an intermediate image plane of the optical system.
16. An imaging system, comprising:
the optical system of any one of claims 1-6 and 8; and
a detector configured to receive an image of an object from the optical system.
17. The imaging system of claim 16, wherein the object comprises an eye, the detector comprises a camera, the camera is positioned outside a field of view of the eye, and the first optical element is positioned in front of the eye to allow the camera to image a direct view of the eye.
18. The imaging system of claim 16 or 17, wherein the imaging system is comprised in a head-mounted device.
19. An imaging system, comprising:
an optical system according to any one of claims 7 and 9.
20. The imaging system of claim 19, wherein the object comprises an eye, the detector comprises a camera positioned outside a field of view of the eye, and the first optical element is positioned in front of the eye to allow the camera to image a direct view of the eye.
21. The imaging system of claim 19 or 20, wherein the imaging system is included in a head-mounted device.
22. An imaging system, comprising:
an optical system according to any one of claims 10-15.
23. The imaging system of claim 22, wherein the imaging system is included in a head-mounted device.
24. A method for relaying an eye image, the method comprising:
receiving light from a user's eye at a first optical element, wherein the first optical element is configured to:
i) Directing a first light ray from a user's eye in a first direction, the first light ray having a first circular polarization and impinging on the first optical element at a first angle of incidence such that the first light ray propagates through the optical waveguide toward the second optical element via total internal reflection, and
ii) directing a second light ray from the user's eye in a second direction different from the first direction, the second light ray having a second circular polarization different from the first circular polarization and impinging on the first optical element at the first angle of incidence such that the second light ray propagates away from the second optical element, and
directing the first light from the optical waveguide to a detector with the second optical element;
wherein directing the first light rays comprises causing reflection and diffraction of the first light rays;
and wherein the first optical element is disposed on the second surface of the optical waveguide such that the first light impinges on the optical waveguide before impinging on the first optical element.
25. The method of claim 24, further comprising:
projecting the first light onto the detector to form an image; and
the position of the pupil of the user's eye is determined from the image.
CN201980033381.4A 2018-05-18 2019-04-22 Waveguide imaging based eye tracking Active CN112136074B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211554573.XA CN116009243A (en) 2018-05-18 2019-04-22 Eye tracking based on waveguide imaging

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US201862673805P 2018-05-18 2018-05-18
US62/673,805 2018-05-18
US201962804136P 2019-02-11 2019-02-11
US62/804,136 2019-02-11
US16/359,117 US11256086B2 (en) 2018-05-18 2019-03-20 Eye tracking based on waveguide imaging
US16/359,117 2019-03-20
PCT/US2019/028452 WO2019221875A1 (en) 2018-05-18 2019-04-22 Eye tracking based on waveguide imaging

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202211554573.XA Division CN116009243A (en) 2018-05-18 2019-04-22 Eye tracking based on waveguide imaging

Publications (2)

Publication Number Publication Date
CN112136074A CN112136074A (en) 2020-12-25
CN112136074B true CN112136074B (en) 2022-12-09

Family

ID=68540713

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211554573.XA Pending CN116009243A (en) 2018-05-18 2019-04-22 Eye tracking based on waveguide imaging
CN201980033381.4A Active CN112136074B (en) 2018-05-18 2019-04-22 Waveguide imaging based eye tracking

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202211554573.XA Pending CN116009243A (en) 2018-05-18 2019-04-22 Eye tracking based on waveguide imaging

Country Status (3)

Country Link
EP (1) EP3794396A4 (en)
CN (2) CN116009243A (en)
WO (1) WO2019221875A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11256086B2 (en) 2018-05-18 2022-02-22 Facebook Technologies, Llc Eye tracking based on waveguide imaging

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104126143A (en) * 2012-02-15 2014-10-29 谷歌公司 Heads-up display including eye tracking
WO2017134412A1 (en) * 2016-02-04 2017-08-10 Milan Momcilo Popovich Holographic waveguide optical tracker

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007021036A1 (en) * 2007-05-04 2008-11-06 Carl Zeiss Ag Display device and display method for binocular display of a multicolor image
US9134534B2 (en) * 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9456744B2 (en) * 2012-05-11 2016-10-04 Digilens, Inc. Apparatus for eye tracking
WO2015139761A1 (en) * 2014-03-20 2015-09-24 Csem Centre Suisse D'electronique Et De Microtechnique Sa - Recherche Et Developpement Imaging system
US9377623B2 (en) * 2014-08-11 2016-06-28 Microsoft Technology Licensing, Llc Waveguide eye tracking employing volume Bragg grating
US9989765B2 (en) * 2015-08-03 2018-06-05 Oculus Vr, Llc Tile array for near-ocular display
US10254542B2 (en) 2016-11-01 2019-04-09 Microsoft Technology Licensing, Llc Holographic projector for a waveguide display

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104126143A (en) * 2012-02-15 2014-10-29 谷歌公司 Heads-up display including eye tracking
WO2017134412A1 (en) * 2016-02-04 2017-08-10 Milan Momcilo Popovich Holographic waveguide optical tracker

Also Published As

Publication number Publication date
EP3794396A1 (en) 2021-03-24
WO2019221875A1 (en) 2019-11-21
CN116009243A (en) 2023-04-25
CN112136074A (en) 2020-12-25
EP3794396A4 (en) 2021-07-28

Similar Documents

Publication Publication Date Title
US11933975B2 (en) Eye tracking based on waveguide imaging
US20220163808A1 (en) Optical assembly with polarization volume holographic element
EP3729173B1 (en) Integrated augmented reality head-mounted display for pupil steering
KR102391561B1 (en) Field curvature correction display
EP3721286B1 (en) Compact multi-color beam combiner using a geometric phase lens
US11624922B2 (en) Optical assemblies having polarization volume gratings for projecting augmented reality content
US10600352B1 (en) Display device with a switchable window and see-through pancake lens assembly
US10942320B2 (en) Dispersion compensation for light coupling through slanted facet of optical waveguide
CN113454515A (en) Holographic illuminator in field
US10969675B1 (en) Optical assemblies having scanning reflectors for projecting augmented reality content
US11914162B1 (en) Display devices with wavelength-dependent reflectors for eye tracking
CN112136074B (en) Waveguide imaging based eye tracking
US10955675B1 (en) Variable resolution display device with switchable window and see-through pancake lens assembly
US11366298B1 (en) Eye tracking based on telecentric polarization sensitive grating
US11714282B2 (en) Compact array light source for scanning display
US11237389B1 (en) Wedge combiner for eye-tracking
US11579425B1 (en) Narrow-band peripheral see-through pancake lens assembly and display device with same
US11726326B1 (en) Wedge light guide
US20230168506A1 (en) High efficiency optical assembly with folded optical path
US11586024B1 (en) Peripheral see-through pancake lens assembly and display device with same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: California, USA

Applicant after: Yuan Platform Technology Co.,Ltd.

Address before: California, USA

Applicant before: Facebook Technologies, LLC

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant