CN117724240A - Eye tracking system with in-plane illumination - Google Patents

Eye tracking system with in-plane illumination Download PDF

Info

Publication number
CN117724240A
CN117724240A CN202311196635.9A CN202311196635A CN117724240A CN 117724240 A CN117724240 A CN 117724240A CN 202311196635 A CN202311196635 A CN 202311196635A CN 117724240 A CN117724240 A CN 117724240A
Authority
CN
China
Prior art keywords
eye
illuminators
led
display
tracking system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311196635.9A
Other languages
Chinese (zh)
Inventor
查德·里奇滕汉
莫尔特扎·卡拉米
叶宽培
穆罕默德·穆特鲁
张绮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Meta Platforms Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/342,277 external-priority patent/US20240094809A1/en
Application filed by Meta Platforms Technologies LLC filed Critical Meta Platforms Technologies LLC
Publication of CN117724240A publication Critical patent/CN117724240A/en
Pending legal-status Critical Current

Links

Abstract

An eye tracking system with in-plane illumination of an optical assembly is described herein. A side-emitting Light Emitting Diode (LED) aligned with the plane of the optical assembly of the near-eye display device is used to illuminate the user's eye and produce a flicker that can be detected by the eye-tracking camera. When a corrective optical lens or similar element is included in the optical assembly that may distort the illumination beam from a Light Emitting Diode (LED), the distortion is mitigated by using a Light Emitting Diode (LED) modified inside or outside the package that provides the oblique beam. In addition to level relief within the package, such as a reflector or a label, the edge portions of the twisted optical element may be shaped or compensated with refractive elements to redirect the beam toward the eye.

Description

Eye tracking system with in-plane illumination
Cross Reference to Related Applications
This patent application claims priority from U.S. provisional patent application No. 63/407,587, entitled "Eye Tracking System with In-Plane Illumination (eye tracking system with in-plane illumination)" filed on day 9 and 16 of 2022, and U.S. patent application No. 18/342,277 filed on day 6 and 27 of 2023.
Technical Field
The present application relates generally to eye tracking in a near-eye display device, and more particularly to a configuration of light emitting diodes (Light Emitting Diode, LEDs) that project illumination in the plane of an optical assembly to achieve eye tracking functionality.
Background
In recent years, with advances in technology, the popularity and diffusion of content creation and content delivery has increased greatly. In particular, interactive content, such as Virtual Reality (VR) content, augmented Reality (Augmented Reality, AR) content, mixed Reality (MR) content, and content in and associated with a real environment and/or a Virtual environment (e.g., metaverse), is very attractive to consumers.
To facilitate the delivery of these and other related content, service providers strive to provide various forms of wearable display systems. One example of this may be a Head-Mounted Display (HMD) device, such as wearable glasses, wearable Head-Mounted devices (headsets), or glasses. In some examples, a Head Mounted Display (HMD) device may project or direct light so that virtual objects may be displayed, or images of real objects combined with virtual objects, such as in a Virtual Reality (VR) application, an Augmented Reality (AR) application, or a Mixed Reality (MR) application. For example, in an AR system, a user may view images of virtual objects (e.g., both Computer-Generated Image (CGI) and ambient environment).
Disclosure of Invention
The present disclosure provides an eye tracking system comprising: a plurality of illuminators for producing a flicker on a surface of an eye, wherein the plurality of illuminators are positioned along a frame comprising an optical assembly for a near-eye display device and illumination of the plurality of illuminators is aligned with a plane of the optical assembly; an eye tracking camera for capturing an image of the eye with the glints; and a processor communicatively coupled with the plurality of illuminators and the eye-tracking camera, the processor for determining three-dimensional features of the surface of the eye based on the acquired images of the eye with the glints to determine a position and gaze of the eye.
The present disclosure also provides a near-eye display device including: an optical assembly comprising a waveguide display for projecting artificial reality content to an eye; an eye tracking system comprising a plurality of illuminators for producing glints on a surface of an eye and an eye tracking camera, wherein the plurality of illuminators are positioned along a frame comprising an optical assembly and illumination of the plurality of illuminators is aligned with a plane of the optical assembly; the eye tracking camera is used for collecting the flicker on the surface of the eye; and a processor communicatively coupled with the plurality of illuminators and the eye-tracking camera, the processor for determining three-dimensional characteristics of the surface of the eye based on the collected glints to determine the position and gaze of the eye.
The present disclosure also provides a method comprising: illuminating the eye by a plurality of illuminators positioned along a frame comprising an optical assembly for a near-eye display device, wherein illumination of the plurality of illuminators is aligned with a plane of the optical assembly; capturing an image of the eye with flicker resulting from the illumination by an eye tracking camera; determining, by a processor communicatively coupled to the plurality of illuminators and the eye-tracking camera, a three-dimensional feature of a surface of the eye based on the captured image of the eye with the glints; and determining a position and gaze of the eye based on the determined three-dimensional features.
Drawings
Features of the present disclosure are illustrated by way of example and not limited by the following figures, in which like references indicate similar elements. Those skilled in the art will readily recognize from the following description that alternative examples of the structures and methods illustrated in the figures may be employed without departing from the principles described herein.
Fig. 1 illustrates a block diagram of an artificial reality system environment including a near-eye display according to an example.
Fig. 2 shows a perspective view of a near-eye display in the form of a Head Mounted Display (HMD) device according to an example.
Fig. 3A and 3B show perspective and top views of a near-eye display in the form of a pair of glasses according to an example.
Fig. 4 illustrates a simplified lens peripheral illumination eye tracking system according to an example.
Fig. 5A shows reflections of a waveguide display in an illuminated eye tracking system around a lens that may result in dual flicker according to an example.
Fig. 5B illustrates reflection of a waveguide display in an illumination eye tracking system surrounding a mitigation lens according to an example.
Fig. 6A illustrates beam scattering by a correction lens in an illuminated eye tracking system around a lens according to an example.
Fig. 6B illustrates a prismatic representation of a correction lens in a lens peripheral illumination eye tracking system according to an example.
Fig. 6C shows a representative irradiance diagram of a lens periphery illumination eye tracking system, according to an example, which illustrates the loss of illumination due to the correction lens that may result in double flicker.
Fig. 7 illustrates various embodiments of a side-emitting light-emitting diode (LED) or top-emitting LED for beam shaping in a lens periphery illumination eye tracking system according to an example.
Fig. 8A-8E illustrate various embodiments of mitigating distortion of a Light Emitting Diode (LED) beam by a corrective optical lens in an eye tracking system, according to an example.
Detailed Description
For simplicity and illustrative purposes, the present application is described primarily by way of example of the present application. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. It will be apparent, however, that the present application may be practiced without limitation to these specific details. In other instances, some methods and structures that would be readily understood by one of ordinary skill have not been described in detail so as not to unnecessarily obscure the present application. As used herein, the terms "a" or "an" are intended to mean at least one of a plurality of specified elements, the term "comprising" means including but not limited to, and the term "based on" means based at least in part on.
Tracking the position and orientation of the eyes and gaze direction in a Head Mounted Display (HMD) device may unlock the display and rendering architecture, which may greatly reduce the ability and computing requirements of rendering a three-dimensional (3D) environment. In addition, gaze prediction and intention inference are performed through eye tracking, and visual and immersive user experience can be achieved in the interaction process of the user and the virtual environment, so that user requirements are met.
Eye tracking may be achieved via a variety of techniques. One technique is fringe projection (fringe projection), which is to project a periodic pattern onto the eye and use the reflected pattern to determine 3D features. The stripe pattern is a periodic pattern. When the phase of the pattern is limited to a specific interval, the phase of the stripe pattern is called a wrap phase. Otherwise, the phase is referred to as unwrapped phase. The phase rather than intensity is used to establish the correspondence between the projector and the camera, and accurate detection can be achieved without using complex algorithms in the background.
In some examples of the present disclosure, eye tracking systems with optical assembly in-plane illumination are described. A side-emitting Light Emitting Diode (LED) aligned with the plane of the optical assembly of the near-eye display device may be used to illuminate the user's eye and produce a flicker that may be detected by the eye-tracking camera. When corrective optical lenses or similar elements are included in the optical assembly, the illumination beam from the Light Emitting Diode (LED) may be distorted, which distortion may be mitigated by using LEDs modified inside or outside the package that provide oblique beams (rather than beams aligned with the plane of the optical assembly). In addition to level relief within the package, such as a reflector or a label, the edge portion of the distorting optical element (e.g., corrective optical lens) may be shaped to redirect the light beam toward the eye or be compensated with a refractive element to redirect the light beam toward the eye.
While some advantages and benefits of the present disclosure are apparent, other advantages and benefits may include reduced complexity and/or power consumption of the eye tracking system. It may be an additional advantage to increase eye tracking accuracy by increasing illumination efficiency and adaptability of the corrective optical lenses.
Fig. 1 illustrates a block diagram of an artificial reality system environment 100 including a near-eye display according to an example. As used herein, a "near-eye display" may refer to a device (e.g., an optical device) that may be in close proximity to a user's eye. As used herein, "artificial reality" may refer to aspects of "virtual reality," or the environment of real and virtual elements, and may include the use of technologies associated with Virtual Reality (VR), augmented Reality (AR), and/or Mixed Reality (MR). As used herein, a "user" may refer to a user or wearer of a "near-eye display.
As shown in fig. 1, the artificial reality system environment 100 may include a near-eye display 120, an optional external imaging device 150, and an optional input/output interface 140, each of which may be coupled to the console 110. In some cases, console 110 may be optional because the functionality of console 110 may be integrated into near-eye display 120. In some examples, the near-eye display 120 may be a head-mounted display (HMD) that presents content to a user.
In some cases, it may be generally desirable for a near-eye display system to extend an eyebox, reduce display haze, improve image quality (e.g., resolution and contrast), reduce physical size, increase power efficiency, and increase or extend Field of View (FOV). As used herein, a "field of view (FOV)" may refer to a range of angles of an image as seen by a user, typically measured in degrees (degres) observed by a single eye (for a monocular Head Mounted Display (HMD)) or by both eyes (for a binocular Head Mounted Display (HMD)). Further, as used herein, an "eyebox" may be a two-dimensional box that may be positioned in front of a user's eyes from which a display image from an image source may be viewed.
In some examples, in a near-eye display system, light from the surrounding environment may pass through a "see-through" region of a waveguide display (e.g., a transparent substrate) to reach the user's eyes. For example, in a near-eye display system, light projecting an image may be coupled into a transparent substrate of a waveguide, propagate within the waveguide, and be coupled out or extracted from the waveguide at one or more locations to replicate the exit pupil and expand the eyebox.
In some examples, the near-eye display 120 may include one or more rigid bodies, which may be rigidly or non-rigidly coupled to each other. In some examples, the rigid coupling between the plurality of rigid bodies may be such that the coupled rigid bodies act as a single rigid entity, while in other examples, the non-rigid coupling between the plurality of rigid bodies may allow the rigid bodies to move relative to one another.
In some examples, the near-eye display 120 may be implemented in any suitable form factor, including a head-mounted display (HMD), a pair of glasses, or other similar wearable glasses or devices. Examples of near-eye display 120 are further described below with respect to fig. 2 and fig. 3A and 3B. Further, in some examples, the functionality described herein may be used in a head-mounted display (HMD) or head-mounted device that combines images of an environment external to the near-eye display 120 with artificial reality content (e.g., computer-generated images). Thus, in some examples, the near-eye display 120 may augment an image of a physical real-world environment external to the near-eye display 120 with generated and/or superimposed digital content (e.g., images, video, sound, etc.) to present augmented reality to a user.
In some examples, near-eye display 120 may include any number of display electronics 122, display optics 124, and eye tracking unit 130. In some examples, the near-eye display 120 may also include one or more positioners 126, one or more position sensors 128, and an inertial measurement unit (Inertial Measurement Unit, IMU) 132. In some examples, the near-eye display 120 may omit any of the eye tracking unit 130, the one or more locators 126, the one or more position sensors 128, and the IMU 132, or may include additional elements.
In some examples, display electronics 122 may display images to a user or facilitate display of images to a user based on data received from, for example, optional console 110. In some examples, display electronics 122 may include one or more display panels. In some examples, display electronics 122 may include any number of pixels that emit light of a dominant color, such as red, green, blue, white, or yellow. In some examples, the display electronics 122 may display a three-dimensional (3D) image, for example, using a stereoscopic effect produced by a two-dimensional panel, to create a subjective perception of image depth.
In some examples, near-eye display 120 may include a projector (not shown) that may form an image in the angular domain for direct viewing by a viewer's eye through the pupil. Projectors may employ controllable light sources (e.g., laser sources) and microelectromechanical systems (Micro-electromechanical System, MEMS) beam scanners to generate a light field from, for example, a collimated beam. In some examples, the same projector or a different projector may be used to project a fringe pattern onto the eye, which may be collected and analyzed by a camera (e.g., by the eye tracking unit 130) to determine the position, gaze, etc. of the eye (pupil).
In some examples, display optics 124 may optically display image content (e.g., using an optical waveguide and/or coupler) or amplify received image light from display electronics 122, correct optical errors associated with the image light, and/or present the corrected image light to a user of near-eye display 120. In some examples, display optics 124 may include a single optical element or any number of combinations of various optical elements, as well as mechanical couplings to maintain the relative spacing and orientation of the optical elements in the combination. In some examples, one or more optical elements in display optics 124 may have an optical coating, such as an anti-reflective coating, a filter coating, and/or a combination of different optical coatings.
In some examples, display optics 124 may also be designed to correct one or more types of optical errors, such as two-dimensional optical errors, three-dimensional optical errors, or any combination thereof. Examples of two-dimensional errors may include barrel distortion, pincushion distortion, longitudinal chromatic aberration, and lateral chromatic aberration. Examples of three-dimensional errors may include spherical aberration, chromatic aberration curvature of field, and astigmatism.
In some examples, one or more locators 126 may be objects located at particular positions on the near-eye display 120 relative to each other and relative to a reference point. In some examples, the optional console 110 may identify one or more locators 126 in images acquired by the optional external imaging device 150 to determine the location, orientation, or both of the artificial reality headset. The one or more locators 126 may each be a Light Emitting Diode (LED), a corner cube reflector, reflective markers, a light source that contrasts with the environment in which the near-eye display 120 operates, or any combination thereof.
In some examples, the external imaging device 150 may include one or more cameras, any other device capable of capturing images including one or more locators 126, or any combination thereof. The optional external imaging device 150 may be configured to detect light emitted or reflected from one or more locators 126 in the field of view of the optional external imaging device 150.
In some examples, one or more position sensors 128 may generate one or more measurement signals in response to movement of near-eye display 120. Examples of the one or more position sensors 128 may include any number of accelerometers, gyroscopes, magnetometers, and/or other motion detection or error correction sensors, or any combination thereof.
In some examples, inertial Measurement Unit (IMU) 132 may be an electronic device that generates rapid calibration data based on measurement signals received from one or more position sensors 128. The one or more position sensors 128 may be located external to the Inertial Measurement Unit (IMU) 132, internal to the Inertial Measurement Unit (IMU) 132, or any combination thereof. Based on one or more measurement signals from the one or more position sensors 128, an Inertial Measurement Unit (IMU) 132 may generate fast calibration data representing an estimated position of the near-eye display 120 relative to an initial position of the near-eye display 120. For example, an Inertial Measurement Unit (IMU) 132 may integrate the received measurement signals from the accelerometer over time to estimate a velocity vector and integrate the velocity vector over time to determine an estimated location of a reference point on the near-eye display 120. Alternatively, an Inertial Measurement Unit (IMU) 132 may provide a sampled measurement signal to the optional console 110, which may determine the fast calibration data.
Eye tracking unit 130 may include one or more eye tracking systems. As used herein, "eye tracking" may refer to determining the position or relative position of an eye, including the orientation, position, and/or gaze of a user's eye. In some examples, the eye tracking system may include an imaging system that images one or both eyes, and the eye tracking system may optionally include a light emitter that may generate light (e.g., a fringe pattern) that is directed to the eyes such that light reflected by the eyes may be collected by the imaging system (e.g., a camera). In other examples, eye tracking unit 130 may collect reflective radio waves emitted by a micro-radar unit. Such eye-related data may be used to determine or predict the position, orientation, activity, location, and/or gaze of the eye. In addition to using stripe pattern reflection, the eye tracking unit 130 may also use one or more illuminators to project light (e.g., infrared light or near infrared light) onto the eye and detect glints, which may be used to detect the eye surface and determine gaze.
In some examples, near-eye display 120 may use the orientation of the eyes to introduce depth cues (e.g., blur images outside of the user's main line of sight), collect heuristic information of user interactions in Virtual Reality (VR) media (e.g., time spent on any particular object, or frame as a function of experienced stimulus), some other function based in part on the orientation of at least one of the user's eyes, or any combination thereof. In some examples, because the orientation may be determined for both eyes of the user, the eye tracking unit 130 may be able to determine where the user is looking or predict any user patterns, and so on.
In some examples, the input/output interface 140 may be a device that allows a user to send an action request to the optional console 110. As used herein, an "action request" may be a request to perform a particular action. For example, the action request may be to start an application or end an application, or to perform a particular action within an application. Input/output interface 140 may include one or more input devices. Example input devices may include a keyboard, mouse, game controller, glove, button, touch screen, or any other suitable device for receiving action requests and transmitting the received action requests to the optional console 110. In some examples, the action request received by the input/output interface 140 may be communicated to the optional console 110, which may perform an action corresponding to the requested action.
In some examples, console 110 may provide content to near-eye display 120 for presentation to a user based on information received from one or more of external imaging device 150, near-eye display 120, and input/output interface 140. For example, in the example shown in fig. 1, optional console 110 may include an application library 112, a head mounted device tracking module 114, a virtual reality engine 116, and an eye tracking module 118. Some examples of the optional console 110 may include different modules or additional modules than those described in connection with fig. 1. The functions described further below may be distributed among the various components of the optional console 110 in a manner different from that described herein.
In some examples, the optional console 110 may include a processor and a non-transitory computer readable storage medium storing instructions executable by the processor. A processor may include multiple processing units that execute instructions in parallel. The non-transitory computer readable storage medium may be any memory, such as a hard disk drive, removable memory, or solid state drive (e.g., flash memory or dynamic random access memory (Dynamic Random Access Memory, DRAM)). In some examples, the modules of the optional console 110 described in connection with fig. 1 may be encoded as instructions in a non-transitory computer-readable storage medium that, when executed by a processor, cause the processor to perform the functions further described below. It should be appreciated that the optional console 110 may or may not be required, or the optional console 110 may be integrated with the near-eye display 120 or separate from the near-eye display.
In some examples, the application library 112 may store one or more applications for execution by the optional console 110. An application may include a set of instructions that, when executed by a processor, generate content for presentation to a user. Examples of applications may include a gaming application, a conferencing application, a video playback application, or other suitable application.
In some examples, the headset tracking module 114 may use slow calibration information from the external imaging device 150 to track the movement of the near-eye display 120. For example, the head-mounted device tracking module 114 may use the localizer as observed from the slow calibration information, as well as a model of the near-eye display 120, to determine the location of the reference point of the near-eye display 120. Additionally, in some examples, the head mounted device tracking module 114 may use multiple parts of: fast calibration information, slow calibration information, or any combination thereof, to predict a future position of the near-eye display 120. In some examples, the head-mounted device tracking module 114 may provide the estimated or predicted future position of the near-eye display 120 to the virtual reality engine 116.
In some examples, the virtual reality engine 116 may execute an application within the artificial reality system environment 100 and receive position information of the near-eye display 120, acceleration information of the near-eye display 120, velocity information of the near-eye display 120, a predicted future position of the near-eye display 120, or any combination thereof from the head mounted device tracking module 114. In some examples, virtual reality engine 116 may also receive information from eye tracking module 118 of estimated eye position and orientation. Based on the received information, the virtual reality engine 116 may determine content to provide to the near-eye display 120 for presentation to the user.
In some examples, eye tracking module 118, which may be implemented as a processor, may receive eye tracking data from eye tracking unit 130 and determine a position of a user's eye based on the eye tracking data. In some examples, the position of the eye may include an orientation, a position, or both of the eye relative to the near-eye display 120 or any element of the near-eye display. Thus, in these examples, because the axis of rotation of the eye varies with the position of the eye in its orbital, determining the position of the eye in its orbital may allow the eye tracking module 118 to more accurately determine the orientation of the eye.
In some examples, the position of the projector of the display system may be adjusted to enable any number of design modifications. For example, in some cases, the projector may be located in front of the viewer's eye (i.e., a "front-mounted" placement). In some examples, in a front-facing placement, the projector of the display system may be located at a position away from the user's eyes (i.e., "world-side"). In some examples, a Head Mounted Display (HMD) device may utilize a front-facing placement to spread light toward a user's eyes to project images.
Fig. 2 shows a perspective view of a near-eye display in the form of a Head Mounted Display (HMD) device 200 according to an example. In some examples, head Mounted Display (HMD) device 200 may be part of a Virtual Reality (VR) system, an Augmented Reality (AR) system, a Mixed Reality (MR) system, another system using a display or a wearable device, or any combination thereof. In some examples, a Head Mounted Display (HMD) device 200 may include a body 220 and a headband 230. Fig. 2 shows the bottom side 223, front side 225, and left side 227 of the body 220 in perspective view. In some examples, headband 230 may have an adjustable or extensible length. In particular, in some examples, there may be sufficient space between the body 220 and the headband 230 of the head-mounted display (HMD) device 200 to allow a user to wear the head-mounted display (HMD) device 200 onto the user's head. For example, the length of headband 230 may be adjustable to accommodate a range of user head sizes. In some examples, head Mounted Display (HMD) device 200 may include additional components, fewer components, and/or different components.
In some examples, head Mounted Display (HMD) device 200 may present media or other digital content to a user, including virtual and/or enhanced views of a physical real-world environment with computer-generated elements. Examples of media or digital content presented by Head Mounted Display (HMD) device 200 may include images (e.g., two-dimensional images (2D) or three-dimensional images (3D)), video (e.g., 2D video or 3D video), audio, or any combinations thereof. In some examples, images and video may be presented to each eye of a user through one or more display components (not shown in fig. 2) housed in the body 220 of the Head Mounted Display (HMD) device 200.
In some examples, head Mounted Display (HMD) device 200 may include various sensors (not shown), such as a depth sensor, a motion sensor, a position sensor, and/or an eye tracking sensor. Some of these sensors may use any number of structured or unstructured light patterns for sensing purposes. In some examples, as described with respect to fig. 1, a Head Mounted Display (HMD) device 200 may include an input/output interface 140 for communicating with console 110. In some examples, head Mounted Display (HMD) device 200 may include a virtual reality engine (not shown), which, although not shown, is similar to virtual reality engine 116 described with respect to fig. 1, which may execute applications within Head Mounted Display (HMD) device 200 and receive depth information, position information, acceleration information, velocity information, predicted future positions, or any combination thereof, of Head Mounted Display (HMD) device 200 from various sensors.
In some examples, information received by virtual reality engine 116 may be used to generate signals (e.g., display instructions) to one or more display components. In some examples, head Mounted Display (HMD) device 200 may include locators (not shown) that, although not shown, are similar to virtual locator 126 described in fig. 1, which may be located at multiple fixed positions relative to each other and relative to a reference point on body 220 of Head Mounted Display (HMD) device 200. Each of these positioners may emit light detectable by an external imaging device. This may be useful for head tracking or other movement/orientation purposes. It should be understood that other elements or components may be used in addition to or in place of such a positioner.
It should be appreciated that in some examples, a projector mounted in a display system may be placed near and/or closer to the user's eye (i.e., "eye-side"). In some examples, as discussed herein, a projector for a display system shaped like glasses may be mounted or positioned in a temple arm of the glasses (i.e., top far corner of the lens side). It should be appreciated that in some cases, utilizing a back-mounted projector placement may help reduce the size or volume of any required housing required for the display system, which may also significantly improve the user experience for the user.
In some examples, the projector may provide structured light (a fringe pattern) to the eye, which may be collected by the eye-tracking camera 212. The eye tracking camera 212 or a communicatively coupled processor (e.g., eye tracking module 118 in fig. 1) may analyze the collected reflections of the fringe pattern and perform the analysis to generate a phase map of the fringe pattern, which may provide depth information for the eye and its structure. For example, in the case of using phase unwrapping, an absolute phase map may be generated using, for example, flickering reflected from the eye as an anchor point (anchor).
Fig. 3A shows a perspective view of a near-eye display 300 in the form of a pair of glasses (or other similar glasses) according to an example. In some examples, near-eye display 300 may be a particular example of near-eye display 120 of fig. 1, and the near-eye display may be configured to operate as a virtual reality display, an Augmented Reality (AR) display, and/or a Mixed Reality (MR) display.
In some examples, near-eye display 300 may include a frame 305 and a display 310. In some examples, display 310 may be configured to present media or other content to a user. In some examples, display 310 may include display electronics and/or display optics similar to the components described with respect to fig. 1-2. For example, as described above with respect to near-eye display 120 of fig. 1, display 310 may include a liquid crystal display (Liquid Crystal Display, LCD) display panel, a Light Emitting Diode (LED) display panel, or an optical display panel (e.g., a waveguide display assembly). In some examples, display 310 may also include any number of optical components, such as waveguides, gratings, lenses, mirrors, and the like. In other examples, display 210 may include a projector, or near-eye display 300 may include a projector in place of display 310.
In some examples, the near-eye display 300 may also include various sensors 350A, 350B, 350C, 350D, and 350E located on the frame 305 or within the frame 305. In some examples, as shown, the various sensors 350A-350E may include any number of depth sensors, motion sensors, position sensors, inertial sensors, and/or ambient light sensors. In some examples, the various sensors 350A-350E may include any number of image sensors configured to generate image data representing different fields of view in one or more different directions. In some examples, sensors 350A-350E may be used as input devices to control or affect the display content of the near-eye display, and/or to provide an interactive Virtual Reality (VR), augmented Reality (AR), and/or Mixed Reality (MR) experience to a user of near-eye display 300. In some examples, the various sensors 350A-350E may also be used for stereoscopic imaging or other similar applications.
In some examples, near-eye display 300 may also include one or more illuminators 330 to project light toward the eye. The projected light may be associated with different frequency bands (e.g., visible light, infrared light, near infrared light, etc.), and the projected light may be used for various purposes. In some examples, light from one or more illuminators 330 may be used to produce glints on the surface of the eye, which may then be used to determine the position (gaze) and other three-dimensional features of the eye.
In some examples, near-eye display 300 may also include a camera 340 or other image acquisition unit. The camera 340 may, for example, capture images of the physical environment in the field of view. In some cases, the acquired images may be processed, for example, by a virtual reality engine (e.g., virtual reality engine 116 of fig. 1) to add virtual objects to the acquired images or to modify physical objects in the acquired images, and the processed images may be displayed to a user by display 310 for an Augmented Reality (AR) application or a Mixed Reality (MR) application. Near-eye display 300 may also include an eye-tracking camera 312.
Fig. 3B shows a top view of a near-eye display in the form of a pair of glasses (or other similar glasses) according to an example. In some examples, the near-eye display 300 may include a frame 305 having a form factor of a pair of eyeglasses. For each eye, the frame 305 supports: a fringe projector 314 (e.g., any fringe projector variation contemplated herein), a display 310 that presents content to the eyebox 366, an eye-tracking camera 312, and one or more illuminators 330. Illuminator 330 may be used to illuminate eyebox 366 as well as to provide flickering illumination to the eye. The fringe projector 314 may provide a periodic fringe pattern to the user's eye. Display 310 may include a pupil replication waveguide to receive the fan beams and provide multiple laterally offset parallel copies of each of the fan beams to expand the projected image over eyebox 366.
In some examples, the pupil replication waveguide may be transparent or translucent to enable a user to view the outside world and the images projected into each eye and superimposed with the outside world view. The image projected into each eye may include objects arranged in a simulated parallax so as to appear immersed in the real world view.
Eye tracking camera 312 may be used to determine the position and/or orientation of the user's two eyes. Once the position and orientation of the user's eyes are known, the gaze convergence distance and direction may be determined. The image displayed by the display 310 may be dynamically adjusted to more realistically immerse the user in the displayed augmented reality scene, and/or to provide specific functionality for interaction with the augmented reality, in view of the user's gaze. In operation, illuminator 330 can illuminate an eye at a corresponding eyebox 366 to enable an eye tracking camera to obtain an image of the eye and provide reference reflection. The reflection (which may also be referred to as "glint") may be used as a reference point in the acquired eye image to facilitate the determination of the eye gaze direction by determining the position of the eye pupil image relative to the glint. To avoid that the illumination light is distracting to the user, the illumination light may be made invisible to the user. For example, infrared light may be used to illuminate the eyebox 366.
In some examples, the image processing and eye position/orientation determination functions may be performed by a central controller (not shown) of the near-eye display 300. The central controller may also provide control signals to the display 310 to generate an image to be displayed to the user based on the determined eye position, eye orientation, gaze direction, eye vergence (vergent), and the like.
Fig. 4 illustrates a simplified lens peripheral illumination eye tracking system according to an example. Schematic 400 shows a waveguide display 402, a Light Emitting Diode (LED) 404 with LED beam expansion 406 (LED beam spot), a glass 408, an eye tracking camera 410, and an eye 412. As described herein, eye tracking acquires data about one or both eyes of a user, such as detecting presence, attention, focus, pupil position and pupil size of a person. Data points such as pupil position, eye gaze vector, gaze point, and eye opening degree may be calculated from the collected eye tracking information. With the collected data, an image may be projected more accurately into the user's eye and the user's intent may be detected (e.g., as input to an interactive display system).
Thus, a light beam from Light Emitting Diode (LED) 404 that passes through glass 408 and reaches eye 412 may be used as an eye tracking illumination source, where eye tracking camera 410 detects flicker on the surface of the eye generated by the light beam. The detected glints may be used in a three-dimensional detection algorithm, a phase mapping algorithm or the like to determine the three-dimensional topography of the eye in a non-contact manner, with high resolution and fast data processing.
In some examples, waveguide display 402 may project artificial reality content (i.e., computer-generated content) to eye 412. Waveguide display 402 may be a transparent display and allows light from the environment to pass through. Thus, the artificial reality content may be overlaid with real content from the environment. In some cases, several optical elements (e.g., lenses, polarizers, filters, waveguide plates, and the like) may be positioned between waveguide display 402 and glass 408 to form an optical assembly.
As described herein, flicker generated by one or more illuminators is required to extract the gaze and position of the eye. An in-field Light Emitting Diode (LED) lighting system may provide sufficient illumination. However, in-field systems are expensive, complex to integrate, and may not alleviate the challenges presented by correction lenses, as discussed below. On the other hand, a lens-surrounding or frame Light Emitting Diode (LED) system may allow illumination light from the Light Emitting Diode (LED) to avoid any optical elements in the optical assembly (located between waveguide display 402 and glass 408), allow for efficient illumination of the eye, and mitigate potential problems due to optical elements (e.g., reflection or diffraction of light, ghost signals, etc.).
In some examples, light sources other than light emitting diodes may also be used as the illuminator. For example, side-emitting laser diodes, vertical cavity surface emitting laser diodes, or superluminescent light emitting diodes (superluminescent LED) are some non-limiting examples of light sources that may be used.
In some examples, a plurality of Light Emitting Diodes (LEDs) may be placed on a flexible printed circuit board (Printed Circuit Board, PCB) attached along the perimeter of the frame (including waveguide display 402). To provide illumination to eye 412, light Emitting Diodes (LEDs) may be positioned in an oblique manner (e.g., in a Virtual Reality (VR) application). However, this technique may increase the space required for the optical assembly, making the optical assembly thicker. To make the optical assembly thinner for use with lighter near-eye display devices (e.g., artificial Reality (AR) glasses), light Emitting Diodes (LEDs) may be placed horizontally (e.g., beam expansion aligned with the plane of the optical assembly), wherein the light beam from the Light Emitting Diode (LED) 404 still passes through the glass 408, illuminates the eye 412, and provides an eye tracking function.
In some examples, one illuminator (light emitting diode (LED) 404) may adequately provide the anchoring flash. However, any number of illuminators may be used. For example, in some practical embodiments, up to 8 to 10 Light Emitting Diodes (LEDs), or up to 20 Light Emitting Diodes (LEDs), may be used. In some embodiments, the illuminator may be infrared or Near Infrared (NIR) to avoid distraction to the user. Thus, the number and/or location of the luminaires may be selected based on some design considerations. The illuminator may be positioned such that flicker is produced within the field of view (FOV) of the eye-tracking camera 410.
Fig. 5A illustrates reflection of a waveguide display in an illuminated eye tracking system around a lens according to an example, which may result in double flicker. Schematic 500A shows waveguide display 502 presenting artificial reality content to eye 512 via glass 508 and/or other optical components (not shown). The waveguide display 502 may be transparent or translucent and may pass light from the environment so that the eye 512 sees a real image of the environment. Schematic 500A also shows a Light Emitting Diode (LED) 504 with LED beam expansion 506 that includes a light beam (e.g., direct (direct) LED light beam 514) that passes through glass 508 and reaches eye 512. The reflection of the direct LED beam 514 may then be collected by the eye tracking camera 510 and used for three-dimensional detection of eye surface features to determine gaze and position. In some embodiments, some of the light beams from Light Emitting Diodes (LEDs) 504 may reach waveguide display 502 and may be reflected toward eye 512 as LED light beams 516 reflected from the waveguide display. Reflections of this secondary beam may also be captured by eye-tracking camera 510 and may cause double flicker, thereby creating ghost signals and/or other mismatches in the flicker analysis. In addition, a portion of the light (e.g., more than half of the light) may be wasted (i.e., not used for illumination of eye 512).
Fig. 5B illustrates reflection of a waveguide display in an illumination eye tracking system surrounding a mitigation lens according to an example. Schematic 500B shows the same configuration of eye tracking system as in fig. 5A, with the addition of mirror 520 to mitigate the dual flicker challenges described above.
In some examples, a mirror 520 or similar reflective element may be placed beside the Light Emitting Diode (LED) 504, between the Light Emitting Diode (LED) 504 and the waveguide display 502. The mirror 520 may reflect the light beam from the Light Emitting Diode (LED) 504 toward the waveguide display 502. The reflected beam 526 from the mirror may avoid completely the pupil or eye 512 and thus not reflect from the eye to the eye tracking camera 510, avoiding double glints. In some examples, mirror 520 may be shaped and/or positioned to control the direction of reflected beam 526 so that it avoids pupil or eye 512. Thus, in addition to alleviating the dual flicker challenge, the mirror 520 may also improve the illumination efficiency of the eye tracking system.
Fig. 6A illustrates beam scattering by a correction lens in an illuminated eye tracking system around a lens according to an example. Schematic 600A shows a waveguide display 602 and a Light Emitting Diode (LED) 604 with LED beam expansion 606 that provide eye-tracking illumination to an eye 612 through glass 608. The flicker caused by the illumination may be captured by the eye tracking camera 610 and used to determine the gaze of the eye 612. In the illustrated configuration, corrective optical lens 614 may be attached to or integrated with glass 608.
Correction optical lens 614 may be used to aid a user with an eye deformity such as myopia, astigmatism, and the like. However, the corrective optical lens 614 may also deviate the light beam from the Light Emitting Diode (LED) 604 from its original orientation. Thus, the presence of corrective optical lens 614 may impair and/or distort the illumination of the eye.
Fig. 6B illustrates a prismatic representation of a correction lens in a lens peripheral illumination eye tracking system according to an example. Schematic 600B shows prism 624, which represents the optical function of corrective optical lens 614 with respect to a light beam from Light Emitting Diode (LED) 604 (e.g., LED light beam 626). LED beam 626 is refracted (e.g., deflected LED beam 628) as it passes through prism 624 (i.e., corrective optical lens 614) and may deviate from eyebox 622, meaning that it may not illuminate eye 612. Accordingly, the corrective optical lens 614 may reduce the illumination efficiency of the eye tracking system and may cause other distortion problems due to the change in the optical path from the Light Emitting Diode (LED) 604.
Fig. 6C shows a representative irradiance diagram of a lens periphery illumination eye tracking system, which illustrates the loss of illumination due to the correction lens that may result in double flicker, according to an example. Schematic 600C shows an irradiance map 630 of an eye tracking system with lens perimeter Light Emitting Diodes (LEDs) that includes substantially uniform illumination 634 of an eyebox 632, wherein an illumination scale 636 represents an illumination level.
With the addition of a high power (description) corrective optical lens (e.g., -6 diopters), irradiance map 640 of the same eye tracking system shows darker areas within eyebox 642, indicating the lack of illumination caused by the corrective optical lens, i.e., the non-uniformity of illumination 644 of eyebox 642. Illumination scale 646 provides a color scale for irradiance maps. Thus, as shown by the irradiance pattern, the corrective optical lens can significantly reduce the performance of an eye tracking system having side-emitting illumination LEDs.
In some examples, the horizontal light emission of a side-emitting Light Emitting Diode (LED) or a top-emitting Light Emitting Diode (LED) may provide sufficient illumination for flicker detection without the need for a complex in-field system. For example, if there is no correction optical lens or the correction optical lens power is low, side emission (horizontal) may be sufficient. In the case of a higher correction optic power, the light beam spread of the Light Emitting Diode (LED) may be tilted to avoid or pass through the edge of the correction optic. Some examples of implementations of Light Emitting Diodes (LEDs) for providing illumination are discussed below.
Fig. 7 illustrates various embodiments of a side-emitting Light Emitting Diode (LED) or top-emitting Light Emitting Diode (LED) for beam shaping in a lens peripheral illumination eye tracking system according to an example. Schematic 700 shows a side-emitting LED 700A with molded label, a side-emitting LED 700B with leadframe, a top-emitting LED 700C with refractive layer, a top-emitting LED 700D with diffractive layer, a custom designed LED chip 700E with tilted output, a side-emitting LED 700F with package internal reflector, and a side-emitting LED 700G with external reflector.
In some examples, side-emitting LED 700A may be attached to a frame including a waveguide display and other optical components by a Printed Circuit Board (PCB) 702. The LED package may include an LED chip 704 and a label 706 may be molded around the LED package to allow for side-emitting light beams 710. The side-emitting LED 700B may include an LED package 708 with a lead frame and may provide a side-emitting light beam 710.
The top-emitting LED 700C may include a refractive layer 712 capable of producing a tilted light beam 714. The refractive layer 714 may be, for example, a Bismuth Silicate (BSO) layer, and formed in any suitable shape. The top-emitting LED 700D may include a diffraction layer 716 capable of producing a tilted light beam 714. The diffraction layer 716 may also be a Bismuth Silicate (BSO) layer or other material. In addition, a microlens array or a microprism array may be used as the refractive layer. Alternatively, holographic gratings, super-structured surfaces (meta-surfaces) or patterned microstructured surfaces may be used as diffractive layers. In some examples, custom designed LED chip 700E may be designed at the chip level (rather than modified at the package level or externally) to provide oblique light beam 714.
In some examples, side-emitting LED 700F may be arranged with reflector 718 within the LED package to provide oblique light beam 720. Alternatively, the side-emitting LED 700G may provide a tilted light beam 724 by way of an external reflector 722 on the Printed Circuit Board (PCB) 702.
Fig. 8A-8E illustrate various embodiments of mitigating distortion of a Light Emitting Diode (LED) beam by a corrective optical lens in an eye tracking system according to an example. Schematic 800A in fig. 8A shows an illumination Light Emitting Diode (LED) 802 attached to a frame 804 that also provides mechanical support for a corrective optical lens 806 (and other optical assembly elements). The corrective optical lens 806 may have a user-specific power and is used to correct the image provided to the eye 810.
In some examples, an edge portion of correction optical lens 806 covering Light Emitting Diode (LED) 802 (e.g., a surface distal from LED cut (cut) 805) may be shaped to direct light beam 801 from Light Emitting Diode (LED) 802 toward eye 810, mitigating any distortion in the beam path that may be caused by correction optical lens 806. Shaping (e.g., correcting the angle of the cut 805 of the edge surface of the optical lens) may be based on the degree of the correcting optical lens 806. The diagram 800B is slightly different from the diagram of the configuration of the diagram 800A, in that an added waveguide display 808 is included.
Schematic diagram 800C shows another configuration in which the edge of corrective optical lens 816 is cut into a different shape 815 near the surface of Light Emitting Diode (LED) 802. The edge cut shape 818 may also allow the light beam from the Light Emitting Diode (LED) 802 to pass through air first, then through the corrective optical lens, and finally through air again to the eyeglass 810. Thus, the edge portion of the correction optical lens 816 may be shaped (or cut) on either surface to achieve the same effect.
Schematic 800D shows another configuration in which edge portions of corrective optical lens 826 are not specifically cut or shaped to redirect Light Emitting Diode (LED) 802. However, a directional turning film or molded Fresnel (Fresnel) structure 822 may be placed between Light Emitting Diode (LED) 802 and corrective optical lens 826 to redirect beam 801 toward eye 810. Schematic 800E is a slightly different schematic than the same configuration in schematic 800D, including waveguide display 808. Schematic diagram 800E also shows a light beam redirecting portion that may not necessarily be a separate component (or material) but rather corrects the refractive shaped portion 824 of optical lens 826.
According to an example, a method of manufacturing an eye tracking system having a lens perimeter Light Emitting Diode (LED) is described herein. Also described herein is a manufacturing system for an eye tracking system having a lens perimeter Light Emitting Diode (LED). The non-transitory computer readable storage medium may have stored thereon an executable file that, when executed, instructs a processor to perform the methods described herein.
In the foregoing description, various examples have been described, including apparatuses, systems, methods, and the like. For purposes of explanation, specific details are set forth in order to provide a thorough understanding of the examples of the present disclosure. It will be apparent, however, that various examples may be practiced without these specific details. For example, devices, systems, structures, components, methods, and other means may be shown as block diagram form in order to avoid obscuring the examples in unnecessary detail. In other instances, well-known devices, processes, systems, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the examples.
These drawings and descriptions are not intended to be limiting. The terms and expressions which have been employed in the present disclosure are used as terms of description and not of limitation, and there is no intention in the use of such terms and expressions of excluding any equivalents of the features shown and described or portions of the features shown. The term "exemplary" is used herein to mean "serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs.
Although the methods and systems as described herein may primarily relate to digital content, such as video or interactive media, it should be understood that the methods and systems as described herein may also be used with other types of content or scenes. Other applications or uses of the methods and systems as described herein may also include social networking, marketing, content-based recommendation engines, and/or other types of knowledge or data driven systems.

Claims (20)

1. An eye tracking system, the eye tracking system comprising:
a plurality of illuminators for producing a flicker on a surface of an eye, wherein the plurality of illuminators are positioned along a frame comprising an optical assembly for a near-eye display device and illumination of the plurality of illuminators is aligned with a plane of the optical assembly;
an eye tracking camera for acquiring an image of the eye having the glints; and
a processor communicatively coupled with the plurality of illuminators and the eye-tracking camera, the processor for determining three-dimensional features of the surface of the eye based on the acquired images of the eye with the glints to determine a position and gaze of the eye.
2. The eye tracking system of claim 1, wherein the plurality of illuminators comprise one or more of light emitting diodes, LEDs, side emitting laser diodes, vertical cavity surface emitting laser diodes, or superluminescent light emitting diodes.
3. The eye tracking system according to claim 2, wherein the LED emits visible, infrared, or near infrared light.
4. The eye tracking system according to claim 2, wherein the LEDs comprise side-emitting label package LEDs or lead frame LEDs.
5. The eye tracking system according to claim 2, wherein the LED comprises a tilt-emitting LED comprising a refractive or diffractive layer to achieve tilt-emission.
6. The eye tracking system according to claim 2, wherein the LEDs comprise oblique-emitting LEDs comprising an in-package reflector or an external reflector to achieve oblique emission.
7. The eye tracking system according to claim 1, wherein the plurality of illuminators are positioned such that the glints are within a field of view FOV of the eye tracking camera.
8. The eye tracking system according to claim 1, wherein the optical assembly comprises a corrective optical lens, and an edge portion of the corrective optical lens is shaped to direct illumination of the plurality of illuminators toward the eye.
9. The eye tracking system according to claim 1, wherein the plurality of illuminators comprises up to twenty illuminators.
10. A near-eye display device, the near-eye display device comprising:
an optical assembly comprising a waveguide display for projecting artificial reality content to an eye;
an eye tracking system, the eye tracking system comprising:
a plurality of illuminators for producing a flicker on a surface of the eye, wherein the plurality of illuminators are positioned along a frame comprising the optical assembly and illumination of the plurality of illuminators is aligned with a plane of the optical assembly;
an eye tracking camera for acquiring the glints on the surface of the eye; and
a processor communicatively coupled with the plurality of illuminators and the eye-tracking camera, the processor for determining three-dimensional characteristics of the surface of the eye based on the collected glints to determine a position and gaze of the eye.
11. The near-eye display device of claim 10, wherein the plurality of illuminators comprises light emitting diodes LEDs that emit visible, infrared, or near infrared light.
12. The near-eye display device of claim 11, wherein the LED comprises a side-emitting label package LED or a lead frame LED.
13. The near-eye display device of claim 11, wherein the LED comprises:
a tilt-emitting LED comprising a refractive or diffractive layer to achieve tilt-emission, or
A tilt-emitting LED comprising an in-package reflector or an external reflector to achieve tilt-emission.
14. The near-eye display device of claim 10, wherein the plurality of illuminators are positioned such that the flicker is within a field of view FOV of the eye tracking camera.
15. The near-eye display device of claim 10, wherein the optical assembly comprises a corrective optical lens and an edge portion of the corrective optical lens is shaped to direct illumination of the plurality of illuminators toward the eye.
16. The near-eye display device of claim 10, wherein the optical assembly comprises a corrective optical lens and an edge portion of the corrective optical lens is compensated with a refractive element to direct illumination of the plurality of illuminators toward the eye.
17. A method, the method comprising:
illuminating an eye by a plurality of illuminators positioned along a frame comprising an optical assembly for a near-eye display device, wherein illumination of the plurality of illuminators is aligned with a plane of the optical assembly;
acquiring, by an eye tracking camera, an image of the eye having flicker resulting from the illumination;
determining, by a processor communicatively coupled to the plurality of illuminators and the eye tracking camera, a three-dimensional feature of a surface of the eye based on the acquired image of the eye with the glints; and
based on the determined three-dimensional features, a position and gaze of the eye is determined.
18. The method of claim 17, wherein illuminating the eye with the plurality of illuminators comprises:
light is directed to the surface of the eye by a refractive or diffractive layer of a tilted light emitting diode, LED.
19. The method of claim 17, wherein illuminating the eye with the plurality of illuminators comprises:
light is directed to the surface of the eye by tilting an internal or external reflector of the package of light emitting diodes, LEDs.
20. The method of claim 17, the method further comprising:
the plurality of illuminators are positioned such that the glints are within a field of view FOV of the eye-tracking camera.
CN202311196635.9A 2022-09-16 2023-09-15 Eye tracking system with in-plane illumination Pending CN117724240A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/407,587 2022-09-16
US18/342,277 US20240094809A1 (en) 2022-09-16 2023-06-27 Eye tracking system with in- plane illumination
US18/342,277 2023-06-27

Publications (1)

Publication Number Publication Date
CN117724240A true CN117724240A (en) 2024-03-19

Family

ID=90198549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311196635.9A Pending CN117724240A (en) 2022-09-16 2023-09-15 Eye tracking system with in-plane illumination

Country Status (1)

Country Link
CN (1) CN117724240A (en)

Similar Documents

Publication Publication Date Title
US10120442B2 (en) Eye tracking using a light field camera on a head-mounted display
US10528128B1 (en) Head-mounted display devices with transparent display panels for eye tracking
US10599215B2 (en) Off-axis eye tracker
US10613323B1 (en) Transition feature for framing multizone optics
US20200333596A1 (en) Reflective polarizer for augmented reality and virtual reality display
WO2022245507A1 (en) Autocalibrated near-eye display
CN117043658A (en) Eye tracker illumination through a waveguide
US20230213772A1 (en) Display systems with collection optics for disparity sensing detectors
US20240061246A1 (en) Light field directional backlighting based three-dimensional (3d) pupil steering
US20230209032A1 (en) Detection, analysis and correction of disparities in a display system utilizing disparity sensing port
WO2023133192A1 (en) Display systems with gratings oriented to reduce appearances of ghost images
US20230393399A1 (en) Zonal lenses for a head-mounted display (hmd) device
US20220350149A1 (en) Waveguide configurations in a head-mounted display (hmd) for improved field of view (fov)
US20230084541A1 (en) Compact imaging optics using spatially located, free form optical components for distortion compensation and image clarity enhancement
US10416445B1 (en) Lenses with consistent distortion profile
EP4345531A1 (en) Eye tracking system with in-plane illumination
CN117724240A (en) Eye tracking system with in-plane illumination
US20240069347A1 (en) System and method using eye tracking illumination
US11927758B1 (en) Multi-laser illuminated mixed waveguide display with volume Bragg grating (VBG) and mirror
US20240061500A1 (en) Absolute phase unwrapping for fringe analysis in an eye tracking application
US20230258937A1 (en) Hybrid waveguide to maximize coverage in field of view (fov)
US20230314716A1 (en) Emission of particular wavelength bands utilizing directed wavelength emission components in a display system
US20230258938A1 (en) Display systems with waveguide configuration to mitigate rainbow effect
US20240012244A1 (en) OPTICAL ASSEMBLY WITH MICRO LIGHT EMITTING DIODE (LED) AS EYE-TRACKING NEAR INFRARED (nIR) ILLUMINATION SOURCE
US11586024B1 (en) Peripheral see-through pancake lens assembly and display device with same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination