EP4232865A1 - Systems and methods for eye tracking in a head-mounted device using low-coherence interferometry - Google Patents

Systems and methods for eye tracking in a head-mounted device using low-coherence interferometry

Info

Publication number
EP4232865A1
EP4232865A1 EP21810491.7A EP21810491A EP4232865A1 EP 4232865 A1 EP4232865 A1 EP 4232865A1 EP 21810491 A EP21810491 A EP 21810491A EP 4232865 A1 EP4232865 A1 EP 4232865A1
Authority
EP
European Patent Office
Prior art keywords
eye
tracking system
eye tracking
interferometer
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP21810491.7A
Other languages
German (de)
French (fr)
Inventor
Robin Sharma
Alexander FIX
Mohamed EL-HADDAD
Marc Aurèle GILLES
Guohua Wei
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Meta Platforms Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meta Platforms Technologies LLC filed Critical Meta Platforms Technologies LLC
Publication of EP4232865A1 publication Critical patent/EP4232865A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking

Definitions

  • the present disclosure relates to eye tracking using photonic integrated circuits. More particularly, the present disclosure relates to systems and methods for eye tracking in a head-mounted device using low-coherence interferometry. Applications of such systems and methods, include, for example and without limitation, intent inference, cognitive load estimation, and health monitoring.
  • glints Two-dimensional landmark point reflections
  • This approach generally requires different electronic components for illumination and detection, and it utilizes two-dimensional sensors to image the eye and the glints.
  • This approach also relies on image processing to find the coordinates of each glint relative to the eye.
  • Typical glint-based tracking methods and systems may provide inaccurate eye tracking results. These shortcomings originate from oversimplified eye models used in typical glint estimations, from the inherent ambiguity encountered when measuring overlapping glints, and from the measurements’ sensitivity to ambient light. As such, it is challenging to design a glint-based tracker that operates with high accuracy for a majority of the population and in varied environmental conditions.
  • an eye tracking system comprising: an interferometer; an emission section configured to direct a light beam from the interferometer to a user’s eye; a lens; and a bezel region adjacent to the lens; wherein the emission section is disposed adjacent to the lens or on the lens.
  • the interferometer includes a detector, a reference arm, a beam splitter, and a beam-shaping element.
  • the reference arm has a length selected from the group of lengths consisting of less than about 100 mm, less than about 80 mm, less than about 50 mm, less than about 25 mm, and less than about 10 mm.
  • the interferometer is configured to output the light beam from the beam splitter.
  • the eye tracking system further comprises a headset, the headset including the lens and the bezel region.
  • the interferometer includes a spatially coherent light source.
  • the interferometer includes a scanning mechanism configured for beam steering.
  • the eye tracking system further comprises another interferometer configured to emit another light beam.
  • respective optical axes of the light beam and of the other light beam are substantially parallel.
  • respective optical axes of the light beam and of the other light beam are angled relative to one another.
  • the eye tracking system of claim 1 further including one of a sensor and a camera configured to provide complementary information.
  • the senor when including the sensor, is configured to conduct axial ranging based on self-mixing interferometry.
  • the senor when including the sensor, is configured to operate as a hybrid eye tracking system.
  • a method for tracking a user’s eye using an eye tracking system comprising: performing at least one interferometric measurement over a predetermined area on the eye; performing at least one depth-profile measurement of the anterior segment of the eye, retina, or combination thereof; determining, based on a combination of the at least one interferometric measurement and the at least one depth-profile measurement, an estimate of a position and a direction of the eye by minimizing a difference between an observed depth and a depth computed from one of an eye model or by a machine learning algorithm; and filtering the measurements spatiotemporally to incorporate different times and at different dynamics of the eye.
  • the method further comprises including obtaining phase information from the measurements.
  • the method further comprises further including computing high- resolution displacements and axial velocity based on the phase information.
  • the method further comprises including running one or more computational filters to enhance a performance of the eye tracking system.
  • the method further comprises including combining the measurements in a calibration protocol to generate a ground-truth eye model.
  • the eye tracking system further includes a light source, wherein the light source is selected from the group consisting of a vertical cavity surface-emitting laser (VCSEL), a super luminescence light emission diode (SLED), an array of SLEDs, a tunable laser, and an array of tunable lasers; in which case optionally wherein the light source, detectors, interferometers and optical input/output couplers are integrated with photonic integrated circuits.
  • VCSEL vertical cavity surface-emitting laser
  • SLED super luminescence light emission diode
  • the light source, detectors, interferometers and optical input/output couplers are integrated with photonic integrated circuits.
  • the embodiments featured herein help solve or mitigate the above-noted issues as well as other issues known in the art.
  • at least one of the embodiments featured herein provides low-coherence interferometry that allows three-dimensional sensing, yielding higher signal-to-noise ratios due to high interferometric gain.
  • low-coherence interferometry is able to resolve surfaces of the eye with less than about 10 pm of axial resolution.
  • the embodiments also provide measurements that are insensitive to ambient light. Moreover, three-dimensional information may be leveraged in a calibration protocol to generate subject-specific eye models that may greatly enhance the overall accuracy of the system.
  • the embodiments are also configured to enable high-speed tracking of saccades and micro-saccades which may provide a valuable signals and measurements for applications such as intent inference, cognitive load estimation, and health monitoring.
  • an embodiment provides an eye tracking system including an interferometer.
  • the system also includes an emission section configured to direct a light beam from the interferometer to a user’s eye and a lens.
  • a bezel region is adjacent to the lens, wherein the emission section is disposed adjacent to the lens or on the lens.
  • FIG. 1 illustrates a swept source-based interferometer according to various aspects of the present disclosure.
  • FIG. 2 illustrates an axial measurement scheme according to various aspects of the present disclosure.
  • FIG. 3 illustrates an assembly according to various aspects of the present disclosure.
  • FIG. 4 illustrates an eye tracking system according to various aspects of the present disclosure.
  • FIG. 5 illustrates a method according to various aspects of the present disclosure.
  • the embodiments featured herein may include interferometer-based systems or hybrid systems that rely on interferometry as well as other sensing modalities, such as image sensing.
  • the embodiments generally include photonic integrated circuits which may include integrated light sources.
  • an exemplary system may be a swept source-based system.
  • the exemplary system includes a swept source that can tune the bandwidth of its output wavelength at a certain speed (for example, at about 100 kHz) and over a specified wavelength range.
  • the wavelength range may be from about 5 nm to about 100 nm.
  • the system can include one or more photodetectors.
  • the light source and one or more photodetectors can be integrated on a single chip.
  • the light from the source may be coupled into elements of the photonic circuit that guides the output light to a beam splitter, a reference and sample arms, as well as to the various photodetectors present in the system, thus forming a low-coherence interferometer.
  • multiple interferometers such as the one described above, may be distributed on a glass substrate to form a spatial detection system.
  • Laser diode (LD) chips and photodiode (PD) chips are placed in the bezel region (embedded in the frame around the substrate) or near the edge where see-through disturbance is the least.
  • a single light source can be shared by the multiple interferometers by using photonic waveguide splitters or optical switches.
  • FIG. 1 illustrates an example implementation of an interferometer 100 on a chip 105, according to an embodiment.
  • the beam 101 shows the direction from a source 103, and the beam 102 is the reflection from the sample 108 or the user eye in this case.
  • Light from the source is collimated and then split by a beam splitter 106 into reference and sample arms.
  • the reference arm is directed up to mirror 110 and reflected back to the beam splitter.
  • the reflected light from the sample 108 and the reference is recombined at the splitter and an interference pattern is detected by the detector 107.
  • FIG. 2 illustrates an example axial measurement (A-scan) scheme 200 where a chip like the interferometer 100 shown in FIG. 1 is embedded into the viewing optics of a headset 201.
  • the representative A-scan illustrates characteristic peaks corresponding to the primary surfaces of interest in the eye, thereby providing a depth profile of the eye, where each peak corresponds to a specific depth of the eye.
  • FIG. 3 illustrates a panel 300 of different examples of possible arrangements of interferometer chips 100 on the lens of the headset, the lens bezel, or a combination thereof.
  • the emitted beams may be parallel or converging (top row), the integrated circuits may be entirely embedded in the lens, entirely placed in the bezel/frame, or split between both regions and coupled via waveguides, as shown in the bottom row.
  • FIG. 4 illustrates a swept source-based system 400 based on the interferometer 100 as illustrated in FIG. 1.
  • the system 400 is integrated with a headset 201 and may include a laser diode and a photodiode which can be monolithically integrated into a single chip.
  • the interferometer-based eye tracking system 400 can be realized using waveguide beam splitters using directional couplers.
  • the input/output (optical I/O) may be on a single channel, and the system 400 may include a beam steering device to provide high spatial resolution.
  • FIG. 5 illustrates a method 500 according to an embodiment.
  • the method 500 is an exemplary process via which a system such has the system 400 is used to perform gazebased interaction in a virtual or augmented reality headset.
  • the method 500 begins at step 502.
  • a user wears virtual or augmented reality head-set including a frame that comprises an eye-tracking system such as the exemplary systems previously described.
  • the method 500 can include providing, by the system, signals that reflect the axial profiles of reflectivity from multiple measurement points.
  • the method 500 includes analyzing the provided signals via a postprocessing algorithm.
  • processing may include classifying one or more of the signals as being associated with an ocular structure like the cornea, the sclera, the iris, the lens, or the retina.
  • processing may also include classifying one or more of the signals as being associated with the skin or the eye lashes.
  • the method 500 can include fitting the classified signals to a prespecified reference frame or eye model.
  • the method 500 can include computing a gaze angle and/or pupil center from the fitted signals and the pre-specified reference frame or eye model.
  • the method 500 may further include inferring points of interest on the display of the headset based on the computed parameters such as gaze angle and/or pupil center.
  • the method 500 may also include making decisions based on dwell time or a number of blinks. The method may end at step 516.
  • One or more of the embodiments described herein may be configured to perform dynamic bandwidth optical coherence tomography (OCT) for either low/high resolution or for long/short ranging application.
  • OCT optical coherence tomography
  • a wavelength- swept source may be used to dynamically control the bandwidth over which the source is being swept. Sweeping over short bandwidths yield low axial resolution whereas sweeping over wide bandwidths yield high axial resolution.
  • embodiments of the present disclosure may be configured as a hybrid system.
  • the hybrid system may include a hybrid system combining a long range, lower resolution path length sensor using SMI or optical or acoustic time-of-flight, and low coherence interferometry for high resolution.
  • a hybrid system may be configured to function as an any path-length measuring sensor.
  • the hybrid system may be configured to allow optimally controlling the reference arm position dynamically.
  • an OCT sensor according to an embodiment may be imaging a 10 mm range, 12 mm away from a lens. If the frame slips or the user has an unusual eye-relief, this distance may be insufficient.
  • the exemplary system may be configured to starting point of the OCT sensor’s 10 mm range by adjusting the reference arm length.
  • the hybrid system may be configured for speckle tracking for motion and velocity estimation based on separate coherence sources. This can be achieved by selecting a narrowband from the spectrum by using a high coherence source (spatial and temporal) to track motion in conjunction with path-length measurement from the OCT sensor.
  • the coherent source may also be “synthesized” by picking a very narrow band from the OCT source.
  • the system may be configured as an anglesensitive detector for surface normals in combination with OCT for path length. If anglesensitive detectors and OCT are detecting from the same point, the exemplary system can yield information about surface normals in addition to the axial information, which will enhance the tracking/modeling accuracy.
  • the exemplary system may be configured for polarization-based sensing.
  • the system may be configured for detecting orthogonal polarizations on separate channels.
  • the system can yield information about birefringence which may enhance the contrast in the signal, thus rendering segmentation and/or computation easier.
  • the system may be configured to enable sensing of local curvature by measuring the ratio of the coupled power from each polarization. For example, at Brewster’s angle, only one polarization reflects and the other transmits.
  • exemplary systems may be configured for scanning SMI in conjunction with OCT or scanning OCT utilizing a micro-electro-mechanical systems (MEMS) scanner.
  • MEMS micro-electro-mechanical systems
  • the exemplary system is configured to enable the formation of a 3D image.
  • the system may include an additional sensor which does not have to be in-field or even pointing at the eye. This additional sensor could be sitting in the arms of the glass pointed at the temple of the cheek bone and sensing at relative distance, and it can serve as a proxy for detecting slippage or vibrations.
  • the embodiments provided herein may include an eye tracking system.
  • the system may include an interferometer, an emission section configured to direct a light beam from the interferometer to a user’s eye.
  • the system further includes a lens and a bezel region adjacent to the lens.
  • the emission section is disposed adjacent to the lens or on the lens.
  • the interferometer further includes a detector, a reference arm, a beam splitter, and a beam-shaping element.
  • the reference arm can have a length of less than about 100 mm, of less than about 80 mm, of less than about 50 mm, of less than about 25 mm, or of less than about 10 mm.
  • the interferometer can be configured to output the light beam from the beam splitter.
  • the eye tracking system can include a headset where the headset includes the lens and the bezel region.
  • the eye tracking system can further include a spatially coherent light source.
  • the eye tracking system can include a scanning mechanism configured for beam steering as well as another interferometer that is configured to emit another light beam substantially parallel to the light beam of the first interferometer described above, and the respective optical axes of each light beam can be substantially parallel. In an alternate implementation, the respective optical axes of the light beams and of can be angled.
  • the eye tracking system can further include a sensor and a camera configured to provide complementary information. The sensor can further be configured to conduct axial ranging based on self-mixing interferometry.
  • the light source can be integrated with a photonic integrated circuit.
  • the photonic integrated circuit can include a waveguide and a coupler configured to couple the photonic integrated circuit with the light source.
  • the coupler can be a grating coupler or a micro-optical element.
  • the interferometer can include a planar photonic circuit and a photonic beam splitter.
  • the photonic beam splitter can be a directional coupler.
  • the emission section can include an output coupler for directing the light beam to the user’s eye.
  • the output coupler can include a beam-shaping element.
  • the beam-shaping element can be a meta lens or a diffractive optical element.
  • the beam-shaping element includes a meta lens and a diffractive optical element.
  • the reference arm can include a waveguide and a directional coupler including a detection port.
  • the directional coupler can have ends that are 2 x 2 50:50, and the detector can include photodetectors coupled to the directional coupler.
  • the photodetectors can include silicon avalanche photodetectors and they may be integrated with a light source on the same chip.
  • the circuit can include a transparent substrate.
  • the substrate may also be glass.
  • the system can include an electric controller circuit and a computational unit configured to drive the light source and the photodetectors.
  • a method for tracking a user’s eye using the above described eye tracking system can include performing at least one interferometric measurement over a predetermined area on the eye and performing at least one depth-profile measurement of the anterior segment of the eye, retina, or a combination thereof.
  • the method can further include determining, based on a combination of the at least one interferometric measurement and the at least one depth profile measurement, an estimate of a position and a direction of the eye by minimizing a difference between an observed depth and a depth computed from one of an eye model and a machine learning algorithm.
  • the method can further include filtering the measurements spatiotemporally to incorporate different times and at different dynamics of the eye.
  • the method can further include obtaining phase information from the measurements, computing high-resolution displacements, and axial velocity based on the phase information.
  • the method can further include running one or more computational filters to enhance a performance of the eye tracking system.
  • the method can further include combining the measurements in a calibration protocol to generate a ground-truth eye model.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instruments For Measurement Of Length By Optical Means (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

There is provided an eye tracking system including an interferometer. The system also includes an emission section configured to direct a light beam from the interferometer to a user's eye and a lens. A bezel region is adjacent to the lens, and the emission section is disposed adjacent to the lens or on the lens.

Description

SYSTEMS AND METHODS FOR EYE TRACKING IN A HEAD-MOUNTED DEVICE USING LOW-COHERENCE INTERFEROMETRY CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present disclosure claims the benefit of U.S. Provisional Patent Application No. 63/105,867, filed on October 26, 2020. The disclosure of this prior application is incorporated herein in its entirety by reference.
TECHNICAL FIELD
[0002] The present disclosure relates to eye tracking using photonic integrated circuits. More particularly, the present disclosure relates to systems and methods for eye tracking in a head-mounted device using low-coherence interferometry. Applications of such systems and methods, include, for example and without limitation, intent inference, cognitive load estimation, and health monitoring.
BACKGROUND
[0003] Current head-mounted eye tracking sensors mainly rely on two-dimensional landmark point reflections called glints. The shape and distribution of these glints depend on the illumination configuration and the geometry of a subject’s eye. This approach generally requires different electronic components for illumination and detection, and it utilizes two-dimensional sensors to image the eye and the glints. This approach also relies on image processing to find the coordinates of each glint relative to the eye.
[0004] Typical glint-based tracking methods and systems may provide inaccurate eye tracking results. These shortcomings originate from oversimplified eye models used in typical glint estimations, from the inherent ambiguity encountered when measuring overlapping glints, and from the measurements’ sensitivity to ambient light. As such, it is challenging to design a glint-based tracker that operates with high accuracy for a majority of the population and in varied environmental conditions.
SUMMARY
[0005] According to the present invention there is provided an eye tracking system comprising: an interferometer; an emission section configured to direct a light beam from the interferometer to a user’s eye; a lens; and a bezel region adjacent to the lens; wherein the emission section is disposed adjacent to the lens or on the lens.
[0006] Preferably, the interferometer includes a detector, a reference arm, a beam splitter, and a beam-shaping element. [0007] Preferably, the reference arm has a length selected from the group of lengths consisting of less than about 100 mm, less than about 80 mm, less than about 50 mm, less than about 25 mm, and less than about 10 mm.
[0008] Preferably, the interferometer is configured to output the light beam from the beam splitter.
Preferably, the eye tracking system further comprises a headset, the headset including the lens and the bezel region.
[0009] Preferably, the interferometer includes a spatially coherent light source. [0010] Preferably, the interferometer includes a scanning mechanism configured for beam steering.
[0011] Preferably, the eye tracking system further comprises another interferometer configured to emit another light beam.
[0012] Preferably, respective optical axes of the light beam and of the other light beam are substantially parallel.
[0013] Preferably, respective optical axes of the light beam and of the other light beam are angled relative to one another.
[0014] Preferably, the eye tracking system of claim 1, further including one of a sensor and a camera configured to provide complementary information.
[0015] Preferably, when including the sensor, the sensor is configured to conduct axial ranging based on self-mixing interferometry.
[0016] Preferably, when including the sensor, the sensor is configured to operate as a hybrid eye tracking system.
According to a further aspect of the present invention there is provided a method for tracking a user’s eye using an eye tracking system, the method comprising: performing at least one interferometric measurement over a predetermined area on the eye; performing at least one depth-profile measurement of the anterior segment of the eye, retina, or combination thereof; determining, based on a combination of the at least one interferometric measurement and the at least one depth-profile measurement, an estimate of a position and a direction of the eye by minimizing a difference between an observed depth and a depth computed from one of an eye model or by a machine learning algorithm; and filtering the measurements spatiotemporally to incorporate different times and at different dynamics of the eye.
[0017] Preferably, the method further comprises including obtaining phase information from the measurements. [0018] Preferably, the method further comprises further including computing high- resolution displacements and axial velocity based on the phase information.
[0019] Preferably, the method further comprises including running one or more computational filters to enhance a performance of the eye tracking system.
[0020] Preferably, the method further comprises including combining the measurements in a calibration protocol to generate a ground-truth eye model.
[0021] Preferably, the eye tracking system further includes a light source, wherein the light source is selected from the group consisting of a vertical cavity surface-emitting laser (VCSEL), a super luminescence light emission diode (SLED), an array of SLEDs, a tunable laser, and an array of tunable lasers; in which case optionally wherein the light source, detectors, interferometers and optical input/output couplers are integrated with photonic integrated circuits.
[0022] The embodiments featured herein help solve or mitigate the above-noted issues as well as other issues known in the art. For example, at least one of the embodiments featured herein provides low-coherence interferometry that allows three-dimensional sensing, yielding higher signal-to-noise ratios due to high interferometric gain. Further, in some embodiments, low-coherence interferometry is able to resolve surfaces of the eye with less than about 10 pm of axial resolution.
[0023] The embodiments also provide measurements that are insensitive to ambient light. Moreover, three-dimensional information may be leveraged in a calibration protocol to generate subject-specific eye models that may greatly enhance the overall accuracy of the system. The embodiments are also configured to enable high-speed tracking of saccades and micro-saccades which may provide a valuable signals and measurements for applications such as intent inference, cognitive load estimation, and health monitoring.
[0024] Under certain circumstances, an embodiment provides an eye tracking system including an interferometer. The system also includes an emission section configured to direct a light beam from the interferometer to a user’s eye and a lens. A bezel region is adjacent to the lens, wherein the emission section is disposed adjacent to the lens or on the lens.
[0025] Further features and advantages of the disclosure, as well as the structure and operation of various embodiments, are described in detail below with reference to the accompanying drawings. It is noted that the disclosure is not limited to the specific embodiments described herein. Such embodiments are presented herein for illustrative purposes only. Additional embodiments will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[0026] Together with the following detailed descriptions, the accompanying drawings illustrate a number of exemplary embodiments in addition to describing and demonstrating various aspects and/or principles set forth in the present disclosure. The accompanying drawings and the brief descriptions are provided to enable one of ordinary skill in the art to practice the various aspects and/or principles set forth the present disclosure.
[0027] FIG. 1 illustrates a swept source-based interferometer according to various aspects of the present disclosure.
[0028] FIG. 2 illustrates an axial measurement scheme according to various aspects of the present disclosure.
[0029] FIG. 3 illustrates an assembly according to various aspects of the present disclosure.
[0030] FIG. 4 illustrates an eye tracking system according to various aspects of the present disclosure.
[0031] FIG. 5 illustrates a method according to various aspects of the present disclosure.
DETAILED DESCRIPTION
[0032] Embodiments will be described below in more detail with reference to the accompanying drawings. The following detailed descriptions are provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein as well as modifications thereof. Accordingly, various modifications and equivalents of the methods, apparatuses, and/or systems described herein will be apparent to those of ordinary skill in the art. Descriptions of well-known functions and constructions may be omitted for increased clarity and conciseness.
[0033] The embodiments featured herein may include interferometer-based systems or hybrid systems that rely on interferometry as well as other sensing modalities, such as image sensing. The embodiments generally include photonic integrated circuits which may include integrated light sources. In one non-limiting implementation, an exemplary system may be a swept source-based system.
[0034] Specifically, the exemplary system includes a swept source that can tune the bandwidth of its output wavelength at a certain speed (for example, at about 100 kHz) and over a specified wavelength range. For example, and not by limitation, the wavelength range may be from about 5 nm to about 100 nm. The system can include one or more photodetectors.
[0035] The light source and one or more photodetectors can be integrated on a single chip. The light from the source may be coupled into elements of the photonic circuit that guides the output light to a beam splitter, a reference and sample arms, as well as to the various photodetectors present in the system, thus forming a low-coherence interferometer.
[0036] In an alternate exemplary implementation, multiple interferometers such as the one described above, may be distributed on a glass substrate to form a spatial detection system. Laser diode (LD) chips and photodiode (PD) chips are placed in the bezel region (embedded in the frame around the substrate) or near the edge where see-through disturbance is the least. In yet another alternate exemplary implementation, a single light source can be shared by the multiple interferometers by using photonic waveguide splitters or optical switches.
[0037] FIG. 1 illustrates an example implementation of an interferometer 100 on a chip 105, according to an embodiment. The beam 101 shows the direction from a source 103, and the beam 102 is the reflection from the sample 108 or the user eye in this case. Light from the source is collimated and then split by a beam splitter 106 into reference and sample arms. The reference arm is directed up to mirror 110 and reflected back to the beam splitter. The reflected light from the sample 108 and the reference is recombined at the splitter and an interference pattern is detected by the detector 107.
[0038] FIG. 2 illustrates an example axial measurement (A-scan) scheme 200 where a chip like the interferometer 100 shown in FIG. 1 is embedded into the viewing optics of a headset 201. The representative A-scan illustrates characteristic peaks corresponding to the primary surfaces of interest in the eye, thereby providing a depth profile of the eye, where each peak corresponds to a specific depth of the eye.
[0039] FIG. 3 illustrates a panel 300 of different examples of possible arrangements of interferometer chips 100 on the lens of the headset, the lens bezel, or a combination thereof. The emitted beams may be parallel or converging (top row), the integrated circuits may be entirely embedded in the lens, entirely placed in the bezel/frame, or split between both regions and coupled via waveguides, as shown in the bottom row.
[0040] FIG. 4 illustrates a swept source-based system 400 based on the interferometer 100 as illustrated in FIG. 1. The system 400 is integrated with a headset 201 and may include a laser diode and a photodiode which can be monolithically integrated into a single chip. The interferometer-based eye tracking system 400 can be realized using waveguide beam splitters using directional couplers. The input/output (optical I/O) may be on a single channel, and the system 400 may include a beam steering device to provide high spatial resolution.
[0041] FIG. 5 illustrates a method 500 according to an embodiment. The method 500 is an exemplary process via which a system such has the system 400 is used to perform gazebased interaction in a virtual or augmented reality headset. The method 500 begins at step 502. At step 504, a user wears virtual or augmented reality head-set including a frame that comprises an eye-tracking system such as the exemplary systems previously described. At step 506, the method 500 can include providing, by the system, signals that reflect the axial profiles of reflectivity from multiple measurement points.
[0042] At a step 508, the method 500 includes analyzing the provided signals via a postprocessing algorithm. Such processing may include classifying one or more of the signals as being associated with an ocular structure like the cornea, the sclera, the iris, the lens, or the retina. Such processing may also include classifying one or more of the signals as being associated with the skin or the eye lashes.
[0043] At step 510, the method 500 can include fitting the classified signals to a prespecified reference frame or eye model. At step 512, the method 500 can include computing a gaze angle and/or pupil center from the fitted signals and the pre-specified reference frame or eye model. At step 514, the method 500 may further include inferring points of interest on the display of the headset based on the computed parameters such as gaze angle and/or pupil center. The method 500 may also include making decisions based on dwell time or a number of blinks. The method may end at step 516.
[0044] One or more of the embodiments described herein may be configured to perform dynamic bandwidth optical coherence tomography (OCT) for either low/high resolution or for long/short ranging application. In such an exemplary configuration, a wavelength- swept source may be used to dynamically control the bandwidth over which the source is being swept. Sweeping over short bandwidths yield low axial resolution whereas sweeping over wide bandwidths yield high axial resolution.
[0045] Furthermore, embodiments of the present disclosure may be configured as a hybrid system. The hybrid system may include a hybrid system combining a long range, lower resolution path length sensor using SMI or optical or acoustic time-of-flight, and low coherence interferometry for high resolution. Generally, such a hybrid system may be configured to function as an any path-length measuring sensor. [0046] In such embodiments, the hybrid system may be configured to allow optimally controlling the reference arm position dynamically. For example, and not by limitation, an OCT sensor according to an embodiment may be imaging a 10 mm range, 12 mm away from a lens. If the frame slips or the user has an unusual eye-relief, this distance may be insufficient. With the low resolution, long-range option of the sensor, the exemplary system may be configured to starting point of the OCT sensor’s 10 mm range by adjusting the reference arm length.
[0047] In yet another embodiment, the hybrid system may be configured for speckle tracking for motion and velocity estimation based on separate coherence sources. This can be achieved by selecting a narrowband from the spectrum by using a high coherence source (spatial and temporal) to track motion in conjunction with path-length measurement from the OCT sensor. The coherent source may also be “synthesized” by picking a very narrow band from the OCT source. Furthermore, the system may be configured as an anglesensitive detector for surface normals in combination with OCT for path length. If anglesensitive detectors and OCT are detecting from the same point, the exemplary system can yield information about surface normals in addition to the axial information, which will enhance the tracking/modeling accuracy.
[0048] In yet another embodiment, the exemplary system may be configured for polarization-based sensing. For instance, the system may be configured for detecting orthogonal polarizations on separate channels. In this configuration, the system can yield information about birefringence which may enhance the contrast in the signal, thus rendering segmentation and/or computation easier. In this configuration, the system may be configured to enable sensing of local curvature by measuring the ratio of the coupled power from each polarization. For example, at Brewster’s angle, only one polarization reflects and the other transmits.
[0049] Furthermore, exemplary systems may be configured for scanning SMI in conjunction with OCT or scanning OCT utilizing a micro-electro-mechanical systems (MEMS) scanner. In this configuration, the exemplary system is configured to enable the formation of a 3D image. In this configuration, the system may include an additional sensor which does not have to be in-field or even pointing at the eye. This additional sensor could be sitting in the arms of the glass pointed at the temple of the cheek bone and sensing at relative distance, and it can serve as a proxy for detecting slippage or vibrations.
[0050] Generally, the embodiments provided herein may include an eye tracking system. The system may include an interferometer, an emission section configured to direct a light beam from the interferometer to a user’s eye. The system further includes a lens and a bezel region adjacent to the lens. The emission section is disposed adjacent to the lens or on the lens. In an exemplary implementation, the interferometer further includes a detector, a reference arm, a beam splitter, and a beam-shaping element. Furthermore, in an exemplary implementation, the reference arm can have a length of less than about 100 mm, of less than about 80 mm, of less than about 50 mm, of less than about 25 mm, or of less than about 10 mm.
[0051] The interferometer can be configured to output the light beam from the beam splitter. In one implementation, the eye tracking system can include a headset where the headset includes the lens and the bezel region. The eye tracking system can further include a spatially coherent light source. The eye tracking system can include a scanning mechanism configured for beam steering as well as another interferometer that is configured to emit another light beam substantially parallel to the light beam of the first interferometer described above, and the respective optical axes of each light beam can be substantially parallel. In an alternate implementation, the respective optical axes of the light beams and of can be angled. The eye tracking system can further include a sensor and a camera configured to provide complementary information. The sensor can further be configured to conduct axial ranging based on self-mixing interferometry.
[0052] In yet another implementation based on the teachings provided herein, the light source can be integrated with a photonic integrated circuit. The photonic integrated circuit can include a waveguide and a coupler configured to couple the photonic integrated circuit with the light source. The coupler can be a grating coupler or a micro-optical element. The interferometer can include a planar photonic circuit and a photonic beam splitter. The photonic beam splitter can be a directional coupler.
[0053] In the exemplary system described above, the emission section can include an output coupler for directing the light beam to the user’s eye. The output coupler can include a beam-shaping element. The beam-shaping element can be a meta lens or a diffractive optical element. And the beam-shaping element includes a meta lens and a diffractive optical element.
[0054] The reference arm can include a waveguide and a directional coupler including a detection port. The directional coupler can have ends that are 2 x 2 50:50, and the detector can include photodetectors coupled to the directional coupler. The photodetectors can include silicon avalanche photodetectors and they may be integrated with a light source on the same chip. In implementations where the system includes a photonic integrated circuit, the circuit can include a transparent substrate. For example, the substrate may also be glass. Furthermore, the system can include an electric controller circuit and a computational unit configured to drive the light source and the photodetectors.
[0055] In yet another embodiment, a method for tracking a user’s eye using the above described eye tracking system is provided. The method can include performing at least one interferometric measurement over a predetermined area on the eye and performing at least one depth-profile measurement of the anterior segment of the eye, retina, or a combination thereof. The method can further include determining, based on a combination of the at least one interferometric measurement and the at least one depth profile measurement, an estimate of a position and a direction of the eye by minimizing a difference between an observed depth and a depth computed from one of an eye model and a machine learning algorithm.
[0056] Furthermore, the method can further include filtering the measurements spatiotemporally to incorporate different times and at different dynamics of the eye. The method can further include obtaining phase information from the measurements, computing high-resolution displacements, and axial velocity based on the phase information. The method can further include running one or more computational filters to enhance a performance of the eye tracking system. The method can further include combining the measurements in a calibration protocol to generate a ground-truth eye model.
[0057] Those skilled in the relevant art(s) will readily appreciate that various adaptations and modifications of the exemplary embodiments described above can be achieved without departing from the scope and spirit of the present disclosure. Therefore, it is to be understood that, within the scope of the appended claims, the teachings of the disclosure may be practiced other than as specifically described herein.

Claims

CLAIMS What is claimed is:
1. An eye tracking system comprising: an interferometer; an emission section configured to direct a light beam from the interferometer to a user’s eye; a lens; and a bezel region adjacent to the lens; wherein the emission section is disposed adjacent to the lens or on the lens.
2. The eye tracking system of claim 1, wherein the interferometer includes a detector, a reference arm, a beam splitter, and a beam-shaping element.
3. The eye tracking system of claim 2, wherein the reference arm has a length selected from the group of lengths consisting of less than about 100 mm, less than about 80 mm, less than about 50 mm, less than about 25 mm, and less than about 10 mm.
4. The eye tracking system of claim 2, wherein the interferometer is configured to output the light beam from the beam splitter.
5. The eye tracking system of claim 1, further comprising a headset, the headset including the lens and the bezel region.
6. The eye tracking system of claim 1, wherein the interferometer includes a spatially coherent light source.
7. The eye tracking system of claim 1, wherein the interferometer includes a scanning mechanism configured for beam steering.
8. The eye tracking system of claim 1, further comprising another interferometer configured to emit another light beam.
9. The eye tracking system of claim 8, wherein respective optical axes of the light beam and of the other light beam are substantially parallel.
10. The eye tracking system of claim 8, wherein respective optical axes of the light beam and of the other light beam are angled relative to one another.
11. The eye tracking system of claim 1, further including one of a sensor and a camera configured to provide complementary information.
12. The eye tracking system of claim 11, and any one of: a) wherein when including the sensor, the sensor is configured to conduct axial ranging based on self-mixing interferometry; or b) wherein when including the sensor, the sensor is configured to operate as a hybrid eye tracking system.
13. A method for tracking a user’s eye using an eye tracking system, the method comprising: performing at least one interferometric measurement over a predetermined area on the eye; performing at least one depth-profile measurement of the anterior segment of the eye, retina, or combination thereof; determining, based on a combination of the at least one interferometric measurement and the at least one depth-profile measurement, an estimate of a position and a direction of the eye by minimizing a difference between an observed depth and a depth computed from one of an eye model or by a machine learning algorithm; and filtering the measurements spatiotemporally to incorporate different times and at different dynamics of the eye.
14. The method of claim 13, and any one of: a) further including obtaining phase information from the measurements; in which case optionally any one of: i) further including computing high-resolution displacements and axial velocity based on the phase information; or ii) further including running one or more computational filters to enhance a performance of the eye tracking system; or b) further including combining the measurements in a calibration protocol to generate a ground-truth eye model.
15. The eye tracking system of claim 1, further including a light source, wherein the light source is selected from the group consisting of a vertical cavity surface-emitting laser (VCSEL), a super luminescence light emission diode (SLED), an array of SLEDs, a tunable laser, and an array of tunable lasers; in which case optionally wherein the light source, detectors, interferometers and optical input/output couplers are integrated with photonic integrated circuits.
EP21810491.7A 2020-10-26 2021-10-23 Systems and methods for eye tracking in a head-mounted device using low-coherence interferometry Withdrawn EP4232865A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063105867P 2020-10-26 2020-10-26
US202117228634A 2021-04-12 2021-04-12
PCT/US2021/056374 WO2022093656A1 (en) 2020-10-26 2021-10-23 Systems and methods for eye tracking in a head-mounted device using low-coherence interferometry

Publications (1)

Publication Number Publication Date
EP4232865A1 true EP4232865A1 (en) 2023-08-30

Family

ID=78676664

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21810491.7A Withdrawn EP4232865A1 (en) 2020-10-26 2021-10-23 Systems and methods for eye tracking in a head-mounted device using low-coherence interferometry

Country Status (5)

Country Link
EP (1) EP4232865A1 (en)
JP (1) JP2023547310A (en)
KR (1) KR20230088909A (en)
CN (1) CN116406449A (en)
WO (1) WO2022093656A1 (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019025871A1 (en) * 2017-08-01 2019-02-07 Opher Kinrot Optical systems and methods for measuring rotational movement
US11112613B2 (en) * 2017-12-18 2021-09-07 Facebook Technologies, Llc Integrated augmented reality head-mounted display for pupil steering

Also Published As

Publication number Publication date
KR20230088909A (en) 2023-06-20
CN116406449A (en) 2023-07-07
JP2023547310A (en) 2023-11-10
WO2022093656A1 (en) 2022-05-05

Similar Documents

Publication Publication Date Title
JP7170033B2 (en) Spatial division multiplexing optical coherence tomography using an integrated photon device
US9816803B2 (en) Method and system for low coherence interferometry
JP4677636B2 (en) Optical coherence tomography apparatus and variable wavelength light generator used therefor
JP4916573B2 (en) Optical interference measurement method and optical interference measurement apparatus
US9217707B2 (en) Method and apparatus for eye movement tracking in spectral optical coherence tomography (SD-OCT)
US9226655B2 (en) Image processing apparatus and image processing method
US8199329B2 (en) Apparatus for measurement of the axial length of an eye
US10349829B2 (en) Ophthalmic imaging apparatus
US20140293290A1 (en) Method and System for Compact Optical Coherence Tomography
CN112485904B (en) Eye accommodation distance measuring apparatus and method, and head-mounted display
CN114384695B (en) Eye movement tracking device
US11430262B1 (en) Eye tracking using optical coherence methods
JP2013148482A (en) Optical coherence tomography device
CN113749609A (en) Method for detecting the gaze direction of an eye
WO2015194145A1 (en) Swept source optical coherence tomography apparatus for fundus imaging
KR20120137329A (en) Device of optical coherence tomography
EP4232865A1 (en) Systems and methods for eye tracking in a head-mounted device using low-coherence interferometry
RU2724442C1 (en) Eye focusing distance determining device and method for head-end display device, head-end display device
US20240310906A1 (en) Eye movement determination
CN111770719B (en) Method for generating two-dimensional interference pattern by Michelson free beam interferometer
CN117813540A (en) Eye movement tracking
JP7006874B2 (en) OCT system and OCT method
EP4198607A1 (en) Eye sensing
CN111770719A (en) Method for generating two-dimensional interferograms by michelson type free beam interferometer
WO2023078689A1 (en) Retinal imaging

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230424

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20231216