EP4232865A1 - Systèmes et procédés d'oculométrie dans un dispositif monté sur la tête à l'aide d'une interférométrie à faible cohérence - Google Patents

Systèmes et procédés d'oculométrie dans un dispositif monté sur la tête à l'aide d'une interférométrie à faible cohérence

Info

Publication number
EP4232865A1
EP4232865A1 EP21810491.7A EP21810491A EP4232865A1 EP 4232865 A1 EP4232865 A1 EP 4232865A1 EP 21810491 A EP21810491 A EP 21810491A EP 4232865 A1 EP4232865 A1 EP 4232865A1
Authority
EP
European Patent Office
Prior art keywords
eye
tracking system
eye tracking
interferometer
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP21810491.7A
Other languages
German (de)
English (en)
Inventor
Robin Sharma
Alexander FIX
Mohamed EL-HADDAD
Marc Aurèle GILLES
Guohua Wei
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Meta Platforms Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meta Platforms Technologies LLC filed Critical Meta Platforms Technologies LLC
Publication of EP4232865A1 publication Critical patent/EP4232865A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking

Definitions

  • the present disclosure relates to eye tracking using photonic integrated circuits. More particularly, the present disclosure relates to systems and methods for eye tracking in a head-mounted device using low-coherence interferometry. Applications of such systems and methods, include, for example and without limitation, intent inference, cognitive load estimation, and health monitoring.
  • glints Two-dimensional landmark point reflections
  • This approach generally requires different electronic components for illumination and detection, and it utilizes two-dimensional sensors to image the eye and the glints.
  • This approach also relies on image processing to find the coordinates of each glint relative to the eye.
  • Typical glint-based tracking methods and systems may provide inaccurate eye tracking results. These shortcomings originate from oversimplified eye models used in typical glint estimations, from the inherent ambiguity encountered when measuring overlapping glints, and from the measurements’ sensitivity to ambient light. As such, it is challenging to design a glint-based tracker that operates with high accuracy for a majority of the population and in varied environmental conditions.
  • an eye tracking system comprising: an interferometer; an emission section configured to direct a light beam from the interferometer to a user’s eye; a lens; and a bezel region adjacent to the lens; wherein the emission section is disposed adjacent to the lens or on the lens.
  • the interferometer includes a detector, a reference arm, a beam splitter, and a beam-shaping element.
  • the reference arm has a length selected from the group of lengths consisting of less than about 100 mm, less than about 80 mm, less than about 50 mm, less than about 25 mm, and less than about 10 mm.
  • the interferometer is configured to output the light beam from the beam splitter.
  • the eye tracking system further comprises a headset, the headset including the lens and the bezel region.
  • the interferometer includes a spatially coherent light source.
  • the interferometer includes a scanning mechanism configured for beam steering.
  • the eye tracking system further comprises another interferometer configured to emit another light beam.
  • respective optical axes of the light beam and of the other light beam are substantially parallel.
  • respective optical axes of the light beam and of the other light beam are angled relative to one another.
  • the eye tracking system of claim 1 further including one of a sensor and a camera configured to provide complementary information.
  • the senor when including the sensor, is configured to conduct axial ranging based on self-mixing interferometry.
  • the senor when including the sensor, is configured to operate as a hybrid eye tracking system.
  • a method for tracking a user’s eye using an eye tracking system comprising: performing at least one interferometric measurement over a predetermined area on the eye; performing at least one depth-profile measurement of the anterior segment of the eye, retina, or combination thereof; determining, based on a combination of the at least one interferometric measurement and the at least one depth-profile measurement, an estimate of a position and a direction of the eye by minimizing a difference between an observed depth and a depth computed from one of an eye model or by a machine learning algorithm; and filtering the measurements spatiotemporally to incorporate different times and at different dynamics of the eye.
  • the method further comprises including obtaining phase information from the measurements.
  • the method further comprises further including computing high- resolution displacements and axial velocity based on the phase information.
  • the method further comprises including running one or more computational filters to enhance a performance of the eye tracking system.
  • the method further comprises including combining the measurements in a calibration protocol to generate a ground-truth eye model.
  • the eye tracking system further includes a light source, wherein the light source is selected from the group consisting of a vertical cavity surface-emitting laser (VCSEL), a super luminescence light emission diode (SLED), an array of SLEDs, a tunable laser, and an array of tunable lasers; in which case optionally wherein the light source, detectors, interferometers and optical input/output couplers are integrated with photonic integrated circuits.
  • VCSEL vertical cavity surface-emitting laser
  • SLED super luminescence light emission diode
  • the light source, detectors, interferometers and optical input/output couplers are integrated with photonic integrated circuits.
  • the embodiments featured herein help solve or mitigate the above-noted issues as well as other issues known in the art.
  • at least one of the embodiments featured herein provides low-coherence interferometry that allows three-dimensional sensing, yielding higher signal-to-noise ratios due to high interferometric gain.
  • low-coherence interferometry is able to resolve surfaces of the eye with less than about 10 pm of axial resolution.
  • the embodiments also provide measurements that are insensitive to ambient light. Moreover, three-dimensional information may be leveraged in a calibration protocol to generate subject-specific eye models that may greatly enhance the overall accuracy of the system.
  • the embodiments are also configured to enable high-speed tracking of saccades and micro-saccades which may provide a valuable signals and measurements for applications such as intent inference, cognitive load estimation, and health monitoring.
  • an embodiment provides an eye tracking system including an interferometer.
  • the system also includes an emission section configured to direct a light beam from the interferometer to a user’s eye and a lens.
  • a bezel region is adjacent to the lens, wherein the emission section is disposed adjacent to the lens or on the lens.
  • FIG. 1 illustrates a swept source-based interferometer according to various aspects of the present disclosure.
  • FIG. 2 illustrates an axial measurement scheme according to various aspects of the present disclosure.
  • FIG. 3 illustrates an assembly according to various aspects of the present disclosure.
  • FIG. 4 illustrates an eye tracking system according to various aspects of the present disclosure.
  • FIG. 5 illustrates a method according to various aspects of the present disclosure.
  • the embodiments featured herein may include interferometer-based systems or hybrid systems that rely on interferometry as well as other sensing modalities, such as image sensing.
  • the embodiments generally include photonic integrated circuits which may include integrated light sources.
  • an exemplary system may be a swept source-based system.
  • the exemplary system includes a swept source that can tune the bandwidth of its output wavelength at a certain speed (for example, at about 100 kHz) and over a specified wavelength range.
  • the wavelength range may be from about 5 nm to about 100 nm.
  • the system can include one or more photodetectors.
  • the light source and one or more photodetectors can be integrated on a single chip.
  • the light from the source may be coupled into elements of the photonic circuit that guides the output light to a beam splitter, a reference and sample arms, as well as to the various photodetectors present in the system, thus forming a low-coherence interferometer.
  • multiple interferometers such as the one described above, may be distributed on a glass substrate to form a spatial detection system.
  • Laser diode (LD) chips and photodiode (PD) chips are placed in the bezel region (embedded in the frame around the substrate) or near the edge where see-through disturbance is the least.
  • a single light source can be shared by the multiple interferometers by using photonic waveguide splitters or optical switches.
  • FIG. 1 illustrates an example implementation of an interferometer 100 on a chip 105, according to an embodiment.
  • the beam 101 shows the direction from a source 103, and the beam 102 is the reflection from the sample 108 or the user eye in this case.
  • Light from the source is collimated and then split by a beam splitter 106 into reference and sample arms.
  • the reference arm is directed up to mirror 110 and reflected back to the beam splitter.
  • the reflected light from the sample 108 and the reference is recombined at the splitter and an interference pattern is detected by the detector 107.
  • FIG. 2 illustrates an example axial measurement (A-scan) scheme 200 where a chip like the interferometer 100 shown in FIG. 1 is embedded into the viewing optics of a headset 201.
  • the representative A-scan illustrates characteristic peaks corresponding to the primary surfaces of interest in the eye, thereby providing a depth profile of the eye, where each peak corresponds to a specific depth of the eye.
  • FIG. 3 illustrates a panel 300 of different examples of possible arrangements of interferometer chips 100 on the lens of the headset, the lens bezel, or a combination thereof.
  • the emitted beams may be parallel or converging (top row), the integrated circuits may be entirely embedded in the lens, entirely placed in the bezel/frame, or split between both regions and coupled via waveguides, as shown in the bottom row.
  • FIG. 4 illustrates a swept source-based system 400 based on the interferometer 100 as illustrated in FIG. 1.
  • the system 400 is integrated with a headset 201 and may include a laser diode and a photodiode which can be monolithically integrated into a single chip.
  • the interferometer-based eye tracking system 400 can be realized using waveguide beam splitters using directional couplers.
  • the input/output (optical I/O) may be on a single channel, and the system 400 may include a beam steering device to provide high spatial resolution.
  • FIG. 5 illustrates a method 500 according to an embodiment.
  • the method 500 is an exemplary process via which a system such has the system 400 is used to perform gazebased interaction in a virtual or augmented reality headset.
  • the method 500 begins at step 502.
  • a user wears virtual or augmented reality head-set including a frame that comprises an eye-tracking system such as the exemplary systems previously described.
  • the method 500 can include providing, by the system, signals that reflect the axial profiles of reflectivity from multiple measurement points.
  • the method 500 includes analyzing the provided signals via a postprocessing algorithm.
  • processing may include classifying one or more of the signals as being associated with an ocular structure like the cornea, the sclera, the iris, the lens, or the retina.
  • processing may also include classifying one or more of the signals as being associated with the skin or the eye lashes.
  • the method 500 can include fitting the classified signals to a prespecified reference frame or eye model.
  • the method 500 can include computing a gaze angle and/or pupil center from the fitted signals and the pre-specified reference frame or eye model.
  • the method 500 may further include inferring points of interest on the display of the headset based on the computed parameters such as gaze angle and/or pupil center.
  • the method 500 may also include making decisions based on dwell time or a number of blinks. The method may end at step 516.
  • One or more of the embodiments described herein may be configured to perform dynamic bandwidth optical coherence tomography (OCT) for either low/high resolution or for long/short ranging application.
  • OCT optical coherence tomography
  • a wavelength- swept source may be used to dynamically control the bandwidth over which the source is being swept. Sweeping over short bandwidths yield low axial resolution whereas sweeping over wide bandwidths yield high axial resolution.
  • embodiments of the present disclosure may be configured as a hybrid system.
  • the hybrid system may include a hybrid system combining a long range, lower resolution path length sensor using SMI or optical or acoustic time-of-flight, and low coherence interferometry for high resolution.
  • a hybrid system may be configured to function as an any path-length measuring sensor.
  • the hybrid system may be configured to allow optimally controlling the reference arm position dynamically.
  • an OCT sensor according to an embodiment may be imaging a 10 mm range, 12 mm away from a lens. If the frame slips or the user has an unusual eye-relief, this distance may be insufficient.
  • the exemplary system may be configured to starting point of the OCT sensor’s 10 mm range by adjusting the reference arm length.
  • the hybrid system may be configured for speckle tracking for motion and velocity estimation based on separate coherence sources. This can be achieved by selecting a narrowband from the spectrum by using a high coherence source (spatial and temporal) to track motion in conjunction with path-length measurement from the OCT sensor.
  • the coherent source may also be “synthesized” by picking a very narrow band from the OCT source.
  • the system may be configured as an anglesensitive detector for surface normals in combination with OCT for path length. If anglesensitive detectors and OCT are detecting from the same point, the exemplary system can yield information about surface normals in addition to the axial information, which will enhance the tracking/modeling accuracy.
  • the exemplary system may be configured for polarization-based sensing.
  • the system may be configured for detecting orthogonal polarizations on separate channels.
  • the system can yield information about birefringence which may enhance the contrast in the signal, thus rendering segmentation and/or computation easier.
  • the system may be configured to enable sensing of local curvature by measuring the ratio of the coupled power from each polarization. For example, at Brewster’s angle, only one polarization reflects and the other transmits.
  • exemplary systems may be configured for scanning SMI in conjunction with OCT or scanning OCT utilizing a micro-electro-mechanical systems (MEMS) scanner.
  • MEMS micro-electro-mechanical systems
  • the exemplary system is configured to enable the formation of a 3D image.
  • the system may include an additional sensor which does not have to be in-field or even pointing at the eye. This additional sensor could be sitting in the arms of the glass pointed at the temple of the cheek bone and sensing at relative distance, and it can serve as a proxy for detecting slippage or vibrations.
  • the embodiments provided herein may include an eye tracking system.
  • the system may include an interferometer, an emission section configured to direct a light beam from the interferometer to a user’s eye.
  • the system further includes a lens and a bezel region adjacent to the lens.
  • the emission section is disposed adjacent to the lens or on the lens.
  • the interferometer further includes a detector, a reference arm, a beam splitter, and a beam-shaping element.
  • the reference arm can have a length of less than about 100 mm, of less than about 80 mm, of less than about 50 mm, of less than about 25 mm, or of less than about 10 mm.
  • the interferometer can be configured to output the light beam from the beam splitter.
  • the eye tracking system can include a headset where the headset includes the lens and the bezel region.
  • the eye tracking system can further include a spatially coherent light source.
  • the eye tracking system can include a scanning mechanism configured for beam steering as well as another interferometer that is configured to emit another light beam substantially parallel to the light beam of the first interferometer described above, and the respective optical axes of each light beam can be substantially parallel. In an alternate implementation, the respective optical axes of the light beams and of can be angled.
  • the eye tracking system can further include a sensor and a camera configured to provide complementary information. The sensor can further be configured to conduct axial ranging based on self-mixing interferometry.
  • the light source can be integrated with a photonic integrated circuit.
  • the photonic integrated circuit can include a waveguide and a coupler configured to couple the photonic integrated circuit with the light source.
  • the coupler can be a grating coupler or a micro-optical element.
  • the interferometer can include a planar photonic circuit and a photonic beam splitter.
  • the photonic beam splitter can be a directional coupler.
  • the emission section can include an output coupler for directing the light beam to the user’s eye.
  • the output coupler can include a beam-shaping element.
  • the beam-shaping element can be a meta lens or a diffractive optical element.
  • the beam-shaping element includes a meta lens and a diffractive optical element.
  • the reference arm can include a waveguide and a directional coupler including a detection port.
  • the directional coupler can have ends that are 2 x 2 50:50, and the detector can include photodetectors coupled to the directional coupler.
  • the photodetectors can include silicon avalanche photodetectors and they may be integrated with a light source on the same chip.
  • the circuit can include a transparent substrate.
  • the substrate may also be glass.
  • the system can include an electric controller circuit and a computational unit configured to drive the light source and the photodetectors.
  • a method for tracking a user’s eye using the above described eye tracking system can include performing at least one interferometric measurement over a predetermined area on the eye and performing at least one depth-profile measurement of the anterior segment of the eye, retina, or a combination thereof.
  • the method can further include determining, based on a combination of the at least one interferometric measurement and the at least one depth profile measurement, an estimate of a position and a direction of the eye by minimizing a difference between an observed depth and a depth computed from one of an eye model and a machine learning algorithm.
  • the method can further include filtering the measurements spatiotemporally to incorporate different times and at different dynamics of the eye.
  • the method can further include obtaining phase information from the measurements, computing high-resolution displacements, and axial velocity based on the phase information.
  • the method can further include running one or more computational filters to enhance a performance of the eye tracking system.
  • the method can further include combining the measurements in a calibration protocol to generate a ground-truth eye model.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instruments For Measurement Of Length By Optical Means (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne un système d'oculométrie comprenant un interféromètre. Le système comprend également une section d'émission configurée pour diriger un faisceau de lumière de l'interféromètre vers l'oeil d'un utilisateur et une lentille. Une région d'encadrement est adjacente à la lentille, et la section d'émission est disposée adjacente à la lentille ou sur la lentille.
EP21810491.7A 2020-10-26 2021-10-23 Systèmes et procédés d'oculométrie dans un dispositif monté sur la tête à l'aide d'une interférométrie à faible cohérence Withdrawn EP4232865A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063105867P 2020-10-26 2020-10-26
US202117228634A 2021-04-12 2021-04-12
PCT/US2021/056374 WO2022093656A1 (fr) 2020-10-26 2021-10-23 Systèmes et procédés d'oculométrie dans un dispositif monté sur la tête à l'aide d'une interférométrie à faible cohérence

Publications (1)

Publication Number Publication Date
EP4232865A1 true EP4232865A1 (fr) 2023-08-30

Family

ID=78676664

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21810491.7A Withdrawn EP4232865A1 (fr) 2020-10-26 2021-10-23 Systèmes et procédés d'oculométrie dans un dispositif monté sur la tête à l'aide d'une interférométrie à faible cohérence

Country Status (5)

Country Link
EP (1) EP4232865A1 (fr)
JP (1) JP2023547310A (fr)
KR (1) KR20230088909A (fr)
CN (1) CN116406449A (fr)
WO (1) WO2022093656A1 (fr)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019025871A1 (fr) * 2017-08-01 2019-02-07 Opher Kinrot Systèmes et procédés optiques de mesure de mouvement de rotation
US11112613B2 (en) * 2017-12-18 2021-09-07 Facebook Technologies, Llc Integrated augmented reality head-mounted display for pupil steering

Also Published As

Publication number Publication date
WO2022093656A1 (fr) 2022-05-05
JP2023547310A (ja) 2023-11-10
CN116406449A (zh) 2023-07-07
KR20230088909A (ko) 2023-06-20

Similar Documents

Publication Publication Date Title
JP7170033B2 (ja) 集積光子装置を用いる空間分割多重化光学コーヒレンストモグラフィー
US9816803B2 (en) Method and system for low coherence interferometry
JP4677636B2 (ja) オプティカル・コヒーレンス・トモグラフィー装置及びこれに用いる可変波長光発生装置
JP4916573B2 (ja) 光干渉計測方法および光干渉計測装置
US9217707B2 (en) Method and apparatus for eye movement tracking in spectral optical coherence tomography (SD-OCT)
US9226655B2 (en) Image processing apparatus and image processing method
US8199329B2 (en) Apparatus for measurement of the axial length of an eye
US10349829B2 (en) Ophthalmic imaging apparatus
JP5903903B2 (ja) 光コヒーレンストモグラフィー装置
CN112485904B (zh) 眼睛调节距离测量设备和方法及头戴式显示器
JP2007114160A (ja) 光コヒーレンストモグラフィー装置
US11430262B1 (en) Eye tracking using optical coherence methods
CN113749609A (zh) 用于检测眼睛的注视方向的方法
WO2015194145A1 (fr) Appareil de tomographie par cohérence optique à source balayée pour l'imagerie du fond d'œil
KR20120137329A (ko) 광간섭 단층촬영 장치
EP4232865A1 (fr) Systèmes et procédés d'oculométrie dans un dispositif monté sur la tête à l'aide d'une interférométrie à faible cohérence
RU2724442C1 (ru) Устройство и способ определения расстояния фокусировки глаза для наголовного устройства отображения, наголовное устройство отображения
CN117813540A (zh) 眼动追踪
JP7006874B2 (ja) Octシステム及びoct方法
EP4198607A1 (fr) Détection de l' il
WO2022243027A1 (fr) Détermination du mouvement de l'œil
CN111770719A (zh) 通过迈克尔逊型自由光束干涉仪产生二维干涉图的方法
WO2023078689A1 (fr) Imagerie rétinienne
CN117617891A (zh) 一种扫描激光眼屈光间质地形图测量装置
Rovati et al. Self-mixing low-coherence interferometry

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230424

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20231216