WO2022024104A1 - Eye tracking systems and methods - Google Patents

Eye tracking systems and methods Download PDF

Info

Publication number
WO2022024104A1
WO2022024104A1 PCT/IL2021/050643 IL2021050643W WO2022024104A1 WO 2022024104 A1 WO2022024104 A1 WO 2022024104A1 IL 2021050643 W IL2021050643 W IL 2021050643W WO 2022024104 A1 WO2022024104 A1 WO 2022024104A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
incident
center
parameter
light beam
Prior art date
Application number
PCT/IL2021/050643
Other languages
French (fr)
Inventor
Boris Greenberg
Artem HIKMAN
Roy Kaner
Albert Kashchenevsky
Mark SHOVMAN
Yaron Zimmerman
Yakov Weinberg
Pedro MERCADER
Original Assignee
Eyeway Vision Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyeway Vision Ltd. filed Critical Eyeway Vision Ltd.
Publication of WO2022024104A1 publication Critical patent/WO2022024104A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Definitions

  • the invention is in the field of eye tracking techniques for use in different applications, such as with image projection systems, medical diagnostics and relevant research.
  • Eye tracking is becoming increasingly popular in a plenty of applications. Accurate eye tracking is a must in some applications. Some applications require online eye tracking with microsecond time resolution derived from the fast eye movements, such as in virtual or augmented reality applications, and some applications require offline eye tracking analysis that may have an online tracking component, such as applications in the research and medical diagnostics fields. As technology advances, there will always persist the need for enhancing accuracy, precision, low latency and compact size of the eye trackers. In the example of virtual/augmented reality applications, image projections provided by the image projection systems should emulate real life visionary perception. Light carrying the information of the outer world enters and traverses the human eye along known paths, until reaching the retina and from there the brain where the light signal is interpreted.
  • Tracking the eye and gaze direction would therefore be necessary to provide convincing virtual/augmented reality experience.
  • tracking the eye and gaze direction is very important in direct retinal projection.
  • a small eye box is produced, typically with a size less than the pupil diameter.
  • the image will be lost if the eye box is not expanded by optical means which adds significant complexity to the virtual/augmented reality systems and often degrades image brightness and quality. Tracking the eye and gaze direction allows moving the small eye box in real time thus enabling images with good brightness and contrast and eliminating the need for eye box expansion.
  • the first eye trackers were built at the end of 19th century. The devices were difficult to build and caused discomfort to the participants. Specially designed rings, contact lenses, and sucker cups were attached to the eyes to help in eye movement measurements.
  • the first photography-based eye trackers which examined light reflected from the different parts of the eye, were introduced only at the beginning of the 20th century. They were much less intrusive. For the most part of 20th century, researchers built their own eye trackers, which were costly and of limited availability. The first commercially available eye trackers appeared only in the 70’s.
  • VOG Video OculoGraphy
  • VOG systems usually capture images of an eye of a user and determine certain features of the eye on the basis of the captured images. These systems are non-intrusive and usually rely on infra-red illumination of the eye that does not cause disturbance or discomfort to the user. Some of them rely on small lightweight cameras that are wearable, but more precise systems are stationary.
  • the combined pupil/comea reflection (1st Purkinje) eye tracker illuminates an eye with a number of infra-red light diodes and images the surface of the eye with one or more cameras, segmenting the pupil (as the darkest part of the eye) and first Purkinje images of the diodes. Change of the pupil position relative to 1st Purkinje images of the IR diodes, indicates movement of the eye.
  • User calibration must be used to calculate the real angles. Usually, a user is required to focus onto some target, moving on a known path to calibrate the system. The precision and accuracy of this approach is relatively low, since it relies heavily on the alignment and user calibration. If the user did not look precisely, accurately and without latency to the target, the system will not be calibrated correctly.
  • any movement of the system relative to the user's head will require a recalibration Pupil dilation further decreases measurement precision, and ambient reflections from the cornea confuse image processing algorithms.
  • Different eye colors, long eye lashes, and contact lenses are all additional factors, complicating image processing systems further. Therefore, usually these systems are noisy and provide less than 1° precision for angular movements, with no information on lateral eye movements.
  • recalibration is required.
  • the present invention provides a novel approach of eye tracking for use, for example, in virtual and/or augmented reality applications, in medical applications and/or in general eye research.
  • Precise online eye tracking in virtual and/or augmented virtual reality applications enables projecting high-quality and life-like virtual images towards an eye of a user by preserving the location of the virtual images on the eye’s retina with high accuracy.
  • ocular axes have been defined to form reference for defining the light paths inside the human eye, for use in multiple applications, such as in optometry, ophthalmological diagnosis and surgical procedures.
  • Some of the frequently used ocular reference axes are the optical axis, the pupillary axis, the visual axis, and the line of sight. At least some of these ocular axes can be useful and can serve as a reference for eye tracking applications.
  • the optical axis of the eye can be defined as an axis that passes through and contains the centers of curvature of the optical surfaces of the eye.
  • the optical surfaces of the eye are the anterior corneal surface, the posterior corneal surface, the anterior crystalline lens surface and the posterior crystalline lens surface.
  • the optical axis is a theoretical construct, and a more practical definition of the optical axis may be the “best fit” line through the centers of curvature of the “best fit” spheres of cornea and lens surfaces.
  • the pupillary axis can be defined as the normal line to the anterior comeal surface that passes through the center of the entrance pupil. If the eye was a centered optical system, the pupillary axis would coincide with the optical axis. However, the pupil is often not centered relative to the cornea, the cornea is mostly deviating from a regular shape so that the pupillary axis points at a slightly different direction than the optical axis.
  • the visual axis can be defined as the line connecting the fixation point (e.g., the spatial location of the object being contemplated), with the fovea, passing through the two nodal points of the eye.
  • the angle between the optical axis and the visual axis is about 5° and varies between 4° and 8° approximately.
  • a target fixing point
  • the visual axis is not easily found experimentally because the nodal points of the eye are abstract notions and are not physical entities within the eye. Since the nodal points (on object and image sides) are within 1 mm from the comeal center of curvature, the visual axis is nearly perpendicular to the cornea.
  • the line of sight can be defined as the ray from the fixation point reaching the fovea via the pupil center.
  • the line of sight is basically the chief ray of the bundle of light arriving from an external object and reaching the individual’s fovea.
  • the line of sight can be easily identified experimentally thanks to its close connection to the pupil center.
  • the line of sight may not be considered a fixed axis because the pupil center may move when the pupil size changes.
  • the first optical surface of the eye that the light entering the eye encounters is the comeal surface, specifically the anterior comeal surface.
  • the cornea is a sphere
  • the center of curvature of the cornea is the center of the corneal sphere.
  • the optical axis is normal (perpendicular) to the anterior comeal surface since it passes through the center of curvature of the spherical cornea.
  • the present invention utilizes the above-mentioned properties of the eye, including its shape, to track the eye, utilizing light beam propagation, by tracking one or more ocular axes, such as the optical axis of the eye, with good approximation.
  • tracking of an ocular axis of the eye requires separating the monitoring of the position of the light beam from two spatially separated surfaces instead of one only, because human eye does not have a fixed center of rotation resulting in that angular eye motions involve also lateral displacement of the ocular axis.
  • four degrees of freedom e.g. two angles (pitch and yaw) and two lateral displacements (horizontal/vertical).
  • the first condition of perpendicularity to the cornea fixes two incidence angles (pitch and yaw) when fulfilled, and a second condition is needed that will ensure a constant (lateral, horizontal/vertical) intersection point of the axis on the cornea.
  • the second condition is a geometrical condition and involves maintaining a geometrical relationship between the incident light beam propagation path and a geometrical parameter of the eye.
  • the geometrical parameter of the eye relates to one or more physical entities of the eye, such as the pupil and/or the limbus.
  • a successful tracking of an ocular axis involves continuous adjustment of the spatial and angular propagation paths of the tracking light beam in order to maintain the tracking conditions, because, for example, the perpendicularity condition of the comeal beam will be breached with a lateral displacement of the cornea with respect to the comeal beam although the angular direction of the comeal beam should not be changed.
  • the cornea is approximated to have a sphere shape.
  • a light beam normal to the comeal surface will be reflected in the same direction, given that the reflection from the comeal surface is a specular reflection (this is mainly the case when the light beam falls on the cornea portion above the pupil).
  • the light beam is reflected from the cornea along the same optical path if and only if it is normal (perpendicular) to the comeal surface.
  • the technique of the invention for tracking ocular axes utilizes beam scanning, i.e. by suitably illuminating the eye with an incident light beam propagating along well- defined spatial and angular illumination propagation path and detecting its respective reflected light beam propagating backwardly along a reflection propagation path.
  • the incident light beam configured to be reflected from the cornea, is a narrow beam, i.e. has a small cross-sectional dimension/area with respect to a dimension/area of the pupil.
  • This enables obtaining spatial and angular information, i.e. information about the exact location of intersection point of the beam on the cornea as well as the angle of reflectance of the beam from the cornea.
  • the information of the spatial and angular propagation paths both in the forward (illumination) and backward (reflection) directions, can be acquired by using relatively simple position sensors/detectors.
  • a method for tracking an eye of an individual comprising: i) illuminating an eye of an individual, over an area of cornea of the eye, with an incident light beam propagating towards the cornea of the eye along an incident direction, detecting a reflected light beam propagating from the cornea of the eye along a reflected direction and determining angular offset between the incident and the reflected directions; ii) analyzing sensing data of the eye to determine a geometrical parameter of the eye; iii) repeatedly adjusting the incident direction of the incident light beam such that said angular offset is minimized and said geometrical parameter of the eye has a predetermined geometrical relationship with the incident direction of the incident light beam; and iv) repeating steps (i) to (iii) under changes in gaze direction of the eye of the individual.
  • step (iii) is repeated until said angular offset tends to or equals zero.
  • the method further comprises applying a transformation of coordinates for determining said predetermined geometrical relationship between the geometrical parameter of the eye and the incident direction of the incident light beam.
  • the sensing data comprises stereoscopic images of the eye
  • said analyzing of the sensing data to determine the geometrical parameter comprises triangulation.
  • the sensing data of the eye is provided at a predetermined first pace being slower than a predetermined second pace of said illuminating of the eye with the incident light beam. In some embodiments, analyzing of the sensing data to determine the geometrical parameter of the eye is done off-line after finishing an eye tracking session.
  • an eye tracking system comprising a control and processing unit comprising: a light path analyzer configured and operable to receive light data indicative of incident light beam illuminating an area of cornea of an eye of an individual and a corresponding reflected light beam propagating backwardly from the cornea of the eye, and generate light path data indicative of angular offset between incident and reflected light directions of said incident and reflected light beams respectively; a sensing data analyzer configured and operable to receive sensing data of the eye and analyze the sensing data to determine a geometrical parameter of the eye; and an eye tracking analyzer configured and operable to generate operational data for repeatedly adjusting the incident light direction such that said angular offset is minimized and said geometrical parameter of the eye has a predetermined geometrical relationship with said incident light direction.
  • the eye tracking analyzer is configured and operable to generate the operational data for repeatedly adjusting the incident light direction until said angular offset tends to or equals zero.
  • the system further comprises a light source configured and operable to generate said incident light beam being configured to be reflected from the cornea of the eye.
  • the eye tracking system further comprises a light directing arrangement configured and operable to direct said incident light beam from said light source towards the cornea of the eye of the individual, and collect said reflected light beam propagating backwardly from the cornea of the eye.
  • the operational data, generated by said eye tracking analyzer for adjusting the incident light direction comprise changing location of a pivot point of the light directing arrangement.
  • the eye tracking system further comprises a light detector, located at an output of said light directing arrangement, and configured and operable for detecting said reflected light beam and generating a detection output indicative thereof, said light data comprising said detection output.
  • the eye tracking system further comprises an imager configured and operable to provide said sensing data of the eye of the individual.
  • the imager may be configured and operable to provide stereoscopic images of the eye, thereby enabling said sensing data analyzer to determine said geometrical parameter by triangulation.
  • the sensing data of the eye is provided to the control and processing utility at a predetermined first pace being slower than a predetermined second pace in which the light data is provided to the control and processing utility.
  • the analyzing of the sensing data by the sensing data analyzer to determine the geometrical parameter of the eye is done off-line after finishing an eye tracking session.
  • the geometrical parameter of the eye is indicative of a symmetry condition of the eye.
  • the geometrical parameter of the eye is a parameter of limbus region of the eye.
  • the parameter of limbus region of the eye may be at least one of the following: a three-dimensional location of center of the limbus region, of spatial points following a diameter of the limbus region, and of spatial points following a perimeter of the limbus region.
  • the predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the limbus region.
  • the geometrical parameter of the eye is a parameter of pupil of the eye.
  • the parameter of pupil of the eye may be at least one of the following: a three- dimensional location of center of the pupil, of spatial points following a diameter of the pupil, and of spatial points following a perimeter of the pupil.
  • the predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the pupil.
  • the geometrical parameter of the eye is center of fovea of the eye, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the fovea.
  • the geometrical parameter of the eye is center of optic disk of the eye, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the optic disk.
  • a method for tracking an eye of an individual comprising: i) performing online tracking comprising: a) defining an instantaneous pivot point in space as an intersection point for a plurality of incident light beams illuminating an eye of an individual over an area of cornea of the eye, b) illuminating the eye with an incident light beam, of the plurality of incident light beams, propagating along an incident direction; detecting a reflected light beam propagating from the cornea of the eye along a reflected direction; determining an angular offset between the incident and the reflected directions; and repeatedly adjusting the incident direction of the incident light beam such that said angular offset tends to zero, and c) providing, at least partially simultaneously, sensing data indicative of an instantaneous geometrical parameter of the eye [e.g.
  • Limbus center ii) performing subsequent off-line analysis comprising: a) providing data indicative of location of center of the cornea of the eye, and b) determining instantaneous geometrical relationships between said instantaneous pivot point, instantaneous geometrical parameter and data indicative of the location of center of the cornea of the eye, to determine instantaneous gaze direction of the individual.
  • Fig. 1 illustrates, by way of a flow diagram, a non-limiting example of a method for tracking an eye of an individual, in accordance with some exemplary embodiments of the present invention
  • Fig. 2 illustrates a non-limiting example of a method for adjusting the direction of the incident light, when the limbus center is the geometrical parameter;
  • Fig. 3 illustrates, by way of a flow diagram, a non-limiting example of a method for tracking an eye of an individual, in accordance with some exemplary embodiments of the present invention, the method includes an online and an offline tracking stages;
  • Fig. 4 illustrates a non-limiting example for offline geometrical analysis for tracking the eye and determining the instantaneous gaze direction
  • Fig. 5 illustrates, by way of a block diagram, a non-limiting example of a system for tracking an eye of an individual, in accordance with some exemplary embodiments of the present invention
  • Fig. 6 illustrates, by way of a block diagram, non-limiting examples of utilities/systems that may be included in the eye tracking system of the present invention.
  • Fig. 7 illustrates a non-limiting example of a method for receiving and analyzing sensing data of the eye in order to determine a geometrical parameter of the eye.
  • a method for tracking an eye of an individual includes at least the following: i) illuminating an eye of an individual, over an area of cornea of the eye, with an incident light beam propagating towards the cornea of the eye along an incident direction, detecting a reflected light beam propagating from the cornea of the eye along a reflected direction and determining angular offset between the incident and the reflected directions; ii) analyzing sensing data of the eye to determine a geometrical parameter of the eye; iii) repeatedly adjusting the incident direction of the incident light beam such that said angular offset is minimized and said geometrical parameter of the eye has a predetermined geometrical relationship with the incident direction of the incident light beam; and iv) repeating steps (i) to (iii) under changes in gaze direction of the eye of the individual.
  • Fig. 1 illustrating by way of a flow diagram a non-limiting example of a method 100 for tracking an eye of an individual, in accordance with the above-described method and according to some exemplary embodiments of the present invention. It is assumed that each eye of the two eyes of the individual is tracked individually, because the light reaching each eye, when a person looks at an external object, passes through a different path in the space. Therefore, the method 100 is typically applied per each eye of the individual such that two specific tracking paths corresponding to both eyes of the individual are determined.
  • step 102 the individual's eye is illuminated, over an area of the cornea of the eye, with an incident light beam propagating along an incident direction towards the cornea of the eye, and a reflected light beam propagating along a reflected direction, being the reflection from cornea of the incident light beam, is detected in step 104.
  • the light direction/path e.g. the incident direction
  • the incident light beam is configured to be reflected from the cornea of the eye, and accordingly it is sometimes referred to herein as a “comeal beam” or “comeal tracker”.
  • the incident light beam tracks the eye, as will be explained further below, the direction/path of the incident light beam coincides with the eye tracking axis.
  • the optical properties of the incident light beam are chosen to enable at least a detectable portion of the incident light beam to be reflected backwardly from the cornea.
  • the incident light beam is configured to track the incidence angle of the incident light beam with respect to the eye’s optical axis (considered as having zero angle) and to maintain a condition of perpendicularity of the eye tracking axis with respect to the comeal surface.
  • the incident light beam is configured to be normal to the comeal surface at all directions of an individual’s gaze, all the time. Accordingly, the incident light beam is configured to meet the comeal surface with a right (90°) angle, with respect to a tangent to the cornea surface, at each meeting/intersection point (the point on the comeal surface where the incident light beam falls).
  • the incident light beam used in the technique(s) of the invention can have one or more wavelengths, being, preferably, in an unseen wavelength range, such that illumination of the light beam towards the individual’s eye does not disturb, dazzle or distract the individual.
  • This is particularly important in virtual and/or augmented reality applications, in which virtual images (objects and scenes) are continuously projected towards the individual’s eyes while continuous track of individual's eye(s) is required in order to control the specifications of the projected image.
  • the individual is exposed to both real scenery and virtual scenery superposed thereof, in which case it is essential to avoid interruption with the individual’s vision during the activity.
  • the incident light beam can be in any un-harmful and un-disturbing light range that fulfils the above two conditions.
  • the incident light beam can be in the Infrared (IR) range.
  • the incident light beam can have a wavelength in the range between 800nm - 1500nm.
  • step 104 the reflected portion of the incident light beam, i.e. the reflected light beam, is detected.
  • the perpendicularity condition, of the incident light beam is verified by the detection of the reflection (at least a portion thereof) of the incident light beam, i.e. the reflected light beam, from the cornea. Detection of the reflected light beam along the reflected direction enables calculating an angular offset between the incident and reflected light directions.
  • the incident light beam impinges on the cornea surface with a right angle, it is reflected therefrom also with a right angle, i.e. along the same direction/path but in the opposite direction.
  • the perpendicularity condition can be verified if and when the reflected light beam (being the reflected portion of the incident light beam) propagates backwardly, away from the eye, along the direction/path that was traversed by the incident light beam, i.e. along the incident direction. If it is detected that the reflected light beam is not propagating along the incident direction, in the backward direction, i.e. it is propagating along a different path with an angular offset between the incident and reflected directions, then, in step 116, the angular direction of the incident direction, along which the incident light beam propagates, should be modified and adjusted. If it is detected that the reflected light beam is propagating along the incident direction in the backward direction, i.e.
  • step 112 a second condition, as will be described further below (step 112) should be verified and only if both conditions are met, independently, the eye is deemed to be tracked (step 114). It is noted that, generally the angular offset between the incident and reflected directions will not be zero in the first place, and the incident direction is dynamically/repeatedly adjusted (step 118) to minimize the angular offset down to zero, or at least such that the angular offset tends to zero. This repeated adjustment of the incident direction to fulfill the perpendicularity condition is performed almost all the time and specifically under changes in the gaze direction of the eye.
  • the method of tracking the eye can include preliminary steps for defining an individual-specific parametric space of all light directions/paths being perpendicular to the cornea surface over a predetermined area thereof extending at least over the pupil and within a predefined range of angular and lateral positions of the eye, by utilizing the properties of the incident and reflected light beams only. While method 100 should be executed continuously because eye movements (angular as well as spatial/lateral) cause the incident beam not to be perpendicular to the cornea, prior knowledge of the parametric space of all light directions/paths being perpendicular to the cornea surface can greatly reduce the latency of the tracking and increase its accuracy.
  • a preliminary method for finding all light directions/paths perpendicular to the cornea surface over an area thereof and with a predefined range of angles and lateral positions with respect to a fixed axis in space can be applied by scanning the cornea with a plurality of incident light beams.
  • the defined perpendicular-to-comea light directions/paths define respective intersection points where the light beams hit the cornea surface.
  • the distance between every two light directions/paths or intersection points can be as small as possible to obtain highly accurate tracking of the eye.
  • the preliminary method finds a) the required settings of light deflectors to affect particular light beam angles with respect to a nominal optical axis of the system used, b) the translation between the recorded shift of the light beam from perfect backward direction (i.e. non perpendicular incidence) to the required angular and lateral corrections needed to bring the light beam back to perpendicular incidence.
  • step 106 performed in parallel (concurrently) to steps 102 and 104, either at the same pace or at a different pace, sensing data of the eye is provided.
  • the sensing data can be of the external or internal sides of the eye, depending on the sensing technique. Non-limiting examples are described herein below.
  • Providing the sensing data may be, in some non-limiting embodiments, by capturing images of the eye.
  • the sensing data is analyzed and a geometrical parameter of the eye is determined from the analyzed sensing data.
  • a geometrical parameter of the eye is determined from the analyzed sensing data.
  • the three-dimensional location of the geometrical parameter with respect to a reference three-dimensional coordinate system, e.g. xyz coordinates in a cartesian coordinate system, is determined.
  • the geometrical relationship between the determined geometrical parameter of the eye and the incident direction of the incident light beam is examined. If the relationship satisfies a predetermined geometrical condition, derived from the kind of the geometrical parameter, the eye is tracked, and if not, and even if the incident direction is perpendicular to the cornea, the spatial location of the incident light beam (the incident direction/path) is adjusted, i.e.
  • the intersection point of the incident light beam with the cornea surface is adjusted, while insuring that the perpendicularity condition described above is valid by repeating step 116. Only when both conditions are met, i.e. the perpendicularity condition and the geometrical condition, the eye is tracked. It is appreciated that whenever the spatial location of the intersection point is changed, given the spherical assumption of the comeal surface, the perpendicular condition is breached and steps 110 and 116 are repeated to insure perpendicularity of the incident light beam with respect to the comeal surface.
  • the sensing data may refer to the limbus of the eye and the analysis of the sensing data results in determining the geometrical parameter being the three-dimensional location, with regard to a reference coordinate system, of one or more spatial points of the limbus.
  • the geometrical parameter is the three-dimensional location of the center of the limbus. In this case, tracking the eye is achieved if the center of the limbus is located on the extension through the eye of the direction/path of the perpendicular-to-comea incident (as well as reflected) light beam.
  • the geometrical parameter is a group of spatial points following/tracking a diameter of the limbus.
  • tracking the eye is achieved if the diameter of the limbus (i.e. the three-dimensional locations of the spatial points tracking the dimeter of the limbus) is perpendicular to the extension of the incident direction through the eye, and the extension of the incident direction through the eye passes through the middle of the diameter of the limbus.
  • the geometrical parameter is a group of spatial points following/tracking the perimeter of the limbus. In this case, tracking the eye is achieved if any point on the incident direction is equidistant to two opposite spatial points on the perimeter of the limbus.
  • the geometrical condition can be that any point on the incident direction is equidistant to any two points on the perimeter of the limbus.
  • the sensing/image data may refer, in another non-limiting embodiment, to the pupil of the eye and the analysis of the sensing data results in determining the geometrical parameter being the three dimensional location, with regard to a reference coordinate system, of one or more spatial points of the pupil.
  • the geometrical parameter is the three-dimensional location of the center of the pupil.
  • tracking the eye is achieved if the center of the pupil is located on the extension through the eye of the direction of the perpendicular-to-comea incident (as well as reflected) light beam.
  • the three-dimensional location of a group of spatial points following/tracking the diameter or the perimeter of the pupil can be determined and similar analysis to that of the limbus, as described above can be applied.
  • the eye is tracked by fulfilling a geometrical condition of a symmetry assumption of the eye's structure. Specifically, the eye is tracked by tracking the optical axis of the eye.
  • the visual axis can be also tracked, because according to experiments the visual axis can be tracked by deflecting the incident light beam by a predetermined angle in a predetermined direction. For example, according to the inventors the visual axis is 4° shifted from the optical axis. By tracking the optical axis, the incident light beam can be shifted by the aforementioned shift to track the visual axis.
  • the sensing data may refer to the fovea of the eye and the analysis of the sensing data results in determining the geometrical parameter being the three-dimensional location, with regard to a reference coordinate system, of one or more spatial points of the fovea, specifically the three-dimensional location of the fovea center. It is known that light reflected from the fovea changes polarization, because of Henle fibers. This polarization changing feature allows to detect when light is reflected from the center of the fovea and allows to separate between light reflected from the fovea and light reflected from the cornea.
  • the sensing data being a portion of the light beam that enters the eye and which is reflected with a specific change in polarization, is indicative of the fovea center (as the geometrical parameter).
  • the light beam that will pass through the cornea center (such that the portion that is reflected from the cornea surface travels along the incident direction) and hits the fovea center can also be used to track the eye.
  • This example of the geometrical parameter enables to track an axis that is close to the visual axis of the eye.
  • the geometrical parameter of the eye is center of the so-called optic disk, which is the point on the retina where optical nerve is connected. As there are no rods or cones overlying the optic disk, it corresponds to a small blind spot in the eye.
  • Optic disk is highly reflective, and since eye behaves as a retro- eflector, this allows determining when the portion of the light beam entering the eye hits the optical disk because of the detected growth of intensity of the retro-ref ected portion of the light beam. Accordingly, another axis that can be used to track the eye is the axis defined by the cornea center and the geometrical parameter as the center of the optic disk.
  • Fig. 2 illustrating a non-limiting example of a method 200 for adjusting the direction of the incident light.
  • the non-limiting exemplified method relates to the limbus center as the geometrical parameter.
  • the method can be accommodated with relevant relations/parameters for any other geometrical parameter used.
  • the incident beam IB is perpendicular to the cornea surface then the extension of the incident beam through the eye passes through the cornea center.
  • the extension of the beam needs to pass through the limbus center as well.
  • the incident light direction supposing it fulfills the first condition of being perpendicular to the cornea, can be calculated as follows: And, the angle between the limbus center and the original pivot point can be calculated as follows:
  • the new required pivot point location that reduces the difference between 0 L and 0 B can be calculated as follows: Where z P is set to a constant value equal to the nominal Limbus z coordinate minus the distance between the Limbus center and the new required pivot point g is a gain of the iterative process.
  • Tracking of the eye can be beneficial in a plenty of applications some of which are of offline tracking nature. For example, analysis of eye movement and gaze direction of an individual during a test, such as a medical test, does not have to be online and can be performed retrospectively after performing the test.
  • the method includes an online stage 302 and an offline stage 304.
  • the online stage includes steps for tracking of the eye by the comeal tracker, i.e. by ensuring that the perpendicularity to cornea condition is achieved online, while concurrently acquiring sensing data indicative of a geometrical parameter of the eye.
  • the offline stage includes analysis of the sensing data in addition to other provided geometrical data of the eye in order to determine the gaze direction retrospectively.
  • an instantaneous pivot point in space is defined as an intersection point for a plurality of incident light beams illuminating an eye of an individual over an area of cornea of the eye.
  • the pivot point will regularly be inside the borders of the eye's sphere.
  • steps 312 and 314 while maintaining the pivot point, the eye is illuminated with an incident light beam propagating along an incident direction, and a reflected light beam propagating from the cornea of the eye along a reflected direction is detected.
  • step 316 an angular offset between the incident and the reflected directions is determined.
  • step 318 the incident direction is adjusted as needed, while maintaining the pivot point in place, to minimize the angular offset, specifically until the angular offset equals zero.
  • steps 312 to 318 are similar to steps 102, 104, 110 and 116 of method 100 above.
  • step 320 sensing data indicative of an instantaneous geometrical parameter of the eye is provided at least partially simultaneously with steps 312-318.
  • the monitored geometrical parameter can be any geometrical parameter of the eye that defines a geometrical relationship with the incident direction of the incident light beam. Such geometrical parameters are described above with respect to method 100.
  • the time gathered data of the incident directions and the simultaneous sensing data are saved for subsequent offline analysis as follows. Accordingly, a plurality of incident directions and a respective simultaneous data of a geometrical parameter of the eye are saved for determination of a respective plurality of gaze directions.
  • step 322 geometrical data indicative of location of center of the cornea of the eye is provided.
  • This data can be, for example, a distance from the geometrical parameter.
  • the geometrical parameter is the center of the limbus
  • the distance between the cornea center and the limbus can be determined or provided.
  • the distance between centers of the limbus and cornea is empirically known.
  • the distance between centers of the limbus and cornea can be found by determining three-dimensional location of cornea center in addition to determining the three-dimensional location of the limbus center.
  • the location of cornea center can be determined, for example, by shifting the pivot point quickly (such that the eye is relatively stationary) and keeping the reflected beam normal to the cornea.
  • step 324 instantaneous geometrical relationships between the instantaneous location of the pivot point, the instantaneous geometrical parameter and the instantaneous location of center of the cornea are determined, and in step 326 the eye is tracked by determining the respective instantaneous gaze directions of the individual.
  • Fig. 4 illustrating a non-limiting example for offline geometrical analysis for tracking the eye and determining the instantaneous gaze direction, in accordance with method 300.
  • a cornea of the eye is shown while being illuminated with incident light beams.
  • a Pivot point P is defined arbitrarily in the three-dimensional space (step 310), however typically inside the eye. All of the illuminating incident light beams or their extensions through the eye pass through the pivot point P. For example, three incident directions ID1, ID2 and ID3 of the incident light beam are illustrated in the figure.
  • the incident direction is continuously adjusted until it is perpendicular to the cornea surface (i.e., when the measured angular offset between the incident and reflected light beams equals zero). This is the illustrated incident direction ID1.
  • the spatial propagation of the incident direction ID1 is known. As described above, the incident direction ID1 passes through the cornea center C.
  • sensing data indicative of the limbus center, as a geometrical parameter of the eye is obtained (step 320), thereby enabling determination of the three-dimensional spatial location of the limbus center L.
  • the orientation and length of the side LP in the triangle LPC can be determined, and as the orientation of the incident direction ID1 is known (being defined by the illuminating system), the angle LPC can be calculated.
  • the length of the side LC in the triangle LPC is determined (step 322).
  • the incident direction IDU presents the optical axis of the eye since it passes through the limbus center L, and the cornea center C. Determining the angle PCP' enables adjusting the location of the pivot point from P to P', thus enabling to determine the gaze direction (step 326).
  • Fig. 5 illustrating by way of a block diagram a non-limiting example of a system 10 for tracking an eye of an individual, according to some exemplary embodiments of the technique of the present invention.
  • the system 10 can be used in execution of methods 100-300.
  • the system 10 includes a control and processing unit 20 configured and operable for controlling and processing of light and light data respectively in order to track the individual's eye.
  • the control and processing unit 20 includes at least the following utilities: a light path analyzer 22, a sensing data analyzer 24, an eye tracking analyzer 26 and a memory 28.
  • the control and processing unit 20 includes input and output utilities for receiving and for sending/presenting data.
  • the light path analyzer 22 is configured and operable to receive light data indicative of incident light beam illuminating an area of cornea of an eye of an individual and light data indicative of a corresponding reflected light beam (a reflection of at least a portion of the incident light beam) propagating backwardly from the cornea of the eye.
  • the incident and reflected light data are analyzed by the light path analyzer 22 to generate light path data including, inter alia, the angular offset between the incident and reflected light directions of the incident and reflected light beams respectively.
  • the incident and reflected direction should be perpendicular to the cornea surface at the intersection point (assuming the cornea surface at the relevant illuminated area is spherical), in case the optical axis of the eye is tracked.
  • the sensing data analyzer 24 is configured and operable to receive sensing data of the eye and analyze the sensing data to determine a geometrical parameter of the eye. Examples of the geometrical parameter has been described above, such as the limbus center or the pupil center.
  • the sensing data analyzer determines the three-dimensional spatial location of the geometrical parameter in a way that enables correlating between the spatial location of the geometrical parameter and the spatial location of the incident light beam (i.e. the incident direction).
  • the eye tracking analyzer 26 is configured and operable to generate operational data for adjusting the incident light direction such that the angular offset is minimized and such that the geometrical parameter of the eye has a predetermined geometrical relationship with the incident light direction.
  • the incident and reflected light beams should be perpendicular to the cornea surface.
  • the angular offset should be minimized down to zero.
  • the adjustment of the incident light direction is carried out repeatedly and continuously in order to bring the angular offset down to zero.
  • the eye tracking analyzer is further configured and operable to execute the required calculations for determining the geometrical relationships between pivot point, geometrical parameter and cornea’s center.
  • the memory 28 is a non-transitory memory configured and operable to save data required for the system operation. For example, the data about the pivot point, incident directions, location of cornea's center, sensing data indicative of the geometrical parameter can all be saved in the memory for offline access and analysis.
  • Fig. 6 illustrating by way of a block diagram non limiting examples of utilities/systems that can be included in the system 10, according to some exemplary embodiments of the present invention. It is noted that the system 10 may include one or more of the following utilities/systems. Also, it is noted that the following utilities/systems are operable to perform respective functions listed above with respect to method 100, even if not directly mentioned herein below.
  • the system 10 can include utilities such as a light source 30, a light directing arrangement 40, a light detector 50, the control and processing unit 20, and an eye sensing system 60, configured and operable together to determine and track the eye EYE.
  • utilities such as a light source 30, a light directing arrangement 40, a light detector 50, the control and processing unit 20, and an eye sensing system 60, configured and operable together to determine and track the eye EYE.
  • the light source 30 is configured and operable to generate an incident light beams IB to be projected towards the eye EYE.
  • the light source is configured to generate the incident light beam IB being configured to be reflected from the cornea of the eye, e.g. being in one or more frequency ranges that are reflected or mostly reflected from the cornea.
  • the light source 30 can include one or more light source units configured to generate the incident light beam IB.
  • one or more filters can be used at the output of the one or more light source units to provide different ranges of light wavelengths forming the incident light beam IB.
  • the light source 30 is configured and operable to generate the incident light beam IB having wavelengths in the unseen range, to minimize disturbance to the individual, as described above.
  • the incident light beam IB can be in the infrared range, e.g. in the range between 800nm - 1500nm. Other specifications of the incident light beam IB are described above with reference to method 100, and are equally valid for the system 10.
  • the light directing arrangement 40 is configured and operable to receive the incident light beam IB from the light source 30 and direct it towards the eye EYE along an incident direction defined by central axis of the incident light beam IB, to thereby illuminate the eye over an area of the cornea extending over the pupil of the eye EYE. Additionally, the light directing arrangement 40 is configured and operable to collect a respective reflected light beam RB propagating backwardly from the cornea of the eye EYE.
  • the light directing arrangement 40 can include optical elements responsible for adjusting the light path of the incident and/or reflected light beams, such as optical reflectors, optical deflectors, mirrors, dichroic mirrors, beam splitters, lenses, and other similar elements configured and operable to direct the incident light beam as well as the reflected light beam in accordance with the conditions mentioned above.
  • optical elements responsible for adjusting the light path of the incident and/or reflected light beams such as optical reflectors, optical deflectors, mirrors, dichroic mirrors, beam splitters, lenses, and other similar elements configured and operable to direct the incident light beam as well as the reflected light beam in accordance with the conditions mentioned above.
  • the light directing arrangement 40 through its various light path adjustment elements is configured and operable to compensate for any angular or lateral displacements of the eye during the tracking of the eye, such that it guarantees that the conditions required, such as the perpendicularity to cornea condition, are fulfilled.
  • the light directing arrangement 40 is responsible for directing the incident light beam IB along the incident direction such that IB is perpendicular to the cornea at the intersection point IP.
  • the light detector 50 is located at the output of the light directing arrangement 40, and is configured and operable for detecting the reflected light beam RB and for generating a detection output indicative thereof.
  • the light detector 50 can include one or more light sensors for detecting the reflected light beam RB.
  • the one or more light sensors can be based on quad sensor(s).
  • the light sensor(s) is/are configured and operable to generate an electrical output signal(s) (the detection output) in response to the light input signal(s).
  • the light sensor(s) is/are configured and operable to generate detection output(s) indicative of the detected light intensity.
  • the light sensor(s) is/are configured and operable to generate detection output(s) indicative of spatial location(s)/propagation path(s) of the detected light beam(s).
  • the control and processing utility 20 is configured and operable to receive the detection output from the light detector 50 and determine the angular offset between the incident and reflected light directions. Further, the control and processing utility 20 is configured and operable to determine the adjustment of the light beams itinerary in order to track the eye, inter alia by activating a control function and/or control loop that receives the detection output and generates an output for correcting the light beams itinerary so that the light beams track the eye.
  • the eye sensing system 60 is configured and operable to acquire sensing data SD of the eye, where the sensing data SD is indicative of a geometrical parameter of the eye.
  • the eye sensing system 60 is activated at least partially simultaneously with the light source 30, such that the sensing data overlaps in time with the incident and reflected light beams.
  • the operation pace of the light source 30 is greater than the operation pace of the eye sensing system 60.
  • the eye sensing system 60 is an imager configured and operable to capture images of the eye, specifically to capture images of the area of cornea covering the limbus, iris and pupil.
  • the imager is configured with stereo imaging sensors configured and operable to provide stereo images of the area of the cornea that contains the limbus and/or pupil, enabling to reconstruct the limbus/pupil or a characteristic thereof by using suitable techniques such as triangulation.
  • the reconstruction can be done at the sensing system 60 independently (in which case the sensing system includes a suitable analyzer) or at the control and processing unit 20 as was described above with reference to the sensing data analyzer 24.
  • Non-limiting examples of the acquisition of a geometrical parameter of the eye by the eye sensing system is described herein further below.
  • the control and processing utility 20 may be configured and operable to control the operation of the light source 30 and/or the light directing arrangement 40 and/or the light detector 50 and/or the eye sensing system 60, and the different elements thereof, in order to determine the tracking of the individual’s eye. Accordingly, the control and processing utility 20 may include one or more controllers configured and operable to control the different parts of the system 10. In some exemplary embodiments, each of the light source 30, the light directing arrangement 40, the light detector 50 and the eye sensing system 60 (the system parts) has its own controller(s) configured to control the operation of one or more elements of the respective system part.
  • one or more central controllers are configured to control operation of some or all of the system parts including the light source 30, the light directing arrangement 40, the light detector 50 and the eye sensing system 60.
  • the controllers can be located in one location in the system 10 and connected to the corresponding controlled system part, or the controllers can be distributed in the system such that each system part has its own controller(s) located therewith. Even if no controller is specifically described or shown in the figures, this should not limit the broad aspect of the invention, and it is to be understood that each action performed by the control and processing utility 20 to control the operation of any system part is typically performed by one or more controllers corresponding to the respective system part.
  • the controller(s) can be software or hardware based or a combination thereof.
  • control and processing utility 20 specifically the eye tracking analyzer, generates the operational data to adjust the incident light direction, by changing spatial location of a pivot point of the light directing arrangement 40, such that the angular offset is minimized and the geometrical relationship between the incident light direction and the geometrical parameter is achieved.
  • Fig. 7 illustrating, in accordance with exemplary embodiments of the invention, a non-limiting example of a method 400 for receiving and analyzing sensing data of the eye in order to determine a geometrical parameter (GP) of the eye.
  • the method 400 can be executed by the eye tracking system 10.
  • the geometrical parameter that is determined includes the limbus or pupil of the eye.
  • the method 400 for determining a geometrical parameter of the eye includes the following: receiving or capturing image data indicative of at least two images from different angles of a user's eye at step 402; identifying regions related to the geometrical parameter in each image at step 404; determining geometrical representation of the GP structure at step 406; and performing a triangulation of the geometrical representation of the GP structure of at least two images at step 410, to thereby determine a three- dimensional spatial location of the geometrical parameter at step 412.
  • step 408 includes performing a triangulation of the geometrical representation of the GP structure in at least two images to determine three- dimensional parameters of the GP.
  • the geometrical parameter is the limbus
  • the three-dimensional parameter may be a radius, diameter, center and/or torsional rotation of the limbus.
  • the method may include determining the geometrical representation of the GP structure by processing the three-dimensional GP parameters to generate further more precise data indicative of the GP region.
  • determining the geometrical representation of the GP structure may include digital image pre-processing (step 412) such as performing a GP recognition process on each image or performing mathematical transformations of the image including using an intensity gradient map and then running a GP area recognition process on the transformed image.
  • the step of digital image pre-processing may include calculating an image intensity gradient map of the GP region, identifying at least one region of the GP structure in which the local direction of the gradient is substantially uniform, and processing data indicative of the GP structure by weighting the pixels of such regions and determining geometrical representation of the GP structure based on the matching pixels related to the GP. More specifically, digital image pre-processing may include determining the region of the GP structure by identifying the local uniformity, i.e. the direction of the gradient of each point is collinear only with its neighbors. An entropy map may also be used as an intensity gradient map (instead of or in addition to a gradient map), or the processing may be performed directly on the image.
  • identifying regions related to the GP in each image may include identifying image data indicative of eye features such as eyelids, sclera, iris and eyelashes and/or identifying an initial GP region based on anatomical parameters by using an iterative pixel filtration process and generating data indicative of the initial GP region.
  • an eyelids region estimation may be implemented in parallel with the GP region.
  • Different eye feature regions may be determined based on any geometrical shape models (e.g. ellipses or circles).
  • identifying an initial GP region may include segmentation based on anatomy estimations. For example, segmenting each image for identifying pixels related to the GP region, performing triangulation between the at least two images to determine three-dimensional parameters of the GP, estimating the location of the initial GP region based on the three-dimensional parameters of eye features, and generating data indicative of the initial region location of the GP in each image. Neural network-based approaches may be used for identifying image data indicative of the initial GP region. GP region estimation may also be performed with a neural network based approach. In some embodiments, identifying image data indicative of eye features may be implemented by using machine learning. Determining (i.e.
  • predicting) the three- dimensional parameters of the GP may include using a data recognition model based on a neural network.
  • the method may include training the network based on the segmentation results of classical approaches, or by using an existing system for training.
  • Generating a representation of the GP based on the series of images may be obtained by using a deep-leaming network (DLN), such as an artificial neural network (ANN).
  • DLN deep-leaming network
  • ANN artificial neural network
  • generating a representation of the GP may include calculating probabilities on the placement of the GP and/or generating a model of the GP, adapting a recognition classifier to a person.
  • the present invention advantageously enables accurate online as well as offline eye tracking by using one light beam towards the cornea, data about a geometrical parameter of the eye and a relation between the light beam and the geometrical parameter of the eye.

Abstract

An eye tracking method and system are presented. According to this technique, an eye of an individual is illuminated over an area of cornea of the eye with an incident light beam propagating towards the cornea of the eye along an incident direction, a reflected light beam propagating from the cornea of the eye along a reflected direction is detected, and an angular offset between the incident and the reflected directions is determined. Geometrical parameter of the eye is determined via analysis of sensing data. The incident direction of the incident light beam is repeatedly adjusted such that the angular offset is minimized and the geometrical parameter of the eye has a predetermined geometrical relationship with the incident direction of the incident light beam. This procedure is repeated under changes in gaze direction of the eye of the individual.

Description

EYE TRACKING SYSTEMS AND METHODS
TECHNOLOGICAL FIELD
The invention is in the field of eye tracking techniques for use in different applications, such as with image projection systems, medical diagnostics and relevant research.
BACKGROUND
Eye tracking is becoming increasingly popular in a plenty of applications. Accurate eye tracking is a must in some applications. Some applications require online eye tracking with microsecond time resolution derived from the fast eye movements, such as in virtual or augmented reality applications, and some applications require offline eye tracking analysis that may have an online tracking component, such as applications in the research and medical diagnostics fields. As technology advances, there will always persist the need for enhancing accuracy, precision, low latency and compact size of the eye trackers. In the example of virtual/augmented reality applications, image projections provided by the image projection systems should emulate real life visionary perception. Light carrying the information of the outer world enters and traverses the human eye along known paths, until reaching the retina and from there the brain where the light signal is interpreted. Tracking the eye and gaze direction would therefore be necessary to provide convincing virtual/augmented reality experience. Specifically, tracking the eye and gaze direction is very important in direct retinal projection. Typically, in this case, a small eye box is produced, typically with a size less than the pupil diameter. During natural eye movements, the image will be lost if the eye box is not expanded by optical means which adds significant complexity to the virtual/augmented reality systems and often degrades image brightness and quality. Tracking the eye and gaze direction allows moving the small eye box in real time thus enabling images with good brightness and contrast and eliminating the need for eye box expansion.
The first eye trackers were built at the end of 19th century. The devices were difficult to build and caused discomfort to the participants. Specially designed rings, contact lenses, and sucker cups were attached to the eyes to help in eye movement measurements. The first photography-based eye trackers, which examined light reflected from the different parts of the eye, were introduced only at the beginning of the 20th century. They were much less intrusive. For the most part of 20th century, researchers built their own eye trackers, which were costly and of limited availability. The first commercially available eye trackers appeared only in the 70’s. From around the 50’s, a number of different techniques were developed and are still in use today, such as contact lenses (suction cups, more precisely) with mirrors, contact lenses with coils, electrooculography (EOG), and piezoelectric sensors. Due to their nature, these techniques can track angular eye movements but cannot measure any lateral eye shifts. Also, these systems are stationary and require users’ head stabilization, thus making them unsuitable for research of eye movements in a more natural environment.
The second part of the 20th century was dominated by less intrusive illumination and light sensor-based approach. One limitation of this approach is that cameras are relatively slow, due to the exposure and processing power required to extract eye movement data. Position-Sensing Photodetectors (PSD) or Quad based approaches were developed in the middle of the 20th century. Dual Purkinje Imaging eye tracker systems were developed in 70 ’s and are still in use today. However, Dual Purkinje Imaging eye tracker systems are demanding because of complex and time-consuming alignment and calibration of the system before each experiment, a fact that requires a high level of training of the person performing an experiment. Starting from the 80’s, due to improvements in camera sensors and computer technology, the eye tracking field became dominated by so-called Video OculoGraphy (VOG) systems. VOG systems usually capture images of an eye of a user and determine certain features of the eye on the basis of the captured images. These systems are non-intrusive and usually rely on infra-red illumination of the eye that does not cause disturbance or discomfort to the user. Some of them rely on small lightweight cameras that are wearable, but more precise systems are stationary. The combined pupil/comea reflection (1st Purkinje) eye tracker illuminates an eye with a number of infra-red light diodes and images the surface of the eye with one or more cameras, segmenting the pupil (as the darkest part of the eye) and first Purkinje images of the diodes. Change of the pupil position relative to 1st Purkinje images of the IR diodes, indicates movement of the eye. User calibration must be used to calculate the real angles. Usually, a user is required to focus onto some target, moving on a known path to calibrate the system. The precision and accuracy of this approach is relatively low, since it relies heavily on the alignment and user calibration. If the user did not look precisely, accurately and without latency to the target, the system will not be calibrated correctly. Also, any movement of the system relative to the user's head will require a recalibration Pupil dilation further decreases measurement precision, and ambient reflections from the cornea confuse image processing algorithms. Different eye colors, long eye lashes, and contact lenses are all additional factors, complicating image processing systems further. Therefore, usually these systems are noisy and provide less than 1° precision for angular movements, with no information on lateral eye movements. Moreover, after shift of the system relative to the head, recalibration is required. A number of other less common approaches exist, such as imaging the retina, the bright pupil approach, and even examining eye movement with an MRI machine. These approaches have their limitations.
GENERAL DESCRIPTION
The present invention provides a novel approach of eye tracking for use, for example, in virtual and/or augmented reality applications, in medical applications and/or in general eye research. Precise online eye tracking in virtual and/or augmented virtual reality applications enables projecting high-quality and life-like virtual images towards an eye of a user by preserving the location of the virtual images on the eye’s retina with high accuracy.
Over the years, several ocular axes have been defined to form reference for defining the light paths inside the human eye, for use in multiple applications, such as in optometry, ophthalmological diagnosis and surgical procedures. Some of the frequently used ocular reference axes are the optical axis, the pupillary axis, the visual axis, and the line of sight. At least some of these ocular axes can be useful and can serve as a reference for eye tracking applications.
The optical axis of the eye can be defined as an axis that passes through and contains the centers of curvature of the optical surfaces of the eye. The optical surfaces of the eye are the anterior corneal surface, the posterior corneal surface, the anterior crystalline lens surface and the posterior crystalline lens surface. As the human eye is not a centered optical system, the optical axis is a theoretical construct, and a more practical definition of the optical axis may be the “best fit” line through the centers of curvature of the “best fit” spheres of cornea and lens surfaces.
The pupillary axis can be defined as the normal line to the anterior comeal surface that passes through the center of the entrance pupil. If the eye was a centered optical system, the pupillary axis would coincide with the optical axis. However, the pupil is often not centered relative to the cornea, the cornea is mostly deviating from a regular shape so that the pupillary axis points at a slightly different direction than the optical axis.
The visual axis can be defined as the line connecting the fixation point (e.g., the spatial location of the object being contemplated), with the fovea, passing through the two nodal points of the eye. Generally, in a typical adult human eye the angle between the optical axis and the visual axis is about 5° and varies between 4° and 8° approximately. When a target (fixation point) is in line with the fixation target and the fovea, that is, when the chief ray of the light is directed along the visual axis, the sharpest vision is realized. However, the visual axis is not easily found experimentally because the nodal points of the eye are abstract notions and are not physical entities within the eye. Since the nodal points (on object and image sides) are within 1 mm from the comeal center of curvature, the visual axis is nearly perpendicular to the cornea.
The line of sight can be defined as the ray from the fixation point reaching the fovea via the pupil center. The line of sight is basically the chief ray of the bundle of light arriving from an external object and reaching the individual’s fovea. Unlike the visual axis, the line of sight can be easily identified experimentally thanks to its close connection to the pupil center. However, the line of sight may not be considered a fixed axis because the pupil center may move when the pupil size changes.
The first optical surface of the eye that the light entering the eye encounters is the comeal surface, specifically the anterior comeal surface. Assuming, as a first approximation, that the cornea is a sphere, the center of curvature of the cornea is the center of the corneal sphere. It is also assumed, as a first approximation, that the optical axis is normal (perpendicular) to the anterior comeal surface since it passes through the center of curvature of the spherical cornea.
The present invention utilizes the above-mentioned properties of the eye, including its shape, to track the eye, utilizing light beam propagation, by tracking one or more ocular axes, such as the optical axis of the eye, with good approximation.
In general, tracking of an ocular axis of the eye requires separating the monitoring of the position of the light beam from two spatially separated surfaces instead of one only, because human eye does not have a fixed center of rotation resulting in that angular eye motions involve also lateral displacement of the ocular axis. In order to track an axis in space, one needs to fix four degrees of freedom e.g. two angles (pitch and yaw) and two lateral displacements (horizontal/vertical). The first condition of perpendicularity to the cornea fixes two incidence angles (pitch and yaw) when fulfilled, and a second condition is needed that will ensure a constant (lateral, horizontal/vertical) intersection point of the axis on the cornea. According to the invention, the second condition is a geometrical condition and involves maintaining a geometrical relationship between the incident light beam propagation path and a geometrical parameter of the eye. The geometrical parameter of the eye relates to one or more physical entities of the eye, such as the pupil and/or the limbus.
Accordingly, a successful tracking of an ocular axis involves continuous adjustment of the spatial and angular propagation paths of the tracking light beam in order to maintain the tracking conditions, because, for example, the perpendicularity condition of the comeal beam will be breached with a lateral displacement of the cornea with respect to the comeal beam although the angular direction of the comeal beam should not be changed.
As mentioned above, according to the invention, the cornea is approximated to have a sphere shape. A light beam normal to the comeal surface will be reflected in the same direction, given that the reflection from the comeal surface is a specular reflection (this is mainly the case when the light beam falls on the cornea portion above the pupil). The light beam is reflected from the cornea along the same optical path if and only if it is normal (perpendicular) to the comeal surface.
The technique of the invention for tracking ocular axes utilizes beam scanning, i.e. by suitably illuminating the eye with an incident light beam propagating along well- defined spatial and angular illumination propagation path and detecting its respective reflected light beam propagating backwardly along a reflection propagation path. The incident light beam, configured to be reflected from the cornea, is a narrow beam, i.e. has a small cross-sectional dimension/area with respect to a dimension/area of the pupil. This enables obtaining spatial and angular information, i.e. information about the exact location of intersection point of the beam on the cornea as well as the angle of reflectance of the beam from the cornea. Moreover, the information of the spatial and angular propagation paths, both in the forward (illumination) and backward (reflection) directions, can be acquired by using relatively simple position sensors/detectors.
Thus, according to a first broad aspect of the invention, there is provided a method for tracking an eye of an individual, the method comprising: i) illuminating an eye of an individual, over an area of cornea of the eye, with an incident light beam propagating towards the cornea of the eye along an incident direction, detecting a reflected light beam propagating from the cornea of the eye along a reflected direction and determining angular offset between the incident and the reflected directions; ii) analyzing sensing data of the eye to determine a geometrical parameter of the eye; iii) repeatedly adjusting the incident direction of the incident light beam such that said angular offset is minimized and said geometrical parameter of the eye has a predetermined geometrical relationship with the incident direction of the incident light beam; and iv) repeating steps (i) to (iii) under changes in gaze direction of the eye of the individual.
In some embodiments, step (iii) is repeated until said angular offset tends to or equals zero.
In some embodiments, the method further comprises applying a transformation of coordinates for determining said predetermined geometrical relationship between the geometrical parameter of the eye and the incident direction of the incident light beam.
In some embodiments, the sensing data comprises stereoscopic images of the eye, said analyzing of the sensing data to determine the geometrical parameter comprises triangulation.
In some embodiments, the sensing data of the eye is provided at a predetermined first pace being slower than a predetermined second pace of said illuminating of the eye with the incident light beam. In some embodiments, analyzing of the sensing data to determine the geometrical parameter of the eye is done off-line after finishing an eye tracking session.
According to another broad aspect of the present invention, there is provided an eye tracking system comprising a control and processing unit comprising: a light path analyzer configured and operable to receive light data indicative of incident light beam illuminating an area of cornea of an eye of an individual and a corresponding reflected light beam propagating backwardly from the cornea of the eye, and generate light path data indicative of angular offset between incident and reflected light directions of said incident and reflected light beams respectively; a sensing data analyzer configured and operable to receive sensing data of the eye and analyze the sensing data to determine a geometrical parameter of the eye; and an eye tracking analyzer configured and operable to generate operational data for repeatedly adjusting the incident light direction such that said angular offset is minimized and said geometrical parameter of the eye has a predetermined geometrical relationship with said incident light direction.
In some embodiments, the eye tracking analyzer is configured and operable to generate the operational data for repeatedly adjusting the incident light direction until said angular offset tends to or equals zero.
In some embodiments, the system further comprises a light source configured and operable to generate said incident light beam being configured to be reflected from the cornea of the eye.
In some embodiments, the eye tracking system further comprises a light directing arrangement configured and operable to direct said incident light beam from said light source towards the cornea of the eye of the individual, and collect said reflected light beam propagating backwardly from the cornea of the eye.
In some embodiments, the operational data, generated by said eye tracking analyzer for adjusting the incident light direction, comprise changing location of a pivot point of the light directing arrangement.
In some embodiments, the eye tracking system further comprises a light detector, located at an output of said light directing arrangement, and configured and operable for detecting said reflected light beam and generating a detection output indicative thereof, said light data comprising said detection output. In some embodiments, the eye tracking system further comprises an imager configured and operable to provide said sensing data of the eye of the individual. The imager may be configured and operable to provide stereoscopic images of the eye, thereby enabling said sensing data analyzer to determine said geometrical parameter by triangulation.
In some embodiments, the sensing data of the eye is provided to the control and processing utility at a predetermined first pace being slower than a predetermined second pace in which the light data is provided to the control and processing utility.
In some embodiments, the analyzing of the sensing data by the sensing data analyzer to determine the geometrical parameter of the eye is done off-line after finishing an eye tracking session.
In some embodiments, the geometrical parameter of the eye is indicative of a symmetry condition of the eye.
In some embodiments, the geometrical parameter of the eye is a parameter of limbus region of the eye. The parameter of limbus region of the eye may be at least one of the following: a three-dimensional location of center of the limbus region, of spatial points following a diameter of the limbus region, and of spatial points following a perimeter of the limbus region. When the parameter of limbus region of the eye is a three- dimensional location of center of the limbus region, the predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the limbus region.
In some embodiments, the geometrical parameter of the eye is a parameter of pupil of the eye. The parameter of pupil of the eye may be at least one of the following: a three- dimensional location of center of the pupil, of spatial points following a diameter of the pupil, and of spatial points following a perimeter of the pupil. When the parameter of pupil of the eye is a three-dimensional location of center of the pupil, the predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the pupil.
In some embodiments, the geometrical parameter of the eye is center of fovea of the eye, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the fovea. In some embodiments, the geometrical parameter of the eye is center of optic disk of the eye, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the optic disk.
According to another broad aspect of the present invention, there is provided a method for tracking an eye of an individual, the method comprising: i) performing online tracking comprising: a) defining an instantaneous pivot point in space as an intersection point for a plurality of incident light beams illuminating an eye of an individual over an area of cornea of the eye, b) illuminating the eye with an incident light beam, of the plurality of incident light beams, propagating along an incident direction; detecting a reflected light beam propagating from the cornea of the eye along a reflected direction; determining an angular offset between the incident and the reflected directions; and repeatedly adjusting the incident direction of the incident light beam such that said angular offset tends to zero, and c) providing, at least partially simultaneously, sensing data indicative of an instantaneous geometrical parameter of the eye [e.g. Limbus center], ii) performing subsequent off-line analysis comprising: a) providing data indicative of location of center of the cornea of the eye, and b) determining instantaneous geometrical relationships between said instantaneous pivot point, instantaneous geometrical parameter and data indicative of the location of center of the cornea of the eye, to determine instantaneous gaze direction of the individual.
BRIEF DESCRIPTION OF THE DRAWINGS In order to better understand the subject matter that is disclosed herein and to exemplify how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
Fig. 1 illustrates, by way of a flow diagram, a non-limiting example of a method for tracking an eye of an individual, in accordance with some exemplary embodiments of the present invention; Fig. 2 illustrates a non-limiting example of a method for adjusting the direction of the incident light, when the limbus center is the geometrical parameter;
Fig. 3 illustrates, by way of a flow diagram, a non-limiting example of a method for tracking an eye of an individual, in accordance with some exemplary embodiments of the present invention, the method includes an online and an offline tracking stages;
Fig. 4 illustrates a non-limiting example for offline geometrical analysis for tracking the eye and determining the instantaneous gaze direction;
Fig. 5 illustrates, by way of a block diagram, a non-limiting example of a system for tracking an eye of an individual, in accordance with some exemplary embodiments of the present invention;
Fig. 6 illustrates, by way of a block diagram, non-limiting examples of utilities/systems that may be included in the eye tracking system of the present invention; and
Fig. 7 illustrates a non-limiting example of a method for receiving and analyzing sensing data of the eye in order to determine a geometrical parameter of the eye.
DETAILED DESCRIPTION OF EMBODIMENTS
In accordance with a first aspect of the invention, a method for tracking an eye of an individual is described, the method includes at least the following: i) illuminating an eye of an individual, over an area of cornea of the eye, with an incident light beam propagating towards the cornea of the eye along an incident direction, detecting a reflected light beam propagating from the cornea of the eye along a reflected direction and determining angular offset between the incident and the reflected directions; ii) analyzing sensing data of the eye to determine a geometrical parameter of the eye; iii) repeatedly adjusting the incident direction of the incident light beam such that said angular offset is minimized and said geometrical parameter of the eye has a predetermined geometrical relationship with the incident direction of the incident light beam; and iv) repeating steps (i) to (iii) under changes in gaze direction of the eye of the individual.
Reference is made to Fig. 1 illustrating by way of a flow diagram a non-limiting example of a method 100 for tracking an eye of an individual, in accordance with the above-described method and according to some exemplary embodiments of the present invention. It is assumed that each eye of the two eyes of the individual is tracked individually, because the light reaching each eye, when a person looks at an external object, passes through a different path in the space. Therefore, the method 100 is typically applied per each eye of the individual such that two specific tracking paths corresponding to both eyes of the individual are determined.
In step 102, the individual's eye is illuminated, over an area of the cornea of the eye, with an incident light beam propagating along an incident direction towards the cornea of the eye, and a reflected light beam propagating along a reflected direction, being the reflection from cornea of the incident light beam, is detected in step 104. The light direction/path, e.g. the incident direction, is defined as the direction/path that the central axis of the light beam (the center of the light beam’s transverse cross section), or the chief ray of the light beam, propagates along. The incident light beam is configured to be reflected from the cornea of the eye, and accordingly it is sometimes referred to herein as a “comeal beam” or “comeal tracker”. When the incident light beam tracks the eye, as will be explained further below, the direction/path of the incident light beam coincides with the eye tracking axis. The optical properties of the incident light beam, such as the wavelength, frequency, intensity, etc., are chosen to enable at least a detectable portion of the incident light beam to be reflected backwardly from the cornea. In some embodiments, the incident light beam is configured to track the incidence angle of the incident light beam with respect to the eye’s optical axis (considered as having zero angle) and to maintain a condition of perpendicularity of the eye tracking axis with respect to the comeal surface. The incident light beam is configured to be normal to the comeal surface at all directions of an individual’s gaze, all the time. Accordingly, the incident light beam is configured to meet the comeal surface with a right (90°) angle, with respect to a tangent to the cornea surface, at each meeting/intersection point (the point on the comeal surface where the incident light beam falls).
The incident light beam used in the technique(s) of the invention can have one or more wavelengths, being, preferably, in an unseen wavelength range, such that illumination of the light beam towards the individual’s eye does not disturb, dazzle or distract the individual. This is particularly important in virtual and/or augmented reality applications, in which virtual images (objects and scenes) are continuously projected towards the individual’s eyes while continuous track of individual's eye(s) is required in order to control the specifications of the projected image. For example, in augmented reality applications, the individual is exposed to both real scenery and virtual scenery superposed thereof, in which case it is essential to avoid interruption with the individual’s vision during the activity. At the same time, it is important for the incident light beam to have a wavelength that causes a measurable part of the beam to be reflected from the cornea. Accordingly, the incident light beam can be in any un-harmful and un-disturbing light range that fulfils the above two conditions. In some non-limiting embodiments, the incident light beam can be in the Infrared (IR) range. In some specific non-limiting embodiments, the incident light beam can have a wavelength in the range between 800nm - 1500nm.
As previously mentioned, in step 104, the reflected portion of the incident light beam, i.e. the reflected light beam, is detected.
In step 110, the perpendicularity condition, of the incident light beam is verified by the detection of the reflection (at least a portion thereof) of the incident light beam, i.e. the reflected light beam, from the cornea. Detection of the reflected light beam along the reflected direction enables calculating an angular offset between the incident and reflected light directions. When the incident light beam impinges on the cornea surface with a right angle, it is reflected therefrom also with a right angle, i.e. along the same direction/path but in the opposite direction. Accordingly, the perpendicularity condition can be verified if and when the reflected light beam (being the reflected portion of the incident light beam) propagates backwardly, away from the eye, along the direction/path that was traversed by the incident light beam, i.e. along the incident direction. If it is detected that the reflected light beam is not propagating along the incident direction, in the backward direction, i.e. it is propagating along a different path with an angular offset between the incident and reflected directions, then, in step 116, the angular direction of the incident direction, along which the incident light beam propagates, should be modified and adjusted. If it is detected that the reflected light beam is propagating along the incident direction in the backward direction, i.e. the angular offset between the incident and reflected directions is zero, then a second condition, as will be described further below (step 112) should be verified and only if both conditions are met, independently, the eye is deemed to be tracked (step 114). It is noted that, generally the angular offset between the incident and reflected directions will not be zero in the first place, and the incident direction is dynamically/repeatedly adjusted (step 118) to minimize the angular offset down to zero, or at least such that the angular offset tends to zero. This repeated adjustment of the incident direction to fulfill the perpendicularity condition is performed almost all the time and specifically under changes in the gaze direction of the eye.
It is noted that according to some embodiments, the method of tracking the eye can include preliminary steps for defining an individual-specific parametric space of all light directions/paths being perpendicular to the cornea surface over a predetermined area thereof extending at least over the pupil and within a predefined range of angular and lateral positions of the eye, by utilizing the properties of the incident and reflected light beams only. While method 100 should be executed continuously because eye movements (angular as well as spatial/lateral) cause the incident beam not to be perpendicular to the cornea, prior knowledge of the parametric space of all light directions/paths being perpendicular to the cornea surface can greatly reduce the latency of the tracking and increase its accuracy. In other words, a preliminary method for finding all light directions/paths perpendicular to the cornea surface over an area thereof and with a predefined range of angles and lateral positions with respect to a fixed axis in space, can be applied by scanning the cornea with a plurality of incident light beams. The defined perpendicular-to-comea light directions/paths define respective intersection points where the light beams hit the cornea surface. The distance between every two light directions/paths or intersection points can be as small as possible to obtain highly accurate tracking of the eye. In one example, the preliminary method finds a) the required settings of light deflectors to affect particular light beam angles with respect to a nominal optical axis of the system used, b) the translation between the recorded shift of the light beam from perfect backward direction (i.e. non perpendicular incidence) to the required angular and lateral corrections needed to bring the light beam back to perpendicular incidence.
In step 106, performed in parallel (concurrently) to steps 102 and 104, either at the same pace or at a different pace, sensing data of the eye is provided. It is noted that the sensing data can be of the external or internal sides of the eye, depending on the sensing technique. Non-limiting examples are described herein below. Providing the sensing data may be, in some non-limiting embodiments, by capturing images of the eye.
In step 108, the sensing data is analyzed and a geometrical parameter of the eye is determined from the analyzed sensing data. In particular, the three-dimensional location of the geometrical parameter, with respect to a reference three-dimensional coordinate system, e.g. xyz coordinates in a cartesian coordinate system, is determined. In step 112, the geometrical relationship between the determined geometrical parameter of the eye and the incident direction of the incident light beam is examined. If the relationship satisfies a predetermined geometrical condition, derived from the kind of the geometrical parameter, the eye is tracked, and if not, and even if the incident direction is perpendicular to the cornea, the spatial location of the incident light beam (the incident direction/path) is adjusted, i.e. the intersection point of the incident light beam with the cornea surface is adjusted, while insuring that the perpendicularity condition described above is valid by repeating step 116. Only when both conditions are met, i.e. the perpendicularity condition and the geometrical condition, the eye is tracked. It is appreciated that whenever the spatial location of the intersection point is changed, given the spherical assumption of the comeal surface, the perpendicular condition is breached and steps 110 and 116 are repeated to insure perpendicularity of the incident light beam with respect to the comeal surface.
For example, in one non-limiting embodiment, the sensing data (e.g. image data) may refer to the limbus of the eye and the analysis of the sensing data results in determining the geometrical parameter being the three-dimensional location, with regard to a reference coordinate system, of one or more spatial points of the limbus. In one example, the geometrical parameter is the three-dimensional location of the center of the limbus. In this case, tracking the eye is achieved if the center of the limbus is located on the extension through the eye of the direction/path of the perpendicular-to-comea incident (as well as reflected) light beam. In another example, the geometrical parameter is a group of spatial points following/tracking a diameter of the limbus. In this case, tracking the eye is achieved if the diameter of the limbus (i.e. the three-dimensional locations of the spatial points tracking the dimeter of the limbus) is perpendicular to the extension of the incident direction through the eye, and the extension of the incident direction through the eye passes through the middle of the diameter of the limbus. In yet another example, the geometrical parameter is a group of spatial points following/tracking the perimeter of the limbus. In this case, tracking the eye is achieved if any point on the incident direction is equidistant to two opposite spatial points on the perimeter of the limbus. If, based on analysis of the sensing data, it is concluded that the limbus has a circular shape, then the geometrical condition can be that any point on the incident direction is equidistant to any two points on the perimeter of the limbus. As with the limbus example above, the sensing/image data may refer, in another non-limiting embodiment, to the pupil of the eye and the analysis of the sensing data results in determining the geometrical parameter being the three dimensional location, with regard to a reference coordinate system, of one or more spatial points of the pupil. In one example, the geometrical parameter is the three-dimensional location of the center of the pupil. In this case, tracking the eye is achieved if the center of the pupil is located on the extension through the eye of the direction of the perpendicular-to-comea incident (as well as reflected) light beam. As can be appreciated, the three-dimensional location of a group of spatial points following/tracking the diameter or the perimeter of the pupil can be determined and similar analysis to that of the limbus, as described above can be applied.
In the non-limiting examples of the geometrical parameters relating to the limbus, described above, the eye is tracked by fulfilling a geometrical condition of a symmetry assumption of the eye's structure. Specifically, the eye is tracked by tracking the optical axis of the eye.
It is also appreciated that by tracking the optical axis, the visual axis can be also tracked, because according to experiments the visual axis can be tracked by deflecting the incident light beam by a predetermined angle in a predetermined direction. For example, according to the inventors the visual axis is 4° shifted from the optical axis. By tracking the optical axis, the incident light beam can be shifted by the aforementioned shift to track the visual axis.
In yet another non-limiting embodiment, the sensing data may refer to the fovea of the eye and the analysis of the sensing data results in determining the geometrical parameter being the three-dimensional location, with regard to a reference coordinate system, of one or more spatial points of the fovea, specifically the three-dimensional location of the fovea center. It is known that light reflected from the fovea changes polarization, because of Henle fibers. This polarization changing feature allows to detect when light is reflected from the center of the fovea and allows to separate between light reflected from the fovea and light reflected from the cornea. Therefore, the sensing data, being a portion of the light beam that enters the eye and which is reflected with a specific change in polarization, is indicative of the fovea center (as the geometrical parameter). The light beam that will pass through the cornea center (such that the portion that is reflected from the cornea surface travels along the incident direction) and hits the fovea center can also be used to track the eye. This example of the geometrical parameter enables to track an axis that is close to the visual axis of the eye.
Yet, another non-limiting example of the geometrical parameter of the eye is center of the so-called optic disk, which is the point on the retina where optical nerve is connected. As there are no rods or cones overlying the optic disk, it corresponds to a small blind spot in the eye. Optic disk is highly reflective, and since eye behaves as a retro- eflector, this allows determining when the portion of the light beam entering the eye hits the optical disk because of the detected growth of intensity of the retro-ref ected portion of the light beam. Accordingly, another axis that can be used to track the eye is the axis defined by the cornea center and the geometrical parameter as the center of the optic disk.
Reference is now made to Fig. 2 illustrating a non-limiting example of a method 200 for adjusting the direction of the incident light. The non-limiting exemplified method relates to the limbus center as the geometrical parameter. However, the method can be accommodated with relevant relations/parameters for any other geometrical parameter used. As described above, if the incident beam IB is perpendicular to the cornea surface then the extension of the incident beam through the eye passes through the cornea center. To track the eye, e.g. by tracking the optical axis of the eye, the extension of the beam needs to pass through the limbus center as well. It is convenient, for the control purpose, to define the intersection point for a plurality of incident light beams illuminating an eye of an individual over an area of cornea of the eye as an instantaneous pivot point in space. Typically, when correcting angular offsets of the incident beam, the location of the pivot point in space is kept constant, however, precise determination of the gaze direction may require adjusting the location of the pivot point in space as will be described later.
The following radius vectors are defined and shown in the figure:
• rB º \xB yB zB]T - is the center of the Limbus:
• rc º \xc yc zc]T - is the center of the Cornea;
• rP º [xP yP zP]T - is the pivot point:
• rN = [xN yN zN]T - is the required pivot point.
The incident light direction, supposing it fulfills the first condition of being perpendicular to the cornea, can be calculated as follows:
Figure imgf000018_0001
And, the angle between the limbus center and the original pivot point can be calculated as follows:
• QB = atan ( \XzB B-zvJ
When 6L is equal to QB. the incident beam passes through the limbus center and the cornea center.
The new required pivot point location that reduces the difference between 0L and 0B, can be calculated as follows:
Figure imgf000019_0001
Where zP is set to a constant value equal to the nominal Limbus z coordinate minus the distance between the Limbus center and the new required pivot point g is a gain of the iterative process.
Tracking of the eye can be beneficial in a plenty of applications some of which are of offline tracking nature. For example, analysis of eye movement and gaze direction of an individual during a test, such as a medical test, does not have to be online and can be performed retrospectively after performing the test.
Reference is now made to Fig. 3, illustrating, by way of a flow diagram, a non limiting example of a method 300 for tracking an eye of an individual, in accordance with some exemplary embodiments of the present invention. As shown, the method includes an online stage 302 and an offline stage 304. In general, the online stage includes steps for tracking of the eye by the comeal tracker, i.e. by ensuring that the perpendicularity to cornea condition is achieved online, while concurrently acquiring sensing data indicative of a geometrical parameter of the eye. The offline stage includes analysis of the sensing data in addition to other provided geometrical data of the eye in order to determine the gaze direction retrospectively.
In step 310, an instantaneous pivot point in space is defined as an intersection point for a plurality of incident light beams illuminating an eye of an individual over an area of cornea of the eye. The pivot point will regularly be inside the borders of the eye's sphere. In steps 312 and 314, while maintaining the pivot point, the eye is illuminated with an incident light beam propagating along an incident direction, and a reflected light beam propagating from the cornea of the eye along a reflected direction is detected.
In step 316, an angular offset between the incident and the reflected directions is determined.
In step 318, the incident direction is adjusted as needed, while maintaining the pivot point in place, to minimize the angular offset, specifically until the angular offset equals zero.
Basically, steps 312 to 318 are similar to steps 102, 104, 110 and 116 of method 100 above.
In step 320, sensing data indicative of an instantaneous geometrical parameter of the eye is provided at least partially simultaneously with steps 312-318. The monitored geometrical parameter can be any geometrical parameter of the eye that defines a geometrical relationship with the incident direction of the incident light beam. Such geometrical parameters are described above with respect to method 100.
The time gathered data of the incident directions and the simultaneous sensing data are saved for subsequent offline analysis as follows. Accordingly, a plurality of incident directions and a respective simultaneous data of a geometrical parameter of the eye are saved for determination of a respective plurality of gaze directions.
In step 322, geometrical data indicative of location of center of the cornea of the eye is provided. This data can be, for example, a distance from the geometrical parameter. For example, if the geometrical parameter is the center of the limbus, then the distance between the cornea center and the limbus can be determined or provided. In one non limiting example, the distance between centers of the limbus and cornea is empirically known. In another non-limiting example, the distance between centers of the limbus and cornea can be found by determining three-dimensional location of cornea center in addition to determining the three-dimensional location of the limbus center. The location of cornea center can be determined, for example, by shifting the pivot point quickly (such that the eye is relatively stationary) and keeping the reflected beam normal to the cornea. This way, the intersection of incident beams before and after shifting the pivot point is coincident with center of the cornea, since as appreciated and was described above, the incident direction of the incident light beam that is perpendicular to the cornea surface, passes through the center of cornea that is approximated to have a spherical shape. In step 324, instantaneous geometrical relationships between the instantaneous location of the pivot point, the instantaneous geometrical parameter and the instantaneous location of center of the cornea are determined, and in step 326 the eye is tracked by determining the respective instantaneous gaze directions of the individual.
Reference is made to Fig. 4, illustrating a non-limiting example for offline geometrical analysis for tracking the eye and determining the instantaneous gaze direction, in accordance with method 300.
A cornea of the eye is shown while being illuminated with incident light beams. A Pivot point P is defined arbitrarily in the three-dimensional space (step 310), however typically inside the eye. All of the illuminating incident light beams or their extensions through the eye pass through the pivot point P. For example, three incident directions ID1, ID2 and ID3 of the incident light beam are illustrated in the figure.
As described above (steps 312-318), the incident direction is continuously adjusted until it is perpendicular to the cornea surface (i.e., when the measured angular offset between the incident and reflected light beams equals zero). This is the illustrated incident direction ID1. At this stage, under the same gaze direction, the spatial propagation of the incident direction ID1 is known. As described above, the incident direction ID1 passes through the cornea center C.
While illuminating the eye along the incident direction ID1, sensing data indicative of the limbus center, as a geometrical parameter of the eye, is obtained (step 320), thereby enabling determination of the three-dimensional spatial location of the limbus center L.
As the three dimensional coordinates of the Pivot point P and the limbus center L are known, the orientation and length of the side LP in the triangle LPC can be determined, and as the orientation of the incident direction ID1 is known (being defined by the illuminating system), the angle LPC can be calculated.
Utilizing known empirical data about the distance between the limbus center L and the cornea center C, the length of the side LC in the triangle LPC is determined (step 322).
Using, for example, the sine law in triangles, it is possible then to calculate the angle LCP, and accordingly the angle PCP' (step 324).
The incident direction IDU presents the optical axis of the eye since it passes through the limbus center L, and the cornea center C. Determining the angle PCP' enables adjusting the location of the pivot point from P to P', thus enabling to determine the gaze direction (step 326).
Reference is made to Fig. 5 illustrating by way of a block diagram a non-limiting example of a system 10 for tracking an eye of an individual, according to some exemplary embodiments of the technique of the present invention. The system 10 can be used in execution of methods 100-300. As shown, the system 10 includes a control and processing unit 20 configured and operable for controlling and processing of light and light data respectively in order to track the individual's eye. The control and processing unit 20 includes at least the following utilities: a light path analyzer 22, a sensing data analyzer 24, an eye tracking analyzer 26 and a memory 28. In addition, while not specifically shown, the control and processing unit 20 includes input and output utilities for receiving and for sending/presenting data.
The light path analyzer 22 is configured and operable to receive light data indicative of incident light beam illuminating an area of cornea of an eye of an individual and light data indicative of a corresponding reflected light beam (a reflection of at least a portion of the incident light beam) propagating backwardly from the cornea of the eye. The incident and reflected light data are analyzed by the light path analyzer 22 to generate light path data including, inter alia, the angular offset between the incident and reflected light directions of the incident and reflected light beams respectively. As mentioned above, the incident and reflected direction should be perpendicular to the cornea surface at the intersection point (assuming the cornea surface at the relevant illuminated area is spherical), in case the optical axis of the eye is tracked.
At the same time, the sensing data analyzer 24 is configured and operable to receive sensing data of the eye and analyze the sensing data to determine a geometrical parameter of the eye. Examples of the geometrical parameter has been described above, such as the limbus center or the pupil center. The sensing data analyzer determines the three-dimensional spatial location of the geometrical parameter in a way that enables correlating between the spatial location of the geometrical parameter and the spatial location of the incident light beam (i.e. the incident direction).
The eye tracking analyzer 26 is configured and operable to generate operational data for adjusting the incident light direction such that the angular offset is minimized and such that the geometrical parameter of the eye has a predetermined geometrical relationship with the incident light direction. As described above, in case the optical axis of the eye is tracked, the incident and reflected light beams should be perpendicular to the cornea surface. In other words, the angular offset should be minimized down to zero. Generally, the adjustment of the incident light direction is carried out repeatedly and continuously in order to bring the angular offset down to zero. In the case of method 300, the eye tracking analyzer is further configured and operable to execute the required calculations for determining the geometrical relationships between pivot point, geometrical parameter and cornea’s center.
The memory 28 is a non-transitory memory configured and operable to save data required for the system operation. For example, the data about the pivot point, incident directions, location of cornea's center, sensing data indicative of the geometrical parameter can all be saved in the memory for offline access and analysis.
Reference is now made to Fig. 6 illustrating by way of a block diagram non limiting examples of utilities/systems that can be included in the system 10, according to some exemplary embodiments of the present invention. It is noted that the system 10 may include one or more of the following utilities/systems. Also, it is noted that the following utilities/systems are operable to perform respective functions listed above with respect to method 100, even if not directly mentioned herein below.
As shown, the system 10 can include utilities such as a light source 30, a light directing arrangement 40, a light detector 50, the control and processing unit 20, and an eye sensing system 60, configured and operable together to determine and track the eye EYE.
The light source 30 is configured and operable to generate an incident light beams IB to be projected towards the eye EYE. The light source is configured to generate the incident light beam IB being configured to be reflected from the cornea of the eye, e.g. being in one or more frequency ranges that are reflected or mostly reflected from the cornea. The light source 30 can include one or more light source units configured to generate the incident light beam IB. In some exemplary embodiments, one or more filters can be used at the output of the one or more light source units to provide different ranges of light wavelengths forming the incident light beam IB. In some exemplary preferred embodiments, the light source 30 is configured and operable to generate the incident light beam IB having wavelengths in the unseen range, to minimize disturbance to the individual, as described above. For example, the incident light beam IB can be in the infrared range, e.g. in the range between 800nm - 1500nm. Other specifications of the incident light beam IB are described above with reference to method 100, and are equally valid for the system 10.
The light directing arrangement 40 is configured and operable to receive the incident light beam IB from the light source 30 and direct it towards the eye EYE along an incident direction defined by central axis of the incident light beam IB, to thereby illuminate the eye over an area of the cornea extending over the pupil of the eye EYE. Additionally, the light directing arrangement 40 is configured and operable to collect a respective reflected light beam RB propagating backwardly from the cornea of the eye EYE. The light directing arrangement 40 can include optical elements responsible for adjusting the light path of the incident and/or reflected light beams, such as optical reflectors, optical deflectors, mirrors, dichroic mirrors, beam splitters, lenses, and other similar elements configured and operable to direct the incident light beam as well as the reflected light beam in accordance with the conditions mentioned above.
The light directing arrangement 40, through its various light path adjustment elements is configured and operable to compensate for any angular or lateral displacements of the eye during the tracking of the eye, such that it guarantees that the conditions required, such as the perpendicularity to cornea condition, are fulfilled.
The light directing arrangement 40 is responsible for directing the incident light beam IB along the incident direction such that IB is perpendicular to the cornea at the intersection point IP.
The light detector 50 is located at the output of the light directing arrangement 40, and is configured and operable for detecting the reflected light beam RB and for generating a detection output indicative thereof. The light detector 50 can include one or more light sensors for detecting the reflected light beam RB. In some exemplary embodiments, the one or more light sensors can be based on quad sensor(s). In some exemplary embodiments, the light sensor(s) is/are configured and operable to generate an electrical output signal(s) (the detection output) in response to the light input signal(s). In some exemplary embodiments, the light sensor(s) is/are configured and operable to generate detection output(s) indicative of the detected light intensity. In some exemplary embodiments, the light sensor(s) is/are configured and operable to generate detection output(s) indicative of spatial location(s)/propagation path(s) of the detected light beam(s). The control and processing utility 20 is configured and operable to receive the detection output from the light detector 50 and determine the angular offset between the incident and reflected light directions. Further, the control and processing utility 20 is configured and operable to determine the adjustment of the light beams itinerary in order to track the eye, inter alia by activating a control function and/or control loop that receives the detection output and generates an output for correcting the light beams itinerary so that the light beams track the eye.
The eye sensing system 60 is configured and operable to acquire sensing data SD of the eye, where the sensing data SD is indicative of a geometrical parameter of the eye. The eye sensing system 60 is activated at least partially simultaneously with the light source 30, such that the sensing data overlaps in time with the incident and reflected light beams. In one non-limiting example, the operation pace of the light source 30 is greater than the operation pace of the eye sensing system 60.
In one non-limiting embodiment, the eye sensing system 60 is an imager configured and operable to capture images of the eye, specifically to capture images of the area of cornea covering the limbus, iris and pupil. In one non-limiting example, the imager is configured with stereo imaging sensors configured and operable to provide stereo images of the area of the cornea that contains the limbus and/or pupil, enabling to reconstruct the limbus/pupil or a characteristic thereof by using suitable techniques such as triangulation. The reconstruction can be done at the sensing system 60 independently (in which case the sensing system includes a suitable analyzer) or at the control and processing unit 20 as was described above with reference to the sensing data analyzer 24. Non-limiting examples of the acquisition of a geometrical parameter of the eye by the eye sensing system is described herein further below.
The control and processing utility 20 may be configured and operable to control the operation of the light source 30 and/or the light directing arrangement 40 and/or the light detector 50 and/or the eye sensing system 60, and the different elements thereof, in order to determine the tracking of the individual’s eye. Accordingly, the control and processing utility 20 may include one or more controllers configured and operable to control the different parts of the system 10. In some exemplary embodiments, each of the light source 30, the light directing arrangement 40, the light detector 50 and the eye sensing system 60 (the system parts) has its own controller(s) configured to control the operation of one or more elements of the respective system part. In some exemplary embodiments, one or more central controllers are configured to control operation of some or all of the system parts including the light source 30, the light directing arrangement 40, the light detector 50 and the eye sensing system 60. In case more than one controller are used, the controllers can be located in one location in the system 10 and connected to the corresponding controlled system part, or the controllers can be distributed in the system such that each system part has its own controller(s) located therewith. Even if no controller is specifically described or shown in the figures, this should not limit the broad aspect of the invention, and it is to be understood that each action performed by the control and processing utility 20 to control the operation of any system part is typically performed by one or more controllers corresponding to the respective system part. The controller(s) can be software or hardware based or a combination thereof.
In one non-limiting embodiment, the control and processing utility 20, specifically the eye tracking analyzer, generates the operational data to adjust the incident light direction, by changing spatial location of a pivot point of the light directing arrangement 40, such that the angular offset is minimized and the geometrical relationship between the incident light direction and the geometrical parameter is achieved.
Reference is made to Fig. 7 illustrating, in accordance with exemplary embodiments of the invention, a non-limiting example of a method 400 for receiving and analyzing sensing data of the eye in order to determine a geometrical parameter (GP) of the eye. The method 400 can be executed by the eye tracking system 10. Specifically, the geometrical parameter that is determined includes the limbus or pupil of the eye.
The method 400 for determining a geometrical parameter of the eye includes the following: receiving or capturing image data indicative of at least two images from different angles of a user's eye at step 402; identifying regions related to the geometrical parameter in each image at step 404; determining geometrical representation of the GP structure at step 406; and performing a triangulation of the geometrical representation of the GP structure of at least two images at step 410, to thereby determine a three- dimensional spatial location of the geometrical parameter at step 412.
In some embodiments, step 408 includes performing a triangulation of the geometrical representation of the GP structure in at least two images to determine three- dimensional parameters of the GP. For example, in case the geometrical parameter is the limbus, the three-dimensional parameter may be a radius, diameter, center and/or torsional rotation of the limbus. In some embodiments, after the triangulation of the geometrical representation of the GP structure of at least two images, the method may include determining the geometrical representation of the GP structure by processing the three-dimensional GP parameters to generate further more precise data indicative of the GP region.
In some embodiments, determining the geometrical representation of the GP structure may include digital image pre-processing (step 412) such as performing a GP recognition process on each image or performing mathematical transformations of the image including using an intensity gradient map and then running a GP area recognition process on the transformed image. The step of digital image pre-processing may include calculating an image intensity gradient map of the GP region, identifying at least one region of the GP structure in which the local direction of the gradient is substantially uniform, and processing data indicative of the GP structure by weighting the pixels of such regions and determining geometrical representation of the GP structure based on the matching pixels related to the GP. More specifically, digital image pre-processing may include determining the region of the GP structure by identifying the local uniformity, i.e. the direction of the gradient of each point is collinear only with its neighbors. An entropy map may also be used as an intensity gradient map (instead of or in addition to a gradient map), or the processing may be performed directly on the image.
In some embodiments, identifying regions related to the GP in each image (step 404) may include identifying image data indicative of eye features such as eyelids, sclera, iris and eyelashes and/or identifying an initial GP region based on anatomical parameters by using an iterative pixel filtration process and generating data indicative of the initial GP region. For example, an eyelids region estimation may be implemented in parallel with the GP region. Different eye feature regions may be determined based on any geometrical shape models (e.g. ellipses or circles).
In some embodiments, identifying an initial GP region may include segmentation based on anatomy estimations. For example, segmenting each image for identifying pixels related to the GP region, performing triangulation between the at least two images to determine three-dimensional parameters of the GP, estimating the location of the initial GP region based on the three-dimensional parameters of eye features, and generating data indicative of the initial region location of the GP in each image. Neural network-based approaches may be used for identifying image data indicative of the initial GP region. GP region estimation may also be performed with a neural network based approach. In some embodiments, identifying image data indicative of eye features may be implemented by using machine learning. Determining (i.e. predicting) the three- dimensional parameters of the GP may include using a data recognition model based on a neural network. The method may include training the network based on the segmentation results of classical approaches, or by using an existing system for training. Generating a representation of the GP based on the series of images may be obtained by using a deep-leaming network (DLN), such as an artificial neural network (ANN). For example, generating a representation of the GP may include calculating probabilities on the placement of the GP and/or generating a model of the GP, adapting a recognition classifier to a person.
More examples about the acquisition of the sensing data and determination of the geometrical parameter, such as the limbus and/or pupil perimeter, center or diameter, can be found in PCT/IL2020/050100 assigned to the assignee of the present invention.
As appreciated, the present invention advantageously enables accurate online as well as offline eye tracking by using one light beam towards the cornea, data about a geometrical parameter of the eye and a relation between the light beam and the geometrical parameter of the eye.

Claims

CLAIMS:
1. A method for tracking an eye of an individual, the method comprising: i) illuminating an eye of an individual, over an area of cornea of the eye, with an incident light beam propagating towards the cornea of the eye along an incident direction, detecting a reflected light beam propagating from the cornea of the eye along a reflected direction and determining angular offset between the incident and the reflected directions; ii) analyzing sensing data of the eye to determine a geometrical parameter of the eye; iii) repeatedly adjusting the incident direction of the incident light beam such that said angular offset is minimized and said geometrical parameter of the eye has a predetermined geometrical relationship with the incident direction of the incident light beam; and iv) repeating steps (i) to (iii) under changes in gaze direction of the eye of the individual.
2. The method according to claim 1, wherein said step (iii) is repeated until said angular offset tends to or equals zero.
3. The method according to claim 1 or 2, further comprising applying a transformation of coordinates for determining said predetermined geometrical relationship between the geometrical parameter of the eye and the incident direction of the incident light beam.
4. The method according to any one of the preceding claims, wherein said geometrical parameter of the eye is indicative of a symmetry condition of the eye.
5. The method according to any one of claims 1 to 3, wherein said sensing data comprises stereoscopic images of the eye, said analyzing of the sensing data to determine the geometrical parameter comprises triangulation.
6. The method according to any one of the preceding claims, wherein said geometrical parameter of the eye is a parameter of limbus region of the eye.
7. The method according to claim 6, wherein said parameter of limbus region of the eye comprises at least one of the following: a three-dimensional location of center of the limbus region, of spatial points following a diameter of the limbus region, and of spatial points following a perimeter of the limbus region.
8. The method according to claim 6, wherein said parameter of limbus region of the eye is a three-dimensional location of center of the limbus region, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the limbus region.
9. The method according to any one of claims 1 to 5, wherein said geometrical parameter of the eye is a parameter of pupil of the eye.
10. The method according to claim 9, wherein said parameter of pupil of the eye comprises at least one of the following: a three-dimensional location of center of the pupil, of spatial points following a diameter of the pupil, and of spatial points following a perimeter of the pupil.
11. The method according to claim 10, wherein said parameter of pupil of the eye is a three-dimensional location of center of the pupil, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the pupil.
12. The method according to any one of claims 1 to 3, wherein said geometrical parameter of the eye is center of fovea of the eye, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the fovea.
13. The method according to any one of claims 1 to 3, wherein said geometrical parameter of the eye is center of optic disk of the eye, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the optic disk.
14. The method according to any one of the preceding claims, wherein said sensing data of the eye is provided at a predetermined first pace being slower than a predetermined second pace of said illuminating of the eye with the incident light beam.
15. The method according to any one of the preceding claims, wherein said analyzing of the sensing data to determine the geometrical parameter of the eye is done off-line after finishing an eye tracking session.
16. An eye tracking system comprising a control and processing unit comprising: a light path analyzer configured and operable to receive light data indicative of incident light beam illuminating an area of cornea of an eye of an individual and a corresponding reflected light beam propagating backwardly from the cornea of the eye, and generate light path data indicative of angular offset between incident and reflected light directions of said incident and reflected light beams respectively; a sensing data analyzer configured and operable to receive sensing data of the eye and analyze the sensing data to determine a geometrical parameter of the eye; and an eye tracking analyzer configured and operable to generate operational data for repeatedly adjusting the incident light direction such that said angular offset is minimized and said geometrical parameter of the eye has a predetermined geometrical relationship with said incident light direction.
17. The eye tracking system according to claim 16, wherein said eye tracking analyzer is configured and operable to generate the operational data for repeatedly adjusting the incident light direction until said angular offset tends to or equals zero.
18. The eye tracking system according to claim 16 or 17, further comprising a light source configured and operable to generate said incident light beam being configured to be reflected from the cornea of the eye.
19. The eye tracking system according to claim 18, further comprising a light directing arrangement configured and operable to direct said incident light beam from said light source towards the cornea of the eye of the individual, and collect said reflected light beam propagating backwardly from the cornea of the eye.
20. The system according to claim 19, wherein said operational data, generated by said eye tracking analyzer for adjusting the incident light direction, comprise changing location of a pivot point of the light directing arrangement.
21. The eye tracking system according to claim 19 or 20, further comprising a light detector, located at an output of said light directing arrangement, and configured and operable for detecting said reflected light beam and generating a detection output indicative thereof, said light data comprising said detection output.
22. The eye tracking system according to any one of claims 16 to 21, further comprising an imager configured and operable to provide said sensing data of the eye of the individual.
23. The eye tracking system according to claim 22, wherein said imager is configured and operable to provide stereoscopic images of the eye, thereby enabling said sensing data analyzer to determine said geometrical parameter by triangulation.
24. The eye tracking system according to any one of claims 16 to 23, wherein said geometrical parameter of the eye is indicative of a symmetry condition of the eye.
25. The eye tracking system according to any one of claims 16 to 24, wherein said geometrical parameter of the eye is a parameter of limbus region of the eye.
26. The eye tracking system according to claim 25, wherein said parameter of limbus region of the eye comprises at least one of the following: a three-dimensional location of center of the limbus region, of spatial points following a diameter of the limbus region, and of spatial points following a perimeter of the limbus region.
27. The eye tracking system according to claim 25, wherein said parameter of limbus region of the eye is three-dimensional location of a center of the limbus region, and said predetermined geometrical relationship is that extension through the eye of said incident light direction passes through the center of the limbus region.
28. The eye tracking system according to any one of claims 16 to 24, wherein said geometrical parameter of the eye is a parameter of pupil of the eye.
29. The eye tracking system according to claim 28, wherein said parameter of pupil of the eye comprises at least one of the following: a three-dimensional location of center of the pupil, of spatial points following a diameter of the pupil, and of spatial points following a perimeter of the pupil.
30. The eye tracking system according to claim 28, wherein said parameter of pupil of the eye is a three-dimensional location of center of the pupil, and said predetermined geometrical relationship is that extension through the eye of said incident light direction passes through the center of the pupil.
31. The eye tracking system according to any one of claims 16 to 22, wherein said geometrical parameter of the eye is center of fovea of the eye, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the fovea.
32. The method according to any one of claims 16 to 22, wherein said geometrical parameter of the eye is center of optic disk of the eye, and said predetermined geometrical relationship is that extension through the eye of said incident direction passes through the center of the optic disk.
33. The eye tracking system according to any one of the claims 16 to 32, wherein said sensing data of the eye is provided to the control and processing utility at a predetermined first pace being slower than a predetermined second pace in which the light data is provided to the control and processing utility.
34. The eye tracking system according to any one of the claims 16 to 33, wherein said analyzing of the sensing data by the sensing data analyzer to determine the geometrical parameter of the eye is done off-line after finishing an eye tracking session.
35. A method for tracking an eye of an individual, the method comprising: i) performing online tracking comprising: a) defining an instantaneous pivot point in space as an intersection point for a plurality of incident light beams illuminating an eye of an individual over an area of cornea of the eye, b) illuminating the eye with an incident light beam, of the plurality of incident light beams, propagating along an incident direction; detecting a reflected light beam propagating from the cornea of the eye along a reflected direction; determining an angular offset between the incident and the reflected directions; and repeatedly adjusting the incident direction of the incident light beam such that said angular offset tends to zero, and c) providing, at least partially simultaneously, sensing data indicative of an instantaneous geometrical parameter of the eye [e.g. Limbus center], ii) performing subsequent off-line analysis comprising: a) providing data indicative of location of center of the cornea of the eye, and b) determining instantaneous geometrical relationships between said instantaneous pivot point, instantaneous geometrical parameter and data indicative of the location of center of the cornea of the eye, to determine instantaneous gaze direction of the individual.
PCT/IL2021/050643 2020-07-28 2021-05-31 Eye tracking systems and methods WO2022024104A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL276354 2020-07-28
IL276354A IL276354A (en) 2020-07-28 2020-07-28 Eye tracking systems and methods

Publications (1)

Publication Number Publication Date
WO2022024104A1 true WO2022024104A1 (en) 2022-02-03

Family

ID=80035407

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2021/050643 WO2022024104A1 (en) 2020-07-28 2021-05-31 Eye tracking systems and methods

Country Status (3)

Country Link
IL (1) IL276354A (en)
TW (1) TW202207865A (en)
WO (1) WO2022024104A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180246336A1 (en) * 2015-09-02 2018-08-30 Eyeway Vision Ltd. Eye projection system and method
US20190361250A1 (en) * 2017-12-18 2019-11-28 Facebook Technologies, Llc Eye tracking for pupil steering in head-mounted displays using eye tracking sensors
US20200104589A1 (en) * 2018-09-28 2020-04-02 Apple Inc. Sensor Fusion Eye Tracking

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180246336A1 (en) * 2015-09-02 2018-08-30 Eyeway Vision Ltd. Eye projection system and method
US20190361250A1 (en) * 2017-12-18 2019-11-28 Facebook Technologies, Llc Eye tracking for pupil steering in head-mounted displays using eye tracking sensors
US20200104589A1 (en) * 2018-09-28 2020-04-02 Apple Inc. Sensor Fusion Eye Tracking

Also Published As

Publication number Publication date
TW202207865A (en) 2022-03-01
IL276354A (en) 2022-02-01

Similar Documents

Publication Publication Date Title
CN107533362B (en) Eye tracking device and method for operating an eye tracking device
US8366273B2 (en) Iris image definition estimation system using the astigmatism of the corneal reflection of a non-coaxial light source
US6659611B2 (en) System and method for eye gaze tracking using corneal image mapping
US7025459B2 (en) Ocular fundus auto imager
JP5555258B2 (en) Adaptive optical scanning ophthalmoscope and method
US20220100268A1 (en) Eye tracking device and a method thereof
WO2015014058A1 (en) System for detecting optical parameter of eye, and method for detecting optical parameter of eye
MXPA03002692A (en) Method for determining distances in the anterior ocular segment.
US20020154269A1 (en) Stereoscopic measurement of cornea and illumination patterns
JP7104516B2 (en) Tomographic imaging device
US7360895B2 (en) Simplified ocular fundus auto imager
EP3821791A1 (en) Ophthalmologic imaging apparatus
US11786119B2 (en) Instant eye gaze calibration systems and methods
JP2023120308A (en) Image processing method, image processing device, and image processing program
Dera et al. Low-latency video tracking of horizontal, vertical, and torsional eye movements as a basis for 3dof realtime motion control of a head-mounted camera
US20060152676A1 (en) Ophthalmological appliance comprising an eye tracker
CN111989030A (en) Image processing method, program, and image processing apparatus
WO2022024104A1 (en) Eye tracking systems and methods
US20220414845A1 (en) Ophthalmic apparatus, method of controlling the same, and recording medium
JP2021166817A (en) Ophthalmologic apparatus
WO2021117031A1 (en) Eye tracking systems and methods
JP2022549561A (en) Patient-induced triggering of measurements for ophthalmic diagnostic devices
JP2018015021A (en) Ophthalmologic apparatus
WO2022085501A1 (en) Ophthalmic device, control method therefor, program, and recording medium
WO2019203314A1 (en) Image processing method, program, and image processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21851268

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 28.06.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21851268

Country of ref document: EP

Kind code of ref document: A1