WO1999005988A2 - Systeme de detection du mouvement de l'oeil utilisant une source d'eclairage circulaire, hors de l'axe - Google Patents

Systeme de detection du mouvement de l'oeil utilisant une source d'eclairage circulaire, hors de l'axe Download PDF

Info

Publication number
WO1999005988A2
WO1999005988A2 PCT/US1998/015920 US9815920W WO9905988A2 WO 1999005988 A2 WO1999005988 A2 WO 1999005988A2 US 9815920 W US9815920 W US 9815920W WO 9905988 A2 WO9905988 A2 WO 9905988A2
Authority
WO
WIPO (PCT)
Prior art keywords
eye
camera
axis
subject
image
Prior art date
Application number
PCT/US1998/015920
Other languages
English (en)
Other versions
WO1999005988A3 (fr
Inventor
Joshua D. Borah
Charles Valois
Original Assignee
Applied Science Laboratories
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Applied Science Laboratories filed Critical Applied Science Laboratories
Publication of WO1999005988A2 publication Critical patent/WO1999005988A2/fr
Publication of WO1999005988A3 publication Critical patent/WO1999005988A3/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2213/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B2213/02Viewfinders
    • G03B2213/025Sightline detection

Definitions

  • the invention relates to an eye tracker for determining line-of-gaze of a subject and, more particularly, it relates to an eye tracker that uses an improved light source for illuminating the subject's eye.
  • All practical eye movement measurement or eye tracking techniques involve tracking one or more features or reflections that can be optically detected on the eye.
  • Many of the system that are available fall into one of two categories, namely, methods that detect single features on the subject's eye and methods that detect two features on the subject's eye.
  • One way of establishing a feature is by reflecting a light source off of the eye.
  • One feature that has been often used for this purpose is the pupil.
  • the equipment determines the position of the pupil center. Through a simple mathematical transformation, changes in the position of the center of the pupil can be easily converted to an indication of the line-of-gaze of the subject.
  • the retina is highly reflective, but any light reflected back through the pupil will be directed towards its original source. In fact, if the eye is focused at the plane of the source, such retro-reflected light from the retina will be imaged back at the source. Under normal viewing conditions, the retina looks completely black because none of the rays reflected off of the retina return to the observer. If, however, the observer is able to look along the axis of an illumination beam, then the observer will see the retinal reflection and the pupil will appear bright.
  • a beam splitter which typically is a prism with a 45° reflecting surface.
  • the viewing camera looks through the beam splitter (and through the reflecting surface) at the subject's eye.
  • the illumination source which is off to the side of the viewing axis, directs light at the 45° reflecting surface of the beam splitter which reflects that light along and coaxial with the viewing axis toward the subject.
  • the beam splitter adds to the bulk of the device it tends to attenuate the light reflected back from the subject's eye to the camera.
  • the invention is a camera unit for use in an eye tracking apparatus .
  • the camera unit includes a camera with a lens having an image axis; and a ring shaped light source disposed around the image axis and near the periphery of the lens aperture.
  • the light source is oriented to direct light along the camera axis toward the target .
  • the invention is an eye line-of-gaze measurement apparatus including an electronic camera with a lens having an image axis; a ring shaped light source disposed around the image axis and near the periphery of the lens aperture and oriented to direct light along the camera axis toward the target; and a digital processor programmed to determine an eye line-of-gaze from the image of a retro-reflection obtained from the subject's eye.
  • the ring-shaped light source includes an array of lights arranged in a circle to form a ring.
  • the plurality of light sources are evenly spaced about the circle.
  • the light sources are LED's.
  • the invention is a method of generating a retro-reflection from a subject's eye for use in a line-of-gaze measurement system that utilizes a bright pupil detection technique.
  • the method includes the steps of producing an image of the subject's eye by using a camera that is characterized by a viewing axis; and illuminating the subject's eye with an off-axis illumination to produce a retro- reflection from the retina of the subject's eye.
  • the step of illuminating the subject's eye involves illuminating with a light source that is distributed around the viewing axis.
  • a light source that is distributed around the viewing axis.
  • Another advantage of the invention is that it provides a larger and brighter illumination source which both improves the accuracy of the eye tracking system and makes the overall system less sensitive to variations in ambient light conditions.
  • the improved light source more effectively produces a retro-reflection from the retina of the subject's eye. Also, it produces a large enough reflection off of the cornea and at the same time it produces a bright enough retinal retro-reflection from which the point-of-gaze determinations can be made.
  • Yet another advantage of the invention is that it eliminates the need for inserting between the viewing camera and the subject's eye a beam splitter which also acts to attenuate the intensity of the image obtained by the camera.
  • FIG. 1 is a block diagram of representative components of an eye tracker system including a camera unit which has a ring illumination source;
  • Fig. 2 is a front view of ring illumination source that is mounted on the front of the camera unit shown in Fig. 1:
  • Fig. 3 is an embodiment of an eye-head tracker system including a head-mounted scene camera, a stationary scene camera, and a eye tacker camera which includes the ring illumination source; and
  • Fig. 4 is an alternative embodiment of an eye-head tracker system including a stationary scene camera.
  • an eye tracker system with an improved illumination source includes a solid state camera 10, a ring light source 12 for illuminating the subjects eye 14 (also shown from a front perspective in Fig. 2) , and a digital processor 16 for executing the programs which processes the camera image to locate the pupil, determine its center, and then compute the line- of-gaze of the subject.
  • the novel aspects of the system are in the design and construction of the ring light source that is used to illuminate the subject's eye and its relationship with the camera lens.
  • the camera can be any commercially available system that is suitable for generating an image of the eye.
  • it can include a solid state linear or two-dimensional array device.
  • the arrays can be made by clustering photocells in tightly packed linear or two- dimensional arrangements.
  • they can include arrays of thousands of charge coupled devices (CCD's) or charge injection devices (CID's) such as are found in commercially available solid state cameras or video cameras.
  • CCD's charge coupled devices
  • CID's charge injection devices
  • the camera is a Sony EVI-D30 pan/tilt color camera, which is a compact solid state device and which uses a CD array.
  • the Sony camera that is used in the described embodiment includes a lens system 20 which focuses the image onto a CCD array 22.
  • It also includes internal electronic circuitry 24 which converts the signals from the CCD array into a form that is displayable and is analyzable by the image processing components of the system.
  • internal electronic circuitry 24 converts the signals from the CCD array into a form that is displayable and is analyzable by the image processing components of the system.
  • any camera that generates an image which can be processed and analyzed to extract feature information would be acceptable.
  • the camera was operated at a distance of between 18 to 40 inches from the subjects eye, we further modified the lens system within the Sony camera to magnify the image to a size that was more useful to the image processing system.
  • This involved adding a positive lens 28 before internal lens system 20 and adding a negative lens 30 after the lens system 20.
  • the added front lens was a piano convex lens obtained from Edmund Scientific (part no. 44020) with a 20.5 mm diameter and a 415 mm focal length.
  • the added back lens was a negative lens also obtained from Edmund Scientific (part no. 44090) with a 9.2 mm diameter and a negative focal length of 41 mm.
  • the Sony camera by building an addition to the front of the camera which included a plate 40 supporting the ring of illumination 13 that encircles and is in close proximity to the lens opening.
  • the lens system is contained in a housing 21 onto which plate 40 is mounted.
  • Plate 40 has a centrally located circular hole 45 through which the camera views the target scene.
  • the hole in plate 40 is approximately the size of the aperture of the lens. More specifically, the hole is made as small as possible without either compromising the light gathering efficiency of the lens or the quality of the image that is produced.
  • Surrounding hole 45 there is an array of eight, evenly spaced LED's 13 mounted on the plate and oriented to direct light toward the target scene.
  • the LED's have built in lenses which produce a narrow beam than would be generated by the device without the lens.
  • the LED's produce light that is in the near infra-red region, though of course, light of other wavelengths could also be used.
  • the advantage of near infra-red light is that since it is not visible to people, it
  • the LED's were devices sold Siemens Corporation (part no. SFH 484) which produce a 16° beam at a center wavelength of about 880 nm.
  • the lens aperture formed in the plate was about 0.5 inch and the diameter of the ring of light was slightly bigger, e.g 0.73 inch.
  • the size of the array should not be so small as to interfere with the efficient operation of the lens. If the ring diameter is too large, then the light source will become less effective at producing the retro-reflection. Indeed, at a certain diameter, it will loose all ability to produce a retro-reflection that can be observed by the camera .
  • the array of closely spaced LED's forms a ring of light surrounding the lens aperture. It should be understood, however, that any effective way of producing a ring of light would be acceptable. For example, one could use a single light source and an group of optical fibers to produce the array of individual light sources. Or alternatively, one could use an optical lens system to produce the ring of light. It should also be noted that the retro-reflection can be generated using less than a complete ring of light. Indeed, what was surprising is that it was possible to effectively produce the retro-reflection with an off axis light source.
  • the eye tracking system is limited to using only a single feature of the eye to compute line-of-gaze.
  • the system may use multiple features also including, for example, corneal reflection of the light source.
  • corneal reflection the feature is typically at a different position from the center of the pupil.
  • Tracking only the position of a single landmark or feature of the eye does not permit the system to distinguish between eye rotation and eye translation with respect to the camera.
  • further information must be provided such as could be obtained from a head tracker that indicates the position of the head with respect to the camera.
  • Another source of information can be a second feature on the eye, e.g. corneal reflection. Since the second feature is at a different location from the first feature, the system can eliminate the ambiguity the ambiguity between translation and rotation.
  • the techniques for using two features to perform eye tracking are well known in the art and will not be described here. But it should be understood that the invention is meant to cover systems and method which use these other techniques in addition to the bright pupil technique.
  • Fig. 3 shows a typical eye tracking system which uses the improved light source for bright pupil monitoring.
  • a person whose point of gaze is being measured wears a helmet 110 on which are mounted a visor 112, an eye tracker sensor and optics unit 114, having the features described above, and a head mounted scene camera 116.
  • Visor 112 is coated to be very reflective in the near infra-red but transparent in the visual spectrum and thus allows the person to look through it while at the same time eye tracker sensor and optics unit 114 is able to "look" at a reflection of the person's eye and head-mounted scene camera 116 is able to see a reflection of the field of view of the subject.
  • a stationary scene camera 118 is mounted on the floor - within proximity of the person. It may be mounted on a tripod, as shown, or fixed to the environment in some other way. Stationary scene camera 118 is aimed so that one or more of the surfaces of interest (e.g. scene planes 120 and 122) are visible in the camera video image. Scene planes 120 and 122 may be instruments on a control panel, visual presentations, or any other regions of visual interest. When the user looks forward, scene planes 120 and 122 are within the field of view of both the person and head-mounted scene camera 116.
  • Eye tracker sensor and optics unit 114 which incorporates the off-axis light source (e.g. the ring light source) produces a video image that is preprocessed and digitized by an eye tracker electronics unit 124 and sent to a computer 126.
  • Computer 126 which is programmed appropriately, uses the resulting digital information from unit 124 to determine the relative locations of the pupil center and the reflection of the near infrared light source on the front surface of the cornea. From the pupil-to-corneal reflection vector, computer 126 determines the pointing direction of the eye with respect to the head mounted optics.
  • the pointing direction of the eye is represented by two coordinates in computer memory that are proportional to eye azimuth and eye elevation angle, respectively (or the equivalent) .
  • eye tracker electronics unit 124 and computer 126 are both part of a eye-head tracker processor 128, which may be commercially obtained from Applied Science Group, Inc. of Waltham, MA. and is identified as the ASL model 4100H-EHS eye-head tracker system.
  • the pupil to corneal reflection technique for measuring eye pointing direction is described in the literature, and is well-known to those skilled in the art. (See, for example, Young and Sheena, Methods & Designs, Survey of eye movement recording methods, Behavior Research Methods and Instrumentation 1975, Vol. 7(5), 397-492; Merchant & Morrisette, Remote measurement of eye direction allowing subject motion over one cubic foot of space, IEEE Transactions on Biomedical Engineering, 1974, BME-21, 309-317; and Borah, "Helmet Mounted Eye Tracking for Virtual Panoramic Display Systems", AAMRL-TR-89-019 , Harry B. Armstrong Aerospace Medical Research Laboratory, Human Systems Division, Air Force Systems Command, Wright-Patterson AFB, August 1989.)
  • Computer 126 maps the resulting coordinate values to a different set of coordinates which represent a horizontal and vertical location on the video image from head mounted scene camera 116.
  • head mounted scene camera 16 moves along with the head and is optically located at the same (or nearly the same) position as the eye, it remains part of the same reference frame as the eye position detection system. In other words, there is a unique relation between eye pointing direction with respect to the head, and point of gaze with respect to the scene camera image.
  • mapping techniques for mapping eye azimuth and elevation values to such a scene camera image field including interpolation techniques and curve fit techniques. These techniques are described in the literature, and are well-known to those practiced in the art .
  • the technique used in the preferred embodiment is a curve fit technique (see, e.g. Sheena & Borah, "Compensation for Some Second order Effects to Improve Eye Position Measurements", for D.F. Fisher, R.A. Monty, and J.W. Sanders (Eds) : Eve Movements: Cognition and Visual Perception, L. Erlbaum Assoc., 1981) .
  • the position coordinates of point of gaze with respect to the head mounted scene camera image can be displayed on a scene monitor 130 as a cursor, cross hairs, or other indicator, superimposed on the video image from head- mounted scene camera 116.
  • the ASL model 4100H-EHS includes the capability for such cursor superimposition.
  • many other commercially available devices exist which superimpose cursors, cross hairs, or other symbols on a video signal at specified locations, whose coordinates are available in computer memory, and such devices can be used. Note that use of head-mounted scene camera 116 is not a necessary part of the system but is described primarily because it is readily available and is a common part of some eye tracking systems .
  • the eye tracking system in Fig. 3 also includes a head tracker which determines the position and orientation of the person's head.
  • the head tracker is a device based on magnetic principles, such as the 3Space Tracker available from Polhemus, a Kaiser Aerospace & Electronics Company, or The BirdTM available from Ascension Technology, Inc.
  • Other possible embodiments could utilize mechanical goniometers; ultrasonic devices, such as one offered commercially by Logitech, Inc.; optical devices; or any other device that can be used to measure head position and orientation.
  • the magnetic head tracker (MHT) shown in Fig. 3 includes an MHT sensor 132, an MHT transmitter 134, and an MHT control unit 136.
  • MHT sensor 132 is fastened to the subjects helmet, and MHT transmitter 134 is fixed to the environment near the subjects head.
  • MHT control unit 136 determines the position of MHT sensor 132 with respect to MHT transmitter 134 in 6 degrees of freedom and communicates this information to computer 126 via an RS-232 interface.
  • a program in computer 126 uses information from the head tracker, information about eye line of gaze with respect to the head (computed as described above) , and stored information about the location of surfaces in the environment (such as scene plane 120 and scene plane 122) , to determine the location and direction of the eye line of gaze vector with respect to the environment, the surface intersected by the line of gaze vector, and the location of the intersection point (point P in Fig. 3) with respect to the surface intersected.
  • this data field which is identified as "RS-232 data output" includes the number of the scene plane being viewed, and the horizontal and vertical coordinates of point of gaze on that surface (with respect to a coordinate frame predefined on that surface) .
  • the data can be read by an external device on a standard RS-232 serial data port.
  • a new data field is available at the same update rate as that being used by the camera imaging the subject's eye. This is generally 60 times per second in the USA, when the eye tracker employs standard NTSC video format cameras, or 50 times per second in Europe or other countries, when the eye tracker employs cameras with standard European PAL video format .
  • Computer 126 uses the point of gaze information to determine the location of gaze within the viewed scene as shown by the video scene monitor and it superimposes a cursor or cross hairs on the image displayed on video scene monitor 130.
  • the system can either use an appropriate set of transformations to map the point of gaze onto the scene image or it can be done by first calibrating to establish a reference point on the scene image. This latter approach involves having the subject look at a fixed reference point in the image scene to determine a reference line-of-gaze direction associated with that point. Then any changes in the line-of-gaze can be readily translated into an appropriate change in the point of gaze in the image scene.
  • FIG. 4 An alternate embodiment of the stationary scene camera implementation is shown in Fig. 4.
  • the standard RS-232 data output available from eye-head tracker processor 128 e.g. ASL model 4100- EHS
  • Eye-head tracker processor 128 e.g. ASL model 4100- EHS
  • External computer 140 is equipped with an NTSC/VGA conversion board 142 such as the Redlake model NTSC 100 Video Digitizer and VGA Overlay Controller.
  • This commercially available board allows computer 140 to display an image from a standard NTSC format video camera 118 on a computer VGA screen 144, and it also allows computer 140 to superimpose VGA graphics on this image.
  • Computer 140 also includes a mouse 146 (or other pointing device) that enables the user to move the cursor about on the video image and it includes programming capable of capturing and recording in memory the VGA coordinates of the cursor at the location at which the mouse is clicked.
  • the stationary scene camera video image is input to NTSC/VGA conversion board 142 in external computer 140.
  • a program in the external computer superimposes a cursor, cross hairs, or other indicator showing the subject's point of gaze, on the VGA image from stationary scene camera 118.

Abstract

L'invention concerne un ensemble caméra destiné à l'utilisation dans un appareil de détection du mouvement de l'oeil, la caméra comportant un objectif ayant un axe image, ainsi qu'une source lumineuse en forme d'anneau placée autour de l'axe image et près de la périphérie de l'ouverture de l'objectif, la source lumineuse étant orientée de manière à diriger la lumière sur l'axe de la caméra en direction de la cible.
PCT/US1998/015920 1997-07-30 1998-07-29 Systeme de detection du mouvement de l'oeil utilisant une source d'eclairage circulaire, hors de l'axe WO1999005988A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US90333397A 1997-07-30 1997-07-30
US08/903,333 1997-07-30

Publications (2)

Publication Number Publication Date
WO1999005988A2 true WO1999005988A2 (fr) 1999-02-11
WO1999005988A3 WO1999005988A3 (fr) 1999-04-08

Family

ID=25417323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/015920 WO1999005988A2 (fr) 1997-07-30 1998-07-29 Systeme de detection du mouvement de l'oeil utilisant une source d'eclairage circulaire, hors de l'axe

Country Status (1)

Country Link
WO (1) WO1999005988A2 (fr)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001074236A1 (fr) * 2000-03-31 2001-10-11 University Technologies International Inc. Test diagnostic pour les troubles deficitaires de l'attention
FR2846120A1 (fr) * 2002-10-21 2004-04-23 Inst Sciences De La Vision Procede de demasquage de l'information visuelle presente sous une zone visuelle peu ou pas fonctionnelle et le dispositif de mise en oeuvre
WO2004060153A1 (fr) * 2002-12-19 2004-07-22 Bausch & Lomb Incorporated Systeme permettant de suivre le deplacement d'un objet spherique
WO2007037751A1 (fr) * 2005-09-27 2007-04-05 Penny Ab Dispositif de commande d’unité externe
JP2007289658A (ja) * 2006-03-27 2007-11-08 Fujifilm Corp 画像出力装置、画像出力方法、および画像出力プログラム
WO2011024134A1 (fr) 2009-08-26 2011-03-03 Ecole Polytechnique Federale De Lausanne (Epfl) Systèmes portés sur le corps pour surveillance audio, visuelle et du regard
EP2551636A1 (fr) 2011-07-25 2013-01-30 Leica Geosystems AG Dispositif de mesure pouvant être commandé sans contact et son procédé de commande
US8929589B2 (en) 2011-11-07 2015-01-06 Eyefluence, Inc. Systems and methods for high-resolution gaze tracking
US9498123B2 (en) 2006-03-27 2016-11-22 Fujifilm Corporation Image recording apparatus, image recording method and image recording program stored on a computer readable medium
US10039445B1 (en) 2004-04-01 2018-08-07 Google Llc Biosensors, communicators, and controllers monitoring eye movement and methods for using them
WO2019187808A1 (fr) * 2018-03-29 2019-10-03 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US11000187B2 (en) 2017-09-07 2021-05-11 Carl Zeiss Meditec, Inc. Systems and methods for improved montaging of ophthalmic imaging data
US11194161B2 (en) 2018-02-09 2021-12-07 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11372476B1 (en) 2018-02-20 2022-06-28 Rockwell Collins, Inc. Low profile helmet mounted display (HMD) eye tracker
US11393251B2 (en) 2018-02-09 2022-07-19 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11412928B2 (en) 2017-08-11 2022-08-16 Carl Zeiss Meditec, Inc. Systems and methods for improved ophthalmic imaging
US11537202B2 (en) 2019-01-16 2022-12-27 Pupil Labs Gmbh Methods for generating calibration data for head-wearable devices and eye tracking system
US11556741B2 (en) 2018-02-09 2023-01-17 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters using a neural network
US11676422B2 (en) 2019-06-05 2023-06-13 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8885877B2 (en) 2011-05-20 2014-11-11 Eyefluence, Inc. Systems and methods for identifying gaze tracking scene reference locations
US8911087B2 (en) 2011-05-20 2014-12-16 Eyefluence, Inc. Systems and methods for measuring reactions of head, eyes, eyelids and pupils

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4699481A (en) * 1984-09-01 1987-10-13 Canon Kabushiki Kaisha Stereoscopic microscope
US5526148A (en) * 1994-08-02 1996-06-11 Moffat; Robert J. Apparatus and method for full-field calibration of color response to temperature of thermochromic liquid crystals
US5585872A (en) * 1993-06-03 1996-12-17 Canon Kabushiki Kaisha Ophthalmic measuring apparatus for determining the shape of the cornea of an eye

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4699481A (en) * 1984-09-01 1987-10-13 Canon Kabushiki Kaisha Stereoscopic microscope
US5585872A (en) * 1993-06-03 1996-12-17 Canon Kabushiki Kaisha Ophthalmic measuring apparatus for determining the shape of the cornea of an eye
US5526148A (en) * 1994-08-02 1996-06-11 Moffat; Robert J. Apparatus and method for full-field calibration of color response to temperature of thermochromic liquid crystals

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001074236A1 (fr) * 2000-03-31 2001-10-11 University Technologies International Inc. Test diagnostic pour les troubles deficitaires de l'attention
FR2846120A1 (fr) * 2002-10-21 2004-04-23 Inst Sciences De La Vision Procede de demasquage de l'information visuelle presente sous une zone visuelle peu ou pas fonctionnelle et le dispositif de mise en oeuvre
WO2004038662A1 (fr) * 2002-10-21 2004-05-06 Institut Des Sciences De La Vision Procede de demasquage de l'information visuelle presente sous une zone visuelle peu ou pas fonctionnelle de l’oeil d’un sujet et dispositif de mise en oeuvre du procede
WO2004060153A1 (fr) * 2002-12-19 2004-07-22 Bausch & Lomb Incorporated Systeme permettant de suivre le deplacement d'un objet spherique
JP2006511305A (ja) * 2002-12-19 2006-04-06 ボシュ・アンド・ロム・インコーポレイテッド 球形対象の運動の追跡のためのシステム
US10039445B1 (en) 2004-04-01 2018-08-07 Google Llc Biosensors, communicators, and controllers monitoring eye movement and methods for using them
WO2007037751A1 (fr) * 2005-09-27 2007-04-05 Penny Ab Dispositif de commande d’unité externe
US8587514B2 (en) 2005-09-27 2013-11-19 Penny Ab Device for controlling an external unit
JP2007289658A (ja) * 2006-03-27 2007-11-08 Fujifilm Corp 画像出力装置、画像出力方法、および画像出力プログラム
EP2004039A4 (fr) * 2006-03-27 2011-12-28 Fujifilm Corp Appareil, procédé et programme de production d'image
US8243132B2 (en) 2006-03-27 2012-08-14 Fujifilm Corporation Image output apparatus, image output method and image output computer readable medium
US9498123B2 (en) 2006-03-27 2016-11-22 Fujifilm Corporation Image recording apparatus, image recording method and image recording program stored on a computer readable medium
EP2004039A1 (fr) * 2006-03-27 2008-12-24 FUJIFILM Corporation Appareil, procédé et programme de production d'image
WO2011024134A1 (fr) 2009-08-26 2011-03-03 Ecole Polytechnique Federale De Lausanne (Epfl) Systèmes portés sur le corps pour surveillance audio, visuelle et du regard
WO2013014084A1 (fr) 2011-07-25 2013-01-31 Leica Geosystems Ag Dispositif de mesure commandable sans contact et son procédé de commande
EP2551636A1 (fr) 2011-07-25 2013-01-30 Leica Geosystems AG Dispositif de mesure pouvant être commandé sans contact et son procédé de commande
US8929589B2 (en) 2011-11-07 2015-01-06 Eyefluence, Inc. Systems and methods for high-resolution gaze tracking
US11412928B2 (en) 2017-08-11 2022-08-16 Carl Zeiss Meditec, Inc. Systems and methods for improved ophthalmic imaging
US11000187B2 (en) 2017-09-07 2021-05-11 Carl Zeiss Meditec, Inc. Systems and methods for improved montaging of ophthalmic imaging data
US11194161B2 (en) 2018-02-09 2021-12-07 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11340461B2 (en) 2018-02-09 2022-05-24 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11393251B2 (en) 2018-02-09 2022-07-19 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11556741B2 (en) 2018-02-09 2023-01-17 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters using a neural network
US11372476B1 (en) 2018-02-20 2022-06-28 Rockwell Collins, Inc. Low profile helmet mounted display (HMD) eye tracker
WO2019187808A1 (fr) * 2018-03-29 2019-10-03 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JPWO2019187808A1 (ja) * 2018-03-29 2021-04-01 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
US11537202B2 (en) 2019-01-16 2022-12-27 Pupil Labs Gmbh Methods for generating calibration data for head-wearable devices and eye tracking system
US11676422B2 (en) 2019-06-05 2023-06-13 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters

Also Published As

Publication number Publication date
WO1999005988A3 (fr) 1999-04-08

Similar Documents

Publication Publication Date Title
WO1999005988A2 (fr) Systeme de detection du mouvement de l'oeil utilisant une source d'eclairage circulaire, hors de l'axe
US5345281A (en) Eye tracking system and method
US6433760B1 (en) Head mounted display with eyetracking capability
EP3485356B1 (fr) Oculométrie reposant sur la polarisation de la lumière
US6091378A (en) Video processing methods and apparatus for gaze point tracking
US7533989B2 (en) Sight-line detection method and device, and three-dimensional view-point measurement device
US6659611B2 (en) System and method for eye gaze tracking using corneal image mapping
US10692224B1 (en) Estimation of absolute depth from polarization measurements
US6578962B1 (en) Calibration-free eye gaze tracking
US6637883B1 (en) Gaze tracking system and method
CN100421614C (zh) 用于检测和跟踪眼睛及其注视方向的方法和装置
US9033502B2 (en) Optical measuring device and method for capturing at least one parameter of at least one eye wherein an illumination characteristic is adjustable
CN103429139B (zh) 具有可调整视场的眼镜装置和方法
US6394602B1 (en) Eye tracking system
US20110170061A1 (en) Gaze Point Tracking Using Polarized Light
Duchowski et al. Eye tracking techniques
US5237351A (en) Visual target apparatus
US20080137909A1 (en) Method and apparatus for tracking gaze position
US10109067B2 (en) Corneal sphere tracking for generating an eye model
JP4500992B2 (ja) 三次元視点計測装置
KR20190004806A (ko) 헤드 마운트 디스플레이의 얼굴 센서를 사용한 얼굴과 안구 추적 및 얼굴 애니메이션
EP3542308B1 (fr) Procédé et dispositif d'acquisition de mesure d' il
US20030108350A1 (en) System and method for measuring the refraction
CN110300976A (zh) 眼睛注视跟踪
EP4158447A1 (fr) Systèmes et procédés pour fournir des expériences de réalité mixte dans des conditions de faible luminosité

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

AK Designated states

Kind code of ref document: A3

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA