WO2015027289A1 - Procédé et appareil de détection oculaire à partir de points brillants - Google Patents

Procédé et appareil de détection oculaire à partir de points brillants Download PDF

Info

Publication number
WO2015027289A1
WO2015027289A1 PCT/AU2014/000868 AU2014000868W WO2015027289A1 WO 2015027289 A1 WO2015027289 A1 WO 2015027289A1 AU 2014000868 W AU2014000868 W AU 2014000868W WO 2015027289 A1 WO2015027289 A1 WO 2015027289A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
frames
reflections
specular
time series
Prior art date
Application number
PCT/AU2014/000868
Other languages
English (en)
Inventor
Sebastian Rougeaux
Original Assignee
Seeing Machines Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2013903337A external-priority patent/AU2013903337A0/en
Application filed by Seeing Machines Limited filed Critical Seeing Machines Limited
Priority to CN201480059776.9A priority Critical patent/CN105765608A/zh
Priority to EP14840504.6A priority patent/EP3042341A4/fr
Priority to US14/916,082 priority patent/US10552675B2/en
Priority to JP2016539360A priority patent/JP2016532217A/ja
Publication of WO2015027289A1 publication Critical patent/WO2015027289A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Definitions

  • the present invention relates to the field of object detection and monitoring, and, in particular discloses a method and system for eye detection based on reflection structure.
  • embodiments of the invention are applicable in tracking the eye location of a user of a computer or mobile device (such as a smartphone or tablet), or a driver of a vehicle.
  • a method of determining the position of at least one eyeball within an image including the steps of: (a) capturing a time series of image frames illuminated in a predetermined temporal manner by at least two spaced apart light sources, by at least one imaging sensor; (b) processing the image frames to determine specular reflection locations in the image frames; and (c) utilising the time series evolution of the location of the specular reflections to isolate corneal reflections from the determined specular reflection locations.
  • the step (c) preferably can include utilising either a velocity or acceleration model of position evolution to model the location of the specular reflections corresponding to corneal reflections.
  • the isolate step preferably can include utilising an error measure between the model and the actual locations of the specular reflections in the image frames.
  • the model preferably can include maximum velocity or accelerations.
  • first and second light sources are included, wherein the first light source is actuated to illuminate one or both of the eyeballs during capture of even frames of the time series and the second light source is actuated to illuminate one or both of the eyeballs during capture of odd frames of the time series.
  • a plurality of light sources is included, each light source being actuated to illuminate one or both of the eyeballs during capture of predetermined frames of the time series.
  • an image processing system for detecting the position of an eyeball within an image, the system including: at least two image illumination sources for illuminating the image area in a predetermined temporal manner; an image sensor for capturing a sequence of temporal frames of the image area; a processor configured to process the temporal frames to determine specular reflection locations in the temporal frames; and second processing means for isolating likely corneal reflections from the specular reflection locations of a series of temporal frames.
  • a method of tracking one or more objects within a series of images including the steps of:
  • the step of applying one or more constraints preferably includes applying a motion model of the one or more objects based on the position of the specular reflections in a plurality of images.
  • a computer system configured to perform a method according to the third aspect.
  • a device configured to perform a method according to the third aspect.
  • Fig. 1 illustrates a first example complex image having a series of specular reflections
  • Fig. 2 illustrates a second example image having specular reflections
  • Fig. 3 illustrates schematically the geometry of creation of corneal reflections
  • Fig. 4 illustrates a flow chart of the steps of the preferred embodiment
  • Fig. 5 illustrates an example processing system suitable for implementation of the preferred embodiment
  • Fig. 6 illustrates the processing arrangement of the preferred embodiment.
  • the preferred embodiment provides a robust form of eye detection through the utilisation of the corneal reflection in a captured image. As the corneal reflection from the eye is usually still present, even in the presence of other strong reflections and noise, the detection and processing of corneal reflection location can provide a strong indicator of eye position and gaze.
  • Fig. 1 illustrates an example noisy image 1 of a human head including hat 2, safety glasses 4 and air mask 3. From close examination of the image 1, it can be seen that two corneal reflections 5, 6 are also present in the image.
  • Fig. 2 illustrates a second example image of an imaging device recording a view of a single eye having glasses 20.
  • the light source produces a number of specular reflections 21, in addition to a targeted corneal reflection 22.
  • the presence of corneal specular reflections is utilised to advantage.
  • the preferred embodiment uses at least one imaging device and at least two active light sources to determine the location of the corneal reflections.
  • the light sources are synchronised with the imaging devices. A greater number of light sources gives higher accuracy glint detection and less detection errors. Where there is more than one imaging device, their integration periods are also synchronised.
  • Exemplary imaging devices include digital cameras and CCD cameras.
  • the light sources can also be synchronized with the imaging device(s) integration period and can be actively controlled so that any combination of light sources can be ON or OFF for a given frame.
  • Exemplary light sources include LEDs or other electronically controllable lights that can emit light for a predetermined time period in response to a control signal.
  • a light source When a light source is ON, it produces a reflection (also called glint) on the surface of the cornea.
  • Fig. 3 illustrates the process schematically 30, wherein light sources 31 and 32 are projected towards the eyeball 33, a corneal reflection 34 is detected by camera 35.
  • Light sources 31 and 32 are spaced apart light so as to direct light at the cornea from different angles. This aids in better detection of glints, especially when one or both eyes are partially occluded.
  • the cornea surface can be modelled as any parametric surface.
  • the cornea is modelled as a sphere of centre C and radius .
  • the light sources 31 and 32 can also produce many other specular reflections, as illustrated in Fig. 1 and Fig. 2.
  • the proposed method of the preferred embodiment detects all the specular reflections in a sequence of images, and then using a constant motion model of the cornea (e.g., the cornea centre C is considered to move at constant velocity or constant acceleration in a 3D space), to evaluate which of the detected specular reflections are corresponding to corneal reflections.
  • Fig 4 illustrates a flow chart of the steps involved in a method 40 of determining the position of eyeballs within an image or a time series of images.
  • method 40 will be described with reference to the exemplary hardware illustrated in arrangement 50 of Fig. 5 having the exemplary configuration of Fig. 6.
  • a monitored subject 51 is subjected to sequenced infra red light sequencing from lights 52, 53 controlled by light sequencing microcontroller 55.
  • Video is captured by a video capture unit 54.
  • Unit 54 includes one or more digital cameras and optionally an internal processor.
  • the video capture is processed by processor 56 in accordance with method 40 described below.
  • a time series of images of subject 51 is captured using unit 54.
  • a subset of the time series is frames n to n+3 (57-60), as illustrated in Fig. 6.
  • the subject's eyeballs are illuminated by light sources 51 and 53.
  • illumination of sequential frames is preferably provided by a different light source in an alternating fashion.
  • light source 0 (52) is ON for the even numbered frames
  • light source 1 (53) is ON for the odd numbered frames.
  • the illumination profile varies by at least one of the light sources each frame.
  • consecutive image frames in the time series may be illuminated using the following illumination sequence:
  • sequencing microcontroller 55 in conjunction with processor 56 and capture unit 54.
  • the timing of the illumination is synchronised with the capture of image frames in the time series.
  • the general preference is that there is some variation in illumination profile (different actuated light sources or combinations of actuated light sources) between consecutive frames of the time series to better differentiate the specular reflections from noise.
  • the specular reflections or glints within the image are detected.
  • a triplet of frames Fn, Fn+1 and Fn+2 (54-56)
  • a set of 2D glints Gn, Gn+1 and Gn+2 is extracted as two-dimensional coordinates of pixels within the image.
  • Glint extraction can be done using well known computer vision methods, such as the maximum of Laplacian operators.
  • Those glints are either corresponding to a corneal reflection or any other specular reflection in the image.
  • the number of glints detected within an image can range from a few to several hundred depending on the environment imaged and the lighting.
  • the glint extraction process can be performed in parallel. Due to the small size of glints with an image, overlap of pixels between the separate modules can be significantly reduced.
  • a motion model is used to determine which specular reflections correspond to corneal reflections (as opposed to other specular reflections such as from a person's glasses).
  • An exemplary motion model is a constant velocity model of an eye.
  • Another exemplary motion model is an acceleration model of an eye. Ideally, a minimum of 3 frames for constant velocity assumption are used, or 4 frames for constant acceleration assumptions. The preferred embodiment focuses on the constant velocity model, but extension to the constant acceleration or other motion models can be used.
  • the model is applied by passing the captured image data through an algorithm run by processor 56. Each model applies constraints which relate to the typical motion of an eye. Corresponding motion models of other objects can be applied when tracking other objects within images.
  • the threshold distance may be based on a distance derived by a maximum velocity of the cornea in 3D space.
  • a minimization process can then occur to determine the best cornea trajectory in 3D (6 degrees of freedom using a constant velocity model) that fit the triplet of glints (6 observations from 3 x 2D locations).
  • Any iterative optimization process can be used at this stage (e.g. Levenberg-Marquardt) using the geometry of Fig. 3.
  • a specific fast solution to the optimization problem can be used.
  • the trajectory of the cornea can be computed from a sequence of 2D glints locations captured by a system as illustrated in Fig. 3, with the following considerations:
  • a camera / with known intrinsic projection parameters ⁇ A reference frame F aligned with the camera axis (X,Y parallel to the image plane, Z collinear with the optical axis of the camera) and centred on the camera centre of projection.
  • An infrared illuminator located at a known 3D location L in the camera reference frame F.
  • a motion model Q g ⁇ , i) where a are the motion parameters (e.g. constant velocity or constant acceleration) describing the trajectory C.
  • a sequence of 2D glints locations G ⁇ G G n ⁇ corresponding to the reflections of the light emanating from the infrared illuminator on the surface of the cornea as imaged by the camera.
  • the minimum of this function can be found using well-known optimization techniques. Once the parameter a min is found the trajectory T of the cornea can be computed using the known motion model.
  • the cornea is assumed to be a sphere of known radius R.
  • the method remains valid for any other parametric shape of the cornea (e.g. ellipsoid) as long as the theoretical location G L of the glint can be computed from the known position (and optionally orientation) of the cornea.
  • the above culling process will often reduce the number of candidate glints down to about 3 or 4.
  • the triplet of glints can then be rejected or accepted based on other predetermined criteria. For example, a maximum threshold on the residuals from the optimization (the error between the observed 2D positions of the glints and their optimized 2D positions computed from the optimized 3D cornea trajectory) can be set. Other thresholds on the optimized cornea trajectory can also be set, like the minimum and maximum depth or velocity.
  • the triplets that pass all the acceptance criteria are considered to be from actual corneal reflections and therefore both the 2D position of the eye and the 3D location of the cornea have been computed.
  • 2 consecutive glint triplets can then be assessed as a quadruplet using another motion model (e.g. constant velocity or constant acceleration) to further check for false positive detections.
  • the proposed method detects any reflective object with a curvature similar to that of a cornea. It can also occasionally produce false positives in the presence of noise (high number of specular reflections) in the images. In such cases, further image analysis, like machine learning based classifiers or appearance based criteria, can be employed to eliminate unwanted false positives.
  • the eye position determined from the corneal reflections is output.
  • the output data is in the form of either a three-dimensional coordinate of the cornea position in the camera reference frame or a two-dimensional projection in the image. These coordinates may be subsequently used to project the eye positions back onto the image or another image in the time series. Further, the coordinates of the detected eyes may be used to determine a gaze direction through further analysis of the images.
  • the embodiments described herein provide various useful method of determining the position of eyeballs within an image.
  • the invention has applications for any computer vision based face or eye tracking systems that require the detection of eye(s) and/or face(s). It is particularly useful where the face is partially occluded (for example, where the user is wearing a dust or hygienic mask), not entirely visible (for example, a portion of the face is out of the field of view of the camera), or the eye texture is partially occluded by glasses rims and reflections on the lenses.
  • Exemplary applications include vehicle operator monitoring systems for detecting signs of fatigue or distraction, gaze tracking systems that computing gaze direction (on 2D screens or in 3D environments) for ergonomic or human behavioural studies, face tracking systems for virtual glasses try-out, and face tracking systems for avatar animation.
  • the present invention is able to be performed in systems having a single glint detection module or a plurality of glint detection modules running in parallel.
  • the abovementioned overlap problem associated with prior art techniques is significantly reduced because the glint is a very small feature in the image even at close range (in some embodiments, typically 3 or 4 pixels in diameter).
  • close range in some embodiments, typically 3 or 4 pixels in diameter.
  • the system and method of the invention is still able to fit a trajectory to the detected glints from the plurality of glint detectors (removing many false eye candidates) and thereby creating a single candidate solution for the eye validation phase to operate over. This makes the process of validating any region containing an eye much more likely to return positive results with less processing time, when the eye is moving.
  • any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others.
  • the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
  • Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others.
  • including is synonymous with and means comprising.
  • the term "exemplary" is used in the sense of providing examples, as opposed to indicating quality. That is, an "exemplary embodiment" is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.
  • Coupled when used in the claims, should not be interpreted as being limited to direct connections only.
  • the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other.
  • the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
  • Coupled may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

Abstract

La présente invention concerne un procédé de détermination de la position des yeux dans une image, le procédé comprenant les étapes consistant à : (a) capturer une série temporelle de trames d'image, éclairées d'une manière temporelle prédéterminée par au moins deux sources de lumière espacées l'une de l'autre, par au moins un capteur d'imagerie; (b) traiter les trames d'image pour déterminer des positions de réflexions spéculaires dans les trames d'image; et (c) utiliser l'évolution de série temporelle de la position des réflexions spéculaires pour isoler des réflexions cornéennes à partir des positions de réflexions spéculaires déterminées.
PCT/AU2014/000868 2013-09-02 2014-09-01 Procédé et appareil de détection oculaire à partir de points brillants WO2015027289A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201480059776.9A CN105765608A (zh) 2013-09-02 2014-09-01 用于根据闪耀点进行眼睛检测的方法和设备
EP14840504.6A EP3042341A4 (fr) 2013-09-02 2014-09-01 Procédé et appareil de détection oculaire à partir de points brillants
US14/916,082 US10552675B2 (en) 2014-03-12 2014-09-01 Method and apparatus for eye detection from glints
JP2016539360A JP2016532217A (ja) 2013-09-02 2014-09-01 グリントにより眼を検出する方法および装置

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
AU2013903337A AU2013903337A0 (en) 2013-09-02 Method and apparatus for eye detection (Glints)
AU2013903337 2013-09-02
AU2014900842A AU2014900842A0 (en) 2014-03-12 Improvements to Methods and Apparatus for Eye Detection (Glints)
AU2014900842 2014-03-12

Publications (1)

Publication Number Publication Date
WO2015027289A1 true WO2015027289A1 (fr) 2015-03-05

Family

ID=52585306

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2014/000868 WO2015027289A1 (fr) 2013-09-02 2014-09-01 Procédé et appareil de détection oculaire à partir de points brillants

Country Status (4)

Country Link
EP (1) EP3042341A4 (fr)
JP (1) JP2016532217A (fr)
CN (1) CN105765608A (fr)
WO (1) WO2015027289A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017160356A1 (fr) * 2016-03-16 2017-09-21 Google Inc. Systèmes et procédés pour améliorer la visibilité d'un objet pour une imagerie aérienne
WO2018154271A1 (fr) * 2017-02-22 2018-08-30 Fuel 3D Technologies Limited Systèmes et procédés d'obtention d'informations de lunetterie
CN109690553A (zh) * 2016-06-29 2019-04-26 醒眸行有限公司 执行眼睛注视跟踪的系统和方法
CN110168563A (zh) * 2016-11-11 2019-08-23 3E株式会社 闪光检测方法
EP3671541A1 (fr) * 2018-12-21 2020-06-24 Tobii AB Classification de reflets à l'aide d'un système de suivi de l' il

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023120892A1 (fr) * 2021-12-20 2023-06-29 삼성전자주식회사 Dispositif et procédé de commande de sources lumineuses dans un suivi de regard à l'aide d'un reflet

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7747068B1 (en) 2006-01-20 2010-06-29 Andrew Paul Smyth Systems and methods for tracking the eye

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7280678B2 (en) * 2003-02-28 2007-10-09 Avago Technologies General Ip Pte Ltd Apparatus and method for detecting pupils
WO2007070853A2 (fr) * 2005-12-14 2007-06-21 Digital Signal Corporation Systeme et procede pour le suivi du deplacement de globe oculaire
JP5621456B2 (ja) * 2010-09-21 2014-11-12 富士通株式会社 視線検出装置、視線検出方法及び視線検出用コンピュータプログラムならびに携帯端末

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7747068B1 (en) 2006-01-20 2010-06-29 Andrew Paul Smyth Systems and methods for tracking the eye

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HARO ET AL.: "Detecting and Tracking eyes by using their physiological properties, dynamcics, and appearance", PROCEEDINGS 2000 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2000, 13 June 2000 (2000-06-13), pages 163 - 168, XP001035597
HARO, A. ET AL.: "Detecting and Tracking Eyes By Using Their Physiological Properties, Dynamics. and Appearance", COMPUTER VISION AND PATTERN RECOGNITION, vol. 1, 2000, pages 163 - 168, XP001035597 *
MORIMOTO, C. H. ET AL.: "Pupil detection and tracking using multiple light sources", IMAGE AND VISION COMPUTING, vol. 18, no. ISSUE, 2000, pages 331 - 335, XP008126446 *
See also references of EP3042341A4

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017160356A1 (fr) * 2016-03-16 2017-09-21 Google Inc. Systèmes et procédés pour améliorer la visibilité d'un objet pour une imagerie aérienne
US9996905B2 (en) 2016-03-16 2018-06-12 Planet Labs, Inc. Systems and methods for enhancing object visibility for overhead imaging
EP3663714A1 (fr) * 2016-03-16 2020-06-10 Planet Labs Inc. Systèmes et procédés pour améliorer la visibilité d'un objet pour une imagerie aérienne
US10249024B2 (en) 2016-03-16 2019-04-02 Plant Labs, Inc. Systems and methods for enhancing object visibility for overhead imaging
CN109690553A (zh) * 2016-06-29 2019-04-26 醒眸行有限公司 执行眼睛注视跟踪的系统和方法
JP2019519859A (ja) * 2016-06-29 2019-07-11 シーイング マシーンズ リミテッド 視線追跡を実行するシステム及び方法
EP3479293A4 (fr) * 2016-06-29 2020-03-04 Seeing Machines Limited Systèmes et procédés permettant d'effectuer le suivi du regard
US10878237B2 (en) 2016-06-29 2020-12-29 Seeing Machines Limited Systems and methods for performing eye gaze tracking
CN110168563A (zh) * 2016-11-11 2019-08-23 3E株式会社 闪光检测方法
WO2018154271A1 (fr) * 2017-02-22 2018-08-30 Fuel 3D Technologies Limited Systèmes et procédés d'obtention d'informations de lunetterie
US10775647B2 (en) 2017-02-22 2020-09-15 Fuel 3D Technologies Limited Systems and methods for obtaining eyewear information
EP3671541A1 (fr) * 2018-12-21 2020-06-24 Tobii AB Classification de reflets à l'aide d'un système de suivi de l' il
CN111522431A (zh) * 2018-12-21 2020-08-11 托比股份公司 使用眼睛跟踪系统对闪光进行分类
CN111522431B (zh) * 2018-12-21 2021-08-20 托比股份公司 使用眼睛跟踪系统对闪光进行分类
CN113608620A (zh) * 2018-12-21 2021-11-05 托比股份公司 使用眼睛跟踪系统对闪光进行分类
US11619990B2 (en) 2018-12-21 2023-04-04 Tobii Ab Classification of glints using an eye tracking system

Also Published As

Publication number Publication date
EP3042341A4 (fr) 2017-04-19
JP2016532217A (ja) 2016-10-13
EP3042341A1 (fr) 2016-07-13
CN105765608A (zh) 2016-07-13

Similar Documents

Publication Publication Date Title
US10552675B2 (en) Method and apparatus for eye detection from glints
US10878237B2 (en) Systems and methods for performing eye gaze tracking
JP5529660B2 (ja) 瞳孔検出装置及び瞳孔検出方法
US7682026B2 (en) Eye location and gaze detection system and method
EP2748797B1 (fr) Détermination de la distance à un objet à partir d'une image
US10318831B2 (en) Method and system for monitoring the status of the driver of a vehicle
EP3453316B1 (fr) Suivi de l' il utilisant une position centrale du globe oculaire
US7819525B2 (en) Automatic direct gaze detection based on pupil symmetry
JP5467303B1 (ja) 注視点検出装置、注視点検出方法、個人パラメータ算出装置、個人パラメータ算出方法、プログラム、及びコンピュータ読み取り可能な記録媒体
WO2015027289A1 (fr) Procédé et appareil de détection oculaire à partir de points brillants
CN106547341B (zh) 注视跟踪器及其跟踪注视的方法
US20160004303A1 (en) Eye gaze tracking system and method
US20140313308A1 (en) Apparatus and method for tracking gaze based on camera array
US9002053B2 (en) Iris recognition systems
JP6583734B2 (ja) 角膜反射位置推定システム、角膜反射位置推定方法、角膜反射位置推定プログラム、瞳孔検出システム、瞳孔検出方法、瞳孔検出プログラム、視線検出システム、視線検出方法、視線検出プログラム、顔姿勢検出システム、顔姿勢検出方法、および顔姿勢検出プログラム
KR20130107981A (ko) 시선 추적 장치 및 방법
WO2016142489A1 (fr) Suivi d'œil à l'aide d'un capteur de profondeur
JP6870474B2 (ja) 視線検出用コンピュータプログラム、視線検出装置及び視線検出方法
JP2010123019A (ja) 動作認識装置及び方法
EP3542308B1 (fr) Procédé et dispositif d'acquisition de mesure d' il
CN113260299A (zh) 用于眼睛追踪的系统和方法
KR101122513B1 (ko) 3차원 위치정보를 이용한 안구위치 추정시스템 및 안구위치추정방법
JP6468755B2 (ja) 特徴点検出システム、特徴点検出方法、および特徴点検出プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14840504

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016539360

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 14916082

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2014840504

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014840504

Country of ref document: EP