WO2009153724A1 - Procédé pour produire des commandes par des mouvements oculaires - Google Patents

Procédé pour produire des commandes par des mouvements oculaires Download PDF

Info

Publication number
WO2009153724A1
WO2009153724A1 PCT/IB2009/052520 IB2009052520W WO2009153724A1 WO 2009153724 A1 WO2009153724 A1 WO 2009153724A1 IB 2009052520 W IB2009052520 W IB 2009052520W WO 2009153724 A1 WO2009153724 A1 WO 2009153724A1
Authority
WO
WIPO (PCT)
Prior art keywords
pupil
centre
displacement vector
modulus
discrete
Prior art date
Application number
PCT/IB2009/052520
Other languages
English (en)
Inventor
Sergio Fonda
Matteo Corradini
Luca Bulf
Original Assignee
Universita' Degli Studi Di Modena E Reggio Emilia
S.C.E. S.R.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universita' Degli Studi Di Modena E Reggio Emilia, S.C.E. S.R.L. filed Critical Universita' Degli Studi Di Modena E Reggio Emilia
Publication of WO2009153724A1 publication Critical patent/WO2009153724A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Definitions

  • the invention relates to a method for giving commands by means of eye movements, and in particular to an electronic apparatus or a computer.
  • the prior art comprises various methods of this type, all distinguished by a considerable complexity of calculations, which require accurate initial calibration of a certain number of operating parameters in order for the method to produce satisfactory results.
  • the aim of the present invention is to provide a method for giving commands by means of eye movements which method is considerably simpler and easy to actuate and which does not require a pre-calibration in order to produce satisfactory results, and is also resultingly more economical with respect to prior-art methods.
  • figures 1 and 2 show a schematic illustration of an eye
  • figure 3 illustrates a video monitor sub-divided into various areas internally of which the method of the present invention can be applied
  • figures 4 and 5 illustrate the performing of some stages of the method of the present invention.
  • a subject's eye can perform three movements: saccade, slow tracking and fixation. If the eyeball is considered as approximately a sphere in its anatomical dimensions, the movement of the centre of the pupil O in each of the three movement states is a function of the movement of all the eyeball internally of the orbit, and consequently also a function of the fixation directly by the subject.
  • the method of the present invention comprises detecting and acquiring the movement of the pupil and thus of the eye.
  • the movement of the pupil is acquired via a television recording realised for example with a television in combination with an infra-red LED lighting device.
  • the television camera can be made solid to the head of the user by means of a support.
  • the lens is positioned before one of the two eyes, preferably at an approximate distance of 8 cm., such as to frame the pupil, the front segment and the periorbital area.
  • the television images acquired by means of the television recording are digitalised.
  • the temporal succession of the television images constitutes the source of information which, following a subsequent programming, is interpreted to produce commands such as for example a displacement of a pointer on a screen, a movement of mechanical arms, activation of switches or motors, or other commands besides.
  • each single video image is processed with an algorithm based on a hypothesis on the basis of which the image of the pupil is a connected two-dimensional object, i.e. an object limited by a closed flat curve, having an outline which is very similar to an ellipse, an average level of very dark grey with respect to the rest of the image , and a significantly large area.
  • a connected two-dimensional object i.e. an object limited by a closed flat curve, having an outline which is very similar to an ellipse, an average level of very dark grey with respect to the rest of the image , and a significantly large area.
  • the algorithm performs of the following operations: an analysis of the various levels of grey, the aim of which is to isolate the darker components of the image; identifying and calculating the area of only the connected objects, in order to determine the object which most probably represents the pupil; extraction of the border, in order to calculate the equation of the ellipse which best approximates the border; calculating the coordinates (x, y) of the centre of the pupil O on the image.
  • the device to be controlled is constituted by a computer
  • the user can pilot the position of the pointer of the mouse on the monitor by displacing her or his eyes in order to direct her or his gaze on the areas of interest of the image shown on the PC monitor.
  • the voluntary activation of an event following determined positions reached by the pointer can be done with various actions, for example with a blink, with a vocal signal, an electromyographic signal, a movement of a limb or a part of a limb, by blowing or even via a combination of two or more of these actions detected by means of sensors.
  • the method of the invention is described herein below in its preferred application which consists in interaction between the eyeball displacement of a user and the effect of that movement on the position of an object to be moved P which belongs to an observed scene, such as for example the pointer of the mouse on a computer monitor.
  • the tracing over a period of time of the position, of the centre of the pupil O is analysed with the aim of recognising the desire of the user to impart a movement command.
  • the command can be defined by means of a displacement vector Vp having a direction, a sense and an intensity (or modulus).
  • the direction and the sense can be recognised within a discrete and predefined sub-set of a beam of directed rays S 1 ,S 55 S n , for example eight rays separated by angular steps of an eighth of a full angle.
  • the vector modulus can be associated to a distance which the object to be moved has to cover.
  • the desire to move the object is expressed by the user by fixating on the object to be moved when in its initial rest position P and, in the following gaze fixation, preceded by a saccade movement, of another point, or perhaps more exactly an area, of the scene observed, which is close to or coincides with the final position T to which the user desires the object to be moved (figure 5).
  • the movement of the of the centre of the pupil O enables the fixation stages to be identified, inasmuch as in these fixation stages the velocity of the pupil is practically zero.
  • the displacement between the positions taken up by the centre of the pupil in the two successive fixations of the initial position of the object to be moved and the point or area in which the object is to be moved defines a displacement vector of the centre of the pupil Vo which, after having been processed by means of the method of the invention, is transformed into a displacement vector Vp of the object to be moved.
  • the transformation which the method of the invention operates on the displacement vector produced by the centre of the pupil is performed as follows.
  • the direction and sense of the displacement vector of the object to be moved are identified by positioning the centre of a discrete bundle of straight lines Rl, Ri, Rn having predetermined direction and sense, at the centre of the pupil O at the moment of initial fixation of the object to be moved.
  • the lines Rl, Ri, Rn are associated to a polar reference system which is centred on the centre O of the pupil and the reference axis arranged horizontally.
  • the displacement vector Vo of the centre of the pupil, which originates at O, forms a determined angle with respect to the horizontal reference axis.
  • the angle ⁇ is comprised between —22.5° and +22.5°, a rightwards horizontal direction is associated to the object to be moved.
  • the angle ⁇ is comprised between +22.5° and +67.5°
  • an oblique rightwards direction of 45° is associated to the object to be moved.
  • the angle ⁇ is comprised between +67.5° and +112.5°, a vertical upwards direction is associated, and so on for the remaining angular sectors in which the image of the pupil is subdivided by the bundle of straight lines Ri.
  • the vector itself must be assigned an intensity or a modulus.
  • the modulus of the displacement vector of the centre of the pupil Vo (figure 2) is multiplied by a gain factor, initially arbitrary but experimentally reasonable.
  • the result of the multiplication represents the modulus of the displacement vector of the object to be moved Vp, i.e. the distance the object will move P.
  • a displacement corresponding to the vector can be applied to the object P (figure 5).
  • the final position B taken on by the object P will probably not coincide with the final position T, which is the desired position.
  • the user performs another saccade movement to move to the desired position from position B to position T, such that the method is newly performed and a new displacement vector is enacted on the object to be moved Vp 5 producing a further displacement of the object.
  • the angular movements of the pupil become progressively smaller and consequently so does the final positioning error of the object to be moved, up until a sufficiently good approximation of the final desired position is achieved.
  • the method therefore enables reaching the final desired position with a convergence process by successive steps.
  • the visual effect is given by the succession of hook-ups to the object that the user makes in order to move it progressively, in a small number of steps, towards the desired objective.
  • a different value for the components of the vertical and horizontal movements can be assigned to the gain parameter.
  • the method in object is further suitable for calibrating and correcting the gain factor.
  • the method comprises comparing the initial position and the final position of the object to be moved with the fixation positions assumed by the centre of the pupil.
  • the initial gain parameter is such as not to enable a correction of the positioning of the object, in this case the pointer, in a single step.
  • the user performs a first fixation of the point A where the pointer is at rest; the pupil is in position Fo (figure 5).
  • the user fixates the desired and targeted end-area, which we shall call T; the pupil is in position Fl.
  • the application of the method transforms the displacement vector of the centre of the pupil Vo into a displacement vector of the pointer Vp 5 using an initial gain factor which leads to the incorrect positioning of the pointer, for example at point B which is at a certain distance from the desired position.
  • the user will have to move the centre of the pupil into position F2 in order to perform a fixation of the pointer which is at point B, which is done by displacing the fixation point from T to B and defining a new displacement vector FoF2 of the centre of the pupil.
  • V P AB
  • the gain factor K recalculated as a relation between the modulus of vector Vp and the modulus of vector FoF2, achieves a correspondence of the displacement vector of the centre of the pupil between the two fixations of points A and B with the displacement vector of the pointer AB.
  • the screen or observed scene is subdivided into a rectangular grid N X N such that for each rectangular zone a local horizontal and vertical gain factor can be associated, as illustrated in figure 3, in which the pointer displaces on a screen subdivided into sixteen zones.
  • the local gain factor of the rectangular zone in question can be updated as a function of the last two positions of the pupil during the fixations on the mobile object.
  • an auto-updating table can be built up, which over time stores the most appropriate gain factors to be applied for each specific rectangular area.
  • the method of the present invention provides important advantages. Firstly, no preliminary calibration is necessary, as is the case with known methods. Further, the subdivision of the image of the eye into zones to each of which a direction and a sense of the displacement vector is attributed minimises direction and sense errors. A further important advantage is that the reiteration of the method enables the desired position of the mobile object to be reached within a few successive steps. Consequently, small movements of the head with respect to the observed scene do not significantly influence the success of the positioning.
  • a further important advantage is provided by the fact that the repeating of the method, with fixation of the object to be moved when at rest before each movement, enables a local calibration of the gain factor to be made, and therefore an auto-updating table of the gain factors. This minimises the influence of the user's changes of head position.
  • the method of the invention also provides an advantageous method for correcting positioning error of the mobile object.
  • the correction method is based on the corrective saccade movement the user's eye makes to displace gaze from the position to which it was intended for the object to be moved, to the position actually reached by the object. Via the detection and measurement of the range of the corrective saccade movement, it is possible to estimate the positioning error during or at the end of the first displacement of the object towards the desired position.
  • Figures 4 and 5 illustrate some stages of the correction method based on the corrective saccade movement.
  • the error of positioning of the object to be moved can be determined at the end of the first movement.
  • a new and correct displacement vector is obtained which enables the mobile object to reach the effectively desired position.
  • the addition of the error to the displacement of the mobile object is practically instantaneous and consequently so is the position of the object; this obviates the need for further steps to be made.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne un procédé pour produire des commandes par des mouvements oculaires, caractérisé en ce qu'il comprend les étapes suivantes : détection d'une image de la pupille et détection d'une position de fixation initiale d'un centre d'une pupille (O) par rapport à un système de référence prédéterminé ; positionnement d'un centre d'un groupe discret de lignes droites ayant des directions prédéterminées, au centre de la pupille (O) et division d'un plan de l'image en des régions discrètes ; positionnement d'un centre d'un groupe discret de rayons au centre de la pupille (O), le groupe de rayons ayant des directions et des sens prédéterminés ; association de chaque région discrète à un rayon dirigé spécifique appartenant au groupe discret de rayons orientés ; détection d'une position finale de fixation adoptée par le centre de la pupille (O) après déplacement de celui-ci par rapport à la position de fixation initiale ; identification d'un premier vecteur de déplacement du centre de la pupille (O) ayant une origine dans la position initiale du centre de la pupille (O) et un sommet dans la position finale du centre de la pupille (O) ; association d'un des rayons orientés du groupe discret à la direction et au sens d'un premier vecteur de déplacement attribuable à un objet à déplacer (P) ; attribution, au premier vecteur de déplacement de l'objet (P) associé au rayon orienté du groupe discret, d'un coefficient qui est identique à un coefficient du premier vecteur de déplacement du centre de la pupille (O) multiplié par un facteur de gain prédéterminé.
PCT/IB2009/052520 2008-06-19 2009-06-12 Procédé pour produire des commandes par des mouvements oculaires WO2009153724A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ITMO2008A000177 2008-06-19
IT000177A ITMO20080177A1 (it) 2008-06-19 2008-06-19 Metodo per impartire comandi per mezzo di movimenti oculari.

Publications (1)

Publication Number Publication Date
WO2009153724A1 true WO2009153724A1 (fr) 2009-12-23

Family

ID=40301925

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/052520 WO2009153724A1 (fr) 2008-06-19 2009-06-12 Procédé pour produire des commandes par des mouvements oculaires

Country Status (2)

Country Link
IT (1) ITMO20080177A1 (fr)
WO (1) WO2009153724A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2679147A3 (fr) * 2012-06-28 2014-11-12 Oliver Hein Procédé et dispositif de codage de données de suivi de l'oeil et du regard
US9239956B2 (en) 2012-06-28 2016-01-19 Oliver Hein Method and apparatus for coding of eye and eye movement data

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0631222A1 (fr) * 1993-06-21 1994-12-28 International Business Machines Corporation Dispositif d'estimation de point de regard
EP1715406A1 (fr) * 2005-04-23 2006-10-25 STMicroelectronics (Research & Development) Limited Dispositif de pointage et procédé de fonctionnement de ce dispositif

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0631222A1 (fr) * 1993-06-21 1994-12-28 International Business Machines Corporation Dispositif d'estimation de point de regard
EP1715406A1 (fr) * 2005-04-23 2006-10-25 STMicroelectronics (Research & Development) Limited Dispositif de pointage et procédé de fonctionnement de ce dispositif

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHERN-SHENG LIN, CHIEN-WA HO,CHAO-NING CHAN,CHI-RE CHAU,YIENG-CHIANG WU,MAU-SHIUN YEH: "An eye-tracking and head-control system using movement increment-coordinate method", OPTICS & LASER TECHNOLOGY, vol. 39, no. 6, September 2007 (2007-09-01), pages 1218 - 1225, XP002547091, DOI: 10.1016/j.optlastec.2006.08.002 *
ROBERT J.K. JACOB: "Eye Movement-Based Human-Computer InteractionTechniques:Toward Non-Command Interfaces", 1993, XP002547092, Retrieved from the Internet <URL:http://www.cs.tufts.edu/~jacob/papers/hartson.pdf> [retrieved on 20090923] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2679147A3 (fr) * 2012-06-28 2014-11-12 Oliver Hein Procédé et dispositif de codage de données de suivi de l'oeil et du regard
US9239956B2 (en) 2012-06-28 2016-01-19 Oliver Hein Method and apparatus for coding of eye and eye movement data

Also Published As

Publication number Publication date
ITMO20080177A1 (it) 2009-12-20

Similar Documents

Publication Publication Date Title
US9934435B2 (en) Image processing apparatus and image processing method
US8708490B2 (en) Method and a device for automatically measuring at least one refractive characteristic of both eyes of an individual
US20060281969A1 (en) System and method for operation without touch by operators
EP3154430B1 (fr) Procédé et système pour configurer un système d&#39;imagerie à rayons x
EP2400370B1 (fr) Dispositif et procédé de traitement d&#39;informations
CN104809424B (zh) 一种基于虹膜特征实现视线追踪的方法
US9727130B2 (en) Video analysis device, video analysis method, and point-of-gaze display system
US20220039645A1 (en) Determining a refractive error of an eye
CN109634431B (zh) 无介质浮空投影视觉追踪互动系统
EP3120294A1 (fr) Système et procédé de capture de mouvements
JP2016173313A (ja) 視線方向推定システム、視線方向推定方法及び視線方向推定プログラム
JP3453911B2 (ja) 視線認識装置
CN110766656B (zh) 筛查眼底黄斑区异常的方法、装置、设备和存储介质
JP7159242B2 (ja) 顔生体の検出方法及び検出装置、電子機器、並びにコンピュータ読み取り可能な媒体
JP7388525B2 (ja) 眼科画像処理装置および眼科画像処理プログラム
CN113260299A (zh) 用于眼睛追踪的系统和方法
JP6324119B2 (ja) 回転角度算出方法、注視点検出方法、情報入力方法、回転角度算出装置、注視点検出装置、情報入力装置、回転角度算出プログラム、注視点検出プログラム及び情報入力プログラム
US11576638B2 (en) X-ray imaging apparatus and X-ray image processing method
WO2009153724A1 (fr) Procédé pour produire des commandes par des mouvements oculaires
JP2005261728A (ja) 視線方向認識装置及び視線方向認識プログラム
JP2018508246A (ja) 少なくとも1つの医療機器を伴う外科的処置中に執刀者にイメージ化支援を提供する為の支援デバイス及び方法
CN111358421B (zh) 屈光图形生成方法、装置及计算机可读存储介质
CN116382473A (zh) 一种基于自适应时序分析预测的视线校准、运动追踪及精度测试方法
DE102011002577A1 (de) Fernsteuerungseinrichtung zur Steuerung einer Vorrichtung anhand eines beweglichen Objektes sowie Schnittstellen-Modul zur Kommunikation zwischen Modulen einer derartigen Fernsteuerungseinrichtung oder zwischen einem der Module und einer externen Vorrichtung
CN114967128B (zh) 一种应用于vr眼镜的视线追踪系统及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09766261

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09766261

Country of ref document: EP

Kind code of ref document: A1