WO2009113026A2 - Appareil permettant de créer, de sauvegarder et de formater des documents textes par une commande par le regard et procédé associé selon le positionnement optimisé d'un curseur - Google Patents

Appareil permettant de créer, de sauvegarder et de formater des documents textes par une commande par le regard et procédé associé selon le positionnement optimisé d'un curseur Download PDF

Info

Publication number
WO2009113026A2
WO2009113026A2 PCT/IB2009/051007 IB2009051007W WO2009113026A2 WO 2009113026 A2 WO2009113026 A2 WO 2009113026A2 IB 2009051007 W IB2009051007 W IB 2009051007W WO 2009113026 A2 WO2009113026 A2 WO 2009113026A2
Authority
WO
WIPO (PCT)
Prior art keywords
management module
data
information
user
states
Prior art date
Application number
PCT/IB2009/051007
Other languages
English (en)
Other versions
WO2009113026A3 (fr
Inventor
Gianluca Dal Lago
Original Assignee
Sr Labs S.R.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sr Labs S.R.L. filed Critical Sr Labs S.R.L.
Priority to DK09719356.9T priority Critical patent/DK2266014T3/da
Priority to US12/922,236 priority patent/US8205165B2/en
Priority to EP09719356.9A priority patent/EP2266014B1/fr
Priority to CA2718441A priority patent/CA2718441C/fr
Priority to ES09719356T priority patent/ES2424662T3/es
Publication of WO2009113026A2 publication Critical patent/WO2009113026A2/fr
Publication of WO2009113026A3 publication Critical patent/WO2009113026A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Definitions

  • the present invention is related to the field of interaction techniques in gaze control system interfaces, and in particular to a new system that allows to create, save and format text document using eye tracking devices through a method of fast cursor positioning.
  • This interface is difficult to develop for many reasons and in particular because the eyes are perceptive organs and the gaze moves on the screen also when the user records information and he doesn't want to produce any type of control command. moreover the user, who can be a person with disabilities, can have difficulties to control his own gaze with accuracy high enough to control the computer as desired and this is particularly emphasized if the objects to control on the screen is small.
  • the state of the art there are a lot of systems that in different ways have tried to develop interaction methods based on the complete management of mouse emulation; in particular some of them provide a pointer movement as A function of gaze movement.
  • One of these interaction techniques magnifies the areas present on the screen so that the user can carry out an action in A more reliable way using the pointer and allows to gain access practically to all Windows applications.
  • the area of the screen gazed by the user enlarges so that the objects selection has made easy; the components outside this area close up and/or move in relation to such expansion.
  • This apparatus represents a possible layout of an assistive technology extremely innovative to create, save and format text document, based on use of input natural and alternative, as the gaze.
  • Fig. 1 Shows a block diagram of the architecture of the method according to the present invention.
  • Fig. 2 Shows the flow chart of the method according to the present invention.
  • Fig. 3 Shows the flow chart of generation and execution of the action routine according to the present invention.
  • Fig. 4 Shows the flow chart of cursor positioning routine.
  • Fig. 5 Shows the flow chart of cursor positioning routine for text selection (multiple selection).
  • the apparatus object of the present invention includes means of data and information processing, means of storage of said data and information and means to interface it with the user.
  • Said means of electronic processing of data and information include an appropriate control section, preferably based on at least a microprocessor and, for instance can be carried out from a personal computer.
  • Said means of storage include preferably hard disk and flash memory
  • aid means of user interface include means of data visualization, like display, monitor or similar external output unit and eye tracking device to determine the direction of the user gaze.
  • Said at least microprocessor is preferably equipped with an appropriate software program which architecture, described in Fig 1 , includes the following modules: a filtering module 10 that processes the user gaze coordinates and makes raw data, coming from eye tracking device, more stable; a Set Action module 11 , that manages graphic interface of the application and that holds the information about the areas components of the interface the user interacts with, and it is responsible to determine which area is currently gazed by the user, the action to perform and carries out it.
  • Said Set Action module 11 holds the information about the action type associated with the activation of a determined component.
  • Said Set Action module 11 is formed by three component modules: Events Management Module 12 that determines the rules to transform the input on the interface into changes on the application states through a mapping between the user action and application reply; a States Management Module 13 that represents the application data and determines the state and the functionalities and an Interface Management Module 14 that represents the visualization of interface objects, and manages the application graphic interface because holds the information related to the areas components of the graphic interface with which the user can interact and determines the interface area currently gazed by the user.
  • Events Management Module 12 that determines the rules to transform the input on the interface into changes on the application states through a mapping between the user action and application reply
  • a States Management Module 13 that represents the application data and determines the state and the functionalities
  • an Interface Management Module 14 that represents the visualization of interface objects, and manages the application graphic interface because holds the information related to the areas components of the graphic interface with which the user can interact and determines the interface area currently gazed by the user.
  • the user interface of the application that realises the method in accordance with the present invention and that allows the user to interact with said program through an eye tracking device associated to said electronic processor, is displayed 20 on means of visualization associated to said electronic processor.
  • the user gaze coordinates, as raw data are calculated 21 from eye tracking device, and represent the gaze coordinates along the axis coordinate obtained with the frequency typical of said eye tracking device.
  • Raw data related to said coordinates are filtered 22 so that render it more stable and suitable to provide information about the user fixations, that is the number of user gaze around a certain area.
  • step e) of sequence described in Figure 2 occurs in accordance with the sequence explained following in Figure 3:
  • the Events Management module processes the event 30 through mapping between it and the application reply, so that every event/action is joined to a corresponding action into the user interface that involves a change of data and in case a change of user interface itself.
  • the Events Management Module 31 sends such information to States Management Module.
  • the States Management Module 32 processes such information and updates its internal state on the basis of received information; said Events
  • the Interface Management 33 requires the data for updating to States
  • the cursor positioning - that is placed to the end of text displayed on said user interface, if it is the first opening of text window, or placed as selection carried out at the end of previous session - occurs in accordance with the steps, that explain previous step j) described following. After this positioning the user will can carry out some operation of erasing, insertion, etc., usually performed during text processing.
  • the Interface Management module requires data for the update to States
  • the Events Management Module receives in input the event related to the selection of word chosen among the buttons into the lateral bar, determines the reply of the application and sends such information to States
  • the States Management Module processes such information and updates its state.
  • the Interface Management Module requires data for the update to States Management Module and produces a colored bar placed to the end of chosen word.
  • the positioning of the cursor for the text selection occurs following the procedure described previously (step I - o) for the positioning of the first cursor to the start/end of text to select and afterwards following the steps explained below, as result of changes to step j and subsequent.
  • the Events Management module receives in input the event related to the selection of button "Select", from lateral bar, determines the reply of application and sends to States Management Module such information.
  • the States Management Module processes such information and updates its state
  • the Interface Management Module requires data for updating to
  • the Events Management Module receives in input the event related to the selection of chosen word (previous to the first that the user must select or the next to last) among the buttons of lateral bar, determines the reply of the application and sends such information to States Management Module.
  • the States Management Module processes such information and updates its state.
  • the Interface Management Module requires data for update to States Management Module and colours the text included between the two words as feedback of occurred selection.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Digital Computer Display Output (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Document Processing Apparatus (AREA)
  • Channel Selection Circuits, Automatic Tuning Circuits (AREA)

Abstract

L'invention porte sur un procédé et un appareil qui créent, sauvegardent et formatent des documents textes à l'aide d'un dispositif à commande par le regard et d'un système de positionnement rapide du curseur. Comparée à l'état antérieur de la technique, la présente invention fournit une interaction plus rapide et requiert un effort moindre.
PCT/IB2009/051007 2008-03-12 2009-03-11 Appareil permettant de créer, de sauvegarder et de formater des documents textes par une commande par le regard et procédé associé selon le positionnement optimisé d'un curseur WO2009113026A2 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
DK09719356.9T DK2266014T3 (da) 2008-03-12 2009-03-11 Apparat til at frembringe, gemme og formatere tekstdokumenter under anvendelse af øjenstyring samt tilhørende fremgangsmåde
US12/922,236 US8205165B2 (en) 2008-03-12 2009-03-11 Apparatus to create, save and format text documents using gaze control and method associated based on the optimized positioning of cursor
EP09719356.9A EP2266014B1 (fr) 2008-03-12 2009-03-11 Appareil permettant de créer, de sauvegarder et de formater des documents textes par une commande par le regard et procédé associé
CA2718441A CA2718441C (fr) 2008-03-12 2009-03-11 Appareil permettant de creer, de sauvegarder et de formater des documents textes par une commande par le regard et procede associe selon le positionnement optimise d'un curseur
ES09719356T ES2424662T3 (es) 2008-03-12 2009-03-11 Aparato para crear, grabar y formatear documentos de texto usando control con la mirada y método asociado

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ITFI2008A000049 2008-03-12
IT000049A ITFI20080049A1 (it) 2008-03-12 2008-03-12 Apparato per la creazione, il salvataggio e la formattazione di documenti testuali tramite controllo oculare e metodo associato basato sul posizionamento ottimizzato del cursore.

Publications (2)

Publication Number Publication Date
WO2009113026A2 true WO2009113026A2 (fr) 2009-09-17
WO2009113026A3 WO2009113026A3 (fr) 2010-01-21

Family

ID=40292754

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/051007 WO2009113026A2 (fr) 2008-03-12 2009-03-11 Appareil permettant de créer, de sauvegarder et de formater des documents textes par une commande par le regard et procédé associé selon le positionnement optimisé d'un curseur

Country Status (8)

Country Link
US (1) US8205165B2 (fr)
EP (1) EP2266014B1 (fr)
CA (1) CA2718441C (fr)
DK (1) DK2266014T3 (fr)
ES (1) ES2424662T3 (fr)
IT (1) ITFI20080049A1 (fr)
PT (1) PT2266014E (fr)
WO (1) WO2009113026A2 (fr)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US8793620B2 (en) * 2011-04-21 2014-07-29 Sony Computer Entertainment Inc. Gaze-assisted computer interface
US8730156B2 (en) 2010-03-05 2014-05-20 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US20100118200A1 (en) * 2008-11-10 2010-05-13 Geoffrey Michael Gelman Signage
IT1399456B1 (it) * 2009-09-11 2013-04-19 Sr Labs S R L Metodo e apparato per l'utilizzo di generiche applicazioni software attraverso controllo oculare e opportune metodologie di interazione.
US8888287B2 (en) 2010-12-13 2014-11-18 Microsoft Corporation Human-computer interface system having a 3D gaze tracker
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US8571851B1 (en) * 2012-12-31 2013-10-29 Google Inc. Semantic interpretation using user gaze order
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US9829972B2 (en) * 2014-03-12 2017-11-28 Lenovo (Singapore) Pte. Ltd. Eye tracking for automatically updating document status
US10993837B2 (en) * 2014-04-23 2021-05-04 Johnson & Johnson Surgical Vision, Inc. Medical device data filtering for real time display
WO2016085212A1 (fr) 2014-11-24 2016-06-02 삼성전자 주식회사 Dispositif électronique et procédé de commande d'affichage

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007017500A1 (fr) 2005-08-10 2007-02-15 Sr Labs S.R.L. Procede et appareil pour l'insertion securisee d'un code d'acces a l'aide d'un dispositif oculometrique

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6204828B1 (en) * 1998-03-31 2001-03-20 International Business Machines Corporation Integrated gaze/manual cursor positioning system
US6091378A (en) * 1998-06-17 2000-07-18 Eye Control Technologies, Inc. Video processing methods and apparatus for gaze point tracking
US6873314B1 (en) * 2000-08-29 2005-03-29 International Business Machines Corporation Method and system for the recognition of reading skimming and scanning from eye-gaze patterns
US7092554B2 (en) * 2001-05-01 2006-08-15 Eastman Kodak Company Method for detecting eye and mouth positions in a digital image
US6712468B1 (en) * 2001-12-12 2004-03-30 Gregory T. Edwards Techniques for facilitating use of eye tracking data
GB2396001B (en) * 2002-10-09 2005-10-26 Canon Kk Gaze tracking system
EP1691670B1 (fr) * 2003-11-14 2014-07-16 Queen's University At Kingston Procede et appareil de poursuite oculaire sans etalonnage
DK1607840T3 (da) * 2004-06-18 2015-02-16 Tobii Technology Ab Øjenstyring af et computerapparat
ITFI20040223A1 (it) * 2004-10-29 2005-01-29 Sr Labs S R L Metodo e sistema di visualizzazione,elaborazione ed analisi integrata di immagini mediche
EP1943583B1 (fr) * 2005-10-28 2019-04-10 Tobii AB Appareil de suivi de l'oeil a retour visuel
US7429108B2 (en) 2005-11-05 2008-09-30 Outland Research, Llc Gaze-responsive interface to enhance on-screen user reading tasks
US7556377B2 (en) * 2007-09-28 2009-07-07 International Business Machines Corporation System and method of detecting eye fixations using adaptive thresholds

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007017500A1 (fr) 2005-08-10 2007-02-15 Sr Labs S.R.L. Procede et appareil pour l'insertion securisee d'un code d'acces a l'aide d'un dispositif oculometrique

Also Published As

Publication number Publication date
PT2266014E (pt) 2013-08-22
DK2266014T3 (da) 2013-07-29
US20110022950A1 (en) 2011-01-27
EP2266014B1 (fr) 2013-05-22
WO2009113026A3 (fr) 2010-01-21
CA2718441A1 (fr) 2009-09-17
ES2424662T3 (es) 2013-10-07
EP2266014A2 (fr) 2010-12-29
ITFI20080049A1 (it) 2009-09-13
CA2718441C (fr) 2018-01-16
US8205165B2 (en) 2012-06-19

Similar Documents

Publication Publication Date Title
EP2266014B1 (fr) Appareil permettant de créer, de sauvegarder et de formater des documents textes par une commande par le regard et procédé associé
JP4944773B2 (ja) 視線追跡に基づいてコンピュータ装置を制御するための装置、方法及びコンピュータプログラム
CA2773636C (fr) Procede et appareil pour utiliser des applications logicielles generiques au moyen d'une commande oculaire et procedes d'interaction adequats
US20170075420A1 (en) Eye tracker based contextual action
Kumar et al. Eye-controlled interfaces for multimedia interaction
Darbar et al. Exploring smartphone-enabled text selection in ar-hmd
Rodrigues et al. Swat: Mobile system-wide assistive technologies
Po et al. Pointing and visual feedback for spatial interaction in large-screen display environments
Greene et al. Initial ACT-R extensions for user modeling in the mobile touchscreen domain
CN114967918A (zh) 一种头戴显示设备多模交互方法及系统
CN114924646A (zh) 一种头戴显示设备多模交互方法及系统
WO2019091840A1 (fr) Système et procédé de modulation de rétroaction d'interface de commande
Bacher Web2cHMI. A multi-modal native user interface implementation for accelerator operations and maintenance applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09719356

Country of ref document: EP

Kind code of ref document: A2

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2718441

Country of ref document: CA

Ref document number: 12922236

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2009719356

Country of ref document: EP