WO2013183533A1 - Dispositif d'affichage, procédé d'affichage et programme - Google Patents

Dispositif d'affichage, procédé d'affichage et programme Download PDF

Info

Publication number
WO2013183533A1
WO2013183533A1 PCT/JP2013/065015 JP2013065015W WO2013183533A1 WO 2013183533 A1 WO2013183533 A1 WO 2013183533A1 JP 2013065015 W JP2013065015 W JP 2013065015W WO 2013183533 A1 WO2013183533 A1 WO 2013183533A1
Authority
WO
WIPO (PCT)
Prior art keywords
displayed
cursor
display
user
line
Prior art date
Application number
PCT/JP2013/065015
Other languages
English (en)
Japanese (ja)
Inventor
晴彦 杉崎
山崎 仁史
誠二 白石
鈴木 大輔
Original Assignee
株式会社エヌ・ティ・ティ・ドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社エヌ・ティ・ティ・ドコモ filed Critical 株式会社エヌ・ティ・ティ・ドコモ
Publication of WO2013183533A1 publication Critical patent/WO2013183533A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the present invention relates to a user interface.
  • processing corresponding to the position of the viewpoint is executed. For this reason, when the user unintentionally looks at a position different from the desired position, a process corresponding to the viewed position is executed, and a process that is not assumed by the user is executed. It will be.
  • the present invention has been made under the above-described background, and an object thereof is to enable an operation as intended by a user when an operation is performed using a line of sight.
  • the present invention provides a display unit having a display surface for displaying an image, a line-of-sight detection unit for detecting a user's line of sight, and a specifying unit for specifying a position touched by the user on the display surface.
  • a first control means for controlling the display means so that an operation image for operating the device is displayed at a position specified by the specifying means, and a means for displaying a cursor on the display means.
  • the position of the cursor is fixed, and when the operation image is not displayed on the display means, the position of the cursor is detected by the line-of-sight detection means.
  • a second control unit that controls the display unit so that the cursor is displayed at the determined position.
  • the first control means displays the operation means corresponding to an image displayed overlapping the cursor when the specifying means specifies a position touched by a user. It is good also as a structure which controls.
  • the display mode of the cursor when the operation image is displayed may be different from the display mode of the cursor when the operation image is not displayed.
  • the present invention provides a line-of-sight detection step for detecting a user's line of sight, a specification step for specifying a position touched by a user on a display surface of a display means for displaying an image, and a position specified in the specification step.
  • a second control step of controlling the display means so that the cursor is displayed at the determined position.
  • an operation as intended by the user when an operation is performed using a line of sight, an operation as intended by the user can be performed.
  • FIG. 2 is a diagram illustrating a hardware configuration of the information processing apparatus 1.
  • FIG. 2 is a functional block diagram of the information processing apparatus 1.
  • FIG. 10 shows an example of a screen displayed on the display unit 1042.
  • FIG. 10 shows an example of a screen displayed on the display unit 1042.
  • FIG. 10 shows an example of a screen displayed on the display unit 1042.
  • FIG. 10 shows an example of a screen displayed on the display unit 1042.
  • SYMBOLS 1 Information processing apparatus, 101 ... Bus, 102 ... Control part, 102A ... CPU, 102B ... ROM, 102C ... RAM, 103 ... Memory
  • the communication unit 105 has a function of a communication interface that performs wireless communication.
  • the information processing apparatus 1 controls the communication unit 105 to connect to a wireless local area network (LAN) or a mobile communication network, and performs data communication and voice communication.
  • the operation unit 104 includes a plurality of hardware buttons for operating the information processing apparatus 1.
  • the operation unit 104 is provided on the surface of the display unit 1042 and the display unit 1042, which is an example of a display unit that displays an image.
  • the operation unit 104 transmits the image displayed on the display unit 1042, and detects a position touched by a finger.
  • a touch screen 104A integrated with the position detection unit 1041 is provided.
  • a display part 1042 although a liquid crystal display or an organic EL (Electro Luminescence) display is used, it is not limited to these, Other displays may be used.
  • the position detection unit 1041 a unit that detects the positions of a plurality of touching fingers (for example, a capacitive type unit) is used.
  • the imaging unit 106 includes a lens and a solid-state imaging device, and generates an image representing an image formed on the solid-state imaging device by the lens.
  • the storage unit 103 includes a nonvolatile memory.
  • the storage unit 103 stores a program OS that realizes an operating system and various application programs.
  • the control unit 102 includes a CPU (Central Processing Unit) 102A, a ROM (Read Only Memory) 102B, a RAM (Random Access Memory) 102C, and the like.
  • IPL Intelligent Program Loader
  • the program OS stored in the storage unit 103 is executed, an operating system is realized, and various application programs can be executed.
  • a user interface using the touch screen 104A and the imaging unit 106 is realized.
  • FIG. 2 is a diagram showing functional blocks related to the characteristic functions according to the present invention among the functions realized in the control unit 102.
  • a user interface using the touch screen 104A and the imaging unit 106 is realized.
  • FIG. 2 is a functional block diagram relating to the user interface function.
  • the specifying unit 201 specifies the position of the finger touched by the user in the position detection unit 1041.
  • the specifying unit 201 acquires position data generated by the position detection unit 1041. This position data represents a position where the user touches with a finger on the display surface of the display unit 1042.
  • the specifying unit 201 specifies the position of the finger touched by the user based on the position data.
  • the line-of-sight detection unit 202 detects a user's line of sight.
  • the line-of-sight detection unit 202 acquires an image of the user's face.
  • the line-of-sight detection unit 202 detects the line of sight of the user from the acquired face image.
  • the first control unit 203 controls the display unit 1042 so that a submenu that is an example of an operation image for operating the information processing apparatus 1 is displayed at the position specified by the specifying unit 201.
  • the submenu items are changed according to the position of the cursor.
  • the second control unit 204 displays a cursor on the display unit 1042.
  • the second control unit 204 fixes the position of the displayed cursor when the submenu is displayed on the display unit 1042.
  • the second control unit 204 determines the position of the cursor based on the line of sight detected by the line-of-sight detection unit 202, and the cursor is displayed at the determined position.
  • the display unit 1042 is controlled.
  • FIG. 6 is an example of an operation screen displayed on the information processing apparatus 1.
  • a cursor 10 is displayed, and icons 20-1 and 20-2, which are examples of pictographs, are displayed.
  • a display area 30 (so-called window) related to an application program for displaying a photograph is displayed.
  • the control unit 102 acquires this face image (FIG. 3: step SA1).
  • the control unit 102 detects the user's line of sight using, for example, the technique disclosed in Japanese Patent Laid-Open No. 10-39995, and specifies the position where the user is looking on the display screen (Step S102).
  • SA2 The technique for detecting the user's line of sight is not limited to the above method, and a method of detecting the line of sight from the captured image using a camera attached to the user's head and the like. Other known methods may be used.
  • control unit 102 determines whether a submenu described later is displayed. If a submenu to be described later is not displayed on display unit 1042 (YES in step SA3), control unit 102 controls display unit 1042 so that cursor 10 is displayed at the position specified in step SA2 (step SA4). ). For example, when the user looks at the position of the icon 20-2, the cursor 10 is displayed on the icon 20-2 as shown in FIG. Further, when the user looks in the display area 30, the cursor 10 is displayed in the display area 30 as shown in FIG.
  • Step SB1 the position detection unit 1041 generates position data indicating the position of the finger touched by the user, and the control unit 102 acquires the position data.
  • the control unit 102 specifies an image overlapping the cursor 10 (step SB2).
  • the control unit 102 controls the display unit 1042 so that the submenu corresponding to the identified image is displayed at the position represented by the position data (step SB3).
  • the control unit 102 corresponds to the icon 20-2 as shown in FIG.
  • the display unit 1042 is controlled so that the submenu M1 is displayed.
  • the submenu M1 is displayed at the position of the finger 40 that has touched the touch screen 104A.
  • the control unit 102 applies the application related to the display area 30 as shown in FIG.
  • the display unit 1042 is controlled so that the corresponding submenu M2 is displayed.
  • the submenu M2 is also displayed at the position of the finger 40 that has touched the touch screen 104A.
  • step SA3 when the user moves his / her line of sight while the submenu is displayed, the control unit 102 determines NO in step SA3. If the controller 102 determines NO in step SA3, the controller 102 ends the process of FIG. 3 without executing the process of step SA4. That is, in the state where the sub menu is displayed, the position of the cursor 10 is fixed because the process of step SA4 is not executed and the cursor 10 does not move even if the user changes the viewing position.
  • control unit 102 executes the process of FIG. First, the control unit 102 has a predetermined time (3 seconds in the present embodiment) that has elapsed since the position data could not be acquired, that is, the time that has elapsed since the user released the finger from the touch screen 104A. Judge whether or not.
  • control unit 102 deletes the displayed submenu (step SC2). Thereafter, when the control unit 102 executes the process of FIG. 3, since the submenu is not displayed, it is determined NO in Step SA3 and the process of Step SA4 is executed. Is displayed. That is, after the submenu is erased, the position of the cursor 10 is moved again when the line of sight is moved.
  • control unit 102 specifies an item at the position and executes a process corresponding to the specified item (step SC5). ).
  • the control unit 102 executes the application program related to the icon 20-2 because the cursor 10 is at the position of the icon 20-2.
  • the control unit 102 controls the display unit 1042 so that the image displayed in the display area 30 is enlarged and displayed when the specified position is the “enlarged” position of the submenu M2.
  • the display unit 1042 is controlled so that the image displayed in the display area 30 is reduced and displayed.
  • the imaging unit 106 is built in the information processing apparatus 1, but is not limited to this configuration.
  • a camera including a lens and a fixed image sensor may be connected to the information processing apparatus 1 so that the user's face is captured by the camera.
  • a camera including a lens and a fixed imaging element may be arranged in a frame of spectacles, and a user's face may be captured by a camera provided in the frame.
  • a screen related to one application program may be displayed on the entire display screen.
  • the submenu may be displayed at the position touched by the finger.
  • the items in the submenu may be changed according to the position of the cursor.
  • the submenu is displayed when the user touches the touch screen 104A, but the trigger for displaying the submenu is not limited to this operation.
  • the sub menu may be displayed when the user slides the finger while keeping the finger in contact with the touch screen 104A.
  • the area of the cursor 10 when the position of the cursor 10 is fixed may be made larger than the area when the cursor 10 is movable. Further, the shape of the cursor 10 may be different between a state where the cursor 10 is movable and a state where the position of the cursor 10 is fixed. Moreover, you may combine these modifications.
  • a program for realizing the functions of the above-described embodiments includes a magnetic recording medium (magnetic tape, magnetic disk (HDD (Hard Disk Drive), FD (Flexible Disk))), optical recording medium (optical disk, etc.), and magneto-optical recording medium.
  • the program may be provided in a state stored in a computer-readable recording medium such as a semiconductor memory and installed in the information processing apparatus 1.
  • the information processing apparatus 1 may be downloaded and installed via a communication line.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Cette invention permet des opérations conformes à l'intention d'un utilisateur lors de la réalisation d'une opération à l'aide d'une ligne visuelle. Un détecteur de ligne visuelle (202) détecte la ligne visuelle de l'utilisateur à partir d'une image du visage de l'utilisateur prise par une unité de prise de vue (106). Une unité de spécification (201) spécifie une position sur l'écran d'une unité d'affichage (1042) touchée par l'utilisateur. Une première unité de commande (203) commande l'unité d'affichage (1042) de telle sorte qu'un sous-menu servant à actionner le dispositif d'affichage est affiché à la position spécifiée par l'unité de spécification (201). Une seconde unité de commande (204) commande l'unité d'affichage (1042) de façon à fixer une position de curseur quand le sous-menu est en train d'être affiché sur l'unité d'affichage (1042), et à déterminer la position du curseur sur la base de la ligne visuelle détectée par le détecteur de ligne visuelle quand le sous-menu n'est pas en train d'être affiché sur l'unité d'affichage (1042) pour que le curseur soit affiché dans la position déterminée.
PCT/JP2013/065015 2012-06-08 2013-05-30 Dispositif d'affichage, procédé d'affichage et programme WO2013183533A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-130469 2012-06-08
JP2012130469A JP2013254406A (ja) 2012-06-08 2012-06-08 表示装置、表示方法及びプログラム。

Publications (1)

Publication Number Publication Date
WO2013183533A1 true WO2013183533A1 (fr) 2013-12-12

Family

ID=49711925

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/065015 WO2013183533A1 (fr) 2012-06-08 2013-05-30 Dispositif d'affichage, procédé d'affichage et programme

Country Status (2)

Country Link
JP (1) JP2013254406A (fr)
WO (1) WO2013183533A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3070582A1 (fr) * 2015-03-19 2016-09-21 Lenovo (Singapore) Pte. Ltd. Appareil, procédé et produit de programme pour régler une position de curseur

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6131540B2 (ja) * 2012-07-13 2017-05-24 富士通株式会社 タブレット端末、操作受付方法および操作受付プログラム
JP7215254B2 (ja) * 2019-03-13 2023-01-31 株式会社リコー 情報処理装置、表示制御方法、及びプログラム
JP7383471B2 (ja) * 2019-12-20 2023-11-20 キヤノン株式会社 電子機器およびその制御方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001100903A (ja) * 1999-09-28 2001-04-13 Sanyo Electric Co Ltd 視線検出機能搭載装置
JP2010224684A (ja) * 2009-03-19 2010-10-07 Smk Corp 操作入力装置、制御方法、およびプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001100903A (ja) * 1999-09-28 2001-04-13 Sanyo Electric Co Ltd 視線検出機能搭載装置
JP2010224684A (ja) * 2009-03-19 2010-10-07 Smk Corp 操作入力装置、制御方法、およびプログラム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3070582A1 (fr) * 2015-03-19 2016-09-21 Lenovo (Singapore) Pte. Ltd. Appareil, procédé et produit de programme pour régler une position de curseur
CN105988664A (zh) * 2015-03-19 2016-10-05 联想(新加坡)私人有限公司 用于设置光标位置的设备和方法
US9983695B2 (en) 2015-03-19 2018-05-29 Lenovo (Singapore)Pte. Ltd. Apparatus, method, and program product for setting a cursor position
CN105988664B (zh) * 2015-03-19 2019-04-23 联想(新加坡)私人有限公司 用于设置光标位置的设备和方法

Also Published As

Publication number Publication date
JP2013254406A (ja) 2013-12-19

Similar Documents

Publication Publication Date Title
KR102423826B1 (ko) 사용자 단말 장치 및 그의 제어 방법
KR102255774B1 (ko) 제스처들을 이용한 디바이스와의 상호작용
US9733700B2 (en) Ring-type mobile terminal
EP2808781B1 (fr) Procédé, support de stockage et dispositif électronique pour le miroitage de données d'écran
EP3617861A1 (fr) Procédé d'affichage d'interface utilisateur graphique, et dispositif électronique
KR20150090840A (ko) 디스플레이 화면의 영역을 보호하는 디바이스 및 방법
EP2913739A1 (fr) Identification d'entrée dans un dispositif électronique
KR20150128377A (ko) 지문 처리 방법 및 그 전자 장치
TW201329835A (zh) 顯示控制裝置、顯示控制方法及電腦程式
JP6483452B2 (ja) 電子機器、制御方法、及び制御プログラム
WO2013190989A1 (fr) Dispositif d'affichage, procédé d'affichage et programme
TW201331825A (zh) 用以在螢幕間提供視覺過渡之裝置及方法
KR20130115174A (ko) 디지털 베젤을 제공하기 위한 장치 및 방법
US9785284B2 (en) Touch screen device
US10095384B2 (en) Method of receiving user input by detecting movement of user and apparatus therefor
JP5342040B1 (ja) 表示装置、表示方法及びプログラム
WO2013183533A1 (fr) Dispositif d'affichage, procédé d'affichage et programme
KR102113509B1 (ko) 가상 키패드 제어 방법 및 그 전자 장치
JP5835240B2 (ja) 情報処理装置、情報処理方法及びプログラム
JP2020017215A (ja) 電子機器、制御プログラム及び表示制御方法
KR102429428B1 (ko) 전자 장치 및 전자 장치의 동작 방법
KR20160025914A (ko) 블록을 설정하는 전자 장치 및 방법
JP5514922B1 (ja) ユーザ操作制御プログラム、携帯装置及びユーザ操作制御方法
JP6616379B2 (ja) 電子機器
JP7034856B2 (ja) 電子機器、制御プログラム及び表示制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13801270

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13801270

Country of ref document: EP

Kind code of ref document: A1