WO2020218859A1 - Procédé d'entrée d'utilisateur et dispositif d'interface d'utilisateur pour le réaliser - Google Patents

Procédé d'entrée d'utilisateur et dispositif d'interface d'utilisateur pour le réaliser Download PDF

Info

Publication number
WO2020218859A1
WO2020218859A1 PCT/KR2020/005398 KR2020005398W WO2020218859A1 WO 2020218859 A1 WO2020218859 A1 WO 2020218859A1 KR 2020005398 W KR2020005398 W KR 2020005398W WO 2020218859 A1 WO2020218859 A1 WO 2020218859A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
gaze
touch screen
cursor
unit
Prior art date
Application number
PCT/KR2020/005398
Other languages
English (en)
Korean (ko)
Inventor
석윤찬
이용은
Original Assignee
주식회사 비주얼캠프
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 비주얼캠프 filed Critical 주식회사 비주얼캠프
Publication of WO2020218859A1 publication Critical patent/WO2020218859A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Definitions

  • Embodiments of the present invention relate to user interface technology.
  • the disclosed embodiment is to provide a new user interface technology.
  • a user interface device for performing a user input in a user terminal having a touch screen, and a gaze tracking unit for generating gaze position information by tracking a user's gaze on the touch screen ; An assist button unit displayed on the touch screen; And a cursor display unit that displays a cursor at a position on the touch screen corresponding to the gaze position information in association with a user's touch of the assist button unit.
  • the preset event may occur when a user touches the assist button or a cursor is displayed on the touch screen.
  • the user interface device may further include a selection determination unit configured to determine a user's selection point on the touch screen in conjunction with the position of the cursor and the drag end operation of the assist button unit.
  • the selection determination unit may determine the position of the cursor on the touch screen as the user's selection point at the end of dragging of the assist button unit.
  • the user interface device may further include a calibration unit for calibrating the gaze tracking unit based on the gaze position information and information on the user's selection point.
  • the calibration unit may map the gaze position information and information on the user’s selection point whenever a user selection operation occurs, and perform calibration on the gaze tracking unit based on the mapped information.
  • the calibration unit detects a gaze vector based on the gaze position information, detects a point vector on the touch screen based on information on the user's selection point, and is detected whenever the user's selection operation occurs. And a matching matrix that matches the line of sight vector and the point vector on a one-to-one basis on the basis of a pair of point vectors to perform calibration.
  • a user input method is a method performed in a computing device having one or more processors and a memory for storing one or more programs executed by the one or more processors, comprising: Generating gaze position information by tracking a gaze; Displaying an assist button unit on the touch screen; And displaying a cursor at a position on the touch screen corresponding to the gaze position information in association with the user's touch of the assist button.
  • the user input method includes: allowing the assist button unit to be dragged on the touch screen according to occurrence of a preset event; And moving the cursor displayed on the touch screen in association with the drag direction and the drag length of the assist button.
  • the user input method may further include an operation of determining a user's selection point on the touch screen in conjunction with a position of the cursor and a drag end operation of the assist button unit.
  • the position of the cursor on the touch screen at the end of dragging of the assist button unit may be determined as the user's selection point.
  • the user input method may further include calibrating a gaze tracking model for tracking the user's gaze based on the gaze position information and information on the user's selection point.
  • the gaze position information and information on the user’s selection point may be mapped, and the gaze tracking model may be calibrated based on the mapped information.
  • the calibration may include detecting a gaze vector based on the gaze position information; Detecting a point vector on the touch screen based on information on the user's selection point; And updating a matching matrix that associates the gaze vector and the point vector on a one-to-one basis based on a pair of gaze vectors and point vectors detected whenever the user's selection operation occurs.
  • the user by tracking the user's gaze, placing a cursor on the touch screen, finely adjusting it through an assist button, and allowing the user to select an intended selection point, the user touches the touch screen with one hand. You will be able to accurately select difficult parts.
  • the gaze vector and the point vector pair are generated whenever a user selects without a separate calibration process, so that the gaze tracking accuracy can be gradually improved.
  • FIG. 1 is a diagram showing the configuration of a user interface device according to an embodiment of the present invention
  • FIGS. 2 to 3 are diagrams showing a process of determining a user's selection point using a user interface device according to an embodiment of the present invention
  • FIG. 4 is a flowchart showing a user input method using a user interface device according to an embodiment of the present invention
  • FIG. 5 is a block diagram illustrating and describing a computing environment including a computing device suitable for use in example embodiments.
  • transmission In the following description, "transmission”, “communication”, “transmission”, “reception” of signals or information, and other terms having a similar meaning are not only directly transmitted signals or information from one component to another component. It includes what is passed through other components.
  • “transmitting” or “transmitting” a signal or information to a component indicates the final destination of the signal or information and does not imply a direct destination. The same is true for “reception” of signals or information.
  • transmission when two or more pieces of data or information are "related”, it means that when one data (or information) is obtained, at least a part of other data (or information) can be obtained based thereon.
  • directional terms such as upper side, lower side, one side, and the other side are used in relation to the orientation of the disclosed drawings. Since the constituent elements of the embodiments of the present invention may be positioned in various orientations, the directional terminology is used for illustrative purposes, but is not limited thereto.
  • first and second may be used to describe various components, but the components should not be limited by the terms. These terms may be used for the purpose of distinguishing one component from another component.
  • a first component may be referred to as a second component, and similarly, a second component may be referred to as a first component.
  • FIG. 1 is a diagram showing the configuration of a user interface device according to an embodiment of the present invention.
  • the user interface device 100 may include a gaze tracking unit 102, an assist button unit 104, a cursor display unit 106, a selection determination unit 108, and a calibration unit 110. have.
  • the user interface device 100 may be mounted on a user terminal (eg, a smart phone, a tablet PC, a notebook, etc.) 50 having a touch screen.
  • the user interface device 100 may be a configuration for a user interface in the user terminal 50.
  • the gaze tracking unit 102 may track a user's gaze. Specifically, the gaze tracking unit 102 may generate gaze location information by tracking a user's gaze on the touch screen of the user terminal 50.
  • the gaze tracking unit 102 may include a gaze tracking model for tracking a user's gaze.
  • the gaze tracking model may be implemented through known gaze tracking techniques.
  • the assist button unit 104 may be provided to enable drag on the touch screen according to occurrence of a preset event.
  • the preset event may include a user touching the assist button unit 104 or displaying a cursor on the touch screen.
  • the assist button unit 104 may be provided to enable a user to touch on the touch screen and drag in a predetermined direction (including a 360° direction) when a preset event occurs.
  • the position of the cursor displayed on the touch screen may be different from the position actually intended by the user (ie, the position at which the user actually gazes). That is, the gaze tracking unit 102 may not accurately track the gaze position of the user, and thus there may be a difference between the position of the cursor based on gaze position information and the position at which the user actually gazes.
  • the cursor display unit 106 may move the cursor displayed on the touch screen in conjunction with the drag of the assist button unit 104. That is, when the user touches the assist button unit 104 or a cursor is displayed on the touch screen, the assist button unit 104 may be changed to a form capable of being dragged on the touch screen.
  • the cursor display unit 106 may move the cursor on the touch screen in association with the dragging direction and length of the assist button unit 104 by the user.
  • the cursor display unit 106 may move the cursor in the drag direction of the assist button unit 104, but may move the cursor by a length corresponding to the drag length of the assist button unit 104.
  • the user can control the movement of the cursor through the assist button unit 104.
  • the selection determination unit 108 may determine a user's selection point on the touch screen of the user terminal 50.
  • the selection determination unit 108 may determine the user's selection point in conjunction with the position of the cursor on the touch screen and the drag end operation of the assist button unit 104. For example, the selection determination unit 108 may position the cursor on the touch screen at the time when the user drags the assist button unit 104 and then releases the hand from the assist button unit 104 (ie, the end of dragging). Can be decided by a selection point.
  • the user's selection point may be a menu or an icon on a touch screen.
  • the selection determination unit 108 determines the user's selection point, a menu or icon corresponding to the user's selection point may be activated or executed.
  • the calibration unit 110 may calibrate the gaze tracking unit 102 based on the gaze position information by the gaze tracking unit 102 and information on the user's selection point by the selection determination unit 108 . That is, the calibration unit 110 includes location information of the gaze that the user actually gazes on the touch screen (i.e., information about the user's selection point) and gaze location information tracked by the gaze tracking unit 102 (that is, the user When viewing the selected point, calibration may be performed on the gaze tracking model based on gaze position information generated by the gaze tracking unit).
  • the calibration unit 110 Whenever the user selects a predetermined point on the touch screen of the user terminal 50 through the user interface device 100, the calibration unit 110 provides gaze position information and selection determination unit 108 by the gaze tracking unit 102. Information on the user's selection point by can be mapped.
  • the calibration unit 110 may calibrate the gaze location information of the gaze tracking unit 102 based on the accumulated gaze location information and mapping information of information on the user's selection point. Accordingly, the difference between the displayed position of the cursor according to the gaze position information of the gaze tracking unit 102 and the position at which the user actually gazes is gradually reduced.
  • the calibration unit 110 may detect a gaze vector based on gaze position information generated by the gaze tracking unit 102.
  • the gaze vector may be a vector directed from the center of the user's eyeball to the center of the pupil or the center of the cornea.
  • the calibration unit 110 may detect a point vector on the touch screen based on the user's selection point on the touch screen.
  • the calibration unit 110 may detect a matching matrix that matches the gaze vector and the point vector (that is, the point vector on the touch screen) 1:1.
  • the calibration unit 110 selects a group of a plurality of pre-stored gaze vectors and point vector pairs (an operation in which the user selects a predetermined point on the touch screen of the user terminal 50 through the user interface device 100) A pair of gaze vectors and point vectors generated by can be added.
  • the plurality of pre-stored gaze vectors and point vectors may be average gaze vectors and average point vectors by other existing users.
  • the calibration unit 110 may improve the mapping speed by removing some of the line-of-sight vector and point vector pairs when the line-of-sight vector and point vector pairs are accumulated in excess of a preset number. In this case, the calibration unit 110 may preferentially remove the gaze vector and point vector pairs by other users among the gaze vector and point vector pairs.
  • the user by tracking the user's gaze, placing a cursor on the touch screen, finely adjusting it through an assist button, and allowing the user to select an intended selection point, the user touches the touch screen with one hand. You will be able to accurately select difficult parts.
  • the gaze vector and the point vector pair are generated whenever a user selects without a separate calibration process, so that the gaze tracking accuracy can be gradually improved.
  • FIGS. 2 to 3 are diagrams illustrating a process of determining a user's selection point using a user interface device according to an embodiment of the present invention.
  • the assist button unit 104 may be displayed on the touch screen 51 of the user terminal 50.
  • the user may touch the assist button unit 104 while looking at a desired position on the touch screen 51 with his own eyes.
  • the gaze tracking unit 102 generates gaze position information by tracking the user’s gaze on the touch screen 51, and the cursor display unit 106 is based on the gaze position information at the moment when the assist button unit 104 is touched.
  • the cursor A can be displayed on the touch screen 51.
  • the position of the cursor A displayed on the touch screen 51 (ie, the position of the cursor based on gaze position information) may be different from the position intended by the user. That is, since the gaze tracking unit 102 cannot accurately track the user's gaze position, there may be a difference between the location of the cursor A based on gaze location information and the location intended by the user. Accordingly, it is necessary to adjust the position of the cursor A of the touch screen 51 to a position intended by the user.
  • the assist button 104 touches It can be changed to enable drag on the screen 51.
  • FIG. 4 is a flowchart illustrating a user input method using a user interface device according to an embodiment of the present invention.
  • the method is described by dividing the method into a plurality of steps, but at least some of the steps are performed in a different order, combined with other steps, performed together, omitted, divided into detailed steps, or not shown. One or more steps may be added and performed.
  • the user interface device 100 generates gaze position information by tracking the user's gaze on the touch screen 51 (S 101 ).
  • the user interface device 100 displays the assist button unit 104 on the touch screen 51 (S103).
  • the assist button unit 104 may be displayed on the touch screen 51 in conjunction with a user's gaze tracking operation.
  • the user interface device 100 displays a cursor at a position on the touch screen 51 corresponding to the gaze position information in conjunction with the user's touch of the assist button unit 104 (S105).
  • the user interface device 100 allows the assist button unit 104 to be dragged on the touch screen 51 according to the occurrence of a preset event (S107).
  • the user interface device 100 moves the position of the cursor on the touch screen 51 according to the user's dragging operation of the assist button unit 104 (S109).
  • the user interface device 100 determines the point where the cursor is located on the touch screen 51 as the user's selection point according to the user's drag end operation (S111).
  • the user interface device 100 maps the gaze position information and information on the user's selection point (S113).
  • FIG. 5 is a block diagram illustrating and describing a computing environment 10 including a computing device suitable for use in example embodiments.
  • each component may have different functions and capabilities in addition to those described below, and may include additional components in addition to those described below.
  • the illustrated computing environment 10 includes a computing device 12.
  • computing device 12 may be a user interface device 100.
  • the computing device 12 includes at least one processor 14, a computer-readable storage medium 16 and a communication bus 18.
  • the processor 14 may cause the computing device 12 to operate according to the exemplary embodiments mentioned above.
  • the processor 14 may execute one or more programs stored in the computer-readable storage medium 16.
  • the one or more programs may include one or more computer-executable instructions, and the computer-executable instructions are configured to cause the computing device 12 to perform operations according to an exemplary embodiment when executed by the processor 14 Can be.
  • the computer-readable storage medium 16 is configured to store computer-executable instructions or program code, program data, and/or other suitable form of information.
  • the program 20 stored in the computer-readable storage medium 16 includes a set of instructions executable by the processor 14.
  • the computer-readable storage medium 16 includes memory (volatile memory such as random access memory, nonvolatile memory, or a suitable combination thereof), one or more magnetic disk storage devices, optical disk storage devices, flash It may be memory devices, other types of storage media that can be accessed by computing device 12 and store desired information, or a suitable combination thereof.
  • the communication bus 18 interconnects the various other components of the computing device 12, including the processor 14 and computer readable storage medium 16.
  • Computing device 12 may also include one or more input/output interfaces 22 and one or more network communication interfaces 26 that provide interfaces for one or more input/output devices 24.
  • the input/output interface 22 and the network communication interface 26 are connected to the communication bus 18.
  • the input/output device 24 may be connected to other components of the computing device 12 through the input/output interface 22.
  • the exemplary input/output device 24 includes a pointing device (such as a mouse or trackpad), a keyboard, a touch input device (such as a touch pad or a touch screen), a voice or sound input device, and various types of sensor devices and/or a photographing device.
  • Input devices and/or output devices such as display devices, printers, speakers, and/or network cards.
  • the exemplary input/output device 24 may be included in the computing device 12 as a component constituting the computing device 12, and may be connected to the computing device 12 as a separate device distinct from the computing device 12. May be.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un procédé d'entrée d'utilisateur et un dispositif d'interface d'utilisateur servant à le réaliser. Un dispositif d'interface d'utilisateur selon un mode de réalisation décrit fait intervenir une interface d'utilisateur servant à réaliser une entrée d'utilisateur dans un terminal d'utilisateur doté d'un écran tactile, et comporte: une unité de suivi de regard servant à générer des informations de localisation de regard en suivant le regard d'un utilisateur sur l'écran tactile; une unité de bouton d'assistance affichée sur l'écran tactile; et une unité d'affichage de curseur liée à un toucher de l'unité de bouton d'assistance de l'utilisateur de façon à afficher un curseur à l'emplacement sur l'écran tactile correspondant aux informations de localisation de regard.
PCT/KR2020/005398 2019-04-25 2020-04-23 Procédé d'entrée d'utilisateur et dispositif d'interface d'utilisateur pour le réaliser WO2020218859A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190048762A KR102198867B1 (ko) 2019-04-25 2019-04-25 사용자 입력 방법 및 이를 수행하기 위한 사용자 인터페이스 장치
KR10-2019-0048762 2019-04-25

Publications (1)

Publication Number Publication Date
WO2020218859A1 true WO2020218859A1 (fr) 2020-10-29

Family

ID=72941088

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/005398 WO2020218859A1 (fr) 2019-04-25 2020-04-23 Procédé d'entrée d'utilisateur et dispositif d'interface d'utilisateur pour le réaliser

Country Status (2)

Country Link
KR (1) KR102198867B1 (fr)
WO (1) WO2020218859A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013097763A (ja) * 2011-11-07 2013-05-20 Fujitsu Ltd 情報処理装置およびその入力制御プログラム
KR20140035358A (ko) * 2011-04-21 2014-03-21 소니 컴퓨터 엔터테인먼트 인코포레이티드 시선-보조 컴퓨터 인터페이스
KR20140088487A (ko) * 2013-01-02 2014-07-10 삼성디스플레이 주식회사 단말기 및 그의 조작 방법
KR20140117469A (ko) * 2012-01-04 2014-10-07 토비 테크놀로지 에이비 시선 상호작용을 위한 시스템
KR20150031986A (ko) * 2013-09-17 2015-03-25 삼성전자주식회사 디스플레이장치 및 그 제어방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150032019A (ko) 2013-09-17 2015-03-25 한국전자통신연구원 시선 추적 기반의 사용자 인터페이스 방법 및 그 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140035358A (ko) * 2011-04-21 2014-03-21 소니 컴퓨터 엔터테인먼트 인코포레이티드 시선-보조 컴퓨터 인터페이스
JP2013097763A (ja) * 2011-11-07 2013-05-20 Fujitsu Ltd 情報処理装置およびその入力制御プログラム
KR20140117469A (ko) * 2012-01-04 2014-10-07 토비 테크놀로지 에이비 시선 상호작용을 위한 시스템
KR20140088487A (ko) * 2013-01-02 2014-07-10 삼성디스플레이 주식회사 단말기 및 그의 조작 방법
KR20150031986A (ko) * 2013-09-17 2015-03-25 삼성전자주식회사 디스플레이장치 및 그 제어방법

Also Published As

Publication number Publication date
KR20200125062A (ko) 2020-11-04
KR102198867B1 (ko) 2021-01-05

Similar Documents

Publication Publication Date Title
CN110769155B (zh) 一种摄像头控制方法及电子设备
WO2013048054A1 (fr) Procédé d'utilisation d'un canal de communication basé sur un geste et système de terminal portable pour supporter celui-ci
WO2018151449A1 (fr) Dispositif électronique et procédés permettant de déterminer une orientation du dispositif
CN108762634B (zh) 一种控制方法及终端
CN110888707A (zh) 一种消息发送方法及电子设备
CN110989881B (zh) 一种图标整理方法及电子设备
CN110536006B (zh) 一种对象位置调整方法及电子设备
WO2011132910A2 (fr) Procédé et appareil d'interfaçage
CN108681427B (zh) 一种访问权限控制的方法及终端设备
WO2016088981A1 (fr) Procédé, dispositif et système pour établir une interface utilisateur, et support d'enregistrement non transitoire lisible par ordinateur
WO2020192324A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2020130356A1 (fr) Système et procédé pour dispositif d'entrée polyvalent pour environnements bidimensionnels et tridimensionnels
WO2018105955A2 (fr) Procédé d'affichage d'objet et dispositif électronique associé
WO2015030460A1 (fr) Procédé, appareil, et support d'enregistrement pour interfonctionnement avec un terminal externe
CN111459350B (zh) 图标排序方法、装置及电子设备
WO2020209455A1 (fr) Système et procédé d'étalonnage tridimensionnel naturel pour suivi de l'œil fiable
WO2013133624A1 (fr) Appareil d'interface utilisant une reconnaissance de mouvement, et procédé destiné à commander ce dernier
CN111090529A (zh) 共享信息的方法及电子设备
US11526320B2 (en) Multi-screen interface control method and terminal device
WO2020218859A1 (fr) Procédé d'entrée d'utilisateur et dispositif d'interface d'utilisateur pour le réaliser
WO2012118271A1 (fr) Procédé et dispositif permettant de contrôler un contenu à l'aide d'un contact, support d'enregistrement associé, et terminal utilisateur comportant ce support
KR20220154825A (ko) 노트 생성 방법 및 전자기기
WO2015056886A1 (fr) Procédé de commande d'un écran tactile par détection de la position de la ligne de vision de l'utilisateur
WO2019203591A1 (fr) Appareil et procédé d'entrée à haute efficacité pour une réalité virtuelle et une réalité augmentée
CN111142772A (zh) 一种内容显示方法及可穿戴设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20794163

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20794163

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26/04/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20794163

Country of ref document: EP

Kind code of ref document: A1