WO2020218859A1 - User input method and user interface device for performing same - Google Patents
User input method and user interface device for performing same Download PDFInfo
- Publication number
- WO2020218859A1 WO2020218859A1 PCT/KR2020/005398 KR2020005398W WO2020218859A1 WO 2020218859 A1 WO2020218859 A1 WO 2020218859A1 KR 2020005398 W KR2020005398 W KR 2020005398W WO 2020218859 A1 WO2020218859 A1 WO 2020218859A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- gaze
- touch screen
- cursor
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 239000013598 vector Substances 0.000 claims description 62
- 239000011159 matrix material Substances 0.000 claims description 8
- 238000013507 mapping Methods 0.000 claims description 3
- 238000004891 communication Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 210000001508 eye Anatomy 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
- G06F3/04186—Touch location disambiguation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
Definitions
- Embodiments of the present invention relate to user interface technology.
- the disclosed embodiment is to provide a new user interface technology.
- a user interface device for performing a user input in a user terminal having a touch screen, and a gaze tracking unit for generating gaze position information by tracking a user's gaze on the touch screen ; An assist button unit displayed on the touch screen; And a cursor display unit that displays a cursor at a position on the touch screen corresponding to the gaze position information in association with a user's touch of the assist button unit.
- the preset event may occur when a user touches the assist button or a cursor is displayed on the touch screen.
- the user interface device may further include a selection determination unit configured to determine a user's selection point on the touch screen in conjunction with the position of the cursor and the drag end operation of the assist button unit.
- the selection determination unit may determine the position of the cursor on the touch screen as the user's selection point at the end of dragging of the assist button unit.
- the user interface device may further include a calibration unit for calibrating the gaze tracking unit based on the gaze position information and information on the user's selection point.
- the calibration unit may map the gaze position information and information on the user’s selection point whenever a user selection operation occurs, and perform calibration on the gaze tracking unit based on the mapped information.
- the calibration unit detects a gaze vector based on the gaze position information, detects a point vector on the touch screen based on information on the user's selection point, and is detected whenever the user's selection operation occurs. And a matching matrix that matches the line of sight vector and the point vector on a one-to-one basis on the basis of a pair of point vectors to perform calibration.
- a user input method is a method performed in a computing device having one or more processors and a memory for storing one or more programs executed by the one or more processors, comprising: Generating gaze position information by tracking a gaze; Displaying an assist button unit on the touch screen; And displaying a cursor at a position on the touch screen corresponding to the gaze position information in association with the user's touch of the assist button.
- the user input method includes: allowing the assist button unit to be dragged on the touch screen according to occurrence of a preset event; And moving the cursor displayed on the touch screen in association with the drag direction and the drag length of the assist button.
- the user input method may further include an operation of determining a user's selection point on the touch screen in conjunction with a position of the cursor and a drag end operation of the assist button unit.
- the position of the cursor on the touch screen at the end of dragging of the assist button unit may be determined as the user's selection point.
- the user input method may further include calibrating a gaze tracking model for tracking the user's gaze based on the gaze position information and information on the user's selection point.
- the gaze position information and information on the user’s selection point may be mapped, and the gaze tracking model may be calibrated based on the mapped information.
- the calibration may include detecting a gaze vector based on the gaze position information; Detecting a point vector on the touch screen based on information on the user's selection point; And updating a matching matrix that associates the gaze vector and the point vector on a one-to-one basis based on a pair of gaze vectors and point vectors detected whenever the user's selection operation occurs.
- the user by tracking the user's gaze, placing a cursor on the touch screen, finely adjusting it through an assist button, and allowing the user to select an intended selection point, the user touches the touch screen with one hand. You will be able to accurately select difficult parts.
- the gaze vector and the point vector pair are generated whenever a user selects without a separate calibration process, so that the gaze tracking accuracy can be gradually improved.
- FIG. 1 is a diagram showing the configuration of a user interface device according to an embodiment of the present invention
- FIGS. 2 to 3 are diagrams showing a process of determining a user's selection point using a user interface device according to an embodiment of the present invention
- FIG. 4 is a flowchart showing a user input method using a user interface device according to an embodiment of the present invention
- FIG. 5 is a block diagram illustrating and describing a computing environment including a computing device suitable for use in example embodiments.
- transmission In the following description, "transmission”, “communication”, “transmission”, “reception” of signals or information, and other terms having a similar meaning are not only directly transmitted signals or information from one component to another component. It includes what is passed through other components.
- “transmitting” or “transmitting” a signal or information to a component indicates the final destination of the signal or information and does not imply a direct destination. The same is true for “reception” of signals or information.
- transmission when two or more pieces of data or information are "related”, it means that when one data (or information) is obtained, at least a part of other data (or information) can be obtained based thereon.
- directional terms such as upper side, lower side, one side, and the other side are used in relation to the orientation of the disclosed drawings. Since the constituent elements of the embodiments of the present invention may be positioned in various orientations, the directional terminology is used for illustrative purposes, but is not limited thereto.
- first and second may be used to describe various components, but the components should not be limited by the terms. These terms may be used for the purpose of distinguishing one component from another component.
- a first component may be referred to as a second component, and similarly, a second component may be referred to as a first component.
- FIG. 1 is a diagram showing the configuration of a user interface device according to an embodiment of the present invention.
- the user interface device 100 may include a gaze tracking unit 102, an assist button unit 104, a cursor display unit 106, a selection determination unit 108, and a calibration unit 110. have.
- the user interface device 100 may be mounted on a user terminal (eg, a smart phone, a tablet PC, a notebook, etc.) 50 having a touch screen.
- the user interface device 100 may be a configuration for a user interface in the user terminal 50.
- the gaze tracking unit 102 may track a user's gaze. Specifically, the gaze tracking unit 102 may generate gaze location information by tracking a user's gaze on the touch screen of the user terminal 50.
- the gaze tracking unit 102 may include a gaze tracking model for tracking a user's gaze.
- the gaze tracking model may be implemented through known gaze tracking techniques.
- the assist button unit 104 may be provided to enable drag on the touch screen according to occurrence of a preset event.
- the preset event may include a user touching the assist button unit 104 or displaying a cursor on the touch screen.
- the assist button unit 104 may be provided to enable a user to touch on the touch screen and drag in a predetermined direction (including a 360° direction) when a preset event occurs.
- the position of the cursor displayed on the touch screen may be different from the position actually intended by the user (ie, the position at which the user actually gazes). That is, the gaze tracking unit 102 may not accurately track the gaze position of the user, and thus there may be a difference between the position of the cursor based on gaze position information and the position at which the user actually gazes.
- the cursor display unit 106 may move the cursor displayed on the touch screen in conjunction with the drag of the assist button unit 104. That is, when the user touches the assist button unit 104 or a cursor is displayed on the touch screen, the assist button unit 104 may be changed to a form capable of being dragged on the touch screen.
- the cursor display unit 106 may move the cursor on the touch screen in association with the dragging direction and length of the assist button unit 104 by the user.
- the cursor display unit 106 may move the cursor in the drag direction of the assist button unit 104, but may move the cursor by a length corresponding to the drag length of the assist button unit 104.
- the user can control the movement of the cursor through the assist button unit 104.
- the selection determination unit 108 may determine a user's selection point on the touch screen of the user terminal 50.
- the selection determination unit 108 may determine the user's selection point in conjunction with the position of the cursor on the touch screen and the drag end operation of the assist button unit 104. For example, the selection determination unit 108 may position the cursor on the touch screen at the time when the user drags the assist button unit 104 and then releases the hand from the assist button unit 104 (ie, the end of dragging). Can be decided by a selection point.
- the user's selection point may be a menu or an icon on a touch screen.
- the selection determination unit 108 determines the user's selection point, a menu or icon corresponding to the user's selection point may be activated or executed.
- the calibration unit 110 may calibrate the gaze tracking unit 102 based on the gaze position information by the gaze tracking unit 102 and information on the user's selection point by the selection determination unit 108 . That is, the calibration unit 110 includes location information of the gaze that the user actually gazes on the touch screen (i.e., information about the user's selection point) and gaze location information tracked by the gaze tracking unit 102 (that is, the user When viewing the selected point, calibration may be performed on the gaze tracking model based on gaze position information generated by the gaze tracking unit).
- the calibration unit 110 Whenever the user selects a predetermined point on the touch screen of the user terminal 50 through the user interface device 100, the calibration unit 110 provides gaze position information and selection determination unit 108 by the gaze tracking unit 102. Information on the user's selection point by can be mapped.
- the calibration unit 110 may calibrate the gaze location information of the gaze tracking unit 102 based on the accumulated gaze location information and mapping information of information on the user's selection point. Accordingly, the difference between the displayed position of the cursor according to the gaze position information of the gaze tracking unit 102 and the position at which the user actually gazes is gradually reduced.
- the calibration unit 110 may detect a gaze vector based on gaze position information generated by the gaze tracking unit 102.
- the gaze vector may be a vector directed from the center of the user's eyeball to the center of the pupil or the center of the cornea.
- the calibration unit 110 may detect a point vector on the touch screen based on the user's selection point on the touch screen.
- the calibration unit 110 may detect a matching matrix that matches the gaze vector and the point vector (that is, the point vector on the touch screen) 1:1.
- the calibration unit 110 selects a group of a plurality of pre-stored gaze vectors and point vector pairs (an operation in which the user selects a predetermined point on the touch screen of the user terminal 50 through the user interface device 100) A pair of gaze vectors and point vectors generated by can be added.
- the plurality of pre-stored gaze vectors and point vectors may be average gaze vectors and average point vectors by other existing users.
- the calibration unit 110 may improve the mapping speed by removing some of the line-of-sight vector and point vector pairs when the line-of-sight vector and point vector pairs are accumulated in excess of a preset number. In this case, the calibration unit 110 may preferentially remove the gaze vector and point vector pairs by other users among the gaze vector and point vector pairs.
- the user by tracking the user's gaze, placing a cursor on the touch screen, finely adjusting it through an assist button, and allowing the user to select an intended selection point, the user touches the touch screen with one hand. You will be able to accurately select difficult parts.
- the gaze vector and the point vector pair are generated whenever a user selects without a separate calibration process, so that the gaze tracking accuracy can be gradually improved.
- FIGS. 2 to 3 are diagrams illustrating a process of determining a user's selection point using a user interface device according to an embodiment of the present invention.
- the assist button unit 104 may be displayed on the touch screen 51 of the user terminal 50.
- the user may touch the assist button unit 104 while looking at a desired position on the touch screen 51 with his own eyes.
- the gaze tracking unit 102 generates gaze position information by tracking the user’s gaze on the touch screen 51, and the cursor display unit 106 is based on the gaze position information at the moment when the assist button unit 104 is touched.
- the cursor A can be displayed on the touch screen 51.
- the position of the cursor A displayed on the touch screen 51 (ie, the position of the cursor based on gaze position information) may be different from the position intended by the user. That is, since the gaze tracking unit 102 cannot accurately track the user's gaze position, there may be a difference between the location of the cursor A based on gaze location information and the location intended by the user. Accordingly, it is necessary to adjust the position of the cursor A of the touch screen 51 to a position intended by the user.
- the assist button 104 touches It can be changed to enable drag on the screen 51.
- FIG. 4 is a flowchart illustrating a user input method using a user interface device according to an embodiment of the present invention.
- the method is described by dividing the method into a plurality of steps, but at least some of the steps are performed in a different order, combined with other steps, performed together, omitted, divided into detailed steps, or not shown. One or more steps may be added and performed.
- the user interface device 100 generates gaze position information by tracking the user's gaze on the touch screen 51 (S 101 ).
- the user interface device 100 displays the assist button unit 104 on the touch screen 51 (S103).
- the assist button unit 104 may be displayed on the touch screen 51 in conjunction with a user's gaze tracking operation.
- the user interface device 100 displays a cursor at a position on the touch screen 51 corresponding to the gaze position information in conjunction with the user's touch of the assist button unit 104 (S105).
- the user interface device 100 allows the assist button unit 104 to be dragged on the touch screen 51 according to the occurrence of a preset event (S107).
- the user interface device 100 moves the position of the cursor on the touch screen 51 according to the user's dragging operation of the assist button unit 104 (S109).
- the user interface device 100 determines the point where the cursor is located on the touch screen 51 as the user's selection point according to the user's drag end operation (S111).
- the user interface device 100 maps the gaze position information and information on the user's selection point (S113).
- FIG. 5 is a block diagram illustrating and describing a computing environment 10 including a computing device suitable for use in example embodiments.
- each component may have different functions and capabilities in addition to those described below, and may include additional components in addition to those described below.
- the illustrated computing environment 10 includes a computing device 12.
- computing device 12 may be a user interface device 100.
- the computing device 12 includes at least one processor 14, a computer-readable storage medium 16 and a communication bus 18.
- the processor 14 may cause the computing device 12 to operate according to the exemplary embodiments mentioned above.
- the processor 14 may execute one or more programs stored in the computer-readable storage medium 16.
- the one or more programs may include one or more computer-executable instructions, and the computer-executable instructions are configured to cause the computing device 12 to perform operations according to an exemplary embodiment when executed by the processor 14 Can be.
- the computer-readable storage medium 16 is configured to store computer-executable instructions or program code, program data, and/or other suitable form of information.
- the program 20 stored in the computer-readable storage medium 16 includes a set of instructions executable by the processor 14.
- the computer-readable storage medium 16 includes memory (volatile memory such as random access memory, nonvolatile memory, or a suitable combination thereof), one or more magnetic disk storage devices, optical disk storage devices, flash It may be memory devices, other types of storage media that can be accessed by computing device 12 and store desired information, or a suitable combination thereof.
- the communication bus 18 interconnects the various other components of the computing device 12, including the processor 14 and computer readable storage medium 16.
- Computing device 12 may also include one or more input/output interfaces 22 and one or more network communication interfaces 26 that provide interfaces for one or more input/output devices 24.
- the input/output interface 22 and the network communication interface 26 are connected to the communication bus 18.
- the input/output device 24 may be connected to other components of the computing device 12 through the input/output interface 22.
- the exemplary input/output device 24 includes a pointing device (such as a mouse or trackpad), a keyboard, a touch input device (such as a touch pad or a touch screen), a voice or sound input device, and various types of sensor devices and/or a photographing device.
- Input devices and/or output devices such as display devices, printers, speakers, and/or network cards.
- the exemplary input/output device 24 may be included in the computing device 12 as a component constituting the computing device 12, and may be connected to the computing device 12 as a separate device distinct from the computing device 12. May be.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims (15)
- 터치 스크린을 구비하는 사용자 단말기에서 사용자 입력을 수행하기 위한 사용자 인터페이스 장치로서, A user interface device for performing a user input in a user terminal having a touch screen, comprising:상기 터치 스크린 상에서 사용자의 시선을 추적하여 시선 위치 정보를 생성하는 시선 추적부;A gaze tracking unit for generating gaze position information by tracking a user's gaze on the touch screen;상기 터치 스크린 상에 표시되는 어시스트 버튼부; 및An assist button unit displayed on the touch screen; And사용자의 상기 어시스트 버튼부의 터치와 연동하여 상기 시선 위치 정보에 대응하는 터치 스크린 상의 위치에 커서를 표시하는 커서 표시부를 포함하는, 사용자 인터페이스 장치.A user interface device comprising a cursor display unit configured to display a cursor at a position on a touch screen corresponding to the gaze position information in connection with a user's touch of the assist button unit.
- 청구항 1에 있어서, The method according to claim 1,상기 어시스트 버튼부는, The assist button portion,기 설정된 이벤트 발생에 따라 상기 터치 스크린 상에서 드래그 가능하도록 마련되고, It is provided to be draggable on the touch screen according to the occurrence of a preset event,상기 커서 표시부는, The cursor display unit,상기 어시스트 버튼부의 드래그 방향 및 드래그 길이와 연동하여 상기 터치 스크린에 표시된 커서를 이동시키는, 사용자 인터페이스 장치.A user interface device for moving a cursor displayed on the touch screen in association with a drag direction and a drag length of the assist button part.
- 청구항 2에 있어서, The method according to claim 2,상기 기 설정된 이벤트는, The preset event is,사용자가 상기 어시스트 버튼부를 터치하거나 상기 터치 스크린에 커서가 표시되는 경우 발생하는, 사용자 인터페이스 장치.Occurs when a user touches the assist button or a cursor is displayed on the touch screen.
- 청구항 2에 있어서, The method according to claim 2,상기 사용자 인터페이스 장치는, The user interface device,상기 커서의 위치 및 상기 어시스트 버튼부의 드래그 종료 동작과 연동하여 상기 터치 스크린에서 사용자의 선택 지점을 결정하는 선택 결정부를 더 포함하는, 사용자 인터페이스 장치.The user interface device further comprising a selection determination unit configured to determine a user's selection point on the touch screen in association with a position of the cursor and a drag end operation of the assist button unit.
- 청구항 4에 있어서, The method of claim 4,상기 선택 결정부는, The selection determination unit,상기 어시스트 버튼부의 드래그 종료 시점에 상기 터치 스크린 상의 커서의 위치를 상기 사용자의 선택 지점으로 결정하는, 사용자 인터페이스 장치.The user interface device, wherein the position of the cursor on the touch screen is determined as the selection point of the user at the end of dragging of the assist button unit.
- 청구항 4에 있어서, The method of claim 4,상기 사용자 인터페이스 장치는, The user interface device,상기 시선 위치 정보 및 상기 사용자의 선택 지점에 대한 정보를 기반으로 상기 시선 추적부에 대해 캘리브레이션을 수행하는 캘리브레이션부를 더 포함하는, 사용자 인터페이스 장치.The user interface device further comprising a calibration unit for calibrating the gaze tracking unit based on the gaze position information and information on the user's selection point.
- 청구항 6에 있어서, The method of claim 6,상기 캘리브레이션부는, The calibration unit,사용자의 선택 동작이 발생할 때마다 상기 시선 위치 정보와 상기 사용자의 선택 지점에 대한 정보를 매핑시키고, 매핑된 정보를 기반으로 상기 시선 추적부에 대해 캘리브레이션을 수행하는, 사용자 인터페이스 장치.Whenever a user's selection operation occurs, the gaze position information and information on the user's selection point are mapped, and calibration is performed on the gaze tracker based on the mapped information.
- 청구항 7에 있어서, The method of claim 7,상기 캘리브레이션부는, The calibration unit,상기 시선 위치 정보를 기반으로 시선 벡터를 검출하고, 상기 사용자의 선택 지점에 대한 정보를 기반으로 터치 스크린 상의 점 벡터를 검출하며, 상기 사용자의 선택 동작이 발생할 때마다 검출되는 시선 벡터 및 점 벡터 쌍을 기반으로 상기 시선 벡터와 상기 점 벡터를 일대일로 대응시키는 매칭 매트릭스를 갱신하여 캘리브레이션을 수행하는, 사용자 인터페이스 장치.A gaze vector is detected based on the gaze position information, a point vector on the touch screen is detected based on the information on the user's selection point, and a gaze vector and a point vector pair detected whenever the user's selection operation occurs A user interface device for performing calibration by updating a matching matrix that matches the gaze vector and the point vector on a one-to-one basis.
- 하나 이상의 프로세서들, 및One or more processors, and상기 하나 이상의 프로세서들에 의해 실행되는 하나 이상의 프로그램들을 저장하는 메모리를 구비한 컴퓨팅 장치에서 수행되는 방법으로서, A method performed in a computing device having a memory storing one or more programs executed by the one or more processors,터치 스크린 상에서 사용자의 시선을 추적하여 시선 위치 정보를 생성하기 위한 동작;Generating gaze position information by tracking a user's gaze on the touch screen;상기 터치 스크린 상에 어시스트 버튼부를 표시하기 위한 동작; 및Displaying an assist button unit on the touch screen; And사용자의 상기 어시스트 버튼부의 터치와 연동하여 상기 시선 위치 정보에 대응하는 터치 스크린 상의 위치에 커서를 표시하기 위한 동작을 포함하는, 사용자 입력 방법.And displaying a cursor at a position on a touch screen corresponding to the gaze position information in association with a user's touch of the assist button unit.
- 청구항 9에 있어서, The method of claim 9,상기 사용자 입력 방법은, The user input method,기 설정된 이벤트 발생에 따라 상기 어시스트 버튼부를 상기 터치 스크린 상에서 드래그 가능하도록 하는 동작; 및Allowing the assist button unit to be dragged on the touch screen according to occurrence of a preset event; And상기 어시스트 버튼부의 드래그 방향 및 드래그 길이와 연동하여 상기 터치 스크린에 표시된 커서를 이동시키는 동작을 더 포함하는, 사용자 입력 방법.The user input method further comprising moving a cursor displayed on the touch screen in association with a drag direction and a drag length of the assist button unit.
- 청구항 10에 잇어서, Following on in claim 10,상기 사용자 입력 방법은, The user input method,상기 커서의 위치 및 상기 어시스트 버튼부의 드래그 종료 동작과 연동하여 상기 터치 스크린에서 사용자의 선택 지점을 결정하는 동작을 더 포함하는, 사용자 입력 방법.The user input method further comprising an operation of determining a user's selection point on the touch screen in association with a position of the cursor and a drag end operation of the assist button unit.
- 청구항 11에 있어서, The method of claim 11,상기 사용자의 선택 지점을 결정하는 동작은, The operation of determining the selection point of the user,상기 어시스트 버튼부의 드래그 종료 시점에 상기 터치 스크린 상의 커서의 위치를 상기 사용자의 선택 지점으로 결정하는, 사용자 입력 방법.A user input method of determining a position of a cursor on the touch screen as a selection point of the user at the end of dragging of the assist button unit.
- 청구항 11에 있어서, The method of claim 11,상기 사용자 입력 방법은, The user input method,상기 시선 위치 정보 및 상기 사용자의 선택 지점에 대한 정보를 기반으로 상기 사용자의 시선 추적을 위한 시선 추적 모델을 캘리브레이션 하는 동작을 더 포함하는, 사용자 입력 방법.The user input method further comprising calibrating a gaze tracking model for tracking the gaze of the user based on the gaze location information and information on the user's selection point.
- 청구항 13에 있어서, The method of claim 13,상기 캘리브레이션 하는 동작은, The calibration operation,사용자의 선택 동작이 발생할 때마다 상기 시선 위치 정보와 상기 사용자의 선택 지점에 대한 정보를 매핑시키고, 매핑된 정보를 기반으로 상기 시선 추적 모델에 대해 캘리브레이션을 수행하는, 사용자 입력 방법.The user input method of mapping the gaze location information and information on the user's selection point whenever a user's selection operation occurs, and calibrating the gaze tracking model based on the mapped information.
- 청구항 14에 있어서, The method of claim 14,상기 캘리브레이션을 하는 동작은, The calibration operation,상기 시선 위치 정보를 기반으로 시선 벡터를 검출하는 동작;Detecting a gaze vector based on the gaze position information;상기 사용자의 선택 지점에 대한 정보를 기반으로 터치 스크린 상의 점 벡터를 검출하는 동작; 및Detecting a point vector on the touch screen based on information on the user's selection point; And상기 사용자의 선택 동작이 발생할 때마다 검출되는 시선 벡터 및 점 벡터 쌍을 기반으로 상기 시선 벡터와 상기 점 벡터를 일대일로 대응시키는 매칭 매트릭스를 갱신하는 동작을 포함하는, 사용자 입력 방법.And updating a matching matrix that matches the gaze vector and the point vector on a one-to-one basis based on a pair of gaze vectors and point vectors detected whenever the user's selection operation occurs.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020190048762A KR102198867B1 (en) | 2019-04-25 | 2019-04-25 | Method for user input and user interface device executing the method |
KR10-2019-0048762 | 2019-04-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020218859A1 true WO2020218859A1 (en) | 2020-10-29 |
Family
ID=72941088
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2020/005398 WO2020218859A1 (en) | 2019-04-25 | 2020-04-23 | User input method and user interface device for performing same |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102198867B1 (en) |
WO (1) | WO2020218859A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013097763A (en) * | 2011-11-07 | 2013-05-20 | Fujitsu Ltd | Information processor and input control program therefor |
KR20140035358A (en) * | 2011-04-21 | 2014-03-21 | 소니 컴퓨터 엔터테인먼트 인코포레이티드 | Gaze-assisted computer interface |
KR20140088487A (en) * | 2013-01-02 | 2014-07-10 | 삼성디스플레이 주식회사 | Terminal and method for controlling thereof |
KR20140117469A (en) * | 2012-01-04 | 2014-10-07 | 토비 테크놀로지 에이비 | System for gaze interaction |
KR20150031986A (en) * | 2013-09-17 | 2015-03-25 | 삼성전자주식회사 | Display apparatus and control method thereof |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150032019A (en) | 2013-09-17 | 2015-03-25 | 한국전자통신연구원 | Method and apparatus for providing user interface by using eye tracking |
-
2019
- 2019-04-25 KR KR1020190048762A patent/KR102198867B1/en active IP Right Grant
-
2020
- 2020-04-23 WO PCT/KR2020/005398 patent/WO2020218859A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20140035358A (en) * | 2011-04-21 | 2014-03-21 | 소니 컴퓨터 엔터테인먼트 인코포레이티드 | Gaze-assisted computer interface |
JP2013097763A (en) * | 2011-11-07 | 2013-05-20 | Fujitsu Ltd | Information processor and input control program therefor |
KR20140117469A (en) * | 2012-01-04 | 2014-10-07 | 토비 테크놀로지 에이비 | System for gaze interaction |
KR20140088487A (en) * | 2013-01-02 | 2014-07-10 | 삼성디스플레이 주식회사 | Terminal and method for controlling thereof |
KR20150031986A (en) * | 2013-09-17 | 2015-03-25 | 삼성전자주식회사 | Display apparatus and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
KR102198867B1 (en) | 2021-01-05 |
KR20200125062A (en) | 2020-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015099293A1 (en) | Device and method for displaying user interface of virtual input device based on motion recognition | |
CN110769155B (en) | Camera control method and electronic equipment | |
US20220300302A1 (en) | Application sharing method and electronic device | |
CN108762634B (en) | Control method and terminal | |
EP2761973A1 (en) | Method of operating gesture based communication channel and portable terminal system for supporting the same | |
CN110888707A (en) | Message sending method and electronic equipment | |
CN110989881B (en) | Icon arrangement method and electronic equipment | |
WO2011132910A2 (en) | Method and apparatus for interface | |
WO2018216868A1 (en) | Electronic device and input processing method of input device | |
CN108681427B (en) | Access right control method and terminal equipment | |
WO2016088981A1 (en) | Method, device, and system for providing user interface, and non-transitory computer-readable recording medium | |
WO2020130356A1 (en) | System and method for multipurpose input device for two-dimensional and three-dimensional environments | |
CN111475080B (en) | Misoperation prompting method and electronic equipment | |
WO2020192324A1 (en) | Interface displaying method and terminal device | |
WO2018105955A2 (en) | Method for displaying object and electronic device thereof | |
CN111459350B (en) | Icon sorting method and device and electronic equipment | |
EP3039556A1 (en) | Method, apparatus, and recording medium for interworking with external terminal | |
WO2013133624A1 (en) | Interface apparatus using motion recognition, and method for controlling same | |
CN111638822A (en) | Icon operation method and device and electronic equipment | |
US11526320B2 (en) | Multi-screen interface control method and terminal device | |
WO2020218859A1 (en) | User input method and user interface device for performing same | |
WO2012118271A1 (en) | Method and device for controlling contents using touch, recording medium therefor, and user terminal having same | |
KR20220154825A (en) | How to create notes and electronic devices | |
WO2015056886A1 (en) | Method for controlling touch screen by detecting position of line of sight of user | |
CN109144390A (en) | Information processing equipment and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20794163 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20794163 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26/04/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20794163 Country of ref document: EP Kind code of ref document: A1 |