KR20170092167A - Control device using eye-tracking - Google Patents
Control device using eye-tracking Download PDFInfo
- Publication number
- KR20170092167A KR20170092167A KR1020160012875A KR20160012875A KR20170092167A KR 20170092167 A KR20170092167 A KR 20170092167A KR 1020160012875 A KR1020160012875 A KR 1020160012875A KR 20160012875 A KR20160012875 A KR 20160012875A KR 20170092167 A KR20170092167 A KR 20170092167A
- Authority
- KR
- South Korea
- Prior art keywords
- line
- user
- control command
- sight
- pattern
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a control technique using eye-tracking, and more particularly, to a technique for tracking a user's gaze and executing a control command corresponding to a user's gaze.
An electronic pen touch method or a multi-touch method using a hand is mainly used as a control method of an interactive whiteboard which is made to replace a blackboard. However, in the touch-based control method, when the screen is large, it frequently causes the change of the position of the electronic board users (teacher, lecturer, etc.) according to the position of the object to be controlled, As shown in Fig. 1, in the case of a large size lecture room and a meeting room with a size of over 120 inches, there is a problem that the pen or hand of the copyboard user can not reach the top of the screen. In addition, students or students who are consuming multimedia contents displayed on the electronic board through the e-book top tablet monitor should also provide a natural screen switching and control interface in the limited display area than the electronic board, In order to improve the quality of class and content by analyzing the preference of multimedia content and the concentration of the school, it is necessary to collect and analyze the gaze information of students or students.
Accordingly, it is an object of the present invention to provide a user-friendly interface through a control device using eye-tracking technology.
It is also an object of the present invention to provide a technique for executing a control command using user's eye tracking.
It is another object of the present invention to provide a technique for analyzing a user's concentration, preference, etc. using a user's gaze tracking date.
Accordingly, it is an object of the present invention to provide a user-friendly interface through a control device using eye-tracking technology.
It is also an object of the present invention to provide a technique for executing a control command using user's eye tracking.
It is another object of the present invention to provide a technique for analyzing a user's concentration, preference, etc. using a user's gaze tracking date.
In the embodiment of the present invention, control using eye tracking is enabled.
In the embodiment of the present invention, it is possible to analyze the user concentration, preference, etc. using the gaze tracking date.
1 is a block diagram of a control apparatus using eye tracking according to an embodiment of the present invention;
2 is a flowchart of a control method using eye tracking according to an embodiment of the present invention.
3 is a flowchart of a method of executing a control command according to a line pattern according to an embodiment of the present invention.
4 is a view for explaining a line recognition process according to an embodiment of the present invention;
While the present invention has been described in connection with certain exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and similarities. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described in detail with reference to the accompanying drawings. In addition, the singular phrases used in the present specification and claims should be interpreted generally to mean "one or more " unless otherwise stated.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals refer to like or corresponding components throughout. .
1 is a block diagram of a control apparatus using eye tracking according to an embodiment of the present invention.
Referring to FIG. 1, a control apparatus using trace tracking includes a line of sight
The line-of-sight
In one embodiment, the line of sight
In one embodiment, when there are a plurality of users, the line-of-sight
In one embodiment, when the position of the user changes, the line-of-sight
The control
Also, the control
Also, the control
In addition, the control
The control
The control
Also, the control
The gaze
The gaze
In addition, the gaze
In addition, the gaze
2 is a flowchart of a control method using eye tracking according to an embodiment of the present invention. Hereinafter, the method will be described by way of example as being performed by a control apparatus using eye tracking.
Referring to FIG. 2, in step S210, a control command according to the line-of-sight pattern is set. Specifically, a separate control command is mapped for each of various visual line patterns composed of the position, movement, and fixation of the user's line of sight.
In addition, the control command and the command can be set using the line-of-sight pattern and the screen display information. For example, the control command may be mapped based on the line of sight pattern and the screen display information so that different control commands can be executed according to the content displayed on the screen even in the same line of sight pattern.
In step S220, the line of sight pattern is extracted by analyzing the user's line of sight information. Specifically, the user recognizes the line of sight from the image including the user's eye region, and tracks the direction, movement, and fixation of the line of sight. The gaze pattern is extracted based on the tracking result.
In step S230, it is determined whether the extracted visual line pattern corresponds to a control command.
In step S240, when the extracted visual line pattern corresponds to the control command, a control command corresponding to the extracted visual line pattern is executed.
In step S240, when the extracted visual line pattern does not correspond to the control command, the visual line information tracking the user's gaze and the gaze pattern extracted from the gaze information are analyzed to analyze the user's concentration, preference, and the like.
3 is a flowchart of a method of executing a control command according to a visual line pattern according to an embodiment of the present invention. Hereinafter, the method will be described by way of an example performed by the control device using the eye tracking. The eye pattern composed of the eye position and the fixed time is set as a video playback or a URL connection control command. It is assumed that the fixed line-of-sight pattern is set by the page turn control command.
Referring to FIG. 3, in step S310, it is determined whether the extracted line of sight pattern corresponds to a control command. Specifically, if the extracted visual line pattern corresponds to a control command, a step of searching for a control command corresponding to the visual line pattern is performed. If the extracted line of sight pattern does not correspond to the control command, the user's concentration or preference analysis is performed based on the eye-tracking information.
In step S320, it is determined whether the extracted visual line pattern corresponds to a moving picture playback or a URL connection command. Specifically, it is determined whether the user's line of sight corresponds to a line-of-sight pattern fixed for a time equal to or longer than a threshold value in the video or URL connection text. Depending on the judgment result, the moving picture is reproduced or the screen is switched according to the URL.
In step S330, it is determined whether the extracted visual line pattern corresponds to a page turnover command. Specifically, it is determined whether the user's line of sight is a line-of-sight pattern fixed to the last word of the document (content) displayed on the screen. According to the judgment result, the page of the document displayed on the screen is passed.
FIG. 4 is a view for explaining a line recognition process according to an embodiment of the present invention. Hereinafter, the gaze recognition process is performed by the gaze information analyzer of the control apparatus using the gaze tracking.
Referring to FIG. 4 (a), an image including a user's eye region is shown. Specifically, an image including one eye region among a plurality of users gazing at a large screen is captured. A portion corresponding to the face region is recognized in the photographed image by using the face recognition technology.
Referring to FIG. 4 (b), an image of the face region recognized in the image of FIG. 4 (a) is shown. Specifically, in order to track the user's eyes, the eye region is recognized in the face region of the user, and the user's gaze is recognized by analyzing the eye region image. Then, the gaze information is obtained by tracking the position, movement, and fixation of the recognized user's gaze.
The apparatus and method according to embodiments of the present invention may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium. The computer readable medium may include program instructions, data files, data structures, and the like, alone or in combination.
Program instructions to be recorded on a computer-readable medium may be those specially designed and constructed for the present invention or may be known and available to those of ordinary skill in the computer software arts. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Includes hardware devices specifically configured to store and execute program instructions such as magneto-optical media and ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.
The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.
The embodiments of the present invention have been described above. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed embodiments should be considered in an illustrative rather than a restrictive sense. The scope of the present invention is defined by the appended claims rather than by the foregoing description, and all differences within the scope of equivalents thereof should be construed as being included in the present invention.
110: line of sight information analysis section
120: Control command setting section
130: Control command execution unit
140: eye line information storage unit
Claims (1)
A control command setting unit for setting a control command to be executed according to a user's line of sight pattern and screen display information;
A line of sight information analyzing unit for analyzing line of sight information of the user and extracting a line of sight pattern; And
A control instruction executing section for executing a control instruction corresponding to the extracted line of sight pattern and screen display information
Comprising a control device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160012875A KR20170092167A (en) | 2016-02-02 | 2016-02-02 | Control device using eye-tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160012875A KR20170092167A (en) | 2016-02-02 | 2016-02-02 | Control device using eye-tracking |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170092167A true KR20170092167A (en) | 2017-08-11 |
Family
ID=59651560
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160012875A KR20170092167A (en) | 2016-02-02 | 2016-02-02 | Control device using eye-tracking |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170092167A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190078690A (en) * | 2017-12-13 | 2019-07-05 | 현대자동차주식회사 | Display control device and method organizing display environment user primary language direction |
WO2022065575A1 (en) * | 2020-09-25 | 2022-03-31 | 주식회사 비주얼캠프 | Gaze-based contents education method using object recognition and system for executing the method |
-
2016
- 2016-02-02 KR KR1020160012875A patent/KR20170092167A/en not_active Application Discontinuation
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190078690A (en) * | 2017-12-13 | 2019-07-05 | 현대자동차주식회사 | Display control device and method organizing display environment user primary language direction |
WO2022065575A1 (en) * | 2020-09-25 | 2022-03-31 | 주식회사 비주얼캠프 | Gaze-based contents education method using object recognition and system for executing the method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210076105A1 (en) | Automatic Data Extraction and Conversion of Video/Images/Sound Information from a Slide presentation into an Editable Notetaking Resource with Optional Overlay of the Presenter | |
US20210056251A1 (en) | Automatic Data Extraction and Conversion of Video/Images/Sound Information from a Board-Presented Lecture into an Editable Notetaking Resource | |
KR102381801B1 (en) | Systems and methods for guiding handwriting input | |
CN107273002B (en) | Handwriting input answering method, terminal and computer readable storage medium | |
US9049482B2 (en) | System and method for combining computer-based educational content recording and video-based educational content recording | |
CA3161129A1 (en) | Enhancing tangible content on physical activity surface | |
US10546508B2 (en) | System and method for automated literacy assessment | |
US9317486B1 (en) | Synchronizing playback of digital content with captured physical content | |
US20200387276A1 (en) | Virtualization of physical activity surface | |
Chatila et al. | Integrated planning and execution control of autonomous robot actions | |
US10984671B2 (en) | Information display apparatus, information display method, and computer-readable recording medium | |
Yadav et al. | Content-driven multi-modal techniques for non-linear video navigation | |
Zhao et al. | A new visual interface for searching and navigating slide-based lecture videos | |
Stearns et al. | The design and preliminary evaluation of a finger-mounted camera and feedback system to enable reading of printed text for the blind | |
US20150301726A1 (en) | Systems and Methods for Displaying Free-Form Drawing on a Contact-Sensitive Display | |
US20200334290A1 (en) | Facilitating contextual video searching using user interactions with interactive computing environments | |
KR20170092167A (en) | Control device using eye-tracking | |
Margetis et al. | Enhancing education through natural interaction with physical paper | |
CN113625985B (en) | Intelligent blackboard and display method and device thereof | |
Angrave et al. | Creating TikToks, Memes, Accessible Content, and Books from Engineering Videos? First Solve the Scene Detection Problem. | |
KR20220013172A (en) | Electronic device and Method for controlling the electronic device thereof | |
KR20130130396A (en) | Method for producing educational material and educational material system | |
JP2024022847A (en) | Information processing device, information processing method and program | |
Lin et al. | Learning-focused structuring for blackboard lecture videos | |
Rana et al. | A proposal for a novel e-learning system for the visually impaired |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |