KR20170092167A - Control device using eye-tracking - Google Patents

Control device using eye-tracking Download PDF

Info

Publication number
KR20170092167A
KR20170092167A KR1020160012875A KR20160012875A KR20170092167A KR 20170092167 A KR20170092167 A KR 20170092167A KR 1020160012875 A KR1020160012875 A KR 1020160012875A KR 20160012875 A KR20160012875 A KR 20160012875A KR 20170092167 A KR20170092167 A KR 20170092167A
Authority
KR
South Korea
Prior art keywords
line
user
control command
sight
pattern
Prior art date
Application number
KR1020160012875A
Other languages
Korean (ko)
Inventor
이희경
김현철
서정일
이인재
황인욱
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020160012875A priority Critical patent/KR20170092167A/en
Publication of KR20170092167A publication Critical patent/KR20170092167A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A control device using user gaze tracking comprises: a control command setting unit for setting a control command to be executed according to a sight pattern of a user and screen display information; a sight information analyzing unit for analyzing sight information of the user and extracting a sight pattern; and a control command executing unit for executing the control command corresponding to the extracted sight pattern and the screen display information. The present invention executes the control command using the gaze tracking of the user.

Description

[0001] CONTROL DEVICE USING EYE-TRACKING [0002]

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a control technique using eye-tracking, and more particularly, to a technique for tracking a user's gaze and executing a control command corresponding to a user's gaze.

An electronic pen touch method or a multi-touch method using a hand is mainly used as a control method of an interactive whiteboard which is made to replace a blackboard. However, in the touch-based control method, when the screen is large, it frequently causes the change of the position of the electronic board users (teacher, lecturer, etc.) according to the position of the object to be controlled, As shown in Fig. 1, in the case of a large size lecture room and a meeting room with a size of over 120 inches, there is a problem that the pen or hand of the copyboard user can not reach the top of the screen. In addition, students or students who are consuming multimedia contents displayed on the electronic board through the e-book top tablet monitor should also provide a natural screen switching and control interface in the limited display area than the electronic board, In order to improve the quality of class and content by analyzing the preference of multimedia content and the concentration of the school, it is necessary to collect and analyze the gaze information of students or students.

Accordingly, it is an object of the present invention to provide a user-friendly interface through a control device using eye-tracking technology.

It is also an object of the present invention to provide a technique for executing a control command using user's eye tracking.

It is another object of the present invention to provide a technique for analyzing a user's concentration, preference, etc. using a user's gaze tracking date.

Accordingly, it is an object of the present invention to provide a user-friendly interface through a control device using eye-tracking technology.

It is also an object of the present invention to provide a technique for executing a control command using user's eye tracking.

It is another object of the present invention to provide a technique for analyzing a user's concentration, preference, etc. using a user's gaze tracking date.

In the embodiment of the present invention, control using eye tracking is enabled.

In the embodiment of the present invention, it is possible to analyze the user concentration, preference, etc. using the gaze tracking date.

1 is a block diagram of a control apparatus using eye tracking according to an embodiment of the present invention;
2 is a flowchart of a control method using eye tracking according to an embodiment of the present invention.
3 is a flowchart of a method of executing a control command according to a line pattern according to an embodiment of the present invention.
4 is a view for explaining a line recognition process according to an embodiment of the present invention;

While the present invention has been described in connection with certain exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and similarities. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described in detail with reference to the accompanying drawings. In addition, the singular phrases used in the present specification and claims should be interpreted generally to mean "one or more " unless otherwise stated.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals refer to like or corresponding components throughout. .

1 is a block diagram of a control apparatus using eye tracking according to an embodiment of the present invention.

Referring to FIG. 1, a control apparatus using trace tracking includes a line of sight information analysis unit 110, a control command setting unit 120, a control command execution unit 130, and a line of sight information storage unit 140.

The line-of-sight information analyzing unit 110 analyzes the line-of-sight information of the user to extract a line-of-sight pattern. Specifically, the gaze information analyzing unit 110 captures an image including a user's eye area. The line-of-sight information analyzing unit 110 analyzes a user's eye area on the photographed image and recognizes the user's line of sight. The line-of-sight information analyzing unit 110 tracks the recognized user's line of sight, and extracts line-of-sight information including the position, direction, and fixation of the line of sight. The line-of-sight information analyzing unit 110 analyzes the line-of-sight information and extracts a line-of-sight pattern composed of the position, direction, and fixed line of sight.

In one embodiment, the line of sight information analyzing unit 110 may acquire an image including a user's eye region from an external device and extract a line-of-sight pattern.

In one embodiment, when there are a plurality of users, the line-of-sight information analyzing unit 110 may extract a line-of-sight pattern for each user or extract only a line-of-sight pattern for a specific user.

In one embodiment, when the position of the user changes, the line-of-sight information analyzing unit 110 may track the position of the user and extract the line-of-sight pattern from the line-of-sight information of the user.

The control command setting unit 120 sets a control command to be executed according to the user's line of sight pattern. Specifically, the control command setting unit 120 sets a control command corresponding to each of various visual line patterns composed of the position, direction, and fixed line of sight, so that the control command intended by the user can be executed according to the user's line of sight pattern do. For example, in the case of a line-of-sight pattern in which the line of sight of the user reading the text displayed on the screen moves from the first part to the end of the test, the control command setting unit 120 sets a control command to move to the next page of the text displayed on the screen And maps to the line of sight pattern.

Also, the control command setting unit 120 can modify and change the control command previously set through learning. For example, when a control command B is set in a line-of-sight pattern A and a user directly inputs a control command C after showing a line-of-sight pattern A, the control command setting unit 120 sets a line- , And changes the preset control command. When a control command B is set in a line-of-sight pattern A and the user inputs a control command B directly after displaying a line-of-sight pattern similar to A, the control command setting unit 120 sets a line- The pattern is also mapped to the command B.

Also, the control command setting unit 120 can set a new control command through learning. For example, if a user inputs a control command B directly after displaying a line of sight pattern A, the control command setting unit 120 maps a line-of-sight pattern A to a control command B and sets a control command.

In addition, the control command setting unit 120 can set a control command using the screen display information which is information such as a user's line of sight pattern and content displayed on the surface. For example, if the user's line of sight pattern is fixed to any one of a plurality of contents displayed on the screen for a certain period of time or longer, the control command setting unit 120 sets the content whose fixed- If the content is music, the control command can be set so that the screen is called according to the URL.

The control command execution unit 130 executes a control command corresponding to the visual line pattern. Specifically, the control command execution unit 130 searches for a control command mapped to the extracted line pattern of the user, and executes a control command corresponding to the extracted line pattern of the user.

The control command execution unit 130 can execute a control command that is not set by the control command setting unit 120 using the extracted line pattern information of the user and the screen display information about the content displayed on the screen . For example, when the user's gaze is fixed to the text linked to the URL among the text displayed on the screen, the control command execution unit 130 can execute a control command for switching the screen according to the URL. In addition, when the user's gaze is fixed to a specific portion of the image displayed on the screen, the control command execution unit 130 can execute a control command for enlarging a portion of the image displayed on the screen where the user's gaze is fixed.

Also, the control instruction execution unit 130 can analyze the user's concentration, preference, etc. using the line-of-sight pattern. For example, when the user's line of sight pattern is fixed on the screen for a long time, the control command execution unit 130 can determine that the user is concentrating on the content displayed on the screen. In addition, when the user's line of sight pattern is fixed only to a specific content among a plurality of contents displayed on the screen, the control instruction execution unit 130 can determine that the user prefers the content whose gaze is fixed among a plurality of contents.

The gaze information storage unit 140 stores gaze information of a user. Specifically, eye line information and eye line information tracking the user's gaze are analyzed and the extracted gaze pattern is stored.

The gaze information storage unit 140 may store the mapping relationship between the gaze pattern and the control information.

In addition, the gaze information storage unit 140 may store user concentration, preference, etc., which analyze the gaze information.

In addition, the gaze information storage unit 140 may store gaze information, a gaze pattern, and the like, which are directly input by the user, in order to learn gaze patterns.

2 is a flowchart of a control method using eye tracking according to an embodiment of the present invention. Hereinafter, the method will be described by way of example as being performed by a control apparatus using eye tracking.

Referring to FIG. 2, in step S210, a control command according to the line-of-sight pattern is set. Specifically, a separate control command is mapped for each of various visual line patterns composed of the position, movement, and fixation of the user's line of sight.

In addition, the control command and the command can be set using the line-of-sight pattern and the screen display information. For example, the control command may be mapped based on the line of sight pattern and the screen display information so that different control commands can be executed according to the content displayed on the screen even in the same line of sight pattern.

In step S220, the line of sight pattern is extracted by analyzing the user's line of sight information. Specifically, the user recognizes the line of sight from the image including the user's eye region, and tracks the direction, movement, and fixation of the line of sight. The gaze pattern is extracted based on the tracking result.

In step S230, it is determined whether the extracted visual line pattern corresponds to a control command.

In step S240, when the extracted visual line pattern corresponds to the control command, a control command corresponding to the extracted visual line pattern is executed.

In step S240, when the extracted visual line pattern does not correspond to the control command, the visual line information tracking the user's gaze and the gaze pattern extracted from the gaze information are analyzed to analyze the user's concentration, preference, and the like.

3 is a flowchart of a method of executing a control command according to a visual line pattern according to an embodiment of the present invention. Hereinafter, the method will be described by way of an example performed by the control device using the eye tracking. The eye pattern composed of the eye position and the fixed time is set as a video playback or a URL connection control command. It is assumed that the fixed line-of-sight pattern is set by the page turn control command.

Referring to FIG. 3, in step S310, it is determined whether the extracted line of sight pattern corresponds to a control command. Specifically, if the extracted visual line pattern corresponds to a control command, a step of searching for a control command corresponding to the visual line pattern is performed. If the extracted line of sight pattern does not correspond to the control command, the user's concentration or preference analysis is performed based on the eye-tracking information.

In step S320, it is determined whether the extracted visual line pattern corresponds to a moving picture playback or a URL connection command. Specifically, it is determined whether the user's line of sight corresponds to a line-of-sight pattern fixed for a time equal to or longer than a threshold value in the video or URL connection text. Depending on the judgment result, the moving picture is reproduced or the screen is switched according to the URL.

In step S330, it is determined whether the extracted visual line pattern corresponds to a page turnover command. Specifically, it is determined whether the user's line of sight is a line-of-sight pattern fixed to the last word of the document (content) displayed on the screen. According to the judgment result, the page of the document displayed on the screen is passed.

FIG. 4 is a view for explaining a line recognition process according to an embodiment of the present invention. Hereinafter, the gaze recognition process is performed by the gaze information analyzer of the control apparatus using the gaze tracking.

Referring to FIG. 4 (a), an image including a user's eye region is shown. Specifically, an image including one eye region among a plurality of users gazing at a large screen is captured. A portion corresponding to the face region is recognized in the photographed image by using the face recognition technology.

Referring to FIG. 4 (b), an image of the face region recognized in the image of FIG. 4 (a) is shown. Specifically, in order to track the user's eyes, the eye region is recognized in the face region of the user, and the user's gaze is recognized by analyzing the eye region image. Then, the gaze information is obtained by tracking the position, movement, and fixation of the recognized user's gaze.

The apparatus and method according to embodiments of the present invention may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium. The computer readable medium may include program instructions, data files, data structures, and the like, alone or in combination.

Program instructions to be recorded on a computer-readable medium may be those specially designed and constructed for the present invention or may be known and available to those of ordinary skill in the computer software arts. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Includes hardware devices specifically configured to store and execute program instructions such as magneto-optical media and ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.

The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

The embodiments of the present invention have been described above. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed embodiments should be considered in an illustrative rather than a restrictive sense. The scope of the present invention is defined by the appended claims rather than by the foregoing description, and all differences within the scope of equivalents thereof should be construed as being included in the present invention.

110: line of sight information analysis section
120: Control command setting section
130: Control command execution unit
140: eye line information storage unit

Claims (1)

A control apparatus using user's gaze tracking,
A control command setting unit for setting a control command to be executed according to a user's line of sight pattern and screen display information;
A line of sight information analyzing unit for analyzing line of sight information of the user and extracting a line of sight pattern; And
A control instruction executing section for executing a control instruction corresponding to the extracted line of sight pattern and screen display information
Comprising a control device.
KR1020160012875A 2016-02-02 2016-02-02 Control device using eye-tracking KR20170092167A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160012875A KR20170092167A (en) 2016-02-02 2016-02-02 Control device using eye-tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160012875A KR20170092167A (en) 2016-02-02 2016-02-02 Control device using eye-tracking

Publications (1)

Publication Number Publication Date
KR20170092167A true KR20170092167A (en) 2017-08-11

Family

ID=59651560

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160012875A KR20170092167A (en) 2016-02-02 2016-02-02 Control device using eye-tracking

Country Status (1)

Country Link
KR (1) KR20170092167A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190078690A (en) * 2017-12-13 2019-07-05 현대자동차주식회사 Display control device and method organizing display environment user primary language direction
WO2022065575A1 (en) * 2020-09-25 2022-03-31 주식회사 비주얼캠프 Gaze-based contents education method using object recognition and system for executing the method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190078690A (en) * 2017-12-13 2019-07-05 현대자동차주식회사 Display control device and method organizing display environment user primary language direction
WO2022065575A1 (en) * 2020-09-25 2022-03-31 주식회사 비주얼캠프 Gaze-based contents education method using object recognition and system for executing the method

Similar Documents

Publication Publication Date Title
US20210076105A1 (en) Automatic Data Extraction and Conversion of Video/Images/Sound Information from a Slide presentation into an Editable Notetaking Resource with Optional Overlay of the Presenter
US20210056251A1 (en) Automatic Data Extraction and Conversion of Video/Images/Sound Information from a Board-Presented Lecture into an Editable Notetaking Resource
KR102381801B1 (en) Systems and methods for guiding handwriting input
CN107273002B (en) Handwriting input answering method, terminal and computer readable storage medium
US9049482B2 (en) System and method for combining computer-based educational content recording and video-based educational content recording
CA3161129A1 (en) Enhancing tangible content on physical activity surface
US10546508B2 (en) System and method for automated literacy assessment
US9317486B1 (en) Synchronizing playback of digital content with captured physical content
US20200387276A1 (en) Virtualization of physical activity surface
Chatila et al. Integrated planning and execution control of autonomous robot actions
US10984671B2 (en) Information display apparatus, information display method, and computer-readable recording medium
Yadav et al. Content-driven multi-modal techniques for non-linear video navigation
Zhao et al. A new visual interface for searching and navigating slide-based lecture videos
Stearns et al. The design and preliminary evaluation of a finger-mounted camera and feedback system to enable reading of printed text for the blind
US20150301726A1 (en) Systems and Methods for Displaying Free-Form Drawing on a Contact-Sensitive Display
US20200334290A1 (en) Facilitating contextual video searching using user interactions with interactive computing environments
KR20170092167A (en) Control device using eye-tracking
Margetis et al. Enhancing education through natural interaction with physical paper
CN113625985B (en) Intelligent blackboard and display method and device thereof
Angrave et al. Creating TikToks, Memes, Accessible Content, and Books from Engineering Videos? First Solve the Scene Detection Problem.
KR20220013172A (en) Electronic device and Method for controlling the electronic device thereof
KR20130130396A (en) Method for producing educational material and educational material system
JP2024022847A (en) Information processing device, information processing method and program
Lin et al. Learning-focused structuring for blackboard lecture videos
Rana et al. A proposal for a novel e-learning system for the visually impaired

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E601 Decision to refuse application