KR20120058947A - Method and apparatus for displaying contents using eye tracking - Google Patents
Method and apparatus for displaying contents using eye tracking Download PDFInfo
- Publication number
- KR20120058947A KR20120058947A KR1020100120487A KR20100120487A KR20120058947A KR 20120058947 A KR20120058947 A KR 20120058947A KR 1020100120487 A KR1020100120487 A KR 1020100120487A KR 20100120487 A KR20100120487 A KR 20100120487A KR 20120058947 A KR20120058947 A KR 20120058947A
- Authority
- KR
- South Korea
- Prior art keywords
- user
- area
- content
- unit
- display unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
The present invention relates to a method and apparatus for outputting content, and more particularly, to a method and apparatus for outputting content using eye tracking, which can provide various effects by using eye tracking technology.
Thanks to remarkable developments in information and communication technology and semiconductor technology, recent portable terminals have not only general communication functions such as voice calls and message transmission and reception but also TV viewing functions (eg, digital multimedia broadcasting (DMB) or digital video broadcasting (DVB)). Mobile broadcasting), music playback function (eg MP3 (MPEG Audio Layer-3), photo recording function, data communication function, Internet connection function and location information providing function, etc.) are provided. For example, the portable terminal may include various sensors and additional devices, for example, the portable terminal may include a camera, an illuminance sensor, a proximity sensor, an infrared sensor, a vibration motor, a fragrance sensor, and the like.
On the other hand, portable terminals with large screen sizes, such as tablet PCs and E-Book terminals, have been increasing in recent years. Accordingly, interest in e-books through which books can be viewed through portable terminals is increasing. However, the conventional portable terminal provides only a simple function of outputting only text and images to the display unit. This can be boring and boring for the user, and can lead to reduced concentration. As such, the conventional portable terminal has a problem in that it does not effectively utilize various sensors when outputting content.
The present invention was devised to solve the above-mentioned problems of the prior art, and an object of the present invention is to distinguish between an area that the user is looking at and another area by using eye tracking technology, and to visually display the area being viewed by the user and another area. To provide a method and apparatus for outputting content using eye tracking that can be differentiated and displayed.
In addition, another object of the present invention is a content output method using eye tracking that can provide a variety of auditory, tactile, olfactory effects (effect sound, vibration, fragrance) corresponding to the information contained in the area that the user is watching (reading) and To provide a device.
In addition, another object of the present invention is to provide a method and apparatus for outputting content using pupil tracking that varies the output magnification of the content according to the distance between the user and the terminal.
According to a preferred embodiment of the present invention, a method for outputting a content using eye tracking includes: outputting content; Confirming whether the eye tracking function is activated; Activating a camera unit when the pupil tracking function is activated; Extracting an area viewed by the user from a content display area through a face image of the user photographed through the camera unit; And visually distinguishing and outputting the region viewed by the user from another region.
According to an aspect of the present invention, there is provided a content output apparatus using eye tracking, including: a display unit configured to output content; A camera unit for photographing a face image of a user; And checking whether the eye tracking function is activated when the content is output, activating the camera unit when the eye tracking function is activated, and a user of the content display area through the face image of the user captured by the camera unit. And a control unit which controls the display unit to extract an area viewed by the user and to visually differentiate and output the area viewed by the user from other areas.
As described above, the method and apparatus for outputting content using eye tracking according to an exemplary embodiment of the present invention may improve the readability of the content and induce the user's attention by differentiating and displaying the area viewed by the user from other areas. . In addition, the present invention may provide content realistically by providing tactile, auditory, and olfactory effects (vibration, sound effects, and scents) corresponding to information included in the area viewed by the user. In addition, by reducing the brightness of the other area that the user is not reading (reading), and turning off the display unit when the user is not viewing the content, it is possible to prevent the battery from being consumed. In addition, the present invention automatically controls the output magnification of the content according to the distance between the user and the terminal, the user can always view the content at the optimum magnification.
1 is a block diagram schematically illustrating a configuration of a portable terminal according to an exemplary embodiment of the present invention.
2 is a flowchart illustrating a content output method using eye tracking according to an exemplary embodiment of the present invention.
3 is a view showing a content output using eye tracking according to an embodiment of the present invention.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. Note that, in the drawings, the same components are denoted by the same reference numerals as possible. In addition, detailed descriptions of well-known functions and configurations that may blur the gist of the present invention will be omitted.
It should be noted that the embodiments of the present invention disclosed in the present specification and drawings are only illustrative of the present invention in order to facilitate the understanding of the present invention and are not intended to limit the scope of the present invention. It will be apparent to those skilled in the art that other modifications based on the technical idea of the present invention can be carried out in addition to the embodiments disclosed herein.
1 is a view schematically showing the configuration of a
Referring to FIG. 1, the
The
The
Meanwhile, the
The
The
The
The data area is an area in which data generated according to the use of the
The
In detail, the
In addition, the
The
The
On the other hand, although not shown in FIG. 1, the
2 is a flowchart illustrating a content output method using pupil tracking according to an exemplary embodiment of the present invention, and FIG. 3 is a view illustrating a content output method using pupil tracking according to an exemplary embodiment of the present invention.
1 to 3, the
The
When the extraction of the
The
The
In contrast, when the pupil tracking function is not terminated in
Although not shown in FIG. 2, the
The content output method using eye tracking according to the embodiment of the present invention as described above may be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer-readable recording medium. In this case, the computer-readable recording medium may include a program command, a data file, a data structure, etc. alone or in combination. On the other hand, the program instructions recorded on the recording medium may be those specially designed and configured for the present invention or may be available to those skilled in the art of computer software.
The computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs, DVDs, and magnetic disks such as floppy disks. -Magneto-Optical Media, and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. In addition, program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like. The hardware device described above may be configured to operate as one or more software modules to perform the operations of the present invention.
In the above description of the preferred embodiments through the specification and drawings with respect to the method and apparatus for outputting content using eye tracking according to an embodiment of the present invention, although specific terms are used, it is only easy to describe the technical content of the present invention. And only to be used in a general sense to help understand the invention, the present invention is not limited to the above-described embodiment. That is, it is apparent to those skilled in the art that various embodiments based on the technical idea of the present invention are possible.
100: portable terminal 110: control unit
120: storage unit 130: display unit
140: camera unit 150: sensor unit
160: audio processor 111: eye tracking unit
Claims (17)
Confirming whether the eye tracking function is activated;
Activating a camera unit when the pupil tracking function is activated;
Extracting an area viewed by the user from a content display area through a face image of the user photographed through the camera unit; And
And visually distinguishing and outputting the region that the user is viewing and other regions.
The process of visually differentiating and outputting
Increasing brightness of the area viewed by the user;
Enlarging and outputting an area viewed by the user;
Changing a color of an area viewed by the user; And
And a process of reducing the brightness of the other area.
And outputting the text information included in the area that the user is viewing by converting the text information into voice information.
Analyzing information of an area viewed by the user; And
And providing at least one effect of hearing, touch, and smell corresponding to the analysis result.
And turning off the display unit when the face image of the user is not captured through the camera unit.
Measuring a distance between the user and the display unit;
And adjusting the output magnification of the content according to the measured distance.
The process of measuring the distance
Measuring through a distance measuring sensor; And
And at least one of measuring distances using stereo vision technology using at least two cameras.
A camera unit for photographing a face image of a user; And
Check whether the eye tracking function is activated when the content is output, activate the camera unit when the eye tracking function is activated, and the user of the content display area through the face image of the user photographed through the camera unit. And a control unit configured to extract the viewing area and control the display unit to visually differentiate and output the area viewed by the user from other areas.
The control unit
And the display unit is controlled to increase the brightness of the area that the user is viewing.
The control unit
And the display unit controls the display unit to enlarge and output the area viewed by the user.
The control unit
And the display unit controls the display unit to change and output a color of an area viewed by the user.
The control unit
And the display unit is controlled to reduce the brightness of the other area.
And an audio processing unit for converting text information included in an area viewed by the user into voice information and outputting the converted voice information.
And a sensor unit configured to provide tactile and olfactory effects in response to the information included in the area viewed by the user.
The control unit
And adjusting the output magnification of the content according to the distance between the user's pupil and the terminal.
The control unit
Content tracking device using eye tracking, characterized in that for measuring the distance using a stereo vision technology using at least two cameras.
The control unit
And the display unit is turned off when the face image of the user is not captured through the camera unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020100120487A KR20120058947A (en) | 2010-11-30 | 2010-11-30 | Method and apparatus for displaying contents using eye tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020100120487A KR20120058947A (en) | 2010-11-30 | 2010-11-30 | Method and apparatus for displaying contents using eye tracking |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20120058947A true KR20120058947A (en) | 2012-06-08 |
Family
ID=46610328
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020100120487A KR20120058947A (en) | 2010-11-30 | 2010-11-30 | Method and apparatus for displaying contents using eye tracking |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20120058947A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101529241B1 (en) * | 2013-10-07 | 2015-06-17 | 황성재 | System and method for controlling an electronic device based upon contents displayed on the electronic device |
KR20160074315A (en) * | 2014-12-18 | 2016-06-28 | 한국과학기술원 | User terminal and method for providing haptic service of the same |
CN112306223A (en) * | 2019-08-30 | 2021-02-02 | 北京字节跳动网络技术有限公司 | Information interaction method, device, equipment and medium |
-
2010
- 2010-11-30 KR KR1020100120487A patent/KR20120058947A/en not_active Application Discontinuation
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101529241B1 (en) * | 2013-10-07 | 2015-06-17 | 황성재 | System and method for controlling an electronic device based upon contents displayed on the electronic device |
KR20160074315A (en) * | 2014-12-18 | 2016-06-28 | 한국과학기술원 | User terminal and method for providing haptic service of the same |
CN112306223A (en) * | 2019-08-30 | 2021-02-02 | 北京字节跳动网络技术有限公司 | Information interaction method, device, equipment and medium |
CN112306223B (en) * | 2019-08-30 | 2024-03-26 | 北京字节跳动网络技术有限公司 | Information interaction method, device, equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102281233B1 (en) | Apparatus and method controlling display | |
CN110572722B (en) | Video clipping method, device, equipment and readable storage medium | |
CN110708596A (en) | Method and device for generating video, electronic equipment and readable storage medium | |
US11720179B1 (en) | System and method for redirecting content based on gestures | |
IES20180181A2 (en) | Maximizing the size of a wide-screen movie on a mobile device with a front camera | |
CN110097428B (en) | Electronic order generation method, device, terminal and storage medium | |
CN109660855B (en) | Sticker display method, device, terminal and storage medium | |
CN110097429B (en) | Electronic order generation method, device, terminal and storage medium | |
CN108419113B (en) | Subtitle display method and device | |
US20230315256A1 (en) | Method for displaying application icon and electronic device | |
KR20150017131A (en) | Mobile terminal and method for controlling the mobile terminal | |
KR20110071349A (en) | Method and apparatus for controlling external output of a portable terminal | |
CN112929687A (en) | Interaction method, device and equipment based on live video and storage medium | |
US9491401B2 (en) | Video call method and electronic device supporting the method | |
WO2021013147A1 (en) | Video processing method, device, terminal, and storage medium | |
US20190012129A1 (en) | Display apparatus and method for controlling display apparatus | |
CN113407291A (en) | Content item display method, device, terminal and computer readable storage medium | |
CN112257006A (en) | Page information configuration method, device, equipment and computer readable storage medium | |
CN113225616A (en) | Video playing method and device, computer equipment and readable storage medium | |
CN113409427A (en) | Animation playing method and device, electronic equipment and computer readable storage medium | |
KR20120058947A (en) | Method and apparatus for displaying contents using eye tracking | |
CN114845152B (en) | Display method and device of play control, electronic equipment and storage medium | |
EP2587359A2 (en) | Method and apparatus for making personalized contents | |
KR102005406B1 (en) | Dispaly apparatus and controlling method thereof | |
CN112995760A (en) | Video processing method, device, equipment and computer storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |