KR101772181B1 - User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation - Google Patents

User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation Download PDF

Info

Publication number
KR101772181B1
KR101772181B1 KR1020160004742A KR20160004742A KR101772181B1 KR 101772181 B1 KR101772181 B1 KR 101772181B1 KR 1020160004742 A KR1020160004742 A KR 1020160004742A KR 20160004742 A KR20160004742 A KR 20160004742A KR 101772181 B1 KR101772181 B1 KR 101772181B1
Authority
KR
South Korea
Prior art keywords
window
woi
module
display
real time
Prior art date
Application number
KR1020160004742A
Other languages
Korean (ko)
Other versions
KR20170085297A (en
Inventor
최선영
고은지
Original Assignee
최선영
고은지
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 최선영, 고은지 filed Critical 최선영
Priority to KR1020160004742A priority Critical patent/KR101772181B1/en
Publication of KR20170085297A publication Critical patent/KR20170085297A/en
Application granted granted Critical
Publication of KR101772181B1 publication Critical patent/KR101772181B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • H04N21/25883Management of end-user data being end-user demographical data, e.g. age, family status or address
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • H04N5/217

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Computer Graphics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A system and method for analyzing a user's multitasking behavior using a user's gaze analysis is disclosed. A line of sight position data collection module for collecting the user's line of sight position data on a display in real time; A window coordinate collection module for collecting coordinates of at least two windows displayed on the display; A window display state collecting module for collecting window coordinates collected by the window coordinate collecting module and display states of respective windows according to activation of each window for at least two windows displayed on the display; A window of interest (WOI) area is obtained by grasping in real time the gaze position data collected by the gaze position data collection module in real time based on the display state of each window collected by the window display state collection module WOI area calculation and eye line analysis module. According to the multi-tasking behavior analyzing system and method of the present invention using the user's gaze analysis, the web viewer's eye line is analyzed at the time of watching the multi-tasking video of the web viewer, and the viewing time and the multi- It is possible to acquire data that can understand and study the broadcasting viewing behavior of the web viewer.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a multi-window multi-tasking viewing behavior analysis system and a method for analyzing multi-window multi-

The present invention relates to a multi-tasking behavior analyzing system and method for a web user, and more particularly, to a multi-tasking behavior analyzing system and method for a user in a multi-window form on a computer, a monitor, More specifically, the present invention relates to a system and method for analyzing a user's multitasking behavior using a user's gaze analysis. More specifically, the present invention relates to a method for viewing a video using general-purpose Internet service or using OTT (Over The Top) The present invention relates to a system and method for analyzing a user's multi-tasking viewing behavior using a user's gaze analysis as a useful method for analyzing an eSports live broadcast viewing behavior.

Recently, the viewing rate of web contents is getting higher because of watching the web through a console or a computer rather than simply a one-way use like watching TV.

For example, there are YouTube, Netflix, domestic Teabing, Naver TVcast, NextPot, etc., and web watching such as e-sports or electronic sports, which is a computer game relay broadcast, can be cited as a typical example of viewing a video OTT service using a general- . This web viewing has a characteristic of lean forward, which is a lot of interacting with the body and the screen, unlike watching through the TV set.

When using a PC, a notebook, or a tablet PC, the viewing distance is kept close to that of a TV set. For TV sets, the distance is at least 1 meter, but the distance between the PC and the notebook is much shorter. This is because the user can reconfigure the GUI (Graphic User Interface) of the display using a mouse, a keyboard, or the like.

Specifically, in e-sports, unlike other sporting events, there is a feature that displays the game screen that players play rather than the player's play. Also, in the case of e - sports relay, interactive watching such as real - time comment or reaction is very active compared to other sports relay. Due to the nature of such e-sports, there are many cases where a single display is divided into multiple windows. For example, there may be a window of the game itself, a game comment window, a window displaying a player's image, a chatting window, and the like. On the other hand, when viewing one e-sport, multiple windows can be opened and viewed according to the user's preference for multitasking even if the corresponding window is not displayed. For example, a plurality of windows such as an e-sports relay window, a portal search site window, a chat window, The user selects the overlapped windows and determines the viewing selection and the switching through the eyes while the user adjusts them using a mouse or a keyboard. Web viewing can be achieved through multi-window multitasking on one display.

Recently, web viewers are selecting multi-contents while paying attention to these various windows at the moment. In the younger generation, viewing time and watching frequency of each window are changed according to the contents of contents, preference, involvement, perceived difficulty, performance, degree of immersion, etc., Switching is also done accordingly. However, since the conventional gaze tracking method tracks only a single coordinate occurring in each window, it can be said that it is difficult to derive an accurate measurement value of what the user actually watched and what the window was switched to.

Therefore, if multi-tasking video viewing behavior is analyzed by tracking and measuring the eye coordinates in which the viewing behavior occurs when the overlapping multiple windows are used in the display, the viewing behavior and interest of the web video viewer , Preference, etc., and can be utilized as research and research data on visual attention to game users.

Previously, this multi-tasking video viewing was centered on a single window coordinate analysis, and thus the accurate analysis was not performed properly. Especially, the window of the dynamic scene is not analyzed yet.

Accordingly, there is a need for a tool for accurately and detailly analyzing multi-tasking behaviors that are often found in OTT service viewing and e-sports viewing behavior in a web display.

More specifically, in timing measurement and analysis of multitasking switching, or in using more than one window with multiple windows, it is necessary to specifically measure the line-by-window line-by- And can be useful in analyzing the context and behavior of conversion.

For example, you can search for a concrete cue about which window to open and use at any moment when you conduct a user survey or research on the frequency and preference of a transition, In analyzing the relationship between subjective and emotional changes, it is useful to analyze the relationship between subjective factors and boredom, visual attention, Factors, multitasking patterns of SNS utilization, and the timing of switching can be concretely made possible.

An object of the present invention is to provide a multitasking behavior analysis system for a user who utilizes multiple windows on a web display screen using a user's gaze analysis.

It is another object of the present invention to provide a multitasking behavior analysis method of a user who utilizes multiple windows on a web display screen using a user's gaze analysis.

A system for analyzing a user's multitasking behavior using a user's gaze analysis according to an object of the present invention includes: a gaze position data collection module for collecting gaze position data of a user on a display in real time; A window coordinate collection module for collecting coordinates of at least two windows displayed on the display for each window; A window display state collecting module for collecting window coordinates collected by the window coordinate collecting module and display states of respective windows according to activation of each window for at least two windows displayed on the display; The gaze position data collected by the gaze position data collection module can be grasped in real time according to the display state of each window collected by the window display state collecting module, and the window of interest Quot; region < / RTI >

Here, the visual position data collected in real time in the visual position data collection module, the coordinates of the real-time collected window in the window coordinate collection module, and the display state of the window collected in the window display state collection module are collected in real time, data generated by the log data generation module and storing the log data in a log database.

The WOI region visualization module may further include a WOI region visualization module that visualizes the WOI region on the display in real time, displays the real time visualization of the WOI region, records the displayed video, and stores the generated video in the WOI region visualization moving image database.

The WOI area calculation and line of sight analysis module may further be configured to generate WOI log data and to store the WOI log data in the WOI log data database by further calculating a window watch time, a watch switch frequency, and a watch frequency according to the time zone according to the calculated WOI area have.

According to another aspect of the present invention, there is provided a method of analyzing a user's multitasking behavior using a user's gaze analysis, the method comprising: collecting gaze position data of a user on a display in real time; Collecting, by each window, coordinates of at least two windows displayed on the display by the window coordinate acquisition module; Collecting a window display state collection module for at least two windows displayed on the display, the window coordinates collected by the window coordinate collection module and the display state of each window according to activation of each window; The WOI area calculation and visual line analysis module grasps in real time how the visual line position data collected in real time by the visual line position data collection module stays according to the display state of each window collected by the window display state collecting module, And calculating a window of interest (WOI) region that is a region of interest.

Here, the log data generation module real-time collects the visual position data collected in the visual position data collection module, the coordinates of the window collected in real time in the window coordinate collection module, and the display state of the window collected in the window display state collection module And generating log data and storing the log data in a log database.

The WOI region visualization module may further comprise visualizing the WOI region on the display in real time, displaying the WOI region in real time, visualizing the WOI region in real time, recording the displayed video and storing the generated video in the WOI region visualization moving image database .

And the WOI area calculation and visual line analysis module further calculates window watch time, watch change frequency and watch frequency for each time period according to the calculated WOI area to generate WOI log data and store the WOI log data in the WOI log data database As shown in FIG.

According to the multi-tasking behavior analyzing system and method of the present invention using the user's eye-gaze analysis, the web viewer's eyes are analyzed at the time of watching the multi-tasking video of the web viewer, and the multi-tasking times and multi- It is possible to acquire data that can understand and study the dynamic view of the video and the dynamic scene of the web viewer.

In particular, even when the window is dynamically changed or multiple windows are overlapped or superimposed, it is configured to analyze the gaze of the web viewer in conjunction with the coordinates of the window and the activation state, AOI) can be grasped accurately.

Ultimately, by analyzing the behavior of multi-window multi-tasking video watching on web display, we can develop a user interface (UI) for user experience (UX: User Experience) And the like.

1 is a block diagram of a multi-tasking behavior analysis system for a user using eye-gaze analysis of a user according to an embodiment of the present invention.
FIG. 2 is an exemplary view of a window-of-interest region and line-of-sight classification process according to an embodiment of the present invention.
3 is an exemplary view of a screen displaying a WOI according to an embodiment of the present invention.
4 is a flowchart illustrating a method for analyzing a user's multitasking behavior using a user's gaze analysis according to an exemplary embodiment of the present invention.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail to the concrete inventive concept. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. Like reference numerals are used for like elements in describing each drawing.

The terms first, second, A, B, etc. may be used to describe various elements, but the elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. The term "and / or" includes any combination of a plurality of related listed items or any of a plurality of related listed items.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.

Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings.

FIG. 1 is a block diagram of a system for analyzing a user's multitasking behavior using a user's gaze analysis according to an exemplary embodiment of the present invention. FIG. 2 is a block diagram illustrating a multi- And FIG. 3 is an exemplary view of a screen displaying a WOI according to an embodiment of the present invention.

Referring to FIG. 1, a user's multitasking behavior analyzing system (hereinafter, referred to as a 'multitasking behavior analyzing system') 100 using a user's gaze analysis according to an embodiment of the present invention includes a gaze position data collecting module A window coordinate acquisition module 120, a window display state collection module 130, a log data generation module 140, a log database 150, a window of interest (WOI) The WOI log data database 170, the WOI visualization module 180, and the WOI region visualization motion picture database 190. The WOI visualization module 180 may be any of the following:

The multitasking behavior analysis system 100 is a system for acquiring and analyzing data for analyzing the viewing behavior of a web viewer who watches web contents such as OTT service video viewing or e-sports.

The multi-tasking behavior analyzing system 100 is configured to analyze a plurality of dynamic windows displayed on a single display in a window interest area (AOI), which can be referred to as an area of interest (AOI) The window of interest (WOI), and can acquire data on the number of times of multitasking, that is, the number of times of watching the window and the number of times of watching the window, and analyzing the viewing behavior.

Hereinafter, the detailed configuration will be described.

The gaze position data acquisition module 110 may be configured to collect the gaze position data of the user on the display 10 in real time.

The gaze position data can be analyzed through the sensor provided on the display 10 and the accurate window unit WOI in which the user's gaze can be found on the display 10 can be found.

The gaze position data can be configured to accurately grasp the distance between the display 10 and the user's eyes and the movement of the user's pupils through a camera sensor, a distance sensor, and the like.

The window coordinate acquisition module 120 may be configured to collect coordinates of at least two windows displayed on the display 10. The window coordinate acquisition module 120 can more accurately collect and analyze window coordinates of each window through a display control module (not shown) or an OS (Operating System).

The window display state collection module 130 is configured to collect the window coordinates collected by the window coordinate collection module 120 for at least two windows displayed on the display 10 and the display state of each window according to the activation of each window Lt; / RTI >

The window display state collection module 130 can be configured to accurately collect the states in which the windows overlap and display, and to correctly analyze also what priority each window is displayed as being superimposed. In addition, it can be configured to grasp in real time which window the currently active window is.

The window display state collecting module 130 can be configured to accurately grasp which window is overlapped and displayed in the coordinate system on the display when several windows overlap each other when viewed by each window.

The log data generation module 140 generates the log position data collected in real time from the visual position data collection module 110, the coordinates of the window collected in real time from the window coordinate collection module 120, The display state of the window may be collected in real time to generate log data and stored in the log database 150.

These data are respectively stored as raw data in the log database 150, and the stored row data can be utilized as data for analysis by various analysis techniques. For example, various analysis data such as a multitasking switching point, a frequency or frequency of multitasking switching, a monitoring time for each window, a multitasking switching count for each window, an activation time of each window, and the like can be derived from the raw data.

Table 1 below shows the total watch time of each currently displayed window by window.

Figure 112016004207015-pat00001

Table 2 below illustrates the watching time of each content category.

Figure 112016004207015-pat00002

Table 3 below shows data on multitasking behavior after sports relay.

Figure 112016004207015-pat00003

The WOI area calculation and line of sight analysis module 160 analyzes the gaze position data collected by the gaze position data collection module 110 in real time according to the display state of each window collected by the window display state collection module 130 It can be configured to calculate the WOI region in real time by knowing whether or not it stays.

The WOI area calculation and line of sight analysis module 160 analyzes the user's line of sight in real time and can be configured to accurately calculate in which window the user's line of sight is located through the display state of the window. The WOI area calculation and line of sight analysis module 160 can be used to convert a window activation or a window size or a position of a window using a mouse or a keyboard, And can be configured to accurately grasp it.

The WOI area calculation and line of sight analysis module 160 may be further configured to generate the WOI log data by further calculating the window watching time and window switching time for each time period according to the calculated WOI area and store the WOI log data in the WOI log data database 170 have.

As shown in FIG. 2, coordinates for each window are collected, and a window in which WOI exists for each window can be grasped in real time.

The WOI region visualization module 180 may be configured to visualize and display the WOI region on the display 10 in real time. That is, the window in which the user's gaze is staying may be highlighted and displayed to display in real time which window the user is viewing.

In addition, the WOI region visualization module 180 may be configured to visualize and visualize the WOI region in real time to record and display the displayed video and store it in the WOI region visualization moving image database 190.

The WOI area visualization module 180 is useful for allowing the user to easily grasp the viewing behavior of the user's gaze on the entire screen of the display 10 through a moving image.

The researcher studying the viewing behavior of the web viewer can visually and stereoscopically analyze the web viewing behavior through the video generated by the WOI region visualization module 180. [

As shown in FIG. 3, among the plurality of windows, the window where WOI is located is highlighted in yellow. This makes it easy to grasp accurately which window the user is watching at a certain time in real time.

On the other hand, it is possible to construct an algorithm that helps each web viewer to watch the web more conveniently based on the analysis of the viewing behavior of each web viewer. These algorithms can be configured in various ways.

For example, the web viewing control module (not shown) may be configured to automatically control the activation of the window through when the multi-tasking switching of the web viewer occurs and when the window is switched to.

As a result of analyzing the web viewing behavior, if the web viewer A shows a pattern of confirming immediately when the SNS message arrives, the web viewer A can be configured to immediately activate and display the SNS messenger window every time the message arrives . And may be configured to display the SNS messenger window on the entire screen of the display 10 depending on the setting. If the viewing behavior of web viewer B does not immediately confirm the SNS message, it can be configured to not immediately activate the SNS messenger window.

In addition, as a result of analyzing the web watching behavior of the web viewer A, if the multitasking is performed while watching the video, if the sound of the video content rapidly grows, if the gaze returns to the video window, And may be configured to activate or display the entire screen. If the viewer's gaze is out of the video window, the user can control to display the original window screen again.

On the other hand, the web viewing / listening control module (not shown) may collect the real time view rate of the moving image in real time, and may automatically control the window of the moving image to be displayed automatically at the timing when the view rate is rapidly increased.

On the other hand, it is possible to control the window to be more convenient by making the window of the web viewer's AOI more active.

As described above, the viewing behavior for each web viewer may vary, and the algorithm for controlling web viewing according to the web viewing behavior can be variously configured.

4 is a flowchart illustrating a method for analyzing a user's multitasking behavior using a user's gaze analysis according to an exemplary embodiment of the present invention.

Referring to FIG. 4, first, the gaze position data collection module 110 collects the gaze position data of the user on the display 10 in real time (S101).

Next, the window coordinate acquisition module 120 acquires coordinates of at least two windows displayed on the display 10 (S102).

Next, the window display state collecting module 130 compares the window coordinates acquired by the window coordinate acquisition module 120 with respect to at least two windows displayed on the display 10 and the display state of each window according to the activation of each window (S103).

Next, the gaze position data collected in real time by the gaze position data collection module 110 according to the display state of each window collected by the window display state collection module 130 A Window of Interest (WOI) area is calculated by knowing in real time which window the user is staying in (S104).

Next, the log data generation module 140 generates the visual position data collected in real time from the visual position data collection module 110, the coordinates of the real-time collected window in the window coordinate collection module 120, The display state of the window is collected in real time to generate log data and stored in the log database 150 (S105).

Next, the WOI area visualization module 180 visualizes and displays the WOI area on the display 10 in real time, and the WOI area is visualized in real time to record and display the displayed video, and stored in the WOI area visualization moving image database 190 (S106).

Next, the WOI area calculation and line of sight analysis module 160 further calculates window watching time and window switching time for each time period according to the calculated WOI area, and generates WOI log data and stores it in the WOI log data database 170 S107).

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention as defined in the following claims. There will be.

110: eye position data acquisition module
120: window coordinate acquisition module
130: window display state collecting module
140: log data generation module
150: Log database
160: WOI area calculation and gaze analysis module
170: WOI log data database
180: WOI visualization module
190: WOI area visualization video database

Claims (8)

A gaze position data collection module for collecting gaze position data of a user on a display in real time;
A window coordinate collection module for collecting coordinates of at least two windows displayed on the display for each window;
A window display state collecting module for collecting window coordinates collected by the window coordinate collecting module and display states of respective windows according to activation of each window for at least two windows displayed on the display;
The gaze position data collected by the gaze position data collection module can be grasped in real time according to the display state of each window collected by the window display state collecting module, and the window of interest ≪ / RTI > area and a line of sight analysis module;
A WOI region visualization module for visualizing the WOI region on the display in real time, recording the visualized video by visualizing the WOI region in real time, and storing the generated video in the WOI region visualization moving image database, Multitasking behavior analysis system.
The method according to claim 1,
The visual position data collected in real time from the visual position data collection module, the coordinates of the real-time collected window in the window coordinate collection module, and the display state of the window collected in the window display state collection module, And a log data generation module that generates log data and stores the generated log data in a log database.
delete The method of claim 1, wherein the WOI area calculation and line of sight analysis module comprises:
Wherein the WOI log data is generated by further calculating the window watch time, the watch change frequency and the watch frequency according to the calculated WOI area, and stores the generated WOI log data in the WOI log data database. Multitasking behavior analysis system.
The visual line position data acquisition module collecting the user's line of sight position data on the display in real time;
Collecting, by each window, coordinates of at least two windows displayed on the display by the window coordinate acquisition module;
Collecting a window display state collection module for at least two windows displayed on the display, the window coordinates collected by the window coordinate collection module and the display state of each window according to activation of each window;
The WOI area calculation and visual line analysis module grasps in real time how the visual line position data collected in real time by the visual line position data collection module stays according to the display state of each window collected by the window display state collecting module, Calculating a window of interest (WOI) region of interest;
The log data generation module collects in real time the visual position data collected in real time in the visual position data collection module, the coordinates of the window collected in real time in the window coordinate collection module, and the display state of the window collected in the window display state collection module, Generating and storing log data in a log database;
Wherein the WOI region visualization module visualizes the WOI region on the display in real time, displays the WOI region in real time, records the displayed video and stores the recorded video in the WOI region visualization moving image database, A method for analyzing user 's multitasking behavior.
delete delete 6. The method of claim 5,
The WOI area calculation and visual line analysis module further generates WOI log data and further stores the WOI log data in the WOI log data database by further calculating the window watch time, the watch change frequency and the watch frequency according to the calculated time interval according to the calculated WOI area And analyzing the user's multitasking behavior using the user's gaze analysis.
KR1020160004742A 2016-01-14 2016-01-14 User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation KR101772181B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160004742A KR101772181B1 (en) 2016-01-14 2016-01-14 User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160004742A KR101772181B1 (en) 2016-01-14 2016-01-14 User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation

Publications (2)

Publication Number Publication Date
KR20170085297A KR20170085297A (en) 2017-07-24
KR101772181B1 true KR101772181B1 (en) 2017-08-28

Family

ID=59429242

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160004742A KR101772181B1 (en) 2016-01-14 2016-01-14 User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation

Country Status (1)

Country Link
KR (1) KR101772181B1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102299103B1 (en) * 2019-10-23 2021-09-07 주식회사 비주얼캠프 Apparatus for gaze analysis, system and method for gaze analysis of using the same
KR102157607B1 (en) 2019-11-29 2020-09-18 세종대학교산학협력단 Method and server for visualizing eye movement and sight data distribution using smudge effect
KR102665453B1 (en) * 2022-01-17 2024-05-10 엔에이치엔 주식회사 Apparatus and method for providing customized content based on gaze recognition

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4396262B2 (en) * 2003-12-22 2010-01-13 富士ゼロックス株式会社 Information processing apparatus, information processing method, and computer program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4396262B2 (en) * 2003-12-22 2010-01-13 富士ゼロックス株式会社 Information processing apparatus, information processing method, and computer program

Also Published As

Publication number Publication date
KR20170085297A (en) 2017-07-24

Similar Documents

Publication Publication Date Title
US11287956B2 (en) Systems and methods for representing data, media, and time using spatial levels of detail in 2D and 3D digital applications
US20190387257A1 (en) Method and system for generating highlights from scored data streams
Kurzhals et al. Space-time visual analytics of eye-tracking data for dynamic stimuli
JP4865811B2 (en) Viewing tendency management apparatus, system and program
US7556377B2 (en) System and method of detecting eye fixations using adaptive thresholds
JP6165846B2 (en) Selective enhancement of parts of the display based on eye tracking
KR20100048688A (en) Sharing system of emotion data and method sharing emotion data
KR101772181B1 (en) User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation
Brown et al. HCI over multiple screens
US20230044842A1 (en) Work analyzing device and work analyzing method
KR102466438B1 (en) Cognitive function assessment system and method of assessing cognitive funtion
US20210349620A1 (en) Image display apparatus, control method and non-transitory computer-readable storage medium
JP2012042507A (en) Video display device
CN105933787A (en) Video comment and processing method and device thereof, and server
Probst et al. SportSense: User Interface for Sketch-Based Spatio-Temporal Team Sports Video Scene Retrieval.
Li et al. Designing shared gaze awareness for remote collaboration
JP4536558B2 (en) TV program audience quality measurement method
JP6803431B2 (en) Programs, information processing devices and information processing methods
Rohs et al. Impact of item density on the utility of visual context in magic lens interactions
JP2013164667A (en) Video retrieval device, method for retrieving video, and video retrieval program
Ali-Hasan et al. Best practices for eye tracking of television and video user experiences
JP2019208885A (en) Emotion evaluation support system and emotion evaluation support method
Pelletier et al. Atypical visual display for monitoring multiple CCTV feeds
Radecký et al. Evaluating user reaction to user interface element using eye-tracking technology
Mussgnug et al. Target based analysis-A model to analyse usability tests based on mobile eye tracking recordings

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant