KR101772181B1 - User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation - Google Patents
User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation Download PDFInfo
- Publication number
- KR101772181B1 KR101772181B1 KR1020160004742A KR20160004742A KR101772181B1 KR 101772181 B1 KR101772181 B1 KR 101772181B1 KR 1020160004742 A KR1020160004742 A KR 1020160004742A KR 20160004742 A KR20160004742 A KR 20160004742A KR 101772181 B1 KR101772181 B1 KR 101772181B1
- Authority
- KR
- South Korea
- Prior art keywords
- window
- woi
- module
- display
- real time
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25866—Management of end-user data
- H04N21/25883—Management of end-user data being end-user demographical data, e.g. age, family status or address
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
-
- H04N5/217—
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Computer Graphics (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A system and method for analyzing a user's multitasking behavior using a user's gaze analysis is disclosed. A line of sight position data collection module for collecting the user's line of sight position data on a display in real time; A window coordinate collection module for collecting coordinates of at least two windows displayed on the display; A window display state collecting module for collecting window coordinates collected by the window coordinate collecting module and display states of respective windows according to activation of each window for at least two windows displayed on the display; A window of interest (WOI) area is obtained by grasping in real time the gaze position data collected by the gaze position data collection module in real time based on the display state of each window collected by the window display state collection module WOI area calculation and eye line analysis module. According to the multi-tasking behavior analyzing system and method of the present invention using the user's gaze analysis, the web viewer's eye line is analyzed at the time of watching the multi-tasking video of the web viewer, and the viewing time and the multi- It is possible to acquire data that can understand and study the broadcasting viewing behavior of the web viewer.
Description
The present invention relates to a multi-tasking behavior analyzing system and method for a web user, and more particularly, to a multi-tasking behavior analyzing system and method for a user in a multi-window form on a computer, a monitor, More specifically, the present invention relates to a system and method for analyzing a user's multitasking behavior using a user's gaze analysis. More specifically, the present invention relates to a method for viewing a video using general-purpose Internet service or using OTT (Over The Top) The present invention relates to a system and method for analyzing a user's multi-tasking viewing behavior using a user's gaze analysis as a useful method for analyzing an eSports live broadcast viewing behavior.
Recently, the viewing rate of web contents is getting higher because of watching the web through a console or a computer rather than simply a one-way use like watching TV.
For example, there are YouTube, Netflix, domestic Teabing, Naver TVcast, NextPot, etc., and web watching such as e-sports or electronic sports, which is a computer game relay broadcast, can be cited as a typical example of viewing a video OTT service using a general- . This web viewing has a characteristic of lean forward, which is a lot of interacting with the body and the screen, unlike watching through the TV set.
When using a PC, a notebook, or a tablet PC, the viewing distance is kept close to that of a TV set. For TV sets, the distance is at least 1 meter, but the distance between the PC and the notebook is much shorter. This is because the user can reconfigure the GUI (Graphic User Interface) of the display using a mouse, a keyboard, or the like.
Specifically, in e-sports, unlike other sporting events, there is a feature that displays the game screen that players play rather than the player's play. Also, in the case of e - sports relay, interactive watching such as real - time comment or reaction is very active compared to other sports relay. Due to the nature of such e-sports, there are many cases where a single display is divided into multiple windows. For example, there may be a window of the game itself, a game comment window, a window displaying a player's image, a chatting window, and the like. On the other hand, when viewing one e-sport, multiple windows can be opened and viewed according to the user's preference for multitasking even if the corresponding window is not displayed. For example, a plurality of windows such as an e-sports relay window, a portal search site window, a chat window, The user selects the overlapped windows and determines the viewing selection and the switching through the eyes while the user adjusts them using a mouse or a keyboard. Web viewing can be achieved through multi-window multitasking on one display.
Recently, web viewers are selecting multi-contents while paying attention to these various windows at the moment. In the younger generation, viewing time and watching frequency of each window are changed according to the contents of contents, preference, involvement, perceived difficulty, performance, degree of immersion, etc., Switching is also done accordingly. However, since the conventional gaze tracking method tracks only a single coordinate occurring in each window, it can be said that it is difficult to derive an accurate measurement value of what the user actually watched and what the window was switched to.
Therefore, if multi-tasking video viewing behavior is analyzed by tracking and measuring the eye coordinates in which the viewing behavior occurs when the overlapping multiple windows are used in the display, the viewing behavior and interest of the web video viewer , Preference, etc., and can be utilized as research and research data on visual attention to game users.
Previously, this multi-tasking video viewing was centered on a single window coordinate analysis, and thus the accurate analysis was not performed properly. Especially, the window of the dynamic scene is not analyzed yet.
Accordingly, there is a need for a tool for accurately and detailly analyzing multi-tasking behaviors that are often found in OTT service viewing and e-sports viewing behavior in a web display.
More specifically, in timing measurement and analysis of multitasking switching, or in using more than one window with multiple windows, it is necessary to specifically measure the line-by-window line-by- And can be useful in analyzing the context and behavior of conversion.
For example, you can search for a concrete cue about which window to open and use at any moment when you conduct a user survey or research on the frequency and preference of a transition, In analyzing the relationship between subjective and emotional changes, it is useful to analyze the relationship between subjective factors and boredom, visual attention, Factors, multitasking patterns of SNS utilization, and the timing of switching can be concretely made possible.
An object of the present invention is to provide a multitasking behavior analysis system for a user who utilizes multiple windows on a web display screen using a user's gaze analysis.
It is another object of the present invention to provide a multitasking behavior analysis method of a user who utilizes multiple windows on a web display screen using a user's gaze analysis.
A system for analyzing a user's multitasking behavior using a user's gaze analysis according to an object of the present invention includes: a gaze position data collection module for collecting gaze position data of a user on a display in real time; A window coordinate collection module for collecting coordinates of at least two windows displayed on the display for each window; A window display state collecting module for collecting window coordinates collected by the window coordinate collecting module and display states of respective windows according to activation of each window for at least two windows displayed on the display; The gaze position data collected by the gaze position data collection module can be grasped in real time according to the display state of each window collected by the window display state collecting module, and the window of interest Quot; region < / RTI >
Here, the visual position data collected in real time in the visual position data collection module, the coordinates of the real-time collected window in the window coordinate collection module, and the display state of the window collected in the window display state collection module are collected in real time, data generated by the log data generation module and storing the log data in a log database.
The WOI region visualization module may further include a WOI region visualization module that visualizes the WOI region on the display in real time, displays the real time visualization of the WOI region, records the displayed video, and stores the generated video in the WOI region visualization moving image database.
The WOI area calculation and line of sight analysis module may further be configured to generate WOI log data and to store the WOI log data in the WOI log data database by further calculating a window watch time, a watch switch frequency, and a watch frequency according to the time zone according to the calculated WOI area have.
According to another aspect of the present invention, there is provided a method of analyzing a user's multitasking behavior using a user's gaze analysis, the method comprising: collecting gaze position data of a user on a display in real time; Collecting, by each window, coordinates of at least two windows displayed on the display by the window coordinate acquisition module; Collecting a window display state collection module for at least two windows displayed on the display, the window coordinates collected by the window coordinate collection module and the display state of each window according to activation of each window; The WOI area calculation and visual line analysis module grasps in real time how the visual line position data collected in real time by the visual line position data collection module stays according to the display state of each window collected by the window display state collecting module, And calculating a window of interest (WOI) region that is a region of interest.
Here, the log data generation module real-time collects the visual position data collected in the visual position data collection module, the coordinates of the window collected in real time in the window coordinate collection module, and the display state of the window collected in the window display state collection module And generating log data and storing the log data in a log database.
The WOI region visualization module may further comprise visualizing the WOI region on the display in real time, displaying the WOI region in real time, visualizing the WOI region in real time, recording the displayed video and storing the generated video in the WOI region visualization moving image database .
And the WOI area calculation and visual line analysis module further calculates window watch time, watch change frequency and watch frequency for each time period according to the calculated WOI area to generate WOI log data and store the WOI log data in the WOI log data database As shown in FIG.
According to the multi-tasking behavior analyzing system and method of the present invention using the user's eye-gaze analysis, the web viewer's eyes are analyzed at the time of watching the multi-tasking video of the web viewer, and the multi-tasking times and multi- It is possible to acquire data that can understand and study the dynamic view of the video and the dynamic scene of the web viewer.
In particular, even when the window is dynamically changed or multiple windows are overlapped or superimposed, it is configured to analyze the gaze of the web viewer in conjunction with the coordinates of the window and the activation state, AOI) can be grasped accurately.
Ultimately, by analyzing the behavior of multi-window multi-tasking video watching on web display, we can develop a user interface (UI) for user experience (UX: User Experience) And the like.
1 is a block diagram of a multi-tasking behavior analysis system for a user using eye-gaze analysis of a user according to an embodiment of the present invention.
FIG. 2 is an exemplary view of a window-of-interest region and line-of-sight classification process according to an embodiment of the present invention.
3 is an exemplary view of a screen displaying a WOI according to an embodiment of the present invention.
4 is a flowchart illustrating a method for analyzing a user's multitasking behavior using a user's gaze analysis according to an exemplary embodiment of the present invention.
While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail to the concrete inventive concept. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. Like reference numerals are used for like elements in describing each drawing.
The terms first, second, A, B, etc. may be used to describe various elements, but the elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. The term "and / or" includes any combination of a plurality of related listed items or any of a plurality of related listed items.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.
Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings.
FIG. 1 is a block diagram of a system for analyzing a user's multitasking behavior using a user's gaze analysis according to an exemplary embodiment of the present invention. FIG. 2 is a block diagram illustrating a multi- And FIG. 3 is an exemplary view of a screen displaying a WOI according to an embodiment of the present invention.
Referring to FIG. 1, a user's multitasking behavior analyzing system (hereinafter, referred to as a 'multitasking behavior analyzing system') 100 using a user's gaze analysis according to an embodiment of the present invention includes a gaze position data collecting module A window
The multitasking
The multi-tasking
Hereinafter, the detailed configuration will be described.
The gaze position
The gaze position data can be analyzed through the sensor provided on the
The gaze position data can be configured to accurately grasp the distance between the
The window coordinate
The window display
The window display
The window display
The log
These data are respectively stored as raw data in the
Table 1 below shows the total watch time of each currently displayed window by window.
Table 2 below illustrates the watching time of each content category.
Table 3 below shows data on multitasking behavior after sports relay.
The WOI area calculation and line of
The WOI area calculation and line of
The WOI area calculation and line of
As shown in FIG. 2, coordinates for each window are collected, and a window in which WOI exists for each window can be grasped in real time.
The WOI
In addition, the WOI
The WOI
The researcher studying the viewing behavior of the web viewer can visually and stereoscopically analyze the web viewing behavior through the video generated by the WOI
As shown in FIG. 3, among the plurality of windows, the window where WOI is located is highlighted in yellow. This makes it easy to grasp accurately which window the user is watching at a certain time in real time.
On the other hand, it is possible to construct an algorithm that helps each web viewer to watch the web more conveniently based on the analysis of the viewing behavior of each web viewer. These algorithms can be configured in various ways.
For example, the web viewing control module (not shown) may be configured to automatically control the activation of the window through when the multi-tasking switching of the web viewer occurs and when the window is switched to.
As a result of analyzing the web viewing behavior, if the web viewer A shows a pattern of confirming immediately when the SNS message arrives, the web viewer A can be configured to immediately activate and display the SNS messenger window every time the message arrives . And may be configured to display the SNS messenger window on the entire screen of the
In addition, as a result of analyzing the web watching behavior of the web viewer A, if the multitasking is performed while watching the video, if the sound of the video content rapidly grows, if the gaze returns to the video window, And may be configured to activate or display the entire screen. If the viewer's gaze is out of the video window, the user can control to display the original window screen again.
On the other hand, the web viewing / listening control module (not shown) may collect the real time view rate of the moving image in real time, and may automatically control the window of the moving image to be displayed automatically at the timing when the view rate is rapidly increased.
On the other hand, it is possible to control the window to be more convenient by making the window of the web viewer's AOI more active.
As described above, the viewing behavior for each web viewer may vary, and the algorithm for controlling web viewing according to the web viewing behavior can be variously configured.
4 is a flowchart illustrating a method for analyzing a user's multitasking behavior using a user's gaze analysis according to an exemplary embodiment of the present invention.
Referring to FIG. 4, first, the gaze position
Next, the window coordinate
Next, the window display
Next, the gaze position data collected in real time by the gaze position
Next, the log
Next, the WOI
Next, the WOI area calculation and line of
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention as defined in the following claims. There will be.
110: eye position data acquisition module
120: window coordinate acquisition module
130: window display state collecting module
140: log data generation module
150: Log database
160: WOI area calculation and gaze analysis module
170: WOI log data database
180: WOI visualization module
190: WOI area visualization video database
Claims (8)
A window coordinate collection module for collecting coordinates of at least two windows displayed on the display for each window;
A window display state collecting module for collecting window coordinates collected by the window coordinate collecting module and display states of respective windows according to activation of each window for at least two windows displayed on the display;
The gaze position data collected by the gaze position data collection module can be grasped in real time according to the display state of each window collected by the window display state collecting module, and the window of interest ≪ / RTI > area and a line of sight analysis module;
A WOI region visualization module for visualizing the WOI region on the display in real time, recording the visualized video by visualizing the WOI region in real time, and storing the generated video in the WOI region visualization moving image database, Multitasking behavior analysis system.
The visual position data collected in real time from the visual position data collection module, the coordinates of the real-time collected window in the window coordinate collection module, and the display state of the window collected in the window display state collection module, And a log data generation module that generates log data and stores the generated log data in a log database.
Wherein the WOI log data is generated by further calculating the window watch time, the watch change frequency and the watch frequency according to the calculated WOI area, and stores the generated WOI log data in the WOI log data database. Multitasking behavior analysis system.
Collecting, by each window, coordinates of at least two windows displayed on the display by the window coordinate acquisition module;
Collecting a window display state collection module for at least two windows displayed on the display, the window coordinates collected by the window coordinate collection module and the display state of each window according to activation of each window;
The WOI area calculation and visual line analysis module grasps in real time how the visual line position data collected in real time by the visual line position data collection module stays according to the display state of each window collected by the window display state collecting module, Calculating a window of interest (WOI) region of interest;
The log data generation module collects in real time the visual position data collected in real time in the visual position data collection module, the coordinates of the window collected in real time in the window coordinate collection module, and the display state of the window collected in the window display state collection module, Generating and storing log data in a log database;
Wherein the WOI region visualization module visualizes the WOI region on the display in real time, displays the WOI region in real time, records the displayed video and stores the recorded video in the WOI region visualization moving image database, A method for analyzing user 's multitasking behavior.
The WOI area calculation and visual line analysis module further generates WOI log data and further stores the WOI log data in the WOI log data database by further calculating the window watch time, the watch change frequency and the watch frequency according to the calculated time interval according to the calculated WOI area And analyzing the user's multitasking behavior using the user's gaze analysis.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160004742A KR101772181B1 (en) | 2016-01-14 | 2016-01-14 | User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160004742A KR101772181B1 (en) | 2016-01-14 | 2016-01-14 | User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170085297A KR20170085297A (en) | 2017-07-24 |
KR101772181B1 true KR101772181B1 (en) | 2017-08-28 |
Family
ID=59429242
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160004742A KR101772181B1 (en) | 2016-01-14 | 2016-01-14 | User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101772181B1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102299103B1 (en) * | 2019-10-23 | 2021-09-07 | 주식회사 비주얼캠프 | Apparatus for gaze analysis, system and method for gaze analysis of using the same |
KR102157607B1 (en) | 2019-11-29 | 2020-09-18 | 세종대학교산학협력단 | Method and server for visualizing eye movement and sight data distribution using smudge effect |
KR102665453B1 (en) * | 2022-01-17 | 2024-05-10 | 엔에이치엔 주식회사 | Apparatus and method for providing customized content based on gaze recognition |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4396262B2 (en) * | 2003-12-22 | 2010-01-13 | 富士ゼロックス株式会社 | Information processing apparatus, information processing method, and computer program |
-
2016
- 2016-01-14 KR KR1020160004742A patent/KR101772181B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4396262B2 (en) * | 2003-12-22 | 2010-01-13 | 富士ゼロックス株式会社 | Information processing apparatus, information processing method, and computer program |
Also Published As
Publication number | Publication date |
---|---|
KR20170085297A (en) | 2017-07-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11287956B2 (en) | Systems and methods for representing data, media, and time using spatial levels of detail in 2D and 3D digital applications | |
US20190387257A1 (en) | Method and system for generating highlights from scored data streams | |
Kurzhals et al. | Space-time visual analytics of eye-tracking data for dynamic stimuli | |
JP4865811B2 (en) | Viewing tendency management apparatus, system and program | |
US7556377B2 (en) | System and method of detecting eye fixations using adaptive thresholds | |
JP6165846B2 (en) | Selective enhancement of parts of the display based on eye tracking | |
KR20100048688A (en) | Sharing system of emotion data and method sharing emotion data | |
KR101772181B1 (en) | User's multi-tasking watching act on the web display analyzing system and method using user's eye movement analyzation | |
Brown et al. | HCI over multiple screens | |
US20230044842A1 (en) | Work analyzing device and work analyzing method | |
KR102466438B1 (en) | Cognitive function assessment system and method of assessing cognitive funtion | |
US20210349620A1 (en) | Image display apparatus, control method and non-transitory computer-readable storage medium | |
JP2012042507A (en) | Video display device | |
CN105933787A (en) | Video comment and processing method and device thereof, and server | |
Probst et al. | SportSense: User Interface for Sketch-Based Spatio-Temporal Team Sports Video Scene Retrieval. | |
Li et al. | Designing shared gaze awareness for remote collaboration | |
JP4536558B2 (en) | TV program audience quality measurement method | |
JP6803431B2 (en) | Programs, information processing devices and information processing methods | |
Rohs et al. | Impact of item density on the utility of visual context in magic lens interactions | |
JP2013164667A (en) | Video retrieval device, method for retrieving video, and video retrieval program | |
Ali-Hasan et al. | Best practices for eye tracking of television and video user experiences | |
JP2019208885A (en) | Emotion evaluation support system and emotion evaluation support method | |
Pelletier et al. | Atypical visual display for monitoring multiple CCTV feeds | |
Radecký et al. | Evaluating user reaction to user interface element using eye-tracking technology | |
Mussgnug et al. | Target based analysis-A model to analyse usability tests based on mobile eye tracking recordings |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |