US10506284B2 - Visual utility analytic method and related eye tracking device and system - Google Patents
Visual utility analytic method and related eye tracking device and system Download PDFInfo
- Publication number
- US10506284B2 US10506284B2 US15/935,048 US201815935048A US10506284B2 US 10506284 B2 US10506284 B2 US 10506284B2 US 201815935048 A US201815935048 A US 201815935048A US 10506284 B2 US10506284 B2 US 10506284B2
- Authority
- US
- United States
- Prior art keywords
- film
- solo
- display
- watched
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 91
- 238000004458 analytical method Methods 0.000 title claims abstract description 8
- 238000001514 detection method Methods 0.000 claims abstract description 25
- 238000000034 method Methods 0.000 claims description 31
- 238000012545 processing Methods 0.000 claims description 28
- 238000013135 deep learning Methods 0.000 claims description 5
- 239000013065 commercial product Substances 0.000 description 6
- 230000004044 response Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 2
- 230000010344 pupil dilation Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4667—Processing of monitored end-user data, e.g. trend analysis based on the log file of viewer selections
Definitions
- the present invention relates to a visual utility analytic method and related eye tracking device and system, and more particularly, to a visual utility analytic method and related eye tracking device and system capable of evaluating the visual utility of a display target based on reactions of the viewer.
- Eye tacking technique can detect a viewer's eye motion (e.g., gazing time, order of gazing points, pupil dilation, and so on) to track a gazing target of the viewer.
- the eye tracking device may be applied in evaluation of visual utility for recording watched data when the viewer is watching web pages, advertisements or films, to find out a display target that is mostly watched by the viewer, so as to evaluate its visual utility (e.g., visual contribution and popularity of the display target).
- a movie maker may evaluate the visual contribution and the popularity of an acting personnel in a movie to determine payment for the acting personnel; or, perform image composition analytic to the film according to viewer's watching habits.
- An advertiser may evaluate the effect of embedded advertising based on the visual contribution and popularity of a commercial product in the film to determine sponsorship amount; or, determine a placement of the commercial product in the screen of the film according to viewer's watching habits.
- the visual utility analytic may be used for selecting the key segment (or story plot) in the film as the basis of film editing; for example, an editor may keep the key segments (or story plots) in the raw footage and remove less viewed segments to ensure the popularity of the film.
- the present invention discloses a visual utility analytic method for an eye tracking system, wherein the eye tracking system comprises a screen and an eye detecting device.
- the method includes dividing each of a plurality of film segments into at least one display area, wherein each of the plurality of film segments is corresponding to at least one display target, and the at least one display target respectively corresponds to one of the at least one display area; determining a plurality of display areas corresponding to the plurality of eye tracking detection results according to a plurality of eye tracking detection results generated by the eye detecting device; and comparing the at least one display area corresponding to the at least one display target in the plurality of film segments with the plurality of display areas corresponding to the plurality of eye tracking detection results, to determine a plurality of visual utilities of the at least one display target in the plurality of film segments.
- the present invention further discloses an eye tracking system for performing visual utility analytic process.
- the eye tracking system includes a screen for displaying a plurality of film segments, wherein each of the plurality of film segments is corresponding to at least one display target, and the at least one display target respectively corresponds to one of the at least one display area; an eye detecting device for respectively generating a plurality of eye tracking detection results when the screen is playing the plurality of film segments; and a processing device coupled to the screen and the eye detecting device, for performing a visual utility analytic process according to the plurality of film segments and the plurality of eye tracking detection results, to determine a plurality of visual utilities.
- the process includes dividing each of a plurality of film segments into at least one display area; determining a plurality of display areas corresponding to the plurality of eye tracking detection results according to a plurality of eye tracking detection results generated by the eye detecting device; and comparing the at least one display area corresponding to the at least one display target in the plurality of film segments with the plurality of display areas corresponding to the plurality of eye tracking detection results, to determine a plurality of visual utilities of the at least one display target in the plurality of film segments.
- the present invention further discloses an electronic device for an eye tracking system, for performing visual utility analytic process, wherein the eye tracking system comprises a screen and an eye tracking device.
- the electronic device includes a processing device; and a memory device coupled to the processing device for storing a program code, wherein the program code instructs the processing device to perform a visual utility analytic process according to a plurality of film segments displayed by the screen and a plurality of eye tracking detection results generated by the eye tracking device to determine a plurality of visual utilities.
- the process includes dividing each of a plurality of film segments into at least one display area, wherein each of the plurality of film segments is corresponding to at least one display target, and the at least one display target respectively corresponds to one of the at least one display area; determining a plurality of display areas corresponding to the plurality of eye tracking detection results according to a plurality of eye tracking detection results generated by the eye detecting device; and comparing the at least one display area corresponding to the at least one display target in the plurality of film segments with the plurality of display areas corresponding to the plurality of eye tracking detection results, to determine a plurality of visual utilities of the at least one display target in the plurality of film segments.
- the eye tracking system of the present invention can compare whether the display area of the display target is same as the watched area of viewer to evaluate the visual utility of the display target (e.g., acting personnel, commercial product and story plot) in the film based on viewer's direct responses, so the film maker and the advertiser can make reference to the visual utility to evaluate the popularity and the contribution of the display target.
- the visual utility of the display target e.g., acting personnel, commercial product and story plot
- FIG. 1 is a schematic diagram of an eye tracking system according to an embodiment of the present invention.
- FIG. 2 illustrates an operating context of the eye tracking system in FIG. 1 according to an embodiment of the present invention.
- FIG. 3 illustrates an operating context of the eye tracking system in FIG. 1 according to an embodiment of the present invention.
- FIG. 4 is a flow chart of a visual utility analytic process according to an embodiment of the present invention.
- FIG. 1 is a schematic diagram of an eye tracking system 1 according to an embodiment of the present invention.
- the eye tracking system 1 includes a screen 10 , an eye detecting device 12 and a processing device 14 .
- the eye tracking system 1 may be used in a movie theater or an audio-visual room for gathering watched data of viewers in each film segments in a film, and combining locations of a display target (e.g., acting personnel, commercial product and so on) in each film segments to analyze the visual utility and popularity of the display target in the film.
- a display target e.g., acting personnel, commercial product and so on
- the eye detecting device 12 may be disposed between a seat of the viewer and the screen 10 and used for tracking the eye motion of the viewer (e.g., watched time, moving positions, and pupil dilation of the eye).
- the screen 10 may be a display device (e.g., television, projector and curtain) for displaying a film, the film includes a plurality of film segments F 1 -FN, wherein each of the plurality of film segments F 1 -FN includes at least one display target TX.
- the eye detecting device 12 is used for respectively generating a plurality of eye tracking detection results E 1 -EN when the screen 10 is playing the plurality of film segments F 1 -FN, wherein the plurality of eye tracking detection results E 1 -EN corresponds to a plurality of watched coordinates W 1 -WN.
- the processing device 14 is coupled to the screen 10 and the eye detecting device 12 , and used for determining a plurality of visual utilities U 1 -UN of the at least one display target TX in the film segments F 1 -FN according to plurality of film segments F 1 -FN and a plurality of eye detection results E 1 -EN.
- the processing device 14 may determine a display area corresponding to the display target TX according to a display coordinate of the display target TX on the screen 10 in the film segment FX, and determine the watched area corresponding to the watched coordinate WX according to the watched coordinate WX corresponding to the eye detection result FEX.
- the processing device 14 may compare the display area corresponding to the display target TX with the watched area corresponding to the watched coordinate WX to determine a visual utility UX of the display target TX. For example, when the display area corresponding to the display target TX is same as the watched area corresponding to the watched coordinate WX, the visual utility UX of the display target TX is valid; on the contrary, when the display area corresponding to the display target TX is different from the watched area corresponding to the watched coordinate WX, the visual utility UX of the display target TX is invalid. Based on the statistics of valid and invalid rates of the visual utilities U 1 -UN, a total watched rate of the display target TX in the film segments F 1 -FN can be obtained.
- the eye tracking system 1 may evaluate the visual utility of the display target based on viewer's direct responses, which allows the film maker to make reference to the visual utility to evaluate the popularity and the contribution of the display target.
- the plurality of eye tracking detection results E 1 -EN may correspond to a plurality of watched coordinate ranges.
- the processing device 14 may be an independent electronic device, or integrated with the screen 10 or the eye detecting device 12 .
- FIG. 2 and FIG. 3 illustrates an operating context of the eye tracking system to perform visual utility analytic process according to an embodiment of the present invention, respectively.
- the processing device 14 may perform visual analytic to the film by executing a deep learning algorithm (e.g., establish an artificial neural network to execute deep learning face recognition); for example, determine the film segments F 1 -FN of the film showing the display target, and recognize associated information of the display target, such as how many acting personnel are there in the film segment, names of the acting personnel, and coordinates and areas of the acting personnel showing on the screen 10 .
- a deep learning algorithm e.g., establish an artificial neural network to execute deep learning face recognition
- the processing device 14 may divide each of the film segments F 1 -FN (e.g., a display range of the screen 10 ) into at least one display area, wherein each of the film segments F 1 -FN corresponds to at least one display target, and the at least one display target respectively corresponds to one of the at least one display area.
- each of the film segments F 1 -FN e.g., a display range of the screen 10
- the processing device 14 may divide each of the film segments F 1 -FN (e.g., a display range of the screen 10 ) into at least one display area, wherein each of the film segments F 1 -FN corresponds to at least one display target, and the at least one display target respectively corresponds to one of the at least one display area.
- the visual composition guideline proposes that an image should be imagined as divided into nine equal parts by two equally spaced horizontal lines and two equally spaced vertical lines (also known as rule of thirds), and the display target should be placed at their intersections. Accordingly, in this embodiment, the display range of the screen 10 is divided into four display areas A 1 , A 2 , A 3 and A 4 .
- the processing device 14 may determine the acting personnel shows in the display area A 1 on the up-left corner of the screen 10 .
- the eye detecting device 12 may instantly generate the eye detection result EX to the processing device 14 , and the processing device 14 accordingly determines the watched area corresponding to the watched coordinate WX is the display area A 2 on the up-right corner of the screen 10 .
- the processing device 14 determines that the display area A 1 corresponding to the display target TX is different from the watched area A 2 corresponding to the watched coordinate WX, thereby determines that the visual utility UX of the display target TX is invalid (i.e., the viewer does not watch the acting personnel when watching the film segment FX). Noticeably, by dividing the display range of the screen 10 into at least one display area, a misjudge of visual utility due mapping error between the watched coordinate of the eye detection result and the display coordinate of the display target can be avoided.
- the processing device 14 may determine a total watched rate and a non-solo watched rate of at least one of the display targets according to the visual utilities U 1 -UN.
- the total watched rate is a ratio of a total watched time and a total show-up time
- the non-solo watched rate is a ratio of a non-solo watched time and a non-solo show-up time, wherein the total show-up time includes solo and non-solo show-up times.
- the processing device 14 may count the total watched time of the acting personnel (i.e., a sum of watched times in solo film segment and non-solo film segment) according to the valid visual utility among the visual utilities U 1 -UN. By counting the ratio of the total watched time and the total show-up time (i.e., total play time of all the film segments F 1 -FN), the processing device 14 may obtain the total watched rate of the acting personnel.
- the film segment FX of the film segments F 1 -FN shows a sole display target
- the film segment FX is a solo film segment
- a solo show-up time is a total time of all the film segment FX among the film segments F 1 -FN
- the solo watched time is a total time of the solo film segments with valid visual utility among the film segments F 1 -FN.
- the processing device 14 may count the non-solo show-up time of the acting personnel (i.e., a total play time of non-solo film segments) when the acting personnel shows up with another acting personnel according to the visual utilities U 1 -UN with valid visual utility. By counting the ratio of the non-solo watched time and the non-solo show-up time, the processing device 14 may obtain the non-solo watched rate of the acting personnel.
- the film segment FX of the film segments F 1 -FN includes a plurality of display targets
- the film segment FX is a non-solo film segment
- the non-solo show-up time is a total time of all the non-solo film segment among the film segments F 1 -FN
- the non-solo watched time is a total time of all the non-solo film segment with valid visual utility among the film segments F 1 -FN.
- Table 1 is an exemplary of visual utility analytic, comparing the visual utility analytic results of the acting personnel A and B shows that the total watched rate of the acting personnel A is same as the total watched rate of the acting personnel B, but the non-solo watched rate of the acting personnel A is greater than the non-solo watched rate of the acting personnel B.
- the acting personnel A and B simultaneously shows up in the screen, more viewers watched the acting personnel A, and thus the visual contribution (popularity) of the acting personnel A is the highest.
- the total watched rate and the non-solo watched rate of the acting personnel C are the lowest, and thus the visual contribution of the acting personnel C is the lowest.
- the eye tracking system 1 may evaluate the visual utility of the display target based on viewer's direct responses, which allows the film maker to make reference to the visual utility to evaluate the popularity and the contribution of the display target.
- Operations of the eye tracking system 1 may be summarized into a process 40 , as shown in FIG. 4 , the process 40 may be stored into a memory device for instructing the processing device 14 to execute a visual utility analytic process.
- the process 40 includes the following steps.
- Step 400 Start.
- Step 401 Divide each of a plurality of film segments into at least one display area, wherein each of the plurality of film segments corresponds to at least one display target, and each of the at least one display target respectively corresponds to one of at least one display area.
- Step 402 Determine a plurality of watched areas corresponding to the plurality of eye tracking detection results, respectively.
- Step 403 Compare the plurality of display area corresponding to the at least one display target with the plurality of watched areas corresponding to a plurality of watched coordinates in the plurality of film segments, to determine a visual utility of the at least one display target in the plurality of film segments.
- Step 404 End.
- the eye tracking system of the present invention can compare whether the display area of the watched target is same as the watched area of viewer to evaluate the visual utility of the display target (e.g., acting personnel, commercial product and story plot) in the film based on viewer's direct responses, so the film maker and the advertiser can make reference to the visual utility to evaluate the popularity and the contribution of the display target.
- the visual utility of the display target e.g., acting personnel, commercial product and story plot
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- General Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Eye Examination Apparatus (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
TABLE 1 | ||||
Total | Non-solo | |||
watched | Total watched | watched | Non-solo | |
time | rate | time | watched rate | |
Acting | 38.3 | 38.3/61 = 63% | 24.3 | 24.3/45 = 54% |
personnel A | seconds | seconds | ||
Acting | 32.0 | 32.0/51 = 63% | 20.5 | 20.5/39 = 53% |
personnel B | seconds | seconds | ||
Acting | 15.5 | 15.5/31 = 50% | 8.0 | 8.0/23 = 35% |
personnel C | seconds | seconds | ||
Claims (19)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW106126864A | 2017-08-09 | ||
TW106126864A TWI642030B (en) | 2017-08-09 | 2017-08-09 | Visual utility analytic method and related eye tracking device and system |
TW106126864 | 2017-08-09 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190052933A1 US20190052933A1 (en) | 2019-02-14 |
US10506284B2 true US10506284B2 (en) | 2019-12-10 |
Family
ID=62217723
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/935,048 Active US10506284B2 (en) | 2017-08-09 | 2018-03-25 | Visual utility analytic method and related eye tracking device and system |
Country Status (4)
Country | Link |
---|---|
US (1) | US10506284B2 (en) |
EP (1) | EP3441850A1 (en) |
CN (1) | CN109388232B (en) |
TW (1) | TWI642030B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11076202B2 (en) * | 2018-04-05 | 2021-07-27 | International Business Machines Corporation | Customizing digital content based on consumer context data |
CN110248241B (en) * | 2019-06-11 | 2021-06-04 | Oppo广东移动通信有限公司 | Video processing method and related device |
CN110337032A (en) * | 2019-06-11 | 2019-10-15 | 福建天泉教育科技有限公司 | Video broadcasting method, storage medium based on attention rate |
CN112130320A (en) * | 2019-06-24 | 2020-12-25 | 宏碁股份有限公司 | Head-mounted display device and adjustment method thereof |
Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020116516A1 (en) * | 2001-02-21 | 2002-08-22 | Fuji Xerox Co., Ltd | Method and apparatus for management and representation of dynamic context |
US20040156020A1 (en) * | 2001-12-12 | 2004-08-12 | Edwards Gregory T. | Techniques for facilitating use of eye tracking data |
US20100092929A1 (en) * | 2008-10-14 | 2010-04-15 | Ohio University | Cognitive and Linguistic Assessment Using Eye Tracking |
US20110242486A1 (en) * | 2010-03-30 | 2011-10-06 | Yoshinobu Ebisawa | Autism diagnosis support apparatus |
US20110298702A1 (en) * | 2009-12-14 | 2011-12-08 | Kotaro Sakata | User interface device and input method |
US20120146891A1 (en) | 2010-12-08 | 2012-06-14 | Sony Computer Entertainment Inc. | Adaptive displays using gaze tracking |
US20120317594A1 (en) * | 2011-04-21 | 2012-12-13 | Sony Mobile Communications Ab | Method and system for providing an improved audio experience for viewers of video |
US20130080974A1 (en) | 2010-06-03 | 2013-03-28 | Nec Corporation | Region recommendation device, region recommendation method and recording medium |
US20130091515A1 (en) * | 2011-02-04 | 2013-04-11 | Kotaro Sakata | Degree of interest estimating device and degree of interest estimating method |
US20130205314A1 (en) * | 2012-02-07 | 2013-08-08 | Arun Ramaswamy | Methods and apparatus to select media based on engagement levels |
USH2282H1 (en) * | 2011-11-23 | 2013-09-03 | The United States Of America, As Represented By The Secretary Of The Navy | Automatic eye tracking control |
US20140007148A1 (en) * | 2012-06-28 | 2014-01-02 | Joshua J. Ratliff | System and method for adaptive data processing |
CN103645806A (en) | 2013-12-24 | 2014-03-19 | 惠州Tcl移动通信有限公司 | Commodity browse method and system based on eyeball tracking |
US20140168056A1 (en) | 2012-12-19 | 2014-06-19 | Qualcomm Incorporated | Enabling augmented reality using eye gaze tracking |
US20150130705A1 (en) | 2013-11-12 | 2015-05-14 | Samsung Electronics Co., Ltd. | Method for determining location of content and an electronic device |
US20150245103A1 (en) * | 2014-02-24 | 2015-08-27 | HotdotTV, Inc. | Systems and methods for identifying, interacting with, and purchasing items of interest in a video |
US20150279418A1 (en) * | 2014-03-18 | 2015-10-01 | Vixs Systems, Inc. | Video system with fovea tracking and methods for use therewith |
TWM518370U (en) | 2015-09-09 | 2016-03-01 | Heran Co Ltd | Interactive mobile device shopping guidance system |
US20160260143A1 (en) * | 2015-03-04 | 2016-09-08 | International Business Machines Corporation | Rapid cognitive mobile application review |
TW201636957A (en) | 2015-04-08 | 2016-10-16 | 神雲科技股份有限公司 | Vending machine |
US20170124399A1 (en) * | 2015-10-29 | 2017-05-04 | International Business Machines Corporation | Computerized video file analysis tool and method |
CN106920129A (en) | 2017-03-09 | 2017-07-04 | 山东师范大学 | A kind of network advertisement effect evaluation system and its method that tracking is moved based on eye |
US20170214951A1 (en) | 2016-01-26 | 2017-07-27 | Adobe Systems Incorporated | Determining Textual Content that is Responsible for Causing a Viewing Spike Within a Video in a Digital Medium Environment |
US20180300096A1 (en) * | 2017-04-17 | 2018-10-18 | Intel Corporation | Regional Adjustment of Render Rate |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015070182A2 (en) * | 2013-11-09 | 2015-05-14 | Firima Inc. | Optical eye tracking |
US20150227735A1 (en) * | 2014-02-13 | 2015-08-13 | Robert Chappell | System and method for eye tracking authentication |
AU2015297036B2 (en) * | 2014-05-09 | 2017-09-28 | Google Llc | Systems and methods for discerning eye signals and continuous biometric identification |
CN105426399A (en) * | 2015-10-29 | 2016-03-23 | 天津大学 | Eye movement based interactive image retrieval method for extracting image area of interest |
-
2017
- 2017-08-09 TW TW106126864A patent/TWI642030B/en active
- 2017-10-12 CN CN201710944727.9A patent/CN109388232B/en active Active
-
2018
- 2018-03-25 US US15/935,048 patent/US10506284B2/en active Active
- 2018-05-03 EP EP18170492.5A patent/EP3441850A1/en not_active Ceased
Patent Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020116516A1 (en) * | 2001-02-21 | 2002-08-22 | Fuji Xerox Co., Ltd | Method and apparatus for management and representation of dynamic context |
US20040156020A1 (en) * | 2001-12-12 | 2004-08-12 | Edwards Gregory T. | Techniques for facilitating use of eye tracking data |
US20100092929A1 (en) * | 2008-10-14 | 2010-04-15 | Ohio University | Cognitive and Linguistic Assessment Using Eye Tracking |
US20110298702A1 (en) * | 2009-12-14 | 2011-12-08 | Kotaro Sakata | User interface device and input method |
US20110242486A1 (en) * | 2010-03-30 | 2011-10-06 | Yoshinobu Ebisawa | Autism diagnosis support apparatus |
US20130080974A1 (en) | 2010-06-03 | 2013-03-28 | Nec Corporation | Region recommendation device, region recommendation method and recording medium |
US20120146891A1 (en) | 2010-12-08 | 2012-06-14 | Sony Computer Entertainment Inc. | Adaptive displays using gaze tracking |
US20130091515A1 (en) * | 2011-02-04 | 2013-04-11 | Kotaro Sakata | Degree of interest estimating device and degree of interest estimating method |
US20120317594A1 (en) * | 2011-04-21 | 2012-12-13 | Sony Mobile Communications Ab | Method and system for providing an improved audio experience for viewers of video |
USH2282H1 (en) * | 2011-11-23 | 2013-09-03 | The United States Of America, As Represented By The Secretary Of The Navy | Automatic eye tracking control |
US20130205314A1 (en) * | 2012-02-07 | 2013-08-08 | Arun Ramaswamy | Methods and apparatus to select media based on engagement levels |
US20140007148A1 (en) * | 2012-06-28 | 2014-01-02 | Joshua J. Ratliff | System and method for adaptive data processing |
US20140168056A1 (en) | 2012-12-19 | 2014-06-19 | Qualcomm Incorporated | Enabling augmented reality using eye gaze tracking |
US20150130705A1 (en) | 2013-11-12 | 2015-05-14 | Samsung Electronics Co., Ltd. | Method for determining location of content and an electronic device |
CN103645806A (en) | 2013-12-24 | 2014-03-19 | 惠州Tcl移动通信有限公司 | Commodity browse method and system based on eyeball tracking |
US20150245103A1 (en) * | 2014-02-24 | 2015-08-27 | HotdotTV, Inc. | Systems and methods for identifying, interacting with, and purchasing items of interest in a video |
US20150279418A1 (en) * | 2014-03-18 | 2015-10-01 | Vixs Systems, Inc. | Video system with fovea tracking and methods for use therewith |
US20160260143A1 (en) * | 2015-03-04 | 2016-09-08 | International Business Machines Corporation | Rapid cognitive mobile application review |
TW201636957A (en) | 2015-04-08 | 2016-10-16 | 神雲科技股份有限公司 | Vending machine |
TWM518370U (en) | 2015-09-09 | 2016-03-01 | Heran Co Ltd | Interactive mobile device shopping guidance system |
US20170124399A1 (en) * | 2015-10-29 | 2017-05-04 | International Business Machines Corporation | Computerized video file analysis tool and method |
US20170214951A1 (en) | 2016-01-26 | 2017-07-27 | Adobe Systems Incorporated | Determining Textual Content that is Responsible for Causing a Viewing Spike Within a Video in a Digital Medium Environment |
CN106920129A (en) | 2017-03-09 | 2017-07-04 | 山东师范大学 | A kind of network advertisement effect evaluation system and its method that tracking is moved based on eye |
US20180300096A1 (en) * | 2017-04-17 | 2018-10-18 | Intel Corporation | Regional Adjustment of Render Rate |
Also Published As
Publication number | Publication date |
---|---|
TWI642030B (en) | 2018-11-21 |
EP3441850A1 (en) | 2019-02-13 |
CN109388232B (en) | 2021-11-30 |
US20190052933A1 (en) | 2019-02-14 |
TW201911234A (en) | 2019-03-16 |
CN109388232A (en) | 2019-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10506284B2 (en) | Visual utility analytic method and related eye tracking device and system | |
US20220092310A1 (en) | Methods and apparatus to measure brand exposure in media streams | |
CN103765346B (en) | The position selection for being used for audio-visual playback based on eye gaze | |
JP4794453B2 (en) | Method and system for managing an interactive video display system | |
US9811158B2 (en) | System and method for calibrating eye gaze data | |
CN103797783B (en) | Comment information generating means and comment information generation method | |
US8667519B2 (en) | Automatic passive and anonymous feedback system | |
US20120169583A1 (en) | Scene profiles for non-tactile user interfaces | |
US20180295420A1 (en) | Methods, systems and apparatus for media content control based on attention detection | |
US20110175992A1 (en) | File selection system and method | |
US20110128283A1 (en) | File selection system and method | |
Chang et al. | Virtual spotlighted advertising for tennis videos | |
US8578407B1 (en) | Real time automated unobtrusive ancilliary information insertion into a video | |
JP5325181B2 (en) | Content display control method, content display control device, and content display system | |
US10049379B2 (en) | Quantitative branding analysis | |
JP6583996B2 (en) | Video evaluation apparatus and program | |
Akahori et al. | Region-of-interest-based subtitle placement using eye-tracking data of multiple viewers | |
US20110085018A1 (en) | Multi-User Video Conference Using Head Position Information | |
TW202131703A (en) | Modifying playback of replacement content responsive to detection of remote control signals that control a device providing video to the playback device | |
GB2421135A (en) | User resizable video display of catalogue entries | |
US10535284B2 (en) | Visual range mapping method and related eye tracking device and system | |
Hough et al. | Measurements of live actor motion in mixed reality interaction | |
CN115083354A (en) | Backlight adjusting method, device, medium and interactive panel | |
Chen et al. | Multi-sensored vision for autonomous production of personalized video summaries | |
WO2023181382A1 (en) | Information processing device, display system, and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: ACER INCORPORATED, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HO, ANDY;YANG, TSUNG-HAN;WANG, SZU-CHIEH;AND OTHERS;REEL/FRAME:045364/0323 Effective date: 20180217 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |