CN104731316B - The system and method for information is presented in equipment based on eyes tracking - Google Patents
The system and method for information is presented in equipment based on eyes tracking Download PDFInfo
- Publication number
- CN104731316B CN104731316B CN201410534851.4A CN201410534851A CN104731316B CN 104731316 B CN104731316 B CN 104731316B CN 201410534851 A CN201410534851 A CN 201410534851A CN 104731316 B CN104731316 B CN 104731316B
- Authority
- CN
- China
- Prior art keywords
- user
- information
- project
- threshold time
- presented
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention provides the system and methods that information is presented in equipment based on eyes tracking.In an aspect, a kind of equipment includes: display, processor and the memory that can be accessed by processor.Memory carrying instruction, the instruction can be executed by processor with: receive at least one signal from least one photographic device communicated with equipment;The signal is based at least partially on to determine that the user of equipment is actively watching the part of display;And the part is actively watching information associated with the project of presentation on the portion is presented in response to determining user.
Description
Technical field
Present application relates generally to information is presented in equipment using eyes tracking.
Background technique
Currently, usually must to be presented in equipment with such as icon or image-related information, user presented thereon
A series of actions must be taken so that the information is presented.This is not intuitive, and may be strictly laborious.
Summary of the invention
Therefore, in the first aspect, a kind of equipment includes: display, processor and can be deposited by what processor accessed
Reservoir.Memory carrying instruction, the instruction can be executed by processor with: connect from least one photographic device communicated with equipment
Receive at least one signal;The signal is based at least partially on to determine that the user of equipment is actively watching the part of display;And it rings
It should be actively watching the part in determining user and be presented information associated on the portion the project of presentation.
In another aspect, a kind of method includes: data of the reception from photographic device at equipment;At least partly ground
Determine that the user of the equipment watches the specific region of the display of the equipment up at least threshold time in the data;And response
In determining that user watches the region up to threshold time and metadata associated with presentation feature on the area is presented.
In another aspect, a kind of device includes: first processor, network adapter and storage device, storage device
Carrying instruction, the instruction by second processor execute with: the first image is presented over the display;From at least one communicated with equipment
A photographic device receives at least one signal, and the equipment is associated with second processor;And it is based at least partially on the signal
To determine that the part of user's the first image of viewing of equipment reaches at least threshold time.It is also wrapped by the instruction that second processor executes
Include: the image for watching the part up to threshold time to determine someone in response to determination user is the part of the first image;From
Data of first image zooming-out about the people;The search of the information about the people is executed using at least part of the data;
And the information is presented at least part of display.First processor passes through network for the instruction via network adapter
Pass to equipment.
Attached drawing is referred to the details of present principles (structurally and operationally about it) is best understood, wherein similar is attached
Icon note indicates similar part, and wherein:
Detailed description of the invention
Fig. 1 is the block diagram according to the system of present principles;
Fig. 2 and Fig. 3 is the exemplary process diagram according to the logic of present principles to be executed by the system;
Fig. 4 to Fig. 8 is the graphical representation of exemplary of present principles;And
Fig. 9 is the exemplary setting user interface (UI) that can be presented in system according to present principles.
Specific embodiment
Present disclosure is generally related to the user information based on (for example, consumer electronics (CE)) equipment.About herein
Any computer system discussed, system may include the server component and client components by network connection, so that
Data can exchange between client components and server component.Client components may include that one or more calculate sets
It is standby, including television set (for example, television set of intelligent TV set, support internet), computer (such as laptop and plate
Computer) and other mobile devices including smart phone.As non-limiting example, these client devices be can be used
From apple, Google or the operating system of Microsoft.Unix operating system can be used.These operating systems can run one or
Multiple browsers, the browser such as obtained by Microsoft or Google or Mo Sila (Mozilla) or other browser programs, can
With the network application provided by network (such as internet, local Intranet or Virtual Private Network) access by Internet server
Program.
As used herein, the computer-implemented step for the information in processing system is referred to.Refer to
Enabling can be realized with software, firmware or hardware;Therefore, illustrated according to their function illustrative component, block, module,
Circuit and step.
Processor can be any conventional general purpose single-chip or multi-chip processor, can be (all by means of various lines
Such as address wire, data line and control line) and register and shift register execute logic.In addition, in addition to general processor
In addition, any logical block, module and circuit described herein can be realized or be executed in the following or by following
To realize or execute: digital signal processor (DSP), field programmable gate array (FPGA) or other programmable logic device,
Such as it is designed to execute specific integrated circuit (ASIC), discrete gate or the transistor logic of function described herein, divides
Vertical hardware component or any combination thereof.Processor can be realized by the combination of controller or state machine or calculating equipment.
By in this article flow chart and/or user interface described in any software and/or application program can wrap
Include various subroutines, process etc..It should be understood that being disclosed as to be divided again by the logic that such as some module executes
It dispensing other software module and/or combines and/or makes it possible in sharable library in individual module
It obtains.
Logic can be write when implemented in software with suitable language (such as, but not limited to C# or C++), and can
To be stored on computer readable storage medium (for example, can not be carrier wave) or by computer readable storage medium (example
Such as, can not be carrier wave) it transmits, all for example random access memory (RAM) of computer readable storage medium, read-only memory
(ROM), electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM) or other optical disc storages
(digital versatile disc (DVD)), disk storage or other magnetic storage apparatus including dismountable thumb actuator etc..
Connection can establish computer-readable medium.As an example, such connection may include rigid line cable comprising optical fiber, same
Axis and twisted pair.Such connection may include wireless communication connection, including infrared and radio.
In one example, processor can be by its input line from data storage device (such as computer-readable storage
Medium) come access information and/or processor can be by activating transceiver send and receive data come from Internet service
Device wirelessly access information.Data when being received usually by the circuit system between the register of processor and antenna from
Analog signal is converted into digital signal, and is converted into analog signal from digital signal when being sent.Then processor passes through
Its shift register handles data to export the data being computed on the output line, for the number being computed to be presented in equipment
According to.
It can be with any appropriately combined in other embodiments including component in one embodiment.For example, herein
Described in and/or any part in attached drawing in discribed various parts can in conjunction with other embodiments, exchange or
It is excluded from other embodiments.
" system at least one A, B and C " (similarly, " system at least one A, B or C " and " with A,
B, the system of at least one C ") it include the only system of A, the system of only B, the only system of C, the system with A and B, tool
There are the system, the system with B and C and/or the system with A, B and C etc. of A and C.
Term " circuit " or " circuit system " are used in summary of the invention, specific embodiment and/or claim.Such as this
Known to field, term " circuit system " includes the available integrated of all ranks, for example, from discrete logic circuitry to most
High level circuit integration (such as VLSI), and the programmable logic units of the function including being programmed to perform embodiment and
The general or specialized processor of these functions is executed with instruction programming.
Especially now referring to Fig.1, it illustrates computer systems 100 (such as, for example, supporting the computer of internet
Change phone (such as smart phone), tablet computer, laptop or desktop computer, support the computerization of internet that can wear
Wear equipment (smartwatch), computerization television set (TV) (intelligence TV) etc.) exemplary block diagram.As a result, one
In a little embodiments, system 100 can be desk side computer system, the association in the city Mo Lisiweier of such as North Carolina state
(U.S.) limited liability company is soldOrOne of the personal computer of series, or
Person is workstation computer, and association (U.S.) Co., Ltd in the city Mo Lisiweier of such as North Carolina state is soldHowever, it is evident that according to the client device of present principles, server such as from the explanation of this paper
Or other machines may include other features or only some features of system 100.
As shown in fig. 1, system 100 includes so-called chipset 110.Chipset refers to: being designed to one to work together
Group integrated circuit or chip.Chipset is sold usually as single product (for example, it is contemplated that with brandThe chipset of equal sale).
In the example of fig. 1, chipset 110 has specific framework, can be to a certain extent according to brand or system
It makes quotient and is varied.The framework of chipset 110 includes via for example direct management interface or direct media interface (DMI) 142
Or link controller 144 exchanges the kernel and memory control group 120 and I/O of information (for example, data, signal, order etc.)
Hub controller 150.In the example of fig. 1, DMI142 is chip to chip interface (sometimes referred to as " north bridge " and SOUTH BRIDGE
Between link).
Kernel includes being exchanged at the one or more of information via front side bus (FSB) 124 with memory control group 120
Manage device 122 (for example, single or multiple core etc.) and memory controller hub 126.As described in this article, kernel and storage
The various parts of device control group 120 can be integrated on such as single processor bare die to manufacture and replace conventional " north bridge " type
The chip of framework.
Memory controller hub 126 and 140 interface of memory.For example, memory controller hub 126 can mention
For the support to DDR SDRAM memory (for example, DDR, DDR2, DDR3 etc.).In general, memory 140 is a kind of random
It accesses memory (RAM).It is commonly known as " system storage ".
Memory controller hub 126 further includes Low Voltage Differential Signal interface (LVDS) 132.LVDS132 can be use
It is connect in supporting the so-called LVDS of display equipment 192 (for example, CRT, plate, projector, the display for supporting touch etc.) to show
Mouth (LDI).Block 138 includes that can support via LVDS interface 132 (for example, serial digital video, HDMI/DVI, display port)
Technology some examples.Memory controller hub 126 further includes for example for supporting one or more of independent display card 136
A PCI-express (quick) interface (PCI-E) 134.Accelerated graphics port is had become using the independent display card of PCI-E interface
(AGP) alternative.For example, memory controller hub 126 may include for the external video card based on PCI-E
The 16 channels port (x16) PCI-E of (including such as one or more GPU).Exemplary system may include for supporting figure
AGP or PCI-E.
I/O hub controller 150 includes various interfaces.The example of Fig. 1 includes SATA interface 151, one or more
PCI-E interface 152 (optionally, one or more traditional pci interfaces), one or more usb 1s 53, LAN interface 154
(led to more generally at least one network (internet, WAN, LAN etc.) is passed through under the guidance of processor 122
The network interface of letter), general purpose I/O Interface (GPIO) 155, low pin count (LPC) interface 170, power-management interface 161, clock
G-interface 162, audio interface 163 (for example, for loudspeaker 194 export audio), total operating cost (TCO) interface 164,
System management bus interface (for example, how main serial computer bus interface) 165 and in the example of fig. 1 include BIOS168
With Serial Peripheral flash memory/control unit interface (SPI Flash) 166 of guidance code 190.About network connection, the control of I/O hub
Device 150 may include the integrated gigabit Ethernet controller line with the multiplexing of PCI-E interface port.Other network characterizations can
To be operated independently of PCI-E interface.
The interface of I/O hub controller 150 is provided for the communication with various equipment, network etc..For example, SATA interface
151 are provided for reading information, write-in information or reading on one or more drivers 180 (such as HDD, SDD or combinations thereof)
Information is taken and is written, but under any circumstance, driver 180 is understood as that the tangible computer that for example, can not be carrier wave can
Read storage medium.I/O hub controller 150 can also include the advanced host control for supporting one or more drivers 180
Device interface (AHCI).PCI-E interface 152 allows the wireless connection 182 with equipment, network etc..Usb 1 53 is provided for defeated
Enter equipment 184, such as keyboard (KB), mouse and various other equipment are (for example, photographic device, phone, storage device, media
Player etc.).
In the example of fig. 1, LPC interface 170, which provides, is used for one or more ASIC171, credible platform module (TPM)
172, super I/O173, firmware hub 174, BIOS support 175 and various types of memories 176 (such as ROM177, sudden strain of a muscle
Deposit 178 and non-volatile ram (NVRAM) 179) use.About TPM172, which can be used for setting software
The form for the chip that standby and hardware device is authenticated.For example, TPM can be able to carry out platform authentication and can be used to verify
The system for seeking access is desired system.
System 100 may be configured to: when powering on, execution is stored in SPI Flash 166 and is used for BIOS168's
Guidance code 190, hereafter in one or more operating systems and application software (for example, being stored in system storage 140
In) control under handle data.Operating system can be stored in any position in each position and such as basis
The instruction of BIOS168 is accessed.
In addition, in some embodiments, system 100 may include taking the photograph to the one or more that processor 122 provides input
As device 196.Photographic device 196 can be such as thermal imaging photographic device, digital camera device (such as network camera device)
And/or it integrates within system 100 and can be controlled by processor 122 to collect picture, image and/or video according to present principles
The photographic device of (for example, collecting one or more images of user and/or the eyes movement for tracking user etc.).In addition, system
100 may include providing one or more motion sensors 197 of input (for example, for feeling to processor 122 according to present principles
Survey the gesture sensor of gesture and/or gesture command).
Before continuing to Fig. 2 and as described in this article, it should be understood that according to the system of present principles and
Equipment may include the less or more feature of feature shown in system 100 than Fig. 1.Under any circumstance, before being at least based on
It states it should be understood that system 100 is configured to using present principles.
Referring now to Fig. 2, the example flow diagram for the logic to be executed by equipment (such as system 100) is shown.In block 200
Place starts, at least one project is presented (for example, file, calendar item on the display of the equipment of the logic using Fig. 2 in logic
Mesh, rolling news feeding, contact person of contacts list from the user etc.), icon is (for example, starting software application
Shortcut icon), feature (for example, software features), element (for example, selector element) _, tiling (for example, in plate environment
Under), image (for example, photo) etc..In short, project, icon, feature, element, image etc. will be referred to as " project below
Deng ".Then logic is carried out to block 202, and at block 202, logic receives for example from least one photographic device communicated with equipment
At least one signal and/or image data about user (for example, the face of user and/or eyes are mobile).Then logic carries out
To diamond block 204 is determined, determining at diamond block 204, logic determines whether user watches the (example such as including project of display
Such as, in the threshold value of object (for example, display) distance) part and/or region up at least first threshold time (for example, not having
Have and also provide other input by the operation of the keyboard, the mouse that communicate with equipment etc.).Note that in some embodiments, in water chestnut
At shape block 204, logic can not only determine that user is watching the part and/or region, but also can determine user
Specifically watch the project or at least close to the project.
Under any circumstance, the negative determination at diamond block 204 makes logic be back to block 202 and continue from it.
However, the affirmative determination at diamond block 204 carries out logic to block 206, at block 206, logic is positioned and/or accessed and project
Etc. the associated first information, the first information can be the storage medium for being for example locally stored in the equipment of the logic using Fig. 2
On metadata, can be by accessing associated with the project etc. website (for example, providing the figure with presentation over the display
Mark the company's site of the company of associated software) information collected on the internet, by user before the logic using Fig. 2
The information etc. about the project there is provided and/or that equipment is input to by user.
After block 206, logic is carried out to block 208, and at block 208, logical response is in determining that user specifically watching
The part and/or project etc. and the first information is presented to the user.The first information can be for example with audible means (by equipment
And/or the loudspeaker that is communicated with equipment) and/or be presented with visual manner (for example, on a display of the device).In addition,
In some embodiments, the first information can be presented in project etc., and in other embodiments, the first information can be aobvious
Show and is presented on the part other than presentation project etc. of device.The first information can be for example aobvious in yet other embodiments,
Show at least part of presentation project of device etc. and is presented in other parts.However, it should also be noted that the first information can
To be presented in overlapping window (overlay window) and/or pop-up window.
Referring now still to block 208, it should also be noted that being by watching the project in such as project etc. at block 208
When the shortcut icon of its input of the eye detection of user, it is associated with the project that user watches that logic can refuse starting
Software application (and/or if for example having been turned on, refusal executes the other function of software application) etc..At this
In example, when logic can with it is thus determined that when user is watching the icon, to provide the input about the icon to equipment,
But basic software application program associated there will not be activated.However, logic can collect first number associated with icon
According to, and be presented in the pop-up window by the icon just watched.
Referring now still to Fig. 2, after block 208, logic is carried out to judgement diamond block 210.Determining at diamond block 210, logic
Determine whether user watches (for example, continuing to watch when making at diamond block 204 and determining certainly without by their eyes
Turn to the other parts of display) part and/or the project is specifically watched etc. (for example, the threshold value (for example, display) of object
In distance) up at least second threshold time (for example, not provided by the operation of the keyboard, the mouse that are communicated with equipment etc. also another
Outer input).In some embodiments, it when describing the second threshold time, can have identical with the first threshold time
Time span, and in other embodiments, it can have different time spans.In addition, ought such as second threshold time from
Logic determine user initially viewing project etc. whens when starting (even if it is earlier than deadline of first threshold time), logic can be with
Determining user, whether viewing project etc. is up to the second threshold time.However, in other embodiments, the second threshold time can be from working as
Logic determines that user at least substantially watches when the project reaches the first threshold time at diamond block 204 and starts.
Referring now still to diamond block 210, the affirmative determination at diamond block 210 carries out logic to block 212, later by description block
212.Logic is carried out to judgement diamond block 214 however, the negative at diamond block 210 determines.Determining to patrol at diamond block 214
It collects based on the input for example from photographic device and/or from motion sensor (such as the sensor 197) and determines user
Whether (for example, predefined) gesture that can by equipment be identified, distinguish and/or detect is made.
Negative at diamond block 214, which determines, carries out logic to diamond block 218 is determined, description is determined diamond block later
218.However, the affirmative determination at diamond block 214 carries out logic to block 212.At block 212, logic positioning and/or access with
Associated second information such as project, the second information can be the storage for being for example locally stored in the equipment of the logic using Fig. 2
Other metadata on medium, can be by access that associated with project etc. website collects on the internet in addition
Information can be the other information etc. about project for being supplied to equipment by user before the logic using Fig. 2.As a result,
It should be understood that the second information can be different from the first information and/or the second information may include at least some first information also
There is other information.
Logic is carried out from block 212 to block 216, at block 216, logical response in determine user specifically watch the part and/
Or the second information is presented to the user by project etc. up to the second threshold time.Second information can for example (be passed through with audible means
Loudspeaker that is in equipment and/or being communicated with equipment) and/or be presented with visual manner (for example, on a display of the device).
In addition, in some embodiments, the second information can be presented in project etc., and in other embodiments, the second information can
To be presented on the part other than presentation project etc. of display.The second information can be in yet other embodiments,
Such as it is presented at least part of presentation project of display etc. and in other parts.However, it should also be noted that second
Information can be presented in overlapping window and/or pop-up window.
Referring now still to block 216, it should also be noted that at block 216, such as when detect determined at diamond block 214 by with
The gesture and the gesture that family is made are associated with the software application for starting software application and/or just specifically watching
When, logic can star the associated software application such as project watched with user and (and/or if for example have been turned on
Words, execute the other function of software application).
After block 216, logic is carried out to judgement diamond block 218.At diamond block 218, when logic determines third threshold value
Between whether have reached and/or in the past, wherein whether third threshold time should be removed with the first information and/or the second information
It closes.In some embodiments, third threshold time can have the time identical with first threshold time and second threshold time
Length, and in other embodiments, when can have different from one or both of first threshold time and second threshold time
Between length.In addition, in third threshold time (even if earlier than the first threshold since when logic determines user's initially viewing project etc.
It is worth the deadline of time and/or second threshold time) when, logic can determine user, and whether viewing project etc. is up to third threshold value
Time.However, in other embodiments, third threshold time can determine user's at least base from when logic at diamond block 204
Start when viewing project etc. reaches the first threshold time in sheet and/or from when logic determines user at least at diamond block 210
Start when substantially viewing project etc. reaches the second threshold time.
Under any circumstance, the negative at diamond block 218, which determines, is determined logic continuation at which until such as doing
Until when determining certainly out.When making certainly determining at diamond block 218, logic is carried out to block 220.At block 220, logic from
Display removes the first information and/or the second information (if being presented in thereon) and/or stops with audible means
It is existing.
Continue to be described in detail referring to Fig. 3, it illustrates being used in combination with the logic of Fig. 2 and/or be incorporated to Fig. 2 patrols
Logic in volume and/or can independently using.However, logic is according to present principles in the logic using Fig. 3 at block 222
Image is presented on the display of equipment.Then, at block 224, logic is received from least one photographic device communicated with equipment
The user of eyes movement and/or direction image with such as user watches at least one related signal and/or data attentively.Then,
Logic is carried out to diamond block 226 is determined, is being determined at diamond block 226, logic is based at least partially on the signal to determine user
The specific and/or specific part for whether watching the first image reaches at least threshold time (for example, the continuously eyes of user
The other part and/or elsewhere of image is not turned to).Negative determination at diamond block 226 continues logic at which
It is determined until making certainly determining.
It is determined certainly once being made at diamond block 226, then logic is carried out to judgement diamond block 228.In diamond block 228
Place, logic determines whether the image of someone is a part of image, and can even determine for example whether the part specifically wraps
Include the image of face.Negative determination at diamond block 228 makes logic be back to block 224 and continue from it.In diamond block
Affirmative determination at 228 carries out logic to block 230, and at block 230, logic is extracted from image about the people's in the part
Data (for example, object extraction of the image in identification image itself).Or at block 230, logic can be by the just quilt of image
This part of viewing is for example afforested or is ashed to be directed toward here simultaneously with the attention for having detected that eyes of user to user's mediation device
And therefore equipment obtain the content shown in the part information processing in.Then, logic is carried out to block 232, in block
At 232, logic is stored in the computer of equipment for example, by searching for using at least part for the data extracted at block 230
The information about people on readable storage medium storing program for executing locally executes the search of the information about people in equipment.For example, can visit
Asking the contacts list of user, to search for, (attention of user is with the image of this person in contacts list to use face recognition
It is directed toward the image at least threshold time of the people) matched image, it thus identifies the people in contacts list and provides about this
The information of people.Note that nevertheless, local information and can be used for such as searching for user from the information that remote source obtains
Contacts list and/or search for social networks account using the log-on message that is locally stored of individual with determines with and extraction
The friend of the user of the face of Data Matching.
Logic is carried out from block 232 to diamond block 234 is determined, is being determined at diamond block 234, logic is determined based on local search
Whether at least some information about people have been positioned.Affirmative at diamond block 234, which determines, carries out logic to block 242, later will
Description block 242.However, the negative determination at diamond block 234 carries out logic to block 236.
At block 236, logic is drawn using at least part for the data extracted at block 230 by using search
Hold up (internet search engine and/or face recognition search engine such as based on the image) Yin Te of Lai Zhihang about the information of people
Net search.Then, logic is carried out to judgement diamond block 238.At diamond block 238, logic is determined based on such as Internet search
Whether at least some information about people have been positioned from the part of image.Affirmative determination at diamond block 238 carries out logic
To block 242, at block 242, oriented at least part information is presented in logic.However, the negative at diamond block 238 determines
Carry out logic to block 240, at block 240, logic can for example show with audible means and/or table on a display of the device
People in the bright image section for being had viewed threshold time can be positioned without information.
Before continuing to Fig. 4, it should be understood that although logical response is in true in example logic shown in Fig. 3
Surely the information being locally stored in equipment can not be located to execute Internet search, but in some embodiments, Ke Yi
The information for example positioned from a source is not considered before search other information to execute local search and Internet search.By
This, in some embodiments, at block 242, the information searched for from two can be presented in logic.
Referring now to Fig. 4, it illustrates project according to the presentation of present principles over the display etc. and its relevant informations
Example diagram 250.It should be understood that can be such as detecting that user watches specific project in the equipment according to present principles up to such as
Content shown in Fig. 4 is presented when at least one threshold time as described herein.Diagram 250 shows display and/or thereon
The user interface (UI) that can be presented comprising the contact person's of user that is that equipment is able to access that and/or being stored in equipment is more
A contact item 252.Contact item 252 includes the contact item 254 about specific people as a result, it should be understood that
Project 254 is at least partly using the photographic device of the equipment project detected for being viewed by a user threshold time.It hands over as a result,
Folded window 256 watches contact item 254 in response to user and is presented up to threshold time, and in addition to be presented about
Except any information of contact item 254, including at least some of window 256 information and/or metadata.
Referring now to Fig. 5, it illustrates can be by the people 260 that equipment 264 (such as system 100) detects in free space
Make the graphical representation of exemplary 258 for the gesture for holding up thumb.Shown in as shown 258, according to present principles, information 266 (for example,
In some embodiments, the second information according to fig. 2) be presented on the display 268 of equipment 264 (for example, in response to user after
Reach threshold time when continuous viewing project 254).Thus, it should be understood that for example, can be discussed in response to reaching referring to Fig. 2
First threshold be presented content shown in Fig. 4, and figure can be presented in response to reaching the second threshold discussed referring to Fig. 2
Content shown in 5.
Continue to be described in detail referring to Fig. 6, it also illustrate example diagram 270, which is in according to present principles
Now related (AV) project of audio frequency and video over the display etc. and relative information-related.It should be understood that for example, can
When the equipment according to present principles detects that user's viewing specific project etc. reaches at least one threshold value as described in this article
Between when content shown in Fig. 6 is presented.
Diagram 270 shows display and/or the user interface that can be presented thereon (UI) comprising for can be by setting
Multiple AV projects 272 of AV content, video content and/or audio content standby access and/or be stored in equipment.As a result,
It should be understood that in some embodiments, shown in UI can be electronic program guides.Under any circumstance, project 272 can be with
Including the film project 274 about certain movie, it should be understood that project 274 is the photographic device at least partly using equipment
User detected has viewed the project of threshold time.As a result, according to present principles, window 276 is overlapped in response to user and watches item
Mesh 274 is presented up to threshold time, and including at least some of window 276 information and/or metadata.As shown,
The title of window 276 including film and its presentation in-formation, audience ratings, synopsis and be related to its production personnel column
Table.
Fig. 7 is gone to, it also illustrate the figures watched in equipment 283 (such as above system 100) according to the people 280 of present principles
As 282 example diagram 278.It should be understood that can such as equipment detect user watch image 282 a part (
It is in this case the face such as represented Bredt method in image not (Brett Favre)) up to as described herein at least
Content shown in fig. 7 is presented when one threshold time.According to present principles, window 284 is overlapped in response to user and watches mine-laying
At least part of the face of Te Fafu is presented up to threshold time, and including (for example, generally) and Bredt method
At least some of not related window 284 information and/or metadata.As shown, window 284 includes expression below: cloth
What Lei Tefafu made a living (for example, playing soccer) with, indicate he the full name of birth and about his soccer life information with
And his birthday, height, spouse, education and/or school and his child.It should be understood that believing shown in window 284
Breath can be for example by extracting data then using based on image from a part for the image for including the face of Bredt method not
Internet search engine pass through internet to scan for using the data for information not related with Bredt method
The information of access.
Referring now to Fig. 8, it illustrates the people 288 according to present principles to be actively watching in equipment 292 (such as above system 100)
Image 290 another example diagram 286.It should be understood that can detect that user watches the one of image 290 in such as equipment
It is presented when partially (being in this case the specific people in group picture) reaches at least one threshold time as described in this article
Content shown in Fig. 8.According to present principles, at least part that overlapping window 294 watches the specific people in response to user reaches threshold
The value time and be presented, and including at least some information related with this person and/or metadata in window 294.As schemed
Show, window 294 includes expression below: this person the work of what corporate department, they what office work, they
Contact details are that the calendar of what and they indicate them now and/or what will do in the recent period.
Continue to be described in detail referring to Fig. 9, it illustrates according to present principles can be in equipment (such as system 100)
It presents with to being presented that information is associated to be set with the eye gaze of detection user as set forth herein and in response to this
Set the exemplary setting user interface (UI) configured.UI300 includes the first setting 302, is used for user and provides input (example
Such as, radio button shown in use), for select for example viewing project up to after threshold time (for example, rather than with
Information always and is everywhere presented when watching a part of display in family, this may be in the shadow for watching complete length for example in equipment
Divert one's attention when piece) present its information one or more types project.Thus, further it is shown that the second setting 304 is used for setting
It is standby to be configured, though in some instances ought watching attentively for such as user can be detected as viewing as set forth herein
Information is not also presented specifically when reaching threshold time for partially/project.
Show the another setting that the time span of first threshold time as described in this article is defined for user
305, together with the input frame and chronomere's frame for inputting desired specific time (for example, in this example, five seconds).Note
Meaning, the chronomere of second can not be unique chronomere input by user, but be also possible to such as minute or hour.
Under any circumstance, the setting that the time span of second threshold time as described in this article is defined for user is shown
306, together with the input frame and chronomere's frame for inputting desired specific time (for example, in this example, ten seconds).Show
Go out and has defined the time span of third threshold time as described in this article for user to remove the information that may have been presented
Another setting 308, together with for inputting the input frame of desired specific time (for example, in this example, 25 seconds)
With chronomere's frame.
Be arranged UI300 can also include setting 310, be used for user provide input with limit as described above (for example, referring to
The amount of the first information presented in response to user's viewing project up to the first threshold time Fig. 2) is such as schemed in this case
Shown, 200 characters are input into input frame.Show setting 312, be used for if desired by user provide whether
It limits (for example, referring to Fig. 2) as described above and watches the second information that project reaches the second threshold time and presents in response to user
Amount input.Being shown as a result, for being arranged 312 is selector element and no selector element, can be used by selection
In respectively by device configuration at or by device configuration at limitation present the second information amount.It also shows for being arranged 312
Input frame, be used to the second information being limited to certain number of character, in this case, for example, 800 characters.
In addition to aforementioned, UI300 includes setting 314, is used for based on to being selector element or no shown in setting 314
Selector element it is corresponding selection and by device configuration at audible means present or do not present the first information and/or second letter
Breath.Although, can be for the first information and the note that illustrate only for one of information setting to be presented with audible means
Two information configurations, which are individually arranged, (for example, the first information is not presented with audible means, but is presented the second letter with audible means
Breath).
Setting 316 is also shown, is used to based on the setting 316 to shown in be selector element or no selector element
Corresponding selection, determine that whether start when the second threshold time as described in this article expires can be with the project just watched
Associated application program.Another setting 318 is shown, is used for device configuration for purpose disclosed herein at connecing
Receive, identify and/or be associated with one or more Pre-defined gestures.It shows as a result, and defines selector element 320, can be chosen
It selects to be for example input to equipment and define one or more gestures (for example, by presenting for by equipment according to user preference
It is configured to identify a series of configuration prompts of the gesture inputted for present purposes).
Without reference to any certain figures, it should be understood that can be presented on screen according to present principles
Cursor.For example, since equipment with the various pieces of the attention traversal display of user tracks the eyes of user, so equipment
Cursor (for example, it can also be operated by operating the mouse that communicates with equipment) can be moved to and in any particular moment
User the corresponding position in attention position.Nevertheless, cursor can also based on the attention of user be directed toward position and
From a place " skipping over " or it "jump" to another place.For example, if user watch indicator screen the upper right corner and cursor
In the lower left corner, then cursor may remain in there until for example reaching the above-mentioned first threshold time, and cursor can be at this point
Automatically stop appearing in the case where not other user's input in the lower left corner and appears in the upper right corner on the contrary
Be located at or at least close to user attention be directed toward place.
Or without reference to any certain figures, it should be understood that in some embodiments, above with reference to
The first information described in Fig. 2 may, for example, be with can in response to for example using mouse in whatsoever project right click and/
Or cursor hovers above project and the information of information same type that presents.It will also be appreciated that in some embodiments, on
Face second information referring to described in Fig. 2 may, for example, be with can in response to for example using mouse in whatsoever project it is left
The information of information same type hit and presented.
It is being directed to above although should also be noted that about whether the presentation first information and the second information and/or figure
As the determination of information describes time threshold, but can be according to present principles using carrying out such other modes determined.Example
Such as, software can be tracked to carry out such determination based on eye motion using eyes according to present principles, including be higher than or low
Object is accelerated in acceleration rate threshold or leaves object, decelerates to object, shake identification and threshold higher or lower than acceleration rate threshold
Value and speed and/or rate identification and threshold value.
In addition, present principles recognize the attention for being directed toward the user of specific project etc. may not need the entire time completely not
It moves until reaching first threshold and second threshold.In such an example, such as determining diamond block 204,210 and 226
The eyes that the determination that place makes can be such as user in corresponding threshold time are mobile less than threshold quantity and/or threshold distance
The determination of (for example, since initial eye position of direction project etc.).
As a result, in some embodiments, mobile orientation ocular data can be used to determine eyes movement and/or position
Value, the intention that then it can be compared with multiple threshold values to explain user is (for example, whether user continues to watch on display
Project or its attention is gone on display elsewhere).For example, user eyes be more than acceleration rate threshold simultaneously
And when being more than shake (also referred to as shake) threshold value, it can determine that the eyes of user are mobile and show user by attention from just being watched
Object turn on divert one's attention to move.In addition, in some embodiments, can by movement value and/or positional value and it is multiple (for example,
User) profile (profile) is compared the intention to explain user.For example, in the case where rate value matches bell curve,
Can be construed to that short distance is mobile for the eyes of user are mobile, thereby determine that user be still intended to viewing present on the screen in eyes
The special object watched before mobile.In some embodiments, movement value and/or positional value and threshold value and profile can be carried out
Compare the intention to explain user.For example, in the case where rate value matches bell curve and acceleration value is more than threshold value, it can
The movement of user is construed to remote mobile (for example, leaving the project being actively watching).
In addition, the amount of biometric data value can be limited to according to the equipment of present principles by scheduled " window " size,
Wherein window size is corresponding with the reaction time of user.It can be improved reliably using the window size in the reaction time for being greater than user
Degree, this is because it guarantees that detected movement is conscious movement (that is, attention is removed from the object just watched
Reaction), rather than user be for example still intended to watch the object (for example, threshold time) because of illusion or mistake caused by noise
Report, passive movement etc..
It will also be appreciated that according to the equipment of present principles can be determined according to the mobile directional data of eyes it is mobile (for example, plus
Speed) value.For example, equipment is available corresponding with time value in the case where ocular data includes positional value and time value
Acceleration value.In some embodiments, equipment can determine positional value, rate value and/or jitter value according to ocular data.Equipment
It may include for calculating integral and/or derivative to obtain the circuit system of movement value according to ocular data.For example, equipment can be with
Circuit system including the second dervative for calculating position data.
Equipment can explain the mobile intention of user based on fixed movement value as a result,.For example, equipment can determine
User whether be intended to execute short distance movement (for example, still watch be presented on identical project on display with before when) or it is remote
Journey movement (for example, sight is turned into other places from the project presented over the display).It in some embodiments, can be by acceleration
Value, rate value, positional value and/or jitter value explain that user is intended to compared with threshold value and/or profile.For example, equipment can be with
It is mobile to determine that user is intended to progress short distance in the case where rate value matching bell curve distribution.It in some embodiments, can be with
Movement value (for example, acceleration value, rate value, positional value and/or jitter value) is compared with the combination of threshold value and profile to solve
Release the intention of user.For example, in the case where velocity amplitude matches bell curve and acceleration value is more than threshold value, the movement of user
It can be interpreted remote mobile (for example, leaving the object being actively watching).
Thus, it should be understood that in some embodiments, equipment can store for mobile one to classify of user
A or multiple positions profile.For example, equipment can store the mobile corresponding position profile of short distance in the display with equipment.
Furthermore, it is possible to be moved based on determining whether meeting one or more triggerings and being examined according to present principles come (for example, initial)
Dynamic value.Triggering can be based on such as position, rate and/or acceleration, and the movement for needing to explain occur to equipment instruction
(for example, detected eyes it is mobile whether show user just by sight from the project of viewing turn to other places or even to
Also continue to watch in the case that eyes are mobile out).Once meeting triggering, then movement value can be explained to determine the intention of user.
Before summary, it is also noted that, although for example, Fig. 3 and some diagrams described herein are related to determining people
Whether in the specific region of image, but identical principle and/or determination and other logic steps are fitted after making necessary modification
For the object in the specific part of image in addition to people and/or face.For example, determining user's viewing figure in response to equipment
The specific region of picture, logic can determine that user is watching the special object being included therein, extract the number about the object
According to, and executed and searched for return to the information about the object using extracted data.
Present principles are now based on it should be understood that eyes tracking software can be used to detect the interested item of user
Thus mesh provides the information about the project or essential characteristic associated there.For example, focusing on the particular day on calendar
User the details (birthday, anniversary, appointment for such as, such as in calendar annotating etc.) about the date can be made to be presented.
As another example, watch file or photo up to threshold time can make about the project additional details (such as, such as photo
Data and/or position, setting etc.) it is presented.As another example, the tiling of viewing dynamic or news feed, which roll, reaches threshold time
More details about this article or news can be made to be presented, the selected parts including such as article itself.
Present principles are it is furthermore realized that for example, above-mentioned logic step can be adopted for touch-screen equipment and non-touch-screen equipment.
Although present principles further recognize that for example for using the software application of present principles can be (all with equipment
Such as system 100) it sells together, it should be understood that present principles are suitable for such example: where such application program
Such as equipment is downloaded to from server by the network of such as internet.
Specifically " it is based on what information was presented in equipment in eyes tracking although showing and being described in detail herein
System and method ", it should be understood that the theme that the application is included is defined by the claims.
Claims (15)
1. a kind of equipment, comprising:
Display;
Processor;
Memory, can be accessed by the processor and be carried instruction, described instruction can be executed by the processor with:
At least one signal is received from least one photographic device communicated with the equipment;
It is based at least partially on the signal, determines that the user of the equipment is actively watching the part of the display;And
It is actively watching the part in response to the determination user, letter associated with the project of presentation on the mentioned parts is presented
Breath;
Wherein, the information response in determine the user at least substantially watch the project up to threshold time and be presented;
The information is the first information, and the threshold time is the first threshold time, and wherein, and described instruction can also be by institute
State processor execute with:
Judge that the user at least substantially watches whether the project reaches second time threshold;
If it is determined that the not up to described second time threshold, judges whether the user is making prearranged gesture,
When determining that the user is making the prearranged gesture, in response to determination, the user is making the predetermined hand
The second information associated with the project is presented in gesture, and second information is different from the first information;
Wherein, if the prearranged gesture and starting software application and/or the software application journey watched with the user
When sequence is associated, then in response to starting software application associated with the project that the user is watching.
2. equipment according to claim 1, wherein described instruction can also by the processor execute with:
Determine that the user at least substantially watches the project up to the second threshold time;And
The project is at least substantially watched up to the second threshold time in response to the determination user to present and the item
Associated second information of mesh, second information are different from the first information.
3. equipment according to claim 1, wherein described instruction can also by the processor execute with:
Determine that the user at least substantially watches the project up to the second threshold time;And
The project is at least substantially watched up to the second threshold time in response to the determination user to present and the item
Associated second information of mesh, second information include the first information and additional letter associated with the project
Breath.
4. equipment according to claim 1, wherein described instruction can also by the processor execute with:
Determine that the user at least substantially watches the project up to the second threshold time, the user at least substantially watches institute
The determination for stating project up to the second threshold time is to watch the part in the user to reach the first threshold time really
Determination after fixed, the length of the second threshold time are different from the length of the first threshold time;And
The project is at least substantially watched up to the second threshold time in response to the determination user to present and the item
Associated second information of mesh.
5. equipment according to claim 4, wherein the second threshold time determines the user from when the processor
Start when initially at least substantially watching the project.
6. equipment according to claim 4, wherein the second threshold time determines the user from when the processor
It at least substantially watches when the project reaches the first threshold time and starting.
7. equipment according to claim 1, wherein the part is first part, and the information be presented on it is described
On display, and wherein, what the information was presented on the display does not include on the second part of the first part.
8. equipment according to claim 7, wherein the information is presented in the window on the second part.
9. equipment according to claim 7, wherein the information response is in determining that the user at least substantially watches institute
It states project to be presented up to the first threshold time, and wherein, described instruction can also be executed by the processor second
The information is removed from the second part of the display after threshold time.
10. equipment according to claim 1, wherein the information at least through the loudspeaker communicated with the equipment with
Audible means are presented to the user.
11. equipment according to claim 1, wherein the information is without starting software associated with the project
It is presented in the case where application program.
12. equipment according to claim 1, wherein the information is in the nothing other than the user watches the part
It is presented on the mentioned parts in the case where needing user to input.
13. a kind of method, comprising:
The data from photographic device are received at equipment;
The data are based at least partially on, determine that the user of the equipment watches the specific region of the display of the equipment and reaches
At least threshold time;And
The region is watched in response to the determination user and reaches the threshold time, and the feature over the region with presentation is presented
Associated metadata;
The metadata is the first metadata, and the threshold time is the first threshold time, and wherein, the method is also
Include:
The second metadata associated with the feature is presented, second metadata is different from first metadata, described
Second metadata be presented selected from the movement including following groups in response to the determination user:
Judge that the user at least substantially watches whether the region reaches second time threshold, however, it is determined that not up to described the
Two time thresholds, judge whether the user is making prearranged gesture;
Wherein, if the prearranged gesture made and starting software application and/or the software watched with the user
When application program is associated, then start software application associated with the project that the user is watching.
14. according to the method for claim 13,
Second metadata is additionally in response to determine that the user be presented selected from the movement including following groups:
The specific region is watched up to the second threshold time.
15. according to the method for claim 13, wherein without starting software application associated with the feature
In the case where the metadata is presented.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/132,663 | 2013-12-18 | ||
US14/132,663 US20150169048A1 (en) | 2013-12-18 | 2013-12-18 | Systems and methods to present information on device based on eye tracking |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104731316A CN104731316A (en) | 2015-06-24 |
CN104731316B true CN104731316B (en) | 2019-04-23 |
Family
ID=53192783
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410534851.4A Active CN104731316B (en) | 2013-12-18 | 2014-10-11 | The system and method for information is presented in equipment based on eyes tracking |
Country Status (3)
Country | Link |
---|---|
US (1) | US20150169048A1 (en) |
CN (1) | CN104731316B (en) |
DE (1) | DE102014118109A1 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9633252B2 (en) | 2013-12-20 | 2017-04-25 | Lenovo (Singapore) Pte. Ltd. | Real-time detection of user intention based on kinematics analysis of movement-oriented biometric data |
US10180716B2 (en) | 2013-12-20 | 2019-01-15 | Lenovo (Singapore) Pte Ltd | Providing last known browsing location cue using movement-oriented biometric data |
US9535497B2 (en) | 2014-11-20 | 2017-01-03 | Lenovo (Singapore) Pte. Ltd. | Presentation of data on an at least partially transparent display based on user focus |
US10317994B2 (en) * | 2015-06-05 | 2019-06-11 | International Business Machines Corporation | Initiating actions responsive to user expressions of a user while reading media content |
CN105094604A (en) * | 2015-06-30 | 2015-11-25 | 联想(北京)有限公司 | Information processing method and electronic equipment |
US10444972B2 (en) * | 2015-11-28 | 2019-10-15 | International Business Machines Corporation | Assisting a user with efficient navigation between a selection of entries with elements of interest to the user within a stream of entries |
DE102016224246A1 (en) | 2016-12-06 | 2018-06-07 | Volkswagen Aktiengesellschaft | Method and apparatus for interacting with a graphical user interface |
AU2016433740B2 (en) * | 2016-12-28 | 2022-03-24 | Razer (Asia-Pacific) Pte. Ltd. | Methods for displaying a string of text and wearable devices |
DE102017107447A1 (en) * | 2017-04-06 | 2018-10-11 | Eveline Kladov | Display device and method for operating a display device |
US20180357670A1 (en) * | 2017-06-07 | 2018-12-13 | International Business Machines Corporation | Dynamically capturing, transmitting and displaying images based on real-time visual identification of object |
US10332378B2 (en) * | 2017-10-11 | 2019-06-25 | Lenovo (Singapore) Pte. Ltd. | Determining user risk |
ES2717526A1 (en) * | 2017-12-20 | 2019-06-21 | Seat Sa | Method for managing a graphic representation of at least one message in a vehicle (Machine-translation by Google Translate, not legally binding) |
CN109151176A (en) * | 2018-07-25 | 2019-01-04 | 维沃移动通信有限公司 | A kind of information acquisition method and terminal |
CN109815409B (en) * | 2019-02-02 | 2021-01-01 | 北京七鑫易维信息技术有限公司 | Information pushing method and device, wearable device and storage medium |
CN115762739B (en) * | 2022-11-23 | 2023-08-04 | 广东德鑫医疗科技有限公司 | Medical equipment fault reporting platform and method based on Internet of things |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101379456A (en) * | 2006-02-01 | 2009-03-04 | 托比伊科技公司 | Generation of graphical feedback in a computer system |
CN103329163A (en) * | 2011-01-07 | 2013-09-25 | 三星电子株式会社 | Method and apparatus for collecting content |
Family Cites Families (125)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5583795A (en) * | 1995-03-17 | 1996-12-10 | The United States Of America As Represented By The Secretary Of The Army | Apparatus for measuring eye gaze and fixation duration, and method therefor |
JP3025173B2 (en) * | 1995-04-13 | 2000-03-27 | シャープ株式会社 | Database search system |
US5649061A (en) * | 1995-05-11 | 1997-07-15 | The United States Of America As Represented By The Secretary Of The Army | Device and method for estimating a mental decision |
US5886683A (en) * | 1996-06-25 | 1999-03-23 | Sun Microsystems, Inc. | Method and apparatus for eyetrack-driven information retrieval |
US5731805A (en) * | 1996-06-25 | 1998-03-24 | Sun Microsystems, Inc. | Method and apparatus for eyetrack-driven text enlargement |
US5898423A (en) * | 1996-06-25 | 1999-04-27 | Sun Microsystems, Inc. | Method and apparatus for eyetrack-driven captioning |
US5831594A (en) * | 1996-06-25 | 1998-11-03 | Sun Microsystems, Inc. | Method and apparatus for eyetrack derived backtrack |
US6437758B1 (en) * | 1996-06-25 | 2002-08-20 | Sun Microsystems, Inc. | Method and apparatus for eyetrack—mediated downloading |
US20080227534A1 (en) * | 1996-11-14 | 2008-09-18 | Bally Gaming, Inc. | Gaming system with savable game states |
US6758755B2 (en) * | 1996-11-14 | 2004-07-06 | Arcade Planet, Inc. | Prize redemption system for games executed over a wide area network |
US8821258B2 (en) * | 1996-11-14 | 2014-09-02 | Agincourt Gaming, Llc | Method for providing games over a wide area network |
WO2000018287A1 (en) * | 1998-09-25 | 2000-04-06 | Case Western Reserve University | Acquired pendular nystagmus treatment device |
US6577329B1 (en) * | 1999-02-25 | 2003-06-10 | International Business Machines Corporation | Method and system for relevance feedback through gaze tracking and ticker interfaces |
US6120461A (en) * | 1999-08-09 | 2000-09-19 | The United States Of America As Represented By The Secretary Of The Army | Apparatus for tracking the human eye with a retinal scanning display, and method thereof |
WO2001040964A1 (en) * | 1999-12-01 | 2001-06-07 | Amicus Software Pty Ltd | Method and apparatus for network access |
JP4235340B2 (en) * | 2000-04-04 | 2009-03-11 | キヤノン株式会社 | Information processing apparatus and information processing method |
US20070078552A1 (en) * | 2006-01-13 | 2007-04-05 | Outland Research, Llc | Gaze-based power conservation for portable media players |
US6873314B1 (en) * | 2000-08-29 | 2005-03-29 | International Business Machines Corporation | Method and system for the recognition of reading skimming and scanning from eye-gaze patterns |
US7197165B2 (en) * | 2002-02-04 | 2007-03-27 | Canon Kabushiki Kaisha | Eye tracking using image data |
US7046924B2 (en) * | 2002-11-25 | 2006-05-16 | Eastman Kodak Company | Method and computer program product for determining an area of importance in an image using eye monitoring information |
US7206022B2 (en) * | 2002-11-25 | 2007-04-17 | Eastman Kodak Company | Camera system with eye monitoring |
US8292433B2 (en) * | 2003-03-21 | 2012-10-23 | Queen's University At Kingston | Method and apparatus for communication between humans and devices |
US7762665B2 (en) * | 2003-03-21 | 2010-07-27 | Queen's University At Kingston | Method and apparatus for communication between humans and devices |
US9274598B2 (en) * | 2003-08-25 | 2016-03-01 | International Business Machines Corporation | System and method for selecting and activating a target object using a combination of eye gaze and key presses |
US8307296B2 (en) * | 2003-10-17 | 2012-11-06 | Palo Alto Research Center, Incorporated | Systems and methods for effective attention shifting |
US7963652B2 (en) * | 2003-11-14 | 2011-06-21 | Queen's University At Kingston | Method and apparatus for calibration-free eye tracking |
DK1607840T3 (en) * | 2004-06-18 | 2015-02-16 | Tobii Technology Ab | Eye control of a computer device |
US7738684B2 (en) * | 2004-11-24 | 2010-06-15 | General Electric Company | System and method for displaying images on a PACS workstation based on level of significance |
US7573439B2 (en) * | 2004-11-24 | 2009-08-11 | General Electric Company | System and method for significant image selection using visual tracking |
US7576757B2 (en) * | 2004-11-24 | 2009-08-18 | General Electric Company | System and method for generating most read images in a PACS workstation |
US7501995B2 (en) * | 2004-11-24 | 2009-03-10 | General Electric Company | System and method for presentation of enterprise, clinical, and decision support information utilizing eye tracking navigation |
JP4645299B2 (en) * | 2005-05-16 | 2011-03-09 | 株式会社デンソー | In-vehicle display device |
US7429108B2 (en) * | 2005-11-05 | 2008-09-30 | Outland Research, Llc | Gaze-responsive interface to enhance on-screen user reading tasks |
US20060256133A1 (en) * | 2005-11-05 | 2006-11-16 | Outland Research | Gaze-responsive video advertisment display |
US9250703B2 (en) * | 2006-03-06 | 2016-02-02 | Sony Computer Entertainment Inc. | Interface with gaze detection and voice input |
US8725729B2 (en) * | 2006-04-03 | 2014-05-13 | Steven G. Lisa | System, methods and applications for embedded internet searching and result display |
JP5355399B2 (en) * | 2006-07-28 | 2013-11-27 | コーニンクレッカ フィリップス エヌ ヴェ | Gaze interaction for displaying information on the gazeed product |
EP2042969A1 (en) * | 2007-09-28 | 2009-04-01 | Alcatel Lucent | Method for determining user reaction with specific content of a displayed page. |
US8077915B2 (en) * | 2007-10-12 | 2011-12-13 | Sony Ericsson Mobile Communications Ab | Obtaining information by tracking a user |
US8693737B1 (en) * | 2008-02-05 | 2014-04-08 | Bank Of America Corporation | Authentication systems, operations, processing, and interactions |
JP2009259238A (en) * | 2008-03-26 | 2009-11-05 | Fujifilm Corp | Storage device for image sharing and image sharing system and method |
US20100045596A1 (en) * | 2008-08-21 | 2010-02-25 | Sony Ericsson Mobile Communications Ab | Discreet feature highlighting |
US20110141011A1 (en) * | 2008-09-03 | 2011-06-16 | Koninklijke Philips Electronics N.V. | Method of performing a gaze-based interaction between a user and an interactive display system |
US8160311B1 (en) * | 2008-09-26 | 2012-04-17 | Philip Raymond Schaefer | System and method for detecting facial gestures for control of an electronic device |
US20100079508A1 (en) * | 2008-09-30 | 2010-04-01 | Andrew Hodge | Electronic devices with gaze detection capabilities |
JP5296221B2 (en) * | 2008-12-29 | 2013-09-25 | テレフオンアクチーボラゲット エル エム エリクソン(パブル) | Method for installing application in NFC-compatible device, NFC-compatible device, server node, computer-readable medium, and computer program |
US8732623B2 (en) * | 2009-02-17 | 2014-05-20 | Microsoft Corporation | Web cam based user interaction |
JP5208810B2 (en) * | 2009-02-27 | 2013-06-12 | 株式会社東芝 | Information processing apparatus, information processing method, information processing program, and network conference system |
WO2010118292A1 (en) * | 2009-04-09 | 2010-10-14 | Dynavox Systems, Llc | Calibration free, motion tolerant eye-gaze direction detector with contextually aware computer interaction and communication methods |
KR101596890B1 (en) * | 2009-07-29 | 2016-03-07 | 삼성전자주식회사 | Apparatus and method for navigation digital object using gaze information of user |
US9507418B2 (en) * | 2010-01-21 | 2016-11-29 | Tobii Ab | Eye tracker based contextual action |
US8922480B1 (en) * | 2010-03-05 | 2014-12-30 | Amazon Technologies, Inc. | Viewer-based device control |
JP2012008686A (en) * | 2010-06-23 | 2012-01-12 | Sony Corp | Information processor and method, and program |
WO2012036324A1 (en) * | 2010-09-13 | 2012-03-22 | 엘지전자 주식회사 | Mobile terminal and method for controlling operation thereof |
US8493390B2 (en) * | 2010-12-08 | 2013-07-23 | Sony Computer Entertainment America, Inc. | Adaptive displays using gaze tracking |
US8957847B1 (en) * | 2010-12-28 | 2015-02-17 | Amazon Technologies, Inc. | Low distraction interfaces |
US20120169582A1 (en) * | 2011-01-05 | 2012-07-05 | Visteon Global Technologies | System ready switch for eye tracking human machine interaction control system |
JP5278461B2 (en) * | 2011-02-03 | 2013-09-04 | 株式会社デンソー | Gaze detection device and gaze detection method |
EP3527121B1 (en) * | 2011-02-09 | 2023-08-23 | Apple Inc. | Gesture detection in a 3d mapping environment |
US8594374B1 (en) * | 2011-03-30 | 2013-11-26 | Amazon Technologies, Inc. | Secure device unlock with gaze calibration |
US20130057573A1 (en) * | 2011-09-02 | 2013-03-07 | DigitalOptics Corporation Europe Limited | Smart Display with Dynamic Face-Based User Preference Settings |
US8643680B2 (en) * | 2011-04-08 | 2014-02-04 | Amazon Technologies, Inc. | Gaze-based content display |
US8881051B2 (en) * | 2011-07-05 | 2014-11-04 | Primesense Ltd | Zoom-based gesture user interface |
US8885882B1 (en) * | 2011-07-14 | 2014-11-11 | The Research Foundation For The State University Of New York | Real time eye tracking for human computer interaction |
AU2011204946C1 (en) * | 2011-07-22 | 2012-07-26 | Microsoft Technology Licensing, Llc | Automatic text scrolling on a head-mounted display |
JP5785015B2 (en) * | 2011-07-25 | 2015-09-24 | 京セラ株式会社 | Electronic device, electronic document control program, and electronic document control method |
US8719278B2 (en) * | 2011-08-29 | 2014-05-06 | Buckyball Mobile Inc. | Method and system of scoring documents based on attributes obtained from a digital document by eye-tracking data analysis |
EP2587342A1 (en) * | 2011-10-28 | 2013-05-01 | Tobii Technology AB | Method and system for user initiated query searches based on gaze data |
US8611015B2 (en) * | 2011-11-22 | 2013-12-17 | Google Inc. | User interface |
KR101891786B1 (en) * | 2011-11-29 | 2018-08-27 | 삼성전자주식회사 | Operation Method For User Function based on a Eye-Tracking and Portable Device supporting the same |
US10395263B2 (en) * | 2011-12-12 | 2019-08-27 | Intel Corporation | Interestingness scoring of areas of interest included in a display element |
US8824779B1 (en) * | 2011-12-20 | 2014-09-02 | Christopher Charles Smyth | Apparatus and method for determining eye gaze from stereo-optic views |
US8941722B2 (en) * | 2012-01-03 | 2015-01-27 | Sony Corporation | Automatic intelligent focus control of video |
US9684374B2 (en) * | 2012-01-06 | 2017-06-20 | Google Inc. | Eye reflection image analysis |
JP5945417B2 (en) * | 2012-01-06 | 2016-07-05 | 京セラ株式会社 | Electronics |
US20150084864A1 (en) * | 2012-01-09 | 2015-03-26 | Google Inc. | Input Method |
US20130198056A1 (en) * | 2012-01-27 | 2013-08-01 | Verizon Patent And Licensing Inc. | Near field communication transaction management and application systems and methods |
US20130201305A1 (en) * | 2012-02-06 | 2013-08-08 | Research In Motion Corporation | Division of a graphical display into regions |
US9536361B2 (en) * | 2012-03-14 | 2017-01-03 | Autoconnect Holdings Llc | Universal vehicle notification system |
US9096920B1 (en) * | 2012-03-22 | 2015-08-04 | Google Inc. | User interface method |
US9207843B2 (en) * | 2012-03-26 | 2015-12-08 | Nokia Technologies Oy | Method and apparatus for presenting content via social networking messages |
US20130260360A1 (en) * | 2012-03-27 | 2013-10-03 | Sony Corporation | Method and system of providing interactive information |
EP2836889A4 (en) * | 2012-04-12 | 2015-11-18 | Intel Corp | Eye tracking based selectively backlighting a display |
WO2013169237A1 (en) * | 2012-05-09 | 2013-11-14 | Intel Corporation | Eye tracking based selective accentuation of portions of a display |
US8893164B1 (en) * | 2012-05-16 | 2014-11-18 | Google Inc. | Audio system |
US9046917B2 (en) * | 2012-05-17 | 2015-06-02 | Sri International | Device, method and system for monitoring, predicting, and accelerating interactions with a computing device |
US9823742B2 (en) * | 2012-05-18 | 2017-11-21 | Microsoft Technology Licensing, Llc | Interaction and management of devices using gaze detection |
WO2013183811A1 (en) * | 2012-06-08 | 2013-12-12 | Lg Electronics Inc. | Portable device and method for controlling the same |
US20130340005A1 (en) * | 2012-06-14 | 2013-12-19 | Mobitv, Inc. | Eye-tracking program guides |
US20130340006A1 (en) * | 2012-06-14 | 2013-12-19 | Mobitv, Inc. | Eye-tracking navigation |
US20140071163A1 (en) * | 2012-09-11 | 2014-03-13 | Peter Tobias Kinnebrew | Augmented reality information detail |
US10139937B2 (en) * | 2012-10-12 | 2018-11-27 | Microsoft Technology Licensing, Llc | Multi-modal user expressions and user intensity as interactions with an application |
US9477993B2 (en) * | 2012-10-14 | 2016-10-25 | Ari M Frank | Training a predictor of emotional response based on explicit voting on content and eye tracking to verify attention |
US9626072B2 (en) * | 2012-11-07 | 2017-04-18 | Honda Motor Co., Ltd. | Eye gaze control system |
US20140168054A1 (en) * | 2012-12-14 | 2014-06-19 | Echostar Technologies L.L.C. | Automatic page turning of electronically displayed content based on captured eye position data |
US8930269B2 (en) * | 2012-12-17 | 2015-01-06 | State Farm Mutual Automobile Insurance Company | System and method to adjust insurance rate based on real-time data about potential vehicle operator impairment |
US8981942B2 (en) * | 2012-12-17 | 2015-03-17 | State Farm Mutual Automobile Insurance Company | System and method to monitor and reduce vehicle operator impairment |
US9996150B2 (en) * | 2012-12-19 | 2018-06-12 | Qualcomm Incorporated | Enabling augmented reality using eye gaze tracking |
US8854447B2 (en) * | 2012-12-21 | 2014-10-07 | United Video Properties, Inc. | Systems and methods for automatically adjusting audio based on gaze point |
US20140195918A1 (en) * | 2013-01-07 | 2014-07-10 | Steven Friedlander | Eye tracking user interface |
US9829971B2 (en) * | 2013-01-21 | 2017-11-28 | Facebook, Inc. | Systems and methods of eye tracking control |
US9791921B2 (en) * | 2013-02-19 | 2017-10-17 | Microsoft Technology Licensing, Llc | Context-aware augmented reality object commands |
US9864498B2 (en) * | 2013-03-13 | 2018-01-09 | Tobii Ab | Automatic scrolling based on gaze detection |
ES2731560T3 (en) * | 2013-03-01 | 2019-11-15 | Tobii Ab | Look interaction with delayed deformation |
US9035874B1 (en) * | 2013-03-08 | 2015-05-19 | Amazon Technologies, Inc. | Providing user input to a computing device with an eye closure |
US20140267094A1 (en) * | 2013-03-13 | 2014-09-18 | Microsoft Corporation | Performing an action on a touch-enabled device based on a gesture |
US9405771B2 (en) * | 2013-03-14 | 2016-08-02 | Microsoft Technology Licensing, Llc | Associating metadata with images in a personal image collection |
US9041741B2 (en) * | 2013-03-14 | 2015-05-26 | Qualcomm Incorporated | User interface for a head mounted display |
US10216266B2 (en) * | 2013-03-14 | 2019-02-26 | Qualcomm Incorporated | Systems and methods for device interaction based on a detected gaze |
US20140266702A1 (en) * | 2013-03-15 | 2014-09-18 | South East Water Corporation | Safety Monitor Application |
US8876535B2 (en) * | 2013-03-15 | 2014-11-04 | State Farm Mutual Automobile Insurance Company | Real-time driver observation and scoring for driver's education |
US9244527B2 (en) * | 2013-03-26 | 2016-01-26 | Volkswagen Ag | System, components and methodologies for gaze dependent gesture input control |
US20140315531A1 (en) * | 2013-04-17 | 2014-10-23 | Donald Joong | System & method for enabling or restricting features based on an attention challenge |
EP2996020A4 (en) * | 2013-05-08 | 2016-05-25 | Fujitsu Ltd | Input device and input program |
US20140354533A1 (en) * | 2013-06-03 | 2014-12-04 | Shivkumar Swaminathan | Tagging using eye gaze detection |
US9965062B2 (en) * | 2013-06-06 | 2018-05-08 | Microsoft Technology Licensing, Llc | Visual enhancements based on eye tracking |
US9908048B2 (en) * | 2013-06-08 | 2018-03-06 | Sony Interactive Entertainment Inc. | Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted display |
US9563283B2 (en) * | 2013-08-06 | 2017-02-07 | Inuitive Ltd. | Device having gaze detection capabilities and a method for using same |
KR20150027614A (en) * | 2013-09-04 | 2015-03-12 | 엘지전자 주식회사 | Mobile terminal |
US10108258B2 (en) * | 2013-09-06 | 2018-10-23 | Intel Corporation | Multiple viewpoint image capture of a display user |
US9451062B2 (en) * | 2013-09-30 | 2016-09-20 | Verizon Patent And Licensing Inc. | Mobile device edge view display insert |
US10210761B2 (en) * | 2013-09-30 | 2019-02-19 | Sackett Solutions & Innovations, LLC | Driving assistance systems and methods |
US20150113454A1 (en) * | 2013-10-21 | 2015-04-23 | Motorola Mobility Llc | Delivery of Contextual Data to a Computing Device Using Eye Tracking Technology |
US9530067B2 (en) * | 2013-11-20 | 2016-12-27 | Ulsee Inc. | Method and apparatus for storing and retrieving personal contact information |
US20150153826A1 (en) * | 2013-12-01 | 2015-06-04 | Apx Labs, Llc | Systems and methods for providing a virtual menu |
-
2013
- 2013-12-18 US US14/132,663 patent/US20150169048A1/en not_active Abandoned
-
2014
- 2014-10-11 CN CN201410534851.4A patent/CN104731316B/en active Active
- 2014-12-08 DE DE102014118109.3A patent/DE102014118109A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101379456A (en) * | 2006-02-01 | 2009-03-04 | 托比伊科技公司 | Generation of graphical feedback in a computer system |
CN103329163A (en) * | 2011-01-07 | 2013-09-25 | 三星电子株式会社 | Method and apparatus for collecting content |
Also Published As
Publication number | Publication date |
---|---|
CN104731316A (en) | 2015-06-24 |
DE102014118109A1 (en) | 2015-06-18 |
US20150169048A1 (en) | 2015-06-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104731316B (en) | The system and method for information is presented in equipment based on eyes tracking | |
US11087538B2 (en) | Presentation of augmented reality images at display locations that do not obstruct user's view | |
US8549418B2 (en) | Projected display to enhance computer device use | |
US10387570B2 (en) | Enhanced e-reader experience | |
US10895961B2 (en) | Progressive information panels in a graphical user interface | |
US10921979B2 (en) | Display and processing methods and related apparatus | |
CN105408847B (en) | User terminal and and its display lock-screen method | |
US9329678B2 (en) | Augmented reality overlay for control devices | |
US10922862B2 (en) | Presentation of content on headset display based on one or more condition(s) | |
US11934640B2 (en) | User interfaces for record labels | |
US9317486B1 (en) | Synchronizing playback of digital content with captured physical content | |
CN103703438A (en) | Gaze-based content display | |
JP7099444B2 (en) | Information processing equipment, information processing methods, and programs | |
JP2016126773A (en) | Systems and methods for generating haptic effects based on eye tracking | |
TW201531917A (en) | Control device, control method, and computer program | |
CN111314759B (en) | Video processing method and device, electronic equipment and storage medium | |
US9875075B1 (en) | Presentation of content on a video display and a headset display | |
US10622017B1 (en) | Apparatus, a system, and a method of dynamically generating video data | |
CN104735517B (en) | Information display method and electronic equipment | |
US20140132725A1 (en) | Electronic device and method for determining depth of 3d object image in a 3d environment image | |
US20210051245A1 (en) | Techniques for presenting video stream next to camera | |
US10248306B1 (en) | Systems and methods for end-users to link objects from images with digital content | |
US20150347364A1 (en) | Highlighting input area based on user input | |
US9990117B2 (en) | Zooming and panning within a user interface | |
CN108141474B (en) | Electronic device for sharing content with external device and method for sharing content thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |