CN106125921B - Gaze detection in 3D map environment - Google Patents

Gaze detection in 3D map environment Download PDF

Info

Publication number
CN106125921B
CN106125921B CN201610451134.4A CN201610451134A CN106125921B CN 106125921 B CN106125921 B CN 106125921B CN 201610451134 A CN201610451134 A CN 201610451134A CN 106125921 B CN106125921 B CN 106125921B
Authority
CN
China
Prior art keywords
user
display
computer
project
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610451134.4A
Other languages
Chinese (zh)
Other versions
CN106125921A (en
Inventor
伊亚尔·贝奇科夫
奥伦·布兰泽纳
米莎·加洛尔
奥菲尔·欧尔
乔纳森·波克拉斯
埃米尔·奥夫南
塔米尔·贝利纳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Computer Inc filed Critical Apple Computer Inc
Publication of CN106125921A publication Critical patent/CN106125921A/en
Application granted granted Critical
Publication of CN106125921B publication Critical patent/CN106125921B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Abstract

This disclosure relates to the gaze detection in 3D map environment.A kind of method includes: three-dimensional (3D) figure for receiving at least one physical feeling of user (22) of computer system;And two dimension (2D) image of user is received, which includes the eyes (34) of user.The 3D coordinate of user's head (32) is extracted from 3D figure and 2D image, and the image recognition of 3D coordinate and eyes based on head goes out the direction performed by the user stared.

Description

Gaze detection in 3D map environment
The application be international filing date be on 2 9th, 2012, application No. is 201280007484.1, entitled " 3D The divisional application of the international application into National Phase in China of gaze detection in map environment ".
Related application
This application claims the U.S. Provisional Patent Application submitted for 9th for 2 months on August 24th, 61/440877,2,011 2011 to mention The U.S. Provisional Patent Application 61/ that the U.S. Provisional Patent Application 61/526692 of friendship and September in 2011 are submitted on the 25th 538867 priority is incorporated them into herein by reference herein.
Technical field
The present invention relates generally to man-machine interfaces, and more particularly to the interface for combining multiple user's interactive forms.
Background technique
It can use many different types of user interface apparatus and method at present.Common tactile interface equipment includes meter Calculate switch disk, mouse and control stick.The touch carried out in touch screen detection display area by finger or other objects is deposited And position.Infrared remote control is widely used, and " wearable " hardware device also have been developed, for remote The purpose of process control.
The computer interface of three-dimensional (3D) induction of physical feeling based on user has also been suggested.For example, PCT is international Open WO 03/071410, the disclosure is incorporated herein by reference, describes a kind of using depth-detecting sensor hand Gesture identifying system.The 3D sensor close to user being generally arranged in room provides location information, and the location information is for knowing Gesture caused by other target body site.Shape and its position based on physical feeling and the direction at one section of interval are come Identify the gesture.The gesture is classified to be used to determine the input for arriving relevant electronic equipment.
The document being incorporated by reference into present patent application will be considered as the intact part of the application, unless The definition explicitly or implicitly made in any term and this specification defined in these incorporated documents mutually conflicts, (at this time) it should only consider the definition in this specification.
As another example, United States Patent (USP) 7,348,963, the disclosure is incorporated herein by reference, which retouches A kind of interactive video display system is stated, display screen shows visual image within the system, and camera capture is about position The 3D information of the object in interactive area before the display screen.Computer system indicates to show in response to the change of object Screen changes visual image.
Three-dimensional human-computer interface system can not only identify the hand of user, but also can identify other positions of body, packet Include head, trunk and limbs.For example, U.S. Patent application 2010/0034457, the disclosure is incorporated herein by reference, It describes a kind of for simulating humanoid method according to depth map.The profile to find body is split to the depth map. The profile is processed for identify the trunk of the object and one or more limbs.By in analysis depth figure at least The arrangement of one limbs identified inputs to generate to control the application program run on computers.
The gaze-direction of some user interface system tracking users.For example, United States Patent (USP) 7,762,665, the disclosure of which It is incorporated herein by reference, which depict a kind of methods for adjustment equipment operation, this method comprises: providing for being closed In the attention user interface of the information of the attention state of user;And the operation based on information adjustment equipment obtained, wherein The operation being conditioned is started by the equipment.Preferably, the information about the attention state of user is that user uses with by the attention The Eye contact that the equipment that family interface senses carries out.
Summary of the invention
According to an embodiment of the invention, providing a method, this method comprises: receiving the user of computer system extremely Three-dimensional (3D) figure of a few physical feeling;The image of the two dimension (2D) of the user is received, which includes the eyes of the user, The 3D coordinate on the head of the user is extracted from 3D figure and 2D image;The image recognition of 3D coordinate and eyes based on the head The direction performed by the user stared.
According to an embodiment of the invention, a kind of method is additionally provided, this method comprises: receiving the use including computer system The image of the eyes at family;The direction stared executed based on the image recognition of the eyes by the user;It is identified based on gaze-direction It is coupled to the region on the display of computer system, and operation is executed to the content presented in this region.
According to an embodiment of the invention, in addition provide a method, this method comprises: by computer system in coupling again It closes and multiple interactive projects is presented on the display of computer;Receiving from the sensing equipment for be coupled to the computer indicates user Gaze-direction input;Target point based on gaze-direction identification on the display;Show that the target point with this is appeared in Show that the first interaction project on device is associated;And in response to the target point, the one or more second opened on the display is handed over Mutually exclusive project.
According to an embodiment of the invention, a kind of method is additionally provided, this method comprises: receiving and separation calculation machine system The First ray of three-dimensional (3D) figure of at least one physical feeling of user at any time, to extract at first point and of the user 2 points of 3D coordinate, the 3D figure indicate movement of the second point relative to the display for being coupled to the computer system;Calculate with First point and second point intersection line segment;It identifies the target point of the line segment Yu the display intersection, and is bonded on display The interaction project close to the target point of upper presentation.
According to an embodiment of the invention, additionally providing a kind of device, which includes sensing equipment, which is matched It is set to the eye image for receiving three-dimensional (3D) figure and the user of at least one physical feeling of user, and receives the two of the user (2D) image is tieed up, which includes the eyes of user;And computer, the computer are coupled to the sensing equipment and are configured as The 3D coordinate that the head of the user is extracted from 3D figure and 2D image, 3D coordinate and eye image based on the head are identified by this The direction stared that user executes.
According to an embodiment of the invention, still further providing a kind of device, which includes sensing equipment, the sensing equipment It is configured as receiving the image including eyes of user;And computer, the computer be configured as based on the eye image identification by The direction stared that the user executes identifies the region being coupled on the display of the computer system based on gaze-direction, with And operation is executed to the content presented in this region.
According to an embodiment of the invention, additionally providing a kind of device, which includes display and computer, the computer It is coupled to the display and is configured as that multiple interactive projects are presented on the display;It is set from the sensing for being coupled to the computer It is standby to receive the input for indicating the gaze-direction of user;Target point based on gaze-direction identification on the display;Make the mesh Punctuate is associated with the first interaction project of appearance on the display;And in response to the target point, open on the display One or more the second interaction project.
According to an embodiment of the invention, additionally providing a kind of device, which includes display and computer, the computer Be coupled to the display and be configured as receiving and at least one physical feeling of the user of separation calculation machine system at any time The First ray of three-dimensional (3D) figure, to extract and the 3D coordinate of second point at first point of the user, which indicates second Movement of the point relative to the display for being coupled to the computer system;Calculate the line segment intersected with and second point at first point;Identification The target point of the line segment and the display intersection;And the interaction item close to the target point that engagement is presented over the display Mesh.
According to an embodiment of the invention, a kind of computer software product is still further provided, the computer software product packet The computer-readable medium for being wherein stored with the nonvolatile of program instruction is included, when the instruction is read by computer, makes the calculating Machine receives three-dimensional (3D) figure of at least one physical feeling of the user of the computer;Two dimension (2D) image of the user is received, The image includes the eyes of the user;The 3D coordinate of the user's head is extracted from 3D figure and 2D image;And it is based on the head 3D coordinate and the eye image identify the direction stared that is executed by the user.
According to an embodiment of the invention, additionally providing a kind of computer software product, which includes it In be stored with program instruction the computer-readable medium of nonvolatile connect the computer when the instruction is read by computer Packet receiving includes the image of the eyes of the user of the computer system;The side stared executed by the user is identified based on the eye image To;The region being coupled on the display of the computer system is identified based on the gaze-direction, and in this region to presentation Content execute operation.
According to an embodiment of the invention, additionally providing a kind of computer software product, which includes it In be stored with program instruction nonvolatile computer-readable medium, when the instruction is read by computer, so that the computer Multiple interactive projects are presented on the display for be coupled to the computer;Receiving from the sensing equipment for being coupled to the computer indicates The input of the gaze-direction of user;The target point on the display is identified based on the gaze-direction;Make the target point and appears in The first interaction project on the display is associated;And in response to the target point, one or more the on the display are opened Two interaction projects.
According to an embodiment of the invention, a kind of computer software product is still further provided, the computer software product packet The computer-readable medium for being wherein stored with the nonvolatile of program instruction is included, when the instruction is read by computer, makes the calculating Machine receives and divides the First ray of three-dimensional (3D) figure of at least one physical feeling of the user of the computer at any time, so as to Extract and the 3D coordinate of second point at first point of the user, which indicates second point relative to being coupled to the aobvious of the computer Show the movement of device;Calculate the line segment intersected with and second point at first point;Identify the target point of the line segment Yu the display intersection; And the interaction project close to the target point that engagement is presented over the display.
Detailed description of the invention
Only the disclosure is described by way of example referring to attached drawing herein.
Fig. 1 is the schematic figure of the computer system of the user interface of the realization mixed form of embodiment according to the present invention Show explanation;
Fig. 2 is the computer for schematically showing the user interface for realizing the mixed form of embodiment according to the present invention The block diagram of the functional unit of system;
Fig. 3 is the process for schematically showing the method for the gaze-direction of detection user of embodiment according to the present invention Figure;
Fig. 4 is the schematic diagram of the numeric keypad for being configurable for input password of embodiment according to the present invention;
Fig. 5 is the method with the user interface interaction for staring operation for schematically showing embodiment according to the present invention Flow chart;
Fig. 6 A to 6C be show embodiment according to the present invention use stare operation user interface execute a system Arrange the schematic diagram of operation;
Fig. 7 is the detection for schematically showing embodiment according to the present invention and the method for staring related instruction gesture Flow chart;
Fig. 8 is that the user of embodiment according to the present invention executes direction-selection gesture to select the first given interaction item Purpose schematic, pictorial illustration;
Fig. 9 is that the user of embodiment according to the present invention executes direction-touch gestures to operate the second given interaction item Purpose schematic, pictorial illustration;
Figure 10 be show embodiment according to the present invention optional direction-selection gesture (herein also referred to as triggering hand Gesture) schematic, pictorial illustration;
Figure 11 be the user of embodiment according to the present invention finger is presented on display given icon so as to school The schematic, pictorial illustration of semicomputer's system;And
Figure 12 A and 12B are that embodiment according to the present invention by the way that smaller or larger size icon is presented helps to use The schematic, pictorial illustration of the computer system of the given icon of family selection.
Specific embodiment
It summarizes
When using physical tactile input equipment, when such as button, idler wheel or touch screen, user usually pass through touch and/or The physical equipment is operated to participate in and be detached from the control to user interface.Present embodiments describe for be coupled to The pointer device that the interaction project presented on the display of the computer of the user interface of mixed form is interacted is run, this is mixed The user interface of conjunction form include by 3D sensor to the change in location of one or more physical feelings of user or move into Capable three-dimensional (3D) sensing, which is usually hand or finger.Instruction gesture described herein includes in detail below Direction-selection, direction-touch and direction-holding gesture of explanation.
Direction-selection gesture allows users to the interaction project that selection is presented over the display.For example, using direction-choosing Gesture is selected, user can start by executing direction-selection gesture towards icon (over the display) associated with film Watch film.Direction-touch gestures allow users to the interaction project that operation is presented over the display.For example, user passes through finger The interaction project (for example, film) presented over the display can be horizontally rolled to-touch gestures operation horizontal rolling frame List.Direction-holding gesture allows users to see the contextual information of the interaction project presented over the display.For example, ringing Direction-holding gesture should be executed on the icon for indicating film in user, pop-up window can be presented in computer, which includes all Such as the information of synopsis, film review and cast member.In some embodiments, when user executes above-mentioned instruction gesture, this is mixed The user interface of conjunction form can also transmit visual feedback.
When with traditional two dimension (2D) user interface interaction, it is anti-that above-mentioned physical equipment usually transmits tactile to user Feedback.However, the user is not engaging when the 3D user interface interaction of the user interface with mixed form as described herein Gesture can be executed in the case where any physical equipment, to not receive any touch feedback.The embodiment provides For non-tactile feedback to be interacted and received with the project presented over the display to compensate for the shortage of touch feedback Method and system.
System describe
Fig. 1 is the user interface 20 for the mixed form of embodiment according to the present invention operated by the user 22 of computer 26 Schematic, pictorial illustration.Although (for brevity, only single user and user interface is shown in figure, actually Interface 20 can be interacted with multiple users simultaneously.Different user interface and/or branch can be used in alternative embodiment of the invention Hold multiple user interfaces across distinct device).By way of example, user interface 20 in the illustrated embodiment is based on 3D sensing equipment 24, the sensing equipment capture 3D scene information, which includes body or at least region, example Such as finger 30, hand 31, head 32 or eyes 34.Equipment 24 or individual camera (not shown) can also be with capturing scenes Color video frequency image.The information that equipment 24 captures is handled by computer 26, which correspondingly drives display screen 28 To present and the interaction project 36 (also referred to as interaction project herein) in operation display.Optionally, the user interface can with it is any The computer equipment of type is used in combination with, such as laptop, tablet computer, television set etc..
Although fig 1 illustrate that the computer 26 of tower structure, but the computer of other structures is recognized as of the invention Within spirit and scope.For example, computer 26 can be configured to desktop computer, portable computer (for example, notebook is electric Brain) or all-in-one computer.
The data that the processing of computer 26 is generated by equipment 24, to rebuild the 3D figure of user 22.Term " 3D figure " (or it is corresponding " depth map ") refer to one group of 3D coordinate for indicating the surface of given object, in this case, given object refers to user Body.In one embodiment, speckle patterns are projected to the image on object and capturing projected pattern by equipment 24.So Afterwards, the lateral displacement based on the spot in imagewise pattern, computer 26 pass through on the body surface of trigonometric calculations user The 3D coordinate of point.By way of example, it is based on equipment 24, referring to the X-axis 40 of basic horizontal, substantially vertical Y-axis 42 and depth Z axis 44 is spent, 3D coordinate is measured.For example, the side for this 3D mapping based on triangulation using projection pattern Method and equipment are described in PCT International Publication WO 2007/043036, WO 2007/105205 and WO 2008/120217, institute Disclosure is incorporated herein by reference.Optionally, as known in the art, other sides of 3D mapping can be used in system 20 Method uses single or multiple cameras or other types of sensor.
In some embodiments, equipment 24 includes usually the light reflected from one or two eyes 34 by processing and analysis The image of (usually infrared light and/or be added the color that color model generates by red-green-blue) detects the eye of user 22 The position and direction of eyeball 34, in order to find the gaze-direction of the user.In an alternate embodiment of the invention, computer 26 (passes through it Itself is combined with equipment 24) detection user eyes 34 position and direction.The light reflected can be originated from equipment 24 Light projection source or any other natural source (for example, daylight) or artificial light sources (for example, lamp).Using known in the art Technology, such as detection pupil center and corneal reflection (PCCR), equipment 24 can handle and analyzes including the composition from eyes 34 The partially image of the light of (such as pupil 38, iris 39 or cornea 41) reflection, to find the gaze-direction of user.In addition, setting Standby 24 can transmit from the light of corneal reflection as flicker effect (to computer 26).
The position for the user's head (for example, the edge of eyes, nose or nostril) extracted from 3D figure by computer 26 and Feature can be used for finding the rough position of eyes of user, so that the accurate measurement of eye position and gaze-direction is simplified, and And it is more reliable and steady to make this stare measurement.In addition, computer 26 can be easily by the portion on the head 32 provided by 3D figure Divide the position 3D of (for example, eyes 34) and is combined to know via the angle information of staring of the image analysis acquisition of eye portion The other user object 36 all on the given screen of viewing at any given time.The some eyes being such as known in the art In tracking system, 3D, which maps and stares tracking and is used in conjunction with, allows user 22 to move freely through head 32, while by using head On sensor or transmitter mitigate initiatively track head needs.
By tracking eyes 34, the embodiment of the present invention is after user's moving-head 32, it is possible to reduce recalibrates user 22 needs.In some embodiments, the depth information on head 32, eyes 34 and pupil 38 can be used in computer 26, so as to The movement on head is tracked, so as to realize that reliable angle of gaze calculates based on the individual calibration of user 22.Using in this field Known technology, such as PCCR, pupil tracking and pupil shape, computer 26 can calculate eye according to the fixed point on head 32 The angle of gaze of eyeball 34, and using the location information on the head to recalculate the angle of gaze and to enhance the accurate of aforementioned techniques Property.In addition to reduce recalibrate, the further advantage for tracking head include: reduce the quantity of light projection source and reduce for The quantity of the camera of track eyes 34.
Computer 26 generally includes general-purpose computer processor, is programmed to implement function described below in software Energy.For example, the software can electronically be downloaded to by network on processor or it also can be provided in such as light It learns, on the tangible computer-readable medium of the nonvolatile of magnetic or electronic storage medium.Additionally or alternatively, computer disposal Some or all of device function can be in such as integrated circuit or programmable digital signal processor of customization or semi-custom (DSP) it is realized in specialized hardware.Although by way of example computer 26 is shown as dividing with sensing equipment 24 in Fig. 1 From unit, some or all of computer processing function can it is intracorporal by the sensing equipment shell or with the sensing equipment phase Associated suitable special circuit executes.
Alternatively, these processing functions can by with display 28 (for example, in a television set) It integrates, or is integrated in the computer equipment such as game console or media player of any other suitable type Suitable processor together is implemented.The sensing function of equipment 24 equally can be incorporated into computer or the output by sensor In other computer installations of control.
Various technologies can be used to rebuild the 3D of the body of user 22 figure.In one embodiment, computer 26 from by The 3D connection component for corresponding to physical feeling is extracted in the depth data that equipment 24 generates.For example, being submitted on August 11st, 2010 U.S. Patent application 12/854187 in describe technology for this purpose, the disclosure is incorporated herein by reference.Meter Calculation machine analyzes these extracted components, so as to " skeleton " of the body of reconstructing user, U.S. Patent application as mentioned above Disclose 2010/0034457, or described in the U.S. Patent application 12/854188 submitted for 11st in August in 2010, disclosure Content is also by being incorporated herein by reference.In an alternate embodiment of the invention, other technologies also can be used identify user's body certain A little positions, and do not need 24 whole of equipment or be even partially seen entire body or skeleton to be reconstructed.
Even if being asked due to such as smallest object size and reducing these apart from the biggish distance punishment resolution of equipment 24 Topic, so that physical feeling (for example, finger tip) may not be detected by depth map, but by using the skeleton of reconstruct, computer 26 can be assumed the position at the physical feeling such as tip of finger 30.In some embodiments, it is examined according to the early stage of physical feeling It surveys or tracks physical feeling according to along several (previous) depth maps received, computer 26 can be based on the institute of human body Expected shape is automatically completed physical feeling.
In some embodiments, as the reconstruct of this skeleton as a result, including user's by the information that computer 26 generates Head and arm, trunk there may also be the position and direction of two legs, hand and other features.In different frames (that is, depth map) The variation of these features or the variation of user's posture be capable of providing the instruction of gesture and other movements that user makes.User's appearance Gesture, gesture and other movements can provide the control input interacted for user with interface 20.These body actions can with by Other interactive forms that equipment 24 senses combine, these body actions include user eyeball movement as described above and language Sound instruction and other sound.Therefore, interface 20 makes user 22 be able to carry out various remote control functions and can be with appearance Application program, interface, video program, image, game and other multimedia content on the display 28 interacts.
Fig. 2 is the block diagram for schematically showing the functional unit of user interface 20 of embodiment according to the present invention. Sensing equipment 24 includes illumination sub-component 50, is projected a pattern on target scene.Such as appropriately configured video camera The image of pattern in 52 capturing scenes of Depth Imaging sub-component.In general, illumination sub-component 50 is with imaging sub-component 52 infrared Operation in range, although other spectral regions can also be used.Selectively, the color video camera (not shown) in equipment 24 2D color image in capturing scenes, and microphone 54 can also capture sound.
Processor 56 from sub-component 52 receive image and by each image pattern and the ginseng that is stored in memory 58 Pattern is examined to compare.By being projected in pattern away from the reference planes at 24 known distance of equipment, reference pattern is usually pre- First it is captured.Local offset of the 56 calculating section pattern of processor on the region of 3D figure, and depth is converted by these offsets Spend coordinate.For example, the details of the process is described in PCT International Publication WO 2010/004542, the disclosure of which is logical It crosses and is incorporated herein by reference.Optionally, as previously mentioned, by other modes known in the art, such as three-dimensional imaging, similar sonar Equipment (based on sound/acoustics), wearable tool, laser or propagation time measurement, equipment 24 can be configured to give birth to Scheme at 3D.
Processor 56 generally comprises embedded microprocessor, described below to implement by being programmed with software (or firmware) Processing function.For example, the software can electronically be supplied to processor by network;Optionally or additionally, which can To be stored in the visible computer readable medium of nonvolatile, such as it is stored in light, magnetic or in electronic storage medium.Place Reason device 56 also includes suitably outputting and inputting interface, and the processor may include functional for realizing its certain or institute Dedicated and/or programmable hardware logic electric circuit.Some processing functions and it can be used for implementing these processing functions The detailed description of circuit is shown in WO 2010/004542 disclosed above.
In some embodiments, Staring Sensors 60 detect user 22 by capturing and handling the two dimensional image of user 22 Eyes 34 gaze-direction.In other embodiments, the sequence 3D figure that computer 26 is transmitted by processing equipment 24 come Detect gaze-direction.Any method appropriate of eyes tracking as known in the art can be used in sensor 60, for example, upper It states United States Patent (USP) 7,762,665 or the method described in United States Patent (USP) 7,809,160, disclosures of which passes through reference It is incorporated herein, or optional method described in the bibliography quoted in these patents.For example, sensor 60 can be with Capture the image of the light (usually infrared light) from the eyeground or corneal reflection of the eyes of user.This light passes through irradiation sub-component It 50 or can be projected towards eyes by another projection element (not shown) relevant to sensor 60.Sensor 60 is in user The entire target area at interface 20 can capture its high-definition image and can then position in target area from eyes Reflection.Optionally, in addition to capturing the pattern image for 3D mapping, imaging sub-component 52 can be captured from eyes of user It reflects (environment light, the reflection from monitor).
Alternatively, processor 56 can drive scanning control 62 to guide the visual field court of Staring Sensors 60 The face of user or the position of eyes 34.As described above, based on depth map or based on the skeleton according to 3D figure reconstruct, or use Face recognition method known in the art based on image can determine the position by processor 60 or by computer 26. For example, scanning control 62 may include the ability of electromechanical frame or scanning optics or photoelectric cell or any other suitable type Scanner known to domain, such as be configured to scene reflectivity to Staring Sensors 60 based on the anti-of MEMS (MEMS) Penetrate mirror.
In some embodiments, scanning control 62 can also include optics or electronic zoom, be mentioned according to by 3D figure The equipment 24 of confession adjusts the magnifying power of sensor 60 to the distance of user's head.Made by the above-mentioned technology that scanning control 62 is implemented The image of eyes of user can be captured with high precision by obtaining the only Staring Sensors 60 with intermediate resolution, to provide accurate The information of gaze-direction.
In other embodiments, computer 26 can be used scanning control angle (that is, relative to Z axis 44) it is solidifying to calculate Visual angle.In a further embodiment, computer 26 can compare is known in the scenery and 3D depth map that Staring Sensors 60 capture Other scenery.In another embodiment, scenery and 2D the camera capture that computer 26 can capture Staring Sensors 60 Scenery compares, which has the wide visual field including entire target scene.Additionally or alternatively, scanning control 62 can wrap Include the sensor (usually optical or electric) for being configured as verification eyes move angle.
The gaze angle of user is extracted in the processing of processor 56 by 60 captured image of Staring Sensors.By by sensor 60 angle measurement done combine with the position 3D of user's head provided by Depth Imaging sub-component 52, and processor can Accurately obtain the Real line-of-sight of user in 3d space.3D mapping and the combination of gaze-direction sensing reduce or eliminate accurate school The needs of true gaze-direction are just extracted with more multiple reflection signals.The sight information extracted by processor 56 makes to count Calculation machine 26 can reliably identify the interaction project that user is just watching attentively.
As not needing detection point of scintillation (for example, used in PCCR method), the combination of two kinds of forms can be permitted Perhaps gaze detection is carried out without using active projection device (that is, irradiation sub-component 50).It can solve this field using the combination It is known other to stare known glass-reflected problem in method.Using reflected from natural light, 2D image (that is, detection pupil position Set) and the information that obtains of 3D depth map (that is, position that head is identified by the feature on detection head), computer 26 can be with It calculates angle of gaze and identifies the given interaction project that user is just watching attentively.
As described above, Staring Sensors 60 and processor 56 can track one or two eyes of user.If can Two eyes 34 are tracked with enough precision, processor can also provide individual gaze angle measurement for each eye.Work as eyes When watching the object of distant place attentively, the gaze angle of eyes will be parallel, but for object nearby, which usually converges Gather in any close to target object.In the 3D coordinate for extracting user's point that its angle of gaze is fixed at any given time When, which can be used together with depth information.
As described above, equipment 24 can produce while the 3D figure of multiple users in the equipment visual field.It is whole by providing The single high-definition picture of a visual field, or by being scanned scanning control 62 to the head position of each user, stare Sensor 60 can similarly find in these users everyone gaze-direction.
3D is schemed and to stare information defeated by the communication link 64 that processor 56 is connected via such as universal serial bus (USB) Out on the suitable interface 66 of computer 26.The computer include with memory 70 central processing unit (CPU) 68 with And user interface 72, it drives display 28 and also may include other components.As described above, 24 property of can choose of equipment It only exports original image, and 3D figure as described above and stares calculating and can be executed by CPU68 with software.For scheming from 3D It can be run in processor 56, CPU68 or in the two with the middleware for extracting higher information is stared in information. CPU68 runs one or more application program, these application programs are usually via application programming interfaces (API), based on by centre The information that part provides drives user interface 72.For example, these applications may include game, amusement, surfing on the net and/or office Using.
Although processor 56 and CPU68 are since the processing task between them is with specific divide and as independent function Can unit be shown in Fig. 2, the function of processor and CPU can optionally be implemented by single processing unit or these Function can be divided into the processing unit implementation by three or more.In addition, although equipment 24 is shown as comprising specific arrangements In element certain combination, other device configurations can be used for purpose described herein, and be considered as in the scope of the invention Within.
Gaze detection
Fig. 3 is the process for schematically showing the method for the gaze-direction of detection user 22 of embodiment according to the present invention Figure.In initial step 80, computer 26 receives at least one physical feeling including user 22 from Depth Imaging sub-component 52 3D figure, and in extraction step 82, computer divides the received 3D figure of institute to extract the 3D coordinate on head 32.? In receiving step 84, computer 26 receives the two dimensional image including eyes 34 of user from Staring Sensors 60.As described above, according to Penetrate sub-component 50 can towards 22 projection light of user, and received image may include from eyeground and/or from eyes 34 The light reflected on cornea.In identification step 86, computer 26 analyzes received depth map and image in order to identify The gaze-direction of user 22, and this method terminates.
In some embodiments, computer 26 is by identifying head from 3D figure along X-axis 40, Y-axis 42 and Z axis 44 The 3D coordinate on head 32 is extracted in position.In other embodiments, computer 26 by along X-axis 40 and Y-axis 42 from 2D image In identify the first position on head, and extract head by identifying the second position on head from 3D figure along Z axis 44 The 3D coordinate in portion 32.
The embodiment of the present invention can be used gaze detection by controlled in response to the direction stared computer system for example in terms of The function of calculation machine 26.In some embodiments, computer 26 can identify on the position of gaze-direction on the display 28 The given interaction project presented, and change in response to gaze-direction the state of given interaction project.
In the first embodiment, the state for changing given interaction project 36 may include executing and the given interaction item The associated operation of mesh.For example, interaction project 36 may include user 22 can choose present on the display 28 it is specific in Hold the menu option of (for example, movie or television program).In a second embodiment, computer 26 will be by will be received defeated from user Enter and be directed to given interaction project, thus it is possible to vary the state of given interaction project.For example, given interaction project can wrap Text box is included, and if user is staring text frame, computer 26 can provide received from keyboard to text frame Alphanumeric input.
In other embodiments, computer 26 can identify presented on the display 28 on the position of gaze-direction to Fixed interaction project, and the state in response to changing given interaction project from the received sound instruction of user.For example, given Interaction project may include icon relevant to software application, and the user can stare the icon and say " beginning " In order to execute the application.In a further embodiment, user interface 20 can be configured to identify in response to gaze-direction to Fixed interaction project, and given interaction is manipulated in response to the gesture performed by limbs (for example, finger 30 or hand 31) Project.For example, after having selected given interaction project, if the computer, which receives, indicates the user to brandish movement (that is, along plane including X-axis 40 and Y-axis 42) is schemed to move a sequence 3D of hand 30, and computer 26 can waved correspondingly Selected interactive project is relocated on dynamic direction (that is, from left to right).
In another embodiment, the embodiment of the present invention can be used for receiving image (2D image or 3D including eyes 34 Figure);Based on the image recognition gaze-direction;Based on the region in gaze-direction on gaze-direction identification display, and Operation is executed on the region.For example, computer 26 may include the desktop computer in conjunction with digital camera, display 28 can be with Including the display for the desktop computer, and the camera may be configured to the lens focus of camera in institute In the project presented in the region identified.
Additionally or alternatively, computer 26 can identify equipment that is being coupled to the computer and being located in gaze-direction, And in response to the function of gaze-direction control equipment.For example, if user stares the top for being coupled to the loudspeaker of computer, The volume level of loudspeaker can be improved in computer, and if user stares the bottom of loudspeaker, computer can reduce volume Grade.
Fig. 4 is the number key for being configured for input password of embodiment according to the present invention presented on the display 28 The schematic diagram of disk 90.Scheme herein and subsequent figure in the presence and staring of user 22 indicate that the eyes have with eyes 34 Sight 92.Sensing equipment 24 in the visual field 94 of the equipment by finding the position 3D of user's head and the side of staring of eyes 34 To can determine sight.
In the present specification, interaction project 36 can be by using different descriptions (for example, using icon or scroll box generation For interaction project) and additional letter to identifying that label distinguishes.For example, interaction project 46 includes numeral input in Fig. 4 Key 36A and start button 36B.When enter key and start button is presented, computer 26 prompts user 22 to input password.Although number Keyboard is shown in Figure 4, and the graphic element on any kind of screen may be used as " key " for the purpose.When user wants to open When beginning to operate computer 26 or operate the specific application program executed on computers, user can be by his sight from one Given key 36A guides next key in appropriate sequence into, in order to input password.Alternatively, when staring for the user rests on often When on a given key 36A (that is, in a series of keys comprising password), which can make specific gesture or sound instruction (such as " input "), in order to notify equipment 24 to record the selection.
In some embodiments, computer 26 may be configured to made with or without user it is any additional In the case where inputting (that is, gesture or sound), select a series of interactive projects 36 (for example, when user by his sight from one Interaction project is when being moved to next) indicate the interaction project selected.Similarly, computer 26 can be configured as not Provide a user any video or audible feedback for indicating the selection.Therefore, it even if user inputs password in public, looks on Person will be unable to determine or replicate password.
Input password after, the user can guide his sight towards start button 36B in order to continue and computer into Row interaction.
In addition to Password Input, gaze detection can be used for otherwise enhancing safety.For example, computer 26 can be with Learn the other biological statistical natures of the distinctive eye movement mode of user and/or eyes 34 using as additional recognition methods.As Another example, equipment 24 can be configured to the gaze angle that can not only find user 22, but also can find and be located at view Other people gaze angle in 94 (for example, being reconstructed by skeleton from being identified in 3D figure).In this case, when another Individual is being when just watching display 28 attentively, and equipment 24, which can be configured as warning user, (or even can prompt computer 26 over the display Other people image that display equipment 24 is captured).Such function can help user 22 to protect oneself from eavesdropping Listener-in continues to look at display with prevention.
As power saving features, equipment 24 can detecte when user does not look at display 28, and when user's When sight leaves the time of display 28 more than during threshold time, computer 26 can activate power saving technique.For example, when using When family does not look at display, computer 26 can make display completely dimmed or dim (that is, reducing brightness).When the user returns When coming over to see to display, computer can deactivate power saving technique.For example, detecting sight return of the user 22 by him When on to display, computer 26 can increase brightness, make brightness back to full brightness.This screen control stared that relies on exists In terms of improving the battery life of portable device and power consumption is being reduced generally for cost and environment friendly is saved Aspect is also useful.
Stare the user interface of operation
Interactive user interface can be used to create by staring tracking, which can detecte out user and be just look at Which interaction project (for example, text box or application program such as word processor) on screen, to eliminate for mouse And/or the needs of keyboard.For example, text can be automatically directed to the text that the user is just being look at when user keys in text Frame.As another example, when first time button is carried out at a word or sentence as user, " mouse click " type event quilt It is sent in watched attentively text box, this is input into text in text frame.In this way, user is not necessarily to mobile mouse Field (" name ", " surname ", " password " etc.) switches over and just could fill out web form mark one by one.
In addition, staring the combination of tracking Yu other forms (such as 3D mapping/gestures detection and/or speech detection), use Family can sufficiently control the object on screen, without using mouse or touch screen.In this way, full model can be implemented in user The direction and selection function enclosed, including scanned for by a large amount of information project and selection.Combined interface form can also Be used to search in the context of some interaction project and execute control function, such as executes and look into open file It looks for, shear, replicating and paste functionality.
Fig. 5 is the method with the user interface interaction for staring operation for schematically showing embodiment according to the present invention Flow chart, and Fig. 6 A to 6C is to show the use of embodiment according to the present invention to stare the user interface of operation and execute The schematic diagram of sequence of operations, in initialization step 100, computer system such as computer 26 presents more on the display 28 A interactive project 36.Shown in Fig. 6 A-6C configuration in, computer 26 initially at the lower right corner of display 28 present start by Button 36B.Since user 22 makes its sight 92 towards start button 36B, computer 26 receives instruction user in receiving step 102 The input (for example, depth map) of 22 gaze-direction.Once it is determined that gaze-direction, computer 26 is based in identification step 104 Gaze-direction (that is, point on the display that the user is just watching attentively) identifies the target point 120 on display 28.
In comparison step 106, if the first interaction project 36 in associated steps 108, is calculated close to target point Machine 26 is associated with the first interaction project by target point.In the example shown in Fig. 6 A, the first interaction project includes start button 36B, and target point 120 is located at start button 36.Back in step 106, if target point is not close to any interaction Project, then this method proceeds to step 102.
In response of step 110, it is one or more close to the second of the first interaction project that computer 26 opens (that is, presentation) Interaction project 36, and this method terminates.Computer 26 can in response to user adjust gaze-direction make target point close to its In one second interactive project, come select one of them second interaction project.
Although the configuration of the interactive user interface in Fig. 6 A to Fig. 6 B is shown computer 26 and is presented with concentric pattern Positioned at the second interaction project 36 of the first interaction project radial outside, other configurations are recognized as in spirit and model of the invention Within enclosing.For example, the first and second interaction projects can be rendered as nested rectangle icon by computer 26.
In some embodiments, computer 26 can be in response to the hand as performed by limbs (for example, finger 30 or hand 31) Gesture opens one or more the second interaction project.For example, after target point is interacted item association with first, if computer Instruction user is received to brandish the sequence 3D for acting (for example, along the plane for including X-axis 40 and Y-axis 42) mobile hand 30 Figure, computer 26 can responsively open one or more the second interaction project.
In the configuration shown in Fig. 6 A, computer 26 presents the menu area 122 of the second interaction project, second interaction Project includes the tabs 36C (being arranged with concentric pattern) of available various applications on computer.In some embodiments, Computer 26 stares target point 120 in response to user and opens menu area 122, to It is not necessary to user be asked to carry out any shape The body gesture or phonetic order of formula.
In fig. 6b, for user 22 by its sight 92 towards voicemail option card 36C, this causes computer 26 to automatically open use Email (E-mail) inbox at family, contains the list 36D of the message item received.When user is arranging its sight When being scanned up and down on table 36D, and when sight 92 is rested on specific list items 36D, computer present automatically by The preview 124 of the content of message item represented by the list items.User, can by moving up or down its sight 92 Preview others list items 36D, or main menu can be returned by the way that its sight 92 is directed to return push-button 36E.It is all these dynamic Work can be executed only by eye motion.
In order to open selected message item, user can input instructions into computer 26 by another form.For example, User is it may be said that " opening ", or the gesture opened can be executed while staring list items 36D corresponding with message item.Sensing Equipment 24 detects the audio input or three-dimensional motion that user is done, and will be in instruction input computer 26 appropriate.As a result, Fig. 6 C Shown in screen open, the full content 126 of selected message item is presented.The user can be by staring movement for him To put to return inbox or usable return push-button 36E return main menu from this to " returning to inbox " region 36F, such as above It is previously mentioned.
When the given interaction project 36 of selection, computer 26 visual feedback can be passed to the instruction selection (that is, Execute such as present full content 126 movement before) user.The example of visual feedback includes the size for changing selected item And/or appearance, or selected project is highlighted by surrounding selected project with frame.Transmitting visual feedback to use Family 22 concentrates on his stare near target point, to optimize user experience.For example, when user selects start button 36B When, computer 26 can transmit visual feedback via start button, so that user be guided to be maintained close to calculate by his stare The position of the tabs 36C in menu area 122 is presented in machine.
Instruction gesture relevant to staring
As detailed below, when user 22 executes direction-selection, direction-touch and direction-context gesture, user 22 are directed toward given interaction project 36.In an embodiment of the present invention, computer 26 passes through first point and second of restriction user Straight line between point identifies user's positive sense where, and identifies straight line wherein (that is, target point 120) and 28 phase of display It hands over.
Fig. 7 is the detection for schematically showing embodiment according to the present invention and the method for staring related instruction gesture Flow chart.Fig. 8 is according to an embodiment of the invention, user 22 executes direction-selection gesture to select to be on the display 28 The schematic, pictorial illustration and Fig. 9 of the given interaction project 36 of existing first are according to an embodiment of the invention, user 22 holds Row direction-touch gestures are to operate the schematic, pictorial illustration of the second given interaction project 36 presented on the display 28. User 22 includes icon 36G using the example of the selectable first interaction project of direction-selection gesture, and user 22 uses direction- The example of the operable second interaction project of touch gestures includes icon 36G, vertical scrolling frame 36H and horizontal rolling frame 36I.
Direction-selection gesture includes the given interaction project of finger 30 (usually index finger) direction first that user 22 will give 36, finger 30 (generally along z-axis 44) is shifted into the first given interaction project, stops or slows down finger, then retract finger User.Direction-touch gestures include user 22 by the given interaction project 36 of finger direction second, by finger 30 (usually along z-axis 44) the second given interaction project is shifted to, suspends finger, then by hand along the planar movement including X-axis 40 and Y-axis 42.
In the first receiving step 130, the computer system of such as computer 26 receive include user 22 3D figure the One sequence, and in extraction step 132, computer segmentation received 3D figure to extract first point of user and second point 3D coordinate.Although Fig. 8 and Fig. 9 show the second point of the and finger tip including finger 30 (index finger) including eyes 34 at first point, Different first point and second point are considered within the scope and spirit of the invention.For example, first point may include between eyes 34 A point or user some other point on the face, and second point may include the arbitrary point on any limbs of user.For with Different calculating can be used to identify and second point at first point in the different gestures at family 22, computer 26.
In the first comparison step 134, if the First ray of 3D figure indicates that the user is just aobvious relative to (being generally directed toward) Show the mobile second point of device 28, then computer 26 receives and divide the 3D that instruction second point slows down in the second receiving step 136 Second sequence of figure.In calculating step 138, computer 26 calculates the line segment 160, Yi Ji intersected with and second point at first point In identification step 140, line segment 160 is extended to display 28 by computer, and identifies the target of line segment 160 Yu display intersection Point 120.
In some embodiments, user 22 may not look at target point.For example, computer 26 can determine (that is, using Above-mentioned gaze detection embodiment) he just stares the first given interaction item guided into the presentation of the left side of display 28 by user Mesh, but step 140 recognizes the second given interaction project 36 that user's positive sense is located at 28 right side of display.In this feelings Under condition, although he is stared direction the first interaction project by the user, computer may be configured to selection the second interaction project.
In the second comparison step 142, if given interaction project is being presented close at target point 120 for computer 26 36, then computer engages (engage) with the given project 36 that interacts close to target point, and connects in third receiving step 144 Receive and divide the third sequence of 3D figure.In third comparison step 146, if the third sequence instruction user of 3D figure is just from display Device (that is, towards user 22) removes second point, then selection step 148 in, computer 26 be in close at target point 120 Existing given interaction project engages (that is, selection), and then this method terminates.In the example shown in Fig. 8, user 22 can be with Using include by finger 30 be directed toward icon " A " and finger is retracted direction-selection gesture selection icon " A ".
In some embodiments, engaging given interaction project may include that execution is associated with given interaction project Movement.For example, given interaction project may include the given icon 36G of film, engaging the given icon 36G includes executing application Program watches film.
Optionally, when being directed toward icon " A ", user 22 can issue the phonetic order such as " selected ".Once from Mike Wind 54 receives phonetic order, and computer 26 can be executed and be presented on close to the given interaction project at target point 120 (for example, icon " A ") relevant operation.
In some embodiments, although above-described gaze detection embodiment instruction user is given currently without looking at Interaction project, but direction-selection gesture can be used in computer 26, selects given interaction project 36.In first example, if Computer 26 does not change his gaze-direction in detecting user at the appointed time, and works as received 2D image When not indicating gaze-direction (for example, hand 31 blocks eyes 34) with 3D figure, computer 26 identifies direction-selection gesture, then Computer 26 " can cover " gaze detection and in response to direction-selection gesture.
In second example, if finger movement posture is " significant " (that is, at least with the first specified distance), And if the interaction project that 120 distance of target point is presented in gaze-direction is at least the second distance to a declared goal, computer 26 can respond direction-selection gesture.For example, if user is just watching the first interaction project in the presentation of the left side of display 28 attentively 36, and direction-selection gesture is executed to the second interaction project 36 presented on the right side of display, then computer 26 can be configured For engagement the second interaction project.
If the third sequence of 3D figure does not indicate that user just removes second point from display 28, compare the 4th In step 150, computer 26 analyzes the third sequence of 3D figure to determine whether user 22 is positive along the X-Y for including X-axis 40 and Y-axis 42 The planar movement second point.If the third sequence of 3D figure indicates that the user just moves the second point along X-Y plane, in weight In positioning step 152, computer 26 is located adjacent to the interaction project at target point 120 in response to the movement of second point, Then this method terminates.In the example shown in Fig. 9, user 22 can be used including finger 30 is directed toward vertical scrolling frame and is incited somebody to action Direction-touch gestures that finger moves up and down, mobile vertical scrolling frame 36H.
If the third sequence of 3D figure does not indicate that the user just moves second point along X-Y plane, compare the 5th In step 154, computer 26 analyzes the third sequence of 3D figure to determine whether user 22 just stably keeps second point relatively and hold Continue at least specific period (for example, two seconds).If the third sequence of 3D figure indicates that the user is just opposite and is stably keeping the 2 points continue the specific period, then in rendering step 156, interaction of the computer 26 to being located adjacent at target point 120 The project implementation keeps operation.For example, keeping operation may include the context that the interaction project being located adjacent at target point 120 is presented Information.The example of contextual information includes the attribute or the available option of user of interactive project.Disappear for example, computer 26 can be presented Breath " retracts finger to select or horizontal or vertical ground moves finger and relocates project." keep another example of operation can The interaction project being located adjacent at target point 120 is deleted including computer 26.
Return step 154, if the third sequence of 3D figure does not indicate user, the opposite second point that stably keeps continues spy The fixed period, then this method proceeds to step 130.Similarly, if in step 142 computer 26 do not present it is close Given interaction project 36 at target point 120, or if the First ray of 3D figure does not indicate user 22 in step 134 It is just that second point is mobile towards display 28, then this method also continues to step 130.
In some embodiments, computer 26 can be coupled to the equipment of computer in response to gesture control described herein Function.For example, if user executes direction-touch gestures, computer when direction is coupled to the loudspeaker of computer upward The volume level of loudspeaker can be improved.Similarly, if user executes direction-touch gestures in directional loudspeaker downward, that Computer can reduce the volume level of loudspeaker.
Although gesture described in the flow chart of Fig. 7 includes direction-selection gesture and direction-touch gestures, those warps It should be considered by other instruction gestures of line segment 160 (that is, the line segment intersected with first point and second point) identification target point 120 In the spirit and scope of the present invention.Additionally or alternatively, direction-selection gesture and direction-touch gestures can with it is above-mentioned Gaze detection embodiment is used in combination.
For example, computer 26 can be configured as once staring based on user, to indicate that user 28 is just shifting to finger aobvious The second sequence and instruction user for showing the 3D figure that the First ray of the 3D figure of device 28, instruction user are just slowing down finger just will When the third recognition sequence for the 3D figure that finger 30 is removed from display goes out target position 120, select (that is, direction-selection) given Interaction project 36.Similarly, computer 26 can be configured as once staring, indicating user 28 just by finger based on user 30 shift to the second sequence and instruction of the First ray of the 3D figure of display 28, the 3D figure that instruction user is just slowing down finger When user just goes out target position 120 along the third recognition sequence of the 3D figure for the planar movement finger 30 for including X-axis 40 and Y-axis 42, The responsively given interaction project 36 of reorientation (i.e. direction-contact).
Figure 10 is to show the optional direction-selection hand for also referred to as triggering gesture herein of embodiment according to the present invention The schematic, pictorial illustration of gesture.Index finger 30 is being referred to that after given interaction project 36, user 22 can be by raising or closing Hold together the given interaction project of thumb 170 (as shown in arrow 172) selection, wherein close up thumb complete the gesture (that is, without It is that finger is shifted to interactive project).Optionally, once finger 30 is directed toward given interaction project and sending sound by user 22 Sound instructs to select given interaction project (for example, user says word " opening "), and computer 26 may be selected by given Icon.
Additionally or alternatively, although example shown in Fig. 8 to Fig. 9 shows at first point on head 32, Its first point is considered within the scope and spirit of the invention.In the example shown in Fig. 10, first point includes index finger 30 Joint, and line segment 160 passes through finger tip and joint.
Non-tactile user interface calibration
In operation, non-tactile user interface 20 is usually used by more than one user, and each user may be different It is directed toward the same given interaction project 36 presented on the display 28 in ground.In some embodiments, using non-tactile user circle The element (that is, icon 30) in face, computer 26 can calculate for each user of non-tactile user interface and store calibration factor.
Figure 11 is that the user 22 of embodiment according to the present invention says the schematic illustration that finger 30 is directed toward given icon 36G It is bright.As shown in the drawing, the position based on user's finger 30, line segment 160 are directed toward the just following of icon " A ".Therefore, one use is identified Family 22, computer 26 can apply calibration factor appropriate when detecting gesture performed by the user.
For example, computer 26 " can be locked if non-tactile user interface 20 is inactive within specific a period of time Firmly " non-tactile user interface, and presentation user 22 can be directed toward to unlock the single solution lock icon 36G of user interface.Work as user When being directed toward solution lock icon, computer 26 can identify target point 120 and calculate calibration factor (namely based on target point for the user With the degree of closeness of solution lock icon).
Additionally or alternatively, might have such situation: when user 22 executes gesture, target point 120 is located at two Between a or multiple icon 36G, and computer cannot identify which icon user is directed toward.Where direction cannot be identified in computer 26 In the case where a icon 36G, computer can identify the icon subset close to target point 120, and identification is presented with bigger size The icon arrived, and user is prompted to refer to again once.
Figure 12 A is that the user 22 of embodiment according to the present invention positions finger 30 to be directed toward showing for one of icon 36G Meaning property illustrates.As shown in the drawing, target point 120 is located between icon " A ", " B ", " E " and " F ".
Figure 12 B is according to an embodiment of the invention, user 22 positions finger 30 to be directed toward the schematic of given icon 36G It illustrates, wherein icon is presented with bigger size in computer 26.Continue example shown in Figure 12 A, computer 26 is with bigger Size present icon " A ", " B ", " E " and " F ", then user 22 positioning finger 30 to be directed toward icon " A ".
In a similar way, as described above, when user selects via the gaze detection obtained from 2D image and/or 3D figure When given icon 36G, user interface 20 can calibrate user 22.Firstly, computer 26 " can lock " non-tactile user interface, And presentation user 22 can look to unlock user interface solution lock icon 36G.When user stares the solution lock icon, Computer 26 can identify target point 120 and calculate calibration factor (namely based on target point and solution lock icon close to journey for user Degree).
In second example, if user continues the specific interaction project 36G agaze given for a period of time, The contextual information of given interaction project can be presented in computer 26 in interaction project 36 (for example, pop-up dialog box).If User stares close to the interaction project that contextual information is presented, then computer 26 can be based on target point and contextual information Degree of closeness calculates calibration factor.
Middleware
As described above, for scheme from 3D and stare in information extract the middleware of higher level information can be in processor 50 And/or it is run on CPU 68, and CPU 68 can drive user circle based on usually being executed by the information that API is provided by middleware The application program in face 72.
It is the example for the middleware primitive that computer 26 can be used to extract information from the 3D figure for be received in equipment 24 below Son:
InteractStart (): the beginning that identification is interacted with user interface 20.
InteractHover (Pos2D, Radius): the current target point 120 on identification display 28 is (that is, user 22 Finger 30 be directed toward display on point coordinate).Pos2D parameter refers to (two dimension) position on display 28.
InteractPointNew (Pos2D, Radius): target is identified when user 22 executes direction-selection gesture Point 120.
InteractPointUpdate (Pos2D, Radius): when executing direction-touch gestures with user 22 along X- The mobile finger 30 of Y plane updates target point 120.
InteractEnd (Pos2D, Radius): identification user 22 when by finger 30 remove visual field 94.
Use the function of above-mentioned middleware, when user 22 executes direction-selection and direction-touch gestures, computer 26 can identify following " stage ":
1. finger identifies.In finger cognitive phase, middleware identifies and starts to track finger 30.Middleware tracks finger, And identify any intention of the finger towards display motion.This enables user interface 20 from other arbitrarily hand exercises Distinguish direction gesture.
2. gesture identification.In order to reach the gesture identification stage, it usually needs meet two conditions: (a) user 22 is by finger 30 shift to display 28, and (b) target point 120 in the boundary of display.In some configurations, target point 120 can " position In " outside of display 28 is a bit.For example, due to be effectively directed toward-touch gestures may include from " outside " display 28 drag Given interaction project 36 (for example, backspace key, alt-tab key etc.).
Once middleware just conveys InteractStart () event into second stage.Once conveying InteractStart () event, middleware, which is just constantly tracked finger 30 and finds, is limiting target point (the 3rd rank seen below Section) direction variation.When finger is mobile, middleware is just conveying InteractHover (Pos2D, Radius) event, makes User interface 20 is able to detect finger 30 just where is being directed toward at present by user 22.
3. interaction is directed toward.When user 20 stops finger 30, or finger 30 is removed from display 28, interaction is just reached The direction stage.According to the position of second point, middleware is calculated by connection (for example) eyes 30 and finger tip with creating line segment 160 Target point 120, and the line segment is extended into arrival display.Then middleware conveys the event of identification target point 120 InteractPointNew (Pos2D, Radius), so that user interface 20 be enable to select expected interaction project 36.
4. interaction terminates.User 22 finger 30 is removed and is removed from display 28 visual field 94 indicate gesture terminate (for example, Above-mentioned direction-selection and direction-touch gestures), so that user be made to be detached from user interface 20.It is re-engaged user interface 20, user 22 can relocate finger 30 in visual field 94, hence into finger cognitive phase described above.
In higher abstraction level, it can be defined similarly as in this field in conjunction with middleware primitive described above The following primitive for the existing touch screen primitive known:
Activation: direction-selection gesture of the given interaction project 36 of engagement (that is, " click ").Activation primitive can be used In activation application program, push button or follow hyperlink.
Translation: direction-touch gestures of given interaction project 36 are moved in any direction on an x-y plane.It is grasping In work, some application programs can be configured to make a response to the movement in X-axis or Y-axis.
Context: as described above, direction-holding gesture includes that finger is shifted to given interaction project 36 by user 22, With by finger it is opposite stably keep specific period (that is, in Microsoft's WinDowsTM environment by mouse picking in item On mesh and to press right mouse button similar).In response to direction-holding gesture, computer 26 can convey instruction to user in next step How this does the feedback of (for example, icon 36G that dragging and/or stopping give), or the information (example about given interaction project Such as, film plot described above is summarized).
In operation, in order to allow to develop the reflection user experience that executes on computer 26 in higher abstraction level (UX) application program of language primitive, middleware can convey following event:
Start (Pos2D, Radius): being started in a manner of being similar to InteractStart () event and user interface 20 interaction, but in higher abstraction level.
Activate (Pos2D, Radius): one in gesture primitive described below is activated.
Pan (Pos2D, radius): user 22 initiates direction-touch gestures.
PanEnd (pos2D, Radius): user 22 completes (that is, being detached from certainly) direction-touch translation gesture (for example, logical It crosses and removes finger 30 from display).
Context (Point2D, Radius): user 22 initiates direction-holding gesture.In operation, user 22 can be straight It connects from direction-holding gesture and is transformed into direction-touch gestures.
ContextEnd (): once direction-holding gesture (that is, being not transitioning to direction-touch gestures) is completed, user 22 are just detached from from user interface 20.
Interactive 3D display and game
The combined 3D mapping and gaze-direction information provided by sensing equipment 24, can use in various ways to improve 3D figure obtains and the quality and user experience of 3D rendering demonstration.For example, user moves through scene in interactive entertainment, Landscape on display 28 can become the focus in the direction of sight 92.Other regions are (such as the viewpoint nearby item as user The landscape of distant place when on mesh, or vice versa) landscape may deliberately be obscured, with simulate actual deep space and/ Or save bandwidth.
As another example, the game application run on computer 26 can be programmed to staring according to user " plot " of direction change game.For example, computer may at the time of given in user not in the display looked at Project (such as " enemy " role) is presented in region suddenly, keeps user surprised.It stares-guides above by reference to described in Fig. 6 A to Fig. 6 C Direction and selection method can equally be applied in game and other " virtual worlds " select interaction project 36 and " aiming " it On (weapon is such as directed toward target).
It the depth collected by equipment 24 and stares information and can be used for enhancing 3D display (especially autostereoscopic display) Ability and user experience.Operate the display by the way that different images is presented to the images of left and right eyes of user, but generally may only be from having The position of limit range is watched.By track user head position and gaze-direction, equipment 24 can guide the display with Change any image that it is presented, allows to watch them in bigger position range and show presenting over the display The different angle view of project.Applied to the near field project 36 presented on automatic stereoscopic display device (or other 3D display devices) Parallax can modify with a distance from display according to user's head, to enhance the sense of reality and to reduce some users at this The visual discomfort that may be felt in the environment of sample.
This kind of 3D display device can also be driven to interact with the 3D gesture with user.For example, the known bits based on user It sets and the gaze-direction of user, computer 26 can drive the 3D display device " can touch " position of phantom item in user The spatial display set they.Then user can item location in the 3 d space pass through he mobile hand (or other body Position) it operates phantom item and is interacted with phantom item.Equipment 24 senses the gesture of user and provides to computer 26 suitable When input so that computer can move in response to user's interaction or in addition modify the project.Such interactive mould For formula but also user can reach and interact with the object on given screen, which is located at the void created by display " back " of another object in quasi- space.
In another example, computer 26 can be presented including multiple interactive projects 36 on the display 28 (for example, game In role) content.In some embodiments, computer 26 can identify on display in the gaze-direction of user Region (that is, region around target point 12), and the content in the region identified is rendered as " focus alignment " (that is, clear Present clearly), and the content other than the region identified is rendered as (that is, faintly presenting) " out of focus ". In another embodiment, each of multiple interactive projects have a relevant depth value, give when user's direction When interaction project is stared, by the way that its relevant depth value is presented with the consistent project that interacts of the depth value of given interaction project For " focus alignment " (that is, clearly present), and its relevant depth value and the depth value of given interaction project is inconsistent Interaction project be rendered as (that is, faintly present) " out of focus ", computer can simulate 3D environment.
In further example, when playing game, user 22 is rendered as interactive icon by being directed toward over the display Weapon can choose weapon.If selected weapon is rifle, as described above, user " can be taken aim at direction-touch gestures It is quasi- " rifle, and use direction-selection gesture or triggering gesture "ON" rifle.Optionally, it if selected weapon is sword, uses It family can be by using direction-selection gesture and the combination of direction-touch gestures in three dimensions (that is, along X-Y plane and Z axis) Manipulate sword.
Above-mentioned application is only how the user interface of mixed form be used to improve system capability and user experience Several embodiments, but other similar applications are considered within the scope of the present invention.As another example, according to user Stared fixed to the object on screen, and when some program (such as commercial advertisement) is presented user whether just looking at it is aobvious Show device, the ability of sensing equipment 24 can be used for measuring user to the interest of the content of such as website or video program.These abilities It can be similarly used for extracting user interest overview, for example, the U.S. Patent Application No. 13/ submitted on November 14th, 2011 Described in 295106, it is incorporated herein by reference.
It should be understood that above-described embodiment is to be illustrated by way of example, and the present invention is not limited to above The content having had been particularly shown and described.More precisely, the scope of the present invention include various features as described above combination and Sub-portfolio, and for the ordinary skill in the art, made in the prior art after reading description above-mentioned There is no disclosed variations and modifications.

Claims (17)

1. a kind of method for controlling the function of computer system in response to the direction stared, comprising:
Receive the three-dimensional 3D figure of at least one physical feeling of the user of the computer system;
The two-dimentional 2D image of the user is received, described image includes the eyes of the user;
The 3D coordinate on the head of the user is extracted from 3D figure and the 2D image;
The image recognition of 3D coordinate and the eyes based on the head as the user execute described in the side that stares To;
Receive and divide the computer system the user at least one physical feeling three-dimensional 3D figure at any time One sequence, to extract and the 3D coordinate of second point at first point of the user, the 3D figure indicate the second point relative to It is coupled to the movement of the display of the computer system;
The line segment intersected with and the second point is calculated at described first point, wherein the line segment intersects with the display;
Identify the target point of the line segment Yu the display intersection;And
When detecting specific event, the second interaction item close to the target point presented on the display is engaged Mesh, the second interaction project are different from the first interaction item on the direction stared presented on the display Mesh,
Wherein the specific event include the second point movement have at least the first distance to a declared goal and the target point away from From the first interaction project at least second distance to a declared goal.
2. according to the method described in claim 1, the direction wherein stared described in identification includes the composition analyzed from the eyes The light of part reflection.
3. according to the method described in claim 1, the 3D coordinate for wherein extracting the head includes dividing the 3D figure to mention Take the head along trunnion axis, the position of vertical axes and depth axis.
4. according to the method described in claim 1, the 3D coordinate for wherein extracting the head includes identifying from the 2D image The head is along the first position of trunnion axis and vertical axes, and the segmentation 3D figure is so as to described in the identification from the 3D figure Head along depth axis the second position.
5. method according to claim 1 to 4, wherein the function of controlling the computer system includes changing The state of the second interaction project.
6. according to the method described in claim 5, further including receiving to indicate that the user moves limbs in a particular direction The sequence of three-dimensional figure, and the second interaction project is responsively relocated on the specific direction.
7. according to the method described in claim 5, further include based on the target point with described second interact project close to journey Degree is to calculate calibration factor.
8. method according to claim 1 to 4 connects wherein the function of controlling the computer system is included in It receives the First ray for the three-dimensional 3D figure for indicating limbs towards the display motion, receive the instruction limbs described in Second sequence of the 3D figure of display retarded motion and receive the instruction 3D figure of the limbs far from the display motion When third sequence, movement associated with the second interaction project is executed.
9. method according to claim 1 to 4, wherein the function of controlling the computer system is included in institute It states when staring described in user's guidance far from the display, power saving technique is activated by the computer system.
10. according to the method described in claim 9, wherein activation power saving technique includes reducing to be coupled to the department of computer science The brightness of the display of system.
11. method according to claim 1 to 4, wherein the function of controlling the computer system is included in When stared described in user guidance towards the display, power saving technique is deactivated by the computer system.
12. according to the method for claim 11, wherein deactivating power saving technique includes increasing to be coupled to the computer The brightness of the display of system.
13. method according to claim 1 to 4, wherein the function of controlling the computer system includes knowing It is not coupled to the equipment computer system and on the direction stared, and changes the state of the equipment.
14. method according to claim 1 to 4, wherein the function of controlling the computer system is included in It is coupled to presentation content on the display of the computer system;It identifies and is being coupled to the described aobvious of the computer system Show the region in the gaze-direction on device;And the content of the region exterior is made to fog.
15. method according to claim 1 to 4, wherein the function of controlling the computer system include: It is coupled on the display of the computer system and the content including multiple interactive projects is presented, the multiple interactive project Each of all have relevant depth value;Identify one on the direction stared in the interactive project;Know Not to one relevant depth value in the interactive project;And makes in the interactive project its depth value and identified The inconsistent any interactive project of depth value out fogs.
16. method as claimed in any of claims 1 to 4 further includes controlling in response to the direction stared It is coupled to the equipment of the computer system.
17. a kind of for controlling the device of the function of computer system in response to the direction stared, comprising:
Sensing equipment, the sensing equipment are configured as receiving the three-dimensional 3D figure of at least one physical feeling of user and the use The image of the eyes at family, and the two-dimentional 2D image of the user is received, the 2D image includes the eyes of the user;And
Computer, the computer are coupled to the sensing equipment, and are configured as scheming from the 3D and mentioning in the 2D image The 3D coordinate on the head of the user is taken, and is identified based on the image of the 3D coordinate on the head and the eyes by described The direction stared that user executes,
Wherein the computer is configured as: receiving and divide the three-dimensional 3D of at least one physical feeling of the user at any time The First ray of figure, to extract and the 3D coordinate of second point at first point of the user, the 3D figure indicates the second point Movement relative to the display for being coupled to described device;Calculate the line segment intersected with and the second point at described first point, Described in line segment intersect with the display;Identify the target point of the line segment Yu the display intersection;And it is detecting When to specific event, engage present on the display interact project close to the second of the target point, described second Interaction project is different from the first interaction project on the direction stared presented on the display,
Wherein the specific event include the second point movement have at least the first distance to a declared goal and the target point away from From the first interaction project at least second distance to a declared goal.
CN201610451134.4A 2011-02-09 2012-02-09 Gaze detection in 3D map environment Active CN106125921B (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US201161440877P 2011-02-09 2011-02-09
US61/440,877 2011-02-09
US201161526692P 2011-08-24 2011-08-24
US61/526,692 2011-08-24
US201161538867P 2011-09-25 2011-09-25
US61/538,867 2011-09-25
CN201280007484.1A CN103347437B (en) 2011-02-09 2012-02-09 Gaze detection in 3D mapping environment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201280007484.1A Division CN103347437B (en) 2011-02-09 2012-02-09 Gaze detection in 3D mapping environment

Publications (2)

Publication Number Publication Date
CN106125921A CN106125921A (en) 2016-11-16
CN106125921B true CN106125921B (en) 2019-01-15

Family

ID=46639006

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201610451134.4A Active CN106125921B (en) 2011-02-09 2012-02-09 Gaze detection in 3D map environment
CN201280007484.1A Active CN103347437B (en) 2011-02-09 2012-02-09 Gaze detection in 3D mapping environment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201280007484.1A Active CN103347437B (en) 2011-02-09 2012-02-09 Gaze detection in 3D mapping environment

Country Status (4)

Country Link
US (5) US9285874B2 (en)
EP (2) EP3527121B1 (en)
CN (2) CN106125921B (en)
WO (1) WO2012107892A2 (en)

Families Citing this family (370)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2535364T3 (en) 2004-06-18 2015-05-08 Tobii Ab Eye control of computer equipment
US8885177B2 (en) * 2007-09-26 2014-11-11 Elbit Systems Ltd. Medical wide field of view optical tracking system
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US20150205111A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. Optical configurations for head worn computing
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US20150277120A1 (en) 2014-01-21 2015-10-01 Osterhout Group, Inc. Optical configurations for head worn computing
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
CN102959616B (en) 2010-07-20 2015-06-10 苹果公司 Interactive reality augmentation for natural interaction
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US8836777B2 (en) * 2011-02-25 2014-09-16 DigitalOptics Corporation Europe Limited Automatic detection of vertical gaze using an embedded imaging device
US9009746B2 (en) * 2011-03-17 2015-04-14 Ebay Inc. Secure transaction through a television
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US20120257035A1 (en) * 2011-04-08 2012-10-11 Sony Computer Entertainment Inc. Systems and methods for providing feedback by tracking user gaze and gestures
US8840466B2 (en) 2011-04-25 2014-09-23 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
JP5757166B2 (en) * 2011-06-09 2015-07-29 ソニー株式会社 Sound control apparatus, program, and control method
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9733789B2 (en) * 2011-08-04 2017-08-15 Eyesight Mobile Technologies Ltd. Interfacing with a device via virtual 3D objects
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9377852B1 (en) * 2013-08-29 2016-06-28 Rockwell Collins, Inc. Eye tracking as a method to improve the user interface
US9098069B2 (en) 2011-11-16 2015-08-04 Google Technology Holdings LLC Display device, corresponding systems, and methods for orienting output on a display
US9628843B2 (en) * 2011-11-21 2017-04-18 Microsoft Technology Licensing, Llc Methods for controlling electronic devices using gestures
US20140317576A1 (en) * 2011-12-06 2014-10-23 Thomson Licensing Method and system for responding to user's selection gesture of object displayed in three dimensions
WO2013089693A1 (en) * 2011-12-14 2013-06-20 Intel Corporation Gaze activated content transfer system
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US9501152B2 (en) 2013-01-15 2016-11-22 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US8638989B2 (en) 2012-01-17 2014-01-28 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US20150253428A1 (en) 2013-03-15 2015-09-10 Leap Motion, Inc. Determining positional information for an object in space
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US8693731B2 (en) 2012-01-17 2014-04-08 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging
US11493998B2 (en) 2012-01-17 2022-11-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US8854433B1 (en) 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9423994B2 (en) * 2012-02-22 2016-08-23 Citrix Systems, Inc. Hierarchical display
US8947382B2 (en) * 2012-02-28 2015-02-03 Motorola Mobility Llc Wearable display device, corresponding systems, and method for presenting output on the same
US8988349B2 (en) 2012-02-28 2015-03-24 Google Technology Holdings LLC Methods and apparatuses for operating a display in an electronic device
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
WO2013136333A1 (en) * 2012-03-13 2013-09-19 Eyesight Mobile Technologies Ltd. Touch free user interface
CN104246682B (en) 2012-03-26 2017-08-25 苹果公司 Enhanced virtual touchpad and touch-screen
EP2847648A4 (en) * 2012-05-09 2016-03-02 Intel Corp Eye tracking based selective accentuation of portions of a display
WO2013168173A1 (en) * 2012-05-11 2013-11-14 Umoove Services Ltd. Gaze-based automatic scrolling
TWI489326B (en) * 2012-06-05 2015-06-21 Wistron Corp Operating area determination method and system
US8934675B2 (en) 2012-06-25 2015-01-13 Aquifi, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US9111135B2 (en) 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
WO2014020604A1 (en) * 2012-07-31 2014-02-06 Inuitive Ltd. Multiple sensors processing system for natural user interface applications
JP6011165B2 (en) * 2012-08-31 2016-10-19 オムロン株式会社 Gesture recognition device, control method thereof, display device, and control program
US8836768B1 (en) 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US9310895B2 (en) 2012-10-12 2016-04-12 Microsoft Technology Licensing, Llc Touchless input
US10146316B2 (en) 2012-10-31 2018-12-04 Nokia Technologies Oy Method and apparatus for disambiguating a plurality of targets
WO2014068582A1 (en) * 2012-10-31 2014-05-08 Nokia Corporation A method, apparatus and computer program for enabling a user input command to be performed
CN103809733B (en) * 2012-11-07 2018-07-20 北京三星通信技术研究有限公司 Man-machine interactive system and method
US9684372B2 (en) * 2012-11-07 2017-06-20 Samsung Electronics Co., Ltd. System and method for human computer interaction
WO2014106219A1 (en) * 2012-12-31 2014-07-03 Burachas Giedrius Tomas User centric interface for interaction with visual display that recognizes user intentions
US9632658B2 (en) 2013-01-15 2017-04-25 Leap Motion, Inc. Dynamic user interactions for display control and scaling responsiveness of display objects
JP6375591B2 (en) * 2013-01-15 2018-08-22 セイコーエプソン株式会社 Head-mounted display device, head-mounted display device control method, and image display system
WO2014111924A1 (en) * 2013-01-15 2014-07-24 Poow Innovation Ltd. Dynamic icons
US9459697B2 (en) * 2013-01-15 2016-10-04 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
US10365874B2 (en) * 2013-01-28 2019-07-30 Sony Corporation Information processing for band control of a communication stream
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US20140232640A1 (en) * 2013-02-05 2014-08-21 Umoove Services Ltd. Dynamic range resetting
US9864498B2 (en) 2013-03-13 2018-01-09 Tobii Ab Automatic scrolling based on gaze detection
US9619020B2 (en) 2013-03-01 2017-04-11 Tobii Ab Delay warp gaze interaction
US9898081B2 (en) 2013-03-04 2018-02-20 Tobii Ab Gaze and saccade based graphical manipulation
US10895908B2 (en) 2013-03-04 2021-01-19 Tobii Ab Targeting saccade landing prediction using visual history
US11714487B2 (en) 2013-03-04 2023-08-01 Tobii Ab Gaze and smooth pursuit based continuous foveal adjustment
US9665171B1 (en) 2013-03-04 2017-05-30 Tobii Ab Gaze and saccade based graphical manipulation
US10082870B2 (en) 2013-03-04 2018-09-25 Tobii Ab Gaze and saccade based graphical manipulation
US11747895B2 (en) * 2013-03-15 2023-09-05 Intuitive Surgical Operations, Inc. Robotic system providing user selectable actions associated with gaze tracking
US20140280503A1 (en) 2013-03-15 2014-09-18 John Cronin System and methods for effective virtual reality visitor interface
US9838506B1 (en) 2013-03-15 2017-12-05 Sony Interactive Entertainment America Llc Virtual reality universe representation changes viewing based upon client side parameters
US20140280506A1 (en) 2013-03-15 2014-09-18 John Cronin Virtual reality enhanced through browser connections
US20140280502A1 (en) 2013-03-15 2014-09-18 John Cronin Crowd and cloud enabled virtual reality distributed location network
US20140282113A1 (en) 2013-03-15 2014-09-18 John Cronin Personal digital assistance and virtual reality
US20140280505A1 (en) 2013-03-15 2014-09-18 John Cronin Virtual reality interaction with 3d printing
US20140280644A1 (en) 2013-03-15 2014-09-18 John Cronin Real time unified communications interaction of a predefined location in a virtual reality location
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
CN103197889B (en) * 2013-04-03 2017-02-08 锤子科技(北京)有限公司 Brightness adjusting method and device and electronic device
US10620709B2 (en) 2013-04-05 2020-04-14 Ultrahaptics IP Two Limited Customized gesture interpretation
US9323338B2 (en) 2013-04-12 2016-04-26 Usens, Inc. Interactive input system and method
US20140354602A1 (en) * 2013-04-12 2014-12-04 Impression.Pi, Inc. Interactive input system and method
US9916009B2 (en) 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
KR20140133362A (en) * 2013-05-10 2014-11-19 삼성전자주식회사 display apparatus and user interface screen providing method thereof
KR101803311B1 (en) * 2013-05-10 2018-01-10 삼성전자주식회사 Display appratus and Method for providing User interface thereof
US9436288B2 (en) 2013-05-17 2016-09-06 Leap Motion, Inc. Cursor mode switching
US10620775B2 (en) 2013-05-17 2020-04-14 Ultrahaptics IP Two Limited Dynamic interactive objects
US9747696B2 (en) 2013-05-17 2017-08-29 Leap Motion, Inc. Systems and methods for providing normalized parameters of motions of objects in three-dimensional space
EP2811318B1 (en) * 2013-06-05 2015-07-22 Sick Ag Optoelectronic sensor
US10137363B2 (en) 2013-06-20 2018-11-27 Uday Parshionikar Gesture based user interfaces, apparatuses and control systems
US20140375541A1 (en) * 2013-06-25 2014-12-25 David Nister Eye tracking via depth camera
DE102013011531B4 (en) * 2013-07-10 2018-04-26 Audi Ag Method for operating an assistance system of a motor vehicle and assistance system for a motor vehicle
US20160077670A1 (en) * 2013-07-31 2016-03-17 Hewlett-Packard Development Company, L.P. System with projector unit and computer
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
CN104349197B (en) * 2013-08-09 2019-07-26 联想(北京)有限公司 A kind of data processing method and device
US10281987B1 (en) 2013-08-09 2019-05-07 Leap Motion, Inc. Systems and methods of free-space gestural interaction
US10914951B2 (en) * 2013-08-19 2021-02-09 Qualcomm Incorporated Visual, audible, and/or haptic feedback for optical see-through head mounted display with user interaction tracking
US10073518B2 (en) * 2013-08-19 2018-09-11 Qualcomm Incorporated Automatic calibration of eye tracking for optical see-through head mounted display
US10089786B2 (en) * 2013-08-19 2018-10-02 Qualcomm Incorporated Automatic customization of graphical user interface for optical see-through head mounted display with user interaction tracking
WO2015027241A1 (en) 2013-08-23 2015-02-26 Tobii Technology Ab Systems and methods for providing audio to a user based on gaze input
US9143880B2 (en) 2013-08-23 2015-09-22 Tobii Ab Systems and methods for providing audio to a user based on gaze input
US10846942B1 (en) 2013-08-29 2020-11-24 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US10108258B2 (en) * 2013-09-06 2018-10-23 Intel Corporation Multiple viewpoint image capture of a display user
CN106462178A (en) * 2013-09-11 2017-02-22 谷歌技术控股有限责任公司 Electronic device and method for detecting presence and motion
WO2015048030A1 (en) 2013-09-24 2015-04-02 Sony Computer Entertainment Inc. Gaze tracking variations using visible lights or dots
US9781360B2 (en) 2013-09-24 2017-10-03 Sony Interactive Entertainment Inc. Gaze tracking variations using selective illumination
EP3048949B1 (en) 2013-09-24 2019-11-20 Sony Interactive Entertainment Inc. Gaze tracking variations using dynamic lighting position
US9632572B2 (en) 2013-10-03 2017-04-25 Leap Motion, Inc. Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US9720559B2 (en) 2013-10-14 2017-08-01 Microsoft Technology Licensing, Llc Command authentication
KR101503159B1 (en) * 2013-10-15 2015-03-16 (주)이스트소프트 Method of controlling touch-screen detecting eyesight
TWI532377B (en) * 2013-10-18 2016-05-01 原相科技股份有限公司 Image sesning system, image sensing method, eye tracking system, eye tracking method
US9876966B2 (en) 2013-10-18 2018-01-23 Pixart Imaging Inc. System and method for determining image variation tendency and controlling image resolution
US9727134B2 (en) * 2013-10-29 2017-08-08 Dell Products, Lp System and method for display power management for dual screen display device
US9524139B2 (en) * 2013-10-29 2016-12-20 Dell Products, Lp System and method for positioning an application window based on usage context for dual screen display device
CN104598009A (en) * 2013-10-30 2015-05-06 鸿富锦精密工业(武汉)有限公司 Screen button control method and system
US9996638B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Predictive information for free space gesture control and communication
US9958939B2 (en) * 2013-10-31 2018-05-01 Sync-Think, Inc. System and method for dynamic content delivery based on gaze analytics
US9891707B2 (en) * 2013-11-01 2018-02-13 Sony Corporation Information processing device, information processing method, and program for controlling a state of an application by gaze position
WO2015062750A1 (en) * 2013-11-04 2015-05-07 Johnson Controls Gmbh Infortainment system for a vehicle
US9672649B2 (en) * 2013-11-04 2017-06-06 At&T Intellectual Property I, Lp System and method for enabling mirror video chat using a wearable display device
US10317995B2 (en) 2013-11-18 2019-06-11 Tobii Ab Component determination and gaze provoked interaction
US10558262B2 (en) 2013-11-18 2020-02-11 Tobii Ab Component determination and gaze provoked interaction
US9613202B2 (en) 2013-12-10 2017-04-04 Dell Products, Lp System and method for motion gesture access to an application and limited resources of an information handling system
US20150169048A1 (en) * 2013-12-18 2015-06-18 Lenovo (Singapore) Pte. Ltd. Systems and methods to present information on device based on eye tracking
US9633252B2 (en) 2013-12-20 2017-04-25 Lenovo (Singapore) Pte. Ltd. Real-time detection of user intention based on kinematics analysis of movement-oriented biometric data
US10180716B2 (en) 2013-12-20 2019-01-15 Lenovo (Singapore) Pte Ltd Providing last known browsing location cue using movement-oriented biometric data
JP2015121623A (en) * 2013-12-20 2015-07-02 カシオ計算機株式会社 Electronic equipment, display control method, and program
US9244539B2 (en) * 2014-01-07 2016-01-26 Microsoft Technology Licensing, Llc Target positioning with gaze tracking
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9430040B2 (en) * 2014-01-14 2016-08-30 Microsoft Technology Licensing, Llc Eye gaze detection with multiple light sources and sensors
US9366868B2 (en) 2014-09-26 2016-06-14 Osterhout Group, Inc. See-through computer display systems
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US20150228119A1 (en) 2014-02-11 2015-08-13 Osterhout Group, Inc. Spatial location presentation in head worn computing
US20150277118A1 (en) 2014-03-28 2015-10-01 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US20160019715A1 (en) 2014-07-15 2016-01-21 Osterhout Group, Inc. Content presentation in head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US9529199B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9811153B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US20150205135A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9615742B2 (en) 2014-01-21 2017-04-11 Osterhout Group, Inc. Eye imaging in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9201578B2 (en) * 2014-01-23 2015-12-01 Microsoft Technology Licensing, Llc Gaze swipe selection
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
US9588343B2 (en) 2014-01-25 2017-03-07 Sony Interactive Entertainment America Llc Menu navigation in a head-mounted display
US9437159B2 (en) 2014-01-25 2016-09-06 Sony Interactive Entertainment America Llc Environmental interrupt in a head-mounted display and utilization of non field of view real estate
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US9852545B2 (en) 2014-02-11 2017-12-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
KR101533319B1 (en) 2014-02-22 2015-07-03 주식회사 브이터치 Remote control apparatus and method using camera centric virtual touch
US9330302B2 (en) * 2014-02-26 2016-05-03 Microsoft Technology Licensing, Llc Polarized gaze tracking
US11221680B1 (en) * 2014-03-01 2022-01-11 sigmund lindsay clements Hand gestures used to operate a control panel for a device
DE102014114131A1 (en) * 2014-03-10 2015-09-10 Beijing Lenovo Software Ltd. Information processing and electronic device
WO2015143073A1 (en) * 2014-03-19 2015-09-24 Intuitive Surgical Operations, Inc. Medical devices, systems, and methods integrating eye gaze tracking for stereo viewer
KR102404559B1 (en) * 2014-03-19 2022-06-02 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Medical devices, systems, and methods using eye gaze tracking
CN104951048A (en) * 2014-03-24 2015-09-30 联想(北京)有限公司 Information processing method and device
RU2014111792A (en) * 2014-03-27 2015-10-10 ЭлЭсАй Корпорейшн IMAGE PROCESSOR CONTAINING A FACE RECOGNITION SYSTEM BASED ON THE TRANSFORMATION OF A TWO-DIMENSIONAL LATTICE
US9727778B2 (en) * 2014-03-28 2017-08-08 Wipro Limited System and method for guided continuous body tracking for complex interaction
US20160187651A1 (en) 2014-03-28 2016-06-30 Osterhout Group, Inc. Safety for a vehicle operator with an hmd
US9342147B2 (en) 2014-04-10 2016-05-17 Microsoft Technology Licensing, Llc Non-visual feedback of visual change
US20150302422A1 (en) * 2014-04-16 2015-10-22 2020 Ip Llc Systems and methods for multi-user behavioral research
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US9727136B2 (en) 2014-05-19 2017-08-08 Microsoft Technology Licensing, Llc Gaze detection calibration
US9740338B2 (en) * 2014-05-22 2017-08-22 Ubi interactive inc. System and methods for providing a three-dimensional touch screen
US10318016B2 (en) * 2014-06-03 2019-06-11 Harman International Industries, Incorporated Hands free device with directional interface
US9696813B2 (en) * 2015-05-27 2017-07-04 Hsien-Hsiang Chiu Gesture interface robot
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US9203951B1 (en) 2014-07-03 2015-12-01 International Business Machines Corporation Mobile telephone adapted for use with one hand
WO2016005649A1 (en) * 2014-07-09 2016-01-14 Nokia Technologies Oy Device control
WO2016008988A1 (en) 2014-07-16 2016-01-21 Sony Corporation Apparatus for presenting a virtual object on a three-dimensional display and method for controlling the apparatus
US9846522B2 (en) 2014-07-23 2017-12-19 Microsoft Technology Licensing, Llc Alignable user interface
US9645641B2 (en) * 2014-08-01 2017-05-09 Microsoft Technology Licensing, Llc Reflection-based control activation
JP6454851B2 (en) * 2014-08-07 2019-01-23 フォーブ インコーポレーテッド 3D gaze point location algorithm
DE202014103729U1 (en) 2014-08-08 2014-09-09 Leap Motion, Inc. Augmented reality with motion detection
US9619008B2 (en) * 2014-08-15 2017-04-11 Dell Products, Lp System and method for dynamic thermal management in passively cooled device with a plurality of display surfaces
US10168833B2 (en) * 2014-09-03 2019-01-01 Hewlett-Packard Development Company, L.P. Presentation of a digital image of an object
KR102244620B1 (en) * 2014-09-05 2021-04-26 삼성전자 주식회사 Method and apparatus for controlling rendering quality
DE102014219803A1 (en) * 2014-09-30 2016-03-31 Siemens Aktiengesellschaft Device and method for selecting a device
CN105590015B (en) * 2014-10-24 2019-05-03 中国电信股份有限公司 Hum pattern hot spot acquisition method, treating method and apparatus and hot point system
US9535497B2 (en) 2014-11-20 2017-01-03 Lenovo (Singapore) Pte. Ltd. Presentation of data on an at least partially transparent display based on user focus
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
US9823764B2 (en) * 2014-12-03 2017-11-21 Microsoft Technology Licensing, Llc Pointer projection for natural user input
US10248192B2 (en) * 2014-12-03 2019-04-02 Microsoft Technology Licensing, Llc Gaze target application launcher
US10088971B2 (en) * 2014-12-10 2018-10-02 Microsoft Technology Licensing, Llc Natural user interface camera calibration
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
WO2016125083A1 (en) * 2015-02-04 2016-08-11 Spiritus Payments Private Limited Method and system for secure pin entry on computing devices
US20160239985A1 (en) 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
US10878775B2 (en) 2015-02-17 2020-12-29 Mentor Acquisition One, Llc See-through computer display systems
EP3267295B1 (en) * 2015-03-05 2021-12-29 Sony Group Corporation Information processing device, control method, and program
NZ773819A (en) 2015-03-16 2022-07-01 Magic Leap Inc Methods and systems for diagnosing and treating health ailments
CN104699249B (en) * 2015-03-27 2018-04-27 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN104834446B (en) * 2015-05-04 2018-10-26 惠州Tcl移动通信有限公司 A kind of display screen multi-screen control method and system based on eyeball tracking technology
CN104866824A (en) * 2015-05-17 2015-08-26 华南理工大学 Manual alphabet identification method based on Leap Motion
WO2016189390A2 (en) * 2015-05-28 2016-12-01 Eyesight Mobile Technologies Ltd. Gesture control system and method for smart home
CN107787497B (en) * 2015-06-10 2021-06-22 维塔驰有限公司 Method and apparatus for detecting gestures in a user-based spatial coordinate system
JP6553418B2 (en) * 2015-06-12 2019-07-31 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Display control method, display control device and control program
US10043281B2 (en) 2015-06-14 2018-08-07 Sony Interactive Entertainment Inc. Apparatus and method for estimating eye gaze location
DE102015211521A1 (en) * 2015-06-23 2016-12-29 Robert Bosch Gmbh Method for operating an input device, input device
TWI676281B (en) 2015-07-23 2019-11-01 光澄科技股份有限公司 Optical sensor and method for fabricating thereof
US10761599B2 (en) * 2015-08-04 2020-09-01 Artilux, Inc. Eye gesture tracking
WO2017024121A1 (en) 2015-08-04 2017-02-09 Artilux Corporation Germanium-silicon light sensing apparatus
US10861888B2 (en) 2015-08-04 2020-12-08 Artilux, Inc. Silicon germanium imager with photodiode in trench
US10707260B2 (en) 2015-08-04 2020-07-07 Artilux, Inc. Circuit for operating a multi-gate VIS/IR photodiode
US20170045935A1 (en) * 2015-08-13 2017-02-16 International Business Machines Corporation Displaying content based on viewing direction
CN115824395B (en) 2015-08-27 2023-08-15 光程研创股份有限公司 Wide-spectrum optical sensor
US9703387B2 (en) 2015-08-31 2017-07-11 Konica Minolta Laboratory U.S.A., Inc. System and method of real-time interactive operation of user interface
JP6525150B2 (en) * 2015-08-31 2019-06-05 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Method for generating control signals for use with a telepresence robot, telepresence system and computer program
US10186086B2 (en) * 2015-09-02 2019-01-22 Microsoft Technology Licensing, Llc Augmented reality control of computing device
US11194398B2 (en) 2015-09-26 2021-12-07 Intel Corporation Technologies for adaptive rendering using 3D sensors
US10401953B2 (en) 2015-10-26 2019-09-03 Pillantas Inc. Systems and methods for eye vergence control in real and augmented reality environments
US10466780B1 (en) * 2015-10-26 2019-11-05 Pillantas Systems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor
US9709807B2 (en) * 2015-11-03 2017-07-18 Motorola Solutions, Inc. Out of focus notifications
US10741598B2 (en) 2015-11-06 2020-08-11 Atrilux, Inc. High-speed light sensing apparatus II
US10739443B2 (en) 2015-11-06 2020-08-11 Artilux, Inc. High-speed light sensing apparatus II
US10418407B2 (en) 2015-11-06 2019-09-17 Artilux, Inc. High-speed light sensing apparatus III
US10254389B2 (en) 2015-11-06 2019-04-09 Artilux Corporation High-speed light sensing apparatus
US10886309B2 (en) 2015-11-06 2021-01-05 Artilux, Inc. High-speed light sensing apparatus II
CN105301778A (en) * 2015-12-08 2016-02-03 北京小鸟看看科技有限公司 Three-dimensional control device, head-mounted device and three-dimensional control method
US9898256B2 (en) * 2015-12-31 2018-02-20 Microsoft Technology Licensing, Llc Translation of gesture to gesture code description using depth camera
US10310618B2 (en) 2015-12-31 2019-06-04 Microsoft Technology Licensing, Llc Gestures visual builder tool
US10591728B2 (en) 2016-03-02 2020-03-17 Mentor Acquisition One, Llc Optical systems for head-worn computers
US20180164895A1 (en) * 2016-02-23 2018-06-14 Sony Corporation Remote control apparatus, remote control method, remote control system, and program
JP6859999B2 (en) * 2016-02-23 2021-04-14 ソニー株式会社 Remote control devices, remote control methods, remote control systems, and programs
US10667981B2 (en) 2016-02-29 2020-06-02 Mentor Acquisition One, Llc Reading assistance system for visually impaired
US10198233B2 (en) * 2016-03-01 2019-02-05 Microsoft Technology Licensing, Llc Updating displays based on attention tracking data
CN105844705B (en) * 2016-03-29 2018-11-09 联想(北京)有限公司 A kind of three-dimensional object model generation method and electronic equipment
WO2017172911A1 (en) * 2016-03-29 2017-10-05 Google Inc. System and method for generating virtual marks based on gaze tracking
US10824253B2 (en) 2016-05-09 2020-11-03 Mentor Acquisition One, Llc User interface systems for head-worn computers
US9910284B1 (en) 2016-09-08 2018-03-06 Osterhout Group, Inc. Optical systems for head-worn computers
WO2017176898A1 (en) 2016-04-08 2017-10-12 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
US10466491B2 (en) 2016-06-01 2019-11-05 Mentor Acquisition One, Llc Modular systems for head-worn computers
US10684478B2 (en) 2016-05-09 2020-06-16 Mentor Acquisition One, Llc User interface systems for head-worn computers
CN107438261B (en) * 2016-05-25 2021-09-07 中兴通讯股份有限公司 Peak-to-average ratio detection device and method, and mobile communication device
KR20170141484A (en) * 2016-06-15 2017-12-26 엘지전자 주식회사 Control device for a vehhicle and control metohd thereof
JP2017228080A (en) * 2016-06-22 2017-12-28 ソニー株式会社 Information processing device, information processing method, and program
CN106066699A (en) * 2016-06-23 2016-11-02 温州美渠传媒有限公司 A kind of intelligent human-computer exchange information equipment
US10591988B2 (en) * 2016-06-28 2020-03-17 Hiscene Information Technology Co., Ltd Method for displaying user interface of head-mounted display device
US20190258318A1 (en) * 2016-06-28 2019-08-22 Huawei Technologies Co., Ltd. Terminal for controlling electronic device and processing method thereof
TWI610059B (en) * 2016-08-04 2018-01-01 緯創資通股份有限公司 Three-dimensional measurement method and three-dimensional measurement device using the same
US10690936B2 (en) 2016-08-29 2020-06-23 Mentor Acquisition One, Llc Adjustable nose bridge assembly for headworn computer
US10209772B2 (en) 2016-09-19 2019-02-19 International Business Machines Corporation Hands-free time series or chart-based data investigation
CN107958446B (en) * 2016-10-17 2023-04-07 索尼公司 Information processing apparatus, information processing method, and computer program
USD840395S1 (en) 2016-10-17 2019-02-12 Osterhout Group, Inc. Head-worn computer
EP3316075B1 (en) * 2016-10-26 2021-04-07 Harman Becker Automotive Systems GmbH Combined eye and gesture tracking
JP2018073244A (en) * 2016-11-01 2018-05-10 富士通株式会社 Calibration program, calibration apparatus, and calibration method
DE102016015119A1 (en) * 2016-12-20 2018-06-21 Drägerwerk AG & Co. KGaA Apparatus, method and computer program for configuring a medical device, medical device, method and computer program for a medical device
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
JP6814053B2 (en) * 2017-01-19 2021-01-13 株式会社日立エルジーデータストレージ Object position detector
CN108459702B (en) * 2017-02-22 2024-01-26 深圳巧牛科技有限公司 Man-machine interaction method and system based on gesture recognition and visual feedback
KR102601052B1 (en) 2017-02-23 2023-11-09 매직 립, 인코포레이티드 Display system with variable power reflector
US10650405B2 (en) 2017-03-21 2020-05-12 Kellogg Company Media content tracking
US10401954B2 (en) * 2017-04-17 2019-09-03 Intel Corporation Sensory enhanced augmented reality and virtual reality device
CN107368184B (en) * 2017-05-12 2020-04-14 阿里巴巴集团控股有限公司 Password input method and device in virtual reality scene
CN109141461A (en) * 2017-06-13 2019-01-04 博世汽车部件(苏州)有限公司 Automobile digital map navigation control system and method
US10810773B2 (en) * 2017-06-14 2020-10-20 Dell Products, L.P. Headset display control based upon a user's pupil state
US11853469B2 (en) * 2017-06-21 2023-12-26 SMR Patents S.à.r.l. Optimize power consumption of display and projection devices by tracing passenger's trajectory in car cabin
DE102017113763B4 (en) * 2017-06-21 2022-03-17 SMR Patents S.à.r.l. Method for operating a display device for a motor vehicle and motor vehicle
US11409105B2 (en) 2017-07-24 2022-08-09 Mentor Acquisition One, Llc See-through computer display systems
US10578869B2 (en) 2017-07-24 2020-03-03 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US10422995B2 (en) 2017-07-24 2019-09-24 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US11073904B2 (en) * 2017-07-26 2021-07-27 Microsoft Technology Licensing, Llc Intelligent user interface element selection using eye-gaze
US10496162B2 (en) * 2017-07-26 2019-12-03 Microsoft Technology Licensing, Llc Controlling a computer using eyegaze and dwell
KR102048674B1 (en) * 2017-07-31 2019-11-26 코닉오토메이션 주식회사 Lighting stand type multimedia device
US10969584B2 (en) 2017-08-04 2021-04-06 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
US20190073040A1 (en) * 2017-09-05 2019-03-07 Future Mobility Corporation Limited Gesture and motion based control of user interfaces
US9940518B1 (en) 2017-09-11 2018-04-10 Tobii Ab Reliability of gaze tracking data for left and right eye
CN107515474B (en) * 2017-09-22 2020-03-31 宁波维真显示科技股份有限公司 Automatic stereo display method and device and stereo display equipment
US10957069B2 (en) * 2017-09-29 2021-03-23 Tobii Ab Head pose estimation from local eye region
KR102518404B1 (en) 2017-09-29 2023-04-06 삼성전자주식회사 Electronic device and method for executing content using sight-line information thereof
CN107864390A (en) * 2017-10-24 2018-03-30 深圳前海茂佳软件科技有限公司 Control method, television set and the computer-readable storage medium of television set
JP6463826B1 (en) * 2017-11-27 2019-02-06 株式会社ドワンゴ Video distribution server, video distribution method, and video distribution program
CN107976183A (en) * 2017-12-18 2018-05-01 北京师范大学珠海分校 A kind of spatial data measuring method and device
KR102476757B1 (en) 2017-12-21 2022-12-09 삼성전자주식회사 Device and method to detect reflection
EP3502836B1 (en) * 2017-12-21 2021-09-08 Atos Information Technology GmbH Method for operating an augmented interactive reality system
US20190253751A1 (en) * 2018-02-13 2019-08-15 Perfect Corp. Systems and Methods for Providing Product Information During a Live Broadcast
WO2019161503A1 (en) 2018-02-22 2019-08-29 Innodem Neurosciences Eye tracking method and system
CN113540142A (en) 2018-02-23 2021-10-22 奥特逻科公司 Optical detection device
US11105928B2 (en) 2018-02-23 2021-08-31 Artilux, Inc. Light-sensing apparatus and light-sensing method thereof
CN108469893B (en) * 2018-03-09 2021-08-27 海尔优家智能科技(北京)有限公司 Display screen control method, device, equipment and computer readable storage medium
TWI780007B (en) 2018-04-08 2022-10-01 美商光程研創股份有限公司 Photo-detecting apparatus and system thereof
US11126257B2 (en) 2018-04-17 2021-09-21 Toyota Research Institute, Inc. System and method for detecting human gaze and gesture in unconstrained environments
WO2019209265A1 (en) * 2018-04-24 2019-10-31 Hewlett-Packard Development Company, L.P. Animated gazes on head mounted displays
US10854770B2 (en) 2018-05-07 2020-12-01 Artilux, Inc. Avalanche photo-transistor
US10969877B2 (en) 2018-05-08 2021-04-06 Artilux, Inc. Display apparatus
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
US10895757B2 (en) * 2018-07-03 2021-01-19 Verb Surgical Inc. Systems and methods for three-dimensional visualization during robotic surgery
US11949943B2 (en) 2018-07-16 2024-04-02 Arris Enterprises Llc Gaze-responsive advertisement
CN110865761B (en) * 2018-08-28 2023-07-28 财团法人工业技术研究院 Direction judging system and direction judging method
CN109145566A (en) * 2018-09-08 2019-01-04 太若科技(北京)有限公司 Method, apparatus and AR glasses based on blinkpunkt information unlock AR glasses
CN109389069B (en) * 2018-09-28 2021-01-05 北京市商汤科技开发有限公司 Gaze point determination method and apparatus, electronic device, and computer storage medium
US20200125175A1 (en) * 2018-10-17 2020-04-23 WiSilica Inc. System using location, video-processing, and voice as user interface for controlling devices
US10782777B2 (en) 2018-11-29 2020-09-22 International Business Machines Corporation Real-time alteration of standard video and immersive video for virtual reality
CN109480767B (en) * 2018-12-13 2021-03-19 孙艳 Medical auxiliary examination device used before ophthalmologic operation
USD914021S1 (en) 2018-12-18 2021-03-23 Intel Corporation Touchpad display screen for computing device
CN109828660B (en) * 2018-12-29 2022-05-17 深圳云天励飞技术有限公司 Method and device for controlling application operation based on augmented reality
KR20200085970A (en) * 2019-01-07 2020-07-16 현대자동차주식회사 Vehcle and control method thereof
US11107265B2 (en) * 2019-01-11 2021-08-31 Microsoft Technology Licensing, Llc Holographic palm raycasting for targeting virtual objects
KR102225342B1 (en) * 2019-02-13 2021-03-09 주식회사 브이터치 Method, system and non-transitory computer-readable recording medium for supporting object control
US20220176817A1 (en) * 2019-03-08 2022-06-09 Indian Institute Of Science A system for man-machine interaction in vehicles
DE112019007085T5 (en) * 2019-03-27 2022-01-20 Intel Corporation Intelligent scoreboard setup and related techniques
US11256342B2 (en) * 2019-04-03 2022-02-22 Facebook Technologies, Llc Multimodal kinematic template matching and regression modeling for ray pointing prediction in virtual reality
CN117590582A (en) * 2019-04-11 2024-02-23 三星电子株式会社 Head-mounted display device and operation method thereof
US10819920B1 (en) 2019-05-22 2020-10-27 Dell Products L.P. Augmented information handling system user presence detection
US11379016B2 (en) 2019-05-23 2022-07-05 Intel Corporation Methods and apparatus to operate closed-lid portable computers
CN110147180B (en) * 2019-05-24 2022-08-02 深圳秋田微电子股份有限公司 Touch display device, touch display method, display and terminal
EP3751812B1 (en) 2019-06-10 2022-10-26 Nokia Technologies Oy Resource access
US11216065B2 (en) * 2019-09-26 2022-01-04 Lenovo (Singapore) Pte. Ltd. Input control display based on eye gaze
US11543873B2 (en) 2019-09-27 2023-01-03 Intel Corporation Wake-on-touch display screen devices and related methods
US11435447B2 (en) 2019-10-11 2022-09-06 Dell Products L.P. Information handling system proximity sensor with mechanically adjusted field of view
US11662695B2 (en) 2019-10-11 2023-05-30 Dell Products L.P. Information handling system infrared proximity detection with distance reduction detection
US11294054B2 (en) 2019-10-11 2022-04-05 Dell Products L.P. Information handling system infrared proximity detection with ambient light management
US11435475B2 (en) 2019-10-11 2022-09-06 Dell Products L.P. Information handling system infrared proximity detection with frequency domain modulation
US11614797B2 (en) 2019-11-05 2023-03-28 Micron Technology, Inc. Rendering enhancement based in part on eye tracking
US11733761B2 (en) 2019-11-11 2023-08-22 Intel Corporation Methods and apparatus to manage power and performance of computing devices based on user presence
US11809535B2 (en) 2019-12-23 2023-11-07 Intel Corporation Systems and methods for multi-modal user device authentication
US11360528B2 (en) 2019-12-27 2022-06-14 Intel Corporation Apparatus and methods for thermal management of electronic user devices based on user activity
US11334146B2 (en) 2020-01-31 2022-05-17 Dell Products L.P. Information handling system peripheral enhanced user presence detection
US11513813B2 (en) 2020-01-31 2022-11-29 Dell Products L.P. Information handling system notification presentation based upon user presence detection
US11663343B2 (en) 2020-01-31 2023-05-30 Dell Products L.P. Information handling system adaptive user presence detection
JP2023520463A (en) * 2020-04-03 2023-05-17 マジック リープ, インコーポレイテッド Avatar Customization for Optimal Gaze Discrimination
TW202205053A (en) * 2020-07-27 2022-02-01 虹光精密工業股份有限公司 Office machine with intelligent sleep and wake function and control method thereof
EP4281843A1 (en) * 2021-01-20 2023-11-29 Apple Inc. Methods for interacting with objects in an environment
EP4288950A1 (en) 2021-02-08 2023-12-13 Sightful Computers Ltd User interactions in extended reality
EP4288856A1 (en) 2021-02-08 2023-12-13 Sightful Computers Ltd Extended reality for productivity
EP4295314A1 (en) 2021-02-08 2023-12-27 Sightful Computers Ltd Content sharing in extended reality
WO2023009580A2 (en) 2021-07-28 2023-02-02 Multinarity Ltd Using an extended reality appliance for productivity
CN116263598A (en) * 2021-12-13 2023-06-16 追觅创新科技(苏州)有限公司 Relocation method and equipment for self-mobile equipment and storage medium
US20230334795A1 (en) 2022-01-25 2023-10-19 Multinarity Ltd Dual mode presentation of user interface elements
US11948263B1 (en) 2023-03-14 2024-04-02 Sightful Computers Ltd Recording the complete physical and extended reality environments of a user
US20240029437A1 (en) * 2022-07-21 2024-01-25 Sony Interactive Entertainment LLC Generating customized summaries of virtual actions and events
CN115601824B (en) * 2022-10-19 2023-05-26 华中科技大学 Device, system and method for labeling gaze direction of human eye in two-dimensional image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4836670A (en) * 1987-08-19 1989-06-06 Center For Innovative Technology Eye movement detector
WO2010089989A1 (en) * 2009-02-05 2010-08-12 パナソニック株式会社 Information display device and information display method
CN101960409A (en) * 2007-12-31 2011-01-26 微软国际控股私有有限公司 3D pointing system

Family Cites Families (242)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4550250A (en) 1983-11-14 1985-10-29 Hei, Inc. Cordless digital graphics input device
US4789921A (en) 1987-02-20 1988-12-06 Minnesota Mining And Manufacturing Company Cone shaped Fresnel reflector
US4988981B1 (en) 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
US5588139A (en) 1990-06-07 1996-12-24 Vpl Research, Inc. Method and system for generating objects for a multi-person virtual world using data flow networks
US5973700A (en) 1992-09-16 1999-10-26 Eastman Kodak Company Method and apparatus for optimizing the resolution of images which have an apparent depth
US5495576A (en) 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5454043A (en) 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US5434370A (en) 1993-11-05 1995-07-18 Microfield Graphics, Inc. Marking system with pen-up/pen-down tracking
WO1996009579A1 (en) 1994-09-22 1996-03-28 Izak Van Cruyningen Popup menus with directional gestures
US5594469A (en) 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US6535210B1 (en) 1995-06-07 2003-03-18 Geovector Corp. Vision system computer modeling apparatus including interaction with real scenes with respect to perspective and spatial relationship as measured in real-time
US5852672A (en) 1995-07-10 1998-12-22 The Regents Of The University Of California Image system for three dimensional, 360 DEGREE, time sequence surface mapping of moving objects
RU2109336C1 (en) 1995-07-14 1998-04-20 Нурахмед Нурисламович Латыпов Method and device for immersing user into virtual world
EP0768511A1 (en) 1995-10-16 1997-04-16 European Community Optical three-dimensional profilometry method based on processing speckle images in partially coherent light, and interferometer implementing such a method
US5864635A (en) 1996-06-14 1999-01-26 International Business Machines Corporation Distinguishing gestures from handwriting in a pen based computer by stroke analysis
US5862256A (en) 1996-06-14 1999-01-19 International Business Machines Corporation Distinguishing gestures from handwriting in a pen based computer by size discrimination
US6084979A (en) 1996-06-20 2000-07-04 Carnegie Mellon University Method for creating virtual reality
US6002808A (en) 1996-07-26 1999-12-14 Mitsubishi Electric Information Technology Center America, Inc. Hand gesture control system
US6118888A (en) 1997-02-28 2000-09-12 Kabushiki Kaisha Toshiba Multi-modal interface apparatus and method
US5917937A (en) 1997-04-15 1999-06-29 Microsoft Corporation Method for performing stereo matching to recover depths, colors and opacities of surface elements
US6049327A (en) 1997-04-23 2000-04-11 Modern Cartoons, Ltd System for data management based onhand gestures
US6008813A (en) 1997-08-01 1999-12-28 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Real-time PC based volume rendering system
US6720949B1 (en) 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
US6072494A (en) 1997-10-15 2000-06-06 Electric Planet, Inc. Method and apparatus for real-time gesture recognition
JP3361980B2 (en) 1997-12-12 2003-01-07 株式会社東芝 Eye gaze detecting apparatus and method
WO1999035633A2 (en) 1998-01-06 1999-07-15 The Video Mouse Group Human motion following computer mouse and game controller
US7844914B2 (en) 2004-07-30 2010-11-30 Apple Inc. Activating virtual keys of a touch-screen virtual keyboard
US6211848B1 (en) 1998-05-15 2001-04-03 Massachusetts Institute Of Technology Dynamic holographic video with haptic interaction
US6076928A (en) 1998-06-15 2000-06-20 Fateh; Sina Ideal visual ergonomic system for computer users
US6064354A (en) 1998-07-01 2000-05-16 Deluca; Michael Joseph Stereoscopic user interface method and apparatus
US6252988B1 (en) 1998-07-09 2001-06-26 Lucent Technologies Inc. Method and apparatus for character recognition using stop words
US6681031B2 (en) 1998-08-10 2004-01-20 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US6501515B1 (en) 1998-10-13 2002-12-31 Sony Corporation Remote control system
CN1145872C (en) 1999-01-13 2004-04-14 国际商业机器公司 Method for automatically cutting and identiying hand written Chinese characters and system for using said method
US6200139B1 (en) * 1999-02-26 2001-03-13 Intel Corporation Operator training system
US7003134B1 (en) 1999-03-08 2006-02-21 Vulcan Patents Llc Three dimensional object pose estimation which employs dense depth information
US6614422B1 (en) 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
GB9913687D0 (en) 1999-06-11 1999-08-11 Canon Kk Image processing apparatus
US6512385B1 (en) 1999-07-26 2003-01-28 Paul Pfaff Method for testing a device under test including the interference of two beams
US6512838B1 (en) 1999-09-22 2003-01-28 Canesta, Inc. Methods for enhancing performance and data acquired from three-dimensional image systems
US7548874B2 (en) 1999-10-21 2009-06-16 International Business Machines Corporation System and method for group advertisement optimization
WO2001095061A2 (en) 1999-12-07 2001-12-13 Frauenhofer Institut Fuer Graphische Datenverarbeitung The extended virtual table: an optical extension for table-like projection systems
US6507353B1 (en) 1999-12-10 2003-01-14 Godot Huard Influencing virtual actors in an interactive environment
US6900779B1 (en) 2000-01-28 2005-05-31 Zheng Jason Geng Method and apparatus for an interactive volumetric three dimensional display
AU2001233019A1 (en) 2000-01-28 2001-08-07 Intersense, Inc. Self-referenced tracking
JP2001307134A (en) 2000-04-19 2001-11-02 Sony Corp Three-dimensional model processor, its method and program providing medium
US6483499B1 (en) 2000-04-21 2002-11-19 Hong Kong Productivity Council 3D sculpturing input device
US20070078552A1 (en) * 2006-01-13 2007-04-05 Outland Research, Llc Gaze-based power conservation for portable media players
US6456262B1 (en) 2000-05-09 2002-09-24 Intel Corporation Microdisplay with eye gaze detection
GB0012275D0 (en) 2000-05-22 2000-07-12 Secr Defence Brit Three dimensional human computer interface
US7042442B1 (en) 2000-06-27 2006-05-09 International Business Machines Corporation Virtual invisible keyboard
US7227526B2 (en) 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
US6686921B1 (en) 2000-08-01 2004-02-03 International Business Machines Corporation Method and apparatus for acquiring a set of consistent image maps to represent the color of the surface of an object
JP3974359B2 (en) 2000-10-31 2007-09-12 株式会社東芝 Online character recognition apparatus and method, computer-readable storage medium, and online character recognition program
US6816615B2 (en) 2000-11-10 2004-11-09 Microsoft Corporation Implicit page breaks for digitally represented handwriting
JP3631151B2 (en) 2000-11-30 2005-03-23 キヤノン株式会社 Information processing apparatus, mixed reality presentation apparatus and method, and storage medium
US20040104935A1 (en) 2001-01-26 2004-06-03 Todd Williamson Virtual reality immersion system
JP2004537082A (en) 2001-01-26 2004-12-09 ザクセル システムズ インコーポレイテッド Real-time virtual viewpoint in virtual reality environment
US6831632B2 (en) 2001-04-09 2004-12-14 I. C. + Technologies Ltd. Apparatus and methods for hand motion tracking and handwriting recognition
US8300042B2 (en) 2001-06-05 2012-10-30 Microsoft Corporation Interactive video display system using strobed light
US8035612B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US20040135744A1 (en) 2001-08-10 2004-07-15 Oliver Bimber Virtual showcases
US6741251B2 (en) 2001-08-16 2004-05-25 Hewlett-Packard Development Company, L.P. Method and apparatus for varying focus in a scene
US6822570B2 (en) 2001-12-20 2004-11-23 Calypso Medical Technologies, Inc. System for spatially adjustable excitation of leadless miniature marker
JP4050055B2 (en) 2002-01-10 2008-02-20 株式会社リコー Handwritten character batch conversion apparatus, handwritten character batch conversion method, and program
US7197165B2 (en) 2002-02-04 2007-03-27 Canon Kabushiki Kaisha Eye tracking using image data
AU2003217587A1 (en) 2002-02-15 2003-09-09 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US10242255B2 (en) 2002-02-15 2019-03-26 Microsoft Technology Licensing, Llc Gesture recognition system using depth perceptive sensors
US9959463B2 (en) 2002-02-15 2018-05-01 Microsoft Technology Licensing, Llc Gesture recognition system using depth perceptive sensors
US7821541B2 (en) 2002-04-05 2010-10-26 Bruno Delean Remote control apparatus using gesture recognition
US7203356B2 (en) 2002-04-11 2007-04-10 Canesta, Inc. Subject segmentation and tracking using 3D sensing technology for video compression in multimedia applications
US7348963B2 (en) 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
US20050122308A1 (en) 2002-05-28 2005-06-09 Matthew Bell Self-contained interactive video display system
US7170492B2 (en) 2002-05-28 2007-01-30 Reactrix Systems, Inc. Interactive video display system
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
US7370883B2 (en) 2002-06-03 2008-05-13 Intelligent Mechatronic Systems, Inc. Three dimensional occupant position sensor
US20030234346A1 (en) 2002-06-21 2003-12-25 Chi-Lei Kao Touch panel apparatus with optical detection for location
US6857746B2 (en) 2002-07-01 2005-02-22 Io2 Technology, Llc Method and system for free-space imaging display and interface
US7646372B2 (en) 2003-09-15 2010-01-12 Sony Computer Entertainment Inc. Methods and systems for enabling direction detection when interfacing with a computer program
US7151530B2 (en) 2002-08-20 2006-12-19 Canesta, Inc. System and method for determining an input selected by a user through a virtual interface
SE0202664L (en) 2002-09-09 2003-11-04 Zenterio Ab Graphical user interface for navigation and selection from various selectable options presented on a monitor
US7526120B2 (en) 2002-09-11 2009-04-28 Canesta, Inc. System and method for providing intelligent airbag deployment
CN100377043C (en) 2002-09-28 2008-03-26 皇家飞利浦电子股份有限公司 Three-dimensional hand-written identification process and system thereof
US7427996B2 (en) 2002-10-16 2008-09-23 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US6977654B2 (en) 2002-10-30 2005-12-20 Iviz, Inc. Data visualization with animated speedometer dial charts
US20040174770A1 (en) 2002-11-27 2004-09-09 Rees Frank L. Gauss-Rees parametric ultrawideband system
US7576727B2 (en) 2002-12-13 2009-08-18 Matthew Bell Interactive directed light/sound system
JP2004199496A (en) 2002-12-19 2004-07-15 Sony Corp Information processor and method, and program
CN1512298A (en) 2002-12-26 2004-07-14 �ʼҷ����ֵ��ӹɷ����޹�˾ Method for three dimension hand writing identification and its system
US7298414B2 (en) 2003-01-29 2007-11-20 Hewlett-Packard Development Company, L.P. Digital camera autofocus using eye focus measurement
US7333113B2 (en) 2003-03-13 2008-02-19 Sony Corporation Mobile motion capture cameras
US7573480B2 (en) 2003-05-01 2009-08-11 Sony Corporation System and method for capturing facial and body motion
KR100518824B1 (en) 2003-03-17 2005-10-05 삼성전자주식회사 Motion recognition system capable of distinguishment a stroke for writing motion and method thereof
KR100465241B1 (en) 2003-03-17 2005-01-13 삼성전자주식회사 Motion recognition system using a imaginary writing plane and method thereof
US7762665B2 (en) 2003-03-21 2010-07-27 Queen's University At Kingston Method and apparatus for communication between humans and devices
US8745541B2 (en) * 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
US7358972B2 (en) 2003-05-01 2008-04-15 Sony Corporation System and method for capturing facial and body motion
US8072470B2 (en) 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
JP4355341B2 (en) 2003-05-29 2009-10-28 本田技研工業株式会社 Visual tracking using depth data
US7515756B2 (en) 2003-06-23 2009-04-07 Shoestring Research, Llc. Region segmentation and characterization systems and methods for augmented reality
JP4723799B2 (en) 2003-07-08 2011-07-13 株式会社ソニー・コンピュータエンタテインメント Control system and control method
JP3977303B2 (en) 2003-08-21 2007-09-19 シャープ株式会社 Position detection system, transmitter and receiver in position detection system
US7874917B2 (en) * 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US7317450B2 (en) 2003-09-26 2008-01-08 Khomo Malome T Spatial chirographic sign reader
US7590941B2 (en) 2003-10-09 2009-09-15 Hewlett-Packard Development Company, L.P. Communication and collaboration system using rich media environments
JP4794453B2 (en) 2003-10-24 2011-10-19 インテレクチュアル ベンチャーズ ホールディング 67 エルエルシー Method and system for managing an interactive video display system
WO2005041579A2 (en) 2003-10-24 2005-05-06 Reactrix Systems, Inc. Method and system for processing captured image information in an interactive video display system
US7302099B2 (en) 2003-11-10 2007-11-27 Microsoft Corporation Stroke segmentation for template-based cursive handwriting recognition
US7809160B2 (en) 2003-11-14 2010-10-05 Queen's University At Kingston Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
US7963652B2 (en) * 2003-11-14 2011-06-21 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
GB2411532B (en) 2004-02-11 2010-04-28 British Broadcasting Corp Position determination
EP1563799B2 (en) 2004-02-11 2012-11-28 BrainLAB AG Adjustable marker arrangement
WO2005082075A2 (en) 2004-02-25 2005-09-09 The University Of North Carolina At Chapel Hill Systems and methods for imperceptibly embedding structured light patterns in projected color images
US20050215319A1 (en) 2004-03-23 2005-09-29 Harmonix Music Systems, Inc. Method and apparatus for controlling a three-dimensional character in a three-dimensional gaming environment
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US7925549B2 (en) 2004-09-17 2011-04-12 Accenture Global Services Limited Personalized marketing architecture
US7289227B2 (en) 2004-10-01 2007-10-30 Nomos Corporation System and tracker for tracking an object, and related methods
US8487879B2 (en) 2004-10-29 2013-07-16 Microsoft Corporation Systems and methods for interacting with a computer through handwriting to a screen
US20100036717A1 (en) 2004-12-29 2010-02-11 Bernard Trest Dynamic Information System
KR20070095407A (en) 2005-01-26 2007-09-28 벤틀리 키네틱스 인코포레이티드 Method and system for athletic motion analysis and instruction
CN101536494B (en) 2005-02-08 2017-04-26 奥布隆工业有限公司 System and method for genture based control system
WO2006108017A2 (en) 2005-04-04 2006-10-12 Lc Technologies, Inc. Explicit raytracing for gimbal-based gazepoint trackers
US7428542B1 (en) 2005-05-31 2008-09-23 Reactrix Systems, Inc. Method and system for combining nodes into a mega-node
EP1922696B1 (en) 2005-08-19 2013-05-15 Philips Intellectual Property & Standards GmbH System and method of analyzing the movement of a user
CN101238428B (en) 2005-08-22 2012-05-30 叶勤中 Free-space pointing and handwriting
US8390821B2 (en) 2005-10-11 2013-03-05 Primesense Ltd. Three-dimensional sensing using speckle patterns
CN101288105B (en) 2005-10-11 2016-05-25 苹果公司 For the method and system of object reconstruction
WO2007053116A1 (en) 2005-10-31 2007-05-10 National University Of Singapore Virtual interface system
TWI301590B (en) 2005-12-30 2008-10-01 Ibm Handwriting input method, apparatus, system and computer recording medium with a program recorded thereon of capturing video data of real-time handwriting strokes for recognition
CN102169415A (en) 2005-12-30 2011-08-31 苹果公司 Portable electronic device with multi-touch input
JP4151982B2 (en) 2006-03-10 2008-09-17 任天堂株式会社 Motion discrimination device and motion discrimination program
EP2002322B1 (en) 2006-03-23 2009-07-15 Koninklijke Philips Electronics N.V. Hotspots for eye track control of image manipulation
US20070230789A1 (en) 2006-04-03 2007-10-04 Inventec Appliances Corp. Method of controlling an electronic device by handwriting
JP5167248B2 (en) 2006-05-11 2013-03-21 プライムセンス リミテッド Modeling of humanoid shape by depth map
GB2438449C (en) 2006-05-24 2018-05-30 Sony Computer Entertainment Europe Ltd Control of data processing
US9696808B2 (en) * 2006-07-13 2017-07-04 Northrop Grumman Systems Corporation Hand-gesture recognition method
US8972902B2 (en) * 2008-08-22 2015-03-03 Northrop Grumman Systems Corporation Compound gesture recognition
US8180114B2 (en) * 2006-07-13 2012-05-15 Northrop Grumman Systems Corporation Gesture recognition interface system with vertical display
US8589824B2 (en) * 2006-07-13 2013-11-19 Northrop Grumman Systems Corporation Gesture recognition interface system
KR100776801B1 (en) 2006-07-19 2007-11-19 한국전자통신연구원 Gesture recognition method and system in picture process system
CN100432897C (en) * 2006-07-28 2008-11-12 上海大学 System and method of contactless position input by hand and eye relation guiding
US7934156B2 (en) 2006-09-06 2011-04-26 Apple Inc. Deletion gestures on a portable multifunction device
US8005294B2 (en) 2006-11-29 2011-08-23 The Mitre Corporation Cursive character handwriting recognition system and method
EP2087742A2 (en) 2006-11-29 2009-08-12 F. Poszat HU, LLC Three dimensional projection display
FR2911211B1 (en) 2007-01-05 2009-06-12 Total Immersion Sa METHOD AND DEVICES FOR REAL-TIME INSERTING VIRTUAL OBJECTS IN AN IMAGE STREAM FROM DATA FROM THE REAL SCENE REPRESENTED BY THESE IMAGES
US7840031B2 (en) 2007-01-12 2010-11-23 International Business Machines Corporation Tracking a range of body movement based on 3D captured image streams of a user
US7971156B2 (en) 2007-01-12 2011-06-28 International Business Machines Corporation Controlling resource access based on user gesturing in a 3D captured image stream of the user
US20080231926A1 (en) * 2007-03-19 2008-09-25 Klug Michael A Systems and Methods for Updating Dynamic Three-Dimensional Displays with User Input
JP2008242929A (en) 2007-03-28 2008-10-09 Oki Data Corp Handwriting input system
TWI433052B (en) 2007-04-02 2014-04-01 Primesense Ltd Depth mapping using projected patterns
US8150142B2 (en) 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
US20080252596A1 (en) 2007-04-10 2008-10-16 Matthew Bell Display Using a Three-Dimensional vision System
US20080256494A1 (en) 2007-04-16 2008-10-16 Greenfield Mfg Co Inc Touchless hand gesture device controller
WO2008137708A1 (en) 2007-05-04 2008-11-13 Gesturetek, Inc. Camera-based user input for compact devices
US8100769B2 (en) 2007-05-09 2012-01-24 Nintendo Co., Ltd. System and method for using accelerometer outputs to control an object rotating on a display
CN101303634B (en) 2007-05-09 2012-05-23 鸿富锦精密工业(深圳)有限公司 Portable electronic apparatus
US8065624B2 (en) 2007-06-28 2011-11-22 Panasonic Corporation Virtual keypad systems and methods
US8726194B2 (en) 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
TW200907764A (en) 2007-08-01 2009-02-16 Unique Instr Co Ltd Three-dimensional virtual input and simulation apparatus
US7949157B2 (en) 2007-08-10 2011-05-24 Nitin Afzulpurkar Interpreting sign language gestures
CA2699628A1 (en) 2007-09-14 2009-03-19 Matthew Bell Gesture-based user interactions with status indicators for acceptable inputs in volumetric zones
US20090078473A1 (en) 2007-09-26 2009-03-26 Digital Pen Systems Handwriting Capture For Determining Absolute Position Within A Form Layout Using Pen Position Triangulation
TWI343544B (en) 2007-09-26 2011-06-11 Inventec Appliances Corp A handwriting record device
US8195499B2 (en) 2007-09-26 2012-06-05 International Business Machines Corporation Identifying customer behavioral types from a continuous video stream for use in optimizing loss leader merchandizing
US10235827B2 (en) 2007-11-09 2019-03-19 Ball Gaming, Inc. Interaction with 3D space in a gaming system
US8166421B2 (en) * 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US20120204133A1 (en) 2009-01-13 2012-08-09 Primesense Ltd. Gesture-Based User Interface
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US7889073B2 (en) 2008-01-31 2011-02-15 Sony Computer Entertainment America Llc Laugh detector and system and method for tracking an emotional response to a media presentation
CA2714534C (en) 2008-02-28 2018-03-20 Kenneth Perlin Method and apparatus for providing input to a processor, and a sensor pad
US9772689B2 (en) 2008-03-04 2017-09-26 Qualcomm Incorporated Enhanced gesture-based image manipulation
JP2009258884A (en) * 2008-04-15 2009-11-05 Toyota Central R&D Labs Inc User interface
KR100947990B1 (en) 2008-05-15 2010-03-18 성균관대학교산학협력단 Gaze Tracking Apparatus and Method using Difference Image Entropy
US8165398B2 (en) 2008-05-30 2012-04-24 Sony Ericsson Mobile Communications Ab Method and device for handwriting detection
JP5317169B2 (en) 2008-06-13 2013-10-16 洋 川崎 Image processing apparatus, image processing method, and program
US8456517B2 (en) 2008-07-09 2013-06-04 Primesense Ltd. Integrated processor for 3D mapping
US9445193B2 (en) 2008-07-31 2016-09-13 Nokia Technologies Oy Electronic device directional audio capture
US20100103103A1 (en) 2008-08-22 2010-04-29 Palanker Daniel V Method And Device for Input Of Information Using Visible Touch Sensors
US7850306B2 (en) * 2008-08-28 2010-12-14 Nokia Corporation Visual cognition aware display and visual data transmission architecture
US20100053151A1 (en) 2008-09-02 2010-03-04 Samsung Electronics Co., Ltd In-line mediation for manipulating three-dimensional content on a display device
US20100071965A1 (en) 2008-09-23 2010-03-25 Panasonic Corporation System and method for grab and drop gesture recognition
US20100149096A1 (en) 2008-12-17 2010-06-17 Migos Charles J Network management using interaction with display surface
US20100162181A1 (en) 2008-12-22 2010-06-24 Palm, Inc. Interpreting Gesture Input Including Introduction Or Removal Of A Point Of Contact While A Gesture Is In Progress
US20120202569A1 (en) 2009-01-13 2012-08-09 Primesense Ltd. Three-Dimensional User Interface for Game Applications
JP2012515966A (en) 2009-01-26 2012-07-12 ズッロ・テクノロジーズ・(2009)・リミテッド Device and method for monitoring the behavior of an object
US20100199228A1 (en) 2009-01-30 2010-08-05 Microsoft Corporation Gesture Keyboarding
US8624962B2 (en) 2009-02-02 2014-01-07 Ydreams—Informatica, S.A. Ydreams Systems and methods for simulating three-dimensional virtual interactions from two-dimensional camera images
US20100235786A1 (en) 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
US8760391B2 (en) 2009-05-22 2014-06-24 Robert W. Hawkins Input cueing emersion system and method
US8619029B2 (en) * 2009-05-22 2013-12-31 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting consecutive gestures
JP4837762B2 (en) * 2009-06-18 2011-12-14 本田技研工業株式会社 In-vehicle device operation device
CN101943982B (en) 2009-07-10 2012-12-12 北京大学 Method for manipulating image based on tracked eye movements
KR20110010906A (en) 2009-07-27 2011-02-08 삼성전자주식회사 Apparatus and method for controlling of electronic machine using user interaction
KR101596890B1 (en) 2009-07-29 2016-03-07 삼성전자주식회사 Apparatus and method for navigation digital object using gaze information of user
US8565479B2 (en) 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
JP5490664B2 (en) * 2009-11-18 2014-05-14 パナソニック株式会社 Electrooculogram estimation device, electrooculogram calculation method, eye gaze detection device, wearable camera, head mounted display, and electronic glasses
US8812990B2 (en) 2009-12-11 2014-08-19 Nokia Corporation Method and apparatus for presenting a first person world view of content
US8587532B2 (en) 2009-12-18 2013-11-19 Intel Corporation Multi-feature interactive touch user interface
US8232990B2 (en) 2010-01-05 2012-07-31 Apple Inc. Working with 3D objects
US20110164032A1 (en) 2010-01-07 2011-07-07 Prime Sense Ltd. Three-Dimensional User Interface
US9507418B2 (en) 2010-01-21 2016-11-29 Tobii Ab Eye tracker based contextual action
US20110188116A1 (en) 2010-02-02 2011-08-04 Nikolay Ledentsov Ledentsov Device for generation of three-demensional images
US8659658B2 (en) 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
JP2013521576A (en) 2010-02-28 2013-06-10 オスターハウト グループ インコーポレイテッド Local advertising content on interactive head-mounted eyepieces
US8787663B2 (en) 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
US8595645B2 (en) 2010-03-11 2013-11-26 Apple Inc. Device, method, and graphical user interface for marquee scrolling within a display area
US20110248914A1 (en) 2010-04-11 2011-10-13 Sherr Alan B System and Method for Virtual Touch Typing
US20110254765A1 (en) 2010-04-18 2011-10-20 Primesense Ltd. Remote text input using handwriting
KR101334107B1 (en) 2010-04-22 2013-12-16 주식회사 굿소프트웨어랩 Apparatus and Method of User Interface for Manipulating Multimedia Contents in Vehicle
US9311724B2 (en) 2010-04-23 2016-04-12 Handscape Inc. Method for user input from alternative touchpads of a handheld computerized device
US8384683B2 (en) 2010-04-23 2013-02-26 Tong Luo Method for user input from the back panel of a handheld computerized device
US20110292036A1 (en) 2010-05-31 2011-12-01 Primesense Ltd. Depth sensor with application interface
US8594425B2 (en) 2010-05-31 2013-11-26 Primesense Ltd. Analysis of three-dimensional scenes
US20110310010A1 (en) 2010-06-17 2011-12-22 Primesense Ltd. Gesture based user interface
US8907929B2 (en) 2010-06-29 2014-12-09 Qualcomm Incorporated Touchless sensing and gesture recognition using continuous wave ultrasound signals
US9489040B2 (en) 2010-07-19 2016-11-08 Smart Technologies Ulc Interactive input system having a 3D input space
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
CN102959616B (en) 2010-07-20 2015-06-10 苹果公司 Interactive reality augmentation for natural interaction
US20120019703A1 (en) 2010-07-22 2012-01-26 Thorn Karl Ola Camera system and method of displaying photos
US9098931B2 (en) 2010-08-11 2015-08-04 Apple Inc. Scanning projectors and image capture modules for 3D mapping
US9013430B2 (en) 2010-08-20 2015-04-21 University Of Massachusetts Hand and finger registration for control applications
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
KR101156734B1 (en) 2010-11-05 2012-06-14 전자부품연구원 Interactive 3d system of table type
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US20130154913A1 (en) 2010-12-16 2013-06-20 Siemens Corporation Systems and methods for a gaze and gesture interface
US20120169583A1 (en) 2011-01-05 2012-07-05 Primesense Ltd. Scene profiles for non-tactile user interfaces
KR101873405B1 (en) 2011-01-18 2018-07-02 엘지전자 주식회사 Method for providing user interface using drawn patten and mobile terminal thereof
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US9329469B2 (en) 2011-02-17 2016-05-03 Microsoft Technology Licensing, Llc Providing an interactive experience using a 3D depth camera and a 3D projector
US8782566B2 (en) 2011-02-22 2014-07-15 Cisco Technology, Inc. Using gestures to schedule and manage meetings
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
CN102306053B (en) 2011-08-29 2014-09-10 Tcl集团股份有限公司 Virtual touch screen-based man-machine interaction method and device and electronic equipment
US9395901B2 (en) 2012-02-08 2016-07-19 Blackberry Limited Portable electronic device and method of controlling same
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
CN104246682B (en) 2012-03-26 2017-08-25 苹果公司 Enhanced virtual touchpad and touch-screen
US9552673B2 (en) 2012-10-17 2017-01-24 Microsoft Technology Licensing, Llc Grasping virtual objects in augmented reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4836670A (en) * 1987-08-19 1989-06-06 Center For Innovative Technology Eye movement detector
CN101960409A (en) * 2007-12-31 2011-01-26 微软国际控股私有有限公司 3D pointing system
WO2010089989A1 (en) * 2009-02-05 2010-08-12 パナソニック株式会社 Information display device and information display method

Also Published As

Publication number Publication date
EP2672880A2 (en) 2013-12-18
US20130321265A1 (en) 2013-12-05
US11262840B2 (en) 2022-03-01
EP3527121B1 (en) 2023-08-23
EP2672880A4 (en) 2017-09-13
US9342146B2 (en) 2016-05-17
WO2012107892A2 (en) 2012-08-16
CN103347437A (en) 2013-10-09
EP2672880B1 (en) 2019-05-22
US10031578B2 (en) 2018-07-24
US20160370860A1 (en) 2016-12-22
US9285874B2 (en) 2016-03-15
CN106125921A (en) 2016-11-16
US20140028548A1 (en) 2014-01-30
CN103347437B (en) 2016-06-08
US9454225B2 (en) 2016-09-27
US20130321271A1 (en) 2013-12-05
WO2012107892A3 (en) 2012-11-01
EP3527121A1 (en) 2019-08-21
US20180314329A1 (en) 2018-11-01

Similar Documents

Publication Publication Date Title
CN106125921B (en) Gaze detection in 3D map environment
US20220164032A1 (en) Enhanced Virtual Touchpad
US11157725B2 (en) Gesture-based casting and manipulation of virtual content in artificial-reality environments
US11875012B2 (en) Throwable interface for augmented reality and virtual reality environments
US20220091722A1 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
US10511778B2 (en) Method and apparatus for push interaction
KR101823182B1 (en) Three dimensional user interface effects on a display by using properties of motion
CN107787472A (en) For staring interactive hovering behavior in virtual reality
AU2015252151B2 (en) Enhanced virtual touchpad and touchscreen
van Wezel Gesture-based interaction concepts for mobile augmented reality applications

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant