US20140218281A1 - Systems and methods for eye gaze determination - Google Patents

Systems and methods for eye gaze determination Download PDF

Info

Publication number
US20140218281A1
US20140218281A1 US14/099,900 US201314099900A US2014218281A1 US 20140218281 A1 US20140218281 A1 US 20140218281A1 US 201314099900 A US201314099900 A US 201314099900A US 2014218281 A1 US2014218281 A1 US 2014218281A1
Authority
US
United States
Prior art keywords
user
eye
wearable device
camera
endo
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/099,900
Other languages
English (en)
Inventor
Gholamreza Amayeh
Dave Leblanc
Zhiming Liu
Michael Vacchina
Steve Wood
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eyefluence Inc
Original Assignee
Eyefluence Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyefluence Inc filed Critical Eyefluence Inc
Priority to US14/099,900 priority Critical patent/US20140218281A1/en
Publication of US20140218281A1 publication Critical patent/US20140218281A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/38Releasing-devices separate from shutter
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2213/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B2213/02Viewfinders
    • G03B2213/025Sightline detection

Definitions

  • the present invention relates generally to systems and methods for eye tracking that are implemented for gaze determination, e.g., determining locations in space or object(s) being viewed by one or both eyes.
  • the gaze-determination systems and methods herein may enable point-of-gaze determination in a wearable device without the need for head-tracking after calibration.
  • This systems and methods herein relate to gaze tracking using a wearable eye-tracking device that utilizes head-pose estimation to improve gaze accuracy.
  • head-tracking allows the system to know the user's head position in relation to the monitor. This enables the user to accurately interact with an electronic display or other monitor (e.g., control a pointer) using his/her gaze.
  • Many wearable eye-tracking devices do not include head pose estimation. However, minor shifts in head pose can introduce ambiguity in eye trackers that use the eye visual axis only when determining the gaze vector. Knowledge of the head pose can extend the range of accuracy of a gaze-tracking system.
  • the present invention is directed to systems and methods for eye tracking that are implemented for gaze determination, e.g., determining locations in space or object(s) being viewed by one or both eyes.
  • the gaze-determination systems and methods herein may enable point-of-gaze determination in a wearable device without the need for head-tracking after calibration.
  • a method for eye tracking includes one or more steps, such as calibrating a wearable device before the wearable device is worn by a user; placing the wearable device on a user's head adjacent one or both of the user's eyes; calibrating the wearable device after placing the wearable device on the user's head; detecting at least one eye feature of a first eye of the user's eyes; performing a compensation algorithm; and calculating a gaze direction of the user.
  • a system for eye tracking that includes a wearable device configured to be worn on a user's head; an exo-camera on the wearable device configured to provide images of a user's surroundings when the wearable device is worn by the user; an endo-camera on the wearable device configured to provide images of a first eye of the user when the wearable device is worn by the user; and one or more processors configured for one or more of calibrating the wearable device before the wearable device is worn by a user; calibrating the wearable device after placing the wearable device on the user's head; detecting at least one eye feature of a first eye of the user's eyes; performing a compensation algorithm; and calculating a gaze direction of the user.
  • a method for compensating for movement of a wearable eye tracking device relative to a user's eye that includes wearing a wearable device on a user's head such that one or more endo-cameras are positioned to acquire images of one or both of the user's eyes, and an exo-camera is positioned to acquire images of the user's surroundings; calculating the location of features in a user's eye that cannot be directly observed from images of the eye acquired by an endo-camera; and spatially transforming camera coordinate systems of the exo- and endo-cameras to place calculated eye features in a known location and alignment.
  • FIGS. 1A and 1B are perspective and back views, respectively, of an exemplary embodiment of a wearable gaze tracking device.
  • FIG. 2 is a flowchart showing an exemplary method for gaze tracking using a wearable device, such as that shown in FIGS. 1A and 1B .
  • FIG. 3 is a flowchart showing an exemplary method for gaze mapping that may be included in the method shown in FIG. 2 .
  • FIG. 4 is a flowchart showing an exemplary method for pupil detection that may be included in the method shown in FIG. 2 .
  • FIGS. 5 and 6 are schematic representations showing a projected pupil point on a virtual plane after normalization and denormalization using a method, such as that shown in FIG. 2 .
  • the present invention may provide apparatus, systems, and methods for head tracking and gaze tracking that include one or more of the following features:
  • One of the hurdles to accurate gaze-mapping in a mobile wearable eye-tracking device is finding a user-friendly method to determine head pose information.
  • a user is comfortable with a short user-specific calibration.
  • the main advantage of the gaze determination method disclosed herein is that point-of-regard may be maintained with or without head tracking after calibration. This is accomplished by estimating the point in space where the user is looking and projecting it onto the scene image. This allows for gaze determination in a plethora of environments not restricted to a computer desk.
  • FIGS. 1A and 1B show an exemplary embodiment of a wearable gaze-tracking device 10 that includes a wearable device 12 , e.g., a frame for glasses (as shown), or a mask, a headset, a helmet, and the like that is configured to be worn on a users head (not shown), an exo-camera 20 (mounted on the device to image the user's surroundings), one or more endo-cameras 30 (mounted on the device to image one or more both of the user's eyes).
  • the device 10 may include one or more light sources, processors, memory, and the like (not shown) coupled to other components for operating the device 10 and/or performing the various functions described herein.
  • Exemplary components e.g., wearable devices, cameras, light sources, processors, communication interfaces, and the like, that may be included in the device 10 are disclosed in U.S. Pat. Nos. 6,541,081 and 7,488,294, and U.S. Publication Nos. 2011/0211056 and 2013/0114850, the entire disclosures of which are expressly incorporated by reference herein.
  • the method includes a) a calibration step 110 in which the wearable device (e.g., device 10 ) is calibrated, a marker detection step 112 , a pupil detection step 114 , a glint detection step 116 , a normalization step 118 , a user calibration step 120 , a gaze mapping step 122 , and a three-dimensional (3D) point-of-regard (POR) step 124 .
  • step 112 head pose estimation, typically operates substantially continuously.
  • gaze determination (steps 114 - 124 ), including user calibration step 120 , generally begins with i) pupil detection 114 , and ii) glint location (identifying glints reflected off of one or more both eyes acquired by the endo-camera(s) 30 ), where i) and ii) may also be performed in reverse order (glint detection before pupil detection).
  • the camera-to-camera calibration steps 110 (calibrating the endo-camera(s) 30 and exo-camera 20 ) is generally performed prior to the user placing the wearable device on their face, e.g., as described below.
  • the first step in calibrating the glint locations in endo-camera images with the light source locations on the wearable device is to acquire a set of perspective images with a secondary reflective surface and light source(s). For example, images of a mirror placed near the working distance of the camera may be acquired, where the mirror's edges are surrounded by LEDs and the mirror is placed in front of the camera such that the glint-LEDs may be seen in the image.
  • the second step is to use a software program to mark and extract the positions of the light sources surrounding the mirror and the reflections in the mirror of the glint-generating light sources on the wearable device.
  • the next step is to determine the homography between the image and the plane of the reflective surface.
  • the aforementioned homography is subsequently applied to the glint light source in the image plane to get the three-dimensional (3D) point corresponding to the light source on the reflective surface.
  • 3D three-dimensional
  • Camera-Camera Calibration Step Two standard checkerboards, and/or other known geometric pattern, are positioned such that one pattern substantially fills the field of view of each of the exo-camera 20 and the endo-camera(s) 30 , e.g., positioned at a near optimal working distance of the respective camera, i.e., the object is at near best focus.
  • the position of the checkerboards remains substantially fixed during camera-to-camera calibration.
  • the matrix equation may be solved with SVD to get the camera-to-camera transformation.
  • the calibration step 110 may then include a User-Specific Calibration.
  • codes displayed on the monitor in the exo-camera images are registered with an established monitor plane. This provides an estimate of head-pose at each calibration and test point in the user's calibration session.
  • the codes may come in the form of a variety of patterns comprising contrasting geometric phenomenon.
  • the patterns may be displayed on the monitor, constructed of other materials and attached to the monitor, a series of light sources in pattern around the monitor, and the like.
  • head pose may be estimated using an accelerometer, MEMS device, or other orientation sensor. In the past, accelerometers were bulky, but have significantly been reduced in their overall footprint with the incorporation of MEMS technology.
  • mapping refers to a mathematical function.
  • the function takes as a variable raw data and evaluates to calibrated points. For example, a polynomial fit is applied to an entire space, and an output value for any point within that space is determined by the function.
  • user-specific calibration may be performed with interpolation. While mapping covers an entire space of interest, interpolation is performed in a piecewise fashion on specific subregions and localized data. For example, the entire space may be subdivided into four subregions, and linear fits may be applied to each of those subregions by using a weighted average of the corner points of each region. If the number of subregions is increased, the interpolation approaches the polynomial fit of the prior exemplary embodiment.
  • user-specific calibration may be performed with machine learning. While machine learning may appear to behave like mathematical functions as applied to the mapping method, machine learning techniques may internally represent highly irregular mappings that would otherwise require extremely complex mathematical equations like discontinuous functions and high-order polynomials. Machine learning techniques also make no assumptions about the types of equations they will model, meaning that the training procedure is identical regardless of the type of mapping it will ultimately represent. This eliminates, among other things, the need for the author to understand the relationship between inputs and outputs. They may also execute very quickly making them useful in high performance applications.
  • each eye image in the video sequence is first pre-processed and has a threshold applied to acquire marker candidates as contours.
  • the candidates are evaluated for contour size, roundness, and corner count.
  • the final candidates are extracted and matched to marker codes stored within the system directories. The user's head pose and orientation are calculated relative to the marker corners.
  • Pupil Detection Step An exemplary embodiment of the pupil detection step 114 in FIG. 2 is shown in FIG. 4 .
  • One potential method for pupil detection is to first apply a blob detector, e.g., MSER, to a downsized and thresholded image to identify regions similar in features to a pupil from endo-camera images.
  • the blob detector may, for example, be constrained to find circularity (e.g., eccentricity, low order moments, and the like) and stable regions that resemble a pupil.
  • an algorithm such as Dense Stage I Starburst may be applied to find pupil edges, while ignoring glints.
  • an ellipse is fitted to the pupil edge, for example using methods such as Ransac or Hough transforms. Exemplary methods are disclosed in Chinese Publication No. CN102831610 and U.S. Pat. No. 7,110,568, the entire disclosures of which are expressly incorporated by reference herein.
  • Glint Detection Step For the glint detection step 116 of FIG. 2 , in an exemplary embodiment, first, an adaptive threshold is applied to a subwindow of the full resolution image determined, where the threshold value is based on mean and median intensity of iris. The image contrast is enhanced. Then, the glints are segmented out of the image through a combination of the threshold and edge detection. Dilation and erosion filters are applied to segmented glints to remove noise. The contours of the glint candidates are determined. The aforementioned glint candidates are screened for predetermined parameters of the actual glint, e.g., size constraints, oddly shaped, eccentricity, and the like. The actual glints are selected from a final pool of candidates based on geometric constraints.
  • Cornea Center Calculation Step Next, the normalization step 118 of FIG. 2 may be performed.
  • the location of the light sources on the device 10 that produce the glints reflected at the anterior corneal surface and the eye tracking camera intrinsic parameters are known.
  • the cornea is assumed to be substantially spherical.
  • Each glint establishes a trajectory of possible cornea center positions in three dimensional (3D) space.
  • Each trajectory pair generates a 3D location on which the cornea center resides.
  • the corneal center coordinates are calculated using the aforementioned information together with a default corneal radius of curvature that matches the population average.
  • the gaze vector in the endo-camera coordinate system may be mapped to either a point-of-regard (POR) overlay, or the monitor plane if mouse or pointer control is required.
  • POR point-of-regard
  • 2D two dimensional
  • head pose continues to be calculated.
  • accurate gaze determination with unrestricted head movement may be accomplished through proper normalization and denormalization of the endo- (toward the eye) and exo-camera spaces(outward-looking) relative to a virtual plane.
  • the image pupil point is projected onto the virtual plane (the mapping between the endo-,exo-, and virtual coordinate spaces is determined during calibration). Then the gaze point is found by intersection of the line formed by the cornea and virtual plane point with the monitor.
  • the center of the pupil is normalized.
  • the cornea center is used as a reference point and every frame it is transformed to a specific, predetermined position.
  • the normalized pupil position is then determined on the shifted cornea image.
  • the normalization puts the cornea in the same position in every frame of the endo-camera images, e.g., within an x-y-z reference frame.
  • the normalization includes a rotation that will rotate the cornea about the origin and put it on the z-axis. This rotation is determined by restricting the rotation to a combination of rotation around the x-axis followed by rotation around the y-axis.
  • the translation is determined by calculating the required translation to move the rotated cornea to a predetermined value on the z-axis. Because of the rotation done before translation, the translation only contains a z value.
  • FIG. 5 shows how the pupil position may be found on the cornea.
  • the pupil position on the image plane in 3D is retrieved and then the intersection of the line formed by the pupil on the image and origin with the non-normalized cornea is found. That point on the cornea is then normalized along with the cornea.
  • the next step is to determine the normalized pupil on the image plane, e.g., at the normalization step 118 shown in FIG. 2 . This is the intersection of the line formed by the normalized pupil on the cornea and origin with the image plane.
  • FIG. 5 demonstrates this.
  • Normalization puts the cornea in a specific position in the endo-camera coordinate system. Since the cornea does not move relative to the screen, the screen moves as well. They are both fixed in space for the instance of this frame. The cameras and virtual plane are all fixed together as well as the frames. So when normalization moves the cornea into the specific position in the endo-camera coordinate system, it is functionally the same as the cornea remaining still and the coordinate system moving. The new normalized pupil center is projected onto the virtual plane but because the virtual plane moved with the endo coordinate system, the gaze point right now would be wrong. The virtual plane must now be denormalized to return it to the proper position for the gaze point, e.g., as shown in FIG. 6 .
  • FIG. 3 shows an exemplary method for performing the normalization step 118 shown in FIG. 2 .
  • the cornea center is rotated about the origin to lie on the z-axis in the endo-camera coordinate system (eye camera coordinate system). Rotation is performed about the x-axis first, then the y-axis. The rotated cornea position is translated to a constant, predefined position along the z-axis.
  • the next step is to transform pupil center data from image pixels to image plane in units of millimeters. Now the point where the line intersecting the endo-camera center and the pupil center on the image plane intersects with the cornea may be determined.
  • the cornea is assumed to be a sphere with a radius centered at the normalized cornea center position.
  • the intersection point is endo-normalized and scaled such that it lies on the image plane, and transformed back into pixels.
  • the normalized pupil is then projected onto a virtual plane, where the polynomial projection function is user-dependent and generated during user calibration.
  • the display origin and normal vector are transformed to the exo-camera coordinate system (scene camera coordinate system).
  • the next step is to transform the cornea center to exo-camera coordinates, followed by transforming the endo-normalization into the exo-camera coordinate system to obtain exo-normalization transformation.
  • the inverse of the exo-normalization transformation is applied to the projected normalized pupil point in the exo-camera coordinate system, e.g., as shown in FIG. 6 .
  • the intersection of the line (exo-cornea and de-normalized projected normalized pupil) with the exo-screen plane is determined.
  • the final step is to transform the result of that intersection to the screen coordinate system of the monitor, and then to pixel to obtain gaze point on the monitor.
  • a mobile gaze-determination system must be robust to small shifts in frame position relative to the face for a given user in addition to accommodating unrestricted head movement. Both conditions may be accomplished through proper normalization of the endo-(toward the eye) and exo-spaces(outward-looking) relative to the viewing plane.
  • gaze point is determined by convergence of the left and right eye gaze vectors.
  • the information may then be relayed to the user through the mobile device as an overlay on the ex-camera (scene) video images.
  • Point-of-Regard Step Next, at step 124 of FIG. 2 , a 3D POR overlap may be performed.
  • the left gaze line is defined by de-normalized projected normalized pupil and cornea in exo-camera coordinate system for left eye. The same procedure is applied to right eye. The intersection (or closest point of intersection) between the two lines is determined and then projected onto the exo-camera images.
  • eye movements may be used interchangeably with other input devices, e.g., that utilize hands, feet, and/or other body movements to direct computer and other control applications.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)
  • Position Input By Displaying (AREA)
US14/099,900 2012-12-06 2013-12-06 Systems and methods for eye gaze determination Abandoned US20140218281A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/099,900 US20140218281A1 (en) 2012-12-06 2013-12-06 Systems and methods for eye gaze determination

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201261734354P 2012-12-06 2012-12-06
US201261734294P 2012-12-06 2012-12-06
US201261734342P 2012-12-06 2012-12-06
US14/099,900 US20140218281A1 (en) 2012-12-06 2013-12-06 Systems and methods for eye gaze determination

Publications (1)

Publication Number Publication Date
US20140218281A1 true US20140218281A1 (en) 2014-08-07

Family

ID=50884065

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/099,900 Abandoned US20140218281A1 (en) 2012-12-06 2013-12-06 Systems and methods for eye gaze determination
US14/099,908 Active 2035-11-02 US10025379B2 (en) 2012-12-06 2013-12-06 Eye tracking wearable devices and methods for use

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/099,908 Active 2035-11-02 US10025379B2 (en) 2012-12-06 2013-12-06 Eye tracking wearable devices and methods for use

Country Status (6)

Country Link
US (2) US20140218281A1 (enExample)
EP (1) EP2929413B1 (enExample)
JP (1) JP6498606B2 (enExample)
KR (1) KR102205374B1 (enExample)
CN (1) CN104903818B (enExample)
WO (1) WO2014089542A1 (enExample)

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140232638A1 (en) * 2013-02-21 2014-08-21 Samsung Electronics Co., Ltd. Method and apparatus for user interface using gaze interaction
WO2016037120A1 (en) * 2014-09-05 2016-03-10 Vision Service Plan Computerized replacement temple for standard eyewear
WO2016187457A3 (en) * 2015-05-20 2017-03-23 Magic Leap, Inc. Tilt shift iris imaging
US20170172408A1 (en) * 2015-11-13 2017-06-22 Hennepin Healthcare System, Inc. Method for predicting convergence disorders caused by concussion or other neuropathology
US9704038B2 (en) 2015-01-07 2017-07-11 Microsoft Technology Licensing, Llc Eye tracking
US9898865B2 (en) 2015-06-22 2018-02-20 Microsoft Technology Licensing, Llc System and method for spawning drawing surfaces
US9910298B1 (en) 2017-04-17 2018-03-06 Vision Service Plan Systems and methods for a computerized temple for use with eyewear
US10082866B2 (en) 2016-04-12 2018-09-25 International Business Machines Corporation Gaze point detection using dynamic facial reference points under varying lighting conditions
WO2018222753A1 (en) * 2017-05-31 2018-12-06 Magic Leap, Inc. Eye tracking calibration techniques
US10215568B2 (en) 2015-01-30 2019-02-26 Vision Service Plan Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete
US20190155380A1 (en) * 2017-11-17 2019-05-23 Dolby Laboratories Licensing Corporation Slippage Compensation in Eye Tracking
US10310269B2 (en) 2016-07-29 2019-06-04 Essilor International Method for virtual testing of at least one lens having a predetermined optical feature and associated device
US10327673B2 (en) * 2015-12-21 2019-06-25 Amer Sports Digital Services Oy Activity intensity level determination
EP3547216A1 (en) * 2018-03-30 2019-10-02 Tobii AB Deep learning for three dimensional (3d) gaze prediction
WO2019185150A1 (en) * 2018-03-29 2019-10-03 Tobii Ab Determining a gaze direction using depth information
WO2019190561A1 (en) * 2018-03-30 2019-10-03 Tobii Ab Deep learning for three dimensional (3d) gaze prediction
US10433768B2 (en) 2015-12-21 2019-10-08 Amer Sports Digital Services Oy Activity intensity level determination
US20200076998A1 (en) * 2013-09-03 2020-03-05 Tobii Ab Portable eye tracking device
US10617342B2 (en) 2014-09-05 2020-04-14 Vision Service Plan Systems, apparatus, and methods for using a wearable device to monitor operator alertness
US20200128902A1 (en) * 2018-10-29 2020-04-30 Holosports Corporation Racing helmet with visual and audible information exchange
US10722128B2 (en) 2018-08-01 2020-07-28 Vision Service Plan Heart rate detection system and method
US10856776B2 (en) 2015-12-21 2020-12-08 Amer Sports Digital Services Oy Activity intensity level determination
WO2021164867A1 (en) * 2020-02-19 2021-08-26 Pupil Labs Gmbh Eye tracking module and head-wearable device
US11137820B2 (en) 2015-12-01 2021-10-05 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11144107B2 (en) 2015-12-01 2021-10-12 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11145272B2 (en) 2016-10-17 2021-10-12 Amer Sports Digital Services Oy Embedded computing device
US11159782B2 (en) * 2016-08-03 2021-10-26 Samsung Electronics Co., Ltd. Electronic device and gaze tracking method of electronic device
US11194161B2 (en) 2018-02-09 2021-12-07 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11210299B2 (en) 2015-12-01 2021-12-28 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11215457B2 (en) 2015-12-01 2022-01-04 Amer Sports Digital Services Oy Thematic map based route optimization
US11284807B2 (en) 2015-12-21 2022-03-29 Amer Sports Digital Services Oy Engaging exercising devices with a mobile device
US11301677B2 (en) * 2019-06-14 2022-04-12 Tobil AB Deep learning for three dimensional (3D) gaze prediction
US20220198789A1 (en) * 2019-06-18 2022-06-23 Pupil Labs Gmbh Systems and methods for determining one or more parameters of a user's eye
US11393251B2 (en) 2018-02-09 2022-07-19 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US20220253135A1 (en) * 2019-07-16 2022-08-11 Magic Leap, Inc. Eye center of rotation determination with one or more eye tracking cameras
US20220269341A1 (en) * 2021-02-19 2022-08-25 Beijing Boe Optoelectronics Technology Co., Ltd. Sight positioning method, head-mounted display device, computer device and computer-readable storage medium
US11537202B2 (en) 2019-01-16 2022-12-27 Pupil Labs Gmbh Methods for generating calibration data for head-wearable devices and eye tracking system
US11541280B2 (en) 2015-12-21 2023-01-03 Suunto Oy Apparatus and exercising device
US11556741B2 (en) 2018-02-09 2023-01-17 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters using a neural network
US11587484B2 (en) 2015-12-21 2023-02-21 Suunto Oy Method for controlling a display
US11607144B2 (en) 2015-12-21 2023-03-21 Suunto Oy Sensor based context management
US11676422B2 (en) 2019-06-05 2023-06-13 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11703938B2 (en) 2016-10-17 2023-07-18 Suunto Oy Embedded computing device
US11838990B2 (en) 2015-12-21 2023-12-05 Suunto Oy Communicating sensor data in wireless communication systems
US20240036318A1 (en) * 2021-12-21 2024-02-01 Alexander Sarris System to superimpose information over a users field of view
US11918375B2 (en) 2014-09-05 2024-03-05 Beijing Zitiao Network Technology Co., Ltd. Wearable environmental pollution monitor computer apparatus, systems, and related methods
EP4468124A1 (en) * 2023-05-25 2024-11-27 Tobii AB Method and system for guiding a user in calibrating an eye tracking device
US12481160B2 (en) 2018-07-19 2025-11-25 Magic Leap, Inc. Content interaction driven by eye metrics

Families Citing this family (162)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9158116B1 (en) 2014-04-25 2015-10-13 Osterhout Group, Inc. Temple and ear horn assembly for headworn computer
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
CA3207408A1 (en) 2011-10-28 2013-06-13 Magic Leap, Inc. System and method for augmented and virtual reality
CN104903818B (zh) 2012-12-06 2018-12-14 谷歌有限责任公司 眼睛跟踪佩戴式设备和使用方法
KR102256992B1 (ko) * 2013-04-25 2021-05-27 에씰로 앙터나시오날 착용자에게 적응된 헤드 장착형 전자-광학 장치를 제어하는 방법
KR102094965B1 (ko) * 2013-12-30 2020-03-31 삼성디스플레이 주식회사 각성 안경, 차량용 미러 유닛 및 표시장치
US10024679B2 (en) 2014-01-14 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9578307B2 (en) * 2014-01-14 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9915545B2 (en) * 2014-01-14 2018-03-13 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10248856B2 (en) 2014-01-14 2019-04-02 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10360907B2 (en) 2014-01-14 2019-07-23 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US20150228119A1 (en) 2014-02-11 2015-08-13 Osterhout Group, Inc. Spatial location presentation in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US12105281B2 (en) 2014-01-21 2024-10-01 Mentor Acquisition One, Llc See-through computer display systems
US12093453B2 (en) 2014-01-21 2024-09-17 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US20160018651A1 (en) 2014-01-24 2016-01-21 Osterhout Group, Inc. See-through computer display systems
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
US12112089B2 (en) 2014-02-11 2024-10-08 Mentor Acquisition One, Llc Spatial location presentation in head worn computing
GB2526515A (en) * 2014-03-25 2015-12-02 Jaguar Land Rover Ltd Image capture system
US20160187651A1 (en) 2014-03-28 2016-06-30 Osterhout Group, Inc. Safety for a vehicle operator with an hmd
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US20160137312A1 (en) 2014-05-06 2016-05-19 Osterhout Group, Inc. Unmanned aerial vehicle launch system
JP2017527036A (ja) 2014-05-09 2017-09-14 グーグル インコーポレイテッド セキュアなモバイル通信で眼信号を用いるためのシステムおよび方法
US10564714B2 (en) 2014-05-09 2020-02-18 Google Llc Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US9818114B2 (en) 2014-08-11 2017-11-14 Mastercard International Incorporated Systems and methods for performing payment card transactions using a wearable computing device
US10024678B2 (en) 2014-09-17 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable clip for providing social and environmental awareness
US9922236B2 (en) 2014-09-17 2018-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable eyeglasses for providing social and environmental awareness
US9568603B2 (en) * 2014-11-14 2017-02-14 Microsoft Technology Licensing, Llc Eyewear-mountable eye tracking device
US10490102B2 (en) 2015-02-10 2019-11-26 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for braille assistance
US10878775B2 (en) 2015-02-17 2020-12-29 Mentor Acquisition One, Llc See-through computer display systems
US9972216B2 (en) 2015-03-20 2018-05-15 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for storing and playback of information for blind users
IL295437B2 (en) 2015-05-19 2024-11-01 Magic Leap Inc Dual composite light field device
EP3923229A1 (en) 2015-06-24 2021-12-15 Magic Leap, Inc. Augmented reality devices, systems and methods for purchasing
US9939644B2 (en) 2015-06-25 2018-04-10 Intel Corporation Technologies for controlling vision correction of a wearable computing device
US10139966B2 (en) 2015-07-22 2018-11-27 Osterhout Group, Inc. External user interface for head worn computing
US11003246B2 (en) 2015-07-22 2021-05-11 Mentor Acquisition One, Llc External user interface for head worn computing
JP2017060078A (ja) * 2015-09-18 2017-03-23 カシオ計算機株式会社 画像録画システム、ユーザ装着装置、撮像装置、画像処理装置、画像録画方法、及びプログラム
US10618521B2 (en) * 2015-09-21 2020-04-14 Ford Global Technologies, Llc Wearable in-vehicle eye gaze detection
WO2017116662A1 (en) * 2015-12-28 2017-07-06 Artilux Corporation Eye gesture tracking
NZ744383A (en) * 2016-01-19 2019-10-25 Magic Leap Inc Augmented reality systems and methods utilizing reflections
US10591728B2 (en) 2016-03-02 2020-03-17 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10850116B2 (en) 2016-12-30 2020-12-01 Mentor Acquisition One, Llc Head-worn therapy device
US10667981B2 (en) 2016-02-29 2020-06-02 Mentor Acquisition One, Llc Reading assistance system for visually impaired
US9880441B1 (en) 2016-09-08 2018-01-30 Osterhout Group, Inc. Electrochromic systems for head-worn computer systems
US9826299B1 (en) 2016-08-22 2017-11-21 Osterhout Group, Inc. Speaker systems for head-worn computer systems
CN108701227B (zh) 2016-03-07 2022-01-14 奇跃公司 用于生物安全的蓝光调节
US10024680B2 (en) 2016-03-11 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Step based guidance system
US10466491B2 (en) 2016-06-01 2019-11-05 Mentor Acquisition One, Llc Modular systems for head-worn computers
US9910284B1 (en) 2016-09-08 2018-03-06 Osterhout Group, Inc. Optical systems for head-worn computers
US10684478B2 (en) 2016-05-09 2020-06-16 Mentor Acquisition One, Llc User interface systems for head-worn computers
US9958275B2 (en) 2016-05-31 2018-05-01 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for wearable smart device communications
US10561519B2 (en) 2016-07-20 2020-02-18 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device having a curved back to reduce pressure on vertebrae
US10451895B2 (en) 2016-07-25 2019-10-22 Magic Leap, Inc. Light field processor system
WO2018023242A1 (zh) * 2016-07-31 2018-02-08 杨洁 一种自动拍照的方法以及眼镜
WO2018023243A1 (zh) * 2016-07-31 2018-02-08 杨洁 一种自动拍照的专利信息推送方法以及眼镜
WO2018023245A1 (zh) * 2016-07-31 2018-02-08 杨洁 一种自动拍照并传输的方法以及眼镜
WO2018023247A1 (zh) * 2016-07-31 2018-02-08 杨洁 一种自动拍照并传输技术的数据采集方法以及眼镜
WO2018023246A1 (zh) * 2016-07-31 2018-02-08 杨洁 一种拍照时的信息推送方法以及眼镜
US10268268B1 (en) 2016-09-02 2019-04-23 Facebook Technologies, Llc Waveguide integrated eye tracking
JP2018061622A (ja) * 2016-10-11 2018-04-19 オプトス ピーエルシー 眼底観察装置
US10432851B2 (en) 2016-10-28 2019-10-01 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device for detecting photography
US10032053B2 (en) * 2016-11-07 2018-07-24 Rockwell Automation Technologies, Inc. Tag based location
USD827143S1 (en) 2016-11-07 2018-08-28 Toyota Motor Engineering & Manufacturing North America, Inc. Blind aid device
US10012505B2 (en) 2016-11-11 2018-07-03 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable system for providing walking directions
US10521669B2 (en) 2016-11-14 2019-12-31 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing guidance or feedback to a user
US10168531B1 (en) 2017-01-04 2019-01-01 Facebook Technologies, Llc Lightfield waveguide integrated eye tracking
US10485420B2 (en) * 2017-02-17 2019-11-26 Analog Devices Global Unlimited Company Eye gaze tracking
US20180255250A1 (en) * 2017-03-03 2018-09-06 Microsoft Technology Licensing, Llc Pulsed, gated infrared illuminated camera systems and processes for eye tracking in high ambient light environments
CN106842625B (zh) * 2017-03-03 2020-03-17 西南交通大学 一种基于特征共识性的目标追踪方法
KR102713228B1 (ko) 2017-03-30 2024-10-02 매직 립, 인코포레이티드 비차단 이중 드라이버 이어폰들
US10977858B2 (en) 2017-03-30 2021-04-13 Magic Leap, Inc. Centralized rendering
IL269861B2 (en) 2017-04-14 2023-11-01 Magic Leap Inc Multimodal eye tracking
US20180336772A1 (en) * 2017-05-19 2018-11-22 Hcl Technologies Limited System and method for alerting a user within a warehouse
US11079522B1 (en) 2017-05-31 2021-08-03 Magic Leap, Inc. Fiducial design
US10810773B2 (en) * 2017-06-14 2020-10-20 Dell Products, L.P. Headset display control based upon a user's pupil state
EP3628076B1 (en) * 2017-07-13 2024-02-14 Huawei Technologies Co., Ltd. Dual mode headset
US10578869B2 (en) 2017-07-24 2020-03-03 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US10422995B2 (en) 2017-07-24 2019-09-24 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US11409105B2 (en) 2017-07-24 2022-08-09 Mentor Acquisition One, Llc See-through computer display systems
US10969584B2 (en) 2017-08-04 2021-04-06 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
KR102633727B1 (ko) 2017-10-17 2024-02-05 매직 립, 인코포레이티드 혼합 현실 공간 오디오
FI20175960A1 (en) 2017-10-30 2019-05-01 Univ Of Eastern Finland Procedure and apparatus for gaze detection
USD849822S1 (en) * 2017-12-29 2019-05-28 Aira Tech Corp. Smart glasses for interactive use cases
IL276510B2 (en) 2018-02-15 2024-02-01 Magic Leap Inc Virtual reverberation in mixed reality
CA3090178A1 (en) 2018-02-15 2019-08-22 Magic Leap, Inc. Mixed reality musical instrument
CA3090281A1 (en) 2018-02-15 2019-08-22 Magic Leap, Inc. Dual listener positions for mixed reality
US10281085B1 (en) * 2018-03-30 2019-05-07 Faspro Systems Co., Ltd. Head-mounted wireless photographic apparatus
CN112602005A (zh) 2018-04-24 2021-04-02 曼特收购第一有限责任公司 具有视力矫正和增加的内容密度的透视计算机显示系统
EP3804132A1 (en) 2018-05-30 2021-04-14 Magic Leap, Inc. Index scheming for filter parameters
CN110557552A (zh) * 2018-05-31 2019-12-10 联想企业解决方案(新加坡)有限公司 便携式图像采集设备
US10667072B2 (en) 2018-06-12 2020-05-26 Magic Leap, Inc. Efficient rendering of virtual soundfields
CN110596889A (zh) * 2018-06-13 2019-12-20 托比股份公司 眼睛跟踪装置和制造眼睛跟踪装置的方法
US10602292B2 (en) 2018-06-14 2020-03-24 Magic Leap, Inc. Methods and systems for audio signal filtering
EP3807872B1 (en) 2018-06-14 2024-04-10 Magic Leap, Inc. Reverberation gain normalization
EP3808108A4 (en) 2018-06-18 2022-04-13 Magic Leap, Inc. SPATIAL AUDIO FOR INTERACTIVE AUDIO ENVIRONMENTS
EP3811360A4 (en) 2018-06-21 2021-11-24 Magic Leap, Inc. PORTABLE SYSTEM VOICE PROCESSING
EP3827359B1 (en) 2018-07-24 2024-04-03 Magic Leap, Inc. Application sharing using scenegraphs
WO2020023721A1 (en) * 2018-07-25 2020-01-30 Natus Medical Incorporated Real-time removal of ir led reflections from an image
EP3853654B1 (en) * 2018-09-21 2025-11-19 Dolby Laboratories Licensing Corporation Incorporating components inside optical stacks of head-mounted devices
CN117572954A (zh) 2018-09-25 2024-02-20 奇跃公司 用于增强现实的系统和方法
CN113170273B (zh) 2018-10-05 2023-03-28 奇跃公司 用于双耳音频渲染的耳间时间差交叉渐变器
CN113170272B (zh) 2018-10-05 2023-04-04 奇跃公司 近场音频渲染
JP7448530B2 (ja) 2018-10-09 2024-03-12 マジック リープ, インコーポレイテッド 仮想および拡張現実のためのシステムおよび方法
EP3871062B1 (en) 2018-10-24 2025-11-26 Magic Leap, Inc. Asynchronous asic
TWI699671B (zh) * 2018-12-12 2020-07-21 國立臺灣大學 減低眼球追蹤運算的方法和其眼動追蹤裝置
WO2020140078A1 (en) 2018-12-27 2020-07-02 Magic Leap, Inc. Systems and methods for virtual and augmented reality
EP3931827B1 (en) 2019-03-01 2025-03-26 Magic Leap, Inc. Determining input for speech processing engine
WO2020198385A1 (en) 2019-03-25 2020-10-01 Magic Leap, Inc. Systems and methods for virtual and augmented reality
US10877268B2 (en) * 2019-04-16 2020-12-29 Facebook Technologies, Llc Active control of in-field light sources of a head mounted display
CN113994424B (zh) 2019-04-19 2025-04-15 奇跃公司 识别语音识别引擎的输入
WO2020231518A1 (en) 2019-05-10 2020-11-19 Verily Life Sciences Llc Adjustable optical system for intraocular micro-display
WO2020231517A1 (en) 2019-05-10 2020-11-19 Verily Life Sciences Llc Natural physio-optical user interface for intraocular microdisplay
WO2020247863A1 (en) 2019-06-06 2020-12-10 Magic Leap, Inc. Photoreal character configurations for spatial computing
US11328740B2 (en) 2019-08-07 2022-05-10 Magic Leap, Inc. Voice onset detection
US11704874B2 (en) 2019-08-07 2023-07-18 Magic Leap, Inc. Spatial instructions and guides in mixed reality
EP4046138B1 (en) 2019-10-18 2024-11-27 Magic Leap, Inc. Gravity estimation and bundle adjustment for visual-inertial odometry
WO2021081435A1 (en) 2019-10-25 2021-04-29 Magic Leap, Inc. Reverberation fingerprint estimation
EP4049117A4 (en) 2019-10-25 2022-12-14 Magic Leap, Inc. UNEVEN STEREO PLAYBACK
US11959997B2 (en) 2019-11-22 2024-04-16 Magic Leap, Inc. System and method for tracking a wearable device
JP7670714B2 (ja) 2019-12-04 2025-04-30 マジック リープ, インコーポレイテッド 可変ピッチ色放出ディスプレイ
US11627430B2 (en) 2019-12-06 2023-04-11 Magic Leap, Inc. Environment acoustics persistence
EP4073689A4 (en) 2019-12-09 2022-12-14 Magic Leap, Inc. Systems and methods for operating a head-mounted display system based on user identity
US11337023B2 (en) 2019-12-20 2022-05-17 Magic Leap, Inc. Physics-based audio and haptic synthesis
EP4093265B1 (en) * 2020-01-22 2024-08-21 Dolby Laboratories Licensing Corporation Electrooculogram measurement and eye-tracking
US11335070B2 (en) 2020-02-10 2022-05-17 Magic Leap, Inc. Dynamic colocation of virtual content
US11778410B2 (en) 2020-02-14 2023-10-03 Magic Leap, Inc. Delayed audio following
CN115698818B (zh) 2020-02-14 2024-01-23 奇跃公司 会话管理器
US11494528B2 (en) 2020-02-14 2022-11-08 Magic Leap, Inc. Tool bridge
JP7681609B2 (ja) 2020-02-14 2025-05-22 マジック リープ, インコーポレイテッド 3dオブジェクト注釈
US11910183B2 (en) 2020-02-14 2024-02-20 Magic Leap, Inc. Multi-application audio rendering
WO2021178454A1 (en) 2020-03-02 2021-09-10 Magic Leap, Inc. Immersive audio platform
US11917384B2 (en) 2020-03-27 2024-02-27 Magic Leap, Inc. Method of waking a device using spoken voice commands
JP2023527561A (ja) 2020-05-29 2023-06-29 マジック リープ, インコーポレイテッド 表面の適切な衝突
US11561613B2 (en) 2020-05-29 2023-01-24 Magic Leap, Inc. Determining angular acceleration
US12417766B2 (en) 2020-09-30 2025-09-16 Magic Leap, Inc. Voice user interface using non-linguistic input
EP4305512A4 (en) 2021-03-12 2024-11-20 Magic Leap, Inc. ATHERMALIZATION CONCEPTS FOR POLYMER EYEPIECES USED IN AUGMENTED REALITY OR MIXED REALITY DEVICES
US12135471B2 (en) 2021-09-10 2024-11-05 Tectus Corporation Control of an electronic contact lens using eye gestures
US20230122300A1 (en) * 2021-10-14 2023-04-20 Microsoft Technology Licensing, Llc Eye-tracking waveguides
US11592899B1 (en) 2021-10-28 2023-02-28 Tectus Corporation Button activation within an eye-controlled user interface
US11619994B1 (en) 2022-01-14 2023-04-04 Tectus Corporation Control of an electronic contact lens using pitch-based eye gestures
US12118138B2 (en) 2022-02-14 2024-10-15 Tectus Corporation Vergence authentication
US11874961B2 (en) 2022-05-09 2024-01-16 Tectus Corporation Managing display of an icon in an eye tracking augmented reality device
EP4361704A1 (en) * 2022-10-27 2024-05-01 Pupil Labs GmbH Eye tracking module and head-wearable device
US20240430561A1 (en) * 2023-05-15 2024-12-26 Apple Inc. Head mountable display
FI20236164A1 (en) * 2023-10-19 2025-04-20 Dispelix Oy Display device with integrated eye tracking
CN119094745A (zh) * 2024-10-29 2024-12-06 青岛歌尔视界科技有限公司 眼动追踪摄像设备的校准方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050073136A1 (en) * 2002-10-15 2005-04-07 Volvo Technology Corporation Method and arrangement for interpreting a subjects head and eye activity
US20130201291A1 (en) * 2012-02-08 2013-08-08 Microsoft Corporation Head pose tracking using a depth camera
US20130242056A1 (en) * 2012-03-14 2013-09-19 Rod G. Fleck Imaging structure emitter calibration
US20130304479A1 (en) * 2012-05-08 2013-11-14 Google Inc. Sustained Eye Gaze for Determining Intent to Interact
US8942419B1 (en) * 2012-01-06 2015-01-27 Google Inc. Position estimation using predetermined patterns of light sources

Family Cites Families (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3863243A (en) 1972-01-19 1975-01-28 Max Skolnick Sleep inhibiting alarm
US3798599A (en) 1972-02-24 1974-03-19 H Kafafian Single input controller for a communication system
US4359724A (en) 1980-04-28 1982-11-16 Ronald R. Zimmerman Eyelid movement detector
DE3777461D1 (de) 1986-06-20 1992-04-23 Matsushita Electric Industrial Co Ltd Optisches aufzeichnungs-und wiedergabegeraet.
EP0280124A1 (en) 1987-02-12 1988-08-31 Omron Tateisi Electronics Co. Doze detector
US4850691A (en) 1987-03-18 1989-07-25 University Of Illinois Method and apparatus for determining pupillary response parameters
US4815839A (en) 1987-08-03 1989-03-28 Waldorf Ronald A Infrared/video electronystagmographic apparatus
US5214456A (en) 1991-10-09 1993-05-25 Computed Anatomy Incorporated Mapping of corneal topography with display of pupil perimeter
JPH05191683A (ja) * 1992-01-14 1993-07-30 Canon Inc 撮影記録装置
US5345281A (en) 1992-12-17 1994-09-06 John Taboada Eye tracking system and method
US5517021A (en) 1993-01-19 1996-05-14 The Research Foundation State University Of New York Apparatus and method for eye tracking interface
TW247985B (en) * 1993-04-22 1995-05-21 Canon Kk Image-taking apparatus
JPH07146431A (ja) * 1993-11-25 1995-06-06 Canon Inc カメラ
US5402109A (en) 1993-04-29 1995-03-28 Mannik; Kallis H. Sleep prevention device for automobile drivers
US5481622A (en) 1994-03-01 1996-01-02 Rensselaer Polytechnic Institute Eye tracking apparatus and method employing grayscale threshold values
JPH07283974A (ja) * 1994-04-12 1995-10-27 Canon Inc 視線検出装置を備えたビデオカメラ
JPH086708A (ja) 1994-04-22 1996-01-12 Canon Inc 表示装置
CA2126142A1 (en) 1994-06-17 1995-12-18 David Alexander Kahn Visual communications apparatus
US5469143A (en) 1995-01-10 1995-11-21 Cooper; David E. Sleep awakening device for drivers of motor vehicles
US5566067A (en) 1995-03-23 1996-10-15 The President And Fellows Of Harvard College Eyelid vigilance detector system
US5689241A (en) 1995-04-24 1997-11-18 Clarke, Sr.; James Russell Sleep detection and driver alert apparatus
US5570698A (en) 1995-06-02 1996-11-05 Siemens Corporate Research, Inc. System for monitoring eyes for detecting sleep behavior
US5682144A (en) 1995-11-20 1997-10-28 Mannik; Kallis Hans Eye actuated sleep prevention devices and other eye controlled devices
US6003991A (en) 1996-02-17 1999-12-21 Erik Scott Viirre Eye examination apparatus and method for remote examination of a patient by a health professional
US5912721A (en) 1996-03-13 1999-06-15 Kabushiki Kaisha Toshiba Gaze detection apparatus and its method as well as information display apparatus
US5886683A (en) 1996-06-25 1999-03-23 Sun Microsystems, Inc. Method and apparatus for eyetrack-driven information retrieval
US6246344B1 (en) 1996-08-19 2001-06-12 William C. Torch Method and apparatus for voluntary communication
US6163281A (en) 1996-08-19 2000-12-19 Torch; William C. System and method for communication using eye movement
US6542081B2 (en) 1996-08-19 2003-04-01 William C. Torch System and method for monitoring eye movement
US5748113A (en) 1996-08-19 1998-05-05 Torch; William C. Method and apparatus for communication
US5867587A (en) 1997-05-19 1999-02-02 Northrop Grumman Corporation Impaired operator detection and warning system employing eyeblink analysis
WO1999018842A1 (en) 1997-10-16 1999-04-22 The Board Of Trustees Of The Leland Stanford Junior University Method for inferring mental states from eye movements
US6007202A (en) 1997-10-23 1999-12-28 Lasersight Technologies, Inc. Eye illumination system and method
DE19803158C1 (de) 1998-01-28 1999-05-06 Daimler Chrysler Ag Vorrichtung zur Vigilanzzustandsbestimmung
US6204828B1 (en) 1998-03-31 2001-03-20 International Business Machines Corporation Integrated gaze/manual cursor positioning system
US6867752B1 (en) 1998-08-31 2005-03-15 Semiconductor Energy Laboratory Co., Ltd. Portable information processing system
US6243076B1 (en) 1998-09-01 2001-06-05 Synthetic Environments, Inc. System and method for controlling host system interface with point-of-interest data
US6087941A (en) 1998-09-01 2000-07-11 Ferraz; Mark Warning device for alerting a person falling asleep
AUPP612998A0 (en) 1998-09-23 1998-10-15 Canon Kabushiki Kaisha Multiview multimedia generation system
US6526159B1 (en) 1998-12-31 2003-02-25 Intel Corporation Eye tracking for resource and power management
US6577329B1 (en) 1999-02-25 2003-06-10 International Business Machines Corporation Method and system for relevance feedback through gaze tracking and ticker interfaces
GB2348520B (en) 1999-03-31 2003-11-12 Ibm Assisting user selection of graphical user interface elements
US6116736A (en) 1999-04-23 2000-09-12 Neuroptics, Inc. Pupilometer with pupil irregularity detection capability
JP2001183735A (ja) * 1999-12-27 2001-07-06 Fuji Photo Film Co Ltd 撮像装置および方法
JP2001281520A (ja) * 2000-03-30 2001-10-10 Minolta Co Ltd 光学装置
US6456262B1 (en) 2000-05-09 2002-09-24 Intel Corporation Microdisplay with eye gaze detection
US6608615B1 (en) 2000-09-19 2003-08-19 Intel Corporation Passive gaze-driven browsing
ES2401132T3 (es) 2000-10-07 2013-04-17 Metaio Gmbh Dispositivo y procedimiento para la determinación de la orientación de un ojo
DE10103922A1 (de) 2001-01-30 2002-08-01 Physoptics Opto Electronic Gmb Interaktives Datensicht- und Bediensystem
US20030038754A1 (en) 2001-08-22 2003-02-27 Mikael Goldstein Method and apparatus for gaze responsive text presentation in RSVP display
AUPR872301A0 (en) 2001-11-08 2001-11-29 Sleep Diagnostics Pty Ltd Alertness monitor
US6712468B1 (en) 2001-12-12 2004-03-30 Gregory T. Edwards Techniques for facilitating use of eye tracking data
US6919907B2 (en) * 2002-06-20 2005-07-19 International Business Machines Corporation Anticipatory image capture for stereoscopic remote viewing with foveal priority
US20040061680A1 (en) 2002-07-10 2004-04-01 John Taboada Method and apparatus for computer control
US7347551B2 (en) 2003-02-13 2008-03-25 Fergason Patent Properties, Llc Optical system for monitoring eye movement
US7881493B1 (en) 2003-04-11 2011-02-01 Eyetools, Inc. Methods and apparatuses for use of eye interpretation information
US20050047629A1 (en) 2003-08-25 2005-03-03 International Business Machines Corporation System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking
US9274598B2 (en) 2003-08-25 2016-03-01 International Business Machines Corporation System and method for selecting and activating a target object using a combination of eye gaze and key presses
US7365738B2 (en) 2003-12-02 2008-04-29 International Business Machines Corporation Guides and indicators for eye movement monitoring systems
JP2005252732A (ja) 2004-03-04 2005-09-15 Olympus Corp 撮像装置
US7561143B1 (en) 2004-03-19 2009-07-14 The University of the Arts Using gaze actions to interact with a display
GB2412431B (en) * 2004-03-25 2007-11-07 Hewlett Packard Development Co Self-calibration for an eye tracker
CN1960670B (zh) 2004-04-01 2011-02-23 威廉·C·托奇 用于监控眼睛运动的生物传感器、通信器、和控制器及其使用方法
ES2568506T3 (es) 2004-06-18 2016-04-29 Tobii Ab Control ocular de aparato computador
ATE526866T1 (de) 2005-03-04 2011-10-15 Sleep Diagnostics Pty Ltd Wachheitsmessung
JP2006345276A (ja) * 2005-06-09 2006-12-21 Fujifilm Holdings Corp 撮像装置
US8120577B2 (en) 2005-10-28 2012-02-21 Tobii Technology Ab Eye tracker with visual feedback
US7429108B2 (en) 2005-11-05 2008-09-30 Outland Research, Llc Gaze-responsive interface to enhance on-screen user reading tasks
US7760910B2 (en) 2005-12-12 2010-07-20 Eyetools, Inc. Evaluation of visual stimuli using existing viewing data
US8793620B2 (en) 2011-04-21 2014-07-29 Sony Computer Entertainment Inc. Gaze-assisted computer interface
JP2008288767A (ja) 2007-05-16 2008-11-27 Sony Corp 情報処理装置および方法、並びにプログラム
WO2009073584A1 (en) 2007-11-29 2009-06-11 Oculis Labs, Inc. Method and apparatus for display of secure visual content
US20100045596A1 (en) 2008-08-21 2010-02-25 Sony Ericsson Mobile Communications Ab Discreet feature highlighting
US7850306B2 (en) 2008-08-28 2010-12-14 Nokia Corporation Visual cognition aware display and visual data transmission architecture
US20100245765A1 (en) * 2008-10-28 2010-09-30 Dyer Holdings, Llc Video infrared ophthalmoscope
US8398239B2 (en) 2009-03-02 2013-03-19 Honeywell International Inc. Wearable eye tracking system
JP2010213214A (ja) * 2009-03-12 2010-09-24 Brother Ind Ltd ヘッドマウントディスプレイ
EP2238889B1 (en) 2009-04-01 2011-10-12 Tobii Technology AB Adaptive camera and illuminator eyetracker
WO2010118292A1 (en) 2009-04-09 2010-10-14 Dynavox Systems, Llc Calibration free, motion tolerant eye-gaze direction detector with contextually aware computer interaction and communication methods
CN101943982B (zh) 2009-07-10 2012-12-12 北京大学 基于被跟踪的眼睛运动的图像操作
EP3338621B1 (en) 2009-07-16 2019-08-07 Tobii AB Eye detection unit using parallel data flow
JP5613025B2 (ja) * 2009-11-18 2014-10-22 パナソニック株式会社 視線検出装置、視線検出方法、眼電位計測装置、ウェアラブルカメラ、ヘッドマウントディスプレイ、電子めがねおよび眼科診断装置
JP5679655B2 (ja) * 2009-12-24 2015-03-04 レノボ・イノベーションズ・リミテッド(香港) 携帯端末装置及びその表示制御方法
US9507418B2 (en) 2010-01-21 2016-11-29 Tobii Ab Eye tracker based contextual action
US20130314303A1 (en) * 2010-02-28 2013-11-28 Osterhout Group, Inc. Ar glasses with user action control of and between internal and external applications with feedback
US9182596B2 (en) * 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US8890946B2 (en) 2010-03-01 2014-11-18 Eyefluence, Inc. Systems and methods for spatially controlled scene illumination
US8593375B2 (en) 2010-07-23 2013-11-26 Gregory A Maltz Eye gaze user interface and method
US8531394B2 (en) 2010-07-23 2013-09-10 Gregory A. Maltz Unitized, vision-controlled, wireless eyeglasses transceiver
US9213405B2 (en) 2010-12-16 2015-12-15 Microsoft Technology Licensing, Llc Comprehension and intent-based content for augmented reality displays
EP2499960B1 (en) * 2011-03-18 2015-04-22 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Method for determining at least one parameter of two eyes by setting data rates and optical measuring device
US8643680B2 (en) 2011-04-08 2014-02-04 Amazon Technologies, Inc. Gaze-based content display
US8911087B2 (en) 2011-05-20 2014-12-16 Eyefluence, Inc. Systems and methods for measuring reactions of head, eyes, eyelids and pupils
US8885877B2 (en) 2011-05-20 2014-11-11 Eyefluence, Inc. Systems and methods for identifying gaze tracking scene reference locations
DK2587341T3 (en) 2011-10-27 2017-04-03 Tobii Ab Power management in an eye tracking system
US8929589B2 (en) 2011-11-07 2015-01-06 Eyefluence, Inc. Systems and methods for high-resolution gaze tracking
KR101891786B1 (ko) 2011-11-29 2018-08-27 삼성전자주식회사 아이 트래킹 기반의 사용자 기능 운용 방법 및 이를 지원하는 단말기
US8955973B2 (en) 2012-01-06 2015-02-17 Google Inc. Method and system for input detection using structured light projection
US9171198B1 (en) * 2012-04-02 2015-10-27 Google Inc. Image capture technique
WO2013169237A1 (en) 2012-05-09 2013-11-14 Intel Corporation Eye tracking based selective accentuation of portions of a display
DE102012105664A1 (de) 2012-06-28 2014-04-10 Oliver Hein Verfahren und Vorrichtung zur Kodierung von Augen- und Blickverlaufsdaten
US9189064B2 (en) 2012-09-05 2015-11-17 Apple Inc. Delay of display event based on user gaze
US20140092006A1 (en) 2012-09-28 2014-04-03 Joshua Boelter Device and method for modifying rendering based on viewer focus area from eye tracking
CN104903818B (zh) 2012-12-06 2018-12-14 谷歌有限责任公司 眼睛跟踪佩戴式设备和使用方法
WO2014111924A1 (en) 2013-01-15 2014-07-24 Poow Innovation Ltd. Dynamic icons
US9829971B2 (en) 2013-01-21 2017-11-28 Facebook, Inc. Systems and methods of eye tracking control
KR102093198B1 (ko) 2013-02-21 2020-03-25 삼성전자주식회사 시선 인식을 이용한 사용자 인터페이스 방법 및 장치
KR102175853B1 (ko) 2013-02-22 2020-11-06 삼성전자주식회사 동작 제어 방법 및 그 전자 장치
ES2731560T3 (es) 2013-03-01 2019-11-15 Tobii Ab Interacción de mirada con deformación retardada
US9870060B2 (en) 2013-12-31 2018-01-16 Google Llc Systems and methods for gaze-based media selection and editing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050073136A1 (en) * 2002-10-15 2005-04-07 Volvo Technology Corporation Method and arrangement for interpreting a subjects head and eye activity
US7460940B2 (en) * 2002-10-15 2008-12-02 Volvo Technology Corporation Method and arrangement for interpreting a subjects head and eye activity
US8942419B1 (en) * 2012-01-06 2015-01-27 Google Inc. Position estimation using predetermined patterns of light sources
US20150098620A1 (en) * 2012-01-06 2015-04-09 Google Inc. Position Estimation
US20130201291A1 (en) * 2012-02-08 2013-08-08 Microsoft Corporation Head pose tracking using a depth camera
US20130242056A1 (en) * 2012-03-14 2013-09-19 Rod G. Fleck Imaging structure emitter calibration
US20130304479A1 (en) * 2012-05-08 2013-11-14 Google Inc. Sustained Eye Gaze for Determining Intent to Interact

Cited By (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140232638A1 (en) * 2013-02-21 2014-08-21 Samsung Electronics Co., Ltd. Method and apparatus for user interface using gaze interaction
US10324524B2 (en) * 2013-02-21 2019-06-18 Samsung Electronics Co., Ltd. Method and apparatus for user interface using gaze interaction
US20200076998A1 (en) * 2013-09-03 2020-03-05 Tobii Ab Portable eye tracking device
US10188323B2 (en) 2014-09-05 2019-01-29 Vision Service Plan Systems, apparatus, and methods for using eyewear, or other wearable item, to confirm the identity of an individual
US10542915B2 (en) 2014-09-05 2020-01-28 Vision Service Plan Systems, apparatus, and methods for using a wearable device to confirm the identity of an individual
US10694981B2 (en) 2014-09-05 2020-06-30 Vision Service Plan Wearable physiology monitor computer apparatus, systems, and related methods
US9795324B2 (en) 2014-09-05 2017-10-24 Vision Service Plan System for monitoring individuals as they age in place
US10448867B2 (en) 2014-09-05 2019-10-22 Vision Service Plan Wearable gait monitoring apparatus, systems, and related methods
US11918375B2 (en) 2014-09-05 2024-03-05 Beijing Zitiao Network Technology Co., Ltd. Wearable environmental pollution monitor computer apparatus, systems, and related methods
WO2016037120A1 (en) * 2014-09-05 2016-03-10 Vision Service Plan Computerized replacement temple for standard eyewear
US10617342B2 (en) 2014-09-05 2020-04-14 Vision Service Plan Systems, apparatus, and methods for using a wearable device to monitor operator alertness
US9649052B2 (en) 2014-09-05 2017-05-16 Vision Service Plan Systems, apparatus, and methods for using eyewear, or other wearable item, to confirm the identity of an individual
US10307085B2 (en) 2014-09-05 2019-06-04 Vision Service Plan Wearable physiology monitor computer apparatus, systems, and related methods
US9704038B2 (en) 2015-01-07 2017-07-11 Microsoft Technology Licensing, Llc Eye tracking
US10215568B2 (en) 2015-01-30 2019-02-26 Vision Service Plan Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete
US10533855B2 (en) 2015-01-30 2020-01-14 Vision Service Plan Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete
WO2016187457A3 (en) * 2015-05-20 2017-03-23 Magic Leap, Inc. Tilt shift iris imaging
IL255734B1 (en) * 2015-05-20 2023-06-01 Magic Leap Inc Tilt-shift iris imaging
IL255734B2 (en) * 2015-05-20 2023-10-01 Magic Leap Inc Tilt-shift iris imaging
US9898865B2 (en) 2015-06-22 2018-02-20 Microsoft Technology Licensing, Llc System and method for spawning drawing surfaces
US20170172408A1 (en) * 2015-11-13 2017-06-22 Hennepin Healthcare System, Inc. Method for predicting convergence disorders caused by concussion or other neuropathology
US11064881B2 (en) * 2015-11-13 2021-07-20 Hennepin Healthcare System, Inc Method for predicting convergence disorders caused by concussion or other neuropathology
US11215457B2 (en) 2015-12-01 2022-01-04 Amer Sports Digital Services Oy Thematic map based route optimization
US11137820B2 (en) 2015-12-01 2021-10-05 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11144107B2 (en) 2015-12-01 2021-10-12 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11210299B2 (en) 2015-12-01 2021-12-28 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11587484B2 (en) 2015-12-21 2023-02-21 Suunto Oy Method for controlling a display
US10856776B2 (en) 2015-12-21 2020-12-08 Amer Sports Digital Services Oy Activity intensity level determination
US10433768B2 (en) 2015-12-21 2019-10-08 Amer Sports Digital Services Oy Activity intensity level determination
US11284807B2 (en) 2015-12-21 2022-03-29 Amer Sports Digital Services Oy Engaging exercising devices with a mobile device
US11541280B2 (en) 2015-12-21 2023-01-03 Suunto Oy Apparatus and exercising device
US11838990B2 (en) 2015-12-21 2023-12-05 Suunto Oy Communicating sensor data in wireless communication systems
US10327673B2 (en) * 2015-12-21 2019-06-25 Amer Sports Digital Services Oy Activity intensity level determination
US11607144B2 (en) 2015-12-21 2023-03-21 Suunto Oy Sensor based context management
US10082866B2 (en) 2016-04-12 2018-09-25 International Business Machines Corporation Gaze point detection using dynamic facial reference points under varying lighting conditions
US10310269B2 (en) 2016-07-29 2019-06-04 Essilor International Method for virtual testing of at least one lens having a predetermined optical feature and associated device
US11159782B2 (en) * 2016-08-03 2021-10-26 Samsung Electronics Co., Ltd. Electronic device and gaze tracking method of electronic device
US11703938B2 (en) 2016-10-17 2023-07-18 Suunto Oy Embedded computing device
US11145272B2 (en) 2016-10-17 2021-10-12 Amer Sports Digital Services Oy Embedded computing device
US9910298B1 (en) 2017-04-17 2018-03-06 Vision Service Plan Systems and methods for a computerized temple for use with eyewear
US20180348861A1 (en) * 2017-05-31 2018-12-06 Magic Leap, Inc. Eye tracking calibration techniques
US11379036B2 (en) 2017-05-31 2022-07-05 Magic Leap, Inc. Eye tracking calibration techniques
US11068055B2 (en) 2017-05-31 2021-07-20 Magic Leap, Inc. Eye tracking calibration techniques
US10671160B2 (en) 2017-05-31 2020-06-02 Magic Leap, Inc. Eye tracking calibration techniques
WO2018222753A1 (en) * 2017-05-31 2018-12-06 Magic Leap, Inc. Eye tracking calibration techniques
CN110945405A (zh) * 2017-05-31 2020-03-31 奇跃公司 眼睛跟踪校准技术
US11181977B2 (en) * 2017-11-17 2021-11-23 Dolby Laboratories Licensing Corporation Slippage compensation in eye tracking
US20190155380A1 (en) * 2017-11-17 2019-05-23 Dolby Laboratories Licensing Corporation Slippage Compensation in Eye Tracking
US11393251B2 (en) 2018-02-09 2022-07-19 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11340461B2 (en) 2018-02-09 2022-05-24 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11194161B2 (en) 2018-02-09 2021-12-07 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11556741B2 (en) 2018-02-09 2023-01-17 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters using a neural network
WO2019185150A1 (en) * 2018-03-29 2019-10-03 Tobii Ab Determining a gaze direction using depth information
US20220129067A1 (en) * 2018-03-29 2022-04-28 Tobii Ab Determining a gaze direction using depth information
US11675428B2 (en) * 2018-03-29 2023-06-13 Tobii Ab Determining a gaze direction using depth information
WO2019190561A1 (en) * 2018-03-30 2019-10-03 Tobii Ab Deep learning for three dimensional (3d) gaze prediction
EP3547216A1 (en) * 2018-03-30 2019-10-02 Tobii AB Deep learning for three dimensional (3d) gaze prediction
US12481160B2 (en) 2018-07-19 2025-11-25 Magic Leap, Inc. Content interaction driven by eye metrics
US10722128B2 (en) 2018-08-01 2020-07-28 Vision Service Plan Heart rate detection system and method
US20200128902A1 (en) * 2018-10-29 2020-04-30 Holosports Corporation Racing helmet with visual and audible information exchange
US10786033B2 (en) * 2018-10-29 2020-09-29 Robotarmy Corp. Racing helmet with visual and audible information exchange
US11730226B2 (en) 2018-10-29 2023-08-22 Robotarmy Corp. Augmented reality assisted communication
US11537202B2 (en) 2019-01-16 2022-12-27 Pupil Labs Gmbh Methods for generating calibration data for head-wearable devices and eye tracking system
US12154383B2 (en) 2019-06-05 2024-11-26 Pupil Labs Gmbh Methods, devices and systems for determining eye parameters
US11676422B2 (en) 2019-06-05 2023-06-13 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11301677B2 (en) * 2019-06-14 2022-04-12 Tobil AB Deep learning for three dimensional (3D) gaze prediction
US20220198789A1 (en) * 2019-06-18 2022-06-23 Pupil Labs Gmbh Systems and methods for determining one or more parameters of a user's eye
US12353617B2 (en) * 2019-06-18 2025-07-08 Pupil Labs Gmbh Systems and methods for determining one or more parameters of a user's eye
US20220253135A1 (en) * 2019-07-16 2022-08-11 Magic Leap, Inc. Eye center of rotation determination with one or more eye tracking cameras
US11868525B2 (en) * 2019-07-16 2024-01-09 Magic Leap, Inc. Eye center of rotation determination with one or more eye tracking cameras
US12140771B2 (en) 2020-02-19 2024-11-12 Pupil Labs Gmbh Eye tracking module and head-wearable device
WO2021164867A1 (en) * 2020-02-19 2021-08-26 Pupil Labs Gmbh Eye tracking module and head-wearable device
US11662814B2 (en) * 2021-02-19 2023-05-30 Beijing Boe Optoelectronics Technology Co., Ltd. Sight positioning method, head-mounted display device, computer device and computer-readable storage medium
US20220269341A1 (en) * 2021-02-19 2022-08-25 Beijing Boe Optoelectronics Technology Co., Ltd. Sight positioning method, head-mounted display device, computer device and computer-readable storage medium
US20240036318A1 (en) * 2021-12-21 2024-02-01 Alexander Sarris System to superimpose information over a users field of view
US12210160B2 (en) * 2021-12-21 2025-01-28 Alexander Sarris System to superimpose information over a users field of view
EP4468124A1 (en) * 2023-05-25 2024-11-27 Tobii AB Method and system for guiding a user in calibrating an eye tracking device

Also Published As

Publication number Publication date
EP2929413A1 (en) 2015-10-14
JP6498606B2 (ja) 2019-04-10
EP2929413B1 (en) 2020-06-03
EP2929413A4 (en) 2016-07-13
US10025379B2 (en) 2018-07-17
KR20150116814A (ko) 2015-10-16
US20140184775A1 (en) 2014-07-03
KR102205374B1 (ko) 2021-01-21
JP2016510517A (ja) 2016-04-07
CN104903818A (zh) 2015-09-09
CN104903818B (zh) 2018-12-14
WO2014089542A1 (en) 2014-06-12

Similar Documents

Publication Publication Date Title
US20140218281A1 (en) Systems and methods for eye gaze determination
EP4383193A1 (en) Line-of-sight direction tracking method and apparatus
JP6902075B2 (ja) 構造化光を用いた視線追跡
CN109801379B (zh) 通用的增强现实眼镜及其标定方法
US11861062B2 (en) Blink-based calibration of an optical see-through head-mounted display
CN112102389B (zh) 确定实物至少一部分的3d重构件的空间坐标的方法和系统
US9779512B2 (en) Automatic generation of virtual materials from real-world materials
Lai et al. Hybrid method for 3-D gaze tracking using glint and contour features
JP7659148B2 (ja) アイトラッキングデバイスおよび方法
Hennessey et al. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions
US20030123027A1 (en) System and method for eye gaze tracking using corneal image mapping
CN108369744B (zh) 通过双目单应性映射的3d注视点检测
JP2016173313A (ja) 視線方向推定システム、視線方向推定方法及び視線方向推定プログラム
JP7030317B2 (ja) 瞳孔検出装置及び瞳孔検出方法
US10620454B2 (en) System and method of obtaining fit and fabrication measurements for eyeglasses using simultaneous localization and mapping of camera images
Takemura et al. Estimation of a focused object using a corneal surface image for eye-based interaction
Lee et al. A robust eye gaze tracking method based on a virtual eyeball model
JP7168953B2 (ja) 自動キャリブレーションを行う視線計測装置、視線計測方法および視線計測プログラム
Chi et al. A novel multi-camera global calibration method for gaze tracking system
JPH0351407B2 (enExample)
Li et al. An efficient method for eye tracking and eye-gazed FOV estimation
Kang et al. A robust extrinsic calibration method for non-contact gaze tracking in the 3-D space
Nitschke et al. I see what you see: point of gaze estimation from corneal images
Plopski et al. Hybrid eye tracking: Combining iris contour and corneal imaging
CN111587397B (zh) 图像生成装置、眼镜片选择系统、图像生成方法以及程序

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION