US12380603B2 - Extrinsic calibration between display and eye-tracking camera in head-mounted devices - Google Patents

Extrinsic calibration between display and eye-tracking camera in head-mounted devices

Info

Publication number
US12380603B2
US12380603B2 US18/412,931 US202418412931A US12380603B2 US 12380603 B2 US12380603 B2 US 12380603B2 US 202418412931 A US202418412931 A US 202418412931A US 12380603 B2 US12380603 B2 US 12380603B2
Authority
US
United States
Prior art keywords
reflection
eye
hmd
display
tracking camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US18/412,931
Other versions
US20240242381A1 (en
Inventor
Zhiheng Jia
Chao Guo
Scott Fullam
Jeffrey Neil Margolis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to US18/412,931 priority Critical patent/US12380603B2/en
Assigned to GOOGLE LLC reassignment GOOGLE LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FULLAM, SCOTT, GUO, Chao, JIA, ZHIHENG, MARGOLIS, JEFFREY NEIL
Publication of US20240242381A1 publication Critical patent/US20240242381A1/en
Application granted granted Critical
Publication of US12380603B2 publication Critical patent/US12380603B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • This description relates in general to head mounted wearable devices, and in particular, to calibration of position and orientation of a display with respect to a user-facing camera, e.g., an eye-tracking camera.
  • a user-facing camera e.g., an eye-tracking camera.
  • the improvement discussed herein is directed to calibration of extrinsic quantities (e.g., position and orientation) of a display of a head-mounted device (HMD) with respect to a user-facing camera, e.g., an eye-tracking camera after factory calibration, e.g., after usage in the field.
  • the calibration involves determining a current position of a reflected pixel of the display within a field of view of the eye-tracking camera and comparing that position with a calibration position determined at the factory. The pixel may then be shifted according to the difference between the current position and the calibration position.
  • a method can include capturing, via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from a first lens of the HMD where the pixel is illuminated within a display associated with a second lens of the HMD.
  • the method can also include determining a current position of the reflection within a field of view of the eye-tracking camera.
  • the method can further include determining a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
  • a computer program product comprising a nontransitory storage medium
  • the computer program product can include code that, when executed by processing circuitry, causes the processing circuitry to perform a method.
  • the method can include capturing, via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from of a first lens of the HMD where the pixel is illuminated within a display associated with a second lens of the HMD.
  • the method can also include determining a current position of the reflection within a field of view of the eye-tracking camera.
  • the method can further include determining a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
  • an apparatus in another general aspect, includes memory and processing circuitry coupled to the memory.
  • the processing circuitry can be configured to capture, via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from a first lens of the HMD where the pixel is illuminated within a display associated with a second lens of the HMD.
  • the processing circuitry can also be configured to determine a current position of the reflection within a field of view of the eye-tracking camera.
  • the processing circuitry can further be configured to determine a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
  • FIG. 1 is a diagram illustrating an example system, in accordance with implementations described herein.
  • FIG. 2 is a diagram illustrating an example comparison of a current image of an illuminated pixel of a display to a previous image of the illuminated pixel of the display.
  • FIG. 3 is a diagram illustrating an example electronic environment for calibrating a display of a HMD to a user-facing camera of the HMD.
  • FIG. 4 is a flow chart illustrating a method of calibrating a display of a HMD to a user-facing camera of the HMD.
  • Head-mounted displays for systems such as extended reality (XR) may use a user-facing camera such as an eye-tracking camera to gather data such as eye gaze direction. It is noted that other user-facing cameras can be used as well, e.g., face-tracking cameras. In some implementations that are described herein the eye-tracking camera is used an example, but some implementations can use other types of eye-tracking cameras.
  • the eye-tracking camera and the display can be calibrated at the factory. For example, during the manufacture process, the position of pixels of the display can be calibrated with respect to the eye-tracking camera. In some implementations, at least one way this may be done is to image the pixels within a field of view of the eye-tracking camera using an external optical system that maps rays from the display to the eye-tracking camera.
  • the position and orientation of the display can drift with respect to the eye-tracking camera during use.
  • the HMD may have some flexibility in the frame for the comfort of users. That flexibility, however, may cause the display to move during use. The movement of the display can, in turn, cause discomfort for the user.
  • At least one technical solution to the above-described technical problem includes performing a calibration of the display with respect to the eye-tracking camera after usage.
  • the calibration involves determining a current position of a reflected pixel of the display within a field of view of the eye-tracking camera and comparing that position with a calibration position determined at the factory. The illuminated pixel may then be shifted according to the difference between the current position and the calibration position.
  • light from the illuminated pixel is reflected from (e.g., off of) a lens (e.g., pancake lens) of the HMD along a path to the eye-tracking camera.
  • the lens can be disposed between an eye of a user and the display and can be configured to image pixels of the display onto the eye.
  • the HMD may be configured to operate in one or more sets of modes. For example, the reflection of the light from the pixel from the lens and captured by the eye-tracking camera can be performed in a calibration mode (e.g., first mode) and the imaging of the pixels of the display onto the eye via the pancake lens can be performed in a user mode (e.g., a second mode).
  • infrared (IR) illumination is used to enhance the reflection of the pixel of the display at the eye-tracking camera.
  • the source of the IR illumination includes an IR diode at an edge of the display. The IR illumination from the IR diode is reflected from the pancake lens and is received at the eye-tracking camera.
  • At least one technical advantage of the technical solution is that the display may remain calibrated with respect to the eye-tracking camera. Such maintenance of this calibration can ensure, in some implementations, that the user is able to remain comfortable during usage of the HMD.
  • FIG. 1 is a diagram illustrating an example HMD system 100 .
  • the HMD system 100 includes a display 110 within a display lens 116 , a user-facing camera 120 , and a pancake lens 130 .
  • the display 110 is configured to provide images of objects to a user.
  • the display includes numerous pixels that may or may not be illuminated depending on the image provided.
  • the display 110 has a position and an orientation in space and also with respect to other components of the HMD system 100 .
  • the position and orientation of the display 110 with respect to other components of the HMD system 100 form the extrinsic quantities, or extrinsics, of the display 110 and the other components.
  • an intrinsic quantity of the display 110 may include quantities such as the temperature, color distribution, and refresh rate
  • an extrinsic quantity of the display 110 can be, for example, a quantity such as position and orientation with respect to another component of the HMD system 100 , e.g., the user-facing camera 120 .
  • the position and/or orientation of the display 110 may change due to deformation of a frame of the HMD system 100 (not pictured). Such a change in the position and/or orientation of the display 110 can have a disorientating effect on the user and affect the comfort of the user while using the HMD system 100 . Accordingly, a calibration of the position and/or orientation of the display 110 with respect to the eye-tracking camera 120 may be needed after use by the user.
  • the display 110 includes illuminated pixels 112 ( 1 ) and 112 ( 2 ).
  • the illuminated pixels 112 ( 1 ) and 112 ( 2 ) are examples of illuminated pixels of the display 110 , and there may be a single illuminated pixel, or many illuminated pixels. It is these illuminated pixels 112 ( 1 ) and 112 ( 2 ) that are imaged by the eye-tracking camera 120 .
  • the eye-tracking camera 120 is configured to form an image of an eye 140 .
  • the eye-tracking camera 120 may be used to determine a gaze direction of the eye 140 .
  • Such a determination of a gaze direction of the eye 140 may in turn determine whether to illuminate certain pixels on the display 110 .
  • the eye-tracking camera 120 is also used to capture reflected pixels 112 ( 1 ) and 112 ( 2 ) of the display 110 .
  • the capturing of the reflected pixels 112 ( 1 ) and 112 ( 2 ) of the display 110 is performed with assistance of the pancake lens 130 .
  • the pancake lens 130 is configured to enable the eye 140 to image illuminated pixels of the display 110 , e.g., illuminated pixels 112 ( 1 ) and 112 ( 2 ).
  • the pancake lens 130 is also configured to enable the eye-tracking camera 120 to form an image of the eye 140 .
  • other types of lenses can be used instead of the pancake lens 130 .
  • a Fresnel lens or a birdbath lens can be used instead of the pancake lens 130 .
  • the pancake lens 130 is used to reflect rays from the illuminated pixels 112 ( 1 ) and 112 ( 2 ) to the eye-tracking camera 120 .
  • the reflected rays (shown as solid in FIG. 1 ) form a reflection of the pixels 112 ( 1 ) and 112 ( 2 ) captured in the field of view of the eye-tracking camera 120 .
  • the reflection capture is performed in a dark environment such as in a carrying case so that external light that would transmit through the pancake lens would not wash out the illumination reflected from the surface of the pancake lens 130 .
  • the scenario shown in FIG. 1 involving the pancake lens 130 is not the only scenario in which the illuminated pixels 112 ( 1 ) and 112 ( 2 ) of the display 110 may have their reflections captured by the eye-tracking camera 120 .
  • a mirror or a folding optical system could be inserted substantially parallel to the display 110 to reflect the rays from the illuminated pixels 112 ( 1 ) and 112 ( 2 ) to the eye-tracking camera 120 .
  • the eye-tracking camera 120 captures the reflected pixels 112 ( 1 ) and 112 ( 2 ) within a field of view of the eye-tracking camera 120 . Further details of this image are discussed with regard to FIG. 2 .
  • FIG. 2 is a diagram illustrating an example comparison of a current reflection of an illuminated pixel of a display within a field of view 200 of an eye-tracking camera (e.g., eye-tracking camera 120 ) to reflection of the pixel of the display within the field of view 200 at initial calibration in the factory.
  • an eye-tracking camera e.g., eye-tracking camera 120
  • the reflection of a first illuminated pixel (e.g., illuminated pixel 112 ( 1 )) is at a current image position 212 ( 1 ) within the field of view 200 of the eye-tracking camera
  • the reflection of a second illuminated pixel (e.g., illuminated pixel 112 ( 2 )) is at current position 212 ( 2 ) within the field of view 200 of the eye-tracking camera.
  • calibration positions 222 ( 1 ) and 222 ( 2 ) may correspond to positions of reflections of the illuminated pixels captured at the time of manufacture in a factory, before use.
  • the calibration positions 222 ( 1 ) and 222 ( 2 ) may be stored in a memory associated with processing circuitry of the HMD system (e.g., HMD system 100 ).
  • the current positions 212 ( 1 ) and 212 ( 2 ) within the field of view 200 are determined using, e.g., computer vision techniques.
  • the processing circuitry then forms differences between the current positions 212 ( 1 ) and 212 ( 2 ) within the field of view 200 and the calibration positions 222 ( 1 ) and 222 ( 2 ) within the field of view 200 .
  • the difference between the current position 212 ( 1 ) and the calibration position 222 ( 1 ) may then be used to determine a position of the pixel (e.g., pixel 112 ( 1 )); similarly, the difference between the current position 212 ( 2 ) and the calibration position 222 ( 2 ) may then be used to determine a position of the pixel (e.g., pixel 112 ( 2 )).
  • the position of the pixel e.g., pixel 112 ( 1 )
  • the processing circuitry may then adjust the position of the pixel as a calibration measure. That is, the processing circuitry may adjust the position of the pixel to correct for the movement of the display under, e.g., deformation of the frame.
  • the eye-tracking camera 120 is an infrared (IR) camera.
  • an IR illumination source e.g., an IR diode 114 is attached to the display 110 .
  • the IR diode 114 further illuminates the reflections of the pixels 112 ( 1 ) and 112 ( 2 ) captured by the eye-tracking camera 120 .
  • the illumination from the IR diode 114 reflects from the pancake lens 130 (dashed line).
  • FIG. 3 is a diagram that illustrates example processing circuitry 320 connected to an HMD system (e.g., HMD system 100 ).
  • the processing circuitry 320 is configured to capture a reflection of an illuminated pixel (e.g., pixel 112 ( 1 )) of a display (e.g., display 110 ) using a user-facing camera (e.g., eye-tracking camera 120 ) and determine a position and orientation of the illuminated pixel relative to the user-facing camera based on the reflection.
  • a user-facing camera e.g., eye-tracking camera 120
  • the processing circuitry 320 includes a network interface 322 , one or more processing units 324 , and nontransitory memory 326 .
  • the network interface 322 includes, for example, Ethernet adaptors, Bluetooth adaptors, and the like, for converting electronic and/or optical signals received from the network to electronic form for use by the processing circuitry 320 .
  • the set of processing units 324 include one or more processing chips and/or assemblies.
  • the memory 326 is a storage medium and includes both volatile memory (e.g., RAM) and non-volatile memory, such as one or more read only memories (ROMs), disk drives, solid state drives, and the like.
  • the set of processing units 324 and the memory 326 together form part of the processing circuitry 320 , which is configured to carry out various methods and functions as described herein as a computer program product.
  • one or more of the components of the processing circuitry 320 can be, or can include processors (e.g., processing units 324 ) configured to process instructions stored in the memory 326 . Examples of such instructions as depicted in FIG. 3 include a reflection manager 330 , a current position manager 340 , a difference manager 350 , and an extrinsics manager 360 . Further, as illustrated in FIG. 3 , the memory 326 is configured to store various data, which is described with respect to the respective managers that use such data.
  • the reflection manager 330 is configured to via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from a first lens of the HMD, the pixel being illuminated within a display associated with a second lens of the HMD.
  • the reflection data 332 may include the captured reflection of the illuminated pixel.
  • the user-facing camera is an eye-tracking camera.
  • the user-facing camera is a face-tracking camera.
  • the reflection is captured using reflections of the illumination from the illuminated pixel from a pancake lens.
  • an IR diode is used to further illuminate the reflection in the user-facing camera.
  • the image manager 330 is configured to capture the reflection of the illuminated pixel in a calibration mode, e.g., when the HMD system is in a dark environment such as a carrying case.
  • the current position manager 340 is configured to determine a current position of the reflection within a field of view of the eye-tracking camera. In some implementations, the current position of the image within the field of view of the user-facing camera is determined using computer vision techniques.
  • the difference manager 350 is configured to determine a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
  • the difference data 352 includes the calibration position of the reflection of the illuminated pixel of the display within the field of view as well as the difference between the current position and the calibration position.
  • the difference is a vector difference corresponding to the case in which the current position and the previous position are vectors.
  • the difference is a scalar quantity.
  • the extrinsics manager 360 is configured to determine a position of the pixel relative to the eye-tracking camera based on the difference. In some implementations, the extrinsics manager is further configured to adjust a position of the illuminated pixel to calibrate the display based on the difference.
  • the components (e.g., modules, processing units 324 ) of processing circuitry 320 can be configured to operate based on one or more platforms (e.g., one or more similar or different platforms) that can include one or more types of hardware, software, firmware, operating systems, runtime libraries, and/or so forth.
  • the components of the processing circuitry 320 can be configured to operate within a cluster of devices (e.g., a server farm). In such an implementation, the functionality and processing of the components of the processing circuitry 320 can be distributed to several devices of the cluster of devices.
  • the components of the processing circuitry 320 can be, or can include, any type of hardware and/or software configured to process attributes.
  • one or more portions of the components shown in the components of the processing circuitry 320 in FIG. 3 can be, or can include, a hardware-based module (e.g., a digital signal processor (DSP), a field programmable gate array (FPGA), a memory), a firmware module, and/or a software-based module (e.g., a module of computer code, a set of computer-readable instructions that can be executed at a computer).
  • DSP digital signal processor
  • FPGA field programmable gate array
  • a memory e.g., a firmware module, and/or a software-based module (e.g., a module of computer code, a set of computer-readable instructions that can be executed at a computer).
  • a software-based module e.g., a module of computer code, a set of computer-readable instructions that can be executed at a computer.
  • the network can be, or can include, a wireless network and/or wireless network implemented using, for example, gateway devices, bridges, switches, and/or so forth.
  • the network can include one or more segments and/or can have portions based on various protocols such as Internet Protocol (IP) and/or a proprietary protocol.
  • IP Internet Protocol
  • the network can include at least a portion of the Internet.
  • one or more of the components of the search system can be, or can include, processors configured to process instructions stored in a memory.
  • processors configured to process instructions stored in a memory.
  • reflection manager 330 and/or a portion thereof
  • current position manager 340 and/or a portion thereof
  • difference manager 350 and/or a portion thereof
  • extrinsics manager 360 are examples of such instructions.
  • the memory 326 can be any type of memory such as a random-access memory, a disk drive memory, flash memory, and/or so forth. In some implementations, the memory 326 can be implemented as more than one memory component (e.g., more than one RAM component or disk drive memory) associated with the components of the processing circuitry 320 . In some implementations, the memory 326 can be a database memory. In some implementations, the memory 326 can be, or can include, a non-local memory. For example, the memory 326 can be, or can include, a memory shared by multiple devices (not shown). In some implementations, the memory 326 can be associated with a server device (not shown) within a network and configured to serve the components of the processing circuitry 320 . As illustrated in FIG. 3 , the memory 326 is configured to store various data, including image data 332 and current position data 342 .
  • FIG. 4 is a flow chart illustrating a method 400 of determining a position and orientation of an illuminated pixel relative to an eye-tracking camera based on a reflection of the pixel captured by the eye-tracking camera.
  • a reflection manager captures, via an eye-tracking camera (e.g., eye-tracking camera 120 ) on a head-mounted display (HMD, e.g., HMD system 100 ), a reflection of a pixel (e.g., pixel 112 ( 1 )) from a first lens (e.g., pancake lens 130 ) of the HMD, the pixel being illuminated within a display (e.g., display 110 ) associated with a second lens (e.g., display lens 116 ) of the HMD.
  • HMD head-mounted display
  • a current position manager determines a current position (e.g., 212 ( 1 )) of the reflection within a field of view (e.g., 200 ) of the eye-tracking camera.
  • a difference manager determines a difference between the current position of the reflection within the field of view and a calibration position (e.g., 222 ( 1 )) within the field of view.
  • spatially relative terms such as “beneath,” “below,” “lower,” “above,” “upper,” and the like, may be used herein for ease of description to describe one element or feature in relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 70 degrees or at other orientations) and the spatially relative descriptors used herein may be interpreted accordingly.
  • Example embodiments of the concepts are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of example embodiments. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments of the described concepts should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. Accordingly, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Techniques include performing a calibration of the display with respect to the eye-tracking camera after usage. The calibration involves determining a current position of a reflected pixel of the display within a field of view of the eye-tracking camera and comparing that position with a calibration position, for example a position seen at the factory. The pixel may then be shifted according to the difference between the current position and the calibration position.

Description

CROSS REFERENCE TO RELATED APPLICATION
This application claims the benefit of U.S. Provisional Application No. 63/479,876, filed Jan. 13, 2023, the disclosure of which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
This description relates in general to head mounted wearable devices, and in particular, to calibration of position and orientation of a display with respect to a user-facing camera, e.g., an eye-tracking camera.
SUMMARY
The improvement discussed herein is directed to calibration of extrinsic quantities (e.g., position and orientation) of a display of a head-mounted device (HMD) with respect to a user-facing camera, e.g., an eye-tracking camera after factory calibration, e.g., after usage in the field. The calibration involves determining a current position of a reflected pixel of the display within a field of view of the eye-tracking camera and comparing that position with a calibration position determined at the factory. The pixel may then be shifted according to the difference between the current position and the calibration position.
In one general aspect, a method can include capturing, via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from a first lens of the HMD where the pixel is illuminated within a display associated with a second lens of the HMD. The method can also include determining a current position of the reflection within a field of view of the eye-tracking camera. The method can further include determining a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
In another general aspect, a computer program product comprising a nontransitory storage medium, the computer program product can include code that, when executed by processing circuitry, causes the processing circuitry to perform a method. The method can include capturing, via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from of a first lens of the HMD where the pixel is illuminated within a display associated with a second lens of the HMD. The method can also include determining a current position of the reflection within a field of view of the eye-tracking camera. The method can further include determining a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
In another general aspect, an apparatus includes memory and processing circuitry coupled to the memory. The processing circuitry can be configured to capture, via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from a first lens of the HMD where the pixel is illuminated within a display associated with a second lens of the HMD. The processing circuitry can also be configured to determine a current position of the reflection within a field of view of the eye-tracking camera. The processing circuitry can further be configured to determine a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a diagram illustrating an example system, in accordance with implementations described herein.
FIG. 2 is a diagram illustrating an example comparison of a current image of an illuminated pixel of a display to a previous image of the illuminated pixel of the display.
FIG. 3 is a diagram illustrating an example electronic environment for calibrating a display of a HMD to a user-facing camera of the HMD.
FIG. 4 is a flow chart illustrating a method of calibrating a display of a HMD to a user-facing camera of the HMD.
DETAILED DESCRIPTION
Head-mounted displays (HMDs) for systems such as extended reality (XR) may use a user-facing camera such as an eye-tracking camera to gather data such as eye gaze direction. It is noted that other user-facing cameras can be used as well, e.g., face-tracking cameras. In some implementations that are described herein the eye-tracking camera is used an example, but some implementations can use other types of eye-tracking cameras.
The eye-tracking camera and the display can be calibrated at the factory. For example, during the manufacture process, the position of pixels of the display can be calibrated with respect to the eye-tracking camera. In some implementations, at least one way this may be done is to image the pixels within a field of view of the eye-tracking camera using an external optical system that maps rays from the display to the eye-tracking camera.
At least one technical problem with the calibration described above is that the position and orientation of the display can drift with respect to the eye-tracking camera during use. For example, the HMD may have some flexibility in the frame for the comfort of users. That flexibility, however, may cause the display to move during use. The movement of the display can, in turn, cause discomfort for the user.
At least one technical solution to the above-described technical problem includes performing a calibration of the display with respect to the eye-tracking camera after usage. The calibration involves determining a current position of a reflected pixel of the display within a field of view of the eye-tracking camera and comparing that position with a calibration position determined at the factory. The illuminated pixel may then be shifted according to the difference between the current position and the calibration position.
In some implementations, light from the illuminated pixel is reflected from (e.g., off of) a lens (e.g., pancake lens) of the HMD along a path to the eye-tracking camera. In some implementations, the lens can be disposed between an eye of a user and the display and can be configured to image pixels of the display onto the eye. In some implementations, the HMD may be configured to operate in one or more sets of modes. For example, the reflection of the light from the pixel from the lens and captured by the eye-tracking camera can be performed in a calibration mode (e.g., first mode) and the imaging of the pixels of the display onto the eye via the pancake lens can be performed in a user mode (e.g., a second mode).
In some implementations, infrared (IR) illumination is used to enhance the reflection of the pixel of the display at the eye-tracking camera. In some implementations, the source of the IR illumination includes an IR diode at an edge of the display. The IR illumination from the IR diode is reflected from the pancake lens and is received at the eye-tracking camera.
At least one technical advantage of the technical solution is that the display may remain calibrated with respect to the eye-tracking camera. Such maintenance of this calibration can ensure, in some implementations, that the user is able to remain comfortable during usage of the HMD.
FIG. 1 is a diagram illustrating an example HMD system 100. The HMD system 100 includes a display 110 within a display lens 116, a user-facing camera 120, and a pancake lens 130.
The display 110 is configured to provide images of objects to a user. The display includes numerous pixels that may or may not be illuminated depending on the image provided. The display 110 has a position and an orientation in space and also with respect to other components of the HMD system 100. The position and orientation of the display 110 with respect to other components of the HMD system 100 form the extrinsic quantities, or extrinsics, of the display 110 and the other components. For example, while an intrinsic quantity of the display 110 may include quantities such as the temperature, color distribution, and refresh rate, an extrinsic quantity of the display 110 can be, for example, a quantity such as position and orientation with respect to another component of the HMD system 100, e.g., the user-facing camera 120.
The position and/or orientation of the display 110, e.g., with respect to the eye-tracking camera 120, may change due to deformation of a frame of the HMD system 100 (not pictured). Such a change in the position and/or orientation of the display 110 can have a disorientating effect on the user and affect the comfort of the user while using the HMD system 100. Accordingly, a calibration of the position and/or orientation of the display 110 with respect to the eye-tracking camera 120 may be needed after use by the user.
As shown in FIG. 1 , the display 110 includes illuminated pixels 112(1) and 112(2). The illuminated pixels 112(1) and 112(2) are examples of illuminated pixels of the display 110, and there may be a single illuminated pixel, or many illuminated pixels. It is these illuminated pixels 112(1) and 112(2) that are imaged by the eye-tracking camera 120.
The eye-tracking camera 120 is configured to form an image of an eye 140. For example, the eye-tracking camera 120 may be used to determine a gaze direction of the eye 140. Such a determination of a gaze direction of the eye 140 may in turn determine whether to illuminate certain pixels on the display 110.
In the scenario illustrated in FIG. 1 , the eye-tracking camera 120 is also used to capture reflected pixels 112(1) and 112(2) of the display 110. As shown in FIG. 1 , the capturing of the reflected pixels 112(1) and 112(2) of the display 110 is performed with assistance of the pancake lens 130.
The pancake lens 130 is configured to enable the eye 140 to image illuminated pixels of the display 110, e.g., illuminated pixels 112(1) and 112(2). The pancake lens 130 is also configured to enable the eye-tracking camera 120 to form an image of the eye 140. In some implementations, other types of lenses can be used instead of the pancake lens 130. For example, instead of the pancake lens 130, a Fresnel lens or a birdbath lens can be used.
In the scenario shown in FIG. 1 , the pancake lens 130 is used to reflect rays from the illuminated pixels 112(1) and 112(2) to the eye-tracking camera 120. The reflected rays (shown as solid in FIG. 1 ) form a reflection of the pixels 112(1) and 112(2) captured in the field of view of the eye-tracking camera 120. In some implementations, the reflection capture is performed in a dark environment such as in a carrying case so that external light that would transmit through the pancake lens would not wash out the illumination reflected from the surface of the pancake lens 130.
The scenario shown in FIG. 1 involving the pancake lens 130 is not the only scenario in which the illuminated pixels 112(1) and 112(2) of the display 110 may have their reflections captured by the eye-tracking camera 120. For example, a mirror or a folding optical system could be inserted substantially parallel to the display 110 to reflect the rays from the illuminated pixels 112(1) and 112(2) to the eye-tracking camera 120.
In at least some scenarios, the eye-tracking camera 120 captures the reflected pixels 112(1) and 112(2) within a field of view of the eye-tracking camera 120. Further details of this image are discussed with regard to FIG. 2 .
FIG. 2 is a diagram illustrating an example comparison of a current reflection of an illuminated pixel of a display within a field of view 200 of an eye-tracking camera (e.g., eye-tracking camera 120) to reflection of the pixel of the display within the field of view 200 at initial calibration in the factory.
In FIG. 2 , the reflection of a first illuminated pixel (e.g., illuminated pixel 112(1)) is at a current image position 212(1) within the field of view 200 of the eye-tracking camera, and the reflection of a second illuminated pixel (e.g., illuminated pixel 112(2)) is at current position 212(2) within the field of view 200 of the eye-tracking camera.
To determine whether the position and orientation of the display (e.g., display 110) has changed during use, the current positions 212(1) and 212(2) are to be compared to calibration positions 222(1) and 222(2), respectively. For example, calibration positions 222(1) and 222(2) may correspond to positions of reflections of the illuminated pixels captured at the time of manufacture in a factory, before use.
Accordingly, the calibration positions 222(1) and 222(2) may be stored in a memory associated with processing circuitry of the HMD system (e.g., HMD system 100). When the illuminated pixels of the display are reflected, the current positions 212(1) and 212(2) within the field of view 200 are determined using, e.g., computer vision techniques. The processing circuitry then forms differences between the current positions 212(1) and 212(2) within the field of view 200 and the calibration positions 222(1) and 222(2) within the field of view 200.
The difference between the current position 212(1) and the calibration position 222(1) may then be used to determine a position of the pixel (e.g., pixel 112(1)); similarly, the difference between the current position 212(2) and the calibration position 222(2) may then be used to determine a position of the pixel (e.g., pixel 112(2)). In some implementations, the position of the pixel (e.g., pixel 112(1)) may be determined by determining a transformation between the calibration position 222(1) and the position of the pixel at the factory. The processing circuitry would then apply the transformation to the difference to determine an amount that the position of the pixel has moved from the calibration position.
Based on the determined amount that the pixel has moved (e.g., the difference), the processing circuitry may then adjust the position of the pixel as a calibration measure. That is, the processing circuitry may adjust the position of the pixel to correct for the movement of the display under, e.g., deformation of the frame.
Returning to FIG. 1 , in some implementations the eye-tracking camera 120 is an infrared (IR) camera. In such an implementation, an IR illumination source, e.g., an IR diode 114 is attached to the display 110. The IR diode 114 further illuminates the reflections of the pixels 112(1) and 112(2) captured by the eye-tracking camera 120. In the implementation shown in FIG. 1 , the illumination from the IR diode 114 reflects from the pancake lens 130 (dashed line).
FIG. 3 is a diagram that illustrates example processing circuitry 320 connected to an HMD system (e.g., HMD system 100). The processing circuitry 320 is configured to capture a reflection of an illuminated pixel (e.g., pixel 112(1)) of a display (e.g., display 110) using a user-facing camera (e.g., eye-tracking camera 120) and determine a position and orientation of the illuminated pixel relative to the user-facing camera based on the reflection.
The processing circuitry 320 includes a network interface 322, one or more processing units 324, and nontransitory memory 326. The network interface 322 includes, for example, Ethernet adaptors, Bluetooth adaptors, and the like, for converting electronic and/or optical signals received from the network to electronic form for use by the processing circuitry 320. The set of processing units 324 include one or more processing chips and/or assemblies. The memory 326 is a storage medium and includes both volatile memory (e.g., RAM) and non-volatile memory, such as one or more read only memories (ROMs), disk drives, solid state drives, and the like. The set of processing units 324 and the memory 326 together form part of the processing circuitry 320, which is configured to carry out various methods and functions as described herein as a computer program product.
In some implementations, one or more of the components of the processing circuitry 320 can be, or can include processors (e.g., processing units 324) configured to process instructions stored in the memory 326. Examples of such instructions as depicted in FIG. 3 include a reflection manager 330, a current position manager 340, a difference manager 350, and an extrinsics manager 360. Further, as illustrated in FIG. 3 , the memory 326 is configured to store various data, which is described with respect to the respective managers that use such data.
The reflection manager 330 is configured to via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from a first lens of the HMD, the pixel being illuminated within a display associated with a second lens of the HMD. The reflection data 332 may include the captured reflection of the illuminated pixel. In some implementations, the user-facing camera is an eye-tracking camera. In some implementations, the user-facing camera is a face-tracking camera. In some implementations, the reflection is captured using reflections of the illumination from the illuminated pixel from a pancake lens. In some implementations, an IR diode is used to further illuminate the reflection in the user-facing camera. In some implementations, the image manager 330 is configured to capture the reflection of the illuminated pixel in a calibration mode, e.g., when the HMD system is in a dark environment such as a carrying case.
The current position manager 340 is configured to determine a current position of the reflection within a field of view of the eye-tracking camera. In some implementations, the current position of the image within the field of view of the user-facing camera is determined using computer vision techniques.
The difference manager 350 is configured to determine a difference between the current position of the reflection within the field of view and a calibration position within the field of view. The difference data 352 includes the calibration position of the reflection of the illuminated pixel of the display within the field of view as well as the difference between the current position and the calibration position. In some implementations, the difference is a vector difference corresponding to the case in which the current position and the previous position are vectors. In some implementations, the difference is a scalar quantity.
The extrinsics manager 360 is configured to determine a position of the pixel relative to the eye-tracking camera based on the difference. In some implementations, the extrinsics manager is further configured to adjust a position of the illuminated pixel to calibrate the display based on the difference.
The components (e.g., modules, processing units 324) of processing circuitry 320 can be configured to operate based on one or more platforms (e.g., one or more similar or different platforms) that can include one or more types of hardware, software, firmware, operating systems, runtime libraries, and/or so forth. In some implementations, the components of the processing circuitry 320 can be configured to operate within a cluster of devices (e.g., a server farm). In such an implementation, the functionality and processing of the components of the processing circuitry 320 can be distributed to several devices of the cluster of devices.
The components of the processing circuitry 320 can be, or can include, any type of hardware and/or software configured to process attributes. In some implementations, one or more portions of the components shown in the components of the processing circuitry 320 in FIG. 3 can be, or can include, a hardware-based module (e.g., a digital signal processor (DSP), a field programmable gate array (FPGA), a memory), a firmware module, and/or a software-based module (e.g., a module of computer code, a set of computer-readable instructions that can be executed at a computer). For example, in some implementations, one or more portions of the components of the processing circuitry 320 can be, or can include, a software module configured for execution by at least one processor (not shown). In some implementations, the functionality of the components can be included in different modules and/or different components than those shown in FIG. 3 , including combining functionality illustrated as two components into a single component.
Although not shown, in some implementations, the components of the processing circuitry 320 (or portions thereof) can be configured to operate within, for example, a data center (e.g., a cloud computing environment), a computer system, one or more server/host devices, and/or so forth. In some implementations, the components of the processing circuitry 320 (or portions thereof) can be configured to operate within a network. Thus, the components of the processing circuitry 320 (or portions thereof) can be configured to function within various types of network environments that can include one or more devices and/or one or more server devices. For example, the network can be, or can include, a local area network (LAN), a wide area network (WAN), and/or so forth. The network can be, or can include, a wireless network and/or wireless network implemented using, for example, gateway devices, bridges, switches, and/or so forth. The network can include one or more segments and/or can have portions based on various protocols such as Internet Protocol (IP) and/or a proprietary protocol. The network can include at least a portion of the Internet.
In some implementations, one or more of the components of the search system can be, or can include, processors configured to process instructions stored in a memory. For example, reflection manager 330 (and/or a portion thereof), current position manager 340 (and/or a portion thereof), difference manager 350 (and/or a portion thereof), and extrinsics manager 360 (and/or a portion thereof) are examples of such instructions.
In some implementations, the memory 326 can be any type of memory such as a random-access memory, a disk drive memory, flash memory, and/or so forth. In some implementations, the memory 326 can be implemented as more than one memory component (e.g., more than one RAM component or disk drive memory) associated with the components of the processing circuitry 320. In some implementations, the memory 326 can be a database memory. In some implementations, the memory 326 can be, or can include, a non-local memory. For example, the memory 326 can be, or can include, a memory shared by multiple devices (not shown). In some implementations, the memory 326 can be associated with a server device (not shown) within a network and configured to serve the components of the processing circuitry 320. As illustrated in FIG. 3 , the memory 326 is configured to store various data, including image data 332 and current position data 342.
FIG. 4 is a flow chart illustrating a method 400 of determining a position and orientation of an illuminated pixel relative to an eye-tracking camera based on a reflection of the pixel captured by the eye-tracking camera.
At 402, a reflection manager (e.g., 330) captures, via an eye-tracking camera (e.g., eye-tracking camera 120) on a head-mounted display (HMD, e.g., HMD system 100), a reflection of a pixel (e.g., pixel 112(1)) from a first lens (e.g., pancake lens 130) of the HMD, the pixel being illuminated within a display (e.g., display 110) associated with a second lens (e.g., display lens 116) of the HMD.
At 404. a current position manager (e.g., 340) determines a current position (e.g., 212(1)) of the reflection within a field of view (e.g., 200) of the eye-tracking camera.
At 406, a difference manager (e.g., 350) determines a difference between the current position of the reflection within the field of view and a calibration position (e.g., 222(1)) within the field of view.
Specific structural and functional details disclosed herein are merely representative for purposes of describing example embodiments. Example embodiments, however, may be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used in this specification, specify the presence of the stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
It will be understood that when an element is referred to as being “coupled,” “connected,” or “responsive” to, or “on,” another element, it can be directly coupled, connected, or responsive to, or on, the other element, or intervening elements may also be present. In contrast, when an element is referred to as being “directly coupled,” “directly connected,” or “directly responsive” to, or “directly on,” another element, there are no intervening elements present. As used herein the term “and/or” includes any and all combinations of one or more of the associated listed items.
Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper,” and the like, may be used herein for ease of description to describe one element or feature in relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 70 degrees or at other orientations) and the spatially relative descriptors used herein may be interpreted accordingly.
Example embodiments of the concepts are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of example embodiments. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments of the described concepts should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. Accordingly, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.
It will be understood that although the terms “first,” “second,” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. Thus, a “first” element could be termed a “second” element without departing from the teachings of the present embodiments.
Unless otherwise defined, the terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which these concepts belong. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and/or the present specification and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes, and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or sub-combinations of the functions, components, and/or features of the different implementations described.

Claims (24)

What is claimed is:
1. A method, comprising:
capturing, via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from a first lens of the HMD, the pixel being illuminated within a display associated with a second lens of the HMD;
determining a current position of the reflection within a field of view of the eye-tracking camera; and
determining a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
2. The method as in claim 1, further comprising:
determining a position of the pixel relative to the eye-tracking camera based on the difference.
3. The method as in claim 1, further comprising:
adjusting the position of the pixel to calibrate the display based on the difference.
4. The method as in claim 1, wherein the first lens of the HMD is a pancake lens.
5. The method as in claim 1, wherein the reflection is further illuminated with infrared (IR) illumination from an IR diode, the IR diode being attached to the display of the HMD.
6. The method as in claim 5, wherein the IR illumination further illuminates the reflection via a reflection from the first lens of the HMD.
7. The method as in claim 1, wherein the capturing of the reflection is performed in a first mode of a set of modes, the set of modes including a second mode in which the eye-tracking camera captures an image of an eye.
8. The method as in claim 7, wherein the capturing of the reflection in the first mode is performed in a dark environment.
9. A computer program product comprising a nontransitory storage medium, the computer program product including code that, when executed by processing circuitry, causes the processing circuitry to perform a method, the method comprising:
capturing, via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from a first lens of the HMD, the pixel being illuminated within a display associated with a second lens of the HMD;
determining a current position of the reflection within a field of view of the eye-tracking camera; and
determining a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
10. The computer program product as in claim 9, wherein the method further comprises:
determining a position of the pixel relative to the eye-tracking camera based on the difference.
11. The computer program product as in claim 9, wherein the method further comprises:
adjusting the position of the pixel to calibrate the display based on the difference.
12. The computer program product as in claim 9, wherein the first lens of the HMD is a pancake lens.
13. The computer program product as in claim 9, wherein the reflection is further illuminated with infrared (IR) illumination from an IR diode, the IR diode being attached to the display of the HMD.
14. The computer program product as in claim 13, wherein the IR illumination further illuminates the reflection via a reflection from the first lens of the HMD.
15. The computer program product as in claim 9, wherein the capturing of the reflection is performed in a first mode of a set of modes, the set of modes including a second mode in which the eye-tracking camera captures an image of an eye.
16. The computer program product as in claim 15, wherein the capturing of the reflection in the first mode is performed in a dark environment.
17. An apparatus, comprising:
memory; and
processing circuitry coupled to the memory, the processing circuitry being configured to:
capture, via an eye-tracking camera on a head-mounted display (HMD), a reflection of a pixel from a first lens of the HMD, the pixel being illuminated within a display associated with a second lens of the HMD;
determine a current position of the reflection within a field of view of the eye-tracking camera; and
determine a difference between the current position of the reflection within the field of view and a calibration position within the field of view.
18. The apparatus as in claim 17, wherein the processing circuitry is further configured to:
determine a position of the pixel relative to the eye-tracking camera based on the difference.
19. The apparatus as in claim 17, wherein the processing circuitry is further configured to:
adjust the position of the pixel to calibrate the display based on the difference.
20. The apparatus as in claim 17, wherein the first lens of the HMD is a pancake lens.
21. The apparatus as in claim 17, wherein the reflection is further illuminated with infrared (IR) illumination from an IR diode, the IR diode being attached to the display of the HMD.
22. The apparatus as in claim 21, wherein the IR illumination further illuminates the reflection via a reflection from the first lens of the HMD.
23. The apparatus as in claim 17, wherein the capturing of the reflection is performed in a first mode of a set of modes, the set of modes including a second mode in which the eye-tracking camera captures an image of an eye.
24. The apparatus as in claim 23, wherein the capturing of the reflection in the first mode is performed in a dark environment.
US18/412,931 2023-01-13 2024-01-15 Extrinsic calibration between display and eye-tracking camera in head-mounted devices Active 2044-02-15 US12380603B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/412,931 US12380603B2 (en) 2023-01-13 2024-01-15 Extrinsic calibration between display and eye-tracking camera in head-mounted devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363479876P 2023-01-13 2023-01-13
US18/412,931 US12380603B2 (en) 2023-01-13 2024-01-15 Extrinsic calibration between display and eye-tracking camera in head-mounted devices

Publications (2)

Publication Number Publication Date
US20240242381A1 US20240242381A1 (en) 2024-07-18
US12380603B2 true US12380603B2 (en) 2025-08-05

Family

ID=91854789

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/412,931 Active 2044-02-15 US12380603B2 (en) 2023-01-13 2024-01-15 Extrinsic calibration between display and eye-tracking camera in head-mounted devices

Country Status (1)

Country Link
US (1) US12380603B2 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10983341B2 (en) 2018-12-19 2021-04-20 Facebook Technologies, Llc Eye tracking based on polarization volume grating

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10983341B2 (en) 2018-12-19 2021-04-20 Facebook Technologies, Llc Eye tracking based on polarization volume grating

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Itoh, et al., "Interaction-Free Calibration for Optical See Through Head-Mounted Displays Based On 3D Eye Localization", IEEE Symp. on 3D User Interfaces (3DUI), 2014, pp. 75-82.
Plopski, et al., "Automated Spatial Calibration of HMD Systems With Unconstrained Eye-Cameras", 2016 IEEE International Symposium on Mixed and Augmented Reality, 2016, pp. 94-99.

Also Published As

Publication number Publication date
US20240242381A1 (en) 2024-07-18

Similar Documents

Publication Publication Date Title
US11277544B2 (en) Camera-specific distortion correction
US10419701B2 (en) Digital pixel image sensor
US8942419B1 (en) Position estimation using predetermined patterns of light sources
US11079839B2 (en) Eye tracking device and eye tracking method applied to video glasses and video glasses
US8600186B2 (en) Well focused catadioptric image acquisition
EP3414763B1 (en) Mobile device for hdr video capture
US9338447B1 (en) Calibrating devices by selecting images having a target having fiducial features
US9648300B2 (en) Calibration of multi-camera devices using reflections thereof
US11887513B2 (en) Case for smartglasses with calibration capabilities
CN109844610A (en) eye tracking system
US11429184B2 (en) Virtual reality display device, display device, and calculation method of line-of-sight angle
CN115989678A (en) Multi-camera system for wide-angle imaging
CN108354584A (en) Eyeball tracking module and its method for tracing, virtual reality device
US9291750B2 (en) Calibration method and apparatus for optical imaging lens system with double optical paths
CN115002433A (en) Projection equipment and ROI feature area selection method
US11416998B2 (en) Pixel classification to reduce depth-estimation error
US20250371686A1 (en) Correcting images for an ophthalmic imaging system
US12380603B2 (en) Extrinsic calibration between display and eye-tracking camera in head-mounted devices
WO2024045446A1 (en) Iris image capture method based on head-mounted display, and related product
US12192602B2 (en) Imaging control device, operation method and program of imaging control device, and imaging device
CN117788535A (en) Image pair registration method and device, electronic equipment and storage medium
CN118102067A (en) Method for operating an image recording system, image recording system and computer program product
WO2023150614A1 (en) Case for smartglasses with calibration capabilities
CN119963652B (en) Calibration method of camera parameters and related equipment
EP4687631A1 (en) Correcting images for an ophthalmic imaging system

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: GOOGLE LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JIA, ZHIHENG;GUO, CHAO;FULLAM, SCOTT;AND OTHERS;REEL/FRAME:066640/0016

Effective date: 20240227

STCF Information on status: patent grant

Free format text: PATENTED CASE