CN117761892A - Lens distance testing for head mounted display devices - Google Patents
Lens distance testing for head mounted display devices Download PDFInfo
- Publication number
- CN117761892A CN117761892A CN202311232924.XA CN202311232924A CN117761892A CN 117761892 A CN117761892 A CN 117761892A CN 202311232924 A CN202311232924 A CN 202311232924A CN 117761892 A CN117761892 A CN 117761892A
- Authority
- CN
- China
- Prior art keywords
- eye
- lens
- hmd
- distance
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 64
- 238000000034 method Methods 0.000 claims abstract description 33
- 210000001508 eye Anatomy 0.000 claims description 156
- 230000008569 process Effects 0.000 claims description 20
- 230000004044 response Effects 0.000 claims description 5
- 230000000007 visual effect Effects 0.000 claims description 5
- 238000011065 in-situ storage Methods 0.000 claims description 3
- 230000003190 augmentative effect Effects 0.000 abstract description 4
- 208000020564 Eye injury Diseases 0.000 abstract description 2
- 230000006870 function Effects 0.000 description 7
- 210000004087 cornea Anatomy 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 3
- 230000001815 facial effect Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000013102 re-test Methods 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 238000007792 addition Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000010344 pupil dilation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000010998 test method Methods 0.000 description 1
- 210000000216 zygoma Anatomy 0.000 description 1
Abstract
The present disclosure relates to lens distance testing of head mounted display devices. Systems and methods are disclosed that enable a lens distance test to be performed in a Head Mounted Display (HMD) to determine a distance between a user's eye and a lens of the HMD (e.g., a display lens in a virtual or augmented reality device). In an embodiment, the HMD is configured to determine a current pose of the eye based on a series of captured eye images. The pose information is used to determine the distance from the corneal vertex to the nearest point on the lens. If the determined distance is too small or too large, an alert or notification is generated indicating that the HMD is adjusted or the light seal is changed to achieve a better distance in order to reduce the risk of eye injury and/or improve the user experience. In embodiments, the lens distance test may be repeated during a user session to re-evaluate and/or monitor the lens distance.
Description
Background
The present application claims priority from U.S. provisional application Ser. No. 63/376,954, titled "LENSDISTANCE TEST FOR HEAD-MOUNTEDDISPLAY DEVICES," filed on 9.23 at 2022, and which is hereby incorporated by reference in its entirety.
Virtual Reality (VR) allows a user to experience and/or interact with an immersive artificial environment so that the user feels as if they were in the environment. For example, a virtual reality system may display a stereoscopic scene to a user to create an illusion of depth, and a computer may adjust scene content in real-time to provide the illusion of the user moving within the scene. When a user views an image through a virtual reality system, the user may thus feel as if they were moving within the scene from a first person perspective. Similarly, a Mixed Reality (MR) system or an Augmented Reality (AR) system combines computer-generated information (referred to as virtual content) with a real-world image or real-world view to augment or add content to the user's world view. Thus, the simulated environment of the VR and/or the hybrid environment of the MR may be utilized to provide an interactive user experience for multiple applications, such as applications that add virtual content to a real-time view of the viewer's environment, applications that interact with a virtual training environment, gaming applications, applications that remotely control a drone or other mechanical system, applications that view digital media content, applications that interact with the internet, and so forth.
Disclosure of Invention
Various implementations of methods and devices for performing lens distance testing in a Head Mounted Display (HMD), such as an AR or VR headset. The distance test is performed to determine a distance between the user's eye and a lens of the HMD (e.g., a display lens or a clip-on lens of the HMD). In some embodiments, the HMD is configured to determine a current pose of the eye based on a series of captured eye images. The pose may be determined using a gaze tracking system implemented by the HMD and used by the HMD to determine a distance from a vertex of the cornea to a closest point on the lens. If the determined distance is too small or too large, an alert or notification is generated instructing the user to adjust the position of the HMD or change the light seal to achieve a better distance in order to reduce the risk of eye injury and/or improve the user experience. In some embodiments, the distance test may be repeated during a user session to re-evaluate and/or monitor lens distance.
Drawings
Fig. 1 illustrates a Head Mounted Display (HMD) device implementing a lens distance test to determine a distance between a user's eye and a lens of the HMD, according to some embodiments.
Fig. 2A illustrates steps in a lens distance test performed by an HMD, according to some embodiments.
Fig. 2B illustrates a lens distance test to determine a distance from a user's eye to a clip-on lens added to an HMD, according to some embodiments.
Fig. 3 is a flow chart illustrating performance of a lens distance test according to some embodiments.
Fig. 4 is a flow chart illustrating a process of repeating a lens distance test using a user session of an HMD, according to some embodiments.
Fig. 5 is a block diagram illustrating various components of an exemplary VR/AR system implementing lens distance testing in accordance with some embodiments.
The present specification includes references to "one embodiment" or "an embodiment. The appearances of the phrase "in one embodiment" or "in an embodiment" are not necessarily referring to the same embodiment. The particular features, structures, or characteristics may be combined in any suitable manner consistent with the present disclosure.
The term "comprising" is open ended. As used in the claims, the term does not exclude additional structures or steps. Consider the claims referenced below: such claims do not exclude that the apparatus comprises additional components (e.g. a network interface unit, a graphics circuit, etc.).
Various units, circuits, or other components may be described or described as "configured to" perform a task or tasks. In such contexts, "configured to" implies that the structure (e.g., circuitry) is used by indicating that the unit/circuit/component includes the structure (e.g., circuitry) that performs the task or tasks during operation. As such, the unit/circuit/component may purportedly be configured to perform this task even when the specified unit/circuit/component is currently inoperable (e.g., not turned on). Units/circuits/components used with a "configured as" language include hardware such as circuits, memory storing program instructions executable to perform an operation, and the like. Reference to a unit/circuit/component being "configured to" perform one or more tasks is expressly intended to not refer to the sixth paragraph of 35u.s.c. ≡112 for that unit/circuit/component. Further, "configured to" may include a general-purpose structure (e.g., a general-purpose circuit) that is manipulated by software or firmware (e.g., an FPGA or a general-purpose processor executing software) to operate in a manner that is capable of performing one or more tasks to be solved. "configured to" may also include adjusting a manufacturing process (e.g., a semiconductor fabrication facility) to manufacture a device (e.g., an integrated circuit) suitable for performing or executing one or more tasks.
"first", "second", etc. As used herein, these terms serve as labels for the nouns they precede and do not imply any type of ordering (e.g., spatial, temporal, logical, etc.). For example, the buffer circuit may be described herein as performing a write operation of a "first" value and a "second" value. The terms "first" and "second" do not necessarily imply that a first value must be written before a second value.
As used herein, these terms are used to describe one or more factors that affect the determination. These terms do not exclude additional factors that may affect the determination. That is, the determination may be based solely on these factors or at least in part on these factors. Consider the phrase "determine a based on B". In this case, B is a factor affecting the determination of A, and such phrases do not preclude the determination of A from being based on C. In other examples, a may be determined based on B alone.
The term "or," as used in the claims, is used as an inclusive, and not an exclusive or. For example, the phrase "at least one of x, y, or z" means any one of x, y, and z, and any combination thereof.
Detailed Description
Various embodiments of methods and apparatus for performing lens distance testing in a Head Mounted Display (HMD) device are described. HMDs may include devices such as headphones, helmets, goggles, or glasses that are designed to be worn by a user and that include display mechanisms (e.g., left and right near-eye display panels) for displaying visual content to the user. In some embodiments, the display mechanism may include a display for both eyes of the user to provide a 3D visual view to the user. In some implementations, the HMD may be a Virtual Reality (VR) or Augmented Reality (AR) device. For AR applications, the HMD may include or be coupled to one or more external cameras that capture video of the user's environment for display. The HMD may include a controller component that renders frames for display to a left display and a right display. Alternatively, the controller component may be implemented by an external device coupled to the HMD via a wired or wireless connection.
In some implementations, the HMD may include a left optical lens and a right optical lens (e.g., a display lens) located between the display and the user's eye. The distance between the user's eye (e.g., the corneal surface) and the display lens can have a significant impact on the user experience of the device. For example, on VR and AR headphones, an optimal user experience is typically achieved when the user's eyes are within an optimal distance range from the lens. Furthermore, too small a lens distance may cause safety problems, such as damage to the cornea when the user falls. To avoid such problems, embodiments of HMDs described herein are configured to perform automatic lens distance testing and notify a user when the measured distance is too close or too far. In some embodiments, the notification may be generated as an alert visually displayed to the user and suggesting that the user adjust the HMD positioning or use a different light seal that better fits the user's facial structure. In some implementations, the lens distance test may be performed during an initialization process of the HMD that occurs when the user first wears the HMD. In some implementations, the lens distance test may be repeated during a user session of the HMD (e.g., when the HMD is detected to have moved relative to the user's eyes) or according to a periodic retest schedule.
In some implementations, the lens distance test may be performed using a gaze tracking system included in the HMD for detecting a gaze direction of the user's eyes. In some implementations, the gaze tracking system may include at least one eye tracking camera (e.g., an Infrared (IR) or Near Infrared (NIR) camera) positioned on each side of the user's face, and an illumination source (e.g., an array or ring of IR or NIR light sources, such as LEDs) that emits light (e.g., IR or NIR light) toward the user's eyes. The eye-tracking camera may be directed toward the user's eye to receive IR or NIR light reflected directly from the eye by the light source, or alternatively may be directed toward "hot" mirrors located between the user's eye and the display panel that reflect IR or NIR light from the eye to the eye-tracking camera while allowing visible light to pass through. The gaze tracking system may determine a current pose of the user's eyes, which may indicate a vertex position of the cornea. In some embodiments, the distance from the vertex position to the closest point on the lens is used as the lens distance. If the lens distance is too short or too long (e.g., beyond a pre-specified threshold or range), the HMD will generate an alert to the user.
Fig. 1 illustrates an HMD device implementing a lens distance test to determine a distance between a user's eye and a lens of the HMD, according to some embodiments.
As shown, the figure depicts an HMD device 100 worn by a user 102. The HMD 100 may include, but is not limited to, a display 110 (e.g., left and right display panels), two display lenses 120, and a gaze tracking system including at least one eye tracking camera 140 (e.g., an Infrared (IR) or Near Infrared (NIR) camera) positioned on each side of the user's face, and an illumination source 130 (e.g., an IR or NIR light source such as an array or ring of NIR Light Emitting Diodes (LEDs)) that emits light (e.g., IR or NIR light) toward the user's eyes 104. The eye tracking camera 140 may be directed toward a mirror located between the user's eye 104 and the display 110 that reflects IR or NIR light from the eye 104 while allowing visible light to pass through, or alternatively, to the user's eye 104 to receive IR or NIR light reflected from the eye 104, as shown.
As shown, HMD 100 may include a light seal 150 that encloses light generated by display 110 such that visual content generated by the display appears brighter to user 120. The light seal 150 may also conform to the user 102, such as being adjusted to a particular shape to conform to the facial structure of the user or placing the display 110 at a particular distance 170 from the user's eyes. In some cases, an HMD with a light seal 150 that is suitable for one user may not properly fit the facial structure of another user, such that the other user may not be able to achieve the proper eye-to-lens distance with the light seal.
In some implementations, the HMD 100 may include a controller 160 that may be configured to render AR or VR content (e.g., left and right frames for left and right display panels) and provide these frames to the display 110. In some implementations, the controller 160 may be integrated in the HMD. In some embodiments, the controller 160 may be a computer device having its own processor and memory. In some embodiments, at least some of the functionality of the controller 160 may be implemented by a device external to the HMD and coupled to the HMD through a wired or wireless connection. The user looks through the display lens 120 to the display 110 (e.g., looks through the left and right display lenses 120 to the left and right display panels).
In some implementations, the controller 160 uses the eye-tracking camera 140 to implement gaze tracking for various purposes. The controller 160 may estimate the gaze point of the user on the display 110 based on gaze tracking input obtained from the eye tracking camera 140 using glints or reflections from the eyes produced by the light source 130. The gaze point estimated from the gaze tracking input may be used to determine the direction in which the user is currently looking.
As shown, in some embodiments, the light sources 130 may be arranged in a circle around each of the display lenses 120. However, in other embodiments, more or fewer light sources 130 may be used, and other arrangements and locations of light sources 130 may be used. In some implementations, the eye-tracking camera 140 may be directed toward a mirror located between the user's eye 104 and the display 110 to reflect IR or NIR light from the eye 104 while allowing visible light to pass. In other embodiments, the light source 130 may be directed toward the user's eye 104 to receive IR or NIR light reflected from the eye 104, as shown.
As shown, a lens distance test 180 may be implemented in the controller 160 of the HMD 100. As discussed, this lens distance test 180 may be performed at various times during operation of the HMD to determine the distance 170 between the user's eye 104 and the display lens 120 and inform the user whether the distance is outside of a threshold or tolerance range. In some implementations, these thresholds or tolerance ranges may be configured via a configuration interface of the HMD 100. In some implementations, the HMD may be configured with multiple distance thresholds. For example, exceeding the first threshold may only cause the HMD to generate a warning that the lens is too close or too far, and exceeding the second threshold may prevent the HMD from fully operating.
In some implementations, the lens distance test 180 may be performed using a gaze tracking function of the HMD. For example, a gaze tracking system of the HMD may be configured to continuously determine a current pose of the eye 104 based on the eye model. In some embodiments, the eye model may implement the function of taking an image (or glint reading) of the eye captured by the camera 140 and converting the data into a particular spatial configuration of the eye (e.g., the current position of the cornea center, optical axis, pupil, etc.). In some embodiments, the lens distance test uses such an eye model to quickly determine the current pose of the eye, including the position of the corneal vertex, which is the point of the cornea that protrudes furthest from the eyeball. In some implementations, the lens distance test uses this vertex position to determine the distance 170 from the eye to the lens.
Fig. 2A illustrates steps in a lens distance test performed by an HMD, according to some embodiments.
The top portion of fig. 2A shows some details regarding the user's eye 104 and lens 120 of fig. 1. As shown, the user's eye 104 includes a pupil 210 and a corneal vertex 212. The point on lens 120 closest to vertex 212 is point 214. In some embodiments, the distance determined by lens distance test 180 is the closest distance 220 between vertex 212 and closest point 214 on the lens.
The bottom portion of fig. 2A shows the steps performed by a particular embodiment of the lens distance test 180. As shown, the process begins with the capture of an eye image 230 by the camera 140. In some embodiments, the eye images 230 are reduced to a set of readings corresponding to the observed reflection or flicker values generated by the light source 130.
The image or reading is fed into an eye model 232 that determines the current pose 234 of the eye. In some cases, eye model 232 may be a user-specific model that was previously generated for the user, for example, during a registration process of the user. The HMD may determine that the current wearer of the device is fitting an eye model (e.g., based on a match of the current eye image or biometric authentication of the user) and choose to use the eye model to perform the lens distance test. In some embodiments, the use of eye models of known users enhances the accuracy of the test. In other cases, the eye model 232 may be a general eye model that is not specific to any user. Such a general eye model may be based on an average of the eye characteristics of many users and may be used to perform lens distance testing with less accuracy. The use of a general eye model is very useful in cases where the HMD is used by others than the normal user of the device (which is often the case where lens distance problems may occur). Once the eye pose 234 is determined, the test determines the vertex position 212 of the cornea of the eye. In some embodiments, the vertex position 212 may be obtained directly from the eye model 232.
As shown, the test process next uses the lens model 236 to determine the point 214 on the lens closest to the vertex position. In some implementations, the lens model 236 may represent the spatial shape of the lens 120 in a reference coordinate system, and the vertex position 212 is also expressed in the same coordinate system. Depending on the implementation, the coordinate system may be centered on the lens 120, the camera 140, or some other point on the HMD. In some embodiments, the display lens 120 is fixed to the HMD, and the lens model 236 is determined by a factory calibration process of the HMD.
As shown, the test procedure next uses the vertex position 212 and the closest point 214 on the lens to determine the closest distance 220. In some embodiments, this distance 220 is used as the lens distance 170. If the distance is too short or too long (e.g., shorter or longer than the distance threshold), the HMD will generate an alert 238. In some implementations, the alert may be generated visually via a display on the HMD. In some embodiments, the alert will indicate that the lens is too close or too far away and instruct or suggest to the user to adjust the position of the HMD or use a different light seal. In some implementations, if the user is not identified as a known user of the HMD, the alert may require the user to authenticate (so that the lens distance test may be retried using a more accurate eye model), or register as a new user.
Fig. 2B illustrates a lens distance test to determine a distance from a user's eye to a clip-on lens added to an HMD, according to some embodiments.
As shown, in this example, a clip lens 240 is inserted in front of the display lens 120. Such clip lens 240 may not be a factory installed component of the HMD, but rather added by the user in the field. In some embodiments, the clip-on lens may be a correction lens that is added to correct the vision of a particular user.
In some implementations, when the clip-on lens 240 is added to the HMD, the HMD will perform an in-situ calibration process to accurately determine the position of the clip-on lens relative to the HMD. This position data may be stored (e.g., as part of lens model 236) so that it may be used to perform lens distance test 180. When a clip-on lens is present, the lens distance test will measure the closest distance 250 from the eye 104 to the clip-on lens instead of the display lens 120. In some embodiments, the distance threshold used to generate the alert may also change when the corrective lens is installed, encouraging the user to properly maintain the eye a distance according to the characteristics of the corrective lens. In some embodiments, lens distance test 180 may be performed as part of an in-situ calibration process when installing a new clip-on lens.
Fig. 3 is a flow chart illustrating performance of a lens distance test according to some embodiments. The process shown in fig. 3 may be performed by an implementation of HMD 100 of fig. 1.
The process begins at operation 310, where an image of the user's eye is captured using a camera (e.g., camera 140) on the HMD. In some implementations, the camera may be part of a gaze tracking system implemented by the HMD, and the image may indicate reflections or flickering produced by LED light sources positioned around the eyes. In some embodiments, the image may be acquired at a single point in time and constitute one frame of the captured eye.
At operation 320, a determination is made as to whether a user-specific eye model associated with the user is available. In some embodiments, the eye model may be a function of converting the captured image or readings derived from the image to a particular pose of the eye. A user-specific eye model may be created during a registration process of a user. Such a user-specific eye model may be preferred over a general eye model because the user-specific eye model may more accurately determine eye pose. In some implementations, the HMD may identify a particular user by analyzing the captured image, certain features of the eye, or as a result of user authentication.
If a user-specific eye model is available, the process proceeds to operation 330 where the pose of the eye is determined based on the captured image or readings derived from the image using the user-specific eye model. If the user-specific eye model is not available (e.g., because the user is not identified as a previously registered known user), the process proceeds to operation 340 where the pose of the eye is determined using the general eye model. The general eye model is not user-specific, and thus gestures determined using the general model may be less accurate. In some embodiments, the general eye model may be based on an average of the characteristics of many users.
Once the eye pose is determined, at operation 350, the eye pose is used to determine a vertex position (e.g., vertex position 212) corresponding to the corneal vertex of the eye. The eye pose may fully represent the 3D shape of the eye in space, including the vertex of the cornea. In some embodiments, the vertex position may be directly output by the eye model. The vertex position may be specified in a reference coordinate system that may be centered on a display lens, camera, or other fixed point relative to the HMD.
At operation 360, the point on the lens under consideration (e.g., display lens 120 or clip lens 240) closest to the vertex position is determined. In some embodiments, the spatial position and shape of the lens may be modeled in a lens model in the same coordinate system as the vertex position, and the lens model may be used to determine the closest point. At operation 370, a closest distance between the vertex position and a closest point on the lens is determined.
At operation 380, one or more checks are made to determine if the closest distance is greater than or less than one or more distance thresholds. The distance threshold may specify an optimal or preferred distance range from the eye to the lens. In some implementations, the threshold may be configured via a configuration interface of the HMD.
If a distance problem is found (e.g., if the closest distance is outside of an acceptable distance range), then at operation 390 an alert is output indicating that the lens is too close or too far. In some embodiments, the alert may suggest to the user to adjust the positioning of the HMD or to use a different light seal that better fits the user's face. In some implementations, the HMD may suggest that the user authenticate so that a more accurate user-specific eye model may be used, or register as a new user so that the user-specific eye model may be configured. In some implementations, if the lens distance test fails, the HMD may prevent the user from initiating a user session using the HMD. In some implementations, an alert may be visually generated to the user via the display lens.
As shown, in some embodiments, the lens distance test may be repeated even though the lens distance is determined to be acceptable. For example, a test may be performed for multiple frames to measure lens distance at multiple eye positions. In some embodiments, an alert will be generated if any of the eye gestures fail the test. In some embodiments, the alert may be generated based on an average of the measured distances for different poses, or based on the number of poses that failed the test.
Fig. 4 is a flow chart illustrating a process of repeating a lens distance test using a user session of an HMD, according to some embodiments. The process shown in fig. 3 may be performed by an implementation of HMD 100 of fig. 1.
As shown, in some embodiments, one or more runs of a lens distance test (e.g., lens distance test 180) may be performed 420 during an initialization process 410 of a user session. This initialization process may occur when the user first wears (or turns on) the HMD device, at which point the HMD will configure various settings for the user settings. In some implementations, the initialization process 410 may occur when the HMD device switches from one user to another. As discussed, the lens distance test may be performed multiple times in different poses. In some implementations, the HMD may perform lens distance testing on both eyes independently.
Additionally, in some implementations, the lens distance test may be repeated 450 during the user session 430 (e.g., when the user is using an AR or VR application performed by the HMD). The retest may be triggered by one or more events or conditions 440 configured for the HMD. For example, in some implementations, a retest of lens distance may be triggered in response to detecting a fall or significant movement of the user's eyes relative to the HMD. In some implementations, the lens distance may be retested (e.g., once per minute) according to a set-up schedule so that the distance may be continuously monitored.
Fig. 5 is a block diagram illustrating various components of an exemplary VR/AR system implementing lens distance testing in accordance with some embodiments. In some implementations, the VR/AR system may include an HMD 2000, such as headphones, helmets, goggles, or glasses. HMD 2000 may implement any one of various types of virtual reality projector technologies. For example, HMD 2000 may include a VR projection system that includes projector 2020 that displays frames including left and right images on screens or displays 2022A and 2022B that are viewed by a user through eye lenses 2220A and 2220B. The VR projection system may be, for example, a DLP (digital light processing), LCD (liquid crystal display) or LCoS (liquid crystal on silicon) technology projection system. To create a three-dimensional (3D) effect in a 3D virtual view, objects at different depths or distances in the two images may be shifted to the left or right as a function of triangulation of the distances, with closer objects being shifted more than more distant objects. It should be noted that in some embodiments, other types of projection systems may be used.
In some embodiments, HMD 2000 may include controller 2030, which implements the functions of the VR/AR system and generates frames (each frame including a left image and a right image) for display by projector 2020. In some embodiments, HMD 2000 may also include memory 2032, which stores software (code 2034) of the VR/AR system executable by controller 2030, and data 2038 usable by the VR/AR system when executed on controller 2030. For example, in some embodiments, code 2034 may include code for performing lens distance test 180, and data 2038 may include captured eye image 230 and determined closest distance 220.
In some embodiments, the HMD 2000 may also include one or more interfaces (e.g., bluetooth technology interface, USB interface, etc.) that communicate with the external device 2100 via a wired or wireless connection. In some embodiments, at least a portion of the functionality described for the controller 2030 may be implemented by the external device 2100. The external device 2100 may be or include any type of computing system or computing device, such as a desktop computer, notebook or laptop computer, tablet or tablet device, smart phone, handheld computing device, game controller, game system, and the like.
In various embodiments, the controller 2030 may be a single processor system including one processor, or a multi-processor system including several processors (e.g., two, four, eight, or another suitable number). The controller 2030 may include a Central Processing Unit (CPU) implementing any suitable instruction set architecture and may execute instructions defined in the instruction set architecture. For example, in various embodiments, controller 2030 may comprise a general purpose processor or an embedded processor implementing any of a variety of Instruction Set Architectures (ISAs), such as the x86, powerPC, SPARC, RISC, or MIPS ISAs, or any other suitable ISA. In a multiprocessor system, each processor may collectively implement the same ISA, but is not required. The controller 2030 may employ any microarchitecture including scalar, superscalar, pipelined, superpipelined, out-of-order, in-order, speculative, non-speculative, etc., or a combination thereof. The controller 2030 may include circuitry that implements microcode techniques. Controller 2030 may include one or more processing cores that each execute instructions. The controller 2030 may include one or more levels of cache that may take any size and any configuration (set associative, direct mapped, etc.). In some embodiments, controller 2030 may include at least one Graphics Processing Unit (GPU), which may include any suitable graphics processing circuitry. Typically, the GPU may render the object to be displayed into a frame buffer (e.g., a frame buffer that includes pixel data for an entire frame). The GPU may include one or more graphics processors that may execute graphics software to perform some or all of the graphics operations or hardware acceleration of certain graphics operations. In some embodiments, the controller 2030 may include one or more other components for processing and rendering video and/or images, such as an Image Signal Processor (ISP), encoder/decoder, and the like.
Memory 2032 may include any type of memory such as Dynamic Random Access Memory (DRAM), synchronous DRAM (SDRAM), double data rate (DDR, DDR2, DDR3, etc.) SDRAM (including a mobile version of SDRAM such as mDDR3, etc., or a low power version of SDRAM such as LPDDR2, etc.), RAMBUSDRAM (RDRAM), static RAM (SRAM), etc. In some embodiments, one or more memory devices may be coupled to a circuit board to form a memory module, such as a single in-line memory module (SIMM), dual in-line memory module (DIMM), or the like. Alternatively, the device may be mounted with an integrated circuit implementing the system in a chip stack configuration, a package stack configuration, or a multi-chip module configuration.
In some implementations, the HMD 2000 may include one or more cameras 2050 that capture video of a user environment for AR applications. In some implementations, the HMD 2000 may render and display frames to provide an augmented or mixed reality (AR) view to a user based at least in part on camera 2050 input. The AR view may include rendering the user's environment, including rendering real objects in the user's environment based on video captured by one or more cameras 2050, which capture high quality, high resolution video of the user's environment for display. In some implementations, the camera 2050 may be equipped with an autofocus mechanism. Although not shown, in some embodiments, the HMD 2000 may also include one or more sensors that collect information about the user's environment and actions (depth information, lighting information, user movements and gestures, etc.). The camera 2050 and sensors may provide information to the controller 2030 of the VR/AR system.
As shown, HMD 2000 may be positioned on the user's head such that display 2022A and display 2022B and eye lenses 2220A and 2220B are disposed in front of user's eyes 2292A and 2292B. IR or NIR light sources 2230A and 2230B (e.g., IR or NIR LEDs) may be positioned in HMD 2000 (e.g., around eye lenses 2220A and 2220B, or elsewhere in HMD 2000) to illuminate user's eyes 2292A and 2292B with IR or NIR light. Eye tracking cameras 2240A and 2240B (e.g., IR or NIR cameras, e.g., 400x400 pixel count cameras) are located on each side of the user's face, e.g., at or near the user's cheekbones. Note that the locations of the eye tracking camera 2240A and the eye tracking camera 2240B are given by way of example and are not intended to be limiting. In some implementations, there may be a single eye tracking camera 2240 located on each side of the user's face. In some implementations, there may be two or more eye tracking cameras 2240 on each side of the user's face. For example, in some implementations, a wide angle camera 2240 and a narrower angle camera 2240 may be used on each side of the face of the user. A portion of the IR or NIR light emitted by light sources 2230A and 2230B is reflected from the user's eyes 2292A and 2292B directly or via mirrors 2250A and 2250B located between the user's eyes 2292 and the display 2022 to respective eye tracking cameras 2240A and 2240B and captured by eye tracking cameras 2240A and 2240B to image the user's eyes 2292A and 2292B. Gaze tracking information captured by cameras 2240A and 2240B may be provided to controller 2030. The controller 2030 may analyze gaze tracking information (e.g., images of the user's eyes 2292A and 2292B) to determine gaze direction, eye position and movement, pupil dilation, or other characteristics of the eyes 2292A and 2292B.
The gaze tracking information obtained and analyzed by the controller 2030 may be used by the controller to perform various VR or AR system functions. For example, gaze points on the displays 2022A and 2022B may be estimated from images captured by the eye tracking cameras 2240A and 2240B using a flicker-assisted method. For example, the estimated gaze point may be used to render virtual content differently based on the determined user gaze direction.
The implementation of HMD 2000 as shown herein is also used in Virtual Reality (VR) applications to provide VR views to users. In these embodiments, the controller 2030 of the HMD 2000 may render or obtain Virtual Reality (VR) frames including virtual content, and the rendered frames may be provided to the projector 2020 of the HMD 2000 for display to the displays 2022A and 2022B. In some implementations, for VR applications, the controller 2030 may obtain distance information for the virtual content for display on the display panel 2022, and may use the distance information to direct the eye lens 2220 to adjust focus according to the distance of the virtual content that the user is currently looking at according to the gaze tracking information.
In various embodiments, the methods described herein may be implemented in software, hardware, or a combination thereof. Further, the order of the blocks of the method may be changed, and various elements may be added, reordered, combined, omitted, modified, etc. Various modifications and alterations will become apparent to those skilled in the art having the benefit of this disclosure. The various embodiments described herein are intended to be illustrative rather than limiting. Many variations, modifications, additions, and improvements are possible. Thus, multiple examples may be provided for components described herein as a single example. Boundaries between various components, operations and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are contemplated and may fall within the scope of the claims that follow. Finally, structures and functions presented as discrete components in an exemplary configuration may be implemented as a combined structure or component. These and other variations, modifications, additions, and improvements may fall within the scope of the embodiments as defined in the claims that follow.
Claims (20)
1. A Head Mounted Display (HMD), comprising:
a lens;
one or more cameras; and
one or more processors configured to perform a lens distance test, the lens distance test comprising:
capturing one or more images of the eye using the one or more cameras;
determining a pose of the eye based at least in part on the one or more images;
determining a distance between a portion of the eye and the lens based at least in part on the pose; and
a notification is output in response to determining that the distance is below a threshold.
2. The system of claim 1, wherein:
the lens is a display lens configured to project visual content generated by the HMD; and is also provided with
The notification is output as part of the visual content.
3. The system of claim 1, wherein:
the lens is a clip-on lens added to the HMD; and is also provided with
The position of the clip-on lens relative to the HMD is determined during an in-situ calibration performed when the clip-on lens is added to the HMD.
4. The system of claim 1, wherein:
the HMD implementing a gaze tracking system that tracks a gaze direction of the eye; and is also provided with
The pose of the eye is determined by the gaze tracking system.
5. The system of claim 4, wherein:
the gaze tracking system includes a plurality of Light Emitting Diodes (LEDs) that emit light toward the eye;
the one or more cameras capture reflections of the light emitted by the LEDs; and is also provided with
The pose of the eye is determined based at least in part on the reflection.
6. The system of claim 1, wherein to determine the distance between the eye and the lens, the HMD is configured to:
determining vertex positions in a coordinate system corresponding to corneal vertices of the eye; and
a point on the lens closest to the vertex position is determined in the same coordinate system.
7. The system of claim 1, wherein to determine the pose of the eye, the HMD is configured to:
determining that the eye corresponds to a user-specific eye model of the user; and
the gesture is determined using the user-specific eye model.
8. The system of claim 1, wherein the pose of the eye is determined using a generic eye model that is not specific to a user.
9. The system of claim 1, wherein to perform the lens distance test, the HMD is configured to determine the distance for a plurality of poses of the eye.
10. The system of claim 1, wherein the HMD is configured to perform the lens distance test for both eyes of a user.
11. The system of claim 1, wherein the HMD is configured to:
performing a second distance test to determine that a second distance between the eye and the lens exceeds a second threshold, and in response:
a second notification is output indicating that the eye is too far from the lens.
12. The system of claim 1, wherein the HMD is configured to perform the lens distance test during an initialization process of a user session of the HMD.
13. The system of claim 12, wherein the HMD is configured to repeat the lens distance test after the initialization process to monitor the distance between the eye and the lens during the user session.
14. A method, comprising:
performing, by a Head Mounted Display (HMD) implemented using one or more processors, a lens distance test, the lens distance test comprising:
capturing one or more images of an eye using one or more cameras of the HMD;
determining a pose of the eye based at least in part on the one or more images;
determining a distance between a portion of the eye and a lens of the HMD based at least in part on the pose; and
a notification is output in response to determining that the distance is below a threshold.
15. The method according to claim 14, wherein:
the HMD implementing a gaze tracking system that tracks a gaze direction of the eye; and is also provided with
The pose of the eye is determined by the gaze tracking system.
16. The method according to claim 15, wherein:
the gaze tracking system includes a plurality of Light Emitting Diodes (LEDs) that emit light toward the eye;
the one or more cameras capture reflections of the light emitted by the LEDs; and is also provided with
The pose of the eye is determined based at least in part on the reflection.
17. The method of claim 14, wherein determining the distance between the eye and the lens comprises:
determining vertex positions in a coordinate system corresponding to corneal vertices of the eye; and
a point on the lens closest to the vertex position is determined in the same coordinate system.
18. The method of claim 14, wherein determining the distance between the eye and the lens comprises:
determining that the eye corresponds to a user-specific eye model of the user; and
the gesture is determined using the user-specific eye model.
19. The method of claim 14, wherein performing the lens distance test comprises determining the distance for a plurality of poses of the eye.
20. One or more non-transitory computer-readable media storing program instructions that, when executed by one or more processors of a Head Mounted Display (HMD), cause the HMD to perform a lens distance test comprising:
capturing one or more images of an eye using one or more cameras of the HMD;
determining a pose of the eye based at least in part on the one or more images;
determining a distance between a portion of the eye and a lens of the HMD based at least in part on the pose; and
a notification is output in response to determining that the distance is below a threshold.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US63/376,954 | 2022-09-23 | ||
US18/470,748 | 2023-09-20 | ||
US18/470,748 US20240105046A1 (en) | 2022-09-23 | 2023-09-20 | Lens Distance Test for Head-Mounted Display Devices |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117761892A true CN117761892A (en) | 2024-03-26 |
Family
ID=90324265
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311232924.XA Pending CN117761892A (en) | 2022-09-23 | 2023-09-22 | Lens distance testing for head mounted display devices |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117761892A (en) |
-
2023
- 2023-09-22 CN CN202311232924.XA patent/CN117761892A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11073908B2 (en) | Eye-tracking enabled wearable devices | |
US11217021B2 (en) | Display system having sensors | |
US11755106B1 (en) | Glint-assisted gaze tracker | |
CN112515624B (en) | Eye tracking using low resolution images | |
US9265415B1 (en) | Input detection | |
CN116569221A (en) | Flexible illumination for imaging systems | |
CN116583885A (en) | Gesture optimization in biometric authentication systems | |
US20240105046A1 (en) | Lens Distance Test for Head-Mounted Display Devices | |
CN117761892A (en) | Lens distance testing for head mounted display devices | |
CN112584127A (en) | Gaze-based exposure | |
EP4343497A1 (en) | Corrected gaze direction and origin | |
WO2019018365A1 (en) | Eye-tracking enabled wearable devices | |
US20240104958A1 (en) | User Eye Model Match Detection | |
US12111463B2 (en) | Head-mounted display apparatus and operating method thereof | |
WO2024064376A1 (en) | User eye model match detection | |
CN117761893A (en) | Corrected gaze direction and origin |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |