CN116338941A - Eye tracking device, display apparatus, and storage medium - Google Patents

Eye tracking device, display apparatus, and storage medium Download PDF

Info

Publication number
CN116338941A
CN116338941A CN202111576836.2A CN202111576836A CN116338941A CN 116338941 A CN116338941 A CN 116338941A CN 202111576836 A CN202111576836 A CN 202111576836A CN 116338941 A CN116338941 A CN 116338941A
Authority
CN
China
Prior art keywords
optical
eye
module
light
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111576836.2A
Other languages
Chinese (zh)
Inventor
朱帅帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111576836.2A priority Critical patent/CN116338941A/en
Priority to PCT/CN2022/139196 priority patent/WO2023116541A1/en
Publication of CN116338941A publication Critical patent/CN116338941A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Abstract

An embodiment of the application provides an eye movement tracking device, a display device and a storage medium, wherein the eye movement tracking device comprises: the optical display module comprises a display device and an optical module, wherein the optical module comprises a first optical device, and first light emitted by the display device is emitted into eyes of a user after passing through the optical module; and an eye tracking module comprising: at least one light source configured to emit a second light to the user's eye, at least a portion of the second light illuminating the eye for reflection; an imaging module configured to obtain at least part of the second light reflected after the at least one light source irradiates the eye; the camera module is located between the first optical device and the display device. The above-mentioned setting method of the camera module that this application provided can reduce the inclination of eye and move tracking light, is favorable to obtaining more (user's eyes reflection) reflected light (second light promptly), and then can improve the precision that eye moved and tracked.

Description

Eye tracking device, display apparatus, and storage medium
Technical Field
The application relates to the technical field of sight tracking, in particular to an eye movement tracking device, display equipment and a storage medium.
Background
Currently, some Head-mounted display devices (HMDs) have an eye movement tracking function, which is to track eye movement by measuring the position of a gaze point of an eye or the movement of an eyeball relative to a Head, or eye movement tracking is to track the binocular vision direction of a user.
The implementation manner of the eye movement tracking is generally as follows: near infrared light is emitted to eyes of a user through a near infrared light source, the eyes of the user are shot through a near infrared camera arranged outside the lens group (on a reflection line of the near infrared light reflected by the eyes of the user), and then the gazing direction of the user is deduced through light and back-end analysis.
The lens group of some head-mounted display devices has compact structure, for example, a folded light path (Pancake) lens group, and under the Pancake architecture, the distance from the eyeball of a user to the Pancake lens group is relatively short, so that if a near-infrared camera is installed outside the Pancake lens group, the inclination angle of motion tracking is too large, and the shot eyeball picture does not meet the algorithm requirement.
Disclosure of Invention
The embodiment of the application provides an eye tracking device, display equipment and storage medium, wherein, be arranged in the eye tracking device and obtain the camera module setting of user's eyes reflection light between optical display module assembly's first optics and display device, compare in setting up near infrared camera outside the mirror group, the above-mentioned setting up mode of the camera module that this application provided can reduce the inclination of eye tracking light, is favorable to obtaining more (user's eyes reflection) reflection light (i.e. second light), and then can improve the precision that eye tracked.
In a first aspect, embodiments of the present application provide an eye tracking device comprising: the optical display module comprises a display device and an optical module, wherein the optical module comprises a first optical device, and first light emitted by the display device is emitted into eyes of a user after passing through the optical module; and an eye tracking module comprising: at least one light source configured to emit a second light to the user's eye, at least a portion of the second light illuminating the eye for reflection; the camera module is configured to acquire at least part of second light reflected after the eyes are irradiated by the at least one light source; the camera module is positioned between the first optical device and the display device. It should be noted that, in the above-mentioned image capturing module located between the first optical device and the display device, a columnar space is provided between the lens of the first optical device adjacent to the display device and the display device, and in one embodiment, the image capturing module may be located in the columnar space. In another embodiment, the imaging module may be not limited to being disposed in the columnar space, and may be disposed at a point (point B) located outside the columnar space at a predetermined distance along the vertical direction of the central axis of the lens barrel at any point (point a) in the columnar space.
In some embodiments, at least a portion of the second light is reflected by the user's eyeglasses and transmitted through the first optical element and captured by the camera module.
In some embodiments, the focal length of the optical module is variable.
In some embodiments, the eye tracking device further comprises a focus measurement unit configured to trigger focus detection after focus adjustment of the optical module to determine current focus information. In one embodiment, the focus measurement unit may include a hall sensor, a grating scale, or a sliding rheostat.
In some embodiments, the distance between the first optical device and the display device is adjustable, the first optical device being a lens.
In some embodiments, the first optic is a variable focal length lens. The method for adjusting the focal length of the variable focal length lens can comprise adjusting the distance between the lens and the display device, and the focal length measuring unit can detect the distance between the current lens and the display device after the distance between the lens and the display device is changed. In some cases, the variable focal length lens may be a lens with a variable concave-convex shape, the focal length being changed by changing the shape of the lens.
In some embodiments, the optical module further comprises an external lens interface, and the external lens interface is used for mounting the myopia correction lens on the lens barrel in a magnetic attraction or buckling manner. In a scenario of myopia correction through external myopia correction lenses, the focal length measurement unit may include a storage chip and an identity chip, in one embodiment, the storage chip may store a correspondence between identity information and focal power information of each myopia correction lens, and the storage chip may be disposed in the optical module; the identity chip stores identity information (such as an identification number) of the myopia correction lens, and the identity chip can be arranged on the myopia correction lens, and after the myopia correction lens is externally connected on a lens barrel of the optical module, the focal power information (namely focal length information) of the myopia correction lens can be matched in the storage chip based on the identity information.
In some embodiments, when the focal length of the optical module is changed, the gaze point direction of the user's eye is unchanged, and the gaze point of the user's eye known by the eye tracking device is unchanged.
In some embodiments, when the focal length of the optical module is changed, an optical path table applicable to the current focal length information is determined. Specific embodiments for determining the optical path table applicable to the current focal length information can be referred to the optical path table correction method provided in the second aspect.
In some embodiments, the eye tracking device further includes a processor, and the processor receives the result obtained by the camera module and obtains the gaze point of the user's eye.
In some embodiments, the optical display module further includes a second optical device through which at least a portion of the second light passes before being captured by the camera module.
In some embodiments, the second optical device is located between the first optical device and the display device, or the second optical device is located on a side of the first optical device facing away from the display device. In one embodiment, the second optic may be a lens and the first optic and the second optic may form a lens group, which may be, but is not limited to, a Pancake lens group.
In a second aspect, an embodiment of the present application further provides an eye movement tracking method, which is used for an eye movement tracking device, where the eye movement tracking device includes an optical display module and an eye movement tracking module; the optical display module comprises a display device and an optical module, wherein the optical module comprises a first optical device; the eye movement tracking module comprises at least one light source and a camera module; the method comprises the following steps: the first light emitted by the display device is emitted into eyes of a user after passing through the optical module; the at least one light source is configured to emit a second light to the user's eye, at least a portion of the second light illuminating the eye for reflection; and at least one light source irradiates at least part of the second light reflected by the eyes, and the second light passes through the first optical device and is acquired by the camera module.
In some embodiments, the method further comprises: acquiring focal length information of the optical module; the gaze direction of the user's eyes is known. The focal length information may include a focal length of the optical module, and the like. In some embodiments, the gazing direction of the eyes of the user is known according to the focal length information of the optical module and the second light acquired by the camera module.
In some embodiments, the method further comprises: acquiring a light path table set, wherein the light path table set comprises at least two light path tables corresponding to myopia correction positions; acquiring current myopia correction parameters after myopia correction operation; and
and carrying out optical path table interpolation on the optical path table set according to the myopia correction parameters to obtain a target optical path table.
In some embodiments the method further comprises: and acquiring the optical path table set from a target memory of the pre-stored optical path table set.
In some embodiments, the focal length of the optical module is variable; the eye tracking device further comprises a focal length measuring unit, wherein the focal length measuring unit is configured to trigger focal length detection after the focal length of the optical module is adjusted, and current focal length information is obtained.
In some embodiments, when the focal length of the optical module is changed, the gaze point direction of the user's eye is unchanged, and the gaze point of the user's eye known by the eye tracking device is unchanged.
In a third aspect, an embodiment of the present application further provides a method for correcting an optical path table, including: acquiring a light path table set, wherein the light path table set comprises at least two light path tables corresponding to myopia correction positions; acquiring current myopia correction parameters after myopia correction operation; and performing optical path table interpolation on the optical path table set according to the myopia correction parameters to obtain a target optical path table.
In some embodiments, obtaining the set of optical path tables includes: and acquiring the optical path table set from a target memory of the pre-stored optical path table set.
In some embodiments, obtaining current myopia correction parameters after a myopia correction operation comprises: acquiring position information of a lens which is translated in the lens group after myopia correction operation; and obtaining the current myopia correction parameters after the myopia correction operation according to the position information conversion.
In some embodiments, the myopia correction operation includes translating lenses in the lens group to adjust the position of the virtual image distance; the obtaining of the position information of the lens which is translated in the lens group after myopia correction operation comprises the following steps: and acquiring current position sensing information provided by a position sensor arranged on the translated lens, wherein the position sensor comprises a Hall sensor, a grating ruler or a sliding rheostat.
In some embodiments, scaling the current myopia correction parameters from the positional information to the myopia correction operation comprises: and determining the vision correction parameters corresponding to the current position sensing information based on the relation between the pre-calibrated myopia correction parameters and the position sensing information.
In some embodiments, the myopia correction operation includes adding a myopia lens in front of the barrel to adjust the position of the virtual image distance; the obtaining of the current myopia correction parameters after the myopia correction operation comprises: and acquiring the focal power information of the myopia lens added in front of the lens cone.
In some embodiments, obtaining the power information of the pre-barrel add near vision lens comprises: acquiring identity information of a myopia lens added in front of a lens cone; the optical power information of the myopia lens added in front of the lens barrel is matched based on the identity information.
In some embodiments, before performing the optical path table interpolation on the optical path table set according to the myopia correction parameters to obtain the target optical path table, the method further includes: acquiring a position set of a light source (such as a near infrared LED), wherein the position set of the near infrared LED comprises position information of the near infrared LED under at least two myopia correction parameters; interpolation is carried out on the position set of the near infrared LEDs according to the current myopia correction parameters so as to obtain the current position of the near infrared LEDs under the previous myopia correction parameters; and updating the position information of the near infrared LEDs according to the current positions of the near infrared LEDs under the front myopia correction parameters.
In a fourth aspect, embodiments of the present application further provide a method for correcting an optical path table, including: after myopia correction operation, the target lens in the lens group is translated from the first position to the second position, and current myopia correction parameters after the myopia correction operation are determined according to the distance information between the first position and the second position; performing optical path table interpolation on the optical path table set according to the myopia correction parameters to obtain a target optical path table; the optical path table set is pre-stored data, and comprises at least two optical path tables corresponding to myopia correction positions.
In a fifth aspect, still another embodiment of the present application further provides an optical path table correction device, including: the optical path table correction method according to the second aspect or the third aspect is implemented when the at least one instruction is loaded and executed by the processor.
In a sixth aspect, still another embodiment of the present application further provides a head-mounted display device, which includes the optical path table correction apparatus provided in the fourth aspect. In one embodiment, the optical path table correction device may be a component element of a head-mounted display device, for example, a chip.
In a seventh aspect, embodiments of the present application also provide a head-mounted display apparatus, the apparatus comprising the eye tracking device provided in the first aspect.
In an eighth aspect, still another embodiment of the present application further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method provided in the second aspect, the third aspect or the fourth aspect.
Through above-mentioned technical scheme, with the module setting of making a video recording between optical display module's first optics and display device, compare in with near infrared camera setting outside the mirror group, the above-mentioned setting method of the module of making a video recording that this application provided can reduce eye and move the inclination who tracks light, is favorable to obtaining more (user's eyes reflection) reflected light (second light promptly), and then can improve the precision that eye moves the tracking.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
Fig. 1 is a schematic diagram of a VR system provided in one embodiment of the present application;
Fig. 2A is a schematic structural diagram of a VR headset display device according to an embodiment of the present application;
fig. 2B is a simplified schematic diagram of a VR head mounted display device provided in one embodiment of the present application;
FIG. 3 is a schematic diagram of a related art head mounted display device based on a Pancake folded light path;
FIG. 4 is a schematic diagram of the polarization of light of a Pancake mirror group according to the related art;
fig. 5 is a schematic structural diagram of a VR headset display device according to an embodiment of the present application;
FIG. 6 is an exemplary software architecture block diagram of a VR head-mounted display device of an embodiment of the present application;
FIG. 7 is a schematic diagram of eye tracking based on corneal reflection in the related art;
FIG. 8 is a schematic diagram of a related art refraction of a corneal reflected light ray through a Pancake lens group and a near infrared camera lens;
FIG. 9 is a schematic diagram of an optical path table calibration for eye tracking in the related art;
FIG. 10a is a schematic diagram of a myopia correction system according to the related art;
FIG. 10b is a schematic diagram of another myopia correction system according to the present invention;
FIG. 11 is a schematic diagram of a system architecture according to yet another embodiment of the present application;
FIG. 12 is a flowchart of a method for correcting an optical path table according to an embodiment of the present disclosure;
FIG. 13 is a schematic view of an optical path table calibration according to still another embodiment of the present application;
FIG. 14 is a schematic view of obtaining parameters for myopia correction according to yet another embodiment of the present application;
FIG. 15 is a schematic view of a myopia correction lens mounting system according to yet another embodiment of the present application;
FIG. 16 is a schematic diagram of transmission of myopia correction parameters according to yet another embodiment of the present application;
FIG. 17a is a schematic view of myopia correction by rotating the lens barrel according to yet another embodiment of the present application;
FIG. 17b is a schematic view of myopia correction through a telescopic lens barrel according to yet another embodiment of the present application;
FIG. 18 is a flowchart of a method for correcting an optical path table according to still another embodiment of the present disclosure;
fig. 19 is a schematic structural view of an optical path table correction device according to still another embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
(1) At least one of the embodiments of the present application includes one or more; wherein, a plurality refers to greater than or equal to two. In addition, it should be understood that in the description of this application, the words "first," "second," and the like are used merely for distinguishing between the descriptions and not for indicating or implying any relative importance or order. For example, the first region and the second region do not represent the importance of both, or the order of both, only to distinguish the regions. In the embodiment of the present application, "and/or" is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
(2) Virtual Reality (VR) technology is a man-machine interaction means created by means of computer and sensor technologies. VR technology integrates a variety of scientific technologies such as computer graphics technology, computer simulation technology, sensor technology, display technology, etc., and can create a virtual environment. The virtual environment comprises three-dimensional realistic images which are generated by a computer and dynamically played in real time, so that visual perception is brought to a user; moreover, besides visual perception generated by computer graphics technology, there are also perceives such as hearing, touch, force sense, movement, etc., even including smell and taste, etc., also called multi-perception; in addition, the head rotation, eyes, gestures or other human behavior actions of the user can be detected, the computer is used for processing data which are suitable for the actions of the user, the data respond to the actions of the user in real time and are respectively fed back to the five sense organs of the user, and then the virtual environment is formed. For example, a user wearing the VR wearable device may see the VR game interface, through gestures, handles, etc., may interact with the VR game interface as if in a game.
(3) Augmented reality (Augmented Reality, AR) technology refers to overlaying computer-generated virtual objects over a real-world scene, thereby enabling augmentation of the real world. That is, the AR technology needs to acquire a real-world scene and then add a virtual environment on the real world.
Thus, VR technology differs from AR technology in that VR technology creates a complete virtual environment, and all users see is a virtual object; while AR technology is the superposition of virtual objects on the real world, i.e. including both real world and virtual objects. For example, a user wears transparent glasses through which a surrounding real environment can be seen, and virtual objects can be displayed on the glasses, so that the user can see both the real objects and the virtual objects.
(4) The Mixed Reality technology (MR) is to introduce real scene information (or referred to as real scene information) into a virtual environment, and bridge an interactive feedback information among the virtual environment, the real world and a user, so as to enhance the sense of Reality of user experience. Specifically, the real object is virtualized (e.g., a camera is used to scan the real object for three-dimensional reconstruction, generating a virtual object), and the virtualized real object is introduced into the virtual environment, so that the user can see the real object in the virtual environment.
It should be noted that, the technical solution provided in the embodiment of the present application may be applied to a scenario in which an electronic device for eye tracking such as a VR scenario, an AR scenario, or an MR scenario is a head-mounted device, or may be applied to a scenario in which other electronic devices for eye tracking are non-head-mounted devices, for example, a scenario in which eye tracking is performed when a terminal device (for example, a mobile phone, a tablet pc, etc.), a computer monitor, a smart car, or a large-screen device such as a television is used, and for example, in a driving scenario of the smart car, the gaze point of a human eye may be determined more accurately by using the technical solution provided in the embodiment of the present application, so that eye tracking is performed more rapidly and accurately; in short, the method is suitable for any scene in which the gaze point of human eyes needs to be accurately determined for eye movement tracking.
For ease of understanding, VR scenarios are mainly described below as examples.
For example, please refer to fig. 1, which is a schematic diagram of a VR system according to an embodiment of the present application. VR systems, which may be referred to as VR split machines, include VR wearable devices (VR head mounted display device 100 is an example in this embodiment) and image processing device 200. VR headset 100 may be connected to processing device 200. The connection between the VR headset 100 and the processing device 200 includes a wired or wireless connection, which may be Bluetooth (BT), conventional bluetooth or low energy (Bluetooth Low Energy, BLE) bluetooth, wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), zigbee, frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), or general 2.4G/5G band wireless communication connection, etc.
In some embodiments, the image processing device 200 may perform processing calculations, for example, the image processing device 200 may generate and process an image (the manner of processing will be described below), and then send the processed image to a VR head mounted display device for display. The image processing device 200 may include a host (e.g., VR host) or a server (e.g., VR server), among others. The VR host or VR server may be a device with greater computing capabilities. For example, the VR host may be a device such as a cell phone, tablet, notebook, etc., and the VR server may be a cloud server, etc.
In some embodiments, VR head mounted display device 100 may be glasses, a helmet, or the like. Typically, VR head mounted display device 100 has two display devices, namely display device 110 and display device 120 disposed thereon. The display device of VR head mounted display device 100 may display images to the human eye. In the embodiment shown in fig. 1, the display device 110 and the display device 120 are wrapped inside VR glasses, so the arrows for indicating the display device 110 and the display device 120 in fig. 1 are indicated with dotted lines.
In some embodiments, VR head mounted display device 100 is native to the functionality of image generation, processing, etc., i.e., VR head mounted display device 100 does not require image processing device 200 in fig. 1, such VR head mounted display device 100 may be referred to as a VR all-in-one machine.
Fig. 2A is a schematic diagram of VR head mounted display device 100. As shown in (a) of fig. 2A, the VR head-mounted display apparatus 100 includes an optical display module 210 and an optical display module 220. Wherein the optical display module 210 includes a display device and an optical device 211. The optical display module 220 includes a display screen and an optical device 221. In some embodiments, the optical display module 210 and the optical display module 220 further include optical devices. In some embodiments, the optical display module 210 and the optical display module 220 may be two barrels, respectively, which are hollow cylindrical, i.e., the optics are housed within the barrels, through which the optics and the display device are disposed on the VR head-mounted display apparatus 100. The optical display module 210 may be used to present images to the left eye of a user when the VR headset 100 is worn by the user. The optical display module 220 may be used to present an image to the right eye of a user. It will be appreciated that other components may be included in the VR head mounted display device 100 shown in fig. 2A, for example, a support 230 and a stand 240, where the support 230 is used to support the VR head mounted display device 100 on the bridge of the nose, and the stand 240 is used to support the VR head mounted display device 100 on both ears to ensure that the VR head mounted display device 100 is stably worn. As shown in fig. 2A (b), at least one eye tracking module (including M light sources 2501 and at least one camera module 2502) may be disposed on the VR head-mounted display device 100 to track the movement of the human eye and further determine the gaze point of the human eye. In some embodiments, a light source 2501 may be disposed on an end face 100a of the VR head-mounted display device 100 facing the face (or the eyes of the user), and an image capturing module 2502 may be disposed on the end face 100a of the VR head-mounted display device 100 facing the face (or the eyes of the user). For example, the light source 2501 is located on the lens barrel eye-facing end surface 210a, and the lens barrel eye-facing end surface 210a may be understood as a portion of the end surface 100 a. Taking the eye tracking module 250 on the optical display module 210 as an example, in some embodiments, the eye tracking module 250 includes a light source 2501 and a camera module 2502. For example, 8 light sources 2501 are arranged on the lens barrel eye-facing end surface 210a, 8 light sources 2502 may be uniformly distributed in a circular shape, and the image pickup module 2502 may be disposed on the lens barrel eye-facing end surface 210 a. The light source 2502 and the camera module 2502 may each be disposed around the eye-facing side of the optics 211.
For convenience of description, referring to fig. 2B, fig. 2B may be understood as a simplification of the VR head-mounted display apparatus 100 in fig. 2A, for example, only the optical display module 210 and the optical display module 220 are shown in fig. 2B, and other components are not shown. As shown in fig. 2B, in the case where the VR headset 100 is worn by a user, the display device 110 is located on a side of the optical device 211 facing away from the left eye, the display device 120 is located on a side of the optical device 221 facing away from the right eye, and the optical device 211 and the optical device 221 are symmetrical with respect to a center line of a human face or a center line D of the VR headset 100, wherein the center line of the human face may be a perpendicular bisector between the right eye and the right eye; the center line D of the VR head-mounted display device 100 may be the center line of the end face 100a, the center line of the stand 240, or the like. When the display device 110 is displaying an image, light emitted from the display device 110 is converged to the left eye of a person through the optical device 211, and when the display device 120 is displaying an image, light emitted from the display device 120 is converged to the right eye of a person through the optical device 221. In some embodiments, VR head mounted display device 100 may further include optics 212 and optics 222, optics 212 and optics 211 forming a mirror group, optics 222 and optics 221 forming a mirror group (which may be understood as an optical module). The lens group may include at least one optic, one or more of the optics in the lens group may be adjusted to change the optical power of the lens group, e.g., the position of one or more of the optics in the lens group may be moved away from or toward the display device to change the optical power.
Note that the composition of the VR head mounted display device 100 shown in fig. 2A or 2B is merely a schematic representation of one logic. In a specific implementation, the number of optical devices and/or display devices may be flexibly set according to different requirements. For example, in some embodiments, display device 110 and display device 120 may be two separate display devices, or two display areas on the same display device. In some embodiments, the display device 110 and the display device 120 may be a display screen, such as a liquid crystal screen, a light emitting diode (light emitting diode, LED) display screen, or other types of display devices, which are not limited in this application. In the embodiment shown in fig. 2B, the optical module may include two devices, optical device 222 and optical device 221, and in other embodiments, the optical module may include one optical device or more than three optical devices. The optical device may be one or several optical devices of a reflecting mirror, a transmitting mirror, an optical waveguide, or the like, or may improve the angle of view, for example, the optical device may be a lens, and a plurality of lenses may form a lens group, and exemplary optical devices may be fresnel lenses and/or aspheric lenses, or the like.
Fig. 3 is a schematic diagram of a related art head-mounted display device based on a panak folded optical path, where the panak folded optical path is commonly used in the head-mounted display device to turn an optical path, and an optical display module 210 with a (left eye) or an optical display module 220 with a (right eye) is described as an example of the panak folded optical path, where the optical module may include two devices, namely, an optical device 222 and an optical device 221. As shown in fig. 3, the head-mounted display device based on the panak folded optical path may include a panak lens group 10 (i.e., an optical module 130 or an optical module 140) and a display assembly 20 (i.e., a display device 110 or a display device 120), wherein the panak lens group 10 may be composed of a plurality of lenses (lenses may be understood as optical devices), and the panak lens group 10 may include a multi-layered coating film therein, and in particular, the panak lens group 10 may include a Beam Splitter (BS) 101, a first Quarter Wave Plate (QWP) 102 (hereinafter referred to as p 1), and a polarization reflecting film (Polarization reflector, PR) 103; the display assembly may include a display screen 201, a Polarizer (P) 202, and a second Quarter Wave Plate (QWP) 203 (hereinafter QWP 2).
Fig. 4 is a schematic diagram of polarization of light of a panpake lens group in the related art, and as shown in fig. 4, light from a display screen is folded in the panpake lens group 10 and finally exits into eyes of a user. The multiple coating layers in the Pancake mirror group 10 can enable light rays to be folded between the film layers, specifically, light emitted by the display screen 201 is modulated into linear polarized light after passing through the polarizer 202, and the polarization direction can be set along the y-axis direction without losing generality. After passing through the QWP2 (203), the light becomes right-handed polarized light, and the fast axis direction of the QWP2 (203) is 45 degrees with the y axis. Thereafter, the light reaches the semi-reflective and semi-transmissive film 101, a part of the light is reflected, and another part of the light is transmitted and passes through the QWP1 (102) to reach the polarizing reflective film 103. The fast axis direction of QWP1 (102) is the same as QWP2 (203), at which point the light is again modulated to linearly polarized light with the polarization direction along the x-axis direction. The polarizing reflection film 103 may reflect polarized light in the x-axis direction and transmit polarized light in the y-axis direction. Therefore, the light is reflected and transmitted through the QWP1 (102) to reach the transflective film 101, and at this time, the light is right-handed polarized light. As before, a portion of the light is transmitted and another portion is reflected. The reflected light becomes left-handed polarized light, and after passing through the QWP1 (102), the light is modulated again into linear polarized light, with the polarization direction along the y-axis. Depending on the characteristics of the polarizing reflective film 103, light will exit through the polarizing reflective film 103 and eventually enter the human eye.
The head-mounted display device has an eye movement tracking function, and obtains the visual axis of the eyeballs of the user through eye movement tracking calibration. The eye movement tracking is to track the eye movement by measuring the position of the gaze point of the eye or the movement of the eyeball relative to the head, or the eye movement tracking is to track the binocular vision direction of the user.
It is understood that more devices may be included in VR head mounted display device 100. For example, please refer to fig. 5, which illustrates a schematic structural diagram of a VR headset 100 according to an embodiment of the present application. As shown in fig. 5, VR head mounted display device 100 may include a processor 401, a memory 402, a sensor module 403 (which may be used to obtain a user's gesture), a microphone 404, keys 405, an input/output interface 406, a communication module 407, a camera 408, a battery 409, an optical display module 410, and an eye tracking module 412, among others.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the VR head mounted display device 100. In other embodiments of the present application, VR head mounted display device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 401, which is typically used to control the overall operation of the VR head mounted display device 100, may include one or more processing units, such as: processor 401 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a video processing unit (video processing unit, VPU) controller, memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. In some embodiments, the processor 401 may also be a micro control unit (Microcontroller Unit, MCU).
A memory may also be provided in the processor 401 for storing instructions and data. In some embodiments, the memory in the processor 401 is a cache memory. The memory may hold instructions or data that has just been used or recycled by the processor 401. If the processor 401 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 401 is reduced, thus improving the efficiency of the system.
In some embodiments of the present application, the processor 401 may acquire the eye flare image sent by the camera module in the eye tracking module 412, and may also learn the position of the eyes of the user, so as to calculate the gaze point of the user.
In some embodiments, the processor 401 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, a serial peripheral interface (serial peripheral interface, SPI) interface, and the like.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 401 may include multiple sets of I2C buses.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 401 with the communication module 407. For example: the processor 401 communicates with the bluetooth module in the communication module 407 through a UART interface, and implements a bluetooth function.
The MIPI interface may be used to connect the processor 401 to peripheral devices such as a display screen, camera 180, etc. in the optical display module 410.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 401 with the camera 180, a display in the optical display module 410, the communication module 407, the sensor module 403, the microphone 404, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc. In some embodiments, the camera 180 may acquire an image including a real object, and the processor 401 may fuse the image acquired by the camera with the virtual object, and the fused image may be realistically displayed through the optical display module 410.
The USB interface is an interface conforming to the USB standard specification, and can be specifically a Mini USB interface, a Micro USB interface, a USB Type C interface and the like. The USB interface may be used to connect a charger to charge the VR headset 100, or may be used to transfer data between the VR headset 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as cell phones and the like. The USB interface may be USB3.0, which is used for compatible high-speed display interface (DP) signal transmission, and may transmit video and audio high-speed data.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the VR head-mounted display device 100. In other embodiments of the present application, VR head mounted display device 100 may also employ different interfacing manners, or a combination of multiple interfacing manners, as in the above embodiments.
In addition, VR head mounted display device 100 may include wireless communication functionality, e.g., VR head mounted display device 100 may receive images from other electronic devices (e.g., VR hosts) for display. The communication module 407 may include a wireless communication module and a mobile communication module. The wireless communication function may be implemented by an antenna (not shown), a mobile communication module (not shown), a modem processor (not shown), a baseband processor (not shown), and the like. The antenna is used for transmitting and receiving electromagnetic wave signals. The VR head mounted display device 100 may include multiple antennas therein, each of which may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module may provide a solution for wireless communication, including second generation (2th generation,2G) network/third generation (3th generation,3G) network/fourth generation (4th generation,4G) network/fifth generation (5th generation,5G) network, etc., as applied on VR headset display device 100. The mobile communication module may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module can receive electromagnetic waves by the antenna, filter, amplify and the like the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module can amplify the signal modulated by the modulation and demodulation processor and convert the signal into electromagnetic waves to radiate through the antenna. In some embodiments, at least part of the functional modules of the mobile communication module may be provided in the processor 401. In some embodiments, at least part of the functional modules of the mobile communication module may be provided in the same device as at least part of the modules of the processor 401.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to speakers, etc.), or displays images or video through a display screen in the optical display module 410. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module or other functional module, independent of the processor 401.
The wireless communication module may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the VR head mounted display device 100. The wireless communication module may be one or more devices that integrate at least one communication processing module. The wireless communication module receives electromagnetic waves via an antenna, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 401. The wireless communication module may also receive a signal to be transmitted from the processor 401, frequency modulate it, amplify it, and convert it into electromagnetic waves to radiate.
In some embodiments, the antenna and mobile communication module of VR head mounted display device 100 are coupled such that VR head mounted display device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
VR head mounted display device 100 implements display functions through a GPU, optical display module 410, and an application processor, among other things. The GPU is a microprocessor for image processing, and is connected to the optical display module 410 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 401 may include one or more GPUs that execute program instructions to generate or change display information.
Memory 402 may be used to store computer executable program code that includes instructions. The processor 401 executes instructions stored in the memory 402 to thereby perform various functional applications and data processing of the VR head mounted display device 100. The memory 402 may include a stored program area and a stored data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of VR head mounted display device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, memory 402 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash memory (universal flash storage, UFS), and the like.
VR head mounted display device 100 may implement audio functionality through an audio module, speakers, microphone 404, an earphone interface, an application processor, and so forth. Such as music playing, recording, etc. The audio module is used for converting digital audio information into analog audio signals for output and also used for converting analog audio input into digital audio signals. The audio module may also be used to encode and decode audio signals. In some embodiments, the audio module may be disposed in the processor 401, or a part of the functional modules of the audio module may be disposed in the processor 401. Speakers, also known as "horns," are used to convert audio electrical signals into sound signals. VR headset 100 may listen to music through a speaker or to hands-free conversations.
A microphone 404, also called a "microphone" or "microphone", is used to convert sound signals into electrical signals. The VR headset 100 may be provided with at least one microphone 404. In other embodiments, VR head mounted display device 100 may be provided with two microphones 404 that may enable noise reduction in addition to capturing sound signals. In other embodiments, the VR head mounted display device 100 may also be provided with three, four, or more microphones 404 to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface is used for connecting a wired earphone. The earphone interface may be a USB interface or a 3.5 millimeter (mm) open mobile wearable platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
In some embodiments, VR headset 100 may include one or more keys 405 that may control the VR headset to provide a user with access to functionality on VR headset 100. The keys 405 may be in the form of buttons, switches, dials, and touch or near touch sensing devices (e.g., touch sensors). Specifically, for example, the user may turn on the optical display module 410 of the VR head-mounted display device 100 by pressing a button. The keys 405 include a power-on key, a volume key, etc. The key 405 may be a mechanical key. Or may be a touch key. The VR head mounted display device 100 may receive key inputs, generating key signal inputs related to user settings and function control of the VR head mounted display device 100.
In some embodiments, VR head mounted display device 100 may include an input-output interface 406, and input-output interface 406 may connect other means to VR head mounted display device 100 through suitable components. The components may include, for example, audio/video jacks, data connectors, and the like.
The optical display module 410 is used for presenting an image to a user under the control of the processor 401. The optical display module 410 may convert the real pixel image display into a near-eye projected virtual image display through one or more optical devices such as a reflector, a transmission mirror or an optical waveguide, so as to implement a virtual interactive experience, or implement an interactive experience combining a virtual and a reality. For example, the optical display module 410 receives the image data information sent by the processor 401 and presents a corresponding image to the user. In some embodiments, the optical display module 410 may include an optical display module 210 and an optical display module 220.
In an embodiment of the present application, VR head mounted display device 100 further includes eye tracking module 412. The eye tracking module 412 is used for tracking the movement of the human eye, and further determining the gaze point of the human eye. For example, the pupil position can be located by an image processing technology, and the pupil center coordinates can be obtained, so that the gaze point of the person can be calculated. In some embodiments, the eye tracking system may determine the gaze point position of the user (or determine the gaze direction of the user) by using a video eye method, a photodiode response method, or a pupillary cornea reflection method, so as to implement eye tracking of the user.
It should be noted that, in some embodiments of the present disclosure, eye tracking modules corresponding to each of the eyes of the user may be provided for eye tracking synchronously or asynchronously. In other embodiments of the present disclosure, an eye tracking module may be disposed only near a single eye of a user, and the eye tracking module may be used to obtain a line of sight direction of a corresponding eye, and determine the line of sight direction or the point of gaze position of the other eye of the user according to a relationship between the points of gaze of the two eyes (e.g. the positions of the points of gaze of the two eyes are generally similar or identical when the user observes an object through the two eyes), in combination with a distance between the two eyes of the user.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the VR head mounted display device 100. In other embodiments of the present application, VR head mounted display device 100 may include more or fewer components than in fig. 2A, or certain components may be combined, certain components may be split, or different component arrangements, embodiments of the present application not being limited.
It will be appreciated that the VR headset 100 is an example of an electronic device in the embodiments of the present application, and many other forms of electronic devices in the embodiments of the present application are possible, such as an AR wearable device, an MR wearable device, a vehicle-mounted eye tracking display device, a smart mobile device, a large screen display, a smart car, a computer monitor, etc., which are not limited herein.
Fig. 6 is an exemplary software architecture block diagram of VR head mounted display device 100 of an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the system is divided into four layers, from top to bottom, an application layer 501, an application framework layer 502, a Runtime (run time) 503 and a system library 504, and a kernel layer 505, respectively.
The application layer 501 may include a series of application packages.
As shown in fig. 6, the application package may include applications (may also be referred to as applications) such as a camera 501A, a calendar 501B, a map 501c, a wlan501d, music 501E, a short message 501F, a gallery 501G, a call 501H, a navigation 501I, bluetooth 501J, a video 501K, and the like.
The application framework layer 502 provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 6, the application framework layer 502 may include a window manager 5021, a content provider 5022, a phone manager 5023, a resource manager 5024, a notification manager 5025, a view system 5026, and the like.
The window manager 5021 is used for managing window programs. The window manager 5021 can acquire the size of the display screen, determine whether a status bar exists, lock the screen, intercept the screen, and the like.
The content provider 5022 is used to store and retrieve data and make the data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The phone manager 5023 is used to provide communication functions of the VR head mounted display device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager 5024 provides various resources such as localization strings, icons, pictures, layout files, video files, and the like to the application program.
The notification manager 5025 allows the application to display notification information in a status bar, can be used to convey notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification presented in the form of a chart or scroll bar text in the system top status bar, such as a notification of a background running application, or a notification presented on a screen in the form of a dialog interface. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The view system 5026 comprises visual controls, such as controls for displaying text, controls for displaying pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
Runtime (run) 503 includes core libraries and virtual machines. Run time is responsible for scheduling and management of the system.
The core library consists of two parts: one part is the function that the programming language (e.g., the java language) needs to call, and the other part is the core library of the system.
The application layer 501 and the application framework layer 502 run in a virtual machine. The virtual machine executes the programming files (e.g., java files) of the application layer 501 and the application framework layer 502 as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library 504 may include a plurality of functional modules. For example: surface manager 5041 (surface manager), three-dimensional graphics processing library 5042 (e.g., openGL ES), two-dimensional graphics engine 5043 (e.g., SGL), media library 5044 (Media Libraries), and the like.
The surface manager 5041 is used to manage the display subsystem and provides a fusion of two-Dimensional (2D) and three-Dimensional (3D) layers for multiple applications.
Media library 5044 supports a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
Three-dimensional graphics processing library 5042 is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, among others.
Two-dimensional graphics engine 5043 is a drawing engine for 2D drawing.
The kernel layer 505 is a layer between hardware and software. The kernel layer 505 contains at least a display driver 5051, a camera driver 5052, an audio driver 5053, and a sensor driver 5054.
In some embodiments of the present application, the application framework layer 502 may further include an eye tracking function module 5027, configured to match a light spot in a human eye image acquired by the camera driver 5052 with a light source, and calculate a line of sight direction of a user so as to determine a gaze point of the user. In other embodiments of the present application, the eye tracking function 5027 may also be located in the application layer 501, the system library 504, or the kernel layer 505, which is not limited herein.
Fig. 7 is a schematic diagram of eye tracking based on cornea reflection in the related art, as shown in fig. 7, a line connecting a fovea 30a and a corneal center 30b of an eyeball 30 of a user is a visual axis 30c of the eyeball 30, i.e. a line of sight direction. The connection line between the pupil center 30d and the cornea sphere center 30b is the optical axis 30e of the eyeball 30, i.e. the optical axis of the eyeball 30. An angle is formed between the optical axis 30e and the visual axis 30c of the eyeball 30, and some detection results are that the angle is 5 °. The architecture of eye tracking based on cornea reflection shown in fig. 7 may include a plurality of near-infrared Light Emitting Diodes (LEDs) D1 and a plurality of near-infrared cameras D2, where the near-infrared cameras may capture a Light spot reflected by the near-infrared LEDs D1 through the cornea 30b of the eyeball, and may also capture the pupil of the eyeball 30, and thereby find the pupil center 30D. According to the law of reflection, the normal line determined by the light reflected by the cornea passes through the cornea sphere center 30b, the position of the cornea sphere center 30b can be calculated according to the positions of all the LED light spots 30g, and the optical axis 30e of the eyeball 30 can be obtained by combining the positions of the image 30f of the pupil center. In some embodiments, the visual axis 30c of the eyeball 30 is obtained by calibration.
Under the above-mentioned Pancake framework, the structure of the system is compact, and the distance from the eyeball 30 of the user to the Pancake lens group 10 is relatively short, so that if the near-infrared camera D2 is installed outside the Pancake lens group 10, the inclination angle is too large, and the captured eyeball picture cannot meet the algorithm requirement.
To overcome the above-mentioned technical problems, an embodiment of the present application provides an eye tracking device, which may include an optical display module, where the optical display module includes a display device and an optical module. The optical module comprises a first optical device, and first light emitted by the display device is emitted into eyes of a user after passing through the optical module; and an eye tracking module comprising: at least one light source (e.g., near infrared LED D1) configured to emit a second light to the user's eye, at least a portion of the second light illuminating the eye for reflection; a camera module (e.g., near infrared camera D2) configured to acquire at least a portion of the second light reflected after the at least one light source irradiates the eye; the camera module is located between the first optical device and the display device. It should be noted that, in the foregoing embodiment, the camera module is located between the first optical device and the display device, and a columnar space is provided between the lens of the first optical device adjacent to the display device and the display device, and in one embodiment, the camera module may be located in the columnar space. In another embodiment, the imaging module may be not limited to being disposed in the columnar space, and may be disposed at a point (point B) located outside the columnar space at a predetermined distance along the vertical direction of the central axis of the lens barrel at any point (point a) in the columnar space.
In one embodiment, the optical module comprises only the first optical device, i.e. a single lens barrel is provided. At least part of the second light is reflected by the user glasses, passes through the first optical device and then is acquired by the camera module. In this embodiment, the focal length of the optical module is variable, specifically, the distance between the first optical device and the display device is adjustable, and the first optical device is a lens. In other words, myopia correction of the user may be achieved by adjusting the distance between the lens and the display screen 201.
The eye tracking device provided by the embodiment of the application can further comprise a focal length measuring unit, wherein the focal length measuring unit is configured to trigger focal length detection after the focal length of the optical module is adjusted so as to determine current focal length information. In one embodiment, the focus measurement unit may include a hall sensor, a grating scale, or a sliding rheostat. Further, the focal length measuring unit may detect the distance between the current lens and the display screen 201 after the distance between the lens and the display device is changed.
In some embodiments, the myopia correction system may further include an external myopia correction lens, and in some embodiments, the optical module further includes an external lens interface, where the external lens interface is used to mount the myopia correction lens on the lens barrel by means of magnetic attraction or fastening. In a scenario of myopia correction through external myopia correction lenses, the focal length measurement unit may include a storage chip and an identity chip, in one embodiment, the storage chip may store a correspondence between identity information and focal power information of each myopia correction lens, and the storage chip may be disposed in the optical module; the identity chip stores identity information (such as an identification number) of the myopia correction lens, and the identity chip can be arranged on the myopia correction lens, and after the myopia correction lens is externally connected on a lens barrel of the optical module, the focal power information (namely focal length information) of the myopia correction lens can be matched in the storage chip based on the identity information.
In the myopia correction scene of the external myopia correction lens, the triggering conditions for triggering the focal length measurement unit to perform focal length detection are as follows:
the external lens interface can comprise a contact point, when myopia correction is externally connected to the lens barrel, the installation state of the myopia correction lens can be detected through the contact point, when the myopia correction lens is determined to be installed on the lens barrel, the identity information of the myopia correction lens can be obtained from the myopia correction chip, and corresponding focal power information is matched from the storage chip based on the identity information.
When the focal length of the optical module is changed, the gaze point direction of the user's eye is unchanged, and the gaze point of the user's eye known by the eye tracking device is unchanged. In some embodiments, when the focal length of the optical module is changed, an optical path table applicable to the current focal length information is determined. The eye tracking device further comprises a processor, and the processor receives the result obtained by the camera module and obtains the fixation point of the eyes of the user.
In another embodiment, the optical display module further includes a second optical device, and at least a portion of the second light is transmitted through the second optical device before being captured by the image capturing module. The second optical device is located between the first optical device and the display device, or the second optical device is located on a side of the first optical device facing away from the display device. In one embodiment, the second optic may be a lens and the first optic and the second optic may form a lens group, which may be, but is not limited to, a Pancake lens group.
Fig. 8 is a schematic diagram of refraction of a cornea reflected light beam through a Pancake lens set and a near infrared camera lens in the related art, as shown in fig. 8, the cornea reflected light beam and a pupil refracted light beam are both refracted through the Pancake lens set 10 and the near infrared camera D2 lens, and then strike the sensor of the near infrared camera D2. Therefore, in order to calculate the positions of the cornea sphere center 30b and the pupil center 30D, it is necessary to calibrate the correspondence between the pixel points on the sensor of the near-infrared camera D2 and the eyeball-side chief ray 41, in other words, to determine the optical path table.
Fig. 9 is a schematic diagram of calibrating an optical path table of eye tracking in the related art, as shown in fig. 9, in the calibration scheme in the related art, a calibration plate 51 is placed in front of a lens barrel, an image of the calibration plate 51 is photographed by a near-infrared camera D2, and a corresponding relationship between a feature point on the calibration plate 51 and a pixel on a sensor of the near-infrared camera D2 is obtained through feature point detection; operating the calibration plate 51 to translate a distance along the optical axis of the lens barrel, shooting an image of the calibration plate 51 by using the near-infrared camera D2, and calculating the corresponding relation between the characteristic points on the calibration plate 51 and the pixels on the sensor of the near-infrared camera D2; through interpolation calculation, each pixel on the sensor of the near infrared camera D2 can find two conjugate points on the calibration plates 51 at two different positions, and the connecting line of the two conjugate points is the principal ray 41 at the eyeball side, so that a calibration light path table of the eye movement tracking system is obtained.
In the related art, some head-mounted display devices also provide myopia correction functions in order to adapt them to myopic groups. Fig. 10a is a schematic diagram of a myopia correction method in the related art, as shown in fig. 10a, a myopia correction lens may be added in front of the Pancake lens barrel, and the mounting method may be a magnetic attraction or buckling method, as shown in fig. 10a, where the virtual image position is located at the position 1 without adding a myopia correction lens, and when the myopia correction lens is added, the virtual image translates from the position 1 to the position 2 due to the change of optical power, so as to achieve the purpose of correcting the myopia of the user. Fig. 10b is a schematic diagram of another myopia correction method in the related art, as shown in fig. 10b, in which one or more lenses or a display screen 201 in the lens group can be translated in the optical axis direction of the lens barrel, and in the following, a myopia correction method in which one lens in the Pancake lens group is translated, when the P2 lens is located at the solid line position, the virtual image is located at the position 1, and when the P2 lens is translated to the dotted line position, due to the change of the focal power of the Pancake lens group, the virtual image is translated from the position 1 to the position 2, thereby achieving the purpose of correcting myopia of the user. It should be noted that, when the optical module is a single lens and the eye position of the user is fixed, the virtual image is translated from the position 1 to the position 2 by adjusting the distance between the single lens and the display screen 201, so as to achieve the purpose of correcting the myopia of the user.
When using the same head-mounted display device with myopia correction, different myopic users need to correct myopia again (translate a lens in a panak lens group or add a myopia correction lens with corresponding degrees in front of a lens barrel) so as to be suitable for the current myopia situation of the user, however, after myopia correction, the optical path table of the eye movement tracking system of the head-mounted display device can be changed, and if calibration is only carried out at a certain myopia correction position, the optical path table obtained during eye movement tracking has larger error.
In order to overcome the technical problems described above, embodiments of the present application provide a light path table calibration method, through which a myopia correction parameter of a current user of a head-mounted display device may be obtained, and further, light path table interpolation may be performed based on the obtained myopia correction parameter of the current user, so as to obtain a corrected light path table, and a user sight direction may be calculated through the corrected light path table, so that a relatively accurate sight direction of the current user may be obtained. The accuracy of eye tracking is improved by reducing the error influence on the optical path table caused by myopia correction.
In order to implement the above optical path table calibration method, a further embodiment of the present application provides a system architecture, so as to implement the above optical path table calibration method by performing corresponding operations through corresponding components of the system architecture. Fig. 11 is a schematic diagram of a system architecture provided in a further embodiment of the present application, as shown in fig. 11, the system architecture may include a parameter obtaining device 71 and a processor 72, where the parameter obtaining device 71 may be configured to obtain a myopia correction parameter of a current user of the head-mounted display device, and the processor 72 may perform optical path table interpolation based on the obtained myopia correction parameter of the current user to obtain a corrected optical path table, and calculate a user sight line direction through the corrected optical path table, so as to obtain a relatively accurate sight line direction of the current user. In one embodiment, the processor 72 may be a micro-control unit (Microcontrol ler Unit; MCU).
Fig. 12 is a flowchart of a light path table correction method according to an embodiment of the present application, as shown in fig. 12, the method may include the following steps:
step 801: the processor acquires a light path table set, wherein the light path table set comprises light path tables corresponding to at least two myopia correction positions.
Step 802: the processor obtains current myopia correction parameters after a myopia correction operation.
Step 803: and the processor performs optical path table interpolation on the optical path table set according to the myopia correction parameters to obtain a corrected target optical path table, and performs line-of-sight direction calculation by using the target optical path table to obtain the line-of-sight direction of the current user.
In a specific implementation of step 801, a set of light path tables may be obtained prior to performing light path table correction based on the myopia correction operation. Fig. 13 is a schematic diagram of optical path table calibration according to still another embodiment of the present application, as shown in fig. 13, an acquisition manner of the optical path table set may include: and acquiring optical path tables corresponding to the optical path table calibration at least two myopia correction positions.
The optical path table calibration operation may be a pre-completed operation, and in one embodiment, all optical path tables (optical path table sets) obtained after the pre-calibration may be stored in a corresponding memory, and the processor 72 may further obtain the optical path tables in the memory. That is, in one embodiment, the "processor obtains the set of light path tables" in step 801 may be to obtain the set of light path tables from the above-mentioned memory.
In the previously executed optical path meter calibration process, the operation of calibrating the optical path meter at a myopia correction position to obtain the corresponding optical path meter includes: at the current myopia correction position, the calibration plates (the calibration plates at the position P1 and the position P2) at two different positions are respectively shot by the near infrared camera C1, so as to find the conjugate point on the calibration plate at the position where each pixel point on the near infrared camera C1 corresponds to two positions, specifically, when the calibration plate is at the position P1, the calibration plate can be shot by the near infrared camera C1 so as to find the conjugate point on the calibration plate where each pixel point on the near infrared camera C1 corresponds to. Taking one pixel as an example, as shown in fig. 13, a pixel (O point) on the near infrared camera C1 sensor corresponds to a conjugate point on the calibration plate at the position P1 as a point a. In this way, after the conjugate point on the calibration plate corresponding to the position P1 at each pixel point is obtained, the calibration plate is translated to the position P2, and the conjugate point on the calibration plate corresponding to the position P2 at each pixel point on the near infrared camera C1 is found in the above manner. Taking one pixel point as an example, after the calibration plate translates to the position P2, the conjugate point of the O point on the calibration plate at the position P2 is the B point, in some embodiments, the conjugate point a and the conjugate point B (vector AB) are connected, so that the position and the direction of the chief ray 41 corresponding to the pixel point (O point) can be obtained, so that the chief ray 41 and the direction corresponding to each pixel point can be obtained, and further, the optical path table calibration performed at a myopia correction position can be obtained to obtain the corresponding optical path table, and the optical path table can be shown in the table.
List one
u/pixel v/pixel x A /mm y A /mm z A /mm x B /mm y B /mm z B /mm
1 1
1 2
In one embodiment, the optical path table shown in Table one may be an M X8 matrix, where M represents the number of pixels of the near infrared camera, columns 1-2 (u/pixel and v/pixel) in the table are camera pixel coordinates (which may be understood as the number of pixels), columns 3-8 (X A Column, Y A Columns, Z A Column, X B Column, Y B Columns, Z B Column) is three of conjugate point A and conjugate point B corresponding to the pixelDimensional coordinates in mm. Wherein X is A 、Y A 、Z A Respectively the coordinates of the point A on the three axes, X B Column, Y B Columns, Z B The coordinates of the point B on the three axes are listed, and the position and direction of the chief ray 41 corresponding to the camera pixel in the space near the eyeball can be obtained from the optical path table.
After the head-mounted display device is subjected to myopia correction once, based on the operation, calibrating the optical path table once again at the current myopia correction position so as to obtain the optical path table corresponding to the current myopia correction position.
In one embodiment, the above-mentioned optical path meter calibration operation may be performed based on the set number of myopia correction positions to obtain a corresponding number of optical path meters, where the optical path meter calibration operation may be performed at a plurality of myopia correction positions to obtain a corresponding number of optical path meters. For example, if the myopia correction range of the myopia correction system of the head-mounted display device is-1D to-7D, the optical path table is calibrated at two myopia correction positions, and the two myopia correction positions are respectively-1D correction position and-7D correction position, so that the optical path table can be calibrated when the myopia correction parameters are-1D and-7D in the calibration process, two different optical path tables are obtained, and the two different optical path tables are respectively recorded as the optical path table 1D and the optical path table 7D, and the obtained optical path table 1D and the obtained optical path table 7D are stored in the corresponding memories of the head-mounted display device for the processor 72 to obtain, where the memories for storing the optical path tables may be a built-in memory of the head-mounted display device or an external memory or a cloud storage space, which is not limited herein.
In a specific implementation of step 802, in a myopia correction manner, the position of the virtual image distance may be adjusted by manually or electrically translating lenses in a lens group of the head-mounted display device, so as to achieve the purpose of myopia correction. The lens group in the head-mounted display device may be various lens groups with an optical path folding function, which is not limited herein. In one embodiment, the lens group in the head-mounted display device may be a Pancake lens group, and the lenses in the Pancake lens group may be moved to adjust the position of the virtual image distance, so as to achieve the purpose of myopia correction.
In the course of the above-described myopia correction operation by the user, positional information after lens translation may be detected by the parameter acquisition means 71 and the acquired positional information may be sent to the processor 72.
The parameter acquiring means 71 may adaptively acquire the positional information after the lens is translated based on the myopia correction system of the head-mounted display device.
In one embodiment, in the above scenario where myopia correction is achieved by translating the lenses in the lens group to adjust the position of the virtual image distance, the obtaining, by the processor 72, the myopia correction parameters corresponding to the current myopia correction operation in step 802 may include the following steps:
Step 802a: after myopia correction operation, the parameter acquisition device acquires the position sensing information of the lens subjected to translation and sends the position sensing information of the lens subjected to translation to the processor;
step 802b: and the processor converts corresponding myopia correction parameters according to the received position sensing information.
In a specific implementation of step 802a, the position sensing information of the translated lens may be obtained by a position sensor within the lens set, including but not limited to hall sensors, grating scales, slide varistors, and the like. In one embodiment, the magnetic flux information after lens translation may be detected by a hall sensor. Fig. 14 is a schematic view of obtaining myopia correction parameters according to yet another embodiment of the present application, as shown in fig. 14, a hall device 1001 may be mounted on a lens barrel structure of a head-mounted display device, and the hall device 1001 does not translate with a P2 lens. In some embodiments, a magnet 1002 may be mounted on the P2 mirror plate, and the magnet 1002 is mounted in a position opposite the sensing surface of the hall device 1001, such that upon translation of the P2 mirror plate, the spacing between the magnet 1002 and the hall device 1001 changes, as does the magnetic flux received by the hall device 1001. The parameter acquisition device acquiring position sensing information of the translated lens may include the magnetic flux received by the hall device 1001 after the P2 lens translation is completed, and in some embodiments the hall device 1001 may send the acquired magnetic flux to the processor 72.
In the implementation of step 802b, after the user adjusts the position of the virtual image by translating the P2 lens, the virtual image is positioned at different positions corresponding to different myopia correction parameters. After hall device 1001 detects and transmits the corresponding magnetic flux after translation of the P2 lens, processor 72 may derive the corresponding myopia correction parameters based on the magnetic flux conversion provided by hall device 1001. For example, when the virtual image distance (the distance from the virtual image to the eyeball of the user) is 1m, the myopia correction parameter is-1D (the distance point position that can be seen by the user with-1D is 1 m); when the virtual image distance is 0.2m, the myopia correction parameter is-5D. Accordingly, the relationship between the magnetic flux received by the hall device and the myopia correction parameters may be obtained by calibration (the calibration process may be completed before the parameter acquisition device 71 or the head-mounted display apparatus is shipped), i.e., the processor 72 may convert the myopia correction parameters set by the user in the myopia correction operation by detecting the magnetic flux provided by the hall device 1001. It should be noted that, in general, the diopter (or optical power) is often expressed as a number of diopters, and the value of diopter D is multiplied by one hundred, for example, -1D is equal to one hundred degrees of the myopic spectacles (concave lenses).
In another myopia correction mode, a piece of myopia correction lens can be added before a lens barrel of the head-mounted display device to adjust the position of the virtual image distance, so that the purpose of myopia correction is achieved. In some embodiments, the myopia correction lens may be mounted on the lens barrel by means of magnetic attraction or a buckle, fig. 15 is a schematic diagram of mounting the myopia correction lens according to still another embodiment of the present application, as shown in fig. 15, a magnet 1102 and a memory chip 1103 may be disposed on the lens barrel 1101, the myopia correction lens is mounted on the lens barrel 1101 by means of magnetic attraction, and the optical power information of the myopia correction lens may be stored in the memory chip 1103.
In one embodiment, in the above scenario where myopia correction is achieved by adding a myopia correction lens to adjust the position of the virtual image distance, the obtaining, by the processor 72, the myopia correction parameters corresponding to the current myopia correction operation in step 802 may include the following steps:
step 802g: the processor 72 obtains identity information for the myopia correcting lens in the current myopia correction operation.
Step 802h: the processor 72 determines the power information of the myopia correcting lens based on the acquired identity information of the myopia correcting lens.
In a specific implementation of step 802g, during a myopia correction operation, the currently used myopia correction lens may establish a connection with the processor 72, and inform the processor 72 of the identity information of the currently used myopia correction lens through the connection, and further the processor 72 may obtain corresponding lens power information in the memory chip 1103 based on the identity information of the currently used myopia correction lens.
Fig. 16 is a schematic diagram of transmission of myopia correction parameters according to still another embodiment of the present application, as shown in fig. 16, a contact 1201 may be disposed at a corresponding position of a lens barrel of a head-mounted display device, and each myopia correction lens 1202 is configured with an identity chip 1203, where identity information of the corresponding myopia correction lens 1202, such as an identity of the myopia correction lens 1202, is stored in the identity chip 1203. When the myopia correction lens 1202 is mounted on the lens barrel, the myopia correction lens 1202 may establish a connection with the processor 72 through the contact 1201 on the lens barrel, and the identity information stored in the identity chip 1203 configured with the myopia correction lens 1202 is provided to the processor 72 through the connection, in other words, the processor 72 may acquire the identity information of the currently mounted myopia correction lens 1202 through the connection.
In other embodiments, the myopia correction lens 1202 configured with the identity chip 1203 may also be connected to the processor 72 by other means to obtain identity information, and the communication connection manner provided by the embodiment shown in fig. 12 is not limited, for example, the identity chip 1203 may be a radio frequency chip, in which identity information corresponding to the myopia correction lens 1202 is stored, the processor 72 may establish a near field wireless communication connection with the identity chip 1203, and read the identity information stored in the identity chip 1203 through the near field wireless communication connection, and the identity information may include information such as focal length or diopter of the myopia correction lens 1202. In some embodiments, the identity chip 1203 and/or the memory chip 1103, the contacts 1201, etc. may be understood as a focus measurement unit.
In an implementation of step 802h, after obtaining the identity information of the currently installed myopia correction lens 1202, the processor 72 may obtain the power information of the myopia correction lens 1202 from the memory chip 1103 based on the identity information of the currently installed myopia correction lens 1202, and in one embodiment, obtaining the myopia correction parameters may include obtaining the power information of the myopia correction lens 1202.
In the implementation of step 803, the processor 72 interpolates the optical path table set according to the myopia correction parameters to obtain a corrected target optical path table, and performs a line-of-sight direction calculation using the target optical path table to obtain the current line-of-sight direction of the user. The interpolation mode includes, but is not limited to, linear interpolation, polynomial interpolation, spline interpolation and the like. For example, if the myopia correction operation is performed by the user, the myopia correction parameter obtained by the processor 72 is-4D, and the optical path table set obtained by the processor includes the optical path table 1D (the correction parameter is-1D) and the optical path table 7D (the correction parameter is-7D), the target optical path table is the average value of the optical path table 1D and the optical path table 7D after the myopia correction operation (the myopia correction parameter is-4D). In some embodiments, the line-of-sight direction calculation may be performed based on the target light path table obtained after the interpolation, so as to obtain the line-of-sight direction of the user.
The above is that in the myopia correction operation process, in the scene that the position of the near infrared LED D1 for emitting near infrared light does not affect the eye tracking result in the eye tracking process, the corresponding operation of optical path table correction is performed, so as to obtain the optical path table suitable for the current myopia correction position, thereby reducing the influence of myopia correction on the optical path table and improving the accuracy of eye tracking.
If the position of the near infrared LED D1 for emitting near infrared light affects the eye tracking result during the eye tracking, it is necessary to consider the influence of the position of the near infrared LED D1 during the optical path table correction.
In another myopia correction mode, the mode of adjusting lenses in the lens group of the head-mounted display device to adjust the virtual image distance position includes: the distance between lenses in the lens group is adjusted by rotating the lens barrel, which is similar to the focusing mode of a single-lens reflex. Fig. 17a is a schematic diagram of myopia correction by rotating the lens barrel according to still another embodiment of the present application, as shown in fig. 17a, when the lens barrel is rotated, the distance between lenses in the lens group of the head-mounted display device can be adjusted to adjust the position of the virtual image distance, so as to achieve the purpose of myopia correction.
However, under the operation of the above-mentioned rotating lens barrel, the position of the near infrared LED D1 in the xoy plane at different degrees of myopia will change, and if the lens position adjustment along the optical axis direction also drives the LED at this time, the position thereof in the z-axis direction will also change. As shown in fig. 17a, the rotation of the lens barrel brings the LED from the solid line square (position P3) to the position of the broken line square (position P4).
In another myopia correction mode, the mode of adjusting lenses in the lens group of the head-mounted display device to adjust the virtual image distance position includes: the distance between lenses in the lens group is adjusted by a mode of stretching and retracting the lens barrel. Fig. 17b is a schematic diagram of myopia correction by using a telescopic lens barrel according to still another embodiment of the present application, as shown in fig. 17b, when the lens barrel is telescopic, the distance between lenses in a lens group of a head-mounted display device can be adjusted to adjust the position of the virtual image distance, so as to achieve the purpose of myopia correction.
Under the operation of the telescopic lens barrel, the position of the near infrared LED D1 in xyz space is changed under different myopia degrees, specifically, as shown in fig. 17b, when the lens barrel is telescopic and regulated, the near infrared LED D1 mounted on the lens barrel is driven to translate along the z-axis direction, so that the near infrared LED D1 moves from the solid line square (position P5) to the position of the broken line square (position P6).
In the myopia correction mode (the position of the near infrared LED D1 is changed along with the myopia correction operation), the position of the near infrared LED D1 is changed, so that the eye tracking accuracy is reduced after the position of the near infrared LED D1 is changed, and the user experience is affected.
In order to overcome the technical problem, still another embodiment of the present application provides another optical path meter calibration method, so as to reduce the LED offset caused by myopia correction, and determine the position of the near infrared LED D1 after myopia correction while performing optical path meter correction, so as to perform corresponding user line-of-sight resolving operation under the condition of knowing the position of the near infrared LED D1, so as to improve the accuracy of user line-of-sight resolving.
Fig. 18 is a flowchart of a light path table correction method according to still another embodiment of the present application, as shown in fig. 18, the method may include the following steps:
step 1401: the processor acquires a light path table set and a near infrared LED position set, wherein the light path table set comprises light path tables corresponding to at least two myopia correction positions, and the near infrared LED position set comprises the position information of the near infrared LEDs under at least two myopia correction parameters.
Step 1402: after obtaining the myopia correction parameters corresponding to the current myopia correction operation, the processor executes step 1403 and step 1404 respectively.
Step 1403: the processor interpolates the position set of the near infrared LEDs according to the myopia correction parameters corresponding to the current myopia correction operation to obtain the positions of the near infrared LEDs under the myopia correction parameters corresponding to the current myopia correction operation.
Step 1404: the processor performs optical path table interpolation according to the optical path table set of myopia correction parameters corresponding to the current myopia correction operation, and executes step 1405 after obtaining the corrected target optical path table.
Step 1405: and according to the position of the near infrared LED and the target light path table under the myopia correction parameters corresponding to the current myopia correction operation, performing line-of-sight direction calculation to obtain the line-of-sight direction of the current user.
In a specific implementation of step 1401, a set of light path tables and a set of positions of near infrared LEDs may be obtained prior to performing light path table correction based on the myopia correction operation. The operation of acquiring the optical path table set may be the same as or similar to the operation of acquiring the optical path table set provided in the embodiment shown in fig. 8, and will not be described herein.
Since the position of the near infrared LED in space is a known input to the eye tracking algorithm, myopia correction operation (myopia accommodation) affects the position of the near infrared LED in xyz space, and therefore the position of the near infrared LED can be re-determined after myopia correction operation. The method for obtaining the position set of the near infrared LED may include: the positions of the near infrared LEDs under at least two myopia correction parameters are obtained.
The calibration operation of the position of the near-infrared LED may be a pre-completed operation, and in an embodiment, the pre-calibrated position of the near-infrared LED (the position set of the near-infrared LED) may be stored in a corresponding memory, and then the processor 72 may obtain the position set of the near-infrared LED in the memory.
In the calibration process of the near infrared LED position performed in advance, calibration of the near infrared LED position may be performed at least two near vision correction positions before the equipment leaves the factory, for example, the LED positions in the case of-1D and-7D are calibrated in advance before the equipment leaves the factory, specifically, the coordinate positions of the near infrared LEDs may be determined in the case of-1D and-7D respectively with a certain fixed point (fixed in the near vision correction process, for example, the center of the lens barrel) on the lens barrel as the origin.
The specific implementation manner of step 1402 may be the same as or similar to the obtaining manner of the myopia correction parameters corresponding to the current myopia correction operation provided in the embodiment shown in fig. 12, and will not be described herein.
In an implementation of step 1403, after obtaining the myopia correction parameters, the processor 72 may perform interpolation calculation on the set of positions of the near infrared LEDs obtained previously (step 1401) according to the myopia correction parameters to determine the current position of the near infrared LEDs after the myopia correction operation (i.e. corresponding to the myopia correction parameters). For example, when the user myopia adjusts to the-4D position, the LED position at the-4D position is obtained by interpolation. It should be noted that, the interpolation of xy coordinates needs to be converted into polar coordinates with the center of the lens barrel as the origin, then the angle of the polar coordinates is interpolated, and finally the rectangular coordinates are converted back; the z coordinate is directly subjected to linear, polynomial or spline interpolation. And obtaining the position of the near infrared LED after myopia correction operation through the interpolation calculation.
The specific implementation manners from step 1404 to step 1405 may be similar to step 803 in the embodiment shown in fig. 12, where the difference is that, before performing the line-of-sight calculation of the user, the position of the near-infrared LED may be updated according to the current position of the near-infrared LED after the myopia correction operation calculated in step 1403, and after completing the operation of updating the position information of the near-infrared LED, the processor 72 performs the optical path table interpolation on the optical path table set according to the myopia correction parameters, to obtain the corrected target optical path table, and performs the line-of-sight calculation using the target optical path table, to obtain the line-of-sight direction of the current user.
Fig. 19 is a schematic structural diagram of an optical path table correction device according to still another embodiment of the present application, as shown in fig. 19, the device may include a processor 1501 and a memory 1502, where the memory 1502 is configured to store at least one instruction, where the instruction is loaded and executed by the processor 1501 to implement the optical path table correction method according to any embodiment of the present application. In one embodiment, the optical path table correction device shown in fig. 19 may be a programmable chip of a head-mounted display device.
Still another embodiment of the present application further provides a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the optical path table correction method provided by any one of the embodiments of the present application.
It should be noted that, the terminals in the embodiments of the present application may include, but are not limited to, a personal Computer (Personal Computer, PC), a personal digital assistant (Personal Digital Assistant, PDA), a wireless handheld device, a Tablet Computer (Tablet Computer), a mobile phone, an MP3 player, an MP4 player, and the like.
It may be understood that the application may be an application program (native app) installed on the terminal, or may also be a web page program (webApp) of a browser on the terminal, which is not limited in this embodiment of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a Processor (Processor) to perform part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description of the embodiments is provided for the purpose of illustration only and is not intended to limit the invention to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, alternatives, and alternatives falling within the spirit and scope of the invention.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (20)

1. An eye tracking device, comprising:
the optical display module comprises a display device and an optical module, wherein the optical module comprises a first optical device, and first light emitted by the display device is emitted into eyes of a user after passing through the optical module; and
an eye tracking module comprising:
at least one light source configured to emit a second light to the user's eye, at least a portion of the second light illuminating the eye for reflection;
an imaging module configured to obtain at least part of the second light reflected after the at least one light source irradiates the eye;
the camera module is located between the first optical device and the display device.
2. The eye tracking device according to claim 1 wherein at least a portion of the second light is reflected by the user's eyeglasses and transmitted through the first optical element for acquisition by the camera module.
3. An eye tracking device according to claim 1 or claim 2 wherein the focal length of the optical module is variable.
4. An eye tracking device according to claim 3 further comprising a focal length measurement unit configured to trigger focal length detection upon focal length adjustment of the optical module to learn current focal length information.
5. An eye tracking device according to claim 3 wherein the distance between the first optical device and the display device is adjustable, the first optical device being a lens.
6. An eye tracking device according to claim 3 wherein the first optical device is a variable focus lens.
7. The eye tracking device according to claim 6 wherein the optical module further comprises an external lens interface for mounting the myopia correcting lens on the barrel by magnetic attraction or snap fitting.
8. The eye-tracking device according to any one of claims 3 to 7, wherein the gaze point direction of the user's eye is unchanged when the focal length of the optical module is changed, and the gaze point of the user's eye known by the eye-tracking device is unchanged.
9. The eye tracking device according to claim 8, wherein an optical path table suitable for the current focal length information is acquired when the focal length of the optical module is changed.
10. The eye tracking device according to claim 9 wherein the optical path table varies with a focal length of the optical module.
11. An eye tracking device according to any one of claims 1 to 10 further comprising a processor that receives the results obtained by the camera module and learns the gaze point of the user's eye.
12. The eye tracker according to any one of claims 1 to 11, wherein the optical display module further comprises a second optical device through which at least a portion of the second light is transmitted before being captured by the camera module.
13. The eye tracking apparatus according to claim 12 wherein the second optical device is located between the first optical device and the display device or the second optical device is located on a side of the first optical device facing away from the display device.
14. A head-mounted display device, characterized in that the device comprises an eye-tracking apparatus according to any of claims 1-13.
15. An eye movement tracking method is used for an eye movement tracking device and is characterized by comprising an optical display module and an eye movement tracking module; the optical display module comprises a display device and an optical module, wherein the optical module comprises a first optical device; the eye movement tracking module comprises at least one light source and a camera module; the method comprises the following steps:
the first light emitted by the display device is emitted into eyes of a user after passing through the optical module;
the at least one light source is configured to emit a second light to the user's eye, at least a portion of the second light illuminating the eye for reflection; the method comprises the steps of,
at least part of the second light reflected after the at least one light source irradiates the eyes is acquired by the camera module after penetrating through the first optical device.
16. The method of claim 15, wherein the method further comprises:
acquiring focal length information of the optical module;
knowing the gaze direction of the user's eyes.
17. The method of claim 16, wherein the method further comprises:
Acquiring a light path table set, wherein the light path table set comprises at least two light path tables corresponding to myopia correction positions;
acquiring current myopia correction parameters after myopia correction operation; and
and carrying out optical path table interpolation on the optical path table set according to the myopia correction parameters to obtain a target optical path table.
18. The method of claim 17, wherein the method further comprises:
and acquiring the light path table set from a target memory pre-storing the light path table set.
19. The method according to any one of claims 15 to 18, wherein the focal length of the optical module is variable;
the eye tracking device further comprises a focal length measuring unit, wherein the focal length measuring unit is configured to trigger focal length detection after the focal length of the optical module is adjusted, and current focal length information is obtained.
20. The method according to any one of claims 15 to 19, wherein the gaze point direction of the user's eye is unchanged when the focal length of the optical module is changed, and the gaze point of the user's eye known by the eye tracking device is unchanged.
CN202111576836.2A 2021-12-22 2021-12-22 Eye tracking device, display apparatus, and storage medium Pending CN116338941A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111576836.2A CN116338941A (en) 2021-12-22 2021-12-22 Eye tracking device, display apparatus, and storage medium
PCT/CN2022/139196 WO2023116541A1 (en) 2021-12-22 2022-12-15 Eye tracking apparatus, display device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111576836.2A CN116338941A (en) 2021-12-22 2021-12-22 Eye tracking device, display apparatus, and storage medium

Publications (1)

Publication Number Publication Date
CN116338941A true CN116338941A (en) 2023-06-27

Family

ID=86879223

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111576836.2A Pending CN116338941A (en) 2021-12-22 2021-12-22 Eye tracking device, display apparatus, and storage medium

Country Status (2)

Country Link
CN (1) CN116338941A (en)
WO (1) WO2023116541A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9791924B2 (en) * 2014-12-23 2017-10-17 Mediatek Inc. Eye tracking with mobile device in a head-mounted display
US10599215B2 (en) * 2016-04-26 2020-03-24 Facebook Technologies, Llc Off-axis eye tracker
CN106598260A (en) * 2017-02-06 2017-04-26 上海青研科技有限公司 Eyeball-tracking device, VR (Virtual Reality) equipment and AR (Augmented Reality) equipment by use of eyeball-tracking device
CN112346558A (en) * 2019-08-06 2021-02-09 苹果公司 Eye tracking system
CN110727111A (en) * 2019-10-23 2020-01-24 深圳惠牛科技有限公司 Head-mounted display optical system and head-mounted display equipment
CN111949131B (en) * 2020-08-17 2023-04-25 陈涛 Eye movement interaction method, system and equipment based on eye movement tracking technology
CN113419350B (en) * 2021-06-18 2023-05-23 深圳市腾讯计算机系统有限公司 Virtual reality display device, picture presentation method, device and storage medium

Also Published As

Publication number Publication date
WO2023116541A1 (en) 2023-06-29

Similar Documents

Publication Publication Date Title
US11340702B2 (en) In-field illumination and imaging for eye tracking
US9870049B2 (en) Reflective lenses to auto-calibrate a wearable system
US10852817B1 (en) Eye tracking combiner having multiple perspectives
US10635182B2 (en) Head mounted display device and control method for head mounted display device
CN108139806A (en) Relative to the eyes of wearable device tracking wearer
CN108205374B (en) Eyeball tracking module and method of video glasses and video glasses
CN104641278A (en) Method and apparatus for determining representations of displayed information based on focus distance
US11847258B2 (en) Method for wireless connection in augmented reality environment and electronic device therefor
KR20180069466A (en) Optical lens assembly and electronic apparatus having the same
WO2021103990A1 (en) Display method, electronic device, and system
US11455031B1 (en) In-field illumination for eye tracking
CN114255204A (en) Amblyopia training method, device, equipment and storage medium
US11307654B1 (en) Ambient light eye illumination for eye-tracking in near-eye display
CN116338941A (en) Eye tracking device, display apparatus, and storage medium
US20190086677A1 (en) Head mounted display device and control method for head mounted display device
CN112565735B (en) Virtual reality measuring and displaying method, device and system
CN114445605A (en) Free-form surface simulation method and device
CN115686181A (en) Display method and electronic equipment
CN116916809A (en) Ophthalmic imaging using a head-mounted device
CN116830065A (en) Electronic device for tracking user gaze and providing augmented reality service and method thereof
CN116301301A (en) Eye movement tracking device and eye movement tracking method
CN112558847B (en) Method for controlling interface display and head-mounted display
CN114446262B (en) Color cast correction method and head-mounted display device
WO2022247482A1 (en) Virtual display device and virtual display method
CN116107421A (en) Display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination