WO2024067657A1 - 眼球追踪系统和设备 - Google Patents
眼球追踪系统和设备 Download PDFInfo
- Publication number
- WO2024067657A1 WO2024067657A1 PCT/CN2023/121832 CN2023121832W WO2024067657A1 WO 2024067657 A1 WO2024067657 A1 WO 2024067657A1 CN 2023121832 W CN2023121832 W CN 2023121832W WO 2024067657 A1 WO2024067657 A1 WO 2024067657A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- optical lens
- light source
- eyeball
- tracking system
- eye tracking
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 claims abstract description 162
- 210000001508 eye Anatomy 0.000 claims abstract description 77
- 210000005252 bulbus oculi Anatomy 0.000 claims abstract description 56
- 230000000712 assembly Effects 0.000 claims description 4
- 238000000429 assembly Methods 0.000 claims description 4
- 238000010521 absorption reaction Methods 0.000 claims description 3
- 230000000295 complement effect Effects 0.000 claims description 3
- 229910044991 metal oxide Inorganic materials 0.000 claims description 3
- 150000004706 metal oxides Chemical class 0.000 claims description 3
- 239000004065 semiconductor Substances 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 11
- 210000001747 pupil Anatomy 0.000 description 10
- 238000000034 method Methods 0.000 description 8
- 101001121408 Homo sapiens L-amino-acid oxidase Proteins 0.000 description 4
- 102100026388 L-amino-acid oxidase Human genes 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 210000000744 eyelid Anatomy 0.000 description 3
- 238000002834 transmittance Methods 0.000 description 3
- 101100233916 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) KAR5 gene Proteins 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000004424 eye movement Effects 0.000 description 2
- 238000001028 reflection method Methods 0.000 description 2
- 101000827703 Homo sapiens Polyphosphoinositide phosphatase Proteins 0.000 description 1
- 102100023591 Polyphosphoinositide phosphatase Human genes 0.000 description 1
- 101100012902 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) FIG2 gene Proteins 0.000 description 1
- GWEVSGVZZGPLCZ-UHFFFAOYSA-N Titan oxide Chemical compound O=[Ti]=O GWEVSGVZZGPLCZ-UHFFFAOYSA-N 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 229940056932 lead sulfide Drugs 0.000 description 1
- 229910052981 lead sulfide Inorganic materials 0.000 description 1
- ORUIBWPALBXDOA-UHFFFAOYSA-L magnesium fluoride Chemical compound [F-].[F-].[Mg+2] ORUIBWPALBXDOA-UHFFFAOYSA-L 0.000 description 1
- 229910001635 magnesium fluoride Inorganic materials 0.000 description 1
- OGIDPMRJRNCKJF-UHFFFAOYSA-N titanium oxide Inorganic materials [Ti]=O OGIDPMRJRNCKJF-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Definitions
- the present application relates to the technical fields of augmented reality, virtual reality, and mixed reality, for example, to eye tracking systems and devices.
- VR virtual reality
- the overall size of VR glasses is also getting smaller, and the corresponding pupil distance (the distance from the eyeball to the VR lens) is also shortened accordingly.
- the problem is that the light source 2 in the related art is basically set on the surface of the lens 1 closest to the human eye 4 (as shown in Figure 1).
- the original position of the light source 2 makes the light emitted by the light source 2 blocked by the corner of the eye or the eyelid, and cannot be projected onto the eyeball, which ultimately causes the camera 3 to be unable to track the position of the human eye 4, and then the user cannot use the VR glasses normally, and the experience is poor.
- the present application provides an eye tracking system and device to solve the problem in the related art that the exit pupil distance is small, resulting in the inability to track the position of the eye.
- the embodiment of the present application provides an eye tracking system, including: an optical component, an image capturing component and a light source component;
- the optical component at least includes: a first optical lens and a second optical lens arranged along a first direction; the light source component is located on a side of the first optical lens away from the eyeball, and/or, is located on a side of the second optical lens away from the eyeball, the light source component is configured to emit an infrared light beam, the infrared light beam is incident on the first optical lens and/or the second optical lens, and at least passes through the first optical lens to be incident on the eyeball to form a Purkinje spot on the eyeball, the image capture component is configured to capture an image of the eyeball, wherein the first direction is a line of sight direction of a human eye looking directly at the first optical lens.
- the embodiment of the present application also proposes a device, comprising two of the above-mentioned eye tracking systems;
- a left eye viewing component for mounting one eye tracking system and a right eye viewing component for mounting another eye tracking system.
- FIG1 is a schematic diagram of the structure of an eye tracking system provided by the related art
- FIG2 is a schematic diagram of the structure of an eye tracking system provided in an embodiment of the present application.
- FIG3 is a schematic diagram of the structure of another eye tracking system provided in an embodiment of the present application.
- FIG4 is a schematic diagram of a side view structure of a first optical lens or a second optical lens in an eye tracking system provided in an embodiment of the present application;
- FIG5 is a schematic diagram of the front view structure of a first optical lens or a second optical lens in another eye tracking system provided in an embodiment of the present application;
- FIG6 is a schematic diagram of the structure of another eye tracking system provided in an embodiment of the present application.
- FIG7 is a schematic diagram of the structure of another eye tracking system provided in an embodiment of the present application.
- FIG8 is a block diagram of a device provided in an embodiment of the present application.
- FIG1 is a schematic diagram of the structure of an eye tracking system provided by the related art.
- the eye tracking system has a lens 1, taking the position in FIG1 as an example, a light source 2 is embedded or The lens 1 is attached to the left side of the lens 1, and the emitted light can hit the human eye 4.
- the camera 3 can collect the light spot in the human eye 4 and then track the shift of the line of sight of the human eye 4.
- the human eye 4 moves to the right, that is, the pupil distance is shortened, the human eye 4 is close to the lens 1.
- the light emitted by the uppermost light source and the lowermost light source on the left side of the lens 1 may be blocked by the eyelid of the human eye and cannot enter the eyeball.
- the above eye tracking system is implemented using an optical recording method.
- the principle of the optical recording method is to use an infrared camera to record the eye movement of the subject, that is, to obtain an eye image that can reflect the eye movement, and extract eye features from the obtained eye image to establish a line of sight estimation model.
- eye features can include: pupil position, pupil shape, iris position, iris shape, eyelid position, eye corner position, Purkinje image position, etc.
- Optical recording methods include pupil-corneal reflection method.
- the principle of pupil-corneal reflection method is that near-infrared light source is directed to the eye, and the eye is photographed by an infrared camera, while the reflection point of the light source on the cornea, namely Purkinje spot, is photographed, thereby obtaining an eye image with light spot.
- the absolute position of the Purkinje spot does not change with the rotation of the eyeball, but its position relative to the pupil and the eyeball is constantly changing.
- the Purkinje spot is between the pupils, and when the human eye is looking up, the Purkinje spot is below the pupil.
- the position of the pupil and Purkinje spot on the human eye image is located in real time and the corneal reflection vector is calculated, the user's line of sight direction can be estimated based on the geometric model.
- Fig. 2 is a schematic diagram of the structure of an eye tracking system provided by an embodiment of the present application.
- the eye tracking system includes: an optical component 101, an image capture component 102 and a light source component 103;
- the optical component 101 includes: a first optical lens 1011 and a second optical lens 1012 arranged at least along a first direction x;
- the light source component 103 is located on a side of the first optical lens 1011 away from the eyeball 104, and/or, located on a side of the second optical lens 1012 away from the eyeball 104,
- the light source component 103 is configured to emit an infrared light beam, the infrared light beam is incident on the first optical lens 1011 and/or the second optical lens 1012, and at least passes through the first optical lens 1011 to be incident on the eyeball 104, forming a Purkinje spot on the eyeball 104
- the image capture component 102 is configured to collect an image of the eyeball 104, wherein the first direction x is the line
- the first optical lens 1011 may be a polarizing reflector
- the second optical lens 1012 may be a beam splitter
- both the first optical lens 1011 and the second optical lens 1012 may be Fresnel lenses.
- the light source assembly 103 includes a plurality of light-emitting diode (LED) light sources.
- the multiple LED light sources in the light source assembly 103 can be arranged on one of the lenses. If each lens only meets the conditions for arranging a part of the light sources, the multiple LED light sources in the light source assembly 103 can be arranged separately on two lenses. For example, if there are 10 LED light sources in total and the first optical lens 1011 meets the conditions for arranging 10 light sources, only 10 light sources can be arranged on the first optical lens 1011.
- the 10 light sources can be arranged on the second optical lens 1012 only; when the first optical lens 1011 or the second optical lens 1012 cannot be arranged with all 10 light sources due to size, shape and other reasons, light sources can be arranged on the first optical lens 1011 and the second optical lens 1012 respectively to meet the requirement of 10 light sources.
- the distance between the light source assembly 103 and the eyeball 104 becomes farther, so that the light emitted from the LED light source arranged on the top or bottom end of one side of the right side surface of the first optical lens 1011 and/or the second optical lens 1012 can enter the eyeball 104, thereby irradiating the eyeball 14 and completing the tracking of the eyeball.
- the light source assembly 103 When the light source assembly 103 is located at one side of the first optical lens 1011, the light source assembly 103 may be directly located on the right side of the first optical lens 1011, the light source assembly 103 may also be located between the first optical lens 1011 and the second optical lens 1012, or the light source assembly 103 may also be directly located on the left side of the second optical lens 1012.
- the light source assembly 103 When the light source assembly 103 is located at one side of the second optical lens 1012, the light source assembly 103 may be directly located on the right side of the second optical lens 1012, if there are other lenses on the right side of the second optical lens 1012, the light source assembly 103 may also be located between the second optical lens 1012 and the other optical lenses, or the light source assembly 103 may also be directly located on the left side of the other lenses. This application does not limit this.
- the light emitting surface of the light source assembly 103 is attached to the side of the first optical lens 1011 away from the eyeball 104 .
- the light-emitting surface of the light source assembly 103 is fitted with the side of the first optical lens 1011 away from the eyeball 104, so that the multiple LED light sources in the light source assembly 103 can be vertically incident on the first optical lens 1011, slightly refracted and scattered in the first optical lens 1011, and emitted to the eyeball 104 from the left side of the first optical lens 1011, wherein the partial plane of the first optical lens 1011 that fits the light source assembly 103 can be perpendicular to the first direction x.
- the eye tracking system also includes a plurality of correction lenses 105 whose number is the same as the number of the light source assemblies 103, and the plurality of correction lenses 105 are located between the first optical lens 1011 and the second optical lens 1012, and correspond one-to-one to the light sources of the plurality of light source assemblies 103, and are configured to correct the infrared light beam passing through the second optical lens 1012, and to incident the corrected infrared light beam on the first optical lens 1011.
- the light emitting surface of the light source assembly 103 is in contact with the side of the second optical lens 1012 away from the eyeball 104, so that the multiple LED light sources in the light source assembly 103 can be vertically incident on the second optical lens. 1012, slight refraction and scattering occurs in the second optical lens 1012, and is emitted from the left side of the second optical lens 1012 to the right side of the first optical lens 1011, and is emitted from the left side of the first optical lens 1011 to the eyeball 104, wherein the partial plane of the second optical lens 1012 that is arranged to fit the light source assembly 103 may be perpendicular to the first direction x.
- the correction lens 105 may be a converging lens, etc., which converges the infrared light beam emitted from the left side of the second optical lens 1012 so that it can be vertically incident on the right side of the first optical lens 1011, thereby preventing the infrared light beam from being scattered after passing through the second optical lens 1012 and being greatly scattered after entering the first optical lens 1011 again, thereby preventing the formation of Purkinje spots on the eyeball 104.
- the correction lens 105 may be a convex lens, etc.
- the correction lens 105 and the second optical lens 1012 or the first optical lens 1011 are integrally formed. This is beneficial to reduce the distance between the optical lenses and reduce the overall size of the device.
- the second optical lens 1012 and the first optical lens 1011 both have an edge ring 106 and a central area 107, and the edge ring 106 is configured to set the light source component 103, wherein when the light source component 103 is located on the side of the first optical lens 1011 away from the eyeball 104, the edge ring 106 of the first optical lens 1011 is provided with an infrared anti-reflection film layer.
- the infrared anti-reflection film layer can be an infrared anti-reflection film well known to those skilled in the art, such as magnesium fluoride, titanium oxide, lead sulfide, and a ceramic infrared anti-reflection film, which is not limited here.
- the edge ring 106 of the second optical lens 1012 is provided with an infrared anti-reflection film layer, and the central area 107 of the second optical lens 1012 is provided with an infrared absorption film layer.
- an infrared anti-reflection film layer can be set on the edge ring 106 of the first optical lens 1011 to increase the transmittance of the infrared light beam emitted by the light source assembly 103 and reduce the loss of the infrared light beam in the first optical lens 1011.
- an infrared absorption film layer is set in the central area 107 of the second optical lens 1012 to absorb the infrared light beam reflected from the first optical lens 1011 to the second optical lens 1012, so as to avoid the disturbance of stray light beams affecting the light beams transmitted by the first optical lens 1011 and the second optical lens 1012 for normal display.
- the image capture component 102 is located on a side of the first optical lens 1011 adjacent to the eyeball 104.
- the image capture component 102 is a charge coupled device. (Charge Coupled Device, CCD) camera or Complementary Metal Oxide Semiconductor (Complementary Metal Oxide Semiconductor, CMOS) camera.
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- the image capture assembly 102 can also be located between the first optical lens 1011 and the second optical lens 1012 (as shown in Figures 6 and 7).
- the central area 107 of the left side of the second optical lens 1012 can be provided with an infrared reflective film layer.
- Fig. 8 is a block diagram of a device provided by an embodiment of the present application.
- the device 200 includes two eye tracking systems 100 provided by any embodiment of the present application; and further includes: a left eye viewing component 110 installed with one eye tracking system 100 and a right eye viewing component 120 installed with another eye tracking system 100, wherein the left eye viewing component 110 and the right eye viewing component 120 are symmetrically arranged.
- the device 200 may be VR glasses or the like.
- the eye tracking system proposed in the embodiment of the present application, it includes: an optical component, an image capture component and a light source component; the optical component includes at least: a first optical lens and a second optical lens arranged along a first direction; the light source component is located on the side of the first optical lens away from the eyeball, and/or, on the side of the second optical lens away from the eyeball, the light source component is configured to emit an infrared light beam, the infrared light beam is incident on the first optical lens and/or the second optical lens, and at least passes through the first optical lens to be incident on the eyeball, forming a Purkinje spot on the eyeball, and the image capture component is configured to collect an image of the eyeball, wherein the first direction is the line of sight direction of the human eye facing the first optical lens.
- the distance between the light source component and the eyeball becomes farther, and the viewing angle of the light emitted by the light source component becomes smaller, so that more light emitted by the light source component can enter the eye, thereby realizing the tracking of the eyeball and improving the user experience.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Vascular Medicine (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ophthalmology & Optometry (AREA)
- Optics & Photonics (AREA)
- Eye Examination Apparatus (AREA)
Abstract
眼球追踪系统和设备。眼球追踪系统包括:光学组件(101),图像捕捉组件(102)和光源组件(103);光学组件(101)至少包括:沿第一方向布置的第一光学镜片(1011)和第二光学镜片(1012);光源组件(103)位于第一光学镜片(1011)远离眼球(104)的一侧,和/或,位于第二光学镜片(1012)远离眼球(104)的一侧,光源组件(103)设置为出射红外光束,红外光束入射至第一光学镜片(1011)和/或第二光学镜片(1012),并至少经过第一光学镜片(1011)入射至眼球(104),在眼球(104)上形成普尔钦斑,图像捕捉组件(102)设置为采集眼球(104)的图像,其中,第一方向为人眼正视第一光学镜片(1011)的视线方向。
Description
本申请要求在2022年09月29日提交中国专利局、申请号为202211203589.6的中国专利申请的优先权,该申请的全部内容通过引用结合在本申请中。
本申请涉及增强现实、虚拟现实、混合现实技术领域,例如涉及眼球追踪系统和设备。
随着科技的发展,虚拟现实(Virtual Reality,VR)眼镜等发展迅速,在生活、军事、体育等领域均起到了很重要的作用。并且VR眼镜的整体尺寸也在变小,相应的出瞳距离(眼球到VR透镜的距离)也相应的缩短。由此,出现的问题是,相关技术中的光源2基本设置在离人眼4最近的镜片1的表面(如图1所示),当出瞳距离缩小时,原本光源2设置的位置使得光源2出射的光可能被眼角或者眼睑遮挡,不能射入至眼球上,最终导致相机3无法对人眼4的位置进行追踪,进而用户无法正常使用VR眼镜,体验较差。
发明内容
本申请提供了眼球追踪系统和设备,以解决相关技术中出瞳距离小,导致无法对眼球的位置进行追踪的问题。
本申请实施例提出了一种眼球追踪系统,包括:光学组件,图像捕捉组件和光源组件;
所述光学组件至少包括:沿第一方向布置的第一光学镜片和第二光学镜片;所述光源组件位于所述第一光学镜片远离眼球的一侧,和/或,位于所述第二光学镜片远离眼球的一侧,所述光源组件设置为出射红外光束,所述红外光束入射至所述第一光学镜片和/或所述第二光学镜片,并至少经过所述第一光学镜片入射至所述眼球,在所述眼球上形成普尔钦斑,所述图像捕捉组件设置为采集所述眼球的图像,其中,所述第一方向为人眼正视所述第一光学镜片的视线方向。
本申请实施例还提出了一种设备,包括两个上述的眼球追踪系统;
还包括:安装一个眼球追踪系统的左眼观看组件和安装另一个眼球追踪系统的右眼观看组件。
图1是相关技术提供的一种眼球追踪系统的结构示意图;
图2是本申请实施例提供的一种眼球追踪系统的结构示意图;
图3是本申请实施例提供的另一种眼球追踪系统的结构示意图;
图4是本申请实施例提供的一种眼球追踪系统中第一光学镜片或第二光学镜片的侧视结构示意图;
图5是本申请实施例提供的另一种眼球追踪系统中第一光学镜片或第二光学镜片的正视结构示意图;
图6是本申请实施例提供的另一种眼球追踪系统的结构示意图;
图7是本申请实施例提供的另一种眼球追踪系统的结构示意图;
图8是本申请实施例提供的一种设备的方框示意图。
附图标记:
1、镜片;2、光源;3、相机;4、人眼;101、光学组件;102、图像捕捉
组件;103、光源组件;1011、第一光学镜片;1012、第二光学镜片;104、眼球;105、修正透镜;106、边缘环;107、中心区域;
200、设备;100、眼球追踪系统;110、左眼观看组件;120、右眼观看组
件。
1、镜片;2、光源;3、相机;4、人眼;101、光学组件;102、图像捕捉
组件;103、光源组件;1011、第一光学镜片;1012、第二光学镜片;104、眼球;105、修正透镜;106、边缘环;107、中心区域;
200、设备;100、眼球追踪系统;110、左眼观看组件;120、右眼观看组
件。
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述,所描述的实施例仅仅是本申请一部分的实施例。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
图1是相关技术提供的一种眼球追踪系统的结构示意图。如图1所示,该眼球追踪系统中,具有一片镜片1,以图1中的方位为例,光源2镶嵌或
者贴合在镜片1的左侧面,出射的光可以打在人眼4上,相机3可以采集到人眼4中的光斑,进而追踪人眼4视线的转移。当人眼4往右移,即出瞳距缩短时,人眼4靠近镜片1,此时,镜片1左侧最上面光源以及最下面光源出射的光可能被人眼的眼睑挡住,无法进入眼球。
以上眼球追踪系统采用了光学记录法实现。光学记录法的原理是,利用红外相机记录被测试者的眼睛运动情况,即获取能够反映眼睛运动的眼部图像,从获取到的眼部图像中提取眼部特征用于建立视线的估计模型。其中,眼部特征可以包括:瞳孔位置、瞳孔形状、虹膜位置、虹膜形状、眼皮位置、眼角位置、普尔钦斑(Purkinje image)位置等。
光学记录法包括瞳孔-角膜反射法。瞳孔-角膜反射法的原理是,近红外光源照向眼睛,由红外相机对眼部进行拍摄,同时拍摄到光源在角膜上的反射点即普尔钦斑,由此获取到带有光斑的眼部图像。
普尔钦斑的绝对位置并不随眼球的转动而变化,但其相对于瞳孔和眼球的位置则是在不断变化的,当人眼盯着摄像头即平视时,普尔钦斑在瞳孔之间,当人眼向上看时,普尔钦斑在瞳孔的下方。进而,只要实时定位人眼图像上的瞳孔和普尔钦斑的位置,计算出角膜反射向量,基于几何模型可以估算出用户的视线方向。
图2是本申请实施例提供的一种眼球追踪系统的结构示意图。如图2所示,该眼球追踪系统包括:光学组件101,图像捕捉组件102和光源组件103;光学组件101包括:沿第一方向x至少布置的第一光学镜片1011和第二光学镜片1012;光源组件103位于第一光学镜片1011远离眼球104的一侧,和/或,位于第二光学镜片1012远离眼球104的一侧,光源组件103设置为出射红外光束,红外光束入射至第一光学镜片1011和/或第二光学镜片1012,并至少经过第一光学镜片1011入射至眼球104,在眼球104上形成普尔钦斑,图像捕捉组件102设置为采集眼球104的图像,其中,第一方向x为人眼正视第一光学镜片1011的视线方向。
第一光学镜片1011可以为偏振反射镜、第二光学镜片1012可以为分束镜,或者第一光学镜片1011和第二光学镜片1012均可以为菲涅尔透镜。光源组件103包括多个发光二极管(Light-Emitting Diode,LED)光源。
当沿第一方向x至少布置有两片光学镜片时,并且每片镜片满足布置光源的条件时,可将光源组件103中的多个LED光源布置在其中一片镜片上,若每片镜片只满足布置部分光源的条件时,可以将光源组件103中的多个LED光源分开布置两片镜片上。示例性的,若总的LED光源有10个,第一光学镜片1011满足布置10个光源的条件时,可仅将10个光源布置在第一光
学镜片1011上,当第二光学镜片1012满足布置10个光源的条件时,可仅将10个光源布置在第二光学镜片1012上,当第一光学镜片1011或第二光学镜片1012由于尺寸、形状等原因无法布置全10个光源时,可以在第一光学镜片1011和第二光学镜片1012上分别布置光源,以满足10个光源的要求。
由于光源组件103布置在第一光学镜片1011和/或第二光学镜片1012的右侧面的一侧,进而光源组件103距离眼球104的距离变远,使得布置在第一光学镜片1011和/或第二光学镜片1012的右侧面的一侧顶端或者底端的LED光源出射的光可以进入眼球104,从而实现对眼球14的照射,进而完成对眼球的追踪。
当光源组件103位于第一光学镜片1011的一侧时,光源组件103可以直接位于第一光学镜片1011的右侧面上,光源组件103也可以位于第一光学镜片1011与第二光学镜片1012之间,或者光源组件103也可以直接位于第二光学镜片1012的左侧面上。当光源组件103位于第二光学镜片1012的一侧时,光源组件103可以直接位于第二光学镜片1012的右侧面上,如果在第二光学镜片1012的右侧还有其他镜片,光源组件103也可以位于第二光学镜片1012与其他光学镜片之间,或者光源组件103也可以直接位于其他镜片的左侧面上。本申请对此不作限定。
根据本申请的一个实施例,如图2所示,光源组件103位于第一光学镜片1011远离眼球104的一侧时,光源组件103的出光面贴合在第一光学镜片1011远离眼球104的一侧。
光源组件103的出光面与第一光学镜片1011远离眼球104的一侧贴合,这样,可以使得光源组件103中的多个LED光源垂直入射至第一光学镜片1011,在第一光学镜片1011中发生轻微折射和散射,从第一光学镜片1011的左侧面出射至眼球104,其中,设置第一光学镜片1011贴合光源组件103的部分平面,可以与第一方向x垂直。
根据本申请的一个实施例,如图3所示,光源组件103位于第二光学镜片1012远离眼球104的一侧时,光源组件103的出光面贴合在第二光学镜片1012远离眼球104的一侧,光源组件103出射的红外光束入射至第二光学镜片1012;眼球追踪系统还包括数量与光源组件103的数量相同的多个修正透镜105,多个修正透镜105位于第一光学镜片1011和第二光学镜片1012之间,并与多个光源组件103的光源一一对应,设置为修正经过第二光学镜片1012的红外光束,并将修正后的红外光束入射至第一光学镜片1011。
光源组件103的出光面与第二光学镜片1012远离眼球104的一侧贴合,这样,可以使得光源组件103中的多个LED光源垂直入射至第二光学镜片
1012,在第二光学镜片1012中发生轻微折射和散射,从第二光学镜片1012的左侧面出射至第一光学镜片1011的右侧面,并从第一光学镜片1011的左侧面出射至眼球104,其中,设置第二光学镜片1012贴合光源组件103的部分平面,可以与第一方向x垂直。
修正透镜105可以为会聚透镜等,将经过第二光学镜片1012的左侧面出射的红外光束会聚,以可以垂直入射至第一光学镜片1011的右侧面,避免经过第二光学镜片1012的红外光束发生散射,再次进入第一光学镜片1011后发生较大散射,无法在眼球104上形成普尔钦斑。修正透镜105可以为凸透镜等。
根据本申请的一个实施例,如图4所示,修正透镜105和第二光学镜片1012或第一光学镜片1011一体成型。这样,有利于缩减光学镜片之间的间距,减小设备的整体尺寸。
根据本申请的一个实施例,如图5所示,第二光学镜片1012以及第一光学镜片1011均具有边缘环106和中心区域107,边缘环106设置为设置光源组件103,其中,光源组件103位于第一光学镜片1011远离眼球104的一侧时,第一光学镜片1011的边缘环106设置有红外增透膜层。
这样,有利于光源组件103出射的红外光束透过第一光学镜片1011,增加光源组件103出射的红外光束的透过率,减少红外光束在第一光学镜片1011中的损失。其中,红外增透膜层可以是氟化镁、氧化钛、硫化铅、陶瓷红外增透膜等本领域人员熟知的红外增透膜,此处不作限制。
根据本申请的一个实施例,如图5所示,光源组件103位于第二光学镜片1012远离眼球104的一侧时,第二光学镜片1012的边缘环106设置有红外增透膜层,且第二光学镜片1012的中心区域107设置有红外吸收膜层。
这样,有利于光源组件103出射的红外光束透过第二光学镜片1012,增加光源组件103出射的红外光束的透过率,减少红外光束在第二光学镜片1012中的损失。同时,还可以在第一光学镜片1011的边缘环106设置红外增透膜层,增加光源组件103出射的红外光束的透过率,减少红外光束在第一光学镜片1011中的损失。并且,第二光学镜片1012的中心区域107设置有红外吸收膜层,以吸收自第一光学镜片1011反射至第二光学镜片1012的红外光束,避免杂散光束扰动影响第一光学镜片1011和第二光学镜片1012透过用于正常显示的光束。
根据本申请的一个实施例,如图2和图3所示,图像捕捉组件102位于第一光学镜片1011近邻眼球104的一侧。图像捕捉组件102为电荷耦合器件
(Charge Coupled Device,CCD)相机或互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)相机。
在其他的实施例中,若光源组件103位于第一光学镜片1011的右侧面的一侧,那图像捕捉组件102还可以位于第一光学镜片1011与第二光学镜片1012之间(如图6所示和图7所示)。在图7中,第二光学镜片1012的左侧面的中心区域107可以设置红外反射膜层。
图8是本申请实施例提供的一种设备的方框示意图。如图8所示,该设备200包括两个本申请任一实施例提出的眼球追踪系统100;还包括:安装一个眼球追踪系统100的左眼观看组件110和安装另一个眼球追踪系统100的右眼观看组件120,其中,左眼观看组件110和右眼观看组件120对称设置。
该设备200可以为VR眼镜等。
根据本申请实施例提出的眼球追踪系统,包括:光学组件,图像捕捉组件和光源组件;光学组件至少包括:沿第一方向布置的第一光学镜片和第二光学镜片;光源组件位于第一光学镜片远离眼球的一侧,和/或,位于第二光学镜片远离眼球的一侧,光源组件设置为出射红外光束,红外光束入射至第一光学镜片和/或第二光学镜片,并至少经过第一光学镜片入射至眼球,在眼球上形成普尔钦斑,图像捕捉组件设置为采集眼球的图像,其中,第一方向为人眼正视第一光学镜片的视线方向。由此,设置的光源组件距离眼球的距离变远,光源组件出射的光的视角变小,使得更多的光源组件出射的光可以进入眼,从而实现对眼球的追踪,提升用户体验。
Claims (10)
- 一种眼球追踪系统,包括:光学组件(101),图像捕捉组件(102)和光源组件(103);所述光学组件(101)至少包括:沿第一方向布置的第一光学镜片(1011)和第二光学镜片(1012);所述光源组件(103)位于所述第一光学镜片(1011)远离眼球(104)的一侧和所述第二光学镜片(1012)远离眼球(104)的一侧中的至少之一,所述光源组件(103)设置为出射红外光束,所述红外光束入射至所述第一光学镜片(1011)和所述第二光学镜片(1012)中的至少之一,并至少经过所述第一光学镜片(1011)入射至所述眼球(104),在所述眼球(104)上形成普尔钦斑,所述图像捕捉组件(102)设置为采集所述眼球(104)的图像,其中,所述第一方向为人眼正视所述第一光学镜片(1011)的视线方向。
- 根据权利要求1所述的眼球追踪系统,其中,在所述光源组件(103)位于所述第一光学镜片(1011)远离所述眼球(104)的一侧的情况下,所述光源组件(103)的出光面贴合在所述第一光学镜片(1011)远离所述眼球(104)的一侧。
- 根据权利要求1所述的眼球追踪系统,其中,在所述光源组件(103)位于所述第二光学镜片(1012)远离所述眼球(104)的一侧的情况下,所述光源组件(103)的出光面贴合在所述第二光学镜片(1012)远离所述眼球(104)的一侧,所述光源组件(103)出射的所述红外光束入射至所述第二光学镜片(1012);所述眼球追踪系统还包括数量与所述光源组件(103)的数量相同的多个修正透镜(105),所述多个修正透镜(105)位于所述第一光学镜片(1011)和所述第二光学镜片(1012)之间,并与多个光源组件(103)的光源一一对应,设置为修正经过所述第二光学镜片(1012)的红外光束,并将修正后的红外光束入射至所述第一光学镜片(1011)。
- 根据权利要求3所述的眼球追踪系统,其中,所述修正透镜(105)和所述第二光学镜片(1012)或所述第一光学镜片(1011)一体成型。
- 根据权利要求1所述的眼球追踪系统,其中,所述第二光学镜片(1012)以及所述第一光学镜片(1011)均具有边缘环(106)和中心区域(107),所述边缘环(106)设置为设置所述光源组件(103),其中,在所述光源组件(103)位于所述第一光学镜片(1011)远离所述眼球(104)的一侧的情况下,所述第一光学镜片(1011)的边缘环(106)设置有红外增透膜层。
- 根据权利要求5所述的眼球追踪系统,其中,在所述光源组件(103)位于所述第二光学镜片(1012)远离所述眼球(104)的一侧的情况下,所述第 二光学镜片(1012)的边缘环(106)设置有红外增透膜层,且所述第二光学镜片(1012)的中心区域(107)设置有红外吸收膜层。
- 根据权利要求6所述的眼球追踪系统,其中,所述图像捕捉组件(102)位于所述第一光学镜片(1011)近邻所述眼球(104)的一侧。
- 根据权利要求1所述的眼球追踪系统,其中,所述光源组件(103)包括:多个红外发光二极管LED光源;所述图像捕捉组件(102)为电荷耦合器件CCD相机或互补金属氧化物半导体CMOS相机。
- 根据权利要求1所述的眼球追踪系统,其中,所述第一光学镜片(1011)为偏振反射镜,所述第二光学镜片(1012)为分束镜,或者所述第一光学镜片(1011)和所述第二光学镜片(1012)均为菲涅尔透镜。
- 一种设备,包括两个如权利要求1-9任一项所述的眼球追踪系统(100);还包括:安装一个眼球追踪系统(100)的左眼观看组件(110)和安装另一个眼球追踪系统(100)的右眼观看组件(120)。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211203589.6 | 2022-09-29 | ||
CN202211203589.6A CN117826416A (zh) | 2022-09-29 | 2022-09-29 | 一种眼球追踪系统和设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024067657A1 true WO2024067657A1 (zh) | 2024-04-04 |
Family
ID=90476450
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2023/121832 WO2024067657A1 (zh) | 2022-09-29 | 2023-09-27 | 眼球追踪系统和设备 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN117826416A (zh) |
WO (1) | WO2024067657A1 (zh) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08234141A (ja) * | 1994-12-01 | 1996-09-13 | Olympus Optical Co Ltd | 頭部装着型映像表示装置 |
CN206178658U (zh) * | 2016-08-10 | 2017-05-17 | 北京七鑫易维信息技术有限公司 | 视频眼镜的眼球追踪模组 |
CN213934926U (zh) * | 2021-01-05 | 2021-08-10 | 南昌虚拟现实研究院股份有限公司 | 一种眼球追踪模组 |
WO2022170287A2 (en) * | 2021-06-07 | 2022-08-11 | Panamorph, Inc. | Near-eye display system |
CN219066130U (zh) * | 2022-10-18 | 2023-05-23 | 腾讯科技(深圳)有限公司 | 头戴式显示设备 |
CN116449566A (zh) * | 2023-03-28 | 2023-07-18 | 歌尔光学科技有限公司 | 近眼显示模组以及头戴显示设备 |
-
2022
- 2022-09-29 CN CN202211203589.6A patent/CN117826416A/zh active Pending
-
2023
- 2023-09-27 WO PCT/CN2023/121832 patent/WO2024067657A1/zh unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08234141A (ja) * | 1994-12-01 | 1996-09-13 | Olympus Optical Co Ltd | 頭部装着型映像表示装置 |
CN206178658U (zh) * | 2016-08-10 | 2017-05-17 | 北京七鑫易维信息技术有限公司 | 视频眼镜的眼球追踪模组 |
CN213934926U (zh) * | 2021-01-05 | 2021-08-10 | 南昌虚拟现实研究院股份有限公司 | 一种眼球追踪模组 |
WO2022170287A2 (en) * | 2021-06-07 | 2022-08-11 | Panamorph, Inc. | Near-eye display system |
CN219066130U (zh) * | 2022-10-18 | 2023-05-23 | 腾讯科技(深圳)有限公司 | 头戴式显示设备 |
CN116449566A (zh) * | 2023-03-28 | 2023-07-18 | 歌尔光学科技有限公司 | 近眼显示模组以及头戴显示设备 |
Also Published As
Publication number | Publication date |
---|---|
CN117826416A (zh) | 2024-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9606354B2 (en) | Heads-up display with integrated display and imaging system | |
US20160249801A1 (en) | Compact eye imaging and eye tracking apparatus | |
TW201831953A (zh) | 基於經由光導光學元件之視網膜成像的眼動追蹤器 | |
CN104216122B (zh) | 眼球追迹装置及其光学组件 | |
CN106199958A (zh) | 自动调焦头戴式显示装置 | |
US10452911B2 (en) | Gaze-tracking system using curved photo-sensitive chip | |
CN110850594B (zh) | 头戴式可视设备及用于头戴式可视设备的眼球追踪系统 | |
KR20170118618A (ko) | 눈 촬영 장치 | |
US11454747B1 (en) | Shadow-matched Fresnel lens doublet for reduced optical artifacts | |
CN105829951A (zh) | 影像投射装置、头戴式显示器 | |
US20220229300A1 (en) | Optical see through (ost) near eye display (ned) system integrating ophthalmic correction | |
WO2024067657A1 (zh) | 眼球追踪系统和设备 | |
JP2009240551A (ja) | 視線検出装置 | |
WO2023246812A1 (zh) | 一种眼球追踪光学装置、系统和虚拟现实设备 | |
CN113366376B (zh) | 能够提供近距离的增强现实用图像的增强现实用光学装置 | |
CN108957743A (zh) | 虚拟实境显示装置 | |
US20230118315A1 (en) | Optical see through (ost) near eye display (ned) system integrating ophthalmic correction | |
CN114252998B (zh) | 近眼显示装置 | |
TWI677711B (zh) | 虛實影像整合裝置 | |
JP7437048B2 (ja) | 画像投影装置 | |
TWI633337B (zh) | 虛擬實境顯示裝置 | |
JPH11249588A (ja) | 頭部装着型表示装置 | |
WO2023246815A1 (zh) | 一种眼球追踪光学系统及头戴式设备 | |
WO2023246813A1 (zh) | 一种眼球追踪光学装置、系统和虚拟现实设备 | |
CN115248500B (zh) | 扩增实境眼镜 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23870884 Country of ref document: EP Kind code of ref document: A1 |