WO2022205769A1 - 基于单眼球追踪的虚拟现实系统注视渲染方法、系统 - Google Patents

基于单眼球追踪的虚拟现实系统注视渲染方法、系统 Download PDF

Info

Publication number
WO2022205769A1
WO2022205769A1 PCT/CN2021/116750 CN2021116750W WO2022205769A1 WO 2022205769 A1 WO2022205769 A1 WO 2022205769A1 CN 2021116750 W CN2021116750 W CN 2021116750W WO 2022205769 A1 WO2022205769 A1 WO 2022205769A1
Authority
WO
WIPO (PCT)
Prior art keywords
monocular
virtual reality
user
eyeball
rendering
Prior art date
Application number
PCT/CN2021/116750
Other languages
English (en)
French (fr)
Inventor
吴涛
Original Assignee
青岛小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛小鸟看看科技有限公司 filed Critical 青岛小鸟看看科技有限公司
Priority to EP21934408.2A priority Critical patent/EP4206979A4/en
Priority to US17/816,408 priority patent/US11715176B2/en
Publication of WO2022205769A1 publication Critical patent/WO2022205769A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the present disclosure relates to the technical field of virtual reality, and more particularly, to a gaze rendering method and system for a virtual reality system based on monocular eye tracking.
  • virtual reality systems are becoming more and more common and are used in many fields, such as computer games, health and safety, industry and education and training.
  • mixed virtual reality systems are being integrated into mobile communication devices, game consoles, personal computers, movie theaters, theme parks, university labs, student classrooms, hospital exercise rooms, and more.
  • virtual reality is a form of reality that is adjusted in some way before being presented to the user, which may include virtual reality (VR for short), augmented reality (AR), mixed reality (Mixed Reality, MR for short), or a certain combination and/or derivative combination.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • a typical virtual reality system includes one or more devices configured to present and display content to a user.
  • a virtual reality system may include a Head Mounted Display (HMD) worn by a user and configured to output virtual reality content to the user.
  • HMD Head Mounted Display
  • a virtual reality system configured by an all-in-one machine is more popular, that is, various hardware devices such as a mobile computing processing unit and an image and graphics renderer are integrated in the all-in-one device. Due to the current application and popularization of virtual reality all-in-one devices in many fields, in some scenarios, the quality parameters such as the image clarity of the rendered content presented by the virtual reality all-in-one device are relatively high. Capabilities and rendering capabilities present no small challenge.
  • the current solutions are in several directions: 1. By reducing the display resolution of the overall virtual reality headset, the computing resources and rendering resources of the virtual content are reduced. 2. Display the virtual content in a part of the center of the screen through high-resolution rendering, and reduce the resolution rendering and calculation processing in other areas. Optimize computing resources and rendering resources for virtual content. 3. Through eye tracking technology, the eye position of both eyes is obtained corresponding to the eye gaze position on the display screen area, and the eye gaze position area is rendered by high resolution, and other areas are rendered with low resolution content and calculated.
  • the above solutions 1 and 2 both have a certain negative impact on the display clarity of the virtual content presented by the virtual reality all-in-one device, and greatly affect the user experience.
  • 3 To a certain extent, it solves the problem of the display clarity of the virtual content in the user's eye gaze area, but the current mainstream eye tracking technology on the virtual reality all-in-one device is mainly through the left and right eyes of the virtual reality all-in-one screen.
  • Two eye-tracking modules are installed on the positions respectively, and the same light source is used in both eye-tracking modules. As a result, the light emitted by the light sources in the two eye-tracking modules may easily interfere with each other during calibration or use, especially when wearing For users of myopia glasses, the error of the calculation results increases, which affects the position accuracy of eye tracking.
  • a gaze rendering method and system for a virtual reality system based on single eye tracking can effectively avoid the problem that the light sources of the two eye tracking modules easily interfere with each other during calibration or use, and can track the location area of the user's eyes with high precision in real time, so as to meet the needs of the user's eye gaze rendering.
  • the embodiments of the present disclosure provide a gaze rendering method and system for a virtual reality system based on monocular eye tracking, so as to solve the problem of installing two eye tracking modules on the left and right eye positions of the virtual reality integrated machine screen respectively, and in the two
  • the same light source is used in the two eye tracking modules, which causes the light emitted by the light sources in the two eye tracking modules to easily interfere with each other during calibration or use, especially for users wearing myopia glasses, which increases the error of the calculation results. , the problem that affects the position accuracy of eye tracking.
  • An embodiment of the present disclosure provides a gaze rendering method for a virtual reality system based on single eye tracking, including:
  • the main rendering area is radiated with the preset threshold as the radius;
  • Main rendering is performed in the main rendering area, and cooperative rendering is performed on an area other than the main rendering area in the display screen of the virtual reality system; wherein, the rendering resolution of the main rendering is higher than that of the cooperative rendering rendering resolution.
  • the process of collecting the first monocular mapping position on the display screen of the virtual reality system corresponding to the user's monocular eyeball includes:
  • the light reflected in the user's monocular eyeball is captured, and the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system is obtained according to the relative position of the light by the computer vision technology.
  • the light is infrared light or visible light.
  • the infrared light source component when the light is infrared light, the infrared light source component is set to emit infrared light; the infrared light reflected in the user's monocular eyeball is captured by an infrared light tracking camera, and the infrared light is captured by a computer vision technology
  • the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system is acquired according to the relative position of the infrared light.
  • the infrared light tracking camera is built in the virtual reality integrated machine device at a position corresponding to the user's monocular eyeball; the infrared light source component is arranged around the infrared light tracking camera.
  • an infrared tracking image formed by the reflected light of the infrared light source assembly on the eyeball is captured by the infrared light tracking camera, and
  • the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system is acquired through computer vision technology.
  • the visible light when the light is visible light, the visible light is emitted through a visible light source component; the visible light reflected in the user's monocular eyeball is captured by a visible light tracking camera, and the visible light is captured by a computer vision technology according to the visible light.
  • the relative position of the line obtains the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system.
  • the visible light tracking camera is built in the virtual reality all-in-one device at a position corresponding to the user's monocular eyeball; the visible light source component is arranged around the visible light tracking camera.
  • the visible light tracking camera is built in the virtual reality all-in-one device at a position corresponding to the user's monocular eyeball; the visible light source component is arranged around the visible light tracking camera.
  • acquiring the distance pupil distance corresponding to the pupillary distance between the two eyes of the user includes: by adapting the IPD adjustment function module built in the virtual reality all-in-one device to the eyes of the user, to acquire the distance between the pupils of the user and the user's eyes. Distant interpupillary distance corresponding to the distance between the pupils of both eyes.
  • Embodiments of the present disclosure also provide a gaze rendering system for a virtual reality system based on monocular eye tracking, which is configured to implement the aforementioned gaze rendering method for a virtual reality system based on monocular eye tracking, including a virtual reality system display set in a virtual reality integrated machine A screen, a monocular tracking module built in the virtual reality all-in-one machine, a far-use interpupillary distance (Inter Pupilary Distance, referred to as IPD) adjustment function module, a processor and a rendering module, wherein,
  • IPD interpupillary distance
  • the monocular tracking module is configured to collect the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system;
  • the IPD adjustment function module is set to be adapted to the user's eyes to obtain the distance pupil distance corresponding to the user's bilateral pupillary distance;
  • the processor is configured to calculate a second monocular mapping position corresponding to the user's other monocular eyeball on the display screen of the virtual reality system according to the distance interpupillary distance and the first monocular mapping position, and use the first monocular mapping position respectively.
  • the monocular mapping position and the second monocular mapping position are the center of the circle, and the main rendering area is radiated with the preset threshold as the radius;
  • the rendering module is set to perform main rendering in the main rendering area, and perform cooperative rendering on the area other than the main rendering area in the display screen of the virtual reality system; wherein, the rendering resolution of the main rendering Higher than the rendering resolution of the mate rendering.
  • the monocular tracking module is an infrared tracking module or a visible light tracking module.
  • the infrared tracking module includes an infrared light-emitting source assembly and an infrared tracking camera; wherein,
  • the infrared light source assembly is configured to emit infrared light
  • the infrared light tracking camera is set to capture the infrared light reflected in the user's monocular eyeball, and obtain the first monocular corresponding to the user's monocular eyeball on the display screen of the virtual reality system according to the relative position of the infrared light through computer vision technology. map location.
  • the infrared light tracking camera is built into the virtual reality kiosk device at a position corresponding to the user's monocular eyeball;
  • the infrared light source assembly is arranged around the infrared light tracking camera.
  • the infrared tracking camera tracks the position of the user's monocular eyeball
  • the infrared light tracking camera is configured to capture an infrared tracking image formed by the reflected light of the infrared light source assembly on the eyeball
  • the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system is acquired through computer vision technology.
  • the visible light tracking module includes a visible light source assembly and a visible light tracking camera;
  • the visible light source component is configured to emit visible light
  • the visible light tracking camera is set to capture the visible light reflected in the user's monocular eyeball, and obtain the first monocular mapping corresponding to the user's monocular eyeball on the display screen of the virtual reality system according to the relative position of the visible light through computer vision technology Location.
  • the visible light tracking camera is built into the virtual reality kiosk device at a position corresponding to the user's monocular eyeball;
  • the visible light source components are arranged around the visible light tracking camera.
  • the wavelength range of the traceable rays of the visible light tracking camera is 400-900 nm.
  • the visible light tracking camera when the visible light tracking camera tracks the position of the user's monocular eyeball, the visible light tracking camera is configured to capture a tracking image formed by the reflected light of the visible light source component on the eyeball, and use a computer
  • the visual technology acquires the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system.
  • Embodiments of the present invention also provide a non-transitory computer-readable storage medium, wherein a computer program is stored thereon, and when executed by a processor, the computer program implements the method according to any of the foregoing embodiments or exemplary embodiments. method described.
  • the method and system for gaze rendering in a virtual reality system based on monocular eye tracking provided by the embodiments of the present disclosure firstly collect the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system, and then acquire The distance pupil distance corresponding to the pupil distance of both eyes of the user, so as to calculate the second monocular mapping position corresponding to the user's other monocular eyeball on the display screen of the virtual reality system according to the distance pupil distance and the first monocular mapping position; and then respectively Taking the first monocular mapping position and the second monocular mapping position as the center of the circle, the main rendering area is radiated with the preset threshold as the radius, and the main rendering of higher resolution is performed in the main rendering area.
  • the lower-resolution coordinated rendering is performed on the area outside the rendering area, so as to present a higher image clarity for the user's eyes, improve the user experience, and solve the problem of the light emitted by the light source in the dual-eye tracking through single eye tracking.
  • Mutual interference and large error in calculation results affect the position accuracy of eye tracking, and real-time high-precision tracking of the user's eye position area greatly meets the user's eye gaze rendering needs.
  • FIG. 1 is a flowchart of a gaze rendering method for a virtual reality system based on monocular eye tracking according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a gaze rendering system of a virtual reality system based on monocular eye tracking according to an embodiment of the present disclosure.
  • the quality parameters such as image clarity of the rendered content presented by the virtual reality all-in-one device are relatively high. If the eye tracking technology is used to obtain the eye position of the eyes corresponding to the eye gaze position on the display screen area, the high-resolution Rendering the eye gaze position area, and performing low-resolution content rendering and calculation processing in other areas, although it can solve the problem of the display clarity of the virtual content in the user's eye gaze area to a certain extent, but currently in the mainstream of virtual reality all-in-one devices. Eye tracking technology is mainly by installing two eye tracking modules on the left and right eye positions of the virtual reality integrated machine screen, and using the same light source in both eye tracking modules, resulting in two eye tracking modules during calibration or use. The light emitted by the light sources in each eye tracking module is easy to interfere with each other, especially for users wearing myopia glasses, which increases the error of the calculation results and affects the position accuracy of eye tracking.
  • embodiments of the present disclosure provide a method and system for gaze rendering in a virtual reality system based on monocular eye tracking. Specific embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
  • FIG. 1 illustrates the gaze rendering method of a virtual reality system based on monocular eye tracking according to the embodiment of the present disclosure
  • FIG. 2 illustrates the present disclosure.
  • the monocular tracking-based virtual reality system of the embodiment is exemplarily indicated by the gaze rendering system.
  • the gaze rendering method for a virtual reality system based on monocular eye tracking includes:
  • S110 Collect the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system;
  • S120 Obtain the distance pupil distance corresponding to the distance between the pupils of the user's two eyes, and calculate the second monocular mapping position corresponding to the user's other monocular eyeball on the display screen of the virtual reality system according to the distance pupil distance and the first monocular mapping position ;
  • S130 respectively taking the first monocular mapping position and the second monocular mapping position as the center of the circle, and radiating the main rendering area with the preset threshold as the radius;
  • S140 Perform main rendering in the main rendering area, and perform cooperative rendering on an area other than the main rendering area on the display screen of the virtual reality system; wherein, the rendering resolution of the main rendering is higher than the rendering resolution of the cooperative rendering.
  • step S110 the process of collecting the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system includes:
  • S11 emit light to the user's monocular eyeball; the light can be infrared light or visible light; the monocular eyeball can be either the left eye or the right eye;
  • S12 Capture the light reflected in the user's monocular eyeball, and obtain the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system according to the relative position of the light by the computer vision technology.
  • the gaze rendering method for a virtual reality system based on monocular eye tracking provided by the present disclosure firstly collects the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system, and then acquires the distance between the pupils of the user's eyes corresponding to the distance between the two eyes of the user.
  • the corresponding distance interpupillary distance so as to calculate the second monocular mapping position corresponding to the user's other monocular eyeball on the display screen of the virtual reality system according to the distance interpupillary distance and the first monocular mapping position; and then use the first monocular mapping position,
  • the second monocular mapping position is the center of the circle, and the main rendering area is radiated with the preset threshold as the radius, and the main rendering of higher resolution is performed in the main rendering area, and the main rendering is performed on the area other than the main rendering area on the display screen of the virtual reality system.
  • the lower resolution is matched with rendering, so as to present higher image clarity for the user's eyes and improve the user experience, and through the single eye tracking, it can solve the problem that the light emitted by the light source in the double eye tracking is easy to interfere with each other, and the calculation result has a large error. , which affects the position accuracy of eye tracking, and real-time high-precision tracking of the user's eye position area, which greatly meets the user's eye gaze rendering needs.
  • the present disclosure also provides a gaze rendering system 100 for a virtual reality system based on monocular eye tracking, which is configured to implement the aforementioned gaze rendering method for a virtual reality system based on monocular eye tracking.
  • the virtual reality system in the display screen 101, the monocular tracking module 102, the IPD adjustment function module 103, the processor 104 and the rendering module 105 built in the virtual reality integrated machine,
  • the monocular tracking module 102 is configured to collect the first monocular mapping position corresponding to the user's monocular eyeball on the display screen 101 of the virtual reality system;
  • the IPD adjustment function module 103 is set to be adapted to the user's eyes to obtain the distance pupil distance corresponding to the user's eye pupil distance; in this embodiment, the IPD adjustment function module 103 is built in the virtual reality integrated machine device, The user can adjust the IPD value (distance interpupillary distance) suitable for the pupillary distance of his own eyes according to his own pupillary distance, so that the distance between the two tube lenses of the virtual reality all-in-one device matches the pupillary distance of the user's eyes;
  • IPD value distance interpupillary distance
  • the processor 104 is configured to calculate a second monocular mapping position corresponding to the user's other monocular eyeball on the display screen of the virtual reality system according to the distance pupil distance and the first monocular mapping position, and use the first monocular mapping position, the The second monocular mapping position is the center of the circle, and the main rendering area is radiated with the preset threshold as the radius; in this embodiment, the preset threshold is not specifically limited, and is adjusted according to the specific application.
  • the rendering module 105 is configured to perform main rendering in the main rendering area, and perform coordinated rendering on the area other than the main rendering area in the display screen of the virtual reality system; wherein, the rendering resolution of the main rendering is higher than the coordination rendering The rendering resolution of the rendering.
  • the eye gaze position area is rendered in high resolution, and low-resolution content rendering and calculation processing are performed in other areas to improve the overall calculation rate, and high-precision tracking of the user's eye position area is performed in real time. Meet the needs of user eye gaze rendering.
  • the monocular tracking module 102 is an infrared tracking module A or a visible light tracking module B.
  • the infrared tracking module A includes an infrared light-emitting source assembly A-1' and an infrared tracking camera A-2; wherein,
  • the infrared light source assembly A-1 is set to emit infrared light
  • the infrared light tracking camera A-2 is set to capture the infrared light reflected in the user's monocular eyeball, and obtain the first monocular mapping corresponding to the user's monocular eyeball on the display screen of the virtual reality system according to the relative position of the infrared light through computer vision technology Location.
  • an infrared tracking camera is used, and the infrared light tracking camera is built in the virtual reality all-in-one device at a position corresponding to the user's monocular eyeball; the infrared light source component is arranged around the infrared light tracking camera, that is, in the A certain number of infrared light-emitting source components are installed near the infrared tracking camera.
  • the user's monocular eyeball is taken as an example.
  • the infrared tracking camera tracks the position of the user's monocular eyeball, the reflected light of the infrared light-emitting source component on the eyeball forms an infrared tracking image. It is captured, and then the position information on the screen of the right eye corresponding to the position of the eyeball is obtained through computer vision technology.
  • the visible light tracking module B includes a visible light source assembly B-1 and a visible light tracking camera B-2;
  • the visible light source component B-1 is set to emit visible light
  • the visible light tracking camera B-2 is set to capture the visible light reflected in the user's monocular eyeball, and obtain the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system according to the relative position of the visible light through computer vision technology.
  • a visible light tracking camera B-2 is set.
  • the visible light tracking camera B-2 may be a color camera or a monochrome grayscale camera.
  • the visible light tracking camera B-2 is built in the virtual reality all-in-one device.
  • the position corresponding to the user's monocular eye, and the band range of the traceable light of the visible light tracking camera is 400-900 nm;
  • the visible light source component B-1 is arranged around the visible light tracking camera, that is, around the visible light tracking camera.
  • the visible light source component B-1 not only includes visible light, but also includes any light source components that can be captured by the visible light tracking camera B-2.
  • the user's monocular eyeball is taken as an example of the right eye.
  • the visible light tracking camera tracks the user's eyeball position
  • the visible light source component is reflected on the eyeball
  • the light forming tracking image is captured, and then the position information of the eyeball position on the screen of the right eye corresponding to the position of the eyeball is obtained through computer vision technology.
  • the gaze rendering system of the virtual reality system based on monocular eye tracking provided by the present disclosure firstly collects the first monocular mapping position corresponding to the user's monocular eyeball on the display screen of the virtual reality system, and then acquires the user's monocular eyeball corresponding to the first monocular mapping position on the display screen of the virtual reality system.
  • the distance pupil distance corresponding to the pupil distance of both eyes, so that the second monocular mapping position corresponding to the user's other monocular eyeball on the display screen of the virtual reality system is calculated according to the distance pupil distance and the mapping position of the first monocular;
  • the monocular mapping position and the second monocular mapping position are the center of the circle, and the main rendering area is radiated with the preset threshold as the radius, and the main rendering area with higher resolution is performed in the main rendering area, except for the main rendering area on the display screen of the virtual reality system.
  • the lower-resolution cooperative rendering is performed on the area of the user, so as to present a higher image clarity for the user's eyes and improve the user experience, and the single-eye tracking can solve the problem that the light emitted by the light source in the double-eye tracking is easy to interfere with each other, The error of the calculation result is large, which affects the position accuracy of eye tracking. Moreover, the high-precision tracking of the user's eye position area is performed in real time, which greatly meets the user's eye gaze rendering needs.
  • the gaze rendering method and system for a virtual reality system based on monocular eye tracking proposed according to the present disclosure are described by way of example with reference to the accompanying drawings.
  • those skilled in the art should understand that various improvements can also be made to the gaze rendering method and system for a virtual reality system based on monocular eye tracking proposed in the present disclosure without departing from the content of the present disclosure. Accordingly, the scope of protection of the present disclosure should be determined by the content of the appended claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Processing Or Creating Images (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

本公开提供一种基于单眼球追踪的虚拟现实系统注视渲染方法,首先采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置,再获取与用户的双眼瞳孔距离相对应的远用瞳距,从而根据远用瞳距与第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置;进而分别以第一单眼映射位置、第二单眼映射位置为圆心,以预设阈值为半径辐射出主要渲染区域,在主要渲染区域内进行较高分辨率的主要渲染,通过单眼球追踪能够解决双眼球追踪中的光源发出的光线容易相互干扰、计算结果误差大,影响眼球追踪的位置精度的问题,而且实时对用户双眼位置区域进行高精度追踪,极大满足用户眼球注视渲染的需求。

Description

基于单眼球追踪的虚拟现实系统注视渲染方法、系统 技术领域
本公开涉及虚拟现实技术领域,更为具体地,涉及一种基于单眼球追踪的虚拟现实系统注视渲染方法、系统。
背景技术
由于科技的进步,市场需求的多元化发展,虚拟现实系统正变得越来越普遍,应用在许多领域,如电脑游戏,健康和安全,工业和教育培训。举几个例子,混合虚拟现实系统正在被整合到移动通讯设备、游戏机、个人电脑、电影院,主题公园,大学实验室,学生教室,医院锻炼健身室等生活各个角落。
一般而言,虚拟现实是一种在呈现给用户之前以某种方式进行调整的现实形式,可能包括虚拟现实(Virtual Reality,简称为VR)、增强现实(Augmented Reality,简称为AR)、混合现实(Mixed Reality,简称为MR)、或某种组合和/或衍生组合。
典型的虚拟现实系统包括一个或多个设置为向用户呈现和显示内容的设备。例如,一种虚拟现实系统可以包含由用户佩戴并配置为向用户输出虚拟现实内容的头戴显示器(Head Mounted Display,简称为HMD)。目前比较流行的是一体机配置的虚拟现实系统,即移动计算处理单元,图像图形渲染器等各种硬件设备都集成在一体机设备中。由于目前虚拟现实一体机设备在许多领域场景下应用和普及,有些场景下对虚拟现实一体机设备呈现的渲染内容的图像清晰度等质量参数要求比较高,对虚拟现实一体机设备的移动端的处理能力和渲染能力带来了不小的挑战。
目前解决的方案是有几个方向:①.通过降低整体的虚拟现实头戴式一体机的显示分辨率,减少虚拟内容的计算资源和渲染资源。②.通过高分辨率渲染显示屏幕中心一部分区域的虚拟内容,其他区域降低分辨率渲染和计算处理。优化虚拟内容的计算资源和渲染资源。③.通过眼球追踪技术,获取双眼 的眼球位置对应在显示屏幕区域上的眼球注视位置,通过高分辨率渲染眼球注视位置区域,其他区域进行低分辨率内容渲染和计算处理。
上述的方案①和②都对虚拟现实一体机设备呈现的虚拟内容的显示清晰度带来一定的负面影响,很大程度上影响用户体验。③从一定程度上解决了用户眼球注视区域的虚拟内容的显示清晰度的问题,但是目前在虚拟现实一体机设备上主流的眼球追踪技术,主要是通过在虚拟现实一体机屏幕的左,右眼位置上分别安装两个眼球追踪模块,并且在两个眼球追踪模块中均采用相同的光源,导致在标定或者使用时,两个眼球追踪模块中的光源发出的光线容易会相互干扰,特别是佩戴近视眼镜的用户,使其计算结果的误差增大,影响眼球追踪的位置精度。
因此,亟需一种能够有效避免在标定或者使用时,两个眼球追踪模块光源容易相互干扰的问题,而且能够实时对用户双眼位置区域进行高精度追踪,使其满足用户眼球注视渲染的需求的一种基于单眼球追踪的虚拟现实系统注视渲染方法、系统。
发明内容
鉴于上述问题,本公开实施例提供一种基于单眼球追踪的虚拟现实系统注视渲染方法、系统,以解决虚拟现实一体机屏幕的左,右眼位置上分别安装两个眼球追踪模块,并且在两个眼球追踪模块中均采用相同的光源,导致在标定或者使用时,两个眼球追踪模块中的光源发出的光线容易会相互干扰,特别是佩戴近视眼镜的用户,使其计算结果的误差增大,影响眼球追踪的位置精度的问题。
本公开实施例提供的一种基于单眼球追踪的虚拟现实系统注视渲染方法,包括:
采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置;
获取与用户的双眼瞳孔距离相对应的远用瞳距,并根据所述远用瞳距与所述第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置;
分别以所述第一单眼映射位置、所述第二单眼映射位置为圆心,以预设 阈值为半径辐射出主要渲染区域;
在所述主要渲染区域内进行主要渲染,在所述虚拟现实系统显示屏幕中除所述主要渲染区域以外的区域上进行配合渲染;其中,所述主要渲染的渲染分辨率高于所述配合渲染的渲染分辨率。
在至少一个示例性实施例中,采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置的过程,包括:
向用户的单眼眼球发射光线;
捕捉用户单眼眼球中反射的光线,并通过计算机视觉技术根据所述光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
在至少一个示例性实施例中,所述光线为红外光线或可见光线。
在至少一个示例性实施例中,所述光线为红外光线的情况下,通过红外发光源组件设置为发射红外光线;通过红外光追踪相机捕捉用户单眼眼球中反射的红外光线,并通过计算机视觉技术根据所述红外光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
在至少一个示例性实施例中,所述红外光追踪相机内置在虚拟现实一体机设备中与用户的单眼眼球相对应的位置;所述红外发光源组件设置在所述红外光追踪相机的四周。
在至少一个示例性实施例中,所述红外追踪相机追踪用户的单眼眼球的位置时,通过所述红外光追踪相机捕捉所述红外发光源组件在眼球上的反射光形成的红外追踪图像,并通过计算机视觉技术获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
在至少一个示例性实施例中,所述光线为可见光线的情况下,通过可见光源组件发射可见光线;通过可见光追踪相机捕捉用户单眼眼球中反射的可见光线,并通过计算机视觉技术根据所述可见光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
在至少一个示例性实施例中,所述可见光追踪相机内置在所述虚拟现实一体机设备中与用户的单眼眼球相对应的位置;所述可见光源组件设置在所述可见光追踪相机的四周。
在至少一个示例性实施例中,所述可见光追踪相机内置在所述虚拟现实 一体机设备中与用户的单眼眼球相对应的位置;所述可见光源组件设置在所述可见光追踪相机的四周。
在至少一个示例性实施例中,获取与用户的双眼瞳孔距离相对应的远用瞳距包括:通过令虚拟现实一体机设备内置的IPD调节功能模块与用户的双眼相适应,以获取与用户的双眼瞳孔距离相对应的远用瞳距。
本公开实施例还提供一种基于单眼球追踪的虚拟现实系统注视渲染系统,设置为实现前述的基于单眼球追踪的虚拟现实系统注视渲染方法,包括设置在虚拟现实一体机中的虚拟现实系统显示屏幕、内置在所述虚拟现实一体机中的单眼追踪模块、远用瞳距(Inter Pupilary Distance,简称为IPD)调节功能模块、处理器和渲染模块,其中,
所述单眼追踪模块设置为采集用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置;
所述IPD调节功能模块设置为与用户的双眼相适应以获取与用户的双眼瞳孔距离相对应的远用瞳距;
所述处理器设置为根据所述远用瞳距与所述第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置,并且分别以所述第一单眼映射位置、所述第二单眼映射位置为圆心,以预设阈值为半径辐射出主要渲染区域;
所述渲染模块设置为在所述主要渲染区域内进行主要渲染,在所述虚拟现实系统显示屏幕中除所述主要渲染区域以外的区域上进行配合渲染;其中,所述主要渲染的渲染分辨率高于所述配合渲染的渲染分辨率。
在至少一个示例性实施例中,所述单眼追踪模块为红外追踪模块或可见光追踪模块。
在至少一个示例性实施例中,所述红外追踪模块包括红外发光源组件和红外追踪相机;其中,
所述红外发光源组件设置为发射红外光线;
所述红外光追踪相机设置为捕捉用户单眼眼球中反射的红外光线,并通过计算机视觉技术根据所述红外光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
在至少一个示例性实施例中,所述红外光追踪相机内置在所述虚拟现实 一体机设备中与用户的单眼眼球相对应的位置;
所述红外发光源组件设置在所述红外光追踪相机的四周。
在至少一个示例性实施例中,所述红外追踪相机追踪用户的单眼眼球的位置时,所述红外光追踪相机设置为捕捉所述红外发光源组件在眼球上的反射光形成的红外追踪图像,并通过计算机视觉技术获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
在至少一个示例性实施例中,所述可见光追踪模块包括可见光源组件和可见光追踪相机;
其中所述可见光源组件设置为发射可见光线;
所述可见光追踪相机设置为捕捉用户单眼眼球中反射的可见光线,并通过计算机视觉技术根据所述可见光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
在至少一个示例性实施例中,所述可见光追踪相机内置在所述虚拟现实一体机设备中与用户的单眼眼球相对应的位置;
所述可见光源组件设置在所述可见光追踪相机的四周。
在至少一个示例性实施例中,所述可见光追踪相机的可追踪光线的波段区间为400~900nm。
在至少一个示例性实施例中,所述可见光追踪相机追踪用户的单眼眼球的位置时,所述可见光追踪相机设置为捕捉所述可见光源组件在眼球上的反射光形成的追踪图像,并通过计算机视觉技术获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
本发明实施例还提供一种非暂时性计算机可读存储介质,其中,其上存储有计算机程序,所述计算机程序在被处理器执行时实现根据前述任一项实施例或示例性实施例所述的方法。
从上面的技术方案可知,本公开实施例提供的基于单眼球追踪的虚拟现实系统注视渲染方法、系统,首先采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置,再获取与用户的双眼瞳孔距离相对应的远用瞳距,从而根据远用瞳距与第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置;进而分别以第一单眼映射位置、第二单眼映射位置为圆心,以预设阈值为半径辐射出主要渲染区 域,在主要渲染区域内进行较高分辨率的主要渲染,在虚拟现实系统显示屏幕中除主要渲染区域以外的区域上进行较低分辨率的配合渲染,从而为用户的双眼呈现较高的图像清晰度,提高用户体验感,并且通过单眼球追踪能够解决双眼球追踪中的光源发出的光线容易相互干扰、计算结果误差大,影响眼球追踪的位置精度的问题,而且实时对用户双眼位置区域进行高精度追踪,极大满足用户眼球注视渲染的需求。
附图说明
通过参考以下结合附图的说明书内容,并且随着对本公开的更全面理解,本公开的其它目的及结果将更加明白及易于理解。在附图中:
图1为根据本公开实施例的基于单眼球追踪的虚拟现实系统注视渲染方法的流程图;
图2为根据本公开实施例的基于单眼球追踪的虚拟现实系统注视渲染系统的示意图。
具体实施方式
有些场景下对虚拟现实一体机设备呈现的渲染内容的图像清晰度等质量参数要求比较高,若通过双眼追踪技术,获取双眼的眼球位置对应在显示屏幕区域上的眼球注视位置,通过高分辨率渲染眼球注视位置区域,其他区域进行低分辨率内容渲染和计算处理,虽能够一定程度上解决了用户眼球注视区域的虚拟内容的显示清晰度的问题,但是目前在虚拟现实一体机设备上主流的眼球追踪技术,主要是通过在虚拟现实一体机屏幕的左,右眼位置上分别安装两个眼球追踪模块,并且在两个眼球追踪模块中均采用相同的光源,导致在标定或者使用时,两个眼球追踪模块中的光源发出的光线容易会相互干扰,特别是佩戴近视眼镜的用户,使其计算结果的误差增大,影响眼球追踪的位置精度。
针对上述问题,本公开实施例提供一种基于单眼球追踪的虚拟现实系统注视渲染方法、系统,以下将结合附图对本公开的具体实施例进行详细描述。
为了说明本公开实施例提供的基于单眼球追踪的虚拟现实系统注视渲染方法、系统,图1对本公开实施例的基于单眼球追踪的虚拟现实系统注视渲 染方法进行了示例性标示;图2对本公开实施例的基于单眼球追踪的虚拟现实系统注视渲染系统进行了示例性标示。
以下示例性实施例的描述实际上仅仅是说明性的,决不作为对本公开及其应用或使用的任何限制。对于相关领域普通技术人员已知的技术和设备可能不作详细讨论,但在适当情况下,所述技术和设备应当被视为说明书的一部分。
如图1所示,本公开实施例提供的基于单眼球追踪的虚拟现实系统注视渲染方法,包括:
S110:采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置;
S120:获取与用户的双眼瞳孔距离相对应的远用瞳距,并根据远用瞳距与第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置;
S130:分别以第一单眼映射位置、所述第二单眼映射位置为圆心,以预设阈值为半径辐射出主要渲染区域;
S140:在主要渲染区域内进行主要渲染,在虚拟现实系统显示屏幕中除主要渲染区域以外的区域上进行配合渲染;其中,主要渲染的渲染分辨率高于配合渲染的渲染分辨率。
如图1所示,在步骤S110中,采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置的过程,包括:
S11:向用户的单眼眼球发射光线;该光线可以为红外光线,也可以为可见光线;该单眼眼球可以为左眼也可以为右眼;
S12:捕捉用户单眼眼球中反射的光线,并通过计算机视觉技术根据光线的相对位置获取用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置。
如上所述,本公开提供的基于单眼球追踪的虚拟现实系统注视渲染方法,首先采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置,再获取与用户的双眼瞳孔距离相对应的远用瞳距,从而根据远用瞳距与第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置;进而分别以第一单眼映射位置、第二单眼映射位置 为圆心,以预设阈值为半径辐射出主要渲染区域,在主要渲染区域内进行较高分辨率的主要渲染,在虚拟现实系统显示屏幕中除主要渲染区域以外的区域上进行较低分辨率的配合渲染,从而为用户的双眼呈现较高的图像清晰度,提高用户体验感,并且通过单眼球追踪能够解决双眼球追踪中的光源发出的光线容易相互干扰、计算结果误差大,影响眼球追踪的位置精度的问题,而且实时对用户双眼位置区域进行高精度追踪,极大满足用户眼球注视渲染的需求。
如图2所示,本公开还提供一种基于单眼球追踪的虚拟现实系统注视渲染系统100,设置为实现前述的基于单眼球追踪的虚拟现实系统注视渲染方法,其包括设置在虚拟现实一体机中的虚拟现实系统显示屏幕101、内置在虚拟现实一体机中的单眼追踪模块102、IPD调节功能模块103、处理器104和渲染模块105,
该单眼追踪模块102设置为采集用户的单眼眼球对应在虚拟现实系统显示屏幕101上的第一单眼映射位置;
该IPD调节功能模块103设置为与用户的双眼相适应以获取与用户的双眼瞳孔距离相对应的远用瞳距;在本实施例中,在虚拟现实一体机设备内置该IPD调节功能模块103,用户可以根据自身的双眼瞳孔距离,调整适合自身双眼瞳孔距离的IPD值(远用瞳距),达到虚拟现实一体机设备的两个筒镜间的距离和用户双眼瞳距匹配;
该处理器104设置为根据远用瞳距与第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置,并且分别以该第一单眼映射位置、该第二单眼映射位置为圆心,以预设阈值为半径辐射出主要渲染区域;在本实施例中该预设阈值不作具体限制,依具体应用情况而调整。
该渲染模块105设置为在主要渲染区域内进行主要渲染,在该虚拟现实系统显示屏幕中除所述主要渲染区域以外的区域上进行配合渲染;其中,该主要渲染的渲染分辨率高于该配合渲染的渲染分辨率,如此,通过高分辨率渲染眼球注视位置区域,其他区域进行低分辨率内容渲染和计算处理,提高整体运算速率,且实时对用户双眼位置区域进行高精度追踪,极大程度满足用户眼球注视渲染的需求。
在图2所示的实施例中,该单眼追踪模块102为红外追踪模块A或可见光追踪模块B。
若采用红外追踪模块A,则该红外追踪模块A包括红外发光源组件A-1`和红外追踪相机A-2;其中,
该红外发光源组件A-1设置为发射红外光线;
该红外光追踪相机A-2设置为捕捉用户单眼眼球中反射的红外光线,并通过计算机视觉技术根据该红外光线的相对位置获取用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置。
在本实施例中采用一个红外追踪相机,该红外光追踪相机内置在虚拟现实一体机设备中与用户的单眼眼球相对应的位置;该红外发光源组件设置在红外光追踪相机的四周,即在红外追踪相机周围附近安装一定数量的红外发光源组件,该用户的单眼眼球以右眼为例,在红外追踪相机追踪用户单眼眼球位置时,红外发光源组件在眼球上的反射光形成红外追踪图像被捕捉到,而后通过计算机视觉技术获取眼球位置相对应在右眼屏幕上的位置信息。
若采用可见光追踪模块B,该可见光追踪模块B包括可见光源组件B-1和可见光追踪相机B-2;
其中该可见光源组件B-1设置为发射可见光线;
该可见光追踪相机B-2设置为捕捉用户单眼眼球中反射的可见光线,并通过计算机视觉技术根据可见光线的相对位置获取用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置。
在本实施例中,设置一个可见光追踪相机B-2,该可见光追踪相机B-2可以是彩色相机,也可以是单色灰度相机,该可见光追踪相机B-2内置在虚拟现实一体机设备中与用户的单眼眼球相对应的位置,且该可见光追踪相机的可追踪光线的波段区间为400~900nm;该可见光源组件B-1设置在可见光追踪相机的四周,即在可见光追踪相机周围附近安装一定数量的可见光源组件B-1,该可见光源组件B-1不仅仅包括可见光,其包括任意可以被可见光追踪相机B-2捕捉到的光源组件,其波段在400~900nm之间,可以为传统意义上的可见光源组件,也可以为波段为850nm的红外发光源组件,该用户的单眼眼球以右眼为例,在可见光追踪相机追踪用户眼球位置时,可见光源组件在眼球上的反射光形成追踪图像被捕捉到,而后通过计算机视觉技术获取眼球 位置相对应在右眼屏幕上的位置信息。
通过上述实施方式可以看出,本公开提供的基于单眼球追踪的虚拟现实系统注视渲染系统,首先采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置,再获取与用户的双眼瞳孔距离相对应的远用瞳距,从而根据远用瞳距与第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置;进而分别以第一单眼映射位置、第二单眼映射位置为圆心,以预设阈值为半径辐射出主要渲染区域,在主要渲染区域内进行较高分辨率的主要渲染,在虚拟现实系统显示屏幕中除主要渲染区域以外的区域上进行较低分辨率的配合渲染,从而为用户的双眼呈现较高的图像清晰度,提高用户体验感,并且通过单眼球追踪能够解决双眼球追踪中的光源发出的光线容易相互干扰、计算结果误差大,影响眼球追踪的位置精度的问题,而且实时对用户双眼位置区域进行高精度追踪,极大满足用户眼球注视渲染的需求。
如上参照附图以示例的方式描述了根据本公开提出的基于单眼球追踪的虚拟现实系统注视渲染方法、系统。但是,本领域技术人员应当理解,对于上述本公开所提出的基于单眼球追踪的虚拟现实系统注视渲染方法、系统,还可以在不脱离本公开内容的基础上做出各种改进。因此,本公开的保护范围应当由所附的权利要求书的内容确定。

Claims (20)

  1. 一种基于单眼球追踪的虚拟现实系统注视渲染方法,包括:
    采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置;
    获取与用户的双眼瞳孔距离相对应的远用瞳距,并根据所述远用瞳距与所述第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置;
    分别以所述第一单眼映射位置、所述第二单眼映射位置为圆心,以预设阈值为半径辐射出主要渲染区域;
    在所述主要渲染区域内进行主要渲染,在所述虚拟现实系统显示屏幕中除所述主要渲染区域以外的区域上进行配合渲染;其中,所述主要渲染的渲染分辨率高于所述配合渲染的渲染分辨率。
  2. 如权利要求1所述的基于单眼球追踪的虚拟现实系统注视渲染方法,其中,采集用户的单眼眼球对应在虚拟现实系统显示屏幕上的第一单眼映射位置的过程包括:
    向用户的单眼眼球发射光线;
    捕捉用户单眼眼球中反射的光线,并通过计算机视觉技术根据所述光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
  3. 如权利要求2所述的基于单眼球追踪的虚拟现实系统注视渲染方法,其中,
    所述光线为红外光线或可见光线。
  4. 如权利要求3所述的基于单眼球追踪的虚拟现实系统注视渲染方法,其中,所述光线为红外光线的情况下,
    通过红外发光源组件设置为发射红外光线;
    通过红外光追踪相机捕捉用户单眼眼球中反射的红外光线,并通过计算 机视觉技术根据所述红外光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
  5. 如权利要求4所述的基于单眼球追踪的虚拟现实系统注视渲染方法,其中,
    所述红外光追踪相机内置在虚拟现实一体机设备中与用户的单眼眼球相对应的位置;
    所述红外发光源组件设置在所述红外光追踪相机的四周。
  6. 如权利要求4所述的基于单眼球追踪的虚拟现实系统注视渲染方法,其中,所述红外追踪相机追踪用户的单眼眼球的位置时,通过所述红外光追踪相机捕捉所述红外发光源组件在眼球上的反射光形成的红外追踪图像,并通过计算机视觉技术获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
  7. 如权利要求3所述的基于单眼球追踪的虚拟现实系统注视渲染方法,其中,所述光线为可见光线的情况下,
    通过可见光源组件发射可见光线;
    通过可见光追踪相机捕捉用户单眼眼球中反射的可见光线,并通过计算机视觉技术根据所述可见光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
  8. 如权利要求7所述的基于单眼球追踪的虚拟现实系统注视渲染方法,其中,
    所述可见光追踪相机内置在所述虚拟现实一体机设备中与用户的单眼眼球相对应的位置;
    所述可见光源组件设置在所述可见光追踪相机的四周。
  9. 如权利要求7所述的基于单眼球追踪的虚拟现实系统注视渲染方法,其中,所述可见光追踪相机追踪用户的单眼眼球的位置时,通过所述可见光 追踪相机捕捉所述可见光源组件在眼球上的反射光形成的追踪图像,并通过计算机视觉技术获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
  10. 如权利要求2所述的基于单眼球追踪的虚拟现实系统注视渲染方法,其中,获取与用户的双眼瞳孔距离相对应的远用瞳距包括:
    通过令虚拟现实一体机设备内置的IPD调节功能模块与用户的双眼相适应,以获取与用户的双眼瞳孔距离相对应的远用瞳距。
  11. 一种基于单眼球追踪的虚拟现实系统注视渲染系统,设置为实现如权利要求1-10任一项所述的基于单眼球追踪的虚拟现实系统注视渲染方法,包括设置在虚拟现实一体机中的虚拟现实系统显示屏幕、内置在所述虚拟现实一体机中的单眼追踪模块、远用瞳距IPD调节功能模块、处理器和渲染模块,其中,
    所述单眼追踪模块设置为采集用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置;
    所述IPD调节功能模块设置为与用户的双眼相适应以获取与用户的双眼瞳孔距离相对应的远用瞳距;
    所述处理器设置为根据所述远用瞳距与所述第一单眼映射位置计算出用户另一个单眼眼球对应在虚拟现实系统显示屏幕上的第二单眼映射位置,并且分别以所述第一单眼映射位置、所述第二单眼映射位置为圆心,以预设阈值为半径辐射出主要渲染区域;
    所述渲染模块设置为在所述主要渲染区域内进行主要渲染,在所述虚拟现实系统显示屏幕中除所述主要渲染区域以外的区域上进行配合渲染;其中,所述主要渲染的渲染分辨率高于所述配合渲染的渲染分辨率。
  12. 如权利要求11所述的基于单眼球追踪的虚拟现实系统注视渲染系统,其中,
    所述单眼追踪模块为红外追踪模块或可见光追踪模块。
  13. 如权利要求12所述的基于单眼球追踪的虚拟现实系统注视渲染系统,其中,
    所述红外追踪模块包括红外发光源组件和红外追踪相机;其中,
    所述红外发光源组件设置为发射红外光线;
    所述红外光追踪相机设置为捕捉用户单眼眼球中反射的红外光线,并通过计算机视觉技术根据所述红外光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
  14. 如权利要求13所述的基于单眼球追踪的虚拟现实系统注视渲染系统,其中,
    所述红外光追踪相机内置在所述虚拟现实一体机设备中与用户的单眼眼球相对应的位置;
    所述红外发光源组件设置在所述红外光追踪相机的四周。
  15. 如权利要求13所述的基于单眼球追踪的虚拟现实系统注视渲染系统,其中,所述红外追踪相机追踪用户的单眼眼球的位置时,所述红外光追踪相机设置为捕捉所述红外发光源组件在眼球上的反射光形成的红外追踪图像,并通过计算机视觉技术获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
  16. 如权利要求12所述的基于单眼球追踪的虚拟现实系统注视渲染系统,其中,
    所述可见光追踪模块包括可见光源组件和可见光追踪相机;
    其中所述可见光源组件设置为发射可见光线;
    所述可见光追踪相机设置为捕捉用户单眼眼球中反射的可见光线,并通过计算机视觉技术根据所述可见光线的相对位置获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
  17. 如权利要求16所述的基于单眼球追踪的虚拟现实系统注视渲染系统,其中,
    所述可见光追踪相机内置在所述虚拟现实一体机设备中与用户的单眼眼球相对应的位置;
    所述可见光源组件设置在所述可见光追踪相机的四周。
  18. 如权利要求17所述的基于单眼球追踪的虚拟现实系统注视渲染系统,其中,
    所述可见光追踪相机的可追踪光线的波段区间为400~900nm。
  19. 如权利要求16所述的基于单眼球追踪的虚拟现实系统注视渲染系统,其中,所述可见光追踪相机追踪用户的单眼眼球的位置时,所述可见光追踪相机设置为捕捉所述可见光源组件在眼球上的反射光形成的追踪图像,并通过计算机视觉技术获取用户的单眼眼球对应在所述虚拟现实系统显示屏幕上的第一单眼映射位置。
  20. 一种非暂时性计算机可读存储介质,其中,其上存储有计算机程序,所述计算机程序在被处理器执行时实现根据权利要求1-10中任一项所述的方法。
PCT/CN2021/116750 2021-03-30 2021-09-06 基于单眼球追踪的虚拟现实系统注视渲染方法、系统 WO2022205769A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21934408.2A EP4206979A4 (en) 2021-03-30 2021-09-06 METHOD AND SYSTEM FOR FOVEA DISPLAY IN A VIRTUAL REALITY BASED ON SINGLE EYEBALL TRACKING
US17/816,408 US11715176B2 (en) 2021-03-30 2022-07-30 Foveated rendering method and system of virtual reality system based on monocular eyeball tracking

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110340440.1 2021-03-30
CN202110340440.1A CN113177434A (zh) 2021-03-30 2021-03-30 基于单眼球追踪的虚拟现实系统注视渲染方法、系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/816,408 Continuation US11715176B2 (en) 2021-03-30 2022-07-30 Foveated rendering method and system of virtual reality system based on monocular eyeball tracking

Publications (1)

Publication Number Publication Date
WO2022205769A1 true WO2022205769A1 (zh) 2022-10-06

Family

ID=76922580

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/116750 WO2022205769A1 (zh) 2021-03-30 2021-09-06 基于单眼球追踪的虚拟现实系统注视渲染方法、系统

Country Status (4)

Country Link
US (1) US11715176B2 (zh)
EP (1) EP4206979A4 (zh)
CN (1) CN113177434A (zh)
WO (1) WO2022205769A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112416125A (zh) * 2020-11-17 2021-02-26 青岛小鸟看看科技有限公司 Vr头戴式一体机
CN113177434A (zh) * 2021-03-30 2021-07-27 青岛小鸟看看科技有限公司 基于单眼球追踪的虚拟现实系统注视渲染方法、系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106445167A (zh) * 2016-10-20 2017-02-22 网易(杭州)网络有限公司 单眼视界自适配调整方法及装置、头戴式可视设备
CN106980983A (zh) * 2017-02-23 2017-07-25 阿里巴巴集团控股有限公司 基于虚拟现实场景的业务认证方法及装置
CN107317987A (zh) * 2017-08-14 2017-11-03 歌尔股份有限公司 虚拟现实的显示数据压缩方法和设备、系统
CN108604116A (zh) * 2015-09-24 2018-09-28 托比股份公司 能够进行眼睛追踪的可穿戴设备
CN110855972A (zh) * 2019-11-21 2020-02-28 Oppo广东移动通信有限公司 图像处理方法以及电子设备和存储介质
CN113177434A (zh) * 2021-03-30 2021-07-27 青岛小鸟看看科技有限公司 基于单眼球追踪的虚拟现实系统注视渲染方法、系统

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK179537B1 (en) * 2015-02-04 2019-02-08 Itu Business Development A/S Tin traces and eye tracking methods
KR102415502B1 (ko) * 2015-08-07 2022-07-01 삼성전자주식회사 복수의 사용자를 위한 라이트 필드 렌더링 방법 및 장치
US10373592B2 (en) * 2016-08-01 2019-08-06 Facebook Technologies, Llc Adaptive parameters in image regions based on eye tracking information
KR102564479B1 (ko) * 2016-11-22 2023-08-07 삼성전자주식회사 사용자의 눈을 위한 3d 렌더링 방법 및 장치
CN106686365A (zh) * 2016-12-16 2017-05-17 歌尔科技有限公司 用于头戴显示设备的镜头调节方法、装置及头戴显示设备
AU2018239511A1 (en) * 2017-03-22 2019-10-17 Magic Leap, Inc. Depth based foveated rendering for display systems
CN109429060B (zh) * 2017-07-07 2020-07-28 京东方科技集团股份有限公司 瞳孔距离测量方法、可穿戴眼部设备及存储介质
TWI646355B (zh) * 2017-12-26 2019-01-01 宏碁股份有限公司 頭戴式顯示器與調整方法
US11435577B2 (en) * 2018-04-25 2022-09-06 Dhanushan Balachandreswaran Foveated projection system to produce ocular resolution near-eye displays
US11347056B2 (en) * 2018-08-22 2022-05-31 Microsoft Technology Licensing, Llc Foveated color correction to improve color uniformity of head-mounted displays
JP7423659B2 (ja) * 2019-05-20 2024-01-29 マジック リープ, インコーポレイテッド 眼姿勢を推定するためのシステムおよび技法
CN111128068B (zh) * 2019-11-28 2022-10-21 武汉天马微电子有限公司 一种显示装置和显示面板的驱动显示方法
CN111556305B (zh) * 2020-05-20 2022-04-15 京东方科技集团股份有限公司 图像处理方法、vr设备、终端、显示系统和计算机可读存储介质
CN111491159A (zh) * 2020-05-29 2020-08-04 上海鸿臣互动传媒有限公司 一种增强现实的显示系统及方法
CN111988598B (zh) * 2020-09-09 2022-06-21 江苏普旭科技股份有限公司 一种基于远近景分层渲染的视景图像生成方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108604116A (zh) * 2015-09-24 2018-09-28 托比股份公司 能够进行眼睛追踪的可穿戴设备
CN106445167A (zh) * 2016-10-20 2017-02-22 网易(杭州)网络有限公司 单眼视界自适配调整方法及装置、头戴式可视设备
CN106980983A (zh) * 2017-02-23 2017-07-25 阿里巴巴集团控股有限公司 基于虚拟现实场景的业务认证方法及装置
CN107317987A (zh) * 2017-08-14 2017-11-03 歌尔股份有限公司 虚拟现实的显示数据压缩方法和设备、系统
CN110855972A (zh) * 2019-11-21 2020-02-28 Oppo广东移动通信有限公司 图像处理方法以及电子设备和存储介质
CN113177434A (zh) * 2021-03-30 2021-07-27 青岛小鸟看看科技有限公司 基于单眼球追踪的虚拟现实系统注视渲染方法、系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4206979A4 *

Also Published As

Publication number Publication date
US11715176B2 (en) 2023-08-01
CN113177434A (zh) 2021-07-27
EP4206979A1 (en) 2023-07-05
US20220366529A1 (en) 2022-11-17
EP4206979A4 (en) 2024-01-24

Similar Documents

Publication Publication Date Title
US11755106B1 (en) Glint-assisted gaze tracker
Plopski et al. Corneal-imaging calibration for optical see-through head-mounted displays
WO2022205769A1 (zh) 基于单眼球追踪的虚拟现实系统注视渲染方法、系统
CN110708533B (zh) 基于增强现实的视觉辅助方法及智能穿戴设备
US20120050493A1 (en) Geometric calibration of head-worn multi-camera eye tracking system
US20140375680A1 (en) Tracking head movement when wearing mobile device
TWI507729B (zh) 頭戴式視覺輔助系統及其成像方法
US20140375531A1 (en) Method of roviding to the user an image from the screen of the smartphome or tablet at a wide angle of view, and a method of providing to the user 3d sound in virtual reality
US11838494B2 (en) Image processing method, VR device, terminal, display system, and non-transitory computer-readable storage medium
US20230255476A1 (en) Methods, devices and systems enabling determination of eye state variables
CN108428375A (zh) 一种基于增强现实的教学辅助方法及设备
US11983310B2 (en) Gaze tracking apparatus and systems
JP2021515278A (ja) ホログラフィック実空間屈折システム
US11640201B2 (en) Virtual reality-based eyeball tracking method and system
US20220365342A1 (en) Eyeball Tracking System and Method based on Light Field Sensing
CN108446011A (zh) 一种基于增强现实的医疗辅助方法及设备
Lander et al. hEYEbrid: A hybrid approach for mobile calibration-free gaze estimation
US11743447B2 (en) Gaze tracking apparatus and systems
US11747897B2 (en) Data processing apparatus and method of using gaze data to generate images
US20200159027A1 (en) Head-mounted display with unobstructed peripheral viewing
WO2019116675A1 (ja) 情報処理装置、情報処理方法、およびプログラム
TW202017368A (zh) 智慧眼鏡、系統及其使用方法
WO2023102500A1 (en) Methods for controlling performance of extended reality display systems
CN114578940A (zh) 一种控制方法、装置和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21934408

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021934408

Country of ref document: EP

Effective date: 20230330

NENP Non-entry into the national phase

Ref country code: DE