WO2018165906A1 - 一种头戴式显示装置及其显示方法 - Google Patents

一种头戴式显示装置及其显示方法 Download PDF

Info

Publication number
WO2018165906A1
WO2018165906A1 PCT/CN2017/076766 CN2017076766W WO2018165906A1 WO 2018165906 A1 WO2018165906 A1 WO 2018165906A1 CN 2017076766 W CN2017076766 W CN 2017076766W WO 2018165906 A1 WO2018165906 A1 WO 2018165906A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
unit
image
display device
mounted display
Prior art date
Application number
PCT/CN2017/076766
Other languages
English (en)
French (fr)
Inventor
廖建强
Original Assignee
廖建强
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 廖建强 filed Critical 廖建强
Priority to PCT/CN2017/076766 priority Critical patent/WO2018165906A1/zh
Publication of WO2018165906A1 publication Critical patent/WO2018165906A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the present invention relates to the field of virtual scene display technologies, and in particular, to a head mounted display device and a display method thereof.
  • the wearable display device provides a parallax between the left eye and the right eye of the user by providing different images to the left and right eyes of the user, respectively, and the head mounted display device keeps the display close to the eyes of the user, which is more It is easy to provide different images for the left and right eyes, and thus the head mounted display device HMD is often used as a display of an embedded game machine.
  • the head-mounted display device is different from the conventional goggle-type display in which the surrounding environment is completely invisible, and the head-mounted display device adopts a see-through display mode, so that the user can not only observe the image on the display of the head-mounted display device, Moreover, it is also possible to observe the actual surrounding environment, which provides the user with a virtual display of augmented reality.
  • the head-mounted display device Since the head-mounted display device is fixed to the user's head, when the user uses the vehicle or walks, the user's head may tilt or vibrate, so that the user's eyes view the optical axis of the space and the light of the display screen. There is a deviation between the axes, which makes the person who uses the vestibular organ to sense the acceleration of gravity to be dizzy due to the above deviation, and also greatly reduces the display feeling of the user viewing the stereoscopic image.
  • the technical problem to be solved by the present invention is that the head-mounted display device of the prior art sees the optical axis of the user's eye and the light of the display screen in the case where the user's head is tilted or vibrated. There is a deviation between the axes, which causes a problem of the user's stereoscopic field of view abnormality, viewing vertigo, and a decrease in the user's realistic stereoscopic feeling, which results in deterioration of the picture quality of the head mounted display device.
  • an embodiment of the present invention provides a head mounted display device including a display unit, a control unit, an image processing unit, and a tracking unit, wherein:
  • the display unit is configured to provide a stereoscopic image
  • the tracking unit is configured to detect an initial position and real-time position data of the eye
  • the control unit is configured to obtain a position offset amount of the eye according to the initial position and real-time position data, and generate motion amount data about the stereoscopic image based on the position shift amount;
  • the image processing unit performs motion processing on the stereoscopic image according to the motion amount data, so that an optical axis of the stereoscopic image after the motion processing is consistent with an optical axis of the eye;
  • the display unit is a see-through display unit, and is capable of generating a left eye image and a right eye image to achieve a parallax between the left eye and the right eye;
  • the tracking unit is an imaging unit or an infrared detecting unit, and the infrared detecting unit detects an eye position by detecting a manner in which the eye reflects infrared light;
  • control unit calculates the position offset according to a difference between the initial position and a real-time position data vector direction
  • the image processing unit performs a motion processing of shifting or rotating the stereoscopic image.
  • the embodiment of the invention further provides a display method of a head-mounted display device, which comprises:
  • S1 acquiring initial position data of the eye relative to the head mounted display device, and generating a corresponding reference eye position
  • S2 acquiring real-time position data of the eye during viewing of the image, and generating a corresponding real-time eye position
  • S3 acquiring a deviation vector between the real-time eye position and a reference eye position, and generating motion amount data about the image based on the deviation vector;
  • the initial position data and the real-time position data are acquired by capturing an image having a depth of field on the eye or performing infrared light reflection tracking on the eye;
  • step S3 the deviation vector is obtained based on respective optical axis vectors of the eye at the real-time eye position and the reference glasses position;
  • the motion amount data includes translation amount data or rotation amount data
  • the motion processing is based on the motion data in the eye This is achieved by two-dimensional affine transformation of the image on the viewing plane.
  • the present invention provides a head mounted display device and a display method thereof by the above technical solution, and the display device and the display method are capable of detecting an offset of a head or an eye during viewing of a stereoscopic image by a user, and according to the offset amount
  • the stereoscopic image displayed by the display is translated or rotated, so that the optical axis of the user's eye viewing space is always consistent with the optical axis of the display screen, thereby avoiding the user stereoscopic field distortion and the occurrence of viewing fatigue and vertigo problems.
  • FIG. 1 is a schematic structural diagram of a head-mounted display device according to an embodiment of the present invention.
  • FIG. 2 is a schematic flow chart of a display method of a head mounted display device according to an embodiment of the present invention.
  • the head-mounted display device includes a display unit, a control unit, an image processing unit, a storage unit, and a tracking device. unit.
  • the display unit includes a left-eye display sub-unit and a right-eye display sub-unit, which respectively provide a left-eye sub-image and a right-eye sub-image to the user, thereby realizing parallax between the left and right eyes.
  • the display unit may be a shutter display unit, a polarized display unit or a wavelength division display unit.
  • the control unit inputs an image signal to the display unit, and the image signal may carry a specific clock signal.
  • the display unit can be a see-through display unit, such as a see-through glasses display
  • the control unit controls the light emitted by the image source to be transmitted to the perspective display unit through the light guide member, and the perspective display unit refracts and diffuses the image light through the internal microstructure thereof, so that the image light is displayed in front of the user, and the user also The real environment around it can be viewed, which enhances the user's sense of reality in viewing stereoscopic images.
  • the image source can be a liquid crystal microdisplay, a DMC or an LED microdisplay.
  • the light guiding member may be an optical fiber or a light guide tube or the like.
  • the tracking unit is configured to detect a user's head or eye position information. Since the relative position between the user's eyes and the head is fixed, the tracking unit can determine the tilt or vibration of the user's head by tracking the position of the user's eyes. Moving, the operation of the head mounted display device will be described below only for the case where the tracking unit detects the position of the user's eyes.
  • the tracking unit may be, but not limited to, an imaging unit or an infrared detecting unit, which is generally disposed at a lower edge of the display unit in order to avoid occlusion of the tracking element by the tracking element.
  • the tracking unit can take a feature part such as a pupil, an inner eye or an outer corner of the eye as a detection target; when the tracking unit is an imaging unit, the imaging unit directly captures an image having a depth of field with respect to the feature part, and the imaging unit is preferably a binocular imaging unit.
  • the tracking unit is an infrared detecting unit
  • the infrared detecting unit includes an infrared emitting unit and an infrared receiving unit. The two units are disposed on the same side of the eye, and the infrared detecting unit emits infrared light to the characteristic portion.
  • the characteristic part reflects the infrared light to the infrared receiving unit, and the infrared receiving unit receives the reflected infrared light to obtain an offset of the infrared light
  • the infrared receiving unit can be a four-quadrant sensor, an infrared CCD or an infrared CMOS.
  • the offset is derived based on the displacement of the infrared light on the receiving surface of the infrared receiving unit.
  • the control unit receives the offset of the infrared ray and calculates the current position of the eye to position the eye.
  • the control unit acquires the offset of the infrared ray at a specific time interval to realize Real-time update of eye positioning information.
  • the storage unit is configured to store the tracking unit and the data information detected or calculated by the control unit. For example, after the user wears the head mounted display device, the tracking unit detects the position of the user's eyes relative to the display device, and obtains initial position data of the eye, and the storage unit stores and transmits the initial position data. To the control unit, the control unit performs data processing such as triangulation on the initial position data, obtains a vector direction of the optical axis of the eye at this time, and uses the vector direction as a reference eye position, and the storage unit stores the reference Eye position data for the next comparison process.
  • data processing such as triangulation
  • the tracking unit detects a change in the position of the user's eyes in real time during the user's use of the head mounted display device, and the storage unit stores and transmits the position change data to the control list.
  • the control unit converts the position transformation data into a vector direction of the corresponding eye optical axis and as a real-time eye position; then the control unit performs a vector calculation on the real-time eye position and the reference eye position to obtain the two The deviation vector, which is stored in the storage unit.
  • the control unit calculates an amount of motion such as a translation amount or a rotation amount corresponding to the displayed stereoscopic image according to the deviation vector, wherein the translation amount may be a movement amount of the stereoscopic image in a horizontal direction or a vertical direction on the eye visual plane, the rotation The amount may be the amount of change in the angular rotation of the stereoscopic image with its geometric center as a point of rotation.
  • the amount of motion of the stereoscopic image is stored in the storage unit and sent to the image processing unit.
  • the image processing unit is configured to perform motion processing on the displayed stereoscopic image. Specifically, after the image processing unit receives the motion amount data, the stereoscopic image is subjected to a corresponding amount of translation or rotation amount processing. Specifically, the image processing unit translates or rotates the image by two-dimensional affine transformation according to the received amount of translation or rotation amount data. First, the image processing unit acquires one or more feature points in the pre-motion stereoscopic image.
  • the stereoscopic image processed by the above motion can ensure that the optical axis thereof is always consistent with the optical axis of the user's eye, so that the stereoscopic image viewed by the eye conforms to the actual environment in which the
  • the head mounted display device obtains an offset of the position of the user's eye relative to the position of the reference eye during the change by real-time tracking detection of the position of the user's eye, the offset being based on the reference eye.
  • the display method includes:
  • S1 Acquire initial position data of the eye relative to the head mounted display device and generate a corresponding reference eye position.
  • the control unit in the head mounted display device issues an instruction to the tracking unit, and the tracking unit acquires an initial of the glasses relative to the head mounted display device when the user normally wears the head mounted display device based on the instruction. position.
  • the tracking unit may be an imaging unit or an infrared detecting unit; the tracking unit may take a characteristic part such as a pupil, an inner eye or an outer corner of the eye as a detection object; when the tracking unit is an imaging unit, the imaging unit directly captures a depth of field regarding the characteristic part.
  • the image capturing unit is preferably a binocular camera unit; when the tracking unit is an infrared detecting unit, the infrared detecting unit includes an infrared emitting unit and an infrared receiving unit, and the two units are disposed on the same side of the eye, and the infrared detecting is performed.
  • the infrared emitting unit emits infrared light to the characteristic portion, and the characteristic portion reflects the infrared light to the infrared receiving unit, and the infrared receiving unit receives the reflected infrared light to obtain an offset of the infrared light, and the infrared receiving
  • the unit may be a four-quadrant sensor, an infrared CCD or an infrared CMOS, the offset being derived based on the displacement of the infrared light on the receiving surface of the infrared receiving unit.
  • the control unit processes the initial position data of the eye, such as a triangulation method, to obtain a vector direction corresponding to the optical axis of the eye at this time, and uses the vector direction as a reference eye position, and the storage unit of the head mounted display device The reference eye position data is stored for further processing.
  • S2 Acquire real-time position data of the eye during viewing of the image, and generate a corresponding real-time eye position.
  • the tracking unit changes the position of the eye in real time during the user viewing the stereoscopic image due to the tilt or vibration of the user's head.
  • the control unit converts the real-time position data of the eye into a vector direction of the corresponding eye optical axis and acts as a real-time eye position.
  • the acquisition of the real-time position of the eye is the same as the acquisition of the initial position data in the above step S1.
  • S3 Obtain a deviation vector between the real-time eye position and the reference eye position, and generate motion amount data about the image based on the deviation vector.
  • the control unit obtains the deviation vector between the two by a corresponding vector operation.
  • the control unit calculates a translation amount corresponding to the stereoscopic image displayed by the head mounted display device according to the offset vector or An amount of motion data, such as a rotation amount, wherein the amount of shifting may be an amount of movement of the stereoscopic image in a horizontal direction or a vertical direction on an eye plane of the eye, and the amount of rotation may be that the stereoscopic image rotates at an arbitrary angle with the geometric center as a rotation point.
  • the amount of change in motion may be an amount of movement of the stereoscopic image in a horizontal direction or a vertical direction on an eye plane of the eye.
  • S4 Perform corresponding motion processing on the image based on the motion amount data, so that the image is displayed at a new position.
  • the image processing unit of the head mounted display device After receiving the motion amount data, the image processing unit of the head mounted display device performs a corresponding amount of translation or rotation amount processing on the stereoscopic image.
  • the image processing unit translates or rotates the image by two-dimensional affine transformation according to the amount of translation or rotation amount received by the image processing unit, and the stereoscopic image processed through the motion can be displayed at a new position, thereby ensuring the new position.
  • the optical axis of the stereoscopic image is always consistent with the optical axis of the user's eye such that the stereoscopic image viewed by the eye conforms to the actual environment in which the eye is viewed.
  • the display method of the head mounted display device performs real-time tracking detection by the user's eye position and generates an offset amount with respect to the position after the eye movement, and guides the stereoscopic image according to the offset amount to perform corresponding
  • the amount of translation or the amount of rotation allows the optical axis of the stereoscopic image to be consistent with the optical axis of the user's eye, thereby reducing the stun of the eye viewing image.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种头戴式显示装置及其显示方法,显示装置和显示方法通过实时检测眼睛位置相对于基准位置的偏移量,并根据偏移量使显示的立体图像进行平移或者旋转等运动,从而保证用户眼睛的光轴与立体图像的光轴始终保持一致,以避免用户发生观看视场扭曲和观看眩晕问题的出现,并以此增强头戴式显示装置的观看现实感。

Description

一种头戴式显示装置及其显示方法 技术领域
本发明涉及虚拟场景显示技术领域,尤其涉及一种头戴式显示装置及其显示方法。
背景技术
戴式显示装置通过给用户的左眼和右眼分别提供不同的图像,使得用户左眼和右眼之间形成视差,而头戴式显示装置使显示器保持与用户的眼睛距离极近,其更容易为左右眼睛提供不同的图像,因此头戴式显示装置HMD常被用作嵌入式游戏机的显示器。此外,头戴式显示装置与常规的完全看不到周围实际环境的护目镜式显示器不同,头戴式显示装置采用透视式显示方式,用户不仅能观察到头戴式显示装置显示器上的图像,而且也能观察周围实际环境,这为用户提供了增强现实的虚拟显示方式。
由于头戴式显示装置是被固定于用户的头部,因而用户在乘坐交通工具或者步行中使用时,用户的头部会发生倾斜或者振动,这样用户眼睛观看空间的光轴与显示器画面的光轴之间会产生偏差,这使得利用前庭器官来感知重力加速度的人会由于上述偏差而眩晕,同时也大大降低用户观看立体图像的显示感。
发明内容
针对上述现有技术的缺陷,本发明所要解决的技术问题在于现有技术中的头戴式显示装置在用户头部发生倾斜或振动的情况下,用户眼睛观看空间的光轴与显示器画面的光轴之间会存在偏差,从而引发用户立体视场异常、观看眩晕以及用户现实立体感降低的问题,这导致头戴式显示装置画面质量变差。
为了解决上述技术问题,本发明实施例提供头戴式显示装置,所述头戴式显示装置包括显示单元、控制单元、图像处理单元和追踪单元,其特征在于:
所述显示单元用于提供立体图像;
所述追踪单元用于检测眼睛的初始位置和实时位置数据;
所述控制单元用于根据所述初始位置和实时位置数据得出眼睛的位置偏移量,并基于所述位置偏移量生成关于立体图像的运动量数据;
所述图像处理单元根据所述运动量数据对所述立体图像进行运动处理,使得经过所述运动处理后的立体图像的光轴与所述眼睛的光轴保持一致;
进一步,所述显示单元为透视式显示单元,并且能够产生左眼图像和右眼图像以实现左眼和右眼之间的视差;
进一步,所述追踪单元为摄像单元或红外检测单元,所述红外检测单元通过检测眼睛反射红外光线的方式来检测眼睛位置;
进一步,所述控制单元根据所述初始位置和实时位置数据矢量方向之间的差异来计算出所述位置偏移量;
进一步,所述图像处理单元对所述立体图像进行平移或者旋转的运动处理。
相应地,本发明实施例还提供一种头戴式显示装置的显示方法,其特征在于,包括:
S1:获取眼睛相对于头戴式显示装置的初始位置数据,并生成相应的基准眼睛位置;
S2:获取眼睛在观看图像过程中的实时位置数据,并生成相应的实时眼睛位置;
S3:获取所述实时眼睛位置与基准眼睛位置之间的偏差矢量,并基于所述偏差矢量生成关于图像的运动量数据;
S4:基于所述运动量数据,对图像进行相应的运动处理,以使图像在新位置进行显示;
进一步,在所述步骤S1和S2中,所述初始位置数据和实时位置数据的获取是通过对眼睛拍摄具有景深的图像或者对眼睛进行红外光线反射追踪而实现的;
进一步,在步骤S3中,所述偏差矢量是基于眼睛在所述实时眼睛位置和基准眼镜位置上各自的光轴矢量而获得的;
进一步,所述运动量数据包括平移量数据或者旋转量数据;
进一步,在所述步骤S4中,所述运动处理时基于所述运动数据在眼睛的 视平面上对所述图像进行二维仿射变换而实现的。
本发明通过上述技术方案提供一种头戴式显示装置及其显示方法,该显示装置和显示方法能够检测用户在观看立体图像过程中头部或眼睛的偏移量,并根据该偏移量对显示器显示的立体图像进行平移或者旋转处理,使得用户眼睛观看空间的光轴与显示器画面的光轴始终保持一致,从而避免用户立体视场扭曲和发生观看疲劳眩晕问题的出现。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本发明实施例提供的一种头戴式显示装置的结构示意图;
图2是本发明实施例提供的一种头戴式显示装置的显示方法的流程示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
参见图1,为本发明实施例提供的一种头戴式显示装置的结构示意图,在本发明实施例中,该头戴式显示装置包括显示单元、控制单元、图像处理单元、存储单元和追踪单元。具体而言,该显示单元包括左眼显示子单元和右眼显示子单元,其分别为用户提供左眼子图像和右眼子图像,从而实现左右眼睛之间的视差。该显示单元可为快门式显示单元、偏光式显示单元或波分式显示单元。在实际工作中,该控制单元向该显示单元输入图像信号,该图像信号可携带有特定的时钟信号。这使得该左眼和右眼显示子单元能够根据该时钟信号进行对应子图像的显示。该显示单元可为透视式显示单元,如透视眼镜式显示器,该 控制单元控制图像源发出的光线通过光导部件传送到该透视式显示单元,该透视式显示单元通过其内部的微结构将该图像光线进行折射扩散,从而使得图像光线显示在用户眼前同时,用户也能观看到其周围的真实环境,这样能够增强用户观看立体图像的现实感。该图像源可为液晶微显示器、DMC或者LED微显示器。该光导部件可为光纤或者导光管等。
该追踪单元用于检测用户的头部或者眼睛位置信息,由于用户的眼睛和头部之间相对位置是固定的,该追踪单元通过追踪用户眼睛的位置就能够确定用户头部的倾斜或振动偏移,下面将只针对追踪单元检测用户眼睛位置的情况来说明该头戴式显示装置的工作情况。该追踪单元可为但不限于摄像单元或者红外检测单元,为了避免该追踪元件对用户视线的遮挡,其通常设置在该显示单元的下方边缘。该追踪单元可将瞳孔、内眼睛或者外眼角等特征部位作为检测对象;当该追踪单元为摄像单元时,该摄像单元直接拍摄关于特征部分具有景深的图像,该摄像单元优选为双目摄像单元;当该追踪单元为红外检测单元时,该红外检测单元包括红外发射单元和红外接收单元,上述两个单元设置在眼睛的同一侧,该红外检测该红外发射单元向该特征部分发射红外光线,该特征部位将该红外光线反射到该红外接收单元上,该红外接收单元接受该反射的红外光线后得出红外光线的偏移量,该红外接收单元可为四象限传感器、红外CCD或者红外CMOS,该偏移量是基于该红外光下在该红外接收单元的接收表面上的位移而得出的。实际上,该控制单元接收到该红外光线的偏移量后计算出当前该眼睛的位置从而对该眼睛进行定位,该控制单元每间隔特定时间就会获取该红外光线的偏移量,以实现对眼睛定位信息的实时更新。
该存储单元用于对该追踪单元和该控制单元检测或者计算的数据信息进行存储。比如,在用户佩戴该头戴式显示装置后,该追踪单元对用户眼睛相对于该显示装置的位置进行检测,得出该眼睛的初始位置数据,该存储单元将该初始位置数据进行存储并传送给该控制单元,该控制单元对该初始位置数据进行如三角定位方式等数据处理,得出此时眼睛光轴的矢量方向,并将该矢量方向作为基准眼睛位置,同时该存储单元存储该基准眼睛位置数据以进行下一步的比较处理之用。该追踪单元在用户使用头戴式显示装置的过程中实时检测用户眼睛的位置变化,该存储单元将该位置变化数据进行存储并传送到该控制单 元,该控制单元将该位置变换数据转换成对应的眼睛光轴的矢量方向并作为实时眼睛位置;接着该控制单元将该实时眼睛位置与该基准眼睛位置进行关于矢量的运算,得出两者的偏差矢量,该偏差矢量被存储到该存储单元中。该控制单元根据该偏差矢量计算出显示的立体图像对应的平移量或者旋转量等运动量,其中,该平移量可以是该立体图像在眼睛视平面上水平方向或竖直方向的移动量,该旋转量可以是该立体图像以其几何中心为旋转点进行任意角度旋转运动的变化量。该立体图像的运动量被存储在该存储单元中并发送至该图像处理单元上。
该图像处理单元用于对显示的立体图像进行运动处理。具体而言,当该图像处理单元接收到该运动量数据后,对立体图像进行相应的平移量或旋转量处理。具体而言,该图像处理单元根据其接收到的平移量或者旋转量数据通过二维仿射变换来对图像进行平移或者旋转,首先该图像处理单元获取运动前立体图像中一个或多个特征点的坐标A,该图像处理单元将该平移量或者旋转量转换为眼睛视空间上对应的变换矩阵,其中该平移量对应的变换矩阵为2*1矩阵T1,该旋转量对应的变换举证为2*2矩阵T2,因此当该立体图像只进行平移运动时,该运动后的立体图像中一个或多个特征点的坐标B=T1*A;当该立体图像只进行旋转运动时,该运动后的立体图像中的一个或多个特征点的坐标B=T2*A;当该立体图像同时进行平移和旋转运动时,该运动后的立体图像中的一个或多个特征点的坐标B=T1*T2*A。经过上述运动处理的立体图像就能保证其光轴与用户眼睛的光轴始终保持一致,使得眼睛观看的立体图像和眼睛观看的实际环境相符合。
从上述实施例可以看出,该头戴式显示装置通过对用户眼睛位置进行实时追踪检测,获取用户眼睛位置在变化过程中相对于基准眼睛位置的偏移量,该偏移量是基于基准眼睛位置的矢量方向和变化后的眼睛位置的矢量方向之差而计算得出;并通过该偏移量计算出立体图像对于的平移量或者旋转量,以使立体图像根据该平移量或者旋转量进行平移或者旋转运动,最终使得该立体图像的光轴能够与用户眼睛的光轴保持一致,从而避免用户发生观看视场扭曲和观看眩晕问题的出现,以增强头戴式显示装置的现实感。
参见图2,为本发明实施例提供的一种头戴式显示装置的显示方法的流程 示意图,在本发明实施例中,该显示方法包括:
S1:获取眼睛相对于头戴式显示装置的初始位置数据,并生成相应的基准眼睛位置。
具体而言,该头戴式显示装置中的控制单元向追踪单元发出指令,该追踪单元基于该指令获取用户正常佩戴该头戴式显示装置时,该眼镜相对于该头戴式显示装置的初始位置。该追踪单元可为摄像单元或者红外检测单元;该追踪单元可将瞳孔、内眼睛或者外眼角等特征部位作为检测对象;当该追踪单元为摄像单元时,该摄像单元直接拍摄关于特征部分具有景深的图像,该摄像单元优选为双目摄像单元;当该追踪单元为红外检测单元时,该红外检测单元包括红外发射单元和红外接收单元,上述两个单元设置在眼睛的同一侧,该红外检测该红外发射单元向该特征部分发射红外光线,该特征部位将该红外光线反射到该红外接收单元上,该红外接收单元接受该反射的红外光线后得出红外光线的偏移量,该红外接收单元可为四象限传感器、红外CCD或者红外CMOS,该偏移量是基于该红外光下在该红外接收单元的接收表面上的位移而得出的。
该控制单元将该眼睛的初始位置数据进行如三角定位方式等处理,得出此时对应眼睛光轴的矢量方向,并将该矢量方向作为基准眼睛位置,同时该头戴式显示装置的存储单元存储该基准眼睛位置数据以进行下一步的处理之用。
S2:获取眼睛在观看图像过程中的实时位置数据,并生成相应的实时眼睛位置。
具体而言,由于用户头部的倾斜或者振动使得该眼睛位置发生变化,该追踪单元在用户观看立体图像过程中实时获取该眼睛的位置数据。该控制单元将该眼睛的实时位置数据转换成对应的眼睛光轴的矢量方向并作为实时眼睛位置。实际上,该眼睛的实时位置的获取与上述步骤S1中的初始位置数据的获取相同。
S3:获取该实时眼睛位置与基准眼睛位置之间的偏差矢量,并基于该偏差矢量生成关于图像的运动量数据。
具体而言,由于该实时眼睛位置和基准眼睛位置都是关于眼睛光轴的矢量方向,该控制单元通过相应的矢量运算求出该两者之间的偏差矢量。该控制单元根据该偏移矢量计算出头戴式显示装置显示的立体图像对应的平移量或者 旋转量等运动量数据,其中,该平移量可以是该立体图像在眼睛视平面上水平方向或竖直方向的移动量,该旋转量可以是该立体图像以其几何中心为旋转点进行任意角度旋转运动的变化量。
S4:基于该运动量数据,对图像进行相应的运动处理,以使图像在新位置进行显示。
具体而言,头戴式显示装置的图像处理单元接收到该运动量数据后,对立体图像进行相应的平移量或旋转量处理。该图像处理单元根据其接收到的平移量或者旋转量数据通过二维仿射变换来对图像进行平移或者旋转,经过该运动处理的立体图像能够在新位置进行显示,从而保证该新位置上的立体图像的光轴与用户眼睛的光轴始终保持一致,以使得眼睛观看的立体图像和眼睛观看的实际环境相符合。
从上述实施例可以看出,该头戴式显示装置的显示方法通过用户眼睛位置进行实时追踪检测并生成关于眼睛运动后的位置的偏移量,并且根据该偏移量引导立体图像进行对应的平移量或者旋转量的运动,使得该立体图像的光轴能够与用户眼睛的光轴保持一致,从而降低眼睛观看图像的眩晕感。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
以上所揭露的仅为本发明一种较佳实施例而已,当然不能以此来限定本发明之权利范围,本领域普通技术人员可以理解实现上述实施例的全部或部分流程,并依本发明权利要求所作的等同变化,仍属于发明所涵盖的范围。

Claims (10)

  1. 一种头戴式显示装置,所述头戴式显示装置包括显示单元、控制单元、图像处理单元和追踪单元,其特征在于:
    所述显示单元用于提供立体图像;
    所述追踪单元用于检测眼睛的初始位置和实时位置数据;
    所述控制单元用于根据所述初始位置和实时位置数据得出眼睛的位置偏移量,并基于所述位置偏移量生成关于立体图像的运动量数据;
    所述图像处理单元根据所述运动量数据对所述立体图像进行运动处理,使得经过所述运动处理后的立体图像的光轴与所述眼睛的光轴保持一致。
  2. 根据权利要求1所述的头戴式显示装置,其特征在于:所述显示单元为透视式显示单元,并且能够产生左眼图像和右眼图像以实现左眼和右眼之间的视差。
  3. 根据权利要求1所述的头戴式显示装置,其特征在于:所述追踪单元为摄像单元或红外检测单元,所述红外检测单元通过检测眼睛反射红外光线的方式来检测眼睛位置。
  4. 根据权利要求1所述的头戴式显示装置,其特征在于:所述控制单元根据所述初始位置和实时位置数据矢量方向之间的差异来计算出所述位置偏移量。
  5. 根据权利要求1所述的头戴式显示装置,其特征在于:所述图像处理单元对所述立体图像进行平移或者旋转的运动处理。
  6. 一种头戴式显示装置的显示方法,其特征在于,包括:
    S1:获取眼睛相对于头戴式显示装置的初始位置数据,并生成相应的基准 眼睛位置;
    S2:获取眼睛在观看图像过程中的实时位置数据,并生成相应的实时眼睛位置;
    S3:获取所述实时眼睛位置与基准眼睛位置之间的偏差矢量,并基于所述偏差矢量生成关于图像的运动量数据;
    S4:基于所述运动量数据,对图像进行相应的运动处理,以使图像在新位置进行显示。
  7. 根据权利要求6所述的头戴式显示装置的显示方法,其特征在于,在所述步骤S1和S2中,所述初始位置数据和实时位置数据的获取是通过对眼睛拍摄具有景深的图像或者对眼睛进行红外光线反射追踪而实现的。
  8. 根据权利要求6所述的头戴式显示装置的显示方法,其特征在于,在步骤S3中,所述偏差矢量是基于眼睛在所述实时眼睛位置和基准眼镜位置上各自的光轴矢量而获得的。
  9. 根据权利要求6所述的头戴式显示装置的显示方法,其特征在于,所述运动量数据包括平移量数据或者旋转量数据。
  10. 根据权利要求6所述的头戴式显示装置的显示方法,其特征在于,在所述步骤S4中,所述运动处理时基于所述运动数据在眼睛的视平面上对所述图像进行二维仿射变换而实现的。
PCT/CN2017/076766 2017-03-15 2017-03-15 一种头戴式显示装置及其显示方法 WO2018165906A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/076766 WO2018165906A1 (zh) 2017-03-15 2017-03-15 一种头戴式显示装置及其显示方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/076766 WO2018165906A1 (zh) 2017-03-15 2017-03-15 一种头戴式显示装置及其显示方法

Publications (1)

Publication Number Publication Date
WO2018165906A1 true WO2018165906A1 (zh) 2018-09-20

Family

ID=63522631

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/076766 WO2018165906A1 (zh) 2017-03-15 2017-03-15 一种头戴式显示装置及其显示方法

Country Status (1)

Country Link
WO (1) WO2018165906A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002014300A (ja) * 2000-06-28 2002-01-18 Seiko Epson Corp 頭部装着型表示装置
JP2010232718A (ja) * 2009-03-25 2010-10-14 Olympus Corp 頭部装着型画像表示装置
CN102884803A (zh) * 2011-05-11 2013-01-16 松下电器产业株式会社 图像处理装置、影像处理方法、程序、集成电路
CN103380625A (zh) * 2011-06-16 2013-10-30 松下电器产业株式会社 头戴式显示器及其位置偏差调整方法
CN103439794A (zh) * 2013-09-11 2013-12-11 百度在线网络技术(北京)有限公司 头戴式设备的校准方法和头戴式设备
CN103593044A (zh) * 2012-08-13 2014-02-19 鸿富锦精密工业(深圳)有限公司 电子装置校正系统及方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002014300A (ja) * 2000-06-28 2002-01-18 Seiko Epson Corp 頭部装着型表示装置
JP2010232718A (ja) * 2009-03-25 2010-10-14 Olympus Corp 頭部装着型画像表示装置
CN102884803A (zh) * 2011-05-11 2013-01-16 松下电器产业株式会社 图像处理装置、影像处理方法、程序、集成电路
CN103380625A (zh) * 2011-06-16 2013-10-30 松下电器产业株式会社 头戴式显示器及其位置偏差调整方法
CN103593044A (zh) * 2012-08-13 2014-02-19 鸿富锦精密工业(深圳)有限公司 电子装置校正系统及方法
CN103439794A (zh) * 2013-09-11 2013-12-11 百度在线网络技术(北京)有限公司 头戴式设备的校准方法和头戴式设备

Similar Documents

Publication Publication Date Title
JP6860488B2 (ja) 複合現実システム
CN108292489B (zh) 信息处理装置和图像生成方法
EP3281406B1 (en) Retina location in late-stage re-projection
CN107810633B (zh) 立体渲染系统
US9904056B2 (en) Display
US10241329B2 (en) Varifocal aberration compensation for near-eye displays
JP6454851B2 (ja) 3次元上の注視点の位置特定アルゴリズム
CN107368192B (zh) Vr眼镜的实景观测方法及vr眼镜
KR20160094190A (ko) 시선 추적 장치 및 방법
JP2010072477A (ja) 画像表示装置、画像表示方法及びプログラム
CN114730094A (zh) 具有人工现实内容的变焦显示的人工现实系统
US9681122B2 (en) Modifying displayed images in the coupled zone of a stereoscopic display based on user comfort
JP2018050179A (ja) 情報処理装置、画像生成方法およびヘッドマウントディスプレイ
WO2020003860A1 (ja) 情報処理装置、情報処理方法、及びプログラム
JP2011010126A (ja) 画像処理装置、画像処理方法
WO2020185405A1 (en) Registration of local content between first and second augmented reality viewers
JP7367689B2 (ja) 情報処理装置、情報処理方法、及び記録媒体
JP6411244B2 (ja) 映像提示方法及び映像提示装置
JP2017046065A (ja) 情報処理装置
US20230215023A1 (en) Apparatus and Method for Virtual Reality
US11187895B2 (en) Content generation apparatus and method
JP2012244466A (ja) 立体画像処理装置
KR20200115631A (ko) 멀티뷰잉 가상 현실 사용자 인터페이스
WO2017163649A1 (ja) 画像処理装置
WO2018165906A1 (zh) 一种头戴式显示装置及其显示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17901096

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17901096

Country of ref document: EP

Kind code of ref document: A1