WO2020192359A1 - 一种近眼可穿戴设备及其显示方法 - Google Patents

一种近眼可穿戴设备及其显示方法 Download PDF

Info

Publication number
WO2020192359A1
WO2020192359A1 PCT/CN2020/077218 CN2020077218W WO2020192359A1 WO 2020192359 A1 WO2020192359 A1 WO 2020192359A1 CN 2020077218 W CN2020077218 W CN 2020077218W WO 2020192359 A1 WO2020192359 A1 WO 2020192359A1
Authority
WO
WIPO (PCT)
Prior art keywords
display panel
pixel units
display
light
pixel unit
Prior art date
Application number
PCT/CN2020/077218
Other languages
English (en)
French (fr)
Inventor
彭金豹
苗京花
陈丽莉
张�浩
李文宇
索健文
范清文
王雪丰
李茜
王立新
赵斌
孙玉坤
李治富
Original Assignee
京东方科技集团股份有限公司
北京京东方光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 北京京东方光电科技有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2020192359A1 publication Critical patent/WO2020192359A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the embodiments of the present disclosure relate to the field of display technology, and in particular to a near-eye wearable device and a display method thereof.
  • Virtual reality technology is a computer simulation system that can create and experience a virtual world. It uses a computer to generate a simulation environment. It is a system simulation of multi-source information fusion, interactive three-dimensional dynamic visual scene and entity behavior. Immerse yourself in the environment.
  • the related art virtual reality devices all display pictures taken from different angles of the same object on the left and right screens, and use the image shift seen by the two eyes to present a three-dimensional feeling.
  • the light emitted by the screen does not have depth information, and the focus of the eyes is fixed on the screen. Therefore, the focus adjustment of the eyes does not match this sense of depth, causing conflicts in visual convergence adjustment, leading to visual fatigue, double images, headaches, and nausea And a series of symptoms.
  • the light field display technology can simulate the real display scene and can perfectly solve the above problems.
  • the light field display equipment used at this stage usually requires separate display devices for the left and right eyes, and some pixels in the display devices cannot function , Resulting in waste of used area and cumbersome and heavy equipment.
  • the embodiments of the present disclosure provide a near-eye wearable device and a display method thereof to realize light field display and improve pixel utilization.
  • the embodiments of the present disclosure provide a near-eye wearable device, including: a first display panel, a second display panel located on the light emitting side of the first display panel, and a second display panel located on the second display panel.
  • a near-eye wearable device including: a first display panel, a second display panel located on the light emitting side of the first display panel, and a second display panel located on the second display panel.
  • the distance between each of the two imaging lenses and the second display panel is equal;
  • the second display panel includes a plurality of pixel units arranged in an array; each of the prisms corresponds to at least one column of the pixel units;
  • Each of the prisms includes an inclined flat surface at a set angle with the surface at the light exit side of the second display panel serving as the display surface to serve as the light exit surface of the prism, and facing the second display panel
  • the flat bottom surface of the display surface is arranged as the light incident surface of the prism; part of the prisms in each of the prisms are configured to receive the emitted light of the corresponding pixel unit column at the bottom surface thereof, and After being transmitted inside it, it transmits through its inclined flat surface in the direction of the imaging lens corresponding to the left eye; the other part of the prism is configured to receive the emitted light of the corresponding pixel unit column at the bottom surface thereof, And after the internal transmission, it transmits through the inclined flat surface thereof in the direction of the imaging lens corresponding to the right eye.
  • the bottom surface is parallel to the display surface of the second display panel.
  • the prism that transmits light to the imaging lens corresponding to the left eye and the prism that transmits light to the imaging lens corresponding to the right eye are along the pixel unit The direction of the rows alternates.
  • one prism corresponds to at least one column of pixel units.
  • the prism further includes: a bottom surface parallel to the display surface of the second display panel, and a bottom surface connected to the inclined flat surface
  • the connecting surface with the bottom surface; the connecting surface is a flat surface or a curved surface.
  • two adjacent prisms are arranged to be mirror images with respect to the common center line of the pixel unit columns corresponding to the two. symmetry.
  • the number of pixel units included in the first display panel is greater than the number of pixel units included in the second display panel.
  • the near-eye wearable device is glasses or a helmet.
  • the embodiments of the present disclosure provide a display method of a near-eye wearable device, the near-eye wearable device comprising: a first display panel and a second display panel located on the light emitting side of the first display panel Display panel; each of the first display panel and the second display panel includes a plurality of pixel units; the display method includes:
  • the determined data signal of each pixel unit is loaded on the first display panel and the second display panel to display the target image.
  • the number of pixel units in the first display panel and the second display panel and each of the first display panel includes:
  • the code vectors corresponding to the pixel units of the first display panel and the second display panel are combined simultaneously according to the predetermined correspondence between the pixel units to obtain the sparse matrix.
  • the calculating the transmittance of each pixel unit includes:
  • the transmittance of each pixel unit is calculated by the least square method.
  • FIG. 1 is a schematic diagram of a three-dimensional structure of a near-eye wearable device provided by an embodiment of the disclosure
  • FIG. 2 is a schematic cross-sectional structure diagram of the near-eye wearable device taken along I-I' in FIG. 1;
  • FIG. 3 is a schematic diagram of a prism imaging principle provided by an embodiment of the disclosure.
  • FIG. 5 is a schematic diagram of the structure of a prism provided by an embodiment of the disclosure.
  • FIG. 6 is the second schematic diagram of the cross-sectional structure of the near-eye wearable device provided by an embodiment of the disclosure.
  • FIG. 7 is an imaging principle diagram of a near-eye wearable device provided by an embodiment of the disclosure.
  • FIG. 8 is a flowchart of a display method of a near-eye wearable device provided by an embodiment of the disclosure.
  • FIG. 9 is a schematic diagram of the correspondence between pixel units provided by the embodiments of the disclosure.
  • the near-eye wearable device includes: a first display panel 11; A second display panel 12 on the light-emitting side of a display panel 11; a plurality of prisms 13 on the light-emitting side of the second display panel 12; and two imaging lenses 14R and 14L, which are respectively located on the side of the prism 13 away from the second display panel 12 and arranged to form Corresponds to the eyes.
  • 14R represents the imaging lens corresponding to the right eye
  • 14L represents the imaging lens corresponding to the left eye.
  • FIG. 2 for a schematic diagram of the cross-sectional structure of the myopia wearable device taken along the direction I-I' in FIG. 1.
  • the distance between the two imaging lenses 14R and 14L and the second display panel 12 is equal; the second display panel 12 includes a plurality of pixel units 121 arranged in an array; each prism 13 corresponds to at least one row Pixel units (in FIG. 2 it is exemplarily shown that one prism 13 corresponds to two columns of pixel units 121). It should be noted here that since Figure 2 is taken along the direction I-I' in Figure 1, the direction of "columns" should be the direction perpendicular to the paper surface for Figure 2.
  • Each of the prisms 13 located at the light exit side of the second display panel includes an inclined flat surface 131 with a set angle with the display surface of the second display panel 12 (that is, the surface at the light exit side of the second display panel 12) to serve as The light-emitting surface of each prism 13.
  • each prism 13 located at the light exit side of the second display panel further includes: a flat bottom surface arranged facing the display surface of the second display panel to serve as its light incident surface
  • the bottom surface is, for example, parallel to the display surface of the second display panel; and a connecting surface connecting the inclined flat surface and the bottom surface; the connecting surface is, for example, a transition surface in the form of an arc as shown in FIGS.
  • a transition surface in the form of a flat surface as shown in FIG. 5 does not limit the shape of the connecting surface of the prism.
  • the position of the connecting surface corresponds to the position of the non-opening area of the pixel unit, so human eyes cannot view the image through the connecting surface.
  • Part of the prisms in each prism is configured to receive the emitted light of the corresponding pixel unit column at its flat light-incident surface facing the display surface of the second display panel, and transmit light inside it Then it exits in the direction of the imaging lens 14L corresponding to the left eye through its inclined flat surface 131; the other part of the prism is configured for the flat incident light facing the display surface of the second display panel.
  • the surface receives the emitted light of the corresponding pixel unit column, and then passes through its inclined flat surface 131 and then emits in the direction of the imaging lens 14R corresponding to the right eye.
  • the imaging principle of the prism 13 on the light emitted from the pixel unit 121 is shown in FIG. 3.
  • the light exit surface of the prism 131 that allows the light from the corresponding pixel unit column to exit is a flat surface, and the flat surface forms a certain angle with the display surface, that is, the light exit surface is set to According to the aforementioned inclined flat surface 131, according to the principle of light refraction, the inclined flat surface 131 of the prism 131 can refract the incident light toward the inclined direction of the inclined flat surface 131.
  • each The light emitted from the corresponding pixel unit column is directed toward the two imaging lenses in two directions that tend to diverge obliquely with respect to each other in a manner that tends to be more divergent with respect to each other.
  • the light emitted from the left eye corresponds to the imaging lens 14L and the right eye corresponds to the imaging lens 14R respectively to achieve the purpose of separating the left eye image and the right eye image.
  • Three-dimensional images can be viewed through image fusion of the brain.
  • the embodiments of the present disclosure only need to set up a set of display devices for both eyes to realize light field display, and both eyes can share some pixel units in the first display panel and the second display panel, which improves the utilization rate of the pixel units.
  • the two above-mentioned prisms 131 are arranged such that their respective bottom surfaces are coplanar, and the respective inclined flat surfaces 131 are arranged toward each other (as shown in FIG. 4)
  • the light emitted by the respective corresponding pixel unit columns can be directed toward the two imaging lenses in two directions that tend to approach obliquely with respect to each other in a manner that tends to slightly converge with respect to each other.
  • the light emission of the pixel unit column respectively emits in the direction of the imaging lens 14L corresponding to the left eye and the imaging lens 14R corresponding to the right eye, achieving the purpose of a limited degree of separation of the left-eye image and the right-eye image.
  • the three-dimensional image can be viewed through the image fusion of the brain. Three-dimensional images.
  • the above-mentioned light exit surface of the prism that causes the light from the pixel unit column to exit is set to a flat surface, so as not to affect the convergence and dispersion properties of the light incident on the prism (that is, in the direction of propagation of the light itself).
  • the relative positioning relationship of the sub-beams included in the light itself with respect to each other is regarded as invariable), and the pattern carried by the original emitted light from the second display panel (for example, interpreted as light The pattern formed on the cross section), so that the final imaging restores the real scene.
  • the orientation of the inclined flat surface 131 of the prism 13 is, for example, flexibly set, as shown in FIGS. 3 and 4.
  • the upper row of prisms is configured for the upper side
  • the imaging lens deflects light
  • the lower row of prisms is configured to deflect the light from the imaging lens on the downward side
  • the upper row of prisms is configured for the imaging lens on the downward side
  • the lower row of prisms is configured to deflect light for the imaging lens on the upper side.
  • the prism that transmits light to the imaging lens 14L corresponding to the left eye and the prism that transmits light to the imaging lens 14R corresponding to the right eye are along the direction of the pixel unit row Alternate arrangement.
  • the prism that transmits light to the imaging lens 14L corresponding to the left eye has the same structure as the prism that transmits light to the imaging lens 14R corresponding to the right eye, and only differs in the installation direction.
  • the two adjacent prisms 13 (that is, the prism that transmits light to the imaging lens 14L corresponding to the left eye and the prism that transmits light to the imaging lens 14R corresponding to the right eye) are set to correspond to the pixels corresponding to the two.
  • the common center line of the unit columns is set in mirror symmetry as the axis, so that the pixel unit columns corresponding to the two adjacent prisms can be used for left-eye and right-eye imaging respectively.
  • one prism 13 can correspond to multiple columns of pixel units, and each column of pixel units can display the same image or different images.
  • a prism corresponds to multiple columns of pixel units, the use of prisms can be reduced. Quantity.
  • one prism 13 can correspond to only one column of corresponding pixel units, and each column of pixel units is used to display a different image. This configuration can improve the image resolution.
  • two virtual image planes can be viewed by both eyes through the imaging lens and prism.
  • the two virtual image planes are respectively the positions where the first display panel and the second display panel are imaged.
  • the two eyes can share the pixels in the two display panels, so that the two imaging lenses overlap the imaging of the same display panel.
  • the two line segments with solid arrows at both ends in FIG. 7 jointly define the position of the plane where the virtual image formed by the second display panel 12 is located.
  • the two line segments with dashed arrows at each end together define the position of the plane where the virtual image formed by the first display panel 11 is located; and as shown in the figure, because the two virtual image planes do not overlap, a depth of field is formed, and a light field is formed between the two virtual image planes to achieve three-dimensional display.
  • the imaging of each pixel unit in the first display panel is compared with The second display panel has been enlarged.
  • the image resolution is matched with each other. For example, as shown in FIG. 2.
  • the aforementioned near-eye wearable device provided by the embodiment of the present disclosure is, for example, glasses or a helmet.
  • the near-eye wearable device is also used for other display accessories, for example, which is not limited here.
  • the above-mentioned three-dimensional display glasses or helmets provided by the embodiments of the present disclosure are lighter to wear and have better display effects.
  • the above-mentioned first display panel 11 is, for example, a liquid crystal display panel, an organic light emitting diode display panel, and a micro light emitting diode display panel.
  • the second display panel 12 is, for example, a liquid crystal display panel or other transmissive display panel, which is not limited here. .
  • the second aspect provided by the embodiments of the present disclosure provides a display method of a near-eye wearable device.
  • the near-eye wearable device to which the display method is applicable includes: a first display panel and a second display panel located on the light emitting side of the first display panel; both the first display panel and the second display panel include a plurality of pixel units.
  • the display method of a wearable device for myopia includes, for example:
  • the above-mentioned display method provided by the embodiments of the present disclosure is applicable to the above-mentioned near-eye wearable device.
  • the light emitted from each pixel unit in the first display panel passes through the corresponding pixel unit in the second display panel and enters the imaging lens. More specifically, assuming that the first display panel and the second display panel are both liquid crystal display panels, the transmittance of the backlight through the pixel in the i-th row and the j-th column of the first display panel is f(i,j); The transmittance of the pixel in the k-th row and the l-th column through the second display panel is g(k,l).
  • This formula can be regarded as an equation for the light rays passing through corresponding pixels in the first display panel and the second display panel. Since multiple rays of light may pass through the same pixel in the light field, the number of light field equations established in the light field space is much larger than the unknown number of pixel transmittance required to be solved. Therefore, the equations are overdetermined equations, which will cause multiple or no solutions.
  • the prism array on the light emitting surface of the second display panel transmits light to the left eye and the right eye respectively, which is equivalent to increasing the constraint condition of the light field, which is helpful for solving.
  • the first display panel 11 includes 9 pixel units P11-P19 in three rows and three columns
  • the second display panel 12 includes 4 pixel units P21-P24 in two rows and two columns
  • the first display panel The corresponding relationship between each pixel unit in the second display panel and each pixel unit in the second display panel is predetermined, and the sparse matrix can be determined according to the corresponding relationship.
  • Determining the sparse matrix about the correspondence of each pixel unit includes the following sub-steps:
  • the code vectors corresponding to the pixel units of the first display panel and the second display panel are simultaneously combined according to the predetermined correspondence between the pixel units to obtain a sparse matrix.
  • the above sub-steps are still described by taking the two display panel structures shown in FIG. 9 as an example.
  • the first display panel 11 there are nine pixel units, and accordingly, a plurality of nine-element vectors are constructed to represent Indicates these pixel units P11 to P19, that is, the aforementioned "encoding vector corresponding to each pixel unit of the first display panel", specifically:
  • each pixel unit of the second display panel corresponds to Encoding vector
  • the pixel unit P21 is represented as a vector [1 0 0 0], for example;
  • the pixel unit P22 is, for example, represented as a vector [0 1 0 0];
  • the pixel unit P23 is expressed as a vector [0 0 1 0], for example;
  • the pixel unit P24 is represented as a vector [0 0 0 1], for example.
  • the vectors obtained by the above two sets of encodings are combined together.
  • the number of pixel units in the second display panel is less than that of the pixel units in the first display panel. Therefore, each pixel unit in the second display panel corresponds to multiple pixel units in the first display panel.
  • the specific correspondence relationship is determined by the optical characteristics of the prism and imaging lens in the near-eye wearable device.
  • the vector array obtained after simultaneous combination is as follows:
  • the sparse matrix T can be obtained as follows:
  • an unknown X vector to be solved is used to characterize the transmittance of each pixel unit in the two display panels. Since the resolution of the first display panel is 3 ⁇ 3, and the resolution of the second display panel is 2 ⁇ 2, if the unknown number X is represented by a vector, the dimension of the vector of the unknown number X is each dimension of the first display panel.
  • the sum of the dimension of the encoding vector corresponding to the pixel unit and the dimension of the encoding vector corresponding to each pixel unit of the second display panel, that is, 9+4 13 dimensions, if TP11 represents the transparency of the pixel unit P11 in the first display panel Overrate, TP12 represents the transmittance of pixel unit P12, and so on...
  • TP19 represents the transmittance of pixel unit P19; similarly, TP21 represents the transmittance of pixel unit P21 in the second display panel, and TP22 represents The transmittance of the pixel unit P22, and so on..., TP24 represents the transmittance of the pixel unit P24, and the above-mentioned vector X representing the transmittance of each pixel unit in the two display panels is thus expressed as The following vectors:
  • the target light field L is obtained by multiplying the sparse matrix T representing the light rays between each pixel unit by the vector X about the transmittance to be solved, and the target light field is the light field information vector corresponding to the target image.
  • the value in L is a known quantity of the target light field content.
  • the column vector on the right side of the equal sign is L, and the 13 items const1, const2, const3,..., const13 contained in it are essentially the measured elements of the target light field L (that is, the target image Each element of the corresponding light field information vector); specifically, the light intensity emitted to each pixel of the first display panel is regarded as unit 1, and the corresponding pixels of the first and second panels are passed through The intensity of the emitted light. Therefore, these items in the equation have been determined after measurement and are regarded as constants.
  • the data signal to be loaded by each pixel unit can be determined according to the transmittance, and then the determined data signal of each pixel unit is loaded into the first display panel and the second display panel
  • Each pixel unit of can display the above-mentioned target image.
  • the near-eye wearable device can use the above-mentioned method to determine the data signal when displaying any picture.
  • the display device and the display method provided by the embodiments of the present disclosure have at least the following superior technical effects:
  • the near-eye wearable device and the display method thereof provided by the embodiments of the present disclosure include: a first display panel, a second display panel located on the light-emitting side of the first display panel, a plurality of prisms located on the light-emitting side of the second display panel, and a prism
  • the two imaging lenses on the side away from the second display panel respectively correspond to the eyes; wherein the distance between the two imaging lenses and the second display panel is equal;
  • the second display panel includes a plurality of pixel units arranged in an array
  • Each prism corresponds to at least one column of pixel units; each prism includes an inclined flat surface at a set angle with the display surface of the second display panel to serve as the light-emitting surface of each prism 13, and also includes facing the second display
  • the flat bottom surface arranged on the display surface of the panel serves as the light-incident surface of each prism 13; part of the prisms in each prism are configured to face the display surface of the second display panel.
  • the light incident surface of the corresponding pixel unit column receives the output light of the corresponding pixel unit column, and after transmitting inside it, it exits in the direction of the imaging lens corresponding to the left eye through its inclined flat surface 131; the other part of the prism is configured to
  • the flat light-incident surface facing the display surface of the second display panel receives the emitted light of the corresponding pixel unit column, and transmits the light from the corresponding pixel unit column, and then passes through the inclined flat surface 131 of the corresponding image to the right eye.
  • the direction of the lens exits.
  • the inclined flat surface of the prism can refract the incident light toward the inclined direction of the inclined flat surface.
  • the The light emitted from the respective corresponding pixel unit columns is directed toward two directions that tend to obliquely depart from each other in a manner that tends to be more divergent with respect to each other, respectively, and exit toward the two imaging lenses to achieve
  • the light emitted by different pixel unit columns respectively emits in the direction of the imaging lens corresponding to the left eye and the imaging lens corresponding to the right eye to achieve the purpose of separating the left eye image and the right eye image, and three-dimensional images can be viewed through image fusion of the brain.
  • the embodiments of the present disclosure only need to set up a set of display devices for both eyes to realize light field display, and the two eyes can share some pixel units in the first display panel and the second display panel, which improves the utilization rate of the pixel units.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

一种近眼可穿戴设备,包括:第一显示面板(11);位于第一显示面板(11)出光侧的第二显示面板(12);位于第二显示面板(12)出光侧的多个棱镜(13);以及两个分别对应于双眼的成像透镜(14R,14L),各自与第二显示面板(12)的距离相等;第二显示面板(12)包括多个呈阵列排布的像素单元(121);各棱镜(13)分别对应至少一列像素单元(121);各棱镜(13)中的部分棱镜(13),用于在面对第二显示面板(12)的其底面处接收对应的像素单元(121)列的出射光,且在其倾斜平坦表面(131)处向左眼对应的成像透镜(14L)的方向出射;其它部分棱镜(13),用于在面对第二显示面板(12)的其底面处接收对应的像素单元(121)列的出射光,且在其倾斜平坦表面(131)处向右眼对应的成像透镜(14R)的方向出射。还公开了一种近眼可穿戴设备的显示方法。

Description

一种近眼可穿戴设备及其显示方法
相关申请的交叉引用
本公开实施例要求于2019年3月28日递交中国专利局的、申请号为201910242431.1的中国专利申请的权益,该申请的全部内容以引用方式并入本文。
技术领域
本公开实施例涉及显示技术领域,尤其涉及一种近眼可穿戴设备及其显示方法。
背景技术
虚拟现实技术是一种可以创建和体验虚拟世界的计算机仿真系统,它利用计算机生成一种模拟环境,是一种多源信息融合的、交互式的三维动态视景和实体行为的系统仿真使用户沉浸到该环境中。相关技术的虚拟现实设备均是通过左右屏显示同一物体不同角度拍摄的画面,利用双眼看到的图像偏移来呈现立体的感觉。但是屏幕发出的光线并没有深度信息,眼睛的焦点就定在屏幕上,因而眼睛的焦点调节与这种纵深感是不匹配的,造成视觉辐辏调节冲突,导致视疲劳、重影、头疼、恶心等一系列症状。
光场显示技术可以模拟真实显示场景,可以完美解决上述问题,然而现阶段所使用的光场显示设备通常需要为左眼和右眼分别设置显示装置,而显示装置中的一些像素并不能发挥作用,造成使用面积的浪费也造成设备繁复笨重。
发明内容
为至少部分地克服上述相关技术中的缺陷和/或不足,本公开的实施例提供了一种近眼可穿戴设备及其显示方法,用以实现光场显示,提高像素利用率。
本公开的实施例提供技术方案如下:
根据本公开实施例的第一方面,本公开实施例提供一种近眼可穿戴设备,包括:第一显示面板,位于所述第一显示面板出光侧的第二显示面板,位于所述第二显示面板出光侧的多个棱镜,以及位于所述棱镜背离所述第二显示面板一侧的两个分别对应于双眼的成像透镜;其中,
两个所述成像透镜各自与所述第二显示面板之间的距离相等;
所述第二显示面板包括多个呈阵列排布的像素单元;各所述棱镜分别对应至少一列所述像素单元;
各所述棱镜均包括与所述第二显示面板的充当显示面的出光侧处的表面呈设定夹角的倾斜平坦表面来充当所述棱镜的出光表面,以及面对所述第二显示面板的所述显示面布置的平坦底面来充当所述棱镜的入光表面;各所述棱镜中的部分棱镜,配置成用于在其的所述底面处接收对应的像素单元列的出射光,并且在其内部透射之后再经由其的所述倾斜平坦表面向左眼对应的成像透镜的方向透射;其它部分棱镜,配置成用于在其的所述底面处接收对应的像素单元列的出射光,并且在其内部透射之后再经由其的所述倾斜平坦表面向右眼对应的成像透镜的方向透射。
在一种示例性的实现方式中,所述底面平行于所述第二显示面板的所述显示面。
在一种示例性的实现方式中,在本公开实施例提供的上述近眼可穿戴设备中,向左眼对应的成像透镜透射光线的棱镜与向右眼对应的成像透镜透射光线的棱镜沿像素单元行的方向交替排列。
在一种示例性的实现方式中,在本公开实施例提供的上述近眼可穿戴设备中,一个所述棱镜对应至少一列所述像素单元。
在一种示例性的实现方式中,在本公开实施例提供的上述近眼可穿戴设备中,所述棱镜还包括:平行于所述第二显示面板显示面的底面,以及连接所述倾斜平坦表面与所述底面的连接表面;所述连接表面为平坦表面或弧面。
在一种示例性的实现方式中,在本公开实施例提供的上述近眼可穿戴设备中,相邻的两个所述棱镜设置成相对于两者对应的像素单元列的共用中线为轴呈镜像对称。
在一种示例性的实现方式中,在本公开实施例提供的上述近眼可穿戴设备中,所述第一显示面板包括的像素单元的数量大于所述第二显示面板包括的像素单元的数量。
在一种示例性的实现方式中,在本公开实施例提供的上述近眼可穿戴设备中,所述近眼可穿戴设备为眼镜或头盔。
根据本公开实施例的第二方面,本公开实施例提供一种近眼可穿戴设备的显示方法,所述近眼可穿戴设备包括:第一显示面板以及位于所述第一显示面板出光侧的第二显示面板;所述第一显示面板及所述第二显示面板均包括多个像素单元;所述显示方法包括:
根据第一显示面板及第二显示面板中像素单元的数量以及所述第一显示面板中各所述像素单元与所述第二显示面板中各所述像素单元之间预先确定的对应关系,确 定关于各所述像素单元之间对应光线的稀疏矩阵;
根据所述稀疏矩阵、所述第一显示面板及所述第二显示面板中各所述像素单元的透过率以及预先确定的目标图像对应的光场信息矩阵构建光场方程,求解各所述像素单元的透过率;
根据各所述像素单元的透过率确定各所述像素单元对应的数据信号;
以确定出的各所述像素单元的数据信号加载所述第一显示面板及所述第二显示面板显示所述目标图像。
在一种示例性的实现方式中,在本公开实施例提供的上述显示方法中,所述根据第一显示面板及第二显示面板的像素单元的数量以及所述第一显示面板中各所述像素单元与所述第二显示面板中各所述像素单元之间预先确定的对应关系,确定关于各所述像素单元对应关系的稀疏矩阵,包括:
分别对所述第一显示面板及所述第二显示面板中的各所述像素单元进行编码,得到各所述像素单元对应的编码向量;
将所述第一显示面板及所述第二显示面板的各所述像素单元对应的编码向量按照预先确定的像素单元之间的对应关系联立组合,得所述稀疏矩阵。
在一种示例性的实现方式中,在本公开实施例提供的上述显示方法中,所述求解各所述像素单元的透过率,包括:
以最小二乘法求解各所述像素单元的透过率。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对本公开实施例中所需要使用的附图作简单地介绍,显而易见地,下面所介绍的附图仅仅是本公开实施例的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。本申请上述的和/或附加的方面和优点从下面结合附图对实施例的描述中将变得明显和容易理解,其中:
图1为本公开实施例提供的近眼可穿戴设备的立体结构示意图;
图2为沿图1中I-I’截得的近眼可穿戴设备的截面结构示意图;
图3为本公开实施例提供的棱镜成像原理示意图之一;
图4为本公开实施例提供的棱镜成像原理示意图之二;
图5为本公开实施例提供的棱镜的结构示意图;
图6为本公开实施例提供的近眼可穿戴设备的截面结构示意图之二;
图7为本公开实施例提供的近眼可穿戴设备的成像原理图;
图8为本公开实施例提供的近眼可穿戴设备的显示方法的流程图;
图9为本公开实施例提供的像素单元对应关系示意图。
具体实施方式
为了使本公开实施例的目的、技术方案和优点更加清楚,下面将结合附图对本公开实施例作进一步地详细描述,显然,所描述的实施例仅仅是本公开实施例一部分实施例,而不是全部的实施例。基于本公开实施例中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本公开实施例保护的范围。
下面结合附图详细介绍本公开实施例具体实施例提供的近眼可穿戴设备及其显示方法。为了清楚起见,附图中的各个部分没有按比例绘制。此外,在图中可能未示出某些公知的部分。
在下文中描述了本公开的实施例的许多特定的细节,例如部件的结构、材料、尺寸、处理工艺和技术,以便更清楚地理解本公开的实施例。但正如本领域的技术人员能够理解的那样,可以不按照这些特定的细节来实现本公开的实施例。
附图中各部件尺寸和形状不反映本公开的实施例的一种近眼可穿戴设备的部件的真实比例,目的只是示意说明本公开的实施例内容。
根据本公开实施例的总体技术构思,在本公开实施例的第一方面,提供一种近眼可穿戴设备,如图1所示,该近眼可穿戴设备,包括:第一显示面板11;位于第一显示面板11出光侧的第二显示面板12;位于第二显示面板12出光侧的多个棱镜13;以及两个成像透镜14R和14L,分别位于棱镜13背离第二显示面板12一侧布置成对应于双眼。其中,14R表示右眼对应的成像透镜,14L表示左眼对应的成像透镜。此处的“对应”意思是指从所述第一显示面板11经由所述第二显示面板12继而穿过所述成像透镜14R的光束传播指向观察者右眼,从所述第一显示面板11经由所述第二显示面板12继而穿过所述成像透镜14L的光束传播指向观察者左眼,更具体地,例如,成像透镜14R和14L例如分别充当观察者佩戴的所述近眼可穿戴设备的分别在观察者右眼和左眼前方布置的右侧和左侧镜片。沿图1中的I-I’方向截得的近视眼可穿戴设备的截面结构示意图参见图2。
如图2所示,两个成像透镜14R和14L各自与第二显示面板12之间的距离相等;第二显示面板12包括多个呈阵列排布的像素单元121;各棱镜13分别对应至少一列像素单元(在图2中示例性示出一个棱镜13对应两列像素单元121)。此处应注意,由于沿图1中的I-I’方向截得图2,则“列”的方向应是对于图2而言垂直于纸面的方向。
位于第二显示面板的出光侧处的各棱镜13均包括与第二显示面板12的显示面(即第二显示面板12的出光侧处的表面)呈设定夹角的倾斜平坦表面131来充当各棱镜13的出光表面。并且,如图3-图5所示,位于第二显示面板的出光侧处的各棱镜13还包括:面对所述第二显示面板的所述显示面布置的平坦底面来充当其入光表面,所述底面例如平行于第二显示面板的所述显示面;以及连接倾斜平坦表面与底面的连接表面;该连接表面例如是呈如图3和图4所示的弧面形式的过渡面,也替代地例如是呈如图5所示的平坦表面形式的过渡面。本公开实施例对棱镜的连接表面的形状不做限制,在实际应用中,该连接表面的位置对应着像素单元的非开口区所在的位置,因此人眼并不能通过该连接表面观看到图像。
各棱镜中的部分棱镜,配置成用于在其的与所述第二显示面板的所述显示面面对的平坦的入光表面处接收对应的像素单元列的出射光,并且在其内部透射之后再经由其倾斜平坦表面131向左眼对应的成像透镜14L的方向出射;其它部分棱镜,配置成用于在其的与所述第二显示面板的所述显示面面对的平坦的入光表面处接收对应的像素单元列的出射光,并且在其内部透射之后再经由其倾斜平坦表面131向右眼对应的成像透镜14R的方向出射。
棱镜13对像素单元121出射光线的成像原理如图3所示。在本公开实施例中,将棱镜131的使得来自对应的像素单元列的光出射的所述出光表面设置为平坦表面,且该平坦表面与显示面呈一定的夹角,即该出光表面设置为前述倾斜平坦表面131,那么根据光的折射原理,棱镜131的倾斜平坦表面131可将入射光线朝着该倾斜平坦表面131的倾斜方向折射。
在进一步的实施例中,例如,在将两个上述棱镜131布置成各自底面共面布置,且各自的所述倾斜平坦表面131彼此背离设置的情况下(如图3所示),可以使各自对应的像素单元列的发出的光朝着相对于彼此趋于倾斜地背离的两个方向以相对于彼此趋于更加发散的方式分别指向所述两个成像透镜出射,实现将不同的像素单元列的发光分别向左眼对应的成像透镜14L和右眼对应的成像透镜14R的方向出射,达到左 眼图像和右眼图像分离的目的,经过大脑的图像融合可以观看到三维立体影像。那么,本公开实施例对双眼仅需要设置一套显示设备即可实现光场显示,并且双眼对于第一显示面板及第二显示面板中的部分像素单元可以共用,提高像素单元的利用率。
并且,在替代的进一步的实施例中,例如,在将两个上述棱镜131布置成各自底面共面布置,且各自的所述倾斜平坦表面131朝向彼此设置的情况下(如图4所示),可以使各自对应的像素单元列的发出的光朝着相对于彼此趋于倾斜地接近的两个方向以相对于彼此趋于略微会聚的方式分别指向所述两个成像透镜出射,实现将不同的像素单元列的发光分别向左眼对应的成像透镜14L和右眼对应的成像透镜14R的方向出射,达到左眼图像和右眼图像有限程度分离的目的,经过大脑的图像融合可以观看到三维立体影像。
进一步地,将所述棱镜的上述使得来自像素单元列的光出射的所述出光表面设置为平坦表面,从而不会影响入射到棱镜的光的聚散性质(即在该光本身的与传播方向正交的截面上,光本身包括的各子光束关于彼此的相对定位关系视为是不变的),可以保留来自第二显示面板的原始的出射光的载有的图案(例如解释为光的截面上形成的图案),使最终的成像还原真实场景。在实际应用中,棱镜13的倾斜平坦表面131的朝向例如灵活设置,如图3和图4所示,在采用如图3所示的结构的情况下,上排棱镜配置成用于向上侧的成像透镜偏折光线,下排棱镜配置成用于向下侧的成像透镜偏折光线;而在采用如图4所示的结构的情况下,上排棱镜配置成用于向下侧的成像透镜偏折光线,下排棱镜配置成用于向上侧的成像透镜偏折光线。
如图2所示,本公开实施例提供的上述近眼可穿戴设备中,向左眼对应的成像透镜14L透射光线的棱镜与向右眼对应的成像透镜14R透射光线的棱镜沿像素单元行的方向交替排列。并且,向左眼对应的成像透镜14L透射光线的棱镜与向右眼对应的成像透镜14R透射光线的棱镜的结构相同,仅在设置方向上有所不同。在具体实施时,将相邻的两个棱镜13(即向左眼对应的成像透镜14L透射光线的棱镜和向右眼对应的成像透镜14R透射光线的棱镜)设置成相对于两者对应的像素单元列的共用中线为轴呈镜像对称设置,这样可以使相邻两个棱镜对应的像素单元列分别用于左眼和右眼的成像。
在实际应用中,一个棱镜13可以对应多列像素单元,各列像素单元可以显示相同的图像也可以显示不同的图像,当采用一个棱镜对应多列像素单元的结构时,可以减小棱镜的使用数量。在另一种可实施的方式中,如图6所示,一个棱镜13可以对应 仅一列相应像素单元,每列像素单元均用于显示不同的图像,这样设置可以提高图像分辨率。
如图7所示,本公开实施例提供的上述近眼可穿戴设备的成像原理图,双眼通过成像透镜以及棱镜的成像可以观看到两个虚像面。这两个虚像面分别为第一显示面板和第二显示面板的成像所在的位置。双眼对于两个显示面板中的像素可以共用,使得两个成像透镜对于同一个显示面板的成像存在重叠部分。其中,图7中的两个各自两端处带有实线箭头的线段共同限定了第二显示面板12所成虚像的所在平面的位置,图7中两个各自两端处带有虚线箭头的线段共同限定了第一显示面板11所成虚像的所在平面的位置;并且如图所示由于这两个虚像面不叠合由此形成景深,在两个虚像面之间形成光场,实现三维显示。在具体实施时,由于与第二显示面板12的所成虚像相比,第一显示面板11的所成虚像与观看者相距较远,第一显示面板中的每个像素单元的成像相较于第二显示面板都有所增大,为了与前方的第二显示面板12的成像融合,图像分辨率相互匹配,则例如如图7所示设置第一显示面板11包括的像素单元的数量大于第二显示面板12包括的像素单元的数量。
在具体实施时,本公开实施例提供的上述近眼可穿戴设备例如是眼镜或头盔。除此之外,该近眼可穿戴设备还例如用于其它显示配件,在此不做限定。相比于相关技术中的三维立体显示的眼镜或头盔,本公开实施例提供的上述三维立体显示眼镜或头盔佩戴更加轻便,且具有较佳的显示效果。上述第一显示面板11例如采用液晶显示面板、有机发光二极管显示面板、微型发光二极管显示面板中的一种,第二显示面板12例如采用液晶显示面板或其它透射型显示面板,在此不做限定。
本公开实施例提供的第二方面,提供一种近眼可穿戴设备的显示方法。该显示方法适用的近眼可穿戴设备包括:第一显示面板以及位于第一显示面板出光侧的第二显示面板;第一显示面板及第二显示面板均包括多个像素单元。
如图8所示,本公开实施例提供的近视眼可穿戴设备的显示方法,例如包括:
S10、根据第一显示面板及第二显示面板中像素单元的数量以及第一显示面板中各像素单元与第二显示面板中各像素单元之间预先确定的对应关系,确定关于各像素单元之间对应光线的稀疏矩阵;
S20、根据稀疏矩阵、第一显示面板及第二显示面板中各像素单元的透过率以及预先确定的目标图像对应的光场信息矩阵构建光场方程,求解各像素单元的透过率;
S30、根据各像素单元的透过率确定各像素单元对应的数据信号;
S40、以确定出的各像素单元的数据信号加载第一显示面板及第二显示面板显示目标图像。
本公开实施例提供的上述显示方法适用于上述近眼可穿戴设备,第一显示面板中各像素单元出射的光线穿过第二显示面板中对应的像素单元向成像透镜的方向入射。更具体地,假设第一显示面板和第二显示面板均为液晶显示面板,背光透过第一显示面板的第i行第j列像素的透过率为f(i,j);透射光线再透过第二显示面板的第k行第l列像素的透过率为g(k,l)。则该光线达到人眼的强度为二者乘积即L(i,j,k,l)=f(i,j)·g(k,l)。该公式可以看作背光穿过第一显示面板及第二显示面板中对应像素点的光线的一个方程。由于光场中可能有多条光线穿过同一像素,因此光场空间中所建立的光场方程数要远大于所要求解的像素透过率未知数。因此,该方程组为超定方程组,会造成多解或无解。而通过第二显示面板出光面的棱镜阵列将光线分别透射给左眼和右眼,相当于增加了光场的约束条件,有助于求解。
如图9所示,假设第一显示面板11包括三行三列的9个像素单元P11-P19,第二显示面板12包括两行两列的4个像素单元P21-P24,且第一显示面板中的各像素单元与第二显示面板中的各像素单元之间的对应关系预先确定,那么可以根据该对应关系确定出稀疏矩阵。
具体地,在上述步骤S10中,根据第一显示面板及第二显示面板的像素单元的数量以及第一显示面板中各像素单元与第二显示面板中各像素单元之间预先确定的对应关系,确定关于各像素单元对应关系的稀疏矩阵,例如包括如下子步骤:
分别对第一显示面板及第二显示面板中的各像素单元进行编码,得到各像素单元对应的编码向量;
将第一显示面板及第二显示面板的各像素单元对应的编码向量按照预先确定的像素单元之间的对应关系联立组合,得稀疏矩阵。
上述子步骤仍以图9所示的两个显示面板结构为例进行说明,具体地,在第一显示面板11中,存在九个像素单元,相应地,构造多个具备九元素的向量来表示表示这些像素单元P11至P19,即前述的“第一显示面板的各像素单元对应的编码向量”,具体地:
Figure PCTCN2020077218-appb-000001
以此类推……,
同理,在第二显示面板中,存在四个像素单元,相应地,构造多个具备四元素的向量来表示表示这些像素单元P21至P24,即前述的“第二显示面板的各像素单元对应的编码向量”,具体地:
像素单元P21例如表示为向量[1 0 0 0];
像素单元P22例如表示为向量[0 1 0 0];
像素单元P23例如表示为向量[0 0 1 0];
像素单元P24例如表示为向量[0 0 0 1]。
为了得到两个显示面板中各像素单元之间对应的光线,将上述两组编码得到的向量联立组合,其中,第二显示面板中的像素单元数量少于第一显示面板中的像素单元的数量,因此第二显示面板中的各像素单元对应多个第一显示面板中的像素单元,具体的对应关系由近眼可穿戴设备中的棱镜、成像透镜的光学特性决定。联立组合后得到的向量阵列如下:
[1 0 0 0 0 0 0 0 0 1 0 0 0]
[0 1 0 0 0 0 0 0 0 1 0 0 0]
[0 0 1 0 0 0 0 0 0 0 1 0 0]
.
.
.
[0 0 0 0 0 0 0 0 1 0 0 0 1]
根据上述向量阵列可得到稀疏矩阵T如下:
Figure PCTCN2020077218-appb-000002
对于上述的光场方程中,例如,以一个待求解的未知数X向量来表征两个显示面板中的各像素单元的透过率。由于第一显示面板的分辨率为3×3,第二显示面板的分 辨率为2×2,若将该未知数X用向量表示,则该未知数X的向量的维数为第一显示面板的各像素单元对应的编码向量的维数与第二显示面板的各像素单元对应的编码向量的维数之和,即9+4=13维,若以TP11表示第一显示面板中像素单元P11的透过率,TP12表示像素单元P12的透过率,以此类推……,TP19表示像素单元P19的透过率;同样地,以TP21表示第二显示面板中像素单元P21的透过率,TP22表示像素单元P22的透过率,以此类推……,TP24表示像素单元P24的透过率,则上述的表征两个显示面板中的各像素单元的透过率的向量X由此例如表示为如下列向量:
Figure PCTCN2020077218-appb-000003
将表征各像素单元之间光线的稀疏矩阵T与待求解的关于透过率的向量X相乘既得到目标光场L,该目标光场即为目标图像对应的光场信息向量。L中的值为目标光场内容的已知量。将上述各参数按照光场方程TX=L联立,例如得到方程如下:
Figure PCTCN2020077218-appb-000004
该方程式中,等号右侧的列向量即L,其包含的const1,const2,const3,....,const13这13个项实质上是测得的目标光场L的各元素(即目标图像对应的光场信息向量的各元素)的数值;具体地即,将向第一显示面板的各像素发出的光强度视为单位1的前 提下,先后经第一和第二面板的各自对应像素发出的光的强度。由此,在等式中这些项目在测量之后已确定,视为常量。该方程TX=L的可采用最小二乘法来求解,最小二乘解集和法方程T TTX=T TL的非空解集一致。采用高维度线性最小二乘算法,根据液晶特性给定透过率变化范围[l max,l min],并根据光场中的一组截图给定初始值X 0,通过多次迭代求出线性超定方程组TX=L的一组最优解。即X为所求两个显示面板中各像素单元的透过率。
在得到各像素单元的透过率之后,可以根据该透过率确定出各像素单元需要加载的数据信号,那么以确定出的各像素单元的数据信号加载第一显示面板及第二显示面板中的各像素单元可以显示出上述目标图像。该近眼可穿戴设备在显示任何一幅画面时均可采用上述方式确定数据信号。
相较于相关技术,基于上述技术方案,本公开实施例提供的显示设备和显示方法至少具有下列优越的技术效果:
本公开实施例提供的近眼可穿戴设备及其显示方法,包括:第一显示面板,位于第一显示面板出光侧的第二显示面板,位于第二显示面板出光侧的多个棱镜,以及位于棱镜背离第二显示面板一侧的两个分别对应于双眼的成像透镜;其中,两个成像透镜各自与第二显示面板之间的距离相等;第二显示面板包括多个呈阵列排布的像素单元;各棱镜分别对应至少一列像素单元;各棱镜均包括与第二显示面板的显示面呈设定夹角的倾斜平坦表面来充当各棱镜13的出光表面,以及还包括面对所述第二显示面板的所述显示面布置的平坦底面来充当各棱镜13的入光表面;各棱镜中的部分棱镜,配置成用于在其的与所述第二显示面板的所述显示面面对的平坦的入光表面处接收对应的像素单元列的出射光,并且在其内部透射之后再经由其倾斜平坦表面131向左眼对应的成像透镜的方向出射;其它部分棱镜,配置成用于在其的与所述第二显示面板的所述显示面面对的平坦的入光表面处接收对应的像素单元列的出射光,并且在其内部透射之后再经由其倾斜平坦表面131向右眼对应的成像透镜的方向出射。棱镜的倾斜平坦表面可将入射光线朝着该倾斜平坦表面的倾斜方向折射,那么将两个上述棱镜布置成各自底面共面布置,且各自的所述倾斜平坦表面131彼此背离设置时,可以使各自对应的像素单元列的发出的光朝着相对于彼此趋于倾斜地背离的两个方向以相对于彼此趋于更加发散的方式分别指向所述两个成像透镜的两个方向出射,实现将不同的像素单元列的发光分别向左眼对应的成像透镜和右眼对应的成像透镜的方向出射,达到左眼图像和右眼图像分离的目的,经过大脑的图像融合可以观看到三维立体影像。 本公开实施例对双眼仅需要设置一套显示设备即可实现光场显示,并且双眼对于第一显示面板及第二显示面板中的部分像素单元可以共用,提高像素单元的利用率。
尽管已描述了本公开实施例的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例作出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本公开实施例范围的所有变更和修改。
显然,本领域的技术人员可以对本公开实施例进行各种改动和变型而不脱离本公开实施例的精神和范围。这样,倘若本公开实施例的这些修改和变型属于本公开实施例权利要求及其等同技术的范围之内,则本公开实施例也意图包含这些改动和变型在内。

Claims (11)

  1. 一种近眼可穿戴设备,其中,包括:第一显示面板,位于所述第一显示面板出光侧的第二显示面板,位于所述第二显示面板出光侧的多个棱镜,以及位于所述棱镜背离所述第二显示面板一侧的两个分别对应于双眼的成像透镜;其中,
    两个所述成像透镜各自与所述第二显示面板之间的距离相等;
    所述第二显示面板包括多个呈阵列排布的像素单元;各所述棱镜分别对应至少一列所述像素单元;
    各所述棱镜均包括与所述第二显示面板的充当显示面的出光侧处的表面呈设定夹角的倾斜平坦表面来充当所述棱镜的出光表面,以及面对所述第二显示面板的所述显示面布置的平坦底面来充当所述棱镜的入光表面;各所述棱镜中的部分棱镜,配置成用于在其的所述底面处接收对应的像素单元列的出射光,并且在其内部透射之后再经由其的所述倾斜平坦表面向左眼对应的成像透镜的方向透射;其它部分棱镜,配置成用于在其的所述底面处接收对应的像素单元列的出射光,并且在其内部透射之后再经由其的所述倾斜平坦表面向右眼对应的成像透镜的方向透射。
  2. 如权利要求1所述的近眼可穿戴设备,其中,所述底面平行于所述第二显示面板的所述显示面。
  3. 如权利要求1所述的近眼可穿戴设备,其中,向左眼对应的成像透镜透射光线的棱镜与向右眼对应的成像透镜透射光线的棱镜沿像素单元行的方向交替排列。
  4. 如权利要求1所述的近眼可穿戴设备,其中,一个所述棱镜对应至少一列所述像素单元。
  5. 如权利要求1所述的近眼可穿戴设备,其中,所述棱镜还包括:连接所述倾斜平坦表面与所述底面的连接表面;所述连接表面为平坦表面或弧面。
  6. 如权利要求4所述的近眼可穿戴设备,其中,相邻的两个所述棱镜设置成相对于两者对应的像素单元列的共用中线为轴呈镜像对称。
  7. 如权利要求1-6任一项所述的近眼可穿戴设备,其中,所述第一显示面板包括的像素单元的数量大于所述第二显示面板包括的像素单元的数量。
  8. 如权利要求1-6任一项所述的近眼可穿戴设备,其中,所述近眼可穿戴设备为眼镜或头盔。
  9. 一种近眼可穿戴设备的显示方法,其中,所述近眼可穿戴设备包括:第一显 示面板以及位于所述第一显示面板出光侧的第二显示面板;所述第一显示面板及所述第二显示面板均包括多个像素单元;所述显示方法包括:
    根据第一显示面板及第二显示面板中像素单元的数量以及所述第一显示面板中各所述像素单元与所述第二显示面板中各所述像素单元之间预先确定的对应关系,确定关于各所述像素单元之间对应光线的稀疏矩阵;
    根据所述稀疏矩阵、所述第一显示面板及所述第二显示面板中各所述像素单元的透过率以及预先确定的目标图像对应的光场信息矩阵构建光场方程,求解各所述像素单元的透过率;
    根据各所述像素单元的透过率确定各所述像素单元对应的数据信号;
    以确定出的各所述像素单元的数据信号加载所述第一显示面板及所述第二显示面板显示所述目标图像。
  10. 如权利要求9所述的显示方法,其中,所述根据第一显示面板及第二显示面板的像素单元的数量以及所述第一显示面板中各所述像素单元与所述第二显示面板中各所述像素单元之间预先确定的对应关系,确定关于各所述像素单元对应关系的稀疏矩阵,包括:
    分别对所述第一显示面板及所述第二显示面板中的各所述像素单元进行编码,得到各所述像素单元对应的编码向量;
    将所述第一显示面板及所述第二显示面板的各所述像素单元对应的编码向量按照预先确定的像素单元之间的对应关系联立组合,得所述稀疏矩阵。
  11. 如权利要求9所述的显示方法,其中,所述求解各所述像素单元的透过率,包括:
    以最小二乘法求解各所述像素单元的透过率。
PCT/CN2020/077218 2019-03-28 2020-02-28 一种近眼可穿戴设备及其显示方法 WO2020192359A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910242431.1 2019-03-28
CN201910242431.1A CN109917549B (zh) 2019-03-28 2019-03-28 一种近眼可穿戴设备及其显示方法

Publications (1)

Publication Number Publication Date
WO2020192359A1 true WO2020192359A1 (zh) 2020-10-01

Family

ID=66967309

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/077218 WO2020192359A1 (zh) 2019-03-28 2020-02-28 一种近眼可穿戴设备及其显示方法

Country Status (2)

Country Link
CN (1) CN109917549B (zh)
WO (1) WO2020192359A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917549B (zh) * 2019-03-28 2021-03-12 京东方科技集团股份有限公司 一种近眼可穿戴设备及其显示方法
CN110471190B (zh) * 2019-09-11 2022-09-09 京东方科技集团股份有限公司 一种显示装置
CN112068326B (zh) * 2020-09-17 2022-08-09 京东方科技集团股份有限公司 3d显示装置
KR20220067950A (ko) * 2020-11-18 2022-05-25 삼성전자주식회사 디스플레이 장치 및 그의 제어 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106125315A (zh) * 2016-06-24 2016-11-16 北京国承万通信息科技有限公司 显示装置及方法
CN106291959A (zh) * 2016-10-31 2017-01-04 京东方科技集团股份有限公司 一种虚拟显示面板及显示装置
CN106773064A (zh) * 2017-01-22 2017-05-31 网易(杭州)网络有限公司 影像画面的显示控制方法、装置及头戴式显示设备
CN108630154A (zh) * 2018-05-10 2018-10-09 深圳市华星光电技术有限公司 一种光栅片、光栅系统及3d显示装置
CN109917549A (zh) * 2019-03-28 2019-06-21 京东方科技集团股份有限公司 一种近眼可穿戴设备及其显示方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104423137B (zh) * 2013-09-04 2018-08-10 联想(北京)有限公司 三维显示装置、显示方法和电子设备
US10565925B2 (en) * 2014-02-07 2020-02-18 Samsung Electronics Co., Ltd. Full color display with intrinsic transparency
EP3479163A4 (en) * 2016-07-14 2019-07-24 Samsung Electronics Co., Ltd. HIGH TRANSPARENCY MULTILAYER DISPLAY FOR LIGHT FIELD GENERATION

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106125315A (zh) * 2016-06-24 2016-11-16 北京国承万通信息科技有限公司 显示装置及方法
CN106291959A (zh) * 2016-10-31 2017-01-04 京东方科技集团股份有限公司 一种虚拟显示面板及显示装置
CN106773064A (zh) * 2017-01-22 2017-05-31 网易(杭州)网络有限公司 影像画面的显示控制方法、装置及头戴式显示设备
CN108630154A (zh) * 2018-05-10 2018-10-09 深圳市华星光电技术有限公司 一种光栅片、光栅系统及3d显示装置
CN109917549A (zh) * 2019-03-28 2019-06-21 京东方科技集团股份有限公司 一种近眼可穿戴设备及其显示方法

Also Published As

Publication number Publication date
CN109917549B (zh) 2021-03-12
CN109917549A (zh) 2019-06-21

Similar Documents

Publication Publication Date Title
WO2020192359A1 (zh) 一种近眼可穿戴设备及其显示方法
US11651566B2 (en) Systems and methods for mixed reality
JP7391842B2 (ja) 表示装置および表示システム
US8628196B2 (en) Display device and display method
US11683472B2 (en) Superstereoscopic display with enhanced off-angle separation
WO2019179136A1 (zh) 显示装置及显示方法
US9632406B2 (en) Three-dimension light field construction apparatus
US10557994B1 (en) Waveguide grating with spatial variation of optical phase
JP2018524952A (ja) クローキングシステム及び方法
US9291821B1 (en) Wide-angle head-up display with three-component combiner
US20230152592A1 (en) Augmented reality display device
US20210021804A1 (en) Method for displaying stereoscopic image and stereoscopic image display apparatus
TWI622805B (zh) Near-eye display method with focusing effect
Kikuta et al. Development of SVGA resolution 128-directional display
CN105137599A (zh) 头戴式显示器及其图像和透射率/反射率确定方法和装置
US20230213772A1 (en) Display systems with collection optics for disparity sensing detectors
US20240061246A1 (en) Light field directional backlighting based three-dimensional (3d) pupil steering
US20230156175A1 (en) Autostereoscopic display device and method
US20220163816A1 (en) Display apparatus for rendering three-dimensional image and method therefor
US11828936B1 (en) Light field display tilting
US20230209032A1 (en) Detection, analysis and correction of disparities in a display system utilizing disparity sensing port
CN117413215A (zh) 双反射器光学部件
US20210294119A1 (en) Display apparatus for rendering three-dimensional image and method therefor
CN109471259B (zh) 近眼显示装置
KR20180026894A (ko) 홀로그램 장치에서 색수차를 보정하는 방법 및 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20778760

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20778760

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 10/02/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20778760

Country of ref document: EP

Kind code of ref document: A1