WO2019014843A1 - 一种利用透镜还原光场的方法 - Google Patents

一种利用透镜还原光场的方法 Download PDF

Info

Publication number
WO2019014843A1
WO2019014843A1 PCT/CN2017/093345 CN2017093345W WO2019014843A1 WO 2019014843 A1 WO2019014843 A1 WO 2019014843A1 CN 2017093345 W CN2017093345 W CN 2017093345W WO 2019014843 A1 WO2019014843 A1 WO 2019014843A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
image
screen
playing
video
Prior art date
Application number
PCT/CN2017/093345
Other languages
English (en)
French (fr)
Inventor
李乔
Original Assignee
辛特科技有限公司
李乔
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 辛特科技有限公司, 李乔 filed Critical 辛特科技有限公司
Priority to CN201780093181.9A priority Critical patent/CN111183634B/zh
Priority to PCT/CN2017/093345 priority patent/WO2019014843A1/zh
Publication of WO2019014843A1 publication Critical patent/WO2019014843A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the invention relates to the field of light field reduction technology, and in particular to a method for reducing a light field by using a lens.
  • 3D equipment based on the principle of "polarization” cannot solve the dizziness problem caused by people during use.
  • the left and right parallax and the eye focus system can confirm each other, so that the brain knows that these two functions are tacit cooperation.
  • the two sets of the brain's distance sensing system and the observation in the natural environment are different, and this difference makes the brain very uncomfortable. The sense of vertigo is created.
  • the industry has introduced a light field theory solution.
  • a more representative company in the field of 3D broadcasting is a solution based on light field theory made by Magic Leap.
  • this scheme uses optical fiber scanning technology to realize the light field display.
  • the optical fiber has certain difficulties in control due to the control of the rotation, angle and illumination of the optical fiber.
  • the multi-focus display method proposed by Magic Leap uses an eye detection system to detect the eye observation point and then re-render the picture, adjust the picture projected to the eye, and each time an image of the depth information is cast, it is difficult to achieve complete light field reduction under a single viewing angle. At the same time, it is difficult to perform light field reduction from different spatial angles.
  • the image playing screen acquires a spatial image with depth information, and restores a light field of the spatial image by changing a distance between the image playing screen and the lens.
  • the distance between the video playing screen and the lens is changed by:
  • the video playing screen plays the space image The play point;
  • the video playing screen stops playing the space image.
  • the image playing screen and the lens are placed in a lens unit wrapped by an opaque material, and the plurality of lens unit arrays are in the same plane.
  • the interval between the start playing and the stopping of playing of the space image playing point is ⁇ 0.4 seconds.
  • the lens is a single lens or combination of lenses.
  • the lens is a single lens that reciprocates the video playback screen and/or a single lens within a single focal length of the single lens.
  • the lens is a lens combination
  • the lens combination includes a first lens at a front end of the video playing screen, and a second lens farthest from the image playing screen, wherein a focal length of the first lens is doubled Reciprocating the image playback screen and/or the combination of lenses with a double focal length.
  • the first lens is spaced apart from the second lens such that a real image of the first lens is inverted within a focal length of the second lens.
  • the video playing screen is arranged in parallel with the lens.
  • the video playing screen is arranged at an angle to the lens.
  • the invention provides a method for reducing a light field by using a lens.
  • the depth position of the enlarged virtual image in the space can be changed, and the purpose of clearing the depth information of the image is realized.
  • Figure 1 is a schematic view showing the structure of a display wall of the present invention
  • Figure 2 is a schematic view showing the structure of a lens unit of the present invention.
  • 3a-3c are schematic diagrams showing a process of calculating a depth position of all play points of a spatial image in the video playing screen of the present invention
  • FIG. 4 is a schematic diagram showing changing the distance between a video playing screen and a lens according to an embodiment of the present invention
  • FIG. 5 is a schematic view showing another embodiment of the present invention for changing a distance between a video playing screen and a lens
  • FIG. 6 is a schematic diagram showing the positional relationship between a video playing screen and a lens in some embodiments of the present invention.
  • the virtual reality is used to display an image in a field of view in a 3D scene, and the optical field of the spatial image with depth information is restored.
  • the afterglow effect is also called visual temporary Retention means that when the human eye is observing the scene, the light signal is transmitted to the brain nerve, and it takes a short period of time. After the end of the light, the visual image does not disappear immediately.
  • FIG. 1 is a schematic structural view of the display wall of the present invention.
  • the invention discloses a structural diagram of a lens unit.
  • the present invention reduces a spatial image with depth information through an image into an enlarged virtual image through a video playing screen.
  • a plurality of lens units 110 are arrayed on a plane to form a display wall 100.
  • the lens units 110 of the display wall 100 are spaced apart from each other, and each lens unit 110 incorporates a video playing screen 20 and a lens 10.
  • the video playback screen 20 and the lens 10 are wrapped in the lens unit 110 by the opaque material 111 to prevent image interference in different lens units.
  • Each lens unit plays a spatial image of different viewing angles, and all the spatial images played constitute a complete visual space.
  • a method for reducing a light field by a lens according to the present invention will now be described in detail.
  • a lens is arranged in front of the image playing screen to form a lens unit, and the lens 10 in each lens unit 110 and the image playing screen can be moved in a real relative motion.
  • a method for reducing a light field using a lens of the present invention includes:
  • S1 arranging a lens in front of the video playing screen to form a lens unit, and the plurality of lens unit arrays form a display wall on the same plane.
  • Each lens unit is wrapped by an opaque material 111 to prevent image interference in different lens units.
  • the video playing screen acquires spatial image information with depth information, and the acquired spatial image information includes image information such as color, brightness, plane position, and spatial position of each point.
  • the light field of the spatial image is restored by changing the distance between the image playing screen and the lens, specifically changing the distance between the image playing screen and the lens by the following method:
  • the coordinates of the calculated play point U are U(xu, yu, zu), and that P, U, and O are on the same line.
  • the distance v from the point P to the lens plane, the distance u from the U point to the lens plane, and the focal length f of the lens satisfy the lens imaging formula:
  • the image here is a virtual image, so it is available Therefore, the position coordinates of the play point U are satisfied: That is, the coordinate position of the play point U is:
  • the video playback screen stops playing the spatial image.
  • the lens 10 is located at the front end of the human eye 40, has a video playing screen 20 within a focal length of the lens, and a virtual image formed by the spatial image in the image playing screen 20 that can be observed by the human eye. 30.
  • the video playback screen has a spatial image 201 with depth information acquired.
  • different depth information of the image 201 includes a play point 2011a (2011b) of the first plane and a play point 2012a (2012b) of the second plane.
  • the play points have different depth coordinates, that is, the coordinate positions of the play points.
  • the playing plane is not limited to two, and the first plane and the second plane here are only for the purpose of more clearly explaining the spatial information of the spatial image having the depth information.
  • the computer controls the video playback screen 20 and/or the lens 10 to reciprocate according to the depth position of the playback point calculated above, that is, the position coordinates of the play point 2011a (2011b). Move to 2011a (2011b)
  • the computer controls the video playback screen to start playing, and the human eye 40 observes the virtual image of the first plane playing point 2011a (2011b) through the lens 10. At this time, the video playback screen stops playing the playback point 2012a (2012b) of the second plane.
  • the computer controls the video playback screen 20 and/or the lens. 10 reciprocating, until 2012a (2012b) satisfies the calculated depth coordinate, the computer controls the video playing screen to start playing, and the human eye 40 observes the virtual image of the second plane playing point 2012a (2012b) through the lens 10. .
  • the video playback screen stops playing the playback point 2011a (2011b) of the first plane.
  • the play point of the first plane and the play point of the second plane are played cyclically, and the interval between the start and stop of the play point is ⁇ 0.4 seconds, and even more, in some embodiments, the play point starts and stops playing.
  • the interval is ⁇ 0.1 seconds.
  • the human eye Since the human eye has a visual persistence, when the play point 2012a (2012b) of the second plane stops playing, the human eye still feels the existence of the play point 2012a (2012b) of the second plane, when the human eye persists, The play point 2012a (2012b) of the second plane starts playing again, so that the human eye 40 always feels that the virtual image of the play point 2012a (2012b) of the second plane exists. Thereby, the light field of the spatial image 201 in the image playback screen 20 is restored, and the situation in which the human eye is dizzy caused by playing all the play points of the spatial image 201 at the same time is solved. Moreover, in the process of restoring the light field of the spatial image 201, the depth position of the enlarged virtual image in the space can be changed, and the purpose of clearing the depth information of the image is realized.
  • an embodiment of the present invention changes the distance between the video playing screen and the lens.
  • the lens of this embodiment is a single lens 10a.
  • the single lens is a single convex lens, and the video playing screen 20a is placed.
  • the human eye 40a observes all the play points of the spatial image played by the video playing screen 20a through the single lens 10a.
  • Virtual image, the virtual image of all playback points of the spatial image completes the light field of the restored spatial image.
  • another embodiment of the present invention changes the distance between the video playing screen and the lens.
  • the lens is a lens assembly 10b, and the lens assembly 10b includes a first lens 10b1 at the front end of the video playing screen, and a distance.
  • the image playback screen is located farthest from the second lens 10b2, wherein
  • the first lens 10b1 is spaced apart from the second lens 10b2, and reciprocally moves the image playing screen 20b and/or the lens combination 10b between the first focal length f1 and the double focal length 2f1 of the first lens 10b1 to have the image playing screen 20b therein.
  • the inverted real image 201b' of the playback point of the spatial image 201b of the depth information via the first lens 10b1 always moves within the range of the focal length f2 of the second lens 10b2.
  • the human eye 40b observes the enlarged virtual image of the inverted real image 201b' of the playback point of the spatial image 201b through the first lens 10b1 through the second lens 10b2, and the spatial image 201b in the video playback screen 20b is double-amplified, which is more clear.
  • the light field of the spatial image 201b is restored.
  • the video playback screen is arranged in parallel with the lens, and the video playback screen and the lens reciprocate in the direction of the main optical axis of the lens.
  • the video playback screen is disposed at an angle to the lens. As shown in FIG. 6, the positional relationship between the video playback screen and the lens in some embodiments of the present invention is shown.
  • the video playback screen 20c and the lens 10c are formed.
  • the lens 10c Arranged at a certain angle, the lens 10c is fixed, the image playing screen 20c reciprocates in a direction perpendicular to the main optical axis of the lens, and the human eye 40c observes the virtual image of the spatial image having the depth information in the image playing screen 20c through the lens 10c, and completely restores the spatial image.
  • Light field Arranged at a certain angle, the lens 10c is fixed, the image playing screen 20c reciprocates in a direction perpendicular to the main optical axis of the lens, and the human eye 40c observes the virtual image of the spatial image having the depth information in the image playing screen 20c through the lens 10c, and completely restores the spatial image. Light field.
  • the invention provides a method for reducing a light field by using a lens, which solves the situation that the human eye is dizzy caused by playing all the play points of the spatial image at the same time, and can change the enlarged virtual image in the space during the process of restoring the light field of the spatial image.
  • the depth position enables the purpose of clearly restoring the depth information of the image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

一种利用透镜(10)还原光场的方法,包括:在影像播放屏(20)前方布置透镜(10);影像播放屏(20)获取包含有纵深信息的空间影像(201)信息,通过改变影像播放屏(20)与透镜(10)之间的距离,还原空间影像(201)的光场。这种利用透镜(10)还原光场的方法,通过控制影像播放屏(20)与透镜(10)之间的位置,能够改变放大虚像(30)在空间的纵深位置,实现还原空间影像(201)的纵深信息的目的。

Description

一种利用透镜还原光场的方法 技术领域
本发明涉及光场还原技术领域,特别涉及一种利用透镜还原光场的方法。
背景技术
至今为止几乎任何3D影像技术都是基于这个偏光原理开发的。1839年,英国科学家温斯特发现了一个奇妙的现象,人的两眼间距约为5cm(欧洲人平均值),看任何物体时,两只眼睛的角度不重合,即存在两个视角。这种细微的视角差异经由视网膜传递到大脑里,就能区别出物体的前后远近,产生强烈的立体感。这便是—偏光原理,至今为止几乎任何3D影像技术都是基于这个原理开发的。
但是基于“偏光原理”的3D设备无法解决人们在使用过程中导致的眩晕问题。在自然环境中,左右视差和眼睛对焦系统可以相互印证,以使得大脑知道这两个功能正在默契的配合。当用户在观看基于“偏光原理”的3D影像的时候由于缺乏眼睛对焦系统的参与,大脑的两套距离感受系统和自然环境中的观察存在差异,这种差异就会让大脑非常不适应,这时候眩晕感就产生了。
为了解决3D视频中的眩晕问题,业界在引入了光场理论解决方案。3D播放领域比较有代表性的公司是Magic Leap公司做的基于光场理论的解决方案。但是该方案采用光纤扫描技术来实现光场显示,光纤由于涉及到光纤的旋转、角度以及发光的控制,其在控制方面存在一定难度。另外,Magic Leap提出的多焦点显示方法利用一个眼睛检测系统检测眼睛观察点然后再重新渲染图片,调节投向眼睛的图片,每次投一个纵深信息的图像,难以实现单一视角下的完整光场还原,同时难以从不同空间角度进行光场还原。
因此,为了解决上述问题,需要通过控制影像播放屏与透镜之间的 位置,改变放大虚像在空间的纵深位置,实现还原影像的纵深信息的目的的一种利用透镜还原光场的方法。
发明内容
本发明的目的在于提供一种利用透镜还原光场的方法,所述方法包括:在影像播放屏前方布置透镜;
所述影像播放屏获取具有纵深信息的空间影像,通过改变所述影像播放屏与所述透镜之间的距离,还原所述空间影像的光场。
优选地,通过如下方法改变所述影像播放屏与所述透镜之间的距离:
计算所述影像播放屏中所述空间影像的所有播放点的纵深位置;
往复移动所述影像播放屏和/或透镜,当所述影像播放屏与所述透镜之间的距离满足所述空间影像某一播放点的纵深位置时,所述影像播放屏播放所述空间影像的该播放点;
当所述影像播放屏与所述透镜之间的距离不满足所述空间影像播放点的纵深位置时,所述影像播放屏停止播放所述空间影像。
优选地,所述影像播放屏和透镜置于通过不透光材料包裹的透镜单元中,多个所述透镜单元阵列于同一平面。
优选地,所述空间影像播放点的开始播放与停止播放的间隔<0.4秒。
优选地,所述透镜为单个透镜或透镜组合。
优选地,所述透镜为单个透镜,在所述单个透镜的一倍焦距内往复移动影像播放屏和/或单个透镜。
优选地,所述透镜为透镜组合,所述透镜组合包括影像播放屏前端的第一透镜,及距离所述影像播放屏距离最远的第二透镜,其中在所述第一透镜的一倍焦距与二倍焦距之间往复移动影像播放屏和/或所述透镜组合。
优选地,所述第一透镜与所述第二透镜的间隔布置,使第一透镜倒立的实像位于所述第二透镜的一倍焦距内。
优选地,所述影像播放屏与所述透镜平行布置。
优选地,所述影像播放屏与所述透镜成一定角度布置。
本发明提供的一种利用透镜还原光场的方法,通过控制影像播放屏与透镜之间的位置,能够改变放大虚像在空间的纵深位置,实现清晰还原影像的纵深信息的目的。
应当理解,前述大体的描述和后续详尽的描述均为示例性说明和解释,并不应当用作对本发明所要求保护内容的限制。
附图说明
参考随附的附图,本发明更多的目的、功能和优点将通过本发明实施方式的如下描述得以阐明,其中:
图1示意性示出了本发明显示墙的结构示意图;
图2示出了本发明透镜单元的结构示意图;
图3a~图3c示出了本发明影像播放屏中空间影像的所有播放点的纵深位置计算过程示意图;
图4示出了本发明一个实施例改变影像播放屏与透镜之间的距离的示意图;
图5示出了本发明另一个实施例改变影像播放屏与透镜之间的距离的示意图;
图6示出了本发明一些实施例中影像播放屏与透镜的位置关系示意图。
具体实施方式
通过参考示范性实施例,本发明的目的和功能以及用于实现这些目的和功能的方法将得以阐明。然而,本发明并不受限于以下所公开的示范性实施例;可以通过不同形式来对其加以实现。说明书的实质仅仅是帮助相关领域技术人员综合理解本发明的具体细节。
在下文中,将参考附图描述本发明的实施例。在附图中,相同的附图标记代表相同或类似的部件,或者相同或类似的步骤,除非另有说明。
下面结合具体的实施例对本发明所提供的一种利用透镜还原光场的方法进行详细说明,虚拟现实是以3D场景将影像展现在用于视野内,将具有纵深信息的空间影像的光场还原成3D效果。余晖效应又叫视觉暂 留,是指在人眼在观察景物时,光信号传入大脑神经,需经过一段短暂的时间,光的作用结束后,视觉形象并不立即消失的现象。
为了使本发明一种利用透镜还原光场的方法得以清晰的说明,需要对本发明所采用的还原光场的透镜进行说明,如图1所示本发明显示墙的结构示意图,图2所示本发明透镜单元的结构示意图,本发明通过影像播放屏将具有纵深信息的空间影像通过透镜还原成放大的虚像,根据本发明,在一平面上阵列多个透镜单元110构成一个显示墙100,阵列与显示墙100的透镜单元110之间相互间隔,每个透镜单元110内置影像播放屏20和透镜10。通过不透光材料111将影像播放屏20和透镜10包裹在透镜单元110内,用以防止不同透镜单元中的图像出现干扰。每个透镜单元分别播放不同视角的空间影像,播放的所有空间影像构成一个完整的视觉空间。
下面对本发明一种利用透镜还原光场的方法进行详细说明,根据本发明,在影像播放屏前方布置透镜构成透镜单元,每个透镜单元110内的透镜10与影像播放屏可以现实相对的运动。本发明一种利用透镜还原光场的方法包括:
S1、在影像播放屏前方布置透镜构成透镜单元,多个透镜单元阵列与同一平面形成一面显示墙。每个透镜单元通过不透光材料111包裹,以防止不同透镜单元中的图像出现干扰。
S2、所述影像播放屏获取具有纵深信息的空间影像信息,在获取的空间影像信息中包含每个点的颜色、亮度、平面位置、空间位置等影像信息。通过改变影像播放屏与透镜之间的距离还原空间影像的光场,具体地通过如下的方法改变影像播放屏与透镜之间的距离:
计算影像播放屏中空间影像的所有播放点的纵深位置,如图3a~图3c所示本发明影像播放屏中空间影像的所有播放点的纵深位置计算过程示意图,如图3a所示,当在用户的视野场景中的一个空间位置P(x,y,z)需要还原为一个影像播放屏中空间影像的播放点的纵深位置时,则计算P点对应的影像播放屏中空间影像的播放点的纵深位置。以P点为例,将透镜固定,透镜的透镜平面z=0,透镜中心坐标为O(xo,yo,0),透 镜焦距为f。
假设计算播放点U的坐标为U(xu,yu,zu),满足P、U、O在同一条直线上,
Figure PCTCN2017093345-appb-000001
且P点到透镜平面的距离v、U点到透镜平面的距离u,以及透镜的焦距f满足透镜成像公式:
Figure PCTCN2017093345-appb-000002
此处的像是一个虚像,因此可得
Figure PCTCN2017093345-appb-000003
从而求得播放点U的位置坐标满足:
Figure PCTCN2017093345-appb-000004
即播放点U的坐标位置为:
Figure PCTCN2017093345-appb-000005
通过上述计算,人眼通过透镜看位于透镜一倍焦距内影像播放屏上的某一个播放点的时候,只能看到播放点U的虚像P点。
S3、往复移动影像播放屏和/或透镜,当影像播放屏与透镜之间的距离满足空间影像某一播放点的纵深位置时,影像播放屏播放所述空间影像的该播放点。
当影像播放屏与透镜之间的距离不满足空间影像播放点的纵深位置时,影像播放屏停止播放空间影像。
如图3b、图3c所示,位于人眼40前端的具有透镜10,在透镜后一倍焦距内具有影像播放屏20,以及人眼可以观测到的影像播放屏20中空间影像所成立的虚像30。影像播放屏中具有获取有纵深信息的空间影像201,本实施例中,影像201的不同纵深信息包括第一平面的播放点2011a(2011b),以及第二平面的播放点2012a(2012b),每个播放点具有不同的纵深坐标,即播放点的坐标位置。应当理解,播放平面并不限于两个,这里的第一平面和第二平面只是为了使具有纵深信息的空间影像的空间信息得以更加清晰的阐释。
当人眼通过透镜10需要看到虚像30时,往复移动影像播放屏20和/或透镜10,并且使影像播放屏20与透镜10之间的距离处于透镜10的一倍焦距之内。当需要播放第一平面的播放点2011a(2011b)成像时,根据上述计算得到的播放点的纵深位置,即播放点2011a(2011b)的位置坐标,计算机控制影像播放屏20和/或透镜10往复移动,至2011a(2011b) 满足上述计算得到的纵深坐标,则计算机控制影像播放屏该播放点开始播放,人眼40通过透镜10观测到第一平面的播放点2011a(2011b)成立的虚像。此时,影像播放屏停止对第二平面的播放点2012a(2012b)进行播放。
同样,当需要播放第二平面的播放点2012a(2012b)成像时,根据上述计算得到的播放点的纵深位置,即播放点2012a(2012b)的位置坐标,计算机控制影像播放屏20和/或透镜10往复移动,至2012a(2012b)满足上述计算得到的纵深坐标,则计算机控制影像播放屏该播放点开始播放,人眼40通过透镜10观测到第二平面的播放点2012a(2012b)成立的虚像。此时,影像播放屏停止对第一平面的播放点2011a(2011b)进行播放。上述过程中第一平面的播放点和第二平面的播放点循环播放,播放点的开始播放与停止播放的间隔<0.4秒,更甚者,在一些实施例中播放点的开始播放与停止播放的间隔<0.1秒。
由于人眼存在视觉暂留,当第一平面的播放点2011a(2011b)停止播放时,人眼依然感觉到第一平面的播放点2011a(2011b)的存在,当人眼视觉暂留结束时,第一平面的播放点2011a(2011b)再次开始播放,使人眼40始终感觉到第一平面的播放点2011a(2011b)成立的虚像存在。
由于人眼存在视觉暂留,当第二平面的播放点2012a(2012b)停止播放时,人眼依然感觉到第二平面的播放点2012a(2012b)的存在,当人眼视觉暂留结束时,第二平面的播放点2012a(2012b)再次开始播放,使人眼40始终感觉到第二平面的播放点2012a(2012b)成立的虚像存在。由此完成还原影像播放屏20中的空间影像201的光场,解决了因同时播放空间影像201所有播放点造成的人眼眩晕的情形。并且在还原空间影像201光场的过程中能够改变放大虚像在空间的纵深位置,实现清晰还原影像的纵深信息的目的。
如图4所示本发明一个实施例改变影像播放屏与透镜之间的距离的示意图,本实施例的透镜为单个透镜10a,这里说的单个透镜为一个单独的凸透镜,影像播放屏20a置于单个透镜10a的一倍焦距f内,人眼40a通过单个透镜10a观测到影像播放屏20a播放的空间影像所有播放点的 虚像,空间影像所有播放点的虚像完整的还原空间影像的光场。
如图5所示本发明另一个实施例改变影像播放屏与透镜之间的距离的示意图,本实施例中透镜为透镜组合10b,透镜组合10b包括影像播放屏前端的第一透镜10b1,及距离所述影像播放屏距离最远的第二透镜10b2,其中
第一透镜10b1与第二透镜10b2的间隔布置,在第一透镜10b1的一倍焦距f1与二倍焦距2f1之间往复移动影像播放屏20b和/或透镜组合10b,使影像播放屏20b内具有纵深信息的空间影像201b的播放点经第一透镜10b1成立的倒立的实像201b’始终在第二透镜10b2的一倍焦距f2范围内移动。人眼40b通过第二透镜10b2观测到空间影像201b的播放点经第一透镜10b1成立的倒立的实像201b’的放大的虚像,影像播放屏20b中的空间影像201b经过二级放大,更加清晰的还原了空间影像201b的光场。
在上述实施例中,影像播放屏与透镜平行布置,影像播放屏与透镜沿透镜的主光轴方向往复移动。在一些实施例中,影像播放屏与所述透镜成一定角度布置,如图6所示本发明一些实施例中影像播放屏与透镜的位置关系示意图,实施例中影像播放屏20c与透镜10c成一定角度布置,透镜10c固定,影像播放屏20c沿垂直于透镜主光轴方向往复移动,人眼40c通过透镜10c观测到影像播放屏20c中具有纵深信息的空间影像的虚像,完整还原空间影像的光场。
本发明提供的一种利用透镜还原光场的方法,解决了因同时播放空间影像所有播放点造成的人眼眩晕的情形,并且在还原空间影像的光场的过程中能够改变放大虚像在空间的纵深位置,实现清晰还原影像的纵深信息的目的。
结合这里披露的本发明的说明和实践,本发明的其他实施例对于本领域技术人员都是易于想到和理解的。说明和实施例仅被认为是示例性的,本发明的真正范围和主旨均由权利要求所限定。

Claims (10)

  1. 一种利用透镜还原光场的方法,其特征在于,所述方法包括:在影像播放屏前方布置透镜;
    所述影像播放屏获取包含有纵深信息的空间影像信息,通过改变所述影像播放屏与所述透镜之间的距离,还原所述空间影像的光场。
  2. 根据权利要求1所述的方法,其特征在于,通过如下方法改变所述影像播放屏与所述透镜之间的距离:
    计算所述影像播放屏中所述空间影像的所有播放点的纵深位置;
    往复移动所述影像播放屏和/或透镜,当所述影像播放屏与所述透镜之间的距离满足所述空间影像某一播放点的纵深位置时,所述影像播放屏播放所述空间影像的该播放点;
    当所述影像播放屏与所述透镜之间的距离不满足所述空间影像播放点的纵深位置时,所述影像播放屏停止播放所述空间影像。
  3. 根据权利要求1或2所述的方法,其特征在于,所述影像播放屏和透镜置于通过不透光材料包裹的透镜单元中,多个所述透镜单元阵列于同一平面。
  4. 根据权利要求2所述的方法,其特征在于,所述空间影像播放点的开始播放与停止播放的间隔<0.4秒。
  5. 根据权利要求1或2或3所述的方法,其特征在于,所述透镜为单个透镜或透镜组合。
  6. 根据权利要求5所述的方法,其特征在于,所述透镜为单个透镜,在所述单个透镜的一倍焦距内往复移动影像播放屏和/或单个透镜。
  7. 根据权利要求5所述的方法,其特征在于,所述透镜为透镜组合,所述透镜组合包括影像播放屏前端的第一透镜,及距离所述影像播放屏距离最远的第二透镜,其中
    在所述第一透镜的一倍焦距与二倍焦距之间往复移动影像播放屏和/或所述透镜组合。
  8. 根据权利要求7所述的方法,其特征在于,所述第一透镜与所述 第二透镜的间隔布置,使第一透镜倒立的实像位于所述第二透镜的一倍焦距内。
  9. 根据权利要求1所述的方法,其特征在于,所述影像播放屏与所述透镜平行布置。
  10. 根据权利要求1所述的方法,其特征在于,所述影像播放屏与所述透镜成一定角度布置。
PCT/CN2017/093345 2017-07-18 2017-07-18 一种利用透镜还原光场的方法 WO2019014843A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780093181.9A CN111183634B (zh) 2017-07-18 2017-07-18 一种利用透镜还原光场的方法
PCT/CN2017/093345 WO2019014843A1 (zh) 2017-07-18 2017-07-18 一种利用透镜还原光场的方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/093345 WO2019014843A1 (zh) 2017-07-18 2017-07-18 一种利用透镜还原光场的方法

Publications (1)

Publication Number Publication Date
WO2019014843A1 true WO2019014843A1 (zh) 2019-01-24

Family

ID=65014858

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/093345 WO2019014843A1 (zh) 2017-07-18 2017-07-18 一种利用透镜还原光场的方法

Country Status (2)

Country Link
CN (1) CN111183634B (zh)
WO (1) WO2019014843A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060152580A1 (en) * 2005-01-07 2006-07-13 Synthosys, Llc Auto-stereoscopic volumetric imaging system and method
CN103353677A (zh) * 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 成像装置及方法
CN104410784A (zh) * 2014-11-06 2015-03-11 北京智谷技术服务有限公司 光场采集控制方法和装置
CN104469110A (zh) * 2014-11-26 2015-03-25 西北工业大学 可变角度采样数的光场采集装置
CN106254857A (zh) * 2015-12-31 2016-12-21 北京智谷睿拓技术服务有限公司 光场显示控制方法和装置、光场显示设备
CN106375694A (zh) * 2015-12-31 2017-02-01 北京智谷睿拓技术服务有限公司 光场显示控制方法和装置、光场显示设备

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7077523B2 (en) * 2004-02-13 2006-07-18 Angstorm Inc. Three-dimensional display using variable focusing lens
EP1751499B1 (en) * 2004-06-03 2012-04-04 Making Virtual Solid, L.L.C. En-route navigation display method and apparatus using head-up display
CN101038371A (zh) * 2006-12-07 2007-09-19 四川大学 变焦透镜眼镜的体三维立体显示装置
CN100483184C (zh) * 2007-05-29 2009-04-29 东南大学 可变焦透镜三维显示器
CN201765418U (zh) * 2010-08-19 2011-03-16 华映视讯(吴江)有限公司 裸眼立体显示装置
CN102497571B (zh) * 2011-12-25 2013-10-23 吉林大学 采用同步时分复用提高组合立体图像显示分辨率的方法
KR101598049B1 (ko) * 2012-04-03 2016-03-07 프라운호퍼 게젤샤프트 쭈르 푀르데룽 데어 안겐반텐 포르슝 에. 베. 자동 입체 영상 스크린 및 3d 이미지를 재생하는 방법
CN103995356B (zh) * 2014-05-30 2016-01-20 北京理工大学 一种真实立体感的光场头盔显示装置
CN105988228B (zh) * 2015-02-13 2020-07-31 北京三星通信技术研究有限公司 三维显示设备及其三维显示方法
CN105068659A (zh) * 2015-09-01 2015-11-18 陈科枫 一种增强现实系统
EP3179289B1 (en) * 2015-12-08 2021-08-11 Facebook Technologies, LLC Focus adjusting virtual reality headset
CN106254858B (zh) * 2015-12-31 2018-05-04 北京智谷睿拓技术服务有限公司 光场显示控制方法和装置、光场显示设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060152580A1 (en) * 2005-01-07 2006-07-13 Synthosys, Llc Auto-stereoscopic volumetric imaging system and method
CN103353677A (zh) * 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 成像装置及方法
CN104410784A (zh) * 2014-11-06 2015-03-11 北京智谷技术服务有限公司 光场采集控制方法和装置
CN104469110A (zh) * 2014-11-26 2015-03-25 西北工业大学 可变角度采样数的光场采集装置
CN106254857A (zh) * 2015-12-31 2016-12-21 北京智谷睿拓技术服务有限公司 光场显示控制方法和装置、光场显示设备
CN106375694A (zh) * 2015-12-31 2017-02-01 北京智谷睿拓技术服务有限公司 光场显示控制方法和装置、光场显示设备

Also Published As

Publication number Publication date
CN111183634A (zh) 2020-05-19
CN111183634B (zh) 2022-01-04

Similar Documents

Publication Publication Date Title
CN106791784B (zh) 一种虚实重合的增强现实显示方法和装置
JP3802630B2 (ja) 立体画像生成装置および立体画像生成方法
JP2004309930A (ja) 立体観察システム
US20060132915A1 (en) Visual interfacing apparatus for providing mixed multiple stereo images
KR101313797B1 (ko) 머리 위치 추적을 이용한 입체 영상 표시 장치 및 이의 동작 방법
WO2018056155A1 (ja) 情報処理装置、画像生成方法およびヘッドマウントディスプレイ
JP2000354257A (ja) 画像処理装置、画像処理方法、およびプログラム提供媒体
JP2017204674A (ja) 撮像装置、ヘッドマウントディスプレイ、情報処理システム、および情報処理方法
RU2689971C2 (ru) Стереоскопическое визуализирующее устройство
CN105068659A (zh) 一种增强现实系统
JP2022183177A (ja) ヘッドマウントディスプレイ装置
JP2005312605A (ja) 注視点位置表示装置
JPH07129792A (ja) 画像処理方法および画像処理装置
CN110794590B (zh) 虚拟现实显示系统及其显示方法
Pietrzak et al. Three-dimensional visualization in laparoscopic surgery.
JPH0685590B2 (ja) 立体表示システム
CN104216126A (zh) 一种变焦3d显示技术
JP3425402B2 (ja) 立体画像を表示する装置および方法
WO2019014843A1 (zh) 一种利用透镜还原光场的方法
JPH02291787A (ja) 広視野表示装置
CN211786414U (zh) 虚拟现实显示系统
JP2003348622A (ja) 立体映像表示方法及び記憶媒体
JPWO2017191703A1 (ja) 画像処理装置
US20060152580A1 (en) Auto-stereoscopic volumetric imaging system and method
RU2609285C9 (ru) Способ формирования многопланового изображения и мультифокальный стереоскопический дисплей

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17918434

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17918434

Country of ref document: EP

Kind code of ref document: A1