WO2014146619A1 - 一种模仿全息3d场景的显示装置和视觉显示方法 - Google Patents

一种模仿全息3d场景的显示装置和视觉显示方法 Download PDF

Info

Publication number
WO2014146619A1
WO2014146619A1 PCT/CN2014/075951 CN2014075951W WO2014146619A1 WO 2014146619 A1 WO2014146619 A1 WO 2014146619A1 CN 2014075951 W CN2014075951 W CN 2014075951W WO 2014146619 A1 WO2014146619 A1 WO 2014146619A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
view
human eye
display screen
camera
Prior art date
Application number
PCT/CN2014/075951
Other languages
English (en)
French (fr)
Inventor
刘美鸿
高炜
Original Assignee
深圳市亿思达显示科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市亿思达显示科技有限公司 filed Critical 深圳市亿思达显示科技有限公司
Priority to JP2015518844A priority Critical patent/JP2015528234A/ja
Priority to EP14770997.6A priority patent/EP2978217A1/en
Priority to KR1020157027103A priority patent/KR20160042808A/ko
Priority to US14/415,603 priority patent/US9983546B2/en
Publication of WO2014146619A1 publication Critical patent/WO2014146619A1/zh

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/08Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
    • G03H1/0866Digital holographic imaging, i.e. synthesizing holobjects from holograms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • G03H2001/0088Adaptation of holography to specific applications for video-holography, i.e. integrating hologram acquisition, transmission and display

Definitions

  • the present invention relates to the field of display technologies, and in particular, to a display device and a visual display method that mimic a holographic 3D scene.
  • the existing multi-view 3D TV and multi-view process painting can view multi-view television programs in multiple narrow points within a certain narrow range.
  • the disadvantage is that the number of visual channels has its limit, generally 8 viewpoints (or slightly More), and when shooting, take a stereo camera with 8 cameras and restore it.
  • the present invention provides a display device and a visual display method that mimic a holographic 3D scene.
  • the invention mainly provides a display device and a visual display method for simulating a holographic 3D scene, and uses the front 3D dual camera to capture the 3D position information of the human eye, and uses the 3D position information of the human eye to calculate the spatial coordinates of the human eye, and further
  • the spatial coordinate control 3D scene generation unit is used to render and acquire the corresponding left-view 3D scene and the right-view 3D scene, and the working angle of the electronic grating is controlled by the spatial coordinates, so that the left-view 3D scene is incident into the left eye of the person. And realize that the right-view 3D scene is incident into the right eye of the person.
  • a technical solution adopted by the present invention is to provide a display device that simulates a holographic 3D scene, including:
  • the first front camera is disposed at an upper left corner of the display screen
  • the second front camera is disposed at an upper right corner of the display screen
  • the first front camera and The second front camera is used to capture 3D position information of the human eye
  • the 3D human eye tracking algorithm processing unit is electrically connected to the first front camera and the second front camera, and is configured to generate a first 3D scene corresponding to the left and right visual channels according to the human eye 3D position information generation control 3D scene generating unit. a signal and a second signal that controls the working angle of the electronic grating;
  • the 3D scene generating unit is electrically connected to the display screen, and configured to receive the first signal and render and acquire a corresponding left-view 3D scene and a right-view 3D scene according to the first signal, and view the left-view 3D scene and the right-view
  • the 3D scene is sent to the display for display;
  • the electronic grating is for receiving the second signal and adjusting the working angle according to the second signal such that the left-view 3D scene is incident into the left eye of the person, and the right-view 3D scene is incident into the right eye of the person.
  • the first front camera is configured to acquire a left eye format picture of a human eye position
  • the second camera is configured to acquire a right eye format picture of a human eye position
  • the 3D human eye tracking algorithm processing unit is configured according to a left eye.
  • the difference between the position of the human eye in the format picture and the right-eye format picture calculates the angle ⁇ between the projection of the human eye center point and the display center point on the XY plane of the spatial rectangular coordinate system O-XYZ and the Y-axis and the The angle between the center point of the human eye and the center point of the display screen and the projection ⁇ ,
  • the origin O of the spatial Cartesian coordinate system O-XYZ is located at a center point of the display screen, and the X-axis of the spatial Cartesian coordinate system O-XYZ is parallel to the center point of the left and right opposite sides of the display screen, The Y-axis of the spatial Cartesian coordinate system O-XYZ is perpendicular to the center point of the left and right opposite sides of the display screen.
  • the first signal is a signal containing parameters ⁇ and ⁇
  • the second signal is a signal containing ⁇
  • the 3D scene generation unit controls the OpenGL3D scene rendering camera to move to the position corresponding to the parameters ⁇ and ⁇ , and then controls the OpenGL3D scene rendering camera to move left to x distance to render and acquire the left view 3D scene, and then causes the OpenGL3D scene rendering camera to move to the right. Render and acquire a right-view 3D scene at 2x distance,
  • the parameter x is a preset parameter related to the depth of field.
  • the 3D scene generation unit controls the DirectX3D scene rendering camera to move to the position corresponding to the parameters ⁇ and ⁇ , and then controls the DirectX3D scene rendering camera to move left x distance to render and acquire the left view 3D scene, and then causes the DirectX3D scene to render the camera right. Move 2x distance to render and get the right view 3D scene,
  • the parameter x is a preset parameter related to the depth of field.
  • the electronic grating comprises: a first glass plate, a second glass plate, a liquid crystal layer and a control unit, wherein the first surface of the first glass plate is provided with a first polarizing plate, and the first glass plate faces away from the first polarizing plate
  • the second surface is provided with a first ITO conductive layer
  • the first surface of the second glass plate is provided with a second polarizing plate, the polarizing direction of the first polarizing plate and the second polarizing plate is perpendicular, and the second glass plate faces away from the second polarizing
  • the second surface of the plate is provided with a second ITO conductive layer
  • the second ITO conductive layer comprises a plurality of equally spaced ITO electrodes and an insulating black strip disposed between adjacent ITO electrodes
  • the liquid crystal layer is firstly disposed
  • the control unit is configured to control the on/off of the alternating voltage between the first ITO conductive layer and the respective ITO electrodes according to the
  • the display device is preferably a computer.
  • the display device is preferably a mobile phone.
  • another technical solution provided by the present invention is to provide a visual display method for simulating a holographic 3D scene, including:
  • the 3D human eye tracking algorithm processing unit calculates the spatial coordinates of the human eye according to the 3D position information of the human eye
  • the step of calculating, by using the 3D human eye tracking algorithm processing unit, the spatial coordinates of the human eye according to the 3D position information of the human eye includes:
  • the 3D human eye tracking algorithm processing unit calculates the line connecting the human eye center point and the display center point in the XY plane of the space rectangular coordinate system O-XYZ according to the difference of the human eye 3D position information captured by the front 3D dual camera.
  • the origin O of the spatial Cartesian coordinate system O-XYZ is located at a center point of the display screen, and the X-axis of the spatial Cartesian coordinate system O-XYZ is parallel to the center point of the left and right opposite sides of the display screen, The Y-axis of the spatial Cartesian coordinate system O-XYZ is perpendicular to the center point of the left and right opposite sides of the display screen.
  • the display includes:
  • the 3D scene generation unit is used to control the OpenGL3D scene rendering camera to move to the position corresponding to the parameters ⁇ and ⁇ , and then control the OpenGL3D scene rendering camera to move left x distance rendering and obtain the left view 3D scene. Then make the OpenGL3D scene rendering camera move 2x distance to the right and get the right-view 3D scene.
  • the parameter x is a preset parameter related to the depth of field.
  • the display includes:
  • the 3D scene generation unit is used to control the DirectX3D scene rendering camera to move to the position corresponding to the parameters ⁇ and ⁇ , and then control the DirectX3D scene rendering camera to move left x distance to render and obtain the left view 3D scene, and then make the DirectX3D scene rendering camera right shift 2x distance rendering And get the right view 3D scene,
  • the parameter x is a preset parameter related to the depth of field.
  • the invention has the beneficial effects that the display device and the visual display method for simulating the holographic 3D scene provided by the present invention capture the 3D position information of the human eye through the front 3D dual camera and according to the 3D position of the human eye, different from the prior art.
  • the information calculates the spatial coordinates of the human eye, and then uses the human eye space coordinate control 3D human eye tracking algorithm processing unit to generate a control 3D scene generating unit to generate a first signal of the 3D scene corresponding to the left and right visual channels and control the electronic grating work.
  • the second signal of the angle causes the left-view 3D scene to enter the person's left eye and cause the right-view 3D scene to enter the person's right eye. Therefore, the present invention can achieve the same effect of seeing different stereoscopic images from different angles as in reality, and can realize holographic display of an object or a scene.
  • FIG. 1 is a schematic view showing the working principle of a display device emulating a holographic 3D scene of the present invention
  • 2A-2B are schematic structural views of a preferred embodiment of a display device for a holographic 3D scene of the present invention.
  • FIG. 3 is a flow chart showing a visual display method of a holographic 3D scene of the present invention.
  • FIG. 1 is a schematic diagram showing the working principle of a display device emulating a holographic 3D scene according to the present invention.
  • the display device emulating the holographic 3D scene of the present invention comprises: a display screen 11, a front 3D dual camera 12 (including a first front camera and a second front camera), and a 3D human eye tracking algorithm processing unit. 13. 3D scene generation unit 14 and electronic grating 15.
  • the display screen 11 described in this embodiment may be an LCD display screen or a TFT display screen, which is not limited thereto.
  • FIG. 2A is a schematic structural diagram of a first preferred embodiment of a display device for a holographic 3D scene according to the present invention.
  • the first front camera is disposed at the upper left corner of the display screen 11.
  • the second front camera is disposed in an upper right corner of the display screen 11, the first front camera and the second front camera are used to capture human eye 3D position information; the 3D human eye tracking algorithm processing unit 13 and the
  • the first front camera and the second front camera are electrically connected, and are configured to generate a first signal of the 3D scene corresponding to the left and right visual channels and a working angle of the control electronic grating 15 according to the human eye 3D position information generation control.
  • the 3D scene generating unit 14 is electrically connected to the display screen 11 for receiving the first signal and rendering and acquiring a corresponding left-view 3D scene and a right-view 3D scene according to the first signal, and the left-viewing channel
  • the 3D scene and the right-view 3D scene are sent to the display screen 11 for display
  • the electronic grating 15 is for receiving the second signal and adjusting the working angle according to the second signal, so that the left-view 3D scene is incident into the left eye of the person and makes the right Vision 3D scene incident into the human eye.
  • the first front camera is used to acquire a left eye format picture of a human eye position
  • the second camera is used to acquire a right eye format picture of a human eye position
  • the 3D human eye tracking is performed.
  • the algorithm processing unit calculates the projection and the Y-axis of the line connecting the human eye center point and the display center point O on the XY plane of the space rectangular coordinate system O-XYZ according to the difference of the human eye position in the left-eye format picture and the right-eye format picture.
  • the origin O of the spatial rectangular coordinate system O-XYZ is located at a center point of the display screen, and the X-axis of the spatial rectangular coordinate system O-XYZ is parallel to the center point of the left and right opposite sides of the display screen 11, the space The Y-axis of the Cartesian coordinate system O-XYZ is perpendicular to the center point of the left and right opposite sides of the display screen 11.
  • the first signal is a signal containing parameters ⁇ and ⁇
  • the second signal is a signal containing ⁇
  • the 3D scene generation unit controls the OpenGL3D scene rendering camera to move to the position corresponding to the parameters ⁇ and ⁇ , and then controls the OpenGL3D scene rendering camera to move left to x distance rendering and acquire the left view 3D scene.
  • the OpenGL3D scene rendering camera is then moved to the right by 2x distance rendering and the right view 3D scene is acquired, wherein the parameter x is a preset parameter related to the depth of field.
  • FIG. 2B is a schematic structural view of a second preferred embodiment of a display device for a holographic 3D scene of the present invention.
  • 2B is different from FIG. 2A in that the display screen of FIG. 2A is a vertical screen display, and the corresponding front 3D dual cameras are preferably respectively disposed at the upper left corner and the upper right corner of the vertical display screen 11, and the display screen of FIG. 2B.
  • the corresponding front 3D dual cameras are preferably respectively disposed at the upper right corner and the lower right corner of the display screen 11, and the spatial rectangular coordinate system O-XYZ is adaptively adjusted according to the display direction of the display screen.
  • the spatial Cartesian coordinate system O-XYZ can be established according to actual needs, and is not limited to the above embodiments.
  • the 3D scene generation unit controls the DirectX3D scene rendering camera to move to the position corresponding to the parameters ⁇ and ⁇ , and then controls the DirectX3D scene rendering camera to move left x distance rendering and acquire the left view 3D scene. Then, the DirectX3D scene rendering camera is moved to the right by 2x distance rendering and the right-view 3D scene is acquired, wherein the parameter x is a preset parameter related to the depth of field.
  • the 3D scene generating unit 14 does not exclude the scene rendering function of the OpenGL3D scene rendering camera and the DirectX3D scene rendering camera by using other types of scene rendering cameras, which is not limited by the present invention.
  • the electronic grating 15 includes: a first glass plate (not shown), a second glass plate (not shown), a liquid crystal layer (not labeled), and a control unit 153, the first surface of the first glass plate a first polarizing plate (not shown) is disposed, a first surface of the first glass plate facing away from the first polarizing plate is provided with a first ITO conductive layer 151, and a first surface of the second glass plate is provided with a second polarizing plate ( The second polarizing plate and the second polarizing plate are perpendicular to the polarizing direction, and the second glass plate is disposed opposite to the second surface of the second polarizing plate with a second ITO conductive layer 152, the second ITO conductive layer.
  • the 152 includes a plurality of equally spaced ITO electrodes and an insulating black strip disposed between adjacent ITO electrodes.
  • the liquid crystal layer is interposed between the first ITO conductive layer 151 and the second ITO conductive layer 152, and the control unit is configured to The second signal controls the on/off of the alternating voltage between the first ITO conductive layer 151 and the respective ITO electrodes, so that the position of the light and dark stripes of the grating is adaptively changed to adapt to the position of the human eye, so that the left view 3D scene displayed by the display screen 11 Injecting into the left eye of the person via the electronic grating 15 The right-view 3D scene displayed with the display screen 11 is incident into the right eye of the person via the electronic grating 15.
  • the positions of the first ITO conductive layer 151 and the second ITO conductive layer 152 may be interchanged, and are also within the scope of the present invention.
  • a preferred embodiment of the display device of the holographic 3D scene of the present invention is a computer.
  • Another preferred embodiment provided by the present invention is a mobile phone.
  • the display device of the holographic 3D scene may also be For other display devices.
  • FIG. 3 is a schematic flow chart of a visual display method for a holographic 3D scene according to the present invention, including:
  • step S2 comprises:
  • the 3D human eye tracking algorithm processing unit calculates the line connecting the human eye center point and the display center point in the XY plane of the space rectangular coordinate system O-XYZ according to the difference of the human eye 3D position information captured by the front 3D dual camera.
  • the origin O of the spatial Cartesian coordinate system O-XYZ is located at a center point of the display screen, and the X-axis of the spatial Cartesian coordinate system O-XYZ is parallel to the center point of the left and right opposite sides of the display screen, The Y-axis of the spatial Cartesian coordinate system O-XYZ is perpendicular to the center point of the left and right opposite sides of the display screen.
  • step S3 includes:
  • the 3D scene generation unit is used to control the OpenGL3D scene rendering camera to move to the position corresponding to the parameters ⁇ and ⁇ , and then control the OpenGL3D scene rendering camera to move left x distance rendering and obtain the left view 3D scene. Then make the OpenGL3D scene rendering camera move 2x distance to the right and get the right-view 3D scene.
  • the parameter x is a preset parameter related to the depth of field.
  • step S4 is a preferred embodiment of the present invention.
  • the 3D scene generation unit is used to control the DirectX3D scene rendering camera to move to the position corresponding to the parameters ⁇ and ⁇ , and then control the DirectX3D scene rendering camera to move left x distance to render and obtain the left view 3D scene, and then make the DirectX3D scene rendering camera right shift 2x distance rendering And get the right view 3D scene,
  • the parameter x is a preset parameter related to the depth of field.
  • the camera in step S3, may be rendered by other scenes to complete the functions of the OpenGL3D scene rendering camera and the DirectX3D scene rendering camera, which is not limited by the present invention.
  • the display screen is a vertical screen display
  • the corresponding front 3D dual cameras are respectively disposed at the upper left corner and the upper right corner of the vertical display screen
  • the X axis of -XYZ is parallel to the center point of the left and right sides of the display screen
  • the Y axis of the space rectangular coordinate system O-XYZ is perpendicular to the center point of the left and right sides of the display screen.
  • the display screen is a horizontal screen display
  • the corresponding front 3D dual cameras are respectively disposed at an upper right corner and a lower right corner of the horizontal display screen
  • the X axis of the -XYZ is parallel to the center point of the upper and lower sides of the display screen
  • the Y axis of the space rectangular coordinate system O-XYZ is perpendicular to the center point of the upper and lower sides of the display screen. It is of course possible to set the positions of the first front camera and the second front camera to other positions, which may be determined according to a specific application, which is not limited by the present invention.
  • the beneficial effects of the present invention are: different from the prior art, the display device and the visual display method for simulating a holographic 3D scene provided by the present invention capture the 3D position information of the human eye through the front 3D dual camera, and according to The human eye 3D position information calculates a spatial coordinate of the human eye, and then uses the human eye space coordinate control 3D human eye tracking algorithm processing unit to generate a first signal for controlling the 3D scene generating unit to generate a 3D scene corresponding to the left and right visual channels.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本发明提供了一种模仿3D场景的显示装置和视觉显示方法,所述显示装置包括:显示屏、用于捕获人眼3D位置信息的前置3D双摄像头、用于根据人眼3D位置信息生成第一信号和第二信号的3D人眼跟踪算法处理单元、用于接收第一信号并根据第一信号渲染并获取对应的左视道3D场景和右视道3D场景的3D场景生成单元、用于接收第二信号并根据第二信号调整工作角度以使得左视道3D场景入射进入人的左眼和使得右视道3D场景入射进入人的右眼的电子光栅;通过上述方式,本发明提供的模仿3D场景的显示装置和视觉显示方法,能够达到和现实当中一样的从不同的角度看到不同的立体图像的效果,可以实现对一个物体或场景的全息显示。

Description

一种模仿全息3D场景的显示装置和视觉显示方法
【技术领域】
本发明涉及显示技术领域,特别是涉及一种模仿全息3D场景的显示装置和视觉显示方法。
【背景技术】
现有的多视点3D电视、多视点工艺画,在特定的狭隘范围内,能以多个视点观看多视道的电视节目,缺点是视道个数有其极限,一般是8视点(或稍多),且需要拍摄时以8台摄像机拍摄立体画面并还原。
随着现在科学的发展,所有的设备都采用小型化和精密化,而现在的显示设备却无法与之相匹配,人类越来越需求一种新的显示技术来解决问题。
为解决上述技术问题,本发明提供一种模仿全息3D场景的显示装置和视觉显示方法。
【发明内容】
本发明主要是提供一种模仿全息3D场景的显示装置和视觉显示方法,利用前置3D双摄像头捕获人眼3D位置信息,并利用人眼3D位置信息计算出人眼所处的空间坐标,进而利用空间坐标控制3D场景生成单元渲染并获取对应的左视道3D场景和右视道3D场景,并利用空间坐标控制电子光栅的工作角度,实现将左视道3D场景入射进入人的左眼,并实现将右视道3D场景入射进入人的右眼。
为解决上述技术问题,本发明采用的一个技术方案是提供一种模仿全息3D场景的显示装置,包括:
显示屏;
第一前置摄像头和第二前置摄像头,所述第一前置摄像头设置于显示屏的左上角,所述第二前置摄像头设置于显示屏的右上角,所述第一前置摄像头和第二前置摄像头用于捕获人眼3D位置信息;
3D人眼跟踪算法处理单元,与所述第一前置摄像头、第二前置摄像头电连接,用于根据人眼3D位置信息生成控制3D场景生成单元生成左右视道对应的3D场景的第一信号和控制电子光栅工作角度的第二信号;
3D场景生成单元与显示屏电连接,用于接收第一信号并根据第一信号渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏进行显示;
电子光栅用于接收第二信号并根据第二信号调整工作角度,使得左视道3D场景入射进入人的左眼,并使得右视道3D场景入射进入人的右眼。
其中,所述第一前置摄像头用于获取人眼位置的左眼格式图片,所述第二摄像头用于获取人眼位置的右眼格式图片,所述3D人眼跟踪算法处理单元根据左眼格式图片和右眼格式图片中人眼位置的差异计算出人眼中心点与显示屏中心点连线在空间直角坐标系O-XYZ的XY平面上的投影与Y轴的夹角α以及所述人眼中心点与显示屏中心点连线与所述投影的夹角β,
其中,所述空间直角坐标系O-XYZ的原点O位于显示屏的中心点,所述空间直角坐标系O-XYZ的X轴平行于所述显示屏的左右对边中心点连线,所述空间直角坐标系O-XYZ的Y轴垂直于所述显示屏的左右对边中心点连线。
其中,所述第一信号是包含了参数α和β的信号,所述第二信号是包含了α的信号。
其中,所述3D场景生成单元控制OpenGL3D场景渲染摄像头移至参数α和β对应的位置,然后控制OpenGL3D场景渲染摄像头左移x距离渲染并获取左视道3D场景,再使得OpenGL3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
其中,参数x是与景深有关的预先设置的参数。
其中,所述3D场景生成单元控制DirectX3D场景渲染摄像头移至参数α和β对应的位置,然后控制DirectX3D场景渲染摄像头左移x距离渲染并获取左视道3D场景,然后再使得DirectX3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
其中,参数x是与景深有关的预先设置的参数。
其中,所述电子光栅包括:第一玻璃板、第二玻璃板、液晶层和控制单元,第一玻璃板的第一表面设置有第一偏光板,第一玻璃板背向第一偏光板的第二表面设置有第一ITO导电层,第二玻璃板的第一表面设置有第二偏光板,第一偏光板和第二偏光板的偏光方向垂直,且第二玻璃板背向第二偏光板的第二表面设置有第二ITO导电层,所述第二ITO导电层包括多个等间隔排列的ITO电极以及设置于相邻ITO电极之间的绝缘黑条,液晶层夹设于第一ITO导电层和第二ITO导电层之间,控制单元用于根据第二信号控制第一ITO导电层与各个ITO电极之间交流电压的通/断,使得光栅的明暗条纹位置发生适应性改变适应人眼的位置,使得显示屏显示的左视道3D场景经由电子光栅入射进入人的左眼和显示屏显示的右视道3D场景经由电子光栅入射进入人的右眼。
其中,上述任意一项所述的显示装置优选为电脑。
其中,上述任意一项所述的显示装置优选为手机。
为解决上述技术问题,本发明提供的另一个技术方案是提供一种模仿全息3D场景的视觉显示方法,包括:
利用显示屏的前置3D双摄像头捕获人眼3D位置信息;
利用3D人眼跟踪算法处理单元根据人眼3D位置信息计算人眼所处的空间坐标;
根据人眼所处的空间坐标渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏进行显示;
根据人眼所处的空间坐标调整电子光栅的工作角度,使得显示屏显示的左视道3D场景经电子光栅入射进入人的左眼,并使得显示屏显示的右视道3D场景经电子光栅入射进入人的右眼。
其中,所述步骤利用3D人眼跟踪算法处理单元根据人眼3D位置信息计算人眼所处的空间坐标包括:
所述3D人眼跟踪算法处理单元根据前置3D双摄像头捕获到的人眼3D位置信息的差异计算出人眼中心点与显示屏中心点连线在空间直角坐标系O-XYZ的XY平面上的投影与Y轴的夹角α以及所述人眼中心点与显示屏中心点连线与所述投影的夹角β,
其中,所述空间直角坐标系O-XYZ的原点O位于显示屏的中心点,所述空间直角坐标系O-XYZ的X轴平行于所述显示屏的左右对边中心点连线,所述空间直角坐标系O-XYZ的Y轴垂直于所述显示屏的左右对边中心点连线。
其中,所述步骤根据人眼所处的空间坐标渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏进行显示包括:
利用3D场景生成单元控制OpenGL3D场景渲染摄像头移至参数α和β对应的位置,然后控制OpenGL3D场景渲染摄像头左移x距离渲染并获取左视道3D场景, 再使得OpenGL3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
其中,参数x是与景深有关的预先设置的参数。
其中,所述步骤根据人眼所处的空间坐标渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏进行显示包括:
利用3D场景生成单元控制DirectX3D场景渲染摄像头移至参数α和β对应的位置,然后控制DirectX3D场景渲染摄像头左移x距离渲染并获取左视道3D场景,再使得DirectX3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
其中,参数x是与景深有关的预先设置的参数。
本发明的有益效果是:区别于现有技术的情况,本发明提供的模仿全息3D场景的显示装置以及视觉显示方法,通过前置3D双摄像头捕获人眼3D位置信息,并根据人眼3D位置信息计算出人眼所处的空间坐标,进而利用所述人眼空间坐标控制3D人眼跟踪算法处理单元生成控制3D场景生成单元生成左右视道对应的3D场景的第一信号和控制电子光栅工作角度的第二信号,进而使得左视道3D场景入射进入人的左眼,并使得右视道3D场景入射进入人的右眼。因此本发明可以达到和现实当中一样的从不同角度看到不同的立体图像的效果,可以实现对一个物体或场景的全息显示。
【附图说明】
图1是本发明的一种模仿全息3D场景的显示装置的工作原理示意图;
图2A-2B是本发明的一种全息3D场景的显示装置的优选实施例的结构示意图;
图3是本发明的一种全息3D场景的视觉显示方法的流程示意图。
【具体实施方式】
请参见图1,图1是本发明的一种模仿全息3D场景的显示装置的工作原理示意图。如图1所示,本发明的模仿全息3D场景的显示装置包括:显示屏11、前置3D双摄像头12(包括第一前置摄像头和第二前置摄像头)、3D人眼跟踪算法处理单元13、3D场景生成单元14和电子光栅15。其中,本实施例中所述的显示屏11可以是LCD显示屏,也可以是TFT显示屏,对此不做限制。
请参见图2A,图2A是本发明的一种全息3D场景的显示装置的第一优选实施例的结构示意图,如图2A所示,所述第一前置摄像头设置于显示屏11的左上角,所述第二前置摄像头设置于显示屏11的右上角,所述第一前置摄像头和第二前置摄像头用于捕获人眼3D位置信息;3D人眼跟踪算法处理单元13与所述第一前置摄像头、第二前置摄像头电连接,用于根据人眼3D位置信息生成控制3D场景生成单元14生成左右视道对应的3D场景的第一信号和控制电子光栅15工作角度的第二信号;3D场景生成单元14与显示屏11电连接,用于接收第一信号并根据第一信号渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏11进行显示;电子光栅15用于接收第二信号并根据第二信号调整工作角度,使得左视道3D场景入射进入人的左眼,并使得右视道3D场景入射进入人的右眼。
在本发明的实施例中,只要人眼在显示装置的显示范围内移动,且同时在前置3D双摄像头的捕获范围内移动,就可获得全息3D场景的视觉效果。
请继续参看图2A,在本实施例中,第一前置摄像头用于获取人眼位置的左眼格式图片,第二摄像头用于获取人眼位置的右眼格式图片,所述3D人眼跟踪算法处理单元根据左眼格式图片和右眼格式图片中人眼位置的差异计算出人眼中心点与显示屏中心点O连线在空间直角坐标系O-XYZ的XY平面上的投影与Y轴的夹角α以及所述人眼中心点与显示屏中心点O连线与所述投影的夹角β,
其中,所述空间直角坐标系O-XYZ的原点O位于显示屏的中心点,所述空间直角坐标系O-XYZ的X轴平行于显示屏11的左右对边中心点连线,所述空间直角坐标系O-XYZ的Y轴垂直于所述显示屏11的左右对边中心点连线。
在本实施例中,所述第一信号是包含了参数α和β的信号,所述第二信号是包含了α的信号。
在本发明的一个优选实施例中,所述3D场景生成单元控制OpenGL3D场景渲染摄像头移至参数α和β对应的位置,然后控制OpenGL3D场景渲染摄像头左移x距离渲染并获取左视道3D场景,再使得OpenGL3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,其中,参数x是与景深有关的预先设置的参数。
请参见图2B,图2B是本发明的一种全息3D场景的显示装置的第二优选实施例的结构示意图。图2B与图2A的不同之处在于:图2A的显示屏为竖屏显示,相应的前置3D双摄像头分别优选设置于竖显示屏11的左上角和右上角,而图2B中的显示屏为横屏显示,相应的前置3D双摄像头分别优选设置于显示屏11的右上角和右下角,空间直角坐标系O-XYZ根据显示屏的显示方向做出适应性调整。
在本发明的其他具体应用中,当然也可以将第一前置摄像头和第二前置摄像头的位置进行其他位置的设置,这可根据具体的应用而定,本发明对此不做限制。
在本发明的其他实施例中,空间直角坐标系O-XYZ可根据实际需要进行建立,不限于上述实施例所述。
在本发明的另一个优选实施例中,所述3D场景生成单元控制DirectX3D场景渲染摄像头移至参数α和β对应的位置,然后控制DirectX3D场景渲染摄像头左移x距离渲染并获取左视道3D场景,然后再使得DirectX3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,其中,参数x是与景深有关的预先设置的参数。
当然在本发明的其他优选实施例中,3D场景生成单元14也不排除用其他类型的场景渲染摄像头来完成OpenGL3D场景渲染摄像头和DirectX3D场景渲染摄像头的场景渲染功能,本发明对此不做限制。
请进一步参见图1,电子光栅15包括:第一玻璃板(未图示)、第二玻璃板(未图示)、液晶层(未标识)和控制单元153,第一玻璃板的第一表面设置有第一偏光板(未图示),第一玻璃板背向第一偏光板的第二表面设置有第一ITO导电层151,第二玻璃板的第一表面设置有第二偏光板(未图示),第一偏光板和第二偏光板的偏光方向垂直,且第二玻璃板背向第二偏光板的第二表面设置有第二ITO导电层152,所述第二ITO导电层152包括多个等间隔排列的ITO电极以及设置于相邻ITO电极之间的绝缘黑条,液晶层夹设于第一ITO导电层151和第二ITO导电层152之间,控制单元用于根据第二信号控制第一ITO导电层151与各个ITO电极之间交流电压的通/断,使得光栅的明暗条纹位置发生适应性改变适应人眼的位置,使得显示屏11显示的左视道3D场景经由电子光栅15入射进入人的左眼和显示屏11显示的右视道3D场景经由电子光栅15入射进入人的右眼。
当然在本发明的其他实施例中,第一ITO导电层151和第二ITO导电层152的位置可以互换,也在本发明的保护范围之内。
本发明的全息3D场景的显示装置的一个优选实施例为电脑,本发明提供的另一个优选实施例为手机,当然,在本发明的其他实施例中,所述全息3D场景的显示装置还可以为其他显示设备。
请参见图3,图3是本发明的一种全息3D场景的视觉显示方法的流程示意图,包括:
S1、利用显示屏的前置3D双摄像头捕获人眼3D位置信息;
S2、利用3D人眼跟踪算法处理单元根据人眼3D位置信息计算人眼所处的空间坐标;
S3、根据人眼所处的空间坐标渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏进行显示;
S4、根据人眼所处的空间坐标调整电子光栅的工作角度,使得显示屏显示的左视道3D场景经电子光栅入射进入人的左眼,并使得显示屏显示的右视道3D场景经电子光栅入射进入人的右眼。
其中,步骤S2包括:
所述3D人眼跟踪算法处理单元根据前置3D双摄像头捕获到的人眼3D位置信息的差异计算出人眼中心点与显示屏中心点连线在空间直角坐标系O-XYZ的XY平面上的投影与Y轴的夹角α以及所述人眼中心点与显示屏中心点连线与所述投影的夹角β,
其中,所述空间直角坐标系O-XYZ的原点O位于显示屏的中心点,所述空间直角坐标系O-XYZ的X轴平行于所述显示屏的左右对边中心点连线,所述空间直角坐标系O-XYZ的Y轴垂直于所述显示屏的左右对边中心点连线。
其中,本发明的一个优选实施例中,步骤S3包括:
利用3D场景生成单元控制OpenGL3D场景渲染摄像头移至参数α和β对应的位置,然后控制OpenGL3D场景渲染摄像头左移x距离渲染并获取左视道3D场景, 再使得OpenGL3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
其中,参数x是与景深有关的预先设置的参数。
其中,在本发明的另一个优选实施例汇总,步骤S4:
利用3D场景生成单元控制DirectX3D场景渲染摄像头移至参数α和β对应的位置,然后控制DirectX3D场景渲染摄像头左移x距离渲染并获取左视道3D场景,再使得DirectX3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
其中,参数x是与景深有关的预先设置的参数。
当然在本发明的其他实施例中,步骤S3中,还可以其他的场景渲染摄像头来完成OpenGL3D场景渲染摄像头和DirectX3D场景渲染摄像头的功能,本发明对此不做限制。
上述本发明的视觉显示方法的实施例中,所述的显示屏为竖屏显示,相应的前置3D双摄像头分别优选设置于竖显示屏的左上角和右上角,所述空间直角坐标系O-XYZ的X轴平行于显示屏的左右对边中心点连线,所述空间直角坐标系O-XYZ的Y轴垂直于所述显示屏的左右对边中心点连线。
而在本发明的另一优选实施例中,所述的显示屏为横屏显示,相应的前置3D双摄像头分别优选设置于横显示屏的右上角和右下角,所述空间直角坐标系O-XYZ的X轴平行于显示屏的上下对边中心点连线,所述空间直角坐标系O-XYZ的Y轴垂直于所述显示屏的上下对边中心点连线。当然也可以将第一前置摄像头和第二前置摄像头的位置进行其他位置的设置,这可根据具体的应用而定,本发明对此不做限制。
通过上述方式,本发明的有益效果是:区别于现有技术的情况,本发明提供的模仿全息3D场景的显示装置以及视觉显示方法,通过前置3D双摄像头捕获人眼3D位置信息,并根据人眼3D位置信息计算出人眼所处的空间坐标,进而利用所述人眼空间坐标控制3D人眼跟踪算法处理单元生成控制3D场景生成单元生成左右视道对应的3D场景的第一信号和控制电子光栅工作角度的第二信号,进而使得左视道3D场景入射进入人的左眼,并使得右视道3D场景入射进入人的右眼。因此本发明可以达到和现实当中一样的从不同角度看到不同的立体图像的效果,可以实现对一个物体或场景的全息显示。
以上所述仅为本发明的实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (12)

  1. 一种模仿全息3D场景的显示装置,所述显示装置包括:
    显示屏;
    第一前置摄像头和第二前置摄像头,所述第一前置摄像头设置于显示屏的左上角,所述第二前置摄像头设置于显示屏的右上角,所述第一前置摄像头和第二前置摄像头用于捕获人眼3D位置信息;
    3D人眼跟踪算法处理单元,与所述第一前置摄像头、第二前置摄像头电连接,用于根据人眼3D位置信息生成控制3D场景生成单元生成左右视道对应的3D场景的第一信号和控制电子光栅工作角度的第二信号;
    3D场景生成单元与显示屏电连接,用于接收第一信号并根据第一信号渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏进行显示;
    电子光栅用于接收第二信号并根据第二信号调整工作角度,使得左视道3D场景入射进入人的左眼,并使得右视道3D场景入射进入人的右眼。
  2. 根据权利要求1所述的显示装置,所述第一前置摄像头用于获取人眼位置的左眼格式图片,所述第二摄像头用于获取人眼位置的右眼格式图片,所述3D人眼跟踪算法处理单元根据左眼格式图片和右眼格式图片中人眼位置的差异计算出人眼中心点与显示屏中心点连线在空间直角坐标系O-XYZ的XY平面上的投影与Y轴的夹角α以及所述人眼中心点与显示屏中心点连线与所述投影的夹角β,
    其中,所述空间直角坐标系O-XYZ的原点O位于显示屏的中心点,所述空间直角坐标系O-XYZ的X轴平行于所述显示屏的左右对边中心点连线,所述空间直角坐标系O-XYZ的Y轴垂直于所述显示屏的左右对边中心点连线。
  3. 根据权利要求2所述的显示装置,所述第一信号是包含了参数α和β的信号,所述第二信号是包含了α的信号。
  4. 根据权利要求3所述的显示装置,所述3D场景生成单元控制OpenGL3D场景渲染摄像头移至参数α和β对应的位置,然后控制OpenGL3D场景渲染摄像头左移x距离渲染并获取左视道3D场景,再使得OpenGL3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
    其中,参数x是与景深有关的预先设置的参数。
  5. 根据权利要求3所述的显示装置,所述3D场景生成单元控制DirectX3D场景渲染摄像头移至参数α和β对应的位置,然后控制DirectX3D场景渲染摄像头左移x距离渲染并获取左视道3D场景,然后再使得DirectX3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
    其中,参数x是与景深有关的预先设置的参数。
  6. 根据权利要求3所述的显示装置,所述电子光栅包括:第一玻璃板、第二玻璃板、液晶层和控制单元,第一玻璃板的第一表面设置有第一偏光板,第一玻璃板背向第一偏光板的第二表面设置有第一ITO导电层,第二玻璃板的第一表面设置有第二偏光板,第一偏光板和第二偏光板的偏光方向垂直,且第二玻璃板背向第二偏光板的第二表面设置有第二ITO导电层,所述第二ITO导电层包括多个等间隔排列的ITO电极以及设置于相邻ITO电极之间的绝缘黑条,液晶层夹设于第一ITO导电层和第二ITO导电层之间,控制单元用于根据第二信号控制第一ITO导电层与各个ITO电极之间交流电压的通/断,使得光栅的明暗条纹位置发生适应性改变适应人眼的位置,使得显示屏显示的左视道3D场景经由电子光栅入射进入人的左眼和显示屏显示的右视道3D场景经由电子光栅入射进入人的右眼。
  7. 根据权利要求1-6任意一项所述的显示装置,所述显示装置为电脑。
  8. 根据权利要求1-6任意一项所述的显示装置,所述显示装置为手机。
  9. 一种模仿全息3D场景的视觉显示方法,所述视觉显示方法包括:
    利用显示屏的前置3D双摄像头捕获人眼3D位置信息;
    利用3D人眼跟踪算法处理单元根据人眼3D位置信息计算人眼所处的空间坐标;
    根据人眼所处的空间坐标渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏进行显示;
    根据人眼所处的空间坐标调整电子光栅的工作角度,使得显示屏显示的左视道3D场景经电子光栅入射进入人的左眼,并使得显示屏显示的右视道3D场景经电子光栅入射进入人的右眼。
  10. 根据权利要求9所述的视觉显示方法,所述步骤利用3D人眼跟踪算法处理单元根据人眼3D位置信息计算人眼所处的空间坐标包括:
    所述3D人眼跟踪算法处理单元根据前置3D双摄像头捕获到的人眼3D位置信息的差异计算出人眼中心点与显示屏中心点连线在空间直角坐标系O-XYZ的XY平面上的投影与Y轴的夹角α以及所述人眼中心点与显示屏中心点连线与所述投影的夹角β,
    其中,所述空间直角坐标系O-XYZ的原点O位于显示屏的中心点,所述空间直角坐标系O-XYZ的X轴平行于所述显示屏的左右对边中心点连线,所述空间直角坐标系O-XYZ的Y轴垂直于所述显示屏的左右对边中心点连线。
  11. 根据权利要求9所述的视觉显示方法,所述步骤根据人眼所处的空间坐标渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏进行显示包括:
    利用3D场景生成单元控制OpenGL3D场景渲染摄像头移至参数α和β对应的位置,然后控制OpenGL3D场景渲染摄像头左移x距离渲染并获取左视道3D场景, 再使得OpenGL3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
    其中,参数x是与景深有关的预先设置的参数。
  12. 根据权利要求9所述的视觉显示方法,所述步骤根据人眼所处的空间坐标渲染并获取对应的左视道3D场景和右视道3D场景,并将所述左视道3D场景和右视道3D场景输送给显示屏进行显示包括:
    利用3D场景生成单元控制DirectX3D场景渲染摄像头移至参数α和β对应的位置,然后控制DirectX3D场景渲染摄像头左移x距离渲染并获取左视道3D场景,再使得DirectX3D场景渲染摄像头右移2x距离渲染并获取右视道3D场景,
    其中,参数x是与景深有关的预先设置的参数。
PCT/CN2014/075951 2013-03-22 2014-04-22 一种模仿全息3d场景的显示装置和视觉显示方法 WO2014146619A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2015518844A JP2015528234A (ja) 2013-03-22 2014-04-22 ホログラフィーの3dシーンを模倣した表示装置及び視覚表示方法
EP14770997.6A EP2978217A1 (en) 2013-03-22 2014-04-22 Display device and visual display method for simulating holographic 3d scene
KR1020157027103A KR20160042808A (ko) 2013-03-22 2014-04-22 홀로그래픽 3d 장면을 시뮬레이팅하는 디스플레이 장치와 시각 디스플레이 방법
US14/415,603 US9983546B2 (en) 2013-03-22 2014-04-22 Display apparatus and visual displaying method for simulating a holographic 3D scene

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310094938XA CN103248905A (zh) 2013-03-22 2013-03-22 一种模仿全息3d场景的显示装置和视觉显示方法
CN201310094938.X 2013-03-22

Publications (1)

Publication Number Publication Date
WO2014146619A1 true WO2014146619A1 (zh) 2014-09-25

Family

ID=48928093

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/075951 WO2014146619A1 (zh) 2013-03-22 2014-04-22 一种模仿全息3d场景的显示装置和视觉显示方法

Country Status (6)

Country Link
US (1) US9983546B2 (zh)
EP (1) EP2978217A1 (zh)
JP (1) JP2015528234A (zh)
KR (1) KR20160042808A (zh)
CN (1) CN103248905A (zh)
WO (1) WO2014146619A1 (zh)

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103248905A (zh) 2013-03-22 2013-08-14 深圳市云立方信息科技有限公司 一种模仿全息3d场景的显示装置和视觉显示方法
CN103595993A (zh) * 2013-11-08 2014-02-19 深圳市奥拓电子股份有限公司 一种基于智能识别技术的led裸眼3d显示系统及其工作方法
CN103995620A (zh) * 2013-12-02 2014-08-20 深圳市云立方信息科技有限公司 一种空中触控系统
CN104661012B (zh) * 2014-11-28 2017-12-01 深圳市魔眼科技有限公司 个人全息三维显示方法及设备
CN104581114A (zh) * 2014-12-03 2015-04-29 深圳市亿思达科技集团有限公司 基于人眼图像追踪的自适应全息显示方法及全息显示装置
CN104581113B (zh) * 2014-12-03 2018-05-15 深圳市魔眼科技有限公司 基于观看角度的自适应全息显示方法及全息显示装置
CN104601974A (zh) * 2014-12-30 2015-05-06 深圳市亿思达科技集团有限公司 基于人眼追踪的全息显示器及全息显示装置
CN104601980A (zh) * 2014-12-30 2015-05-06 深圳市亿思达科技集团有限公司 基于眼镜追踪的全息显示装置、系统及方法
US10444972B2 (en) 2015-11-28 2019-10-15 International Business Machines Corporation Assisting a user with efficient navigation between a selection of entries with elements of interest to the user within a stream of entries
CN105467604B (zh) * 2016-02-16 2018-01-12 京东方科技集团股份有限公司 一种3d显示装置及其驱动方法
EP3485322A4 (en) 2016-07-15 2020-08-19 Light Field Lab, Inc. SELECTIVE PROPAGATION OF ENERGY IN A LUMINOUS FIELD AND HOLOGRAPHIC WAVE GUIDE NETWORKS
CN105954992B (zh) 2016-07-22 2018-10-30 京东方科技集团股份有限公司 显示系统和显示方法
CN106331688A (zh) * 2016-08-23 2017-01-11 湖南拓视觉信息技术有限公司 基于视觉追踪技术的三维显示系统及方法
CN106125322A (zh) * 2016-09-05 2016-11-16 万维云视(上海)数码科技有限公司 裸眼3d显示面板及裸眼3d显示装置
CN107172242A (zh) * 2017-07-06 2017-09-15 重庆瑞景信息科技有限公司 智能手机
TW201919391A (zh) * 2017-11-09 2019-05-16 英屬開曼群島商麥迪創科技股份有限公司 顯示系統及顯示影像的顯示方法
RU2686576C1 (ru) 2017-11-30 2019-04-29 Самсунг Электроникс Ко., Лтд. Компактное устройство голографического дисплея
JP7274682B2 (ja) 2018-01-14 2023-05-17 ライト フィールド ラボ、インコーポレイテッド 3d環境からデータをレンダリングするためのシステムおよび方法
KR20200116941A (ko) 2018-01-14 2020-10-13 라이트 필드 랩 인코포레이티드 정렬된 구조를 사용해 에너지 릴레이의 횡방향 에너지 편재를 위한 시스템 및 방법
US11163176B2 (en) 2018-01-14 2021-11-02 Light Field Lab, Inc. Light field vision-correction device
WO2019140398A1 (en) 2018-01-14 2019-07-18 Light Field Lab, Inc. Holographic and diffractive optical encoding systems
CN108182659A (zh) * 2018-02-01 2018-06-19 周金润 一种基于视点跟踪、单视角立体画的裸眼3d显示技术
CN108881893A (zh) * 2018-07-23 2018-11-23 上海玮舟微电子科技有限公司 基于人眼跟踪的裸眼3d显示方法、装置、设备和介质
CN109218701B (zh) * 2018-11-13 2020-07-28 深圳市靓工创新应用科技有限公司 裸眼3d的显示设备、方法、装置及可读存储介质
CN112929642A (zh) * 2019-12-05 2021-06-08 北京芯海视界三维科技有限公司 人眼追踪装置、方法及3d显示设备、方法
CN113947632A (zh) * 2020-07-15 2022-01-18 北京芯海视界三维科技有限公司 实现目标物体定位的方法、装置及显示器件
CN113038116B (zh) * 2021-03-09 2022-06-28 中国人民解放军海军航空大学航空作战勤务学院 一种空中加受油模拟训练视景系统
CN113660476A (zh) * 2021-08-16 2021-11-16 纵深视觉科技(南京)有限责任公司 一种基于Web页面的立体显示系统及方法
CN113900273B (zh) * 2021-10-09 2023-12-26 广东未来科技有限公司 裸眼3d显示方法及相关设备
CN114928739A (zh) * 2022-02-11 2022-08-19 广东未来科技有限公司 3d显示方法、装置及存储介质
WO2024174050A1 (zh) * 2023-02-20 2024-08-29 京东方科技集团股份有限公司 视频通信方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101072366A (zh) * 2007-05-24 2007-11-14 上海大学 基于光场和双目视觉技术的自由立体显示系统和方法
CN202172467U (zh) * 2011-06-21 2012-03-21 鼎创电子股份有限公司 具人眼定位的光栅随动装置的显示器
CN202453584U (zh) * 2011-11-22 2012-09-26 深圳市亿思达显示科技有限公司 立体显示装置
CN102710956A (zh) * 2012-06-04 2012-10-03 天马微电子股份有限公司 一种裸眼立体追踪显示方法及装置
CN103248905A (zh) * 2013-03-22 2013-08-14 深圳市云立方信息科技有限公司 一种模仿全息3d场景的显示装置和视觉显示方法

Family Cites Families (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08106070A (ja) * 1994-08-08 1996-04-23 Sanyo Electric Co Ltd 光シャッター及び3次元画像表示装置
AUPN003894A0 (en) * 1994-12-13 1995-01-12 Xenotech Research Pty Ltd Head tracking system for stereoscopic display apparatus
JP3459721B2 (ja) * 1995-05-22 2003-10-27 キヤノン株式会社 立体画像表示方法及びそれを用いた立体画像表示装置
JPH09127494A (ja) * 1995-11-06 1997-05-16 Sharp Corp 液晶表示装置およびその製造方法
GB2337388A (en) * 1998-05-12 1999-11-17 Sharp Kk Directional autereoscopic 3D display having directional illumination system
US6757422B1 (en) * 1998-11-12 2004-06-29 Canon Kabushiki Kaisha Viewpoint position detection apparatus and method, and stereoscopic image display system
KR100399787B1 (ko) * 2001-05-04 2003-09-29 삼성에스디아이 주식회사 기판과 이 기판의 제조방법 및 이 기판을 가지는 플라즈마표시장치
GB2405543A (en) * 2003-08-30 2005-03-02 Sharp Kk Multiple view directional display having means for imaging parallax optic or display.
US7623105B2 (en) * 2003-11-21 2009-11-24 Sharp Laboratories Of America, Inc. Liquid crystal display with adaptive color
KR100580632B1 (ko) * 2003-12-05 2006-05-16 삼성전자주식회사 2차원과 3차원 영상을 선택적으로 표시할 수 있는디스플레이
US20050207486A1 (en) * 2004-03-18 2005-09-22 Sony Corporation Three dimensional acquisition and visualization system for personal electronic devices
US7433021B2 (en) * 2004-08-10 2008-10-07 Joseph Saltsman Stereoscopic targeting, tracking and navigation device, system and method
WO2007036936A1 (en) * 2005-09-28 2007-04-05 Mirage Innovations Ltd. Stereoscopic binocular system, device and method
US7451022B1 (en) * 2006-12-28 2008-11-11 Lockheed Martin Corporation Calibration of ship attitude reference
JP2008191569A (ja) * 2007-02-07 2008-08-21 Nano Loa Inc 液晶デバイス
US20100283843A1 (en) * 2007-07-17 2010-11-11 Yang Cai Multiple resolution video network with eye tracking based control
US8089479B2 (en) * 2008-04-11 2012-01-03 Apple Inc. Directing camera behavior in 3-D imaging system
JP5852956B2 (ja) * 2009-06-23 2016-02-03 シーリアル テクノロジーズ ソシエテ アノニムSeereal Technologies S.A. 2次元及び3次元の少なくともいずれかの画像コンテンツ提示用ディスプレイに用いられる光変調デバイス
CN102640502B (zh) * 2009-10-14 2015-09-23 诺基亚公司 自动立体渲染和显示装置
US8976327B2 (en) * 2010-05-05 2015-03-10 3M Innovative Properties Company Optical shutter applicable in stereoscopic viewing glasses
WO2012021967A1 (en) * 2010-08-16 2012-02-23 Tandemlaunch Technologies Inc. System and method for analyzing three-dimensional (3d) media content
US20120200676A1 (en) * 2011-02-08 2012-08-09 Microsoft Corporation Three-Dimensional Display with Motion Parallax
KR101270780B1 (ko) * 2011-02-14 2013-06-07 김영대 가상 강의실 강의 방법 및 장치
CN201966999U (zh) * 2011-03-17 2011-09-07 黑龙江省四维影像数码科技有限公司 三维自由立体显示手机
JP2012216951A (ja) * 2011-03-31 2012-11-08 Toshiba Corp 電子機器およびインジケータの制御方法
JP5813434B2 (ja) * 2011-09-22 2015-11-17 株式会社ジャパンディスプレイ 液晶表示装置
US20130077154A1 (en) * 2011-09-23 2013-03-28 Milan Momcilo Popovich Autostereoscopic display
CN103135815B (zh) * 2011-11-25 2017-02-22 上海天马微电子有限公司 内嵌触摸屏液晶显示装置及其触控驱动方法
ITTO20111150A1 (it) * 2011-12-14 2013-06-15 Univ Degli Studi Genova Rappresentazione stereoscopica tridimensionale perfezionata di oggetti virtuali per un osservatore in movimento
JP5167439B1 (ja) * 2012-02-15 2013-03-21 パナソニック株式会社 立体画像表示装置及び立体画像表示方法
WO2013121468A1 (ja) * 2012-02-15 2013-08-22 パナソニック株式会社 立体画像表示装置及び立体画像表示方法
US9786094B2 (en) * 2012-03-19 2017-10-10 Electronics For Imaging, Inc. Method and apparatus for creating a dimensional layer for an image file
CN102665087B (zh) * 2012-04-24 2014-08-06 浙江工业大学 3d立体摄像设备的拍摄参数自动调整系统
KR20130140960A (ko) * 2012-05-22 2013-12-26 엘지디스플레이 주식회사 액티브 리타더 역할을 하는 패널과 이의 제조 방법 및 이를 구비한 입체 영상 구현 시스템
US10025089B2 (en) * 2012-10-05 2018-07-17 Microsoft Technology Licensing, Llc Backlight for viewing three-dimensional images from a display from variable viewing angles
US20140132595A1 (en) * 2012-11-14 2014-05-15 Microsoft Corporation In-scene real-time design of living spaces
US20140181910A1 (en) * 2012-12-21 2014-06-26 Jim Fingal Systems and methods for enabling parental controls based on user engagement with a media device
US20140267772A1 (en) * 2013-03-15 2014-09-18 Novatel Inc. Robotic total station with image-based target re-acquisition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101072366A (zh) * 2007-05-24 2007-11-14 上海大学 基于光场和双目视觉技术的自由立体显示系统和方法
CN202172467U (zh) * 2011-06-21 2012-03-21 鼎创电子股份有限公司 具人眼定位的光栅随动装置的显示器
CN202453584U (zh) * 2011-11-22 2012-09-26 深圳市亿思达显示科技有限公司 立体显示装置
CN102710956A (zh) * 2012-06-04 2012-10-03 天马微电子股份有限公司 一种裸眼立体追踪显示方法及装置
CN103248905A (zh) * 2013-03-22 2013-08-14 深圳市云立方信息科技有限公司 一种模仿全息3d场景的显示装置和视觉显示方法

Also Published As

Publication number Publication date
EP2978217A1 (en) 2016-01-27
US9983546B2 (en) 2018-05-29
CN103248905A (zh) 2013-08-14
JP2015528234A (ja) 2015-09-24
KR20160042808A (ko) 2016-04-20
US20150227112A1 (en) 2015-08-13

Similar Documents

Publication Publication Date Title
WO2014146619A1 (zh) 一种模仿全息3d场景的显示装置和视觉显示方法
JP7443314B2 (ja) 3dテレプレゼンスシステム
CN101909219B (zh) 一种立体显示方法及跟踪式立体显示器
WO2015037796A1 (en) Display device and method of controlling the same
CN101984670A (zh) 一种立体显示方法、跟踪式立体显示器及图像处理装置
WO2015081029A1 (en) Video interaction between physical locations
WO2017004859A1 (zh) 3d显示装置
WO2015046684A1 (en) Display device and control method thereof
WO2013174249A1 (zh) 立体显示装置
CN201114214Y (zh) 具有立体拍照功能的手机
CN103945122A (zh) 利用云台摄像机和投影机实现虚拟窗户的方法
WO2023231674A1 (zh) 液晶光栅的驱动方法及显示装置、其显示方法
CN205510302U (zh) 一种三维立体投影仪
WO2014058187A2 (ko) 가변 초점 렌즈, 이를 이용한 디스플레이 장치 및 디스플레이 방법
WO2013105794A1 (en) 3d display apparatus and method thereof
WO2018018357A1 (zh) Vr图像拍摄装置及其基于移动终端的vr图像拍摄系统
CN201820070U (zh) 立体拍摄装置
CN102970498A (zh) 菜单立体显示的显示方法及显示装置
CN208092355U (zh) 一种柱镜光栅裸眼3d显示器
CN203775345U (zh) 基于集成成像的2d-3d混合显示系统
CN202512298U (zh) 立体显示装置
CN103777412A (zh) 3d显示手机皮
WO2015020423A1 (en) Display apparatus and control method for providing a 3d image
CN103376558A (zh) 立体显示装置
CN103376557A (zh) 立体显示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14770997

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015518844

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2014770997

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014770997

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 14415603

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20157027103

Country of ref document: KR

Kind code of ref document: A