WO2024032057A1 - 一种立体显示装置、立体显示设备以及立体显示方法 - Google Patents

一种立体显示装置、立体显示设备以及立体显示方法 Download PDF

Info

Publication number
WO2024032057A1
WO2024032057A1 PCT/CN2023/093016 CN2023093016W WO2024032057A1 WO 2024032057 A1 WO2024032057 A1 WO 2024032057A1 CN 2023093016 W CN2023093016 W CN 2023093016W WO 2024032057 A1 WO2024032057 A1 WO 2024032057A1
Authority
WO
WIPO (PCT)
Prior art keywords
light beam
angle
waveguide
light
grating group
Prior art date
Application number
PCT/CN2023/093016
Other languages
English (en)
French (fr)
Inventor
李肖
卢庆聪
刘欣
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024032057A1 publication Critical patent/WO2024032057A1/zh

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/33Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving directional light or back-light sources

Definitions

  • Embodiments of the present application relate to the field of display, and in particular, to a stereoscopic display device, a stereoscopic display device, and a stereoscopic display method.
  • 3D displays are increasingly favored by consumers.
  • a current 3D display device is composed of multi-layer waveguide device units superimposed into a multi-layer composite directional duct plate, and then adopts the method of frequency division to control the sequential illumination of each layer.
  • the light illuminated by the waveguide device units of different layers corresponds to the left and right respectively. Eye-viewing angle to achieve naked-eye 3D display.
  • the use of multi-layer waveguide device unit stacking results in a relatively thick product, and the stacking of devices will produce crosstalk, affecting the user's viewing experience.
  • the present application provides a stereoscopic display device, a stereoscopic display device and a stereoscopic display method. Realizing naked-eye 3D display or free stereoscopic display. It can avoid crosstalk caused by the superposition of multi-layer waveguide device units, and at the same time reduce the thickness of the product, thereby improving the user's viewing experience as much as possible.
  • a three-dimensional display device in a first aspect, includes a light source waveguide layer and an image generation module.
  • the light source is used to emit a first light beam at a first time and a second light beam at a second time.
  • the waveguide layer is used for projecting the first light beam to the image generation module at a first angle.
  • An image generation module is used to modulate the first light beam based on the first image data to obtain the first imaging light.
  • the waveguide layer is also used to project the second light beam to the image generation module at a second angle.
  • the image generation module is also used to modulate the second light beam based on the second image data to obtain the second imaging light.
  • first imaging light and second imaging light are respectively converged to the left eye and right eye of the observer.
  • the light source can be located below the waveguide layer, and the image generation module can be located above the waveguide layer, or both the light source and the image generation module can be located above or below the waveguide layer, or there are other composition methods, specifically as follows There are no restrictions anywhere.
  • the first beam and the second beam are projected to the image generation module at different angles to obtain the first imaging light and the second imaging light corresponding to the left and right eyes respectively, thereby realizing naked-eye 3D display.
  • only one waveguide layer is needed to realize naked-eye 3D display, which avoids crosstalk caused by superposition of different levels of waveguides, reduces the thickness of the three-dimensional display device, and improves the user experience.
  • the waveguide layer includes a first grating group and a first waveguide medium.
  • the first grating group is used to transmit the first light beam in the first waveguide medium at a first diffraction angle, and to project the first light beam to the image generation module at a first angle. and is also used to cause the second beam to be at the first wave at a second diffraction angle The second beam is transmitted in the conductive medium and projected to the image generation module at a second angle.
  • a first grating group is used to project the first light beam to the image generation module at a first angle, and the second light beam is projected to the image generation module at a second angle, reducing as much as possible
  • the component unit of a stereoscopic display device realizes a stereoscopic display device that is light, thin and has low crosstalk.
  • the stereoscopic display device further includes a voltage control module, the voltage control module is used to load a first voltage on the first grating group, and is used to adjust the first diffraction angle and the first angle .
  • the voltage control module loads a second voltage on the first grating group for adjusting the second diffraction angle and the second angle.
  • the refractive index of the first grating group can be controlled through different voltages, thereby controlling the projection angles of the first light beam and the second light beam, making the stereoscopic display device more flexible.
  • the material of the first grating group is an electro-optical polymer material.
  • the first grating group is composed of photoelectric polymer materials, and the first light beam and the second light beam are projected to the image generation module at different angles through voltage, ensuring that the three-dimensional display device is based on one grating group and Waveguide media can realize naked-eye 3D display, ensuring that the three-dimensional display device is as light and thin as possible and avoiding crosstalk interference as much as possible.
  • the waveguide layer includes a first grating group, a first waveguide medium, a second grating group, and a second waveguide medium.
  • the first grating group is used to cause the first light beam to transmit in the first waveguide medium at a first diffraction angle, and to make the first light beam to be projected to the image generation module at a first angle;
  • the second grating group is used to transmit the second light beam in the second waveguide medium at a second diffraction angle, and to project the second light beam to the image generation module at a second angle.
  • the grating parameters of the first grating group and the second grating group are different.
  • the parameters of the first grating group and the second grating group can be determined according to the specific directions of the first imaging light and the second imaging light corresponding to the left and right eye angles respectively.
  • Raster parameters are not limited here.
  • the first grating group and the second grating group with different grating parameters respectively project the first light beam to the image generation module at a first angle, and project the second light beam to the image generation module at a second angle. module, and obtain the first imaging light and the second imaging light corresponding to the left and right eyes, thereby realizing naked-eye 3D display and realizing a three-dimensional display device that is light, thin, and has low crosstalk.
  • the waveguide layer further includes a third grating group, a third waveguide medium, a fourth grating group, and a fourth waveguide medium.
  • the medium and the fourth waveguide medium are located on the same horizontal plane of the waveguide layer.
  • the third grating group is used to cause the first light beam to transmit in the third waveguide medium at a third diffraction angle, and to make the first light beam to be projected to the image generation module at a third angle.
  • the fourth wave grating group is used to transmit the second light beam in the fourth waveguide medium at a fourth diffraction angle, and to project the second light beam to the image generation module at a fourth angle.
  • the three-dimensional display device obtains the first imaging light and the second imaging light corresponding to multiple left and right viewing angles based on a waveguide layer composed of multiple grating groups and multiple waveguide media, thereby achieving at least a two-person mode.
  • the display mode can realize multi-person 3D naked-eye display and is suitable for more complex application scenarios.
  • each of the above waveguide media includes a light entrance surface and a light exit surface.
  • the light entrance surface is used to receive the first light beam and the second light beam.
  • the light exit surface is used to project the first light beam and the second light beam. beam.
  • a stereoscopic display device which includes a stereoscopic display device and a control module, wherein the stereoscopic display device includes a light source, a waveguide layer, and an image generation module.
  • Its control module is used to control the light source to emit a first beam at a first moment and a second beam at a second moment, and to control the image generation module to obtain the first image data at the first moment, and at the second moment Obtain second image data.
  • the waveguide layer is used for projecting the first light beam to the image generation module at a first angle.
  • An image generation module configured to modulate the first light beam based on the first image data to obtain the first imaging light
  • the waveguide layer is also used to project the second beam to the image generation module at a second angle
  • the image generation module is also used to modulate the second light beam based on the second image data to obtain second imaging light.
  • the stereoscopic display device is a stereoscopic display device as described in the first aspect and the possible implementation manner of the first aspect.
  • the waveguide layer further includes a third grating group, a third waveguide medium, a fourth grating group, and a fourth waveguide medium
  • the display mode of the stereoscopic display device is a multi-person mode.
  • the control module is also used to control the third grating group of the stereoscopic display device so that the first light beam is transmitted in the third waveguide medium at a third diffraction angle, and the first light beam is projected to the image generation module at a third angle.
  • group, and control the fourth grating group so that the second light beam is transmitted in the fourth waveguide medium at a fourth diffraction angle, and the second light beam is projected to the image generation module at a fourth angle.
  • control module is also used to determine the display mode based on information fed back by the sensor.
  • the information fed back by the sensor includes at least one of human eye position information, head information, or thermal imaging information.
  • a vehicle including the stereoscopic display device described in the second aspect and possible implementations of the second aspect.
  • the vehicle further includes a windshield, and the three-dimensional display device projects the first imaging light and the second imaging light to the windshield respectively.
  • a stereoscopic display method which method includes:
  • a first light beam is acquired by the light source at a first time, and a second light beam is acquired at a second time. Then, the first light beam is projected through the waveguide layer at a first angle, and the first light beam is modulated based on the first image data to obtain first imaging light. A second light beam is also projected through the waveguide layer at a second angle, and the second light beam is modulated based on the second image data to obtain second imaging light.
  • the first grating group when the waveguide layer includes a first grating group and a first waveguide medium, the first grating group causes the first light beam to be transmitted in the first waveguide medium at a first diffraction angle. , and causes the first light beam to be projected at a first angle.
  • the second light beam is transmitted in the first waveguide medium at a second diffraction angle through the first grating group, and the second light beam is projected at a second angle.
  • a first voltage is applied to the first grating group to adjust the first diffraction angle and the first angle.
  • a second voltage is applied to the first grating group to adjust the second diffraction angle and the second angle.
  • the material of the first grating group is an electro-optical polymer material.
  • the waveguide layer includes a first grating group, a first waveguide medium, a second grating group, and a second waveguide medium
  • the first waveguide medium and the second waveguide medium are located at On the same horizontal plane
  • the first grating group causes the first light beam to propagate in the first waveguide medium at a first diffraction angle, and causes the first light beam to project at a first angle
  • the second grating group causes the second light beam to diffract at a second angle propagates within the second waveguide medium and causes the The two beams are projected at a second angle.
  • the waveguide layer further includes a third grating group, a third waveguide medium, a fourth grating group and a fourth waveguide medium
  • the first waveguide medium, the second waveguide medium , the third waveguide medium and the fourth waveguide medium are located on the same horizontal plane.
  • the first light beam is transmitted in the third waveguide medium at a third diffraction angle through the third grating group, and the first light beam is projected at a third angle.
  • the second beam is transmitted in the fourth waveguide medium at a fourth diffraction angle through the fourth grating group, and the second beam is projected at a fourth angle.
  • the aforementioned waveguide medium includes a light entrance surface and a light exit surface, the light entrance surface is used to receive the first light beam and the second light beam, and the light exit surface is used to project the first light beam and the second light beam.
  • Figure 1 is a schematic diagram of a multi-layer composite directional duct plate
  • Figure 2 is a schematic structural diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of the grating group and waveguide medium provided by the embodiment of the present application.
  • Figure 4 is a schematic diagram of projecting a first light beam provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of projecting a second light beam provided by an embodiment of the present application.
  • Figure 6 is another structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 7 is another structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 8a is another schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 8b is a schematic diagram of the waveguide layer provided by the embodiment of the present application.
  • Figure 9 is a schematic structural diagram of a stereoscopic display device provided by an embodiment of the present application.
  • Figure 10 is a schematic diagram of the product form of the stereoscopic display device provided by the embodiment of the present application.
  • Figure 11a is a schematic diagram of the table display form of the stereoscopic display device provided by the embodiment of the present application.
  • Figure 11b is a schematic diagram of the HUD form of the stereoscopic display device provided by the embodiment of the present application.
  • Figure 11c is a schematic diagram of a possible functional framework of a vehicle provided by an embodiment of the present application.
  • Figure 12 is a schematic flowchart of a stereoscopic display method provided by an embodiment of the present application.
  • Embodiments of the present application provide a stereoscopic display device, a stereoscopic display device, and a stereoscopic display method, which are applied in the field of stereoscopic imaging and specifically realize naked-eye 3D display or free stereoscopic display. It can avoid crosstalk caused by the superposition of multi-layer waveguide device units, and at the same time reduce the thickness of the product, thereby improving the user's viewing experience as much as possible.
  • Naked-eye 3D technology A 3D display technology that uses a display device to separate left and right eye images and does not require viewers to wear wearable devices to separate left and right eye images.
  • 3D display technology stereoscopic 3D images are obtained in the human brain by projecting imaging light from different viewing angles to the left and right eyes respectively.
  • 3D display technology can project the imaging light of the left and right eyes to the corresponding human eyes through various means. Among them, in naked-eye 3D technology, the imaging light is divided into left and right eye imaging light, and the imaging light from the left eye perspective is projected to the left eye, and the imaging light from the right eye perspective is projected to the right eye.
  • Naked-eye 3D technology does not require viewers to wear wearable devices and can separate the imaging light for the left and right eyes. It is a technology with high usability for viewers.
  • the left eye angle of view is the angle of view of the stereoscopic display device for sending the two-dimensional image corresponding to the left eye during the acquisition process of the three-dimensional image.
  • Right eye perspective is the perspective from which the stereoscopic display device corresponding to the right eye sends the two-dimensional image during the acquisition process of the three-dimensional image.
  • 3D display technology can be used Project 3D images to make the projected images more three-dimensional and vivid.
  • a current naked-eye 3D display device is composed of multi-layer waveguide device units superimposed into a multi-layer composite directional duct plate.
  • the naked-eye 3D display is realized by using frequency division to control the illumination of each layer corresponding to the left and right viewing angles.
  • Figure 1 is a schematic diagram of a multi-layer composite directional conduit plate, in which each layer is controlled by frequency division to sequentially control the illumination beam of the waveguide device unit 101 corresponding to the left-eye viewing angle, and the waveguide device unit
  • the illumination beam 102 corresponds to the right eye viewing angle
  • the waveguide device unit 101/102 also includes a plurality of nanogratings that can be regarded as pixels.
  • Naked-eye 3D display at multiple viewing angles can be achieved by controlling the color and gray scale of the device 103 and controlling the sequential illumination of waveguide device units 101 and 102 at different levels through frequency division.
  • a stereoscopic display device a stereoscopic display device and a stereoscopic display method, which are applied in the field of 3D display.
  • Its three-dimensional display device includes a light source, a waveguide layer and an image generation module.
  • the light source is used to emit a first light beam at a first time and a second light beam at a second time.
  • the waveguide layer is used for projecting the first light beam to the image generation module at a first angle.
  • An image generation module is used to modulate the first light beam based on the first image data to obtain the first imaging light.
  • the waveguide layer is also used to project the second light beam to the image generation module at a second angle.
  • the image generation module is also used to modulate the second light beam based on the second image data to obtain the second imaging light.
  • the obtained first imaging light and the second imaging light are then converged to the left eye and right eye of the observer respectively.
  • This enables naked-eye 3D display.
  • only one waveguide layer is needed to realize naked-eye 3D display, which avoids crosstalk caused by superposition of different levels of waveguides, reduces the thickness of the three-dimensional display device, and improves the user experience.
  • Figure 2 is a schematic structural diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the three-dimensional display device 200 specifically includes:
  • the light source 201 is used to emit a first light beam at a first time and a second light beam at a second time.
  • the waveguide layer 202 is used to project the first light beam to the image generation module 203 at a first angle, and is also used to project the second light beam to the image generation module 203 at a second angle.
  • the image generation module 203 is used to project the first light beam to the image generation module 203 at a second angle.
  • An image data modulates the first light beam to obtain first imaging light, and modulates the second light beam based on the second image data to obtain second imaging light. The obtained first imaging light and the second imaging light are then converged to the left eye and right eye of the observer respectively.
  • the light source 201 and the image generation module 203 can also be located below or above the waveguide layer, or in other composition modes, which are not limited here.
  • the left and right eyes need to alternately receive images from corresponding viewing angles. Therefore, during the modulation process of the first light beam and the second light beam by the image generation module 203, the loaded first image data and second image data need to be switched according to the time when the light source emits the first light beam and the second light beam. That is, the image generation module 203 loads the first image data at the first time, and the image generation module 203 loads the second image data at the second time. And the first imaging light and the second imaging light modulated by the image generation module 203 are respectively focused on the left eye and the right eye of the observer, thereby realizing naked-eye 3D display.
  • the light source 201 is the left light source at the first moment and the right light source at the second moment; or the light source 201 is the right light source at the first moment and the left light source at the second moment,
  • the light beam emitted by the left light source is projected to the image generation module 203 at a certain angle through the waveguide 202, and the image generation module 203 generates imaging light based on the corresponding image data and converges to the observer's left eye.
  • the light beam emitted by the right light source passes through the waveguide.
  • 202 is projected to the image generation module 203 at a certain angle, and the image generation module 203 generates imaging light based on the corresponding image data and focuses it on the observer's left eye.
  • the first light beam and the second light beam emitted by the light source 201 at different times are projected to the image generation module 203 at different angles through the waveguide layer 202, and the image generation module respectively
  • the first light beam is modulated based on the first image data to obtain the first imaging light
  • the second light beam is modulated based on the second image data to obtain the second imaging light
  • the first imaging light and the second imaging light are respectively converged to the left and right eyes of the observer
  • the number of the above-mentioned light sources 201 may be one.
  • the light source 201 emits a first beam at the first moment and a second beam at the second moment to achieve matching of multiple viewing angles as much as possible.
  • the volume and weight of the stereoscopic display device are reduced, ensuring that the stereoscopic display device is as light and thin as possible.
  • the number of the above-mentioned light sources 201 can also be two, one as the left light source and the other as the right light source. In this way, the first light beam and the second light beam can be emitted at different times respectively, which can reduce the burden on the light source device. loss, reduce equipment failure time as much as possible, and increase application scenarios.
  • a light source 201 is used as an example only for understanding the embodiment of the present application, and does not substantially limit the solution. It can be understood that in actual situations, it can be determined according to specific circumstances, and the details are not limited here. .
  • the waveguide layer 202 may be a planar waveguide with a rectangular cross-section as shown in FIG. 2 , or may be other strip waveguides or curved surface waveguides, which are not specifically limited here.
  • the light source 201 may be a backlight source, such as a light emitting diode, a cold cathode fluorescent tube, a point light source, a hot cathode fluorescent tube, a flat fluorescent lamp, or other laser light sources, which are not limited here.
  • a backlight source such as a light emitting diode, a cold cathode fluorescent tube, a point light source, a hot cathode fluorescent tube, a flat fluorescent lamp, or other laser light sources, which are not limited here.
  • the image generation module 203 may be a transmission image generation module, used to modulate the incident illumination beam and transmit the modulated imaging light.
  • the image generation module 203 can be a liquid crystal display (LCD), an organic light-emitting diode (OLED) or other transmissive image generation module, which is not limited here.
  • the material of the grating group may be an electro-optical polymer material or a non-electro-optical polymer material.
  • the refractive index of the grating group can be adjusted according to the voltage to project the first light beam and the second light beam at different angles.
  • the grating group is made of a non-electro-optical polymer material
  • the first beam and the second beam can be projected at different angles according to the grating group composed of non-electro-optical polymer materials with different grating parameters, as shown in subsequent embodiments, which are not detailed here.
  • the waveguide layer 202 includes a first grating group and a first waveguide medium 20212, and the first grating group is the first grating 20211 and the second grating 20213.
  • the first grating group is used to transmit the first light beam in the first waveguide medium 20212 at a first diffraction angle, and to project the first light beam to the image generation module 203 at a first angle.
  • the first grating group is also used to cause the second light beam to transmit within the first waveguide medium 20212 at a second diffraction angle, and to cause the second light beam to be projected to the image generation module 203 at a second angle.
  • the grating is directly processed on the waveguide medium, or is processed on the film, and the film is attached to the opposite side of the light incident surface, or opposite to the light output surface, or embedded in the waveguide medium.
  • the details are not limited here.
  • Figure 3 is a schematic diagram of a grating group and a waveguide medium provided by an embodiment of the present application.
  • the first waveguide medium 20212 also includes a light incident surface and a light exit surface.
  • the light incident surface is used For receiving the first light beam and the second light beam
  • the light exit surface is used for projecting the first light beam and the second light beam.
  • the first grating 20211 is located above the light incident surface
  • the second grating 20213 is located below the light exit surface.
  • the area of the light-emitting surface is not less than the area of the image generation module 203, so as to ensure the quality of the imaging light and improve the user experience. It can be understood that in other application scenarios, the area of the light emitting surface can also be smaller than the area of the image generation module 203, and there is no specific limitation here.
  • FIG. 4 is a schematic diagram of projecting the first light beam provided by the embodiment of the present application.
  • the first grating 20211 of the first grating group receives the first light beam from the light incident surface, and the first grating 20211 transmits the first light beam to the second grating 20213 in the first waveguide medium 20212 at a first diffraction angle.
  • the first waveguide medium 20212 is a reflective surface, so the first grating 20211 diffracts the first beam into the first waveguide medium 20212 at the first diffraction angle, then the first waveguide medium 20212 The first beam can continue to be diffracted to the second grating 20213 at a first diffraction angle. And the second grating 20213 projects the first light beam from the light emitting surface to the image generation module 203 at a first angle.
  • FIG. 5 is a schematic diagram of projecting the second light beam provided by the embodiment of the present application.
  • the first grating 20211 receives the second light beam from the light incident surface, and the first grating 20211 transmits the second light beam to the second grating 20213 in the first waveguide medium 20212 at a second diffraction angle, for example, as shown in the aforementioned Figure 4
  • the first waveguide medium 20212 is a reflective surface. Therefore, the first grating 20211 diffracts the second beam into the first waveguide medium 20212 at the second diffraction angle.
  • the second beam can continue to be diffracted to the second grating 20213 at a second diffraction angle.
  • the second grating 20213 projects the second light beam from the light emitting surface to the image generation module 203 at a second angle.
  • the angle between the first light beam and the second light beam emitted by the light source 201 in the above example is only used as an example for understanding the embodiments of the present application. It can be understood that in actual situations, the light source 201 emits the first light beam and the second light beam.
  • the angles of the two light beams to the light incident surface can be different as shown in Figure 2, or they can be emitted at the same angle. There is no specific limit here.
  • a first grating group and a first waveguide medium are used to project the first light beam to the image generation module at a first angle, and the second light beam is projected to the image generation module at a second angle. It is possible to reduce the component units of the three-dimensional display device and realize a three-dimensional display device that is light, thin and has low crosstalk.
  • the material of the first grating group is a photoelectric polymer material.
  • the refractive index of the first grating group may be controlled based on a voltage loaded on the first grating group. That is, the first diffraction angle and the second diffraction angle can be controlled based on the voltage loaded on the first grating 20211, and the first angle and the second angle can be controlled based on the voltage loaded on the second grating 20213.
  • the first grating 20211 and the second grating 20213 may be composed of different photoelectric polymer materials, or may be composed of the same photoelectric polymer material, and there is no specific limitation here.
  • the first grating group is composed of photoelectric polymer materials, and the first light beam and the second light beam are projected to the image generation module at different angles through voltage, ensuring that the three-dimensional display device is based on one grating group and A waveguide medium can realize naked-eye 3D display, ensuring that the stereoscopic display device is as light and thin as possible and avoiding crosstalk interference as much as possible.
  • FIG. 6 is another schematic structural diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the three-dimensional display device 200 further includes a voltage control module 204, which is connected to the first grating group, that is, the first grating 20211 and the second grating 20213. At the first moment when the light source 201 emits the first light beam, the voltage control module 204 is used to load a first voltage on the first grating group.
  • the first grating group adjusts the first diffraction angle and the first angle based on the first voltage, so that the first light beam is transmitted within the first waveguide medium 20212 at the first diffraction angle, and the first light beam is projected to the image generation at the first angle.
  • Mod 203 the first grating group adjusts the first diffraction angle and the first angle based on the first voltage, so that the first light beam is transmitted within the first waveguide medium 20212 at the first diffraction angle, and the first light beam is projected to the image generation at the first angle.
  • the first voltage includes two voltage value combinations
  • the voltage control module 204 loads the first voltages of the two voltage value combinations on the first grating 20211 and the second grating 20213 respectively, so that the first grating 20211 The first diffraction angle is adjusted based on the loaded voltage value, and the second grating 20213 adjusts the first angle based on the loaded voltage value.
  • the materials of the first grating 20211 and the second grating 20213 may be the same or different, the two voltage values of the first voltage may be the same or different, and the details are not limited here.
  • the voltage control module 204 is also used to load a second voltage on the first grating group, and then the first grating group adjusts the second diffraction angle and the second angle, so that the second beam
  • the second diffraction angle is transmitted in the first waveguide medium 20212, and the second light beam is projected to the image generation module 203 at the second angle.
  • the second voltage includes a combination of two voltage values.
  • the voltage control module 204 loads the second voltages of the two voltage value combinations on the first grating 20211 and the second grating 20213 respectively, so that the first grating 20211 The second diffraction angle is adjusted based on the loaded voltage value, and the second grating 20213 adjusts the first angle based on the loaded voltage value. It should be noted that since the materials of the first grating 20211 and the second grating 20213 may be the same or different, the two voltage values of the second voltage may be the same or different, and the details are not limited here.
  • different voltages can be used to control the refractive index of the grating group and control the projection angles of the first light beam and the second light beam, making the stereoscopic display device more flexible.
  • the values of the first voltage and the second voltage corresponding to the single-player mode may be different from the values of the first voltage and the second voltage corresponding to the multi-player mode.
  • the stereoscopic display device can be used in display modes such as single-person mode, double-person mode, and/or other multi-person modes, and different display modes correspond to different voltages.
  • the first imaging light And the second imaging light can correspond to the left and right eyes of a single person respectively.
  • the first imaging light and the second imaging light need to correspond to the left and right eyes of multiple people respectively.
  • the first diffraction angle and the first diffraction angle can be adjusted based on the first voltage corresponding to the multi-person mode. angle, so that the first light beam is finally projected to the image generation module 203 at the first angle, and the obtained first imaging light is projected to the left eyes of multiple people simultaneously.
  • the value of the first voltage can also be adjusted in time division, so that the first light beams are projected to the image generation module 203 at different first angles, so that the first imaging light is projected to the left eyes of different people in time divisions.
  • the second imaging corresponding to the right eye of multiple people is finally obtained.
  • the light is similar to the first imaging light corresponding to the left eyes of multiple people described above, and the details will not be described again here. Therefore, the first diffraction angle and the first angle are adjusted based on the first voltage, and the second diffraction angle and the second angle are adjusted based on the second voltage, so that the stereoscopic display device can be flexibly adjusted and applied to multiple application scenarios, and the stereoscopic display device is added Device application scenarios.
  • the waveguide layer 202 may include a first grating group, a first waveguide medium, a second grating group, and a second waveguide medium.
  • the following will continue to describe this application by taking another three-dimensional display device as an example.
  • FIG. 7 is another structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the waveguide layer 702 includes a first grating group (ie, the first grating 70211 and the second grating 70213) and a first waveguide medium 70212, a second grating group (ie, the third grating 70221 and the fourth grating 70223) and a second waveguide medium.
  • first waveguide medium 70212 and the second waveguide medium 70222 are on the same horizontal plane
  • first grating 70211 and the third grating 70221 are on the same horizontal plane
  • second grating 70213 and the fourth grating 70223 are on the same horizontal plane.
  • the first waveguide medium 70212 and the second waveguide medium 70222 can be arranged at staggered heights within a certain range. It does not need to be completely required to be on the same level. It is only necessary to avoid the waveguide medium from forming a multi-layer space as much as possible. To occupy, just reduce the thickness of the waveguide layer.
  • the first grating 70211 and the third grating 70221, the second grating 70213 and the fourth grating 70223 can also be arranged at different heights, and the details are not limited here.
  • the first grating group is used to cause the first light beam to transmit in the first waveguide medium 70212 at a first diffraction angle, and to make the first light beam to be projected to the image generation module 703 at a first angle.
  • the second grating group is used to transmit the second light beam in the second waveguide medium 70222 at a second diffraction angle, and to project the second light beam to the image generation module 703 at a second angle.
  • the first waveguide medium 70212 and the second waveguide medium 70222 respectively include a light incident surface and a light exit surface.
  • the light entrance surface is used to receive the first light beam or the second light beam.
  • the light exit surface For projecting the first beam or the second beam.
  • the first grating 70211 and the third grating 70221 are located above the light incident surface, and the second grating 70213 and the fourth grating 70223 are located below the light exit surface.
  • the first grating 70211 receives the first light beam from the light incident surface, and the first grating 70211 causes the first light beam to transmit to the second grating 70213 in the first waveguide medium 70212 at a first diffraction angle.
  • the second grating 70213 causes the first light beam to be projected from the light exit surface to the image generation module 703 at a first angle.
  • the third grating 70221 receives the second light beam from the light incident surface, and the third grating 70221 causes the second light beam to transmit to the fourth grating 70223 in the second waveguide medium 70222 at a second diffraction angle.
  • the fourth grating 70223 causes the second light beam to be projected from the light exit surface to the image generation module 703 at a second angle.
  • the grating parameters of the first grating group and the second grating group are different. In actual situations, they can be determined according to the specific directions of the first imaging light and the second imaging light respectively corresponding to the left and right eye viewing angles. The details are not limited here.
  • the number of light sources 701 is two. They are respectively used as the left light source and the right light source to emit the first light beam at the first time and the second light beam at the second time. This is only used as an example for understanding. The embodiments of the present application do not substantially limit this solution. It can be understood that a light source can also emit a first light beam at a first time and a second light beam at a second time. The details are not limited here.
  • the first grating group and the second grating group with different grating parameters respectively project the first light beam to the image generation module at a first angle, and project the second light beam to the image generation module at a second angle. module, and obtain the first imaging light and the second imaging light corresponding to the left and right eyes, thereby realizing naked-eye 3D display and realizing a three-dimensional display device that is light, thin, and has low crosstalk.
  • the waveguide layer 702 also includes a third grating group, a third waveguide medium, a fourth grating group, and a fourth waveguide medium.
  • Figure 8a is a three-dimensional display provided by an embodiment of the present application. Another schematic of the device.
  • the waveguide layer 702 includes a first grating group 7021 and a first waveguide medium 70212, a second grating group 7022 and a second waveguide medium 70222, a third grating group 7023 and a third waveguide medium 70232, a fourth grating group 7024 and a fourth The waveguide medium 70242, and the combination of each waveguide medium and each grating group are located on the same horizontal plane, that is, arranged side by side as shown in Figure 8b.
  • Figure 8b is a schematic diagram of the waveguide layer provided by the embodiment of the present application.
  • the third grating group group 7023 and the fourth grating group group 7024 have different grating parameters.
  • At least two light sources 701 emit first light beams to the light incident surfaces of the first waveguide medium 70212 and the third waveguide medium 70232 respectively at the first moment, and to the second waveguide medium 70222 and the fourth waveguide medium 70242 respectively at the second moment.
  • the light incident surface emits the second beam.
  • the first grating group 7021 is used to transmit the first light beam in the first waveguide medium 70212 at a first diffraction angle, and to project the first light beam to the image generation module 703 at a third angle.
  • the third grating group 7023 is used for The first light beam is transmitted in the third waveguide medium 70232 at a third diffraction angle, and the first light beam is projected to the image generation module 703 at a third angle, and the image generation module 703 modulates the first light beam based on the first image data.
  • the light beam is used to obtain the first imaging light corresponding to multiple left or right viewing angles.
  • the second grating group 7022 is used to transmit the second light beam in the second waveguide medium 70222 at a second diffraction angle, and to project the second light beam to the image generation module 703 at a second angle.
  • the fourth grating group 7024 is used to make The second light beam is transmitted within the fourth waveguide medium 70242 at a fourth diffraction angle, and the second light beam is projected to the image generation module 703 at a fourth angle, and the image generation module 703 modulates the second light beam based on the second image data, Second imaging light corresponding to multiple left or right viewing angles is obtained. The details are similar to those described in Figure 7 and will not be described again here.
  • the three-dimensional display device obtains the first imaging light and the second imaging light corresponding to multiple left and right viewing angles based on a waveguide layer composed of multiple grating groups and waveguide media, thereby achieving at least a two-person mode display. mode, suitable for more complex application scenarios.
  • the material of the first grating group 7021, the second grating group 7022, the third grating group 7023 and/or the fourth grating group 7024 is a non-electro-optical polymer material.
  • the first grating group 7021 and the second grating group 7022 with different grating parameters, and the third grating group 7023 and the fourth grating group 7024 with different grating parameters can be realized by using different non-electro-optical polymer materials.
  • different non-electro-optical polymer materials are used to realize grating groups with different grating parameters, which reflects the diversity and selectivity of the stereoscopic display device.
  • the materials of the first grating group 7021, the second grating group 7022, the third grating group 7023 and/or the fourth grating group 7024 may be electro-optical polymer materials with different grating parameters, and no voltage may be applied in this case.
  • each waveguide medium and each grating group shown on the same horizontal plane in Figure 8b is only used as an example to understand the embodiment of the present application, and does not substantially limit the present solution. It can be understood that, with Similar to what was described in Figure 7, there can be a certain range of height differences between each waveguide medium or between each grating group. You only need to avoid the waveguide medium from forming multiple layers of space and reduce the thickness of the waveguide layer as much as possible. Specifically, There are no limitations here.
  • one of the first grating groups is located above the waveguide medium and one is located below the waveguide medium. This only corresponds to the image generation module in Figure 2 where the light source is below the waveguide layer. Embodiment underneath the waveguide layer. It can be understood that in other application scenarios, for example, when the light source and the image generation module are both above or below the waveguide layer, the two gratings of the first grating group can be located on the same side of the waveguide medium at the same time. This is not done here. limited.
  • FIG. 9 is a schematic structural diagram of a stereoscopic display device provided by an embodiment of the present application.
  • the stereoscopic display device 900 includes a control module 902 and a stereoscopic display device 903 .
  • the three-dimensional display device 903 is the three-dimensional display device of Figures 2 to 8a.
  • the control module is used to control the light source of the stereoscopic display device 903 to emit the first light beam at the first time and the second light beam at the second time, and also to control the image generation module of the stereoscopic display device to obtain the third light beam at the first time.
  • the image generation module modulates the first light beam based on the first image data to obtain the first imaging light, and the image generation module modulates the second light beam based on the second image data to obtain the second imaging light.
  • control module 902 may be a hardware module such as a processor, a computing unit, an integrated circuit or an integrated chip, or may be implemented in software. It is understood that the control module 902 may include a hardware structure and/or a software module. , implemented in the form of hardware structure, software module, or hardware structure plus software module. There are no specific limitations here.
  • the stereoscopic display device realizes control of the stereoscopic display device based on the control module, thereby realizing naked-eye 3D display, and the stereoscopic display device has the advantages of thin volume and low crosstalk.
  • the control module 902 further includes a third grating group, a third waveguide medium, a fourth grating group and a fourth waveguide medium in the waveguide layer of the three-dimensional display device, and the three-dimensional display device 900
  • the display mode is at least a two-person mode
  • it is also used to control the third grating group of the stereoscopic display device 903 so that the first light beam is transmitted in the third waveguide medium at a third diffraction angle, and the first light beam is projected at a third angle.
  • the image generation module and control the fourth grating group to transmit the second light beam in the fourth waveguide medium at a fourth diffraction angle, and to project the second light beam to the image generation module at a fourth angle.
  • control module 902 can control the light source to emit the first light beam and the second light beam to the light incident surface of the third waveguide medium and the fourth waveguide medium respectively, thereby achieving the first light beam and the second light beam at different angles. projection. This can be applied to at least two-player mode, corresponding to application scenarios with multiple left and right perspectives.
  • the above-mentioned three-dimensional display device also includes a sensor 901.
  • the sensor 901 is used to obtain information about the external observer and feedback information to the control module 902.
  • the feedback information includes human eye position information, head information, At least one of the thermal imaging information.
  • the sensor 901 may be a thermal sensor, a position sensor, an infrared sensor, or other resistance sensor, and the details are not limited here.
  • control module 902 is also used to determine the display mode based on the information fed back by the sensor 901 .
  • the application scenarios of stereoscopic display devices and multiple implementation methods are added, increasing the diversity of stereoscopic display devices.
  • the three-dimensional display device 900 has various product forms. As shown in FIG. 10 , FIG. 10 is a schematic diagram of the product form of a stereoscopic display device provided by an embodiment of the present application.
  • the stereoscopic display device 900 may include a 3D display, a 3D projector, a 3D wearable device, etc.
  • the 3D display can be a display screen of a computer monitor, a mobile phone, a laptop, a personal digital assistant (PDA), a game console and other mobile devices.
  • PDA personal digital assistant
  • the 3D projector can be used in front projection scenes and rear projection scenes, and this application does not limit this.
  • the stereoscopic display device 900 may be a car light, a desktop display device, a head up display (HUD) device, etc.
  • 3D wearable devices can be augmented reality (AR)/virtual reality (VR) glasses, AR/VR helmets, smart watches, etc. This application does not limit this.
  • the three-dimensional display device 900 provided by the embodiment of the present application can be applied to vehicles such as cars and boats, and the details are not limited here.
  • Figure 11a is a schematic diagram of the desktop display form of the stereoscopic display device provided by the embodiment of the present application.
  • the stereoscopic display device 900 is a desktop display device
  • the stereoscopic display device 1100 on the stereoscopic display device 900 outputs the first imaging light and a second imaging light.
  • the first imaging light and the second imaging light are reflected by the optical element and the free-form surface mirror, and are projected to the human eye through the optical element, and an image is presented on the human eye.
  • the optical element is used to reflect or transmit a part of the imaging light.
  • the stereoscopic display device can be applied in a head-up display (HUD) scene.
  • HUD head-up display
  • a vehicle includes the aforementioned three-dimensional display device and a windshield, as shown in Figure 11b.
  • Figure 11b is a schematic diagram of the HUD form of the three-dimensional display device provided by an embodiment of the present application.
  • the optical element in Figure 11a mentioned above is the windshield.
  • the stereoscopic display device on the stereoscopic display device outputs first imaging light and second imaging light.
  • the first imaging light and the second imaging light pass through optical elements such as a diffusion screen and a free-form surface mirror (referred to as a free-form surface mirror) and are projected onto the windshield.
  • the windshield reflects the first imaging light and the second imaging light to the human eye, thereby presenting an image on the human eye.
  • the windshield is only an example of an optical element.
  • the optical element can also be made of other materials, which is not limited here.
  • the three-dimensional display device can be HUD; HUD can be applied to vehicles, airplanes and other means of transportation. In addition, it can also be applied to central control rooms, architectural landscapes, advertising and other scenarios, which will not be done here. limited.
  • the main function of the windshield in Figure 11b is to reflect imaging light, so the types of optical elements in these scenarios are not limited.
  • FIG. 11c is a schematic diagram of a possible functional framework of a vehicle provided by an embodiment of the present application.
  • the functional framework of the vehicle may include various subsystems, such as the sensor system 12 in the figure, the control system 14, one or more peripheral devices 16 (one is shown as an example in the figure), a power supply 18.
  • Computer system 20 and head-up display system 22 may also include other functional systems, such as an engine system that provides power for the vehicle, etc., which is not limited in this application.
  • the sensor system 12 may include several detection devices, which can sense the measured information and convert the sensed information into electrical signals or other required forms of information output according to certain rules.
  • these detection devices may include a global positioning system (GPS), vehicle speed sensor, inertial measurement unit (IMU), radar unit, laser rangefinder, camera device, wheel speed sensor, Steering sensors, gear sensors, or other components used for automatic detection, etc. are not limited in this application.
  • the control system 14 may include several elements, such as the illustrated steering unit, braking unit, lighting system, automatic driving system, map navigation system, network time synchronization system and obstacle avoidance system.
  • the control system 14 may also include components such as a throttle controller and an engine controller for controlling the driving speed of the vehicle, which are not limited in this application.
  • Peripheral device 16 may include several elements, such as a communication system, a touch screen, a user interface, a microphone and a speaker as shown, among others.
  • the communication system is used to realize network communication between vehicles and other devices other than vehicles.
  • the communication system can use wireless communication technology or wired communication technology to realize network communication between vehicles and other devices.
  • the wired communication technology may refer to communication between vehicles and other devices through network cables or optical fibers.
  • the power source 18 represents a system that provides power or energy to the vehicle, which may include, but is not limited to, rechargeable lithium batteries or lead-acid batteries, etc. In practical applications, one or more battery components in the power supply are used to provide electric energy or energy for starting the vehicle. The type and material of the power supply are not limited in this application.
  • the computer system 20 may include one or more processors 2001 (one processor is shown as an example) and a memory 2002 (which may also be referred to as a storage device).
  • the memory 2002 may also be inside the computer system 20 or outside the computer system 20 , for example, as a cache in a vehicle, etc., which is not limited in this application. in,
  • Processor 2001 may include one or more general-purpose processors, such as a graphics processing unit (GPU).
  • the processor 2001 may be used to run relevant programs or instructions corresponding to the programs stored in the memory 2002 to implement corresponding functions of the vehicle.
  • Memory 2002 may include volatile memory (volatile memory), such as RAM; memory may also include non-volatile memory (non-volatile memory), such as ROM, flash memory (flash memory), HDD or solid state drive SSD; memory 2002 may also include combinations of the above types of memories.
  • Memory 2002 is available for storage
  • a set of program codes for vehicle control can be stored in the memory 2002, and the processor 2001 calls the program codes to control the safe driving of the vehicle. How to achieve safe driving of the vehicle will be described in detail below in this application.
  • the memory 2002 may also store information such as road maps, driving routes, sensor data, and the like.
  • the computer system 20 can be combined with other components in the vehicle, such as sensors in the sensor system, GPS, etc., to implement vehicle-related functions.
  • the computer system 20 can control the driving direction or driving speed of the vehicle based on data input from the sensor system 12 , which is not limited in this application.
  • Heads-up display system 22 may include several elements, such as the illustrated windshield, controls, and heads-up display.
  • the controller 222 is configured to generate an image according to user instructions (for example, generate an image containing vehicle status such as vehicle speed, power/fuel level, and an image of augmented reality AR content), and send the image to the head-up display for display; the head-up display may include an image
  • the generation module, reflector combination, and front glass are used to cooperate with the head-up display to realize the light path of the head-up display system, so that the target image is presented in front of the driver.
  • the functions of some components in the head-up display system can also be implemented by other subsystems of the vehicle.
  • the controller can also be a component in the control system.
  • Figure 11c of this application shows that it includes four subsystems.
  • the sensor system 12, the control system 14, the computer system 20 and the head-up display system 22 are only examples and do not constitute a limitation.
  • vehicles can combine several components in the vehicle according to different functions to obtain subsystems with corresponding different functions.
  • the vehicle may include more or fewer systems or components, which is not limited by this application.
  • the above-mentioned means of transportation can be cars, trucks, motorcycles, buses, boats, airplanes, helicopters, lawn mowers, recreational vehicles, playground vehicles, construction equipment, trams, golf carts, trains, and trolleys.
  • the application examples are not particularly limited.
  • FIG. 12 is a schematic flowchart of a stereoscopic display method provided by an embodiment of the present application. This method can be applied to any of the aforementioned three-dimensional display devices. As shown in Figure 12, the method includes:
  • the first grating group causes the first light beam to transmit in the first waveguide medium at a first diffraction angle, and causes The first beam is projected at a first angle.
  • the second light beam is transmitted in the first waveguide medium at a second diffraction angle through the first grating group, and the second light beam is projected at a second angle.
  • a first voltage is applied to the first grating group, and the first diffraction angle and the first angle are adjusted.
  • a second voltage is applied to the first grating group to adjust the second diffraction angle and the second angle.
  • the material of the first grating group is an electro-optical polymer material.
  • the waveguide layer when the waveguide layer includes a first grating group, a first waveguide medium, a second grating group, and a second waveguide medium, the first waveguide medium and the second waveguide medium are located on the same horizontal plane.
  • the first grating group causes the first light beam to be transmitted within the first waveguide medium at a first diffraction angle and causes the first light beam to be projected at a first angle
  • the second grating group causes the second light beam to be transmitted within the second waveguide medium at a second diffraction angle, and causes the second light beam to be projected at a second angle.
  • the waveguide layer further includes a third grating group, a third waveguide medium, a fourth grating group, and a fourth waveguide medium
  • the medium and the fourth waveguide medium are located on the same horizontal plane.
  • the first light beam is transmitted in the third waveguide medium at a third diffraction angle through the third grating group, and the first light beam is projected at a third angle.
  • the second beam is transmitted in the fourth waveguide medium at a fourth diffraction angle through the fourth grating group, and the second beam is projected at a fourth angle.
  • the waveguide medium includes a light incident surface and a light exit surface, the light entrance surface is used to receive the first light beam and the second light beam, and the light exit surface is used to project the first light beam and the second light beam.
  • the three-dimensional display device, three-dimensional display equipment and three-dimensional display method provided by the embodiments of this application can be used in offices, education, medical care, entertainment, games, advertising, architectural decoration, event broadcasts, as well as exhibitions of handicrafts, collections, etc., dramas In scenes such as screenings of performances, operas, concerts, etc.
  • it can be used in computer display screens, conference projectors, conference flat panel displays and other equipment.
  • medical scenarios it can be used in medical monitors or surgical microscopes to enrich the display content (3D imaging can display objects or the depth distance between objects), so that the object information obtained by medical staff can be upgraded from 2D to 3D. Thereby improving the accuracy of remote medical diagnosis or medical examination.
  • 3D images can be displayed on the screens of game consoles, mobile phones, tablets, etc., or 3D images can be displayed through game projectors to make the image display more three-dimensional and vivid, and enhance the user's sense of presence. (presence).
  • the data loaded on the image generation module can be a digital signal corresponding to a pre-prepared 3D image, or a digital signal corresponding to a 3D image generated in real time.
  • a digital signal corresponding to a 3D image generated in real time For example, in an event broadcast scenario, two cameras can be used to collect images from the left eye perspective and right eye perspective respectively at the competition site, convert the real-time collected binocular images into digital signals and load them on the image modulator in real time, realizing on-site Live broadcast.
  • the data may also be free-view image data. That is, the viewing angle can be changed to enhance interactivity.
  • the device embodiments described above are only illustrative.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units. , that is, it can be located in one place, or it can be distributed across multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • the connection relationship between modules indicates that there are communication connections between them, which can be specifically implemented as one or more communication buses or signal lines.
  • the present application can be implemented by software plus necessary general hardware. Of course, it can also be implemented by dedicated hardware including dedicated integrated circuits, dedicated CPUs, dedicated memories, Special components, etc. to achieve. In general, all functions performed by computer programs can be easily implemented with corresponding hardware. Moreover, the specific hardware structures used to implement the same function can also be diverse, such as analog circuits, digital circuits or special-purpose circuits. circuit etc. However, for this application, software program implementation is a better implementation in most cases. Based on this understanding, the technical solution of the present application can be embodied in the form of a software product in essence or that contributes to the existing technology.
  • the computer software product is stored in a readable storage medium, such as a computer floppy disk. , U disk, mobile hard disk, read only memory (ROM), random access memory (RAM), magnetic disk or optical disk, etc., including several instructions to make a computer device (which can be a personal computer, training equipment, or network equipment, etc.) to execute the methods of various embodiments of this application.
  • a computer program product includes one or more computer instructions. When computer program instructions are loaded and executed on a computer, processes or functions according to embodiments of the present application are generated in whole or in part.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g., computer instructions may be transmitted from a website, computer, training facility, or data center via wired (for example, coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.) means to transmit to another website, computer, training equipment or data center.
  • wired For example, coaxial cable, optical fiber, digital subscriber line
  • wireless such as infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that a computer can store, or a data storage device such as a training device, a data center, or other integrated media that contains one or more available media.
  • Available media may be magnetic media (e.g., floppy disks, hard disks, tapes), optical media (e.g., high-density digital video discs (DVD)), or semiconductor media (e.g., solid state drives (SSD) ))wait.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

一种立体显示装置(200)、立体显示设备(900)以及立体显示方法,应用于立体显示领域中。立体显示装置(200)包括光源(201)、波导层(202)以及图像生成模组(203)。其光源(201)用于在第一时刻发射第一光束以及在第二时刻发射第二光束。波导层(202)用于将第一光束以第一角度投射至图像生成模组(203)。图像生成模组(203)用于基于第一图像数据调制第一光束得到第一成像光。波导层(202)还将第二光束以第二角度投射至图像生成模组(203)。图像生成模组(203)还用于基于第二图像数据调制第二光束得到第二成像光。且得到的第一成像光以及第二成像光分别汇聚至观察者的左眼以及右眼,实现了裸眼3D显示。无需多层波导即可实现3D裸眼显示,减轻了装置的厚度,实现了低串扰提升了用户的体验。

Description

一种立体显示装置、立体显示设备以及立体显示方法
本申请要求于2022年8月12日提交中国国家知识产权局、申请号为202210970221.6、申请名称为“一种立体显示装置、立体显示设备以及立体显示方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及显示领域,尤其涉及一种立体显示装置、立体显示设备以及立体显示方法。
背景技术
随着经济和科技的飞速发展,物质文化生活日益丰富,对于显示设备的需求也不断增长。现今社会中三维(three-dimension,3D)显示越来越受到消费者的青睐。
当前的一种3D显示装置是由多层波导器件单元叠加为多层复合型指向性导管板构成,进而采用分频控制各层依次照明的方式,不同层的波导器件单元照明的光线分别对应左右眼视角,以此实现裸眼3D显示。但是,通过采用多层波导器件单元叠加导致产品体积比较厚重,且器件叠加会产生串扰,影响用户的观看体验。
发明内容
本申请提供了一种立体显示装置、立体显示设备以及立体显示方法。实现了裸眼3D显示或者自由立体显示。可以避免多层波导器件单元叠加导致的串扰,同时能减少产品的厚度,从而尽可能的提高用户的观看体验。
第一方面,提供了一种立体显示装置,该立体显示装置包括光源波导层以及图像生成模组。
其中,光源用于在第一时刻发射第一光束,以及在第二时刻发射第二光束。
波导层,用于将第一光束以第一角度投射至图像生成模组。
图像生成模组,用于基于第一图像数据调制第一光束得到第一成像光。
波导层,还用于将第二光束以第二角度投射至图像生成模组。
图像生成模组,还用于基于第二图像数据调制第二光束得到第二成像光。
且得到的第一成像光以及第二成像光分别汇聚至观察者的左眼以及右眼。
示例性的,其光源可以位于波导层的下方,图像生成模组位于波导层的上方,也可以光源与图像生成模组均为与波导层的上方或下方,或者还有其他组成方式,具体此处不做限定。
在本申请的实施方式中,基于波导层将第一光束以及第二光束以不同角度投射至图像生成模组得到分别对应左右眼的第一成像光以及第二成像光,实现了裸眼3D显示。而且本申请实施例中只需一个波导层即可实现裸眼3D显示,避免了不同层次的波导叠加产生的串扰,且减轻了立体显示装置的厚度,提升了用户的体验。
在第一方面的一种可能的实现方式中,波导层包括第一光栅组以及第一波导介质。
其中,第一光栅组用于,使得第一光束以第一衍射角度在第一波导介质内传输,并使得第一光束以第一角度投射至图像生成模组。且还用于使得第二光束以第二衍射角度在第一波 导介质内传输,并使得第二光束以第二角度投射至图像生成模组。
在本申请的实施方式中,以一个第一光栅组将第一光束以第一角度投射至图像生成模组,以及将第二光束以第二角度投射至图像生成模组,尽可能的减少了立体显示装置的组成单元,实现了体积轻薄、低串扰的立体显示装置。
在第一方面的一种可能的实现方式中,立体显示装置还包括电压控制模块,该电压控制模块用于在第一光栅组上加载第一电压,用于调整第一衍射角度和第一角度。电压控制模块在第一光栅组上加载第二电压,用于调整第二衍射角度和第二角度。
在本申请的实施方式中,通过不同的电压可以控制第一光栅组的折射率,以此控制第一光束以及第二光束的投射角度,使得立体显示装置更具有灵活性。
在第一方面的一种可能的实现方式中,第一光栅组的材料为电光聚合物材料。
在本申请的实施方式中,由光电聚合物材料组成第一光栅组,通过电压实现第一光束以及第二光束以不同角度投射至图像生成模组,确保了该立体显示装置基于一个光栅组以及波导介质即可实现裸眼3D显示,尽可能的确保立体显示装置的轻薄,并尽可能的避免串扰的干扰。
在第一方面的一种可能的实现方式中,波导层包括第一光栅组、第一波导介质、第二光栅组以及第二波导介质。
其中,第一光栅组,用于使得第一光束以第一衍射角度在第一波导介质内传输,并使得第一光束以第一角度投射至图像生成模组;
第二光栅组,用于使得第二光束以第二衍射角度在第二波导介质内传输,并使得第二光束以第二角度投射至图像生成模组。
具体的,第一光栅组以及第二光栅组的光栅参数不同,实际情况中可以根据具体第一成像光以及第二成像光分别对应左右眼视角的方向确定第一光栅组以及第二光栅组的光栅参数,具体此处不做限定。
在本申请的实施方式中,光栅参数不同的第一光栅组以及第二光栅组分别将第一光束以第一角度投射至图像生成模组,以及将第二光束以第二角度投射至图像生成模组,并得到对应左右眼的第一成像光以及第二成像光,以此实现裸眼3D显示,实现了体积轻薄、低串扰的立体显示装置。
在第一方面的一种可能的实现方式中,波导层还包括第三光栅组、第三波导介质、第四光栅组以及第四波导介质,第一波导介质、第二波导介质、第三波导介质以及第四波导介质位于波导层的同一水平面。
其中,第三光栅组用于,使得第一光束以第三衍射角度在第三波导介质内传输,并使得第一光束以第三角度投射至图像生成模组。
第四波光栅组用于,使得第二光束以第四衍射角度在第四波导介质内传输,并使得第二光束以第四角度投射至图像生成模组。
在本申请的实施方式中,立体显示装置基于多个光栅组以及多个波导介质组成的波导层得到了对应多个左右视角的第一成像光以及第二成像光,以此实现了至少双人模式的显示模式,能实现多人3D裸眼显示,适用于更复杂的应用场景。
在第一方面的一种可能的实现方式中,上述各个波导介质包括入光面以及出光面,入光面用于接收第一光束和第二光束,出光面用于投射第一光束和第二光束。
在本申请的实施方式中,进一步说明了波导介质的具体组成,说明了光束的出入,体现了本申请的可靠性。
第二方面,提供了一种立体显示设备,该立体显示设备包括立体显示设备以及控制模组,其中,立体显示装置包括光源、波导层以及图像生成模组。
其控制模组,用于控制光源在第一时刻发射第一光束,以及在第二时刻发射第二光束,以及控制图像生成模组在第一时刻获取得到第一图像数据,以及在第二时刻获取得到第二图像数据。
波导层,用于将第一光束以第一角度投射至图像生成模组。
图像生成模组,用于基于第一图像数据调制所述第一光束得到第一成像光;
所述波导层,还用于将所述第二光束以第二角度投射至所述图像生成模组;
所述图像生成模组,还用于基于所述第二图像数据调制所述第二光束得到第二成像光。
在第二方面的一种可能的实现方式中,立体显示设备为如第一方面以及第一方面的可能实现方式所描述的立体显示设备。
在第二方面的一种可能的实现方式中,在波导层还包括第三光栅组、第三波导介质、第四光栅组以及第四波导介质,且立体显示设备的显示模式为多人模式的情况下,控制模组还用于,控制立体显示装置的第三光栅组使得第一光束以第三衍射角度在第三波导介质内传输,并使得第一光束以第三角度投射至图像生成模组,以及控制第四光栅组,使得第二光束以第四衍射角度在第四波导介质内传输,并使得第二光束以第四角度投射至图像生成模组。
在第二方面的一种可能的实现方式中,控制模组还用于根据传感器反馈的信息确定显示模式。
在第二方面的一种可能的实现方式中,传感器反馈的信息包括人眼位置信息、头部信息或热成像信息中的至少一个。
第三方面,提供一种交通工具,包括第二方面以及第二方面的可能实现方式所描述的立体显示设备。
在第三方面的一种可能的实现方式中,交通工具还包括挡风玻璃,所述立体显示设备将所述第一成像光以及所述第二成像光分别投射至所述挡风玻璃。
第四方面,提供一种立体显示方法,该方法包括:
通过光源在第一时刻获取第一光束,以及在第二时刻获取第二光束。然后,通过波导层以第一角度投射第一光束,且基于第一图像数据调制第一光束得到第一成像光。还通过波导层以第二角度投射第二光束,且基于第二图像数据调制第二光束得到第二成像光。
在第四方面的一种可能的实现方式中,波导层包括第一光栅组以及第一波导介质的情况下,通过第一光栅组使得第一光束以第一衍射角度在第一波导介质内传输,并使得第一光束以第一角度投射。
且通过第一光栅组使得第二光束以第二衍射角度在第一波导介质内传输,并使得第二光束以第二角度投射。
在第四方面的一种可能的实现方式中,在第一光栅组上加载第一电压,调整第一衍射角度和第一角度。
另外在第一光栅组上加载第二电压,调整第二衍射角度和第二角度。
在第四方面的一种可能的实现方式中,第一光栅组的材料为电光聚合物材料。
在第四方面的一种可能的实现方式中,在波导层包括第一光栅组、第一波导介质、第二光栅组以及第二波导介质的情况下,第一波导介质以及第二波导介质位于同一水平面,通过第一光栅组使得第一光束以第一衍射角度在第一波导介质内传输,并使得第一光束以第一角度投射,且通过第二光栅组使得第二光束以第二衍射角度在第二波导介质内传输,并使得第 二光束以第二角度投射。
在第四方面的一种可能的实现方式中,在波导层还包括第三光栅组、第三波导介质、第四光栅组以及第四波导介质的情况下,第一波导介质、第二波导介质、第三波导介质以及第四波导介质位于同一水平面。
其中,通过第三光栅组使得第一光束以第三衍射角度在第三波导介质内传输,并使得第一光束以第三角度投射。
通过第四光栅组使得第二光束以第四衍射角度在第四波导介质内传输,并使得第二光束以第四角度投射。
在第四方面的一种可能的实现方式中,前述波导介质包括入光面以及出光面,入光面用于接收第一光束和第二光束,出光面用于投射第一光束和第二光束。
第二方面、第三方面、第四方面的有益效果参见第一方面,此处不再赘述。
附图说明
图1为多层复合型指向性导管板的一个示意图;
图2为本申请实施例提供的立体显示装置的一个结构示意图;
图3为本申请实施例提供的光栅组与波导介质的一个示意图;
图4为本申请实施例提供的投射第一光束的一个示意图;
图5为本申请实施例提供的投射第二光束的一个示意图;
图6为本申请实施例提供的立体显示装置的另一个结构示意图;
图7为本申请实施例提供的立体显示装置的另一个结构示意图;
图8a为本申请实施例提供的立体显示装置的另一个示意图;
图8b为本申请实施例提供的波导层的一个示意图;
图9为本申请实施例提供的立体显示设备的一个结构示意图;
图10为本申请实施例提供的立体显示设备的产品形态示意图;
图11a为本申请实施例提供的立体显示设备的桌显形态示意图;
图11b为本申请实施例提供的立体显示设备的HUD形态示意图;
图11c是本申请实施例提供的一种交通工具的一种可能的功能框架示意图;
图12为本申请实施例提供的立体显示方法的一个流程示意图。
具体实施方式
本申请实施例提供了一种立体显示装置、立体显示设备以及立体显示方法,应用于立体成像领域中,具体实现了裸眼3D显示或者自由立体显示。可以避免多层波导器件单元叠加导致的串扰,同时能减少产品的厚度,从而尽可能的提高用户的观看体验。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的术语在适当情况下可以互换,这仅仅是描述本申请的实施例中对相同属性的对象在描述时所采用的区分方式。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,以便包含一系列单元的过程、方法、系统、产品或设备不必限于那些单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它单元。
在介绍本申请实施例之前,以便于后续理解本申请实施例,首先对本申请实施例中出现的一些专业术语进行解释:
裸眼3D技术:由显示装置实现左右眼图像的分离,不需要观众佩戴穿戴设备来分离左右眼图像的一种3D显示技术。在3D显示技术中,通过向左右眼分别投射不同视角的成像光,从而在人脑中获得立体的3D图像。3D显示技术可以通过多种手段将左右眼的成像光分别投射至对应人眼。其中,在裸眼3D技术中,将成像光分为左右眼成像光,并使左眼视角的成像光投射至左眼,使右眼视角的成像光投射至右眼。裸眼3D技术不需要观众佩戴穿戴设备,即可实现左右眼成像光的分离,对于观众来说是一种高易用性(usability)的技术。
左眼视角:由于人的双眼位置不同,因此对于同一个三维画面,左右眼所接收到的二维图像是不同的。也就是说,对于同一个三维画面,左右眼是站在不同的视角接收二维画面的。因此,在三维画面的图像采集(即本申请实施例中图像生成模组所放映的图像数据的采集)过程中,需要用左右眼分别对应的采集左右眼视角的二维图像。左眼视角为三维画面的采集过程中,对应于左眼的立体显示装置的发送二维图像的视角。
右眼视角:右眼视角为三维画面的采集过程中,对应于右眼的立体显示装置发送二维图像的视角。
随着3D显示技术的发展,3D显示的应用场景也在增加。例如,在办公、教育、医疗、娱乐、游戏、广告投放、建筑装饰、赛事转播,以及工艺品、藏品等的展出,话剧、歌剧、演唱会等演出的放映等场景下,可通过3D显示技术投射3D图像,从而使投射出的图像更加立体生动。
当前一种裸眼3D显示装置是由多层波导器件单元叠加为多层复合型指向性导管板构成,采用分频控制各层依次对应左右视角照明的方式实现裸眼3D显示。示例性的如图1所示,图1为多层复合型指向性导管板的一个示意图,其中通过分频控制各层依次控制波导器件单元101的照明光束对应于左眼视角,以及波导器件单元102的照明光束对应于右眼视角,且波导器件单元101/102还包括多个可以看作像素的纳米光栅。由于纳米光栅结构的周期(空频)和取向在各亚像素之间的变化连续,即可实现光场的调控和变换。因此,在一块波导器件单元上制作出多个按需设定的不同取向角和周期的纳米光栅之后,就构成一个指向性导光板,理论上就可以获得足够多的不同视点,配合空间光调制器103对颜色和灰度的控制,以及通过分频控制不同层次的波导器件单元101以及102依次照明的方式就能实现多视角下的裸眼3D显示。
但是,通过采用波导器件单元多层叠加导致产品体积比较厚重,且波导器件单元叠加会产生串扰,影响用户的观看体验。
为解决上述所述问题,本申请实施例首先提供了一种立体显示装置、立体显示设备以及立体显示方法,应用于3D显示领域中。其立体显示装置包括光源、波导层以及图像生成模组。其中,光源用于在第一时刻发射第一光束,以及在第二时刻发射第二光束。波导层,用于将第一光束以第一角度投射至图像生成模组。图像生成模组,用于基于第一图像数据调制第一光束得到第一成像光。波导层,还用于将第二光束以第二角度投射至图像生成模组。图像生成模组,还用于基于第二图像数据调制第二光束得到第二成像光。然后得到的第一成像光以及第二成像光分别汇聚至观察者的左眼以及右眼。从而实现了裸眼3D显示。而且本申请实施例中只需一个波导层即可实现裸眼3D显示,避免了不同层次的波导叠加产生的串扰,且减轻了立体显示装置的厚度,提升了用户的体验。
为了更好的理解本申请的实施例,下面结合附图,首先对本申请的实施例提供的一种立体显示装置进行详细描述。本领域普通技术人员可知,随着技术的发展和新场景的出现,本 申请实施例提供的技术方案对于类似的技术问题,同样适用。具体请参阅图2,图2为本申请实施例提供的立体显示装置的一个结构示意图,立体显示装置200具体包括:
光源201、一个波导层202以及图像生成模组203,其光源201位于波导层202的下方,而图像生成模组203位于波导层202的上方。其中,光源201用于在第一时刻发射第一光束,以及在第二时刻发射第二光束。波导层202用于将第一光束以第一角度投射至图像生成模组203,还用于将第二光束以第二角度投射至图像生成模组203,该图像生成模组203用于基于第一图像数据调制第一光束得到第一成像光,以及基于第二图像数据调制第二光束得到第二成像光。然后得到的第一成像光以及第二成像光分别汇聚至观察者的左眼以及右眼。
需要说明的是,其光源201还可以与图像生成模组203均位于波导层的下方或上方,或者其他组成模式,具体此处不做限定。
为了在人脑中呈现立体的3D图像,左右眼需要交替接收对应视角的图像。因此图像生成模组203在对第一光束以及第二光束的调制过程中,所加载的第一图像数据以及第二图像数据需要按照光源发射第一光束以及第二光束的时刻进行切换。即图像生成模组203在第一时刻加载第一图像数据,图像生成模组203在第二时刻加载第二图像数据。且图像生成模组203调制得到的第一成像光以及第二成像光分别汇聚至观察者的左眼以及右眼中,以此实现裸眼3D显示。
在本申请的实施方式中,可选的,光源201在第一时刻为左光源,在第二时刻为右光源;或者,光源201在第一时刻为右光源,在第二时刻为左光源,具体此处不做限定。其中,左光源发射的光束经由波导202以一定角度投射至图像生成模组203,且图像生成模组203基于对应的图像数据生成成像光汇聚至观察者的左眼,右光源发射的光束经由波导202以一定角度投射至图像生成模组203,且图像生成模组203基于对应的图像数据生成成像光汇聚至观察者的左眼。
在本申请实施例提供的立体显示装置200中,通过波导层202将光源201在不同时刻发射的第一光束以及第二光束以不同的角度投射至图像生成模组203,且图像生成模组分别基于第一图像数据调制第一光束得到第一成像光,以及基于第二图像数据调制第二光束得到第二成像光,且第一成像光以及第二成像光分别汇聚至观察者的左右眼,从而实现了裸眼3D显示。而且本申请实施例中只需一个波导层即可实现裸眼3D显示,避免了不同层次的波导叠加产生的串扰,且减轻了立体显示装置的厚度,提升了用户的体验。
一种可能的实现方式中,上述光源201的数量可以是一个,该光源201在第一时刻发射第一光束,在第二时刻发射第二光束即可实现对多个视角的匹配,尽可能的减少了立体显示设备的体积以及重量,尽可能的保障该立体显示设备的轻薄。可选的,上述光源201的数量还可以是两个,一个作为左光源,一个作为右光源,以此能单独的控制在不同的时刻分别发射第一光束以及第二光束,可以降低对光源器件的损耗,尽可能的减少设备失效时限,且增加了应用场景。上述图2的示例中以一个光源201作为示例仅仅用于理解本申请实施例,不对本方案产生实质性的限定,可以理解的是,实际情况中可以根据具体情况确定,具体此处不做限定。
示例性的,其中波导层202可以是图2中所示的矩形横截面的平板波导,或者还可以是其他的条形波导或曲面波导,具体此处不做限定。
其中,光源201可以是背光光源,例如发光二极管、冷阴极荧光管、点状光源、热阴极荧光管、扁平荧光灯、或其他激光光源,具体此处不做限定。
其中,图像生成模组203可以为透射式图像生成模组,用于将入射的照明光束调制,并将调制得到的成像光透射。示例地,图像生成模组203可以为液晶显示器(liquid crystal display,LCD)、有机电激光显示(organic light-emitting diode,OLED)或者其他透射式图像生成模组,具体此处不做限定。
为便于理解后续实施例,先说明光栅组的材料可以是电光聚合物材料,也可以是非电光聚合物材料。具体的,当光栅组为电光聚合物材料时,可以根据电压调整光栅组的折射率实现不同角度投射第一光束和第二光束。当光栅组为非电光聚合物材料时,可以根据不同光栅参数的非光电聚合物材料组成的光栅组实现不同角度投射第一光束和第二光束,具体如后续实施例所示,具体此处不再赘述。
一种可能的实现方式中,如图2所示,上述波导层202包括第一光栅组以及第一波导介质20212,其第一光栅组即第一光栅20211以及第二光栅20213。
其第一光栅组用于,使得第一光束以第一衍射角度在第一波导介质20212内传输,并使得第一光束以第一角度投射至图像生成模组203。
第一光栅组还用于,使得第二光束以第二衍射角度在第一波导介质20212内传输,并使得第二光束以第二角度投射至图像生成模组203。
其中,光栅直接加工于波导介质上,或,加工于薄膜上,并将薄膜贴合于入光面的对面,或出光面的对面或嵌设在波导介质中,具体此处不做限定。
需要说明的是,如图3所示,图3为本申请实施例提供的光栅组与波导介质的一个示意图,其中,第一波导介质20212还包括入光面以及出光面,该入光面用于接收第一光束和第二光束,出光面用于投射第一光束和第二光束。且第一光栅20211位于入光面的上方,第二光栅20213位于出光面的下方。
另外,一种可能的实现方式中,其出光面的面积不小于图像生成模组203的面积,以此可以确保成像光的质量,提升用户的体验感。可以理解的是,在其他应用场景中,出光面的面积也可以小于图像生成模组203的面积,具体此处不做限定。
具体的,基于图4的示例进行说明,图4为本申请实施例提供的投射第一光束的一个示意图。其中,第一光栅组的第一光栅20211从入光面接收第一光束,且第一光栅20211以第一衍射角度在第一波导介质20212内传输第一光束至第二光栅20213,示例性的,第一波导介质20212除入光面以及出光面外,其余均为反射面,因此第一光栅20211以第一衍射角度将第一光束衍射到第一波导介质20212以内,则第一波导介质20212可以继续以第一衍射角度衍射该第一光束至第二光栅20213。且该第二光栅20213将第一光束以第一角度从出光面投射至图像生成模组203。
另外,基于图5的示例进行说明,图5为本申请实施例提供的投射第二光束的一个示意图。其中,第一光栅20211从入光面接收第二光束,且第一光栅20211以第二衍射角度在第一波导介质20212内传输第二光束至第二光栅20213,示例性的,如前述图4的第一波导介质20212除入光面以及出光面外,其余均为反射面,因此第一光栅20211以第二衍射角度将第二光束衍射到第一波导介质20212以内,则第一波导介质20212可以继续以第二衍射角度衍射该第二光束至第二光栅20213。且该第二光栅20213将第二光束以第二角度从出光面投射至图像生成模组203。
需要说明的是,前述示例中光源201发射第一光束与第二光束的角度只是作为示例,用于理解本申请实施例,可以理解的是,在实际情况中,光源201发射第一光束以及第二光束的到入光面的角度可以如图2所示不同,也可以是相同角度发射,具体此处不做限定。
在本申请实施例中,以一个第一光栅组以及第一波导介质将第一光束以第一角度投射至图像生成模组,以及将第二光束以第二角度投射至图像生成模组,尽可能的减少了立体显示装置的组成单元,实现了体积轻薄、低串扰的立体显示装置。
一种可能的实现方式中,第一光栅组的材料为光电聚合物材料。可以基于加载在第一光栅组上的电压控制第一光栅组的折射率。即可以基于加载在第一光栅20211上的电压控制第一衍射角度以及第二衍射角度,基于第二光栅20213上的电压控制第一角度以及第二角度。可以理解的是,第一光栅20211与第二光栅20213可以由不同的光电聚合物材料组成,也可以由相同的光电聚合物材料组成,具体此处不做限定。在本申请的实施方式中,由光电聚合物材料组成第一光栅组,通过电压实现第一光束以及第二光束以不同角度投射至图像生成模组,确保了该立体显示装置基于一个光栅组与一个波导介质即可实现裸眼3D显示,尽可能的确保立体显示装置的轻薄,并尽可能的避免串扰的干扰。
一种可能的实现方式中,如图6的示例,图6为本申请实施例提供的立体显示装置的另一个结构示意图。其中,立体显示装置200还包括电压控制模块204,该电压控制模块204与第一光栅组即第一光栅20211与第二光栅20213相连。在光源201发射第一光束的第一时刻,该电压控制模组204用于在第一光栅组上加载第一电压。则第一光栅组基于第一电压调整第一衍射角度以及第一角度,使得第一光束以第一衍射角度在第一波导介质20212内传输,并使得第一光束以第一角度投射至图像生成模组203。
示例性的,该第一电压包括两个电压值组合,电压控制模组204在将两个电压值组合的第一电压分别加载在第一光栅20211以及第二光栅20213上,使得第一光栅20211基于加载的电压值调整第一衍射角度,且使得第二光栅20213基于加载的电压值调整第一角度。需要说明的是,由于第一光栅20211与第二光栅20213的材料可以相同也可以不同,则第一电压的两个电压值可以相同,也可以不相同,具体此处不做限定。
在光源201发射第二光束的第二时刻,电压控制模块204还用于,在第一光栅组上加载第二电压,则第一光栅组调整第二衍射角度和第二角度,使得第二光束以第二衍射角度在第一波导介质20212内传输,并使得第二光束以第二角度投射至图像生成模组203。
示例性的,该第二电压包括两个电压值组合,电压控制模组204在将两个电压值组合的第二电压分别加载在第一光栅20211以及第二光栅20213上,使得第一光栅20211基于加载的电压值调整第二衍射角度,且使得第二光栅20213基于加载的电压值调整第一角度。需要说明的是,由于第一光栅20211与第二光栅20213的材料可以相同也可以不同,则第二电压的两个电压值可以相同,也可以不相同,具体此处不做限定。
在本申请的实施方式中,通过不同的电压可以控制光栅组的折射率,控制第一光束以及第二光束的投射角度,使得立体显示装置更具有灵活性。
一种可能的实现方式中,对应单人模式的第一电压以及第二电压的值,与多人模式对应的第一电压以及第二电压的值可能不同。示例性的,立体显示装置可以用于单人模式、双人模式、和/或其他多人模式等显示模式,而不同的显示模式对应不同的电压,在单人模式的情况下,第一成像光以及第二成像光分别对应单人的左右眼即可。而在多人模式的情况下,需要将第一成像光以及第二成像光分别对应多人的左右眼,具体的,可以基于多人模式对应的第一电压,调整第一衍射角度以及第一角度,使得第一光束最终以第一角度投射至图像生成模组203,得到的第一成像光分别同时投射至多人的左眼。可选的,也可时分调控第一电压的值,使得第一光束分别以不同的第一角度投射至图像生成模组203,得到第一成像光分别时分投射至不同人的左眼。基于多人模式对应的第二电压最终得到对应多人右眼的第二成像 光与前述对应多人左眼的第一成像光描述的类似,具体此处不再赘述。因此基于第一电压调整第一衍射角度和第一角度,以及第二电压调整第二衍射角度和第二角度,以此实现立体显示装置可以灵活调控应用于多个应用场景,增加了该立体显示装置的应用场景。
一种可能的实现方式中,上述波导层202可以包括第一光栅组组、第一波导介质、第二光栅组以及第二波导介质实现,下面以另一种立体显示装置为例继续说明本申请实施例的另一种可能的实现方式,具体请参阅图7的示例,图7为本申请实施例提供的立体显示装置的另一个结构示意图。其中,波导层702包括第一光栅组(即第一光栅70211以及第二光栅70213)以及第一波导介质70212,第二光栅组(即第三光栅70221以及第四光栅70223)以及第二波导介质70222。且第一波导介质70212以及第二波导介质70222在同一水平面,且第一光栅70211以及第三光栅70221在同一水平面,第二光栅70213与第四光栅70223在同一水平面。可以理解的是,在实际应用场景中,第一波导介质70212以及第二波导介质70222可以在一定范围内高低错落排列,无需完全要求在同一水平面,只需尽可能的避免波导介质形成多层空间占用,减少波导层的厚度即可。同理第一光栅70211以及第三光栅70221,二光栅70213与第四光栅70223也可高低错落排列,具体此处不做限定。
其中,第一光栅组,用于使得第一光束以第一衍射角度在第一波导介质70212内传输,并使得第一光束以第一角度投射至图像生成模组703。第二光栅组,用于使得第二光束以第二衍射角度在第二波导介质70222传输第二光束,并使得第二光束以第二角度投射至图像生成模组703。
示例性的,如前述图3所述的类似,第一波导介质70212以及第二波导介质70222分别包括入光面以及出光面,该入光面用于接收第一光束或第二光束,出光面用于投射第一光束或第二光束。且第一光栅70211以及第三光栅70221位于入光面的上方,第二光栅70213以及第四光栅70223位于出光面的下方。
其中,第一光栅70211从入光面接收第一光束,且第一光栅70211使得第一光束以第一衍射角度在第一波导介质70212内传输至第二光栅70213。该第二光栅70213使得第一光束以第一角度从出光面投射至图像生成模组703。具体如前述图4所述的类似,具体此处不再赘述。
第三光栅70221从入光面接收第二光束,且第三光栅70221使得第二光束以第二衍射角度在第二波导介质70222内传输至第四光栅70223。该第四光栅70223使得第二光束以第二角度从出光面投射至图像生成模组703。具体如前述图5所述的类似,具体此处不再赘述。
需要说明的是,第一光栅组与第二光栅组的光栅参数不同,实际情况中可以根据具体第一成像光以及第二成像光分别对应左右眼视角的方向确定,具体此处不做限定。
另外,如图7的示例中所示,光源701的数量为两个,分别作为左光源以及右光源在第一时刻发射第一光束以及在第二时刻发射第二光束,仅仅作为示例用于理解本申请实施例,不对本方案产生实质性的限定,可以理解的是,还可以是一个光源在第一时刻发射第一光束,以及在第二时刻发射第二光束,具体此处不做限定。
在本申请的实施方式中,光栅参数不同的第一光栅组以及第二光栅组分别将第一光束以第一角度投射至图像生成模组,以及将第二光束以第二角度投射至图像生成模组,并得到对应左右眼的第一成像光以及第二成像光,以此实现裸眼3D显示,实现了体积轻薄、低串扰的立体显示装置。
一种可能的实现方式中,如图8a的示例,波导层702还包括第三光栅组、第三波导介质、第四光栅组以及第四波导介质,图8a为本申请实施例提供的立体显示装置的另一个示意图。 其中,波导层702包括第一光栅组7021以及第一波导介质70212、第二光栅组7022以及第二波导介质70222、第三光栅组7023以及第三波导介质70232,第四光栅组7024以及第四波导介质70242,且,各个波导介质以及各个光栅组的组合均位于同一水平面,即如图8b所示并列排列,图8b为本申请实施例提供的波导层的一个示意图。其中第三光栅组组7023与第四光栅组组7024的光栅参数不同。
其中,至少两个光源701在第一时刻分别向第一波导介质70212以及第三波导介质70232的入光面发射第一光束,在第二时刻分别向第二波导介质70222以及第四波导介质70242的入光面发射第二光束。
其中,第一光栅组7021用于使得第一光束以第一衍射角度在第一波导介质70212内传输,并使得第一光束以第三角度投射至图像生成模组703,第三光栅组7023用于使得第一光束以第三衍射角度在第三波导介质70232内传输,并使得第一光束以第三角度投射至图像生成模组703,且图像生成模组703基于第一图像数据调制第一光束,得到对应多个左视角或右视角的第一成像光。第二光栅组7022用于使得第二光束以第二衍射角度在第二波导介质70222内传输,并使得第二光束以第二角度投射至图像生成模组703,第四光栅组7024用于使得第二光束以第四衍射角度在第四波导介质70242内传输,并使得第二光束以第四角度投射至图像生成模组703,且图像生成模组703基于第二图像数据调制第二光束,得到对应多个左视角或右视角的第二成像光。具体如前述图7所述的类似,具体此处不再赘述。
在本申请的实施方式中,立体显示装置基于多个光栅组以及波导介质组成的波导层得到了对应多个左右视角的第一成像光以及第二成像光,以此实现了至少双人模式的显示模式,适用于更复杂的应用场景。
一种可能的实现方式中,前述第一光栅组7021、第二光栅组7022、第三光栅组7023和/或第四光栅组7024的材料为非电光聚合物材料。具体的,可以通过不同的非电光聚合物材料实现光栅参数不同的第一光栅组7021与第二光栅组7022,以及光栅参数不同的第三光栅组7023与第四光栅组7024。在本申请的实施方式中,利用不同的非电光聚合物材料实现光栅参数不同的光栅组,体现了立体显示装置的多样性,以及可选择性。可选的,前述第一光栅组7021、第二光栅组7022、第三光栅组7023和/或第四光栅组7024的材料可以为光栅参数不同的电光聚合物材料,此时可不加载电压。
另外,需要说明的是,图8b中示在同一水平面的各个波导介质以及各个光栅组的组合仅仅作为示例用于理解本申请实施例,不对本方案产生实质性的限定,可以理解的是,与前述图7中所述的类似,各个波导介质之间或各个光栅组之间可以有一定范围的高低差,只需尽可能的避免波导介质形成多层空间占用,减少波导层的厚度即可,具体此处不做限定。
需要说明的是,前述图2至图8b的示例中,第一光栅组其中一个位于波导介质的上方,一个位于波导介质的下方,仅仅对应图2中光源在波导层的下方,图像生成模组在波导层的下方的实施例。可以理解的是,在其他应用场景中,例如光源与图像生成模组均在波导层的上方或下方时,第一光栅组的两个光栅可以同时位于波导介质的同一面,具体此处不做限定。
如图9所示,本申请还提供了一种立体显示设备。具体请参阅图9,图9为本申请实施例提供的立体显示设备的一个结构示意图。立体显示设备900包括控制模组902以及立体显示装置903。其中,立体显示装置903为图2至图8a的立体显示装置。其中,控制模组用于,控制立体显示装置903的光源在第一时刻发射第一光束,以及在第二时刻发射第二光束,还控制立体显示设备的图像生成模组在第一时刻获取第一图像数据,以及在第二时刻获取第二 图像数据。图像生成模组基于该第一图像数据调制第一光束得到第一成像光,图像生成模组基于该第二图像数据调制第二光束得到第二成像光。
可选的,控制模组902可以是处理器、计算单元、集成电路或集成芯片等硬件模块,也可以是软件实现,可以理解的是,其控制模组902可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上。具体此处不做限定。
在本申请的实施方式中,该立体显示设备基于控制模组实现对立体显示装置的控制,进而实现裸眼3D显示,且该立体显示设备具备体积较薄,低串扰等优点。
一种可能的设计中,上述控制模组902,在上述立体显示装置的波导层还包括第三光栅组、第三波导介质、第四光栅组以及第四波导介质,且上述立体显示设备900的显示模式至少为双人模式的情况下,还用于控制立体显示装置903的第三光栅组使得第一光束以第三衍射角度在第三波导介质内传输,并使得第一光束以第三角度投射至图像生成模组,以及控制第四光栅组使得第二光束以第四衍射角度在第四波导介质内传输,并使得第二光束以第四角度投射至图像生成模组。示例性的,控制模组902可以控制光源将第一光束以及第二光束分别发射到第三波导介质以及第四波导介质的入光面,进而实现将第一光束以及第二光束以不同角度进行投射。以此能适用于至少双人模式,对应于多个左右视角的应用场景。
一种可能的设计中,上述立体显示设备还包括传感器901,该传感器901用于获取外部观察者的信息,并向控制模组902反馈信息,反馈的信息包括人眼位置信息、头部信息、热成像信息中的至少一个。传感器901可以是热敏传感器、位置传感器、红外传感器或其他电阻传感器,具体此处不做限定。
一种可能的设计中,上述控制模组902,还用于根据传感器901反馈的信息确定显示模式。增加了立体显示设备的应用场景,以及多种实现方式,增加了立体显示设备的多样性。
立体显示设备900具有多种产品形态。如图10所示,图10为本申请实施例提供的立体显示设备的产品形态示意图,立体显示设备900可以包括3D显示器、3D投影仪、3D穿戴设备等。其中,3D显示器可以是计算机显示器、手机、笔记本电脑、个人数字助手(personal digital assistant,PDA)、游戏机等移动设备的显示屏。3D投影仪可以应用于前投式场景、背投式场景中,本申请对此不做限定。例如,立体显示设备900可以是车灯、桌面显示设备、抬头显示(head up display,HUD)设备等。3D穿戴设备可以是增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)眼镜、AR/VR头盔、智能手表等,本申请对此不做限定。本申请实施例提供的立体显示设备900,可以应用于车、船等交通工具上,具体此处不做限定。
如图11a所示,图11a为本申请实施例提供的立体显示设备的桌显形态示意图,当立体显示设备900为桌面显示设备时,立体显示设备900上的立体显示装置1100输出第一成像光以及第二成像光。第一成像光以及第二成像光经过光学元件和自由曲面反射镜的反射,透过光学元件投射到人眼上,在人眼上呈现出成像,光学元件用于反射或透射一部分成像光。
以上对本申请实施例所提供的一种立体显示装置以及立体显示设备进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请。同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。
对场景再细化,可以将立体显示设备应用在抬头显示(head up display,HUD)场景中。
一种可能的实现方式中,一种交通工具包括前述立体显示设备和挡风玻璃,如图11b所示,图11b为本申请实施例提供的立体显示设备的HUD形态示意图。在抬头显示HUD场景中, 前述图11a中的光学元件为挡风玻璃。立体显示设备上的立体显示装置输出第一成像光以及第二成像光。第一成像光以及第二成像光经过扩散屏、自由曲面反射镜(简称自由曲面镜)等光学元件,投射到挡风玻璃上。挡风玻璃将第一成像光以及第二成像光反射至人眼,在人眼上呈现出成像。
值得注意的是,挡风玻璃仅是对光学元件的一种示例,除了玻璃,光学元件也可以是其他材质,此处不做限定。
在抬头显示场景中,立体显示设备可以为HUD;HUD可以应用在车辆、飞机等交通工具上,除此之外,还可以应用在中控室、建筑景观、广告投放等场景下,此处不做限定。在交通工具之外的场景下,图11b中的挡风玻璃主要的作用是用于反射成像光,因此不限定这些场景下光学元件的种类。
上述立体显示设备即HUD可以安装在交通工具上,具体请参见图11c,图11c是本申请实施例提供的一种交通工具的一种可能的功能框架示意图。
如图11c所示,交通工具的功能框架中可包括各种子系统,例如图示中的传感器系统12、控制系统14、一个或多个外围设备16(图示以一个为例示出)、电源18、计算机系统20和抬头显示系统22。可选地,交通工具还可包括其他功能系统,例如为交通工具提供动力的引擎系统等等,本申请这里不做限定。
其中,传感器系统12可包括若干检测装置,这些检测装置能感受到被测量的信息,并将感受到的信息按照一定规律将其转换为电信号或者其他所需形式的信息输出。如图示出,这些检测装置可包括全球定位系统(global positioning system,GPS)、车速传感器、惯性测量单元(inertial measurement unit,IMU)、雷达单元、激光测距仪、摄像装置、轮速传感器、转向传感器、档位传感器、或者其他用于自动检测的元件等等,本申请并不做限定。
控制系统14可包括若干元件,例如图示出的转向单元、制动单元、照明系统、自动驾驶系统、地图导航系统、网络对时系统和障碍规避系统。可选地,控制系统14还可包括诸如用于控制车辆行驶速度的油门控制器及发动机控制器等元件,本申请不做限定。
外围设备16可包括若干元件,例如图示中的通信系统、触摸屏、用户接口、麦克风以及扬声器等等。其中,通信系统用于实现交通工具和除交通工具之外的其他设备之间的网络通信。在实际应用中,通信系统可采用无线通信技术或有线通信技术实现交通工具和其他设备之间的网络通信。该有线通信技术可以是指车辆和其他设备之间通过网线或光纤等方式通信。
电源18代表为车辆提供电力或能源的系统,其可包括但不限于再充电的锂电池或铅酸电池等。在实际应用中,电源中的一个或多个电池组件用于提供车辆启动的电能或能量,电源的种类和材料本申请并不限定。
交通工具的若干功能均由计算机系统20控制实现。计算机系统20可包括一个或多个处理器2001(图示以一个处理器为例示出)和存储器2002(也可称为存储装置)。在实际应用中,该存储器2002也在计算机系统20内部,也可在计算机系统20外部,例如作为交通工具中的缓存等,本申请不做限定。其中,
处理器2001可包括一个或多个通用处理器,例如图形处理器(graphic processing unit,GPU)。处理器2001可用于运行存储器2002中存储的相关程序或程序对应的指令,以实现车辆的相应功能。
存储器2002可以包括易失性存储器(volatile memory),例如RAM;存储器也可以包括非易失性存储器(non-volatile memory),例如ROM、快闪存储器(flash memory)、HDD或固态硬盘SSD;存储器2002还可以包括上述种类的存储器的组合。存储器2002可用于存储 一组程序代码或程序代码对应的指令,以便于处理器2001调用存储器2002中存储的程序代码或指令以实现车辆的相应功能。该功能包括但不限于车辆中的部分功能或全部功能。本申请中,存储器2002中可存储一组用于车辆控制的程序代码,处理器2001调用该程序代码可控制车辆安全行驶,关于如何实现车辆安全行驶具体在本申请下文详述。
可选地,存储器2002除了存储程序代码或指令之外,还可存储诸如道路地图、驾驶线路、传感器数据等信息。计算机系统20可以结合车辆中的其他元件,例如传感器系统中的传感器、GPS等,实现车辆的相关功能。例如,计算机系统20可基于传感器系统12的数据输入控制交通工具的行驶方向或行驶速度等,本申请不做限定。
抬头显示系统22可包括若干元件,例如图示出的前挡玻璃,控制器和抬头显示器。控制器222用于根据用户指令生成图像(例如生成包含车速、电量/油量等车辆状态的图像以及增强现实AR内容的图像),并将该图像发送至抬头显示器进行显示;抬头显示器可以包括图像生成模组、反射镜组合,前挡玻璃用于配合抬头显示器以实现抬头显示系统的光路,以使在驾驶员前方呈现目标图像。需要说明的是,抬头显示系统中的部分元件的功能也可以由车辆的其它子系统来实现,例如,控制器也可以为控制系统中的元件。
其中,本申请图11c示出包括四个子系统,传感器系统12、控制系统14、计算机系统20和抬头显示系统22仅为示例,并不构成限定。在实际应用中,交通工具可根据不同功能对车辆中的若干元件进行组合,从而得到相应不同功能的子系统。在实际应用中,交通工具可包括更多或更少的系统或元件,本申请不做限定。
上述交通工具可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不做特别的限定。
如图12所示,本申请实施例还提供了一种立体显示方法,应用于裸眼3D技术。具体请参阅图12,图12为本申请实施例提供的立体显示方法的一个流程示意图。该方法可以应用于前述任一种立体显示装置。如图12所示,该方法包括:
1201、通过光源在第一时刻获取第一光束,以及在第二时刻获取第二光束。
1202、通过波导层以第一角度投射第一光束。
1203、基于第一图像数据调制第一光束得到第一成像光。
1204、通过波导层以第二角度投射第二光束。
1205、基于第二图像数据调制第二光束得到第二成像光。
在一种可能的实现方式中,在波导层包括第一光栅组以及第一波导介质的情况下,通过第一光栅组使得第一光束以第一衍射角度在第一波导介质内传输,并使得第一光束以第一角度投射。
且通过第一光栅组使得第二光束以第二衍射角度在第一波导介质内传输,并使得第二光束以第二角度投射。
在一种可能的实现方式中,在第一光栅组上加载第一电压,调整第一衍射角度和第一角度。
且在第一光栅组上加载第二电压,调整第二衍射角度和第二角度。
在一种可能的实现方式中,第一光栅组的材料为电光聚合物材料。
在一种可能的实现方式中,在波导层包括第一光栅组、第一波导介质、第二光栅组以及第二波导介质的情况下,第一波导介质以及第二波导介质位于同一水平面,通过第一光栅组使得第一光束以第一衍射角度在第一波导介质内传输,并使得第一光束以第一角度投射,且 通过第二光栅组使得第二光束以第二衍射角度在第二波导介质内传输,并使得第二光束以第二角度投射。
在一种可能的实现方式中,在波导层还包括第三光栅组、第三波导介质、第四光栅组以及第四波导介质的情况下,第一波导介质、第二波导介质、第三波导介质以及第四波导介质位于同一水平面。
其中,通过第三光栅组使得第一光束以第三衍射角度在第三波导介质内传输,并使得第一光束以第三角度投射。
通过第四光栅组使得第二光束以第四衍射角度在第四波导介质内传输,并使得第二光束以第四角度投射。
在一种可能的实现方式中,前述波导介质包括入光面以及出光面,入光面用于接收第一光束和第二光束,出光面用于投射第一光束和第二光束。
本申请实施例提供的立体显示装置、立体显示设备以及立体显示方法,可以应用于办公、教育、医疗、娱乐、游戏、广告投放、建筑装饰、赛事转播,以及工艺品、藏品等的展出,话剧、歌剧、演唱会等演出的放映等场景中。例如,在办公、教育等场景中,可以应用在计算机显示屏、会议投影仪、会议平板显示屏等设备中。在医疗场景中,可以应用在医用显示器或手术显微镜等中,以丰富显示内容(3D成像可以显示物体或物体之间的深度距离),使医护人员获取的物体信息从2维升级至3维,从而提升远程医疗诊断或医学检查等的准确性。在娱乐、赛事转播、演出放映等场景中,可以在游戏机、手机、平板等设备的屏幕上显示3D图像,或通过游戏投影仪显示3D图像,使图像显示更加立体生动,提升用户的临场感(presence)。
其中,加载在图像生成模组上的数据可以是预先准备好的3D图像对应的数字信号,也可以是实时生成的3D图像对应的数字信号。例如,在赛事转播场景中,可以在比赛现场通过两个相机分别采集左眼视角和右眼视角的图像,将该实时采集的双眼图像转换成数字信号并实时加载在图像调制器上,实现现场实时转播。可选地,该数据还可以是自由视角的图像数据。即,可以改变观看视角,以增强交互性。
另外需说明的是,以上所描述的装置实施例仅仅是示意性的,其中作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。另外,本申请提供的装置实施例附图中,模块之间的连接关系表示它们之间具有通信连接,具体可以实现为一条或多条通信总线或信号线。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到本申请可借助软件加必需的通用硬件的方式来实现,当然也可以通过专用硬件包括专用集成电路、专用CPU、专用存储器、专用元器件等来实现。一般情况下,凡由计算机程序完成的功能都可以很容易地用相应的硬件来实现,而且,用来实现同一功能的具体硬件结构也可以是多种多样的,例如模拟电路、数字电路或专用电路等。但是,对本申请而言更多情况下软件程序实现是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在可读取的存储介质中,如计算机的软盘、U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,训练设备,或者网络设备等)执行本申请各个实施例的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。
计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、训练设备或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、训练设备或数据中心进行传输。计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包含一个或多个可用介质集成的训练设备、数据中心等数据存储设备。可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,高密度数字视频光盘(digital video disc,DVD))、或者半导体介质(例如,固态硬盘(solid state drive,SSD))等。

Claims (20)

  1. 一种立体显示装置,其特征在于,所述立体显示装置包括光源、波导层以及图像生成模组;
    所述光源,用于在第一时刻发射第一光束,以及在第二时刻发射第二光束;
    所述波导层,用于将所述第一光束以第一角度投射至所述图像生成模组;
    所述图像生成模组,用于基于第一图像数据调制所述第一光束得到第一成像光;
    所述波导层,还用于将所述第二光束以第二角度投射至所述图像生成模组;
    所述图像生成模组,还用于基于第二图像数据调制所述第二光束得到第二成像光。
  2. 根据权利要求1所述的装置,其特征在于,所述波导层包括第一光栅组以及第一波导介质,所述第一光栅组用于:
    使得所述第一光束以第一衍射角度在所述第一波导介质内传输,并使得所述第一光束以所述第一角度投射至所述图像生成模组;
    使得所述第二光束以第二衍射角度在所述第一波导介质内传输,并使得所述第二光束以所述第二角度投射至所述图像生成模组。
  3. 根据权利要求2所述的装置,其特征在于,所述立体显示装置还包括电压控制模块,所述电压控制模块用于:
    在所述第一光栅组上加载第一电压,用于调整所述第一衍射角度和所述第一角度;
    所述电压控制模块还用于:
    在所述第一光栅组上加载第二电压,用于调整所述第二衍射角度和所述第二角度。
  4. 根据权利要求3所述的装置,其特征在于,所述第一光栅组的材料为电光聚合物材料。
  5. 根据权利要求1所述的装置,其特征在于,所述波导层包括第一光栅组、第一波导介质、第二光栅组以及第二波导介质,所述第一波导介质以及所述第二波导介质位于所述波导层的同一水平面;
    其中,所述第一光栅组,使得所述第一光束以第一衍射角度在所述第一波导介质内传输,并使得所述第一光束以所述第一角度投射至所述图像生成模组;
    所述第二光栅组,使得所述第二光束以第二衍射角度在所述第二波导介质内传输,并使得所述第二光束以所述第二角度投射至所述图像生成模组。
  6. 根据权利要求5所述的装置,其特征在于,所述波导层还包括第三光栅组、第三波导介质、第四光栅组以及第四波导介质,所述第一波导介质、所述第二波导介质、是第三波导介质以及所述第四波导介质位于所述波导层的同一水平面;
    所述第三光栅组用于:
    使得所述第一光束以第三衍射角度在所述第三波导介质内传输,并使得所述第一光束以第三角度投射至所述图像生成模组;
    所述第四光栅组用于:
    使得所述第二光束以第四衍射角度在所述第四波导介质内传输,并使得所述第二光束以第四角度投射至所述图像生成模组。
  7. 根据权利要求2-6中任一项所述的装置,其特征在于,所述波导介质包括入光面以及出光面,所述入光面用于接收所述第一光束和所述第二光束,所述出光面用于投射所述第一光束和所述第二光束。
  8. 一种立体显示设备,其特征在于,所述立体显示设备包括立体显示装置,以及控制模组,所述立体显示装置包括光源、波导层以及图像生成模组;
    所述控制模组用于:
    控制所述光源在第一时刻发射第一光束,以及在第二时刻发射第二光束;
    控制所述图像生成模组在所述第一时刻获取得到第一图像数据,以及在所述第二时刻获取得到第二图像数据;
    所述波导层,用于将所述第一光束以第一角度投射至所述图像生成模组;
    所述图像生成模组,用于基于所述第一图像数据调制所述第一光束得到第一成像光;
    所述波导层,还用于将所述第二光束以第二角度投射至所述图像生成模组;
    所述图像生成模组,还用于基于所述第二图像数据调制所述第二光束得到第二成像光。
  9. 根据权利要求8所述的设备,其特征在于,所述立体显示装置为如2、5或7中任一项所述的立体显示装置。
  10. 根据权利要求8所述的设备,其特征在于,所述立体显示装置为如权利要求6所述的立体显示装置,在所述立体显示设备的显示模式为多人模式的情况下,所述控制模组还用于:
    控制所述立体显示装置的第三光栅组使得所述第一光束以第三衍射角度在所述第三波导介质内传输,并使得所述第一光束以第三角度投射至所述图像生成模组;
    控制所述立体显示装置的第四光栅组,使得所述第二光束以第四衍射角度在所述第四波导介质内传输,并使得所述第二光束以第四角度投射至所述图像生成模组。
  11. 根据权利要求10所述的设备,其特征在于,所述控制模组还用于根据传感器反馈的信息确定所述显示模式。
  12. 根据权利要求11所述的设备,其特征在于,所述传感器反馈的信息包括人眼位置信息、头部信息或热成像信息中的至少一个。
  13. 一种交通工具,其特征在于,包括权利要求8至12中任一项所述的立体显示设备。
  14. 根据权利要求13所述的交通工具,其特征在于,所述交通工具还包括挡风玻璃,所述立体显示设备将所述第一成像光以及所述第二成像光分别投射至所述挡风玻璃。
  15. 一种立体显示方法,其特征在于,包括:
    通过光源在第一时刻获取第一光束,以及在第二时刻获取第二光束;
    通过波导层以第一角度投射所述第一光束;
    基于第一图像数据调制所述第一光束得到第一成像光;
    通过所述波导层以第二角度投射所述第二光束;
    基于第二图像数据调制所述第二光束得到第二成像光。
  16. 根据权利要求15所述的方法,其特征在于,在所述波导层包括第一光栅组以及第一波导介质的情况下,所述通过波导层以第一角度投射所述第一光束包括:
    通过所述第一光栅组使得所述第一光束以第一衍射角度在所述第一波导介质内传输,并使得所述第一光束以所述第一角度投射;
    通过所述波导层以第二角度投射所述第二光束包括:
    通过所述第一光栅组使得所述第二光束以第二衍射角度在所述第一波导介质内传输,并使得所述第二光束以所述第二角度投射。
  17. 根据权利要求16所述的方法,其特征在于,所述方法还包括:
    在所述第一光栅组上加载第一电压,调整所述第一衍射角度和所述第一角度;
    在所述第一光栅组上加载第二电压,调整所述第二衍射角度和所述第二角度。
  18. 根据权利要求17所述的方法,其特征在于,所述第一光栅组的材料为电光聚合物材料。
  19. 根据权利要求15所述的方法,其特征在于,在所述波导层包括第一光栅组、第一波导介质、第二光栅组以及第二波导介质的情况下,所述第一波导介质以及所述第二波导介质位于所述波导层的同一水平面,所述通过波导层以第一角度投射所述第一光束包括:
    通过所述第一光栅组使得所述第一光束以第一衍射角度在所述第一波导介质内传输,并使得所述第一光束以所述第一角度投射;
    所述通过所述波导层以第二角度投射所述第二光束包括:
    通过所述第二光栅组使得所述第二光束以第二衍射角度在所述第二波导介质内传输,并使得所述第二光束以所述第二角度投射。
  20. 根据权利要求19所述的方法,其特征在于,在所述波导层还包括第三光栅组、第三波导介质、第四光栅组以及第四波导介质的情况下,所述第一波导介质、所述第二波导介质、所述第三波导介质以及所述第四波导介质位于所述波导层的同一水平面,所述方法还包括:
    通过所述第三光栅组使得所述第一光束以第三衍射角度在所述第三波导介质内传输,并使得所述第一光束以第三角度投射;
    通过所述第四光栅组使得所述第二光束以第四衍射角度在所述第四波导介质内传输,并使得所述第二光束以第四角度投射。
PCT/CN2023/093016 2022-08-12 2023-05-09 一种立体显示装置、立体显示设备以及立体显示方法 WO2024032057A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210970221.6A CN117590621A (zh) 2022-08-12 2022-08-12 一种立体显示装置、立体显示设备以及立体显示方法
CN202210970221.6 2022-08-12

Publications (1)

Publication Number Publication Date
WO2024032057A1 true WO2024032057A1 (zh) 2024-02-15

Family

ID=89850618

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/093016 WO2024032057A1 (zh) 2022-08-12 2023-05-09 一种立体显示装置、立体显示设备以及立体显示方法

Country Status (2)

Country Link
CN (1) CN117590621A (zh)
WO (1) WO2024032057A1 (zh)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001013457A (ja) * 1999-06-25 2001-01-19 Sanyo Electric Co Ltd 眼鏡無し立体映像表示装置
US20100232016A1 (en) * 2005-09-28 2010-09-16 Mirage Innovations Ltd. Stereoscopic Binocular System, Device and Method
CN201886210U (zh) * 2010-12-07 2011-06-29 京东方科技集团股份有限公司 立体显示装置
CN102768406A (zh) * 2012-05-28 2012-11-07 中国科学院苏州纳米技术与纳米仿生研究所 一种空间分割式裸眼立体显示器
CN103221873A (zh) * 2010-09-17 2013-07-24 拜耳知识产权有限责任公司 自由立体3d显示器
US20160202594A1 (en) * 2015-01-13 2016-07-14 Electronics And Telecommunications Research Institute Backlight unit and display apparatus including the same
KR101673547B1 (ko) * 2015-07-27 2016-11-07 주식회사 엘엠에스 백라이트 유닛 및 이를 포함하는 입체 영상 표시장치
CN106461956A (zh) * 2014-02-18 2017-02-22 科思创德国股份有限公司 使用全息光学元件的自动立体3d显示设备
CN111273457A (zh) * 2020-02-24 2020-06-12 广州弥德科技有限公司 基于投影光学引擎的指向光源裸眼3d显示器和显示方法
TW202030519A (zh) * 2018-12-20 2020-08-16 美商雷亞有限公司 具有可移動會聚平面的多視像顯示器、系統和方法
CN112925108A (zh) * 2019-12-05 2021-06-08 北京芯海视界三维科技有限公司 场序列显示器

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001013457A (ja) * 1999-06-25 2001-01-19 Sanyo Electric Co Ltd 眼鏡無し立体映像表示装置
US20100232016A1 (en) * 2005-09-28 2010-09-16 Mirage Innovations Ltd. Stereoscopic Binocular System, Device and Method
CN103221873A (zh) * 2010-09-17 2013-07-24 拜耳知识产权有限责任公司 自由立体3d显示器
CN201886210U (zh) * 2010-12-07 2011-06-29 京东方科技集团股份有限公司 立体显示装置
CN102768406A (zh) * 2012-05-28 2012-11-07 中国科学院苏州纳米技术与纳米仿生研究所 一种空间分割式裸眼立体显示器
CN106461956A (zh) * 2014-02-18 2017-02-22 科思创德国股份有限公司 使用全息光学元件的自动立体3d显示设备
US20160202594A1 (en) * 2015-01-13 2016-07-14 Electronics And Telecommunications Research Institute Backlight unit and display apparatus including the same
KR101673547B1 (ko) * 2015-07-27 2016-11-07 주식회사 엘엠에스 백라이트 유닛 및 이를 포함하는 입체 영상 표시장치
TW202030519A (zh) * 2018-12-20 2020-08-16 美商雷亞有限公司 具有可移動會聚平面的多視像顯示器、系統和方法
CN112925108A (zh) * 2019-12-05 2021-06-08 北京芯海视界三维科技有限公司 场序列显示器
CN111273457A (zh) * 2020-02-24 2020-06-12 广州弥德科技有限公司 基于投影光学引擎的指向光源裸眼3d显示器和显示方法

Also Published As

Publication number Publication date
CN117590621A (zh) 2024-02-23

Similar Documents

Publication Publication Date Title
CN103261942B (zh) 双取向自动立体背光源和显示器
Balogh et al. Holovizio 3D display system
US11683472B2 (en) Superstereoscopic display with enhanced off-angle separation
CN105143964B (zh) 多激光驱动系统
US10955685B2 (en) Volumetric display arrangement and a method for representing content of an image
WO2023142568A1 (zh) 显示装置和交通工具
US11555960B1 (en) Waveguide array illuminator with light scattering mitigation
WO2024021852A1 (zh) 立体显示装置、立体显示系统和交通工具
WO2024032057A1 (zh) 一种立体显示装置、立体显示设备以及立体显示方法
WO2023143505A1 (zh) 一种图像生成装置、显示设备和图像生成方法
WO2024021574A1 (zh) 立体投影系统、投影系统和交通工具
US11726252B2 (en) Self-lit display panel
CN116184686A (zh) 立体显示装置和交通工具
WO2024041034A1 (zh) 一种显示模组、光学显示系统、终端设备及成像方法
WO2024124519A1 (zh) 一种图像生成装置、显示设备、交通工具和图像生成方法
CN115542644B (zh) 投影装置、显示设备及交通工具
WO2024098828A1 (zh) 投影系统、投影方法和交通工具
US20230314716A1 (en) Emission of particular wavelength bands utilizing directed wavelength emission components in a display system
US11860395B2 (en) Self-lit display panel
WO2024037061A1 (zh) 一种显示设备和交通工具
EP4432006A1 (en) Wide color gamut enabled edge-lit blu for high ppi vr-lcd display
WO2024045704A1 (zh) 显示装置、显示设备及交通工具
WO2023146939A1 (en) Image generation and delivery in a display system utilizing a two-dimensional (2d) field of view expander
CN118192078A (zh) 显示装置和交通工具
CN117075340A (zh) 一种基于多层透明背光板的增强现实近眼显示装置及显示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23851287

Country of ref document: EP

Kind code of ref document: A1