WO2021196369A1 - 基于子像素出射光空间叠加的三维显示方法 - Google Patents

基于子像素出射光空间叠加的三维显示方法 Download PDF

Info

Publication number
WO2021196369A1
WO2021196369A1 PCT/CN2020/091873 CN2020091873W WO2021196369A1 WO 2021196369 A1 WO2021196369 A1 WO 2021196369A1 CN 2020091873 W CN2020091873 W CN 2020091873W WO 2021196369 A1 WO2021196369 A1 WO 2021196369A1
Authority
WO
WIPO (PCT)
Prior art keywords
sub
light
pixel
pixels
aperture
Prior art date
Application number
PCT/CN2020/091873
Other languages
English (en)
French (fr)
Inventor
滕东东
刘立林
Original Assignee
中山大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中山大学 filed Critical 中山大学
Priority to US17/226,093 priority Critical patent/US20210314553A1/en
Publication of WO2021196369A1 publication Critical patent/WO2021196369A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/30Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers

Definitions

  • the present invention relates to the field of three-dimensional display technology, and more specifically to a three-dimensional display method based on spatial superposition of sub-pixel emitted light.
  • three-dimensional displays can provide optical information whose dimensions are consistent with the real world that people perceive, and are receiving more and more attention.
  • Stereo vision technology including automatic stereo vision
  • for three-dimensional presentation based on the principle of binocular parallax, which projects a corresponding two-dimensional image to the observer's binoculars, and uses the binocular vision to cross trigger the observer at the scene of the screen Depth perception.
  • binocular parallax which projects a corresponding two-dimensional image to the observer's binoculars
  • uses the binocular vision to cross trigger the observer at the scene of the screen Depth perception.
  • the eyes of the observer must always focus on the two-dimensional image display surface, which leads to the problem of focus-convergence conflict, that is, the observer's monocular focus depth and binocular focus depth are inconsistent.
  • Monocular multi-view is an effective technical path to solve the problem of focus-convergence conflict. It uses a light splitting device to guide the display device to project at least two two-dimensional images of the scene to be displayed to the same eye of the observer, so as to guide the presence of at least two light beams to enter the same eye of the observer through each display object point.
  • the light intensity distribution superimposed at the display object point to form a light spot can lead the observer's eyes to freely focus on the superimposed light spot, that is, to overcome the problem of focus-convergence conflict.
  • the present invention proposes a three-dimensional display method based on spatial superposition of sub-pixel emitted light.
  • the sub-pixel of the display device is used as the basic display unit, and each sub-pixel emitting light of the same color is separately used as a sub-pixel group or divided into several sub-pixel groups.
  • the multiple sub-pixel groups formed thereby project multiple images of the scene to be displayed into the area where the pupil of the observer is located, and realize the monocular focusable three-dimensional scene display based on the monocular multi-view.
  • the existing monocular multi-view display technologies are all based on the spatial superposition of light beams with color rendering capabilities projected by different pixels to form a monocular focusable spatial light spot.
  • the color rendering ability of the projected light beam of each pixel comes from the mixing and modulation of the basic color light beams respectively emitted by the sub-pixels included in the pixel.
  • a superposition of this type of spatial light point requires at least the spatial superposition of the projection beams of two pixels, such as PCT/CN2017/080874 (THREE-DIMENTIONAL DISPLAY SYSTEM BASED ON DIVISION MULTIPLEXING OF VIEWER'S ENTRANCE-PUPIL AND DISPLAY METHOD) and PCT/IB2017 /055664 (NEAR-EYE SEQUENTIAL LIGHT-FIELD PROJECTOR WITH CORRECT MONOCULAR DEPTH CUES).
  • the basic color light beams projected by different sub-pixels are superimposed in space to form a monocular focusable spatial light spot.
  • the method described in this patent requires at least two sub-pixels.
  • the three-dimensional display method of this patent based on the spatial superposition of sub-pixel emitted light can effectively increase the number of two-dimensional view projections of the display device, which is more conducive to the expansion of the viewing area of the observer’s eyes, or by improving
  • the projection view corresponds to the spatial density of the viewing zone, and the expanded monocular can focus on the display depth of the scene.
  • this patent uses a projection device to project an enlarged image of the display device, so that the application range of the method is extended to near-eye display; and a relay device is used to optimize the spatial structure distribution of the optical structure.
  • the method can be directly applied to a binocular three-dimensional display optical engine, and can also be applied to a monocular optical engine.
  • Sub-pixels are used as display units to realize monocular focusable three-dimensional display based on monocular multi-views.
  • the present invention provides the following solutions:
  • the three-dimensional display method based on spatial superposition of sub-pixel emitted light includes the following steps:
  • each sub-pixel emitting light of the same color is separately used as a sub-pixel group or divided into several sub-pixel groups;
  • the sub-pixels of the display device are classified into K′-type basic color sub-pixels with different colors of emitted light, including K-type primitive color sub-pixels, and the sub-pixels of each sub-pixel group are arranged all over the display device, wherein K′ ⁇ 2, K′ ⁇ K ⁇ 2;
  • the K-type primitive color sub-pixels correspond to the K-type filter in a one-to-one correspondence manner, and the light emitted by the various primitive color sub-pixels corresponds to the filter and to other (K-1) types.
  • the transmittance ratio of the primary color sub-pixel corresponding to each filter is greater than 9, and the color of the light emitted by the basic color sub-pixel is defined as the basic color.
  • K′ basic colors There are K′ basic colors.
  • the color of the light emitted by the primary color sub-pixel is defined as Primitive colors, a total of K types of primitive colors;
  • a light splitting device is arranged in front of the display device to guide each sub-pixel of the display device to project a light beam to the corresponding viewing area of the sub-pixel group to which it belongs, and to limit the divergence of each projected light beam Angle so that each projected light beam is on the surface where the pupil of the observer is located, and along at least one direction, the light distribution size with a light intensity value greater than 50% of the peak light intensity is smaller than the pupil diameter of the observer;
  • the control device connected to the display device controls each sub-pixel group to load and display the corresponding image.
  • the image information loaded by each sub-pixel is the transmission of the light beam projected along the sub-pixel and incident on the pupil of the observer.
  • Vector direction the projection light information of the scene to be displayed on the intersection of the straight line where the transmission vector is located and the plane where the observer’s pupil is located;
  • the image displayed by one sub-pixel group is a view of the scene to be displayed, and the image displayed by the spliced sub-pixel group formed by complementary splicing of different parts of different sub-pixel groups is a spliced view;
  • the spatial position distribution of the corresponding viewing zone of each sub-pixel group of the display device is set such that a total of at least two views or/and the combined view information enter the pupil of the same observer.
  • step (i) further includes naming the sub-pixel and the sub-pixel group to which it belongs according to the color of the light emitted by the sub-pixel of the sub-pixel group, for example, the green sub-pixel and the green sub-pixel group to which it belongs.
  • the light splitting device is an aperture array placed corresponding to the pupil of the observer, the aperture array including at least one aperture group consisting of K apertures, the K apertures are attached with K-type primitive dice in a one-to-one correspondence
  • the pixels respectively correspond to the K-type filter, and each aperture corresponds to the sub-pixel group formed by the primary color sub-pixels, and the sub-pixel group whose projected light passes through the aperture uses the aperture as the corresponding viewing zone.
  • the aperture array includes M groups of aperture groups, and different aperture groups respectively allow only mutually different orthogonal characteristic light to pass through, where M ⁇ 2.
  • the mutually different orthogonal characteristics are time-series orthogonal characteristics transmitted at different time points and different times, or two linear polarization characteristics with mutually perpendicular polarization directions, or two optical rotation characteristics with opposite rotation directions, or The combination of the orthogonal characteristics of the transmission at different time points and the two linear polarization characteristics with the polarization direction perpendicular to each other, or the combination of the orthogonal characteristics of the transmission at different time points and the two optical rotation characteristics with the opposite rotation direction characteristic.
  • the light splitting device is an aperture array placed corresponding to the pupil of the observer, and the aperture array includes at least one set of apertures composed of K′ apertures, and the K′ apertures are one-to-one with K′-type basic color sub-pixels.
  • the sub-pixel group formed by the sub-pixels of each aperture corresponding to the basic color the sub-pixel group whose projected light passes through the aperture uses the aperture as the corresponding viewing zone, and the K apertures corresponding to the K-type primary color sub-pixels are respectively attached with corresponding K class filter, where K′>K;
  • K apertures with K-type filters allow the same orthogonal characteristic light to pass through, and the remaining (K'-K) apertures respectively allow only other (K'-K) orthogonal characteristics. Each kind of light passes through.
  • the (K′-K+1) orthogonal characteristics are different from each other.
  • the sub-pixel group formed by the sub-pixels of the corresponding basic color of each aperture the sub-pixel group whose projected light passes through the aperture is The aperture is the corresponding viewing zone.
  • the aperture array includes M groups of aperture groups, and different aperture groups respectively allow only mutually different orthogonal characteristic light to pass through, where M ⁇ 2.
  • the mutually different orthogonal characteristics are time-series orthogonal characteristics transmitted at different time points and different times, or two linear polarization characteristics with mutually perpendicular polarization directions, or two optical rotation characteristics with opposite rotation directions, or The combination of the orthogonal characteristics of the transmission at different time points and the two linear polarization characteristics with the polarization direction perpendicular to each other, or the combination of the orthogonal characteristics of the transmission at different time points and the two optical rotation characteristics with the opposite rotation direction characteristic.
  • the display device is a passive display device with a backlight source array.
  • the backlight source array includes at least one backlight source group consisting of K backlight sources projecting K-type elementary color lights, and the light splitting device is used to project the The optical device of the real image of each backlight source of the backlight source array, the light distribution area of the real image of each backlight source is used as a sub-pixel group whose projected light color is consistent with the color of the backlight source, and the sub-pixel group that projects light through the light distribution area The corresponding viewport.
  • the backlight source array includes M groups of backlight source groups, and different backlight source groups respectively emit mutually different orthogonal characteristic lights, where M ⁇ 2.
  • the mutually different orthogonal characteristics are time-series orthogonal characteristics projected at different time points at different times, or two linear polarization characteristics with mutually perpendicular polarization directions, or two optical rotation characteristics with opposite rotation directions, or The combination of the orthogonal characteristics of the time series projected at different time points and the two linear polarization characteristics with the polarization direction perpendicular to each other, or the combination of the orthogonal characteristics of the time series projected at different time points and the two optical rotation characteristics with the opposite rotation direction characteristic.
  • the display device is a passive display device with a backlight source array.
  • the backlight source array includes at least one group of K'backlight sources projecting K'basic color lights respectively, and the light splitting device is a projection
  • the optical device of the real image of each backlight source of the backlight source array, and the light distribution area of the real image of each backlight source is used as a sub-pixel group whose projected light color is the same as that of the backlight source, and the projected light passes through the sub-pixels of the light distribution area
  • the output lights of the K backlight sources that project the K-type primitive color light have the same orthogonal characteristic, and the output lights of the remaining (K'-K) backlight sources have other (K'-K) types respectively.
  • the (K'-K+1) orthogonal characteristics are different from each other.
  • the backlight source array includes M groups of backlight source groups, and different backlight source groups respectively emit mutually different orthogonal characteristic lights, where M ⁇ 2.
  • the mutually different orthogonal characteristics are time-series orthogonal characteristics projected at different time points at different times, or two linear polarization characteristics with mutually perpendicular polarization directions, or two optical rotation characteristics with opposite rotation directions, or The combination of the orthogonal characteristics of the time series projected at different time points and the two linear polarization characteristics with the polarization direction perpendicular to each other, or the combination of the orthogonal characteristics of the time series projected at different time points and the two optical rotation characteristics with the opposite rotation direction characteristic.
  • step (ii) also includes placing the projection device at a position corresponding to the display device to form an enlarged image of the display device.
  • step (ii) further includes placing a relay device on the transmission path of the projection light of the display device to guide the projection light beam of the display device to enter the area where the pupil of the observer is located.
  • the relay device is a reflective surface, or a semi-transparent and semi-reverse surface, a combination of free-form surfaces, or an optical waveguide device.
  • step (iii) also includes connecting the tracking device with the control device, and determining the position of the pupil of the observer in real time through the tracking device (70).
  • step (iii) also includes determining the light information loaded on each sub-pixel of the pupil of the observer whose projection light enters the pupil of the observer according to the position of the pupil of the observer.
  • Transmission vector the projection light information of the scene to be displayed on the intersection of the straight line where the transmission vector is located and the pupil of the observer.
  • step (iii) also includes determining, by the control device, each sub-pixel group where the projected light beam enters the observer’s pupil according to the position of the observer’s pupil, and only driving each sub-pixel group as an effective sub-pixel group for monocular multi-view show.
  • the present invention uses sub-pixels as the basic display unit. Compared with the monocular multi-view display method using pixels as the display unit, the present invention can effectively increase the number of projections of two-dimensional views. And time multiplexing, to further meet the larger requirements of monocular multi-view display for the number of projections of two-dimensional views.
  • the present invention has the following technical effects: the present invention is based on a three-dimensional display method based on the spatial superposition of sub-pixel emitted light, and provides an implementation method for three-dimensional display without focus-convergence conflict.
  • This patent is based on the three-dimensional display method of sub-pixel emitting light space superposition, which can effectively increase the number of two-dimensional view projections of the display device, and is more conducive to the expansion of the viewing area of the observer's eyes, or by increasing the spatial density of the corresponding viewing zone of the projected view Monocular can focus on the display depth of the scene.
  • this patent uses a projection device to project an enlarged image of the display device, so that the application range of the method is extended to near-eye display; and a relay device is used to optimize the spatial structure distribution of the optical structure.
  • the method can be directly applied to a binocular three-dimensional display optical engine, and can also be applied to a monocular optical engine.
  • FIG. 1 is a schematic diagram of a conventional monocular multi-view display principle using pixels as display units.
  • Fig. 2 is a schematic diagram of a three-dimensional display method of this patent based on spatial superposition of sub-pixel emitted light.
  • Fig. 3 is an explanatory diagram of the merging rules for merging sub-pixel groups.
  • Figure 4 is a schematic diagram of the depth area where no beam superposition occurs.
  • Fig. 5 is a schematic diagram of the imaging function of the projection device.
  • Fig. 6 is an example of the propagation path of the deflected light of the relay device.
  • Fig. 7 is a schematic diagram of the light splitting principle of an aperture type light splitting device.
  • Fig. 8 is an explanatory diagram of the working principle of an aperture type spectroscopic device based on linear deviation characteristics.
  • Fig. 9 is an explanatory diagram of the working principle of the aperture type spectroscopic device based on the orthogonal characteristic of the timing.
  • Fig. 10 is an explanatory diagram of the working principle of an aperture type spectroscopic device based on hybrid orthogonal characteristics.
  • Fig. 11 is a diagram illustrating the working principle of another aperture type spectroscopic device based on hybrid orthogonal characteristics.
  • Fig. 12 is a schematic diagram of a binocular display structure based on an aperture type spectroscopic device.
  • Fig. 13 is a schematic diagram of a near-eye monocular display optical engine based on an aperture type spectroscopic device.
  • Fig. 14 is a schematic diagram of a near-eye binocular display optical engine based on an aperture type spectroscopic device.
  • Fig. 15 is an example 1 of a composite structure type near-eye monocular optical engine based on an aperture type spectroscopic device.
  • Fig. 16 is an example 2 of a near-eye monocular optical engine with a composite structure based on an aperture-type spectroscopic device.
  • Fig. 17 is an example 3 of a near-eye monocular optical engine with a composite structure based on an aperture-type spectroscopic device.
  • Fig. 18 is an example 4 of a near-eye monocular optical engine with a composite structure based on an aperture-type spectroscopic device.
  • Fig. 19 is a structural assembly diagram of a light and thin monocular optical engine based on an aperture type spectroscopic device.
  • Fig. 20 is an example 5 of a composite structure type near-eye monocular optical engine based on an aperture type spectroscopic device.
  • FIG. 21 is a schematic diagram of an example of a free-form surface type relay device based on an aperture type light splitting device.
  • FIG. 22 is a schematic diagram of Example 1 of an optical waveguide type relay device based on an aperture type optical splitting device.
  • FIG. 23 is a schematic diagram of the stacked structure of multiple optical waveguides.
  • Example 24 is a schematic diagram of Example 2 of an optical waveguide type relay device based on an aperture type optical splitting device.
  • FIG. 25 is a schematic diagram of Example 3 of an optical waveguide type relay device based on an aperture type optical splitting device.
  • Fig. 26 is a schematic diagram of the light splitting principle of an imaging type spectroscopic device.
  • Fig. 27 is an explanatory diagram of the working principle of an imaging type spectroscopic device based on linear deviation characteristics.
  • Fig. 28 is an explanatory diagram of the working principle of an imaging type spectroscopic device based on the orthogonal characteristic of the time series.
  • Fig. 29 is an explanatory diagram of the working principle of an imaging type spectroscopic device based on hybrid orthogonal characteristics.
  • Fig. 30 is a schematic diagram of a binocular display structure based on an imaging type spectroscopic device.
  • Figure 31 is a schematic diagram of the shape of the backlight.
  • FIG. 32 is a schematic diagram of the oblique arrangement of the strip-shaped viewing area relative to the direction of the binocular connection of the observer.
  • Fig. 33 is a schematic diagram of a composite structure type display device.
  • FIG. 34 is a schematic diagram of Example 1 of a near-eye monocular display optical engine based on an imaging type spectroscopic device.
  • Example 35 is a schematic diagram of Example 2 of a near-eye monocular display optical engine based on an imaging type spectroscopic device.
  • Fig. 36 is a schematic diagram of a near-eye binocular display optical engine based on an imaging type spectroscopic device.
  • Fig. 37 is a schematic diagram of Example 3 of a near-eye monocular display optical engine based on an imaging type spectroscopic device.
  • FIG. 38 is a schematic diagram of Example 4 of a near-eye monocular display optical engine based on an imaging type spectroscopic device.
  • FIG. 39 is a schematic diagram of Example 5 of a near-eye monocular display optical engine based on an imaging type spectroscopic device.
  • FIG. 40 is a schematic diagram of Example 6 of a near-eye monocular display optical engine based on an imaging type spectroscopic device.
  • Fig. 41 is a schematic diagram of an example of a free-form surface type relay device based on an imaging type spectroscopic device.
  • FIG. 42 is a schematic diagram of Example 1 of an optical waveguide type relay device based on an imaging type spectroscopic device.
  • FIG. 43 is a schematic diagram of Example 2 of an optical waveguide type relay device based on an imaging type spectroscopic device.
  • the present invention is based on a three-dimensional display method based on the spatial superposition of sub-pixels. It directly uses sub-pixels as the basic display unit, and uses multiple sub-pixel groups to project at least two views to the same pupil of the observer, based on the space of different sagittal beams from different views. Superimposed to form a monocular focusable display light spot, realizing a three-dimensional display without focus-convergence conflict.
  • the existing monocular and multi-view technologies all use pixels as the basic display unit, project at least two views to the same pupil 50 of the observer, and through the spatial superposition of at least two point light beams from the at least two views that pass the display object, A space superimposed light spot is formed at the display object point.
  • the observer's eye can be drawn to focus on the spatial superimposed light spot, thereby overcoming focusing -Convergent conflict.
  • the light distribution size of each beam whose intensity is greater than 50% of the peak intensity is smaller than the diameter D p of the pupil 50 of the observer, so as to ensure relative
  • the light intensity distribution at the point where the beam exits the pixel, and the light intensity distribution at the spatially superimposed light point can have the ability to attract the focus of the observer's eye.
  • the at least two views are derived from at least two pixel groups on the display device 10, and the at least two pixel groups respectively project corresponding views on the respective view zones under the action of the light splitting device 20. Specifically, as shown in FIG.
  • the display device 10 is an example of a thin structure display.
  • the projected light beam of the pixel group 1 carries the light information of the view 1 through the corresponding viewing zone VZ 1
  • the projected light beam of the pixel group 2 carries the light information of the view 2 through the corresponding viewing zone VZ 2 , and respectively enters the pupil 50 of the observer.
  • the x direction is the arrangement direction of the viewing zone.
  • the light beam 1 from the pixel group 1 and the light beam 2 from the pixel group 2 are superimposed to form a spatially superimposed light spot.
  • FIG. 1 is to illustrate the basic principle and method of monocular multi-view realization, without involving the specific structure of the light splitting device 20, and replacing the light splitting device 20 with a virtual frame.
  • the spatial position of the spectroscopic device 20 relative to the display device 10 is also schematically shown for ease of illustration, and does not mean that the actual positional relationship between the spectroscopic device 20 and the display device 10 is certain when the spectroscopic device 20 adopts different specific structures. As shown in Figure 1.
  • more views of the pupil 50 of the same observer can be incident.
  • more superimposed light beams will enter the pupil 50 of the observer along the respective sagittal directions.
  • the superposition of the larger number of light beams can improve the attraction ability of the spatial superimposed light points to the focus of the observer's eyes, which is beneficial to the display of scenes with a larger screen distance.
  • more viewing zones can also provide a larger viewing area for the pupil 50 of the observer, so that when the pupil 50 of the observer moves within the larger viewing area, the observer continues to see no focus based on the monocular multi-view principle. -Display scene of convergence conflict.
  • an increase in the number of viewing zones means an increase in the number of their respective views
  • the light splitting device 20 is required to divide the pixels of the display device 10 into more pixel groups to project the more views.
  • the division of pixels of the display device 10 into pixel groups is often implemented by the light splitting device 20 based on space division multiplexing or time division multiplexing.
  • the spatial division multiplexing divides the pixels of the display device 10 into different spatial pixel groups corresponding to different viewing areas.
  • the pixels contained in each spatial pixel group are different from each other. More spatial pixel groups mean that each spatial pixel group contains pixels.
  • the decrease in the number that is, the decrease in the resolution of the projected view.
  • Time division multiplexing divides the pixel timing of the display device 10 into different timing pixel groups corresponding to different viewing areas. Each timing pixel group contains the same pixels, but is performed to different viewing areas at different time points in the same time period.
  • Light information projection based on visual retention, performs spatial superposition of light beams projected by different timing pixel groups.
  • the generation of more time series pixel groups based on time division multiplexing means that when a scene is displayed based on visual retention, an increase in the number of time points is required in a time period, that is, a relative decrease in the display frequency of a three-dimensional scene.
  • FIG. 1 is a schematic diagram of a conventional monocular multi-view display method in which each light beam is emitted from a corresponding pixel.
  • Each pixel includes more than one sub-pixel. More than one sub-pixels of the same pixel respectively emit light of different colors, and the light of different colors is mixed to form a multi-color mixed light, which is used as the color emitted light of the pixel.
  • the colored light emitted by each pixel in this way is in the monocular multi-view display process, through the divergence angle restriction and sagittal guidance of the beam splitter 20, in the form of a sagittal beam with a limited divergence angle, spatially superimposed to generate a monocular focusable spatial superimposed light spot .
  • the two views required for the process shown in FIG. 1 are projected from two pixel groups formed by dividing the pixels of the display device 10 by the light splitting device 20.
  • each sub-pixel emitting light of the same color is separately used as a sub-pixel group or divided into multiple sub-pixel groups, and the sub-pixel and its sub-pixel group are named after the color of the light emitted by each sub-pixel, such as the green sub-pixel and its sub-pixel group.
  • Green sub-pixel group FIG. 2 takes the display device 10 in which each pixel is composed of R (red), G (green), and B (blue) sub-pixels as an example for illustration.
  • the R, G, and B three-color sub-pixels emit red light, Three types of sub-pixels of green light and blue light.
  • K′-type basic color sub-pixels with different colors of emitted light including K-type primary color sub-pixels.
  • the K-type elementary color sub-pixels meet the following conditions: in a one-to-one correspondence, they respectively correspond to the K-type filter, and the light emitted by the various elementary color sub-pixels corresponds to the filter and to other (K-1 )
  • the transmittance ratio of the filter corresponding to the element-like color sub-pixel is greater than 9.
  • the color of the light emitted by the basic color sub-pixels as the basic color, which is a total of K'-type basic colors, and define the color of the light emitted by the primary color sub-pixels as the primary color, and have a total of K-type primitive colors.
  • the three sub-pixels in each pixel are arranged along the x direction.
  • the 6 sub-pixel groups under the guidance of the light splitting device 20, respectively pass through the 6 viewing zones VZ R1 , VZ G1 , VZ B1 , VZ R2 , VZ G2 and VZ B2 to project 6 views of the scene to be displayed in a one-to-one correspondence.
  • the image information loaded by each sub-pixel is the transmission vector of the light beam projected along the sub-pixel and incident on the area where the pupil 50 of the observer is located through the corresponding viewing zone of the sub-pixel group to which the scene to be displayed is located.
  • the projection light information at the intersection of the straight line and the observer ’s pupil 50.
  • each sub-pixel group is loaded with views relative to their corresponding view zones. It should be pointed out that the projection light of each sub-pixel is the basic color, and only the basic color light information can be projected.
  • the transmission vector of the scene to be displayed is the projection light information in the "projected light information on the intersection of the straight line where the transmission vector is located and the surface where the pupil 50 of the observer is located" refers only to the information component whose color is consistent with the color of the emitted light from the sub-image. This also applies to other parts of this patent.
  • the distance between the viewing zones is designed so that the light information of at least two views emitted through at least two viewing zones enters the pupil 50 of the same observer.
  • the three sagittal beams 3, 4, and 5 from the three basic color sub-pixel groups are superimposed to form a monocular focusable display light spot.
  • the sub-pixels that emit light of different colors are grouped to project views respectively, which can increase the projected views and corresponding views of the display device 10 through the light splitting device 20.
  • FIG. 2 is to illustrate the principle and method of implementing monocular multi-view display with sub-pixels as the basic display unit, without involving the specific structure of the light splitting device 20, and replacing the light splitting device 20 with a virtual frame.
  • the spatial position of the spectroscopic device 20 relative to the display device 10 is also schematically shown for ease of illustration, and does not mean that the actual positional relationship between the spectroscopic device 20 and the display device 10 is certain when the spectroscopic device 20 adopts different specific structures. as shown in picture 2. Comparing Figure 1 and Figure 2, if the structure shown in Figure 1 and the structure shown in Figure 2 use the same display device 10, when the resolution of the projected view is the same, using sub-pixels as the basic display unit can increase the number of projected views.
  • the monocular multi-view display method of this patent using sub-pixels as the basic display unit is compared with the existing monocular multi-view display method using pixels as the basic display unit.
  • Multi-view display can effectively increase the number of projected views and corresponding viewing zones, so that more viewing zones can be used to provide a larger viewing area for the pupil 50 of the observer, or the display depth can be expanded by increasing the distribution density of the viewing zones.
  • the spatially superimposed light point P shown in FIG. 2 is located between the display device 10 and the pupil 50 of the observer, and is formed by the real superposition of light beams from different sub-pixels.
  • a spatially superimposed light spot can also be generated.
  • the point P'in Fig. 2 is formed by the intersection of the reverse extension lines of the light beams 6, 7, and 8.
  • the observer's eyes receive the light beams 6, 7, and 8, they see the superimposed light distribution of the equivalent light distribution at the point P′ obtained by the transmission of the light beams 6, 7, and 8 along the reverse diffraction, which is also called in this patent
  • It is the spatially superimposed light spot at the point P′ which also corresponds to the real light spot on the retina of the observer's eye.
  • the display scenes on both sides of the display device 10 are generated based on the same principle for the observer. In the following part, only the display scene on the side of the transmission direction of the light emitted by the display device 10 is taken as an example for description.
  • each superimposed beam is required to be on the surface where the observer’s pupil 50 is located, along at least one direction, the light intensity value is greater than the light distribution size of 50% of the peak light intensity It is smaller than the diameter of the pupil 50 of the observer to ensure that the light intensity distribution of the spatially superimposed light spots relative to the light intensity at the exit sub-pixels of each light beam can attract the observer's eye focus.
  • the light beam projected by each sub-pixel enters the area where the pupil 50 of the observer is located through the corresponding viewing zone of the sub-pixel group to which it belongs. There are two cases for the shape of each corresponding viewing zone.
  • the size of the viewing zone is smaller than the diameter D p of the observer’s pupil 50, but there are other directions. Along this other direction, the size of the viewing zone is not smaller than the observer’s pupil 50 diameter D p .
  • This patent refers to this type of visual field. The area is a striped viewing area; in the second case, in all directions, the size of the viewing area is small. The observer’s pupil diameter D p is 50. This patent refers to this type of viewing area as a point-shaped viewing area. When a strip view zone is used, each view zone is arranged in a one-dimensional direction; when a point view zone is used, each view zone can be arranged in a one-dimensional direction or a two-dimensional direction.
  • the pupil 50 of the observer is placed close to the surface of the visual zone.
  • the pupil 50 of the observer may not be able to completely receive all the light beams of at least two views.
  • the pupil 50 of the observer at the position shown in FIG. 3 can receive a complete view incident through the viewing zone VZ B1 , which is projected by the blue sub-pixel group 1 corresponding to the viewing zone VZ B1.
  • the observer pupil 50 at this position can only receive the partial view projected by the sub-pixels of the red sub-pixel group 2 in the M s2 M r1 area through the viewing zone VZ R2 , and the green sub-pixel group 1 in the M s1 M r2 area Part of the view projected by the sub-pixels on the view zone VZ G1.
  • M p1 and M p2 are the two edge points of the observer pupil 50 along the viewing zone arrangement direction x
  • M s1 and M s2 are the two sub-pixel distribution area edge points of the display device 10 along the viewing zone arrangement direction x.
  • Mr1 is the intersection of the -x edge point connection of M p2 and the viewing zone VZ R2 and the display device 10
  • Mr2 is the intersection of the x edge connection of M p1 and the viewing zone VZ G1 and the display device 10.
  • the M s2 M r1 area and the M s1 M r2 area overlap take their spatially complementary parts, M s2 M t and M t M s1 , and join them into a joined sub-pixel group, and the image displayed on it is a joined scene of the scene to be displayed. view.
  • M t is a point in the overlapping area M r1 M r2.
  • the pupil 50 of the observer at the position shown in FIG. 3 can receive one view and one split view.
  • each display object point there will be at least two light beams from the one view and one split view respectively entering the pupil 50 of the observer.
  • it can be superimposed based on the monocular multi-view principle to form a monocular focusable spatial light spot.
  • the combined sub-pixel group that he can observe will be complementarily combined by different parts of more sub-pixel groups.
  • color light information is often presented through the synthesis of multiple basic colors.
  • the pixel presentation of the color light information of various displays is realized by the color mixing of the K'-type basic colors emitted by the sub-pixels.
  • the K'basic colors corresponding to common displays are R, G, B or R, G, B, W (white).
  • the two light beams can be superimposed to form a monocular focusable spatial superimposed light point within a certain distance from the screen.
  • the presentation of its colors is inaccurate due to the lack of basic colors.
  • the superimposed beams passing through each display object point are optimally at least K'beams of basic colors.
  • the views received by the pupil 50 of the observer optimally include views or/and combined views in which the projected light colors are K′ and the total number of basic colors is K′.
  • the colors of the light emitted through the adjacent K′ viewing zones are K′-type basic colors.
  • the discrete distribution of sub-pixels and the view may occur.
  • the discrete distribution of the area results in a display object point with a superimposed beam number less than K'.
  • the point P r is the intersection of two adjacent sub-pixels that cross projected light beams to two adjacent viewing areas
  • P l is the intersection of the adjacent sub-pixels to the opposite extended line of the non-crossed projected light beams of the viewing area with the largest spacing.
  • the viewing zones shown in FIG. 2 are illustrated by taking the existence of a certain space gap between the viewing zones as an example.
  • adjacent viewing areas can also be arranged adjacently or partially overlapping.
  • the following parts are often shown in an adjacent arrangement or a gap arrangement, but it does not mean that adjacent viewing areas must be arranged in the shown arrangement.
  • the tracking device 70 as shown in FIG. 2 can also be used to obtain the position of the observer's pupil 50 in real time. Its function is, first, when the light beam projected by each sub-pixel and incident into the area where the observer’s pupil is located, the light distribution size is larger than the observer’s pupil diameter D p in a certain direction, and each is determined according to the position of the observer’s pupil 50
  • the image information loaded by the sub-pixel is the transmission vector of the light beam projected by the sub-pixel and incident on the pupil 50 of the observer, and the projection light information on the intersection of the straight line where the transmission vector of the scene to be displayed is located and the plane of the pupil 50 of the observer
  • the control device 30 determines the sub-pixel group in which the projected light beam enters the observer’s pupil 50, and uses these sub
  • the number of viewing zones projected by the display device 10 through the light splitting device 20 is large enough to project at least two views or/and split views to the two pupils of the observer respectively, based on the spatial superposition of sub-pixel emission light as described in this patent.
  • the optical structure displayed by the three-dimensional display method can be used as a binocular display optical engine. If the viewing zone projected by the display device 10 through the beam splitting device 20 only supports the projection of at least two views or/and the split view to a single pupil of the observer, it is based on the three-dimensional display method based on the spatial superposition of sub-pixel emitted light as described in this patent.
  • the optical structure of the display can only be used as a monocular display optical engine, such as an eyepiece for head-mounted virtual reality (VR)/augmented reality (AR).
  • VR virtual reality
  • AR augmented reality
  • the projection device 40 by introducing the projection device 40, the enlarged image I 10 of the shadow display device 10 can be projected, as shown in FIG. 5.
  • the image I 10 of the display device 10 with respect to the projection device 40 can be used as an equivalent display device composed of equivalent sub-pixel groups, where each equivalent sub-pixel group is the image of each sub-pixel group of the display device 10 with respect to the projection device 40. .
  • the image I VZR2 of the red sub-pixel group 2 corresponding to the viewing zone VZ R2 is regarded as the equivalent sub-pixel group corresponding to the red sub-pixel group 2 (red sub-pixel Group 2 image) corresponding to the equivalent viewing zone.
  • the projection device 40 when the projection device 40 is introduced, information loading and monocular multi-view display can be performed directly based on the equivalent sub-pixel groups of the equivalent display device and their corresponding equivalent viewing areas, which is completely similar to the above-mentioned non-introduction of the projection device 40.
  • the equivalent viewing zone exists, the above-mentioned constraint on the size of the viewing zone is transformed into the constraint on the size of the corresponding equivalent viewing zone.
  • the equivalent viewing zone may be strip-shaped, the size of which is smaller than the diameter D p of the pupil 50 of the observer in one direction, but the size of the equivalent viewing zone is not smaller than the diameter D p of the pupil 50 of the observer in some other direction.
  • the equivalent viewing zone can also be point-shaped, and the size of the equivalent viewing zone is smaller in all directions, the diameter D p of the observer’s pupil 50.
  • the size of the equivalent viewing zone along a certain direction refers to the size of an area covered by a light intensity of 50% of the peak light intensity in this direction.
  • the equivalent viewing zones are strip-shaped, the equivalent viewing zones are arranged in a one-dimensional direction; if the equivalent viewing zones are dot-shaped, the equivalent viewing zones can be arranged in a one-dimensional direction or in a two-dimensional direction.
  • information loading and monocular multi-view display can be performed directly based on each equivalent sub-pixel group of the equivalent display device and their respective corresponding viewing zones.
  • the position of the projection device 40 relative to the beam splitting device 20 is related to the specific structure of the beam splitting device 20. According to the specific structure of the beam splitting device 20, the projection device 40 can also be placed at the position P o2 shown in FIG. Position P o3 between 20 different components.
  • the relay device 60 can be used to guide the display device 10 to project the light information displayed by the deflection path to the area where the pupil of the observer is located, as shown in FIG. 6.
  • the relay device 60 is a semi-transparent and semi-reverse side as an example.
  • the equivalent display device is the image of the display device 10 with respect to the projection device 40 and the relay device 60, as shown in I 10 in FIG. 6; each equivalent sub-pixel group is each sub-pixel group with respect to the projection device 40 and the relay device
  • the image of each viewing zone exists it is also the image of each viewing zone with respect to the projection device 40 and the relay device 60 as the corresponding equivalent viewing zone, such as I VZR1 , I VZB1 and so on in FIG. 6.
  • the positional relationship of the spectroscopic device 20, the projection device 40, and the relay device 60 shown in FIG. 6 is only an illustration. And change. Based on the optical structure for display based on the three-dimensional display method based on the spatial superposition of sub-pixel emitted light described in this patent, when it is used as a monocular display optical engine, the construction of the binocular optical engine requires two such monoculars respectively corresponding to the binoculars of the observer. Eye display optical engine.
  • the display device 10 is a display with a thin structure as an example, which may be an active display or a passive display with a backlight. 1 to 6 do not involve the specific structure of the light splitting device 20, and the light splitting device 20 is replaced by a virtual frame.
  • the spatial position of the spectroscopic device 20 relative to the display device 10 is also schematically shown for ease of illustration, and does not mean that the actual positional relationship between the spectroscopic device 20 and the display device 10 is certain when the spectroscopic device 20 adopts different specific structures. As shown in Figure 1 to Figure 6.
  • the display device 10 takes a common RGB display as an example.
  • Each pixel is composed of three sub-pixels emitting R, G, and B light respectively arranged in the x direction, and the sub-pixels emitting light of the same color are arranged adjacently in a row along the y direction perpendicular to the x direction.
  • FIG. 7 only takes a row of sub-pixels in the x-direction as an example, and each sub-pixel is respectively identified by the colors R, G, and B of the light emitted by them.
  • the sub-pixel SP Bn1 has a subscript B to identify the blue light emitted from it.
  • the sub-pixels that emit light of the same color are individually regarded as a sub-pixel group, that is, the sub-pixels of the display device 10 are grouped into a red sub-pixel group, a green sub-pixel group, and a blue sub-pixel group.
  • This patent defines that the transmittance ratio of the light emitted by various elementary color sub-pixels to the corresponding filter and to the corresponding filter of other elementary color sub-pixels is greater than 9.
  • each sub-pixel emits the elementary color light through other elements.
  • the transmitted light of the quasi-primitive color light corresponding to the filter is used as noise, and the impact on the display quality is within a tolerable range.
  • the noise will not be discussed in the following sections, and it is considered that the projected light of various primary color sub-pixels cannot be transmitted through the corresponding filters of other primary color sub-pixels.
  • the aperture with the red filter F R is the aperture viewing zone VZ R corresponding to the red sub-pixel group, and the R in its subscript indicates that the viewing zone is attached with a red filter; with a green filter F i.e.
  • aperture G is green sub-pixel group corresponding aperture viewing zone VZ G, in which the subscript G indicates the viewing area with the green filter; blue filter F B with a pore size that is blue sub
  • the aperture viewing zone VZ B corresponding to the pixel group, and the B in its subscript indicates that the viewing zone is attached with a blue filter.
  • This subscript indicates the method of indicating the color corresponding to the filter attached to the viewing zone, and is applicable to the following parts of this embodiment. According to the principle described in FIG. 2, when the distance between adjacent viewing zones is small enough, at least two views or/and a split view may enter the pupil 50 of the observer, so as to realize monocular multi-view display.
  • the three primitive color views all enter the pupil 50 of the observer.
  • Introducing orthogonal characteristics to the aperture type spectroscopic device 20 can effectively solve this problem.
  • the orthogonal characteristic may be two linear polarization characteristics whose polarization directions are perpendicular to each other.
  • the aperture viewing zones VZ B1 , VZ G1 , and VZ R1 allow only blue, green, and red light to pass through
  • the aperture viewing zones VZ B2 , VZ G2 , and VZ R2 allow only blue, green, and red light, respectively. -" The light passes.
  • the sub-pixels SP Bn1 , SP Bn3 , SP Bn5 , ... constitute the spatial blue sub-pixel group 1
  • the sub-pixels SP Bn2 , SP Bn4 , ... constitute the spatial blue sub-pixel group 2.
  • the two mutually perpendicular linear deflection states shown in Fig. 8 can also be replaced by two optical rotation states with opposite rotation directions.
  • the above-mentioned orthogonal characteristic may also be a time-series orthogonal characteristic of transmission at different time points.
  • the apertures VZ B1 , VZ G1 , and VZ R1 are opened at the time point t
  • the apertures VZ B2 , VZ G2 , and VZ R2 are opened at the time point t+ ⁇ t/2.
  • Figure 9 shows the corresponding situation at time t.
  • the sub-pixels that emit light of the same color are divided into two groups of time series sub-pixels in time.
  • the sub-pixels of the two groups of time series sub-pixel groups are exactly the same in space, but in time, they are separated from point t and point.
  • t+ ⁇ t/2 works, when each sub-pixel determines the corresponding loading information, the corresponding view zone of the sub-pixel group is different.
  • the time-series blue sub-pixel group 1 composed of SP Bn1 , SP Bn2 , SP Bn3 , ... uses aperture VZ B1 as its corresponding aperture viewing zone
  • the time-series blue sub-pixel group 2 composed of Bn2 , SP Bn3 , ... uses aperture VZ B2 as its corresponding aperture viewing zone.
  • the spatial positions of the viewing zones can be interchanged.
  • the realization of the time sequence orthogonal characteristic can be realized by using a controllable liquid crystal surface that is signal-connected to the controller 30 as an aperture array.
  • the above-mentioned orthogonal characteristic may also be a mixed characteristic, for example, a mixed characteristic of a timing orthogonal characteristic and a linear offset characteristic.
  • the apertures VZ R1 , VZ G1 , and VZ B1 that only allow light to pass through are respectively attached with filters corresponding to the primary colors R, G, and B, only in a time period t ⁇ t+ ⁇ t Is opened at time t and closed at time t+ ⁇ t/2; the apertures VZ R2 , VZ G2 , and VZ B2 that only allow light to pass through are attached with filters corresponding to the elementary colors R, G, and B respectively.
  • a time period t ⁇ t+ ⁇ t opens at t+ ⁇ t/2, and closes at t;
  • the aperture viewing zones VZ R3 , VZ G3 , and VZ B3 that only allow "-" light to pass are respectively attached with corresponding primitive colors R,
  • the filters of G and B are only opened at time t during a time period of t ⁇ t+ ⁇ t, and closed at time t+ ⁇ t/2;
  • the aperture viewing zone that only allows "-" light to pass through VZ R4 , VZ G4 , VZ B4 is respectively attached with filters corresponding to the primary colors R, G, and B, which are only turned on at t+ ⁇ t/2 in a time period of t ⁇ t+ ⁇ t, and closed at time t.
  • each sub-pixel with the same color of emitted light is spatially divided into two spatial sub-pixel groups arranged alternately, and each spatial sub-pixel group is divided into two time-sequential sub-pixel groups in a time period. Then, within a time period t ⁇ t+ ⁇ t, four mutually independent sub-pixel groups of sub-pixels with the same color of emitted light can respectively project four views to their corresponding four aperture viewing areas, and finally generate a total of 12 Viewports.
  • the other time periods are similarly repeated, and based on visual retention, the 12 viewing zones can provide the pupil 50 of the observer with a denser projection view and a larger viewing space.
  • the light emitted from each sub-pixel of the same sub-pixel group has the same orthogonal characteristic.
  • sub-pixels in the same sub-pixel group do not have completely consistent orthogonal characteristics.
  • the sub-pixels of adjacent area blocks Emit light in different orthogonal states. For example, the sub-pixels on the area block B 1 all emit " ⁇ "light; the sub-pixels on the area block B 2 all emit "-"light; the sub-pixels on the area block B 3 all emit " ⁇ " light.
  • the sub-pixels with the same color of the emitted light are respectively regarded as a group of sub-pixel groups.
  • the sub-pixels of each sub-pixel group in different area blocks have different linear polarization characteristics and correspond to different apertures, although the color of the emitted light is the same.
  • the sub-pixel on B 1 is at time t within a time period t ⁇ t+ ⁇ t, corresponding to the open aperture viewing zone VZ B1 at this time, at t+ ⁇ t/2, Corresponding to the opened aperture viewing zone VZ B2 at this time; its sub-pixel on B 2 is at time t within a time period t ⁇ t+ ⁇ t, corresponding to the opened aperture viewing zone VZ B3 at this time, at t+ ⁇ t/2 tick Corresponding to the opened aperture viewing zone VZ B4 at this time; its sub-pixel on B 3 is at time t within a time period t ⁇ t+ ⁇ t, corresponding to the opened aperture viewing zone VZ B5 at this time, at t+ ⁇ t/2 tick Corresponding to the aperture viewing zone VZ B6 opened at this time.
  • the sub-pixel groups that emit blue light and the sub-pixels on different area blocks respectively perform light information projection through different corresponding viewing zones.
  • the distribution area of the sub-pixels corresponding to each viewing zone becomes smaller, which can alleviate the limitation of the viewing zone caused by the limitation of the size-limited aperture viewing zone on the visible sub-pixel distribution area.
  • This sub-pixel and view zone setting method results in that the view zone corresponding to a sub-pixel group is no longer one, but BN ( ⁇ 2).
  • each sub-pixel corresponds to the view zone, and the sub-pixel group to which it belongs corresponds to BN One of three viewports.
  • the number of optional orthogonal characteristics for example, there are only two orthogonal polarization states with polarization directions perpendicular to each other in the linear polarization characteristics, and it may appear that the sub-pixels on one area block can pass through non-adjacent area blocks.
  • the subpixels 11 in FIG SP Bn1 projection VZ B5 noise light transmitted through the viewing zone the sub-pixel SP Gn2 VZ G5 projection light transmitted through the viewing zone noise.
  • the above-mentioned noise can be prevented from entering the pupil 50 of the observer through the design of the distribution of the viewing zone.
  • the useful beam projected by the sub-pixel SP Bn1 through the corresponding viewing zone VZ B1 and the noise beam projected by the non-corresponding viewing zone VZ B5 have A relatively large separation angle can ensure that noise cannot enter the pupil 50 of the observer in the observation area.
  • the viewing zone can be divided into two groups, that is, the aperture of the aperture type beam splitter 20 is divided into two groups, which are respectively placed at the left pupil 50' and the right pupil 50 of the observer, such as Figure 12, the method described in this patent is directly applied to the eyes of the observer. Furthermore, when the number of groups to be classified into can correspond to the binoculars of more than one observer, binocular display of multiple persons can be performed.
  • each aperture can be a strip-shaped aperture that can only be arranged in a one-dimensional direction.
  • the size of each aperture is smaller than the observer’s pupil diameter D p , and along some other direction, its size is larger than the observer’s pupil diameter. D p .
  • the size of each aperture along each direction is smaller than that of the pupil diameter D p of the observer, which is called a point-shaped aperture, which can be arranged in a one-dimensional direction or in a two-dimensional direction.
  • a dot-shaped aperture is used, the above example of arranging the apertures in a one-dimensional direction can be extended to the case where the dot-shaped apertures are arranged in a two-dimensional direction.
  • the adjacent sub-pixels are all arranged in a spatially offset manner.
  • the K′ basic color sub-pixels can also be spatially overlapped at one point, for example, a display method of sequentially projecting K′ color backlights to a spatially same sub-pixel through a color wheel.
  • the above process displays a time period of an image, for example, t ⁇ t+ ⁇ t/2, it needs to be further divided into K′ sub-time periods, and the K′ colors of backlights are allowed to enter in turn, correspondingly equivalent to K′
  • the time sequence of the sub-pixels of different colors corresponds to the loading of light information.
  • the displayed scene is a single primitive color, such as the scene to be displayed or part of it.
  • the scene is green.
  • the light information of the light spot in the green space to be displayed can be designed as white W(R+G+B)+green B, and its presentation requires the superposition of one beam from each of the three sub-pixel groups that emit the primitive color light, To avoid the above phenomenon.
  • the application of the method described in this patent is not limited to the values of K′ and K.
  • monocular multi-view display can also be performed based on the above-mentioned principles and methods.
  • the emitted light can pass through the filters corresponding to the other three primary colors.
  • the aperture viewing area that allows white light transmission corresponding to the white sub-pixel group is required to be generated, and the other three primitive-color sub-pixel groups are required.
  • the corresponding aperture viewing zones have different orthogonal characteristics to ensure that the light emitted by each sub-pixel group only transmits through the corresponding aperture viewing zones.
  • the image array of the emitted elementary color light and the image array of the emitted white light are misaligned in time sequence and projected light information, and they correspond to the aperture viewing zone and the corresponding timing is misaligned to open.
  • the emitted light from the image array of the element color light and the image array of the white light are left-handed light and right-handed light respectively, and their corresponding aperture viewing zones also correspondingly allow only left-handed light and right-handed light to pass through.
  • the method described in this patent does not restrict the shape of the sub-pixels of the display device 10.
  • the rectangular sub-pixels commonly found in existing display devices can also be displays in which the sub-pixels are designed as squares.
  • the arrangement of the sub-pixels can also be designed in other arrangements than the above-mentioned RGB arrangement, for example, the pencil arrangement.
  • the display device 10 is a display with a thin structure for demonstration.
  • the display device 10 can also be other various types of displays, for example, a transmissive or reflective display with a thicker structure that requires a backlight.
  • Each aperture in the aperture type spectroscopic device 20 may also be a reflection type aperture.
  • a projection device 40 may be introduced, such as the projection device 40 placed at the position Po1 in FIG. 5.
  • the introduced projection device 40 and the light splitting device 20 can be placed at a relatively small distance, as shown in FIG. 13.
  • the positions of the projection device 40 and the beam splitting device 20 in FIG. 13 can also be interchanged.
  • the image I 10 of the display device 10 with respect to the projection device 40 is used as an equivalent display device instead of the aforementioned display device 10, which can perform monocular multi-view display based on the same method and process.
  • two groups of apertures are taken as an example. It can also be only one group or more groups.
  • the different groups may have different orthogonal characteristics as described in FIGS. 7 to 11, such as different timing characteristics, or different optical rotation characteristics, or composite characteristics.
  • the structure of introducing the projection device 40 is often used as an eyepiece of the near-eye display optical engine, and two such eyepieces build a binocular display optical structure, as shown in FIG. 14.
  • the structure shown in FIG. 13 can be further expanded into a composite structure to optimize its display performance and optical structure size.
  • the structure shown in Figure 15 increases the number and density of projection views through the overlap of the images of the two display devices with respect to their corresponding projection devices; the structure shown in Figure 16 allows the two display devices to seamlessly interact with the images of their corresponding projection devices.
  • the spliced image I 10 +I 10 ′ formed by the splicing increases the display viewing angle;
  • Figure 17 is similar to Figure 16, but the spliced image adopts curved surface splicing.
  • Fig. 18 introduces the auxiliary imaging device 80, and the second imaging is the second mosaic image formed by each display device through the corresponding projection device.
  • the image I 10 and the image I 10 ′ are combined to form a combined image I 10 +I 10 ′, where the image I 10 is the image formed by the display device 10 by the projection device 40, and the image I 10′ is the image by the display device 10 ′.
  • the image is imaged by the projection device 40'.
  • the corresponding display device, projection device, and the combined structure of the aperture type light splitting device for example, the combined structure of the display device 10, the projection device 40, and the aperture type light splitting device 20, are displayed in the display device 10
  • the overall size can be designed to be relatively small, and placed in the small hole in the thin eyepiece structure shown in Figure 19, so that the display structure is truly light and thin.
  • the compensation device 801 is used to eliminate the influence of the auxiliary imaging device 80 on the external ambient light, so that the external ambient light enters the pupil 50 of the observer without distortion or small distortion. In the case that no external ambient light is required to be incident, the compensation device 801 can be eliminated.
  • auxiliary imaging device 80 and the compensation device 801 are a supporting medium, such as optical glass, which provides a support for the installation of the combined structure of the display device, the projection device and the aperture type beam splitting device.
  • a supporting medium such as optical glass
  • the auxiliary imaging device 80 and the compensation device 801 can be removed.
  • more display devices, projection devices, and aperture-type light splitting devices are combined into a structure that can realize the projection of more mosaic images. These mosaic images can overlap, or partially overlap, or be located at different depths.
  • the images of the display device 10 and the display device 10′′ are combined into a combined image I 10 +I 10′′
  • the images of the display device 10′ and the display device 10′′′ are combined into a combined image I 10′ +I 10′′. ′ .
  • the two mosaic images can overlap in depth, or they can be misaligned; on the vertical plane of depth, they can be completely overlapped, or they can be partially overlapped in a dislocation. For clarity, some components are not marked in the figure.
  • the two mosaic images in Fig. 20 are examples of misalignment in the depth direction, and partially overlapped misalignment on the vertical plane of the depth.
  • each combined structure 20 can also be placed in a position between each projection device and each beam splitting device, and each combined structure can also be arranged along a curved surface, or can be expanded to be arranged along a two-dimensional surface. In each of the above figures, more combined structures may also be included.
  • a relay device 60 can also be introduced to guide the projection light of each sub-pixel to propagate to the area where the pupil 50 of the observer is located, as shown in the semi-transparent and semi-reverse side of FIG. 6.
  • the relay device 60 can be selected from various other optical devices or optical device components.
  • the relay device 60 and the projection device 40 can be placed independently of each other, or they can share some components with the projection device 40.
  • the free-form surface composite device shown in FIG. 21 the free-form surface composite device shown in FIG. 21.
  • the free-form surface composite device is composed of a transmissive curved surface FS1, a reflective curved surface FS2, a transflective curved surface FS3, a transmissive curved surface FS4, and a transmissive curved surface FS5.
  • FS1, FS2, FS3, and FS4 perform the function of the projection device 40 together
  • FS2, FS3 perform the function of the relay device 60
  • FS5 has a compensation modulation function, allowing external ambient light to enter the pupil 50 of the observer without being affected by FS3 and FS4.
  • the relay device 60 can also be an optical waveguide device, which is referred to as an optical waveguide type relay device 60.
  • the optical waveguide type relay device 60 is placed between the display device 10 and the aperture type beam splitting device 20 group, including an entrance pupil 611, a coupling device 612, an optical waveguide 613, reflecting surfaces 614a and 614b, and a coupling Out device 615 and exit pupil 616.
  • the projection device 40 includes two components, a lens 40a and a lens 40b.
  • the emitted light is converted into parallel light by the component 40a of the projection device 40; and then enters the coupling device 612 through the entrance pupil 611 of the optical waveguide type relay device 60;
  • the input device 612 guides the parallel light from the sub-pixel p 0 into the optical waveguide body 613, and propagates to the out-coupling device 615 based on the reflection of the reflective surfaces 614a and 614b; the out-coupling device 615 modulates the incident light beam and guides it to enter the projection through the exit pupil 616
  • the component 40b of the device 40; the component 40b of the projection device 40 guides the light projected by the sub-pixel p 0 to propagate to the aperture type light splitting device 20, and modulates it to converge on the virtual image p′ 0 in the reverse direction.
  • the virtual image p′ 0 is the virtual image of the sub-pixel p 0 .
  • p′ 1 corresponds to the corresponding virtual image of the sub-pixel p 1.
  • the sub-pixel images such as p′ 0 and p′ 1 form the image I 10 of the display device 10.
  • the coupling-out device 615 can have a pupil dilation function. On the one hand, it guides the incident light beam to be partly projected to the exit pupil, and on the other hand, it allows the incident light to partially continue to transmit along the original transmission path, and is reflected by the reflective surface 614b and then re-entered and coupled out. The device 615 performs coupling out and reflection, and repeats this until the light beam from each sub-pixel covers the exit pupil 616.
  • the compensation device 801 is used to compensate the influence of the component 40b of the projection device 40 on the incident light of the external environment, and can be removed when the external environment light is not needed.
  • the projection device assembly 40b in the figure can also be combined with the decoupling device 615.
  • a holographic device is used to place the decoupling device 615 at the position of the decoupling device 615 in the figure to perform the functions of the decoupling device 615 and the projection device assembly 40b.
  • FIG. 22 only takes a commonly used optical waveguide device as an example.
  • the existing optical waveguide devices of various structures can actually be used as the optical waveguide relay device 60 of this patent.
  • the coupling device 612 is a reflective surface.
  • the three optical waveguide device components are responsible for the propagation and guidance of the R, G, and B color light beams, and their coupling devices
  • the out-coupling device can be designed according to the wavelength of the beam that it is responsible for transmitting, reducing the dispersion effect.
  • the optical waveguide type relay device 60 is placed between the display device 10 and the aperture type light splitting device 20.
  • the optical waveguide type relay device 60 may also be placed in front of the aperture type beam splitting device 20 along the beam transmission direction, as shown in FIG. 24.
  • the light emitted from each sub-pixel is converted into parallel light by the component 40a of the projection device 40, and then enters the aperture type beam splitting device 20; then enters the coupling device 612 through the entrance pupil 611 of the optical waveguide type relay device 60;
  • the parallel light of each sub-pixel in the optical waveguide 613 propagates to the out-coupling device 615 based on the reflection of the reflective surfaces 614a and 614b;
  • the out-coupling device 615 modulates the incident light beam and guides it to enter the assembly 40b of the projection device 40 through the exit pupil 616;
  • the component 40b of the projection device 40 guides the light projected by each sub-pixel to propagate to the aperture type light splitting device 20, and modulates it to converge on the virtual image in the reverse direction, thereby based on the projection device 40 and the waveguide type relay device 60 composed of the components 40a and 40b
  • a virtual image I 10 of the display device 10 is generated.
  • the transmitted light at each point on the aperture type spectroscopic device 20 is divergent light.
  • the upper point of the aperture type beam splitting device 20 passes through the optical waveguide relay device 60 and the assembly 40b of the projection device 40 to generate a unique image, and use them
  • the combination is the only image of the aperture type spectroscopic device 20.
  • the projected position of the image I 20 of the spectroscopic device 20 is related to the specific parameters of each component of the optical structure. For example, it may be the position shown in FIG. 24.
  • the image I 10 of the display device 10 is used as an effective display device, and each aperture image on the I 20 is used as an effective viewing zone, and a monocular multi-view display is realized based on the foregoing principles and methods.
  • the optical waveguide type relay device 60 has a pupil expansion function, multiple images of the aperture type spectroscopic device 20 are generated due to the pupil expansion.
  • each sub-pixel is required to project different sagittal beams of different images of the aperture type beam splitter 20 at the same time point with a sufficiently large angular spacing to ensure that they do not enter the pupil of the observer at the same time.
  • the control device 30 determines the unique sagittal beam projected by each sub-pixel and incident on the pupil 50 of the observer according to the position. Based on the sagittal direction of the beam, according to the aforementioned method Determine the loaded light information of the sub-pixel.
  • the aperture type spectroscopic device 20 may also be at the position Po5 shown.
  • FIG. 24 takes the reflective surface as the coupling-in device 612 and the coupling-out device 615.
  • Fig. 24 can also use the optical waveguide device selected in Fig. 22 or other optical waveguide devices.
  • the compensation device 801 can also be introduced, or the component 40b of the projection device 40 can be combined with the coupling out device 615.
  • the design components 40c, 40d, and 40e constitute the projection device 40. Then, through 40c, 40d, the components of the relay device 60, and 40e, the image I 20 of the aperture type spectroscopic device 20 is formed.
  • the projection light of each sub-pixel is converted into a parallel state by the component 40c of the projection device 40, and each point on the aperture type light splitting device 20 transmits the divergent light through the component 40d of the projection device 40, and enters the optical waveguide relay device 60 in a parallel state.
  • the assembly 40e of the projection device 40 condenses the parallel light projected from each point on the aperture type light splitting device 20 to form an image I 20 of the aperture type light splitting device 20.
  • the optical structure shown in FIG. 25 can be required to make each sub-pixel project light. Corresponding to the unique image, and combining them into the image I 10 of the display device 10.
  • the position of I 10 is related to the specific parameters of the light path, such as the position shown in Figure 25.
  • each sub-pixel When the optical waveguide type relay device 60 has a pupil expansion function, each sub-pixel generates a plurality of images due to pupil expansion. In this case, it is required that the different images corresponding to the sub-pixels at the same time point need to have a sufficiently large spacing to ensure that they do not enter the pupil 50 of the observer at the same time.
  • the tracking device 70 determines the position of the pupil 50 of the observer in real time
  • the control device 30 determines the light beam projected by each sub-pixel and incident on the pupil 50 of the observer according to the position. Loading information of sub-pixels. In FIG.
  • the component 40c of the projection device 40 can also be eliminated.
  • the optical waveguide type relay device 60 When the optical waveguide type relay device 60 is used, the transmitted light at each point of the aperture type beam splitter device in FIGS. 22, 24, and 25 is converted into parallel light to enter the optical waveguide type relay device 60, or the light emitted from each sub-pixel of the display device 10 The case where the parallel light is converted into the optical waveguide type relay device 60 will be described as an example.
  • the display device 10 selects a passive display that requires a backlight source, and multiple backlight sources are required to form the backlight source array 110.
  • the imaging device of each backlight source of the imaging backlight source array 110 is referred to as the light splitting device 20, which is named as the imaging type light splitting device 20, as shown in FIG. 26.
  • the display device 10 takes a common RGB display as an example. Each pixel is composed of three sub-pixels that modulate and project R (red), G (green), and B (blue) light, arranged along the x direction.
  • FIG. 26 only takes a row of sub-pixels in the x-direction as an example, and the sub-pixels are respectively identified by the colors R, G, and B of the light projected by them.
  • the sub-pixel SP Bn1 has a subscript B to identify that its projected light is blue.
  • the sub-pixels that emit light of the same color are individually regarded as a sub-pixel group, that is, the sub-pixels of the display device 10 are grouped into a red sub-pixel group, a green sub-pixel group, and a blue sub-pixel group.
  • FIG. 26 specifically uses a lens as an example of the imaging type spectroscopic device 20.
  • BS B corresponding backlight blue sub-pixel group
  • BS G corresponding backlight green sub-pixel group
  • BS R corresponding backlight red sub-pixel group.
  • the projection light of each backlight source transmits through the non-corresponding sub-pixel group
  • noise may also be generated due to the existence of a certain value of light transmittance. The noise can be ignored between the backlight source whose emitted light is a primitive color and its corresponding sub-pixel group, and it is considered that the primitive color light emitted by the backlight source does not transmit the non-corresponding primitive color sub-pixels.
  • the light information projected by the corresponding sub-pixel group of the backlight source is visible, that is, the light distribution area of the image of each backlight source is defined by the corresponding sub-pixel group of the backlight source.
  • the viewport corresponds to the viewport.
  • the light emitted from each sub-pixel of any sub-pixel group is guided to the viewing zone corresponding to the sub-pixel group through the imaging type light splitting device 20.
  • the imaging type light splitting device 20 According to the principle described in FIG. 2, when the distance between adjacent viewing zones is small enough, at least two views or/and a split view may enter the pupil 50 of the observer, so as to realize monocular multi-view display.
  • each viewing zone is imaged by the corresponding backlight source through the imaging type spectroscopic device, and its size along a certain direction refers to the light distribution range of the corresponding backlight source image in that direction, and the light intensity value is greater than the peak light intensity 50 % Of the size occupied by the light distribution area.
  • the relative positional relationship between the display device 10 and the imaging type spectroscopic device 20 shown in the figure is not mandatory. When it is ensured that the light projected by each backlight can cover the display device 10, the display device 10 can also be placed at the position Po2 in the figure. Po3 or Po4.
  • "-" and “ ⁇ ” are two orthogonal linear polarization states with polarization directions perpendicular to each other.
  • the backlight sources BS B1 , BS G1 , and BS R1 respectively project blue, green, and red " ⁇ ” light
  • the backlight sources BS B2 , BS G2 , and BS R2 respectively project blue, green, and red "-” lights.
  • each backlight source is imaged by the imaging type light splitting device 20 as the viewing area of the spatial sub-pixel group corresponding to the backlight source.
  • the projected light of each sub-pixel of the corresponding spatial sub-pixel group passes.
  • the two mutually perpendicular linear deflection states shown in Fig. 27 can also be replaced by two optical rotation states with opposite rotation directions.
  • the above-mentioned orthogonal characteristic may also be a time-series orthogonal characteristic of transmission at different time points.
  • the backlight source group composed of the backlight sources BS B1 , BS G1 , and BS R1 projects the backlight at the time point t, and does not project at the time point t+ ⁇ t/2; the backlight sources BS B2 , BS G2 , and BS R2 constitute the backlight source
  • the group is turned on at time t+ ⁇ t/2 and does not project at time t.
  • Figure 28 shows the corresponding situation at time t.
  • the sub-pixels emitting light of the same color are divided into two groups of time-series sub-pixel groups in time sequence.
  • the sub-pixels of the two groups of time-series sub-pixel groups are exactly the same in space, but in time, they are respectively at point t and point t.
  • the corresponding viewing zones are different.
  • the blue sub-pixel group composed of SP Bn1 , SP Bn2 , SP Bn3 , ... takes VZ B1 as its corresponding viewing zone;
  • the blue sub-pixel group composed of ,... takes VZ B2 as its corresponding viewing zone.
  • a larger M value requires the display device 10 to have a higher frame rate to avoid the occurrence of flicker effects.
  • the spatial positions of the backlight sources can be interchanged.
  • the realization of the timing orthogonal characteristic can be realized by the controller 30 controlling the switches of each light source group, and refreshing the sub-pixels of the display device 10 synchronously with corresponding light information.
  • the orthogonal characteristic may also be a mixed characteristic, for example, a mixed characteristic of a timing orthogonal characteristic and a linear offset characteristic.
  • the backlight sources BS R1 , BS G1 , and BS B1 emitting " ⁇ " light are turned on at time t, and turned off at time t+ ⁇ t/2;
  • the backlight sources BS R2 , BS G2 and BS B2 of the light are turned on at time t and turned off at time t+ ⁇ t/2;
  • the backlight sources BS R3 , BS G3 and BS B3 emitting " ⁇ " light are at time t+ ⁇ t/2 Turn on and turn off at time t;
  • the backlight sources BS R4 , BS G4 , and BS B4 that emit "-" light are turned on at time t+ ⁇ t/2 and turned off at time t.
  • each sub-pixel with the same color of the emitted light is spatially divided into two spatial sub-pixel groups, and the two spatial sub-pixel groups only receive and modulate the " ⁇ " light and the "-" light respectively.
  • each spatial sub-pixel group is divided into two time-sequential sub-pixel groups in time period t ⁇ t+ ⁇ t. Then, within a time period t ⁇ t+ ⁇ t, the four independent sub-pixel groups of the sub-pixels with the same color of the emitted light, respectively take the image of the corresponding backlight source as the corresponding viewing area, and project the corresponding viewing area. view. For example, SP Bn1, SP Bn3, ...
  • the pupil 50 of the observer can be provided with a denser viewing area and a larger viewing space.
  • the number of view zones shown in Figure 27 to Figure 29 increases as the degree of multiplexing increases.
  • the generated viewports can be divided into two groups spatially, respectively corresponding to the left pupil 50' and right pupil 50 of the observer in space, as shown in Figure 30, the method described in this patent is directly Applied to the eyes of the observer.
  • the spatial position distribution of the backlight sources of the backlight source array 110 also changes accordingly.
  • the number of viewing zone groups can correspond to the binoculars of more than one observer, binocular display of multiple people can be performed.
  • each backlight source can be a strip-shaped backlight source that can only be arranged in a one-dimensional direction.
  • the image of each backlight source along the arrangement direction has a size smaller than the observer’s pupil diameter D p , and along some other direction, its size is larger than Observer pupil diameter D p . It can also be another situation.
  • the size of each backlight image is smaller than the pupil diameter D p of the observer. This is called a dot-shaped backlight, which can be arranged in one-dimensional direction or in two-dimensional direction. . That is to say.
  • Each backlight source can have a certain size.
  • the size of the image of each backlight source in a certain direction herein refers to the size occupied by the light distribution area with a light intensity value greater than 50% of the peak light intensity within the light distribution range of the backlight image in that direction.
  • Fig. 32 When the strip backlight is selected, another design method can also be adopted to realize the binocular space coverage of the viewing zone to the observer. As shown in Fig. 32, by designing a larger angle where the viewing zone arrangement direction deviates from the observer's binocular connection direction, the coverage of the light information projected through each viewing zone along the observer's binocular connection direction x'is increased. Fig. 32 takes the case where the pupil 50 and pupil 50' of the observer are exactly on the distribution plane of the viewing zone as an example. Each sub-pixel group of the display device 10 projects light to the viewing zones VZ R1 , VZ G1 , VZ B1 ,... Through the light splitting device 20, respectively.
  • the coverage size of the multiple viewing areas on the surface along the arrangement direction x is represented by D cv .
  • the design viewing area arrangement direction x deviates from the viewer's binocular connection direction x'to a greater angle, that is, as shown in the figure The smaller the angle, the coverage size of the viewing zone along the x′ direction The larger is, the more favorable the viewing zone is to provide a larger viewing area along the direction x'of the binocular line.
  • the x-direction is rotated in the clockwise direction and deviated from the x'direction as an example, and it can also be rotated in the counterclockwise direction and deviated from the x'direction.
  • each viewing zone also requires that the distance between adjacent viewing zones along the x-direction be smaller than the pupil diameter D p of the observer.
  • D cv the distance between the observer's eyes.
  • D ee is the distance between the observer's eyes.
  • the minimum value of the angle should also be restricted to prevent the light information emitted through the same viewing zone from being incident on the observer's eyes at the same time.
  • the adjacent sub-pixels are all arranged in a spatially offset manner.
  • the K′ basic color sub-pixels can also be spatially overlapped at one point.
  • the K′ backlight sources in each backlight source group of the backlight source array 110 are required to sequentially project K′ kinds of basic colors in turn.
  • the above process displays a time period of an image, for example, t ⁇ t+ ⁇ t/2, which needs to be further divided into K′ sub-time periods, and the backlights of the K′ colors are allowed to enter in turn.
  • the corresponding equivalent is The sequence of K'sub-pixels of different colors are correspondingly loaded with light information.
  • the displayed scene is a single primitive color, such as the scene to be displayed Or part of the scene is green.
  • the displayed scene is a single primitive color, such as the scene to be displayed Or part of the scene is green.
  • the light information of the light spot in the green space to be displayed can be designed as white W(R+G+B)+green B, which requires the superposition of three elementary beams instead of just one green beam to avoid the above The emergence of the phenomenon.
  • the application of the method described in this patent is not limited to the values of K′ and K.
  • K 3.
  • the white sub-pixels which correspond to the white light projected by the W light backlight, will transmit the other three types of primary color sub-pixels and generate noise.
  • the backlight source that emits primitive color light and the backlight source that emits white light are turned on by the control device 30 in a timing staggered manner and emit the backlight, and their corresponding sub-pixel groups respectively load light information synchronously and correspondingly.
  • the light emitted by the backlight source emitting elementary color light and the backlight source emitting white light are left-handed light and right-handed light respectively, and their corresponding sub-pixel groups respectively receive and modulate only left-handed light and right-handed light respectively.
  • the method described in this patent allows the sub-pixels of the display device 10 to choose different shapes, for example, it can be a rectangle commonly seen in existing display devices, or it can be designed as a square.
  • the arrangement of the sub-pixels can also be designed in various ways, such as a pencil arrangement.
  • the display device 10 is a transmissive display device 10 as an example.
  • the display device 10 may also be a reflective display device.
  • the spatial position of each backlight is not limited to the plane arrangement, and they can also be arranged in a staggered depth.
  • Each of the above structures can also be used as a basic structure, and two or more of the basic structures are combined into a composite structure to increase the viewing angle.
  • the backlight source array 110, the display device 10 and the imaging type light splitting device 20 constitute the basic structure
  • the backlight source array 110', the display device 10' and the imaging type light splitting device 20' constitute another basic structure.
  • the viewing zones generated by the two basic structures are both placed in front of the pupil 50 of the observer, and their display devices are arranged adjacent to each other, which can realize the expansion of the display viewing angle.
  • the projection device 40 can be introduced to project the image of the display device 10. Since both the imaging type spectroscopic device 20 and the projection device 40 have imaging functions, there is mutual influence between the two, and there are often shared components.
  • the components 21 and 22 constitute an imaging type spectroscopic device 20, and the component 22 is also a projection device 40, imaging the enlarged virtual image I 10 of the display device 10.
  • the components 21 and 22 of the imaging type spectroscopic device 20 both take lenses as examples, and the backlight sources of the backlight source array 110 are placed on the front focal plane of the component 21.
  • the distance between the backlight source of the backlight source array 110 and the component 21 may not be equal to the focal length of the component 21, and the backlight sources of the backlight source array 110 may even be at different depths along the direction of the emitted light transmission.
  • the component 22 and the component 21 may be other physical optical devices.
  • the imaging type spectroscopic device 20 and the projection device 40 can be realized by a single lens device.
  • the image I 10 of the display device 10 with respect to the projection device 40 is used as an equivalent display device instead of the aforementioned display device 10, which can perform monocular multi-view display based on the same method and process.
  • the structure of introducing the projection device 40 is often used as an eyepiece of the near-eye display optical engine, and two such eyepieces build a binocular display optical structure, as shown in FIG. 36.
  • a relay device 60 may also be introduced to guide the projected light of each sub-pixel to propagate to the area where the pupil 50 of the observer is located, as shown in the semi-transparent and semi-transparent surface of FIG.
  • the relay device 60 can also be selected from various other optical devices or optical device components. For example, in FIG. 37, a relay device 60 constructed by mirrors 61a, 61b, and 61c placed at each viewing zone; in FIG. 38, a relay device 60 constructed by mirrors 62, 63a, 63b, and 63c; in FIG.
  • the relay device 60 constructed by the transflective surface 64, the reflective surface 65a, the reflective surface 65b, and the reflective surface 65c; in FIG. 40, the relay device 60 constructed by the angular characteristic surface 66, the reflective surface 67a, the reflective surface 67b, and the reflective surface 67c .
  • the display device 10 is a reflective display device.
  • the angle characteristic surface 66 in FIG. 40 has a transmissive property for the light from the backlight that is incident nearly perpendicularly, and has a reflective property for the light beam that is reflected from the display device 10 and incident at a larger incident angle.
  • Figure 41 shows the optical structure using a free-form surface composite device.
  • the free-form surface composite device is composed of a transmissive curved surface FS1, a reflective curved surface FS2, a transflective curved surface FS3, a transmissive curved surface FS4, and a transmissive curved surface FS5.
  • FS1, FS2, FS3, and FS4 perform the functions of the imaging type spectroscopic device 20 and the projection device 40 together
  • FS2, FS3 perform the function of the relay device 60
  • FS5 has a compensation modulation function, allowing the external ambient light to not be affected by FS3, FS4 Enter the pupil 50 of the observer.
  • a lens may also be placed between the backlight source array 110 and the display device 10 as a component of the imaging type light splitting device 20 to converge the light emitted by each backlight source.
  • the backlight source array 110 includes the number of backlight source groups, and it is shown as an example that it includes a smaller group or two groups of backlight source groups. It can also contain more backlight groups.
  • the different groups may have different orthogonal characteristics as described in FIGS. 27 to 29, such as different timing characteristics, or different optical rotation characteristics, or different polarization characteristics, or composite characteristics.
  • the relay device 60 can also be an optical waveguide device, called an optical waveguide relay device 60, which includes an entrance pupil 611, a coupling device 612, an optical waveguide body 613, reflecting surfaces 614a and 614b, an output device 615 and an output Pupil 616.
  • an optical waveguide relay device 60 which includes an entrance pupil 611, a coupling device 612, an optical waveguide body 613, reflecting surfaces 614a and 614b, an output device 615 and an output Pupil 616.
  • the optical waveguide type relay device 60 is placed between the backlight source array 110 and the display device 10 to guide the projection light of each backlight source of the backlight source array 110 to enter the display device 10 with their respective sagittal characteristics.
  • the emitted light is converted into parallel light by the component 21 of the imaging type light splitting device 20; and then enters the coupling device through the entrance pupil 611 of the optical waveguide type relay device 60 612; the coupling device 612 guides the parallel light from the backlight source BS B1 into the optical waveguide 613, and propagates to the coupling out device 615 based on the reflection of the reflective surfaces 614a and 614b; the coupling out device 615 modulates the incident light beam and guides it through the exit pupil 616 incident on the display device 10; imaging spectroscopic device assembly 2220, and also a projection device 40, the projected view of the backlight BS B1 corresponding to the sub-pixel group, the backlight BS B1 as condensing projection, the backlight BS B1 in the image that is
  • the backlight source BS B1 corresponds to the viewing zone VZ B1 of the sub-pixel group.
  • the component 22 of the imaging type spectroscopic device 20 is also the projection device 40 at the same time, and projects the image I 10 of the display device 10.
  • the optical waveguide type relay device 60 also participates in the imaging of the backlight and the imaging of the display device 10.
  • the outcoupling device 615 constructed with the surfaces 615a, 615b, and 615 that partially reflect the incident light as components can have a pupil dilation function. The incident light partially continues to travel along the original transmission path, and is reflected by the reflective surface 614b and then enters the next component of the coupling out device 615 for coupling out and reflection, and repeats this until the light beams from the backlight sources cover the display device 10. .
  • the optical waveguide type relay device 60 may also be placed in front of the display device 10 along the light propagation direction. As shown in Figure 43, the light emitted from each backlight is converted into parallel light into the display device 10 through the component 21 of the light splitting device 20, and then guided into the component 22 of the light splitting device 20 through the optical waveguide type relay device 60, and is condensed and guided to this multiple.
  • the image of the light source forms the viewing area of the sub-pixel group corresponding to the backlight source.
  • the component 22 of the light splitting device 20 is also the projection device 40 at the same time.
  • the optical waveguide type relay device 60 also participates in the imaging of the backlight and the imaging of the display device 10. In FIG. 43, the front and back positional relationship of the assembly 21 of the spectroscopic device 20 and the display device 10 can also be interchanged.
  • each backlight is a point light source.
  • the backlight source is the aforementioned point light source or strip light source with a certain size
  • the corresponding situation is that the light emitted from each point on the backlight source is converted into parallel light propagating along the corresponding sagittal direction.
  • FIG. 42 and FIG. 43 take the case where the light emitted from each point of the backlight is converted into parallel light and enters the optical waveguide type relay device 60 as an example.
  • the positional relationship of the relevant components in Figure 42 and Figure 43 can also be changed, or new components can be introduced to make
  • the light emitted from each point of the backlight enters the optical waveguide type relay device 60 in a non-parallel light state, including the case where the emitted light of each sub-pixel of the display device 10 is converted into parallel light and enters the optical waveguide type relay device 60 or the display device 10 each
  • the light emitted from the sub-pixel is also not converted into a case where parallel light enters the optical waveguide type relay device 60.
  • the core idea of the present invention is to use sub-pixels as basic display units, and display monocular focusable spatial scenes through spatial superposition of sub-pixel projection beams.
  • the K′ sub-pixels of the display device 10 emitting light of different colors
  • the projection light of the K′ sub-pixels belonging to different types can be superimposed to form a color space light spot immediately.
  • the method described in this patent can increase the number of viewing zones (K′-1) times, effectively improving the feasibility of monocular multi-view technology.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

一种基于子像素出射光空间叠加的三维显示方法。以显示器件(10)子像素为基本显示单元,各出射同种颜色光的子像素分别单独作为一个子像素组或分为几个子像素组。在分光器件(20)调控下,该多个子像素组投射待显示场景多于一个的图像入射观察者同一瞳孔(50),基于单目多视图实现单目可聚焦的三维场景显示。来自不同颜色子像素的投射光,沿各自对应矢向于各显示物点叠加形成空间彩色光点,将传统基于不同颜色子像素拼连而成面分布彩色像素进行的显示,转化为基于不同颜色子像素投射光束叠加而成空间分布彩色光点进行的显示。其中,分光器件(20)引导来自各子像素的光束发散角受限地沿各自对应矢向,向各自所属子像素组的对应视区投射。

Description

基于子像素出射光空间叠加的三维显示方法 技术领域
本发明涉及三维显示技术领域,更具体涉及基于子像素出射光空间叠加的三维显示方法。
背景技术
相较于传统二维显示,三维显示可以提供维度一致于人们所感知真实世界的光学信息,正受到越来多的关注。基于双目视差原理进行三维呈现的体视技术(包括自动体视),向观察者双目分别投射各自对应的一幅二维图像,利用双目视向于出屏场景处的交叉触发观察者的深度感知。但观察者各眼睛为了看清楚各自对应的二维图像,需一直聚焦于二维图像显示面,由此导致聚焦-会聚冲突问题,即观察者单目聚焦深度和双目汇聚深度的不一致。这种单目聚焦深度和双目汇聚深度的不一致,有悖于自然情况下人们观察真实三维场景时单目聚焦深度和双目汇聚深度一致的生理习惯,由此导致观察者的视觉不适,是目前阻碍三维显示技术推广应用的瓶颈性问题。单目多视图是解决聚焦-会聚冲突问题的一种有效技术路径。其利用分光器件,引导显示器件向观察者同一眼睛投射至少两个待显示场景的二维图像,以引导过各显示物点存在至少两束光束入射观察者该同一眼睛,该至少两束光束于显示物点处叠加形成光斑的光强分布可以牵引观察者眼睛自由聚焦于该叠加光斑时,即实现聚焦-会聚冲突问题的克服。
发明内容
本发明提出一种基于子像素出射光空间叠加的三维显示方法,以显示器件子像素为基本显示单元,各出射同种颜色光的子像素分别单独作为一个子像素组或分为几个子像素组,由此形成的多个子像素组,投射待显示场景的多个图像入射观察者瞳孔所处区域,基于单目多视图实现单目可聚焦的三维场景显示。现有单目多视图显示技术,均是基于不同像素所投射具有彩色呈现能力的光束的空间叠加,形成单目可聚焦空间光点。其各像素投射光束所具有的彩色呈现能力来自于该像素所包含子像素分别出射的基本色光束的混合调制。一个该类空间光点的叠加形成,至少需要两个像素投射光束的空间叠加,例如PCT/CN2017/080874(THREE-DIMENTIONAL DISPLAY SYSTEM BASED ON DIVISION  MULTIPLEXING OF VIEWER'S ENTRANCE-PUPIL AND DISPLAY METHOD)和PCT/IB2017/055664(NEAR-EYE SEQUENTIAL LIGHT-FIELD PROJECTOR WITH CORRECT MONOCULAR DEPTH CUES)所述。本专利所述方法,以不同子像素投射的基本色光束,于空间叠加形成单目可聚焦空间光点。相较于现有单目多视图技术至少所需的两个像素,形成一个单目可聚焦空间光点,本专利所述方法最少仅需要两个子像素。通过以子像素为基本显示单元,本专利基于子像素出射光空间叠加的三维显示方法,可有效提高显示器件的二维视图投射数量,更有利于观察者眼睛观影区域的扩展,或通过提升投射视图对应视区的空间密度扩展单目可聚焦显示场景的显示深度。进一步的,本专利利用投影器件投射显示器件的放大像,使所述方法的适用范围扩展至近眼显示;利用中继器件优化光学结构的空间结构分布。所述方法可以直接应用于双目三维显示光学引擎,也可以应用于针对单目的光学引擎。
以子像素作为显示单元,基于单目多视图实现单目可聚焦的三维显示,本发明提供如下方案:
基于子像素出射光空间叠加的三维显示方法,包括以下步骤:
(i)以显示器件各像素的子像素为基本显示单元,各出射相同颜色光的子像素,分别单独作为一个子像素组或分为若干个子像素组;
所述显示器件的子像素分属K′类出射光颜色互不相同的基本色子像素,包括K类基元色子像素,且各子像素组的子像素遍布该显示器件排列,其中K′≧2,K′≧K≧2;
其中,所述K类基元色子像素,按一一对应的方式,分别对应K类滤光片,各类基元色子像素出射光对对应滤光片和对其它(K-1)类基元色子像素对应各滤光片的透过率比值大于9,并定义基本色子像素出射光的颜色为基本色,共K′类基本色,定义基元色子像素出射光的颜色为基元色,共K类基元色;
(ii)沿显示器件各子像素出射光传输方向,设置分光器件于显示器件前,引导显示器件各子像素分别向其所属子像素组对应的视区投射光束,并限制该各投射光束的发散角,使该各投射光束于观察者瞳孔所处面上,沿至少一个方向,光强值大于峰值光强的50%的光分布尺寸小于观察者瞳孔直径;
(iii)由与显示器件连接的控制器件,控制各子像素组加载显示对应图像,其中各子像素所加载图像信息,为沿该子像素所投射并入射观察者瞳孔所处区域 的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔所处面交点上的投影光信息;
其中,一个子像素组显示的图像为待显示场景的一个视图,不同子像素组的不同部分互补拼合所成拼合子像素组显示的图像为一个拼合视图;
其中,显示器件各子像素组对应视区的空间位置分布,被设置为使得总数至少为两个的视图或/和拼合视图信息入射同一观察者瞳孔。
进一步地,步骤(i)还包括以所述子像素组的子像素所出射光的颜色命名该子像素及其所属子像素组,例如绿色子像素及其所属的绿色子像素组。
进一步地,所述分光器件为对应观察者瞳孔置放的孔径阵列,该孔径阵列包含至少一组由K个孔径组成的孔径组,该K个孔径一一对应地附有K类基元色子像素分别对应的K类滤光片,各孔径对应类基元色子像素所成子像素组中,投射光通过该孔径的子像素组以该孔径为对应视区。
进一步地,所述孔径阵列包含M组孔径组,不同孔径组分别仅允许互不相同正交特性光通过,其中M≧2。
进一步地,所述互不相同正交特性,是于不同时间点不同时透射的时序正交特性、或偏振方向相互垂直的两个线偏特性、或旋转方向相反的两个旋光特性、或于不同时间点不同时透射的时序正交特性和偏振方向相互垂直的两个线偏特性的组合特性、或于不同时间点不同时透射的时序正交特性和旋转方向相反的两个旋光特性的组合特性。
进一步地,所述分光器件为对应观察者瞳孔置放的孔径阵列,该孔径阵列包含至少一组由K′个孔径组成的孔径组,该K′个孔径与K′类基本色子像素一一对应,各孔径对应类基本色子像素所成子像素组中,投射光通过该孔径的子像素组以该孔径为对应视区,其中对应K类基元色子像素的K个孔径分别附有对应的K类滤光片,其中K′﹥K;
其中,同一孔径组中,附有K类滤光片的K个孔径允许同一种正交特性光通过,其余(K′-K)个孔径分别仅允许其它(K′-K)种正交特性光中的各一种通过,该(K′-K+1)种正交特性互不相同,各孔径对应类基本色子像素所成子像素组中,投射光通过该孔径的子像素组以该孔径为对应视区。
进一步地,所述孔径阵列包含M组孔径组,不同孔径组分别仅允许互不相同正交特性光通过,其中M≧2。
进一步地,所述互不相同正交特性,是于不同时间点不同时透射的时序正交特性、或偏振方向相互垂直的两个线偏特性、或旋转方向相反的两个旋光特性、或于不同时间点不同时透射的时序正交特性和偏振方向相互垂直的两个线偏特性的组合特性、或于不同时间点不同时透射的时序正交特性和旋转方向相反的两个旋光特性的组合特性。
进一步地,所述显示器件为具有背光源阵列的被动式显示装置,该背光源阵列包括至少一组分别投射K类基元色光的K个背光源所组成背光源组,所述分光器件为投射该背光源阵列各背光源的实像的光学器件,各背光源的实像的光分布区域作为投射光颜色一致于该背光源投射光颜色的子像素组中,投射光过该光分布区域的子像素组所对应视区。
进一步地,所述背光源阵列包含M组背光源组,不同背光源组分别出射互不相同正交特性光,其中M≧2。
进一步地,所述互不相同正交特性,是于不同时间点不同时投射的时序正交特性、或偏振方向相互垂直的两个线偏特性、或旋转方向相反的两个旋光特性、或于不同时间点不同时投射的时序正交特性和偏振方向相互垂直的两个线偏特性的组合特性、或于不同时间点不同时投射的时序正交特性和旋转方向相反的两个旋光特性的组合特性。
进一步地,所述显示器件为具有背光源阵列的被动式显示装置,该背光源阵列包括至少一组分别投射K′类基本色光的K′个背光源所组成背光源组,所述分光器件为投射该背光源阵列各背光源的实像的光学器件,各背光源的实像的光分布区域作为投射光颜色一致于该背光源投射光颜色的子像素组中,投射光过该光分布区域的子像素组所对应视区,其中K′﹥K;
其中,同一背光源组中,投射K类基元色光的K个背光源出射光具有同一种正交特性,其余(K′-K)个背光源出射光分别具有其它(K′-K)种正交特性中的各一种,该(K′-K+1)种正交特性互不相同。
进一步地,所述背光源阵列包含M组背光源组,不同背光源组分别出射互不相同正交特性光,其中M≧2。
进一步地,所述互不相同正交特性,是于不同时间点不同时投射的时序正交特性、或偏振方向相互垂直的两个线偏特性、或旋转方向相反的两个旋光特性、或于不同时间点不同时投射的时序正交特性和偏振方向相互垂直的两个线偏特 性的组合特性、或于不同时间点不同时投射的时序正交特性和旋转方向相反的两个旋光特性的组合特性。
进一步地,步骤(ii)还包括置投影器件于与显示器件对应的位置,成显示器件放大像。
进一步地,步骤(ii)还包括置中继器件于显示器件投射光传输路径上,引导显示器件投射光束入射观察者瞳孔所处区域。
进一步地,所述中继器件为反射面、或半透半反面、自由曲面组合、或光波导器件。
进一步地,步骤(iii)还包括将追踪器件与控制器件连接,通过追踪器件(70)实时确定观察者瞳孔的位置。
进一步地,步骤(iii)还包括根据观察者瞳孔的位置,对于投射光入射观察者瞳孔的各子像素,确定其所加载光信息,为沿该子像素所投射并入射观察者瞳孔的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔交点上的投影光信息。
进一步地,步骤(iii)还包括根据观察者瞳孔的位置,由控制器件确定投射光束入射观察者瞳孔的各子像素组,并仅驱动该各子像素组作为有效子像素组进行单目多视图显示。
本发明以子像素为基本显示单元,相对于以像素为显示单元的单目多视图显示方法,可以有效提高二维视图的投射数量,并结合分光器件的特性设计,通过结合空间复用或/和时间复用,进一步满足单目多视图显示对二维视图投射数量的较大要求。
本发明具有以下技术效果:本发明基于子像素出射光空间叠加的三维显示方法,为无聚焦-会聚冲突三维显示提供一种实现方法。本专利基于子像素出射光空间叠加的三维显示方法,可有效提高显示器件的二维视图投射数量,更有利于观察者眼睛观影区域的扩展,或通过提升投射视图对应视区的空间密度扩展单目可聚焦显示场景的显示深度。进一步的,本专利利用投影器件投射显示器件的放大像,使所述方法的适用范围扩展至近眼显示;利用中继器件优化光学结构的空间结构分布。所述方法可以直接应用于双目三维显示光学引擎,也可以应用于针对单目的光学引擎。
本发明实施例的细节在附图或以下描述中进行体现。本发明的其它特性、目 的和优点通过下述描述、附图而变得更为明显。
附图说明
附图用于帮助更好地理解本发明,也是本说明书的一部分。这些对实施例进行图解的附图和描述一起用以阐述本发明的原理。
图1是现有以像素为显示单元的单目多视图显示原理示意图。
图2本专利基于子像素出射光空间叠加的三维显示方法示意图。
图3是拼合子像素组的拼合规则说明图。
图4是无光束叠加发生的深度区域示意图。
图5是投影器件的成像功能示意图。
图6是中继器件偏折光传播路径的示例。
图7是孔径型分光器件分光原理示意图。
图8是基于线偏特性的孔径型分光器件工作原理说明图。
图9是基于时序正交特性的孔径型分光器件工作原理说明图。
图10是一种基于混合正交特性的孔径型分光器件工作原理说明图。
图11是另一种基于混合正交特性的孔径型分光器件工作原理说明图。
图12是基于孔径型分光器件的双目显示结构示意图。
图13是基于孔径型分光器件的近眼单目显示光学引擎示意图。
图14是基于孔径型分光器件的近眼双目显示光学引擎示意图。
图15是基于孔径型分光器件的复合结构型近眼单目光学引擎示例1。
图16是基于孔径型分光器件的复合结构型近眼单目光学引擎示例2。
图17是基于孔径型分光器件的复合结构型近眼单目光学引擎示例3。
图18是基于孔径型分光器件的复合结构型近眼单目光学引擎示例4。
图19是基于孔径型分光器件的轻薄化单目光学引擎的结构装配图。
图20是基于孔径型分光器件的复合结构型近眼单目光学引擎示例5。
图21是基于孔径型分光器件的自由曲面型中继器件范例示意图。
图22是基于孔径型分光器件的光波导型中继器件范例1示意图。
图23是多光波导堆叠结构示意图。
图24是基于孔径型分光器件的光波导型中继器件范例2示意图。
图25是基于孔径型分光器件的光波导型中继器件范例3示意图。
图26是成像型分光器件分光原理示意图。
图27是基于线偏特性的成像型分光器件工作原理说明图。
图28是基于时序正交特性的成像型分光器件工作原理说明图。
图29是一种基于混合正交特性的成像型分光器件工作原理说明图。
图30是基于成像型分光器件的双目显示结构示意图。
图31是背光源形状示意图。
图32是条状视区相对于观察者双目连线方向的倾斜排列示意图。
图33是复合结构型显示装置示意图。
图34是基于成像型分光器件的近眼单目显示光学引擎范例1示意图。
图35是基于成像型分光器件的近眼单目显示光学引擎范例2示意图。
图36是基于成像型分光器件的近眼双目显示光学引擎示意图。
图37是基于成像型分光器件的近眼单目显示光学引擎范例3示意图。
图38是基于成像型分光器件的近眼单目显示光学引擎范例4示意图。
图39是基于成像型分光器件的近眼单目显示光学引擎范例5示意图。
图40是基于成像型分光器件的近眼单目显示光学引擎范例6示意图。
图41是基于成像型分光器件的自由曲面型中继器件范例示意图。
图42是基于成像型分光器件的光波导型中继器件范例1示意图。
图43是基于成像型分光器件的光波导型中继器件范例2示意图。
具体实施方式
本发明基于子像素出射光空间叠加的三维显示方法,直接以子像素为基本显示单元,利用多个子像素组,向观察者同一瞳孔投射至少两个视图,基于来自不同视图的不同矢向光束的空间叠加,形成单目可聚焦显示光点,实现无聚焦-会聚冲突的三维显示。
现有单目多视图技术都是以像素为基本显示单元,投射至少两个视图至观察者的同一瞳孔50,通过来自该至少两个视图的至少两束过显示物点光束的空间叠加,于显示物点处形成空间叠加光点。相对于各叠加光束于出射像素处的光强分布,该空间叠加光点的光强分布对观察者眼睛具有更大吸引力时,可以牵引观察者眼睛聚焦于该空间叠加光点,从而克服聚焦-会聚冲突。该过程中,于观察者瞳孔50所处面上,沿至少一个方向,各光束光强值大于峰值光强的50%的光分布尺寸要小于观察者瞳孔50直径D p,以保证相对于各光束出射像素处的光强分布,空间叠加光点处的光强分布可以具有吸引观察者眼睛焦点的能力。所述的 至少两个视图,来自于显示器件10上的至少两个像素组,该至少两个像素组在分光器件20的作用下,分别向各自对应视区投射关于该视区的对应视图。具体地如图1所示基于单目两视图的示例,其中显示器件10以薄结构显示器为例。像素组1投射光束携带视图1光信息经对应视区VZ 1,像素组2投射光束携带视图2光信息经对应视区VZ 2,分别入射观察者瞳孔50。x向为视区排列方向。于待显示物点P,来自像素组1的光束1和来自像素组2的光束2叠加形成空间叠加光点。该空间叠加光点的光强分布,可以吸引观察者瞳孔50对应眼睛聚焦于该物点P时,观察者眼睛焦点将不再被强制固定于光束1或光束2的出射像素处,也即观察者眼睛焦点将不再被强制固定于显示器件10上,从而实现聚焦-会聚冲突的克服。其它同理呈现的单目可聚焦显示物点,共同组成单目可聚焦三维场景。图1是为了说明单目多视图实现的基本原理和方法,没有涉及分光器件20的具体结构,以虚框代替分光器件20。且分光器件20相对于显示器件10空间位置,也是为了便于图示而示意性地示出,并不代表分光器件20采用不同形态的具体结构时,其和显示器件10之间的实际位置关系一定如图1所示。
实际上,提高视区的数量和分布密度,可以实现同一观察者瞳孔50的更多视图入射。这样,过各显示物点,将会有更多的叠加光束沿各自矢向入射观察者瞳孔50。该更多数量光束的叠加,可以提高空间叠加光点对观察者眼睛焦点的吸引能力,有利于更大出屏距离场景的显示。同时,更多的视区,也可以为观察者瞳孔50提供更大的观影区域,以便观察者瞳孔50于该更大观影区域内发生移动时,持续基于单目多视图原理看到无聚焦-会聚冲突的显示场景。基于单目多视图进行显示时,视区数量的增加意味着它们各自对应视图数量的增加,需要分光器件20分割显示器件10的像素为更多个像素组,以投射该更多的视图。显示器件10像素分割为像素组,往往是由分光器件20基于空间分割复用或时间分割复用来实现的。其中,空间分割复用将显示器件10的像素空间上分为对应不同视区的不同空间像素组,各空间像素组包含像素互不相同,更多的空间像素组意味着各空间像素组包含像素数量的下降,也即投射视图分辨率的下降。时间分割复用将显示器件10的像素时序上分为对应不同视区的不同时序像素组,各时序像素组包含像素相同,但在同一个时间周期内的不同时间点分别向不同的视区进行光信息投射,基于视觉滞留进行不同时序像素组投射光束间的空间叠加。基于 时间分割复用生成更多的时序像素组,意味着基于视觉滞留显示一个场景时,于一个时间周期内需要时间点数目的增加,也即三维场景显示频率的相对下降。
图1为现有单目多视图显示方法的示意图,其各光束是由对应像素出射的。各像素分别包含大于一个的子像素。同一个像素的多于一个的子像素,分别出射不同颜色的光,该不同颜色光混合而成多色混合光,作为该像素的彩色出射光。各像素如此出射的彩色光,于单目多视图显示过程中,通过分光器件20的发散角约束和矢向引导,以发散角受限的矢向光束形态,空间叠加生成单目可聚焦空间叠加光点。图1所示过程所需的两个视图,由显示器件10像素经分光器件20分割而成的两个像素组投射。
不同于图1所示以像素为基本显示单元所进行的单目多视图显示,本专利以子像素作为基本显示单元进行单目多视图显示。各出射相同颜色光的子像素,分别单独作为一个子像素组或分成多个子像素组,并以各子像素出射光的颜色命名该子像素及其所属子像素组,例如绿色子像素和其所属绿色子像素组。图2以各像素分别由R(红)、G(绿)、B(蓝)三色子像素构成的显示器件10为例进行说明,R、G、B三色子像素为分别出射红色光、绿色光、蓝色光的三类子像素。本专利中,设定显示器件10的所有子像素分属K′类出射光颜色互不相同的基本色子像素,其中包括K类基元色子像素。所述K类基元色子像素,满足如下条件:按一一对应的方式,分别对应K类滤光片,各类基元色子像素出射光对对应滤光片和对其它(K-1)类基元色子像素对应滤光片的透过率比值大于9。此处,K′≧2,K′≧K≧2。定义基本色子像素出射光的颜色为基本色,共K′类基本色,定义基元色子像素出射光的颜色为基元色,共K类基元色。图2选用的显示器件10,其K′=3类基本色子像素同时也是K=3类基元色子像素。各像素中的三个子像素沿x向排列。图2以各基本色子像素分别被分割为两个子像素组为例,共2×3=6个子像素组:红色子像素组1、绿色子像素组1、蓝色子像素组1、红色子像素组2、绿色子像素组2和蓝色子像素组2。该6个子像素组,在分光器件20引导下,一一对应地分别过6个视区VZ R1、VZ G1、VZ B1、VZ R2、VZ G2和VZ B2投射待显示场景的6个视图。其中,各子像素所加载的图像信息,为沿该子像素所投射并经其所属子像素组对应视区入射观察者瞳孔50所处区域的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔50所处面交点上 的投影光信息。也即是说,各子像素组分别加载相对于各自对应视区的视图。需要指出,各子像素投射光为基本色,只能投射该基本色光信息,所述“沿该子像素所投射并经其所述子像素组对应视区入射观察者瞳孔50所处区域的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔50所处面交点上的投影光信息”中的投影光信息,仅指颜色一致于该子像出射光颜色的信息分量。这也适用于本专利其它部分。设计各视区间距,使过至少两个视区出射的至少两个视图的光信息入射同一观察者瞳孔50。如图2所示,经视区VZ B1、VZ R2、VZ G2出射的三束分别来自于蓝色子像素组1、红色子像素组2、绿色子像素组2的光束息均入射观察者瞳孔50时,对各显示物点,如所示物点P,分别来自该三个基本色子像素组的三条矢向光束3、4、5叠加形成单目可聚焦显示光点。相较于以像素为基本显示单元的情况,以子像素为基本显示单元时,出射不同颜色光的子像素分别分组投射视图,可以增大显示器件10经分光器件20所能投射视图及对应视区的数量。图2是为了说明以子像素为基本显示单元实现单目多视图显示的原理和方法,不涉及分光器件20的具体结构,以虚框代替分光器件20。且分光器件20相对于显示器件10空间位置,也是为了便于图示而示意性地示出,并不代表分光器件20采用不同形态的具体结构时,其和显示器件10之间的实际位置关系一定如图2所示。对比图1和图2,如果图1所示结构和图2所示结构选用相同的显示器件10,在投射视图分辨率相同的情况下,采用子像素作为基本显示单元,可以将投射视图数量提升为以像素为基本显示单元时的K′=3倍。也即是说,在不牺牲各投射视图分辨率和显示频率的前提下,本专利以子像素为基本显示单元的单目多视图显示方法,相对于现有以像素为基本显示单元的单目多视图显示,可以有效提升投射视图及对应视区的数目,从而可以利用更多视区为观察者瞳孔50提供更大的观影区域,或者通过提高视区分布密度扩展显示深度。图2中所示空间叠加光点P处于显示器件10和观察者瞳孔50之间,由来自不同子像素的光束真实叠加而成。实际上,于显示器件的另一边,也可以生成空间叠加光点。如图2中的点P′,由光束6、7、8的反向延长线相交而成。当观察者眼睛接收到光束6、7、8时,其看到的是光束6、7、8沿反向衍射传输所得等效光分布于点P′处的叠加光分布,本专利一样称之为点P′处的空间叠加光点,其于观察者眼睛视网膜上也对应真实的光点。本专利中,位于显示器件10两边的显示 场景,对观察者而言,都是基于相同的原理生成的。在以下部分,仅以显示器件10出射光传输方向这一边的显示场景为例进行说明。
基于多于一束光束的叠加形成单目可聚焦空间光点,要求各叠加光束于观察者瞳孔50所处面上,沿至少一个方向,光强值大于峰值光强的50%的光分布尺寸小于观察者瞳孔50直径,以保证相对于各光束出射子像素处的光强,空间叠加光点的光强分布可以具有吸引观察者眼睛焦点的能力。各子像素投射光束,经其所属子像素组的对应视区入射观察者瞳孔50所处区域。该各对应视区的形状,有两种情况。第一种情况,沿一个方向,视区尺寸小于观察者瞳孔50直径D p,但存在其它方向,沿该其它方向,视区尺寸不小于观察者瞳孔50直径D p,本专利称此类视区为条状视区;第二种情况,沿各个方向,视区尺寸均小观察者瞳孔50直径D p,本专利称此类视区为点状视区。采用条状视区时,各视区沿一维方向排列;采用点状视区时,各视区可以沿一维方向排列,也可以沿二维方向排列。
图2中,观察者瞳孔50靠近视区所在面放置。当观察者瞳孔沿光束传输方向,向前或向后偏离各视区所在面时,观察者瞳孔50可能无法完全接收到至少两个视图的全部光束。如图3所示位置处的观察者瞳孔50,可以接收到经视区VZ B1入射的完整视图,其是由该视区VZ B1对应的蓝色子像素组1投射的。但处于该位置的观察者瞳孔50仅能接收红色子像素组2于M s2M r1区域上的子像素经视区VZ R2所投射的部分视图,及绿色子像素组1于M s1M r2区域上的子像素经视区VZ G1所投射的部分视图。图中,M p1、M p2为观察者瞳孔50沿视区排列方向x向的两个边点,M s1、M s2为显示器件10沿视区排列方向x向的两个子像素分布区域边点,M r1是M p2和视区VZ R2的-x向边点连线与显示器件10的交点,M r2是M p1和视区VZ G1的x向边点连线与显示器件10的交点。M s2M r1区域和M s1M r2区域发生重叠,取它们空间互补的各自一部分M s2M t和M tM s1,拼连为拼合子像素组,其上显示图像为待显示场景的一个拼合视图。其中,M t为重叠区域M r1M r2内的一个点。图3所示位置的观察者瞳孔50,可以接收到一个视图和一个拼合视图,过各显示物点,将会有至少两束分别来自该一个视图和一个拼合视图的光束入射观察者瞳孔50,在一定出屏范围内,可基于单目多视图原理叠加形成单目可聚焦的空间光点。同理类推,随着观察者瞳孔50和视区所在面距离的增大,其所能观察到的拼合子像素组,将会由更多个子像素组的不同部分互补拼合而成。
在现有各种显示技术中,往往通过多个基本色的合成进行彩色光信息呈现。各种显示器彩色光信息的像素呈现,就是通过子像素所出射K′类基本色混色而实现的。常见的显示器对应的K′类基本色是R、G、B或R、G、B、W(白)。如前所述,当过各显示物点,以最少的两束光束叠加进行单目多视图显示时,该两束光束于一定出屏距离内虽然可以叠加形成单目可聚焦空间叠加光点,但其颜色的呈现却因为基本色的缺失而失准。考虑色彩的准确呈现,基于各子像素投射光束的空间叠加进行单目多视图显示时,过各显示物点的叠加光束,最优的为至少K′束基本色光束。也即是说,观察者瞳孔50接收到的视图,最优的包括投射光颜色分别为K′类基本色的总数为K′个的视图或/和拼合视图。沿排列方向,经相邻K′个视区出射的光的颜色分别为K′类基本色是常见设计。需要注意,即使观察者瞳孔50接收到至少K′个分别出射K′类基本色的视图或/和拼合视图,于显示器件10附近的一定范围内,也会出现由于子像素的离散分布和视区的离散分布而导致的叠加光束数小于K′的显示物点。如图4所示子像素和视区排布情况下,点P r和点P l之间为无光束叠加区域。该无光束叠加区域内各显示点最多仅有一束光通过,无法基于光束重叠进行显示。其中,点P r为相邻两个子像素向两个相邻视区交叉投射光束的交点,P l为相邻子像素向间距最大视区非交叉投射光束反向延长线的交点。该无光束叠加区域空间上位于显示器件10附近,于该范围内聚焦-会聚冲突导致的视觉不适比较轻微,该范围内的情况,在下述部分不再考虑和讨论。
图2所示各视区,以视区之间存在一定的空间间隙为例进行示出。实际上,相邻视区也可以毗邻排列,也可以部分重叠排列。以下部分常以毗邻排列方式或间隙排列方式示出,但不意味着相邻视区之间必须以示出的排列方式进行排列。
以子像素为显示单元进行单目多视图显示时,还可以利用如图2中所示的追踪器件70,实时获取观察者瞳孔50的位置。其作用在于,第一,各子像素所投射并入射观察者瞳孔所处区域的光束,其光分布尺寸沿某个方向大于观察者瞳孔直径D p时,根据观察者瞳孔50的位置,确定各子像素所加载图像信息,为沿该子像素所投射并入射观察者瞳孔50的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔50所处面交点上的投影光信息;第二,在视区观察者瞳孔50可以在一定区域内发生移动时,根据观察者瞳孔50的位置,由控制器件 30确定投射光束入射观察者瞳孔50的子像素组,并以这些子像素组作为有效子像素组进行单目多视图显示。
显示器件10通过分光器件20投射的视区数量足够多,可以向观察者的两个瞳孔分别各投射至少两个视图或/和拼合视图时,基于本专利所述基于子像素出射光空间叠加的三维显示方法进行显示的光学结构,可以作为双目显示光学引擎。如果显示器件10通过分光器件20投射的视区,仅支持向观察者的单个瞳孔投射至少两个视图或/和拼合视图时,基于本专利所述基于子像素出射光空间叠加的三维显示方法进行显示的光学结构,仅能作为单目显示光学引擎,例如头戴式虚拟现实(VR)/增强现实(AR)的一个目镜。此时,引入投影器件40,可以投射影显示器件10的放大像I 10,如图5所示。显示器件10关于投影器件40的像I 10,可以作为由各等效子像素组组成的等效显示器件,其中各等效子像素组分别为显示器件10各子像素组关于投影器件40的像。各子像素组对应视区关于投影器件40的像存在时,例如红色子像素组2对应视区VZ R2的像I VZR2,其作为红色子像素组2对应的等效子像素组(红色子像素组2的像)所对应的等效视区。则,在引入投影器件40时,可以直接基于等效显示器件各等效子像素组及其分别对应的等效视区进行信息加载和单目多视图显示,完全类似于上述未引入投影器件40时,基于显示器件10各子像素组及其分别对应视区进行的信息加载和单目多视图显示。在等效视区存在时,上述对视区的尺寸约束,转为对其对应等效视区的尺寸约束。此时,等效视区可以为条状,其尺寸沿一个方向小于观察者瞳孔50直径D p,但沿其它某个方向,等效视区尺寸不小于观察者瞳孔50直径D p。等效视区也可以为点状,沿各个方向,等效视区尺寸均小观察者瞳孔50直径D p。所述沿某个方向上的等效视区尺寸,是指该方向上,光强为峰值光强的50%的区域所覆盖尺寸。等效视区为条状时,各等效视区沿一维方向排列;等效视区为点状是,各等效视区可以沿一维方向排列,也可以沿二维方向排列。在各子像素组分别对应各视区关于投影器件40的像不存在时,可以直接基于等效显示器件各等效子像素组及其分别对应视区进行信息加载和单目多视图显示,也完全类似于上述未引入投影器件40时,基于显示器件10各子像素组及其分别对应视区进行的信息加载和单目多视图显示。另外,投影器件40相对分光器件20的位置,和分光器件20的具体结构有关,根据分光器件20的具体结构,投影器件40也可 以置于图5中所示位置P o2,或置于分光器件20不同组件之间的位置P o3
进一步,可以利用中继器件60,引导显示器件10显示光信息经偏折路径投射至观察者瞳孔所处区域,如图6所示。这里中继器件60以半透半反面为例。此时,等效显示器件为显示器件10关于投影器件40和中继器件60的像,如图6中的I 10;各等效子像素组为各子像素组关于投影器件40和中继器件60的像;各视区的像存在时,其作为对应等效视区也是各视区关于投影器件40和中继器件60的像,如图6中的I VZR1、I VZB1等。图6中所示分光器件20、投影器件40、中继器件60的位置关系,也仅是一种示意,它们沿显示器件10投射光传输方向的前后位置关系,可以根据实际光学器件的选用不同而改变。基于本专利所述基于子像素出射光空间叠加的三维显示方法进行显示的光学结构,作为单目显示光学引擎时,双目光学引擎的搭建,需要与观察者双目分别对应的两个该单目显示光学引擎。上述各图中,为了图示的简单,显示器件10均取薄结构的显示器为例,其可以是主动式显示器,也可以具有背光源的被动式显示器。图1至图6不涉及分光器件20的具体结构,均以虚框代替分光器件20。且分光器件20相对于显示器件10空间位置,也是为了便于图示而示意性地示出,并不代表分光器件20采用不同形态的具体结构时,其和显示器件10之间的实际位置关系一定如图1至图6所示。
下面,以分光器件20取具体器件为例,进一步例证解释本专利所述基于子像素出射光空间叠加的三维显示方法。
实施例1
以孔径阵列为分光器件20,命名为孔径型分光器件20,对应显示器件10放置,如图7所示。显示器件10以常见的RGB显示器为例。各像素由分别出射R、G、B光的三个子像素沿x方向排列组成,沿与x向垂直的y方向,出射相同颜色光的子像素相邻成列排列。图7仅以x向的一行子像素为例,各子像素分别以它们出射光颜色R、G、B来标识。例如子像素SP Bn1以下标B标识其出射光蓝光。该K′=3类出射光颜色互不相同的基本色子像素,同时都是基元色,即K′=K=3。它们分对应红色滤光片、绿色滤光片和蓝色滤光片。出射相同颜色光的子像素,单独作为一个子像素组,即显示器件10的子像素被分组为红色子像素组、绿色子像素组和蓝色子像素组。沿显示器件10各子像素出射光传输方向,设置包含K′=3个孔径的孔径阵列作为分光器件20,其K′=3个孔径分别附有红色滤光片 F R、绿色滤光片F G和蓝色滤光片F B。则,红色子像素组出射光经附有红色滤光片F R的孔径出射,绿色子像素组出射光经附有绿色滤光片F G的孔径出射,蓝色子像素组出射光经附有蓝色滤光片F B的孔径出射。本专利限定,各类基元色子像素出射光对对应滤光片和对其它基元色子像素对应滤光片的透过率比值大于9,这时,各子像素出射基元色光经其它类基元色光对应滤光片的透射光作为噪声,对显示质量的影响处于可容忍范围内。该前提下,以下部分不再讨论该噪声,并认为各类基元色子像素投射光不能经过其它类基元色子像素对应滤光片透射。则,附有红色滤光片F R的孔径即是红色子像素组对应的孔径视区VZ R,其下标中的R表示该视区附有红色滤光片;附有绿色滤光片F G的孔径即是绿色子像素组对应的孔径视区VZ G,其下标中的G表示该视区附有绿色滤光片;附有蓝色滤光片F B的孔径即是蓝色子像素组对应的孔径视区VZ B,其下标中的B表示该视区附有蓝色滤光片。这种以下标表示视区所附滤光片对应颜色的表示方法,适用于本实施例下述部分。根据图2所述原理,当相邻视区间距足够小时,可以有至少两个视图或/和拼合视图入射观察者瞳孔50,实现单目多视图显示。
为了实现显示场景色彩的准确呈现,最优的应该三个基元色视图均入射观察者瞳孔50。这种情况下,仅K′=3个视区的呈现,导致观察者瞳孔50相对于孔径型分光器件20的位置被约束于较小的空间区域。希望更多的视区,以允许观察者瞳孔可以有一定的移动空间。引入正交特性至孔径型分光器件20,可以有效地解决该问题。该正交特性可以是偏振方向相互垂直的两个线偏特性。如图8所示,其孔径型分光器件20包含M=2组孔径组,各孔径组均包含分别附有红色滤光片F R、绿色滤光片F G和蓝色滤光片F B的K′=3个孔径。且该M=2组孔径组分别允许两个相互垂直的线偏光“-”光和“·”光通过。具体地,孔径视区VZ B1、VZ G1、VZ R1分别仅允许蓝色、绿色、红色“·”光通过,孔径视区VZ B2、VZ G2、VZ R2分别仅允许蓝色、绿色、红色“-”光通过。此处,线偏光“-”光和“·”的选择通过,可以通过附着偏振片于各孔径来实现。对应地,各出射相同颜色光的子像素,也对应地分别空间分割为M=2组,一组出射“·”光,另一组出射“-”光。例如,图8中,子像素SP Bn1、SP Bn3、SP Bn5、…组成空间蓝色子像素组1,子像素SP Bn2、SP Bn4、…组成空间蓝色子像素组2。为了保证各子像素组的子像素遍布显示器件10排列,出射相同颜色光的M=2个子像素组的子像素,最优地沿子像素排列方向相间排列,例如图8中,沿x向相邻M=2同色子像素分属不同子像素组。如此,共K′×M=6 个子像素组和6个孔径一一对应,各孔径作为对应子像素组的视区,仅允许对应子像素组投射光通过。图8中所示两个相互垂直的线偏态,也可以用两个旋向相反的旋光态代替。
上述正交特性,也可以是不同时间点不同时透射的时序正交特性。如图9所示,其孔径阵列包含M=2组孔径组,各孔径组分别包含分别附有红色滤光片F R、绿色滤光片F G和蓝色滤光片F B的K′=3个孔径,且该M=2组孔径组分别在一个时间周期Δt的两个时间点t和t+Δt/2,由控制器件30控制,不同时地打开。具体地,孔径VZ B1、VZ G1、VZ R1在时间点t打开,孔径VZ B2、VZ G2、VZ R2在时间点t+Δt/2打开。图9所示为时间点t对应情况。对应地,出射相同颜色光的子像素,从时间上分割为两组时序子像素组,该两组时序子像素组的子像素于空间上完全一样,但于时间上,分别于点t和点t+Δt/2工作,其各子像素确定对应加载信息时,所属子像素组对应视区不同。例如,在时间点t,SP Bn1、SP Bn2、SP Bn3、…组成的时序蓝色子像素组1以孔径VZ B1作为其对应孔径视区,在时间点t+Δt/2,SP Bn1、SP Bn2、SP Bn3、…组成的时序蓝色子像素组2以孔径VZ B2作为其对应孔径视区。基于时序复用,同样可以实现K′×M=6个视区的投射。更大的M值要求显示器件10具有更高的帧频,以避免闪烁效应的出现。图8和图9中,在最优地投射至少K′个基本色视图或/和拼合视图至观察者同一瞳孔50的前提下,各视区的空间位置可以互换。所述时序正交特性的实现,可以通过采用和控制器30有信号连接的可控液晶面作为孔径阵列来实现。
进一步地,上述正交特性,也可以是混合特性,例如时序正交特性和线偏特性的混合特性。如图10所示,仅允许“·”光通过的孔径VZ R1、VZ G1、VZ B1分别附有对应基元色R、G、B的滤波片,仅在在一个时间周期t~t+Δt的t时刻打开,在t+Δt/2时刻关闭;仅允许“·”光通过的孔径VZ R2、VZ G2、VZ B2分别附有对应基元色R、G、B的滤波片,仅在在一个时间周期t~t+Δt的t+Δt/2时刻打开,在t时刻关闭;仅允许“-”光通过的孔径视区VZ R3、VZ G3、VZ B3分别附有对应基元色R、G、B的滤波片,仅在在一个时间周期t~t+Δt的t时刻打开,在t+Δt/2时刻关闭;仅允许“-”光通过的孔径视区VZ R4、VZ G4、VZ B4分别附有对应基元色R、G、B的滤波片,仅在在一个时间周期t~t+Δt的t+Δt/2时刻打开,在t时刻关闭。对应地,出射光颜色相同的各子像素,空间上分为两个子像素相间排列的空间子像素组,各空间子像素组,于一个时间周期内又时序地分为两个时序子像素组。则, 一个时间周期t~t+Δt内,各出射光颜色相同的子像素的四个相互独立的子像素组,可以向各自对应的四个孔径视区分别投射四个视图,最终共生成12个视区。其它时间周期同理重复,则基于视觉滞留,该12个视区可以给观察者瞳孔50提供更密集的投射视图和更大的观影空间。
图10所示混合特性示例中,同一子像素组的各子像素出射光具有相同的正交特性。也存在同一子像素组的各子像素不具有完全一致的正交特性的设置方式。如图11,以将显示器件10沿x向分为BN=3个区域块为例(BN≧2),该B 1、B 2、B 3三个区域块中,相邻区域块的子像素出射不同正交态的光。例如,区域块B 1上子像素均出射“·”光;区域块B 2上子像素均出射“-”光;区域块B 3上子像素均出射“·”光。出射光颜色相同的子像素分别作为一组子像素组。各子像素组于不同区域块上的子像素,虽然出射光的颜色相同,但具有不同的线偏光特性,分别对应不同的孔径。以出射蓝光的子像素组为例,其于B 1上子像素于一个时间周期t~t+Δt内的t时刻,对应此时打开的孔径视区VZ B1,于t+Δt/2刻,对应此时打开的孔径视区VZ B2;其于B 2上子像素于一个时间周期t~t+Δt内的t时刻,对应此时打开的孔径视区VZ B3,于t+Δt/2刻对应此时打开的孔径视区VZ B4;其于B 3上子像素于一个时间周期t~t+Δt内的t时刻,对应此时打开的孔径视区VZ B5,于t+Δt/2刻对应此时打开的孔径视区VZ B6。这样,出射蓝光的子像素组,其于不同区域块上的子像素,分别通过不同的对应视区进行光信息投射。每个视区对应的子像素分布区域变小,可以缓解尺寸有限孔径视区对可视子像素分布区域的限制而导致的视区受限。如图11所示,其它各子像素组于BN=3个区域块的子像素,也于各时间点,分别通过BN=3个孔径视区进行信息投射。该子像素及视区设置方式,导致一个子像素组对应的视区不再是一个,而是BN(≧2)个,此时,各子像素对应视区,为其所属子像素组对应BN个视区中的一个。受制于可选正交特性数量的限制,例如,线偏特性中只存在偏光方向相互垂直的两个正交偏光态,可能会出现一个区域块上的子像素出射光可以通过非相邻区域块上同组子像素对应视区出射而导致的噪声,如图11中子像素SP Bn1投射光经视区VZ B5透射的噪声,子像素SP Gn2投射光经视区VZ G5透射的噪声等。但由于时序正交复用和线偏正交复用的采用,上述噪声可以通过视区的分布设计而无法入射观察者瞳孔50。例如,图11中,VZ B1和VZ B5之间间隔了11个视区,子像素SP Bn1经对应视区VZ B1投射的有用光束和其经非对应视区VZ B5投射的噪声光束之间 具有比较大的间隔角,该较大的间隔可以保证噪声不能入射处于观察区域内的观察者瞳孔50。
图8至图11所示结构中,视区可以分为两组,也即是孔径型分光器件20的孔径分为两组,分别对应观察者的左瞳孔50′和右瞳孔50处放置,如图12,将本专利所述方法直接应用于观察者双目。更进一步地,视区分为的组数可以对应多于一个的观察者的双目时,则可进行多人的双目显示。
上述各图中,各孔径可以是只能沿一维方向排列的长条状孔径,沿排列方向,各孔径尺寸小于观察者瞳孔直径D p,沿其它某个方向,其尺寸大于观察者瞳孔直径D p。另一种情况,各孔径沿各个方向,其尺寸均小于观察者瞳孔直径D p的称点状孔径,可以沿一维方向排列,也可以沿二维方向排列。当采用点状孔径时,上述以一维方向排列孔径的示例,可以扩展至点状孔径沿二维方向排列的情况。
上述各图示例中,相邻子像素都是空间错位排列的。实际上,K′个基本色子像素也可以空间上重叠于一点,例如通过色轮向空间上相同的一个子像素时序投射K′种颜色背光的显示方法。这时上述过程显示一个图像的一个时间段,例如t~t+Δt/2段,需要进一步分为K′个子时间段,分别让该K′种颜色的背光轮流入射,对应等效为K′个不同颜色的子像素时序对应加载光信息。
在K′=3个分别具有不同基元色的视图或/和拼合视图入射观察者同一瞳孔50,进行三维场景搭建时,也存在显示场景为单一基元色的情况,例如待显示场景或部分场景为绿色的情况。该情况下,若仅有来自一个绿色子像素组的一束过该待显示绿色空间光点的光束入射观察者瞳孔50,则无法实现至少两束过该绿色空间光点的光束入射观察者瞳孔的要求,破环单目聚焦的实现。可将该待显示绿色空间光点的光信息,设计为白色W(R+G+B)+绿色B,其呈现需要分别来自出射基元色光的三个子像素组的各一束光束的叠加,以避免上述现象的出现。
上述显示器件10取K′=K=3为例。实际上,本专利所述方法的应用,不受限于K′和K的取值。例如,对常见的由R、G、B、W四个子像素组成一个像素的显示器,可以同样基于上述原理和方法进行单目多视图显示。此时,K′=4,K=3。其中的白色子像素,其出射光可以通过其它三类基元色对应的滤光片。为了避免白色子像素投射光透射其它三类基元色对应滤光片而产生的噪声,需要出白色子像素组所对应的允许白光透射的孔径视区,和其它三类基元色子像素组所对应孔径视区,具有不同的正交特性,以保证各子像素组出射光分别具有仅透射 对应孔径视区。例如,出射基元色光的像数组和出射白光的像数组时序错位投射光信息,它们对应孔径视区也对应地时序错位打开。又例如,出射基元色光的像数组和出射白光的像数组所出射光分别为左旋光和右旋光,它们对应孔径视区也对应地分别仅允许左旋光和右旋光通过。
另外,本专利所述属方法并不约束显示器件10的子像素的形状,例如现有显示器件中常见的矩形子像素,也可以是子像素设计为方形的显示器。各子像素的排列方式,也可以设计为除上述RGB排列方式的其它排列方式,例如pencil排列方式。另外,上述各图中,显示器件10均选用薄结构的显示器示范。实际上,显示器件10也可以是其它各种类型的显示器,比如,具有较厚结构的需要背光源的透射型或反射型显示器。孔径型分光器件20中的各孔径,也可以是反射型孔径。
上述图7至图11所示各结构中,可以引入投影器件40,例如图5中置于位置Po1处的投影器件40。采用孔径型分光器件20,引入的投影器件40和分光器件20可以以较小距离放置,如图13。图13中投影器件40和分光器件20的位置也可以互换。该情况下,以显示器件10关于投影器件40的像I 10作为等效显示器件,代替前述显示器件10,既可基于同样方法和过程进行单目多视图显示。图13中,以两组孔径组为例示出。其也可以是仅一组,或者更多组。其不同组之间可具有图7至图11所述不同正交特性,例如不同的时序特性,或者不同的旋光特性,或者复合特性等。引入投影器件40的结构,常被作为近眼显示光学引擎的一个目镜,两个该类目镜搭建一个双目显示光学结构,如图14。
图13所示结构,还可以进一步地扩展为复合结构,以优化其显示性能和光学结构尺寸。图15所示结构,通过两个显示器件关于各自对应投影器件的像的重合,提高投射视图的数量和密度;图16所示结构,通过两个显示器件关于各自对应投影器件的像的无缝拼合所成的拼连像I 10+I 10′,增大显示视角;图17类似于图16,但其拼连像采用曲面拼合的方式。图18引入辅助成像器件80,对各显示器件经各自对应投影器件所成像的拼连像,二次成像为二次拼连像。如图18中,像I 10和像I 10′拼合为拼连像I 10+I 10′,其中像I 10是显示器件10经投影器件40所成像,像I 10′是显示器件10′经投影器件40′所成像。辅助成像器件80对该拼连像I 10+I 10′再次成像,得到I 10的像I I10和I 10′的像I I10′所成二次拼连像I I10+I I10′。图18所示结构的特点在于,其中各对应的显示器件、投影器件和孔径型分光器 件的组合结构,例如显示器件10、投影器件40和孔径型分光器件20所成组合结构,在显示器件10的尺寸很小的前提下,其整体尺寸可以设计的比较小,置于图19所示薄目镜结构中的小孔内,使显示结构真正轻薄化。其中补偿器件801用以消除辅助成像器件80对外部环境光的影响,使外部环境光无畸变或小畸变地入射观察者瞳孔50。在无需外部环境光入射的情况下,补偿器件801可以去除。辅助成像器件80和补偿器件801之间为支撑介质,例如光学玻璃,为显示器件、投影器件和孔径型分光器件组合结构的安装提供支持体。实际上,当对应显示器件、投影器件和孔径型分光器件所成组合结构中,投影器件所投影对应显示器件的像足够大时,也即上述拼连像足够大时,无需二次成像所得二次拼连像,辅助成像器件80和补偿器件801可以去除。进一步地,更多的显示器件、投影器件和孔径型分光器件所成组合结构,可以实现更多拼连像的投射,这些拼连像可以重叠,也可以部分重叠,也可以位于不同深度上。图20中,显示器件10和显示器件10″的像拼合为拼连像I 10+I 10″,显示器件10′和显示器件10″′的像拼合为拼连像I 10′+I 10″′。这两个拼连像于深度上可以重叠,也可以错位;于深度垂面上,可以完全重叠,也可以错位地部分重叠。为了清晰,图中部分组件未标识。图20中的两个拼连像,示例为沿深度方向错位、于深度垂面上错位地部分重合。图18和图20中的辅助成像器件80,也可以置于各投影器件和各分光器件之间的位置,各组合结构也可以分别沿曲面排列,也可以扩展至沿二维面排列。上述各图中,也可包含更多的组合结构。
图13所示结构中,还可以引入中继器件60,以引导各子像素投射光向观察者瞳孔50所处区域进行传播,如图6的半透半反面。中继器件60可以选用其它各种的光学器件或光学器件组件。当中继器件60有多个组件组成时,该中继器件60可以和投影器件40各自独立放置,也可以和投影器件40共用部分组件。例如图21所示的自由曲面组合器件。该自由曲面组合器件由透射曲面FS1、反射曲面FS2、半透半反曲面FS3、透射曲面FS4、透射曲面FS5构成。其中FS1、FS2、FS3、FS4一起执行投影器件40的功能,FS2、FS3执行中继器件60的功能,FS5具有补偿调制功能,允许外部环境光不受FS3、FS4影响地入射观察者瞳孔50。
所述中继器件60也可以选用光波导器件,称之为光波导型中继器件60。图22所示结构中,光波导型中继器件60置于显示器件10和孔径型分光器件20组 之间,包括入瞳611,耦入器件612,光波导体613,反射面614a和614b,耦出器件615和出瞳616。投影器件40包括透镜40a和透镜40b两个组件。显示器件10上的一个子像素,例如子像素p 0,出射光经投影器件40的组件40a转化为平行光;然后经光波导型中继器件60的入瞳611入射至耦入器件612;耦入器件612引导来自子像素p 0的平行光于光波导体613内,基于反射面614a和614b的反射,向耦出器件615传播;耦出器件615调制入射光束,引导其经出瞳616入射投影器件40的组件40b;投影器件40的组件40b引导子像素p 0投射来的光向孔径型分光器件20传播,并调制其反向汇聚于虚像p′ 0。该虚像p′ 0即为子像素p 0的虚像。同样,p′ 1对应为子像素p 1的对应虚像。p′ 0、p′ 1等子像素像形成显示器件10的像I 10。耦出器件615可以具有扩瞳功能,一方面引导入射光束部分地向出瞳投射,另一方面又允许入射光部分地继续沿原来的传输路径传输,并经反射面614b反射后再次入射耦出器件615进行耦出和反射,并如此重复,直至来自各子像素的光束覆盖出瞳616。则,以显示器件10的像I 10作为等效显示器件,即可基于上述原理和方法进行单目多视图显示。补偿器件801用于补偿投影器件40的组件40b对外部环境入射光的影响,在无需外部环境光时可以去除。图中的投影器件组件40b,也可以复合于耦出器件615,例如采用全息器件置于耦出器件615于图中的位置,起耦出器件615和投影器件组件40b的功能。当该复合投影器件组件40b的耦出器件615具有角度选择性时,即仅对由耦入器件612传播过来的光束起调制作用,对从外部入射的环境光不起作用时,补偿器件801也可以去除。图22仅以一个常用的光波导器件为例进行说明,现有各种结构的光波导器件,实际上均可作为本专利的光波导型中继器件60,例如耦入器件612为反射面的光波导器件,采用多个半透半反面进行扩瞳的光波导器件等。其中,针对光波导器件的色散问题,也可以采用多光波导堆叠结构,如图23所示,三个光波导器件组件分别负责R、G、B色光束的传播和引导,它们的耦入器件和耦出器件可以针对自己负责传输光束的波长进行设计,降低色散效应。
图22所示结构中,光波导型中继器件60置于显示器件10和孔径型分光器件20中间。光波导型中继器件60也可以沿光束传输方向,置于孔径型分光器件20前,如图24所示。各子像素出射光经投影器件40的组件40a转化为平行光后入射孔径型分光器件20;然后经光波导型中继器件60的入瞳611入射至耦入器件612;耦入器件612引导来自各子像素的平行光于光波导体613内,基于反 射面614a和614b的反射,向耦出器件615传播;耦出器件615调制入射光束,引导其经出瞳616入射投影器件40的组件40b;投影器件40的组件40b引导各子像素投射来的光向孔径型分光器件20传播,并调制其反向汇聚于虚像,由此基于组件40a和40b组成的投影器件40和波导型中继器件60生成显示器件10的虚像I 10。孔径型分光器件20上各点透射光为发散光。为了保证各子像素出射光仅会沿唯一矢向入射观察者瞳孔,可以要求孔径型分光器件20上点经光波导性中继器件60和投影器件40的组件40b,产生唯一的像,并由它们组合为孔径型分光器件20的唯一像。分光器件20的像I 20被投射位置,和光学结构各组件的具体参数有关,例如可能是图24所示位置。以显示器件10的像I 10作为有效显示器件,以I 20上各孔径像作为有效视区,基于前述原理和方法实现单目多视图显示。此时,若光波导型中继器件60具有扩瞳功能,因扩瞳会生成孔径型分光器件20的多个像。该情况下,要求各子像素于同一个时间点,所投射过孔径型分光器件20的不同像的不同矢向光束,具足够大的角间距,以保证它们不同时入射观察者瞳孔。这时,需要利用追踪器件70实时确定观察者瞳孔50的位置,并由控制器件30根据该位置确定各子像素投射并入射观察者瞳孔50的唯一矢向光束,基于该光束的矢向,根据前述方法确定该子像素的加载光信息。图中,孔径型分光器件20也可以处于所示位置Po5。图24取反射面作为耦入器件612和耦出器件615。图24也可以选用图22所选用的光波导器件,或其它光波导器件。也可以引入补偿器件801,也可以将投影器件40的组件40b复合于耦出器件615。
光波导型中继器件60沿光束传输方向置于孔径型分光器件20之前时,也可以设计孔径型分光器件20上各点所透射光以平行态入射光波导型中继器件60。如图25,设计组件40c、40d、40e组成投影器件40。则经40c、40d、中继器件60各组件和40e,形成孔径型分光器件20的像I 20。其中各子像素投射光经投影器件40组件40c转换为平行态,孔径型分光器件20上各点透射发散光经投影器件40的组件40d,以平行态入射光波导中继器件60。投影器件40的组件40e会聚孔径型分光器件20上各点投射过来的平行光,形成孔径型分光器件20的像I 20。为了保证各子像素出射光仅会沿唯一矢向入射观察者瞳孔,可以要求图25所示光学结构使各子像素投射光,经40c、40d、光波导型中继器件60各组件和40e,仅对应唯一的像,并由它们组合为显示器件10的像I 10。I 10的位置和光路的具体参数有关,例如图25所示位置。光波导型中继器件60具有扩瞳功能时,各子像 素会因扩瞳生成多个像。该情况下,要求各子像素于同一个时间点对应的不同像,需要具足够大的间距,以保证它们不同时入射观察者瞳孔50。这时,需要利用追踪器件70实时最终观察者瞳孔50的位置,并由控制器件30根据该位置确定各子像素投射并入射观察者瞳孔50的光束,基于该光束的矢向,根据前述方法确定该子像素的加载信息。图25中,投影器件40的组件40c也可以去除。采用光波导型中继器件60时,图22、图24和图25以孔径型分光器件各点透射光被转化为平行光入射光波导型中继器件60,或显示器件10各子像素出射光被转换为平行光入射光波导型中继器件60的情况为例进行说明。实际上,在保证实现至少两个视图或/拼合视图光信息入射观察者同一瞳孔50的前提下,图22、图24和图25中相关各组件的位置关系也可以变动,或引入新的组件,甚至在孔径型分光器件各点透射光或子像素出射光均以非平行光的形态入射光波导型中继器件60。
实施例2
显示器件10选用需要背光源的被动式显示器,所需多个背光源组成背光源阵列110。以成像背光源阵列110各背光源的成像器件为分光器件20,命名为成像型分光器件20,如图26所示。显示器件10以常见的RGB显示器为例,各像素由分别调制投射R(红)、G(绿)、B(蓝)光的三个子像素沿x方向排列组成。图26仅以x向的一行子像素为例,各子像素分别以它们所投射光的颜色R、G、B来标识。例如子像素SP Bn1以下标B标识其投射光为蓝光。该K′=3类出射光颜色互不相同的基本色子像素,同时都是基元色,即K′=K=3。图26的背光源阵列110由一组K′=3个背光源BS B、BS G和BS R组成,该K′=3个背光源分别出射K′=3种基本色光。具体地,背光源BS B出射B光,背光源BS G出射G光,背光源BS R出射R光。出射相同颜色光的子像素,单独作为一个子像素组,即显示器件10的子像素被分组为红色子像素组、绿色子像素组和蓝色子像素组。沿各背光源出射光传输路径,置成像各背光源实像的成像器件作为成像型分光器件20,其K′=3个背光源的像对应标示为I BSB、I BSG和I BSR。图26具体地以透镜作为成像型分光器件20的范例。由于各像素组仅调制对应颜色的入射光,该K′=3个背光源分别成为各自对应像素组的背光源。即BS B是对应蓝色子像素组的背光源,BS G是对应绿色子像素组的背光源,BS R是对应红色子像素组的背光源。其中,各背光源投射光透过非对应子像素组也 可能会因为存在一定值的透光率而产生噪声。在出射光为基元色的背光源及其对应子像素组之间,该噪声可以忽略,并认为背光源出射的基元色光不透射非对应类基元色子像素。根据物像关系,于各背光源的像处,该背光源对应子像素组所投射光信息可见,也就是说,各背光源的像的光分布区域,即为该背光源对应子像素组所对应视区。任一子像素组的各子像素出射光,均经成像型分光器件20被导向该子像素组所对应视区。根据图2所述原理,当相邻视区间距足够小时,可以有至少两个视图或/和拼合视图入射观察者瞳孔50,实现单目多视图显示。此处,各视区是对应背光源通过成像型分光器件所成像,其沿某个方向上的尺寸,是指该方向上对应背光源像的光分布范围内,光强值大于光强峰值50%的光分布区域所占据尺寸。图中所示显示器件10和成像型分光器件20的相对位置关系并不是强制性的,在保证各背光源投射光可以覆盖显示器件10时,显示器件10也可以置于图中的位置Po2、Po3或Po4。
为了实现显示场景色彩的准确呈现,最优的应该K′个基本色视图均入射观察者瞳孔50。这种情况下,仅K′个视区的呈现,导致观察者瞳孔50相对于成像型分光器件20的位置比较有限。希望更多的视区,以允许观察者瞳孔可以有一定的移动空间。引入正交特性至背光源阵列110,可以有效地解决该问题。该正交特性可以是偏振方向相互垂直的两个线偏特性。如图27所示,其背光源阵列110包含M=2组背光源组,各背光源组均包含出射K′=3种基本色光的K′=3个背光源。且该M=2组背光源组分别出射相互垂直的线偏光“-”光和“·”光。“-”和“·”为偏光方向相互垂直的两个正交线偏态。具体地,背光源BS B1、BS G1、BS R1分别投射蓝色、绿色、红色“·”光,背光源BS B2、BS G2、BS R2分别投射蓝色、绿色、红色“-”光。对应地,各出射相同颜色光的子像素,也对应地分别空间分割为M=2个空间子像素组,一组仅接收并调制“·”光,另一组仅接收并调制“-”光。如图,子像素SP Bn1、SP Bn3、…组成蓝色子像素组1,子像素SP Bn2、…组成蓝色子像素组2;子像素SP Gn1、SP Gn3、…组成绿色子像素组1,子像素SP Gn2、…组成绿色子像素组2;如此类推。为了保证各空间子像素组的子像素遍布显示器件10排列,出射相同颜色光的M=2个空间子像素组的子像素,最优地沿子像素排列方向相间排列,例如图27中,沿x向相邻M=2同色子像素分属不同子像素组。如此,共K′×M=6个空间子像素组和6个背光源一一对应,各背光源经成像型分光器件20所成像,作为该背光源对应空间子像素组的视区,仅有对应空间子像素 组各子像素投射光通过。图27中所示两个相互垂直的线偏态,也可以用两个旋向相反的旋光态代替。
上述正交特性,也可以是不同时间点不同时透射的时序正交特性。如图28所示,其背光源阵列110包含M=2组背光源组,各背光源组均包含出射K′=3种基本色光的K′=3个背光源。该M=2组背光源组分别在一个时间周期Δt的M=2个时间点t和t+Δt/2,由控制器件30控制,不同时地投射背光。具体地,背光源BS B1、BS G1、BS R1所组成背光源组在时间点t投射背光,在时间点t+Δt/2不投射;背光源BS B2、BS G2、BS R2所组成背光源组在时间点t+Δt/2打开,在时间点t不投射。图28所示为时间点t对应情况。对应地,出射相同颜色光的子像素,从时序上分割为两组时序子像素组,该两组时序子像素组的子像素于空间上完全一样,但于时间上,分别于点t和点t+Δt/2进行信息加载时,其对应视区不同。例如,在时间点t,SP Bn1、SP Bn2、SP Bn3、…组成的蓝色子像素组以VZ B1作为其对应视区;在时间点t+Δt/2,SP Bn1、SP Bn2、SP Bn3、…组成的蓝色子像素组以VZ B2作为其对应视区。其它各子像素组同理。基于时序复用,可以实现K′×M=6个视区对应视图的投射。更大的M值要求显示器件10具有更高的帧频,以避免闪烁效应的出现。图27和图28中,在最优地投射至少K′个基本色视图或/和拼合视图至观察者同一瞳孔50的前提下,各背光源的空间位置可以互换。所述时序正交特性的实现,可以通过控制器30,控制各光源组的开关,并以相应光信息同步刷新显示器件10各子像素来实现。
进一步地,所述正交特性也可以是混合特性,例如时序正交特性和线偏特性的混合特性。如图29所示,在一个时间周期t~t+Δt内,出射“·”光的背光源BS R1、BS G1、BS B1在t时刻打开,在t+Δt/2时刻关闭;出射“-”光的背光源BS R2、BS G2、BS B2在t时刻打开,在t+Δt/2时刻关闭;出射“·”光的背光源BS R3、BS G3、BS B3在t+Δt/2时刻打开,在t时刻关闭;出射“-”光的背光源BS R4、BS G4、BS B4在t+Δt/2时刻打开,在t时刻关闭。对应地,出射光颜色相同的各子像素,空间上分为两个空间子像素组,该两个空间子像素组分别仅接收并调制“·”光和“-”光。且,各空间子像素组,于时间周期t~t+Δt内又时序地分为两个时序子像素组。则,一个时间周期t~t+Δt内,各出射光颜色相同的子像素的四个相互独立的子像素组,分别以各自对应背光源的像作为对应视区,投射关于各自对应视区的视图。例如,SP Bn1、SP Bn3、…组成的蓝色子像素组于t时刻向VZ B1投射关于VZ B1的视图, 在t+Δt时刻向VZ B3投射关于VZ B3的视图,SP Bn2…组成的蓝色子像素组于t时刻向VZ B2投射关于VZ B2的视图,在t+Δt时刻向VZ B4投射关于VZ B4的视图。其它时间周期同理重复。则基于视觉滞留,所生成的12个视区。可以给观察者瞳孔50提供更密集的视区分别和更大的观影空间。
图27至图29所示视区的数量随着复用度的提高而提高。当生成视区数量足够时,所生成视区可以于空间上分为两组,于空间上分别对应观察者的左瞳孔50′和右瞳孔50放置,如图30,将本专利所述方法直接应用于观察者双目。对应地,背光源阵列110各背光源的空间位置分布也相应改变。更进一步地,视区分组的数目可以对应多于一个的观察者的双目时,则可进行多人的双目显示。
上述各图中,各背光源可以是只能沿一维方向排列的条状背光源,各背光源的像沿排列方向,尺寸小于观察者瞳孔直径D p,沿其它某个方向,其尺寸大于观察者瞳孔直径D p。也可以是另一种情况,沿各个方向,各背光源像的尺寸均小于观察者瞳孔直径D p的,称之为点状背光源,可以沿一维方向排列,也可以沿二维方向排列。也即是说。各背光源可以具有一定的尺寸。如图31的左图所示的条状背光源和右图所示的点状背光源,它们的像,作为视区,其光分布尺寸满足上述描述。此处所述各背光源的像沿某个方向上的尺寸,指该方向上,该背光源像的光分布范围内,光强值大于光强峰值50%的光分布区域所占据尺寸。当采用点状孔径时,上述仅以一维方向排列背光源的示例,可以扩展至点状孔径沿二维方向排列的情况。
在选用条状背光源时,也可以通过另外一种设计方式,实现视区对观察者双目的空间覆盖。如图32,通过设计视区排列方向偏离观察者双目连线方向更大的角度,增大过各视区投射光信息沿观察者双目连线方向x′向的覆盖范围。图32以观察者的瞳孔50和瞳孔50′恰好处于视区分布面时的情况为例。显示器件10各子像素组经分光器件20分别投射光至视区VZ R1、VZ G1、VZ B1、…。该多个视区沿排列方向x向于所在面上覆盖尺寸以D cv表示,设计视区排列方向x向偏离观察者双目连线方向x′向越大角度,也即图中所示
Figure PCTCN2020091873-appb-000001
角越小,视区沿x′向覆盖尺寸
Figure PCTCN2020091873-appb-000002
越大,越有利于视区沿沿双目连线方向x′向提供更大的观影区域。图中以x向沿顺时针方向旋转偏离x′向为例,其同样也可以沿逆时针方向旋转偏离x′向。同时,各视区的分布,还同样要求沿x向相邻视区间距小于观察者瞳孔直径D p。甚至在D cv<D e-e时,
Figure PCTCN2020091873-appb-000003
角的设计也可以满足观察者双目的同时单目多视 图呈现。D e-e为观察者双目间距。实际上,观察者双目不在视区分布面上时,上述越小的
Figure PCTCN2020091873-appb-000004
角一样可以增大视区沿观察着双目连线方向的覆盖范围,只是在该情况下,观察者各目接收到的视图可能是拼合视图。上述过程中,
Figure PCTCN2020091873-appb-000005
角的极小值也要受到约束,以避免经同一视区出射的光信息同时入射观察者双目。
上述各示例图中,相邻子像素都是空间错位排列的。实际上,K′个基本色子像素也可以空间上重叠于一点,这时,要求背光源阵列110各背光源组中的K′个背光源时序轮流投射K′种基本色。该情况下,上述过程显示一个图像的一个时间段,例如t~t+Δt/2段,需要进一步分为K′个子时间段,分别让该K′种颜色的背光轮流入射,对应等效为K′个不同颜色的子像素时序对应加载光信息。
在K=3个分别具有K=3种基元色的视图或/和拼合视图入射观察者同一瞳孔50而进行三维场景呈现时,也存在显示场景为单一基元色的情况,例如待显示场景或部分场景为绿色的情况。该情况下,若仅有来自一个绿色子像素组的一束过该待显示绿色空间光点的光束入射观察者瞳孔50,则无法实现至少两束过该绿色空间光点的光束入射观察者瞳孔的要求,破环单目聚焦的实现。可将该待显示绿色空间光点的光信息,设计为白色W(R+G+B)+绿色B,其呈现需要三条基元光束的叠加,而不再是仅一条绿色光束,以避免上述现象的出现。
上述各图中的显示器件10取K′=K=3为例。实际上,本专利所述方法的应用,不受限于K′和K的取值。例如,取常见的由R、G、B、W(白)K′=4个子像素组成一个像素的显示器作为显示器件10时,背光源阵列110至少一组的背光源组,包含出射光分别为R、G、B、W光的K′=4个背光源。此时,K=3。其中的白色子像素,其对应W光背光源投射的白光,会透射其它三类基元色子像素而产生噪声。为了避免该噪声,需要W光背光源所投射白光和其它K=3类基元色背光源所投射光具有不同的正交特性。例如,出射基元色光的背光源和出射白光的背光源被控制器件30时序错位地打开并出射背光,它们对应的子像素组各自同步对应加载光信息。又例如,出射基元色光的背光源和出射白光的背光源所出射光,分别为左旋光和右旋光,它们对应子像素组相应地分别仅接收并调制左旋光和右旋光。
另外,本专利所述方法允许显示器件10的子像素选择不同的形状,例如可以是现有显示器件中常见的矩形,也可以设计为方形等。各子像素的排列方式,也可以设计为各种方式,例如pencil型排列。另外,上述各图中,显示器件10 均以透射式显示器件10为例,实际上,显示器件10也可以是反射式显示装置。各背光源的空间位置,也不受限于平面排列,它们也可以沿深度上错位排列。
上述各结构,还可以作为基本结构,两个或多个该基本结构组合为复合结构,以增大视角。如图33,以两个基本结构为例,背光源阵列110、显示器件10和成像型分光器件20组成基本结构,背光源阵列110′、显示器件10′和成像型分光器件20′组成另一个基本结构。两个基本结构所生成视区均置于观察者瞳孔50前,它们的显示器件毗邻排列,可以实现显示视角的扩展。
上述图26至图30所示各结构及图33所示各基本结构中,可以引入投影器件40,投射显示器件10的像。由于成像型分光器件20和投影器件40均具有成像功能,二者之间存在相互影响,往往存在共用的组件。如图34,组件21和22构成成像型分光器件20,其组件22同时又是投影器件40,成像显示器件10的放大虚像I 10。图34中,成像型分光器件20的组件21和组件22均取透镜为范例,且背光源阵列110各背光源置于组件21的前焦面上。实际上,背光源阵列110各背光源所处面和组件21的距离可以不等于组件21的焦距,背光源阵列110各背光源甚至可以沿出射光传输方向处于不同的深度上。在实现成像背光源阵列110各背光源和投射显示器件10像的前提下,组件22和组件21可以是其它的实物光学器件。图35中,成像型分光器件20和投影器件40用一个透镜器件即可实现。
图34和图35中,以显示器件10关于投影器件40的像I 10作为等效显示器件,代替前述显示器件10,既可基于同样方法和过程进行单目多视图显示。引入投影器件40的结构,常被作为近眼显示光学引擎的一个目镜,两个该类目镜搭建一个双目显示光学结构,如图36。
上述图34和图35所示各结构中,还可以引入中继器件60,以引导各子像素投射光向观察者瞳孔50所处区域进行传播,如图6的半透半反面。中继器件60也可以选用其它各种的光学器件或光学器件组件。例如,图37中置于各视区处的反射镜61a、61b、61c构建的中继器件60;图38中,反射镜62、63a、63b、63c构建的中继器件60;图39中,半透半反面64、反射面65a、反射面65b、反射面65c构建的中继器件60;图40中,角度特性面66、反射面67a、反射面67b、反射面67c构建的中继器件60。图39和图40中,显示器件10选用反射式显示装置。其中,图40中的角度特性面66,对接近垂直入射的来自背光源的 光,具有透射性质,对由显示器件10反射而来的以较大入射角入射的光束,具有反射性质。图41所示为采用自由曲面组合器件的光学结构。该自由曲面组合器件由透射曲面FS1、反射曲面FS2、半透半反曲面FS3、透射曲面FS4、透射曲面FS5构成。其中FS1、FS2、FS3、FS4一起执行成像型分光器件20和投影器件40的功能,FS2、FS3执行中继器件60的功能,FS5具有补偿调制功能,允许外部环境光不受FS3、FS4影响地入射观察者瞳孔50。图中,背光源阵列110和显示器件10之间也可以置于透镜,作为成像型分光器件20的组件,对各背光源出射光进行会聚。图34至图41中,背光源阵列110包含背光源组数目,以包括较少的一组或两组背光源组为例示出。其也可以包含更多组的背光源组。其不同组之间可具有图27至图29所述不同正交特性,例如不同的时序特性,或者不同的旋光特性,或者不同的偏光特性,或者复合特性。
所述中继器件60也可以选用光波导器件,称之为光波导型中继器件60,包括入瞳611,耦入器件612,光波导体613,反射面614a和614b,耦出器件615和出瞳616。图42所示结构中光波导型中继器件60置于背光源阵列110和显示器件10之间,引导背光源阵列110各背光源投射光以各自对应矢向特性入射显示器件10。背光源阵列110的各背光源,例如背光源BS B1,出射光经成像型分光器件20的组件21,转化为平行光;然后经光波导型中继器件60的入瞳611入射至耦入器件612;耦入器件612引导来自背光源BS B1的平行光于光波导体613内,基于反射面614a和614b的反射,向耦出器件615传播;耦出器件615调制入射光束,引导其经出瞳616入射显示器件10;成像型分光器件20的组件22,也是投影器件40,将背光源BS B1对应子像素组所投射视图,向背光源BS B1的像会聚投射,该背光源BS B1的像即为背光源BS B1对应子像素组的视区VZ B1。成像型分光器件20的组件22同时也是投影器件40,投射显示器件10的像I 10。其中,光波导型中继器件60也参与了背光源的成像和显示器件10的成像。由部分反射入射光的面615a、615b、615作为组件构建的耦出器件615可以具有扩瞳功能,耦出器件615的各组件一方面引导入射光束部分地向出瞳投射,另一方面又允许入射光部分地继续沿原来的传输路径传输,并经反射面614b反射后再次入射下一个耦出器件615的组件进行耦出和反射,并如此重复,直至来自各背光源的光束覆盖显示器件10。则,以显示器件10的像I 10作为等效显示器件,即可基于上述原理和方法进行单目多视图显示。图中的投影器件组件40b,也可以 复合于耦出器件615。
光波导型中继器件60也可以沿光传播方向置于显示器件10之前。如图43,各背光源出射光经分光器件20的组件21,转为为平行光入射显示器件10,然后经光波导型中继器件60引导进入分光器件20的组件22,会聚导到该倍光源的像,形成该背光源对应子像素组的视区。分光器件20的组件22同时也是投影器件40。其中,光波导型中继器件60也参与了背光源的成像和显示器件10的成像。图43中,分光器件20的组件21和显示器件10的前后位置关系也可以互换。
本实例例上述各图中,各背光源出射光经其它器件转化为平行光时,说的是各背光源为点光源的情况。当背光源是上述具有一定尺寸的点状光源或条状光源时,对各背光源,对应的情况是该背光源上各点出射光被转化为沿对应矢向传播的平行光。采用光波导型中继器件60时,图42和图43以背光源各点出射光被转化为平行光入射光波导型中继器件60的情况为例进行说明。实际上,在保证实现至少两个视图或/拼合视图光信息入射观察者同一瞳孔50的前提下,图42和图43中相关各组件的位置关系也可以变动,或至引入新的组件,使背光源各点出射光以非平行光的状态入射光波导型中继器件60,包括显示器件10各子像素出射光被转换为平行光入射光波导型中继器件60的情况或显示器件10各子像素出射光也不被转换为平行光入射光波导型中继器件60的情况。
本发明的核心思想是以子像素为基本显示单元,通过子像素投射光束的空间叠加显示单目可聚焦空间场景。显示器件10出射不同颜色光的K′类子像素中,分属不同类的K′个子像素投射光即刻可叠加形成彩色空间光点。相较于现有以像素为基本显示单元的单目多视图显示方法,本专利所述方法可以提高视区呈现数量(K′-1)倍,有效提升单目多视图技术的可实现性。
以上仅为本发明的优选实施例,但本发明的设计构思并不局限于此,凡利用此构思对本发明做出的非实质性修改,也均落入本发明的保护范围之内。相应地,所有相关实施例都落入本发明的保护范围内。

Claims (19)

  1. 基于子像素出射光空间叠加的三维显示方法,其特征在于,包括以下步骤:
    (i)以显示器件(10)各像素的子像素为基本显示单元,各出射相同颜色光的子像素,分别单独作为一个子像素组或分为若干个子像素组;
    所述显示器件(10)的子像素分属K′类出射光颜色互不相同的基本色子像素,包括K类基元色子像素,且各子像素组的子像素遍布该显示器件(10)排列,其中K′≧2,K′≧K≧2;
    其中,所述K类基元色子像素,按一一对应的方式,分别对应K类滤光片,各类基元色子像素出射光对对应滤光片和对其它(K-1)类基元色子像素对应各滤光片的透过率比值大于9,并定义基本色子像素出射光的颜色为基本色,共K′类基本色,定义基元色子像素出射光的颜色为基元色,共K类基元色;
    (ii)沿显示器件(10)各子像素出射光传输方向,设置分光器件(20)于显示器件(10)前,引导显示器件(10)各子像素分别向其所属子像素组对应的视区投射光束,并限制该各投射光束的发散角,使该各投射光束于观察者瞳孔(50)所处面上,沿至少一个方向,光强值大于峰值光强的50%的光分布尺寸小于观察者瞳孔(50)直径;
    (iii)由与显示器件(10)连接的控制器件(30),控制各子像素组加载显示对应图像,其中各子像素所加载图像信息,为沿该子像素所投射并入射观察者瞳孔(50)所处区域的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔(50)所处面交点上的投影光信息;
    其中,一个子像素组显示的图像为待显示场景的一个视图,不同子像素组的不同部分互补拼合所成拼合子像素组显示的图像为一个拼合视图;
    其中,显示器件(10)各子像素组对应视区的空间位置分布,被设置为使得总数至少为两个的视图或/和拼合视图信息入射同一观察者瞳孔(50)。
  2. 根据权利要求1所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述分光器件(20)为对应观察者瞳孔(50)置放的孔径阵列,该孔径阵列包含至少一组由K个孔径组成的孔径组,该K个孔径一一对应地附有K类基元色子像素分别对应的K类滤光片,各孔径对应类基元色子像素所成子像素组中,投射光通过该孔径的子像素组以该孔径为对应视区。
  3. 根据权利要求2所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述孔径阵列包含M组孔径组,不同孔径组分别仅允许互不相同正交特性光通过,其中M≧2。
  4. 根据权利要求3所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述互不相同正交特性,是于不同时间点分别透射的时序正交特性、或偏振方向相互垂直的两个线偏特性、或旋转方向相反的两个旋光特性、或于不同时间点不同时透射的时序正交特性和偏振方向相互垂直的两个线偏特性的组合特性、或于不同时间点不同时透射的时序正交特性和旋转方向相反的两个旋光特性的组合特性。
  5. 根据权利要求1所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述分光器件(20)为对应观察者瞳孔(50)置放的孔径阵列,该孔径阵列包含至少一组由K′个孔径组成的孔径组,该K′个孔径与K′类基本色子像素一一对应,各孔径对应类基本色子像素所成子像素组中,投射光通过该孔径的子像素组以该孔径为对应视区,其中对应K类基元色子像素的K个孔径分别附有对应的K类滤光片,其中K′﹥K;
    其中,同一孔径组中,附有K类滤光片的K个孔径允许同一种正交特性光通过,其余(K′-K)个孔径分别仅允许其它(K′-K)种正交特性光中的各一种通过,该(K′-K+1)种正交特性互不相同,各孔径对应类基本色子像素所成子像素组中,投射光通过该孔径的子像素组以该孔径为对应视区。
  6. 根据权利要求5所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述孔径阵列包含M组孔径组,不同孔径组分别仅允许互不相同正交特性光通过,其中M≧2。
  7. 根据权利要求5和6任一项所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述互不相同正交特性,是于不同时间点分别透射的时序正交特性、或偏振方向相互垂直的两个线偏特性、或旋转方向相反的两个旋光特性、或于不同时间点不同时透射的时序正交特性和偏振方向相互垂直的两个线偏特性的组合特性、或于不同时间点不同时透射的时序正交特性和旋转方向相反的两个旋光特性的组合特性。
  8. 根据权利要求1所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述显示器件(10)为具有背光源阵列(110)的被动式显示装置,该 背光源阵列(110)包括至少一组分别投射K类基元色光的K个背光源所组成背光源组,所述分光器件(20)为投射该背光源阵列(110)各背光源的实像的光学器件,各背光源的实像的光分布区域作为投射光颜色一致于该背光源投射光颜色的子像素组中,投射光过该光分布区域的子像素组所对应视区。
  9. 根据权利要求8所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述背光源阵列(110)包含M组背光源组,不同背光源组分别出射互不相同正交特性光,其中M≧2。
  10. 根据权利要求9所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述互不相同正交特性,是于不同时间点不同时投射的时序正交特性、或偏振方向相互垂直的两个线偏特性、或旋转方向相反的两个旋光特性、或于不同时间点不同时投射的时序正交特性和偏振方向相互垂直的两个线偏特性的组合特性、或于不同时间点不同时投射的时序正交特性和旋转方向相反的两个旋光特性的组合特性。
  11. 根据权利要求1所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述显示器件(10)为具有背光源阵列(110)的被动式显示装置,该背光源阵列(110)包括至少一组分别投射K′类基本色光的K′个背光源所组成背光源组,所述分光器件(20)为投射该背光源阵列(110)各背光源的实像的光学器件,各背光源的实像的光分布区域作为投射光颜色一致于该背光源投射光颜色的子像素组中,投射光过该光分布区域的子像素组所对应视区,其中K′﹥K;
    其中,同一背光源组中,投射K类基元色光的K个背光源出射光具有同一种正交特性,其余(K′-K)个背光源出射光分别具有其它(K′-K)种正交特性中的各一种,该(K′-K+1)种正交特性互不相同。
  12. 根据权利要求11所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述背光源阵列(110)包含M组背光源组,不同背光源组分别出射互不相同正交特性光,其中M≧2。
  13. 根据权利要求11和12任一项所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述互不相同正交特性,是于不同时间点不同时投射的时序正交特性、或偏振方向相互垂直的两个线偏特性、或旋转方向相反的两个旋光特性、或于不同时间点不同时投射的时序正交特性和偏振方向相互垂直的两个线偏特性的组合特性、或于不同时间点不同时投射的时序正交特性和旋转方向相 反的两个旋光特性的组合特性。
  14. 根据权利要求1所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,步骤(ii)还包括置投影器件(40)于与显示器件(10)对应的位置,成显示器件(10)放大像。
  15. 根据权利要求1所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,步骤(ii)还包括置中继器件(60)于光传输路径上,引导显示器件(10)投射光束入射观察者瞳孔(50)所处区域。
  16. 根据权利要求15所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,所述中继器件(60)为反射面、或半透半反面、自由曲面组合、或光波导器件。
  17. 根据权利要求1所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,步骤(iii)还包括将追踪器件(70)与控制器件(30)连接,通过追踪器件(70)实时确定观察者瞳孔(50)的位置。
  18. 根据权利要求17所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,步骤(iii)还包括根据观察者瞳孔(50)的位置,对于投射光入射观察者瞳孔(50)的各子像素,确定其所加载光信息,为沿该子像素所投射并入射观察者瞳孔(50)的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔(50)交点上的投影光信息。
  19. 根据权利要求17所述的基于子像素出射光空间叠加的三维显示方法,其特征在于,步骤(iii)还包括根据观察者瞳孔(50)的位置,由控制器件(30)确定投射光束入射观察者瞳孔(50)的各子像素组,并仅驱动该各子像素组作为有效子像素组进行单目多视图显示。
PCT/CN2020/091873 2020-04-03 2020-05-22 基于子像素出射光空间叠加的三维显示方法 WO2021196369A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/226,093 US20210314553A1 (en) 2020-04-03 2021-04-09 Three-dimensional display method based on spatial superposition of sub-pixels' emitted beams

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010259368.5A CN113495366B (zh) 2020-04-03 2020-04-03 基于子像素出射光空间叠加的三维显示方法
CN202010259368.5 2020-04-03

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/226,093 Continuation US20210314553A1 (en) 2020-04-03 2021-04-09 Three-dimensional display method based on spatial superposition of sub-pixels' emitted beams

Publications (1)

Publication Number Publication Date
WO2021196369A1 true WO2021196369A1 (zh) 2021-10-07

Family

ID=77927351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/091873 WO2021196369A1 (zh) 2020-04-03 2020-05-22 基于子像素出射光空间叠加的三维显示方法

Country Status (2)

Country Link
CN (1) CN113495366B (zh)
WO (1) WO2021196369A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114545653A (zh) * 2022-01-10 2022-05-27 中山大学 基于正交特性孔径组对瞳孔追踪对应的光学显示结构
CN114545652A (zh) * 2022-01-10 2022-05-27 中山大学 一种像素块出射光各自指向对应小尺寸孔径的光学显示结构
CN115128811A (zh) * 2022-06-20 2022-09-30 中山大学 一种基于正交特性像素块的近眼显示模组

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0829743A2 (en) * 1996-09-12 1998-03-18 Sharp Kabushiki Kaisha Observer tracking directional display
US5959664A (en) * 1994-12-29 1999-09-28 Sharp Kabushiki Kaisha Observer tracking autostereoscopic display and method of tracking an observer
CN1539095A (zh) * 2001-08-06 2004-10-20 ¿ 光切换设备
CN103472589A (zh) * 2013-09-29 2013-12-25 中山大学 可便携的三维图像显示系统和方法
CN107340567A (zh) * 2017-09-01 2017-11-10 上海誉沛光电科技有限公司 一种平板光波导和显示装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101957523B (zh) * 2010-05-26 2013-01-09 天马微电子股份有限公司 液晶光栅模组及平面/立体可切换型液晶显示装置
EP2618585B1 (en) * 2010-09-13 2016-07-27 Fujifilm Corporation Monocular 3d-imaging device, shading correction method for monocular 3d-imaging device, and program for monocular 3d-imaging device
JP2012194274A (ja) * 2011-03-15 2012-10-11 Japan Display West Co Ltd 表示装置
US10652526B2 (en) * 2016-04-25 2020-05-12 Sun Yat-Sen University Three-dimentional display system based on division multiplexing of viewer's entrance-pupil and display method thereof
CN105866963B (zh) * 2016-05-10 2019-07-16 中山大学 一种增加视点呈现数目的空间复用模组和方法
CN106291958B (zh) * 2016-10-21 2021-04-23 京东方科技集团股份有限公司 一种显示装置及图像显示方法
CN106324847B (zh) * 2016-10-21 2018-01-23 京东方科技集团股份有限公司 一种三维显示装置
KR102222825B1 (ko) * 2016-11-15 2021-03-05 크리얼 에스에이 보정한 단안의 깊이 단서를 갖는 접안 순차 라이트 필드 프로젝터
CN110035274B (zh) * 2018-01-12 2020-10-16 中山大学 基于光栅的三维显示方法
CN108375840B (zh) * 2018-02-23 2021-07-27 北京耐德佳显示技术有限公司 基于小型阵列图像源的光场显示单元及使用其的三维近眼显示装置
US11272168B2 (en) * 2018-07-16 2022-03-08 Boe Technology Group Co., Ltd. Three-dimensional display apparatus, three-dimensional imaging apparatus, and method of displaying three-dimensional image
CN110908134B (zh) * 2018-08-28 2021-01-26 京东方科技集团股份有限公司 一种显示装置及显示系统
CN110401829B (zh) * 2019-08-26 2022-05-13 京东方科技集团股份有限公司 一种裸眼3d显示设备及其显示方法
CN110738697B (zh) * 2019-10-10 2023-04-07 福州大学 基于深度学习的单目深度估计方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5959664A (en) * 1994-12-29 1999-09-28 Sharp Kabushiki Kaisha Observer tracking autostereoscopic display and method of tracking an observer
EP0829743A2 (en) * 1996-09-12 1998-03-18 Sharp Kabushiki Kaisha Observer tracking directional display
CN1539095A (zh) * 2001-08-06 2004-10-20 ¿ 光切换设备
CN103472589A (zh) * 2013-09-29 2013-12-25 中山大学 可便携的三维图像显示系统和方法
CN107340567A (zh) * 2017-09-01 2017-11-10 上海誉沛光电科技有限公司 一种平板光波导和显示装置

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114545653A (zh) * 2022-01-10 2022-05-27 中山大学 基于正交特性孔径组对瞳孔追踪对应的光学显示结构
CN114545652A (zh) * 2022-01-10 2022-05-27 中山大学 一种像素块出射光各自指向对应小尺寸孔径的光学显示结构
CN114545652B (zh) * 2022-01-10 2024-01-12 中山大学 一种像素块出射光各自指向对应小尺寸孔径的光学显示结构
CN114545653B (zh) * 2022-01-10 2024-02-06 中山大学 基于正交特性孔径组对瞳孔追踪对应的光学显示结构
CN115128811A (zh) * 2022-06-20 2022-09-30 中山大学 一种基于正交特性像素块的近眼显示模组
CN115128811B (zh) * 2022-06-20 2024-01-12 中山大学 一种基于正交特性像素块的近眼显示模组

Also Published As

Publication number Publication date
CN113495366B (zh) 2022-05-17
CN113495366A (zh) 2021-10-12

Similar Documents

Publication Publication Date Title
WO2021196369A1 (zh) 基于子像素出射光空间叠加的三维显示方法
CN106291958B (zh) 一种显示装置及图像显示方法
WO2021062941A1 (zh) 基于光栅的光波导光场显示系统
US11480796B2 (en) Three-dimensional display module using optical wave-guide for providing directional backlights
CN112882248B (zh) 一种光束发散角偏转孔径二次约束的显示模组
WO2021196370A1 (zh) 以子像素为显示单元的单目多视图显示方法
CN104380157A (zh) 定向照明波导布置方式
JP2007524111A (ja) カラープロジェクションディスプレイシステム
CN112305776B (zh) 基于光波导耦出光出瞳分割-组合控制的光场显示系统
WO2021169065A1 (zh) 孔径时序选通复用的光波导显示模组
CN112925098B (zh) 基于光出射受限像素块-孔径对的近眼显示模组
CN112925110B (zh) 基于光出射受限像素块-孔径对的三维显示模组
JP2006091333A (ja) 三次元映像表示装置
US20200379268A1 (en) Near-eye display device and near-eye display method
US20210314553A1 (en) Three-dimensional display method based on spatial superposition of sub-pixels&#39; emitted beams
CN113946054A (zh) 一种显示装置
JP6202806B2 (ja) 虚像表示装置
WO2021175341A1 (zh) 一种多背光光源的显示模组
CN112748570A (zh) 正交特性光栅-像素阵列对及基于其的近眼光场显示模组
US20030020883A1 (en) Image display device
CN113359312B (zh) 基于多光源的光波导显示模组
US11555962B1 (en) Waveguide illuminator with optical interference mitigation
WO2021016761A1 (zh) 基于光波导耦出光出瞳分割-组合控制的光场显示系统
WO2019157986A1 (zh) 单眼大视场近眼显示模组、显示方法及头戴式显示设备
CN113703164A (zh) 光波导指向背光全息显示模组

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20928600

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07.02.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20928600

Country of ref document: EP

Kind code of ref document: A1