WO2021196370A1 - 以子像素为显示单元的单目多视图显示方法 - Google Patents

以子像素为显示单元的单目多视图显示方法 Download PDF

Info

Publication number
WO2021196370A1
WO2021196370A1 PCT/CN2020/091877 CN2020091877W WO2021196370A1 WO 2021196370 A1 WO2021196370 A1 WO 2021196370A1 CN 2020091877 W CN2020091877 W CN 2020091877W WO 2021196370 A1 WO2021196370 A1 WO 2021196370A1
Authority
WO
WIPO (PCT)
Prior art keywords
sub
display
pixels
pixel
light
Prior art date
Application number
PCT/CN2020/091877
Other languages
English (en)
French (fr)
Inventor
刘立林
滕东东
Original Assignee
中山大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中山大学 filed Critical 中山大学
Priority to US18/580,381 priority Critical patent/US20240223744A1/en
Publication of WO2021196370A1 publication Critical patent/WO2021196370A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/30Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/32Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using arrays of controllable light sources; using moving apertures or moving light sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/324Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/349Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
    • H04N13/351Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/04Structural and physical details of display devices
    • G09G2300/0439Pixel structures

Definitions

  • the present invention relates to the field of three-dimensional display technology, and more specifically to a monocular multi-view display method using sub-pixels as display units.
  • three-dimensional displays can provide optical information whose dimensions are consistent with the real world that people perceive, and are receiving more and more attention.
  • Stereo vision technology including automatic stereo vision
  • three-dimensional presentation based on the principle of binocular parallax, which projects a corresponding two-dimensional image to the observer's binoculars, and uses the binocular vision to cross trigger the observer at the scene of the screen Depth perception.
  • binocular parallax which projects a corresponding two-dimensional image to the observer's binoculars
  • uses the binocular vision to cross trigger the observer at the scene of the screen Depth perception.
  • the eyes of the observer must always focus on the display surface, which leads to the problem of focus-convergence conflict, that is, the observer's monocular focus depth and binocular focus depth are inconsistent.
  • Monocular multi-view is an effective technical path to solve the problem of focus-convergence conflict. It uses optical devices to guide different pixel groups of the display device to project at least two two-dimensional images of the scene to be displayed on the same eye of the observer, so that at least two light beams passing through each display object point enter the eye of the observer.
  • the light intensity distribution superimposed on the display object point to form the spot can lead the observer's eyes to freely focus on the superimposed spot, that is, to overcome the above-mentioned focus-convergence conflict problem.
  • the present invention proposes a monocular multi-view display method.
  • the sub-pixels of the display device are used as the basic display unit, and the light is split through the grating device to guide multiple sub-pixel groups to project multiple images of the scene to be displayed into the area where the pupil of the observer is located.
  • Multi-views realize monocular focusable three-dimensional scene display.
  • the existing technology for monocular multi-view display based on grating light splitting uses pixels as the basic display unit, and guides different pixel groups through the grating device to project different views to the area where the observer’s pupil is located, such as PCT/CN2019/070029(GRATING BASED THREE- DIMENTIONAL DISPLAY METHOD FOR PRESENTING MORE THAN ONE VIEWS TO EACH PUPIL).
  • the method described in this patent uses sub-pixels as the basic display unit, and uses different sub-pixel groups to project different views to the area where the observer's pupil is located. A single-lens focusable spatial light spot is also formed.
  • the existing single-lens multi-view display method requires at least two pixels, and the method described in this patent requires at least two sub-pixels.
  • the monocular multi-view display method of this patent using sub-pixels as the display unit can effectively improve the two-dimensional view projection capability of the display device, which is more conducive to the expansion of the observer’s eye viewing area, or by improving the projection view correspondence
  • the spatial density of the viewing zone expands the monocular to focus on the display depth of the scene.
  • this patent uses a projection device to project an enlarged image of the display device, so that the application range of the method is extended to near-eye display; and a relay device is used to optimize the spatial structure distribution of the optical structure.
  • the method can be directly applied to a binocular three-dimensional display optical engine, and can also be applied to a monocular optical engine.
  • Sub-pixels are used as display units to realize monocular focusable three-dimensional display based on monocular multi-views.
  • the present invention provides the following solutions:
  • the monocular multi-view display method using sub-pixels as the display unit includes the following steps:
  • a grating device is arranged in front of the display device to guide each sub-pixel of the display device to project light beams to their respective viewing areas;
  • the sub-pixels corresponding to the same view zone are formed into a sub-pixel group, and different sub-pixel groups have no shared sub-pixels at the same time point;
  • the control device connected to the display device controls each sub-pixel group to load and display the corresponding image.
  • the image information loaded by each sub-pixel is the transmission of the light beam projected along the sub-pixel and incident on the pupil of the observer.
  • Vector direction the projection light information of the scene to be displayed on the intersection of the straight line where the transmission vector is located and the plane where the observer’s pupil is located;
  • the image displayed by one sub-pixel group is a view of the scene to be displayed, and the image displayed by the spliced sub-pixel group formed by complementary splicing of different parts of different sub-pixel groups is a spliced view;
  • the spatial position distribution of the corresponding viewing zone of each sub-pixel group of the display device is set so that the total number of views or/and the light information of the combined view are incident on the pupil of the same observer.
  • the grating unit of the grating device is a cylindrical lens or a slit.
  • the grating device is composed of microstructure units, and each of the microstructure units and each sub-pixel of the display device are placed in a one-to-one correspondence for modulating the light emitted by the corresponding sub-pixel.
  • step (i) further includes: along the arrangement direction of the grating units, the grating units separated by (T-1) grating units form a grating unit group, and step (ii) further includes controlling the control device to control the T grating unit groups to be adjacent to each other.
  • the control device In each time period composed of T time points, the light is turned on sequentially, and only one grating unit group is turned on and light is turned on at a time point, where T ⁇ 2.
  • step (i) further includes that each sub-pixel of the display device emits light of M different colors, along the arrangement direction of the grating units, the grating units separated by (M-1) grating units form a grating unit group, and the M gratings
  • the unit group is set to allow only one of the M colors of light emitted by the display device to pass through in a one-to-one correspondence, where M ⁇ 2.
  • step (i) also includes placing the projection device at a position corresponding to the display device to form an enlarged image of the display device.
  • step (i) further includes placing a relay device on the transmission path of the projection light of the display device, and guiding the projection light beam of the display device to enter the area where the pupil of the observer is located.
  • the relay device is a reflective surface, or a semi-transparent and semi-reverse surface, a combination of free-form surfaces, or an optical waveguide device.
  • step (ii) further includes connecting the tracking device with the control device, and tracking the position of the pupil of the observer in real time through the tracking device.
  • step (ii) also includes determining the loaded image information for each sub-pixel where the projected light enters the observer’s pupil according to the position of the observer’s pupil, which is the value of the light beam projected along the sub-pixel and incident on the observer’s pupil Transmission vector, the projection light information of the scene to be displayed on the intersection of the straight line where the transmission vector is located and the pupil of the observer.
  • the present invention uses sub-pixels as the basic display unit. Compared with the monocular multi-view display method using pixels as the display unit, the present invention can effectively increase the number of projections of two-dimensional views, and combines the characteristic design of the grating device to further satisfy the monocular multi-view display. Requirements for the number of projections of the two-dimensional view.
  • the monocular multi-view display method using sub-pixels as the display unit of the invention provides an implementation method for three-dimensional display without focus-convergence conflict.
  • the monocular multi-view display method using sub-pixels as the display unit of the present invention can effectively improve the two-dimensional view projection capability of the display device, and is more conducive to the expansion of the viewing area of the observer's eyes, or by increasing the space of the viewing zone corresponding to the projected view
  • the density expansion monocular can focus on the display depth of the scene.
  • the present invention uses the projection device to project the magnified image of the display device, so that the application range of the method is extended to near-eye display; and the relay device is used to optimize the spatial structure distribution of the optical structure.
  • the method can be directly applied to a binocular three-dimensional display optical engine, and can also be applied to a monocular optical engine.
  • FIG. 1 is a schematic diagram of a conventional monocular multi-view display principle using pixels as display units.
  • Fig. 2 is a schematic diagram of a monocular multi-view display method using sub-pixels as display units in this patent.
  • Fig. 3 is an explanatory diagram of the merging rules for merging sub-pixel groups.
  • Fig. 4 is a schematic diagram of the corresponding positional relationship between the grating device and the sub-pixels.
  • Figure 5 is a schematic diagram of the light splitting principle of a grating device.
  • Fig. 6 is a partial enlarged view of the corresponding positions of the grating device and sub-pixels shown in Fig. 4.
  • FIG. 7 is a schematic diagram of an arrangement of sub-pixels for generating a pure color view.
  • Fig. 8 is a schematic diagram of a pure color view through a grating device for light splitting and projection.
  • FIG. 9 is a partial enlarged view of the corresponding positions of the grating device and sub-pixels shown in FIG. 7.
  • FIG. 10 is a schematic diagram of the gate state of a grating device with timing characteristics at a time.
  • FIG. 11 is a schematic diagram of the gate state of the grating device with timing characteristics at another time.
  • Fig. 12 is a schematic diagram of the working principle of a grating device with color selection characteristics.
  • FIG. 13 is a schematic diagram of another application environment of a grating device with color selection characteristics.
  • FIG. 14 is a schematic diagram of the influence of the inclination of the striped viewing zone on the coverage area of the viewing zone in the direction of the binocular connection.
  • Fig. 15 is a schematic diagram of a near-eye monocular display optical module incorporating a projection device.
  • 16 is a schematic diagram of a binocular display structure based on a near-eye monocular display optical module.
  • Figure 17 is a schematic diagram of a near-eye monocular display optical module incorporating a relay device.
  • Figure 18 is an example of a relay device based on a free-form surface.
  • Fig. 19 is an example of a relay device based on an optical waveguide device.
  • Fig. 20 is a schematic diagram of a stacked structure of multiple optical waveguides.
  • Fig. 21 is a second example of a relay device based on an optical waveguide device.
  • the monocular multi-view display method using sub-pixels as the display unit of the present invention directly uses sub-pixels as the basic display unit, and uses multiple sub-pixel groups to project multiple views to the area where the observer’s pupil is located, based on different sagittal directions from different views
  • the spatial superposition of light beams forms a monocular focusable display light spot, realizing a three-dimensional display without focus-convergence conflict.
  • the existing monocular multi-view technology uses pixels as the basic display unit to project at least two views to the area where the pupil 50 of the observer is located through different pixel groups of the display device 10.
  • pixels as the basic display unit to project at least two views to the area where the pupil 50 of the observer is located through different pixel groups of the display device 10.
  • at least two light beams from different pixel groups passing through the object point are superimposed to form a spatial display light spot that can be focused by the observer's monocular vision.
  • the observer's eye can be drawn to focus on the spatial superimposed light spot, thereby overcoming the focus-convergence conflict.
  • FIG. 1 specifically takes a single eye and two views as an example for description.
  • the pixel group 1 projects the view 1 about the viewing zone VZ 1 , and the projected light of each pixel of the pixel group 1 is guided by the grating device 20, and exits through the viewing zone VZ 1 without passing the viewing zone VZ 2 ; the pixel group 2 projects the viewing zone VZ 2 In view 2, and the projection light of each pixel of the pixel group 2 is guided by the grating device 20, and exits through the viewing zone VZ 2 without passing through the viewing zone VZ 1 .
  • the projected light beam of the pixel group 1 carries the light information of the view 1 through the viewing zone VZ 1
  • the projected light beam of the pixel group 2 carries the light information of the view 2 through the viewing zone VZ 2 , and respectively enters the pupil 50 of the observer.
  • the x direction is the arrangement direction of the viewing zone.
  • the light beam 1 from the pixel group 1 and the light beam 2 from the pixel group 2 are superimposed to form a spatially superimposed light spot.
  • the spatial superposition of the light intensity distribution of the light point can attract the observer’s pupil 50 when the corresponding eye is focused on the object point P, the observer’s eye focus will no longer be forced to be fixed at the exit pixel of the beam 1 or beam 2, that is, the observer The focus of the eye will no longer be forcibly fixed on the display device 10, so that the focus-convergence conflict can be overcome.
  • other display object points that can be monocularly focused together form a monocularly focusable display scene.
  • more view light information can be incident on the pupil 50 of the observer.
  • more superimposed light beams will enter the pupil 50 of the observer along the respective sagittal directions.
  • the superposition of the larger number of superimposed light beams can improve the attraction ability of the spatial superimposed light points to the focus of the observer's eyes, and is beneficial to the display of scenes with a larger screen distance.
  • more projected viewing areas can also provide a larger viewing area for the pupil 50 of the observer, so that when the pupil 50 of the observer moves within the larger viewing area, it can continue to see nothing based on the monocular multi-view principle. Focus-Convergence conflict display scene.
  • an increase in the number of viewing zones corresponds to an increase in the number of view projections, and the grating device 20 is required to divide the pixels of the display device 10 into more pixel groups to project more views. This also corresponds to a decrease in the number of pixels contained in each pixel group, that is, a decrease in the resolution of the projected view.
  • This patent uses sub-pixels as the basic display unit for monocular multi-view display.
  • the number of projected views and their corresponding viewing areas can be increased to N times.
  • N ⁇ 2 is the number of sub-pixels included in each pixel.
  • the display device 10 emits light of M colors, where M ⁇ 2. Fig.
  • the grating device 20 is designed to split light to generate 6 viewing zones VZ 1 , VZ 2 , VZ 3 , VZ 4 , VZ 5 and VZ 6 , and the number of sub-pixels corresponding to each viewing zone is equivalent to the two viewing zones shown in Figure 1 Each view zone corresponds to the number of pixels contained in the pixel group.
  • the observer pupil 50 needs to receive the information of the two views projected by all two view zones;
  • the pupil 50 of the observer only needs to receive two view information projected from two of the viewing zones.
  • the 6 viewing zones in Figure 2 can provide a larger viewing area for the pupil 50 of the observer, or by increasing the distribution density of the viewing zone to guide more views into the pupil of the observer, so as to increase the light point formed by the superposition. The ability to attract the focus of the glasses, optimize the monocular multi-view display effect.
  • the monocular multi-view display method of this patent using sub-pixels as the basic display unit is compared with the existing monocular multi-view display method using pixels as the basic display unit.
  • Multi-view display can effectively increase the number of projected views and corresponding viewing zones, so that more viewing zones can be used to provide a larger viewing area for the observer's pupil 50, or the monocular multi-view can be optimized by increasing the distribution density of the viewing zones display effect.
  • the sub-pixels corresponding to the same viewing zone are grouped.
  • the six viewing zones VZ 1 , VZ 2 , VZ 3 , VZ 4 , VZ 5 and VZ 6 in Figure 2 correspond to sub-pixel group 1, sub-pixel group 2, and sub-pixel respectively Group 3, sub-pixel group 4, sub-pixel group 5, and sub-pixel group 6.
  • the control device 30 controls the image information loaded by each sub-pixel, which is the transmission vector of the light beam projected along the sub-pixel and incident on the area where the pupil 50 of the observer is located. Projected light information at the intersection of every surface. In other words, each sub-pixel group is loaded with views relative to their corresponding view zones.
  • the distance between the viewing zones is designed so that the light information of at least two views emitted through at least two viewing zones enters the pupil 50 of the same observer.
  • the over-display object point P has three sagittal beams 3, 4 , and 5 from the sub-pixel group 3, sub-pixel group 4, and sub-pixel group 5 through the viewing zones VZ 3 , VZ 4, and VZ 5, respectively.
  • Superimposed to form a monocular can focus and display the light spot.
  • the display light spot is located between the display device 10 and the pupil 50 of the observer, and is formed by the real superposition of light beams from different sub-pixels.
  • straight lines represent the light beams projected by each sub-pixel to the area where the pupil 50 of the observer is located, such as the sagittal light beams 3, 4, and 5.
  • the light emitted by each sub-pixel is divergent light with a certain divergence angle.
  • a function of the grating device 20 is that each grating unit restricts the divergence angle of the projected light beam corresponding to each sub-pixel, so that the projected light beam of each sub-pixel is on the plane where the pupil 50 of the observer is located, along the arrangement direction of the grating unit, and the light is stronger than the peak light.
  • the 50% strong light distribution area is smaller than the 50 diameter of the observer's pupil.
  • spatially superimposed light points can also be generated.
  • the point P'in Fig. 2 is formed by the intersection of the reverse extension lines of the light beams 6, 7, and 8.
  • the patent also calls it the spatially superimposed light spot at point P', which is like a real light spot formed on the retina of the observer's eye.
  • the display scenes on both sides of the display device 10 are generated based on the same multi-beam superposition for the observer. In the following part, only the display scene on the side of the transmission direction of the light emitted by the display device 10 is taken as an example for description.
  • the pupil 50 of the observer is placed close to the surface of the visual zone.
  • the pupil 50 of the observer may not be able to completely receive all the light beams of at least two views.
  • the pupil 50 of the observer at the position shown in FIG. 3 can receive the view information incident through the viewing zone VZ 3 , which is projected by the corresponding sub-pixel group 3. But the observer pupil 50 at this position can only receive the partial view projected by the sub-pixels in the sub-pixel group 4 in the M s2 M r1 area through the viewing zone VZ 4 , and the sub-pixel group 2 in the M s1 M r2 area.
  • M p1 and M p2 are the two side points of the observer’s pupil 50 in the x direction along the viewing zone arrangement direction
  • M s1 and M s2 are the side points of the two sub-pixel distribution areas of the display device 10
  • M r1 is M p2
  • Mr2 is the intersection point of the line connecting the x edge point of M p1 and the viewing zone VZ 2 and the display device 10.
  • the M s2 M r1 area and the M s1 M r2 area overlap.
  • M t is a point in the overlapping area M r1 M r2.
  • the observer pupil 50 in the position shown in FIG. 3 can receive a complete view and a complete split view. After each display object point, there will be at least two beams from the one view and the split view respectively entering the observer pupil 50. Within a certain range of the screen, it can be superimposed based on the monocular multi-view principle to form a monocular focusable spatial light spot.
  • the combined sub-pixel group that he can observe will be complementarily combined by different parts of more sub-pixel groups.
  • a grating device 20 with a cylindrical lens as a grating unit is taken as an example. Take a common RGB arrangement display in the 10th area of the display device as an example, as shown in Figure 4. Each pixel is composed of three sub-pixels respectively emitting R light, G light, and B light arranged along the x'direction, and the sub-pixels emitting the same color light along the y'direction are arranged adjacently in a row.
  • the grating device 20 uses cylindrical lenses arranged along the one-dimensional x direction as a grating unit, and is placed corresponding to the display device 10, based on the grating splitting formula:
  • N zone 6 sub-pixels constituting a sub-pixel period unit.
  • the two sub-pixel periodic units respectively correspond to the grating units G 1 and G 2 , and Ok+1 and Ok+2 are the optical centers of the grating unit cylindrical lenses G 1 and G 2 on the xz plane.
  • p is the pitch of the sub-pixels along the x-direction in the same sub-pixel periodic unit
  • e is the pitch of the viewing zone
  • D b is the pitch of the grating type grating device 20 and the display device 10
  • D e is the pitch of the viewing zone and the display device 10
  • b Is the distance between adjacent grating units.
  • FIG. 4 it can be seen that there is a misalignment along the y-direction between the sub-pixels shown in FIG. 5.
  • the angle ⁇ between the y′ direction and the long direction y of the grating unit satisfies:
  • dx' and dy' are the sub-pixel pitches along the x'and y'directions, respectively, and N row is the number of rows of sub-pixels occupied by the same sub-pixel periodic unit.
  • the sub-pixels of the same sub-pixel periodic unit are on the same row.
  • Fig. 6 is a partial enlarged view of Fig. 4 in order to illustrate the arrangement of the sub-pixels more clearly.
  • the two beams can be superimposed to form a monocular focusable spatial light spot within a certain distance from the screen.
  • the presentation of its colors is inaccurate due to the lack of basic colors.
  • the superimposed light beams incident on the pupil 50 of the same observer through each display object point are preferably at least M light beams of different colors.
  • the observer optimally receives at least M views or/and split views with the same pupil 50, and the M views or/and split views respectively display different colors of pure color image information, such as a pure green view or a split view. , Pure white view or split view.
  • all the sub-pixels of the pupil 50 of the projected light information incident on the observer can be optimally combined into M pure-color sub-pixel groups or combined sub-pixel groups that respectively emit M different colors.
  • the views corresponding to each view zone all emit light of two different colors, and the corresponding sub-pixel groups are not pure-color sub-pixel groups.
  • the solid colors are combined into sub-pixel groups.
  • each sub-pixel group corresponding to adjacent M viewing zones is designed as a pure-color sub-pixel group that projects R, G, and B color lights, respectively.
  • the sub-pixel arrangement shown in FIG. 7 can be used to achieve the purpose of emitting a pure color view through each viewing area, and the M pure color views respectively outputting through adjacent M viewing areas have different colors, as shown in FIG. 8 . This design is conducive to the ideal presentation of colors.
  • FIG. 9 is a partial enlarged view of FIG. 7 to illustrate the arrangement of sub-pixels more clearly.
  • the grating device 20 may also have timing characteristics.
  • the grating units with an interval of (T-1) grating units are grouped to form a group of T grating units, where T ⁇ 2.
  • the control device 30 controls the T grating unit groups to turn on light sequentially in each time period composed of T adjacent time points, and only one sub-grating is turned on for light at a time point.
  • each grating unit group 1 At the time t within the time period t ⁇ t+ ⁇ t, the grating units of the grating unit group 1 are turned on and the grating units of the grating unit group 2 are closed; at the time t+ in the time period t ⁇ t+ ⁇ t At ⁇ t/2, the grating units of the grating unit group 2 are turned on to pass light, and the grating units of the grating unit group 1 are closed.
  • the opening and closing of each grating unit is controlled by the control device 30 to complete.
  • the control device 30 is realized by controlling the switch of each aperture of the aperture array 201, and each aperture of the aperture array 201 is placed corresponding to each grating unit. As shown in FIG. 10 and FIG.
  • T in the case of small ⁇ t, based on visual retention, which is equivalent to observation, the resolution of the view received through the viewing zone is improved.
  • FIGS. 10 and 11 along the arrangement direction of the grating units, when the distances ⁇ 1 and ⁇ 2 between one grating unit and the adjacent different groups of grating units are equal, the space of the viewing zones generated at different time points in each time period overlaps with each other. It is also possible to design ⁇ 1 ⁇ 2 so that the spatial dislocation arrangement of the viewing zones generated at different time points in each time period can increase the distribution density of the viewing zones.
  • the grating device 20 may also have color selection characteristics.
  • the grating units with an interval of (M-1) grating units are grouped to form M grating unit groups.
  • the M grating unit groups are set to one-to-one correspondence and only allow only M output from the display device 10, respectively. One of the colors of light passes through.
  • the naming of each grating unit uses subscripts to indicate the color of light allowed by the respective attached filter.
  • the grating unit G G1 represents that the filter attached to the grating unit only allows G light to pass through.
  • its serial number is 1.
  • Grating unit groups composed of similar grating units are also named after the color that allows light to pass through, for example, a G-color grating unit group.
  • the sub-pixel emitting B light corresponds to the B-color grating unit group
  • the sub-pixel emitting G light corresponds to the G-color grating unit group
  • the sub-pixel emitting R light corresponds to the R-color grating unit group. That is, the sub-pixels that emit light of different colors each project views to their corresponding viewing zones independently of each other through their corresponding grating unit group.
  • the distance between adjacent M grating units that is, the design of ⁇ 3 , ⁇ 4 , and ⁇ 5 in FIG. 12, optimally makes the adjacent M viewing zones emit light of M different colors.
  • the advantage of the grating device 20 with color selection characteristics is that it can also be applied to the display device 10 that emits light of different colors in the same sub-pixel sequence. As shown in FIG. 13, each sub-pixel of the display device 10 projects R light, G light, and B light sequentially under the action of the timing backlight.
  • the above-mentioned grating device 20 with color selection characteristics can make the viewing zones of different color lights projected by sub-pixels at the same spatial position in a spatially misaligned arrangement.
  • FIG. 13 shows the situation when R light is projected by each sub-pixel at time t within a time period t to t+ ⁇ t required to project R light, G light, and B light sequentially.
  • a B-color light beam emitted by the sub-pixel SP 4 at t+ ⁇ t/3, and a G-color light beam emitted by the sub-pixel SP 6 at t+2 ⁇ t/3 are also shown in dotted lines in FIG.
  • the spacing of adjacent M grating units often needs to be designed to be unequal.
  • a grating device 20 with timing characteristics: corresponding to groups of grating units emitting photo sub-pixels of different colors, the light is sequentially turned on at different adjacent time points, At the same time, only one set of grating unit groups is turned on, and sub-pixels of corresponding color light are emitted, and light information is loaded in synchronization with the corresponding grating unit group.
  • the above-mentioned grating device 20 may also be a slit grating with a slit as a grating unit, and display is performed in the same way.
  • the tracking device 70 as shown in FIG. 2 can also be used to connect the tracking device 70 with the control device 30 to obtain the position of the pupil 50 of the observer in real time. Its function is to determine the loaded image information for each sub-pixel where the projected light enters the observer’s pupil 50 according to the position of the observer’s pupil 50 when the viewing zone is in a strip shape.
  • the display device 10 may also adopt other sub-pixel arrangement structures, such as a display with four-color sub-pixels of R, G, B, and W; for example, a display with a pentile arrangement of sub-pixels, based on the above principle, and a similar method for monocular multi-view display. . It should be noted that when the sub-pixels that emit white light are introduced, since the white light is mixed light, the mixed light cannot be isolated from other R, G, and B lights through the filter.
  • the grating unit group corresponding to the sub-pixel that emits white light needs to be blocked based on other characteristics. Transmission of light projected by sub-pixels of light, G light, and B light. For example, W light corresponding to the grating unit group and other grating unit groups are respectively turned on at different time points, and the corresponding sub-pixels of each grating unit group are only loaded with corresponding light information when the grating unit group is turned on; or W The light-corresponding grating unit group and other grating unit groups respectively allow only orthogonal light to pass.
  • the polarizer attached to the grating unit for W light only allows vertical polarized light to pass, and the light for R, G, and B corresponds to the grating unit.
  • the attached polarizer only allows horizontally polarized light to pass through, and the projected light of each sub-pixel corresponding to each grating unit group is set to pass only the polarized light of the corresponding grating unit group.
  • the two orthogonal polarization states here can also be replaced by the optical rotation states with opposite rotation directions.
  • the sub-pixels can also be designed in other shapes, such as square sub-pixels, or different sub-pixels have different shapes.
  • the structure shown in Figs. 4 and 7 can be directly used as a binocular optical display engine.
  • Figure 14 takes the case when the observer's eyes are exactly on the distribution surface of the viewing zone as an example.
  • Each sub-pixel group of the display device 10 projects light to the viewing zones VZ 1 , VZ 2 , VZ 3 ,... Through the grating device 20, respectively.
  • the coverage size of the multiple viewing areas on the surface along the arrangement direction x is represented by D cv .
  • the design viewing area arrangement direction x deviates from the viewer's binocular connection direction x'to a greater angle, that is, as shown in the figure The smaller the angle, the coverage size of the viewing zone along the x′ direction The larger it is, the more beneficial it is to provide the viewer with a larger viewing area along the direction of the binocular line.
  • D cv ⁇ D ee
  • the angle design can also satisfy the observer's binocular simultaneous monocular multi-view projection.
  • the distribution of each viewing zone also requires that the distance between adjacent viewing zones along the x-direction is smaller than the pupil diameter D p of the observer.
  • the x-direction is rotated in the clockwise direction and deviated from the x'direction as an example, and it can also be rotated in the counterclockwise direction and deviated from the x'direction.
  • the above The angle design can also increase the coverage of the viewing zone along the direction of the observer's binocular connection, but in this case, the views received by the observers may be split views.
  • the minimum value of the angle also needs to be constrained to prevent the light information emitted through the same viewing zone from being incident on the observer's eyes at the same time.
  • the above figures take a one-dimensional grating device 20 composed of a one-dimensional array of grating units as an example for description, which can also be extended in two-dimensional directions in the same way.
  • the light modulation function of the grating device 20 is two of the above-mentioned one-dimensional gratings.
  • the modulation function of the grating device is compounded, and the arrangement directions of the grating units of the two one-dimensional grating devices are along two dimensional directions respectively.
  • the viewing zones whose size is smaller than the pupil diameter of the observer are distributed in a two-dimensional direction.
  • the above-mentioned grating device 20 may also be a grating device composed of microstructure units, each of which is placed in a one-to-one correspondence with each sub-pixel of the display device 10, and guides the light emitted by the corresponding sub-pixel to propagate toward its corresponding viewing zone.
  • a micro-grating corresponding to each sub-pixel of the display device 10 is used as a micro-structure unit of the grating device. Due to the ability to independently control the light emitted by each sub-pixel, the grating device 20 composed of microstructure units can split the viewing area generated by the light emitted by the display device 10, which can be arranged in a one-dimensional direction or a two-dimensional direction.
  • the number of viewing zones projected by the display device 10 through the grating device 20 is large enough to project at least two views or/and split views respectively to the two pupils of the observer.
  • the optical structure of the monocular multi-view display method can be used as a binocular display optical engine. If the viewing zone projected by the display device 10 through the grating device 20 only supports projecting at least two views or/and split views to a single pupil of the observer, based on the monocular multi-view display with sub-pixels as the display unit described in this patent
  • the optical structure of the method for displaying can only be used as a monocular display optical engine, such as a head-mounted virtual reality (VR)/augmented reality (AR) eyepiece.
  • VR virtual reality
  • AR augmented reality
  • the projection device 40 is often needed to project the image I 10 of the display device 10.
  • the image I 10 of the display device 10 with respect to the projection device 40 serves as an equivalent display device;
  • the image I 20 of the grating device 20 with respect to the projection device 40 serves as an equivalent grating device.
  • the image of each sub-pixel group of the display device 10 with respect to the projection device 40 is an equivalent sub-pixel group, and each equivalent sub-pixel group is combined into an equivalent display device I 10 .
  • the image of the corresponding viewing zone of each sub-pixel group with respect to the projection device 40 is taken as the equivalent viewing zone corresponding to the equivalent sub-pixel group corresponding to the sub-pixel group.
  • a specific example is shown in Fig. 15.
  • the display device 10 is projected by the grating device 20, and the 6 sub-pixel groups of the display device 10 are respectively projected by the corresponding 6 viewing zones VZ 1 , VZ 2 , VZ 3 , VZ 4 , VZ 5 , VZ 6 Light information.
  • the modulation by the projection device 40 is equivalent to that the equivalent display device I 10 is split by the equivalent grating device I 20 , and the 6 equivalent sub-pixel groups project 6 views through the corresponding 6 equivalent viewing areas I vz1 , I vz2 , I vz3 , I vz4 , I vz5 , and I vz6 enter the area where the pupil 50 of the observer is located.
  • the image of the display device 10 (equivalent display device) is substituted for the aforementioned display device 10
  • the image of the grating device 20 is substituted for the aforementioned grating device 20.
  • more than one projected The view enters the pupil 50 of the observer, achieving monocular multi-view display.
  • respective corresponding eyepiece structures are required, as shown in Figure 16.
  • a relay device 60 can also be further introduced to guide the projection light of the display device 10 to be projected to the area where the pupil of the observer is located through the deflection path, as shown in FIG. 17.
  • the relay device 60 is taken as an example of the transflective and semi-transparent reverse surface that allows the incident of external ambient light.
  • the equivalent display device is the image I 10 of the display device 10 with respect to the projection device 40 and the relay device 60
  • the equivalent grating The device is the image I 20 of the grating device 20 with respect to the projection device 40 and the relay device 60
  • each equivalent viewing zone corresponds to each viewing zone VZ 1 , VZ 2 , VZ 3 , VZ 4 , VZ 5 , VZ 6 and the projection device 40
  • the relay device 60 I VZ1 , I VZ2 , I VZ3 , I VZ4 , I VZ5 , I VZ6 .
  • the free-form surface composite device is composed of a transmissive curved surface FS1, a reflective curved surface FS2, a transflective curved surface FS3, a transmissive curved surface FS4, and a transmissive curved surface FS5.
  • FS1, FS2, FS3, and FS4 perform the function of the projection device 40 together
  • FS2, FS3 perform the function of the relay device 60
  • FS5 has a compensation modulation function, allowing external ambient light to enter the pupil 50 of the observer without being affected by FS3 and FS4.
  • the relay device 60 can also be an optical waveguide device, which is referred to as an optical waveguide type relay device 60.
  • the optical waveguide type relay device 60 includes an entrance pupil 601, a coupling device 602, an optical waveguide body 603, reflecting surfaces 604 a and 604 b, a coupling out device 605 and an exit pupil 606.
  • the projection device 40 includes a component 40a and a component 40b.
  • the emitted light is converted into parallel light by the component 40a of the projection device 40; and then enters the coupling device 602 through the entrance pupil 601 of the optical waveguide type relay device 60;
  • the input device 602 guides the parallel light from the sub-pixel p m into the optical waveguide body 603, and propagates to the out-coupling device 605 based on the reflection of the reflective surfaces 604a and 604b; the out-coupling device 605 modulates the incident light beam and guides it to enter the projection through the exit pupil 606
  • the component 40b of the device 40; the component 40b of the projection device 40 guides the light projected by the sub-pixel p m to propagate to the area where the pupil 50 of the observer is located, and modulates it to converge on the virtual image I pm in the reverse direction.
  • the virtual image I pm is the virtual image of the sub-pixel p m .
  • I pn corresponds to the virtual image of the sub-pixel p n.
  • Sub-pixel images such as I pm and I pn form an image I 10 of the display device 10. Then, from the optical information I equivalent display device 10 or the at least two views / view can split incident pupil of the observer 50 can be displayed on a multi-view monocular.
  • the compensation device 80 is used to compensate the influence of the component 40b of the projection device 40 on the incident light of the external environment, and can be removed when the external environment light is not needed.
  • the projection device assembly 40b in the figure can also be combined with the coupling-out device 605, such as a holographic device or a convex reflective surface placed at the coupling-out device 605, to have the common function of the coupling-out device 605 and the projection device assembly 40b.
  • the component 40b of the composite projection device 40 and the functional device of the coupling-out device 605 have angular selectivity, that is, it only modulates the light beam propagated by the coupling-in device 602, and has no effect on the ambient light incident from the outside.
  • the compensation device 80 can also be removed.
  • Fig. 19 only takes a commonly used optical waveguide device as an example.
  • the existing optical waveguide devices of various structures can actually be used as the optical waveguide relay device 60 of this patent.
  • the coupling device 602 is a reflective surface.
  • Optical waveguide device for the dispersion problem of optical waveguide devices, a multi-optical waveguide stack structure can also be used. As shown in Figure 20, three optical waveguide device components are responsible for the propagation and guidance of R light, G light, and B light.
  • the device and the out-coupling device can be designed according to the wavelength of the light beam that it is responsible for transmitting to reduce the dispersion effect.
  • the light emitted from each sub-pixel passes through the assembly 40 a of the projection device 40 and enters the optical waveguide relay device 60 in a parallel state. It can also be different.
  • the light transmitted by the display device 10 at various points on the viewing area formed by the grating device 20 is converted into parallel light by the assembly 40a of the projection device 40, and then enters the grating device 20; and then passes through the entrance pupil 601 of the optical waveguide type relay device 60 Incident to the coupling device 602;
  • the coupling device 602 guides the parallel light from each sub-pixel into the optical waveguide body 603, based on the reflection of the reflective surfaces 604a and 604b, propagates to the coupling device 605;
  • the coupling device 605 modulates the incident light beam and guides It enters the component 40b of the projection device 40 through the exit pupil 606;
  • the component 40b of the projection device 40 guides the display device 10 to converge the transmitted light from each point on the viewing area formed by the grating device 20
  • VZ 1 viewport image I VZ1.
  • the light information from at least two sub-pixel groups or the combined sub-pixel groups of the display device 10 can be displayed on the basis of monocular multi-view when the light is incident to the same 50 hours as the observer.
  • the pupil expansion will cause the same sub-pixel to project more than one light beam along different vector directions into the area where the observer's pupil 50 is located.
  • each sub-pixel is required to project different light beams to the area where the observer’s pupil 50 is located at the same time point.
  • the distance on the surface of the observer’s pupil 50 is greater than the diameter of the observer’s pupil to ensure that they are not incident at the same time.
  • Observer pupil 50 it is necessary to use the tracking device 70 to determine the position of the pupil 50 of the observer in real time, and the control device 30 determines the unique sagittal beam projected by each sub-pixel and incident on the pupil 50 of the observer according to the position. Based on the sagittal direction of the beam, according to the aforementioned method Determine the loaded light information of the sub-pixel.
  • the core idea of the present invention is to use sub-pixels as the basic display unit, through the light splitting of the grating device 10, to guide multiple sub-pixel groups to project at least two images to the pupil 50 of the same observer, based on the spatial superposition of the at least two images corresponding to the sagittal beam , To achieve monocular focusable three-dimensional scene presentation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Diffracting Gratings Or Hologram Optical Elements (AREA)

Abstract

一种以子像素为显示单元的单目多视图显示方法。以显示器件(10)子像素为基本显示单元,经光栅器件(20)的分光调控,显示器件(10)的多个子像素组投射光分别经各自对应视区被导向观察者瞳孔(50)所处区域。通过各子像素组调制投射目标场景的不同视图,利用入射观察者同一瞳孔(50)的多于一个的视图,实现单目可聚焦的三维场景显示。来自不同颜色子像素的投射光,沿各自对应矢向于各显示物点叠加形成空间彩色光点,将传统基于不同颜色子像素所拼连而成面分布彩色像素进行的显示,转化为基于不同颜色子像素投射光束叠加而成空间分布彩色光点进行的显示。其中,光栅器件(20)引导来自各子像素的光束发散角受限地沿各自对应矢向,向各自所属子像素组的对应视区投射。

Description

以子像素为显示单元的单目多视图显示方法 技术领域
本发明涉及三维显示技术领域,更具体涉及以子像素为显示单元的单目多视图显示方法。
背景技术
相较于传统二维显示,三维显示可以提供维度一致于人们所感知真实世界的光学信息,正受到越来多的关注。基于双目视差原理进行三维呈现的体视技术(包括自动体视),向观察者双目分别投射各自对应的一幅二维图像,利用双目视向于出屏场景处的交叉触发观察者的深度感知。但观察者各眼睛为了看清楚各自对应的二维图像,需一直聚焦于显示面,由此导致聚焦-会聚冲突问题,即观察者单目聚焦深度和双目会聚深度的不一致。这种单目聚焦深度和双目会聚深度的不一致,有悖于自然情况下人们观察真实三维场景时单目聚焦深度和双目会聚深度一致的生理习惯,由此导致观察者的视觉不适,是目前阻碍三维显示技术推广应用的瓶颈性问题。单目多视图是解决聚焦-会聚冲突问题的一种有效技术路径。其利用光学器件引导显示器件不同像素组向观察者同一眼睛投射至少两个待显示场景的二维图像,以使过各显示物点至少两束光束入射观察者该眼睛,该至少两束光束于显示物点处叠加形成光斑的光强分布可以牵引观察者眼睛自由聚焦于该叠加光斑时,即实现上述聚焦-会聚冲突问题的克服。
发明内容
本发明提出一种单目多视图显示方法,以显示器件子像素为基本显示单元,通过光栅器件分光,引导多个子像素组投射待显示场景的多个图像入射观察者瞳孔所处区域,基于单目多视图实现单目可聚焦的三维场景显示。现有基于光栅分光进行单目多视图显示的技术,以像素为基本显示单元,通过光栅器件引导不同像素组向观察者瞳孔所处区域投射不同视图,例如PCT/CN2019/070029(GRATING BASED THREE-DIMENTIONAL DISPLAY METHOD FOR PRESENTING MORE THAN ONE VIEWS TO EACH PUPIL)所述。本专利所述方法以子像素为基本显示单元,通过不同子像素组向观察者瞳孔所处区域投射不 同的视图。同样形成一个单目可聚焦空间光点,现有单目多视图显示方法需要至少两个像素,本专利所述方法最少仅需要两个子像素。相对而言,本专利以子像素为显示单元的单目多视图显示方法,可有效提高显示器件的二维视图投射能力,更有利于观察者眼睛观影区域的扩展,或通过提升投射视图对应视区的空间密度扩展单目可聚焦显示场景的显示深度。进一步的,本专利利用投影器件投射显示器件的放大像,使所述方法的适用范围扩展至近眼显示;利用中继器件优化光学结构的空间结构分布。所述方法可以直接应用于双目三维显示光学引擎,也可以应用于针对单目的光学引擎。
以子像素作为显示单元,基于单目多视图实现单目可聚焦的三维显示,本发明提供如下方案:
以子像素为显示单元的单目多视图显示方法,包括以下步骤:
(i)以显示器件各子像素为基本显示单元,沿显示器件各子像素出射光传输方向,设置光栅器件于显示器件前,引导显示器件各子像素分别向各自对应的视区投射光束;
其中,同一视区对应的各子像素组成为一个子像素组,不同子像素组于同一时间点上无共用的子像素;
(ii)由与显示器件连接的控制器件,控制各子像素组加载显示对应图像,其中各子像素所加载图像信息,为沿该子像素所投射并入射观察者瞳孔所处区域的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔所处面交点上的投影光信息;
其中,一个子像素组显示的图像为待显示场景的一个视图,不同子像素组的不同部分互补拼合所成拼合子像素组显示的图像为一个拼合视图;
其中,显示器件各子像素组对应视区的空间位置分布,被设置得使总数至少为两个的视图或/和拼合视图的光信息入射同一观察者瞳孔。
进一步地,所述光栅器件的光栅单元为柱透镜或狭缝。
进一步地,所述光栅器件由微结构单元组成,其各微结构单元和显示器件各子像素一一对应放置,用于调制对应子像素出射光。
进一步地,步骤(i)还包括沿光栅单元排列方向,间隔(T-1)个光栅单元的光栅单元组成光栅单元组,步骤(ii)还包括控制器件控制该T个光栅单元组在相邻T个时间点组成的各时间周期内,时序打开通光,且一个时间点仅一个光 栅单元组被打开通光,其中T≧2。
进一步地,步骤(i)还包括所述显示器件各子像素出射M种不同颜色光,沿光栅单元排列方向,间隔(M-1)个光栅单元的光栅单元组成光栅单元组,该M个光栅单元组被设置为一一对应地分别允许显示器件所出射M种颜色光中的仅一种通过,其中M≧2。
进一步地,步骤(i)还包括置投影器件于与显示器件对应的位置,成显示器件放大像。
进一步地,步骤(i)还包括置中继器件于显示器件投射光传输路径上,引导显示器件投射光束入射观察者瞳孔所处区域。
进一步地,所述中继器件为反射面、或半透半反面、自由曲面组合、或光波导器件。
进一步地,步骤(ii)还包括将追踪器件与控制器件连接,通过追踪器件实时追踪观察者瞳孔的位置。
进一步地,步骤(ii)还包括根据观察者瞳孔的位置,对于投射光入射观察者瞳孔的各子像素,确定其所加载图像信息,为沿该子像素所投射并入射观察者瞳孔的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔交点上的投影光信息。
本发明以子像素为基本显示单元,相对于以像素为显示单元的单目多视图显示方法,可以有效提高二维视图的投射数量,并结合光栅器件的特性设计,进一步满足单目多视图显示对二维视图投射数量的要求。
本发明具有以下技术效果:本发明以子像素为显示单元的单目多视图显示方法,为无聚焦-会聚冲突的三维显示提供一种实现方法。本发明以子像素为显示单元的单目多视图显示方法,可有效提高显示器件的二维视图投射能力,更有利于观察者眼睛观影区域的扩展,或通过提升投射视图对应视区的空间密度扩展单目可聚焦显示场景的显示深度。进一步的,本发明利用投影器件投射显示器件的放大像,使所述方法的适用范围扩展至近眼显示;利用中继器件优化光学结构的空间结构分布。所述方法可以直接应用于双目三维显示光学引擎,也可以应用于针对单目的光学引擎。
本发明实施例的细节在附图或以下描述中进行体现。本发明的其它特性、目的和优点通过下述描述、附图而变得更为明显。
附图说明
附图用于帮助更好地理解本发明,也是本说明书的一部分。这些对实施例进行图解的附图和描述一起用以阐述本发明的原理。
图1是现有以像素为显示单元的单目多视图显示原理示意图。
图2本专利以子像素为显示单元的单目多视图显示方法示意图。
图3是拼合子像素组的拼合规则说明图。
图4是光栅器件和子像素对应位置关系示意图示意图。
图5是光栅器件分光原理示意图。
图6是图4所示光栅器件和子像素对应位置局部放大图。
图7是一种产生纯色视图的子像素排列方式示意图。
图8是纯色视图经光栅器件分光投射示意图。
图9是图7所示光栅器件和子像素对应位置局部放大图。
图10是具有时序特性光栅器件于一个时刻的选通状态示意图。
图11是具有时序特性光栅器件于另一个时刻的选通状态示意图。
图12是具有颜色选择特性光栅器件工作原理示意图。
图13是具有颜色选择特性光栅器件的另一种应用环境示意图。
图14是条状视区倾斜度对双目连线方向上视区覆盖区域的影响示意图。
图15是引入投影器件的近眼单目显示光学模组示意图。
图16是基于近眼单目显示光学模组的双目显示结构示意图。
图17是引入中继器件的近眼单目显示光学模组示意图。
图18是基于自由曲面的中继器件范例。
图19是基于光波导器件的中继器件范例一。
图20是多光波导堆叠结构示意图。
图21是基于光波导器件的中继器件范例二。
具体实施方式
本发明以子像素为显示单元的单目多视图显示方法,直接以子像素为基本显示单元,利用多个子像素组,向观察者瞳孔所处区域投射多个视图,基于来自不同视图的不同矢向光束的空间叠加,形成单目可聚焦显示光点,实现无聚焦-会聚冲突的三维显示。
现有单目多视图技术,以像素为基本显示单元,通过显示器件10的不同像 素组,投射至少两个视图至观察者瞳孔50所处区域。过任一显示物点,过该物点的至少两束来自不同像素组的光束叠加形成观察者单目可聚焦的空间显示光点。相对于各叠加光束于出射像素处的光强,该空间叠加光点对观察者眼睛具有更大吸引力时,可以牵引观察者眼睛聚焦于该空间叠加光点,从而克服聚焦-会聚冲突。图1具体地以单目两视图为例进行说明。像素组1投射关于视区VZ 1的视图1,且像素组1各像素投射光经光栅器件20引导,经视区VZ 1出射,而不过视区VZ 2;像素组2投射关于视区VZ 2的视图2,且像素组2各像素投射光经光栅器件20引导,经视区VZ 2出射,而不过视区VZ 1。像素组1投射光束携带视图1光信息经视区VZ 1,像素组2投射光束携带视图2光信息经视区VZ 2,分别入射观察者瞳孔50。x向为视区排列方向。于待显示物点P,来自像素组1的光束1和来自像素组2的光束2叠加形成空间叠加光点。该空间叠加光点的光强分布,可以吸引观察者瞳孔50对应眼睛聚焦于该物点P时,观察者眼睛焦点将不再被强制固定于光束1或光束2的出射像素处,即观察者眼睛焦点将不再被强制固定于显示器件10上,从而实现聚焦-会聚冲突的克服。同理,其它可以单目聚焦的显示物点,共同组成单目可聚焦显示场景。
实际上,提高视区的数量和分布密度,可以实现观察者瞳孔50的更多视图光信息入射。这样,过各显示物点,将会有更多的叠加光束沿各自矢向入射观察者瞳孔50。该更多数量叠加光束的叠加,可以提高空间叠加光点对观察者眼睛焦点的吸引能力,有利于更大出屏距离场景的显示。同时,更多的投射视区,也可以为观察者瞳孔50提供更大的观影区域,以便观察者瞳孔50于该更大观影区域内发生移动时,持续基于单目多视图原理看到无聚焦-会聚冲突的显示场景。基于光栅器件20进行单目多视图显示时,视区数量的增加对应于视图投射数量的增加,需要光栅器件20分割显示器件10的像素为更多个像素组,以投射更多视图。这也对应导致各像素组所包含像素数目的下降,也即投射视图分辨率的下降。
本专利以子像素作为基本显示单元进行单目多视图显示。相对于以像素为基本显示单元的方法,在选用相同显示器件10和光栅器件20的情况下,以子像素为基本显示单元,可以将投射视图及其对应视区的数目提高至N倍。其中,N≧2为各像素所包含子像素的个数。显示器件10出射M种颜色的光,其中,M≧2。 图2以各像素分别由R(红)、G(绿)、B(蓝)三色子像素构成的显示器件10为例进行说明,对应M=3,其中R、G、B三色子像素为分别出射红光(R光)、绿光(G光)、蓝光(B光)的M=3类子像素。设计光栅器件20分光生成6个视区VZ 1、VZ 2、VZ 3、VZ 4、VZ 5和VZ 6,则其各视区对应子像素的数目,等同于图1所示两个视区中各视区对应像素组所包含像素的数目。根据单目多视图的实现原理,在选用同样的显示器件10和光栅器件20的前提下,图1所示情况中,观察者瞳孔50需要接收过全部两个视区投射的两个视图信息;图2所示情况中,观察者瞳孔50仅需要接收过其中两个视区投射的两个视图信息。相较而言,图2的6个视区可以为观察者瞳孔50提供更大的观影区域,或者通过提高视区分布密度引导更多的视图入射观察者瞳孔,以提升叠加所成光点对眼镜焦点的吸引能力,优化单目多视图显示效果。也即是说,在不牺牲各投射视图分辨率和显示频率的前提下,本专利以子像素为基本显示单元的单目多视图显示方法,相对于现有以像素为基本显示单元的单目多视图显示,可以有效提升所能投射视图及对应视区的数目,从而可以利用更多视区为观察者瞳孔50提供更大的观影区域,或者通过提高视区分布密度优化单目多视图显示效果。
对应同一个视区的子像素成组,图2中的6个视区VZ 1、VZ 2、VZ 3、VZ 4、VZ 5和VZ 6分别对应子像素组1、子像素组2、子像素组3、子像素组4、子像素组5和子像素组6。控制器件30控制各子像素所加载图像信息,为沿该子像素所投射并入射观察者瞳孔50所处区域的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔50所处面交点上的投影光信息。也即是说,各子像素组分别加载相对于各自对应视区的视图。设计各视区间距,使过至少两个视区出射的至少两个视图的光信息入射同一观察者瞳孔50。如图2所示,过显示物点P,分别经视区VZ 3、VZ 4、VZ 5来自于子像素组3、子像素组4、子像素组5的三条矢向光束3、4、5,叠加形成单目可聚焦显示光点。该显示光点处于显示器件10和观察者瞳孔50之间,由来自不同子像素的光束真实叠加而成。图中以直线表示各子像素向观察者瞳孔50所处区域投射的光束,例如矢向光束3、4、5。实际上,各子像素出射光是具有一定发散角的发散光。光栅器件20的一个功能在于,其各光栅单元约束对应各子像素投射光束的发散角,使各子像素投射光束于观察者瞳孔50所处平面上,沿光栅单元排列方向,光强大于峰值光强50% 的光分布区域小于观察者瞳孔50直径。该情况下,过一个物点的两束或多束不同矢向的光束,其于该物点处叠加形成的光强分布,相对于该各光束于出射子像素处的光强分布,对观察者眼睛焦点更容易具有更大吸引力,基于至少的两束光束叠加,就可以于一定的出屏范围内基于单目多视图实现单目可聚焦的空间光点显示。本专利中,来自各子像素的、入射观察者瞳孔50所处区域的光束,发散角可以满足上述要求的光束,用光线来表示。图中-z区域内,也可以生成空间叠加光点。如图2中的点P′,由光束6、7、8的反向延长线相交而成。当观察者眼睛所处位置可以接收到光束6、7、8时,其看到的是光束6、7、8沿反向衍射传输所得等效光分布于点P′处的叠加光分布,本专利一样称之为点P′处的空间叠加光点,其像于观察者眼睛视网膜上形成真实的光点。本专利中,位于显示器件10两边的显示场景,对观察者而言,都是基于同样的多光束叠加生成的。在以下部分,仅以显示器件10出射光传输方向这一边的显示场景为例进行说明。
图2中,观察者瞳孔50靠近视区所在面放置。当观察者瞳孔沿光束传输方向,向前或向后偏离视区所在面时,观察者瞳孔50可能无法完全接收至少两个视图的全部光束。如图3所示位置处的观察者瞳孔50,可以接收到经视区VZ 3入射的视图信息,其是由对应的子像素组3投射的。但处于该位置的观察者瞳孔50仅能接收子像素组4于M s2M r1区域上的子像素经视区VZ 4所投射的部分视图,及子像素组2于M s1M r2区域上的子像素经视区VZ 2所投射的部分视图。图中,M p1、M p2此为观察者瞳孔50沿视区排列方向x向的两个边点,M s1、M s2为显示器件10的两个子像素分布区域边点,M r1是M p2和视区VZ 4的-x向边点连线与显示器件10的交点,M r2是M p1和视区VZ 2的x向边点连线与显示器件10的交点。M s2M r1区域和M s1M r2区域发生重叠,取它们分别于空间互补的M s2M t和M tM s1区域上的子像素,拼连为拼合子像素组,其显示图像为待显示场景的一个拼合视图。其中,M t为重叠区域M r1M r2内的一个点。图3所示位置的观察者瞳孔50,可以接收到一个完整视图和一个完整拼合视图,过各显示物点,将会有至少两束分别来自该一个视图和一个拼合视图的光束入射观察者瞳孔50,在一定出屏范围内,可基于单目多视图原理叠加形成单目可聚焦的空间光点。同理类推,随着观察者瞳孔50距离视区所在面距离的增大,其所能观察到的拼合子像素组,将会由更多个子像素组的不同部分互补拼合而成。
具体地,取以柱透镜为光栅单元的光栅器件20为例。显示器件10区常见RGB排列显示器为例,如图4所示。各像素由分别出射R光、G光、B光的三个子像素沿x′方向排列组成,沿y′方向出射相同颜色光的子像素相邻成列排列。光栅器件20以沿一维x方向排列的柱透镜作为光栅单元,对应显示器件10放置,基于光栅分光公式:
p/e=D b/(De-Db)    1)
b/(N zone×p)=(De-Db)/De    2),
沿x方向,错位排列的各相邻N zone=6个子像素的出射光被对应的一个柱透镜单元分别引导至N zone=6个条状视区中的对应视区,其中N zone≧2。N zone=6个出射光被对应的一个光栅单元分别引导至N zone=6个条状视区的相邻N zone=6个子像素组成一个子像素周期单元。以图4所示两个虚线框中所示两个子像素周期单元为例,其各子像素投射光经光栅器件20被分光引导的原理如图5所示。该两个子像素周期单元分别对应光栅单元G 1和G 2,O k+1和O k+2为光栅单元柱透镜G 1和G 2于xz面上的光心。其中,p为同一子像素周期单元中子像素沿x向的间距,e为视区间距,D b为光栅型光栅器件20和显示器件10间距,D e为视区和显示器件10间距,b为相邻光栅单元间距。对比图4可知,图5中示出的各子像素之间存在沿y向的错位。y′向和光栅单元长向y向夹角θ满足:
tan(θ)×N row=dx′/dy′,(θ≠0,N row≧2)    3)
θ=04)。
其中,dx′和dy′分别为沿x′和y′向子像素间距,N row为同一子像素周期单元所占据子像素的行数。θ=0对应x′和x′重合的情况,该情况下,同一子像素周期单元的各子像素处于同一行上。考虑沿各个方向分辨率的均衡,常采用θ≠0的情况。则根据希望的N zone和该N row值,由上式2)确定相邻光栅单元间距b。具体地,图4取N row=2,N zone=6。显示器件10的子像素SP Raa、SP Rad、SP Rag、SP Raj、SP Ram、…,SP Gcb、SP Gce、SP Gch、SP Gck、SP Gcn、…,…组成对应视区VZ 1的子像素组1,子像素SP Gbb、SP Gbe、SP Gbh、SP Gbk、SP Gbn、…,SP Bdc、SP Bdf、SP Bdi、SP Bdl、SP Bdo、…,…组成对应视区VZ 2的子像素组2,…,同理类推,确定N zone=6个子像素组。设计视区间距足够小,在至少两个视图或/拼合视图投射光信息入射观察者同一个 瞳孔50的情况下,可以基于单目多视图实现单目可聚焦的三维显示。图6为图4的局部放大图,以便更清晰地图示各子像素的排列情况。
显示器件10可选用的现有各种显示器中,彩色光信息的呈现,是通过子像素出射的M种基本色光混色而实现的,例如常见M=3的RGB显示器、M=4的RGBW显示器,其中W代表出射白光的子像素。如前所述,当过各显示物点,以最少的两束光束叠加进行单目多视图显示时,该两束光束于一定出屏距离内虽然可以叠加形成单目可聚焦空间光点,但其颜色的呈现却因为基本色的缺失而失准。考虑色彩的准确呈现,基于各子像素投射光束的空间叠加进行单目多视图显示时,过各显示物点入射同一观察者瞳孔50的叠加光束,最优的为至少M条不同颜色的光束。这也要求,观察者同一瞳孔50最优地至少接收到M个视图或/和拼合视图,且该M个视图或/和拼合视图分别显示不同颜色的纯色图像信息,例如纯绿色视图或拼合视图、纯白色视图或拼合视图。也即是说,投射光信息入射观察者瞳孔50的所有子像素,最优地可以组合为M个分别出射M种不同颜色的纯色子像素组或拼合子像素组。图4和图5所示设计参数情况下,各视区对应的视图,均是出射两种不同颜色的光,其对应子像素组不是纯色子像素组。这时,为了理想的色彩呈现效果,观察者瞳孔50所接收到的各视图或部分视图对应的子像素,最优的情况为可以分割重组为M=3个分别投射R、G、B颜色光的纯色拼合子像素组。
也可以通过调整显示器件10各色子像素的排列方式,以使各视区对应子像素为出射相同颜色光的纯色子像素组。例如,将相邻M个视区对应的各子像素组,设计为分别投射R、G、B颜色光的纯色子像素组。例如,采用图7所示子像素排列方式,即可实现经各视区出射纯色视图的目的,且经相邻M个视区分别出射的M个纯色视图具有互不相同的颜色,如图8。这种设计,有利于色彩的理想呈现。图9为图7的局部放大图,以便更清晰地图示子像素的排列情况。
光栅器件20也可以具有时序特性。沿光栅单元排列方向,间隔(T-1)个光栅单元的光栅单元成组,组成T个光栅单元组,其中T≧2。控制器件30控制该T个光栅单元组于T个相邻时间点组成的各时间周期内,时序打开通光,且一个时间点仅一个子光栅被打开通光。图10以T=2为例,G 1(t)、G 2(t)、G 3(t)、…组成光栅单元组1,G 1(t+Δt/2)、G 2(t+Δt/2)、G 3(t+Δt/2)、…组成光栅单元组2。在时 间周期t~t+Δt内的时间点t,光栅单元组1各光栅单元被打开通光,光栅单元组2各光栅单元被关闭;在时间周期t~t+Δt内的时间点t+Δt/2,光栅单元组2各光栅单元被打开通光,光栅单元组1各光栅单元被关闭。该各光栅单元的打开和关闭由控制器件30来控制完成。例如控制器件30通过控制孔径阵列201各孔径的开关来实现,该孔径阵列201各孔径分别对应各光栅单元放置。则如图10和图11所示,该情况下,各视区在一个时间周期的T=2个时间点,分别有T=2个不同的子像素组投射光入射。在小Δt情况下,基于视觉滞留,等效于观察则通过该视区接收到的视图分辨率得到提高。图10和图11中,沿光栅单元排列方向,一个光栅单元和相邻的不同组光栅单元的间距δ 1和δ 2取相等时,各时间周期内不同时间点生成的视区空间相互重合。也可以设计δ 1≠δ 2,以使各时间周期内不同时间点生成的视区空间错位排列,可以提高视区的分布密度。
光栅器件20也可以具有颜色选择特性。沿光栅单元排列方向,间隔(M-1)个光栅单元的光栅单元成组,组成M个光栅单元组,该M个光栅单元组被设置为一一对应地分别仅允许显示器件10所出射M种颜色光中的一种通过。图12选用RGB显示器作为显示器件10,M=3。光栅器件20的相邻M=3个光栅单元,分别附有仅允许R光、G光、B光通过的滤光片。各光栅单元的命名用下标表明各自所附滤光片允许通过光的颜色,例如光栅单元G G1代表该光栅单元所附滤光片仅允许G光通过,于同类光栅单元中,其序号为1。同类光栅单元组成的光栅单元组也用其允许通过光的颜色命名,例如G色光栅单元组。则,出射B光的子像素,对应B色光栅单元组;出射G光的子像素,对应G色光栅单元组;出射R光的子像素,对应R色光栅单元组。也即是,出射不同颜色光的子像素,各自经自己对应的光栅单元组,互不相干地向各自对应视区投射视图。相邻M个光栅单元之间的距离,即图12中的δ 3、δ 4、δ 5的设计,最优地使相邻M个视区分别出射M种不同颜色的光。这种具有颜色选择特性的光栅器件20的优势在于,其也可以应用于同一个子像素时序出射不同颜色光的显示器件10。如图13所示显示器件10,其各子像素,在时序背光的作用下,时序投射R光、G光、B光。针对该类型显示器件10,上述具有颜色选择特性的光栅器件20,可使同一空间位置上的子像素时序投射的不同颜色光所对应视区空间错位排列。图13所示为在时序投射R光、G光、B光所需的一个时间周期t~t+Δt内的t时刻,各子 像素投射R光时的情况。t+Δt/3时刻子像素SP 4出射的一条B色光束、t+2Δt/3时刻子像素SP 6出射的一条G色光束也于图13中以虚线示出,以说明为了使不同颜色视图对应视区空间错位排列,相邻M个光栅单元间距往往需要设计为不相等。图12和图13所示具有颜色选择性光栅器件20,也可以用具有时序特性的光栅器件20代替:对应出射不同颜色光子像素的光栅单元组,在相邻的不同时间点依次打开通光,且同一时间点仅一组光栅单元组打开,出射对应颜色光的子像素,同步于对应光栅单元组进行光信息加载。
上述光栅器件20,也可以采用以狭缝为光栅单元的狭缝光栅,同理进行显示。
上述过程中,还可以利用如图2中所示的追踪器件70,将追踪器件70与控制器件30连接,实时获取观察者瞳孔50的位置。其作用在于,在视区为条状时,根据观察者瞳孔50的位置,对于投射光入射观察者瞳孔50的各子像素,确定其所加载图像信息,为沿该子像素所投射并入射观察者瞳孔50的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔50所处面交点上的投影光信息。
图4和图7以RGB显示器作为显示器件10进行示例说明。显示器件10也可以采用其它子像素排列结构,比如采用R、G、B、W四色子像素的显示器;比如子像素Pentile排列的显示器,基于上述原理,通过类似的方法进行单目多视图显示。其中需要注意的是,引入出射白光的子像素时,由于白光为混合光,无法通过滤光片将该类混合光和其它的R、G、B光隔离。设计具有颜色选择性光栅器件20时,在R、G、B光对应各光栅单元分别附着对应滤光片的情况下,出射白光的子像素对应的光栅单元组,需要基于其它特性来阻挡出射R光、G光、B光的子像素所投射光的透射。例如,W光对应光栅单元组和其它光栅单元组在不同的时间点分别打开通光,且各光栅单元组对应子像素仅在该光栅单元组打开通光时才同步加载对应光信息;或者W光对应光栅单元组和其它光栅单元组分别仅允许正交态光通过,例如W光对应光栅单元上附有的偏光片仅允许垂直偏振光通过,R光、G光、B光对应光栅单元上附有的偏光片仅允许水平偏振光通过,且各光栅单元组对应的各子像素投射光,设置为仅能通过对应光栅单元组的偏振态光。此处的两个正交偏振态,也可以用旋向相反的旋光态代替。另外,除了上述各图中所示矩形子像素,子像素也可以设计为其它形状,比如方形子像 素,或者不同子像素具有不同的形状。另外,不限于薄结构显示器件10,其也可以是其它各种类型的显示器,比如,需要背光源的透射式或反射式显示器,接收投影信息的透射式或反射式投影屏等。
当投射视区数量,足以向观察者各目分别投射至少两个视图或/和拼合视图时,图4和图7所示结构可以直接作为双目光学显示引擎。该情况下,可以通过设计视区排列方向偏离观察者双目连线方向更大的角度,增大各视区投射光信息沿观察者双目连线方向x′向的覆盖范围。图14以观察者双目恰好处于视区分布面时的情况为例。显示器件10各子像素组经光栅器件20分别投射光至视区VZ 1、VZ 2、VZ 3、…。该多个视区沿排列方向x向于所在面上覆盖尺寸以D cv表示,设计视区排列方向x向偏离观察者双目连线方向x′向越大角度,也即图中所示
Figure PCTCN2020091877-appb-000001
角越小,视区沿x′向覆盖尺寸
Figure PCTCN2020091877-appb-000002
越大,越有利沿双目连线方向向观察者提供的更大的观影区域。甚至,在D cv<D e-e时,
Figure PCTCN2020091877-appb-000003
角的设计也可以满足观察者双目的同时单目多视图投射。同时,各视区的分布,还要求沿x向相邻视区间距小于观察者瞳孔直径D p。图中以x向沿顺时针方向旋转偏离x′向为例,其同样也可以沿逆时针方向旋转偏离x′向。实际上,观察者双目不在视区分布面上时,上述
Figure PCTCN2020091877-appb-000004
角的设计,一样可以增大视区沿观察者双目连线方向的覆盖范围,只是在该情况下,观察者各目接收到的视图可能是拼合视图。上述过程中,
Figure PCTCN2020091877-appb-000005
角的极小值也需要受到约束,以避免经同一视区出射的光信息同时入射观察者双目。
上述各图以一维排列光栅单元组成的一维型光栅器件20为例进行说明,其也可以同理扩展式二维方向,此时光栅器件20的光调制函数,为两个上述一维型光栅器件的调制函数的复合,该两个一维型光栅器件的光栅单元排列方向分别沿两个维度方向。这时,尺寸小于观察者瞳孔直径的各视区,沿二维方向分布。
上述光栅器件20还可以是由微结构单元组成的光栅器件,其各微结构单元和显示器件10各子像素一一对应放置,引导对应子像素出射光指向其对应视区传播。例如,显示器件10各子像素上对应置放的一个微光栅作为光栅器件的一个微结构单元。由于具有独立控制各个子像素出射光的能力,微结构单元组成的光栅器件20分光显示器件10出射光生成的视区,可以沿一维方向排列,也可以沿二维方向排列。
显示器件10通过光栅器件20投射的视区数量具够多,可以向观察者的两个 瞳孔分别各投射至少两个视图或/和拼合视图时,基于本专利所述以子像素为显示单元的单目多视图显示方法进行显示的光学结构,可以作为双目显示光学引擎。如果显示器件10通过光栅器件20投射的视区,仅支持向观察者的单个瞳孔投射至少两个视图或/和拼合视图时,基于本专利所述以子像素为显示单元的单目多视图显示方法进行显示的光学结构,仅能作为单目显示光学引擎,例如头戴式虚拟现实(VR)/增强现实(AR)的目镜。此时,经常会需要投影器件40,投影显示器件10的像I 10。此时,显示器件10关于投影器件40的像I 10,作为等效显示器件;光栅器件20关于投影器件40的像I 20,作为等效光栅器件。显示器件10各子像素组关于投影器件40的像为等效子像素组,各等效子像素组拼合为等效显示器件I 10。各子像素组对应视区关于投影器件40的像,作为该子像素组对应等效子像素组所对应的等效视区。具体范例如图15,显示器件10经光栅器件20,由显示器件10的6个子像素组分别经各自对应的6个视区VZ 1、VZ 2、VZ 3、VZ 4、VZ 5、VZ 6投射光信息。经投影器件40的调制,等效于等效显示器件I 10经等效光栅器件I 20分光,其6个等效子像素组投射6个视图经各自对应的6个等效视区I vz1、I vz2、I vz3、I vz4、I vz5、I vz6入射观察者瞳孔50所处区域。也即是说,以显示器件10的像(等效显示器件)代替前述显示器件10,以光栅器件20的像(等效光栅器件)代替前述光栅器件20,基于同样原理,投射多于一个的视图入射观察者瞳孔50,实现单目多视图显示。对于观察者双目,分别需要各自对应的目镜结构,如图16所示。图16中,各投影器件和对应显示器件之间所示的空间错位量σ l和σ r,用以设置左右眼睛分别对应显示器件的像的重合度,例如完全重合,或者σ l=0、σ r=0情况下的部分重合。
图15所示结构中,也可以进一步地引入中继器件60,引导显示器件10投射光经偏折路径投射至观察者瞳孔所处区域,如图17所示。图17中,中继器件60以允许外部环境光入射的半透半反面为例,此时,等效显示器件为显示器件10关于投影器件40和中继器件60的像I 10,等效光栅器件为光栅器件20关于投影器件40和中继器件60的像I 20,各等效视区为对应各视区VZ 1、VZ 2、VZ 3、VZ 4、VZ 5、VZ 6关于投影器件40和中继器件60的像I VZ1、I VZ2、I VZ3、I VZ4、I VZ5、I VZ6。图15中的半透半反面,也可用反射面代替,甚至更进一步地用反射凹面代替,将投影器件40和中继器件60的功能复合至一个器件。中继器件60也可以选用其它各种的光学器件或光学器件的组合,例如图18所示的自由曲面组合器 件。该自由曲面组合器件由透射曲面FS1、反射曲面FS2、半透半反曲面FS3、透射曲面FS4、透射曲面FS5构成。其中FS1、FS2、FS3、FS4一起执行投影器件40的功能,FS2、FS3执行中继器件60的功能,FS5具有补偿调制功能,允许外部环境光不受FS3、FS4影响地入射观察者瞳孔50。
所述中继器件60也可以选用光波导器件,称之为光波导型中继器件60。如图19所示,光波导型中继器件60包括入瞳601,耦入器件602,光波导体603,反射面604a和604b,耦出器件605和出瞳606。投影器件40包括组件40a和组件40b。显示器件10上的一个子像素,例如子像素p m,出射光经投影器件40的组件40a转化为平行光;然后经光波导型中继器件60的入瞳601入射至耦入器件602;耦入器件602引导来自子像素p m的平行光于光波导体603内,基于反射面604a和604b的反射,向耦出器件605传播;耦出器件605调制入射光束,引导其经出瞳606入射投影器件40的组件40b;投影器件40的组件40b引导子像素p m投射来的光向观察者瞳孔50所处区域传播,并调制其反向会聚于虚像I pm。该虚像I pm即为子像素p m的虚像。同样,I pn对应为子像素p n的虚像。I pm、I pn等子像素像形成显示器件10的像I 10。则,来自等效显示器件I 10的至少两个视图或/拼合视图的光信息可以入射光观察者瞳孔50时,即可基于单目多视图进行显示。补偿器件80用于补偿投影器件40的组件40b对外部环境入射光的影响,在无需外部环境光时可以去除。图中的投影器件组件40b,也可以复合于耦出器件605,例如置于耦出器件605处的全息器件或凸反射面,起耦出器件605和投影器件组件40b的共同功能。当该复合投影器件40的组件40b和耦出器件605的功能的器件具有角度选择性时,即仅对由耦入器件602传播过来的光束起调制作用,对从外部入射的环境光不起作用时,补偿器件80也可以去除。图19仅以一个常用的光波导器件为例进行说明,现有各种结构的光波导器件,实际上均可作为本专利的光波导型中继器件60,例如耦入器件602为反射面的光波导器件。其中,针对光波导器件的色散问题,也可以采用多光波导堆叠结构,如图20所示,三个光波导器件组件分别负责R光、G光、B光的传播和引导,它们的耦入器件和耦出器件可以针对自己负责传输光束的波长进行设计,降低色散效应。
图19所示结构中,各子像素出射光经投影器件40的组件40a,以平行态入射光波导中继器件60。也可以是不同的情况。如21图,显示器件10经光栅器件20所成视区上各点透射光经投影器件40的组件40a转化为平行光后入射光栅 器件20;然后经光波导型中继器件60的入瞳601入射至耦入器件602;耦入器件602引导来自各子像素的平行光于光波导体603内,基于反射面604a和604b的反射,向耦出器件605传播;耦出器件605调制入射光束,引导其经出瞳606入射投影器件40的组件40b;投影器件40的组件40b引导显示器件10经光栅器件20所成视区上各点透射光会聚于对应点,形成各视区的像。例如图21所示视区VZ 1的像I VZ1。则,过各视区的像,来自显示器件10的至少两个子像素组或拼合子像素组的光信息可以入射光观察者同50时,即可基于单目多视图进行显示。
上述中继器件60若具有扩瞳功能,扩瞳会导致同一子像素投射多于一束的光束沿不同矢向入射观察者瞳孔50所处区域。该情况下,要求各子像素于同一个时间点,投射至观察者瞳孔50所处区域的不同光束,于观察者瞳孔50所处面上的间距大于观察者瞳孔直径,以保证它们不同时入射观察者瞳孔50。这时,需要利用追踪器件70实时确定观察者瞳孔50的位置,并由控制器件30根据该位置确定各子像素投射并入射观察者瞳孔50的唯一矢向光束,基于该光束的矢向,根据前述方法确定该子像素的加载光信息。
实际上,在保证实现至少两个子像素组或拼合子像素组所投射光信息入射观察者同一瞳孔50的前提下,图19和图21中相关各组件的位置关系也可以变动,或引入新的组件,甚至在所生成视区上各点透射光或子像素出射光均以非平行光的形态入射光波导型中继器件60的情况下实现单目多视图显示。
本发明的核心思想是以子像素为基本显示单元,通过光栅器件10的分光,引导多个子像素组向同一观察者瞳孔50投射至少两个图像,基于该至少两个图像对应矢向光束的空间叠加,实现单目可聚焦的三维场景呈现。
以上仅为本发明的优选实施例,但本发明的设计构思并不局限于此,凡利用此构思对本发明做出的非实质性修改,也均落入本发明的保护范围之内。相应地,所有相关实施例都落入本发明的保护范围内。

Claims (10)

  1. 以子像素为显示单元的单目多视图显示方法,其特征在于,包括以下步骤:
    (i)以显示器件(10)各子像素为基本显示单元,沿显示器件(10)各子像素出射光传输方向,设置光栅器件(20)于显示器件(10)前,引导显示器件(10)各子像素分别向各自对应的视区投射光束;
    其中,同一视区对应的各子像素组成为一个子像素组,不同子像素组于同一时间点上无共用的子像素;
    (ii)由与显示器件(10)连接的控制器件(30),控制各子像素组加载显示对应图像,其中各子像素所加载图像信息,为沿该子像素所投射并入射观察者瞳孔(50)所处区域的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔(50)所处面交点上的投影光信息;
    其中,一个子像素组显示的图像为待显示场景的一个视图,不同子像素组的不同部分互补拼合所成拼合子像素组显示的图像为一个拼合视图;
    其中,显示器件(10)各子像素组对应视区的空间位置分布,被设置得使总数至少为两个的视图或/和拼合视图的光信息入射同一观察者瞳孔(50)。
  2. 根据权利要求1所述的以子像素为显示单元的单目多视图显示方法,其特征在于,所述光栅器件(20)的光栅单元为柱透镜或狭缝。
  3. 根据权利要求1所述的以子像素为显示单元的单目多视图显示方法,其特征在于,所述光栅器件(20)由微结构单元组成,其各微结构单元和显示器件(10)各子像素一一对应放置,用于调制对应子像素出射光。
  4. 根据权利要求2所述的以子像素为显示单元的单目多视图显示方法,其特征在于,步骤(i)还包括沿光栅单元排列方向,间隔(T-1)个光栅单元的光栅单元组成光栅单元组,步骤(ii)还包括控制器件(30)控制该T个光栅单元组在相邻T个时间点组成的各时间周期内,时序打开通光,且一个时间点仅一个光栅单元组被打开通光,其中T≧2。
  5. 根据权利要求2所述的以子像素为显示单元的单目多视图显示方法,其特征在于,步骤(i)还包括所述显示器件(10)各子像素出射M种不同颜色光,沿光栅单元排列方向,间隔(M-1)个光栅单元的光栅单元组成光栅单元组,该M个光栅单元组被设置为一一对应地分别允许显示器件(10)所出射M种颜色 光中的仅一种通过,其中M≧2。
  6. 根据权利要求1所述的以子像素为显示单元的单目多视图显示方法,其特征在于,步骤(i)还包括置投影器件(40)于与显示器件(10)对应的位置,成显示器件(10)放大像。
  7. 根据权利要求6所述的以子像素为显示单元的单目多视图显示方法,其特征在于,步骤(i)还包括置中继器件(60)于显示器件(10)投射光传输路径上,引导显示器件(10)投射光束入射观察者瞳孔(50)所处区域。
  8. 根据权利要求7所述的以子像素为显示单元的单目多视图显示方法,其特征在于,所述中继器件(60)为反射面、或半透半反面、自由曲面组合、或光波导器件。
  9. 根据权利要求1所述的以子像素为显示单元的单目多视图显示方法,其特征在于,步骤(ii)还包括将追踪器件(70)与控制器件(30)连接,通过追踪器件(70)实时追踪观察者瞳孔(50)的位置。
  10. 根据权利要求9所述的以子像素为显示单元的单目多视图显示方法,其特征在于,步骤(ii)还包括根据观察者瞳孔(50)的位置,对于投射光入射观察者瞳孔的各子像素,确定其所加载图像信息,为沿该子像素所投射并入射观察者瞳孔(50)的光束的传输矢向,待显示场景于该传输矢向所处直线与观察者瞳孔(50)交点上的投影光信息。
PCT/CN2020/091877 2020-04-03 2020-05-22 以子像素为显示单元的单目多视图显示方法 WO2021196370A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/580,381 US20240223744A1 (en) 2020-04-03 2020-05-22 Multiple-views-one-eye display method with sub-pixels as basic display units

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010258846.0 2020-04-03
CN202010258846.0A CN113495365B (zh) 2020-04-03 2020-04-03 以子像素为显示单元的单目多视图显示方法

Publications (1)

Publication Number Publication Date
WO2021196370A1 true WO2021196370A1 (zh) 2021-10-07

Family

ID=77927364

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/091877 WO2021196370A1 (zh) 2020-04-03 2020-05-22 以子像素为显示单元的单目多视图显示方法

Country Status (3)

Country Link
US (1) US20240223744A1 (zh)
CN (1) CN113495365B (zh)
WO (1) WO2021196370A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114545652A (zh) * 2022-01-10 2022-05-27 中山大学 一种像素块出射光各自指向对应小尺寸孔径的光学显示结构
CN115128811A (zh) * 2022-06-20 2022-09-30 中山大学 一种基于正交特性像素块的近眼显示模组

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114545653B (zh) * 2022-01-10 2024-02-06 中山大学 基于正交特性孔径组对瞳孔追踪对应的光学显示结构

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104950458A (zh) * 2014-03-27 2015-09-30 索尼公司 空间图像显示装置和空间图像显示方法
CN105807438A (zh) * 2016-04-25 2016-07-27 中山大学 一种增加视点呈现数目的时分复用模组和方法
CN107147895A (zh) * 2017-04-18 2017-09-08 中山大学 一种用于多视图时序呈现的视频处理方法
CN109782453A (zh) * 2018-12-04 2019-05-21 中山大学 一种单目多视图的三维显示方法
JP2019078852A (ja) * 2017-10-23 2019-05-23 株式会社ジャパンディスプレイ 表示装置及び表示方法
CN110035274A (zh) * 2018-01-12 2019-07-19 中山大学 基于光栅的三维显示方法
CN110632767A (zh) * 2019-10-30 2019-12-31 京东方科技集团股份有限公司 显示装置及其显示方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2008282213A1 (en) * 2007-07-30 2009-02-05 Magnetic Media Holdings Inc. Multi-stereoscopic viewing apparatus
KR101922722B1 (ko) * 2012-08-22 2018-11-27 엘지디스플레이 주식회사 입체영상표시장치
CN104681001A (zh) * 2015-03-23 2015-06-03 京东方科技集团股份有限公司 显示驱动方法及显示驱动装置
KR102526751B1 (ko) * 2016-01-25 2023-04-27 삼성전자주식회사 지향성 백라이트 유닛, 3차원 영상 디스플레이 장치, 및 3차원 영상 디스플레이 방법
CN106324847B (zh) * 2016-10-21 2018-01-23 京东方科技集团股份有限公司 一种三维显示装置
CN106291958B (zh) * 2016-10-21 2021-04-23 京东方科技集团股份有限公司 一种显示装置及图像显示方法
CN106873170A (zh) * 2016-12-29 2017-06-20 中山大学 一种提高光栅式三维显示呈现视图分辨率的系统和方法
WO2019137272A1 (en) * 2018-01-12 2019-07-18 Sun Yat-Sen University Grating based three-dimentional display method for presenting more than one views to each pupil
CN110401829B (zh) * 2019-08-26 2022-05-13 京东方科技集团股份有限公司 一种裸眼3d显示设备及其显示方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104950458A (zh) * 2014-03-27 2015-09-30 索尼公司 空间图像显示装置和空间图像显示方法
CN105807438A (zh) * 2016-04-25 2016-07-27 中山大学 一种增加视点呈现数目的时分复用模组和方法
CN107147895A (zh) * 2017-04-18 2017-09-08 中山大学 一种用于多视图时序呈现的视频处理方法
JP2019078852A (ja) * 2017-10-23 2019-05-23 株式会社ジャパンディスプレイ 表示装置及び表示方法
CN110035274A (zh) * 2018-01-12 2019-07-19 中山大学 基于光栅的三维显示方法
CN109782453A (zh) * 2018-12-04 2019-05-21 中山大学 一种单目多视图的三维显示方法
CN110632767A (zh) * 2019-10-30 2019-12-31 京东方科技集团股份有限公司 显示装置及其显示方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114545652A (zh) * 2022-01-10 2022-05-27 中山大学 一种像素块出射光各自指向对应小尺寸孔径的光学显示结构
CN114545652B (zh) * 2022-01-10 2024-01-12 中山大学 一种像素块出射光各自指向对应小尺寸孔径的光学显示结构
CN115128811A (zh) * 2022-06-20 2022-09-30 中山大学 一种基于正交特性像素块的近眼显示模组
CN115128811B (zh) * 2022-06-20 2024-01-12 中山大学 一种基于正交特性像素块的近眼显示模组

Also Published As

Publication number Publication date
CN113495365A (zh) 2021-10-12
CN113495365B (zh) 2022-09-30
US20240223744A1 (en) 2024-07-04

Similar Documents

Publication Publication Date Title
US9164351B2 (en) Freeform-prism eyepiece with illumination waveguide
CN106291958B (zh) 一种显示装置及图像显示方法
WO2021196370A1 (zh) 以子像素为显示单元的单目多视图显示方法
CN108803023B (zh) 单眼大视场近眼显示模组、显示方法及头戴式显示设备
WO2021062941A1 (zh) 基于光栅的光波导光场显示系统
WO2019179136A1 (zh) 显示装置及显示方法
WO2017080089A1 (zh) 指向性彩色滤光片和裸眼3d显示装置
US11480796B2 (en) Three-dimensional display module using optical wave-guide for providing directional backlights
CN106773057A (zh) 一种单片全息衍射波导三维显示装置
CN104380157A (zh) 定向照明波导布置方式
US11054661B2 (en) Near-eye display device and near-eye display method
JP2007524111A (ja) カラープロジェクションディスプレイシステム
CN112882248B (zh) 一种光束发散角偏转孔径二次约束的显示模组
CN112305776B (zh) 基于光波导耦出光出瞳分割-组合控制的光场显示系统
WO2021169065A1 (zh) 孔径时序选通复用的光波导显示模组
WO2021196369A1 (zh) 基于子像素出射光空间叠加的三维显示方法
CN108873332A (zh) 单眼大视场近眼显示模组、显示方法及头戴式显示设备
CN112925098B (zh) 基于光出射受限像素块-孔径对的近眼显示模组
JP2006091333A (ja) 三次元映像表示装置
US20210314553A1 (en) Three-dimensional display method based on spatial superposition of sub-pixels&#39; emitted beams
WO2021175341A1 (zh) 一种多背光光源的显示模组
JPH0738825A (ja) 眼鏡型表示装置
WO2019157986A1 (zh) 单眼大视场近眼显示模组、显示方法及头戴式显示设备
JP2021189379A (ja) 映像表示装置
CN114545652B (zh) 一种像素块出射光各自指向对应小尺寸孔径的光学显示结构

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20929377

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112 (1) EPC - (EPO FORM 1205A) - 07.02.2023

122 Ep: pct application non-entry in european phase

Ref document number: 20929377

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18580381

Country of ref document: US