CN113009710B - Projector for forming images on multiple planes - Google Patents
Projector for forming images on multiple planes Download PDFInfo
- Publication number
- CN113009710B CN113009710B CN202011542614.4A CN202011542614A CN113009710B CN 113009710 B CN113009710 B CN 113009710B CN 202011542614 A CN202011542614 A CN 202011542614A CN 113009710 B CN113009710 B CN 113009710B
- Authority
- CN
- China
- Prior art keywords
- image
- plane
- hologram
- reconstruction
- pattern
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 claims abstract description 57
- 238000005070 sampling Methods 0.000 claims description 92
- 238000012545 processing Methods 0.000 claims description 53
- 238000006073 displacement reaction Methods 0.000 claims description 15
- 230000008569 process Effects 0.000 claims description 11
- 239000003086 colorant Substances 0.000 claims description 7
- 238000012935 Averaging Methods 0.000 claims description 3
- 230000007423 decrease Effects 0.000 claims description 2
- 238000009826 distribution Methods 0.000 description 30
- 230000003287 optical effect Effects 0.000 description 25
- 230000006870 function Effects 0.000 description 23
- 239000002131 composite material Substances 0.000 description 21
- 239000004973 liquid crystal related substance Substances 0.000 description 17
- 230000000694 effects Effects 0.000 description 11
- 230000008901 benefit Effects 0.000 description 8
- 238000003491 array Methods 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000003595 spectral effect Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000001093 holography Methods 0.000 description 4
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 3
- 210000002858 crystal cell Anatomy 0.000 description 3
- 229910052710 silicon Inorganic materials 0.000 description 3
- 239000010703 silicon Substances 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- 210000004027 cell Anatomy 0.000 description 2
- 230000001427 coherent effect Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 229910021421 monocrystalline silicon Inorganic materials 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3102—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using two-dimensional electronic spatial light modulators
- H04N9/3105—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using two-dimensional electronic spatial light modulators for displaying all colours simultaneously, e.g. by using two or more electronic spatial light modulators
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2249—Holobject properties
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
- G03H1/08—Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
- G03H1/0808—Methods of numerical synthesis, e.g. coherent ray tracing [CRT], diffraction specific
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/50—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
- G02B30/52—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being constructed from a stack or sequence of 2D planes, e.g. depth sampling systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B27/0103—Head-up displays characterised by optical features comprising holographic elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/02—Diffusing elements; Afocal elements
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F1/00—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
- G02F1/01—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
- G02F1/13—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells
- G02F1/137—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/0005—Adaptation of holography to specific applications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/02—Details of features involved during the holographic process; Replication of holograms without interference recording
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
- G03H1/0443—Digital holography, i.e. recording holograms with digital recording means
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
- G03H1/08—Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
- G03H1/0866—Digital holographic imaging, i.e. synthesizing holobjects from holograms
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
- G03H1/16—Processes or apparatus for producing holograms using Fourier transform
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2294—Addressing the hologram to an active spatial light modulator
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/26—Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/26—Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique
- G03H1/30—Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique discrete holograms only
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B27/0103—Head-up displays characterised by optical features comprising holographic elements
- G02B2027/0105—Holograms with particular structures
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B27/0103—Head-up displays characterised by optical features comprising holographic elements
- G02B2027/0109—Head-up displays characterised by optical features comprising holographic elements comprising details concerning the making of holograms
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/0005—Adaptation of holography to specific applications
- G03H2001/0088—Adaptation of holography to specific applications for video-holography, i.e. integrating hologram acquisition, transmission and display
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/02—Details of features involved during the holographic process; Replication of holograms without interference recording
- G03H2001/0208—Individual components other than the hologram
- G03H2001/0224—Active addressable light modulator, i.e. Spatial Light Modulator [SLM]
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
- G03H1/08—Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
- G03H1/0808—Methods of numerical synthesis, e.g. coherent ray tracing [CRT], diffraction specific
- G03H2001/0825—Numerical processing in hologram space, e.g. combination of the CGH [computer generated hologram] with a numerical optical element
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2249—Holobject properties
- G03H2001/2263—Multicoloured holobject
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2249—Holobject properties
- G03H2001/2263—Multicoloured holobject
- G03H2001/2271—RGB holobject
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2249—Holobject properties
- G03H2001/2284—Superimposing the holobject with other visual information
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/26—Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique
- G03H2001/2605—Arrangement of the sub-holograms, e.g. partial overlapping
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/26—Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique
- G03H2001/2625—Nature of the sub-holograms
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/26—Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique
- G03H1/30—Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique discrete holograms only
- G03H2001/303—Interleaved sub-holograms, e.g. three RGB sub-holograms having interleaved pixels for reconstructing coloured holobject
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/26—Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique
- G03H1/30—Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique discrete holograms only
- G03H2001/306—Tiled identical sub-holograms
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H2210/00—Object characteristics
- G03H2210/10—Modulation characteristics, e.g. amplitude, phase, polarisation
- G03H2210/13—Coloured object
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H2210/00—Object characteristics
- G03H2210/30—3D object
- G03H2210/33—3D/2D, i.e. the object is formed of stratified 2D planes, e.g. tomographic data
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H2210/00—Object characteristics
- G03H2210/40—Synthetic representation, i.e. digital or optical object decomposition
- G03H2210/45—Representation of the decomposed object
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H2225/00—Active addressable light modulator
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H2225/00—Active addressable light modulator
- G03H2225/52—Reflective modulator
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Nonlinear Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Chemical & Material Sciences (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Crystallography & Structural Chemistry (AREA)
- Mathematical Physics (AREA)
- Holo Graphy (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Projection Apparatus (AREA)
Abstract
A projector and corresponding method arranged to form a plurality of image reconstructions on different planes disposed on a common projection axis are disclosed. The hologram engine is arranged to determine a hologram corresponding to each image used for image reconstruction and to form a diffraction pattern for each image comprising the corresponding hologram. The display engine is arranged to display each diffraction pattern and to receive light such that an image reconstruction corresponding to each hologram is formed on a plane of the plurality of different planes. Each image reconstruction includes image points arranged in a pattern. The first image reconstructed image points formed on the first plane are interposed between the second image reconstructed image points formed on the second plane.
Description
Technical Field
The present disclosure relates to a projector. More particularly, the present disclosure relates to a holographic projector, a holographic projection method, and a holographic projection system. Some embodiments relate to head-up displays and head-mounted displays. Some embodiments relate to projectors and corresponding methods arranged to form multiple image reconstructions on different planes disposed on a common projection axis.
Background
Light scattered from the object contains amplitude and phase information. This amplitude and phase information can be captured by well known interference techniques, for example, on a photosheet to form a holographic record or "hologram" comprising interference fringes. The hologram may be reconstructed by irradiation with suitable light to form a two-or three-dimensional holographic reconstructed or playback image representing the original object.
Computer-generated holography can numerically simulate the interference process. Computer-generated holograms may be calculated by techniques based on mathematical transformations such as fresnel or fourier transforms. These types of holograms may be referred to as fresnel/fourier transform holograms or simply fresnel/fourier holograms. A fourier hologram may be considered as a fourier domain/planar representation of an object or a frequency domain/planar representation of an object. Computer-generated holograms can also be computed, for example, by coherent ray tracing or point cloud techniques.
The computer-generated hologram may be encoded on a spatial light modulator arranged to modulate the amplitude and/or phase of the incident light. For example, electrically addressable liquid crystals, optically addressable liquid crystals, or micromirrors may be used to effect light modulation.
Spatial light modulators typically comprise a plurality of individually addressable pixels, which may also be referred to as cells or elements. The light modulation scheme may be binary, multi-level or continuous. Alternatively, the device may be continuous (i.e. not include pixels), so the light modulation may be continuous across the device. The spatial light modulator may be reflective, meaning that the light is modulated to reflect the output. The spatial light modulator may likewise be transmissive, which means that the light is modulated to transmit output.
The holographic projector may be provided using the systems described herein. For example, such projectors have been applied to heads-up displays "HUDs" and head-mounted displays "HMDs," which include near-eye devices.
In devices using coherent light, such as holographic projectors, a moving diffuser may be used to improve image quality.
Disclosure of Invention
Aspects of the present disclosure are defined in the appended independent claims.
A projector is provided that is arranged to form a plurality of image reconstructions on different planes disposed on a common projection axis. The viewer can directly view the image reconstruction. For example, the image reconstruction is viewed from the eye-box in a direction substantially parallel to the common projection axis. In an embodiment, the image reconstruction occurs at the same distance or depth from the viewer. In other examples, some different image reconstructions occur at different distances or depths from the viewer. The viewer may perceive the different image reconstructions as a composite or combined image. The projector includes a hologram engine and a display engine. The hologram engine is arranged to determine a hologram corresponding to each image used for image reconstruction. The hologram engine is further arranged to form a diffraction pattern comprising a hologram for image reconstruction or a hologram for each image reconstruction. The display engine is arranged to display the diffraction pattern and to receive light such that at least one image reconstruction corresponding to each hologram is formed on a plane of the plurality of different planes. Each image reconstruction includes image points (image spots) arranged in a pattern. The first image reconstructed image points formed on the first plane are interposed between the second image reconstructed image points formed on the second plane.
Thus, the image points of the first image reconstruction formed on the first plane are interposed between the image points of the second image reconstruction formed on the second plane in two dimensions-the (first and second) dimensions (e.g., x, y) of the first and second planes. In addition, the image points of the first image reconstruction are spatially separated from the image points of the second image reconstruction in a third dimension (e.g., z) that is the dimension of (or parallel to) the common projection axis. The reconstructed image points of the first image may be said to be spatially intermediate in three dimensions between the reconstructed image points of the second image.
Embodiments of the present disclosure may be implemented using a "direct view" configuration, i.e., an optical configuration in which a viewer looks directly at a display device (e.g., a spatial light modulator). In these embodiments, the image reconstruction is not formed on an intermediate screen (e.g., a diffuser) and the viewer receives light directly from the display device. In these cases, when the hologram is a fourier or fresnel hologram, it is sometimes said that the lens of the viewer's eye performs a fourier transform on the displayed hologram. In some embodiments, image reconstruction is formed in free space between the display device and the viewer, and the viewer is able to focus on these planes during display.
Some embodiments of the present disclosure may also be implemented using an "indirect view" configuration, in which image reconstruction is formed on one or more screens. The screen may be movable such that each image reconstruction is focused on the corresponding screen. Alternatively, a plurality of switchable screens may be used, as for example described in WO/2015/173556, wherein a stack of parallel switchable screens is operated in transmissive or diffuse mode, respectively, using liquid crystals. The image reconstruction is only visible when the corresponding screen is operated in diffuse mode. In some embodiments, only one screen is operated at a time in diffuse mode. In other embodiments, more than one screen is operated simultaneously in diffuse mode. Those skilled in the art will appreciate that some of the embodiments described herein are not suitable for such a scenario.
Regardless of the viewing configuration, the viewer can see the superimposed content of the plurality of different image reconstructions. Although different image reconstructions are formed on different display planes, the distance between the display planes is small enough that the viewer cannot discern that the reconstructions are on different planes on a common projection axis. In an embodiment, a viewer perceives a single composite image formed by image reconstruction of at least first and second components (or portions) of the same target image. In some embodiments, the holographic projector forms multiple holographic reconstructions along its projection axis on different playback planes. Thus, the viewer views the image reconstruction (actually lying on different planes) as a "whole" image. The different planes are close enough for the viewer to see a composite image comprising a plurality of two-dimensional image reconstructions that appear to have been formed on the same plane. It can be said that the different image reconstructions appear to be coplanar. In some embodiments, the configuration is such that the viewer is not provided with any visual cues that alter the co-planar feel. In some embodiments, at least one of the partial images is processed prior to hologram calculation to ensure that there are no different depth cues. For example, a first image reconstruction on a first plane may be scaled to ensure that it appears coplanar with a second image reconstruction that is actually formed on a second plane. This type of image processing is within the ability of those skilled in the art of image display, and thus a detailed description of image scaling is not provided herein.
Each image reconstruction includes a plurality of image points formed on a respective plane. However, viewing difficulties may occur if two or more image reconstructions formed on different planes include image points at equivalent spatial locations on their respective two-dimensional planes. The equivalent spatial position can be said to have substantially the same coordinates (e.g., x, y) as the two-dimensional coordinate system, where the origin of the coordinate system is defined by the position of the common projection axis (e.g., at the center of each plane). In particular, image points formed at the same two-dimensional spatial location but on different planes may not be fully visible because they coincide or fully overlap in the composite image. This is a result of the planes being arranged in parallel along a common projection axis.
The inventors herein disclose a method that involves inserting image points of different image reconstructions formed on different planes. In particular, the image points of the first image reconstruction formed on the first plane are interposed between, and thus spatially separated from, the image points of the second image reconstruction formed on the second plane. Thus, image points of different image reconstructions are not formed at equivalent two-dimensional spatial positions, so as to coincide or overlap in the composite image seen by the viewer. It can be said that the image points of the different image reconstructions are spatially separated in the first and second dimensions (x, y) of their respective parallel image planes and thus in the plane of the composite two-dimensional image. Furthermore, the image points of the different image reconstructions are spatially separated in a third dimension (z) of the common projection axis. Thus, the viewer can more clearly see the composite image (which appears to be formed on a single plane, but includes different image reconstructions formed on different planes), as described further below. The present disclosure addresses the technical problem of image pixel crosstalk, as described further below.
The hologram engine is arranged to form a diffraction pattern comprising holograms corresponding to a plurality of image reconstructions, wherein the diffraction pattern determines a plane of the image reconstruction. In some embodiments, the hologram engine forms a diffraction pattern comprising a lens function having a focal length. The focal length of the lens function may affect the position of the image reconstruction plane on the common projection axis (e.g., the distance from the spatial light modulator along the axis of the projection path, referred to herein as the "propagation distance").
The first image reconstruction and the second image reconstruction are formed substantially simultaneously. In some embodiments, the first image reconstruction and the second image reconstruction are formed simultaneously. In other examples, the first image reconstruction and the second image reconstruction are formed in rapid succession with one another.
In an embodiment, the first image reconstructed image points are arranged in a first pattern and the second image reconstructed image points are arranged in a second pattern, wherein the first pattern is opposite to the second pattern. For example, the first pattern may be a first checkerboard pattern and the second pattern may be a second checkerboard pattern opposite the first checkerboard pattern. Thus, the image points of the first image reconstruction fill in gaps between the image points of the second image reconstruction and vice versa.
The projector may be arranged to form the image reconstruction sequence on the first plane by alternating between image points arranged in the first pattern and image points arranged in the second pattern. The projector may be further arranged to form the image reconstruction sequence on the second plane by alternating between image points arranged in the second pattern and image points arranged in the first pattern in synchronization with the image reconstruction sequence formed on the first plane.
In some embodiments, the projector may be arranged to form a monochromatic image reconstruction to display a monochromatic image. In this case, the first image reconstruction and the second image reconstruction are monochromatic image reconstructions having the same color.
In other embodiments, the projector may be arranged to form image reconstructions of different colors in order to display a multi-color image. In some embodiments, each image reconstruction in the first/second plane is a single color, but successive image reconstructions in the sequence of image reconstructions formed in the first/second plane are different colors. In other examples, the first image reconstruction formed on the first plane and the second image reconstruction formed substantially simultaneously on the second plane may have different colors. In these examples, at any point in time, the color of the image point formed on the first plane is different from the color of the image point formed on the second plane.
In an embodiment, the first image reconstruction is a first holographic reconstruction and the second image reconstruction is a second holographic reconstruction. Thus, the first plane is a first playback plane and the second plane is a second playback plane.
In some embodiments, the projector further comprises an image processing engine. The image processing engine is arranged to receive a plurality of source images. The image processing engine is further arranged to process each source image according to a sampling scheme selected from a plurality of different sampling schemes so as to reduce the number of image pixels formed on each playback plane. The pattern of image points reconstructed by each image formed by the display engine is determined by the corresponding sampling scheme. The first sampling scheme associated with the first image reconstruction is different from the second sampling scheme associated with the second image reconstruction such that image points of the first image formed on the first plane are interposed between image points of the second image formed on the second plane.
By sampling each source image according to a sampling scheme, holograms can be calculated to form an image reconstruction comprising groups or subsets of image points of the image arranged in a pattern. Therefore, the interval between image points in the image point pattern formed on each playback plane can be increased.
The technical advancement according to the present disclosure will be understood from the following. Notably, some embodiments of the present disclosure relate to interlacing in space/depth in order to address the problem of pixel crosstalk in holographic images. It has been found that the simultaneous formation of adjacent closely spaced image points (i.e., in an array of image points) can cause pixel crosstalk or inter-pixel interference, thereby degrading image quality. According to embodiments of the present disclosure, pixel crosstalk is reduced or even eliminated by increasing the spacing between image points on the playback field. However, the perceived resolution of the holographic image is maintained by inserting image points onto the second holographic playback plane. Image pixels on the first plane do not interfere with image points on the second plane. The first holographic reconstruction on the first plane comprises image points arranged in a first checkerboard pattern and the second holographic reconstruction on the second plane comprises image points arranged in a second checkerboard pattern. The first checkerboard pattern is complementary to the second checkerboard pattern. The second holographically reconstructed image points are perceived between the first holographically reconstructed image points so that there is no significant loss of resolution. However, inter-pixel crosstalk in the holographic image is reduced.
A first checkerboard and a second checkerboard of a source image for projection are formed. For example, image pixels of the source image may be sampled according to a sampling scheme comprising a first checkerboard pattern and a second checkerboard pattern to derive the first and second checkerboard of the source image. A single hologram may be calculated which forms an image reconstruction of both the first and the second checkerboard. The single hologram may be a biplane hologram, such as a fresnel hologram or a biplane hologram calculated using a biplane Gerchberg-Saxton algorithm. Alternatively, a first hologram channel may be used for a first image reconstruction forming a first checkerboard on a first plane and a second hologram channel may be used for a second image reconstruction forming a second checkerboard on a second plane. The first holographic projection channel and the second holographic projection channel may be substantially collinear. The first image reconstruction and the second image reconstruction are formed on the same projection axis. The first reconstructed image point and the second reconstructed image point may be of the same color. Each reconstruction may be made up of red, green and blue image pixels formed from the first red, green and blue holographic projection channels.
Notably, in some embodiments, multiple holographic playback fields (on different playback planes) are formed simultaneously using the same hologram (e.g., fresnel hologram) or using two holographic projection channels of each color or by two different holograms displayed simultaneously on each monochromatic spatial light modulator (e.g., two different fourier holograms combined with different lens functions).
The image processing engine may be arranged to magnify the source image before processing according to the sampling scheme. The image processing engine may be arranged to change the size of the (optionally enlarged) source image before forming one of the first and second checkerboards such that the first and second image reconstructions appear to be formed on the same plane (i.e. at the same depth as the viewer), even though they are formed on different planes.
In some embodiments, the first sampling scheme includes invalidating the substitute pixel values according to a first checkerboard pattern and the second sampling scheme includes invalidating the substitute pixel values according to a second checkerboard pattern, wherein the first checkerboard pattern is opposite the second checkerboard pattern. Thus, according to the first and second sampling schemes, every other pixel value of a row/column is sampled and the remaining pixel values are zero.
In some embodiments, each sampling scheme includes averaging pixel values within a sampling window at a plurality of sampling window positions. For example, the average may be a weighted average based on the locations of the pixels within the sampling window. For example, the weighting given to each pixel value in the respective sampling window decreases with distance from the center of the sampling window. In some examples, the first sampling scheme includes a first set of sampling window positions and the second sampling scheme includes a second set of sampling window positions, wherein the first set of sampling window positions are diagonally offset from the second set of sampling window positions. The first set of sampling window positions may partially overlap the second set of sampling window positions.
In other embodiments, the hologram engine is arranged to form a diffraction pattern having a phase ramp function. The first phase ramp function associated with the first image reconstruction has a ramp gradient that differs from a ramp gradient of the second phase ramp function associated with the second image reconstruction to provide a displacement between the image points of the first image reconstruction relative to the image points of the second image reconstruction. Thus, the different first and second ramp gradients enable spatial shifting of the image point of the first image reconstruction relative to the image point of the second image reconstruction. In such an embodiment, sub-sampling of the source image may not be required. In an example, a difference between the slope gradient of the first phase ramp function and the slope gradient of the second phase ramp function is such that image points of the first image formed on the first plane are interposed between image points of the second image formed on the second plane.
Head-up displays including the projectors disclosed herein are also provided.
Methods of holographic projection using the projectors disclosed herein are also provided.
The term "hologram" is used to refer to a record containing amplitude information or phase information about an object, or some combination thereof. The term "holographic reconstruction" is used to refer to the optical reconstruction of an object formed by illuminating a hologram. The system disclosed herein is described as a "holographic projector" because the holographic reconstruction is a real image and is spatially separated from the hologram. The term "playback field" is used to refer to a 2D region within which a holographic reconstruction is formed and which is fully focused. If the hologram is displayed on a spatial light modulator comprising pixels, the playback field will repeat in the form of a plurality of diffraction orders, each diffraction order being a copy of the zero order playback field. The zero order playback field generally corresponds to the preferred or primary playback field because it is the brightest playback field. The term "playback field" shall be taken to mean a zero-order playback field unless explicitly stated otherwise. The term "playback plane" is used to refer to a plane in space that contains all playback fields. The terms "image", "playback image" and "image region" refer to the region of the playback field illuminated by the light of the holographic reconstruction. In some embodiments, an "image" may include discrete points, which may be referred to as "image points," or, for convenience only, as "image pixels. The image spot is a light spot formed in the playback field. Thus, the terms "image point" and "image pixel" refer to discrete points or light points forming a holographic reconstruction in a playback field formed on a playback plane. As will be appreciated by those skilled in the art, each image point or image pixel represents the smallest resolvable element of the holographic reconstruction formed on the replay plane.
The terms "encoding", "writing" or "addressing" are used to describe the process of providing a plurality of pixels of the SLM with a corresponding plurality of control values that respectively determine the modulation level of each pixel. The pixels of the SLM are said to be configured to "display" the light modulation profile in response to receiving a plurality of control values. Thus, the SLM can be said to "display" a hologram, and the hologram can be considered as an array of light modulating values or levels.
It has been found that a holographic reconstruction of acceptable quality can be formed from a "hologram" that contains only phase information related to the fourier transform of the original object. Such holographic recordings may be referred to as phase-only holograms. The embodiments relate to phase-only holograms, but the disclosure is equally applicable to amplitude-only holograms.
The present disclosure is equally applicable to the formation of holographic reconstructions using amplitude and phase information related to the fourier transform of the original object. In some embodiments, this is achieved by complex modulation using a so-called full complex hologram containing amplitude and phase information about the original object. Since the value (gray level) assigned to each pixel of a hologram has an amplitude and phase component, such a hologram may be referred to as a full complex hologram. The value (gray level) assigned to each pixel can be represented as a complex number having amplitude and phase components. In some embodiments, a full complex computer generated hologram is calculated.
Reference may be made to the phase value, phase component, phase information or simply the phase of a pixel of a computer generated hologram or spatial light modulator, as shorthand for "phase delay". That is, any of the phase values described are actually numbers (e.g., in the range of 0 to 2π) representing the amount of phase delay provided by that pixel. For example, a pixel described as having a pi/2 phase value of a spatial light modulator will delay the phase of the received light by pi/2 radians. In some embodiments, each pixel of the spatial light modulator may operate in one of a plurality of possible modulation values (e.g., phase delay values). The term "gray scale" may be used to refer to a number of available modulation levels. For example, the term "gray level" may be used for convenience to refer to a plurality of available phase levels in a phase-only modulator, even though different phase levels do not provide different shades of gray. For convenience, the term "gray scale" may also be used to refer to a plurality of available complex modulation levels in a complex modulator.
Thus, the hologram comprises a gray scale array, i.e. an array of light modulating values, such as an array of phase delay values or complex modulating values. A hologram is also considered a diffraction pattern because it is a pattern that causes diffraction when displayed on a spatial light modulator and illuminated with light having a wavelength that is relative to (typically smaller than) the pixel pitch of the spatial light modulator. Reference is made herein to combining holograms with other diffraction patterns, such as those used as lenses or gratings. For example, a diffraction pattern used as a grating may be combined with a hologram to translate the playback field in the playback plane, or a diffraction pattern used as a lens may be combined with a hologram to focus the holographic reconstruction on the playback plane in the near field.
In the following description, the term "image" is used herein to refer to a desired image (also referred to as a "target image," which may be derived from a "source image"). The term "image reconstruction" refers to reconstructing a desired image or a target image on a plane. Thus, in the context of a holographic display, the term "image reconstruction" refers to "holographic reconstruction". The term "image" may also be used herein as shorthand for "image reconstruction" when referring to an image formed on a plane. As previously mentioned, an "image reconstruction" may include a pattern of discrete points or spots called "image points" or "image pixels".
The term "display event" generally refers to the time interval during which a single image reconstruction is formed on a display plane. In the context of the holographic display described herein, this corresponds to a time interval in which a single hologram written to the spatial light modulator is displayed. The display event may correspond to the display of a frame or sub-frame of a video image. In the following description, the term "display event" is also used to refer to a time interval during which multiple image reconstructions are formed substantially simultaneously in multiple planes.
Although various embodiments and groups of embodiments may be separately disclosed in the detailed description that follows, any feature of any embodiment or group of embodiments may be combined with any other feature or combination of features of any embodiment or group of embodiments. That is, all possible combinations and permutations of features disclosed in the present disclosure are contemplated.
Drawings
Specific embodiments are described by way of example only with reference to the following drawings:
FIG. 1 is a schematic diagram showing a reflective SLM producing a holographic reconstruction on a screen;
FIG. 2A shows a first iteration of an example Gerchberg-Saxton type algorithm;
FIG. 2B illustrates a second and subsequent iteration of an example Gerchberg-Saxton type algorithm;
FIG. 2C shows an alternative second and subsequent iterations of the example Gerchberg-Saxton type algorithm.
FIG. 3 is a schematic diagram of a reflective LCOS SLM.
FIG. 4A illustrates an example of multiple image reconstructions formed on respective planes disposed along a common projection axis relative to a viewer, and FIG. 4B illustrates a composite image including the multiple image reconstructions of FIG. 4A as seen by the viewer;
FIG. 5A shows a first pattern of image points of a first image reconstruction, and FIG. 5B shows a second pattern of image points of a second image reconstruction, in accordance with an embodiment;
FIG. 6 illustrates the first image reconstruction of FIG. 5A formed on a first plane and the second image reconstruction of FIG. 5B formed on a second plane;
FIG. 7 shows a composite image comprising the first and second image reconstructions of FIG. 6 as seen by a viewer;
8A-8C illustrate an example sequence of first and second image reconstructions, including a pattern of variations of image points formed on respective first and second planes, according to an embodiment; and
Fig. 9 is a schematic diagram illustrating a holographic projection system according to an embodiment.
The same reference numbers will be used throughout the drawings to refer to the same or like parts.
Detailed Description
The invention is not limited to the embodiments described below, but extends to the full scope of the appended claims. That is, the present invention may be embodied in different forms and should not be construed as limited to the described embodiments set forth herein for purposes of illustration.
Unless otherwise indicated, singular terms may include the plural.
Structures described as being formed on/under another structure should be interpreted as including a case where the structures are in contact with each other and a case where a third structure is provided therebetween.
In describing temporal relationships, such as when the temporal sequence of events is described as "after," "subsequent," "next," "previous," etc., the present disclosure should be considered to include continuous and discontinuous events unless specified otherwise. For example, the description should be considered to include a discontinuous condition unless words such as "only," "immediately adjacent," or "directly" are used.
Although the terms "first," "second," etc. may be used herein to describe various elements, these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the appended claims.
Features of different embodiments may be partially or wholly coupled to or combined with one another and interoperate differently from one another. Some embodiments may be performed independently of each other or may be performed together in interdependent relationship.
Optical arrangement
Fig. 1 shows an arrangement in which a computer-generated hologram is encoded on a single spatial light modulator. The computer generated hologram is a fourier transform of the object for reconstruction. The hologram can thus be said to be a fourier domain or a frequency domain or spectral domain representation of the object. In this arrangement, the spatial light modulator is a reflective liquid crystal on silicon "LCOS" device. The hologram is encoded on a spatial light modulator and a holographic reconstruction is formed at the playback field (e.g. a light receiving surface such as a screen or a diffuser). In an alternative arrangement, according to the present disclosure, the viewer looks directly at the spatial light modulator displaying the hologram, and the light receiving surface is omitted. Instead, the holographic reconstruction is formed at a location in free space, or on the retina of the viewer's eye.
A light source 110, such as a laser or laser diode, is provided to illuminate SLM140 via a collimator lens 111. The collimating lens causes a substantially planar wavefront of light to be incident on the SLM. In fig. 1, the direction of the wavefront is off-normal (e.g., two or three degrees from a plane truly perpendicular to the transparent layer). However, in other arrangements, the substantially planar wavefront is provided at normal incidence and the beam splitter arrangement is used to split the input and output optical paths. In the arrangement shown in fig. 1, light from the light source reflects from the mirrored back surface of the SLM and interacts with the light modulating layer to form an outgoing wavefront 112. The outgoing wavefront 112 is applied to optics comprising a fourier transform lens 120 whose focal point is located on a screen 125. More specifically, fourier transform lens 120 receives the modulated light beam from SLM140 and performs a frequency-space transform to produce a holographic reconstruction on screen 125.
Notably, in this type of hologram, each pixel of the hologram contributes to the overall reconstruction. There is no one-to-one correlation between a particular point (or image pixel) on the playback field and a particular light modulation element (or hologram pixel). In other words, the modulated light leaving the light modulation layer is distributed over the entire playback field.
In these arrangements, the position of the holographic reconstruction in space is determined by the diopter (focus) of the fourier transform lens. In the arrangement shown in fig. 1, the fourier transform lens is a physical lens. That is, the fourier transform lens is an optical fourier transform lens, and the fourier transform is performed optically. Any lens can act as a fourier transform lens, but the performance of the lens will limit the accuracy of the fourier transform it performs. The skilled person understands how to use a lens to perform an optical fourier transform.
Hologram calculation
In some embodiments, the computer generated hologram is a fourier transform hologram, or simply a fourier hologram or a fourier-based hologram, in which the image is reconstructed in the far field by utilizing the fourier transform characteristics of a positive lens. The fourier hologram is calculated by fourier transforming the desired light field in the playback plane back to the lens plane. The fourier transform may be used to calculate a computer generated fourier hologram. In an embodiment, the hologram engine is used to calculate a computer generated hologram using any suitable technique or algorithm. An example of an algorithm for computing a computer-generated hologram is described below.
An algorithm, such as the Gerchberg-Saxton algorithm, may be used to calculate the fourier transform hologram. Furthermore, the Gerchberg-Saxton algorithm may be used to calculate holograms in the Fourier domain (i.e., fourier transformed holograms) from pure amplitude information in the spatial domain (e.g., photographs). Phase information about the object is effectively "retrieved" from the pure amplitude information in the spatial domain. In some embodiments, computer-generated holograms are calculated from pure amplitude information using the Gerchberg-Saxton algorithm or variants thereof.
The Gerchberg-Saxton algorithm considers the intensity profile I of the beam when known to be in planes A and B, respectively A (x, y) and I B (x, y) and I A (x, y) and I B (x, y) is related by a single fourier transform. For a given intensity cross section, an approximation of the phase distribution in planes A and B is obtained, respectively ψ A (x, y) and ψ B (x, y). The Gerchberg-Saxton algorithm finds a solution to this problem by following an iterative process. More specifically, the Gerchberg-Saxton algorithm iteratively applies spatial and spectral constraints while repeating transmission of the representation I between the spatial domain and the Fourier (spectral or frequency) domain A (x, y) and I B (x, y) data sets (amplitude and phase). A corresponding computer-generated hologram in the spectral domain is obtained by at least one iteration of the algorithm. The algorithm is convergent and arranged to produce a hologram representing the input image. The hologram may be a pure amplitude hologram, a pure phase hologram, or a full complex hologram.
In some embodiments, the phase-only holograms are calculated using an algorithm based on the Gerchberg-Saxton algorithm, such as described in uk patent 2498170 or 2501112, the entire contents of which are incorporated herein by reference. However, the embodiments disclosed herein describe computing phase-only holograms by way of example only. In these embodiments, the Gerchberg-Saxton algorithm retrieves phase information ψ [ u, v ] of the Fourier transform of a dataset, which yields known amplitude information T [ x, y ], where the amplitude information T [ x, y ] represents a target image (e.g., a photograph). Since amplitude and phase are inherently combined in the fourier transform, the transformed amplitude and phase contain useful information about the accuracy of the computed data set. Thus, the algorithm may be used iteratively with feedback of amplitude and phase information. However, in these embodiments, the phase-only information ψ [ u, v ] is used as a hologram to form a holographic representation of the target image on the image plane. A hologram is a dataset (e.g. a 2D array) of phase values.
In other embodiments, an algorithm based on the Gerchberg-Saxton algorithm is used to calculate the full complex hologram. A full complex hologram is a hologram having an amplitude component and a phase component. A hologram is a data set (e.g., a 2D array) comprising an array of complex data values, wherein each complex data value comprises an amplitude component and a phase component.
In some embodiments, the algorithm processes complex data and the fourier transform is a complex fourier transform. Complex data may be considered to include (i) a real component and an imaginary component, or (ii) an amplitude component and a phase component. In some embodiments, the two components of complex data are processed differently at various stages of the algorithm.
Fig. 2A illustrates a first iteration of an algorithm for computing a phase-only hologram, in accordance with some embodiments. The input to the algorithm is an input image 210 comprising a 2D array of pixels or data values, wherein each pixel or data value is an amplitude or amplitude value. That is, each pixel or data value of the input image 210 has no phase component. Thus, the input image 210 may be considered as a pure amplitude or intensity distribution only. An example of such an input image 210 is a photograph or a frame of video comprising a time series of frames. The first iteration of the algorithm begins with a data formation step 202A that includes assigning a random phase value to each pixel of the input image using a random phase distribution (or random phase seed) 230 to form a starting complex data set, wherein each data element of the data set includes an amplitude and a phase. It can be said that the initial complex data set represents the input image in the spatial domain.
The first processing block 250 receives the initial complex data set and performs a complex fourier transform to form a fourier transformed complex data set. The second processing block 253 receives the fourier transformed complex data set and outputs a hologram 280A. In some embodiments, hologram 280A is a phase-only hologram. In these embodiments, the second processing block 253 quantizes each phase value and sets each amplitude value to 1 to form the hologram 280A. Each phase value is quantized according to a phase level that can be represented on a pixel of a spatial light modulator that will be used to "display" a phase-only hologram. For example, if each pixel of the spatial light modulator provides 256 different phase levels, each phase value of the hologram is quantized to one of the 256 possible phase levels. Hologram 280A is a phase-only fourier hologram representing an input image. In other embodiments, hologram 280A is a full complex hologram that includes an array of complex data values (each including an amplitude component and a phase component) derived from a received fourier transformed complex data set. In some embodiments, the second processing block 253 constrains each complex data value to one of a plurality of allowable complex modulation levels to form the hologram 280A. The constraining step may include setting each complex data value to a nearest allowable complex modulation level in the complex plane. The hologram 280A may be said to represent an input image in the spectral or fourier or frequency domain. In some embodiments, the algorithm stops at this point.
However, in other embodiments, the algorithm continues as indicated by the dashed arrow in FIG. 2A. In other words, the steps following the dashed arrows in fig. 2A are optional (i.e., not essential to all embodiments).
The third processing block 256 receives the modified complex data set from the second processing block 253 and performs an inverse fourier transform to form an inverse fourier transformed complex data set. The complex data set of the inverse fourier transform can be said to represent the input image in the spatial domain.
The fourth processing block 259 receives the inverse fourier transformed complex data set and extracts the distribution of amplitude values 211A and the distribution of phase values 213A. Optionally, the fourth processing block 259 evaluates the distribution of amplitude values 211A. In particular, the fourth processing block 259 may compare the distribution of amplitude values 211A of the complex data set of the inverse fourier transform with the input image 510, the input image 510 itself of course being the distribution of amplitude values. If the difference between the distribution of amplitude values 211A and the input image 210 is sufficiently small, the fourth processing block 259 may determine that the hologram 280A is acceptable. That is, if the difference between the distribution of amplitude values 211A and the input image 210 is sufficiently small, the fourth processing block 259 may determine that the hologram 280A is a sufficiently accurate representation of the input image 210. In some embodiments, the distribution of the phase values 213A of the complex data set of the inverse fourier transform is ignored for comparison purposes. It will be appreciated that any number of different methods may be employed to compare the distribution of amplitude values 211A to the input image 210, and that the present disclosure is not limited to any particular method. In some embodiments, the mean square error is calculated and if the mean square error is less than a threshold, then hologram 280A is considered acceptable. If the fourth processing block 259 determines that the hologram 280A is not acceptable, further iterations of the algorithm may be performed. However, this comparison step is not required, and in other embodiments, the number of iterations of the algorithm performed is predetermined or preset or user-defined.
Fig. 2B shows a second iteration of the algorithm and any further iterations of the algorithm. The distribution of phase values 213A of the previous iteration is fed back by the processing block of the algorithm. The distribution of amplitude values 211A is rejected, giving priority to the distribution of amplitude values of the input image 210. In a first iteration, the data forming step 202A forms a first complex data set by combining a distribution of amplitude values of the input image 210 with a random phase distribution 230. However, in the second and subsequent iterations, the data forming step 202B includes forming a complex data set by combining (i) the distribution of phase values 213A from the previous iteration of the algorithm with (ii) the distribution of amplitude values of the input image 210.
The complex data set formed by the data forming step 202B of fig. 2B is then processed in the same manner as described with reference to fig. 2A to form a second iteration hologram 280B. Therefore, a description of the process is not repeated here. When the second iteration hologram 280B has been calculated, the algorithm may stop. However, any number of further iterations of the algorithm may be performed. It will be appreciated that the third processing block 256 is only required when the fourth processing block 259 is required or when further iterations are required. The output hologram 280B generally becomes better with each iteration. In practice, however, a point is typically reached where no measurable improvement is observed, or the positive benefits of performing further iterations are offset by the negative effects of the additional processing time. Thus, the algorithm is described as iterative and convergent.
Fig. 2C shows an alternative embodiment for the second and subsequent iterations. The distribution of phase values 213A of the previous iteration is fed back by the processing block of the algorithm. The distribution of reject amplitude values 211A prioritizes alternative distributions of amplitude values. In this alternative embodiment, the alternative distribution of amplitude values is derived from the distribution of amplitude values 211 of the previous iteration. Specifically, the processing block 258 subtracts the distribution of amplitude values of the input image 210 from the distribution of amplitude values 211 of the previous iteration, scales the difference by a gain factor α, and subtracts the scaled difference from the input image 210. This is expressed mathematically by the following equation, where the subscript text and numbers represent the number of iterations:
R n+1 [x,y]=F'{exp(iψ n [u,v])}
ψ n [u,v]=∠F{η·exp(i∠R n [x,y])}
η=T[x,y]-α(|R n [x,y]|-T[x,y])
wherein:
f' is the inverse fourier transform;
f is the forward fourier transform;
r < x, y > is the complex data set output by the third processing block 256;
t [ x, y ] is the input or target image;
angle is the phase component;
ψ is phase-only hologram 280B;
η is the new distribution of amplitude values 211B; and
alpha is the gain factor.
The gain factor α may be fixed or variable. In some embodiments, the gain factor α is determined based on the size and rate of the input target image data. In some embodiments, the gain factor α depends on the number of iterations. In some embodiments, the gain factor α is only a function of the number of iterations.
In all other respects, the embodiment of fig. 2C is identical to the embodiments of fig. 2A and 2B. It can be said that the phase-only hologram ψ (u, v) comprises a phase distribution in the frequency domain or in the fourier domain.
In some embodiments, the fourier transform is performed using a spatial light modulator. In particular, the hologram data is combined with second data providing optical power. That is, the data written to the spatial light modulation includes hologram data representing an object and lens data representing a lens. When displayed on a spatial light modulator and illuminated with light, the lens data mimics a physical lens, i.e., it focuses the light in the same manner as the corresponding physical optical element. Thus, the lens data provides the optical or focal power. In these embodiments, the physical fourier transform lens 120 of fig. 1 may be omitted. It is known how to calculate data representing the lens. The data representing the lens may be referred to as a software lens. For example, a phase-only lens may be formed by calculating the phase delay caused by each point of the lens due to its refractive index and spatially varying optical path length. For example, the optical path length at the center of the convex lens is greater than the optical path length at the edge of the lens. The pure amplitude lens may be formed by a fresnel zone plate. In the field of computer-generated holography, it is also known how to combine data representing a lens with a hologram so that the fourier transformation of the hologram can be performed without the need for a physical fourier lens. In some embodiments, the lensed data is combined with the hologram by simple addition (such as simple vector addition). In some embodiments, a physical lens is used in conjunction with a software lens to perform the fourier transform. Alternatively, in other embodiments, the fourier transform lens is omitted entirely, so that holographic reconstruction occurs in the far field. In a further embodiment, the hologram may be combined with the raster data in the same way, i.e. data arranged to perform a raster function, such as image steering. Also, it is known in the art how to calculate such data. For example, a pure phase grating may be formed by modeling the phase delay caused by each point on the surface of a blazed grating. The pure amplitude grating may be simply superimposed with the pure amplitude hologram to provide angular steering of the holographic reconstruction. The second data providing lensing and/or steering may be referred to as a light processing function or light processing pattern to distinguish from the hologram data, which may be referred to as an image forming function or image forming pattern.
In some embodiments, the fourier transform is performed by a combination of a physical fourier transform lens and a software lens. That is, some of the optical power that contributes to the fourier transform is provided by the software lens, while the remaining optical power that contributes to the fourier transform is provided by one or more physical optics.
In some embodiments, a real-time engine is provided that is arranged to receive image data and calculate holograms in real-time using an algorithm. For example, the image processing engine may receive and process a source image corresponding to a desired or target image, and the hologram engine may calculate a hologram corresponding to an image output by the image processing engine. In some embodiments, the image data is video comprising a sequence of image frames. In other embodiments, the hologram is pre-computed, stored in computer memory and recalled as needed for display on the SLM. That is, in some embodiments, a repository of predetermined holograms is provided.
Embodiments relate, by way of example only, to fourier holography and the Gerchberg-Saxton type algorithm. Some embodiments relate to forming reconstructed holograms on multiple planes simultaneously. Such holograms may be fresnel holograms or holograms calculated using the multiplanar Gerchberg-Saxton algorithm. The present disclosure also applies to holograms calculated by other techniques (e.g., point cloud method based techniques).
Light modulation
A spatial light modulator may be used to display a diffraction pattern comprising a computer-generated hologram. For example, the diffraction pattern may include a hologram representing the desired image, and optionally a software lens or the like (e.g., a grating/phase ramp function) to determine the location of the image reconstruction, as described herein. If the hologram is a phase-only hologram, a spatial light modulator that modulates the phase is required. If the hologram is a full complex hologram, a spatial light modulator that modulates phase and amplitude may be used, or a first spatial light modulator that modulates phase and a second spatial light modulator that modulates amplitude may be used.
In some embodiments, the light modulating elements (i.e., pixels) of the spatial light modulator are cells containing liquid crystals. That is, in some embodiments, the spatial light modulator is a liquid crystal device in which the optically active component is liquid crystal. Each liquid crystal cell is configured to selectively provide a plurality of light modulation levels. That is, each liquid crystal cell is configured to operate at a selected one of a plurality of possible light modulation levels at any time. Each liquid crystal cell is dynamically reconfigurable to a different light modulation level than the plurality of light modulation levels. In some embodiments, the spatial light modulator is a reflective Liquid Crystal On Silicon (LCOS) spatial light modulator, although the present disclosure is not limited to this type of spatial light modulator.
LCOS devices provide dense arrays of light modulating elements or pixels within a small aperture (e.g., a few centimeters wide). The pixels are typically about 10 microns or less, which results in a diffraction angle of a few degrees, meaning that the optical system can be compact. The small aperture of a fully illuminated LCOS SLM is much easier than the large aperture of other liquid crystal devices. LCOS devices are typically reflective, meaning that the circuitry driving the LCOS SLM pixels can be buried under a reflective surface. Resulting in a higher aperture ratio. In other words, the pixels are closely packed, which means that there is little dead space between the pixels. This is advantageous because it reduces optical noise in the playback field. LCOS SLMs use a silicon backplane, which has the advantage that the pixels are optically flat. This is particularly important for phase modulation devices.
A suitable LCOS SLM is described below, by way of example only, with reference to fig. 3. LCOS devices are formed using a monocrystalline silicon substrate 302. It has a 2D array of square planar aluminum electrodes 301 arranged on the upper surface of the substrate spaced apart by gaps 301a. Each electrode 301 may be addressed by a circuit 302a buried in the substrate 302. Each electrode forms a respective plane mirror. An alignment layer 303 is disposed on the electrode array, and a liquid crystal layer 304 is disposed on the alignment layer 303. The second orientation layer 305 is arranged on a planar transparent layer 306, for example made of glass. A single transparent electrode 307 made of, for example, ITO is disposed between the transparent layer 306 and the second alignment layer 305.
Each square electrode 301 together with the coverage area of the transparent electrode 307 and the intermediate liquid crystal material define a controllable phase modulating element 308, commonly referred to as a pixel. The effective pixel area or fill factor is the percentage of the total pixel that is optically active, taking into account the space between pixels 301 a. By controlling the voltage applied to each electrode 301 relative to the transparent electrode 307, the characteristics of the liquid crystal material of the respective phase modulating element can be varied to provide a variable retardation to light incident thereon. The effect is to provide pure phase modulation to the wavefront, i.e. no amplitude effects occur.
The described LCOS SLM outputs spatially modulated light in a reflective manner. Reflective LCOS SLMs have the advantage that the signal lines, gate lines and transistors are located under the mirror, which results in a high fill factor (typically greater than 90%) and high resolution. Another advantage of using a reflective LCOS spatial light modulator is that the thickness of the liquid crystal layer can be half that required when using a transmissive device. This greatly increases the switching speed of the liquid crystal (a key advantage of projecting moving video images). However, the teachings of the present disclosure can be equally implemented using a transmissive LCOS SLM.
Forming multiple image reconstructions on multiple planes
Fig. 4A shows an example of a plurality of four target images (i.e., image reconstructions) formed at different planes arranged with respect to a viewer 400. In particular, the first image "a" is formed at the first plane "P1", the second image "B" is formed at the second plane "P2", the third image "C" is formed at the third plane "P3", and the fourth image "D" is formed at the fourth plane "P4". The planes P1 to P4 are disposed at different positions along a common projection axis (not shown). Thus, the planes P1 to P4 are arranged parallel to each other and perpendicular to the projection axis, which extends through the center of each plane P1 to P4. The first plane P1 is at a maximum distance along the projection axis from the spatial light modulator (not shown), while the fourth plane P4 is at a minimum distance along the projection axis from the spatial light modulator. Fig. 4B shows a viewer perceived composite image comprising the first image a, the second image B, the third image C and the fourth image D of fig. 4A. The second image B, the third image C, and the fourth image D may be resized to avoid causing any sense of depth. For ease of illustration, the plurality of images a through D are formed at spatially separated locations (i.e., x, y coordinate locations) in their respective two-dimensional planes such that the images do not coincide or overlap in the composite image as seen by the viewer. In other examples, multiple images may be formed in their respective planes in positions such that they are adjacent to each other in the composite image as seen by the viewer, but at least do not overlap.
As described above, when the spatial light modulator displays a hologram, hologram reconstructions (i.e. image reconstructions) are formed on at least one plane, wherein each image reconstruction comprises an image point or image pixel. In particular, a set of image points arranged in an array may be selectively formed on at least one playback plane.
The inventors have found that the quality of the image reconstruction can be improved by forming (i.e. displaying) different groups or subsets of its image points on different planes. In particular, the image quality may be improved by spatially multiplexing the display of different groups or subsets of image points. Different subsets of the image point array are formed on different planes. In particular, adjacent image points on each plane are spaced wider than adjacent image points in the full perceptible array of image points. Displaying different subsets of image points in a pattern having fewer and widely spaced image points in different planes in rapid succession or substantially simultaneously may improve image quality. For example, arrays of image points having opposite checkerboard patterns may be displayed on different planes. The method avoids the formation of adjacent closely spaced image points on the same plane (i.e. in an image point array for image reconstruction) and as a result has been found to reduce pixel cross-talk or inter-pixel interference and thereby improve image quality. Since all image points of the target image are perceived by the human eye, displaying the subset of image points does not result in a reduced image quality. Thus, the full resolution image reconstruction can be seen by the viewer.
According to embodiments of the present disclosure, multiple image reconstructions are formed simultaneously on different planes substantially simultaneously. Fig. 5 to 7 show examples of the display of subsets of image points of the same target image formed on different planes, each arranged in a checkerboard pattern, according to an embodiment. The present disclosure is not limited to using patterns of image points arranged in a checkerboard pattern. Instead, various other patterns of image points are possible and contemplated that enable insertion of image points, as described herein.
Fig. 5A shows a first image point of a first image reconstruction and fig. 5B shows a second image point of a second image reconstruction. The first image points comprise a subset of image points arranged in a first pattern and the second image points comprise a subset of image points arranged in a second pattern. In the example shown, the set of pixels that can be formed by the spatial light modulator in the replay field is a 5 x 5 array of pixels (i.e. a combination of the first and second pixels). It will be appreciated that in practice the spatial light modulator forms an image reconstruction comprising a much larger array of image points. In addition, it should be understood that the values of the image points are not shown in fig. 5A and 5B, which means, for example, that some of the image points shown may have zero values and thus may not be shown as spots in the image reconstruction.
The first pattern of image points of fig. 5A comprises a first checkerboard pattern and the second pattern of image points of fig. 5B comprises a second checkerboard pattern, wherein the first pattern is opposite (i.e., inverted) to the second checkerboard pattern. For example, the first pattern comprises a first subset of image points arranged in a first checkerboard pattern (e.g. corresponding to black squares of the checkerboard) and the second pattern comprises a second subset of image points arranged in a second (opposite) checkerboard pattern (e.g. corresponding to white squares of the checkerboard). Thus, each of the first and second patterns comprises a reduced number of image points in the array of image points (i.e. a subset of the 5 x 5 array of image points).
According to an embodiment, first and second image reconstructions of the same target image are formed synchronously at respective first and second planes, as shown in fig. 6 and 7. In particular, the first and second image reconstructions are formed substantially simultaneously such that they are perceived by the human eye as coplanar. The image used to calculate one of the first and second holograms may be resized by an amount based on the distance between the first and second planes such that the first and second image reconstructions appear to be coplanar. The first and second image reconstructions are formed or displayed simultaneously (e.g., by displaying the respective first and second single plane holograms substantially simultaneously on respective first and second arrays of the same or different spatial light modulators, or displaying the biplane holograms on the spatial light modulators).
As shown in fig. 6, a first image point 601 is formed on a first plane P1 and a second image point 602 is formed on a second plane P2, wherein the first plane P1 and the second plane P2 are arranged in parallel along a common projection axis (not shown). Since each of the first and second image points comprises a subset of image points of the same target image arranged in a checkerboard pattern, the spacing between adjacent image points on the respective planes is increased by a factor of two (i.e. compared to the spacing between adjacent image points of a 5x5 array). Furthermore, since the first image points 601 and the second image points 602 are arranged in opposite checkerboard patterns, the second image points 602 are interposed between the first image points 601 when the composite image is viewed. The second image point 602 can be said to fill in the gap between the first image points 601 in the composite two-dimensional image. In particular, as shown in fig. 6, the image point 621 of the second image point 602 formed on the second plane P2 is interposed at a position 623 between the three first image points 601 formed on the first plane P1. Thus, a first image point 601 of a first image reconstruction formed on a first plane P1 is spatially separated from a second image point 602 of a second image reconstruction formed on a second plane P2 and interposed between the second image points 602. It can be said that the coordinate positions of the first image point 601 and the second image point 602 (defined by the two-dimensional coordinate system (x, y) of their respective planes) are offset or shifted from each other so that the first and second image points are interpolated and spatially separated. Fig. 7 shows a composite image comprising the first image point 601 of the first image reconstruction formed on the plane P1 and the second image point 602 of the second image reconstruction formed on the plane P2 of fig. 6, as seen by a viewer.
An image projection method is provided for forming a plurality of image reconstructions on different planes disposed on a common projection axis. The method includes determining at least one hologram corresponding to a target image for image reconstruction. The method further includes forming a diffraction pattern including a corresponding hologram. The method includes displaying each diffraction pattern using a display engine. The method further includes illuminating the display engine with light and forming image reconstructions corresponding to each hologram on a plane of the plurality of different planes, wherein each image reconstruction includes image points arranged in a pattern, and wherein image points of a first image reconstruction formed on a first plane are interposed between image points of a second image reconstruction formed on a second plane.
Thus, spatial interpolation or interlacing of image points of a two-dimensional image reconstruction formed on three different planes-extending in the direction of the projection axis-is disclosed. The principle of spatial interpolation of image points in three dimensions can be applied to image points of the same color (for example in the case of forming a monochromatic image) and to image points of different colors (for example in the case of forming a polychromatic (panchromatic) image). Examples are described below.
Hologram calculation for checkerboard
In the embodiments described above with reference to fig. 5 to 7, each image reconstruction comprises a subset of image points of a single desired or target image arranged in a checkerboard pattern. This is referred to herein as a "checkerboard". Individual holograms displayed at a single point in time on the spatial light modulator (e.g., image points formed by biplane holograms having opposite checkerboard patterns in different planes) reconstruct a complete target image but span both planes. Alternatively, two separate holograms (e.g., a pair of holograms forming image points having opposite checkerboard patterns on respective planes) may together reconstruct a complete target image on both planes. It can be said that a subset of image points corresponding to the partial image reconstruction of the target image is formed on each plane. Various techniques for computing the partial image reconstructed hologram forming the target image are possible. An example of using a checkerboard method is described below, in which each individual hologram represents a subset of image points of a target image arranged in a checkerboard pattern, as described above.
Checkerboard pattern
In particular, image data including a source image representing a desired or target image may be received and processed by an image processing engine. In some examples, the source image may be an image for processing. The image processing engine may process (e.g., sub-sample) the image to generate a pair of sub-images. The sub-sampling may be based on a checkerboard pattern. The first sub-image may be generated by selecting every other image pixel of the image in the first checkerboard pattern and making the remaining pixels zero (i.e., filling the remaining pixels with "zeros"). The second sub-image may be generated by selecting every other image pixel of the image in the second checkerboard pattern and making the remaining pixels zero (i.e. filling the remaining pixels with "zeros"), wherein the first checkerboard pattern is opposite to the second checkerboard, and vice versa. Thus, the first sub-image comprises a first subset of image pixels and the second sub-image comprises a second subset of image pixels, wherein the first and second subsets do not overlap and together form an image pixel set.
In some embodiments of the present disclosure, a single hologram is calculated based on all image pixels of a desired or target image. The single hologram forms first and second image reconstructions comprising respective subsets of image points (one subset on each plane) having a checkerboard pattern, which are perceived as complete reconstructions of the target image arranged in an array. In other embodiments, a pair of holograms is calculated, where each hologram is calculated based on a subset of image pixels of the target image. Thus, the first and second holograms form respective first and second image reconstructions comprising respective subsets of image points having a checkerboard pattern in respective planes. Each subset of pixels includes a reduced number of pixels. A sampling scheme based on a checkerboard pattern may be used to determine a subset of pixels. In particular, the sampling scheme for the checkerboard selects (or sub-samples) every other image pixel of the image such that the number of image pixels is reduced by a factor of two. Each image reconstruction on each plane includes a subset of image points forming a partial image reconstruction of the target image. By means of the checkerboard method, the spacing between image points in the array on each plane is increased by a factor of two. However, the spatial resolution (i.e. the density of image points) of the perceived composite image reconstruction is not reduced by a factor of two.
In some embodiments, each hologram is a fourier hologram included in the diffraction pattern, wherein the fourier hologram overlaps with a lens function that determines a position on a projection axis of the playback field. In these embodiments, the lens function with the first hologram superimposed is different from the lens function with the second hologram superimposed. The first image reconstruction on the first plane and the second image reconstruction on the second plane may be the same color or different colors. In these embodiments, two holographic projection channels may be provided for each color. For example, a first red holographic projection channel may be used to form a first red image reconstruction on a first plane and a second red holographic projection channel may be used to form a second red image reconstruction. In some embodiments, red, green, and blue image reconstructions are formed on multiple planes on the projection axis.
The viewer sees a single complete image reconstruction corresponding to each target image of the combined first and second partial image reconstructions. Thus, the projection image appears to be a faithful and complete reconstruction of the target image.
As described herein, the above techniques may be used to calculate holograms of each target image to be displayed on different ones of a plurality of planes disposed along a common projection axis.
Phase ramp function for image point insertion
In other embodiments using different holographic projection channels to form multiple holographic reconstructions on different planes, the image points of a first (partial) image reconstruction of the target image appear to be intermediate between the image points of a second (partial) image reconstruction of the target image by adding a phase ramp or software grating function to transform the (partial) image reconstructions relative to each other. In other embodiments using different holographic projection channels, different phase ramp functions are applied such that image points of one image reconstruction are inserted (perceived) between image points of another image reconstruction. This helps to prevent any overlap between adjacent image points formed on the same plane (i.e. it reduces or prevents "pixel cross talk" or "inter-pixel interference"). Overlapping of adjacent image points or image pixels on the same plane may create interference, which may appear as particles/noise to the viewer. Such disturbances can be managed by forming the first (partial) image reconstruction and the second (partial) image reconstruction substantially simultaneously on different parallel planes. The first and second holograms may be calculated from the respective reduced resolution images based on the target image, examples of which are described below.
Sub-sampling with kernel
One example sampling scheme is to sample image pixels of a source image using a so-called "kernel". The kernel may be considered a moving sampling window that captures and operates on a small set of image pixels (e.g., a 4 x 4 array of pixels). The kernel is progressively moved to a series of sampling window positions that cover or capture successive (i.e., adjacent and non-overlapping) groups of pixels of the image. The kernel can be said to operate on successive groups of pixels of the source image. For each sampling window position, the kernel operates to determine a single sub-sampled pixel value for the sub-image, which represents the value of the image pixel captured in the sampling window. For example, a single sub-sampled pixel value for a sub-image may be calculated as an average of pixel values or an average of weighted pixel values, where the weighting applied to the pixel values is defined by a kernel operation. Thus, the sub-image comprises a sub-sampled or reduced resolution image (i.e. a reduced number of image pixels). Different kernel-based sampling schemes (e.g., defining different sampling window locations and/or different kernel operations) may be used to determine the first and second sub-images for computing the corresponding one or more holograms.
Subsampling an intermediate image with warp correction
Another example sampling scheme is to process image pixels of a source image to determine an intermediate image and use the intermediate image as a primary image for sampling to determine a secondary image for hologram calculation. For example, the intermediate image may be determined using a displacement map comprising displacement values for each pixel of the source image (e.g., in the x-and y-directions) that represent image distortions caused by an optical relay system (extending in the direction of the common projection axis) arranged to form each image reconstruction. In particular, the optical relay system of the holographic projector may comprise optics and/or an image combiner having optical power. The image reconstruction formed by the optical system may be distorted. The distortion can be modeled by considering a single displacement (x and y) for each pixel. In practical applications such as heads-up displays, such distortion may be caused by magnification optics, free form optics, windshields, etc. in the light path from the playback plane to the eye-box. This effect is referred to as "warping", and thus the intermediate image may be referred to as a "warped image" determined by using the "warp map".
An intermediate image or a main image for processing is derived from the source image from the warp map or a pair of warp maps (e.g., x and y). The intermediate image may be formed by warping the source image. The position of the pixel in the intermediate image can be determined by establishing a translation in the x and y directions caused by the warp. This may be determined by calculating ray tracing or measuring the actual displacement using a camera and interpolating the result. The warped image (i.e., the intermediate image, rather than the source or target image) is sub-sampled to generate a plurality of sub-images. The sub-image used to calculate the hologram effectively compensates for the warping effect because the sub-image is calculated from the intermediate image (i.e., the warped image) rather than the source image. Therefore, this technique has the additional advantage of simultaneously compensating for warpage caused by the optical relay system of the projector.
As described above, the warped image may be sub-sampled using a kernel or other type of fixed or moving sampling window. For example, a first set of symmetrically arranged circular sampling windows covering substantially the entire warped image may be used to determine pixel values of the first sub-image. An equivalent second set of symmetrically arranged circular sampling windows, located at a position diagonally offset from but partially overlapping the first set, may be used to determine pixel values of the second sub-image. Each sampling window may determine a single sub-sampled pixel value of the corresponding sub-image by weighting the pixel values (i.e., gray values) captured in the sampling window and calculating an average of the weighted values. For example, the weighting may be assumed to be a gaussian distribution such that image pixels at the center of the sampling window have the highest weighting and pixel values at the edges of the sampling window have the lowest weighting according to the gaussian distribution.
Subsampling based on warp map
Another example sub-sampling scheme is sub-sampling pixels of a high resolution image to derive a plurality of sub-images, wherein sampling window positions for sub-sampling groups of pixels of the sub-sampled image are determined based on a warp map. For example, a high resolution image may be formed by enlarging a source image or a target image. The displacement map (or warp map) of the projector may be used to determine the pixel displacement caused by warp. Therefore, since the displacement of the pixel in the x and y directions caused by the warp effect is known, the displacement position can be used for sub-sampling to compensate for the warp effect. Thus, the technique derives multiple sub-images from a high resolution image by sub-sampling groups of image pixels captured by a sampling window (e.g., a block comprising a 4 x 4 pixel array) at displaced pixel locations to compensate for warping effects.
Thus, the source image is sub-sampled using the sampling windows at a set of sampling window positions corresponding to the displacement positions, whereby individual sub-sampled pixel values for each pixel of each sub-image are derived from pixel values of image pixels captured within the respective sampling window. In contrast to other sub-sampling techniques, where the sampling window position is typically predefined, the sampling window position is calculated based on a displacement/warp map of the optical relay system.
Thus, the first and second sub-images can be generated. The first and second sub-images automatically compensate for warping effects by means of a sampling scheme, i.e. the position of the sampling window. The first and second holograms may be calculated using the first and second sub-images. Alternatively, a hologram (e.g., a biplane hologram such as a fresnel hologram) may be calculated that holographically reconstructs a first sub-image on a first plane and a second sub-image on a second plane that is different from the first plane. Thus, the viewer can see a true reconstruction of the target image on the projection axis of the holographic projector, which compensates for the warping effect of the optical relay system.
Synchronizing multiple checkerboard image reconstructions on multiple planes
In some embodiments, image reconstructions of a single color target image may be formed on different planes, where each image reconstruction includes image points of the same color. During a first display event, a first partial image reconstruction of an image formed on a first plane is a first checkerboard pattern of image points and a second partial image reconstruction of an image formed on a second plane is a second checkerboard pattern of image points, as shown in FIG. 6. During a second display event, a complementary first partial image reconstruction of the image formed on the first plane is a second checkerboard pattern of image points and a complementary second partial image reconstruction of the image formed on the second plane is a first checkerboard pattern of image points. During each display event, the formation of the first and second image reconstructions is synchronized. In some embodiments, the first and second image reconstructions are formed simultaneously from the same hologram. In some embodiments, the first and second image reconstructions are formed simultaneously in a spatially separated manner (e.g., by using two separate holographic channels/spatial light modulators to display the respective holograms). In other embodiments, the first and second image reconstructions are formed in rapid succession in a frame sequential method as described below (e.g., by sequentially displaying the corresponding holograms on a single spatial light modulator at high speed).
Example 1-monochrome (red) image on two planes:
displaying events | First plane | Second plane |
1 | Red-first chess piece | Red-second chess piece |
2 | Red-second chess piece | Red-first chess piece |
3 | Red-first chess piece | Red-second chess piece |
Example 1
In the scenario of example 1, when the image reconstructed image points formed on the second plane have a second checkerboard pattern ("second chess pieces"), the image reconstructed image points formed on the first plane have a first checkerboard pattern ("first chess pieces"), and vice versa. As a result, image points of the image reconstruction formed on the first plane are interposed between image points of the image reconstruction formed on the second plane when viewed by a viewer. In addition, image points of the image reconstruction formed one after the other (successively/consecutively) on the first plane alternate between the first and the second checkerboard pattern, and image points of the image reconstruction formed one after the other on the second plane alternate between the second and the first checkerboard pattern. The composite image seen by the viewer includes the problem that image reconstruction formed on different planes is not disturbed by image pixels in each playback field.
Examples 2 and 3-two color (red and green) images on two planes:
In other embodiments, more complex schemes are contemplated, such as using different colors to form a multi-color or "panchromatic" image reconstruction on each plane. Two different methods of forming a color image reconstruction are known. A first method, called spatially separated color "SSC", uses three spatially separated arrays of light modulating pixels to simultaneously display three monochromatic holograms (e.g., red, blue, and green holograms). A second method, called frame sequential color "FSC", uses all pixels of a common spatial light modulator to perform sequential display of three monochromatic holograms within the integration time of the human eye.
In the following example scheme, the target image includes a composite of red, green (and blue) images corresponding to its red, green (and blue) image pixels. As described above, a separate hologram is calculated for each color or wavelength. According to the present disclosure, during a display event, image points formed on a first display plane are interposed between image points formed on a second display plane.
Example 2
In the scenario of example 2, when the image reconstructed red image points formed on the second plane have the second checkerboard pattern, the image reconstructed red image points formed on the first plane have the first checkerboard pattern, and vice versa. Similarly, when the image reconstructed green image points formed on the second plane have a first checkerboard pattern, the image reconstructed green image points formed on the first plane have a second checkerboard pattern, and vice versa. As a result, during a display event, red image points of a first partial image reconstruction of a red image are interposed between red image points of a second partial image reconstruction of a red image, and during a display event, green image points of a first partial image reconstruction of a green image are interposed between green image points of a second partial image reconstruction of a green image. As will be appreciated by a person skilled in the art, the scheme may be extended to include a third color, i.e. a blue image point of each of the first and second images on the respective first and second planes.
Example 3
In the scenario of example 3, when the green image points on the first plane have the second checkerboard pattern, the red image points on the first plane have the first checkerboard pattern, and vice versa. Similarly, when the green image points on the second plane have the second checkerboard pattern, the red image points on the second plane have the first checkerboard pattern, and vice versa. As a result, during two subframes of a display event, the red image points of the first partial image reconstruction of the red image are interposed between the green image points of the second partial image reconstruction of the green image.
Examples 4 and 5-three color (red, green, and blue) images on two planes:
example 4
In the scenario of example 4, when the blue image points on the first plane have the second checkerboard pattern, the red/green image points on the first plane have the first checkerboard pattern, and vice versa. In addition, when the red/green image points on the second plane have the second checkerboard pattern, the blue image points on the second plane have the first checkerboard pattern, and vice versa. As a result, during each display event, red, green and blue image points formed on the first plane are interposed between corresponding red, green and blue image points formed on the second plane.
Displaying events | First plane | Second plane |
1 | RGB-all first chess pieces | RGB-all second chess pieces |
2 | RGB-all second chess pieces | RGB-all first chess pieces |
3 | RGB-all first chess pieces | RGB-all second chess pieces |
Example 5
In the scheme of example 5, a plurality of target images are displayed, wherein each target image is a composite of red, green, and blue images corresponding to its red, green, and blue image pixels. Each display event may display a different one of the plurality of target images. When the reconstructed red, green and blue image points of the same image formed on the second plane have the second checkerboard pattern, the reconstructed red, green and blue image points of the corresponding image formed on the first plane have the first checkerboard pattern, and vice versa. As a result, superimposed red, green and blue image points formed on the first plane are interposed between superimposed red, green and blue image points formed on the second plane during each display event. In this scheme, three separate holographic channels (red, green and blue channels) are required for each of the first and second image reconstruction/planes to be able to display the same color simultaneously on different planes.
Fig. 8A to 8C show the display of the respective first, second, and third target images using the scheme of example 5. Thus, during the first display event shown in fig. 8A, the RGB image points reconstructed from the first partial image of the first image formed in the first plane P1 have a first checkerboard pattern, and the RGB image points reconstructed from the second partial image of the first image formed on the second plane P2 have a second checkerboard pattern. During the second display event shown in fig. 8B, the RGB image points reconstructed from the first partial image of the second image formed in the first plane P1 have a second checkerboard pattern, and the RGB image points reconstructed from the second partial image of the second image formed on the second plane P2 have the first checkerboard pattern. Finally, during the third display event shown in fig. 8C, the RGB image points reconstructed from the first partial image of the third image formed in the first plane P1 have a first checkerboard pattern and the RGB image points reconstructed from the second partial image of the third image formed on the second plane P2 have a second checkerboard pattern. As a result, RGB image points formed on the first plane are interposed between RGB image points formed on the second plane during each display event.
Other schemes for synchronizing color and checkerboard patterns in accordance with the principles of the present disclosure will be apparent to those skilled in the art.
As described above, the hologram displayed by the spatial light modulator of the projector disclosed herein may be a fourier or fresnel hologram. In embodiments that include fourier holograms, a single fourier hologram forms an image reconstruction in only one plane. As described above, the position of the plane is determined by the variable soft lens comprising the diffraction pattern of the hologram. In embodiments that include fresnel holograms, a single fresnel hologram may be used to form image reconstructions on each of multiple planes simultaneously. In other embodiments, a biplane hologram is used to simultaneously form two holographic playback fields on different planes at the same time.
In a particular embodiment, fourier holography is applied with a variable soft lens applied to the diffraction pattern to form an image reconstruction of the hologram in the respective first and second planes. One red, one green and one blue holographic channel is used to selectively form red, green and blue image reconstructions on the respective first and second planes.
System diagram
Fig. 9 is a schematic diagram illustrating a holographic system according to an embodiment. A Spatial Light Modulator (SLM) 940 is arranged to display holograms received from controller 930. In operation, light source 910 irradiates a hologram displayed on SLM940 and the hologram is reconstructed into a playback field on playback plane 925 a. According to an embodiment, the playback plane 925a may be in free space. Additionally or alternatively, holographic reconstructions may be formed on the playback plane 925b, depending on the focal length of the respective variable soft lens. The controller 930 receives an image from the image source 920. For example, the image source 920 may be an image capturing device, such as a still camera arranged to capture a single still image or a video camera arranged to capture a video sequence of moving images.
The controller 930 includes an image processing engine 950, a hologram engine 960, a data frame generator 980, and a display engine 990. The image processing engine 950 receives a plurality of source images from the image source 920 for display on respective planes in the composite image. The image processing engine 950 may process the received images as described below, or may pass the received images directly to the hologram engine 960 to calculate the corresponding holograms.
The image processing engine 950 comprises a sub-image generator 955 arranged to generate a plurality of sub-images from each received image (e.g. a source image representing a desired or target image) according to a defined scheme, as described herein. The image processing engine 950 may receive control signals or otherwise determine a scheme for generating the secondary image. Thus, each sub-image includes fewer pixels than the received image. The image processing engine 950 may generate a plurality of sub-images using the source image as a main image. The source image may be a magnified version of the target image, or the image processing engine may perform magnification as described herein. Alternatively, the image processing engine 950 may process the source image to determine an intermediate image, and use the intermediate image as the main image. For example, as described herein, the intermediate image may be a "warped image". The warped image may be determined using a displacement map comprising displacement values for each pixel of the source image (e.g., in the x and y directions) that represent image distortions caused by an optical relay system disposed to each of the holographically reconstructed images. As described herein, the image processing engine 950 may generate a plurality of secondary images by sub-sampling (or undersampling) the primary image. The image processing engine 950 may determine a first sub-image and a second sub-image, wherein a pixel value for each pixel of the first sub-image is calculated from a first set of pixels of the main image and a pixel value for each pixel of the second sub-image is calculated from a second set of pixels of the main image. In some embodiments, the sampling window used to select the second set of pixels is offset and/or partially overlaps with the sampling window used to select the first set of pixels. In other embodiments, in each case, the sampling window positions may be arranged in a checkerboard pattern, with a different checkerboard pattern being used for each secondary image. In some embodiments, a displacement map is used to determine sampling window locations for selecting the first and second sets of pixels. The image processing engine 950 passes the plurality of sub-images to the hologram engine 960.
The hologram engine 960 is arranged to determine a hologram corresponding to each image received from the image processing engine 950, as described herein. In addition, the hologram engine is arranged to determine a diffraction pattern comprising a hologram for each image and, optionally, a software lens and/or a grating/phase ramp function, as described herein. The hologram engine 960 passes the diffraction pattern to the data frame generator 980. The data frame generator 980 is arranged to generate a data frame (e.g. an HDMI frame) comprising a plurality of diffraction patterns, as described herein. In particular, the data frame generator 980 generates a data frame including hologram data for each of the plurality of holograms. The data frame generator 980 passes the data frames to the display engine 990. The display engine 990 is arranged to display each of the plurality of holograms on the SLM940 in turn. Display engine 990 includes hologram extractor 992, tiling engine 970, and software optics 994. As described herein, the display engine 990 extracts each hologram from the data frame using the hologram extractor 992 and tiles the holograms according to the tiling scheme generated by the tiling engine 970. In particular, the tiling engine 970 may receive control signals to determine a tiling scheme, or may additionally determine a tiling scheme for tiling based on holograms. The display engine 990 may optionally add a phase ramp function (software raster function) using the software optics 994 to pan the position of the playback field on the playback plane, as described herein. Thus, for each hologram, the display engine 990 is arranged to output a drive signal to the SLM940 to display each diffraction pattern comprising a hologram of the plurality of holograms on a respective plane in a synchronized scheme according to a corresponding tiling scheme, as described herein.
As described herein, controller 930 may dynamically control how secondary image generator 955 generates the secondary images. Controller 930 may dynamically control the refresh rate of the hologram. As described herein, the refresh rate may be considered as the frequency with which the hologram engine recalculates the hologram from the next source image in the sequence received by the image processing engine 950 from the image source 920. As described herein, the dynamically controllable features and parameters may be determined based on external factors indicated by the control signals. Controller 930 may receive control signals related to such external factors or may include a module for determining such external factors and generating such control signals accordingly.
As will be appreciated by those skilled in the art, the above-described features of controller 930 may be implemented in software, firmware, or hardware, as well as any combination thereof.
Accordingly, a projector is provided that is arranged to form an image viewable from a viewing area. The projector comprises at least one spatial light modulator and at least one light source (per color). Each spatial light modulator is arranged to display a computer generated diffraction pattern. Each respective light source is arranged to illuminate the spatial light modulator during display such that a first image is formed on a first plane and a second image is formed on a second plane. The first plane is spatially separated from the spatial light modulator by a first propagation distance and the second plane is spatially separated from the spatial light modulator by a second propagation distance different from the first propagation distance. The first image comprises first image points arranged in a first pattern and the second image comprises second image points arranged in a second pattern. The first image point and the second image point are interleaved (i.e. spatially interleaved and offset/separated in two dimensions (x, y) of the image and in a third dimension (z) parallel to the projection axis) such that the first image point and the second point are visible from the viewing area at the same time. The computer generated diffraction pattern may be a fresnel hologram arranged to form the first image and the second image simultaneously. Alternatively, the computer generated diffraction pattern may comprise two spatially separated fourier holograms, each combined (i.e. superimposed/added) with a respective lens function defining a respective propagation distance to a respective (playback) plane. That is, a first Fourier hologram is displayed on a first subset (e.g., a first half) of the pixels of the spatial light modulator and a second Fourier hologram is displayed on a second subset (e.g., a second half) of the pixels of the spatial light modulator substantially simultaneously with the first Fourier hologram.
In some embodiments, the image processing engine is arranged to undersample the first main image for projection according to the first sampling scheme to form the first sub-image comprising pixels arranged in a first checkerboard pattern. The image processing engine is further arranged to undersample the first main image for projection according to a second sampling scheme to form a second sub-image comprising pixels arranged in a second checkerboard pattern. The second sampling scheme may be different from the first sampling scheme, and the second checkerboard pattern may be opposite to the first checkerboard pattern. The hologram engine is arranged to determine a first hologram from the first secondary image; a second hologram is determined from the second sub-image. Alternatively, the hologram engine may be arranged to determine (e.g. calculate) a single hologram (e.g. fresnel hologram) encoding the first and second sub-images. The light source is arranged to illuminate the spatial light modulator such that a first image reconstruction corresponding to a first hologram is formed on a first plane and a second image reconstruction corresponding to a second hologram is formed simultaneously on a second plane. The first plane is spatially separated from the spatial light modulator by a first propagation distance and the second plane is spatially separated from the spatial light modulator by a second propagation distance. The second travel distance is different from the first travel distance. The first image reconstruction includes first image points arranged in a first pattern and the second image reconstruction includes second image points arranged in a second pattern. The first image point is interposed between the second image point such that both the first image point and the second image point are viewable from the viewing area simultaneously.
Additional features
Embodiments relate, by way of example only, to electrically activated LCOS spatial light modulators. The teachings of the present disclosure may be equivalently implemented on any spatial light modulator capable of displaying computer-generated holograms according to the present disclosure, such as any electrically activated SLM, optically activated SLM, digital micromirror device, or microelectromechanical device.
In some embodiments, the light source is a laser, such as a laser diode. In some embodiments, the light receiving surface is a diffusing surface or screen, such as a diffuser. The holographic projection systems of the present disclosure may be used to provide an improved head-up display (HUD) or head-mounted display. In some embodiments, a vehicle is provided that includes a holographic projection system mounted in the vehicle to provide a HUD. The vehicle may be a motor vehicle such as an automobile, truck, van, cargo truck, motorcycle, train, aircraft, boat or ship.
In some embodiments, the size of the hologram (number of pixels in each direction) is equal to the size of the spatial light modulator, such that the hologram fills the spatial light modulator. I.e. the hologram uses all pixels of the spatial light modulator. In other embodiments, the hologram is smaller than the spatial light modulator. More specifically, the number of hologram pixels is less than the number of light modulation pixels available on the spatial light modulator. In some of these other embodiments, a portion of the hologram (i.e., a contiguous subset of the pixels of the hologram) is repeated in unused pixels. This technique may be referred to as "tiling" in which the surface area of the spatial light modulator is divided into a plurality of "tiles," each representing at least a subset of the holograms. Thus, the size of each tile is smaller than the size of the spatial light modulator. In some embodiments, a "tiling" technique is implemented to improve image quality. In particular, some embodiments implement tiling techniques to minimize the size of image pixels while maximizing the amount of signal content that enters the holographic reconstruction. In some embodiments, the holographic pattern written to the spatial light modulator comprises at least one complete tile (i.e., a complete hologram) and at least a small portion of the tile (i.e., a contiguous subset of the pixels of the hologram).
In an embodiment, only the main playback field is used to form the image reconstruction on each plane, and the system comprises physical blocks, such as baffles, arranged to limit the propagation of higher order playback fields through the system.
In an embodiment, as described herein, the holographic reconstruction is colored. In some embodiments, a method called spatially separated color "SSC" is used to provide color holographic reconstruction. In other embodiments, a method called frame sequential color "FSC" is used.
The SSC method uses three spatially separated arrays of light modulating pixels for three monochromatic holograms. The advantage of the SSC method is that the image can be very bright, since all three holographic reconstructions can be formed simultaneously. However, if, due to spatial constraints, three spatially separated arrays of light modulating pixels are provided on a common SLM, the quality of each monochrome image will not be optimal, as only a subset of the available light modulating pixels is used for each color. Thus, a relatively low resolution color image is provided.
The FSC method can sequentially display three monochromatic holograms using all pixels of a common spatial light modulator. The cycle of monochromatic reconstruction (e.g., red, green, blue, etc.) is fast enough for a human viewer to perceive a polychromatic image from the integration of three monochromatic images. The advantage of FSC is that the entire SLM can be used for each color. This means that the quality of the three color images produced is optimal, since all pixels of the SLM are used for each color image. However, the disadvantage of the FSC method is that the brightness of the composite color image is about 3 times lower than the SSC method because only one third of the frame time can occur per single color illumination event. This disadvantage can be addressed by overdriving the laser or using a more powerful laser, but this requires more power, resulting in higher costs and increased system size.
The methods and processes described herein may be embodied on a computer readable medium. The term "computer readable medium" includes media arranged to temporarily or permanently store data such as Random Access Memory (RAM), read Only Memory (ROM), cache memory, flash memory, and cache memory. The term "computer-readable medium" shall also be taken to include any medium or combination of media that is capable of storing instructions for execution by a machine such that the instructions, when executed by one or more processors, cause the machine to perform any one or more of the methodologies described herein, in whole or in part.
The term "computer-readable medium" also encompasses cloud-based storage systems. The term "computer-readable medium" includes, but is not limited to, one or more tangible and non-transitory data stores (e.g., data volumes) in the example form of solid state memory chips, optical disks, magnetic disks, or any suitable combination thereof. In some example embodiments, the instructions for execution may be conveyed by a carrier medium. Examples of such carrier media include transient media (e.g., propagated signals conveying instructions).
It will be apparent to those skilled in the art that various modifications and variations can be made without departing from the scope of the appended claims. The disclosure covers all modifications and variations within the scope of the appended claims and their equivalents.
Claims (21)
1. A projector arranged to form a plurality of image reconstructions on different planes disposed on a common projection axis, wherein the projector comprises:
a hologram engine arranged to determine a plurality of holograms corresponding to at least one image for image reconstruction and, for each hologram of the plurality of holograms, form a diffraction pattern comprising the corresponding hologram;
a display engine arranged to display each diffraction pattern and receive light, thereby forming an image reconstruction corresponding to each hologram on a plane of a plurality of different planes, wherein each image reconstruction comprises image points arranged in a pattern; and is also provided with
Wherein the first image reconstructed image points formed on the first plane are interposed between the second image reconstructed image points formed on the second plane.
2. The projector of claim 1, wherein the first and second image reconstructions are formed simultaneously.
3. The projector of claim 1 or 2, wherein the first image reconstruction is a partial reconstruction of a target image and the second image reconstruction is a partial reconstruction of a target image.
4. The projector of claim 1 or 2, wherein the first image reconstructed image points are arranged in a first pattern and the second image reconstructed image points are arranged in a second pattern, wherein the first pattern is opposite to the second pattern.
5. The projector of claim 4 wherein the first pattern is a first checkerboard pattern and the second pattern is a second checkerboard pattern.
6. The projector of claim 4 arranged to form an image reconstruction sequence on the first plane by alternating between image points arranged in the first pattern and image points arranged in the second pattern, and to form an image reconstruction sequence on the second plane by alternating between image points arranged in the second pattern and image points arranged in the first pattern synchronously.
7. The projector of claim 6 wherein successive image reconstructions in the sequence of image reconstructions on the first plane are different colors.
8. The projector according to claim 1 or 2, wherein at any point in time the color of the image point formed on the first plane is different from the color of the image point formed on the second plane.
9. The projector of claim 1 or 2, wherein the first image reconstruction is a first holographic reconstruction and the second image reconstruction is a second holographic reconstruction.
10. The projector of claim 1, further comprising:
An image processing engine arranged to receive a plurality of source images comprising image pixels for projection and to process each source image according to a sampling scheme so as to reduce the number of image pixels having pixel values before processing the source images by the hologram engine;
wherein each diffraction pattern formed by the hologram engine further comprises a lens function having a focal length, wherein the focal length of the lens function determines a plane of image reconstruction;
wherein the pattern of image points reconstructed by each image formed by the display engine is determined by a corresponding sampling scheme,
wherein the first sampling scheme associated with the first image reconstruction is different from the second sampling scheme associated with the second image reconstruction such that image points of the first image formed on the first plane are interposed between image points of the second image formed on the second plane.
11. The projector of claim 10 wherein the image processing engine is further arranged to magnify each source image prior to processing according to a sampling scheme.
12. Projector according to claim 10 or 11, wherein the first sampling scheme comprises invalidating the substitute pixel values according to a first chessboard pattern and the second sampling scheme comprises invalidating the substitute pixel values according to a second chessboard pattern, wherein the first chessboard pattern is opposite to the second chessboard pattern.
13. The projector of claim 10 or 11 wherein each sampling plan includes averaging pixel values within a sampling window at a plurality of sampling window positions, wherein averaging is based on a weighted average of pixel positions within the sampling window.
14. The projector of claim 13 wherein the weighting assigned to each pixel value in the respective sampling window decreases with distance from the center of the sampling window.
15. The projector of claim 13 wherein the first sampling plan includes a first set of sampling window positions and the second sampling plan includes a second set of sampling window positions, wherein the first set of sampling window positions are diagonally offset from the second set of sampling window positions.
16. The projector of claim 15 wherein the first set of sampling window positions partially overlap the second set of sampling window positions.
17. The projector of claim 1 or 2, wherein each diffraction pattern formed by the hologram engine further comprises a phase ramp function, wherein a ramp gradient of a first phase ramp function associated with a first image reconstruction is different from a ramp gradient of a second phase ramp function associated with a second image reconstruction to provide a displacement between image points of the first image reconstruction relative to image points of the second image reconstruction.
18. The projector of claim 17 wherein a difference between the ramp gradient of the first phase ramp function and the ramp gradient of the second phase ramp function is such that image points of a first image formed on a first plane are interposed between image points of a second image formed on a second plane.
19. A heads-up display comprising the projector of claim 1 or 2, wherein the first plane contains near-field image content and the second plane contains far-field image content.
20. A method for forming a plurality of image reconstructions on different planes disposed on a common projection axis, the method comprising:
determining a plurality of holograms corresponding to at least one image for image reconstruction;
for each hologram of the plurality of holograms, forming a diffraction pattern comprising the corresponding hologram;
displaying each diffraction pattern using a display engine;
illuminating the display engine with light and forming an image reconstruction corresponding to each hologram on a plane of a plurality of different planes,
wherein each image reconstruction includes image points arranged in a pattern, and wherein the image points of a first image reconstruction formed on a first plane are interposed between the image points of a second image reconstruction formed on a second plane.
21. The method of claim 20, comprising simultaneously forming the first and second image reconstructions.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1918966.1A GB2590621B (en) | 2019-12-20 | 2019-12-20 | A projector for forming images on multiple planes |
GB1918966.1 | 2019-12-20 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113009710A CN113009710A (en) | 2021-06-22 |
CN113009710B true CN113009710B (en) | 2023-08-01 |
Family
ID=69323006
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011542614.4A Active CN113009710B (en) | 2019-12-20 | 2020-12-21 | Projector for forming images on multiple planes |
Country Status (5)
Country | Link |
---|---|
US (1) | US11736667B2 (en) |
EP (1) | EP3839639A1 (en) |
CN (1) | CN113009710B (en) |
GB (1) | GB2590621B (en) |
TW (1) | TWI820365B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2587400B (en) * | 2019-09-27 | 2022-02-16 | Dualitas Ltd | Hologram display using a liquid crystal display device |
GB2595696B (en) * | 2020-06-04 | 2022-12-28 | Envisics Ltd | Forming a hologram of a target image for projection using data streaming |
GB2604119B (en) * | 2021-02-24 | 2023-10-11 | Envisics Ltd | Holographic projection |
CN113791529B (en) * | 2021-08-13 | 2022-07-08 | 北京航空航天大学 | Crosstalk-free holographic 3D display method based on diffraction fuzzy imaging principle |
US11915395B2 (en) * | 2022-07-20 | 2024-02-27 | GM Global Technology Operations LLC | Holographic display system for a motor vehicle with real-time reduction of graphics speckle noise |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2461294B (en) * | 2008-06-26 | 2011-04-06 | Light Blue Optics Ltd | Holographic image display systems |
GB2498170B (en) | 2011-10-26 | 2014-01-08 | Two Trees Photonics Ltd | Frame inheritance |
GB2501112B (en) | 2012-04-12 | 2014-04-16 | Two Trees Photonics Ltd | Phase retrieval |
GB2526158B (en) | 2014-05-16 | 2017-12-20 | Two Trees Photonics Ltd | Imaging device for moving a virtual image |
GB201513333D0 (en) * | 2015-07-29 | 2015-09-09 | Khan Javid | Volumetric display |
WO2017198713A2 (en) * | 2016-05-18 | 2017-11-23 | Seereal Technologies S.A. | Method for producing holograms |
CN106773590B (en) * | 2017-02-10 | 2019-08-30 | 清华大学深圳研究生院 | A kind of holographic display system based on multiple Digital Micromirror Device |
CN107024849B (en) * | 2017-05-02 | 2019-06-28 | 四川大学 | A kind of holographic veiling glare elimination system and method for colored calculating based on digital lens |
FR3068489B1 (en) | 2017-06-29 | 2019-08-02 | B<>Com | METHOD FOR DIGITALLY GENERATING A HOLOGRAM, DEVICE, TERMINAL EQUIPMENT, SYSTEM AND COMPUTER PROGRAM THEREOF |
US10317684B1 (en) * | 2018-01-24 | 2019-06-11 | K Laser Technology, Inc. | Optical projector with on axis hologram and multiple beam splitter |
GB2574058B (en) * | 2018-05-25 | 2021-01-13 | Envisics Ltd | Holographic light detection and ranging |
US20220075200A1 (en) * | 2019-02-14 | 2022-03-10 | Hangzhou Uphoton Optoelectronics Technology Co., Ltd. | Beam-splitting optical module and manufacturing method thereof |
GB2586511B (en) * | 2019-08-23 | 2021-12-01 | Dualitas Ltd | Holographic projector |
GB2578523B (en) * | 2019-09-25 | 2021-08-11 | Dualitas Ltd | Holographic projection |
EP4058837A1 (en) * | 2019-11-15 | 2022-09-21 | CY Vision Inc. | Augmented reality head-up display with steerable eyebox |
GB2584513B (en) * | 2019-12-18 | 2022-09-14 | Envisics Ltd | Conjugate suppression |
-
2019
- 2019-12-20 GB GB1918966.1A patent/GB2590621B/en active Active
-
2020
- 2020-12-04 TW TW109142867A patent/TWI820365B/en active
- 2020-12-08 EP EP20212358.4A patent/EP3839639A1/en active Pending
- 2020-12-11 US US17/119,715 patent/US11736667B2/en active Active
- 2020-12-21 CN CN202011542614.4A patent/CN113009710B/en active Active
Also Published As
Publication number | Publication date |
---|---|
GB2590621A (en) | 2021-07-07 |
EP3839639A1 (en) | 2021-06-23 |
TW202125128A (en) | 2021-07-01 |
TWI820365B (en) | 2023-11-01 |
US11736667B2 (en) | 2023-08-22 |
GB201918966D0 (en) | 2020-02-05 |
GB2590621B (en) | 2022-05-25 |
CN113009710A (en) | 2021-06-22 |
US20210195146A1 (en) | 2021-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113009710B (en) | Projector for forming images on multiple planes | |
US11765328B2 (en) | Holographic projection | |
US11480919B2 (en) | Holographic projector | |
EP3650950B1 (en) | A spatial light modulator for holographic projection | |
CN112558452B (en) | Holographic projection | |
CN111176092B (en) | Pixel mapping on holographic projection display devices | |
KR102481541B1 (en) | hologram projector | |
CN114967400B (en) | Holographic projection | |
US12126944B2 (en) | Holographic projection | |
US20220299940A1 (en) | Image Projection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |