TWI421540B - Universal image display device and method (1) - Google Patents

Universal image display device and method (1)

Info

Publication number
TWI421540B
TWI421540B TW96140505A TW96140505A TWI421540B TW I421540 B TWI421540 B TW I421540B TW 96140505 A TW96140505 A TW 96140505A TW 96140505 A TW96140505 A TW 96140505A TW I421540 B TWI421540 B TW I421540B
Authority
TW
Taiwan
Prior art keywords
spatial light
light modulator
emitting diode
array
hologram
Prior art date
Application number
TW96140505A
Other languages
Chinese (zh)
Other versions
TW200827771A (en
Inventor
Armin Schwertner
Original Assignee
Seereal Technologies Sa
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GBGB0621360.7A priority Critical patent/GB0621360D0/en
Priority to GBGB0625838.8A priority patent/GB0625838D0/en
Priority to GB0705410A priority patent/GB0705410D0/en
Priority to GB0705408A priority patent/GB0705408D0/en
Priority to GB0705401A priority patent/GB0705401D0/en
Priority to GB0705398A priority patent/GB0705398D0/en
Priority to GB0705409A priority patent/GB0705409D0/en
Priority to GBGB0705404.2A priority patent/GB0705404D0/en
Priority to GB0705403A priority patent/GB0705403D0/en
Priority to GBGB0705411.7A priority patent/GB0705411D0/en
Priority to GB0705407A priority patent/GB0705407D0/en
Priority to GB0705412A priority patent/GB0705412D0/en
Priority to GB0705406A priority patent/GB0705406D0/en
Priority to GBGB0705402.6A priority patent/GB0705402D0/en
Priority to GB0705399A priority patent/GB0705399D0/en
Priority to GBGB0705405.9A priority patent/GB0705405D0/en
Application filed by Seereal Technologies Sa filed Critical Seereal Technologies Sa
Publication of TW200827771A publication Critical patent/TW200827771A/en
Application granted granted Critical
Publication of TWI421540B publication Critical patent/TWI421540B/en

Links

Description

Full-image display device and method (1)

The present invention is a holographic display device for generating a three-dimensional image, especially a compact device comprising a display on which a computer-generated image hologram will be in one or two optically-spaced spatial light modulators. Encoded on. This device produces a three-dimensional holographic reconstruction. This device is particularly useful in portable devices and handheld devices, such as mobile phones.

Computer-generated video holograms (CGHs) are compiled from one or more spatial light modulators (SLMs); spatial light modulators can include electronics or optics. Controllable components. These components encode the hologram values based on the image hologram to achieve the purpose of modulating the amplitude and phase of the light. The computer-generated image hologram can be calculated, for example, by coherent ray tracing, by simulating interference between the reflected light and the reference wave, or by Fourier or Fresnel conversion. An ideal spatial light modulator is a value that can represent any complex number, that is, the phase and amplitude of the incoming light wave are separately controlled. However, a typical spatial light modulator can only control one of the amplitudes or phases, with adverse effects that affect other characteristics. The amplitude and phase of the modulated light can be varied in several ways, such as using an electronically addressed liquid crystal spatial light modulator, an optically addressed liquid crystal spatial light modulator, a magneto-optical spatial light modulator, a micromirror device, or an acousto-optic light. Tune Transformer. The modulation of light can be spatially contiguous or composed of individual addressable elements, which can be one or two dimensional, binary, multi-level or continuous.

In the present invention, the proper noun "encoding" means providing a spatial light modulator control value to encode the hologram such that the three-dimensional scene can be reconstructed by the spatial light modulator. Therefore, the "space light modulator coded hologram" means that the hologram is encoded on the spatial modulator.

Compared with the pure automatic stereo display panel, the observer can observe the optical reconstruction of the wavefront of the three-dimensional scene through the image hologram. The three-dimensional scene is reconstructed in a space that extends between the observer's eye and the spatial light modulator or even after the spatial light modulator. The spatial light modulator can also be encoded using an image hologram such that the observer can observe the reconstructed three-dimensional scene object before the spatial light modulator, while other objects are observed on or behind the spatial light modulator.

The elements of the spatial light modulator are those that are more optically transmissive, with the interference produced by the rays being at least at a defined location and spatially tonality lengths in excess of a few millimeters. This provides a holographic reconstruction with sufficient resolution in at least one dimension. This type of light will be referred to as "full dimming".

In order to ensure sufficient time homology, the spectrum emitted by the source must be limited to a suitably narrow wavelength range, ie it must be close to a single color. The spectral bandwidth of high-brightness light-emitting diodes (LEDs) is narrow enough to ensure time-harmonicity of holographic reconstruction. The diffraction angle on the spatial light modulator is proportional to the wavelength, meaning that only one monochromatic source will result in a strong reconstruction of the target point. Wide spectrum leads to wide eyes Punctuation and fuzzy target reconstruction. The spectrum of the laser source can be treated as a single color. The spectral linewidth of a light-emitting diode (LED) is sufficiently narrow to aid in better reconstruction.

Spatial coherence is related to the lateral width of the light source. Conventional light sources, such as light-emitting diodes (LEDs) or cold cathode light-emitting lamps (CCFLs), can also meet these needs if they emit light through a narrow gap. The light from the laser source can be viewed as being emitted from a point source limited by diffraction, and depending on the purity of the model, a sharp reconstruction of the target will be produced, ie each target point is reconstructed as a point of diffraction limitation.

The light produced by the spatially non-coherent light source extends laterally and causes the reconstruction target to be blurred. The ambiguity is determined by the wide size of the target point reconstructed at a given location. In order to use a spatially non-coherent light source for hologram reconstruction, a compromise must be found between brightness and the use of aperture to limit the lateral width of the source. Smaller sources will give better spatial coherence.

A linear light source can be regarded as a point source if viewed from a right angle to a longitudinal extension. Therefore, the light wave can transmit in the same direction in the same direction, and it is not in the same direction.

In general, holograms reconstruct the scene in its entirety by coherent super-overlap of the waves in the horizontal and vertical directions. The above image hologram is referred to as a full parallax hologram. The reconstructed object can be viewed as a moving parallax in the horizontal and vertical directions, just like a real object. However, a larger viewing angle requires a high resolution in the horizontal and vertical directions of the spatial light modulator.

In general, the demand for spatial light modulators is reduced by limiting to holograms with only horizontal parallax (HPO). The holographic reconstruction only occurs in the horizontal direction, and there is no holographic reconstruction in the vertical direction. This will cause the reconstructed object to have a horizontally moving parallax. The perspective does not change on vertical movement. A hologram with only parallax will require a spatial light modulator to have a lower resolution in the vertical direction than a full parallax hologram. A hologram with only vertical parallax (VPO) is equally possible but rare. A holographic reconstruction occurs only in the vertical direction, producing a reconstructed object with a vertical moving parallax. There is no moving parallax in the horizontal direction. Since the perspectives observed by the left and right eyes are different, the perspectives must be generated separately.

Discuss related technologies

Typically, devices for generating three-dimensional images are less compact, requiring complex and bulky optical systems that render them unusable in portable devices, or in handheld devices such as cell phones. Taking US 4,208,086 as an example, the length of the device used to generate a larger three-dimensional image is in meters. With reference to WO 2004/044659 (US 2006/0055994), the device for reconstructing a three-dimensional image of an image has a thickness of more than 10 cm. Thus, the conventional devices described above have an excessive thickness for mobile phones or other portable, handheld or smaller display devices.

A device for reconstructing a three-dimensional scene by sufficient diffracted diffraction is mentioned in WO 2004/044659 (US 2006/0055994); the device comprises a point source or a linear source, a lens for focusing light, and a spatial light modulator. Compared to the conventional holographic display, the spatial light modulator reconstructs the 3D scene in at least one "virtual observer window" in the transmission mode (for the description of the virtual observer window and related techniques, please refer to Annexes I and II). Each virtual observer window is placed close to the observer's eyes and is limited in size, so the virtual observer window is in a single diffraction class, so each eye can see the complete reconstruction of the three-dimensional scene in the conical reconstruction space, conical The reconstruction space is extended between the surface light modulator surface and the virtual observer window. In order for the hologram reconstruction to be undisturbed, the size of the virtual observer window must not exceed the periodic interval of the reconstructed diffraction stage. However, this must be at least large enough to allow the observer to see a complete reconstruction of the 3D scene via the window. The other eye can be viewed through the same virtual observer window or a second virtual observer window generated by the second source. At this point, a typically large visible area is limited to a partially set virtual observer window. A conventional solution is to reconstruct a large area of miniaturization produced by the surface of a conventional high resolution spatial light modulator to reduce the size of the virtual observer window. This will result in a smaller diffraction angle due to geometric reasons, as well as a consumer-level computing device that is sufficient to achieve high quality instant holographic reconstruction.

However, it is known that a method of generating a three-dimensional image exhibits the disadvantage of requiring a large volume, large capacity, heavy weight, and expensive lens to focus due to the large spatial light modulator surface area. Therefore, the device will have a large thickness and weight. Another disadvantage is that when such a large lens is used, the color difference of the edge will seriously degrade the quality of the reconstruction. An improved light source improvement method comprising a lenticular array is mentioned in US 2006/250671, although it is applied to a wide range of image holograms as a reference.

A mobile phone that produces a three-dimensional image is mentioned in US 2004/0223049. However, the three-dimensional images mentioned are produced using autostereoscopic display. One problem with utilizing autostereoscopic display to produce a three-dimensional image is that the viewer typically perceives that the image is inside the display, while the viewer's eyes tend to focus on the surface of the display. In many instances, the difference between the focus of the viewer's eye and the perceived position of the three-dimensional image may cause a discomfort to the user. In instances where holographic techniques are used to generate three-dimensional images, these problems will not occur or be greatly reduced.

In a first aspect, a holographic display device is provided comprising an organic light emitting diode (OLED) array, an organic light emitting diode array, and an optically addressed spatial light tone, written to an optically addressed spatial light modulator The transducer will form an adjacent layer, and the optically addressed spatial light modulator will encode a full image map, when the read beam array illuminates the optically addressed spatial light modulator, and the optically addressed spatial light modulator is via the organic light emitting diode Polar body When the array is properly controlled, the holographic reconstruction will be produced by the device. The organic light emitting diode array and the optically addressed spatial light modulator can form adjacent layers facing each other and have no intermediate imaging optics between the organic light emitting diode array and the optically addressed spatial light modulator. The organic light-emitting diode array and the optically-addressed spatial light modulator can be fixed and physically directly connected to each other or the organic light-emitting diode array and the optically-addressed spatial light modulator are fixed and physically indirectly connected to each other. The organic light emitting diode array and the optically addressed spatial light modulator can be physically interconnected by an isolation layer. The isolation layer can be an angular filter, such as a Bragg filter.

In one embodiment, an array of infrared organic light emitting diodes is provided on the substrate, the substrate is transparent to visible light, and the array of infrared organic light emitting diodes is adjacent to the optically addressed spatial light modulator. Infrared light allows for amplitude or phase control of visible light transmitted by an optically addressed spatial light modulator, or a combination of amplitude and phase.

In an array manner, the infrared organic light emitting diode allows for amplitude or phase control of visible light transmitted by an optically addressed spatial light modulator, or a combination of amplitude and phase. The organic light emitting diode array and the optically addressed spatial light modulator are placed in close proximity such that they form a tight pair. The compact organic light emitting diode array and the optically addressed spatial light modulator act in pairs on the visible light such that a hologram can be produced in the optically addressed spatial light modulator. three The dimensional image can then be viewed by a pair of viewers positioned at a distance from the closely spaced organic light emitting diode array and the optically addressed spatial light modulator.

The organic light emitting diode array can emit a non-primary color display wavelength, and the readout wavelength can be one or more of red, green, and blue (RGB). The organic light emitting diode array can be infrared emitting and written onto the infrared sensing layer of the optically addressed spatial light modulator. The organic light emitting diode array and the optically addressed spatial light modulator layer can be reflective, and visible light can be reflected from the organic light emitting diode array and the optically addressed spatial light modulator layer to the observer. The organic light emitting diode array may be composed of a plurality of and smaller shingled organic light emitting diodes. The optically addressed spatial light modulator can comprise a liquid crystal material. The optically addressed spatial light modulator can include a photosensitive dye as the photosensitive layer.

The display can be illuminated with a backlight and a microlens array. The microlens array provides local homology over a small area of the display, which is the only portion of the display that encodes information used at a given point in the reconstructed object. The display can include a reflective polarizer. The display can include a 稜鏡 optical film.

The optically addressed spatial light modulator can be arranged for Freedericksz cells to provide phase control. Whole image reconstruction can be observed via a virtual observer window. The virtual observer window can be laid out using space or time multiplex. The display can be operable to include the hologram for the observer's left eye followed by the right eye Time-series re-encoded holograms on the medium.

The display can produce a holographic reconstruction for viewing by a single user.

The display can have a light-emitting diode as its light source.

The display can produce a two-dimensional image that is focused on the screen without the need for any projection lens, and is independent of the distance of the screen from the device in the optical far field.

The display device can use a beam splitter to transmit a holographic image to each eye.

The optically addressed spatial light modulator can be placed within 30 mm of the light source and placed in a portable cassette.

The display device can perform virtual observer window tracking by using a beam pointing component, wherein the beam pointing component is composed of a liquid crystal region inside the isotropic body material, wherein the interface between the region and the matrix is a prismatic shape, or a partial shape of the ball, or It is a partial shape of a cylinder, and the direction of the liquid crystal is controlled by an applied electric field to change the local refraction or diffraction property of the beam pointing element.

The display device allows optically positioned spatial light modulators, light sources, and light sources to be arranged The lens arrays are all placed in a portable case, and in which the light source is enlarged 10 to 60 times via the lens array.

The organic light emitting diode array and the optically addressed spatial light modulator layer can be transparent, and the read visible light can pass through this layer to the observer.

The optically addressed spatial light modulator is sensitive to the white light wavelength emitted by the organic light emitting diode array, but is not sensitive to the read wavelength.

The organic light emitting diode array can emit yellow light, and the readout wavelength can be one or more of red, green and blue.

The optically addressed spatial light modulator can be continuous.

An optically addressed spatial light modulator can be comprised of multiple and smaller shingled optically addressed spatial light modulators.

The display device can encode an hologram and can enable holographic reconstruction to be produced.

The display can be set to only be in the image of the observer's eye that is located close to the light source In the plane, the holographic reconstruction can be seen correctly.

The display device allows the size of the reconstructed three-dimensional scene to be a function that includes the size of the hologram medium, which can be anywhere within the volume defined by the virtual observer window containing the hologram media and the viewable reconstructed three-dimensional scene.

The display device may cause the display to compile an hologram comprising an area having information for reconstructing a single point of the three-dimensional scene, the point being viewable from a defined viewing position, the area (a) encoding a single point of information in the reconstructed scene, (b) And the area where the information is uniquely encoded in the hologram, and (c) the size is limited to form part of the overall hologram, the size of which requires multiple reconstructions of the point by the higher diffraction level Not seen in the defined viewing position.

The display can compile an hologram generated by a wavefront that is close to the observer's eye position as a result of determining the real version of the reconstructed object.

The display allows the hologram to be reconstructed into a Fresnel transform of the hologram rather than a Fourier transform of the hologram.

In another aspect, a holographic display device is provided comprising an array of organic light emitting diodes written to a pair of optically addressed spatial light modulators, an organic light emitting diode The polar body array and the optically addressed spatial light modulator will form adjacent layers, and the pair of optically addressed spatial light modulators encode the hologram, when the reading beam is illuminated by the pair of optically addressed spatial light modulators When the paired optically addressed spatial light modulators are properly controlled via the organic light emitting diode array, holographic reconstruction will be produced by the device. An organic light-emitting diode array can emit two different wavelengths, one for writing/controlling an optically addressed spatial light modulator for phase adjustment and the other for writing/controlling another An optically addressed spatial light modulator for amplitude adjustment. The organic light emitting diode array can be composed of two kinds of organic light emitting diodes emitting different wavelengths. Time multiplexing between the two emission wavelengths of the organic light emitting diode array can be used to facilitate independent control of the two optically addressed spatial light modulators.

In another aspect, a method consisting of generating a holographic reconstruction is presented, wherein omni-image reconstruction includes the steps of using the display device described herein.

In another aspect, a method of fabricating a display device is provided, comprising: obtaining a glass substrate and continuously printing on the substrate or otherwise generating an array of organic light emitting diodes, and subsequently generating an optically positioned spatial light modulator The steps of the layer. This method allows the insulating layer between the organic light emitting diode and the optically addressed spatial light modulator to be a sputter coating or other coating having a thickness of less than 10 microns. This method allows both the organic light emitting diode array and the optically addressed spatial light modulator layer to be printed Or production is a different step in a single manufacturing process.

The use of a "space light modulator to encode an hologram" means that the hologram is encoded on a spatial light modulator.

A. Close combination of infrared organic light-emitting diode display and optical address space light modulator

This embodiment provides an intimate combination of an optically addressed spatial light modulator with an infrared emitting display that can be formatted on an optically addressed spatial light modulator, such a combination being capable of producing a three dimensional image under appropriate lighting conditions.

An optically addressed spatial light modulator includes a photoreceptor layer and a liquid crystal (LC) layer positioned between the conductive electrodes. When a voltage is applied to the electrodes, the light pattern incident on the photoreceptor layer will be converted to a liquid crystal layer for modulating the read beam. In conventional techniques, the incident light pattern is provided by a write beam modulated by an electronically addressed spatial light modulator (EASLM). The electronically addressed spatial light modulator is illuminated by a light source and imaged onto an optically addressed spatial light modulator. Typically, the write beam is non-coherent to avoid speckle patterning, while the read beam is coherent and has the ability to produce a diffractive pattern.

Optically addressed spatial light modulator compared to electronically addressed spatial light modulator The advantage is that the optically addressed spatial light modulator can have a continuous, non-pixel or non-patterned structure, while the electronic addressed spatial light modulator is a pixel structure. Pixels produce sharp edges on the spatial distribution of light: this sharp edge is equivalent to a high spatial frequency.

High spatial frequencies result in wide-angle diffraction in the optical far field. Thus, electronically addressed spatial light modulators produce optically diffracted articles that are undesirable in optical far fields and must be eliminated using known techniques such as spatial filtering. In optical processing, spatial filtering requires additional steps, which can make the device thicker and waste light. An advantage of the optically addressed spatial light modulator type of device is the ability to allow continuous pattern generation in an optically addressed spatial light modulator. A continuous pattern allows the light intensity to have less sharp changes in any given direction to the direction of beam propagation. Therefore, fewer steep changes have a higher spatial frequency concentration than the edge of the pixel produced by the electronically addressed spatial light modulator device. In devices that include optically addressed spatial light modulators, the lower concentration of high spatial frequencies facilitates optical processing and is more efficient than devices that include electronically addressed spatial light modulators. Furthermore, the optically addressed spatial light modulator device can be a bistable device as compared to an electronic addressed spatial light modulator. Thus, an optically addressed spatial light modulator can have lower power requirements than an electronic addressed spatial light modulator device, which can increase the battery life of a portable device or handheld device.

In this embodiment, a compact device that does not require imaging optics is described. The optically addressed spatial light modulator is written using an infrared organic light emitting diode display. Organic luminescence The diode display is a direct connection to an optically addressed spatial light modulator that forms a compact device that does not have imaging optics. The organic light emitting diodes may be in the form of a tile to form an array of organic light emitting diodes. The optically addressed spatial light modulator can be comprised of a plurality of smaller shingled optically addressed spatial light modulators.

The close combination of the organic light emitting diode display and the optically addressed spatial light modulator can be transparent. Transparent organic light emitting diode displays are currently known, for example as described in the "Organic Light Emitting Diode Materials" section. In one example, the close combination of the organic light-emitting diode display and the optically-addressed spatial light modulator is to illuminate the edge formed by the edge to the three-dimensional image, and the visible light passes through the organic light-emitting diode and the optical address space. The light modulator is transmitted to the observer. A better method is that the organic light emitting diode display emits infrared light to be written to the infrared sensing photoreceptor layer of the optically addressed spatial light modulator. Because the human eye is not sensitive to infrared light, the observer cannot see any kind of light generated from the infrared writing beam.

As another example, the close combination of an organic light emitting diode display and an optically addressed spatial light modulator allows the write beam and the read beam to be incident on opposite sides of the optically addressed spatial light modulator. In another example, the close combination of the organic light emitting diode display and the optically addressed spatial light modulator allows the reflective layer to be on the side of the optically addressed spatial light modulator, which is an organic light emitting diode display. The opposite side allows the three-dimensional image to be viewed from the same side of the optically addressed spatial light modulator, ie It is the side where the organic light-emitting diode display is located, and the illumination source is also on the same side of the optically-addressed spatial light modulator as the organic light-emitting diode display: this is an example of a reflective display.

In an embodiment comprising an array of infrared organic light emitting diodes, the infrared emitting organic light emitting diode allows control of the combination of amplitude, phase or amplitude and phase of visible light transmitted by the optically addressed spatial light modulator, The hologram is generated in an optically addressed spatial light modulator. The optically addressed spatial light modulator can comprise a pair of transparent spacers coated with two electrically conductive films as described in the reference US 4,941,735. A continuous or discontinuous photosensitive film can be applied to one of the conductive films.

A bistable ferroelectric liquid crystal or some other type of liquid crystal may be confined between another conductive film and a photosensitive film. The starting voltage can be applied to the conductive film. In an optically addressed spatial light modulator, the optical write beam can be programmed pixel by pixel or to initiate the polarization of the optical read beam. The write beam can be programmed to optically address the spatial light modulator using the photosensitive regions of the individual activated optically addressed spatial light modulators. The optically addressed spatial light modulator is programmed to rotate the reading beam by the start of the write beam.

Figure 1 depicts an embodiment. 10 is a lighting device for providing illumination of a planar area, wherein the illumination is sufficiently homophonic to enable generation of a three-dimensional image. in US 2006/250671 refers to an example of a lighting device for a large area image hologram, an example of which is shown in Figure 4. A device like 10 may be in the form of an array of white light sources, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on a focusing system, wherein the focusing system may be compact, such as a lenticular array or a microlens array. . Alternatively, the light source for 10 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker. The thickness of elements 10-13 can all be on the order of a few centimeters or less. Element 11 can be a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 12, although a color filter is not required if a colored light source is used. Element 12 is an array of infrared organic light emitting diodes on a transparent substrate. The infrared organic light emitting diode array will cause each of the infrared organic light emitting diodes to emit light in the direction of the element 13 in parallel and conform to the light emitted from the unique corresponding color pixel. Element 13 is an optically addressed spatial light modulator. With regard to an optically addressed spatial light modulator, an infrared organic light emitting diode array provides a write beam; the color beam emitted by element 11 is a read beam. A viewer located at a distance 14 from the device including the compact hologram generator 15 can view a three-dimensional image from the direction of 15. Elements 10, 11, 12, and 13 are configured to be physically connected (realally connected), each forming a junction A layer of structure makes the whole a single, unified object. Physical connections can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. The physical connections can be limited to small areas that ensure correct mutual alignment, or can extend to larger areas, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using an optical transfer adhesive to form a compact hologram generator 15, or by any other means (refer to the outline manufacturing procedure section). ).

The element 10 may comprise one or two xenon optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 10 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 10 can include a focusing system that can be compact, such as a lenticular array or a microlens array. Element 10 can include other optical components known in the art of backlighting.

Figure 4 is a side view of a conventional technique showing three focusings of the vertical focusing system 1104 The elements 1101, 1102, 1103 are in the form of a cylindrical lens arranged horizontally in the array, reference is made to the reference WO 2006/119920. The beam that is nearly collimated with the horizontal line source LS2 passes through the focusing unit 1102 of the illumination unit to the observer plane OP as an example. According to Figure 4, many of the line sources LS1, LS2, LS3 are arranged one above the other. The light emitted by each light source is spatially homogenous in the vertical direction and spatially non-coherent in the horizontal direction. This light passes through the transmission element of the optical modulator SLM. This light is only diffracted in the vertical direction because of the components of the hologram-modulated optical modulator SLM. The focusing element 1102 images the light source LS2 at the observer plane OP in a number of diffraction stages (only one is useful). The light beam emitted by the light source LS2 is an example of the focusing element 1102 that passes only through the focusing system 1104. In Figure 4, the three beams show a first diffractive class 1105, a zeroth diffractive class 1106, and a negative diffractive class 1107. Line sources allow very high light intensities to be produced compared to a single point source. Efficient light intensity can be enhanced by using a plurality of holographic regions that have increased efficiency and line source alignment for each portion of the reconstructed three-dimensional scene. Another advantage is that without the use of a laser, multiple sources (e.g., after a slot that can be part of the shutter) can produce sufficient dimming.

B. Close combination of the combination of two pairs of organic light emitting diodes and an optically addressed spatial light modulator.

In still further embodiments, a close combination of the combination of two pairs of organic light emitting diodes and an optically addressed spatial light modulator can be used in a continuous and compact manner. Modulated amplitude and phase. Therefore, a complex number consisting of amplitude and phase can be compiled in transmitted light one by one.

This embodiment includes a first intimate combination of an infrared organic light emitting diode array and an optically addressed spatial light modulator pair and a second closely matched pair of an infrared organic light emitting diode array and an optically addressed spatial light modulator. combination.

The first pair modulates the amplitude of the transmitted light, and the second pair modulates the phase of the transmitted light. It is also possible that the first pair modulates the phase of the transmitted light and the second pair modulates the amplitude of the transmitted light. The close combination of each infrared organic light emitting diode array and the optically addressed spatial light modulator can be as described in Section A. The close combination of the two pairs of infrared organic light emitting diode arrays and the optically addressed spatial light modulator is separated by an infrared filter that absorbs infrared light without processing visible light.

In a first step, the first infrared organic light emitting diode array is patterned to provide amplitude modulation in the first optically addressed spatial light modulator. In a second step, the second infrared organic light emitting diode array is patterned to provide phase modulation in the second optically addressed spatial light modulator. The infrared filter blocks the leakage of infrared rays from the first tightly combined pair of infrared-organic light emitting diode arrays and the optically addressed spatial light modulator to the second tightly combined pair of infrared-organic light emitting diode arrays and optical type Address space light modulator. Infrared filter is also prevented from the second pair of red The close combination of the infrared ray of the external organic light emitting diode array and the optically addressed spatial light modulator leaks into the close combination of the first pair of infrared organic light emitting diode arrays and the optically addressed spatial light modulator. However, the infrared filter transmits visible light from the close combination of the first pair of infrared organic light emitting diode arrays and the optically addressed spatial light modulator to serve as a second pair of infrared organic light emitting diode arrays and optically addressed spatial light. The read beam in a tight combination of modulators. The light transmitted by the second optically-spaced spatial light modulator has been modulated in amplitude and phase so that when the viewer views the light emitted by the device comprising the two closely combined pairs, the observer can observe the three-dimensional map image.

Since the conventional phase and amplitude modulation techniques promote the performance of complex values, both the organic light emitting diode display and the optically addressed spatial light modulator have high resolution. Thus, this embodiment can be applied to produce a holographic image so that the viewer can see the three-dimensional image.

In Figure 2, an example of an implementation is shown. 20 is a lighting device for providing illumination of a planar area, and the illumination is sufficiently coherent to produce a three-dimensional image. An example of a large area image hologram as provided in US 2006/250671 is an example. This type of device can be in the form of an array of white light sources, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on the focusing system, wherein the focusing system can be compact, such as a lenticular array or micro. Lens array Column. Alternatively, the light source for 20 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

The thickness of elements 20-23, 26-28 can all be on the order of a few centimeters or less. Element 21 may comprise a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 22, although a color filter is not required if a colored light source is used. Element 22 is an array of infrared organic light emitting diodes on a transparent substrate. The infrared organic light emitting diode array will cause the light emitted by each of the infrared organic light emitting diodes in the direction of the element 23 to be parallel and conform to the light emitted from the unique corresponding color pixel. Element 23 is an optically addressed spatial light modulator. With regard to an optically addressed spatial light modulator, an infrared organic light emitting diode array provides a write beam; the color beam emitted by element 21 is a read beam. Element 26 is an infrared filter that transmits only visible light and interrupts infrared light such that the infrared light emitted by element 22 does not affect element 27. Element 27 is an optically addressed spatial light modulator. Element 28 is an array of infrared organic light emitting diodes on a transparent substrate. The infrared organic light emitting diode array will cause each of the infrared organic light emitting diodes to emit light in the direction of the element 27 in parallel and conform to the light emitted from the unique corresponding color pixel. About optical formula The address spatial light modulator 27, the infrared organic light emitting diode array 28 provides a write beam; the color beam emitted by the element 26 is a read beam. With respect to transmitting light, element 23 is modulated in amplitude and element 27 is modulated in phase. Element 27 can also be modulated in amplitude and element 23 can be modulated in phase. Since the light from the infrared organic light emitting diode array on the transparent substrate 28 is emitted in the direction of the element 26, the element 26 can absorb infrared light, preventing the light of the element 28 from being optically addressed to the spatial light modulator 23. With such a setting, the light emitted by the two organic light emitting diode arrays 22 and 28 is in substantially opposite directions, ensuring that the two optically addressed spatial light modulators 23 and 27 can be placed in close proximity. Bringing the optically-addressed spatial light modulators 23 and 27 close to the problem of reducing pixel loss and pixel crosstalk caused by beam divergence: when the optically addressed spatial light modulators 23 and 27 are in close proximity, optical A preferred approximation of the non-overlapping propagation of a colored light beam of a spatially modulated spatial modulator. The order of elements 27 and 28 of Figure 2 can be reversed, but this is not considered to be an ideal setting for achieving low crosstalk and high transmission targets between colored light beams through optically addressed spatial light modulators 23 and 27.

The element 20 may comprise one or two 稜鏡 optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 20 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such a sheet is known, for example, as described in US 5,828,488. Another example is Reflective polarizers that transmit a circularly polarized state and reflect orthogonal circularly polarized states - such sheets are known, for example, as described in US 6,181,395. Element 20 can include a focusing system that can be compact, such as a lenticular array or a microlens array. Element 20 can include other optical components known in the art of backlighting.

A viewer located some distance from the device including the compact hologram generator 25 at point 24 can view the three-dimensional image from the direction of 25. Elements 20, 21, 22, 23, 26, 27, and 28 are configured to be physically connected (realally connected), each forming a layer of structure such that the entirety is a single, unified object. Physical connections can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. Physical connections can be limited to small areas that ensure proper alignment, or can extend to larger areas, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using an optical transfer adhesive to form a compact hologram generator 15, or by any other means (refer to the outline manufacturing procedure section). ).

In Figure 2, the light emitted by the organic light emitting diode arrays 22 and 28 is ideally collimated. However, the light emitted by the actual organic light-emitting diode may be uncollimated, such as Lambertian (completely diffused) light. When the light emission of the organic light-emitting diode is not very collimated, the organic light-emitting diode can be as close as possible to the corresponding optically-spaced light modulator. In such a case, incident The intensity of the surface of the optically modulated spatial modulator will vary to approximately the square of the cosine of the incident angle. Incident light at 45° or 60° will result in a intensity that is only one-half or one-quarter of the normal incident light. Therefore, if the organic light-emitting diodes are spaced apart at intervals, the visible light pixels are sufficiently small and close enough to the optically-spaced spatial light modulator, the geometric effect will result in traversing the optically-spaced spatial light modulator space. There is a significant change in the potential difference generated, even in the case where the light emission of the organic light-emitting diode is limited by Lambertian. The intensity of the incident infrared light may not fall to zero between the points of the optically-addressed spatial light modulator where the light of the organic light-emitting diode is incident perpendicularly, which may result in a reduction in the achievable contrast of the device. However, if the device structure can be simplified, the reduced contrast is acceptable.

In Figure 2, the light emitted by the organic light emitting diode arrays 22 and 28 is ideally collimated. However, the light emitted by the actual organic light-emitting diode may be uncollimated, such as Lambertian (completely diffused) light. When the light emission of the organic light-emitting diode is not collimated, the geometric light distribution of the organic light-emitting diode can be corrected using a Bragg filter holographic optical element, such as described in US 5,153,670. A Bragg filter holographic optical element can cause light collimation or better collimation than without the use of this component. Figure 8 shows an example of the action of a Bragg filter holographic optical element. In Figure 8, 80 is an array of organic light emitting diodes, 81 is a holographic optical element Bragg filter, contains a Bragg plane, such as a Bragg plane 84, and 82 is an optically addressed spatial light modulator. In There A single organic light-emitting diode 83 in the array of light-emitting diodes 80 has a distribution of infrared rays emitted as indicated by 85. The light ray 86 emitted by the organic light emitting diode array 80 undergoes scattering in the holographic optical element 81 and is then incident approximately orthogonally on the optically addressed spatial light modulator 82. In this method, it is possible to improve the collimation of the infrared rays incident on the optically-addressed spatial light modulator 82.

Another implementation is shown in Figure 5. 57 is a lighting device for providing illumination of a planar area, and the illumination is sufficiently coherent to produce a three-dimensional image. An example of a large area image hologram as provided in US 2006/250671 is an example. This type of device may take the form of a white light source array, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on a focusing system, wherein the focusing system can be compact, such as a lenticular array or a microlens array. 50. Alternatively, the light source for 57 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

Element 57 may contain one or two xenon optical films to increase the brightness of the display: Such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 57 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 57 may comprise other optical elements known in the art of backlight technology.

The thickness of the elements 57, 50-54 may all be on the order of a few centimeters or less. Element 51 may comprise a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 52, although a color filter is not required if a colored light source is used.

Element 52 is an array of infrared organic light emitting diodes on a transparent substrate. The infrared organic light emitting diode array will be such that for each color pixel, a single pair of two infrared organic light emitting diodes emitting light in the direction of element 53 will be parallel and conform to the color pixels from their corresponding color Light. The first type of infrared organic light emitting diode emits infrared rays of a first wavelength. The second infrared organic light emitting diode emits infrared light of a second wavelength, and the second wavelength is different from the first wavelength. Element 53 is an optically addressed spatial light modulator. Element 54 is another optics Space-addressed light modulator. With regard to an optically addressed spatial light modulator, an infrared organic light emitting diode array provides a write beam; the color beam emitted by element 51 is a read beam. The optically addressed spatial light modulator 53 is controlled by a first one of the two infrared wavelengths emitted by the organic light emitting diode array 52. The optically addressed spatial light modulator 53 is insensitive to the second wavelength of the two infrared wavelengths emitted by the organic light emitting diode array 52 and will emit the second of the two infrared wavelengths emitted by the organic light emitting diode array 52. Wavelength transmission. The optically addressed spatial light modulator 54 is controlled by a second of the two infrared wavelengths emitted by the organic light emitting diode array 52. The optically addressed spatial light modulator 54 is insensitive to the first wavelength of the two infrared wavelengths emitted by the organic light emitting diode array 52, or may utilize the absorption of the optically addressed spatial light modulator 53 and/or To prevent the light of the first infrared wavelength from reaching the optically-addressed spatial light modulator 54, by its absorption, in the compact hologram generator 55, the optical type that is insensitive to the first infrared wavelength is not necessarily required. Address space light modulator 54. Alternatively, a single organic light-emitting diode emitting two different wavelengths may be used. The relative intensities of the two different wavelengths are determined by a parameter such as the voltage across the organic light-emitting diode. Two different wavelengths of radiation can be controlled using time multiplexing.

For transmitting light, element 53 is modulated in amplitude and element 54 is modulated in phase. Element 54 can also be modulated in amplitude, and element 53 can be modulated in phase. With such a setting, the organic light emitting diode array 52 emits light having two different wavelengths, ensuring that two optical addresses are empty. The inter-optical modulators 53 and 54 can be placed in close proximity. Bringing the optically-addressed spatial light modulators 53 and 54 close to the pixel crosstalk problem that can reduce optical wear and due to beam divergence: when the optically addressed spatial light modulators 53 and 54 are in close proximity, optical optics can be achieved. A preferred approximation of the non-overlapping propagation of a colored light beam of a spatially modulated spatial modulator.

A viewer located some distance from the device including the compact hologram generator 55 at point 56 can view the three-dimensional image from the direction of 55. Elements 57, 50, 51, 52, 53 and 54 are configured to be physically connected (realally connected), each forming a layer of structure such that the entirety is a single, uniform object. Physical connections can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. Physical connections can be limited to small areas that ensure proper alignment, or can extend to larger areas, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using an optical transfer adhesive to form a compact hologram generator 55, or by any other means (refer to the outline manufacturing procedure section). ).

Where the optically addressed spatial light modulator performs amplitude modulation, in a typical setup, the incident read optical beam will be linearly polarized by passing the beam through a linear polarizer. The amplitude modulation is controlled by the rotation of the liquid crystal in the applied electric field, wherein the electric field is generated by the photosensitive layer, affecting the polarization state of the light. In such a device, light exiting the optically addressed spatial light modulator passes through another linear polarizer. The intensity is reduced by the change in the polarization state of the light as it is when optically locating the spatial light modulator.

In the optically-addressed spatial light modulators performing phase modulation, unless they are already in a defined linearly biased state, in a typical setting, the incident reading optical beam will be achieved by passing the beam through a linear polarizer. Linearly biased. Phase modulation is controlled by the application of an applied electric field, where the electric field is generated by the photosensitive layer, affecting the phase state of the light. In one example of phase modulation, a nematic phase liquid crystal is used, the optical axis direction being fixed at intervals, but birefringence is a function of applied voltage. In the case of phase modulation, using ferroelectric liquid crystal, birefringence is fixed, but the direction of the optical axis is controlled by the applied voltage. In phase modulation implementation, using either method, the output beam has a phase difference for the input beam that is controlled by the applied voltage. One example of a liquid crystal element that can perform phase modulation is a Freedericksz cell arrangement in which an anti-parallel arrangement of nematic liquid crystals having positive dielectric anisotropy is used, as described in US 5,973,817.

C. Close combination of compact light source and electronically addressed spatial light modulator.

This embodiment provides a close combination of an electronic addressed spatial light modulator and a fully coherent compact light source that produces a three dimensional image with proper illumination.

In this embodiment, a close combination of an electronic address spatial light modulator that does not require imaging optics and a compact light source is described. This embodiment provides a close combination of a light source or multiple sources, a focusing method, an electronically addressed spatial light modulator (EASLM), and an optional spectroscopic element that produces a three-dimensional image with appropriate illumination. .

In Figure 11 is an embodiment. 110 is an illumination device for providing illumination of a planar area, wherein the illumination is sufficiently homogenous to enable generation of a three-dimensional image. An example of a lighting device for a large area image hologram is mentioned in US 2006/250671, an example of which is shown in Figure 4. The device like 110 can be in the form of a white light source array, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on the focusing system, wherein the focusing system can be compact, such as a lenticular array or a microlens array. . Alternatively, the light source for 110 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. Red, green, and blue light-emitting diodes can be organic light-emitting diodes (OLEDs). However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

Element 110 may have a thickness of about a few centimeters or less. In the preferred embodiment, elements 110-113 will all be less than three centimeters thick to provide a closely spaced, compact source. Element 111 can be a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 112, although a color filter is not required if a colored light source is used. Element 112 is an electronic addressed spatial light modulator. Element 113 is an unnecessary beam splitter element. A viewer located at point 114 some distance from the device including the compact hologram generator 115 can view the three-dimensional image from the direction of 115.

The element 110 may comprise one or two xenon optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 110 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 110 can include other optical components known in the art of backlighting.

The electronic address space optical modulator is a kind of spatial light modulator, in which the element Each component in the array can be addressed electronically. Each element performs some effect on the incident light, for example to modulate the amplitude of the light it transmits, or to modulate the phase of the light it transmits, or to modulate the combination of the amplitude and phase of the light it transmits. An example of an electronic addressed spatial light modulator is provided in US 5,973,817, which is a phase modulated electronic addressed spatial light modulator. The liquid crystal electronic address space optical modulator is an example of an electronic address space optical modulator. The magneto-optical electronic addressing spatial light modulator is another example of an electronic addressed spatial light modulator.

Elements 110, 111, 112, and 113 are configured to be physically connected (realally connected), each forming a layer of structure such that the entirety is a single, unified object. Physical connections can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. The physical connections can be limited to small areas that ensure correct mutual alignment, or can extend to larger areas, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using optically transmissive adhesives to form a compact hologram generator 115, or by any other means (refer to the outline manufacturing procedure section). ).

Figure 4 is a side view of a conventional technique showing the three focusing elements 1101, 1102, 1103 of the vertical focusing system 1104 in the form of cylindrical lenses arranged horizontally in the array. And the beam that is nearly collimated by the horizontal line source LS2 passes through the focus element of the illumination unit The case 1102 to the observer plane OP is taken as an example. According to Figure 4, many of the line sources LS1, LS2, LS3 are arranged one above the other. The light emitted by each light source is spatially homogenous in the vertical direction and spatially non-coherent in the horizontal direction. This light passes through the transmission element of the optical modulator SLM. This light is only diffracted in the vertical direction because of the components of the hologram-modulated optical modulator SLM. The focusing element 1102 images the light source LS2 at the observer plane OP in a number of diffraction stages (only one is useful). The light beam emitted by the light source LS2 is an example of the focusing element 1102 that passes only through the focusing system 1104. In Figure 4, the three beams show a first diffractive class 1105, a zeroth class 1106, and a negative one class 1107. Line sources allow very high light intensities to be produced compared to a single point source. Efficient light intensity can be enhanced by using a plurality of holographic regions that have increased efficiency and line source alignment for each portion of the reconstructed three-dimensional scene. Another advantage is that without the use of a laser, multiple sources (e.g., after a slot that can be part of the shutter) can produce sufficient dimming.

Typically, the hologram display is used to reconstruct the wavefront in the virtual observer window. A wavefront is something that an actual object will produce if it exists. When the observer's eyes are in a virtual observer window that may be in multiple virtual observer windows (VOWs), he will see the reconstructed object. As shown in Fig. 6A, the hologram display is composed of the following components: a light source, a lens, a spatial light modulator, and an unnecessary beam splitter.

In order to facilitate the close combination of the spatial light modulator and the compact light source that can display the holographic image, the single light source and the single lens of FIG. 6A can be replaced by the light source array and the lens array or the lenticular array, respectively, as shown in FIG. 6B. Show. In Figure 6B, the light source illuminates the spatial light modulator and the lens images the light source to the observer plane. The spatial light modulator encodes the holographic image and modulates the incoming wavefront so that the wavefront can be reconstructed in the virtual observer window. The optional beam splitter element can be used to create a number of virtual observer windows, such as a virtual observer window for the left eye and a virtual observer window for the right eye.

Assuming that an array of light sources is used with a lens array or a lenticular array, the light sources in the array must be separated such that light passing through the lens array or the lenticular array of all lenses simultaneously reaches the virtual observer window.

The device of Figure 6B is suitable for a compact design that can be applied to a compact hologram display. Such a holographic display can be applied to mobile applications, such as in a mobile phone or a personal digital assistant. Typically, such a hologram display will have a screen size of one inch or a few inches. The size of the full-image display screen can be as small as one centimeter. Suitable components will be described in detail below.

1) Light source / light source array

A fixed single source can be used for simple situations. If the observer moves, The observer can be tracked and the display can be adjusted so that the resulting image is visible to the observer at the new location. At this point, if there is no tracking of the virtual observer window, the tracking is done using the beam pointing element after the spatial light modulator.

A configurable array of light sources can be implemented by a liquid crystal display (LCD) illuminated in a backlight. In order to generate an array of point or line sources, only the appropriate pixels are switched to the transfer state. The apertures of these sources must be small enough to provide sufficient spatial coherence for the target holographic reconstruction. An array of point sources can be used with a lens array comprising two-dimensionally arranged lenses. An array of line sources is more preferably used with lenticular arrays comprising cylindrical lenses arranged in parallel.

It is preferred to use an organic light emitting diode display as the light source array. As a self-illuminating device, most of the light generated by the liquid crystal display can be absorbed by components such as a color filter or pixels that are in a non-perfect state, which can have better tightness and better province. Electric effect. However, liquid crystal displays may have an overall price advantage over organic light emitting diode displays, even though organic light emitting diode displays can provide light in a more efficient manner than liquid crystal displays. When an organic light emitting diode display is used as the light source array, only the pixels switched to it need to create a virtual observer window at the eye position. The organic light emitting diode display may have a two-dimensional array of pixels or a one-dimensional array of line sources. The light-emitting area of each point source or the width of each line source needs to be small enough to ensure that the space is coherent to the target. The holographic reconstruction. Similarly, an array of point sources is preferred for use with lens arrays comprising two-dimensionally aligned lenses. An array of line sources is more suitable for use with a lenticular array comprising cylindrical lenses arranged in parallel.

2) Focus method: single lens, lens array or lenticular array

The focusing tool images a light source or multiple sources to the observer plane. When the spatial light modulator is very close to the focusing tool, the Fourier transform of the information encoded in the spatial light modulator is in the observer plane. The focusing tool contains one or several focusing elements. The position of the spatial light modulator and the focusing tool is interchangeable.

For the close combination of an electronically addressed spatial light modulator with a well-constrained compact light source, a thin focusing tool is necessary: conventionally used refractive lenses are too thick. Instead, a diffractive or holographic lens is used. A diffractive or holographic lens can have the function of a single lens, a lens array, or a lenticular array. Such materials are present, such as surface relief holographic products provided by Physical Optics Corporation, Torrance, CA, USA. Or use a lens array. The lens array comprises two-dimensionally arranged lenses, each lens being assigned to a light source of the array of light sources. Another option is to use a lenticular array. The lenticular array comprises a one-dimensional array of cylindrical lenses, each lens having a corresponding source in the array of light sources. As described above, if a light source array and a lens array or a lenticular array are used, the light sources in the array must be separated so that the light passing through the lens array or the lenticular array of all the lenses is simultaneously virtual Observer window.

Light passing through a lens array or a lens of a lenticular array is non-coherent to any other lens. Therefore, the hologram image encoded on the spatial light modulator is composed of a sub-hologram, and each sub-image corresponds to one lens. The aperture of each lens must be large enough to ensure that the resolution of the reconstructed object is sufficient. A lens having an aperture that is almost as large as the typical size of the hologram encoding region can be used, as exemplified in US 2006/0055994. That is to say, the aperture of each lens is one or several millimeters.

3) Spatial light modulator

The hologram is encoded on the spatial light modulator. Typically, the encoding of an hologram is made up of a complex two-dimensional array. Therefore, ideally the spatial light modulator should be able to modulate the amplitude and phase of the local light beam passing through each pixel of the spatial light modulator. However, a general spatial light modulator can only modulate the amplitude or phase, and cannot be independently modulated.

Amplitude modulated spatial light modulators can be used in combination with track phase encoding, such as Burckhardt encoding. Its disadvantage is that it requires three pixels to encode a complex number, and the reconstructed object is less bright.

The phase modulation spatial light modulator produces a higher brightness reconstruction. For example, so-called 2-phase encoding can be used, using two pixels to encode a complex number.

Although electronically addressed spatial light modulators have significant edge characteristics, which would result in undesirable higher diffraction levels in their diffraction patterns, these problems can be reduced or eliminated by using soft apertures. The soft aperture is an aperture that does not have a sharp delivery cutoff. An example of a soft aperture transmission method is to have a Gaussian pattern. Gaussian graphics are known to be useful for diffraction systems. The reason is that the Fourier transform of the Gaussian function is a mathematical result of the Gaussian function itself. Therefore, the diffraction of the beam intensity waveform function is not changed except for the transmission with an aperture having a sharp cutoff in the transmission pattern itself. An array of sheets of Gaussian transfer graphics can be used. When these are provided in alignment with the electronically addressed spatial light modulator aperture, a higher diffraction-level system without higher diffraction levels or substantial reduction will be obtained compared to systems with sharp cut-offs in the beam delivery pattern. . A Gaussian filter or a soft aperture filter suppresses the diffraction of the product to a high spatial frequency. Gaussian filter or soft aperture filters minimize crosstalk between virtual observer windows for the left and right eyes.

4) Beam splitter component

The virtual observer window limits a periodic interval of the Fourier transform of the spatial light modulator encoded information. Using the existing maximum resolution spatial light modulator, the virtual observer window is 10 millimeters in size. In some cases, for the application This may be too small when there is no holographic display tracked. A spatially multiplexed virtual observer window is a solution to this problem: creating multiple virtual observer windows. In the case of spatial multiplexing, virtual observer windows are generated simultaneously at different locations on the spatial light modulator. This can be achieved by a beam splitter. For example, one set of pixels on the spatial light modulator encodes information for the virtual observer window 1 and another set of pixels encodes information for the virtual observer window 2. The beam splitter will distinguish the two sets of light so that the virtual observer window 1 and the virtual observer window 2 will be juxtaposed on the observer plane. The virtual observer window 1 and the virtual observer window 2 can be configured to create a larger virtual observer window. Multiplex can also be used to create virtual observer windows for the left and right eyes. In such cases, no seam juxtaposition is required and there may be a gap between one or several virtual observer windows for the left eye and one or several virtual observer windows for the right eye. Care must be taken that the higher diffraction level of the virtual observer window does not overlap with other virtual observer windows.

A simple example of a beam splitter element is a parallax barrier comprising black stripes with a transparent area between the black stripes, as described in US 2004/223049. Another example is a lenticular sheet, as described in US 2004/223049. Another example of a beam splitter element is a lens array and a sputum shield. In a compact hologram display, it may typically be desirable to have a beam splitter element, whereas a typical 10 mm virtual observer window is only sufficient to provide one eye, which is not consistent with a typical viewer having two eyes and is approximately 10 cm apart. . However, Use time multiplex as an alternative to spatial multiplexing. In the absence of space multiplex, it will not be necessary to use the beam splitter element.

Spatial multiplexing can also be used in the generation of color hologram reconstruction. For spatial color multiplex, the pixels are grouped, and each group contains red, green, and blue color elements. These clusters are spatially separated in a spatial light modulator and simultaneously illuminate red, green and blue light. Each group will utilize a hologram encoding calculated for the color element corresponding to the target. Each group recreates the color elements of its holographic target reconstruction.

5) Time multiplexing

In the case of time multiplexing, the virtual observer windows are generated one after the other at the same position on the spatial light modulator. This can be achieved by the position of the alternating light source and the simultaneous re-encoding of the spatial light modulator. The alternate positions of the light sources must be such that the virtual observer windows in the observer plane are seamlessly juxtaposed. If the time multiplex is fast enough, that is, the full period is greater than 25 Hz, the eye will see a continuously expanding virtual observer window.

Multiplex can also be used to create virtual observer windows for the left and right eyes. In such cases, no seam juxtaposition is required and there may be a gap between one or several virtual observer windows for the left eye and one or several virtual observer windows for the right eye. Such multiplexing can be space or time multiplex.

The multiplex of space and time can also be combined. As an example, three virtual observer windows are spatially multiplexed to create an expanded virtual observer window for one eye. This expanded virtual observer window is time multiplexed to create an enlarged virtual observer window for the left eye and an expanded virtual observer window for the right eye.

Care must be taken that the higher diffraction level of the virtual observer window does not overlap with other virtual observer windows.

The multiplexing of the expanded virtual observer window is more recommended than the re-encoding of the spatial light modulator because it provides an expanded virtual observer window with continuous changes in parallax for observer movement. In simple terms, multiplex without re-encoding will provide duplicate content in different parts of the expanded virtual observer window.

Time multiplexing can also be used in the generation of color hologram reconstruction. For time color multiplexing of three color elements, they are encoded sequentially on the spatial light modulator. These three sources will switch simultaneously with the re-encoding on the spatial light modulator. If the repetition of the full cycle is fast enough, ie greater than 25 Hz, the eye will see a continuous color reconstruction.

6) Undesirable processing of higher diffraction classes

If the larger virtual observer window is pieced together by a smaller virtual observer window, the higher diffraction level of the virtual observer window will likely be in other virtual views. Disturbing crosstalk occurs in the inspector window unless there are steps to avoid this problem. As an example, if each virtual observer window is in the zeroth diffraction stage of the Fourier transform of the spatial light modulator encoded information, the first diffraction level of the virtual observer window will likely overlap with the adjacent virtual observer window. Such overlap may result in a disturbing background, which may become particularly noticeable if the unwanted image intensity exceeds about 5% of the required image intensity. In such cases, it tends to compensate or suppress higher diffraction classes.

If the angle of the illuminated spatial light modulator is constant, a fixed angle filter can be used. If this is not the holographic display, there is no tracking function or the beam splitter element (such as the beam pointing element) is located behind the spatial light modulator. The fixed angle filter can be a Bragg filter or a Fabry Perot Etalon.

The Bragg filter imaging optics can be used to modify the geometric light intensity distribution, as described in US 5,153,670, where the spatial light modulator produces a geometric light intensity distribution with an unwanted diffraction level. The Bragg filter holographic optical element can result in a different light intensity distribution than when the component is not used. Figure 7 shows the function of the Bragg filter holographic optics. In Figure 7, 70 is a spatial light modulator and 71 is a holographic optical element Bragg filter containing a Bragg plane, such as a Bragg plane 74. A single element 73 on the spatial light modulator 70 is provided as shown in Figure 75 The intensity of the diffracted light is distributed. Light ray 76, which is diffracted by spatial light modulator 70, undergoes scattering in holographic optical element 71 and is then transmitted in a different direction than the original propagation between 70 and 71. If the direction in which the light rays 76 are transmitted is between the 70 and 71 unwanted unwanted first-order diffracted light, it can be easily seen that the Bragg filter 71 successfully changes the light to a different direction so that it does not cause unwanted and may hinder viewing. The optically processed product of a typical viewer will be located approximately perpendicular to the direction of 70.

An adjustable method for suppressing the diffractive class is mentioned in the patent application No. DE 10 2006 030 503. Mentioned is a liquid crystal layer between two coplanar glass sheets coated with a partially reflective coating. For each reflection of the coated beam, the beam is partially reflected and partially transmitted. The interference of the transmitted beam and the phase difference between them will determine whether the interference is constructive or destructive, as described in the Fabry-Perrault specification. Given a wavelength, the interference and transmission will vary with the angle of incidence of the beam.

Given a direction of light propagation, the interference can be adjusted by changing the refractive index of the liquid crystal for the direction of propagation of a given light. The refractive index is controlled by an electric field applied to the liquid crystal layer. Therefore, in all the limitations of the Fabry-Perrault gauge, the angular transmission characteristics can be adjusted, and the diffractive class can be selectively transmitted or reflected as desired. For example, if the Fabry-Perrault gauge is set to the optimal transmission of the zeroth class and the best reflection of the first class, there may still be some unwanted transmissions of the second and higher classes. In all the restrictions of the Fabry-Perrault gauge, this device can help The diffractive class is fixed or sequentially selected for transmission or reflection as required.

Space filters can be used in the selection of the diffraction class. The spatial filter can be placed between the spatial light modulator and the virtual virtual observer window and contains transparent and opaque areas. These spatial filters can be used to deliver the desired diffractive class and block unwanted diffracting classes. These spatial filters can be fixed or configurable. For example, an electronic address space light modulator disposed between the spatial light modulator and the virtual observer window can be used as a settable spatial filter.

7) Eye tracking

In an intimate combination of an electronically addressed spatial light modulator with eye tracking and a well-constrained compact light source, the eye position detector detects the observer's eye position. Therefore, one or several virtual observer windows can be automatically placed in the eye position so that the observer can see the reconstructed object through the virtual observer window.

However, tracking is not always possible because of the additional device requirements and power demand limitations that affect performance, especially for portable devices or handheld devices. Without tracking, the observer must adjust the position of the display. This is easily achievable because in the preferred embodiment, the compact display is a hand held display that may be included in a personal digital assistant or mobile phone. A personal digital assistant or a user of a mobile phone usually views the display vertically, for adjusting the virtual observer window. Corresponding to the position of the user's eyes, it will not be of much help. It is well known that users of handheld devices tend to change the orientation of the device on hand to obtain the most desirable viewing state, as described in WO 01/96941. Therefore, in such a device, the user's eye tracking is not required and the tracking optics are not as compact as the scanning mirror. However, eye tracking can be applied to other devices, and for the device, the extra demand for the device and the power supply does not cause an excessive burden.

In the absence of tracking, the close combination of an electronic address space light modulator with a fully coherent compact light source requires a large virtual observer window to simplify display adjustment. A better virtual observer window size should be several times the size of the pupil of the eye. This can be done by a single large virtual observer window using a small pitch spatial light modulator or by a number of smaller virtual observer windows using a large pitch spatial light modulator.

The position of the virtual observer window is determined by the position of the light source in the array of light sources. The eye position detector detects the position of the eye and sets the position of the light source to fit the virtual observer window to the position of the eye. This type of tracking is described in US 2006/055994 and US 2006/250671.

Alternatively, the virtual observer window can be moved when the light source is in a fixed position. Light source tracking needs to be relatively insensitive to changes in the angle of incidence of light from the source Space light modulator. If the light source is moved to move the virtual observer window position, such settings may make it difficult to achieve a tight combination of a compact light source and a spatial light modulator due to the possibility of abnormal light propagation in tight combinations, in such an example, It would be helpful to have a fixed light path in the display and a beam pointing element that is the last optical component in the display.

The beam directing elements are shown in Figures 20 and 21. This beam pointing element changes the angle of the beam at the output of the display. It can have optical properties for x and y tracking controllable and for z-tracking controllable lenses. For example, either or both of the beam directing elements of Figures 20 and 21 can be applied to a single device. The beam directing element is a controllable diffractive element or a controllable refractive element. The controllable refractive element may comprise an array of recesses filled with liquid crystal, the liquid crystal being embedded in an isotropic linear dipole susceptibility tensor matrix. The cavity has the shape of a crucible or a lens. The electric field controls the effective refractive index of the liquid crystal and thus helps the beam to be directed. The electric field can vary between components to produce a beam directing characteristic that varies between components. As shown in Figure 20, the electric field is applied between the transparent electrodes. The liquid crystal has a uniaxial refractive property and can be selected such that its refractive index perpendicular to its optical axis is equivalent to the refractive index of the host material or "matrix". The rest of the settings can be obtained from the conventional technology. The host material has an isotropic refractive index. If the optical axis of the liquid crystal is aligned along the z direction, as applied by the appropriate electric field as shown in Fig. 20, the plane wave propagating along the z direction does not have refraction when it passes through the beam pointing element because it does not encounter it. To any wave that is perpendicular to it The refractive index change of the Poynting vector. However, if an electric field is applied to the electrode such that the optical axis of the liquid crystal is perpendicular to the z-direction, a plane wave that is biased parallel to the optical axis is propagated along the z-direction, and when it passes through the beam to the element, the most refraction is encountered because It experiences the most likely refractive index change along its (system-providable) biasing direction. The degree of refraction will be adjusted between these two extreme examples by selecting the appropriate electric field applied to the host material.

If the cavity is prismatic rather than lens shaped, the beam will be pointed. Figure 21 shows the orientation of the beam to the appropriate prism. If the optical axis of the liquid crystal is aligned along the z direction, as applied by the appropriate electric field as shown in Fig. 21, the plane wave propagating along the z direction does not have refraction when it passes through the beam pointing element because it does not Any refractive index change is encountered in its direction of polarization. However, if the electronic field is to apply a traverse electrode such that the liquid crystal axis is perpendicular to the z-direction, the plane wave propagates along the z-direction which is biased parallel to the optical axis to experience the most refraction because it passes through the beam pointing element because Its most probable refractive index system provides its Poynting vector that varies vertically.

However, if an electric field is applied to the electrode such that the optical axis of the liquid crystal is perpendicular to the z-direction, a plane wave that is biased parallel to the optical axis is propagated along the z-direction, and when it passes through the beam to the element, the most refraction is encountered because It undergoes a maximum of possible refractive index changes perpendicular to its (system-provided) Poynting vector. The degree of refraction will be between these two extreme examples, by choosing to apply to the host material Adjust with an appropriate electric field.

8) Examples

An example of a close combination of an electronic addressed spatial light modulator and a fully coherent compact light source will now be described. This combination can produce a three-dimensional image with appropriate illumination and can be placed in a personal digital assistant or mobile phone. in. The close combination of an electronic addressed spatial light modulator and a fully coherent compact light source includes an organic light emitting diode display as an array of light sources, an electronic addressed spatial light modulator and a lens array, as shown in FIG.

Depending on the location requirements of the virtual observer window (represented by OW in Figure 12), a particular pixel in the organic light emitting diode display is activated. These pixels illuminate the electronically addressed spatial light modulator and are imaged by the lens array at the observer plane. At least one pixel of each lens of the lens array is activated in the organic light emitting diode display. In drawing a given size, if the pixel pitch is 20 μm, a virtual observer window with a lateral increment of 400 μm can be traced. Such tracking is quasi-continuous.

Organic light-emitting diode pixels are light sources with partial spatial homology. Partial homology will produce a fuzzy reconstruction of the target point. In drawing a given size, if the pixel width is 20 microns, a tape will be produced at a target point of 100 mm from the display. There is a 100 micron lateral blur reconstruction. This is sufficient for the resolution of the human visual system.

Light passing through different lenses of the lens array does not have significant common homology. The need for coherence is limited to each single lens of the lens array. Therefore, the resolution of the reconstruction target point is determined by the pitch of the lens array. For the human visual system, the typical lens pitch will be 1 mm class to ensure full resolution. If the organic light emitting diode pitch is 20 micrometers, this means that the ratio of the lens pitch to the organic light emitting diode pitch is 50:1. If only one organic light-emitting diode is illuminated for each lens, this means that only one organic light-emitting diode will be illuminated per 50^2 = 2,500 organic light-emitting diodes. Therefore, this display will be a low power display. The difference between the holographic display referred to herein and the conventional organic light-emitting diode display is that the former concentrates on the viewer's eyes, whereas the latter emits light to 2π steradian. The conventional organic light-emitting diode display achieves a luminance of about 1,000 cd/m^2 (calculated by the inventor in practice), whereas in practice, the illumination type organic light-emitting diode should be able to achieve 1,000 cd/m^ 2 times the luminosity.

The virtual observer window is a diffractive class that limits the Fourier spectrum of the encoded information in the spatial light modulator. If the spatial light modulator has a pixel pitch of 10 μm and two pixels are required to encode a complex number, ie if 2-phase encoding is used on a phase-modulated electronically addressed spatial light modulator, at 500 nm wavelength, the virtual observer The window will have a width of 10mm wide. The virtual observer window can utilize space or time multiplexing to piece together several virtual observer windows into an expanded virtual observer window. In the case of spatial multiplexing, additional optical components such as beam splitters are required.

Color hologram reconstruction can be achieved by time multiplexing. The red, green, and blue pixels of a color organic light emitting diode display are sequentially activated using synchronous re-encoding of a spatial light modulator having an hologram of the red, green, and blue optical wavelengths.

The display can include an eye position detector to detect the observer's eye position. The eye position detector is connected to a control unit that controls the pixel activity of the organic light emitting diode display.

The calculation of the hologram encoded on the spatial light modulator is preferably performed by an external coding unit because it requires higher computational power. The display data is then sent to a personal digital assistant or mobile phone to display a three-dimensional image of the full image.

For practical examples, a 2.6 inch screen size XGA liquid crystal display electronically positioned spatial light modulator manufactured by Sanyo (RTM) Epson (RTM) Imaging Devices Corporation of Japan can be used. The pitch of the sub-pixels is 17 μm. If this is the construction used for the red, green and blue hologram display, use the hologram The amplitude modulation code is 0.4m wide from the observation window at a distance of 0.4m from the electronic address space optical modulator. For the case of monochrome, the viewing window is calculated to be 4 mm wide. If the same setting is used, but the phase modulation of the 2 phase encoding is used, the viewing window is calculated to be 6 mm wide. If the same setting is used, but the phase modulation of the Kinoform encoding is used, the viewing window is calculated to be 12 mm wide.

In addition, there are other high resolution electronic address space optical modulators. Seiko (RTM) Epson (RTM) Corporation of Japan has published a monochrome electronic address space optical modulator, such as a D4:L3D13U 1.3 inch screen size with a 15 micron pitch panel. The company also published the same type of panel D5: L3D09U-61G00, with a 0.9 inch screen size and a pixel pitch of 10μm. On December 12, 2006, the company announced the same type of panel L3D07U-81G00 with a 0.7-inch screen size and a pixel pitch of 8.5 μm. If the D4:L3D13U 1.3 inch panel is used to construct a monochrome holographic display and uses the holographic Burckhardt amplitude modulation code, the distance from the electronically addressed spatial light modulator is 0.4m. The virtual observer window can be calculated to be 5.6mm wide.

D. Close combination of pairs of electronically addressed spatial light modulators

In another embodiment, the combination of two electronically addressed spatial light modulators can be used to modulate the amplitude and phase of the light in a sequential and compact manner. So, including The complex amplitude and phase can be encoded in the transmitted light pixel by pixel.

This embodiment includes a close combination of two electronically addressed spatial light modulators. The first electronically addressed spatial light modulator modulates the amplitude of the transmitted light, and the second electronically addressed spatial light modulator modulates the phase of the transmitted light. It is also possible to modulate the phase of the transmitted light by a first electronically addressed spatial light modulator, and the second electronically addressed spatial light modulator modulates the amplitude of the transmitted light. Each electronic address space light modulator can be as described in Section C. In addition to using two electronically addressed spatial light modulators, the overall configuration can be as described in Section C. Any combination of other two types of electronically addressed spatial light modulator modulation characteristics that are equivalent to independent modulation of amplitude and phase is possible.

In a first step, the first electronically addressed spatial light modulator utilizes pattern coding for amplitude modulation. In a first step, the second electronically addressed spatial light modulator utilizes pattern coding for phase modulation. The light transmitted from the second electronically addressed spatial light modulator has been modulated in amplitude and phase, so that when the observer observes the light emitted by the device of the two electronically addressed spatial light modulators, A three-dimensional image can be observed.

The phase-and-amplitude modulation technique promotes the performance of complex values. In addition, electronically addressed spatial light modulators can have high resolution. Therefore, this embodiment It can be used to generate a hologram to make a three-dimensional image viewable by an observer.

Figure 13 is an embodiment. 130 is a lighting device for providing illumination of a planar area, wherein the illumination is sufficiently homogenous to enable generation of a three-dimensional image. An example of a lighting device for a large area image hologram is mentioned in US 2006/250671, an example of which is shown in Figure 4. The device like 130 can be in the form of an array of white light sources, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on the focusing system, wherein the focusing system can be compact, such as a lenticular array or a microlens array. . Alternatively, the light source for 130 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. The red, green, and blue light emitting diodes may be organic light emitting diodes (OLEDs). However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

The element 130 may comprise one or two 稜鏡 optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 130 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such Sheets are known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 130 can include a focusing system that can be compact, such as a lenticular array or a microlens array. Element 130 can include other optical components known in the art of backlighting.

Element 130 can have a thickness of about a few centimeters or less. In a preferred implementation, the thickness of elements 130-134 are all less than 3 cm to provide a tightly tuned compact source. Element 131 can be a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 132, although a color filter is not required if a colored light source is used. Element 132 is an electronic addressed spatial light modulator. Element 133 is an electronic addressed spatial light modulator. Element 134 is an optional beam splitter element. For transmitting light, element 132 is modulated in amplitude and element 133 is modulated in phase. Alternatively, component 133 modulates the amplitude and component 132 modulates the phase. Proximating the electronically addressed spatial light modulators 132 and 133 to reduce optical loss and pixel crosstalk caused by beam divergence: when the electronically addressed spatial light modulators 132 and 133 are in close proximity, A preferred approximation of the non-overlapping propagation of a colored light beam of a spatially modulated spatial modulator. A viewer located at point 135 some distance from the device including the compact hologram generator 136 can view the three-dimensional image from the direction of 136.

Elements 130, 131, 132, 133, and 134 are configured to be physically connected (realally connected), each forming a layer of structure such that the entirety is a single, unified object. Physical connections can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. Physical connections can be limited to small areas that ensure proper alignment, or can extend to larger areas, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using optically transmissive adhesives to form a compact hologram generator 136, or by any other means (refer to the Summary Manufacturing Procedures section). ).

Where the electronically addressed spatial light modulator performs amplitude modulation, in a typical setup, the incident read optical beam will be linearly biased by passing the beam through a linear polarizer. The amplitude modulation is controlled by the rotation of the liquid crystal in the applied electric field, and the application of the electric field affects the polarization state of the light. In such a device, light exiting the electronically addressed spatial light modulator passes through another linear polarizer, which reduces the intensity due to changes in the polarization state of the light, as it does when electronically addressing the spatial light modulator. .

In the electronically addressed spatial light modulators performing phase modulation, unless they are already in a defined linearly biased state, in a typical setting, the incident reading optical beam will be achieved by passing the beam through a linear polarizer. Linearly biased. Phase modulation is controlled by the application of an electric field that affects the phase state of the light. One in phase modulation In one example, a nematic phase liquid crystal is used, the optical axis direction being fixed at intervals, but birefringence is a function of applied voltage. In one example of phase modulation, using ferroelectric liquid crystal, birefringence is fixed, but the direction of the optical axis is controlled by the applied voltage. In phase modulation implementation, using either method, the output beam will have a phase difference from the input beam that is a function of the applied voltage. One example of a liquid crystal element that can perform phase modulation is a Freedericksz cell arrangement in which an anti-parallel arrangement of nematic liquid crystals having positive dielectric anisotropy is used, as described in US 5,973,817.

A compact combination for compact hologram display with two electronically addressed spatial light modulators combined in small or minimal separation. A preferred embodiment is that the two spatial light modulators have the same number of pixels. Because the two electronically addressed spatial light modulators are not equidistant to the observer, the pixel spacing of the two electronically addressed spatial light modulators may need to be slightly different (but will still be about the same) to compensate for the difference. The impact of distance on the observer. The light that has passed through the pixels of the first spatial light modulator passes through the pixels corresponding to the second spatial light modulator. Therefore, light is modulated by two spatial light modulators, and complex amplitude and phase modulation can be achieved independently. As an example, the first spatial light modulator performs amplitude modulation, and the second spatial light modulator performs phase modulation. Similarly, any combination of other two spatial light modulator modulation characteristics that are equivalent to independent modulation of amplitude and phase is possible.

It must be noted that the light passing through the pixels of the first spatial light modulator can only pass through the pixels corresponding to the second spatial light modulator. Crosstalk will occur if the light exiting the first spatial light modulator pixel passes through a non-corresponding, adjacent pixel of the second spatial light modulator. These crosstalk may cause problems with reduced image quality. Four possible ways to minimize crosstalk between pixels are provided here. As will be apparent from conventional techniques, these methods are equally applicable to the Part B embodiment.

(1) The first and easiest way is to directly connect or bond the two spatial light modulators after adjusting the pixels. In the pixels of the first spatial light modulator, there may be a diffraction phenomenon that causes the light to deviate from propagation. The separation between the spatial light modulators must be sufficiently thin to be as thin as possible to the acceptable crosstalk between adjacent pixels of the second spatial light modulator. As an example, the spacing of two electronically addressed spatial light modulators having a pixel pitch of 10 μm must be less than or equal to a level of 10-100 μm. This is almost impossible to achieve in a conventionally manufactured spatial light modulator because the thickness of the glass cover is a rating of 1 mm. Of course, a "sandwich" approach that enables a thin separation layer between spatial light modulators is more recommended in a program. The fabrication method described in the Summary Manufacturing Procedure section can be applied to fabricate an apparatus comprising two electronically addressed spatial light modulators having a small or minimal separation distance.

Figure 14 shows the Fresnel diffraction number calculated from the diffraction of a slit 10 μm wide. According to the graph, the distance from the slit is varied in the two-dimensional model, with the vertical axis being slit(z) and the horizontal axis being slit(x). The uniformly illuminated slit is between -5 [mu]m and +5 [mu]m on the x-axis and z is zero micron. Optical transmission media are used to achieve a refractive index of 1.5, which is a typical medium for compact devices. The selected light is red light having a vacuum wavelength of 633 nm. The green and blue wavelengths are smaller than the red light, so for the calculation of red light, among the three colors red, green and blue, the strongest diffraction effect is exhibited. Calculations can be performed using the product MathCad (RTM) software from Parametric Technology (RTM) Corp., Needham, MA, USA. Figure 15 shows that the slight intensity remains in the 10 μm wide range at the center of the slit as a function of the distance from the slit. At a distance of 20 μm from the slit, Fig. 15 shows that the intensity greater than 90% is still in the range of 10 μm wide of the slit. Therefore, in this two-dimensional model, less than 5% of the pixel intensity is incident on each of the adjacent pixels. This is the result of the calculation of the zero boundary width between pixels. The actual boundary width between pixels is greater than zero, so the crosstalk problem will be lower in the real system than the result calculated here. In Figure 14, the Fresnel diffraction pattern approaches the slit, for example 50 μm from the slit, and somewhat approximates the high hat strength function of the slit. Therefore, there is no wide diffraction feature close to the slit. The wide diffraction characteristic is a characteristic of the far-field diffraction function of the high-hat type function, which is a conventionally known sinc squared function. The wide diffraction pattern can be observed by the example of the distance slit 300 μm in Fig. 14. This indicates that the diffraction effect can be controlled by the close proximity of the two electronically addressed spatial light modulators, and that the very close advantage of setting the two electronically addressed spatial light modulators is the diffraction data graph. The function type will change from far-field characteristics to more effective The rate contains a functional pattern of light that is close to the axis perpendicular to the slit. This advantage is contrary to the idea of conventional holographic techniques, which tend to be thought to cause strong, large, and unavoidable diffraction effects when light passes through a small aperture of a spatial light modulator. Thus, conventional techniques do not have the motivation to bring the two spatial light modulators together, and would expect such a way to cause pixel crosstalk problems that would inevitably occur and are severely caused by diffraction effects.

Figure 16 shows a contour plot of the intensity distribution as a function of distance from the slit. The plot of the contour is on a logarithmic scale, not a linear scale. Ten contour lines were used, all including a range of 100 intensity factors. For a slit width of 10 μm, a large degree of intensity distribution boundary is clear in the range of about 50 μm from the slit.

In a further embodiment, the pixel aperture area of the first electronically addressed spatial light modulator can be reduced to mitigate crosstalk problems in the second electronically addressed spatial light modulator.

(2) The second method is to use a lens array between the two spatial light modulators, as shown in Figure 17. A better approach is to have the number of lenses equal to the number of pixels in each spatial light modulation. The spacing of the two spatial light modulators and the spacing of the lens arrays can be slightly different to compensate for the observer's distance difference. Imaging of each lens The pixels of the first spatial light modulator are on the pixels corresponding to the second spatial light modulator, as shown by a plurality of light beams 171 in FIG. It is also possible that light will cause crosstalk problems through adjacent lenses, as indicated by a large number of beams 172. If its intensity is low enough, or if its direction is sufficiently different to make it impossible to reach the virtual observer window, it can be ignored.

The numerical aperture (NA) of each lens must be large enough to image a pixel with sufficient resolution. As an example, for a resolution of 5 μm, a numerical aperture (NA) of about 0.2 is required. This also means that if it is assumed to be a polyhedron, if the distance between the spatial light modulator and the lens array is 10 μm, the maximum distance between the lens array and each spatial light modulator is about 25 μm.

It is also possible to assign several pixels of each spatial light modulator to one lens of the lens array. As an example, a group of four pixels of the first spatial light modulator can be imaged by a lens in the lens array to a group of four pixels of the second spatial light modulator. The number of lenses of such a lens array will be one quarter of the number of pixels in each spatial light modulator. This allows the use of lenses with higher numerical apertures, thus enabling higher resolution imaging pixels.

(3) The third method is to reduce the pixel aperture of the first electronically addressed spatial light modulator as much as possible. From the perspective of diffraction, the second spatial light modulator consists of the first spatial light The area illuminated by one pixel of the modulator is determined by the pixel aperture width D and the diffraction angle of the first electronically addressed spatial light modulator, as shown in FIG. In Figure 18, d is the distance between two electronically addressed spatial light modulators, and w is the distance between the two first-order diffraction minimums, occurring on either side of the zeroth class maximum. This is assumed to be a diffraction of Fraunhofer or a reasonable approximation of the Fraunhofer diffraction.

Reducing the aperture width D on the one hand reduces the range of direct projection of the central portion of the illumination area, as indicated by the dashed line in FIG. On the other hand, the diffraction angle is increased in accordance with the diffraction angle being proportional to 1/D in the Fraunhofer diffraction. This increases the width of the illuminated area on the second electronically-spaced spatial light modulator. w. The full width of the illuminated area is w. In the Fraun and Fiji diffraction methods, the division d, D can be determined, and the equation w=D+2dλ/D is used to minimize w, which is the first of two from the Fraun and Fiji diffractions. The distance between the order minimums is derived.

For example, if λ is 0.5 μm, d is 100 μm, and w is 20 μm, a minimum value of D of 10 μm can be obtained. In this case, however, the Fraunhofer method may not be a good approximation. This example illustrates the use of the distance between the electronically addressed spatial light modulators to control the diffraction in the Fraunhofer diffraction mode. The principle of the process.

(4) The fourth method uses a fiber optic panel to image the image of the first spatial light modulator The pixel is on the pixel of the second spatial light modulator. The fiber optic panel is composed of two-dimensionally arranged parallel fibers. The length of the fiber and therefore the thickness of the panel is typically a few centimeters, and the diagonal length of the panel surface is as long as several inches. As an example, the spacing of the fibers can be 6 μm. Edmund Optics Inc. of Barrington, New Jersey, USA has sold fiber optic panels with such fiber spacing. Each fiber guides light from one of its sources to the other. Therefore, the image at one end of the panel is transmitted to the other end with high resolution and no focusing components. Such a panel can serve as a separation layer between two spatial light modulators, as shown in Figure 19. Multimode fiber is preferred over single mode fiber because multimode fiber has better coupling efficiency than single mode fiber. When the refractive index of the fiber core is stable with the refractive index of the liquid crystal, the best coupling efficiency is obtained because it minimizes the Fresnel back reflection loss.

There is no additional glass cover between the two spatial light modulators. The polarizer, the electrode and the alignment layer are directly connected to the fiber optic panel. Each of these layers is very thin, ie a grade of 1-10 μm. Therefore, the liquid crystal (LC) layers LC1 and LC2 are located close to the panel. Light passing through the first spatial light modulator pixel is directed to a pixel corresponding to the second spatial light modulator. This minimizes crosstalk from neighboring pixels. The panel transmits the light distribution at the output of the first spatial light modulator to the input of the second spatial light modulator. On average, each pixel should have at least one fiber. If there are less than one fiber per pixel, on average, the spatial light modulator will lose resolution, resulting in reduced image quality for applications displayed in holographic displays.

In Figure 19, the first spatial light modulator modulates the amplitude and the second spatial light modulator modulates the phase. Other combinations of modulation characteristics of two electronically addressed spatial light modulators that promote complete complex modulation are possible.

Figure 10 shows an example of the tight alignment of the encoded amplitude and phase information in the hologram.

104 is an illumination device for providing illumination of a planar area, wherein the illumination is sufficiently homogenous to enable generation of a three-dimensional image. An example of a lighting device for a large area image hologram is mentioned in US 2006/250671. The device like 104 may be in the form of a white light source array, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on a focusing system, wherein the focusing system may be compact, such as a lenticular array or a microlens array. 100. Alternatively, the light source for 104 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

Element 104 may include one or two xenon optical films to increase the brightness of the display </ RTI> </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; Element 104 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 104 can include other optical components known in the art of backlighting.

The thickness of the elements 104, 100-103 may all be on the order of a few centimeters or less. Element 101 may comprise a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 102, although a color filter is not required if a colored light source is used. Element 102 is an electronic addressed spatial light modulator that encodes phase information, such as a Freedericksz cell. Element 103 is an electronic address spatial light modulator that encodes amplitude information, such as in a commercially available liquid crystal display device. Each of the elements of element 102, designated 107 here, will be aligned with the corresponding elements of element 103, indicated at 108. However, although elements in elements 102 and 103 have the same lateral spacing or spacing, the element size in element 102 may be less than or equal to the elements in element 103, as light exiting element 107 precedes element 108 of element 103, typically The ground will experience some diffraction. The encoding order of amplitude and phase can be Contrary to what is shown in Figure 10.

A viewer located at point 106 some distance from the device including the compact hologram generator 105 can view the three-dimensional image from the direction of 105. Elements 104, 100, 101, 102, and 103 are configured to be physically connected as previously described to form a compact hologram generator 105.

E. The constituent element comprises a close combination of one or two pairs of organic light emitting diodes with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators, and has a target holographic reconstruction Large magnification three-dimensional image display device

Figure 24 shows a compact combination of one or two pairs of organic light-emitting diodes with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators. A large-magnification three-dimensional image display device for reconstructing a target hologram. The components of this device include a close combination of a spatial light modulator and a well-consistent compact light source (such as those described in sections A, B, C and D), which can be combined in appropriate lighting conditions. A visible three-dimensional image is produced in the virtual observer window (labeled OW in Figure 24), which may be integrated, for example, in a personal digital assistant or mobile phone. As shown in Fig. 24, the close combination of the spatial light modulator and the well-toned compact light source includes a light source array, a spatial light modulator, and a lens array. The spatial light modulator in Figure 24 includes one Close combination of two or two pairs of organic light emitting diodes and optically addressed spatial light modulators or one or two electronically addressed spatial light modulators, or an organic light emitting diode and optical addressing space The combination of optical modulators is paired and an electronically addressed spatial light modulator.

In a simple example, the array of light sources can be formed in the following manner. A single source, such as a monochromatic light-emitting diode, is placed in close proximity to the array of apertures to illuminate the aperture. If the aperture is a slit of a one-dimensional array, the light that is transmitted from the slit will form a one-dimensional array of light sources. If the aperture is a circle of a two-dimensional array, the illuminated set of circles forms a two-dimensional array of light sources. A typical aperture width will be approximately 20 μm. Such an array of light sources is suitable for the generation of an observer window for one eye.

In Fig. 24, the light source array is disposed at a distance from the lens array u. The array of light sources can be the source of the element 10 of Figure 1, and can optionally include the elements 11 of Figure 1. Specifically, each of the light sources in the array of light sources is located at a distance u from the lens u corresponding to the lens array. In a preferred embodiment, the array of light sources is parallel to the plane of the lens array. The spatial light modulator can be located on either side of the lens array. The distance between the virtual observer window and the lens array is u. The lens in the lens array is a condensing mirror, and the focal length f is given by f = 1 / [1/u + 1 / v]. In a preferred embodiment, the value of v is in the range of 300 mm to 600 mm. In a more preferred embodiment, v is approximately 400 mm. In a preferred embodiment, the value of u is In the range of 10mm to 30mm. In a more preferred embodiment, u is approximately 20 mm. The amplification factor M is determined by v/u. M is a factor that is modulated by the spatial light modulator and is magnified in the virtual observer window. In a preferred embodiment, the value of M is in the range of 10 to 60. In a more preferred embodiment, M is about 20. In order to achieve such an amplification factor and have good holographic image quality, an array of light sources and a lens array that are accurately aligned are required. In order to maintain a precise alignment and maintain the same distance between the array of light sources and the lens array, the device components need to have strong mechanical stability until the lifetime of the component is exceeded.

The virtual observer window can be traceable or untrackable. If the virtual observer window is traceable, a particular light source in the array of light sources will be activated depending on the desired position of the virtual observer window. The activated light source illuminates the spatial light modulator and is imaged by the lens array to the observer plane. In the array of light sources, at least one light source is activated for each lens in the lens array. Tracking is quasi-continuous. If u is 20 mm and v is 400 mm, if the pixel pitch is 20 μm, a virtual observer window with a lateral increment of 400 μm can be traced. Such tracking is quasi-continuous. If u is 20mm and v is 400mm, f is about 19mm.

The light source in the array of light sources may only have partial spatial homology. Partial homology can lead to fuzzy reconstruction of the target point. If u is 20mm and v is 400mm, if the width of the light source is 20μm, the reconstruction of the target point of 100mm from the display will be 100 μm lateral blur. This is sufficient for the resolution of the human visual system.

It is not necessary to have any significant mutual homology between the light passing through the different lenses in the lens array. The need for coherence is limited to each single lens in the lens array. Therefore, the resolution of the reconstruction target point is determined by the pitch of the lens array. A typical lens pitch will be on the order of 1 mm to ensure adequate resolution for the human visual system.

The virtual observer window is a diffractive class that limits the Fourier spectrum of the encoded information in the spatial light modulator. If the spatial aperture of the spatial light modulator is 10 μm and two pixels are required to encode a complex number, ie if 2-phase encoding is used on a phase-modulated electronic address spatial light modulator, at a wavelength of 500 nm, the virtual observer window There will be a width of 10mm wide. The virtual observer window can utilize space or time multiplexing to piece together several virtual observer windows into an expanded virtual observer window. In the case of spatial multiplexing, additional optical components such as beam splitters are required. Some multiplexed methods are described in Section C, and these multiplexed methods may also be applied to the implementation of this case.

Color hologram reconstruction can be achieved by time multiplexing. The red, green, and blue pixels of a color organic light-emitting diode display are successively activated by synchronous re-encoding of a spatial light modulator having a hologram of red, green, and blue optical wavelengths. move.

The display formed by the device components can include an eye position detector for detecting the position of the observer's eyes. The eye position detector is connected to a control unit that controls the activation of the light source in the array of light sources.

The calculation of the hologram encoded on the spatial light modulator is preferably performed by an external coding unit because it requires higher computational power. The display data is then sent to a personal digital assistant or mobile phone to display a three-dimensional image of the full image.

For practical examples, a 2.6 inch screen size XGA liquid crystal display electronically positioned spatial light modulator manufactured by Sanyo (RTM) Epson (RTM) Imaging Devices Corporation of Japan can be used. The pitch of the sub-pixels is 17 μm. If this is used for the construction of the red, green and blue hologram display, the amplitude modulation code of the hologram is used, and the observation window is calculated to be 1.3 mm wide at a distance of 0.4 m from the electronic address space light modulator. For the case of monochrome, the viewing window is calculated to be 4 mm wide. If the same setting is used, but the phase modulation of the 2 phase encoding is used, the viewing window is calculated to be 6 mm wide. If the same setting is used, but the phase modulation of the Kinoform encoding is used, the viewing window is calculated to be 12 mm wide.

There are still other high resolution electronic address space light modulators. Seiko (RTM) Epson (RTM) Corporation of Japan has published a monochrome electronic address space optical modulator, such as a D4:L3D13U 1.3 inch screen size with a 15 micron pitch panel. The company also published the same type of panel D5: L3D09U-61G00, with a 0.9 inch screen size and a pixel pitch of 10μm. On December 12, 2006, the company announced the same type of panel L3D07U-81G00 with a 0.7-inch screen size and a pixel pitch of 8.5 μm. If the D4:L3D13U 1.3 inch panel is used to construct a monochrome holographic display and uses the holographic Burckhardt amplitude modulation code, the distance from the electronically addressed spatial light modulator is 0.4m. The virtual observer window can be calculated to be 5.6mm wide.

F. A compact combination of one or two pairs of organic light emitting diodes combined with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators with a target holographic reconstruction Image display device

A close combination of one or two pairs of organic light emitting diodes with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators is preferred for handheld 3D display devices or In larger 3D display devices, because such combinations are very tight. Such a combination can be integrated into, for example, a mobile phone, a satellite navigation device, a car display, a computer game device, a personal digital assistant (PDA), a notebook computer display, a desktop computer screen, or a thin television display. This A three-dimensional display is more specific to a single user. The user is typically positioned perpendicular to the light emitting surface of the device and is the distance from which the device can be optimally viewed, such as a distance of approximately 500 mm. It is well known that users of handheld devices tend to change the orientation of the device on hand to obtain the most desirable viewing state, as described in WO 01/96941. Therefore, in such a device, the user's eye tracking is not required and the tracking optics are not as compact as the scanning mirror. However, eye tracking can be applied to other devices, and for the device, the extra demand for the device and the power supply does not cause an excessive burden.

A three-dimensional map of satellite navigation with one or two pairs of organic light-emitting diodes combined with an optically-spaced spatial light modulator or one or two electronically-spaced spatial light modulators with target holographic reconstruction The image display device has the following advantages. The driver can find a three-dimensional image of the route information, such as the way to be performed at the next intersection, and because the three-dimensional image information is more in line with the perception when the driver is driving, it can be better than the two-dimensional image information. Information on other displays, such as menus, can be displayed in three dimensions. Some or all of the information on the display can be displayed in three dimensions.

A three-dimensional map of a vehicle that includes a pair or two pairs of organic light-emitting diodes combined with an optically-spaced spatial light modulator or one or two electronically-positioned spatial light modulators with target holographic reconstruction The image display device has the following advantages. This dress It is possible to display three-dimensional information directly, for example, when reversing, or attempting to display a three-dimensional view of the proximity of a car bumper (guard) to an adjacent object (such as a wall) by a position slightly wider or narrower than the vehicle. image. Where the passage is narrower than the vehicle, the three-dimensional image display device helps the driver understand that the vehicle does not pass through the passage. The three-dimensional image can be created using information provided by sensors mounted on the vehicle. Other vehicle information can be displayed on the display in three dimensions, such as speed, temperature, per minute firing speed, or other information displayed in the vehicle. Satellite navigation information can be displayed in three dimensions on the display. Some or all of the information on the display can be displayed in three dimensions.

The size of the output window is limited by the periodic spacing of the diffracted patterns in the Fourier plane. If the pixel pitch in the organic light emitting diode display or the electronic address spatial light modulator is close to 10 μm, then for a visible light with a wavelength of 500 nm, at a distance of 500 mm, according to the spatial light modulator of the hologram The code, virtual observer window (VOW) is about 10mm to 25mm wide. This is wide enough for one eye. For another second virtual observer, the spatial or temporal multiplexing of the content of the spatial light modulator can be established. In the absence of tracking, in order to see the best three-dimensional image, the observer must rotate or move the device and/or his own position so that his eyes can be in the virtual observer window and at the optimal distance from the device. .

The patchwork of several virtual observer windows makes it easier to adjust the position and orientation of the display device. Two or three virtual observer windows can be juxtaposed in the x- and y-directions such that the virtual observer window can cover a larger area. The patchwork can be done by spatial or temporal multiplexing, or by a combination of space and time multiplexing. In time multiplex, light is projected onto the virtual observer window in time. If the virtual observer window has different content, the spatial light modulator must be re-encoded. In spatial multiplexing, the content of the different virtual observer windows is encoded in the spatial light modulator at the same time, but in different areas of the spatial light modulator. The beam splitter splits the light in different areas of the spatial light modulator into different virtual observer windows. A combination of space and time multiplexing can be used.

The size of a hand-held three-dimensional display device typically used for mobile phones or personal digital assistants ranges from one inch to several inches. The full-image display can have a screen size as small as one centimeter.

The three-dimensional image display device can switch to display a two-dimensional image, for example, by displaying the same image to each of the viewer's eyes.

Figure 3 shows the implementation of a three-dimensional image display device comprising a close combination of one or two pairs of organic light emitting diodes and an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators. example. The device in Figure 3 is a mobile phone 30. On the mobile phone, when the 3D image of the other party equipped with the similar device is displayed on the screen area 31, the user can make a call. The mobile phone has an antenna 32 for mobile communication. In other embodiments, the antenna can be located in the body of the mobile phone 30. The mobile phone 30 is equipped with two cameras 33 and 34 for recording images of the left and right eyes of the user. The images of the left and right eyes contain stereoscopic image data. The mobile phone 30 is equipped with a digital and "*" and "#" symbol button 35, as well as other function buttons 36, such as moving in the menu on the screen, rewinding or starting to close. The indications displayed on the buttons, such as "ON" "OFF" or "2", can avoid reversing the confusion and prevent the two sides of the three-dimensional video telephone conversation from being confused when viewing the other party. In use, the eyes of the two viewers are preferably coplanar with the two cameras 33 and 34, and the user's face is positioned approximately perpendicular to the screen area 31. This ensures that the two cameras 33 and 34 record the parallax in the plane containing the viewer's eyes. The viewer's head is pre-determined for the most optimal viewing position of the display so that the two cameras 33 and 34 can obtain the most desirable image quality of the viewer's head at this location. The same is true for the other party in a three-dimensional image telephone call, so that both parties can be in a two-way three-dimensional image telephone conversation with the best image quality. In order to ensure that each viewer is accurately facing the cameras 33 and 34, it may be desirable to ensure that the virtual observer window for each eye is not much larger than each eye, as this can limit the viewer's eye to the viewer's camera orientation. An error in position and orientation. By mounting the device toward the target of the photograph, the mount can take a three-dimensional photograph of the target. Alternatively, it can be guided by a small button icon on the device screen. It is used by the user to complete the optimal orientation setting of the device. The device can also have an eye tracking function. The device format and usage described herein can be used for devices that can produce a three-dimensional image in a holographic, autostereoscopic, or any other method.

During a two-way three-dimensional video call, cameras 33 and 34 record the user's right and left eye images, respectively. The data obtained from these images will be used on the corresponding handheld device in the 3D video call to create a 3D image. If the three-dimensional image is produced in an autostereoscopic display, the images of the two eyes can be directly used in the autostereoscopic display by viewing from the cameras 33 and 34. If the three-dimensional image is holographically generated, the data contained in viewing from cameras 33 and 34 should be processed, for example by using a computer that produces a hologram, such as on one or two spatial light modulators. Like the proper encoding of the material. When the three-dimensional image is produced in a holographic manner, the three-dimensional display is a holographic display. Compared to autostereoscopic displays, holographic displays provide full depth information, ie adjustment (eye focus) and parallax. A holographic display provides holographic reconstruction of the target, ie holographic reconstruction of all target points at the correct depth.

The application of the handheld three-dimensional display described herein includes a call to maintain a two-way three-dimensional videophone. Another application involves displaying a three-dimensional display of a target or scene by the other party in the call, such as viewing the product prior to purchase, or checking for damage. Another application is to include confirmation of individual identities, which can be displayed in three dimensions. Get help. Three-dimensional display enhances the ability to distinguish individuals who look very similar, such as twins or disguised people. Another application involves the use of images to view individuals for further contact, such as in dating services, where three-dimensional images can aid in decision making. Another application includes a way to view adult content using a three-dimensional display, and viewers will prefer a three-dimensional display over a two-dimensional display.

Different individuals have different distances between their eyes. In one embodiment, a three-dimensional display device with target holographic reconstruction has a menu option that allows the user of the display to vary the distance between the virtual observer window projecting the left eye and the right eye. In the selection of menu options, the user presses a button on the device to increase or decrease the separation between the virtual observer windows. If this is already set, when viewing the display and attempting to view the three-dimensional image, the separation distance between the best virtual observer windows can be selected to allow the viewer to view the best three-dimensional image that can be achieved. The selected distance can then be stored in the user's preferences. If there are multiple individuals using the device, multiple user preferences can be stored in the device. Such menu options can be implemented, although the device has the ability to track the viewer's eye position separately, because the precise distance between the virtual observer windows selected by the user is better than the choice of the software. . Once such a choice is made, the speed of tracking will be accelerated because the precise positional decision required for the observer's eyes will be lower after the distance between the eyes becomes a fixed parameter. The ability to choose a better distance between two virtual observer windows also provides the advantage of surpassing the autostereoscopic display system, in automatic In a stereoscopic display system, the distance between the left eye and right eye images tends to be fixed using the device hardware.

G. Planar projector system comprising one or two pairs of organic light emitting diodes combined with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators

Light emitted from the device can also be projected onto a screen or wall or some other surface to replace the way the projected light is projected into several virtual observer windows as described in Section F. Therefore, a three-dimensional display device in a mobile phone or a personal digital assistant or in other devices can also be used as a pocket projector.

The quality of the hologram can be improved by using a spatial light modulator to modulate the amplitude and phase of the incident light. Therefore, a complex-valued hologram can be encoded on a spatial light modulator, allowing for better quality of images reconstructed on the screen or wall.

The close combination of one or two pairs of organic light-emitting diodes described in the previous section with an optically-addressed spatial light modulator or one or two electronically-positioned spatial light modulators can be used as spatial light modulation The device is used in the projector. Since the size of this combination is tight, the projector will also be tight. The projector can even be a mobile phone or a personal digital assistant or some other device: by "three-dimensional display" The "and" projector mode is used to switch.

Compared to conventional two-dimensional projectors, holographic two-dimensional projectors have the advantage of not requiring a projection lens and that the projected image is focused at all distances in the optical far field. Conventional holographic two-dimensional projectors, such as those described in WO2005/059881, use a single spatial light modulator, so that complex modulation cannot be performed. The holographic two-dimensional projector described herein will be able to perform complex modulations and thus have excellent image quality.

H. Autostereoscopic or holographic display using a tight combination of one or two infrared organic light emitting diode displays and an optically addressed spatial light modulator

The close combination of an infrared organic light-emitting diode display and an optically-addressed spatial light modulator (such as described in Part A) can also be used in auto-stereoscopic displays (ASD), especially in mobile phones or personal digital Hand-held autostereoscopic display in the assistant. However, for a typical viewer, viewing an autostereoscopic display is not as comfortable as viewing a full-image display, although in some cases an autostereoscopic display may be cheaper or easier to generate or provide than a full-image display. Image data. The autostereoscopic display provides several viewing areas, with each viewing area displaying a different perspective of the three dimensional scene. If the viewer's eyes are in different viewing areas, he will see a stereoscopic image. The difference between autostereoscopic display and holographic technology: The autostereoscopic display provides two planar images, while the holographic technique provides Z-information for each target point in the three-dimensional scene.

Typically, autostereoscopic displays are based on spatial multiplexing of the viewing area on the display and use photonic spectroscopic elements such as lenticulars, barrier masks or prism masks. . Obstacle masks can also be called "parallax obstacles." A disadvantage of autostereoscopic displays is that the resolution of each viewing area is typically inversely proportional to the number of viewing areas. However, this disadvantage can be compensated for by the advantages of the autostereoscopic display as described above.

The close combination of an infrared organic light emitting diode display and an amplitude modulated optically addressed spatial light modulator (such as described in Section A) can be used to provide a high resolution amplitude modulated display. If the close combination of the infrared organic light emitting diode display and the amplitude modulation optical address spatial light modulator is combined with the beam splitter element, a high resolution autostereoscopic display can be constructed. The high resolution of the tight combination compensates for the loss of resolution due to spatial multiplex.

For autostereoscopic displays that require one or more additional optically addressed spatial light modulators, use a tight combination of one or more organic light emitting diode arrays with one or more optically addressed spatial light modulators (eg The advantage of what is described in sections A and B) is the non-patterned optically addressed spatial light modulator. Autostereo The display includes a beam splitter and an organic light emitting diode array, which may have a processed product due to the patterned organic light emitting diode, for example, a moiré effect between the beam splitter and the organic light emitting diode period ( Moiré effects). In contrast, the information on the tightly combined optically addressed spatial light modulator is continuous: during the beam splitter only, periodic artifacts do not occur.

The light source of the autostereoscopic display can be one or more light sources, such as light emitting diodes, lasers, organic light emitting diodes or cold cathode fluorescent lamps. The light source does not need to be homogenous. If an organic light emitting diode is used and the autostereoscopic display displays a color image, a color filter layer, such as red, is required between the light source and the light emitting display and the close combination of the amplitude modulated optically addressed spatial light modulator. Green and blue filters.

The close combination of an infrared organic light-emitting diode display and an optically-addressed spatial light modulator (such as described in Part A) can also be used in holographic displays, especially in mobile phones or personal digital assistants. Display. A holographic display is based on spatial multiplexing of the viewing area on the display and uses photon spectroscopy elements such as lenticulars, barrier masks or prism masks. Obstacle masks can also be called "parallax obstacles." The close combination of an infrared organic light emitting diode display and an optically addressed spatial light modulator (such as described in Section A) can be used to make a full resolution display with high resolution. If the infrared organic light emitting diode display and the amplitude modulation light The close combination of the spatially-spaced optical modulators, combined with the beam splitter elements, enables the construction of high-resolution holographic displays. The high resolution of the tight combination compensates for the loss of resolution due to spatial multiplex. In another embodiment, the combination of the close combination of two pairs of organic light emitting diode arrays and an optically addressed spatial light modulator can be used in a sequential and compact manner to modulate the amplitude and phase of the light, such as portion B. What is described. Therefore, the complex number consisting of amplitude and phase can be encoded in the transmitted light by pixel by pixel. If the close combination of the two pairs of infrared organic light emitting diode displays and the amplitude modulated optically addressed spatial light modulator is combined with the beam splitter elements, a high resolution holographic display can be constructed. The high resolution of the tight combination compensates for the loss of resolution due to spatial multiplex. A holographic display with a beam splitter element can provide several viewing areas with different views of the three dimensional scene displayed by each viewing area. If the viewer's eyes are in different viewing areas, he will see a stereoscopic image.

I. A data processing system required for three-dimensional transmission.

Figure 22 shows the data processing system required for 3D transmission. In Fig. 22, one of the parties 220 and the other party 221 are in three-dimensional transmission. The photographing data for creating an image can be collected using the mobile telephone device 30 shown in FIG. 3 or some devices having similar functions. The data processing for the three-dimensional image display can be performed in a device of one of the parties 220, the device can be a mobile phone 30 or an equivalent device, or can be executed in the device of the other party 221, but preferably can be located Two Execution is performed in the intermediate system 224 on the transport network between mobile phones. The transmission network includes a first connection 222, an intermediate system 224 and a second connection 223. The two connections 222 and 223 can be wireless or non-wireless. The intermediate system 224 can include a computer that performs calculations such that a three-dimensional image, such as a computer-generated hologram or autostereoscopic display, can be displayed. It is better to use a computer to perform calculations on the transmission network between the two mobile phones, since the calculation will not consume the battery power of the mobile phone, but instead use the main power source instead. Computers located on the transmission network can be used to simultaneously process images of a large number of three-dimensional videophone calls, which allows for more efficient use of computing resources, such as by reducing the amount of unused computational processing power. If the required computing power is reduced, the weight of the mobile phone or other similar device will be reduced, and it will require less computer circuitry and memory, as computing needs will be performed by a computer located on the transmission network. Finally, the software that performs the calculations will only need to be installed on a computer located on the transport network and will not need to be installed in a mobile phone or other similar device. This will reduce the memory requirements of the mobile phone and the scope of software piracy, and will increase the protection of any corporate secrets in the code. While most of the computation required for three-dimensional image display can be performed by intermediate system 224, it is also possible that some image calculations are performed in the user device prior to data transfer. For example, if the two captured images are very similar, if the two images are difference images that are transmitted as the first image and the difference between the two images, the difference image is very easy to perform to facilitate data transfer. The data compression technology will therefore facilitate the transfer of data. Similarly, the three-dimensional image display device can perform some image calculations, such as decompressing the map. Like information.

In one example of the system of FIG. 22, the first image and the second image form a pair of stereoscopic display images and are transmitted by the device of the user 220 to the intermediate device 224 via the connection 222. The second transmitted image may be a difference image between two stereoscopic display images, as the difference image will typically require less material than the full image. If the three-dimensional telephone conversation is in progress, the first image may be the difference between the current image and the image at the previous point in time. Similarly, the second image can be the difference between the current image and the image at the previous point in time. Next, based on the corresponding depth map from the received material, the intermediary 224 can calculate the two-dimensional (2D) image using a conventional calculation program for conversion between two-dimensional and three-dimensional (3D) images. For a color image, three elements of the two-dimensional image in three main colors are required, along with their corresponding depth maps. Next, the data about the two-dimensional image and the depth map is transmitted to the device of the user 221 via the connection 223. The user 221's device encodes the hologram based on the received two-dimensional image and depth map in its compact three-dimensional display device. In order to efficiently use the transmission bandwidth, the data transmitted in this system can be subjected to a conventional compression procedure, and a corresponding decompression action is performed in the receiving device. Using the most efficient amount of data compression, the battery of the mobile device is compressed and decompressed by the battery compared to the bandwidth requirement with less data compression.

The intermediate device 224 can access a library containing a collection of known three-dimensional shapes and It seeks to find a match that satisfies the three-dimensional data it computes, or it can access a library containing a collection of known two-dimensional graphics in which it attempts to find a pair of stable incoming two-dimensional image data. This can speed up the calculation process if a good pairing can be found in a known shape, since a two- or three-dimensional image can then be represented as corresponding to a known shape. The three-dimensional shape library can provide the face or body shape of a group of sports stars, such as a major tennis player or football player, as well as all or part of a major sports venue, such as a famous tennis court or a famous football pitch. For example, a three-dimensional image of a human face can be represented as an item that has been accessed by an intermediate device 224, plus facial expression changes, such as a smile or frown, plus a change in the length of the hair, because the hair may remain after the data is stored. Long or short. If a set of persistent differences occurs, the records that the intermediate device 224 has accessed are significantly more obsolete than the data, for example, over a long period of time, the length of the person's hair has changed significantly, then the data that has been accessed by the intermediate device 224 The update can be made by the intermediary device 224. If the intermediary device 224 encounters a two- or three-dimensional image of a pair that has not been found among the records it has accessed, it will add a new shape to the set of records.

J. System for helping 2D image content to 3D image content

One of the difficulties in safely adopted 3D display technology is the fact that very little content is produced in a three-dimensional format, and now most of the content continues to be produced in a two-dimensional format. Partly because most image recording devices used today continue to record two-dimensional images, and no data can be used in three-dimensional images. In addition, now There are very few opportunities for viewers to request three-dimensional content or to obtain three-dimensional content generated from two-dimensional content.

This is obviously a system that supports the generation of 3D content from 2D content. A system is given in Figure 23. In FIG. 23, even if the viewer 2303 has a three-dimensional display device in the home, the television communication company 2300 continues to play the two-dimensional television image 2304. In this system, with an intermediate system 2301, two-dimensional content can be converted to three-dimensional content 2305. Such conversion procedures may be supported by the viewer for payment or may be supported by other parties, such as advertiser 2303. In FIG. 23, when the advertisement of the advertiser 2303 is played by the television company 2300, the advertiser 2303 pays the fee 2306 to the intermediate system 2301, and converts the known two-dimensional content into three-dimensional content by the conversion program. Dimensional content is converted into three-dimensional content. The advertiser's benefit is presented to the viewer 2302 as a three-dimensional television commercial, which will be more noticeable than the two-dimensional television commercial. Alternatively, viewer 2302 may pay a fee to intermediate system 2301 to convert and receive some or all of the three-dimensional format of the television broadcast. The intermediate system will ensure that the provision of the three-dimensional content is a correct and synchronized format. For example, if the two-dimensional image has its corresponding depth map, the two data sets will be provided in a synchronous manner, that is, the three-dimensional display device will correspond to the corresponding two-dimensional image. Like using a depth map, a depth map is not used for non-corresponding 2D images. The three-dimensional display device can be a holographic display device, an autostereoscopic display device, or any conventional three-dimensional display device. The data providing the three-dimensional display device should be suitable for the type of three-dimensional display device. Systems similar to the above are also applicable to non-TV broadcasts. The content provided by the company's provider, such as a movie or video tape provider.

In another system, the viewer can pay for the two-dimensional content to the intermediary system and receive a three-dimensional form of the provided two-dimensional content. The provided two-dimensional content can be, for example, an MP3 file of a home movie, or other video content or an image such as a photo or a picture.

The intermediate system can include a computer to perform calculations such that the three-dimensional image can be displayed, such as a computer-generated hologram or auto-stereoscopic image. It is preferable to perform the calculation using a computer that transfers the network between the two-dimensional content provider and the viewer who wishes to view the three-dimensional image content, since this is more efficient than executing such a program on the viewer side. Computers located on the transmission network can perform image processing for large amounts of 2D to 3D content conversion at the same time, which allows for more efficient use of computing resources, for example by reducing the amount of unused computational processing power. If the required computing power is reduced, the cost of the viewer's 3D display device will be reduced because it will require less computer circuitry and memory, and computing needs will be performed by a computer located on the transmission network. . Finally, the software that performs the calculations will only need to be installed on a computer located on the transmission network and will not need to be installed in the viewer's 3D display device. This will reduce the memory requirements of the viewer's three-dimensional display device and the scope of software piracy, and will increase the protection of any corporate secrets in the code. Although most of the calculations required for 3D image display can be performed by an intermediate system, it is possible that some image calculations are Executed in the viewer's three-dimensional display device. The three-dimensional image display device may perform some image calculations, such as decompressing the compressed image data, or generating a holographic encoding of the spatial light modulator from the two-dimensional image and its corresponding depth map.

In one example, the intermediate system can calculate a corresponding depth map of the received two-dimensional image using a computational program that converts between two-dimensional and three-dimensional images. For a color image, three elements of the two-dimensional image in three main colors are required, along with their corresponding depth maps. Next, the data about the two-dimensional image and the depth map is transmitted to the viewer's three-dimensional display device. The viewer's 3D display device encodes the hologram based on the received 2D image and depth map in its spatial light modulator. In order to efficiently use the transmission bandwidth, the data transmitted in this system can be subjected to a conventional compression procedure, and a corresponding decompression action is performed in the receiving device. Using the most efficient amount of data compression, the cost of providing data decompression to a 3D display device is balanced compared to the bandwidth requirement for using less data compression.

The intermediary device can access data of a known set of three-dimensional shapes and attempt to find a match that satisfies the three-dimensional data it computes, or it can access a collection of known two-dimensional graphics and attempt to find a stable entry therein. Pairing of 2D image data. This can speed up the calculation process if a good pairing can be found in a known shape, since a two- or three-dimensional image can then be represented as corresponding to a known shape. A three-dimensional shape library can provide the face or body shape of a group of sports stars, such as the main tennis game. Mobilize or football players, and all or part of the main sports venues, such as the famous tennis courts or famous football venues. For example, a three-dimensional image of a human face can be represented as an item that has been accessed by an intermediate device, plus facial expression changes, such as a smile or frown, plus a change in the length of the hair, because the hair may remain long after the data is stored. Or cut short. If a set of persistent differences occurs, the records that the intermediate device has accessed are significantly more outdated than the data, for example, over a long period of time, the length of the person's hair has changed significantly, then the data that has been accessed by the intermediate device can be intermediate Device 224 is updated. If the intermediary device encounters a two- or three-dimensional image of a pair that has not been found in the records it has accessed, it will add the newly calculated three-dimensional shape to the set of records.

K. Space multiplexing and two-dimensional coding of observer windows

This embodiment relates to the spatial multiplexing of virtual observer windows (VOWs) for holographic displays, combined with the use of two-dimensional encoding. In addition, the holographic display can be as described in Sections A, B, C or D, or any conventional holographic display.

A number of pseudo-observer windows, such as a virtual observer window for the left eye and a virtual observer window for the right eye, are known to be generated by spatial or temporal multiplexing. With regard to spatial multiplexing, two virtual observer windows are generated at the same point in time and are distinguished by a beam splitter, similar to an autostereoscopic display. The content as described in WO 2006/027228. With regard to time multiplex, the virtual observer window is generated in time.

However, conventional hologram display systems have some drawbacks. For spatial multiplexing, the illumination system used is spatially non-coherent in the horizontal direction and is based on a horizontal line source and a lenticular array, as shown in Figure 4 by the prior art WO 2006/027228. This has the advantage that the known techniques of autostereoscopic displays can be utilized. However, its disadvantage is that holographic reconstruction in the horizontal direction is impossible. Instead, so-called 1D coding is used, which produces holographic reconstruction and moving parallax only in the vertical direction. Thus, the vertical focus is on the plane of the reconstructed object and the horizontal focus is on the plane of the spatial light modulator. These astigmatisms reduce the quality of spatial vision, meaning that it reduces the quality of holographic reconstruction received by the viewer. Similarly, time multiplex systems also have the disadvantage that they require fast space light modulators that are not yet available in all display sizes, and are readily available at an instant.

Only two-dimensional coding provides holographic reconstruction in both horizontal and vertical directions, and therefore two-dimensional coding does not produce astigmatism, which reduces the quality of spatial vision, meaning that the quality of holographic reconstruction received by the viewer is reduced. Therefore, the purpose of this embodiment is to achieve spatial multiplexing of virtual observer windows in conjunction with two-dimensional encoding.

In this embodiment, the illumination will have horizontal and vertical local spatial coherence In combination with the beam splitter, the beam splitter splits the light into light for the left eye virtual observer window and for the right eye virtual observer window. Therefore, the diffraction at the beam splitter must be considered. The beam splitter can be a tantalum array, a second lens array (such as a static array or a variable array, as shown in Figure 20) or a barrier mask.

An example of this embodiment is shown in Figure twenty-fifth. Figure 25 is a schematic diagram of a holographic display comprising a light source of a two-dimensional array of light sources, a lens of a two-dimensional lens array, a spatial light modulator, and a beam splitter. The beam splitter splits the light exiting the spatial light modulator into two beams that illuminate the virtual observer window (VOWL) for the left eye and the virtual observer window (VOWR) for the right eye. In this example, the number of light sources is one or more; the number of lenses is the same as the number of light sources.

In this example, the beam splitter is after the spatial light modulator. The positions of the beam splitter and the spatial light modulator can also be interchanged. An example of this embodiment is shown in Fig. 26, in which a 稜鏡 array is used as a beam splitter in plan view. The illumination device comprises an array of two-dimensional two-dimensional light sources (LS1, LS2, ... LSn) and a two-dimensional lens array (L1, L2, ... Ln) of n elements, and only two light sources are shown in Figure 26. With two lenses. Each light source is imaged to the observer plane using its associated lens. The spacing of the array of light sources and the spacing of the lens arrays are such that all of the source images can appear simultaneously on the observer plane, ie, the plane containing the two virtual observer windows. In Figure 26, the left eye virtual observer window (VOWL) and the right eye virtual are not displayed. Observer windows (VOWR) because they are outside the graph and are on the right side of the graph. Additional field of view lenses can be added. In order to provide sufficient spatial coherence, the pitch of the lens array is similar to the typical size of the sub-image, ie, one to several millimeters. Illumination is horizontal and vertical spatially tonal in each lens because the source is small or a point source and because a two-dimensional lens array is used. The lens array can be refractive, diffractive or holographic.

In this example, the beam splitter is a one-dimensional array of vertical turns. Light incident on a slope is deflected to the left eye virtual observer window (to VOWL), and light incident on the other slope is skewed to the right eye virtual observer window (to VOWR). Light rays generated from the same LS and the same lens are also homologous to each other after passing through the beam splitter. Therefore, two-dimensional encoding with vertical and horizontal focusing and vertical and horizontal moving parallax is possible.

The hologram is encoded on a spatial light modulator with two-dimensional code. The hologram for the left and right eyes is a stagger of one field of a field, meaning that the field will interleave the holographic information for the left and right eyes. It is better to have a field for the left eye hologram information and a field for the right eye hologram information under each armpit. Alternatively, there may be two or more hologram fields under each bevel, such as three fields for the left eye virtual observer window, and then three for the right eye virtual The field of the observer window. The distance between the beam splitters can be The same as the spacing of the spatial light modulator, or an integer (for example, two or three) multiple, or, in order to allow perspective shortening, the distance of the beam splitter can be slightly smaller than the spacing of the spatial light modulator, Or a little smaller than its integer (for example, two or three) multiples.

Light emitted from a field with a full-eye image of the left eye reconstructs the target for the left eye and illuminates the virtual observer window (VOWL) of the left eye; the light emitted from the field with the full image of the right eye reconstructs the target for the right eye And illuminate the right eye virtual observer window (VOWR). Therefore, each eye will see an appropriate reconstruction. If the spacing of the 稜鏡 array is sufficiently small, the eye cannot resolve the 稜鏡 structure and the 稜鏡 structure does not interfere with the reconstruction of the hologram. Each eye will see a reconstruction with full focus and full motion parallax and no astigmatism.

There will be diffraction on the beam splitter because the same dimming will illuminate the beam splitter. The beam splitter can be viewed as a diffraction grating that produces multiple diffraction classes. The oblique bevel has the effect of a blazed grating. For blazed gratings, the maximum intensity is directed to a particular diffraction class. For the 稜鏡 array, one maximum intensity will be directed from one ramp of the 导向 to the diffraction level at the virtual observer window of the left eye, and the other maximum intensity will be directed from the other slope of the 导向 to the virtual observer window at the right eye. A diffraction class. More precisely, the maximum intensity of the enveloping sinc-squared function is moved to these positions, while the diffractive class is in a fixed position. Set. The 稜鏡 array produces a maximum intensity of the sinc-squared function at the position of the virtual observer window of the left eye and a maximum value of the sinc-squared function at the position of the virtual observer window of the right eye. The intensity of the other diffraction classes will be small (meaning that the sinc squared intensity function maximum is narrow) and will not cause interference crosstalk because the fill factor of the 稜鏡 array is large, for example close to 100% .

As can be seen in the prior art, in order to provide a virtual observer window to two or more observers, by using a more complex array of cymbals (eg, two types of cymbals, having the same apex angle, but different non- The degree of symmetry, continuously adjacently configured, produces multiple virtual observer windows. However, the use of static 稜鏡 arrays is not able to track observers individually.

In another example, more than one light source can be used per lens. Additional light sources for each lens can be utilized to create additional virtual observer windows for additional observers. This is an example of a lens and m light sources provided for m observers in WO 2004/044659 (US 2006/0055994). In this further example, m light sources per multiplex and double spatial multiplexing are used to generate m left virtual observer windows and m right virtual observer windows for m observers. Each of the lens m light sources is in an m-to-one correspondence, where m is an integer.

This is followed by an example of this embodiment. A 20 inch screen size was used with the following parameter values: observer distance 2 m, pixel pitch 69 μm in the vertical, 207 μm in the horizontal, Burckhardt encoding, and optical wavelength 633 nm. The Burckhardt code is in the vertical direction with a sub-pixel pitch of 69 μm and a virtual observer window (vertical period) of 6 mm height. Ignoring the perspective shortening, the vertical prismatic array has a pitch of 414 μm, that is, a field with two spatial light modulators under each full edge. Therefore, the horizontal period in the observer plane is 3 mm. This is also the width of the virtual observer window. This width is less than the ideal pupil of the eye of about 4 mm in diameter. In another similar example, if the spatial light modulator has a smaller pitch of 50 μm, the virtual observer window will have a width of 25 mm.

If the adult's eye is separated by 65 mm (which is typical), the ridge must deflect light by ±32.5 mm, at which point the light will intersect the plane containing the virtual observer window. More precisely, the maximum value of the intensity package sinc-squared function needs to be skewed by ±32.5 mm. This corresponds to an angle of ±0.93° for an observer distance of 2 m. For a prismatic refractive index n = 1.5, a suitable ridge angle is ± 1.86 °. The angle of the ridge is defined as the angle between the base and the bevel of the rib.

For the level in the 3 mm observer plane, the position of the other eye is at a distance of approximately 21 diffraction stages (ie 65 mm divided by 3 mm). By another virtual observation The higher diffractive class of the window causes crosstalk between the left eye virtual observer window and the right eye virtual observer window to be negligible.

In order to implement tracking, the source tracking is a simple tracking method, which means adapting the position of the light source. If the spatial light modulator is not on the same plane as the 稜鏡 array, there will be a disturbing related lateral offset caused by the parallax between the spatial light modulator pixels and 稜鏡. This will probably cause disturbing crosstalk. In the above example, a 20-inch screen size pixel may have a fill factor of 70% in the direction perpendicular to the axis formed by each tip of the crucible, that is, on each side, the pixel size is 145 μm. 31 μm inactive area. If the construction area of the 稜鏡 array is directed to a spatial light modulator, the separation between the 稜鏡 array and the spatial light modulator may be approximately 1 mm. The horizontal tracking range without crosstalk will be ±31μm/1mm * 2m=±62mm. If small crosstalk is tolerable, the range of tracking will be larger. This tracking range is not very large, but it is enough to allow some tracking to occur so that the viewer will have fewer restrictions, such as limiting the placement of his/her eyes.

The parallax between the spatial light modulator and the 稜鏡 array can be avoided. The better way is to integrate the 稜鏡 array or integrate it directly into the spatial light modulator (such as refraction, diffraction or full Image 稜鏡 array). This will be a specialized component for the product. Another option is the lateral mechanical movement of the 稜鏡 array, although this is less recommended because moving the mechanical part will make the device more To be complicated.

Another key issue is the separation of fixed virtual observer windows as determined by the angle of view. This can be confusing for non-standard eye-separated observers or z-tracking. One solution is to use a combination of encapsulated liquid-crystal domains, as shown in Figure 21. Then, the electric field can control the refractive index, as well as the skew angle. This solution can be combined with the 稜鏡 array to continuously provide variable skew and fixed skew continuously. In another solution, the liquid crystal layer can be used to cover the structural edges of the tantalum array. Then, the electric field can control the refractive index, as well as the skew angle. If the virtual observer window has enough observers to allow different eye separations and z-tracking such a large width, a variable skew combination is not required.

A more complicated solution is to use a controllable array of iridium, such as an e-wetting array (as shown in Figure 27) or a liquid filled 稜鏡 (as shown in Figure 21). In Fig. 27, the layer having the tantalum element 159 includes electrodes 1517, 1518 and a cavity filled with two separated liquids 1519, 1520. Each liquid fills the prismatic portion of the cavity. As an example, the liquid can be oil or water. The slope of the interface between the liquids 1519, 1520 is determined by the voltage applied to the electrodes 1517, 1518. If the liquids have different refractive indices, the beam will be biased, which is determined by the voltage applied to the electrodes 1517, 1518. So 稜鏡 component 159 A controllable beam pointing element. Providing electronic holographic techniques for the need to track the virtual observer window to the observer's eyes is an important feature for the applicant's approach. Tracking of the virtual observer window with the 稜鏡 element to the observer's eye is described in the patent application nos. DE 102007024237.0, DE 102007024236.2.

This is an embodiment for a compact handheld display. Seiko (RTM) Epson (RTM) Corporation of Japan has published a monochrome electronic address space optical modulator, such as the D4:L3D13U 1.3 inch screen size. An illustrative example is the use of a D4:L3D13U liquid crystal display panel as a spatial light modulator. It has HDTV resolution (1920 x 1080 pixels), 15μm pixel pitch and 28.8mm x 16.2mm panel area. This panel is typically used in 2D image projection displays.

This example is to calculate the distance between the wavelength of 663 nm and the observer distance of 50 cm. For this amplitude-modulated spatial optical modulator, track phase encoding (Buckhardt encoding) is used: three pixels are required to encode a complex number. These three associated pixels are arranged vertically. If the 稜鏡 array beam splitter is integrated into the spatial light modulator, the spacing of the 稜鏡 array will be 30 μm. If there is a separation between the spatial light modulator and the 稜鏡 array, the spacing of the 稜鏡 array will be slightly different to handle the perspective shortening.

The height of the virtual observer window is encoded by a spacing of 3 * 15 μm = 45 μm Determined by a complex number and is 7.0mm. The width of the virtual observer window is determined by the 30 μm spacing of the 稜鏡 array and is 10.6 mm. Both values are greater than the pupil of the eye. Therefore, if the virtual observer window is in the position of the eye, the hologram reconstruction can be seen in each eye. The holographic reconstruction is derived from a two-dimensionally encoded hologram, so there is no flash problem inherent in the one-dimensional encoding described above. This ensures high spatial visual quality and high depth impression quality.

When the separation of the eyes is 65 mm, the 稜鏡 must be deflected by ±32.5 mm. More precisely, the maximum intensity of the package sinc-squared intensity function needs to be skewed by ±32.5 mm. For an observer distance of 0.5 m, this corresponds to an angle of ± 3.72°. For a refractive index n = 1.5, a suitable 稜鏡 angle is ± 7.44 °. The 稜鏡 angle is defined as the angle between the base and the skewed edge.

For the level in the 10.6 mm observer plane, the position of the other eye is at a distance of about 6 diffraction stages (ie 65 mm divided by 10.6 mm). The crosstalk caused by the higher diffraction level is therefore negligible because the 稜鏡 array has a high fill factor, meaning close to 100%.

This is an embodiment for use with large displays. The hologram display can be designed with a phase-modulated spatial light modulator with a pixel pitch of 50 μm and a screen size of 20 inches. For applications such as television, the screen size can be quite close to 40 inches. The observer distance for this design is 2m and the wavelength is 633nm.

The two phase modulated pixels of the spatial light modulator are used to encode a complex number. The two associated pixels are vertically arranged and the corresponding vertical spacing is 2 * 50 μm = 100 μm. By integrating the 稜鏡 array into the spatial light modulator, the horizontal spacing of the 稜鏡 array is also 2 * 50μm = 100μm, since each 稜鏡 contains two slopes, and each slope is used for spatial light modulators a field. The width and height of the resulting 12.7 mm virtual observer window is greater than the pupil of the eye. Therefore, if the virtual observer window is in the position of the eye, the hologram reconstruction can be seen in each eye. The holographic reconstruction is derived from the two-dimensional coded hologram, so there is no flash problem in the one-dimensional coding itself. This ensures high spatial visual quality and high depth impression quality.

When the separation of the eyes is 65 mm, the 稜鏡 must be deflected by ±32.5 mm. More precisely, the maximum value of the intensity package sinc-squared function needs to be skewed by ±32.5 mm. For an observer distance of 2 m, this corresponds to an angle of ± 0.93°. For a refractive index n = 1.5, a suitable 稜鏡 angle is ± 1.86 °. The 稜鏡 angle is defined as the angle between the base and the skewed edge.

The above example is for the observer to be 50 cm and 2 m away from the spatial light modulator. In summary, this embodiment can be applied to an observer away from the spatial light modulator. A distance between 50cm and 2m. The screen size can range from 1cm (such as a mobile phone secondary screen) to 50 inches (such as a large TV).

Laser source

RGB solid-state laser sources, such as GaInAs or GaInAsN, are suitable sources for compact holographic displays because they are compact and highly Light directionality. Such light sources include RGB Vertical Cavity Surface Emitting Lasers (VCSELs) manufactured by Novalux (RTM) Inc., CA, USA. Such a light source can be provided as a single laser or laser array, although each light source can utilize a diffractive optical element to produce multiple beams. The beam can be transmitted in a multimode fiber because if the coherence is too high for use in a compact hologram display, this may reduce the homology class and will not cause unwanted artifacts, such as laser classes. Dot pattern. The array of laser sources can be one or two dimensional.

Organic light emitting diode material

Infrared organic light emitting diode materials have been proposed. For example, Del Caño et al. published electroluminescence in organic light-emitting diode materials based on perylenediimide-doped tris (8-quinolinolato) aluminium, as in Applied Physics Letters vol. 88, 071117 (2006). The content described in ).

Electroluminescence at a wavelength of 805 nm is illustrated. Domercq et al., J. Phys Chem B vol. 108, 8647-8651 (2004), disclose materials for near-infrared organic light-emitting diodes. The preparation of an organic light-emitting diode material on a transparent substrate has been described. For example, in US 7,098,591, organic light-emitting diode materials are prepared on transparent indium tin oxide electrodes. The electrode is prepared on a transparent substrate, and the transparent substrate may be borosilicate glass. These constituent elements may be included in an organic light emitting diode device having a transparent substrate. The indium tin oxide layer can be sputtered onto the substrate using a radio frequency magnetron sputtering tool. Indium tin oxide can be sputtered using a target containing indium oxide and tin oxide. The indium tin oxide layer can have an optical transmission of about 85% in the visible range. Indium tin oxide can be smooth to avoid local enhanced electric field generation, and local enhanced electric field may reduce the performance of the organic light emitting diode material. A root mean square roughness of less than about 2 nm is preferred. One or several practical organic layers may be disposed on the patterned electrode surface. The thickness of the organic layer is typically between 2 nm and 200 nm. The conductive layer can be constructed on the organic layer in accordance with the pattern to form an anode and a cathode on both sides of the organic layer. The device may be sealed by a layer of glass to protect the active layer from environmental damage.

Summary manufacturing process

An outline of the procedure for making the apparatus of Figure 2 is described below, although many variations of this procedure will be found in the prior art.

In the process of manufacturing the device of Fig. 2, a transparent substrate is selected for use. Such a substrate may be a rigid substrate, such as a boron germanium glass sheet of about 200 μm thick, or it may be a flexible substrate, such as a polymer substrate such as polycarbonate, acrylic, Polypropylene, polyurethane, polystyrene, polyvinyl chloride or the like. As described in the previous section, the transparent electrode is prepared on glass. As described in the previous section, the infrared organic light emitting diode material is disposed on the glass, and the electrical contacts are disposed on the other side of the transparent electrode, so that the pixelated organic light emitting diode infrared light is Radiation is possible. The glass substrate can have a recess that provides an organic light emitting diode pixel material. The infrared organic light emitting diode material can be printed, sprayed or solution-processed on a transparent substrate. The sub-sealing layer, which is also an electrically insulating layer, is then disposed on the organic light-emitting diode pixel layer. Such a sealing layer may be an inorganic insulator layer such as silicon dioxide, silicon nitride or silicon carbide or it may be a polymerizable layer. For example, epoxy. The configuration can be performed by sputtering or by chemical vapour deposition for the inorganic insulating layer, or by printing or coating the polymeric layer. The sub-sealing layer, which is also an electrically insulating layer, may have a thickness of a few microns or less than 10 microns. Next, the photosensitive layer of the optically addressed spatial light modulator will cover the encapsulation layer. The photosensitive layer is sensitive to infrared light and transparent to visible light, and And it can have a thickness of several micrometers. Such optical properties can be provided by dyes that absorb infrared light. The optically addressed spatial light modulator is then completed by configuring a liquid crystal layer that covers between the two conductive layers. The liquid crystal layer can be set for amplitude modulation or phase modulation, and typically has a thickness of a few microns. Next, an infrared filter layer is disposed on the device. This may be in the form of a thin layer of polymer having infra red absorbing pigments, or it may be an inorganic layer, such as a thin layer of ceria grown with sputtering or chemical vapor deposition of an infrared absorbing element.

The layer between the two optically addressed spatial light modulator devices must be sufficiently thick to ensure that the electric field in one optically addressed spatial light modulator does not affect the other optically addressed spatial light modulation. The performance of the device. The infrared filter layer can be thick enough to accomplish this goal. However, if the infrared filter layer is not thick enough, the optically-addressed spatial light modulator device can be combined with a glass sheet having a sufficient thickness, for example, by an optical adhesive, or by arranging another optically transparent layer. For example, the inorganic layer or the polymer layer described above increases the thickness of the layer. In any event, the two optically addressed spatial light modulator devices must not be too far apart, so that the optical diffraction effect reduces pixel crosstalk. For example, if the pixel width is 10 microns, the optically addressed spatial light modulator layers should preferably be less than 100 microns apart. The liquid crystal layer in one of the optically addressed spatial light modulators is set to perform amplitude modulation; the liquid crystal layer in another optically addressed spatial light modulator is set to perform phase modulation.

Other portions of the device can be fabricated using the methods described above for each of the optically addressed spatial light modulators and organic light emitting diode layers. Alternatively, other portions of the device may be fabricated as a single component, and then bonded to the first portion of the device, using, for example, a glass layer to ensure adequate separation between the optically-addressed spatial modulator layers, such that each The electric field of an optically addressed spatial light modulator does not affect the function of another optically addressed spatial light modulator. The remainder of the device is prepared by arranging additional material onto the first portion of the device, which has the advantage of facilitating precise alignment of the pixels of the second organic light emitting diode layer with the pixels of the first organic light emitting diode layer.

It is also possible to use a thin spacer layer coated with a conducting transparent electrode (e.g., indium tin oxide) instead of using a spacer layer having a sufficient thickness in close proximity to the optically addressed spatial light modulator. This electrode acts as a common electrode for the two liquid crystal layers. Furthermore, as a conductive electrode it is an equipotential surface. Thus, it protects the electric field and prevents electric field leakage from one optically addressed spatial light modulator to another optically addressed spatial light modulator.

Fig. 9 shows an example of a device structure which can be manufactured by the above program or the like. During use, surface 909 illuminates substantially coherently visible light to device structure 910 in FIG. 9, such that a viewer at point 911 can see a three dimensional image at a distance from the device (related to the dimensions of the device). The layer in the device, from 90 until 908 is not required to be related to each other's scale. Layer 90 is a base layer, such as a glass layer. Layer 91 is an organic light emitting diode backplane layer that provides an organic light emitting diode power supply and may be wholly or partially transparent. Layer 92 is an array of infrared organic light emitting diodes. Layer 93 is a Bragg filter hologram element for at least partial infrared light aiming. In some embodiments, layer 93 can be omitted. Layer 94 is an electrically insulating layer. Layer 95 is an optically addressed spatial light modulator photosensitive and electrode layer. Layer 96 is a liquid crystal layer for visible beam amplitude modulation. Layer 97 is a separator layer, particularly a thin separator layer. Layer 98 is a transparent electrode layer. Layer 99 is a linear polarizing layer. Layer 900 is an infrared filter layer that transmits visible light but blocks infrared light from organic light emitting diode arrays 92 and 906. Layer 901 is a liquid crystal layer for phase modulation of the visible beam. Layer 902 is a separator layer, particularly a thin separator layer. Layer 903 is an optically addressed spatial light modulator photosensitive and electrode layer. Layer 904 is an electrically insulating layer. Layer 905 is a Bragg filter hologram element for at least partial infrared light aiming. In some embodiments, layer 905 can be omitted. Layer 906 is an array of infrared organic light emitting diodes. Layer 907 is an organic light emitting diode backplane layer that provides an organic light emitting diode power supply and may be wholly or partially transparent. Layer 908 is a plane that covers the material, such as glass. During fabrication, the fabrication of device 910 can begin with substrate layer 90, with each layer being disposed in sequence until the last layer 908 is added. The above described procedure has the advantage of facilitating a layer arrangement of highly accurate structures. Alternatively, the manufacture of the layer can be divided into two or more sections and bonded together with a sufficient degree of adjustment.

For the manufacture of the device, it is very important to maintain the unwanted birefringence at a minimum, such as unwanted stress-induced birefringence. Stress induced birefringence causes a linear or circularly polarized state of light to change to an elliptically polarized state of light. In devices with ideal linear or circularly polarized states of light, the presence of elliptically polarized states of light reduces contrast and color fidelity, and thus reduces device performance.

Practice

Based on conventional techniques, for the optically addressed spatial light modulator of the above embodiment, a photosensitive layer that is transparent in the visible range but absorbs infrared light is needed. In another implementation, the photosensitive layer can be patterned to provide transparent spacing that transmits visible light, such as red, green, and blue light beams, as well as non-transparent that is sensitive to light from organic light-emitting diodes. region. In this example, the photosensitive material need not be transparent to visible light. In addition, the writing beam does not need to be infrared light. In one implementation, the write beam can be produced by a non-primary display color, such as by a yellow light organic light emitting diode. The filter between the two optically addressed spatial light modulators will therefore need to be in yellow with strong optical absorption to block yellow light, but for the purpose of producing an active optical display, in other There is still a need for sufficient transmission at the optical wavelength. In another implementation, the write beam can be produced by an ultraviolet organic light emitting diode. The filter between the two optically addressed spatial light modulators will therefore need to be strong in the UV Optical absorption allows it to block ultraviolet light, but for the purpose of producing an effective optical display, sufficient transmission is still required at other optical wavelengths. Ultraviolet organic light-emitting diode materials have been published by Qiu et al. Applied Physics Letters 79, 2276 (2001) and Wong et al. Org. Lett. 7 (23), 5131 (2005). In addition, although emphasis is placed on the use of organic light-emitting diode materials, other light-emitting diode materials or other display technologies such as Surface-conduction Electron-emitter Display (SED) technology may be used.

Although the embodiments described herein emphasize continuous encoding of amplitude and phase in a spatial optical modulator, any continuous weighting of two unequal combinations of amplitude and phase can be used for encoding based on conventional techniques. A holographic pixel, the two combinations are independent of multiplying any real number, but not multiplied by any complex number (except for real numbers). The reason is that the vector space of the possible holographic coding of the pixel will extend in the vector space perception by any two unequal combinations of amplitude and phase. Any two combinations will be independent of any real number, but not multiplied. Any plural (except real numbers). In the reference figures, the relevant dimensions shown are not necessarily to scale.

The technology disclosed in this case can be implemented by a person familiar with the technology, and its unprecedented practice is also patentable, and the application for patent is filed according to law. However, the above embodiments are not sufficient to cover the scope of patents to be protected in this case. Therefore, the scope of the patent application is attached.

12. Annex I: Introduction to technology

The purpose of this section below is to provide an introduction to several important techniques in implementing the system of the present invention.

In the conventional holographic technique, the observer can see the holographic reconstruction of the target (this can be a changing scene); his distance from the hologram is not related anyway. The reconstruction in a typical optical arrangement is on or near the imaging plane of the source that illuminates the hologram, so it is on the Fourier plane of the hologram. Thus, far field light distribution with the same reconstructed real world object is reconstructed.

An earlier system (described in WO 2004/044659 and US 2006/0055994) defines a very different arrangement in which the reconstructed object is not at or near the Fourier plane of the hologram. Instead, the virtual observer window is in the Fourier plane of the hologram; the observer only sees his eyes in this position to see the correct reconstruction. The hologram is encoded on a liquid crystal display (or other type of spatial light modulator) and illuminated such that the virtual observer window becomes a Fourier transform of the hologram (hence, it is a Fourier transform directly imaged to the eye); reconstruction The object is then a Fresnel transformation of the hologram because it is not in the focal plane of the lens. It is defined by the near-field light distribution (using the spherical wavefront as the model, relative to the plane wavefront assigned to the far field). This reconstruction can appear in virtual The observer window (this is as described above, in the Fourier plane of the hologram) and anywhere between the liquid crystal display, or even behind the liquid crystal display as a virtual target.

This method will produce several results. First, the basic limitation that designers of holographic imaging systems face is the pixel pitch of liquid crystal displays (or other types of light modulators). The goal is to use a liquid crystal display with pixel pitch that is commercially available and reasonably priced to produce a large hologram reconstruction. But in the past this was impossible because of the following reasons. The periodic spacing between adjacent diffraction classes in the Fourier plane is determined by λD/p, λ is the wavelength of the illumination light, D is the distance from the hologram to the Fourier plane, and p is the pixel pitch of the liquid crystal display. But in conventional holographic displays, the reconstructed object is in the Fourier plane. Therefore, the reconstructed object must continue to be less than the periodic interval; if it is larger, its edges will be blurred from the adjacent diffraction class to the reconstruction. This will result in very small reconstructed objects - typically only a few cm wide, even with expensive and professional small pitch displays. But with the current method, the virtual observer window (which is set in the Fourier plane of the hologram as described above) only needs to be as large as the pupil of the eye. Therefore, even if the liquid crystal display has a medium pitch size, it can be used. And because the reconstructed object completely fills the frustum between the virtual observer window and the hologram, it can be very large, which can be much larger than the periodic interval. Furthermore, there is no pixelation using optically addressed spatial light modulators, and therefore there is no periodicity, keeping virtual observers The window is less than the periodic interval limit is no longer needed.

There is another advantage. When calculating the hologram, start with the knowledge of holistic objects - for example you might have a 3D image of the car. That file will describe how objects should be seen from several different viewing positions. In the conventional holographic technique, the hologram required to generate the car reconstruction is obtained directly from the 3D image archive in the computationally intensive program. But the virtual observer window approach enables a different and more computationally efficient technical approach. Starting with a plane that reconstructs the object, we can calculate the virtual observer window because this is the Fresnel transformation of the target. We then do this for all target planes and sum the results to produce a cumulative Fresnel transition; this defines the wave field that traverses the virtual observer window. We then calculate the hologram as the Fourier transform of this virtual observer window. Although the virtual observer window contains all the information about the object, only a single planar virtual observer window must be converted to an hologram and not a multi-plane object. This is particularly advantageous if the transition from the virtual observer window to the hologram is not a single conversion step, but a repeated conversion, such as an Iterative Fourier Transformation Algorithm. Each of the repeated steps contains only a single Fourier transform of the virtual observer window, replacing one for the entire object plane, which can greatly reduce the amount of computation.

Another interesting result of the virtual observer window approach is that all the information needed to reconstruct a given target point is contained in a fairly small area of the hologram; Compared to reconstructing a given target point is a conventional holographic technique that is interspersed throughout the hologram. Because we need to encode information into a much smaller area of the hologram, this means that the amount of information we need to process and encode is much lower than that for the hologram. This, in turn, illustrates the use of conventional computing devices for instant image holography (e.g., a digital signal processor (DSP) with a price performance that is compatible with the mass market.

However, there are some lower than hopeful results. First, it is important to view the distance of the hologram - using this method to encode and illuminate the hologram, only when the eye is placed in the Fourier plane of the hologram, the best reconstruction is seen; in general As shown in the figure, viewing distance is not very important. However, there are a variety of methods for reducing this Z-sensitivity or design around it, and generally the Z-sensitivity of holographic reconstruction is not very large in practice.

Similarly, because such a method is used to encode and illuminate the hologram, the best hologram reconstruction can only be seen from a precise and small viewing position (meaning that Z is precisely defined, as described above, and X and Y Eye coordinates may require eye tracking. As with Z sensitivity, there are a variety of methods used to reduce the X or Y sensitivity or design around it, for example, as the pixel pitch is reduced (because it will follow the manufacturing progress of the LCD), the virtual observer window The size will increase. In addition, more efficient coding techniques (like Keno format coding) facilitate the use of a larger portion of the periodic interval as The virtual observer window will increase the virtual observer window accordingly.

The above description assumes that we are dealing with Fourier holograms. The virtual observer window is in the Fourier plane of the hologram, meaning in the imaging plane of the light source. For example, one advantage is that the non-diffracted light is focused in a so-called DC-spot. This method can also be used with a Fresnel hologram in the virtual observer window that is not in the imaging plane of the light source. However, care must be taken that non-diffracted light is not visible as disturbing the background. Another point to note is that the word conversion should be interpreted to include any mathematical or computational method that is equal or similar to the transformation that describes light propagation. Conversion is just a procedure to approximate the entity, more precisely defined by the Maxwellian wave propagation equation; Fresnel and Fourier transform are second-order approximations, but have advantages (i) because they are algebraic relative to the differentiation, They can be processed in a computationally efficient manner and (ii) can be accurately implemented on an optical system.

Further details are described in US Patent Application 2006-0138711, US 2006-0139710, and US 2006-0250671, the disclosures of each of which are incorporated herein by reference.

XIII. Annex II:

Terminology used in the description

Computer-generated hologram

The computer generated image hologram (CGH) is a hologram calculated from the scene. The computer generated image hologram may contain complex values representing the amplitude and phase of the light waves required to reconstruct the scene. Computer-generated image holograms can be calculated using, for example, coherent ray tracing, interference between simulated scenes and reference waves, or Fourier or Fresnel conversion.

coding

Coding is a program in which a spatial light modulator (such as its constituent elements or a continuous region of a continuous optical modulator such as an optically addressed spatial light modulator) is provided for image hologram control. value. In general, holograms contain complex values that represent amplitude and phase.

Coding area

The coded region is typically a finite region of the image hologram space where the hologram data for a single scene point is encoded. Spatial constraints are achieved either by steep truncation or by smooth transitions, which are achieved by Fourier transforms from the virtual observer window to the image hologram.

Fourier transform

Fourier transform is used to calculate the light propagation in the far field of the spatial light modulator. Wavefronts are described using plane waves.

Fourier plane

The Fourier plane contains the Fourier transform of the light distribution in the spatial light modulator. Without any focusing lens, the Fourier plane is infinite. If there is a focusing lens on the light path close to the spatial light modulator, the Fourier plane will be equal to the plane of imaging containing the light source.

Fresnel conversion

Fresnel conversion is used to calculate the propagation of light in the near field of a spatial light modulator. The wavefront is described as a spherical wave. The phase factor of the light wave contains an item that is affected twice by the lateral coordinate.

Frustum

The virtual frustum is constructed between the virtual observer window and the spatial light modulator and extends after the spatial light modulator. The scene is reconstructed in this frustum. The size of the reconstructed scene is limited by this frustum and is not limited by the periodic spacing of the spatial light modulator.

Imaging optics

Imaging optics are one or more optical elements, such as lenses, lenticular arrays, or microlens arrays, used to form an image of one or more light sources. Imaging optics is not mentioned in the reference, which also shows that when constructing holographic reconstruction, imaging optics are not used to form a plane between the Fourier plane and one or two spatial light modulators. Imaging of two spatial light modulators as described herein.

Light system

The light system may include a coherent light source, such as a laser, or a partially coherent light source, such as a light emitting diode. The temporal and spatial coherence of some homogenous light sources must be sufficient to help produce good scene reconstruction, ie the spectral linewidth and lateral extent of the radiating surface must be sufficiently small.

Virtual Observer Window (VOW)

The virtual observer window is a virtual window in the observer plane, from which the reconstructed three-dimensional object can be seen. The virtual observer window is a Fourier transform of the hologram and is placed in a periodic interval to avoid observing multiple object reconstructions. The size of the virtual observer window must be at least the size of the pupil of the eye. If in the system with eye tracking, at least one virtual observer window is placed at the observer's eye position, the virtual observer window can be much smaller than the observer's lateral movement Wai. This promotes the use of medium resolution and small periodic spaced spatial light modulators.

The virtual observer window can be thought of as a keyhole through which the reconstructed three-dimensional object can be seen, either as an observer window at each eye or as a virtual observer window with two eyes.

Periodic interval

If the computer-generated image hologram is displayed on a spatial light modulator consisting of individual addressable components, the computer-generated image hologram will be sampled. This sampling will result in periodic repetition of the diffracted pattern.

The periodic interval is λD/p, where λ is the wavelength, D is the distance from the hologram to the Fourier plane, and p is the spacing of the spatial light modulator elements. However, optically addressed spatial light modulators do not have sampling, so there is no periodic repetition of the diffraction pattern; repetition is actually suppressed.

reconstruction

The spatial light modulator that encodes the full image and is illuminated reconstructs the original light distribution. This light distribution is used to calculate the hologram. Ideally, there is no way for observers to distinguish between the original light distribution and the reconstructed light distribution. In most hologram displays, the light distribution of the scene is reconstructed. In our display, the light distribution in the virtual observer window is reconstructed instead.

Scenes

The reconstructed scene is a three-dimensional light distribution that is real or computer generated. In a special case, it can also be a two-dimensional light distribution. The scene constitutes a variety of fixed or moving objects arranged in space.

Spatial Light Modulator (SLM)

A spatial light modulator is used to modulate the wavefront of incoming light. An ideal spatial light modulator should have the ability to represent any complex value, that is, to control the amplitude and phase of the light wave, respectively. However, a typical conventional spatial light modulator can only control one of the characteristics, either amplitude or phase, with undesirable side effects that would also affect the other characteristic.

10‧‧‧Lighting device

11‧‧‧Color Filter Array

12‧‧‧Infrared organic light emitting diode array

13‧‧‧Optical addressed spatial light modulator

14‧‧‧ points

15‧‧‧Complete hologram generator

20‧‧‧Lighting device

21‧‧‧Color Filter Array

22‧‧‧Infrared organic light emitting diode array

23‧‧‧Optical addressed spatial light modulator

24‧‧‧ points

25‧‧‧Complete hologram generator

26‧‧‧Infrared filter

27‧‧‧Optical addressed spatial light modulator

28‧‧‧Infrared organic light emitting diode array

30‧‧‧Mobile Phone

31‧‧‧Screen area

32‧‧‧Antenna

33‧‧‧ camera

34‧‧‧ camera

35‧‧‧ button

36‧‧‧ button

1101‧‧‧ Focusing components

1102‧‧‧ Focusing components

1103‧‧‧ Focusing components

1104‧‧‧Vertical Focusing System

1105‧‧‧The first diffraction class

1106‧‧‧The zeroth diffraction class

1107‧‧‧negative diffracted class

50‧‧‧Microlens array

51‧‧‧Color Filter Array

52‧‧‧Infrared organic light emitting diode array

53‧‧‧Optical addressed spatial light modulator

54‧‧‧Optical addressed spatial light modulator

55‧‧‧Complete hologram generator

56‧‧‧ points

57‧‧‧Lighting device

70‧‧‧Space light modulator

71‧‧‧Whole image optical element Bragg filter

73‧‧‧ single component

74‧‧‧ Prague plane

75‧‧‧Diffractive light intensity distribution

76‧‧‧Light

80‧‧‧Organic LED array

81‧‧‧Whole Image Optical Element Bragg Filter

82‧‧‧Optical addressed spatial light modulator

83‧‧‧Single organic light-emitting diode

84‧‧‧ Prague plane

85‧‧‧ Distribution of infrared rays emitted

86‧‧‧Light rays

90‧‧‧ basal layer

91‧‧‧ Organic light-emitting diode bottom layer

92‧‧‧Infrared organic light emitting diode array

93‧‧‧Prague filter hologram components

94‧‧‧Electrical insulation

95‧‧‧Optical addressed spatial light modulator photosensitive and electrode layer

96‧‧‧Liquid layer

97‧‧‧Separation layer

98‧‧‧Transparent electrode layer

99‧‧‧linear polarizing layer

900‧‧‧Infrared filter

901‧‧‧Liquid layer

902‧‧‧Separation layer

903‧‧‧Optical addressed spatial light modulator photosensitive and electrode layer

904‧‧‧Electrical insulation

905‧‧ ‧ Prague filter hologram components

906‧‧‧Infrared organic light emitting diode array

907‧‧‧ Organic light-emitting diode bottom layer

098‧‧‧Face of the covering material

909‧‧‧ surface

910‧‧‧Device structure

911‧‧ points

100‧‧‧Microlens array

101‧‧‧Color Filter Array

102‧‧‧Electronic Address Space Light Modulator

103‧‧‧Electronic Address Space Light Modulator

104‧‧‧Lighting device

105‧‧‧Compact hologram generator

106‧‧‧ points

107‧‧‧ components

108‧‧‧ components

110‧‧‧ for lighting installations

111‧‧‧Color Filter Array

112‧‧‧Electronic Address Space Light Modulator

113‧‧‧ Beam beam splitter components

114‧‧‧ points

115‧‧‧Compact hologram generator

130‧‧‧Lighting device

131‧‧‧Color Filter Array

132‧‧‧Electronic Address Space Light Modulator

133‧‧‧Electronic Address Space Light Modulator

134‧‧‧beam beam splitter element

135‧‧ points

136‧‧‧Complete hologram generator

171‧‧‧ Beam

172‧‧‧ Beam

220‧‧‧Users

221‧‧‧Users

222‧‧‧Connected

223‧‧‧Connected

224‧‧‧Intermediate system

2300‧‧‧TV Communications

2301‧‧‧Intermediate system

2302‧‧‧ Viewers

2303‧‧‧Advertisers

2304‧‧‧Two-dimensional content

2305‧‧‧3D content

2306‧‧‧Payment fees

159‧‧‧稜鏡 Components

1517‧‧‧electrode

1518‧‧‧electrode

1519‧‧‧dove

1520‧‧‧Deep

1 is a schematic diagram of a holographic display device including a single optical address spatial light modulator and a single organic light emitting diode array; FIG. 2 is a schematic diagram of a holographic display device including a pair of components, each component including a single optical address Spatial light modulator and single organic light-emitting diode array; Figure 3 is a schematic diagram of a mobile three-dimensional display device; Figure 4 is a conventional holographic display schematic; Figure 5 is a single organic light-emitting diode array for controlling two optical Schematic diagram of the holographic display of the address space optical modulator; Figure 6A is a schematic diagram of the holographic display; Figure 6B is a schematic diagram suitable for achieving a compact holographic display; Figure 7 is included to reduce the problem associated with higher diffraction levels Schematic diagram of a constituent element of a holographic display of a holographic filter of a holographic optical element; FIG. 8 is a holographic display of a holographic filtered holographic optical element including collimation for enhancing the light emitted by the array of organic light emitting diodes Schematic diagram of constituent elements; Figure 9 is a schematic diagram of a holographic display device; Figure 10 is a diagram showing two electronic formulas for continuously encoding amplitude and phase Spatial light modulator means is a schematic diagram showing the whole image; 11 is a schematic diagram of a holographic display device including a single electronic address spatial light modulator; FIG. 12 is a specific embodiment of a holographic display according to an embodiment; FIG. 13 is included for continuously encoding amplitude and Schematic diagram of the holographic display device of two electronically-positioned spatial light modulators of phase; Figure 14 shows the diffraction simulation results obtained using MathCad (RTM); Figure 15 shows the diffraction obtained using MathCad (RTM) Simulation results; Figure 16 is a diffraction simulation result obtained using MathCad (RTM); Figure 17 is a schematic diagram showing the arrangement of lens layers between two electronic address spatial light modulators according to an embodiment; Schematic diagram of a diffraction procedure that occurs when light travels from an electronically addressed spatial light modulator to a second electronically addressed spatial light modulator; Figure 19 shows two electronically addressed spatial light modulators Schematic diagram of the structure, in which there is a fiber optic panel between two electronically-positioned spatial light modulators; Figure 20 is a schematic diagram of the beam pointing component; Figure 21 is a schematic diagram of the beam pointing component; Figure 22 is Making three-dimensional schematic view of a possible visual communication system; FIG twenty-three schematic two-dimensional images to convert the three-dimensional image content is the content of the method; Figure 24 is a schematic diagram of a holographic display element according to an embodiment; Figure 25 is a light source comprising a two-dimensional array of light sources, a lens in the form of a two-dimensional lens array, a spatial light modulator and a beam splitter The hologram shows the schematic. The beam splitter splits the light exiting the spatial light modulator into two beams, respectively illuminating the virtual observer window (VOWL) for the left eye and the virtual observer window (VOWR) for the right eye; Figure 26 contains A schematic diagram of the full image display of two light sources in a two-dimensional light source array, two lenses in a two-dimensional lens array, a spatial light modulator, and a beam splitter. The beam splitter splits the light leaving the spatial light modulator into two beams, respectively illuminating the virtual observer window (VOWL) for the left eye and the virtual observer window (VOWR) for the right eye; A schematic view of the cross section of the mirror beam directed to the component.

10‧‧‧Lighting device

11‧‧‧Color Filter Array

12‧‧‧Infrared organic light emitting diode array

13‧‧‧Optical addressed spatial light modulator

14‧‧‧ points

15‧‧‧Complete hologram generator

Claims (23)

  1. A holographic display device comprising: an organic light emitting diode array, the organic light emitting diode array is written to an optical address space optical modulator, the organic light emitting diode array and the optical addressing space The optical modulator forms an adjacent layer, wherein the organic light emitting diode array and the optically addressed spatial light modulator are physically interconnected through an isolation layer, that is, a corner filter or the angle filter is included The optically addressed spatial light modulator encodes a hologram, and when a read beam array illuminates the optically addressed spatial light modulator, and the optically addressed spatial light modulator is via the organic light emitting When the diode array is properly controlled, a holographic reconstruction is produced by the hologram display device.
  2. The hologram display device of claim 1, wherein the organic light emitting diode array and the optical address spatial light modulator form an adjacent layer facing each other, and the organic light emitting diode array is There is no intermediate imaging optics between the optically addressed spatial light modulator.
  3. The holographic display device of claim 1, wherein the organic light-emitting diode array and the optically-addressed spatial light modulator are fixed and physically directly connected to each other.
  4. The holographic display device of claim 1, wherein the organic light emitting diode array and the optically addressed spatial light modulator are fixed and physically indirectly connected to each other.
  5. The hologram display device of claim 4, wherein the spacer layer is the corner filter, such as a Bragg filter.
  6. The hologram display device according to any one of claims 1 to 5, wherein the organic light emitting diode array emits a non-primary colour display wavelength and the readout wavelength ( Read-out wavelengths) are one or more of red, green and blue (RGB).
  7. The hologram display device according to any one of claims 1 to 5, wherein the organic light emitting diode array is infrared light emitting, and an infrared light written on the optical address space light modulator Sensing layer.
  8. The hologram display device according to any one of claims 1 to 5, wherein the organic light-emitting diode array and the optically-addressed spatial light modulator layer are reflective, and visible light is The organic light emitting diode array and the optically addressed spatial light modulator layer are reflected to an observer.
  9. The hologram display device according to any one of claims 1 to 5, wherein the organic light-emitting diode array is made up of a plurality of smaller shingled organic light-emitting diodes.
  10. The hologram display device according to any one of claims 1 to 5, wherein the optically addressed spatial light modulator comprises a liquid crystal material or wherein the optically addressed spatial light modulator comprises as a photoreceptor A photosensitive dye of the layer.
  11. The hologram display device of any one of claims 1 to 5, wherein the display is illuminated with a backlight and a microlens array.
  12. The holographic display device of claim 11, wherein the microlens array provides local coherence on a small portion of the display, the portion being the only part of the display that encodes a message, This information is used to reconstruct a given point of the reconstructed object.
  13. The holographic display device of any one of claims 1 to 5, wherein the optically addressed spatial light modulator is a Freedericksz cell arrangement to provide phase control .
  14. A hologram display device according to any one of claims 1 to 5, The hologram reconstruction can be observed through a virtual observer window.
  15. A holographic display device according to any one of claims 1 to 5, wherein the plurality of virtual observer windows can be patched together by spatial or temporal multiplexing.
  16. A hologram display device according to any one of claims 1 to 5, wherein the display is operable to perform time on a medium containing an omnipresent for an observer's left eye followed by a right eye Serially recode a hologram.
  17. The holographic display device of any one of claims 1 to 5, wherein the display produces a holographic reconstruction that provides a single user view.
  18. The hologram display device according to any one of claims 1 to 5, wherein the display is capable of generating a two-dimensional image focused on a screen without any projection lens, and is not related to The screen is at a distance from the device in the optical far field.
  19. A hologram display device according to any one of claims 1 to 5, wherein a holographic image is transmitted through a beam splitter to each eye of an observer.
  20. The holographic display device according to any one of claims 1 to 5, wherein the optically-addressed spatial light modulator is disposed within a range of 30 mm of a light source and is placed in a portable type. In the box.
  21. A hologram display device according to any one of claims 1 to 5, wherein there is a beam pointing element for tracking a plurality of virtual observer windows, the beam pointing element being internal to an isotropic body material a plurality of liquid crystal regions, wherein the plurality of interfaces between the plurality of regions and the matrix are a prismatic shape, or a partial shape of a ball, or a partial shape of a cylinder, and a direction of the plurality of liquid crystals It is controlled by an applied electric field to vary the local refraction or diffraction characteristics of the beam pointing element.
  22. The holographic display device according to any one of claims 1 to 5, wherein the optically-addressed spatial light modulator, a light source, and a lens array arranged with the light source are all placed in a portable Within the cassette, and wherein the source is enlarged 10 and 60 times via the lens array.
  23. A method of producing a holographic reconstruction comprising the steps of using a hologram display device according to any one of claims 1 to 5.
TW96140505A 2006-10-26 2007-10-26 Universal image display device and method (1) TWI421540B (en)

Priority Applications (16)

Application Number Priority Date Filing Date Title
GBGB0621360.7A GB0621360D0 (en) 2006-10-26 2006-10-26 Compact three dimensional image display device
GBGB0625838.8A GB0625838D0 (en) 2006-10-26 2006-12-22 Compact three dimensional image display device
GB0705401A GB0705401D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705398A GB0705398D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705409A GB0705409D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GBGB0705404.2A GB0705404D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705403A GB0705403D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GBGB0705411.7A GB0705411D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705407A GB0705407D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705412A GB0705412D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705406A GB0705406D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GBGB0705402.6A GB0705402D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705399A GB0705399D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GBGB0705405.9A GB0705405D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705410A GB0705410D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705408A GB0705408D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device

Publications (2)

Publication Number Publication Date
TW200827771A TW200827771A (en) 2008-07-01
TWI421540B true TWI421540B (en) 2014-01-01

Family

ID=44771492

Family Applications (6)

Application Number Title Priority Date Filing Date
TW96140508A TWI442763B (en) 2006-10-26 2007-10-26 3d content generation system
TW96140506A TWI421541B (en) 2006-10-26 2007-10-26 Full image display device and method (2)
TW096140510A TWI454742B (en) 2006-10-26 2007-10-26 Compact three dimensional image display device
TW96140509A TWI406115B (en) 2006-10-26 2007-10-26 Holographic display device and method for generating holographic reconstruction of three dimensional scene
TW96140507A TWI432002B (en) 2006-10-26 2007-10-26 Mobile telephone system and method for using the same
TW96140505A TWI421540B (en) 2006-10-26 2007-10-26 Universal image display device and method (1)

Family Applications Before (5)

Application Number Title Priority Date Filing Date
TW96140508A TWI442763B (en) 2006-10-26 2007-10-26 3d content generation system
TW96140506A TWI421541B (en) 2006-10-26 2007-10-26 Full image display device and method (2)
TW096140510A TWI454742B (en) 2006-10-26 2007-10-26 Compact three dimensional image display device
TW96140509A TWI406115B (en) 2006-10-26 2007-10-26 Holographic display device and method for generating holographic reconstruction of three dimensional scene
TW96140507A TWI432002B (en) 2006-10-26 2007-10-26 Mobile telephone system and method for using the same

Country Status (2)

Country Link
JP (1) JP2014209247A (en)
TW (6) TWI442763B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5397190B2 (en) * 2009-11-27 2014-01-22 ソニー株式会社 Image processing apparatus, image processing method, and program
CN102736393B (en) 2011-04-07 2014-12-17 台达电子工业股份有限公司 Display apparatus for displaying multiple images of viewing angles
TWI501053B (en) * 2011-10-28 2015-09-21 Jing Heng Chen Holographic imaging device and method thereof
WO2013136358A1 (en) 2012-03-12 2013-09-19 Empire Technology Development Llc Holographic image reproduction mechanism using ultraviolet light
TWI508040B (en) * 2013-01-07 2015-11-11 Chunghwa Picture Tubes Ltd Stereoscopic display apparatus and electric apparatus thereof
TWI493160B (en) * 2013-05-13 2015-07-21 Global Fiberoptics Inc Method for measuring the color uniformity of a light spot and apparatus for measuring the same
TWI537605B (en) 2014-08-28 2016-06-11 台達電子工業股份有限公司 Autostereoscopic display device and autostereoscopic display method using the same
CN104463964A (en) * 2014-12-12 2015-03-25 英华达(上海)科技有限公司 Method and equipment for acquiring three-dimensional model of object
TWI670850B (en) * 2019-03-08 2019-09-01 友達光電股份有限公司 Display device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6683665B1 (en) * 2000-11-20 2004-01-27 Sarnoff Corporation Tiled electronic display structure and method for modular repair thereof
TW594069B (en) * 1999-06-09 2004-06-21 Holographic Imaging Llc Holographic display
US20040196524A1 (en) * 2003-04-05 2004-10-07 Holographic Imaging Llc. Spatial light modulator imaging systems
US20050030608A1 (en) * 2001-09-11 2005-02-10 Kwasnick Robert F. Light emitting device addressed spatial light modulator
US20060139711A1 (en) * 2004-12-23 2006-06-29 Seereal Technologies Gmbh Method of computing a hologram

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69128103D1 (en) * 1990-04-05 1997-12-11 Seiko Epson Corp An optical device
CA2305735C (en) * 1997-12-05 2008-01-08 Dynamic Digital Depth Research Pty. Ltd. Improved image conversion and encoding techniques
JP2000078611A (en) * 1998-08-31 2000-03-14 Toshiba Corp Stereoscopic video image receiver and stereoscopic video image system
GB2350963A (en) * 1999-06-09 2000-12-13 Secr Defence Holographic Displays
JP2002123688A (en) * 2000-10-16 2002-04-26 Sony Corp Holographic stereogram print order receipt system and its method
JP2002223456A (en) * 2001-01-24 2002-08-09 Morita Mfg Co Ltd Image data distribution method and image distributor, and recording medium
JP3679744B2 (en) * 2001-09-26 2005-08-03 三洋電機株式会社 Image composition method and apparatus
JP2003289552A (en) * 2002-03-28 2003-10-10 Toshiba Corp Image display terminal and stereoscopic image display system
GB2390172A (en) * 2002-06-28 2003-12-31 Sharp Kk Polarising optical element and display
JP2004040445A (en) * 2002-07-03 2004-02-05 Sharp Corp Portable equipment having 3d display function and 3d transformation program
CN100437393C (en) * 2002-11-13 2008-11-26 希瑞尔技术有限公司 Video hologram and device for reconstructing video holograms
GB2406730A (en) * 2003-09-30 2005-04-06 Ocuity Ltd Directional display.
JP4230331B2 (en) * 2003-10-21 2009-02-25 富士フイルム株式会社 Stereoscopic image generation apparatus and image distribution server
EP1827465B1 (en) * 2004-12-21 2009-08-19 Alpha-Biocare GmbH Preparation made from diptera larvae for the treatment of wounds
US20060187297A1 (en) * 2005-02-24 2006-08-24 Levent Onural Holographic 3-d television

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW594069B (en) * 1999-06-09 2004-06-21 Holographic Imaging Llc Holographic display
US6683665B1 (en) * 2000-11-20 2004-01-27 Sarnoff Corporation Tiled electronic display structure and method for modular repair thereof
US20050030608A1 (en) * 2001-09-11 2005-02-10 Kwasnick Robert F. Light emitting device addressed spatial light modulator
US20040196524A1 (en) * 2003-04-05 2004-10-07 Holographic Imaging Llc. Spatial light modulator imaging systems
US20060139711A1 (en) * 2004-12-23 2006-06-29 Seereal Technologies Gmbh Method of computing a hologram

Also Published As

Publication number Publication date
TWI454742B (en) 2014-10-01
TW200845698A (en) 2008-11-16
TWI406115B (en) 2013-08-21
JP2014209247A (en) 2014-11-06
TW200844693A (en) 2008-11-16
TW200824426A (en) 2008-06-01
TW200827771A (en) 2008-07-01
TWI442763B (en) 2014-06-21
TW200841042A (en) 2008-10-16
TWI432002B (en) 2014-03-21
TWI421541B (en) 2014-01-01
TW200839295A (en) 2008-10-01

Similar Documents

Publication Publication Date Title
TWI646375B (en) A display system having an optical element for coupling the multiplexed optical flow of the
CN104704821B (en) Scan two-way light-field camera and display
JP2017142506A (en) Two-dimensional/three-dimensional holographic display system
US20150355597A1 (en) Projection device and method for holographic reconstruction of scenes
US10175478B2 (en) Methods and systems for generating virtual content display with a virtual or augmented reality apparatus
JP6320451B2 (en) Display device
US20190113751A9 (en) Diffractive projection apparatus
CN104884862B (en) Lighting apparatus
US20160004090A1 (en) Wearable data display
US10409144B2 (en) Diffractive waveguide providing structured illumination for object detection
US9354604B2 (en) Optically addressable spatial light modulator divided into plurality of segments, and holographic three-dimensional image display apparatus and method using the light modulator
Hainich et al. Displays: fundamentals & applications
US7974007B2 (en) Display device
JP2019512745A (en) Method and apparatus for providing polarization selective holographic waveguide device
US8711466B2 (en) Illumination unit for a direct-view display
Hong et al. Three-dimensional display technologies of recent interest: principles, status, and issues [Invited]
TWI559105B (en) Kombinierte lichtmodulationsvorrichtung zur benutzernachfuhrung
US9244286B2 (en) Display, instrument panel, optical system and optical instrument
US8639072B2 (en) Compact wearable display
JP4454429B2 (en) Direct view LC display
DE602004003474T2 (en) Switchable display device
US7683989B2 (en) Directional display apparatus
JP3238755B2 (en) Hologram creation and stereoscopic display method and stereoscopic display device
US8319828B2 (en) Highly efficient 2D-3D switchable display device
US7639210B2 (en) Multi-depth displays