TWI432002B - Mobile telephone system and method for using the same - Google Patents

Mobile telephone system and method for using the same

Info

Publication number
TWI432002B
TWI432002B TW96140507A TW96140507A TWI432002B TW I432002 B TWI432002 B TW I432002B TW 96140507 A TW96140507 A TW 96140507A TW 96140507 A TW96140507 A TW 96140507A TW I432002 B TWI432002 B TW I432002B
Authority
TW
Taiwan
Prior art keywords
mobile phone
spatial light
display
light modulator
image
Prior art date
Application number
TW96140507A
Other languages
Chinese (zh)
Other versions
TW200845698A (en
Inventor
Armin Schwerdtner
Original Assignee
Seereal Technologies Sa
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GBGB0621360.7A priority Critical patent/GB0621360D0/en
Priority to GBGB0625838.8A priority patent/GB0625838D0/en
Priority to GB0705409A priority patent/GB0705409D0/en
Priority to GB0705403A priority patent/GB0705403D0/en
Priority to GB0705401A priority patent/GB0705401D0/en
Priority to GBGB0705404.2A priority patent/GB0705404D0/en
Priority to GB0705410A priority patent/GB0705410D0/en
Priority to GB0705399A priority patent/GB0705399D0/en
Priority to GBGB0705405.9A priority patent/GB0705405D0/en
Priority to GB0705408A priority patent/GB0705408D0/en
Priority to GBGB0705411.7A priority patent/GB0705411D0/en
Priority to GBGB0705402.6A priority patent/GB0705402D0/en
Priority to GB0705407A priority patent/GB0705407D0/en
Priority to GB0705398A priority patent/GB0705398D0/en
Priority to GB0705406A priority patent/GB0705406D0/en
Priority to GB0705412A priority patent/GB0705412D0/en
Application filed by Seereal Technologies Sa filed Critical Seereal Technologies Sa
Publication of TW200845698A publication Critical patent/TW200845698A/en
Application granted granted Critical
Publication of TWI432002B publication Critical patent/TWI432002B/en

Links

Description

Mobile phone system and its use method

The present invention is a mobile phone system that provides a three-dimensional image, especially a mobile phone that includes a called party's mobile phone and the called party's mobile phone uses a coded hologram to generate a caller's holographic reconstruction action. telephone system.

Computer-generated video holograms (CGHs) are compiled from one or more spatial light modulators (SLMs); spatial light modulators can include electronics or optics. Controllable components. These components encode the hologram values based on the image hologram to achieve the purpose of modulating the amplitude and phase of the light. The computer-generated image hologram can be calculated, for example, by coherent ray tracing, by simulating interference between the reflected light and the reference wave, or by Fourier or Fresnel conversion. An ideal spatial light modulator is a value that can represent any complex number, that is, the phase and amplitude of the incoming light wave are separately controlled. However, a typical spatial light modulator can only control one of the amplitudes or phases, with adverse effects that affect other characteristics. The amplitude and phase of the modulated light can be varied in several ways, such as using an electronically addressed liquid crystal spatial light modulator, an optically addressed liquid crystal spatial light modulator, a magneto-optical spatial light modulator, a micromirror device, or an acousto-optic light. Modulator. The modulation of light can be spatially contiguous or composed of individual addressable elements, which can be one or two dimensional, binary, multi-level or continuous.

In the present invention, the proper noun "encoding" means providing a spatial light modulator control value to encode the hologram such that the three-dimensional scene can be reconstructed by the spatial light modulator. Therefore, the "space light modulator coded hologram" means that the hologram is encoded on the spatial modulator.

Compared with the pure automatic stereo display panel, the observer can observe the optical reconstruction of the wavefront of the three-dimensional scene through the image hologram. The three-dimensional scene is reconstructed in a space that extends between the observer's eye and the spatial light modulator or even after the spatial light modulator. The spatial light modulator can also be encoded using an image hologram such that the observer can observe the reconstructed three-dimensional scene object before the spatial light modulator, while other objects are observed on or behind the spatial light modulator.

The elements of the spatial light modulator are those that are more optically transmissive, with the interference produced by the rays being at least at a defined location and spatially tonality lengths in excess of a few millimeters. This provides a holographic reconstruction with sufficient resolution in at least one dimension. This type of light will be referred to as "full dimming".

In order to ensure sufficient time homology, the spectrum emitted by the source must be limited to a suitably narrow wavelength range, ie it must be close to a single color. The spectral bandwidth of high-brightness light-emitting diodes (LEDs) is narrow enough to ensure time-harmonicity of holographic reconstruction. The diffraction angle on the spatial light modulator is proportional to the wavelength, meaning that only one monochromatic source will result in a strong reconstruction of the target point. The wide spectrum leads to wide target points and fuzzy target reconstruction. The spectrum of the laser source can be treated as a single color. The spectral linewidth of a light-emitting diode (LED) is sufficiently narrow to aid in better reconstruction.

Spatial coherence is related to the lateral width of the light source. Conventional light sources, such as light-emitting diodes (LEDs) or cold cathode light-emitting lamps (CCFLs), can also meet these needs if they emit light through a narrow gap. The light from the laser source can be viewed as being emitted from a point source limited by diffraction, and depending on the purity of the model, a sharp reconstruction of the target will be produced, ie each target point is reconstructed as a point of diffraction limitation.

The light produced by the spatially non-coherent light source extends laterally and causes the reconstruction target to be blurred. The ambiguity is determined by the wide size of the target point reconstructed at a given location. In order to use a spatially non-coherent light source for hologram reconstruction, a compromise must be found between brightness and the use of aperture to limit the lateral width of the source. Smaller sources will give better spatial coherence.

A linear light source can be regarded as a point source if viewed from a right angle to a longitudinal extension. Therefore, the light wave can transmit in the same direction in the same direction, and it is not in the same direction.

In general, holograms reconstruct the scene in its entirety by coherent super-overlap of the waves in the horizontal and vertical directions. The above image hologram is referred to as a full parallax hologram. The reconstructed object can be viewed as a moving parallax in the horizontal and vertical directions, just like a real object. However, a larger viewing angle requires a high resolution in the horizontal and vertical directions of the spatial light modulator.

In general, the demand for spatial light modulators is reduced by limiting to holograms with only horizontal parallax (HPO). The holographic reconstruction only occurs in the horizontal direction, and there is no holographic reconstruction in the vertical direction. This will cause the reconstructed object to have a horizontally moving parallax. The perspective does not change on vertical movement. A hologram with only parallax will require a spatial light modulator to have a lower resolution in the vertical direction than a full parallax hologram. A hologram with only vertical parallax (VPO) is equally possible but rare. A holographic reconstruction occurs only in the vertical direction, producing a reconstructed object with a vertical moving parallax. There is no moving parallax in the horizontal direction. Since the perspectives observed by the left and right eyes are different, the perspectives must be generated separately.

Discuss related technologies

Typically, devices for generating three-dimensional images are less compact, requiring complex and bulky optical systems that render them unusable in portable devices, or in handheld devices such as cell phones. Taking US 4,208,086 as an example, the length of the device used to generate a larger three-dimensional image is in meters. With reference to WO 2004/044659 (US 2006/0055994), the device for reconstructing a three-dimensional image of an image has a thickness of more than 10 cm. Thus, the conventional devices described above have an excessive thickness for mobile phones or other portable, handheld or smaller display devices.

By coherent coherence in WO 2004/044659 (US 2006/0055994) A device for reconstructing a three-dimensional scene by diffracting light; the device includes a point source or a linear source, a lens for focusing light, and a spatial light modulator. Compared to the conventional holographic display, the spatial light modulator reconstructs the 3D scene in at least one "virtual observer window" in the transmission mode (for the description of the virtual observer window and related techniques, please refer to Annexes I and II). Each virtual observer window is placed close to the observer's eyes and is limited in size, so the virtual observer window is in a single diffraction class, so each eye can see the complete reconstruction of the three-dimensional scene in the conical reconstruction space, conical The reconstruction space is extended between the surface light modulator surface and the virtual observer window. In order for the hologram reconstruction to be undisturbed, the size of the virtual observer window must not exceed the periodic interval of the reconstructed diffraction stage. However, this must be at least large enough to allow the observer to see a complete reconstruction of the 3D scene via the window. The other eye can be viewed through the same virtual observer window or a second virtual observer window generated by the second source. At this point, a typically large visible area is limited to a partially set virtual observer window. A conventional solution is to reconstruct a large area of miniaturization produced by the surface of a conventional high resolution spatial light modulator to reduce the size of the virtual observer window. This will result in a smaller diffraction angle due to geometric reasons, as well as a consumer-level computing device that is sufficient to achieve high quality instant holographic reconstruction.

However, it is known that a method of generating a three-dimensional image exhibits a large volume, large capacity, heavy weight, and high cost due to a large spatial light modulator surface area. The disadvantage of focusing the lens. Therefore, the device will have a large thickness and weight. Another disadvantage is that when such a large lens is used, the color difference of the edge will seriously degrade the quality of the reconstruction. An improved light source improvement method comprising a lenticular array is mentioned in US 2006/250671, although it is applied to a wide range of image holograms as a reference.

A mobile phone that produces a three-dimensional image is mentioned in US 2004/0223049. However, the three-dimensional images mentioned are produced using autostereoscopic display. One problem with utilizing autostereoscopic display to produce a three-dimensional image is that the viewer typically perceives that the image is inside the display, while the viewer's eyes tend to focus on the surface of the display. In many instances, the difference between the focus of the viewer's eye and the perceived position of the three-dimensional image may cause a discomfort to the user. In instances where holographic techniques are used to generate three-dimensional images, these problems will not occur or be greatly reduced.

In a first aspect, a mobile telephone system is provided, comprising a caller mobile phone having an image system and a display, the image system is used to capture an image of the caller, and the caller's mobile phone transmits the caller's map through the wireless link. Like a mobile phone transmitted to the called party, the called party's mobile phone locally generates a caller's holographic reconstruction using the hologram-encoded hologram display.

The hologram display includes an organic light emitting diode (OLED) array, an organic light emitting diode The body array is written to an optically addressed spatial light modulator that forms a plurality of adjacent layers. The holographic display may comprise a pair of two organic light emitting diode arrays and an optically addressed spatial light modulator, each pair consisting of an array of organic light emitting diodes, the organic light emitting diode array being written to An optically addressed spatial light modulator is formed over a plurality of adjacent layers.

The hologram display can include an electronic address space light modulator. The holographic display can include two electronically addressed spatial light modulators.

The mobile phone system reconstructs the image hologram. The mobile phone system can add depth maps to the remote server or intermediate system to enable the caller's image and depth map to be transmitted to the called party's mobile phone. The called party's mobile phone may include a synchronization device to compensate for the delay caused by the remote server. The remote server can define the stylized data of the three-dimensional entity map of the calling party's face.

The called party's mobile phone may include a stop function to generate a static hologram reconstruction.

The called party's mobile phone may include an amplification function that allows the user to zoom in on a partial holographic reconstruction.

The called party's mobile phone and/or the calling party's mobile phone may include a stereo camera.

The called party's mobile phone and/or the calling party's mobile phone may include a single camera and software that uses the data obtained from a single camera to generate a depth map.

The called party's mobile phone and/or the calling party's mobile phone can display an indication on the screen to guide the user to ideally set the camera's position or orientation for optimal image capture and/or holographic reconstruction.

The called party's mobile phone and/or the calling party's mobile phone may be a display device that can correctly and clearly see the resulting holographic reconstructed when the user is at a preset distance from the display.

The called party's mobile phone and/or the calling party's mobile phone may be a display device that can switch from the holographic reconstruction mode to the conventional two-dimensional display mode.

The called party's mobile phone and/or the calling party's mobile phone may be a handheld portable device.

The called party's mobile phone and/or the calling party's mobile phone can be a personal digital Assistant (PDA).

The mobile phone of the called party and/or the mobile phone of the calling party may be a video game device.

The mobile phone system allows the holographic reconstruction of the holographic display to be viewed by a single user.

A holographic display can produce a two-dimensional image that is focused on the screen without the need for any projection lens, regardless of the distance of the screen from the device in the optical far field.

The mobile phone system can set the spatial light modulator of the holographic display within 30 mm of the light source and place it in a portable box.

The mobile telephone system can have a beam pointing component in the holographic display for tracking the virtual observer window. The beam pointing component is composed of a liquid crystal region inside the isotropic body material, wherein the interface between the region and the matrix is prismatic. Or the partial shape of the ball, or the partial shape of the cylinder, and the direction of the liquid crystal is controlled by means of an applied electric field to change the local refraction or diffraction property of the beam directed to the element.

The called party's mobile phone and/or the calling party's mobile phone may be a device in which the optically addressed spatial light modulator encodes an hologram and when reading the beam When the array illuminates the optically addressed spatial light modulator and the optically addressed spatial light modulator is properly controlled via the organic light emitting diode array, the holographic reconstruction will be produced by the device.

The called party's mobile phone and/or the calling party's mobile phone may be a device in which the organic light emitting diode array is written to a pair of optically addressed spatial light modulators, the organic light emitting diode array and An optically addressed spatial light modulator will form an adjacent layer, and a pair of optically addressed spatial light modulators encode a full image map, when the read beam is illuminated by a pair of optically addressed spatial light modulators, and paired When the optically addressed spatial light modulator is properly controlled via the organic light emitting diode array, the holographic reconstruction will be produced by the device.

The called party's mobile phone and/or the calling party's mobile phone may be a device including a first organic light emitting diode array written to the first optically addressed spatial light modulator, and written to a second organic light emitting diode array on the second optically-positioned spatial light modulator, the first organic light emitting diode array and the first optically-addressed spatial light modulator form an adjacent layer, and the second organic light emitting The diode array and the second optically-positioned spatial light modulator form an adjacent layer, and the first and second optically-addressed spatial light modulators encode the hologram, when the reading beam array illuminates the first and the a second optically-spaced spatial light modulator, and the first and second optically-addressed spatial light modulators are appropriately controlled via the first and second organic light-emitting diode arrays At the time, the device will produce a holographic reconstruction.

The called party's mobile phone and/or the calling party's mobile phone may be a device in which the combination of the first and second pairs of organic light emitting diodes/optical addressed spatial light modulators is modulated by a control mode The amplitude and phase of the beam array.

The called party's mobile phone and/or the calling party's mobile phone may be a device in which the amplitude and phase of the read beam array are modulated in pairs by an array of organic light emitting diodes and an optically addressed spatial light modulator. The first combination, the second organic light emitting diode array and the optically addressed spatial light modulator, are paired to modulate the second different combination of amplitude and phase of the read beam array.

The called party's mobile phone and/or the calling party's mobile phone may be a display device that generates a holographic reconstruction for viewing by a single user, and the display includes a write to the first optically addressed spatial light modulator An array of organic light emitting diodes and forming adjacent layers, and a second array of organic light emitting diodes written onto the second optically addressed spatial light modulator and forming adjacent layers.

The called party's mobile phone and/or the calling party's mobile phone may be a display device having a display mode, which can generate a two-dimensional image focused on the screen without any projection lens, and has no screen departure The distance of the device. Display package Forming a first organic light emitting diode array written to the first optically addressed spatial light modulator and forming an adjacent layer, and writing a second organic to the second optically addressed spatial light modulator Light emitting diode arrays and forming adjacent layers.

The called party's mobile phone and/or the calling party's mobile phone may have a display mode in which is an autostereoscopic display device including an organic light emitting device written on the amplitude-modulated optical address space light modulator An array of poles and forming adjacent layers, and the device includes a beam splitter that allows an observer's eye to illuminate the optically addressed spatial light modulator in the read beam array, and the optically addressed spatial light modulator is via organic light emission When the diode array is properly controlled, a stereoscopic image is seen.

In another aspect, a method can be used that includes the steps of using a mobile telephone system as described herein.

In another aspect, a method of providing telecommunication services is provided in which a network operator provides a calling party, a called party's mobile phone, a wireless link, and a remote server, and has an imaging system in the calling party's mobile phone. The display, the image system can be used to capture the image of the calling party, and the calling party's mobile phone transmits the image of the calling party to the called party's mobile phone through the wireless link, and the called party's mobile phone utilizes the hologram encoding. The holographic display locally produces a holographic reconstruction of the calling party.

The method of providing telecommunications services allows the called party's mobile telephone display to include at least one array of organic light emitting diodes written to at least one optically addressed spatial light modulator and form an adjustment layer.

In another aspect, a method for making an image call from a calling party mobile phone having an imaging system and a display is provided, the image system can be used to capture an image of the calling party, and the calling party's mobile phone transmits the image of the calling party to the The caller's mobile phone, the callee's mobile phone locally generates a caller's holographic reconstruction using the hologram-encoded hologram display.

The method of making an image call from the calling party's mobile phone allows the called party's mobile phone display to include an array of organic light emitting diodes that are written to the optically addressed spatial light modulator and form an adjustment layer.

A method of placing an image call from a calling party's mobile phone can add a depth map to the remote server or intermediate system to enable the caller's image and depth map to be transmitted to the called party's mobile phone.

The method of making an image call from the calling party's mobile phone allows the called party's mobile phone to include a synchronization device to compensate for the delay caused by the remote server.

The method of making an image call from the calling party's mobile phone can design the remote server to include data defining a three-dimensional entity map of the calling party's face.

The method of making an image call from the calling party's mobile phone allows the called party's mobile phone to include a stop function to generate a static holographic reconstruction.

The method of making an image call from the calling party's mobile phone allows the called party's mobile phone to include an amplification function, allowing the user to zoom in on part of the holographic reconstruction.

The method of making an image call from the calling party's mobile phone allows the called party's mobile phone and/or the calling party's mobile phone to include a stereo camera.

The method of making an image call from the calling party's mobile phone allows the called party's mobile phone and/or the calling party's mobile phone to include a single camera and the data obtained from the single camera to generate the depth map software.

The method of making an image call from the calling party's mobile phone allows the called party's mobile phone and/or the calling party's mobile phone to display an indication on the screen, guiding the user to set the ideal position or direction of the camera to obtain the best. Image capture and/or hologram reconstruction.

Using the "space light modulator to encode a full image map" means that the hologram is spatially tuned Encoding on the transformer.

A. Close combination of infrared organic light-emitting diode display and optical address space light modulator

This embodiment provides an intimate combination of an optically addressed spatial light modulator with an infrared emitting display that can be formatted on an optically addressed spatial light modulator, such a combination being capable of producing a three dimensional image under appropriate lighting conditions.

An optically addressed spatial light modulator includes a photoreceptor layer and a liquid crystal (LC) layer positioned between the conductive electrodes. When a voltage is applied to the electrodes, the light pattern incident on the photoreceptor layer will be converted to a liquid crystal layer for modulating the read beam. In conventional techniques, the incident light pattern is provided by a write beam modulated by an electronically addressed spatial light modulator (EASLM). The electronically addressed spatial light modulator is illuminated by a light source and imaged onto an optically addressed spatial light modulator. Typically, the write beam is non-coherent to avoid speckle patterning, while the read beam is coherent and has the ability to produce a diffractive pattern.

An advantage of an optically addressed spatial light modulator compared to an electronic addressed spatial light modulator is that the optically addressed spatial light modulator can have a continuous, non-pixel or non-patterned structure, while the electronic addressed spatial light tone The transformer is a pixel structure. Pixels produce sharp edges on the spatial distribution of light: this sharp edge is equivalent to a high spatial frequency.

High spatial frequencies result in wide-angle diffraction in the optical far field. Thus, electronically addressed spatial light modulators produce optically diffracted articles that are undesirable in optical far fields and must be eliminated using known techniques such as spatial filtering. In optical processing, spatial filtering requires additional steps, which can make the device thicker and waste light. An advantage of the optically addressed spatial light modulator type of device is the ability to allow continuous pattern generation in an optically addressed spatial light modulator. A continuous pattern allows the light intensity to have less sharp changes in any given direction to the direction of beam propagation. Therefore, fewer steep changes have a higher spatial frequency concentration than the edge of the pixel produced by the electronically addressed spatial light modulator device. In devices that include optically addressed spatial light modulators, the lower concentration of high spatial frequencies facilitates optical processing and is more efficient than devices that include electronically addressed spatial light modulators. Furthermore, the optically addressed spatial light modulator device can be a bistable device as compared to an electronic addressed spatial light modulator. Thus, an optically addressed spatial light modulator can have lower power requirements than an electronic addressed spatial light modulator device, which can increase the battery life of a portable device or handheld device.

In this embodiment, a compact device that does not require imaging optics is described. The optically addressed spatial light modulator is written using an infrared organic light emitting diode display. An organic light-emitting diode display is a direct connection to an optically addressed spatial light modulator that forms a compact device that does not have imaging optics. The organic light emitting diodes may be of a layable type to constitute an array of organic light emitting diodes. Optically addressed spatial light modulators can be made up of multiple smaller The fabricizable optically addressed spatial light modulator is composed of.

The close combination of the organic light emitting diode display and the optically addressed spatial light modulator can be transparent. Transparent organic light emitting diode displays are currently known, for example as described in the "Organic Light Emitting Diode Materials" section. In one example, the close combination of the organic light-emitting diode display and the optically-addressed spatial light modulator is to illuminate the edge formed by the edge to the three-dimensional image, and the visible light passes through the organic light-emitting diode and the optical address space. The light modulator is transmitted to the observer. A better method is that the organic light emitting diode display emits infrared light to be written to the infrared sensing photoreceptor layer of the optically addressed spatial light modulator. Because the human eye is not sensitive to infrared light, the observer cannot see any kind of light generated from the infrared writing beam.

As another example, the close combination of an organic light emitting diode display and an optically addressed spatial light modulator allows the write beam and the read beam to be incident on opposite sides of the optically addressed spatial light modulator. In another example, the close combination of the organic light emitting diode display and the optically addressed spatial light modulator allows the reflective layer to be on the side of the optically addressed spatial light modulator, which is an organic light emitting diode display. The opposite side allows the three-dimensional image to be viewed from the same side of the optically-addressed spatial light modulator, that is, the side on which the organic light-emitting diode display is located, and the illumination source is also in the same manner as the organic light-emitting diode display. On the same side of the addressed spatial light modulator: this is an example of a reflective display.

In an embodiment comprising an array of infrared organic light emitting diodes, the infrared emitting organic light emitting diode allows control of the combination of amplitude, phase or amplitude and phase of visible light transmitted by the optically addressed spatial light modulator, The hologram is generated in an optically addressed spatial light modulator. The optically addressed spatial light modulator can comprise a pair of transparent spacers coated with two electrically conductive films as described in the reference US 4,941,735. A continuous or discontinuous photosensitive film can be applied to one of the conductive films.

A bistable ferroelectric liquid crystal or some other type of liquid crystal may be confined between another conductive film and a photosensitive film. The starting voltage can be applied to the conductive film. In an optically addressed spatial light modulator, the optical write beam can be programmed pixel by pixel or to initiate the polarization of the optical read beam. The write beam can be programmed to optically address the spatial light modulator using the photosensitive regions of the individual activated optically addressed spatial light modulators. The optically addressed spatial light modulator is programmed to rotate the reading beam by the start of the write beam.

Figure 1 depicts an embodiment. 10 is a lighting device for providing illumination of a planar area, wherein the illumination is sufficiently homophonic to enable generation of a three-dimensional image. An example of a lighting device for a large area image hologram is mentioned in US 2006/250671, an example of which is shown in Figure 4. A device like 10 can be in the form of an array of white light sources, such as a cold cathode fluorescent lamp or white light that is incident on the focusing system. A light emitting diode, wherein the focusing system can be compact, such as a lenticular array or a microlens array. Alternatively, the light source for 10 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker. The thickness of elements 10-13 can all be on the order of a few centimeters or less. Element 11 can be a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 12, although a color filter is not required if a colored light source is used. Element 12 is an array of infrared organic light emitting diodes on a transparent substrate. The infrared organic light emitting diode array will cause each of the infrared organic light emitting diodes to emit light in the direction of the element 13 in parallel and conform to the light emitted from the unique corresponding color pixel. Element 13 is an optically addressed spatial light modulator. With regard to an optically addressed spatial light modulator, an infrared organic light emitting diode array provides a write beam; the color beam emitted by element 11 is a read beam. A viewer located at a distance 14 from the device including the compact hologram generator 15 can view a three-dimensional image from the direction of 15. Elements 10, 11, 12, and 13 are configured to be physically connected (realally connected), each forming a layer of structure such that the entirety is a single, unified object. Physical connections can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. Physical connections can be limited to small areas that ensure correct mutual alignment, or can be extended To a larger area, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using an optical transfer adhesive to form a compact hologram generator 15, or by any other means (refer to the outline manufacturing procedure section). ).

The element 10 may comprise one or two xenon optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 10 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 10 can include a focusing system that can be compact, such as a lenticular array or a microlens array. Element 10 can include other optical components known in the art of backlighting.

Figure 4 is a side view of a conventional technique showing the three focusing elements 1101, 1102, 1103 of the vertical focusing system 1104 in the form of a cylindrical lens arranged horizontally in the array, with reference to reference WO 2006/119920. And the nearly collimated beam of the horizontal line source LS2 passes through the focusing unit 1102 of the illumination unit to the observer plane OP example. According to Figure 4, many of the line sources LS1, LS2, LS3 are arranged one above the other. The light emitted by each light source is spatially homogenous in the vertical direction and spatially non-coherent in the horizontal direction. This light passes through the transmission element of the optical modulator SLM. This light is only diffracted in the vertical direction because of the components of the hologram-modulated optical modulator SLM. The focusing element 1102 images the light source LS2 at the observer plane OP in a number of diffraction stages (only one is useful). The light beam emitted by the light source LS2 is an example of the focusing element 1102 that passes only through the focusing system 1104. In Figure 4, the three beams show a first diffractive class 1105, a zeroth diffractive class 1106, and a negative diffractive class 1107. Line sources allow very high light intensities to be produced compared to a single point source. Efficient light intensity can be enhanced by using a plurality of holographic regions that have increased efficiency and line source alignment for each portion of the reconstructed three-dimensional scene. Another advantage is that without the use of a laser, multiple sources (e.g., after a slot that can be part of the shutter) can produce sufficient dimming.

B. Close combination of the combination of two pairs of organic light emitting diodes and an optically addressed spatial light modulator.

In still further embodiments, the close combination of the two pairs of organic light emitting diodes and the optically addressed spatial light modulator can be used to modulate the amplitude and phase in a continuous and compact manner. Therefore, a complex number consisting of amplitude and phase can be compiled in transmitted light one by one.

This embodiment includes a first intimate combination of an infrared organic light emitting diode array and an optically addressed spatial light modulator pair and a second closely matched pair of an infrared organic light emitting diode array and an optically addressed spatial light modulator. combination.

The first pair modulates the amplitude of the transmitted light, and the second pair modulates the phase of the transmitted light. It is also possible that the first pair modulates the phase of the transmitted light and the second pair modulates the amplitude of the transmitted light. The close combination of each infrared organic light emitting diode array and the optically addressed spatial light modulator can be as described in Section A. The close combination of the two pairs of infrared organic light emitting diode arrays and the optically addressed spatial light modulator is separated by an infrared filter that absorbs infrared light without processing visible light.

In a first step, the first infrared organic light emitting diode array is patterned to provide amplitude modulation in the first optically addressed spatial light modulator. In a second step, the second infrared organic light emitting diode array is patterned to provide phase modulation in the second optically addressed spatial light modulator. The infrared filter blocks the leakage of infrared rays from the first tightly combined pair of infrared-organic light emitting diode arrays and the optically addressed spatial light modulator to the second tightly combined pair of infrared-organic light emitting diode arrays and optical type Address space light modulator. The infrared filter also prevents infrared leakage from the close combination of the second pair of infrared organic light emitting diode arrays and the optically addressed spatial light modulator to the first pair of infrared organic light emitting diode arrays Close combination of optically addressed spatial light modulators. However, the infrared filter is transmitted from the first pair of red Visible light combined with an external organic light-emitting diode array and an optically-addressed spatial light modulator for reading in a close combination of a second pair of infrared organic light-emitting diode arrays and an optically-addressed spatial light modulator beam. The light transmitted by the second optically-spaced spatial light modulator has been modulated in amplitude and phase so that when the viewer views the light emitted by the device comprising the two closely combined pairs, the observer can observe the three-dimensional map image.

Modulation techniques based on conventional phase and amplitude promote the performance of complex values, both of which have high resolution for both organic light-emitting diode displays and optically-spaced spatial light modulators. Thus, this embodiment can be applied to produce a holographic image so that the viewer can see the three-dimensional image.

In Figure 2, an example of an implementation is shown. 20 is a lighting device for providing illumination of a planar area, and the illumination is sufficiently coherent to produce a three-dimensional image. An example of a large area image hologram as provided in US 2006/250671 is an example. This type of device can be in the form of an array of white light sources, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on the focusing system, wherein the focusing system can be compact, such as a lenticular array or micro. Lens array. Alternatively, the light source for 20 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. However, non-laser sources with sufficient spatial coherence (eg, light-emitting diodes, organic light-emitting diodes, Cold cathode fluorescent lamps) are even better. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

The thickness of elements 20-23, 26-28 can all be on the order of a few centimeters or less. Element 21 may comprise a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 22, although a color filter is not required if a colored light source is used. Element 22 is an array of infrared organic light emitting diodes on a transparent substrate. The infrared organic light emitting diode array will cause the light emitted by each of the infrared organic light emitting diodes in the direction of the element 23 to be parallel and conform to the light emitted from the unique corresponding color pixel. Element 23 is an optically addressed spatial light modulator. With regard to an optically addressed spatial light modulator, an infrared organic light emitting diode array provides a write beam; the color beam emitted by element 21 is a read beam. Element 26 is an infrared filter that transmits only visible light and interrupts infrared light such that the infrared light emitted by element 22 does not affect element 27. Element 27 is an optically addressed spatial light modulator. Element 28 is an array of infrared organic light emitting diodes on a transparent substrate. The infrared organic light emitting diode array will cause each of the infrared organic light emitting diodes to emit light in the direction of the element 27 in parallel and conform to the light emitted from the unique corresponding color pixel. With respect to the optically addressed spatial light modulator 27, the infrared organic light emitting diode array 28 provides a write beam; the color beam emitted by the element 26 is a read beam. With respect to transmitting light, element 23 is modulated in amplitude and element 27 is modulated in phase. It is also possible to modulate the amplitude of element 27, element 23 Modulate the phase. Since the light from the infrared organic light emitting diode array on the transparent substrate 28 is emitted in the direction of the element 26, the element 26 can absorb infrared light, preventing the light of the element 28 from being optically addressed to the spatial light modulator 23. With such a setting, the light emitted by the two organic light emitting diode arrays 22 and 28 is in substantially opposite directions, ensuring that the two optically addressed spatial light modulators 23 and 27 can be placed in close proximity. Bringing the optically-addressed spatial light modulators 23 and 27 close to the problem of reducing pixel loss and pixel crosstalk caused by beam divergence: when the optically addressed spatial light modulators 23 and 27 are in close proximity, optical A preferred approximation of the non-overlapping propagation of a colored light beam of a spatially modulated spatial modulator. The order of elements 27 and 28 of Figure 2 can be reversed, but this is not considered to be an ideal setting for achieving low crosstalk and high transmission targets between colored light beams through optically addressed spatial light modulators 23 and 27.

The element 20 may comprise one or two 稜鏡 optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 20 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 20 can include a focusing system that can be tight, such as Mirror array or microlens array. Element 20 can include other optical components known in the art of backlighting.

A viewer located some distance from the device including the compact hologram generator 25 at point 24 can view the three-dimensional image from the direction of 25. Elements 20, 21, 22, 23, 26, 27, and 28 are configured to be physically connected (realally connected), each forming a layer of structure such that the entirety is a single, unified object. Physical connections can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. Physical connections can be limited to small areas that ensure proper alignment, or can extend to larger areas, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using an optical transfer adhesive to form a compact hologram generator 15, or by any other means (refer to the outline manufacturing procedure section). ).

In Figure 2, the light emitted by the organic light emitting diode arrays 22 and 28 is ideally collimated. However, the light emitted by the actual organic light-emitting diode may be uncollimated, such as Lambertian (completely diffused) light. When the light emission of the organic light-emitting diode is not very collimated, the organic light-emitting diode can be as close as possible to the corresponding optically-spaced light modulator. In such a case, the intensity incident on the surface of the optically addressed spatial modulator will vary to approximately the square of the cosine of the incident angle. Incident light at 45° or 60° will result in a intensity that is only one-half or one-quarter of the normal incident light. Therefore, if the organic light-emitting diodes are sufficiently spaced apart Separated, the visible light pixels are sufficiently small and close enough to the optically addressed spatial light modulator, and the geometric effect will cause a significant change in the potential difference across the optically-spaced spatial modulator, even in organic light-emitting Polar body light distribution is limited by Lambertian. The intensity of the incident infrared light may not fall to zero between the points of the optically-addressed spatial light modulator where the light of the organic light-emitting diode is incident perpendicularly, which may result in a reduction in the achievable contrast of the device. However, if the device structure can be simplified, the reduced contrast is acceptable.

In Figure 2, the light emitted by the organic light emitting diode arrays 22 and 28 is ideally collimated. However, the light emitted by the actual organic light-emitting diode may be uncollimated, such as Lambertian (completely diffused) light. When the light emission of the organic light-emitting diode is not collimated, the geometric light distribution of the organic light-emitting diode can be corrected using a Bragg filter holographic optical element, such as described in US 5,153,670. A Bragg filter holographic optical element can cause light collimation or better collimation than without the use of this component. Figure 8 shows an example of the action of a Bragg filter holographic optical element. In Figure 8, 80 is an array of organic light emitting diodes, 81 is a holographic optical element Bragg filter, contains a Bragg plane, such as a Bragg plane 84, and 82 is an optically addressed spatial light modulator. In a single organic light-emitting diode 83 in the organic light-emitting diode array 80, the distribution of the emitted infrared rays is a distribution as indicated by 85. Light ray 86 emitted by organic light emitting diode array 80 undergoes scattering in holographic optical element 81, followed by approximately orthogonal The incidence is incident on the optically addressed spatial light modulator 82. In this method, it is possible to improve the collimation of the infrared rays incident on the optically-addressed spatial light modulator 82.

Another implementation is shown in Figure 5. 57 is a lighting device for providing illumination of a planar area, and the illumination is sufficiently coherent to produce a three-dimensional image. An example of a large area image hologram as provided in US 2006/250671 is an example. This type of device may take the form of a white light source array, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on a focusing system, wherein the focusing system can be compact, such as a lenticular array or a microlens array. 50. Alternatively, the light source for 57 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

The element 57 may comprise one or two 稜鏡 optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 57 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that can be transmitted. A linearly biased state is provided and the orthogonal linearly biased state is reflected - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 57 may comprise other optical elements known in the art of backlight technology.

The thickness of the elements 57, 50-54 may all be on the order of a few centimeters or less. Element 51 may comprise a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 52, although a color filter is not required if a colored light source is used.

Element 52 is an array of infrared organic light emitting diodes on a transparent substrate. The infrared organic light emitting diode array will be such that for each color pixel, a single pair of two infrared organic light emitting diodes emitting light in the direction of element 53 will be parallel and conform to the color pixels from their corresponding color Light. The first type of infrared organic light emitting diode emits infrared rays of a first wavelength. The second infrared organic light emitting diode emits infrared light of a second wavelength, and the second wavelength is different from the first wavelength. Element 53 is an optically addressed spatial light modulator. Element 54 is another optically addressed spatial light modulator. With regard to an optically addressed spatial light modulator, an infrared organic light emitting diode array provides a write beam; the color beam emitted by element 51 is a read beam. The optically addressed spatial light modulator 53 is comprised of an organic light emitting diode array 52 The first wavelength of the two infrared wavelengths emitted is controlled. The optically addressed spatial light modulator 53 is insensitive to the second wavelength of the two infrared wavelengths emitted by the organic light emitting diode array 52 and will emit the second of the two infrared wavelengths emitted by the organic light emitting diode array 52. Wavelength transmission. The optically addressed spatial light modulator 54 is controlled by a second of the two infrared wavelengths emitted by the organic light emitting diode array 52. The optically addressed spatial light modulator 54 is insensitive to the first wavelength of the two infrared wavelengths emitted by the organic light emitting diode array 52, or may utilize the absorption of the optically addressed spatial light modulator 53 and/or To prevent the light of the first infrared wavelength from reaching the optically-addressed spatial light modulator 54, by its absorption, in the compact hologram generator 55, the optical type that is insensitive to the first infrared wavelength is not necessarily required. Address space light modulator 54. Alternatively, a single organic light-emitting diode emitting two different wavelengths may be used. The relative intensities of the two different wavelengths are determined by a parameter such as the voltage across the organic light-emitting diode. Two different wavelengths of radiation can be controlled using time multiplexing.

For transmitting light, element 53 is modulated in amplitude and element 54 is modulated in phase. Element 54 can also be modulated in amplitude, and element 53 can be modulated in phase. With such a setting, the organic light emitting diode array 52 emits light having two different wavelengths, ensuring that the two optically addressed spatial light modulators 53 and 54 can be placed in close proximity. Bringing the optically-addressed spatial light modulators 53 and 54 close to the problem of reducing pixel loss and pixel crosstalk caused by beam divergence: when the optically addressed spatial light modulators 53 and 54 are very close In close proximity, a preferred approximation of the non-overlapping propagation of the colored light beams through the optically addressed spatial light modulator can be achieved.

A viewer located some distance from the device including the compact hologram generator 55 at point 56 can view the three-dimensional image from the direction of 55. Elements 57, 50, 51, 52, 53 and 54 are configured to be physically connected (realally connected), each forming a layer of structure such that the entirety is a single, uniform object. Physical connections can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. Physical connections can be limited to small areas that ensure proper alignment, or can extend to larger areas, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using an optical transfer adhesive to form a compact hologram generator 55, or by any other means (refer to the outline manufacturing procedure section). ).

Where the optically addressed spatial light modulator performs amplitude modulation, in a typical setup, the incident read optical beam will be linearly polarized by passing the beam through a linear polarizer. The amplitude modulation is controlled by the rotation of the liquid crystal in the applied electric field, wherein the electric field is generated by the photosensitive layer, affecting the polarization state of the light. In such a device, light exiting the optically addressed spatial light modulator passes through another linear polarizer, which reduces the intensity due to the change in the polarization state of the light, as it does when optically addressing the spatial light modulator. .

In the optically-addressed spatial light modulators performing phase modulation, unless they are already in a defined linearly biased state, in a typical setting, the incident reading optical beam will be achieved by passing the beam through a linear polarizer. Linearly biased. Phase modulation is controlled by the application of an applied electric field, where the electric field is generated by the photosensitive layer, affecting the phase state of the light. In one example of phase modulation, a nematic phase liquid crystal is used, the optical axis direction being fixed at intervals, but birefringence is a function of applied voltage. In the case of phase modulation, using ferroelectric liquid crystal, birefringence is fixed, but the direction of the optical axis is controlled by the applied voltage. In phase modulation implementation, using either method, the output beam has a phase difference for the input beam that is controlled by the applied voltage. One example of a liquid crystal element that can perform phase modulation is a Freedericksz element arrangement in which an anti-parallel arrangement of nematic liquid crystals having a positive dielectric anisotropy is used, as described in US 5,973,817.

C. Close combination of compact light source and electronically addressed spatial light modulator.

This embodiment provides a close combination of an electronic addressed spatial light modulator and a fully coherent compact light source that produces a three dimensional image with proper illumination.

In this embodiment, a close combination of an electronic address spatial light modulator that does not require imaging optics and a compact light source is described. This embodiment provides a light source or multiple sources, a focusing method, an electronic addressed spatial light modulator (EASLM), and a non- A close combination of the necessary beam splitter elements, this combination is capable of producing a three-dimensional image with appropriate illumination.

In Figure 11 is an embodiment. 110 is an illumination device for providing illumination of a planar area, wherein the illumination is sufficiently homogenous to enable generation of a three-dimensional image. An example of a lighting device for a large area image hologram is mentioned in US 2006/250671, an example of which is shown in Figure 4. The device like 110 can be in the form of a white light source array, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on the focusing system, wherein the focusing system can be compact, such as a lenticular array or a microlens array. . Alternatively, the light source for 110 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. Red, green, and blue light-emitting diodes can be organic light-emitting diodes (OLEDs). However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

Element 110 may have a thickness of about a few centimeters or less. In the preferred embodiment, elements 110-113 will all be less than three centimeters thick to provide a closely spaced, compact source. Element 111 can be a color filter array such that colored light (eg, red, The pixels of green and blue light are directed toward element 112, although a color filter is not required if a colored light source is used. Element 112 is an electronic addressed spatial light modulator. Element 113 is an unnecessary beam splitter element. A viewer located at point 114 some distance from the device including the compact hologram generator 115 can view the three-dimensional image from the direction of 115.

The element 110 may comprise one or two xenon optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 110 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 110 can include other optical components known in the art of backlighting.

An electronic addressed spatial light modulator is a type of spatial light modulator in which each component in an array of elements can be addressed electronically. Each element performs some effect on the incident light, for example to modulate the amplitude of the light it transmits, or to modulate the phase of the light it transmits, or to modulate the amplitude and phase of the light it transmits. combination. An example of an electronic addressed spatial light modulator is provided in US 5,973,817, which is a phase modulated electronic addressed spatial light modulator. The liquid crystal electronic address space optical modulator is an example of an electronic address space optical modulator. The magneto-optical electronic addressing spatial light modulator is another example of an electronic addressed spatial light modulator.

Elements 110, 111, 112, and 113 are configured to be physically connected (realally connected), each forming a layer of structure such that the entirety is a single, unified object. Physical connections can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. The physical connections can be limited to small areas that ensure correct mutual alignment, or can extend to larger areas, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using optically transmissive adhesives to form a compact hologram generator 115, or by any other means (refer to the outline manufacturing procedure section). ).

Figure 4 is a side view of a conventional technique showing the three focusing elements 1101, 1102, 1103 of the vertical focusing system 1104 in the form of cylindrical lenses arranged horizontally in the array. The beam that is nearly collimated with the horizontal line source LS2 passes through the focusing unit 1102 of the illumination unit to the observer plane OP as an example. According to Figure 4, many of the line sources LS1, LS2, LS3 are arranged one above the other. The light emitted by each light source is spatially homogenous in the vertical direction and spatially non-coherent in the horizontal direction. This light will pass Transmission element of the optical modulator SLM. This light is only diffracted in the vertical direction because of the components of the hologram-modulated optical modulator SLM. The focusing element 1102 images the light source LS2 at the observer plane OP in a number of diffraction stages (only one is useful). The light beam emitted by the light source LS2 is an example of the focusing element 1102 that passes only through the focusing system 1104. In Figure 4, the three beams show a first diffractive class 1105, a zeroth class 1106, and a negative one class 1107. Line sources allow very high light intensities to be produced compared to a single point source. Efficient light intensity can be enhanced by using a plurality of holographic regions that have increased efficiency and line source alignment for each portion of the reconstructed three-dimensional scene. Another advantage is that without the use of a laser, multiple sources (e.g., after a slot that can be part of the shutter) can produce sufficient dimming.

Typically, the hologram display is used to reconstruct the wavefront in the virtual observer window. A wavefront is something that an actual object will produce if it exists. When the observer's eyes are in a virtual observer window that may be in multiple virtual observer windows (VOWs), he will see the reconstructed object. As shown in Fig. 6A, the hologram display is composed of the following components: a light source, a lens, a spatial light modulator, and an unnecessary beam splitter.

In order to facilitate the close combination of the spatial light modulator and the compact light source that can display the holographic image, the single light source and the single lens of FIG. 6A can be replaced by the light source array and the lens array or the lenticular array, respectively, as shown in FIG. 6B. Show. In Figure 6B, the light source The spatial light modulator is illuminated and the lens images the light source to the observer plane. The spatial light modulator encodes the holographic image and modulates the incoming wavefront so that the wavefront can be reconstructed in the virtual observer window. The optional beam splitter element can be used to create a number of virtual observer windows, such as a virtual observer window for the left eye and a virtual observer window for the right eye.

Assuming that an array of light sources is used with a lens array or a lenticular array, the light sources in the array must be separated such that light passing through the lens array or the lenticular array of all lenses simultaneously reaches the virtual observer window.

The device of Figure 6B is suitable for a compact design that can be applied to a compact hologram display. Such a holographic display can be applied to mobile applications, such as in a mobile phone or a personal digital assistant. Typically, such a hologram display will have a screen size of one inch or a few inches. The size of the full-image display screen can be as small as one centimeter. Suitable components will be described in detail below.

1) Light source / light source array

A fixed single source can be used for simple situations. If the observer moves, the observer can be tracked and the display can be adjusted so that the resulting image is visible to the observer at the new location. At this point, if there is no tracking of the virtual observer window, the tracking is done using the beam pointing element after the spatial light modulator.

A configurable array of light sources can be implemented by a liquid crystal display (LCD) illuminated in a backlight. In order to generate an array of point or line sources, only the appropriate pixels are switched to the transfer state. The aperture of these sources must be small enough to ensure adequate spatial coherence for target holographic reconstruction. An array of point sources can be used with a lens array comprising two-dimensionally arranged lenses. An array of line sources is more preferably used with lenticular arrays comprising cylindrical lenses arranged in parallel.

It is preferred to use an organic light emitting diode display as the light source array. As a self-illuminating device, most of the light generated by the liquid crystal display can be absorbed by components such as a color filter or pixels that are in a non-perfect state, which can have better tightness and better province. Electric effect. However, liquid crystal displays may have an overall price advantage over organic light emitting diode displays, even though organic light emitting diode displays can provide light in a more efficient manner than liquid crystal displays. When an organic light emitting diode display is used as the light source array, only the pixels switched to it need to create a virtual observer window at the eye position. The organic light emitting diode display may have a two-dimensional array of pixels or a one-dimensional array of line sources. The illuminating area of each point source or the width of each line source needs to be sufficiently small to ensure a full-image reconstruction that provides sufficient spatial coherence to the target. Similarly, an array of point sources is preferred for use with lens arrays comprising two-dimensionally aligned lenses. An array of line sources is more suitable for use with a lenticular array comprising cylindrical lenses arranged in parallel.

2) Focus method: single lens, lens array or lenticular array

The focusing tool images a light source or multiple sources to the observer plane. When the spatial light modulator is very close to the focusing tool, the Fourier transform of the information encoded in the spatial light modulator is in the observer plane. The focusing tool contains one or several focusing elements. The position of the spatial light modulator and the focusing tool is interchangeable.

For the close combination of an electronically addressed spatial light modulator with a well-constrained compact light source, a thin focusing tool is necessary: conventionally used refractive lenses are too thick. Instead, a diffractive or holographic lens is used. A diffractive or holographic lens can have the function of a single lens, a lens array, or a lenticular array. Such materials are present, such as surface relief holographic products provided by Physical Optics Corporation, Torrance, CA, USA. Or use a lens array. The lens array comprises two-dimensionally arranged lenses, each lens being assigned to a light source of the array of light sources. Another option is to use a lenticular array. The lenticular array comprises a one-dimensional array of cylindrical lenses, each lens having a corresponding source in the array of light sources. As noted above, if a source array is used with a lens array or a lenticular array, the light sources in the array must be separated such that light passing through the lens array or the lenticular array of all lenses simultaneously reaches the virtual observer window.

Light passing through a lens array or a lens of a lenticular array for any other penetration The mirror is non-coherent. Therefore, the hologram image encoded on the spatial light modulator is composed of a sub-hologram, and each sub-image corresponds to one lens. The aperture of each lens must be large enough to ensure that the resolution of the reconstructed object is sufficient. A lens having an aperture that is almost as large as the typical size of the hologram encoding region can be used, as exemplified in US 2006/0055994. That is to say, the aperture of each lens is one or several millimeters.

3) Spatial light modulator

The hologram is encoded on the spatial light modulator. Typically, the encoding of an hologram is made up of a complex two-dimensional array. Therefore, ideally the spatial light modulator should be able to modulate the amplitude and phase of the local light beam passing through each pixel of the spatial light modulator. However, a general spatial light modulator can only modulate the amplitude or phase, and cannot be independently modulated.

Amplitude modulated spatial light modulators can be used in combination with track phase encoding, such as Burckhardt encoding. Its disadvantage is that it requires three pixels to encode a complex number, and the reconstructed object is less bright.

The phase modulation spatial light modulator produces a higher brightness reconstruction. For example, so-called 2-phase encoding can be used, using two pixels to encode a complex number.

Although electronically addressed spatial light modulators have significant edge characteristics, which would result in undesirable higher diffraction levels in their diffraction patterns, these problems can be reduced or eliminated by using soft apertures. The soft aperture is an aperture that does not have a sharp delivery cutoff. An example of a soft aperture transmission method is to have a Gaussian pattern. Gaussian graphics are known to be useful for diffraction systems. The reason is that the Fourier transform of the Gaussian function is a mathematical result of the Gaussian function itself. Therefore, the diffraction of the beam intensity waveform function is not changed except for the transmission with an aperture having a sharp cutoff in the transmission pattern itself. An array of sheets of Gaussian transfer graphics can be used. When these are provided in alignment with the electronically addressed spatial light modulator aperture, a higher diffraction-level system without higher diffraction levels or substantial reduction will be obtained compared to systems with sharp cut-offs in the beam delivery pattern. . A Gaussian filter or a soft aperture filter suppresses the diffraction of the product to a high spatial frequency. Gaussian filter or soft aperture filters minimize crosstalk between virtual observer windows for the left and right eyes.

4) Beam splitter component

The virtual observer window limits a periodic interval of the Fourier transform of the spatial light modulator encoded information. Using the existing maximum resolution spatial light modulator, the virtual observer window is 10 millimeters in size. In some cases, this may be too small for an application to be in a holographic display without tracking. A spatially multiplexed virtual observer window is a solution to this problem: creating multiple virtual observer windows. In the case of spatial multiplexing, the virtual observer window will be in the spatial light modulator Different locations are created at the same time. This can be achieved by a beam splitter. For example, one set of pixels on the spatial light modulator encodes information for the virtual observer window 1 and another set of pixels encodes information for the virtual observer window 2. The beam splitter will distinguish the two sets of light so that the virtual observer window 1 and the virtual observer window 2 will be juxtaposed on the observer plane. The virtual observer window 1 and the virtual observer window 2 can be configured to create a larger virtual observer window. Multiplex can also be used to create virtual observer windows for the left and right eyes. In such cases, no seam juxtaposition is required and there may be a gap between one or several virtual observer windows for the left eye and one or several virtual observer windows for the right eye. Care must be taken that the higher diffraction level of the virtual observer window does not overlap with other virtual observer windows.

A simple example of a beam splitter element is a parallax barrier comprising black stripes with a transparent area between the black stripes, as described in US 2004/223049. Another example is a lenticular sheet, as described in US 2004/223049. Another example of a beam splitter element is a lens array and a sputum shield. In a compact hologram display, it may typically be desirable to have a beam splitter element, whereas a typical 10 mm virtual observer window is only sufficient to provide one eye, which is not consistent with a typical viewer having two eyes and is approximately 10 cm apart. . However, time multiplexing can be used as an alternative to spatial multiplexing. In the absence of space multiplex, it will not be necessary to use the beam splitter element.

Spatial multiplexing can also be used in the generation of color hologram reconstruction. For spatial color multiplex, the pixels are grouped, and each group contains red, green, and blue color elements. These clusters are spatially separated in a spatial light modulator and simultaneously illuminate red, green and blue light. Each group will utilize a hologram encoding calculated for the color element corresponding to the target. Each group recreates the color elements of its holographic target reconstruction.

5) Time multiplexing

In the case of time multiplexing, the virtual observer windows are generated one after the other at the same position on the spatial light modulator. This can be achieved by the position of the alternating light source and the simultaneous re-encoding of the spatial light modulator. The alternate positions of the light sources must be such that the virtual observer windows in the observer plane are seamlessly juxtaposed. If the time multiplex is fast enough, that is, the full period is greater than 25 Hz, the eye will see a continuously expanding virtual observer window.

Multiplex can also be used to create virtual observer windows for the left and right eyes. In such cases, no seam juxtaposition is required and there may be a gap between one or several virtual observer windows for the left eye and one or several virtual observer windows for the right eye. Such multiplexing can be space or time multiplex.

The multiplex of space and time can also be combined. As an example, three virtual observer windows are spatially multiplexed to create an expanded virtual observer window for one eye. This expanded virtual observer window is time multiplexed to produce for the left eye Expand the virtual observer window and expand the virtual observer window for the right eye.

Care must be taken that the higher diffraction level of the virtual observer window does not overlap with other virtual observer windows.

The multiplexing of the expanded virtual observer window is more recommended than the re-encoding of the spatial light modulator because it provides an expanded virtual observer window with continuous changes in parallax for observer movement. In simple terms, multiplex without re-encoding will provide duplicate content in different parts of the expanded virtual observer window.

Time multiplexing can also be used in the generation of color hologram reconstruction. For time color multiplexing of three color elements, they are encoded sequentially on the spatial light modulator. These three sources will switch simultaneously with the re-encoding on the spatial light modulator. If the repetition of the full cycle is fast enough, ie greater than 25 Hz, the eye will see a continuous color reconstruction.

6) Undesirable processing of higher diffraction classes

If the larger virtual observer window is pieced together by a smaller virtual observer window, the higher diffractive class of the virtual observer window will likely cause disturbing crosstalk in other virtual observer windows unless there is an implementation to avoid this problem. step. As an example, if each virtual observer window is in the zeroth diffraction class of the Fourier transform of the spatial light modulator encoded information, the first winding of the virtual observer window The shooting class will likely overlap with the adjacent virtual observer window. Such overlap may result in a disturbing background, which may become particularly noticeable if the unwanted image intensity exceeds about 5% of the required image intensity. In such cases, it tends to compensate or suppress higher diffraction classes.

If the angle of the illuminated spatial light modulator is constant, a fixed angle filter can be used. If this is not the holographic display, there is no tracking function or the beam splitter element (such as the beam pointing element) is located behind the spatial light modulator. The fixed angle filter can be a Bragg filter or a Fabry Perot Etalon.

The Bragg filter imaging optics can be used to modify the geometric light intensity distribution, as described in US 5,153,670, where the spatial light modulator produces a geometric light intensity distribution with an unwanted diffraction level. The Bragg filter holographic optical element can result in a different light intensity distribution than when the component is not used. Figure 7 shows the function of the Bragg filter holographic optics. In Figure 7, 70 is a spatial light modulator and 71 is a holographic optical element Bragg filter containing a Bragg plane, such as a Bragg plane 74. A single element 73 on the spatial light modulator 70 provides a diffracted light intensity distribution as in Figure 75. Light ray 76, which is diffracted by spatial light modulator 70, undergoes scattering in holographic optical element 71 and is then transmitted in a different direction than the original propagation between 70 and 71. If the direction of light 76 is between 70 and 71 is not desired The first class diffracts light, and it can be easily seen that the Bragg filter 71 successfully changes the light to different directions so that it does not cause optical processing that is unwanted and may hinder the viewer. A typical viewer will be located near perpendicular to 70 directions.

An adjustable method for suppressing the diffractive class is mentioned in the patent application No. DE 10 2006 030 503. Mentioned is a liquid crystal layer between two coplanar glass sheets coated with a partially reflective coating. For each reflection of the coated beam, the beam is partially reflected and partially transmitted. The interference of the transmitted beam and the phase difference between them will determine whether the interference is constructive or destructive, as described in the Fabry-Perrault specification. Given a wavelength, the interference and transmission will vary with the angle of incidence of the beam.

Given a direction of light propagation, the interference can be adjusted by changing the refractive index of the liquid crystal for the direction of propagation of a given light. The refractive index is controlled by an electric field applied to the liquid crystal layer. Therefore, in all the limitations of the Fabry-Perrault gauge, the angular transmission characteristics can be adjusted, and the diffractive class can be selectively transmitted or reflected as desired. For example, if the Fabry-Perrault gauge is set to the optimal transmission of the zeroth class and the best reflection of the first class, there may still be some unwanted transmissions of the second and higher classes. In all the limitations of the Fabry-Perrault gauge, this device can help to fix or sequentially select a particular diffraction class, either for transmission or for reflection.

Space filters can be used in the selection of the diffraction class. Space filter can be set at The spatial light modulator is between the virtual virtual observer window and contains transparent and opaque areas. These spatial filters can be used to deliver the desired diffractive class and block unwanted diffracting classes. These spatial filters can be fixed or configurable. For example, an electronic address space light modulator disposed between the spatial light modulator and the virtual observer window can be used as a settable spatial filter.

7) Eye tracking

In an intimate combination of an electronically addressed spatial light modulator with eye tracking and a well-constrained compact light source, the eye position detector detects the observer's eye position. Therefore, one or several virtual observer windows can be automatically placed in the eye position so that the observer can see the reconstructed object through the virtual observer window.

However, tracking is not always possible because of the additional device requirements and power demand limitations that affect performance, especially for portable devices or handheld devices. Without tracking, the observer must adjust the position of the display. This is easily achievable because in the preferred embodiment, the compact display is a hand held display that may be included in a personal digital assistant or mobile phone. A personal digital assistant or a user of a mobile phone typically views the display vertically, which does not greatly help to adjust the virtual observer window to correspond to the position of the user's eyes. It is well known that users of handheld devices tend to change the orientation of the device on hand to obtain the most desirable viewing state, as described in WO 01/96941. Thus, in In such a device, the user's eye tracking is not required and the tracking optics are not as compact as the scanning mirror. However, eye tracking can be applied to other devices, and for the device, the extra demand for the device and the power supply does not cause an excessive burden.

In the absence of tracking, the close combination of an electronic address space light modulator with a fully coherent compact light source requires a large virtual observer window to simplify display adjustment. A better virtual observer window size should be several times the size of the pupil of the eye. This can be done by a single large virtual observer window using a small pitch spatial light modulator or by a number of smaller virtual observer windows using a large pitch spatial light modulator.

The position of the virtual observer window is determined by the position of the light source in the array of light sources. The eye position detector detects the position of the eye and sets the position of the light source to fit the virtual observer window to the position of the eye. This type of tracking is described in US 2006/055994 and US 2006/250671.

Alternatively, the virtual observer window can be moved when the light source is in a fixed position. Light source tracking requires a spatial light modulator that is relatively insensitive to changes in the angle of incidence of light from the source. If the light source is moved to move the virtual observer window position, such settings may make it difficult to achieve a close combination of the compact light source and the spatial light modulator due to the possibility of abnormal light propagation in tight combinations, in such an example. It would be helpful to have a fixed light path in the display and a beam pointing element that is the last optical component in the display.

The beam directing elements are shown in Figures 20 and 21. This beam pointing element changes the angle of the beam at the output of the display. It can have optical properties for x and y tracking controllable and for z-tracking controllable lenses. For example, either or both of the beam directing elements of Figures 20 and 21 can be applied to a single device. The beam directing element is a controllable diffractive element or a controllable refractive element. The controllable refractive element may comprise an array of recesses filled with liquid crystal, the liquid crystal being embedded in an isotropic linear dipole susceptibility tensor matrix. The cavity has the shape of a crucible or a lens. The electric field controls the effective refractive index of the liquid crystal and thus helps the beam to be directed. The electric field can vary between components to produce a beam directing characteristic that varies between components. As shown in Figure 20, the electric field is applied between the transparent electrodes. The liquid crystal has a uniaxial refractive property and can be selected such that its refractive index perpendicular to its optical axis is equivalent to the refractive index of the host material or "matrix". The rest of the settings can be obtained from the conventional technology. The host material has an isotropic refractive index. If the optical axis of the liquid crystal is aligned along the z direction, as applied by the appropriate electric field as shown in Fig. 20, the plane wave propagating along the z direction does not have refraction when it passes through the beam pointing element because it does not encounter it. The change in refractive index to any Poynting vector perpendicular to it. However, if an electric field is applied to the electrode such that the optical axis of the liquid crystal is perpendicular to the z-direction, a plane wave that is biased parallel to the optical axis is propagated along the z-direction, and when it passes through the beam to the element, the most refraction is encountered. It experiences the most likely change in refractive index along its direction of bias (which the system can provide). The degree of refraction will be adjusted between these two extreme examples by selecting the appropriate electric field applied to the host material.

If the cavity is prismatic rather than lens shaped, the beam will be pointed. Figure 21 shows the orientation of the beam to the appropriate prism. If the optical axis of the liquid crystal is aligned along the z direction, as applied by the appropriate electric field as shown in Fig. 21, the plane wave propagating along the z direction does not have refraction when it passes through the beam pointing element because it does not Any refractive index change is encountered in its direction of polarization. However, if the electronic field is to apply a traverse electrode such that the liquid crystal axis is perpendicular to the z-direction, the plane wave propagates along the z-direction which is biased parallel to the optical axis to experience the most refraction because it passes through the beam pointing element because Its most probable refractive index system provides its Poynting vector that varies vertically.

However, if an electric field is applied to the electrode such that the optical axis of the liquid crystal is perpendicular to the z-direction, a plane wave that is biased parallel to the optical axis is propagated along the z-direction, and when it passes through the beam to the element, the most refraction is encountered because It undergoes a maximum of possible refractive index changes perpendicular to its (system-provided) Poynting vector. The degree of refraction will be adjusted between these two extreme examples by selecting the appropriate electric field applied to the host material.

8) Examples

An example of a close combination of an electronic addressed spatial light modulator and a fully coherent compact light source will now be described. This combination can produce a three-dimensional image with appropriate illumination and can be placed in a personal digital assistant or mobile phone. in. The close combination of an electronic addressed spatial light modulator and a fully coherent compact light source includes an organic light emitting diode display as an array of light sources, an electronic addressed spatial light modulator and a lens array, as shown in FIG.

Depending on the location requirements of the virtual observer window (represented by OW in Figure 12), a particular pixel in the organic light emitting diode display is activated. These pixels illuminate the electronically addressed spatial light modulator and are imaged by the lens array at the observer plane. At least one pixel of each lens of the lens array is activated in the organic light emitting diode display. In drawing a given size, if the pixel pitch is 20 μm, a virtual observer window with a lateral increment of 400 μm can be traced. Such tracking is quasi-continuous.

Organic light-emitting diode pixels are light sources with partial spatial homology. Partial homology will produce a fuzzy reconstruction of the target point. In drawing a given size, if the pixel width is 20 microns, a reconstruction with a lateral blur of 100 microns will be produced at a target point of 100 mm from the display. This is sufficient for the resolution of the human visual system.

Light passing through different lenses of the lens array does not have significant common homology. The need for coherence is limited to each single lens of the lens array. Therefore, the resolution of the reconstruction target point is determined by the pitch of the lens array. For the human visual system, the typical lens pitch will be 1 mm class to ensure full resolution. If the organic light emitting diode pitch is 20 micrometers, this means that the ratio of the lens pitch to the organic light emitting diode pitch is 50:1. If only one organic light-emitting diode is illuminated for each lens, this means that only one organic light-emitting diode will be illuminated per 50^2 = 2,500 organic light-emitting diodes. Therefore, this display will be a low power display. The difference between the holographic display referred to herein and the conventional organic light-emitting diode display is that the former concentrates on the viewer's eyes, whereas the latter emits light to 2π steradian. The conventional organic light-emitting diode display achieves a luminance of about 1,000 cd/m^2 (calculated by the inventor in practice), whereas in practice, the illumination type organic light-emitting diode should be able to achieve 1,000 cd/m^ 2 times the luminosity.

The virtual observer window is a diffractive class that limits the Fourier spectrum of the encoded information in the spatial light modulator. If the spatial aperture of the spatial light modulator is 10 μm and two pixels are required to encode a complex number, ie if 2-phase encoding is used on a phase-modulated electronic address spatial light modulator, at a wavelength of 500 nm, the virtual observer window There will be a width of 10mm wide. The virtual observer window can utilize space or time multiplexing to piece together several virtual observer windows into an expanded virtual observer window. In the case of spatial multiplexing, additional optical components such as beam splitters are required.

Color hologram reconstruction can be achieved by time multiplexing. The red, green, and blue pixels of a color organic light emitting diode display are sequentially activated using synchronous re-encoding of a spatial light modulator having an hologram of the red, green, and blue optical wavelengths.

The display can include an eye position detector to detect the observer's eye position. The eye position detector is connected to a control unit that controls the pixel activity of the organic light emitting diode display.

The calculation of the hologram encoded on the spatial light modulator is preferably performed by an external coding unit because it requires higher computational power. The display data is then sent to a personal digital assistant or mobile phone to display a three-dimensional image of the full image.

For practical examples, a 2.6 inch screen size XGA liquid crystal display electronically positioned spatial light modulator manufactured by Sanyo (RTM) Epson (RTM) Imaging Devices Corporation of Japan can be used. The pitch of the sub-pixels is 17 μm. If this is used for the construction of the red, green and blue hologram display, the amplitude modulation code of the hologram is used, and the observation window is calculated to be 1.3 mm wide at a distance of 0.4 m from the electronic address space light modulator. For the case of monochrome, the viewing window is calculated to be 4 mm wide. If the same setting is used, but the phase modulation of the 2 phase encoding is used instead. Change, the observation window is calculated to be 6mm wide. If the same setting is used, but the phase modulation of the Kinoform encoding is used, the viewing window is calculated to be 12 mm wide.

In addition, there are other high resolution electronic address space optical modulators. Seiko (RTM) Epson (RTM) Corporation of Japan has published a monochrome electronic address space optical modulator, such as a D4:L3D13U 1.3 inch screen size with a 15 micron pitch panel. The company also published the same type of panel D5: L3D09U-61G00, with a 0.9 inch screen size and a pixel pitch of 10μm. On December 12, 2006, the company announced the same type of panel L3D07U-81G00 with a 0.7-inch screen size and a pixel pitch of 8.5 μm. If the D4:L3D13U 1.3 inch panel is used to construct a monochrome holographic display and uses the holographic Burckhardt amplitude modulation code, the distance from the electronically addressed spatial light modulator is 0.4m. The virtual observer window can be calculated to be 5.6mm wide.

D. Close combination of pairs of electronically addressed spatial light modulators

In another embodiment, the combination of two electronically addressed spatial light modulators can be used to modulate the amplitude and phase of the light in a sequential and compact manner. Therefore, a complex number including amplitude and phase can be encoded in the transmitted light one by one.

This embodiment includes a close combination of two electronically addressed spatial light modulators. The first electronically addressed spatial light modulator modulates the amplitude of the transmitted light, and the second electronically addressed spatial light modulator modulates the phase of the transmitted light. It is also possible to modulate the phase of the transmitted light by a first electronically addressed spatial light modulator, and the second electronically addressed spatial light modulator modulates the amplitude of the transmitted light. Each electronic address space light modulator can be as described in Section C. In addition to using two electronically addressed spatial light modulators, the overall configuration can be as described in Section C. Any combination of other two types of electronically addressed spatial light modulator modulation characteristics that are equivalent to independent modulation of amplitude and phase is possible.

In a first step, the first electronically addressed spatial light modulator utilizes pattern coding for amplitude modulation. In a first step, the second electronically addressed spatial light modulator utilizes pattern coding for phase modulation. The light transmitted from the second electronically addressed spatial light modulator has been modulated in amplitude and phase, so that when the observer observes the light emitted by the device of the two electronically addressed spatial light modulators, A three-dimensional image can be observed.

The modulation algorithm based on the conventional phase and amplitude promotes the representation of complex values, and the electronic address space optical modulator can have high resolution. Thus, this embodiment can be used to generate a hologram to make a three-dimensional image viewable by an observer.

Figure 13 is an embodiment. 130 is a lighting device for providing a flat area Illumination, where illumination is sufficiently homogenous to enable the production of three-dimensional images. An example of a lighting device for a large area image hologram is mentioned in US 2006/250671, an example of which is shown in Figure 4. The device like 130 can be in the form of an array of white light sources, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on the focusing system, wherein the focusing system can be compact, such as a lenticular array or a microlens array. . Alternatively, the light source for 130 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. The red, green, and blue light emitting diodes may be organic light emitting diodes (OLEDs). However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

The element 130 may comprise one or two 稜鏡 optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 130 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer that transmits a linearly biased state and reflects an orthogonal linearly biased state - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly polarized state - such sheets are known, for example as described in US 6,181,395. The content of the description. Element 130 can include a focusing system that can be compact, such as a lenticular array or a microlens array. Element 130 can include other optical components known in the art of backlighting.

Element 130 can have a thickness of about a few centimeters or less. In a preferred implementation, the thickness of elements 130-134 are all less than 3 cm to provide a tightly tuned compact source. Element 131 can be a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 132, although a color filter is not required if a colored light source is used. Element 132 is an electronic addressed spatial light modulator. Element 133 is an electronic addressed spatial light modulator. Element 134 is an optional beam splitter element. For transmitting light, element 132 is modulated in amplitude and element 133 is modulated in phase. Alternatively, component 133 modulates the amplitude and component 132 modulates the phase. Proximating the electronically addressed spatial light modulators 132 and 133 to reduce optical loss and pixel crosstalk caused by beam divergence: when the electronically addressed spatial light modulators 132 and 133 are in close proximity, A preferred approximation of the non-overlapping propagation of a colored light beam of a spatially modulated spatial modulator. A viewer located at point 135 some distance from the device including the compact hologram generator 136 can view the three-dimensional image from the direction of 136.

Elements 130, 131, 132, 133, and 134 are configured to be physically connected (realally connected), each forming a layer of structure such that the entirety is a single, unified object. real The body connection can be direct. Or indirect, if there is a thin intermediate layer, cover the film between adjacent layers. Physical connections can be limited to small areas that ensure proper alignment, or can extend to larger areas, even the entire surface of the layer. The physical connection can be achieved by layer-to-layer bonding, for example by using optically transmissive adhesives to form a compact hologram generator 136, or by any other means (refer to the Summary Manufacturing Procedures section). ).

Where the electronically addressed spatial light modulator performs amplitude modulation, in a typical setup, the incident read optical beam will be linearly biased by passing the beam through a linear polarizer. The amplitude modulation is controlled by the rotation of the liquid crystal in the applied electric field, and the application of the electric field affects the polarization state of the light. In such a device, light exiting the electronically addressed spatial light modulator passes through another linear polarizer, which reduces the intensity due to changes in the polarization state of the light, as it does when electronically addressing the spatial light modulator. .

In the electronically addressed spatial light modulators performing phase modulation, unless they are already in a defined linearly biased state, in a typical setting, the incident reading optical beam will be achieved by passing the beam through a linear polarizer. Linearly biased. Phase modulation is controlled by the application of an electric field that affects the phase state of the light. In one example of phase modulation, a nematic phase liquid crystal is used, the optical axis direction being fixed at intervals, but birefringence is a function of applied voltage. In one example of phase modulation, using ferroelectric liquid crystal, birefringence is fixed, but the direction of the optical axis is controlled by the applied voltage. In phase modulation implementation, using either method, the output beam will have a phase difference from the input beam that is a function of the applied voltage. One example of a liquid crystal element that can perform phase modulation is a Freedericksz element arrangement in which an anti-parallel arrangement of nematic liquid crystals having a positive dielectric anisotropy is used, as described in US 5,973,817.

A compact combination for compact hologram display with two electronically addressed spatial light modulators combined in small or minimal separation. A preferred embodiment is that the two spatial light modulators have the same number of pixels. Because the two electronically addressed spatial light modulators are not equidistant to the observer, the pixel spacing of the two electronically addressed spatial light modulators may need to be slightly different (but will still be about the same) to compensate for the difference. The impact of distance on the observer. The light that has passed through the pixels of the first spatial light modulator passes through the pixels corresponding to the second spatial light modulator. Therefore, light is modulated by two spatial light modulators, and complex amplitude and phase modulation can be achieved independently. As an example, the first spatial light modulator performs amplitude modulation, and the second spatial light modulator performs phase modulation. Similarly, any combination of other two spatial light modulator modulation characteristics that are equivalent to independent modulation of amplitude and phase is possible.

It must be noted that the light passing through the pixels of the first spatial light modulator can only pass through the pixels corresponding to the second spatial light modulator. If it is emitted from the first spatial light modulator pixel The crosstalk will occur when the light passes through the non-corresponding, adjacent pixels of the second spatial light modulator. These crosstalk may cause problems with reduced image quality. Four possible ways to minimize crosstalk between pixels are provided here. As will be apparent from conventional techniques, these methods are equally applicable to the Part B embodiment.

(1) The first and easiest way is to directly connect or bond the two spatial light modulators after adjusting the pixels. In the pixels of the first spatial light modulator, there may be a diffraction phenomenon that causes the light to deviate from propagation. The separation between the spatial light modulators must be sufficiently thin to be as thin as possible to the acceptable crosstalk between adjacent pixels of the second spatial light modulator. As an example, the spacing of two electronically addressed spatial light modulators having a pixel pitch of 10 μm must be less than or equal to a level of 10-100 μm. This is almost impossible to achieve in a conventionally manufactured spatial light modulator because the thickness of the glass cover is a rating of 1 mm. Of course, a "sandwich" approach that enables a thin separation layer between spatial light modulators is more recommended in a program. The fabrication method described in the Summary Manufacturing Procedure section can be applied to fabricate an apparatus comprising two electronically addressed spatial light modulators having a small or minimal separation distance.

Figure 14 shows a Fresnel diffraction data graph calculated from the diffraction of a slit of 10 μm wide. The distance from the slit is changed in the two-dimensional model. The vertical axis is slit(z) and the horizontal axis is slit (x). ). The uniformly illuminated slit is between -5 [mu]m and +5 [mu]m on the x-axis and z is zero micron. Optical transmission medium is used to obtain a refractive index of 1.5 for use in compact devices Typical medium. The selected light is red light having a vacuum wavelength of 633 nm. The green and blue wavelengths are smaller than the red light, so for the calculation of red light, among the three colors red, green and blue, the strongest diffraction effect is exhibited. Calculations can be performed using the product MathCad (RTM) software from Parametric Technology (RTM) Corp., Needham, MA, USA. Figure 15 shows that the slight intensity remains in the 10 μm wide range at the center of the slit as a function of the distance from the slit. At a distance of 20 μm from the slit, Fig. 15 shows that the intensity greater than 90% is still in the range of 10 μm wide of the slit. Therefore, in this two-dimensional model, less than 5% of the pixel intensity is incident on each of the adjacent pixels. This is the result of the calculation of the zero boundary width between pixels. The actual boundary width between pixels is greater than zero, so the crosstalk problem will be lower in the real system than the result calculated here. In Figure 14, the Fresnel diffraction pattern approaches the slit, for example 50 μm from the slit, and somewhat approximates the high hat strength function of the slit. Therefore, there is no wide diffraction feature close to the slit. The wide diffraction characteristic is a characteristic of the far-field diffraction function of the high-hat type function, which is a conventionally known sinc squared function. The wide diffraction pattern can be observed by the example of the distance slit 300 μm in Fig. 14. This indicates that the diffraction effect can be controlled by the close proximity of the two electronically addressed spatial light modulators, and that the very close advantage of setting the two electronically addressed spatial light modulators is the diffraction data graph. The functional version will change from a far-field characteristic to a more efficient function that contains light that is close to the axis perpendicular to the slit. This advantage is contrary to the idea of conventional holographic techniques, which tend to be thought to cause strong, large, and unavoidable diffraction effects when light passes through a small aperture of a spatial light modulator. Thus, conventional techniques do not have the motivation to bring the two spatial light modulators together, and would expect such a way to cause pixel crosstalk problems that would inevitably occur and are severely caused by diffraction effects.

Figure 16 shows a contour plot of the intensity distribution as a function of distance from the slit. The plot of the contour is on a logarithmic scale, not a linear scale. Ten contour lines were used, all including a range of 100 intensity factors. For a slit width of 10 μm, a large degree of intensity distribution boundary is clear in the range of about 50 μm from the slit.

In a further embodiment, the pixel aperture area of the first electronically addressed spatial light modulator can be reduced to mitigate crosstalk problems in the second electronically addressed spatial light modulator.

(2) The second method is to use a lens array between the two spatial light modulators, as shown in Figure 17. A better approach is to have the number of lenses equal to the number of pixels in each spatial light modulation. The spacing of the two spatial light modulators and the spacing of the lens arrays can be slightly different to compensate for the observer's distance difference. Each lens images the pixels of the first spatial light modulator to the pixels corresponding to the second spatial light modulator, as shown by the plurality of light beams 171 in FIG. It is also possible that light will cause crosstalk problems through adjacent lenses, as indicated by a large number of beams 172. If its strength is low enough, or Its direction is full and different, making it impossible to reach the virtual observer window when it can be ignored.

The numerical aperture (NA) of each lens must be large enough to image a pixel with sufficient resolution. As an example, for a resolution of 5 μm, a numerical aperture (NA) of about 0.2 is required. This also means that if it is assumed to be a polyhedron, if the distance between the spatial light modulator and the lens array is 10 μm, the maximum distance between the lens array and each spatial light modulator is about 25 μm.

It is also possible to assign several pixels of each spatial light modulator to one lens of the lens array. As an example, a group of four pixels of the first spatial light modulator can be imaged by a lens in the lens array to a group of four pixels of the second spatial light modulator. The number of lenses of such a lens array will be one quarter of the number of pixels in each spatial light modulator. This allows the use of lenses with higher numerical apertures, thus enabling higher resolution imaging pixels.

(3) The third method is to reduce the pixel aperture of the first electronically addressed spatial light modulator as much as possible. From the viewpoint of diffraction, the area of the second spatial light modulator illuminated by one pixel of the first spatial light modulator is the pixel aperture width D and diffraction of the first electronically-positioned spatial light modulator The angle is determined as shown in Figure 18. In Figure 18, d is the distance between two electronically addressed spatial light modulators, and w is two The distance between the first class diffraction minimums occurs on either side of the zeroth class maximum. This is assumed to be a diffraction of Fraunhofer or a reasonable approximation of the Fraunhofer diffraction.

Reducing the aperture width D on the one hand reduces the range of direct projection of the central portion of the illumination area, as indicated by the dashed line in FIG. On the other hand, the diffraction angle is increased in accordance with the diffraction angle being proportional to 1/D in the Fraunhofer diffraction. This increases the width of the illuminated area on the second electronically-spaced spatial light modulator. w. The full width of the illuminated area is w. In the Fraun and Fiji diffraction methods, the division d, D can be determined, and the equation w=D+2dλ/D is used to minimize w, which is the first of two from the Fraun and Fiji diffractions. The distance between the order minimums is derived.

For example, if λ is 0.5 μm, d is 100 μm, and w is 20 μm, a minimum value of D of 10 μm can be obtained. In this case, however, the Fraunhofer method may not be a good approximation. This example illustrates the use of the distance between the electronically addressed spatial light modulators to control the diffraction in the Fraunhofer diffraction mode. The principle of the process.

(4) The fourth method uses a fiber optic panel to image the pixels of the first spatial light modulator to the pixels of the second spatial light modulator. The fiber optic panel is composed of two-dimensionally arranged parallel fibers. The length of the fiber and therefore the thickness of the panel is typically a few centimeters, and the diagonal length of the panel surface is as long as several inches. As an example, the spacing of the fibers can be It is 6 μm. Edmund Optics Inc. of Barrington, New Jersey, USA has sold fiber optic panels with such fiber spacing. Each fiber guides light from one of its sources to the other. Therefore, the image at one end of the panel is transmitted to the other end with high resolution and no focusing components. Such a panel can serve as a separation layer between two spatial light modulators, as shown in Figure 19. Multimode fiber is preferred over single mode fiber because multimode fiber has better coupling efficiency than single mode fiber. When the refractive index of the fiber core is stable with the refractive index of the liquid crystal, the best coupling efficiency is obtained because it minimizes the Fresnel back reflection loss.

There is no additional glass cover between the two spatial light modulators. The polarizer, the electrode and the alignment layer are directly connected to the fiber optic panel. Each of these layers is very thin, ie a grade of 1-10 μm. Therefore, the liquid crystal (LC) layers LC1 and LC2 are located close to the panel. Light passing through the first spatial light modulator pixel is directed to a pixel corresponding to the second spatial light modulator. This minimizes crosstalk from neighboring pixels. The panel transmits the light distribution at the output of the first spatial light modulator to the input of the second spatial light modulator. On average, each pixel should have at least one fiber. If there are less than one fiber per pixel, on average, the spatial light modulator will lose resolution, resulting in reduced image quality for applications displayed in holographic displays.

In Figure 19, the first spatial light modulator modulates the amplitude and the second spatial light modulator modulates the phase. Other two electronically addressed spatial light that promotes complete complex modulation Combinations of modulation characteristics of the modulator are possible.

Figure 10 shows an example of the tight alignment of the encoded amplitude and phase information in the hologram.

104 is an illumination device for providing illumination of a planar area, wherein the illumination is sufficiently homogenous to enable generation of a three-dimensional image. An example of a lighting device for a large area image hologram is mentioned in US 2006/250671. The device like 104 may be in the form of a white light source array, such as a cold cathode fluorescent lamp or a white light emitting diode that emits light incident on a focusing system, wherein the focusing system may be compact, such as a lenticular array or a microlens array. 100. Alternatively, the light source for 104 may be comprised of red, green, and blue lasers, or red, green, and blue light emitting diodes that emit sufficient tonal light. However, a non-laser light source (for example, a light-emitting diode, an organic light-emitting diode, a cold cathode fluorescent lamp) having a sufficient spatial coherence is preferable. Disadvantages of laser sources, such as laser spots on holographic reconstruction, are relatively expensive, and all possible safety issues with respect to the holographic display of the viewer or the eyes of a holographic display assembly worker.

The element 104 may comprise one or two 稜鏡 optical films to increase the brightness of the display: such a film is known, for example, as described in US 5,056,892 and US 5,919,551. Element 104 can comprise a polarizing element or a collection of polarizing elements. Linear polarizing sheets are an example of this. Another example is a reflective polarizer. A linearly biased state can be transmitted and the orthogonal linearly biased state can be reflected - such a sheet is known, for example, as described in US 5,828,488. Another example is a reflective polarizer that transmits a circularly polarized state and reflects an orthogonal circularly biased state - such a sheet is known, for example, as described in US 6,181,395. Element 104 can include other optical components known in the art of backlighting.

The thickness of the elements 104, 100-103 may all be on the order of a few centimeters or less. Element 101 may comprise a color filter array such that pixels of colored light (e.g., red, green, and blue light) are directed toward element 102, although a color filter is not required if a colored light source is used. Element 102 is an electronic addressed spatial light modulator that encodes phase information, such as a Freedericksz component. Element 103 is an electronic address spatial light modulator that encodes amplitude information, such as in a commercially available liquid crystal display device. Each of the elements of element 102, designated 107 here, will be aligned with the corresponding elements of element 103, indicated at 108. However, although elements in elements 102 and 103 have the same lateral spacing or spacing, the element size in element 102 may be less than or equal to the elements in element 103, as light exiting element 107 precedes element 108 of element 103, typically The ground will experience some diffraction. The encoding order of amplitude and phase can be reversed as shown in FIG.

Located at point 106 some distance from the device including the compact hologram generator 105 The viewer can view the three-dimensional image from the direction of 105. Elements 104, 100, 101, 102, and 103 are configured to be physically connected as previously described to form a compact hologram generator 105.

E. The constituent element comprises a close combination of one or two pairs of organic light emitting diodes with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators, and has a target holographic reconstruction Large magnification three-dimensional image display device

Figure 24 shows a compact combination of one or two pairs of organic light-emitting diodes with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators. A large-magnification three-dimensional image display device for reconstructing a target hologram. The components of this device include a close combination of a spatial light modulator and a well-consistent compact light source (such as those described in sections A, B, C and D), which can be combined in appropriate lighting conditions. A visible three-dimensional image is produced in the virtual observer window (labeled OW in Figure 24), which may be integrated, for example, in a personal digital assistant or mobile phone. As shown in Fig. 24, the close combination of the spatial light modulator and the well-toned compact light source includes a light source array, a spatial light modulator, and a lens array. The spatial light modulator in Figure 24 includes a pair of two or two pairs of organic light emitting diodes combined with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators. Combination, or a combination of an organic light-emitting diode and an optically-addressed spatial light modulator, and an electronic formula Address space light modulator.

In a simple example, the array of light sources can be formed in the following manner. A single source, such as a monochromatic light-emitting diode, is placed in close proximity to the array of apertures to illuminate the aperture. If the aperture is a slit of a one-dimensional array, the light that is transmitted from the slit will form a one-dimensional array of light sources. If the aperture is a circle of a two-dimensional array, the illuminated set of circles forms a two-dimensional array of light sources. A typical aperture width will be approximately 20 μm. Such an array of light sources is suitable for the generation of an observer window for one eye.

In Fig. 24, the light source array is disposed at a distance from the lens array u. The array of light sources can be the source of the element 10 of Figure 1, and can optionally include the elements 11 of Figure 1. Specifically, each of the light sources in the array of light sources is located at a distance u from the lens u corresponding to the lens array. In a preferred embodiment, the array of light sources is parallel to the plane of the lens array. The spatial light modulator can be located on either side of the lens array. The distance between the virtual observer window and the lens array is u. The lens in the lens array is a condensing mirror, and the focal length f is given by f = 1 / [1/u + 1 / v]. In a preferred embodiment, the value of v is in the range of 300 mm to 600 mm. In a more preferred embodiment, v is approximately 400 mm. In a preferred embodiment, the value of u is in the range of 10 mm to 30 mm. In a more preferred embodiment, u is approximately 20 mm. The amplification factor M is determined by v/u. M is a factor that is modulated by the spatial light modulator and is magnified in the virtual observer window. In a preferred embodiment, the value of M is In the range of 10 to 60. In a more preferred embodiment, M is about 20. In order to achieve such an amplification factor and have good holographic image quality, an array of light sources and a lens array that are accurately aligned are required. In order to maintain a precise alignment and maintain the same distance between the array of light sources and the lens array, the device components need to have strong mechanical stability until the lifetime of the component is exceeded.

The virtual observer window can be traceable or untrackable. If the virtual observer window is traceable, a particular light source in the array of light sources will be activated depending on the desired position of the virtual observer window. The activated light source illuminates the spatial light modulator and is imaged by the lens array to the observer plane. In the array of light sources, at least one light source is activated for each lens in the lens array. Tracking is quasi-continuous. If u is 20 mm and v is 400 mm, if the pixel pitch is 20 μm, a virtual observer window with a lateral increment of 400 μm can be traced. Such tracking is quasi-continuous. If u is 20mm and v is 400mm, f is about 19mm.

The light source in the array of light sources may only have partial spatial homology. Partial homology can lead to fuzzy reconstruction of the target point. If u is 20 mm and v is 400 mm, if the light source width is 20 μm, the reconstruction of the target point of 100 mm from the display will have a lateral blur of 100 μm. This is sufficient for the resolution of the human visual system.

There is no need to have any notice between the light passing through the different lenses in the lens array. Mutual coherence. The need for coherence is limited to each single lens in the lens array. Therefore, the resolution of the reconstruction target point is determined by the pitch of the lens array. A typical lens pitch will be on the order of 1 mm to ensure adequate resolution for the human visual system.

The virtual observer window is a diffractive class that limits the Fourier spectrum of the encoded information in the spatial light modulator. If the spatial aperture of the spatial light modulator is 10 μm and two pixels are required to encode a complex number, ie if 2-phase encoding is used on a phase-modulated electronic address spatial light modulator, at a wavelength of 500 nm, the virtual observer window There will be a width of 10mm wide. The virtual observer window can utilize space or time multiplexing to piece together several virtual observer windows into an expanded virtual observer window. In the case of spatial multiplexing, additional optical components such as beam splitters are required. Some multiplexed methods are described in Section C, and these multiplexed methods may also be applied to the implementation of this case.

Color hologram reconstruction can be achieved by time multiplexing. The red, green, and blue pixels of a color organic light emitting diode display are sequentially activated using synchronous re-encoding of a spatial light modulator having an hologram of the red, green, and blue optical wavelengths.

The display formed by the device component may include an eye position detector for detecting the view The position of the inspector's eyes. The eye position detector is connected to a control unit that controls the activation of the light source in the array of light sources.

The calculation of the hologram encoded on the spatial light modulator is preferably performed by an external coding unit because it requires higher computational power. The display data is then sent to a personal digital assistant or mobile phone to display a three-dimensional image of the full image.

For practical examples, a 2.6 inch screen size XGA liquid crystal display electronically positioned spatial light modulator manufactured by Sanyo (RTM) Epson (RTM) Imaging Devices Corporation of Japan can be used. The pitch of the sub-pixels is 17 μm. If this is used for the construction of the red, green and blue hologram display, the amplitude modulation code of the hologram is used, and the observation window is calculated to be 1.3 mm wide at a distance of 0.4 m from the electronic address space light modulator. For the case of monochrome, the viewing window is calculated to be 4 mm wide. If the same setting is used, but the phase modulation of the 2 phase encoding is used, the viewing window is calculated to be 6 mm wide. If the same setting is used, but the phase modulation of the Kinoform encoding is used, the viewing window is calculated to be 12 mm wide.

There are still other high resolution electronic address space light modulators. Seiko (RTM) Epson (RTM) Corporation of Japan has published a monochrome electronic address space optical modulator, such as D4: L3D13U 1.3 inch screen size with pixel pitch 15μm panel. The company also published the same type of panel D5: L3D09U-61G00, with a 0.9 inch screen size and a pixel pitch of 10μm. On December 12, 2006, the company announced the same type of panel L3D07U-81G00 with a 0.7-inch screen size and a pixel pitch of 8.5 μm. If the D4:L3D13U 1.3 inch panel is used to construct a monochrome holographic display and uses the holographic Burckhardt amplitude modulation code, the distance from the electronically addressed spatial light modulator is 0.4m. The virtual observer window can be calculated to be 5.6mm wide.

F. A compact combination of one or two pairs of organic light emitting diodes combined with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators with a target holographic reconstruction Image display device

A close combination of one or two pairs of organic light emitting diodes with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators is preferred for handheld 3D display devices or In larger 3D display devices, because such combinations are very tight. Such a combination can be integrated into, for example, a mobile phone, a satellite navigation device, a car display, a computer game device, a personal digital assistant (PDA), a notebook computer display, a desktop computer screen, or a thin television display. Such three-dimensional displays are more specific to a single user. The user is generally positioned perpendicular to the light emitting surface of the device and is at a distance from the device for optimal viewing, for example a distance of approximately 500 mm. Everyone knows that users of handheld devices It will tend to change the orientation of the device on hand to obtain the most ideal viewing state, as described in WO 01/96941. Therefore, in such a device, the user's eye tracking is not required and the tracking optics are not as compact as the scanning mirror. However, eye tracking can be applied to other devices, and for the device, the extra demand for the device and the power supply does not cause an excessive burden.

A three-dimensional map of satellite navigation with one or two pairs of organic light-emitting diodes combined with an optically-spaced spatial light modulator or one or two electronically-spaced spatial light modulators with target holographic reconstruction The image display device has the following advantages. The driver can find a three-dimensional image of the route information, such as the control method to be performed at the next intersection, and because the three-dimensional image information can be more in line with the driver's perception when driving, it can be more than the two-dimensional image information. good. Information on other displays, such as menus, can be displayed in three dimensions. Some or all of the information on the display can be displayed in three dimensions.

A three-dimensional map of a vehicle that includes a pair or two pairs of organic light-emitting diodes combined with an optically-spaced spatial light modulator or one or two electronically-positioned spatial light modulators with target holographic reconstruction The image display device has the following advantages. This device may be able to display three-dimensional information directly, for example, when reversing, or attempting to display a three-dimensional proximity of a car bumper (guard) to an adjacent object (such as a wall) by a position slightly wider or narrower than the vehicle. image. Where the passage is narrower than the vehicle, The 3D image display device helps the driver understand that the vehicle does not pass through this channel. The three-dimensional image can be created using information provided by sensors mounted on the vehicle. Other vehicle information can be displayed on the display in three dimensions, such as speed, temperature, per minute firing speed, or other information displayed in the vehicle. Satellite navigation information can be displayed in three dimensions on the display. Some or all of the information on the display can be displayed in three dimensions.

The size of the output window is limited by the periodic spacing of the diffracted patterns in the Fourier plane. If the pixel pitch in the organic light emitting diode display or the electronic address spatial light modulator is close to 10 μm, then for a visible light with a wavelength of 500 nm, at a distance of 500 mm, according to the spatial light modulator of the hologram The code, virtual observer window (VOW) is about 10mm to 25mm wide. This is wide enough for one eye. For another second virtual observer, the spatial or temporal multiplexing of the content of the spatial light modulator can be established. In the absence of tracking, in order to see the best three-dimensional image, the observer must rotate or move the device and/or his own position so that his eyes can be in the virtual observer window and at the optimal distance from the device. .

The patchwork of several virtual observer windows makes it easier to adjust the position and orientation of the display device. Two or three virtual observer windows can be juxtaposed in the x- and y-directions such that the virtual observer window can cover a larger area. Cobbled way It can be done by space or time multiplex, or a combination of space and time multiplexing. In time multiplex, light is projected onto the virtual observer window in time. If the virtual observer window has different content, the spatial light modulator must be re-encoded. In spatial multiplexing, the content of the different virtual observer windows is encoded in the spatial light modulator at the same time, but in different areas of the spatial light modulator. The beam splitter splits the light in different areas of the spatial light modulator into different virtual observer windows. A combination of space and time multiplexing can be used.

The size of a hand-held three-dimensional display device typically used for mobile phones or personal digital assistants ranges from one inch to several inches. The full-image display can have a screen size as small as one centimeter.

The three-dimensional image display device can switch to display a two-dimensional image, for example, by displaying the same image to each of the viewer's eyes.

Figure 3 shows the implementation of a three-dimensional image display device comprising a close combination of one or two pairs of organic light emitting diodes and an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators. example. The device in Fig. 3 is a mobile phone 30 on which a user can make a call when a three-dimensional image of the other party equipped with a similar device is displayed on the screen area 31. The mobile phone has an antenna 32 for mobile communication. In other embodiments, the antenna can be located in a row In the main body of the mobile phone 30. The mobile phone 30 is equipped with two cameras 33 and 34 for recording images of the left and right eyes of the user. The images of the left and right eyes contain stereoscopic image data. The mobile phone 30 is equipped with a digital and "*" and "#" symbol button 35, as well as other function buttons 36, such as moving in the menu on the screen, rewinding or starting to close. The indications displayed on the buttons, such as "ON" "OFF" or "2", can avoid reversing the confusion and prevent the two sides of the three-dimensional video telephone conversation from being confused when viewing the other party. In use, the eyes of the two viewers are preferably coplanar with the two cameras 33 and 34, and the user's face is positioned approximately perpendicular to the screen area 31. This ensures that the two cameras 33 and 34 record the parallax in the plane containing the viewer's eyes. The viewer's head is pre-determined for the most optimal viewing position of the display so that the two cameras 33 and 34 can obtain the most desirable image quality of the viewer's head at this location. The same is true for the other party in a three-dimensional image telephone call, so that both parties can be in a two-way three-dimensional image telephone conversation with the best image quality. In order to ensure that each viewer is accurately facing the cameras 33 and 34, it may be desirable to ensure that the virtual observer window for each eye is not much larger than each eye, as this can limit the viewer's eye to the viewer's camera orientation. An error in position and orientation. By mounting the device toward the target of the photograph, the mount can take a three-dimensional photograph of the target. Alternatively, the user can be guided by a small button icon on the device screen to complete the optimal orientation setting of the device. The device can also have an eye tracking function. The device format and usage described herein can be used for devices that can produce a three-dimensional image in a holographic, autostereoscopic, or any other method.

During a two-way three-dimensional video call, cameras 33 and 34 record the user's right and left eye images, respectively. The data obtained from these images will be used on the corresponding handheld device in the 3D video call to create a 3D image. If the three-dimensional image is produced in an autostereoscopic display, the images of the two eyes can be directly used in the autostereoscopic display by viewing from the cameras 33 and 34. If the three-dimensional image is holographically generated, the data contained in viewing from cameras 33 and 34 should be processed, for example by using a computer that produces a hologram, such as on one or two spatial light modulators. Like the proper encoding of the material. When the three-dimensional image is produced in a holographic manner, the three-dimensional display is a holographic display. Compared to autostereoscopic displays, holographic displays provide full depth information, ie adjustment (eye focus) and parallax. A holographic display provides holographic reconstruction of the target, ie holographic reconstruction of all target points at the correct depth.

The application of the handheld three-dimensional display described herein includes a call to maintain a two-way three-dimensional videophone. Another application involves displaying a three-dimensional display of a target or scene by the other party in the call, such as viewing the product prior to purchase, or checking for damage. Another application is to include confirmation of individual identities, which can be assisted by a three-dimensional display. Three-dimensional display enhances the ability to distinguish individuals who look very similar, such as twins or disguised people. Another application involves using images to view individuals for further contact, such as in dating services, three-dimensional images. Can help decide. Another application includes a way to view adult content using a three-dimensional display, and viewers will prefer a three-dimensional display over a two-dimensional display.

Different individuals have different distances between their eyes. In one embodiment, a three-dimensional display device with target holographic reconstruction has a menu option that allows the user of the display to vary the distance between the virtual observer window projecting the left eye and the right eye. In the selection of menu options, the user presses a button on the device to increase or decrease the separation between the virtual observer windows. If this is already set, when viewing the display and attempting to view the three-dimensional image, the separation distance between the best virtual observer windows can be selected to allow the viewer to view the best three-dimensional image that can be achieved. The selected distance can then be stored in the user's preferences. If there are multiple individuals using the device, multiple user preferences can be stored in the device. Such menu options can be implemented, although the device has the ability to track the viewer's eye position separately, because the precise distance between the virtual observer windows selected by the user is better than the choice of the software. . Once such a choice is made, the speed of tracking will be accelerated because the precise positional decision required for the observer's eyes will be lower after the distance between the eyes becomes a fixed parameter. The ability to choose a better distance between two virtual observer windows also provides the advantage over the autostereoscopic display system. In autostereoscopic display systems, the distance between the left and right eye images tends to use device hardware. Come fixed.

G. Planar projector system comprising one or two pairs of organic light emitting diodes combined with an optically addressed spatial light modulator or one or two electronically addressed spatial light modulators

Light emitted from the device can also be projected onto a screen or wall or some other surface to replace the way the projected light is projected into several virtual observer windows as described in Section F. Therefore, a three-dimensional display device in a mobile phone or a personal digital assistant or in other devices can also be used as a pocket projector.

The quality of the hologram can be improved by using a spatial light modulator to modulate the amplitude and phase of the incident light. Therefore, a complex-valued hologram can be encoded on a spatial light modulator, allowing for better quality of images reconstructed on the screen or wall.

The close combination of one or two pairs of organic light-emitting diodes described in the previous section with an optically-addressed spatial light modulator or one or two electronically-positioned spatial light modulators can be used as spatial light modulation The device is used in the projector. Since the size of this combination is tight, the projector will also be tight. The projector can even be a mobile phone or a personal digital assistant or some other device: it can be switched by "three-dimensional display" and "projector" mode.

Compared to conventional 2D projectors, holographic 2D projectors do not require investment The full distance of the shadow lens and the projected image in the optical far field is the advantage of focusing. Conventional holographic two-dimensional projectors, such as those described in WO2005/059881, use a single spatial light modulator, so that complex modulation cannot be performed. The holographic two-dimensional projector described herein will be able to perform complex modulations and thus have excellent image quality.

H. Autostereoscopic or holographic display using a tight combination of one or two infrared organic light emitting diode displays and an optically addressed spatial light modulator

The close combination of an infrared organic light-emitting diode display and an optically-addressed spatial light modulator (such as described in Part A) can also be used in auto-stereoscopic displays (ASD), especially in mobile phones or personal digital Hand-held autostereoscopic display in the assistant. However, for a typical viewer, viewing an autostereoscopic display is not as comfortable as viewing a full-image display, although in some cases an autostereoscopic display may be cheaper or easier to generate or provide than a full-image display. Image data. The autostereoscopic display provides several viewing areas, with each viewing area displaying a different perspective of the three dimensional scene. If the viewer's eyes are in different viewing areas, he will see a stereoscopic image. The difference between autostereoscopic display and holographic technology: autostereoscopic display provides two planar images, while omnidirectional technology provides Z-information for each target point in a three-dimensional scene.

Typically, autostereoscopic displays are based on spatial multiplexing of the viewing area on the display and use photonic spectroscopic elements such as lenticulars, barrier masks or prism masks. . Obstacle masks can also be called "parallax obstacles." A disadvantage of autostereoscopic displays is that the resolution of each viewing area is typically inversely proportional to the number of viewing areas. However, this disadvantage can be compensated for by the advantages of the autostereoscopic display as described above.

The close combination of an infrared organic light emitting diode display and an amplitude modulated optically addressed spatial light modulator (such as described in Section A) can be used to provide a high resolution amplitude modulated display. If the close combination of the infrared organic light emitting diode display and the amplitude modulation optical address spatial light modulator is combined with the beam splitter element, a high resolution autostereoscopic display can be constructed. The high resolution of the tight combination compensates for the loss of resolution due to spatial multiplex.

For autostereoscopic displays that require one or more additional optically addressed spatial light modulators, use a tight combination of one or more organic light emitting diode arrays with one or more optically addressed spatial light modulators (eg The advantage of what is described in sections A and B) is the non-patterned optically addressed spatial light modulator. The autostereoscopic display includes a beam splitter and an organic light emitting diode array, and may have a processed product due to the pattern of the organic light emitting diode, for example, a pattern between the beam splitter and the organic light emitting diode. Moiré effects. By comparison, The information on the tightly combined optically addressed spatial light modulator is continuous: during the beam splitter only, periodic artifacts do not occur.

The light source of the autostereoscopic display can be one or more light sources, such as light emitting diodes, lasers, organic light emitting diodes or cold cathode fluorescent lamps. The light source does not need to be homogenous. If an organic light emitting diode is used and the autostereoscopic display displays a color image, a color filter layer, such as red, is required between the light source and the light emitting display and the close combination of the amplitude modulated optically addressed spatial light modulator. Green and blue filters.

The close combination of an infrared organic light-emitting diode display and an optically-addressed spatial light modulator (such as described in Part A) can also be used in holographic displays, especially in mobile phones or personal digital assistants. Display. A holographic display is based on spatial multiplexing of the viewing area on the display and uses photon spectroscopy elements such as lenticulars, barrier masks or prism masks. Obstacle masks can also be called "parallax obstacles." The close combination of an infrared organic light emitting diode display and an optically addressed spatial light modulator (such as described in Section A) can be used to make a full resolution display with high resolution. If the close combination of the infrared organic light-emitting diode display and the amplitude-modulated optically-addressed spatial light modulator is combined with the beam splitter element, a high-resolution full-image display can be constructed. The high resolution of the tight combination compensates for the loss of resolution due to spatial multiplex. In another embodiment, two pairs The combination of an intimate combination of an organic light emitting diode array and an optically addressed spatial light modulator can be used in a sequential and compact manner to modulate the amplitude and phase of the light, as described in Section B. Therefore, the complex number consisting of amplitude and phase can be encoded in the transmitted light by pixel by pixel. If the close combination of the two pairs of infrared organic light emitting diode displays and the amplitude modulated optically addressed spatial light modulator is combined with the beam splitter elements, a high resolution holographic display can be constructed. The high resolution of the tight combination compensates for the loss of resolution due to spatial multiplex. A holographic display with a beam splitter element can provide several viewing areas with different views of the three dimensional scene displayed by each viewing area. If the viewer's eyes are in different viewing areas, he will see a stereoscopic image.

I. A data processing system required for three-dimensional transmission.

Figure 22 shows the data processing system required for 3D transmission. In Fig. 22, one of the parties 220 and the other party 221 are in three-dimensional transmission. The photographing data for creating an image can be collected using the mobile telephone device 30 shown in FIG. 3 or some devices having similar functions. The data processing for the three-dimensional image display can be performed in a device of one of the parties 220, the device can be a mobile phone 30 or an equivalent device, or can be executed in the device of the other party 221, but preferably can be located Execution is performed in the intermediate system 224 on the transport network between the two mobile phones. The transmission network includes a first connection 222, an intermediate system 224 and a second connection 223. The two connections 222 and 223 can be wireless or non-wireless. Intermediate system 224 can include an execution meter The computer is such that a three-dimensional image, such as a computer-generated hologram or autostereoscopic display, can be displayed. It is better to use a computer to perform calculations on the transmission network between the two mobile phones, since the calculation will not consume the battery power of the mobile phone, but instead use the main power source instead. Computers located on the transmission network can be used to simultaneously process images of a large number of three-dimensional videophone calls, which allows for more efficient use of computing resources, such as by reducing the amount of unused computational processing power. If the required computing power is reduced, the weight of the mobile phone or other similar device will be reduced, and it will require less computer circuitry and memory, as computing needs will be performed by a computer located on the transmission network. Finally, the software that performs the calculations will only need to be installed on a computer located on the transport network and will not need to be installed in a mobile phone or other similar device. This will reduce the memory requirements of the mobile phone and the scope of software piracy, and will increase the protection of any corporate secrets in the code. While most of the computation required for three-dimensional image display can be performed by intermediate system 224, it is also possible that some image calculations are performed in the user device prior to data transfer. For example, if the two captured images are very similar, if the two images are difference images that are transmitted as the first image and the difference between the two images, the difference image is very easy to perform to facilitate data transfer. The data compression technology will therefore facilitate the transfer of data. Likewise, the three-dimensional image display device can perform some image calculations, such as decompressing image data.

In an example of the system of Figure 22, the first image and the second image are formed A pair of stereoscopic images are displayed and transmitted by the device of the user 220 to the intermediate device 224 via the link 222. The second transmitted image may be a difference image between two stereoscopic display images, as the difference image will typically require less material than the full image. If the three-dimensional telephone conversation is in progress, the first image may be the difference between the current image and the image at the previous point in time. Similarly, the second image can be the difference between the current image and the image at the previous point in time. Next, based on the corresponding depth map from the received material, the intermediary 224 can calculate the two-dimensional (2D) image using a conventional calculation program for conversion between two-dimensional and three-dimensional (3D) images. For a color image, three elements of the two-dimensional image in three main colors are required, along with their corresponding depth maps. Next, the data on the two-dimensional image and the depth map is transmitted to the device of the user 221 via the link 223. The user 221's device encodes the hologram based on the received two-dimensional image and depth map in its compact three-dimensional display device. In order to efficiently use the transmission bandwidth, the data transmitted in this system can be subjected to a conventional compression procedure, and a corresponding decompression action is performed in the receiving device. Using the most efficient amount of data compression, the battery of the mobile device is compressed and decompressed by the battery compared to the bandwidth requirement with less data compression.

The intermediary 224 can access a library containing a collection of known three-dimensional shapes and attempt to find a pairing that satisfies the three-dimensional data it computes, or it can access a library containing a collection of known two-dimensional graphics, and In which an attempt is made to find a pairing of two-dimensional image data that is steadily entering. If a good pairing can be found in a known shape, this can speed up The speed of the program is calculated because the 2D or 3D image can be represented as corresponding to the known shape. The three-dimensional shape library can provide the face or body shape of a group of sports stars, such as a major tennis player or football player, as well as all or part of a major sports venue, such as a famous tennis court or a famous football pitch. For example, a three-dimensional image of a human face can be represented as an item that has been accessed by an intermediate device 224, plus facial expression changes, such as a smile or frown, plus a change in the length of the hair, because the hair may remain after the data is stored. Long or short. If a set of persistent differences occurs, the records that the intermediate device 224 has accessed are significantly more obsolete than the data, for example, over a long period of time, the length of the person's hair has changed significantly, then the data that has been accessed by the intermediate device 224 The update can be made by the intermediary device 224. If the intermediary device 224 encounters a two- or three-dimensional image of a pair that has not been found among the records it has accessed, it will add a new shape to the set of records.

J. System for helping 2D image content to 3D image content

One of the difficulties in safely adopted 3D display technology is the fact that very little content is produced in a three-dimensional format, and now most of the content continues to be produced in a two-dimensional format. Partly because most image recording devices used today continue to record two-dimensional images, and no data can be used in three-dimensional images. In addition, there are currently few opportunities for viewers to request three-dimensional content or to obtain three-dimensional content generated from two-dimensional content.

This is obviously a system that supports the generation of 3D content from 2D content. A system is given in Figure 23. In FIG. 23, even if the viewer 2303 has a three-dimensional display device in the home, the television communication company 2300 continues to play the two-dimensional television image 2304. In this system, with an intermediate system 2301, two-dimensional content can be converted to three-dimensional content 2305. Such conversion procedures may be supported by the viewer for payment or may be supported by other parties, such as advertiser 2303. In FIG. 23, when the advertisement of the advertiser 2303 is played by the television company 2300, the advertiser 2303 pays the fee 2306 to the intermediate system 2301, and converts the known two-dimensional content into three-dimensional content by the conversion program. Dimensional content is converted into three-dimensional content. The advertiser's benefit is presented to the viewer 2302 as a three-dimensional television commercial, which will be more noticeable than the two-dimensional television commercial. Alternatively, viewer 2302 may pay a fee to intermediate system 2301 to convert and receive some or all of the three-dimensional format of the television broadcast. The intermediate system will ensure that the provision of the three-dimensional content is a correct and synchronized format. For example, if the two-dimensional image has its corresponding depth map, the two data sets will be provided in a synchronous manner, that is, the three-dimensional display device will correspond to the corresponding two-dimensional image. Like using a depth map, a depth map is not used for non-corresponding 2D images. The three-dimensional display device can be a holographic display device, an autostereoscopic display device, or any conventional three-dimensional display device. The data providing the three-dimensional display device should be suitable for the type of three-dimensional display device. Systems similar to those described above are also applicable to content provided by providers of non-television broadcast companies, such as movie or video tape providers.

In another system, the viewer can pay for the two-dimensional content to the middle system. And receive a three-dimensional form of the provided two-dimensional content. The provided two-dimensional content can be, for example, an MP3 file of a home movie, or other video content or an image such as a photo or a picture.

The intermediate system can include a computer to perform calculations such that the three-dimensional image can be displayed, such as a computer-generated hologram or auto-stereoscopic image. It is preferable to perform the calculation using a computer that transfers the network between the two-dimensional content provider and the viewer who wishes to view the three-dimensional image content, since this is more efficient than executing such a program on the viewer side. Computers located on the transmission network can perform image processing for large amounts of 2D to 3D content conversion at the same time, which allows for more efficient use of computing resources, for example by reducing the amount of unused computational processing power. If the required computing power is reduced, the cost of the viewer's 3D display device will be reduced because it will require less computer circuitry and memory, and computing needs will be performed by a computer located on the transmission network. . Finally, the software that performs the calculations will only need to be installed on a computer located on the transmission network and will not need to be installed in the viewer's 3D display device. This will reduce the memory requirements of the viewer's three-dimensional display device and the scope of software piracy, and will increase the protection of any corporate secrets in the code. While most of the computation required for three-dimensional image display can be performed by an intermediate system, it is also possible that some image calculations are performed in the viewer's three-dimensional display device. The three-dimensional image display device may perform some image calculations, such as decompressing the compressed image data, or generating a holographic encoding of the spatial light modulator from the two-dimensional image and its corresponding depth map.

In one example, the intermediate system can calculate a corresponding depth map of the received two-dimensional image using a computational program that converts between two-dimensional and three-dimensional images. For a color image, three elements of the two-dimensional image in three main colors are required, along with their corresponding depth maps. Next, the data about the two-dimensional image and the depth map is transmitted to the viewer's three-dimensional display device. The viewer's 3D display device encodes the hologram based on the received 2D image and depth map in its spatial light modulator. In order to efficiently use the transmission bandwidth, the data transmitted in this system can be subjected to a conventional compression procedure, and a corresponding decompression action is performed in the receiving device. Using the most efficient amount of data compression, the cost of providing data decompression to a 3D display device is balanced compared to the bandwidth requirement for using less data compression.

The intermediary device can access data of a known set of three-dimensional shapes and attempt to find a match that satisfies the three-dimensional data it computes, or it can access a collection of known two-dimensional graphics and attempt to find a stable entry therein. Pairing of 2D image data. This can speed up the calculation process if a good pairing can be found in a known shape, since a two- or three-dimensional image can then be represented as corresponding to a known shape. The three-dimensional shape library can provide the face or body shape of a group of sports stars, such as a major tennis player or football player, as well as all or part of a major sports venue, such as a famous tennis court or a famous football pitch. For example, a three-dimensional image of a human face can be represented as data that has been accessed by an intermediate device, plus facial expression changes, such as a smile. Or frowning, etc., plus changes in hair length, because the hair may stay long or cut short after storage. If a set of persistent differences occurs, the records that the intermediate device has accessed are significantly more outdated than the data, for example, over a long period of time, the length of the person's hair has changed significantly, then the data that has been accessed by the intermediate device can be intermediate Device 224 is updated. If the intermediary device encounters a two- or three-dimensional image of a pair that has not been found in the records it has accessed, it will add the newly calculated three-dimensional shape to the set of records.

K. Space multiplexing and two-dimensional coding of observer windows

This embodiment relates to the spatial multiplexing of virtual observer windows (VOWs) for holographic displays, combined with the use of two-dimensional encoding. In addition, the holographic display can be as described in Sections A, B, C or D, or any conventional holographic display.

A number of pseudo-observer windows, such as a virtual observer window for the left eye and a virtual observer window for the right eye, are known to be generated by spatial or temporal multiplexing. With regard to spatial multiplexing, two virtual observer windows are generated at the same point in time and are distinguished by a beam splitter, similar to an autostereoscopic display, as described in WO 2006/027228. With regard to time multiplex, the virtual observer window is generated in time.

However, conventional hologram display systems have some drawbacks. For spatial multiplexing, the illumination system used is spatially non-coherent in the horizontal direction and is based on a horizontal line source and a lenticular array, as shown in Figure 4 by the prior art WO 2006/027228. This has the advantage that the known techniques of autostereoscopic displays can be utilized. However, its disadvantage is that holographic reconstruction in the horizontal direction is impossible. Instead, so-called 1D coding is used, which produces holographic reconstruction and moving parallax only in the vertical direction. Thus, the vertical focus is on the plane of the reconstructed object and the horizontal focus is on the plane of the spatial light modulator. These astigmatisms reduce the quality of spatial vision, meaning that it reduces the quality of holographic reconstruction received by the viewer. Similarly, time multiplex systems also have the disadvantage that they require fast space light modulators that are not yet available in all display sizes, and are readily available at an instant.

Only two-dimensional coding provides holographic reconstruction in both horizontal and vertical directions, and therefore two-dimensional coding does not produce astigmatism, which reduces the quality of spatial vision, meaning that the quality of holographic reconstruction received by the viewer is reduced. Therefore, the purpose of this embodiment is to achieve spatial multiplexing of virtual observer windows in conjunction with two-dimensional encoding.

In this embodiment, illumination with horizontal and vertical local spatial coherence will be combined with a beam splitter that splits the light into light for the left eye virtual observer window and for the right eye virtual observer window. Therefore, the diffraction at the beam splitter must be considered. The beam splitter can be a prism array, and the second lens array (for example) Such as static arrays or variable arrays, as shown in Figure 20) or obstacle masks.

An example of this embodiment is shown in Figure twenty-fifth. Figure 25 is a schematic diagram of a holographic display comprising a light source of a two-dimensional array of light sources, a lens of a two-dimensional lens array, a spatial light modulator, and a beam splitter. The beam splitter splits the light exiting the spatial light modulator into two beams that illuminate the virtual observer window (VOWL) for the left eye and the virtual observer window (VOWR) for the right eye. In this example, the number of light sources is one or more; the number of lenses is the same as the number of light sources.

In this example, the beam splitter is after the spatial light modulator. The positions of the beam splitter and the spatial light modulator can also be interchanged. An example of this embodiment is shown in Fig. 26, in which a prism array is used as a beam splitter in plan view. The illumination device comprises an array of two-dimensional two-dimensional light sources (LS1, LS2, ... LSn) and a two-dimensional lens array (L1, L2, ... Ln) of n elements, and only two light sources are shown in Figure 26. With two lenses. Each light source is imaged to the observer plane using its associated lens. The spacing of the array of light sources and the spacing of the lens arrays are such that all of the source images can appear simultaneously on the observer plane, ie, the plane containing the two virtual observer windows. In Figure 26, the left-eye virtual observer window (VOWL) and the right-eye virtual observer window (VOWR) are not shown because they are outside the figure and are on the right side of the figure. Additional field of view lenses can be added. In order to provide sufficient spatial coherence, the pitch of the lens array is similar to the typical size of the sub-image, ie, one to several millimeters. illumination Within each lens is horizontal and vertical spatial coherence because the source is small or a point source and because a two-dimensional lens array is used. The lens array can be refractive, diffractive or holographic.

In this example, the beam splitter is a one-dimensional array of vertical prisms. Light incident on one slope of the prism is deflected to the left eye virtual observer window (to VOWL), and light incident on the other slope of the prism is deflected to the right eye virtual observer window (to VOWR). Light rays generated from the same LS and the same lens are also homologous to each other after passing through the beam splitter. Therefore, two-dimensional encoding with vertical and horizontal focusing and vertical and horizontal moving parallax is possible.

The hologram is encoded on a spatial light modulator with two-dimensional code. The hologram for the left and right eyes is a stagger of one field of a field, meaning that the field will interleave the holographic information for the left and right eyes. More preferably, there is a field for the left eye hologram information and a field for the right eye hologram information under each prism. Alternatively, there may be two or more hologram fields under the bevel of each prism, such as three fields for the left eye virtual observer window, and then three virtual observers for the right eye. The field of the window. The beam splitter may have the same pitch as the spatial light modulator, or an integer (eg, two or three) multiples, or, in order to allow for perspective shortening, the beam splitter may be spaced apart by a spatial light modulator The spacing is slightly smaller, or an integer greater than it (for example, two or three) The multiple is slightly smaller.

Light emitted from a field with a full-eye image of the left eye reconstructs the target for the left eye and illuminates the virtual observer window (VOWL) of the left eye; the light emitted from the field with the full image of the right eye reconstructs the target for the right eye And illuminate the right eye virtual observer window (VOWR). Therefore, each eye will see an appropriate reconstruction. If the pitch of the prism array is sufficiently small, the eye cannot resolve the prism structure and the prism structure does not interfere with the reconstruction of the hologram. Each eye will see a reconstruction with full focus and full motion parallax and no astigmatism.

There will be diffraction on the beam splitter because the same dimming will illuminate the beam splitter. The beam splitter can be viewed as a diffraction grating that produces multiple diffraction classes. The oblique prism bevel has the effect of a blazed grating. For blazed gratings, the maximum intensity is directed to a particular diffraction class. For a prism array, one maximum intensity will be directed from one bevel of the prism to the diffraction level at the virtual observer window of the left eye, and the other maximum intensity will be directed from the other bevel of the prism to another diffraction at the virtual observer window of the right eye. class. More precisely, the maximum intensity of the enveloping sinc-squared function is moved to these positions, while the diffractive class is in a fixed position. The prism array produces a maximum intensity sinc-squared function at the position of the left-eye virtual observer window, and another intensity-encapsulated sinc-squared function maximum at the position of the right-eye virtual observer window. The strength of other diffraction classes will be Very small (meaning that the sinc squared intensity function maximum is narrow) and will not produce interfering crosstalk because the fill factor of the prism array is large, for example close to 100%.

As can be seen in the prior art, in order to provide a virtual observer window to two or more observers, by using a more complex prism array (eg, two types of prisms, having the same apex angle, but different degrees of asymmetry) , continuously adjacent to each other), generating multiple virtual observer windows. However, the use of static prism arrays is not capable of tracking observers individually.

In another example, more than one light source can be used per lens. Additional light sources for each lens can be utilized to create additional virtual observer windows for additional observers. This is an example of a lens and m light sources provided for m observers in WO 2004/044659 (US 2006/0055994). In this further example, m light sources per multiplex and double spatial multiplexing are used to generate m left virtual observer windows and m right virtual observer windows for m observers. Each of the lens m light sources is in an m-to-one correspondence, where m is an integer.

This is followed by an example of this embodiment. Use a 20-inch screen size with the following parameter values: observer distance 2m, pixel pitch 69μm vertical, 207μm horizontal, using Burckhardt encoding, and optical wavelength 633nm. The Burckhardt code is in the vertical direction with a sub-pixel pitch of 69 μm and a virtual observer window (vertical period) of 6 mm height. Ignoring the perspective shortening, the vertical prism array has a pitch of 414 μm, that is, a field with two spatial light modulators under each full prism. Therefore, the horizontal period in the observer plane is 3 mm. This is also the width of the virtual observer window. This width is less than the ideal pupil of the eye of about 4 mm in diameter. In another similar example, if the spatial light modulator has a smaller pitch of 50 μm, the virtual observer window will have a width of 25 mm.

If the adult's eye is separated by 65mm (which is typical), the prism must deflect light by ±32.5mm, at which point the light will intersect the plane containing the virtual observer window. More precisely, the maximum value of the intensity package sinc-squared function needs to be skewed by ±32.5 mm. This corresponds to an angle of ±0.93° for an observer distance of 2 m. For prism refractive index n = 1.5, the appropriate prism angle is ± 1.86 °. The prism angle is defined as the angle between the base and the bevel of the prism.

For the level in the 3 mm observer plane, the position of the other eye is at a distance of approximately 21 diffraction stages (ie 65 mm divided by 3 mm). The crosstalk between the left eye virtual observer window and the right eye virtual observer window caused by the higher diffractive class of another virtual observer window is thus negligible.

In order to implement tracking, the source tracking is a simple tracking method, which means adapting the position of the light source. If the spatial light modulator is not in the same plane as the prism array, there will be a disturbing related lateral offset caused by the parallax between the spatial light modulator pixels and the prism. This will probably cause disturbing crosstalk. In the above example, a 20-inch screen size pixel may have a fill factor of 70% in the direction perpendicular to the axis formed by each prism tip, that is, on each side, the pixel size is 145 μm active area and 31 μm. Inactive area. If the constructed area of the prism array is directed to a spatial light modulator, the separation between the prism array and the spatial light modulator may be approximately 1 mm. The horizontal tracking range without crosstalk will be ±31μm/1mm*2m=±62mm. If small crosstalk is tolerable, the range of tracking will be larger. This tracking range is not very large, but it is enough to allow some tracking to occur so that the viewer will have fewer restrictions, such as limiting the placement of his/her eyes.

The parallax between the spatial light modulator and the prism array can be avoided. The better way is to integrate the prism array or directly integrate it into the spatial light modulator (such as refraction, diffraction or hologram). Prism array). This will be a specialized component for the product. Another option is the lateral mechanical movement of the prism array, although this is less recommended because moving the mechanical part can make the device more complicated.

Another key issue is the fixed virtual observer determined by the prism angle. Separated by windows. This can be confusing for non-standard eye-separated observers or z-tracking. One solution is to use a combination of encapsulated liquid-crystal domains, as shown in Figure 21. Then, the electric field can control the refractive index, as well as the skew angle. This solution can be combined with a prism array to continuously provide variable skew and fixed skew continuously. In another solution, the structural edges of the prism array can be covered with a liquid crystal layer. Then, the electric field can control the refractive index, as well as the skew angle. If the virtual observer window has enough observers to allow different eye separations and z-tracking such a large width, a variable skew combination is not required.

A more complicated solution is to use a controllable prism array, such as an e-wetting prism array (as shown in Figure 27) or a liquid crystal filled prism (as shown in Figure 21). In Fig. 27, the layer having the prism element 159 includes electrodes 1517, 1518 and a cavity filled with two separated liquids 1519, 1520. Each liquid fills the prismatic portion of the cavity. As an example, the liquid can be oil or water. The slope of the interface between the liquids 1519, 1520 is determined by the voltage applied to the electrodes 1517, 1518. If the liquids have different refractive indices, the beam will be biased, which is determined by the voltage applied to the electrodes 1517, 1518. Thus, prism element 159 acts as a controllable beam directing element. Providing electronic holographic techniques for the need to track the virtual observer window to the observer's eyes is an important feature for the applicant's approach. Patent application number DE proposed by the applicant 102007024237.0, DE 102007024236.2, describes the tracking of a virtual observer window with prism elements to the observer's eyes.

This is an embodiment for a compact handheld display. Seiko (RTM) Epson (RTM) Corporation of Japan has published a monochrome electronic address space optical modulator, such as the D4:L3D13U 1.3 inch screen size. An illustrative example is the use of a D4:L3D13U liquid crystal display panel as a spatial light modulator. It has HDTV resolution (1920 x 1080 pixels), 15μm pixel pitch and 28.8mm x 16.2mm panel area. This panel is typically used in 2D image projection displays.

This example is to calculate the distance between the wavelength of 663 nm and the observer distance of 50 cm. For this amplitude-modulated spatial optical modulator, track phase encoding (Buckhardt encoding) is used: three pixels are required to encode a complex number. These three associated pixels are arranged vertically. If the prism array beam splitter is integrated into a spatial light modulator, the prism array spacing will be 30 μm. If there is a separation between the spatial light modulator and the prism array, the spacing of the prism arrays will be slightly different to handle the perspective shortening.

The height of the virtual observer window is determined by a complex number of 3*15 μm=45 μm and is 7.0 mm. The width of the virtual observer window is determined by the 30 μm pitch of the prism array and is 10.6 mm. Both values are greater than the pupil of the eye. So if the virtual observer window is in the eye position, every eye is You can see the hologram reconstruction. The holographic reconstruction is derived from a two-dimensionally encoded hologram, so there is no flash problem inherent in the one-dimensional encoding described above. This ensures high spatial visual quality and high depth impression quality.

When the separation of the eyes is 65 mm, the prism must be deflected by ±32.5 mm. More precisely, the maximum intensity of the package sinc-squared intensity function needs to be skewed by ±32.5 mm. For an observer distance of 0.5 m, this corresponds to an angle of ± 3.72°. For a refractive index n = 1.5, a suitable prism angle is ± 7.44. The prism angle is defined as the angle between the base and the bevel of the prism.

For the level in the 10.6 mm observer plane, the position of the other eye is at a distance of about 6 diffraction stages (ie 65 mm divided by 10.6 mm). The crosstalk caused by the higher diffractive class is therefore negligible because the prism array has a high fill factor, meaning close to 100%.

This is an embodiment for use with large displays. The hologram display can be designed with a phase-modulated spatial light modulator with a pixel pitch of 50 μm and a screen size of 20 inches. For applications such as television, the screen size can be quite close to 40 inches. The observer distance for this design is 2m and the wavelength is 633nm.

The two phase modulated pixels of the spatial light modulator are used to encode a complex number. This The two associated pixels are vertically aligned and the corresponding vertical spacing is 2*50 μm=100 μm. By integrating the prism array into the spatial light modulator, the horizontal spacing of the prism array is also 2*50 μm=100 μm, since each prism contains two slopes, and each slope is a column for the spatial light modulator Bit. The width and height of the resulting 12.7 mm virtual observer window is greater than the pupil of the eye. Therefore, if the virtual observer window is in the position of the eye, the hologram reconstruction can be seen in each eye. The holographic reconstruction is derived from the two-dimensional coded hologram, so there is no flash problem in the one-dimensional coding itself. This ensures high spatial visual quality and high depth impression quality.

When the separation of the eyes is 65 mm, the prism must be deflected by ±32.5 mm. More precisely, the maximum value of the intensity package sinc-squared function needs to be skewed by ±32.5 mm. For an observer distance of 2 m, this corresponds to an angle of ± 0.93°. For a refractive index n = 1.5, a suitable prism angle is ± 1.86 °. The prism angle is defined as the angle between the base and the bevel of the prism.

The above example is for the observer to be 50 cm and 2 m away from the spatial light modulator. In summary, this embodiment can be applied to an observer distance of 50 cm to 2 m from the spatial light modulator. The screen size can range from 1cm (such as a mobile phone secondary screen) to 50 inches (such as a large TV).

Laser source

RGB solid-state laser sources, such as GaInAs or GaInAsN, are suitable sources for compact holographic displays because they are compact and highly Light directionality. Such light sources include RGB Vertical Cavity Surface Emitting Lasers (VCSELs) manufactured by Novalux (RTM) Inc., CA, USA. Such a light source can be provided as a single laser or laser array, although each light source can utilize a diffractive optical element to produce multiple beams. The beam can be transmitted in a multimode fiber because if the coherence is too high for use in a compact hologram display, this may reduce the homology class and will not cause unwanted artifacts, such as laser classes. Dot pattern. The array of laser sources can be one or two dimensional.

Organic light emitting diode material

Infrared organic light emitting diode materials have been proposed. For example, Del Caño et al. published electroluminescence in organic light-emitting diode materials based on perylenediimide-doped tris (8-quinolinolato) aluminium, as in Applied Physics Letters vol. 88, 071117 (2006). The content described in ).

Electroluminescence at a wavelength of 805 nm is illustrated. Domercq et al., J. Phys Chem B vol. 108, 8647-8651 (2004), disclose materials for near-infrared organic light-emitting diodes. The preparation of an organic light-emitting diode material on a transparent substrate is described of. For example, in US 7,098,591, organic light-emitting diode materials are prepared on transparent indium tin oxide electrodes. The electrode is prepared on a transparent substrate, and the transparent substrate may be borosilicate glass. These constituent elements may be included in an organic light emitting diode device having a transparent substrate. The indium tin oxide layer can be sputtered onto the substrate using a radio frequency magnetron sputtering tool. Indium tin oxide can be sputtered using a target containing indium oxide and tin oxide. The indium tin oxide layer can have an optical transmission of about 85% in the visible range. Indium tin oxide can be smooth to avoid local enhanced electric field generation, and local enhanced electric field may reduce the performance of the organic light emitting diode material. A root mean square roughness of less than about 2 nm is preferred. One or several practical organic layers may be disposed on the patterned electrode surface. The thickness of the organic layer is typically between 2 nm and 200 nm. The conductive layer can be constructed on the organic layer in accordance with the pattern to form an anode and a cathode on both sides of the organic layer. The device may be sealed by a layer of glass to protect the active layer from environmental damage.

Summary manufacturing process

An outline of the procedure for making the apparatus of Figure 2 is described below, although many variations of this procedure will be found in the prior art.

In the process of manufacturing the device of Fig. 2, a transparent substrate is selected for use. Such a substrate may be a hard substrate, such as a boron germanium glass sheet of about 200 μm thick, or it may be A flexible substrate, such as a polymer substrate, such as polycarbonate, acrylic, polypropylene, polyurethane, polystyrene, polyvinyl chloride (polyvinyl) Chloride) or a similar substrate. As described in the previous section, the transparent electrode is prepared on glass. As described in the previous section, the infrared organic light emitting diode material is disposed on the glass, and the electrical contacts are disposed on the other side of the transparent electrode, so that the pixelated organic light emitting diode infrared light is Radiation is possible. The glass substrate can have a recess that provides an organic light emitting diode pixel material. The infrared organic light emitting diode material can be printed, sprayed or solution-processed on a transparent substrate. The sub-sealing layer, which is also an electrically insulating layer, is then disposed on the organic light-emitting diode pixel layer. Such a sealing layer may be an inorganic insulator layer such as silicon dioxide, silicon nitride or silicon carbide or it may be a polymerizable layer. For example, epoxy. The configuration can be performed by sputtering or by chemical vapour deposition for the inorganic insulating layer, or by printing or coating the polymeric layer. The sub-sealing layer, which is also an electrically insulating layer, may have a thickness of a few microns or less than 10 microns. Next, the photosensitive layer of the optically addressed spatial light modulator will cover the encapsulation layer. The photosensitive layer is sensitive to infrared rays, transparent to visible light, and may have a thickness of several micrometers. Such optical properties can be provided by dyes that absorb infrared light. The optically addressed spatial light modulator is then completed by configuring a liquid crystal layer that covers between the two conductive layers. The liquid crystal layer can be used for amplitude modulation or phase modulation The setting is made and the typical thickness is a few microns. Next, an infrared filter layer is disposed on the device. This may be in the form of a thin layer of polymer having infra red absorbing pigments, or it may be an inorganic layer, such as a thin layer of ceria grown with sputtering or chemical vapor deposition of an infrared absorbing element.

The layer between the two optically addressed spatial light modulator devices must be sufficiently thick to ensure that the electric field in one optically addressed spatial light modulator does not affect the other optically addressed spatial light modulation. The performance of the device. The infrared filter layer can be thick enough to accomplish this goal. However, if the infrared filter layer is not thick enough, the optically-addressed spatial light modulator device can be combined with a glass sheet having a sufficient thickness, for example, by an optical adhesive, or by arranging another optically transparent layer. For example, the inorganic layer or the polymer layer described above increases the thickness of the layer. In any event, the two optically addressed spatial light modulator devices must not be too far apart, so that the optical diffraction effect reduces pixel crosstalk. For example, if the pixel width is 10 microns, the optically addressed spatial light modulator layers should preferably be less than 100 microns apart. The liquid crystal layer in one of the optically addressed spatial light modulators is set to perform amplitude modulation; the liquid crystal layer in another optically addressed spatial light modulator is set to perform phase modulation.

Other portions of the device can be fabricated using the methods described above for each of the optically addressed spatial light modulators and organic light emitting diode layers. Alternatively, other parts of the device can be prepared as a single component and then bonded to the first portion of the device, for example using a a layer of glass to ensure adequate separation between the optically-addressed spatial modulator layers such that the electric field of each optically addressed spatial light modulator does not affect the other optically-spaced spatial modulator effect. The remainder of the device is prepared by arranging additional material onto the first portion of the device, which has the advantage of facilitating precise alignment of the pixels of the second organic light emitting diode layer with the pixels of the first organic light emitting diode layer.

It is also possible to use a thin spacer layer coated with a conducting transparent electrode (e.g., indium tin oxide) instead of using a spacer layer having a sufficient thickness in close proximity to the optically addressed spatial light modulator. This electrode acts as a common electrode for the two liquid crystal layers. Furthermore, as a conductive electrode it is an equipotential surface. Thus, it protects the electric field and prevents electric field leakage from one optically addressed spatial light modulator to another optically addressed spatial light modulator.

Fig. 9 shows an example of a device structure which can be manufactured by the above program or the like. During use, surface 909 illuminates substantially coherently visible light to device structure 910 in FIG. 9, such that a viewer at point 911 can see a three dimensional image at a distance from the device (related to the dimensions of the device). The layers in the device, from 90 to 908, do not need to be related to each other's dimensions. Layer 90 is a base layer, such as a glass layer. Layer 91 is an organic light emitting diode backplane layer that provides an organic light emitting diode power supply and may be wholly or partially transparent. Layer 92 is an array of infrared organic light emitting diodes. Layer 93 It is a Bragg filter hologram element for at least partial infrared light aiming. In some embodiments, layer 93 can be omitted. Layer 94 is an electrically insulating layer. Layer 95 is an optically addressed spatial light modulator photosensitive and electrode layer. Layer 96 is a liquid crystal layer for visible beam amplitude modulation. Layer 97 is a separator layer, particularly a thin separator layer. Layer 98 is a transparent electrode layer. Layer 99 is a linear polarizing layer. Layer 900 is an infrared filter layer that transmits visible light but blocks infrared light from organic light emitting diode arrays 92 and 906. Layer 901 is a liquid crystal layer for phase modulation of the visible beam. Layer 902 is a separator layer, particularly a thin separator layer. Layer 903 is an optically addressed spatial light modulator photosensitive and electrode layer. Layer 904 is an electrically insulating layer. Layer 905 is a Bragg filter hologram element for at least partial infrared light aiming. In some embodiments, layer 905 can be omitted. Layer 906 is an array of infrared organic light emitting diodes. Layer 907 is an organic light emitting diode backplane layer that provides an organic light emitting diode power supply and may be wholly or partially transparent. Layer 908 is a plane that covers the material, such as glass. During fabrication, the fabrication of device 910 can begin with substrate layer 90, with each layer being disposed in sequence until the last layer 908 is added. The above described procedure has the advantage of facilitating a layer arrangement of highly accurate structures. Alternatively, the manufacture of the layer can be divided into two or more sections and bonded together with a sufficient degree of adjustment.

For the manufacture of the device, it is very important to maintain the unwanted birefringence at a minimum, such as unwanted stress-induced birefringence. Stress induced birefringence can cause linear or circular polarization changes in light Change to the elliptical polarization state of the light. In devices with ideal linear or circularly polarized states of light, the presence of elliptically polarized states of light reduces contrast and color fidelity, and thus reduces device performance.

Practice

Based on conventional techniques, for the optically addressed spatial light modulator of the above embodiment, a photosensitive layer that is transparent in the visible range but absorbs infrared light is needed. In another implementation, the photosensitive layer can be patterned to provide transparent spacing that transmits visible light, such as red, green, and blue light beams, as well as non-transparent that is sensitive to light from organic light-emitting diodes. region. In this example, the photosensitive material need not be transparent to visible light. In addition, the writing beam does not need to be infrared light. In one implementation, the write beam can be produced by a non-primary display color, such as by a yellow light organic light emitting diode. The filter between the two optically addressed spatial light modulators will therefore need to be in yellow with strong optical absorption to block yellow light, but for the purpose of producing an active optical display, in other There is still a need for sufficient transmission at the optical wavelength. In another implementation, the write beam can be produced by an ultraviolet organic light emitting diode. The filter between the two optically addressed spatial light modulators will therefore need to have strong optical absorption in the ultraviolet light, so that it can block ultraviolet light, but in order to achieve the purpose of producing an effective optical display, in other There is still a need for sufficient transmission at the optical wavelength. UV organic light-emitting diode materials have been prepared by Qiu et al. Applied Physics Letters 79, 2276 (2001) and Wong et al. Org. Lett. 7 (23), 5131 (2005). In addition, although emphasis is placed on the use of organic light-emitting diode materials, other light-emitting diode materials or other display technologies such as Surface-conduction Electron-emitter Display (SED) technology may be used.

Although the embodiments described herein emphasize continuous encoding of amplitude and phase in a spatial optical modulator, any continuous weighting of two unequal combinations of amplitude and phase can be used to encode the full encoding based on conventional techniques. Like a pixel, the two combinations are independent of multiplying any real number, but not multiplied by any complex number (except for real numbers). The reason is that the vector space of the possible holographic coding of the pixel will extend in the vector space perception by any two unequal combinations of amplitude and phase. Any two combinations will be independent of any real number, but not multiplied. Any plural (except real numbers).

In the reference figures, the relevant dimensions shown are not necessarily to scale.

The technology disclosed in this case can be implemented by a person familiar with the technology, and its unprecedented practice is also patentable, and the application for patent is filed according to law. However, the above embodiments are not sufficient to cover the scope of patents to be protected in this case. Therefore, the scope of the patent application is attached.

10‧‧‧Lighting device

11‧‧‧Color Filter Array

12‧‧‧Infrared organic light emitting diode array

13‧‧‧Optical addressed spatial light modulator

14‧‧‧ points

15‧‧‧Complete hologram generator

20‧‧‧Lighting device

21‧‧‧Color Filter Array

22‧‧‧Infrared organic light emitting diode array

23‧‧‧Optical addressed spatial light modulator

24‧‧‧ points

25‧‧‧Complete hologram generator

26‧‧‧Infrared filter

27‧‧‧Optical addressed spatial light modulator

28‧‧‧Infrared organic light emitting diode array

30‧‧‧Mobile Phone

31‧‧‧Screen area

32‧‧‧Antenna

33‧‧‧ camera

34‧‧‧ camera

35‧‧‧ button

36‧‧‧ button

1101‧‧‧ Focusing components

1102‧‧‧ Focusing components

1103‧‧‧ Focusing components

1104‧‧‧Vertical Focusing System

1105‧‧‧The first diffraction class

1106‧‧‧The zeroth diffraction class

1107‧‧‧negative diffracted class

50‧‧‧Microlens array

51‧‧‧Color Filter Array

52‧‧‧Infrared organic light emitting diode array

53‧‧‧Optical addressed spatial light modulator

54‧‧‧Optical addressed spatial light modulator

55‧‧‧Complete hologram generator

56‧‧‧ points

57‧‧‧Lighting device

70‧‧‧Space light modulator

71‧‧‧Whole image optical element Bragg filter

73‧‧‧ single component

74‧‧‧ Prague plane

75‧‧‧Diffractive light intensity distribution

76‧‧‧Light

80‧‧‧Organic LED array

81‧‧‧Whole Image Optical Element Bragg Filter

82‧‧‧Optical addressed spatial light modulator

83‧‧‧Single organic light-emitting diode

84‧‧‧ Prague plane

85‧‧‧ Distribution of infrared rays emitted

86‧‧‧Light rays

90‧‧‧ basal layer

91‧‧‧ Organic light-emitting diode bottom layer

92‧‧‧Infrared organic light emitting diode array

93‧‧‧Prague filter hologram components

94‧‧‧Electrical insulation

95‧‧‧Optical addressed spatial light modulator photosensitive and electrode layer

96‧‧‧Liquid layer

97‧‧‧Separation layer

98‧‧‧Transparent electrode layer

99‧‧‧linear polarizing layer

900‧‧‧Infrared filter

901‧‧‧Liquid layer

902‧‧‧Separation layer

903‧‧‧Optical addressed spatial light modulator photosensitive and electrode layer

904‧‧‧Electrical insulation

905‧‧ ‧ Prague filter hologram components

906‧‧‧Infrared organic light emitting diode array

907‧‧‧ Organic light-emitting diode bottom layer

098‧‧‧Face of the covering material

909‧‧‧ surface

910‧‧‧Device structure

911‧‧ points

100‧‧‧Microlens array

101‧‧‧Color Filter Array

102‧‧‧Electronic Address Space Light Modulator

103‧‧‧Electronic Address Space Light Modulator

104‧‧‧Lighting device

105‧‧‧Compact hologram generator

106‧‧‧ points

107‧‧‧ components

108‧‧‧ components

110‧‧‧ for lighting installations

111‧‧‧Color Filter Array

112‧‧‧Electronic Address Space Light Modulator

113‧‧‧ Beam beam splitter components

114‧‧‧ points

115‧‧‧Compact hologram generator

130‧‧‧Lighting device

131‧‧‧Color Filter Array

132‧‧‧Electronic Address Space Light Modulator

133‧‧‧Electronic Address Space Light Modulator

134‧‧‧beam beam splitter element

135‧‧ points

136‧‧‧Complete hologram generator

171‧‧‧ Beam

172‧‧‧ Beam

220‧‧‧Users

221‧‧‧Users

222‧‧‧ links

223‧‧‧ links

224‧‧‧Intermediate system

2300‧‧‧TV Communications

2301‧‧‧Intermediate system

2302‧‧‧ Viewers

2303‧‧‧Advertisers

2304‧‧‧Two-dimensional content

2305‧‧‧3D content

2306‧‧‧Payment fees

159‧‧‧ prism elements

1517‧‧‧electrode

1518‧‧‧electrode

1519‧‧‧dove

1520‧‧‧Deep

1 is a schematic diagram of a holographic display device including a single optical address spatial light modulator and a single organic light emitting diode array; FIG. 2 is a schematic diagram of a holographic display device including a pair of components, each component including a single optical address Spatial light modulator and single organic light-emitting diode array; Figure 3 is a schematic diagram of a mobile three-dimensional display device; Figure 4 is a conventional holographic display schematic; Figure 5 is a single organic light-emitting diode array for controlling two optical Schematic diagram of the holographic display of the address space optical modulator; Figure 6A is a schematic diagram of the holographic display; Figure 6B is a schematic diagram suitable for achieving a compact holographic display; Figure 7 is included to reduce the problem associated with higher diffraction levels Schematic diagram of a constituent element of a holographic display of a holographic filter of a holographic optical element; FIG. 8 is a holographic display of a holographic filtered holographic optical element including collimation for enhancing the light emitted by the array of organic light emitting diodes Schematic diagram of constituent elements; Figure 9 is a schematic diagram of a holographic display device; Figure 10 is a diagram showing two electronic formulas for continuously encoding amplitude and phase Spatial light modulator means is a schematic diagram showing the whole image; 11 is a schematic diagram of a holographic display device including a single electronic address spatial light modulator; FIG. 12 is a specific embodiment of a holographic display according to an embodiment; FIG. 13 is included for continuously encoding amplitude and Schematic diagram of the holographic display device of two electronically-positioned spatial light modulators of phase; Figure 14 shows the diffraction simulation results obtained using MathCad (RTM); Figure 15 shows the diffraction obtained using MathCad (RTM) Simulation results; Figure 16 is a diffraction simulation result obtained using MathCad (RTM); Figure 17 is a schematic diagram showing the arrangement of lens layers between two electronic address spatial light modulators according to an embodiment; Schematic diagram of a diffraction procedure that occurs when light travels from an electronically addressed spatial light modulator to a second electronically addressed spatial light modulator; Figure 19 shows two electronically addressed spatial light modulators Schematic diagram of the structure, in which there is a fiber optic panel between two electronically-positioned spatial light modulators; Figure 20 is a schematic diagram of the beam pointing component; Figure 21 is a schematic diagram of the beam pointing component; Figure 22 is Making three-dimensional schematic view of a possible visual communication system; FIG twenty-three schematic two-dimensional images to convert the three-dimensional image content is the content of the method; Figure 24 is a schematic diagram of a holographic display element according to an embodiment; Figure 25 is a light source comprising a two-dimensional array of light sources, a lens in the form of a two-dimensional lens array, a spatial light modulator and a beam splitter The hologram shows the schematic. The beam splitter splits the light exiting the spatial light modulator into two beams, respectively illuminating the virtual observer window (VOWL) for the left eye and the virtual observer window (VOWR) for the right eye; Figure 26 contains A schematic diagram of the full image display of two light sources in a two-dimensional light source array, two lenses in a two-dimensional lens array, a spatial light modulator, and a beam splitter. The beam splitter splits the light exiting the spatial light modulator into two beams, respectively illuminating the virtual observer window (VOWL) for the left eye and the virtual observer window (VOWR) for the right eye; Figure 27 is a prism A schematic cross-sectional view of the beam pointing element.

220‧‧‧Users

221‧‧‧Users

222‧‧‧ links

223‧‧‧ links

224‧‧‧Intermediate system

Claims (20)

  1. A mobile telephone system comprising: a caller's mobile phone having an image system and a display operable to capture an image of the caller, the caller's mobile phone being The caller's image is transmitted to a called party's mobile phone, which locally generates a holographic reconstruction of the calling party using a holographic display encoded in a computer-generated hologram. The data processing of the 3D image display is performed by an intermediate system located on one of the transmission networks between the two mobile phones, and the intermediate system includes a computer for performing calculations to enable the hologram generated by the three-dimensional computer to be displayed.
  2. The mobile phone system of claim 1, wherein the holographic display comprises an organic light emitting diode array (OLED), the organic light emitting diode array is written to form a plurality of adjacent layers Optically addressed spatial light modulator (OASLM).
  3. The mobile telephone system of claim 1, wherein the holographic display comprises two pairs of an organic light emitting diode array and an optically addressed spatial light modulator, each pair being organically paired. An array of light emitting diodes is written to an optically addressed spatial light modulator forming a plurality of adjacent layers.
  4. The mobile telephone system of claim 1 or 2, wherein the holographic display comprises an electronic addressed spatial light modulator (EASLM), or wherein the holographic display comprises two electronic Address space light modulator.
  5. The mobile telephone system of claim 1, wherein a remote server or an intermediate system adds a depth map and transmits the image of the calling party and the depth map to the called party. The mobile phone, or wherein the remote server is stylized with a profile that defines a three-dimensional entity map of the caller's face.
  6. The mobile telephone system of claim 5, wherein the called party's mobile telephone system includes a synchronization device to compensate for delays caused by the remote server.
  7. The mobile telephone system of claim 1, wherein the called party's mobile phone includes a stop function for generating a static hologram reconstruction, or wherein the called party's mobile phone includes a The zoom function allows a user to zoom in on a portion of the hologram reconstruction.
  8. The mobile telephone system of claim 1, wherein the called party's mobile phone and/or the calling party's mobile phone comprises a plurality of stereo cameras, or wherein the called party's mobile phone and/or The caller's mobile phone includes a single camera and a software that generates a depth map using the data obtained from the single camera.
  9. The mobile telephone system according to claim 1, wherein the called party's mobile phone and/or the calling party's mobile phone display an indication on the screen to guide the user to ideally set the camera. Position or orientation for optimal image capture and/or hologram reconstruction.
  10. The mobile telephone system of claim 1, wherein the called party's mobile phone and/or the calling party's mobile phone is a user who is correctly and clearly defined by a predetermined distance from the display. The resulting holographic reconstructed display device is seen.
  11. The mobile telephone system according to claim 1, wherein the called party's mobile phone and/or the calling party's mobile phone is a display capable of switching from the holographic reconstruction mode to the conventional two-dimensional display mode. The device, or the mobile phone of the called party and/or the mobile phone of the calling party, is a handheld portable device.
  12. The mobile telephone system according to claim 1, wherein the called party's mobile phone and/or the calling party's mobile phone is a number of person assistants (PDAs) or a video game device.
  13. The mobile telephone system of claim 1, wherein the holographic reconstruction of the holographic display provides a single user viewing.
  14. The mobile telephone system according to claim 1, wherein the holographic display can generate a two-dimensional image focused on a screen without any projection lens, and the optical screen is optically independent. The distance of the device in the far field.
  15. The mobile telephone system of claim 1, wherein the spatial light modulators (SLMs) of the holographic display are disposed within a range of 30 mm of a light source and are placed in a portable box. .
  16. The mobile telephone system of claim 1, wherein the presence of a beam directing element is for tracking a plurality of virtual observer windows (VOWs), the beam pointing element comprising a controllable element having a helium element. Array An array of electrowetting arrays, a unit comprising a plurality of electrodes, a cavity filled with two separate liquids, and an interface between the liquids, the slope of the interface between the liquids being The applied voltage is on the electrodes and can be electrically controlled.
  17. The mobile telephone system of claim 1, wherein the omnidirectional display has a beam directing component for tracking a plurality of virtual observer windows, the beam directing elements being internal to an isometric body material a plurality of liquid crystal regions, wherein the plurality of interfaces between the plurality of regions and the matrix are a prismatic shape, or a partial shape of a ball, or a partial shape of a cylinder, and the plurality of liquid crystals The direction is controlled by an applied electric field to vary the local refraction or diffraction characteristics of the beam directed to the element.
  18. A communication method comprising the steps of a mobile telephone system according to any one of the preceding claims, wherein the mobile telephone system comprises a calling party mobile phone having an image system and a a display operable to capture an image of the calling party, the calling party's mobile phone transmitting the image of the calling party to a called party's mobile phone via a wireless connection, the called party's action The telephone locally generates a holographic reconstruction of the calling party using a holographic image encoded by a computer-generated hologram, wherein the data processing of the 3D image display is on a transmission network between the two mobile phones. An intermediate system is executed, and the intermediate system includes a hologram for performing calculations to cause a three-dimensional computer to generate A computer that can display.
  19. A method for providing a telecommunication service, in which a network operator provides a calling party and a called party's mobile phone, a wireless link and a remote server, and has an image system and a mobile phone in the calling party's mobile phone. a display operable to capture an image of the calling party, the calling party's mobile phone transmitting the image of the calling party to the called party's mobile phone via the wireless link, the called party's mobile phone A holographic reconstruction of the calling party is locally generated by a hologram display encoded by a computer-generated hologram, wherein the data processing of the 3D image display is performed on a transmission network between two mobile phones The remote server executes, and the remote server includes a computer for performing calculations to enable the hologram generated by the three-dimensional computer to be displayed.
  20. A method of making an image call from a caller's mobile phone having an image system and a display operable to capture an image of a caller, the caller's mobile phone transmitting the image of the caller to a The called party's mobile phone, the called party's mobile phone locally generates a holographic reconstruction of the calling party using a holographic image encoded by a computer-generated hologram, wherein the data processing of the 3D image display is An intermediate system is executed on one of the transmission networks between the two mobile phones, and the intermediate system includes a computer for performing calculations to enable the hologram generated by the three-dimensional computer to be displayed.
TW96140507A 2006-10-26 2007-10-26 Mobile telephone system and method for using the same TWI432002B (en)

Priority Applications (16)

Application Number Priority Date Filing Date Title
GBGB0621360.7A GB0621360D0 (en) 2006-10-26 2006-10-26 Compact three dimensional image display device
GBGB0625838.8A GB0625838D0 (en) 2006-10-26 2006-12-22 Compact three dimensional image display device
GB0705401A GB0705401D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GBGB0705404.2A GB0705404D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705410A GB0705410D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705399A GB0705399D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GBGB0705405.9A GB0705405D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705408A GB0705408D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GBGB0705411.7A GB0705411D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GBGB0705402.6A GB0705402D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705407A GB0705407D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705398A GB0705398D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705406A GB0705406D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705412A GB0705412D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705409A GB0705409D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device
GB0705403A GB0705403D0 (en) 2006-10-26 2007-03-21 Compact three dimensional image display device

Publications (2)

Publication Number Publication Date
TW200845698A TW200845698A (en) 2008-11-16
TWI432002B true TWI432002B (en) 2014-03-21

Family

ID=44771492

Family Applications (6)

Application Number Title Priority Date Filing Date
TW96140506A TWI421541B (en) 2006-10-26 2007-10-26 Full image display device and method (2)
TW96140508A TWI442763B (en) 2006-10-26 2007-10-26 3d content generation system
TW96140509A TWI406115B (en) 2006-10-26 2007-10-26 Holographic display device and method for generating holographic reconstruction of three dimensional scene
TW96140505A TWI421540B (en) 2006-10-26 2007-10-26 Universal image display device and method (1)
TW96140507A TWI432002B (en) 2006-10-26 2007-10-26 Mobile telephone system and method for using the same
TW096140510A TWI454742B (en) 2006-10-26 2007-10-26 Compact three dimensional image display device

Family Applications Before (4)

Application Number Title Priority Date Filing Date
TW96140506A TWI421541B (en) 2006-10-26 2007-10-26 Full image display device and method (2)
TW96140508A TWI442763B (en) 2006-10-26 2007-10-26 3d content generation system
TW96140509A TWI406115B (en) 2006-10-26 2007-10-26 Holographic display device and method for generating holographic reconstruction of three dimensional scene
TW96140505A TWI421540B (en) 2006-10-26 2007-10-26 Universal image display device and method (1)

Family Applications After (1)

Application Number Title Priority Date Filing Date
TW096140510A TWI454742B (en) 2006-10-26 2007-10-26 Compact three dimensional image display device

Country Status (2)

Country Link
JP (1) JP2014209247A (en)
TW (6) TWI421541B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5397190B2 (en) * 2009-11-27 2014-01-22 ソニー株式会社 Image processing apparatus, image processing method, and program
CN102736393B (en) 2011-04-07 2014-12-17 台达电子工业股份有限公司 Display apparatus for displaying multiple images of viewing angles
TWI501053B (en) * 2011-10-28 2015-09-21 Jing Heng Chen Holographic imaging device and method thereof
US9019584B2 (en) 2012-03-12 2015-04-28 Empire Technology Development Llc Holographic image reproduction mechanism using ultraviolet light
TWI508040B (en) * 2013-01-07 2015-11-11 Chunghwa Picture Tubes Ltd Stereoscopic display apparatus and electric apparatus thereof
TWI493160B (en) * 2013-05-13 2015-07-21 Global Fiberoptics Inc Method for measuring the color uniformity of a light spot and apparatus for measuring the same
TWI537605B (en) 2014-08-28 2016-06-11 台達電子工業股份有限公司 Autostereoscopic display device and autostereoscopic display method using the same
CN104463964A (en) * 2014-12-12 2015-03-25 英华达(上海)科技有限公司 Method and equipment for acquiring three-dimensional model of object
TWI670850B (en) * 2019-03-08 2019-09-01 友達光電股份有限公司 Display device

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0451681B1 (en) * 1990-04-05 1997-11-05 Seiko Epson Corporation Optical apparatus
ID27878A (en) * 1997-12-05 2001-05-03 Dynamic Digital Depth Res Pty Enhanced image conversion and encoding techniques
JP2000078611A (en) * 1998-08-31 2000-03-14 Toshiba Corp Stereoscopic video image receiver and stereoscopic video image system
GB2350962A (en) * 1999-06-09 2000-12-13 Secr Defence Brit Holographic displays
GB2350963A (en) * 1999-06-09 2000-12-13 Secr Defence Holographic Displays
JP2002123688A (en) * 2000-10-16 2002-04-26 Sony Corp Holographic stereogram print order receipt system and its method
US6683665B1 (en) * 2000-11-20 2004-01-27 Sarnoff Corporation Tiled electronic display structure and method for modular repair thereof
JP2002223456A (en) * 2001-01-24 2002-08-09 Morita Mfg Co Ltd Image data distribution method and image distributor, and recording medium
US6721077B2 (en) * 2001-09-11 2004-04-13 Intel Corporation Light emitting device addressed spatial light modulator
JP3679744B2 (en) * 2001-09-26 2005-08-03 三洋電機株式会社 Image composition method and apparatus
JP2003289552A (en) * 2002-03-28 2003-10-10 Toshiba Corp Image display terminal and stereoscopic image display system
GB2390172A (en) * 2002-06-28 2003-12-31 Sharp Kk Polarising optical element and display
JP2004040445A (en) * 2002-07-03 2004-02-05 Sharp Corp Portable equipment having 3d display function and 3d transformation program
DE10353439B4 (en) * 2002-11-13 2009-07-09 Seereal Technologies Gmbh Device for the reconstruction of video holograms
GB0307923D0 (en) * 2003-04-05 2003-05-14 Holographic Imaging Llc Spatial light modulator imaging system
GB2406730A (en) * 2003-09-30 2005-04-06 Ocuity Ltd Directional display.
JP4230331B2 (en) * 2003-10-21 2009-02-25 富士フイルム株式会社 Stereoscopic image generation apparatus and image distribution server
US20090285906A1 (en) * 2004-12-21 2009-11-19 Alpha-Biocare Gmbh Preparation Made From Diptera Larvae For The Treatment Of Wounds
DE102004063838A1 (en) * 2004-12-23 2006-07-06 Seereal Technologies Gmbh Method and apparatus for calculating computer generated video holograms
US20060187297A1 (en) * 2005-02-24 2006-08-24 Levent Onural Holographic 3-d television

Also Published As

Publication number Publication date
TWI442763B (en) 2014-06-21
JP2014209247A (en) 2014-11-06
TWI421541B (en) 2014-01-01
TW200839295A (en) 2008-10-01
TW200845698A (en) 2008-11-16
TW200824426A (en) 2008-06-01
TWI454742B (en) 2014-10-01
TW200841042A (en) 2008-10-16
TWI406115B (en) 2013-08-21
TWI421540B (en) 2014-01-01
TW200844693A (en) 2008-11-16
TW200827771A (en) 2008-07-01

Similar Documents

Publication Publication Date Title
CN104884862B (en) Lighting apparatus
JP2019512745A (en) Method and apparatus for providing polarization selective holographic waveguide device
Fattal et al. A multi-directional backlight for a wide-angle, glasses-free three-dimensional display
US20190094536A1 (en) Methods and systems for generating virtual content display with a virtual or augmented reality apparatus
CN104704821B (en) Scan two-way light-field camera and display
Geng Three-dimensional display technologies
JP6320451B2 (en) Display device
Hainich et al. Displays: fundamentals & applications
US20190113751A9 (en) Diffractive projection apparatus
CN104520749B (en) Composite space optical modulator and holographic 3D rendering display including it
US10409144B2 (en) Diffractive waveguide providing structured illumination for object detection
US10642058B2 (en) Wearable data display
Urey et al. State of the art in stereoscopic and autostereoscopic displays
KR102050503B1 (en) Optically addressed spatial light modulator divided into plurality of segments, and apparatus and method for holography 3-dimensional display
KR101942972B1 (en) Spatial light modulator, Apparatus for holography 3-dimensional display and Method for modulating spatial light
TWI451162B (en) Autostereoscopic display apparatus
JP5528846B2 (en) Liquid crystal lens and display device
JP5148631B2 (en) Optical system and display
Mukawa et al. A full‐color eyewear display using planar waveguides with reflection volume holograms
JP3452470B2 (en) Display
US6525847B2 (en) Three dimensional projection systems based on switchable holographic optics
JP4396984B2 (en) Parallax barrier liquid crystal panel for stereoscopic image display device and method for manufacturing the same
TWI416287B (en) Improve the quality of the reconstruction of the whole image display
JP5346335B2 (en) Holographic reconstruction system with lightwave tracking means
US8634119B2 (en) System for holography